Archive

Archive for the ‘Microsoft Dynamics’ Category

Synchronizing and the Offline Client

One of the great features provided by Microsoft Dynamics CRM Online is the ability to take the application offline (i.e. work on a plane).  This capability is made possible by our S+S (software+services) vision of leveraging the best of services (CRM Online) coupled with the best of software (Outlook).  This is not possible in a SaaS (software as a service) model as locally installed software is required to achieve the functionality since an internet connection would not be available to serve up the hosted service.  The offline CRM Outlook client functionality is made possible by utilizing Outlook, a SQL Express database to store the offline CRM data and IIS on the laptop to render the web pages.  This is a competitive advantage of the Microsoft CRM offering that many of our competitors do not offer or are trying to replicate (thus proving the efficacy of the S+S vision).

Recently I was asked the following question:

“When I go Offline with the Outlook client and create a CRM Appointment, will it show up on my Outlook calendar?”

This was a question I recently received from one of our customers.  I believed the answer was yes but I wanted to investigate and take the scenario a bit further to include the core synchronization elements (Appointments, Contacts and Tasks).  So to test I:

  1. Installed the Offline Outlook Client.  There is only 1 Outlook client that supports both subscriptions (Professional – No Offline and Professional Plus – Offline).  In order to achieve the Offline capabilities you must select to install Outlook with Offline Access during the installation process.
    Online or Offline installation option
    NOTE: If you have the Outlook Client installed and upgrade your CRM Online subscription to Professional Plus you will need to uninstall and reinstall the Outlook Client in order to gain the offline functionality.  This WILL NOT delete any of the records previously synchronized with CRM and upon reinstall will maintain the CRM and Outlook record linkage.
  2. Went offline but clicking on the Go Offline button in the CRM toolbar.
  3. Created a CRM Appointment.
    CRM Appointment Created while Offline
  4. Confirmed the synchronization of the CRM Appointment to the offline Outlook Calendar.
    Confirmation of Outlook Appointment created while Offline
  5. Created a CRM Contact
    CRM Contact created while Offline
  6. Confirmed the synchronization of the CRM Contact to the offline Outlook Contacts.
    Confirmation of Outlook Contact created while Offline
  7. Created a CRM Task.
    CRM Task creation while Offline
  8. Confirmed the synchronization of the CRM Task to the offline Outlook Task.
    Confirmation of Outlook Task created while Offline
  9. Created a CRM Appointment in the CRM web client.
    Appointment created in web client while Outlook is offline
  10. Confirmed the Appointment created in CRM Offline has not synchronized to the web database.
    Confirmation that Offline Outlook Appointment does not exist in CRM Online (web) database
  11. Confirmed that the Appointment created in the CRM web client did not synchronize to the Offline Outlook client (only appt. created in step 3 exists).  Synchronized Outlook and CRM by clicking on the Go Online button in the CRM toolbar.
    Confirmation that webclient Appointment does not exist in Outlook.  Synchronizing changes made to offline and web client
  12. Confirmed the Offline created CRM Appointment and CRM Task were synchronized to the CRM web database.
    Task created while Offline now exists in CRM webclient.
  13. Confirmed the Online created Appointment was synchronized to the Outlook client.
    Appointment created while Offline now exists in CRM webclient.

Hopefully this will help you get a better understanding of how the offline synchronization process works with the Microsoft Dynamics CRM Outlook client.

Cheers,

Eric Boocock

Microsoft Dynamics CRM 4.0 IWorkflowContext Interface

The IWorkflowContext interface is the interface that is used to provide a way to obtain business data that is needed to correctly implement your business logic in a custom workflow activity.

In order to start we first need to understand how the IWorkflowContext is initialized.

Where does the IWorkflowContext come from?

The IWorkflowContext is initialized for a particular asyncoperation of type workflow (operationtype 10) when the asyncoperation of type workflow is processed and executed by the Asynchronous Service. However that is only where the context object is created. The data that is contained in the context object was collected prior to this action.

The Workflow Expansion Task plugin executes asynchronously for a particular message in CRM. For example if you have published a workflow to execute on Create of account, the Workflow Expansion Task plugin will be registered as a step on the Create message for the account entity when the workflow is published. The step will execute asynchronously. The Workflow Expansion Task plugin will retrieve all workflow rules that are published and that are configured to execute when the current message executes. So if you have more than one workflow that is published and should execute on the create message, the primary entity for the workflow is account, and you have access to the workflow based on the scope setting, those workflows will be retrieved.

The Workflow Expansion Task plugin will loop over each of the workflows retrieved and create an asyncoperation with an operationtype of 10 (Workflow). When the asyncoperation is created certain attributes are populated so that when the asyncoperation is processed by the Asynchronous Service it knows how to process the record. One of the attributes that is populated is the data attribute. In the SDK documentation the data attribute is a string property. The data property contains one of two types of serialized objects

AsyncOperationData

WorkflowData

Since we are working with an asyncoperation record of type workflow, the object is a WorkflowData object. When the asyncoperation is processed and executed by the Asynchronous Service, this WorkflowData object will be the basis of the IWorkflowContext object that we will have access to in our custom workflow activity.

Now that we know where the IWorkflowContext comes from, let’s look at how we can get access to the interface

How to get the IWorkflowContext object

The IWorkflowContext object can be obtained through the IContextService interface. Great, another interface! How do you get the IContextService interface??? Simple, we can obtain the IContextService interface through the GetService method on the ActivityExecutionContext which is provided as a parameter in our overridden Execute method for our custom workflow activity.

protected override ActivityExecutionStatus Execute(ActivityExecutionContext executionContext)

{

IContextService contextService = (IContextService)executionContext.GetService(typeof(IContextService ));

 

Now that we have our IContextService interface, we can now access the Context property which is of type IWorkflowContext.

protected override ActivityExecutionStatus Execute(ActivityExecutionContext executionContext)

{

IContextService contextService = (IContextService)executionContext.GetService(typeof(IContextService ));

 

IWorkflowContext context = contextService.Context;

 

Now that we know how to get the IWorkflowContext we can now look at what data it can provide

Great, we have the IWorkflowContext! Now what can we do with it?

In order to see what we have access to through the IWorkflowContext we can view the definition of it in Visual Studio by right-clicking IWorkflowContext, then clicking “Go To Definition”. When we do this we see the following

public interface IWorkflowContext

{

Guid ActivationId { get; }

Guid AsyncOperationId { get; }

Collection<EntityDependencyBase> EntityDependencies { get; }

PropertyBag InputParameters { get; }

string MessageName { get; }

Guid OrganizationId { get; }

PropertyBag OutputParameters { get; }

Guid PluginTypeId { get; }

Guid PrimaryEntityId { get; }

DynamicEntity PrimaryEntityImage { get; }

string PrimaryEntityName { get; }

DynamicEntity PrimaryEntityPostImage { get; }

DynamicEntity PrimaryEntityPreImage { get; }

PropertyBag SharedVariables { get; }

string StageName { get; }

Guid UserId { get; }

 

ICrmService CreateCrmService();

ICrmService CreateCrmService(bool asAdminUser);

IMetadataService CreateMetadataService();

IMetadataService CreateMetadataService(bool asAdminUser);

bool EvaluateCondition(ConditionOperator op, object primaryValue, params object[] values);

object EvaluateExpression(ExpressionOperator op, params object[] values);

void PopulateEntitiesFrom(CrmWorkflow workflow, string parameterName);

int RetrieveActivityCount(bool includeActivitiesCreatedByWorkflow);

}

Let’s go over each member in this interface
Properties

ActivationId: This is the workflowid for the workflowbase record that is the activation record for the workflow. In other words, this describes which workflow this is an instance of.

AsyncOperationId: This is the ID of the asyncoperationbase record.

EntityDependencies: This is a generic collection of EntityDependencyBase objects. An EntityDependencyBase object can be of one of the following types

PrimaryEntityDependency: Specifies the objectTypeCode (Int32), comma delimited attributes to retrieve for the primary entity (String), and the type of parameter it is (Type)

RelatedEntityDependency: Specifies the objectTypeCode (Int32), comma delimited attributes to retrieve for a related entity (String), and the type of parameter it is (Type)

You are going to have 1 PrimaryEntityDependency, and X RelatedEntityDependencies.

InputParameters: InputParameters is a property bag which contains the objects from the request that triggered the workflow. For example, if the workflow executed due to a create message for account, the InputParameters property bag would contain the target account as a DynamicEntity. You access the objects in the property bag just as you would when using a plugin.

MessageName: The name of the message which triggered the workflow to execute. Create, Delete, Update, or SetState.

OrganizationId: The ID of the organization to which this asyncoperation belongs

OutputParameters: OutputParameters is exactly the same as InputParameters. It contains the output of the request. For create, it would contain the GUID of the business entity created.

PluginTypeId: The ID of the plugin type. This is the ID for the compiled Windows Workflow that this asyncoperation is an instance of.

PrimaryEntityImage: A DynamicEntity object that represents the primary business entity for which the workflow is regarding. If you want to get attributes included in the create or update request, you don’t want to use this. Instead get the Target property out of the InputParameters property bag

PrimaryEntityName: The schema name for the entity that this workflow is regarding.

PrimaryEntityPostImage: DynamicEntity which contains the primary entity’s ID attribute along with the values for the attributes defined in the PrimaryEntityDependency after the operation which triggered the workflow to execute. This is not specific to a singular workflow, but contains all the needed attributes for all workflows registered to execute on the same message for the same entity type.

PrimaryEntityPreImage: DynamicEntity which contains the primary entity’s ID attribute along with the values for the attributes defined in the PrimaryEntityDependency after the operation which triggered the workflow to execute. This is not specific to a singular workflow, but contains all the needed attributes for all workflows registered to execute on the same message for the same entity type.

SharedVariables: Another property bag in which you can store data to be persisted to the database for use through-out the execution of the workflow. Also SharedVariables will contain any objects that have been stored in the SharedVariables property bag from the plugin execution context during the entire life of the pipeline for the message which triggered the workflow to execute. So you could store an object in the SharedVariables property bag pre create of an account, and that object would be available in any workflows that were created on create of the account.

Note: The IWorkflowContext’s SharedVariables property gets populated from the SharedVariables property on the IPluginExecutionContext for the “Workflow Expansion Task” asynchronous plugin. The “Workflow Expansion Task” executes in the Child pipeline. So if you want values from before the main operation (BeforeMainOperationOutsideTransaction) you need to register your Pre plugin step in the Child pipeline as well. Further, any SharedVariables you populate after the main operation (AfterMainOperationOutsideTransaction) will not be included even if you register your Post plugin step in the Child pipeline since the “Workflow Expansion Task” executes in the pipeline stage that is not available to third parties (AfterMainOperationInsideTransaction).

StageName: The name of the stage that workflow is currently in.

UserId: If the workflow was executed on demand, this will be the ID of the user who executed the workflow. If the workflow was executed due to a message executing, then it will be the ID of the workflow owner.

Methods

CreateCrmService:

CreateCrmService(bool asAdminUser):

Both of these methods return an ICrmService interface. This is optimal way to interact with CRM to execute the various requests in the CRM web service. The overload allows you to execute with more privileges if needed; just in case the current user doesn’t have the required privileges to perform some action in your custom business logic.

CreateMetadataService():

CreateMetadataService(bool asAdminUser):

Both of these methods return an IMetadataService interface. This is optimal way to interact with CRM to execute the various requests in the MetadataService web service. The overload allows you to execute with more privileges if needed; just in case the current user doesn’t have the required privileges to perform some action in your custom business logic.

RetrieveActivityCount: Retrieves the number of CRM activity records created so far by this workflow

The following methods are for internal use only and are not intended for third party use

EvaluateCondition(ConditionOperator op, object primaryValue, params object[] values):

EvaluateExpression(ExpressionOperator op, params object[] values):

PopulateEntitiesFrom(CrmWorkflow workflow, string parameterName):

Cheers,

Clint Warriner

How to toggle Smart Matching in Microsoft Dynamics CRM 4.0

Smart matching is the default and the only email correlation strategy shipped with CRM 4. Smart matching uses the subject of the current email along with the email recipients of the current email and generates subject hash and address recipient hash and uses these hashes to compare against old emails to find a match for the subject hash and at least HashMinAddressCount number of address recipients to correlate with an old email. HashMinAddressCount is reg key based and has a default value of 2. Smart Matching is not always the solution that customers are looking for.

Read more about Smart Matching.

How Smart matching works

In cases we have emails that have same subject and atleast 2 (HashMinAddressCount) recipients that match, we treat this as a correlation and set the same regarding object for the new email to the old email which may not be right.

An example can be a Service based CRM setup might have Service Details in the subject and this email is sent to customer service representative and the case is related to customer Bob and there could be a second email that is addressed to the same CSRs with the same subject line but belongs to Customer John Doe. In this case Smart matching wrongly ties these 2 emails together.

Another scenario could be a case where HR sends a New Hire subject line email to a Manager and each email could be related to a different new Hire but Smart Matching will go wrong in this case.

In all these cases, users will have to use caution and always generate unique email subjects to avoid these wrong correlation problems which cannot be that feasible and hence the reason for choosing Tracking Token strategy used in V3 where the token is automatically generated and appended to the subject and emails are now correlated based on the tracking token only.

How to use the tracking token-based solution

1. There is a trackingtoken column that is indexed in EmailBase table that is used for correlating new emails with old emails when using the Tracking Token strategy instead of the Smart Matching strategy. This column is not filled for all emails in CRM v4 as we use Smart matching.

To get this tracking token column populated for the past emails in the database, please use the tool that comes with hotfix KB# 969946 that uses SQL CLR to use the power of regular expressions which are lacking in SQL Server but are available and are used by the CRM Server code to extract the tracking token out of the subject of the past emails in an efficient manner.

For this tool to work correctly in case CRM Server and SQL Server box are different, please make sure to set up same directory structure where the tool got dropped on the CRM Server in the SQL Server box as well. The reason being , the DB Scripts in the Tool install the assembly EmailTrackingTokenExtractor in the SQL Server and creates a user defined function in SQL Server that links to the external assembly function and hence SQL Server looks for this assembly on the SQL Server box.

if CRM is 32bit, and SQL is 64bit you will need both the 32bit Hotfix as well as the 64bit Hotfix. The 32bit dll and the exe must be placed on the CRM server, and the 64bit dll must be placed in the same path but on the SQL server. We found that when SQL is 64bit and you attempt to execute a SQL CLR type from a 32bit assembly you will get an error.

This tool uses the same regular expressions that the CRM server code uses to extract the tokens and so it honors all the different tracking token prefixes and other tracking token settings set in the past.

Once you have the tool extract all the tracking tokens from the subject and populate the tracking token column in the email base you are ready to enable tracking token strategy.

The tool has to run at a time when there is no activity (to reduce the contention on Email table and also not have a flood of emails in between) and once the tool run is complete, the hot fix can be installed and smart matching can be disabled. There is a chance that you might have some emails in that small delta of time so you can re-run the tool again to cover these emails if needed.

Without the power of SQL CLR we would have been forced to retrieve all the emails from the database and process them through the regular expressions in C# layer and bulk update these rows back to the database.

With the power of SQL CLR we can run updates as simple as:

update email set trackingtoken = dbo.ExtractToken(subject)

where trackingtoken is null

and subject is not null

and subject <> ”

where dbo.ExtractToken is a User defined function that refers to the C# function EmailTrackingTokenExtractor.[EmailTrackingTokenExtractor.TrackingTokenExtractor].ExtractToken which is defined in the EmailTrackingTokenExtractor.dll and is now installed in SQL Server.

This eliminates all the round trips between the mid-tier and SQL Server and all the processing including the regular expression processing is happening on SQL Server itself.

2. Follow the instructions of Knowledge Base article KB 958084 and set the DisableSmartMatching registry key to 1 to turn off smart matching after installing KB 958084 hotfix.

For this hotfix to work, please make sure that this registry key DisableSmartMatching is set to 1 and the Use Tracking Token option is set in the Settings. This will direct CRM email tracking manager to correlate emails using the tracking token (CRM V3 strategy) instead of the default CRM v4 Smart matching strategy.

There is no impact to switch back to Smart matching as we still store the address hashes and the subject hashes.

Cheers,

Chandra Akkiraju

Tapping into the Power of Workflows

The workflow engine in Microsoft Dynamics CRM is an area I personally spend a ton of time playing with. It is like Jscript was in Version 1.x and 3.0 for me. It has A TON of potential that many people have barely scratched the surface of.

So in working with a customer over the past month, they wanted to track the number of widgets each one of their accounts has at each location. This works well, until they want to be able to run some queries and there is no way to be able to get the total number of widgets per account. How many accounts have brand A of widgets in a quantity of over 200? How many have Brand B?

So that is where workflow comes in. In talking with Jon Goodwin from Microsoft Consulting Services, he showed me a quick and easy way to have workflow create a summary number on a parent entity. So he showed me a quick trick to be able to take quantities from a child entity and using workflow, create a summary of the number of widgets owned.

Step 1. Create Widget Entity. Add a Quantity Owned to Widget Entity. In addition, I also wanted to track by competitor, but that step isn’t used in this example. Then create a field to track number of widgets and type of widget.

Step 2: Create N:1 Relationship Between Widget and Account Entities. (So the Widgets are a Child Record of an Account)

Step 3: Create Summary Field on Account. (To hold how many widgets total you have!)

Step 4: Create Workflow on Widget Entity. You will want the workflow to fire on the create of a Widget as well as the change of the number of widgets.

clip_image002

Add a step with the type of Update Record. Select Account as the Entity to update.

If you are anything like me, you generally close the form assistant. But in this case, we want to leave the form assistant open. We should click on the Total Number of Widgets, Make the Operator Increment By, and Look for Widget and Number of Widgets. In the Default Value, we put a 0 so that if the user creates the entity and doesn’t fill it in, it doesn’t inflate the total count.

clip_image004

Save and Close the Workflow and Publish it.

So here is the finished product:

clip_image006

And if I browse to Widgets, you can see that we do in fact have 1011 widgets in this location for this customer.

clip_image007

So there is one more way that Workflow can make CRM work better for your customers. Attached is the workflow rule if you want to play with in your demo or testing environment.

Happy Workflowing!

Ben Vollmer

Fetch It Part 2

In the first blog on using Fetch XML, I introduced a tool that I had created that would allow you to execute Fetch XML statements to retrieve CRM data. In this blog, I want to discuss some of the inner workings to executing Fetch XML statements to retrieve more than a single page of records. Here’s the link to the updated FetchIt tool

The mechanism is a paging cookie that tells the CRM system were to start the retrieve from. The paging cookie is returned on a fetch execute statement in the data set. The first step is to execute a fetch xml request. This is pretty simple and documented well in the SDK.

string QueryString = @”<fetch mapping=’logical’><entity name=’new_test’><all-attributes/></entity></fetch>”;
Result = this.m_crmService.Fetch(QueryString);

 

 

This will return a string of xml data.

<resultset morerecords=”1″ paging-cookie=”&lt;cookie page=&quot;1&quot;&gt;&lt;new_testid last=&quot;{C855BD56-D955-DE11-90B3-001E0B5E0BF6}&quot; first=&quot;{37F4AB96-D755-DE11-90B3-001E0B5E0BF6}&quot; /&gt;&lt;/cookie&gt;”><result>……

Notice in the root element are the attributes morerecords and paging-cookie.

The morerecords indicates there is more data from the query. The limit per retrieve is 5000 records. To get more, you simply need to add the paging cookie to the next request.

string QueryString = @”<fetch mapping=’logical’ page=’2′ count=’5000′ ” + Cookie + “><entity name=’new_test’><all-attributes/></entity></fetch>”;

 

 

To accomplish this you’ll need to do a bit of parsing and string manipulation. Make sure you grab the quote on the end.

 

 

string Cookie= paging-cookie=”&lt;cookie page=&quot;1&quot;&gt;&lt;new_testid last=&quot;{C855BD56-D955-DE11-90B3-001E0B5E0BF6}&quot; first=&quot;{37F4AB96-D755-DE11-90B3-001E0B5E0BF6}&quot; /&gt;&lt;/cookie&gt;”

 

 

then insert into the query along with a paging attribute.

string QueryString2 = @”<fetch mapping=’logical’ page=’2′ count=’5000′” + Cookie + “’><entity name=’new_test’><all-attributes/></entity></fetch>”;

 

 

Next, all you need to is execute the fetch request with the new string. On each call, increase the page number from the last until morerecords = 0.

Here’s some code snippets. Make sure you add appropriate error handling. I put this together as a conceptual example.

 

 

string GetCookie(int idx,string DataChunk)

 

 

{

 

 

    int end = DataChunk.IndexOf(“>”, idx);

 

 

    int start = DataChunk.IndexOf(“paging-cookie=”);

 

 

    int nCnt = end – start;

 

 

    string Cookie = DataChunk.Substring(start, nCnt);

 

 

    return Cookie;

 

 

}

 

 

 

 

 

public static string InsertCookie(string fetchXml,

                     int PageNum, string Cookie)

 

 

{

 

 

    string Root = “<fetch mapping=’logical’ page='”

     + PageNum.ToString() + “‘ count=’5000’ ”

     + Cookie + “>”;

 

 

 

 

 

    int end = fetchXml.IndexOf(“>”);

 

 

    if (end < 0)

 

 

    {

 

 

      return “”;  // need to handle empty string

    }

 

 

 

 

 

    StringBuilder builder

            = new StringBuilder(fetchXml);

 

 

    int RootEnd = fetchXml.IndexOf(“>”);

 

 

    if (RootEnd > -1)

 

 

    {

 

 

        int length = RootEnd + 1;

 

 

 

 

 

        // Remove old root attributes.

 

 

        builder.Remove(0, length);

 

 

       

 

 

        // Insert new root element attributes

        builder.Insert(0, Root);

 

 

    }

 

 

  

 

 

    return builder.ToString();

 

 

}

 

 

 

 

 

 

cheers

-jonw

 

Posted: Wednesday, June 17, 2009 5:30 PM by JonWhite | 1 Comments

Enhanced Internet Lead Capture capabilities now available

For those currently using the Internet Lead Capture capabilities within CRM Online you may have noticed that the functionality was enhanced recently. 

Here are some highlights of what has improved:

– Canadian customers can now take advantage of Internet Lead Capture (AdCenter integration does not apply in Canada)

– There is support for vanity URLs which allows you to leverage your unique CRM Online org name for a specific landing page URL if you decide to have Microsoft host your internet lead capture page

– You have the ability to add any lead field (including custom lead fields) to the Internet Lead Capture web form

– There is an improved Lead Capture Home Page with embedded charts to show lead capture effectiveness

If you aren’t using this functionality today, you can easily activate it by clicking on “Landing Pages” or “Internet Leads” from the Sales or Marketing Tab within the left navigation

Performance counters for Microsoft Dynamics CRM 4.0

Microsoft Windows operating systems include a tool named Reliability and Performance Monitor (or Performance Monitor in some versions of the operating system). You can configure this tool to obtain and display performance data as system components run on your Microsoft Dynamics CRM deployment.

Performance objects provide sets of counters that generate data about how various components perform as they run in your Microsoft Dynamics CRM environment. For example, the Processor object collects metrics that show how one or more microprocessors are performing on a particular server.

Many performance objects are built into the operating system, and more are installed with software applications and services. For example, the performance objects installed with SQL Server and Exchange Server can help you monitor the performance of components that are relevant to your system.

This article describes the performance objects and counters that are installed with Microsoft Dynamics CRM.

Read more…

Cheers,

Michael Guthmann

CRM Online: Reporting Options

Microsoft Dynamics CRM Online is a fantastic CRM platform for most small medium businesses (SMB) – it is so easy and inexpensive to get started. 

However, one area of challenge for Microsoft CRM Online is custom reporting.  Some background on this.  Unlike Microsoft CRM On-Premise, Microsoft CRM Online does not support uploading of SQL Report Services (SRS) custom reports (.RDL) or accessing the SQL Server Database filtered views.  CRM Online does not allow users to directly access the CRM Application Server or Database Server – this is a good thing to maintain security and the 99.9% uptime for all the companies that are running on Microsoft CRM Online.  In fact, I’m not aware of any other major On-Demand Business Application vendors allowing their users to directly access the application or database server.

After working on numerous Microsoft CRM Online deployments, here is what I would suggest as a good practice for reporting:

Start with the “Out of the Box” options such as Default Reports, Advanced Find / Export to Excel, Mail Merge, and Report Wizard to meet your CRM Online reporting needs.  If none of them are “good enough” then go with custom report development (SDK).

Here are some good references/links to the reporting options mentioned above:

Default Reports:

  • To view the list of default reports that come with Microsoft CRM Online –
    from the CRM Client >> Workplace >> Reports
  • Blog: Advanced Find / Export to Excel
  • Video: Using Advanced Find – Time: 10:02 minutes
    This video will show you how to use the Advanced Find feature to find and take action on targeted sets of data. It will also show how to create Saved Views (My Views).

Mail Merge:

Report Wizard:

 

Export and Integrate CRM Data to a Local Data Source (Access, SQL Server, SQL Server Express, etc.):
There are two approaches for this.  Please note that the two approaches are not supported by Microsoft CRM Support – use at your own risk/support.

 

  1. Subscribe to the Microsoft CRM Online Professional Plus edition that supports the Offline Data Synchronization feature.  Setup a Microsoft CRM Online Outlook Client and ensure that all the needed CRM Online data are continuously updated to the local database.  Then use the Microsoft SQL Server Reporting Service (SRS) report development tool or other reporting, Business Intelligence (BI) tools to build the custom report against the local SQL Server Express CRM database used by the Microsoft CRM Online Professional Plus edition.  Note: 4 GB database limit for SQL Server Express
  2. Utilize an integration tool to extract CRM data to a local data source and then use a corresponding reporting tool.  For example, you can purchase and setup the Scribe Software Insight to integrate between Microsoft CRM Online and the local SQL Server database.  The local SQL Server database will mimic the needed CRM Online tables so  you can use the Microsoft SQL Server Reporting Service (SRS) report development tool or other reporting, Business Intelligence (BI) tools tools to build the custom report against the local data

Custom Development :
With custom development using the Microsoft Dynamics CRM SDK, the reporting options are endless. 

Here is a link to a FREE Software utility download – Workopia CRM Online Statistics.  It was built by one of our developers using Microsoft CRM SDK that produces a report of the current record count of the following Microsoft CRM Online entities: Account, Contact, Lead, Opportunity and much more. It reports the break down for the different record states, Lead count from different Lead Sources and the count of Opportunity records originated from a Lead record.   Enjoy.

Here’s a case study regarding custom report development for Microsoft CRM Online:

One of our Microsoft CRM Online clients in the high tech engineering services industry had a need for several reports, some of those reports are used as “summary printouts” to be given to the customer upon service completion.  These reports contain very specific formatting, layout, calculations, graphics, and data content from various Microsoft CRM related areas, more than two levels of data relationships.

We had tried to use the “Out of the Box” options to see if it is “good enough”, but none of them were acceptable for this case.  The Client loves Microsoft CRM Online but without their critical reports, they will have to either switch to Microsoft CRM On-Premise or to another CRM system.

We decided to build their Microsoft CRM Online reports via custom report development – this worked great!   The Client can now click on a report button from either their CRM Online Web Client or Outlook Client to access these custom reports.  The Client got the EXACT reports they are looking for and was able to stay with Microsoft CRM Online without switching to another CRM system.  The effort to build these custom reports was about the same as if we have to build them via SRS custom report development for an On-Premise environment.

In summary, if you have data in Microsoft CRM Online that you would like to report on, and none of the “Out of the Box” reporting options are good enough, then try custom report development.

Cheers,

CRM MVP Frank Lee

Fetch It

My colleague Tony Clayton the other day mentioned that he’d loved to have a tool that allowed him to enter fetch xml queries to rip data out of Microsoft Dynamics CRM Online. Fetch XML is a part of the CRM technology platform that allows you to create queries using an xml language. The queries can be executed against a CRM Online data store to fetch data. See the Microsoft CRM SDK topics Using FetchXml and FetchXML Schema

Here are some examples of Fetch queries from the SDK.

1. Return all attributes from the account entity for accounts created today.

<fetch mapping='logical'>

   <entity name='account'><all-attributes/>

      <filter type='and'>

         <condition attribute = 'createdon' operator='today'/>

      </filter>

   </entity>

</fetch>

2. Inner Joins between Entities

<fetch mapping='logical' page='1' count='10'>

   <entity name='invoice'>

      <attribute name = 'name'/>

      <link-entity name='invoicedetail' to='invoiceid' from='invoiceid'>

      </link-entity>

      <filter type='and'>

         <condition attribute = 'accountid' operator='eq'

                    
                value='{7F901912-DD67-47B9-A5B3-B702B9F84680}'/>

      </filter>

      <link-entity name='account' to='accountid'>

         <filter type='and'>

            <condition attribute = 'name' operator='like'

                       
                value='%Account%'/>

         </filter>

      </link-entity>

   </entity>

</fetch>

 

While building Fetch XML Queries might take some time to learn, they can be very powerful in data searching.

How can you use Fetch XML Queries ?
Typically, the fetch statements are used in custom built software to access data. I’ve built a tool that will allow you to run them in a small client side application and store the resulting data as an xml file. You can then use Excel 2007 to convert the xml to csv. You could also write an XSLT translation to convert the xml into the needed data format.

Since I thought it might be nice to schedule data fetches, I wrote the tool to be ‘hands’ free. This way you can run it automatically with the Windows Task Scheduler if you want.

The tool is pretty simple to use.

1. Unzip to a folder. Do not run out of the zip file.

2. Configure the settings

To configure the tool, you’ll need to set values in thee FetchIt.exe.config file

Set the values highlighted below

  • Your OrgName is the first part of your URL.  As an example, if you url is https://acme.crm.dynamics.com than you org name is acme.
  • FileName is the name of the file that will be produced. The tool will add an .xml to the end of it and use the default directory.
  • RunUnAttended will toggle the application to automatically close when done.
  • FetchQuery is where you put the fetch statement to execute.

Make sure to replace < with &lt; , > with &gt;   ( sorry but an unfortunate necessary step )

Example –

Change:

<fetch mapping='logical'><entity name='account'><all-attributes/></entity></fetch>

To

&lt;fetch mapping='logical'&gt;&lt;entity name='account'&gt;&lt;all-attributes/&gt;&lt;/entity&gt;&lt;/fetch&gt;

 

 

<userSettings>

        <FetchIt.Properties.Settings>

            <setting name="UserName" serializeAs="String">

                <value>UserWindowsLiveID</value>

            </setting>

            <setting name="Password" serializeAs="String">

                <value>PassWord</value>

            </setting>

            <setting name="OrgName" serializeAs="String">

                <value>CRM Org Name</value>

            </setting>

            <setting name="FetchQuery" serializeAs="String">

                <value>&lt;fetch mapping='logical'&gt;&lt;entity name='account'&gt;&lt;all-attributes/&gt;&lt;/entity&gt;&lt;/fetch&gt;</value>

            </setting>

            <setting name="FileName" serializeAs="String">

                <value>FetchItResults</value>

            </setting>

            <setting name="RunUnAttended" serializeAs="String">

                <value>False</value>

            </setting>

        </FetchIt.Properties.Settings>

    </userSettings>

The last tip is that the tool will write event messages to the Windows Event Viewer Log in a source named FetchIt.

FetchEvent

Here is a screen shot of a successful fetch with RunUnAttended = false. That way I can see the window messages.

FetchIt

You can download the tool from my Sky Drive at FetchIt .

Cheers,

Jon White

Service Accounts – Non-Interactive Users

A non-interactive user is a user account in Microsoft Dynamics CRM Online that can access the system but only via the web service layer. Essentially, that user can not use the user interface. Service accounts are used to access CRM Online using the service to service model. A service account is a non-interactive user account with the proxy role assigned to it. Microsoft Dynamics Online allows 5 free non-interactive user accounts. To make the user account a non-interactive account, you need to change the access mode. The access attribute is not visible in the UI by default.  The attribute is “access mode”, you can either customize the form to show it, or manipulate it by an SDK call.  Setting the access mode to non-interactive simultaneously frees up a license and prevents that identity from logging in interactively.

To change the user account mode, add the access mode attribute to the form

access_mode_attribute

User_form_designer

Next, you need to publish, then open the specific user record and set the the access mode.

User_form

See also:

 

 

Cheers –

Jon White

Including External Data in Microsoft Dynamics CRM 4.0 Reports

So you want to create a report that references data from another database.  Perhaps you want to correlate your CRM data with your ERP data to see how CRM activities are affecting your orders in financials.  Of course, you could do an integration and bring the ERP data into CRM.  However, an integration can require a big investment of time and money, and a report can be an affordable alternative if you don’t need the data for other purposes.

Another reason for doing this is if you want to have reports reference a mirrored copy of your CRM database, so heavy reporting doesn’t impact CRM performance.

The Challenge

To understand the challenge of reporting from external databases from CRM, we need to discuss the way that CRM reports data sources work.  When you upload a SSRS report to CRM, the data source of the report is modified to point to the CRM environment to which the report is uploaded.  For example, if your datasource referenced in your SSRS report is Contoso_MSCRM and you upload the report to a different environment with database ACME_MSCRM, when you upload the report, the data source will be changed to ACME_MSCRM.

This is typically a good thing, as it enables us to write a report for one environment and upload it to multiple other environments without having to modify the data source.  However, in this case, it creates some challenges.

Say I have a report where my Microsoft Dynamics GP database is the data source.  If I upload the report to CRM, it will change the data source to CRM, and the report will not work.

One option is to upload the report directly to the report server, bypassing CRM reporting.  When you upload the report directly to the report server, the report retains the original data source.  You can then make the report accessible to CRM users by creating a new report record and selecting “Link to Web Page” and adding the URL of your report on the report server as the “Web Page URL.”

This is not always a great option.  You lose certain CRM report functionality, like prefiltering, and users who don’t have access to the Report Server (such as users who access CRM via Internet Facing Deployment) will not be able to run the report.

The Solution

In this example, my Microsoft Dynamics CRM database is called CRM_MSCRM, and I have a copy of my CRM database called Mirror_MSCRM

1.  Create SSRS report, selecting your production MSCRM database as the data source.  In this case I selected CRM_MSCRM as my data source

clip_image002

2. Write your query referencing the external database.  For example, in this case, I want to select my accounts from the mirrored database.  This is how I would write the query :

SELECT *  FROM MIRROR_MSCRM.DBO.ACCOUNT

The result is that our report data source is our production database, but the query will return data from the external database.  This is a good start, but there is one more thing we will want to do.

3.  Join your query to the Filtered Views of your CRM database.  This will allow us to use prefiltering, and in the case of your mirrored database, this will prevent security issues, as users will only return records that they have a right to view.

SELECT * FROM MIRROR_MSCRM.DBO.ACCOUNT inner join
FilteredAccount AS CRMAF_FIlteredAccount ON
Mirror_MSCRM.DBO.Account.accountid=CRMAF_FilteredAccount.accountid

Now you can upload your report through CRM, use CRM prefiltering, reflect data from your external database, and your CRM data will be secure. 

One consideration is that the user that is running the report will need to have database read permissions on the external database to be able to read the report.  Also, if you are running the SRS Data Connector, you will need to give the NT AUTHORITY\NETWORK SERVICE read/select permissions on the external database.  After you do that, IFD users will be able to run the report.

Cheers,

Joel Lindstrom

CustomerEffective blog