Archive for July, 2008

How it Works: SQL Server Reporting Services and Dynamics CRM

At Tech-Ed Developer in Orlando a few weeks back I lead an interactive session on CRM and Business Intelligence. The session was open to any CRM and BI topic so I expected a lot of hard questions about data mining and the like but the topic of greatest interest turned out to be the CRM and SQL Server Reporting Services (SSRS) integration. This should be helpful to folks with questions about deployment practices or with an interest in exposing CRM reports to users outside of the CRM application.

First things:

  1. With CRM 4 reports are an entity within CRM. They have meta data and CRM security is applied to determine whether or not a user may view the report . Note that that is the report, not the underlying data. Users can have access to CRM data that a report points to and not have access to the report itself. Likewise a user may have access to a report but not the data that it would show (in which case the user could run the report but it would return no data).
  2. The SQL Reporting Services Report Viewer is an ASP.Net control which runs on the CRM 4.0 Web server. In CRM 3.0, and when you interact with the SSRS Report Manager, that control is running on the Web server fronting SSRS. When you choose to run a report from CRM 4.0 the ASP.Net control requests the report and data from the remote SSRS box. In practical terms: in CRM 3.0 the URL for a report was the URL for the SSRS Web server; in CRM 4.0 the URL for a report is the CRM Web server.

Because CRM 4.0 reports are always run in a delegated mode the CRM and SSRS integration has to handle security. There are two ways to do this in CRM 4.0. One way to do this is to use integrated authentication where trust for delegation is required between the CRM server, the SSRS server and the SQL server with the CRM db. This was the required configuration on CRM 3.0 and frankly, it was a bit of a headache for folks to manage [see HOW TO: Configure Kerberos authentication for Microsoft CRM 3.0 and Microsoft SQL Server Reporting Services and Microsoft CRM 3.0: Additional Setup Tasks Required if Reporting Services Is Installed on Different Server . ]

The other mechanism is to use the SQL Server Reporting Services MS CRM connector. This connector runs as an SSRS Data Processing Extension and handles all of the delegation for you. The use of the data connector is recommended for Internet facing deployments and anywhere users are not using NT Auth to connect to CRM. When using the Data Connector users of CRM cannot directly access the RDLs in SSRS – all management of reports must be done through the CRM reporting UI; users connecting to the SSRS Report Manager will get an access denied message if they try to browse Reports.

Choosing a deployment type is up to you and of course there are pros and cons either way. The following table describes some of those (if you have others throw them in the comments).



SQL Server Reporting Services Data Connector

Kerberos Authentication

Works with Internet Facing Deployments



Schedule reports using the Report Scheduling wizard in CRM



Uses NT credentials to connect to SQL Views



Access CRM reports outside of CRM



Use the CRM Report Wizard



Keeps CRM data secure



The table speaks for itself and I think that for most organizations the Connector is probably the right way to go. But let me point out one item that is near and dear to me: “Access CRM reports outside of CRM”. One of the great things about SSRS is its direct URL access to reports; along with that are the ability the embed reports into Microsoft Office SharePoint sites, in Performance Point dashboards, on your own ASPX pages using the ASP.Net control or my favorite: embedded with forms of the CRM application itself. If you use the connector you won’t be able to use URL access for reports; this is so useful though that we made sure to give you a work around.

clip_image001If you have the “Add Reporting Services Reports” privilege you’ll see a command on the Action menu of the Report form titled “Publish Report for External Use”. This command will publish your report and any child reports to a directory in SSRS that is open to all CRM users. You can embed the URL to that report, along with any arguments on the query string, within CRM or the Report Viewer controls.

You won’t get any feedback that this worked so you’ll just have to trust but verify that it did. Doing this multiple times will also overwrite any existing report with the same name in the target directory so this isn’t the most… elegant… solution but there isn’t a demo environment that I have that doesn’t take advantage of it.



Barry Givens


LinkedIn to Microsoft Dynamics CRM

Guest blogger CRM MVP Matt Wittemann is the director of the CRM practice at Customer Connect and has been working with CRM since it was first released. Matt has a passion for helping businesses get more profitable and efficient through the effective use of CRM tools and improved processes.

LinkedIn is slowly releasing APIs to allow developers to integrate their popular business networking site with external applications. One of the first widgets they’ve released is the LinkedIn Company Insider. This widget allows web sites to show LinkedIn connections from a given company from within the context of the web site. More information is available at

Of course, this would be tremendously helpful to salespeople who are trying to find the inside track when working a Lead in Microsoft Dynamics CRM. The following sample shows how to integrate this LinkedIn widget with CRM so it shows in an iFrame on the CRM Lead form.

The result is that you can open a Lead record and the iFrame will show you how many LinkedIn users work at the Lead’s company. You can click on them to open a new window to view all the connections, and, of course, you’ll get the most out of the widget if you have a LinkedIn account so you can reach out to these people in your sales process.


(Isn’t it cool that the top three people that come up for Microsoft are all CRM folks? Sorry Phil, Jim and Menno! I needed a real world example to show the functionality of this widget!)

To install this widget in CRM, create an iFrame on the Lead form. There’s plenty of documentation on the web and in the SDK on creating iFrames for CRM, so I won’t go into the steps here. But make sure to place a check next to the option to pass parameters to the iFrame and remove the checkbox that restricts cross-frame scripting. You might also want to set the iFrame to be 4 or more rows high and expandable to fill any available space.

In my sample, I named the iFrame “linkedin” and I pointed it at my custom HTML page, which I put in the handy ISV folder inside the CRM web site. In previous versions of CRM, developers had to venture into unsupported territory and create their own custom folders inside the CRM web site folder structure, but in 4.0 there’s a handy ISV folder. Place a simple HTML page in the ISV folder with the following code, and point your new iFrame at this page by using a relative URL like “/ISV/linkedin.html” (be sure to replace the file name with whatever you’ve called your HTML page).




<script src=”; type=”text/javascript”></script>




<span id=”getlinkedin”></span>


<script type=”text/javascript”>

var parentForm = parent.frames.document.crmForm;

new LinkedIn.CompanyInsiderBox(“getlinkedin”,parentForm.all.companyname.DataValue);



This simple code in your HTML page references a javascript file housed on LinkedIn’s server, and the script in the body of the page gets the value from the Company Name field on the Lead. (This won’t work if the Company Name field is empty or missing from your Lead form – you can also consider adding some error checking to this sample for your production environment.)


Matt Wittemann

Microsoft Dynamics CRM E-mail Router and Exchange 2007: Keeping it Secure

In the last several months, there have been increasingly greater numbers of questions from customers and partners regarding the configuration of the Dynamics CRM E-mail Router against Exchange 2007 in ‘On Premise’ deployments.  Specifically, how can one workaround those “invalid cert” pop-ups and the resultant error message during the Test Access phase of router configuration?  I mean, besides just un-enforcing the use of SSL…a solution more than a few have resorted to when pressed to find an easy answer.

By default, Exchange 2007 users certificates to secure client access and as such it’s becoming increasingly important to familiarize oneself with the basics of PKI.  The problem is, even the basics tend to glaze the eyes of even seasoned admins despite their best efforts.  Additionally, it’s often tough to find a one-size-fits-all checklist because when it comes to security, one size really can’t fit all and take into account specific environments and implementation goals.  The end result is that there are lots of articles scattered about that all tell part of the story.   

Recently, I collaborated with our intrepid CRM Support Escalation team on such a case.  While I don’t presume to think that my method can or will work for everyone in their production environments, it’s the type of thing that I would recommend going over a couple of times in a lab or at home to de-mystify some of these concepts.  I think you’ll find once you do this once or twice and are comfortable with the tools and methods, implementing cert-based security, while not always the most intuitive process, can be done.


I have a lab environment the consists of 4 servers–a Domain Controller, an Exchange 2007 server, a CRM 4.0 full server install, and its back-end SQL 2005 server, all Windows 2003.  I’m trying to configure the E-mail Router on the DC but I keep getting the error: 

Incoming Status: Failure – The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel. The remote certificate is invalid according to the validation procedure. 

When I attempt to Test Access in Router Configuration Manager.  Additionally, I notice that when I try to use OWA, I get a pop error in IE regarding an invalid cert. 

This seems to be the general blueprint for the types of questions we’ve been receiving so I set out to repro the issue and document step-by-step its resolution.  I’ll cite the resources that I used further below since, as mentioned, it’s tough to describe a catch-all procedure–I think you’ll find you can adapt these and other steps to your own lab.

Establish a standalone root CA 

In my test environment, I decided it would be overkill to get a 3rd party cert but in production this might be the way you want to go.  In any case, for my lab the first step was to create a new Standalone Root Certificate Authority (CA).  A standalone root will basically be the first and last word in my environment as to a machine or user identity.  It will, of course, not be valid in the real world.  For my purposes, though, it’s perfect.  I decided to install the CA on my DC.

Open Control Panel > Add/Remove Programs > Windows Components > Certificate Services  and install Standalone Root CA.

You’ll see some other options including Enterprise CA, etc.  One of the biggest differences is that requests of certificates must be *manually* approved/denied in a Standalone.  Something to think about if you intend to use this in a non-lab environment.

Create a new certificate request on the Exchange 2007 server 

The next step is the generate a request for certification by your CA from the Exchange server.  There are different ways you can go about this and you can even generate a request that can validate the cert Exchange 2007 installs by default.  For my purposes, though, I used the following command:

From Exchange Management Console:

New-ExchangeCertificate -GenerateRequest -FriendlyName “AdamCert” -Path c:\reqest.req -SubjectName “c=us, o=adam bly, cn=ablye2k7.ablydom.local” -DomainName *.ablydom.local 

FriendlyName –  This is just a simple, logical name for the cert

Path – File system location/filename for the REQ file this request will output

SubjectName – This is where a lot of people get hung up.  This is the NAME on the cert.  It should match whatever you want the name to resolve to (i.e. what you intend to enter on the router).  This field can be multi-valued…that is, you could add the NetBIOS name so that you can use the cert internally without using the FQDN.  This could also be used to enter the CNAME or alias (ex. My mail server is ablye2k7 but the CNAME is mail.ablydom.local). Also, c is the region, o is the owner/organization (I used my name but this could just as well be Contoso Ltd.  Something to be aware of if you go the 3rd party route)

DomainName – domain for which the cert is valid.  I used a wildcard, but this isn’t explicitly necessarily.

The Certificate request (.REQ) file should be created where you specified in the Path parameter.

Submit the request to your CA and approve it 

The next step is to process the request and issue the certificate.  To start, I copied the REQ file to my Certificate Authority (my DC :)) and opened the Certification Authority MMC snap-in.    

Right-click on the server name and select All Tasks > Submit a new request.  In the Open Request File dialog, specify the REQ file you copied over.   Next, right-click on the Pending Requests node, select All Tasks > Issue.  Finally, click on the Issued Certificates node and double-click the cert you JUST issued. 

Make sure the certificate information includes includes “Ensures the identity of a remote computer” and that the “Issued to” makes sense in your environment.  Mine would be ablye2k7.ablydom.local based on the request above.  The “Valid from”  should be the current date through at least a year in the future.  Now we need to export the cert in a format that Exchange can consume.

Open the certificate and click on Details tab, select Copy to File which will invoke the Certificate Export Wizard.  Click Next at the Welcome screen, and ensure the radio button corresponding to DER encoded binary X.509 (.CER) is selected, enter a filename and a location for the .CER file and click Finish. 

Almost done!  Copy the certificate back to your Exchange server. 

Import and Enable the new cert 

Back on the Exchange server, open the Exchange Management Console. 

Import-ExchangeCertificate – Path c:\adam.cer |Enable-ExchangeCertificate -Services SMTP, IIS 

If you are prompted, confirm the overwrite of existing cert (if applicable).  Just need to do a few more housekeeping chores and we’re ready to go. 

Fix up OWA, etc., Virtual Directories 

In the Exchange Management Console, expand Server Configuration > Client Access > Outlook Web Access tab, right-click OWA (Default Web Site), Properties

Note the internal and external URL — these should be the same (or whatever you specified in the SubjectName in the request; additionally, if you included the NetBIOS name of the server, you could set the Internal URL to be simply “http://ablye2k7/owa“).  

Theoretically, you would perform these same steps for the OAB, POP3, IMAP4, EWS, and AutoDiscover.  

[PS] C:\Documents and Settings\Administrator.ABLYDOM>Set-WebServicesVirtualDirectory -Identity “EWS*” -InternalURL https://ablye2k7.ablydom.local

-ExternalURL https://ablye2k7.ablydom.local 

[PS] C:\Documents and Settings\Administrator.ABLYDOM>Set-ClientAccessServer -Identity “ablye2k7” -AutoDiscoverServiceInternalUri https://ablye2k7.ablydom.local/autodiscover/autodiscover.xml 

Configure the router 

Finally we’re ready to configure the router.  Our cert is in place and valid per our environment’s very own standalone root CA.  

My config profiles in the Router Configuration Manager look like the following:

IN – Location:  https://ablye2k7.ablydom.local

OUT – Location: https://ablye2k7.ablydom.local

Check “Use SSL”

It’s very important that the URLs match what’s on the cert!  This may seem like a “duh” type thing, but you’d be surprised how many people (me included) have kicked themselves when they realized they were trying to use the NetBIOS name for the Exchange server when the FQDN is what was on the cert.  🙂

Test Access should succeed for BOTH methods and everything should be secure.  Perhaps a little drawn-out, but not entirely impossible, right?  Once you’ve run through this a few times, I think you’ll agree it can be extrapolated to meet your environments needs once you understand the end-to-end process.  Additionally, the following resources helped me immensely when I was getting started.

More on Exchange 2007 and certificates – with real world scenario 

Creating a Certificate or Certificate Request for TLS,printer).aspx 

White Paper: Domain Security in Exchange 2007 

A big thank-you to the folks responsible for those from me. 

Adam Bly

Top 14 Microsoft Dynamics CRM Sites and Downloads

July 24, 2008 3 comments

We put together this handout for the Microsoft Worldwide Partner Conference and it is something that people are finding really valuable so I thought I’d post a copy here.

Whether you’re interested in an online or on-premises solution, Microsoft Dynamics CRM has you covered. Check out just a few of the content and community areas available to help you create the perfect experience for your business.  

Microsoft Dynamics CRM


The Microsoft Dynamics CRM Web site is the place to go for information on all things CRM. You’ll find everything from an introduction to CRM and its value to your business to white papers, product specifications, customer testimonials, links to CRM communities, and much more.

Resource Center


The Resource Center is a one-stop shop, designed to help you get started, maximize your efficiency, and build your business. Using a community-centered approach, the Resource Center brings you some of the best community ideas via blogs, forums, and newsgroups right to your desktop. But if you like to explore them on your own, you’re just a click away. Ramp up now and gain the knowledge that will help you care for your customers and maximize profits.

Developer Center

Visit Search for “crm developer center”. Click the first link: Microsoft Dynamics CRM.

The Microsoft Dynamics CRM Developer Center is the place to go for information and sample code for developers. You’ll find both introductory and in-depth articles, overview and reference documentation, entity model diagrams for you to download, links to community and support, and much more.

SDK Download

Visit Search for “crm 4.0 sdk”. Click the first link: Download details: Microsoft Dynamics CRM 4.0 SDK.

The Microsoft Dynamics CRM SDK download package contains all the same documentation found in the MSDN library, as well as hundreds of code samples in both C# and Visual Basic .NET, code to build tools for registering plug-ins, and a design guide to help you create CRM add-ins that match the look of Microsoft Dynamics CRM.

CRM on the MSDN Code Gallery

Visit Search for “code gallery”. Click the first link: MSDN Code Gallery.

The MSDN Code Gallery is your one-stop resource for finding code to customize and enhance your CRM experience. Developers and software vendors can start here to find code solutions.

CRM on CodePlex

Visit Search for “CRM”. Click the first link to view all the CRM projects.

CodePlex is the Microsoft open-source, project-hosting Web site. Developers and software vendors can start a new project, join an existing one, or download software created by the community.

Implementation Guide

Visit\downloads. Search for “crm implementation guide”. Click the link: Microsoft Dynamics CRM 4.0 Implementation Guide.

The Implementation Guide contains comprehensive information about how to plan, install, and maintain Microsoft Dynamics CRM 4.0. The planning tools include 43 templates, projects, and worksheets to help plan your implementation.

Community Home Page


The Microsoft Dynamics CRM Community is the place to go for information about all things CRM. You’ll find articles and links to blogs, forums, newsgroups, MVPs, and much more.

Team Blog


The Microsoft Dynamics CRM Team Blog is the place to go to connect with the CRM team. You’ll find articles on customization, development, and implementation. This is one of the fastest ways to get to know the CRM team.



The Microsoft Dynamics CRM Forums are your question and answer resource. You’ll find answers to some of your most pressing questions and a team of crack experts ready to help.


Visit Search for “crm newsgroups”. Click the link: Microsoft Business Solutions Newsgroups Home.

The Microsoft Dynamics CRM Newsgroups are threaded discussion groups that cover various aspects of CRM deployment and development. This is also where some customers come to ask questions.


Visit Search for “mvp”. Click the first link: Microsoft Most Valuable Professional.

The Microsoft MVP site recognizes exceptional technical community leaders from around the world. Here you can find experts who love to talk about all aspects of Microsoft Dynamics CRM.

CRM on Facebook

Visit Search for “Microsoft Dynamics CRM”. Click the link: Microsoft Dynamics CRM.

The Microsoft Dynamics CRM group in Facebook is a community that facilitates networking and collaboration. Go here to meet people in your part of the world, view videos, and introduce your company to the group.

CRM on Linked In

Visit In the group directory, search for “Microsoft Dynamics CRM”. Click the link: Microsoft CRM. and the Microsoft Dynamics CRM group is a community that fosters business networking. Here you can place a public resume, create and maintain business connections, and provide feedback to business acquaintances based on the services they provide.


Accessing a SQL Database from a Microsoft Dynamics CRM Plug-in

Have you ever had the need to access data in a non-CRM SQL database from within a plug-in? Let’s say that you register a plug-in with Microsoft Dynamics CRM that will pull additional data from another SQL database in order to pre-populate a newly created entity’s attributes or perform some calculation using the data from both databases.

The problem that you will run into is that the system account that the plug-in executes under needs to have login and data access to the SQL server and database, which is not enabled by default. In Microsoft Dynamics CRM, all plug-ins execute under the system account named “NT AUTHORITY\NETWORK SERVICE”. If you take a look at any Microsoft Dynamics CRM database, you will see that a login exists for the NETWORK SERVICE account.


Your SQL server administrator will need to create a SQL server login and assign database access permissions and roles for the NETWORK SERVICE account in order for your plug-in to be able to access the SQL database. Once this is configured you can connect to the database using a trusted connection string.

Data Source=myServer;Initial Catalog=myDataBase;Integrated Security=SSPI;

An alternate approach to creating a SQL server login account is to have your plug-in establish a connection to the SQL server using a connection string which includes login information. For example:

Data Source=myServer;Initial Catalog=myDataBase;User Id=myUsername; Password=myPassword;Integrated Security=false

Note that you must use Integrated Security=false and not Integrated Security=SSPI. This method has the disadvantage of sending login information in clear text over the network, which is less secure. You are also going to have to either hardcode the login information in the plug-in or pass the information to the plug-in’s constructor at run-time. For more information on how to pass data to a plug-in at run-time, refer to the Microsoft Dynamics CRM 4.0 SDK documentation under the topic Writing the Plug-in Constructor.

How to Execute SQL Commands from a Plug-in using Impersonation

Sometimes you may need to execute SQL stored procedures or SQL commands in the context of the user who caused a plug-in to execute instead of the Network Service system user. You can achieve this using the Execute AS command in SQL.

The NT AUTHORITY\NETWORK SERVICE login (see previous figure), or the user ID used to connect from the plug-in without using Integrated authentication, in the SQL database should be granted the sysadmin role in order for impersonation to work.


The following steps describe the process that a plug-in should implement.

1. Retrieve the domain name of the caller from Microsoft Dynamics CRM through the CrmService Web service. The systemuser entity contains domain information. You can execute a Retrieve on that entity to obtain the information.

2. Create the SQL connection to the target SQL database using the connection string specified in the secure or unsecure configuration attribute of the step. You can use integrated authentication or a hard coded SQL connection string as explained in the previous section of this blog.

3. Start the impersonation as the caller.

4. Execute any SQL commands or stored procedure that you want.

5. Revert the SQL execution context back to the Network Service system user.

The following plug-in sample code implements the previously described steps.

using System;

using System.Collections.Generic;

using System.Text;

using Microsoft.Crm.Sdk;

using Microsoft.Crm.SdkTypeProxy;

using System.Xml;

using System.Data.SqlClient;

using Microsoft.Crm.Sdk.Query;

public class AccessDatabase : IPlugin


   string m_secureConfig;

   string m_connectionString;

public string SecureConfig


      get { return m_secureConfig; }

      set { m_secureConfig = value; }


// Pass the connection string to the plug-in’s constructor.

   // The string is defined during plug-in registration.

   public AccessDatabase(string config, string secureConfig)


      m_connectionString = config;

      m_secureConfig = secureConfig;


   public void Execute(IPluginExecutionContext context)


// Step 1. Get the domain name of the calling user.

      ICrmService crmService = context.CreateCrmService(false);

      systemuser callingUser = (systemuser)crmService.Retrieve(

EntityName.systemuser.ToString(), context.UserId,

new ColumnSet(new string[] { “domainname” }));

      // Step 2. Connect using a SQL connection string specified in the

      // configuration of step

using (SqlConnection conn =

new SqlConnection(m_connectionString))



SqlCommand comm = conn.CreateCommand();

// Step3. Start SQL impersonation.

      comm.CommandText = @”Execute as Login='” +

         callingUser.domainname +”‘; “;

      // Step 4. Run the SQL commands that need to be executed.

      comm.CommandText += “SELECT SUSER_NAME(); “;

// Step 5. Revert the context back to Network Service

      comm.CommandText += “revert;”;

      comm.CommandType = System.Data.CommandType.Text;

// For demonstration purposes, display the username displayed

      // from the SELECT statement.

throw new InvalidPluginExecutionException(





For more information on the EXECUTE AS command, refer to


Ajith Gande and Peter Hecke

Categorizing and Displaying Reports in Different Languages

Microsoft Dynamics CRM includes a number of ready-to-use business reports and provides the capability for creating custom reports. In Microsoft Dynamics CRM 3.0, you could manage the reports only by using the Microsoft Dynamics CRM Web application. In Microsoft Dynamics CRM 4.0, you can manage the reports programmatically by using the Microsoft Dynamics CRM Web services. The reports are represented by a rich entity model that contains report, report category, report entity, report link and report visibility entities. One of the new features is categorizing and displaying the reports in different languages.  You can enable additional languages in Microsoft Dynamics CRM by installing Language Packs. This lets you display text in the user interface, online Help, and the reports in different languages. For more information about how to install Language Packs, see the Microsoft Dynamics CRM 4.0 Implementation Guide.

To categorize the reports by language, use the report.languagecode property. You can set the property to a specific locale ID (for example, 1033 for US English) to make the report visible to the users of that language. For example, the English out-of-the-box Account Summary report appears in the Reports grid in the English user interface, but not in the Spanish or German user interfaces in the same organization.

You can also set the report.languagecode property to -1 (minus one) to make the report visible to all users in the base language user interface (this user interface is installed during the original Microsoft Dynamics CRM server installation) and in the user interfaces in other languages. For more information about locale ID, see “List of Locale ID (LCID) Values as Assigned by Microsoft”, at

You can use the report language information in combination with information that is contained in the report entity, report category, and report visibility entities to determine the areas and categories in the Microsoft Dynamics CRM Web application where the report is shown in different user interfaces languages.

Note   The Language element inside the report definition language (RDL) file does not determine where the report is shown inside the Microsoft Dynamics CRM Web application. It contains an expression that evaluates to a language code as defined in the Internet Engineering Task Force (IETF) RFC1766 specification. The language code is used mainly for formatting numbers, dates, and times for a specified language. For more information about the Language element, see “Language Element (Report) (RDL)” at

For more information, see Report Writers Guide.


Inna Agranov

Plug-in Registration Tool 2.1

I have made some minor upgrades to the Plug-in Registration v2.0 tool that is included in the Microsoft CRM 4.0 SDK download or downloaded from Code Gallery.

The following features were added for making the developer’s life easier.

1. Support for https:// in the tool

The Connections Grid now accepts a Discovery Service server name in one of the following formats.

a. https://myMachine

b. mymachine

c. http://mymachine

When making web service calls to the https:// servers, the current logic blindly accepts all the certificates. The program trusts the https:// endpoint that you provided to the tool. It does not verify if the certificate is Expired or if it is from VeriSign etc. You could change the implementation of the logic in the code.

2. Support to change the endpoint returned by the DiscoveryService Web service

Sometimes due to incorrect configuration the SDK endpoint returned by the discovery service is incorrect. So I made modifications to the tool to allow you to change the endpoint that is returned by the discovery service before making any calls to the CrmService Web service.

3. Support registering images for a subordinate entity for Merge

The Merge request involves 2 entities, a parent entity and a subordinate entity. Inside the plug-in you might want to Plug-in Registration Tool 2.1get the image of both the parent and subordinate entities. Now, in the plugin registration tool 2.1, when you register an image for Merge, a dialog is displayed that states “Do you want the image for Parent or Subordinate entity?” You can now register for both.

4. Plugin Registration in IFD/SPLA

IFD or SPLA installations do support plug-ins. You can now use the tool to register a plug-in on the Microsoft CRM server using AD authentication. You will need the DeploymentAdmin privilege to register a plug-in just like you would for an on-premise installation.

5. Works with Visual Studio 2008

6. Restrict image registration for a Create pre-event and Delete post-event

Images are snapshots of the entity’s attributes at the corresponding stage in the pipeline. When you look at the CreateRequest, a plug-in registered for a pre-event on Create gets fired before the Create platform operation occurs. For example, the plug-in is fired before the entity is created in the system. So it doesn’t make sense to register the plug-in step for a pre-event image. The new version of the tool doesn’t allow that image registration.

Similarly for post-event Delete step, we don’t have an entity after a Delete operation, so it doesn’t make sense to support registering for a post-event image.

You can download the latest version of the Plug-in Registration tool (v2.1) at Plugin Code Gallery

Thanks for the feedback on prior versions of the tool

Ajith Gande