In last week’s post, we took a look at how a custom Workflow activity can be implemented within Dynamics CRM/Dynamics 365 for Customer Engagement to obtain the name of the user who triggered the workflow. It may be useful to retrieve this information for a variety of different reasons, such as debugging, logging user activity or to automate the population of key record information. I mentioned in the post the “treasure trove” of information that the IWorkflowContext interface exposes to developers. Custom Workflow activities are not unique in having execution-specific information exposable, with an equivalent interface at our disposal when working with plug-ins. No prizes for guessing its name – the IPluginExecutionContext.

When comparing both interfaces, some comfort can be found in that they share almost identical properties, thereby allowing us to replicate the functionality demonstrated in last weeks post as Post-Execution Create step for the Lead entity. The order of work for this is virtually the same:

  1. Develop a plug-in C# class file that retrieves the User ID of the account that has triggered the plugin.
  2. Add supplementary logic to the above class file to retrieve the Display Name of the User.
  3. Deploy the compiled .dll file into the application via the Plug-in Registration Tool, adding on the appropriate execution step.

The emphasis on this approach, as will be demonstrated, is much more focused towards working outside of the application; something you may not necessarily be comfortable with. Nevertheless, I hope that the remaining sections will provide enough detail to enable you to replicate within your own environment.

Developing the Class File

As before, you’ll need to have ready access to a Visual Studio C# Class file project and the Dynamics 365 SDK. You’ll also need to ensure that your project has a Reference added to the Microsoft.Xrm.Sdk.dll. Create a new Class file and copy and paste the following code into the window:

using System;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;

namespace D365.BlogDemoAssets.Plugins
{
    public class PostLeadCreate_GetInitiatingUserExample : IPlugin
    {
        public void Execute(IServiceProvider serviceProvider)
        {
            // Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            // Obtain the organization service reference.
            IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);

            // The InputParameters collection contains all the data passed in the message request.
            if (context.InputParameters.Contains("Target") &&
                context.InputParameters["Target"] is Entity)

            {
                Entity lead = (Entity)context.InputParameters["Target"];

                //Use the Context to obtain the Guid of the user who triggered the plugin - this is the only piece of information exposed.
      
                Guid user = context.InitiatingUserId;

                //Then, use GetUserDisplayCustom method to retrieve the fullname attribute value for the record.

                string displayName = GetUserDisplayName(user, service);

                //Build out the note record with the required field values: Title, Regarding and Description field

                Entity note = new Entity("annotation");
                note["subject"] = "Test Note";
                note["objectid"] = new EntityReference("lead", lead.Id);
                note["notetext"] = @"This is a test note populated with the name of the user who triggered the Post Create plugin on the Lead entity:" + Environment.NewLine + Environment.NewLine + "Executing User: " + displayName;

                //Finally, create the record using the IOrganizationService reference

                service.Create(note);
            }
        }
    }
}

Note also that you will need to rename the namespace value to match against the name of your project.

To explain, the code replicates the same functionality developed as part of the Workflow on last week’s post – namely, create a Note related to a newly created Lead record and populate it with the Display Name of the User who has triggered the plugin.

Retrieving the User’s Display Name

After copying the above code snippet into your project, you may notice a squiggly red line on the following method call:

The GetUserDisplayName is a custom method that needs to be added in manually and is the only way in which we can retrieve the Display Name of the user, which is not returned as part of the IPluginExecutionContext. We, therefore, need to query the User (systemuser) entity to return the Full Name (fullname) field, which we can then use to populate our newly create Note record. We use a custom method to return this value, which is provided below and should be placed after the last 2 curly braces after the Execute method, but before the final 2 closing braces:

private string GetUserDisplayName(Guid userID, IOrganizationService service)
    {
        Entity user = service.Retrieve("systemuser", userID, new ColumnSet("fullname"));
        return user.GetAttributeValue<string>("fullname");
    }

Deploy to the application using the Plug-in Registration Tool

The steps involved in this do not differ greatly from what was demonstrated in last week’s post, so I won’t repeat myself 🙂 The only thing you need to make sure you do after you have registered the plug-in is to configure the plug-in Step. Without this, your plug-in will not execute. Right-click your newly deployed plug-in on the main window of the Registration Tool and select Register New Step:

On the form that appears, populate the fields/values indicated below:

  • Message: Create
  • Primary Entity: Lead
  • Run in User’s Context: Calling User
  • Event Pipeline Stage of Execution: Post-Operation

The window should look similar to the below if populated correctly. If so, then you can click Register New Step to update the application:

All that remains is to perform a quick test within the application by creating a new Lead record. After saving, we can then verify that the plug-in has created the Note record as intended:

Having compared both solutions to achieve the same purpose, is there a recommended approach to take?

The examples shown in the past two blog posts indicate excellently how solutions to specific scenarios within the application can be achieved via differing ways. As clearly evidenced, one could argue that there is a code-heavy (plug-in) and a light-touch coding (custom Workflow assembly) option available, depending on how comfortable you are with working with the SDK. Plug-ins are a natural choice if you are confident working solely within Visual Studio or have a requirement to perform additional business logic as part of your requirements. This could range from complex record retrieval operations within the application or even an external integration piece involving specific and highly tailored code. The Workflow path clearly favours those of us who prefer to work within the application in a supported manner and, in this particular example, can make certain tasks easier to accomplish. As we have seen, the act of retrieving the Display Name of a user is greatly simplified when we go down the Workflow route. Custom Workflow assemblies also offer greater portability and reusability, meaning that you can tailor logic that can be applied to multiple different scenarios in the future. Code reusability is one of the key drivers in many organisations these days, and the use of custom Workflow assemblies neatly fits into this ethos.

These are perhaps a few considerations that you should make when choosing the option that fits the needs of your particular requirement, but it could be that the way you feel most comfortable with ultimately wins the day – so long as this does not compromise the organisation as a consequence, then this is an acceptable stance to take. Hopefully, this short series of posts have demonstrated the versatility of the application and the ability to approach challenges with equally acceptable pathways for resolution.

Dynamics CRM/Dynamics 365 for Customer Engagement (CRM/D365CE) is an incredibly flexible application for the most part. Regardless of how your business operates, you can generally tailor the system to suit your requirements and extend it to your heart’s content; often to the point where it is completely unrecognisable from the base application. Notwithstanding this argument, you will come across aspects of the application that are (literally) hard-coded to behave a certain way and cannot be straightforwardly overridden via the application interface. The most recognisable example of this is the Lead Qualification process. You are heavily restricted in how this piece of functionality acts by default but, thankfully, there are ways in which it can be modified if you are comfortable working with C#, JScript and Ribbon development.

Before we can start to look at options for tailoring the Lead Qualification process, it is important to understand what occurs during the default action within the application. In developer-speak, this is generally referred to as the QualifyLead message and most typically executes when you click the button below on the Lead form:

When called by default, the following occurs:

  • The Status/Status Reason of the Lead is changed to Qualified, making the record inactive and read-only.
  • A new OpportunityContact and Account record is created and populated with (some) of the details entered on the Lead record. For example, the Contact record will have a First Name/Last Name value supplied on the preceding Lead record.
  • You are automatically redirected to the newly created Opportunity record.

This is all well and good if you are able to map your existing business processes to the application, but most organisations will typically differ from the applications B2B orientated focus. For example, if you are working within a B2C business process, creating an Account record may not make sense, given that this is typically used to represent a company/organisation. Or, conversely, you may want to jump straight from a Lead to a Quote record. Both of these scenarios would require bespoke development to accommodate currently within CRM/D365CE. This can be broadly categorised into two distinct pieces of work:

  1. Modify the QualifyLead message during its execution to force the desired record creation behaviour.
  2. Implement client-side logic to ensure that the user is redirected to the appropriate record after qualification.

The remaining sections of this post will demonstrate how you can go about achieving the above requirements in two different ways.

Our first step is to “intercept” the QualifyLead message at runtime and inject our own custom business logic instead

I have seen a few ways that this can be done. One way, demonstrated here by the always helpful Jason Lattimer, involves creating a custom JScript function and a button on the form to execute your desired logic. As part of this code, you can then specify your record creation preferences. A nice and effective solution, but one in its guise above will soon obsolete as a result of the SOAP endpoint deprecation. An alternative way is to instead deploy a simplistic C# plugin class that ensures your custom logic is obeyed across the application, and not just when you are working from within the Lead form (e.g. you could have a custom application that qualifies leads using the SDK). Heres how the code would look in practice:

public void Execute(IServiceProvider serviceProvider)
    {
        //Obtain the execution context from the service provider.

        IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

        if (context.MessageName != "QualifyLead")
            return;

        //Get a reference to the Organization service.

        IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        IOrganizationService service = factory.CreateOrganizationService(context.UserId);

        //Extract the tracing service for use in debugging sandboxed plug-ins

        ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

        tracingService.Trace("Input parameters before:");
        foreach (var item in context.InputParameters)
        {
            tracingService.Trace("{0}: {1}", item.Key, item.Value);
        }

        //Modify the below input parameters to suit your requirements.
        //In this example, only a Contact record will be created
        
        context.InputParameters["CreateContact"] = true;
        context.InputParameters["CreateAccount"] = false;
        context.InputParameters["CreateOpportunity"] = false;

        tracingService.Trace("Input parameters after:");
        foreach (var item in context.InputParameters)
        {
            tracingService.Trace("{0}: {1}", item.Key, item.Value);
        }
    }

To work correctly, you will need to ensure this is deployed out on the Pre-Operation stage, as by the time the message reaches the Post-Operation stage, you will be too late to modify the QualifyLead message.

The next challenge is to handle the redirect to your record of choice after Lead qualification

Jason’s code above handles this effectively, with a redirect after the QualifyLead request has completed successfully to the newly created Account (which can be tweaked to redirect to the Contact instead). The downside of the plugin approach is that this functionality is not supported. So, if you choose to disable the creation of an Opportunity record and then press the Qualify Lead button…nothing will happen. The record will qualify successfully (which you can confirm by refreshing the form) but you will then have to manually navigate to the record(s) that have been created.

The only way around this with the plugin approach is to look at implementing a similar solution to the above – a Web API request to retrieve your newly created Contact/Account record and then perform the necessary redirect to your chosen entity form:

function redirectOnQualify() {

    setTimeout(function(){
        
        var leadID = Xrm.Page.data.entity.getId();

        leadID = leadID.replace("{", "");
        leadID = leadID.replace("}", "");

        var req = new XMLHttpRequest();
        req.open("GET", Xrm.Page.context.getClientUrl() + "/api/data/v8.0/leads(" + leadID + ")?$select=_parentaccountid_value,_parentcontactid_value", true);
        req.setRequestHeader("OData-MaxVersion", "4.0");
        req.setRequestHeader("OData-Version", "4.0");
        req.setRequestHeader("Accept", "application/json");
        req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
        req.setRequestHeader("Prefer", "odata.include-annotations=\"OData.Community.Display.V1.FormattedValue\"");
        req.onreadystatechange = function () {
            if (this.readyState === 4) {
                req.onreadystatechange = null;
                if (this.status === 200) {
                    var result = JSON.parse(this.response);
                    
                    //Uncomment based on which record you which to redirect to.
                    //Currently, this will redirect to the newly created Account record
                    var accountID = result["_parentaccountid_value"];
                    Xrm.Utility.openEntityForm('account', accountID);

                    //var contactID = result["_parentcontactid_value"];
                    //Xrm.Utility.openEntityForm('contact', contactID);

                }
                else {
                    alert(this.statusText);
                }
            }
        };
        req.send();
        
    }, 6000);     
}

The code is set to execute the Web API call 6 seconds after the function triggers. This is to ensure adequate time for the QualifyLead request to finish and make the fields we need available for accessing.

To deploy out, we use the eternally useful Ribbon Workbench to access the existing Qualify Lead button and add on a custom command that will fire alongside the default one:

As this post has hopefully demonstrated, overcoming challenges within CRM/D365CE can often result in different – but no less preferred – approaches to achieve your desired outcome. Let me know in the comments below if you have found any other ways of modifying the default Lead Qualification process within the application.

When Server-Side Synchronization (Server-Side Sync) was first introduced in Dynamics CRM 2013, I imagine that lots of application and email server administrators breathed a huge sigh of relief. The feature greatly simplified the amount of effort involved in integrating On-Premise/Online CRM instances with their equivalent Exchange Server versions. Previously, the only way of achieving such an integration was via the E-mail Router, a cumbersome application that provided limited integration between email servers and CRM (i.e. no synchronization of items such as Tasks, Appointments, and Contacts). Granted, the E-mail Router was desirable if you were running a non-Exchange based Email Server, but having to provision a dedicated computer/server as the “intermediary” for all email messages to flow through could start to make a simple application deployment grow arms and legs very quickly.

Since the addition of Server-Side Sync, the feature has been continually updated to make it more versatile and the de facto choice for getting your emails tagged back into CRM & Dynamics 365 for Enterprise (D365E) – to the extent that the Email Router will shortly be extinct. Server-Side Sync now supports hybrid-deployments (e.g. D365E Online to Exchange On-Premise and vice-versa), has been expanded to include other Email protocol types and is now tailored to provide easier mechanisms for diagnosing mail flow issues from within the application itself. The last of these developments is best epitomised by the introduction of the Server-Side Synchronization Monitoring dashboard, introduced in CRM 2016 Update 1:

With this Dashboard, Administrators now have a simplified means of monitoring the health of their Server-Side Sync settings, facilitating the easy identification of problematic mailboxes, mailboxes that have failed a Test & Enable and those that have a recurring error being generated on them. Having all of this information at our fingertips enables CRM administrators to much more proactive in managing their health of their instance.

When recently working within some D365E organizations, which were originally provisioned as Dynamics CRM Online 2015 instances on the same Office 365 tenant, I noticed that the above Dashboard was missing:

Therefore, when clicking on the appropriate button in the sitemap area, an alternative Dashboard is loaded instead (either the user’s favorite Dashboard or a random one instead). So the question was – where has the Dashboard gone?

It turns out that the reason for its absence is down to an error as part of a previous major version upgrade, and is an issue that may be encountered by CRM Organisations provisioned a few years back. After escalating to Microsoft Support for further assistance, we were able to find a workaround to make the Dashboard available on the instances that were missing it. The steps involved are relatively straightforward, but in our case, we did have to resort to a spare trial/demo instance available that had the Dashboard installed successfully. The workaround steps are as follows:

  1. Log into an instance that contains the Dashboard. Go into Customizations and rename the Dashboard (doesn’t matter what you call it, as long as it’s not the same as the default name).
  2. Create a new unmanaged Solution and add in the renamed Dashboard. Export the solution as an Unmanaged Solution.
  3. Import the Solution into the instance that is missing it.
  4. Attempt to access the Dashboard and verify that it loads successfully.

You may be wondering why the Dashboard needs to be renamed before exporting. When attempting to import this Dashboard into any target instance with its default name, the component is automatically skipped during the import process. A Microsoft engineer advised that this is because an instance with the missing Dashboard actually thinks it is there and therefore does not try to import and overwrite a system component with the same name.

For the benefit of those who may also be experiencing the same issue and do not readily have access to another CRM/D365E instance, I have uploaded below two unmanaged solution files for the previous two versions of the application:

Version 8.1 (Dynamics CRM Online 2016 Service Pack 1)

Version 8.2 (December 2016 update for Dynamics 365)

The solution is a simple 1 component solution, containing a renamed version of the Server-Side Synchronization Monitoring Dashboard:

Once you have downloaded your Solution of choice above, follow steps 3-4 above and then go into Solution and the Dashboard to rename it from Copy of Server-Side Synchronization Monitoring -> Server Side Synchronization Monitoring. The application will accept the changes and it will be as if you always had the Dashboard there in the first place 🙂

It seems strange that this issue occurred in the first place, and I suspect I may not be the only one who has faced it recently. The Microsoft engineer I spoke to seemed to confirm that they’ve had this type of issue crop up several times previously, which makes it strange that this has not been acknowledged as a bug, either within the application itself as part of the twice-yearly upgrade processes. Notwithstanding this fact, it is good that an established workaround can be applied to fix the issue.

The current talk around CRM at the moment is all about the Spring Wave AKA the CRM 2016 Update 1. Unlike Dynamics CRM 2015 Update 1, released at the same time last year, On-Premise customers can also take advantage of the latest update now by downloading Service Pack 1. Now it’s worth pointing out that some features, such as Guided Help and Field Service Management, are not made available to On-Premise customers at this time. Nevertheless, I think it’s a really positive step forward that On-Premise customers get most, if not all, of the latest updates as part of the Spring Wave, at the same time as Online customers. I’ve already talked previously about some of the great things to expect as part of the Spring Wave, so I would strongly recommend that organisations look at scheduling in their on-premise upgrade sooner rather than later. The main reason is the simplicity involved behind the actual upgrade process, which is the focus for this week’s blog post:

How to Download Service Pack 1

If you have enrolled your Dynamics CRM installation into Windows Update, then currently SP1 does not appear when your run Check for Updates on your CRM Server (this will change starting from Q3, according this knowledgebase article). Therefore, you will need to download the update manually via the following link:

https://www.microsoft.com/en-us/download/details.aspx?id=52662

There are a number of install files available as part of the update. On first glance, it can be difficult to determine which update file relates to which component of your CRM installation, so I have provided a summary of this below:

CRM2016-Client-KB3154952-ENU-Amd64.exe – Outlook Client (64 Bit)
CRM2016-Client-KB3154952-ENU-I386.exe – Outlook Client (32 Bit)
CRM2016-Mui-KB3154952-ENU-Amd64.exe – Outlook Client (64 Bit) Language Pack – English
CRM2016-Mui-KB3154952-ENU-I386.exe – Outlook Client (32 Bit) Language Pack – English
CRM2016-Router-KB3154952-ENU-Amd64.exe –  Email Router (64 Bit)
CRM2016-Router-KB3154952-ENU-I386.exe – Email Router (32 Bit)
CRM2016-Server-KB3154952-ENU-Amd64.exe – Server Update
CRM2016-Srs-KB3154952-ENU-Amd64.exe – Reporting Extensions

There is also a final file – CRM2016-Tools-KB3154952-ENU-amd64.exe – which I believe may be some kind of utility to perform database upgrades. I have been unable to get this working successfully on my end and, from the looks it, the server update performs the database upgrade as part of the update. If anyone knows what this is for and how it works, let me know in the comments below.

Installing the Update

The update is really simple – this is backed up by the fact that it can be completed in as little as 6 button presses!

After running the self-extracting installer, you will be greeted with the first screen of the setup process:

SP1_Update_1

As is always the case, accept the license agreement:

SP1_Update_2

Then, confirm that you are ready to begin the installation:

SP1_Update_3

The installation process shouldn’t take long (about 10 -15 minutes when I performed it). Once complete, you’ll then be able to view the installation log file and specify whether or not you want to restart the server immediately:

SP1_Update_4

 

As part of the update, your organisations and databases will be automatically updated to the latest version. So, no need to manually perform this yourself after the update.

Reporting Extensions Update

The only thing to point out with this is that this will need to be installed on the same machine as your CRM’s SSRS instance. Depending on your On-Premise deployment, your SSRS instance may be installed on a different server. The actual process of installing the update is identical to the above. Fortunately, however, a server reboot is not required 🙂

Outlook Client Update

Updating the Outlook client is also very similar and straightforward. Any existing connections will still work after the update. You will also need to install the update(s) for any currently installed language pack(s) immediately after this update is complete. In most cases, this will just be the one update for your CRM organization’s base language. Depending on the size and complexity of your CRM deployment, you may need to plan carefully to ensure a successful roll-out; you may be happy to note that the existing CRM client should still work with a 8.1 CRM instance, so you can always choose to defer this part of the update entirely.

The Testing Conundrum

As with any application update, it is always prudent to ensure that you have performed some kind of testing. This can be invaluable in identifying issues that can then be resolved in good time, without causing disruption to colleagues/end-users of the application.

The difficulty in this particular case is that the update appears to automatically update all organizations on a CRM instance in one fell swoop. So you cannot, for example, setup a copy of your primary CRM instance that you can perform a test upgrade on. You would therefore have to deploy an entirely separate CRM instance which you can perform the necessary testing on. You can use a trial key to cover the initial licensing hurdle, but you will still need to allocate hardware and put time aside to get your temporary testing environment setup.

Given that this release is not a “major” release of the application, I would argue softly that you can probably throw caution to the wind and perform the update with a minimal amount of testing, particularly if your CRM instance has not been significantly customised/extended. There does not appear to be anything majorly changed under the hood of CRM for this release that could cause problems (unlike the 2015 Spring Wave release for CRM Online). In order to best inform your decision on this, you should consider the following factors:

  • Are you able to schedule your CRM update over a weekend or extended out of hours slot, and have resources in place to perform any post-update testing? If the answer is yes, then you may be able to get away with not performing any pre-update testing.
  • What is the maximum amount of time your CRM instance can be taken down for during normal working hours?
  • Does your CRM utilise the Email Router, CRM for Outlook and/or multiple language packs? If yes, then performing upgrade testing of these components may be beneficial.
  • Does your business have a spare Windows Server instance that can be used for a test upgrade, without incurring additional cost to the business? If yes, then a major impediment has been eliminated, meaning that you should look at performing a test update.

Has anyone performed an On-Premise upgrade to SP1 yet? Let me know your experience and comments below!