Dynamics CRM/Dynamics 365 for Customer Engagement (CRM/D365CE) is an incredibly flexible application for the most part. Regardless of how your business operates, you can generally tailor the system to suit your requirements and extend it to your heart’s content; often to the point where it is completely unrecognisable from the base application. Notwithstanding this argument, you will come across aspects of the application that are (literally) hard-coded to behave a certain way and cannot be straightforwardly overridden via the application interface. The most recognisable example of this is the Lead Qualification process. You are heavily restricted in how this piece of functionality acts by default but, thankfully, there are ways in which it can be modified if you are comfortable working with C#, JScript and Ribbon development.

Before we can start to look at options for tailoring the Lead Qualification process, it is important to understand what occurs during the default action within the application. In developer-speak, this is generally referred to as the QualifyLead message and most typically executes when you click the button below on the Lead form:

When called by default, the following occurs:

  • The Status/Status Reason of the Lead is changed to Qualified, making the record inactive and read-only.
  • A new OpportunityContact and Account record is created and populated with (some) of the details entered on the Lead record. For example, the Contact record will have a First Name/Last Name value supplied on the preceding Lead record.
  • You are automatically redirected to the newly created Opportunity record.

This is all well and good if you are able to map your existing business processes to the application, but most organisations will typically differ from the applications B2B orientated focus. For example, if you are working within a B2C business process, creating an Account record may not make sense, given that this is typically used to represent a company/organisation. Or, conversely, you may want to jump straight from a Lead to a Quote record. Both of these scenarios would require bespoke development to accommodate currently within CRM/D365CE. This can be broadly categorised into two distinct pieces of work:

  1. Modify the QualifyLead message during its execution to force the desired record creation behaviour.
  2. Implement client-side logic to ensure that the user is redirected to the appropriate record after qualification.

The remaining sections of this post will demonstrate how you can go about achieving the above requirements in two different ways.

Our first step is to “intercept” the QualifyLead message at runtime and inject our own custom business logic instead

I have seen a few ways that this can be done. One way, demonstrated here by the always helpful Jason Lattimer, involves creating a custom JScript function and a button on the form to execute your desired logic. As part of this code, you can then specify your record creation preferences. A nice and effective solution, but one in its guise above will soon obsolete as a result of the SOAP endpoint deprecation. An alternative way is to instead deploy a simplistic C# plugin class that ensures your custom logic is obeyed across the application, and not just when you are working from within the Lead form (e.g. you could have a custom application that qualifies leads using the SDK). Heres how the code would look in practice:

public void Execute(IServiceProvider serviceProvider)
    {
        //Obtain the execution context from the service provider.

        IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

        if (context.MessageName != "QualifyLead")
            return;

        //Get a reference to the Organization service.

        IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        IOrganizationService service = factory.CreateOrganizationService(context.UserId);

        //Extract the tracing service for use in debugging sandboxed plug-ins

        ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

        tracingService.Trace("Input parameters before:");
        foreach (var item in context.InputParameters)
        {
            tracingService.Trace("{0}: {1}", item.Key, item.Value);
        }

        //Modify the below input parameters to suit your requirements.
        //In this example, only a Contact record will be created
        
        context.InputParameters["CreateContact"] = true;
        context.InputParameters["CreateAccount"] = false;
        context.InputParameters["CreateOpportunity"] = false;

        tracingService.Trace("Input parameters after:");
        foreach (var item in context.InputParameters)
        {
            tracingService.Trace("{0}: {1}", item.Key, item.Value);
        }
    }

To work correctly, you will need to ensure this is deployed out on the Pre-Operation stage, as by the time the message reaches the Post-Operation stage, you will be too late to modify the QualifyLead message.

The next challenge is to handle the redirect to your record of choice after Lead qualification

Jason’s code above handles this effectively, with a redirect after the QualifyLead request has completed successfully to the newly created Account (which can be tweaked to redirect to the Contact instead). The downside of the plugin approach is that this functionality is not supported. So, if you choose to disable the creation of an Opportunity record and then press the Qualify Lead button…nothing will happen. The record will qualify successfully (which you can confirm by refreshing the form) but you will then have to manually navigate to the record(s) that have been created.

The only way around this with the plugin approach is to look at implementing a similar solution to the above – a Web API request to retrieve your newly created Contact/Account record and then perform the necessary redirect to your chosen entity form:

function redirectOnQualify() {

    setTimeout(function(){
        
        var leadID = Xrm.Page.data.entity.getId();

        leadID = leadID.replace("{", "");
        leadID = leadID.replace("}", "");

        var req = new XMLHttpRequest();
        req.open("GET", Xrm.Page.context.getClientUrl() + "/api/data/v8.0/leads(" + leadID + ")?$select=_parentaccountid_value,_parentcontactid_value", true);
        req.setRequestHeader("OData-MaxVersion", "4.0");
        req.setRequestHeader("OData-Version", "4.0");
        req.setRequestHeader("Accept", "application/json");
        req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
        req.setRequestHeader("Prefer", "odata.include-annotations=\"OData.Community.Display.V1.FormattedValue\"");
        req.onreadystatechange = function () {
            if (this.readyState === 4) {
                req.onreadystatechange = null;
                if (this.status === 200) {
                    var result = JSON.parse(this.response);
                    
                    //Uncomment based on which record you which to redirect to.
                    //Currently, this will redirect to the newly created Account record
                    var accountID = result["_parentaccountid_value"];
                    Xrm.Utility.openEntityForm('account', accountID);

                    //var contactID = result["_parentcontactid_value"];
                    //Xrm.Utility.openEntityForm('contact', contactID);

                }
                else {
                    alert(this.statusText);
                }
            }
        };
        req.send();
        
    }, 6000);     
}

The code is set to execute the Web API call 6 seconds after the function triggers. This is to ensure adequate time for the QualifyLead request to finish and make the fields we need available for accessing.

To deploy out, we use the eternally useful Ribbon Workbench to access the existing Qualify Lead button and add on a custom command that will fire alongside the default one:

As this post has hopefully demonstrated, overcoming challenges within CRM/D365CE can often result in different – but no less preferred – approaches to achieve your desired outcome. Let me know in the comments below if you have found any other ways of modifying the default Lead Qualification process within the application.

Monday may not have been my day of choice for attending an all-day session on the General Data Protection Regulation (GDPR), but it was something that I walked away from feeling more well-informed on:

If you currently work within the IT industry, then I would be very surprised if you have not yet come across GDPR or are already in the process of assessing what your organisation needs to do to prepare for it. In a nutshell, GDPR replaces existing data protection legislation within EU countries on May 25th 2018 (for the UK, this will be the Data Protection Act 1998). GDPR brings data protection guidelines firmly into the 21st century and provides a framework for organisations to apply the appropriate steps to protect individuals data. Whilst there is much within the updated guidelines that remain unchanged, there is additional emphasis towards organisations implementing the appropriate levels of security (both physical and technical), applying regular auditing processes and documentation of processes to protect against a possible data breach. For an IT professional, one of the overriding questions you should be starting to ask yourself is “What can I do to make the systems I support/implement compatible with GDPR?

Dynamics CRM/Dynamics 365 for Enterprise (CRM/D365E) is one system that is likely to be in place within businesses/organisations across the EU, and one that is arguably best placed to help meet the challenges that GDPR brings to the table. The wide berth of functionality within the application can be picked up and adapted to suit the following requirements:

  • Provide backend database encryption, to protect your key customer data in the event of a data breach.
  • Ensure that highly sensitive data categories are only accessed by relevant personnel within your organisation.
  • Enables you to implement a clear and comprehensive security model within your system, that can then clearly documented.
  • Helps you to implement a data retention policy that is line with contractual and statutory requirements.
  • Allow you to quickly and effectively respond to subject access requests, via the use of easy to generate document templates.

All of the above can be achieved using out of the box functionality within the application and, in some cases, can be more straightforwardly than you may assume.

As part of this and the next couple of week’s blog posts, I will take a look at each of the bullet points above, step by step. The aim of this is to highlight the specific elements within GDPR that each potential situation covers, how to go about implementing a solution within CRM/D365E to address each one and to provide other thoughts/considerations to better prepare yourself for GDPR. By doing so, I hope to make you aware of functionality within the application that hitherto you may never have looked at before and to explore specific use cases that provide a wider business relevance.

All posts in the series will make frequent reference to the text contained within Regulation (EU) 2016/679, available online as part of the Official Journal of the European Union – a particularly onerous and long-winded document. If you are based in the UK, you may find solace instead by reading through the ICO’s rather excellent Overview of the General Data Protection Regulation (GDPR) pages, where further clarification on key aspects of the regulation can be garnered.

Without further ado, let’s jump into the focus for this weeks post: Understanding and effectively utilising Transparent Database Encryption (TDE) within your CRM/D365E deployment.

One area within GDPR that has changed significantly is data breaches and penalties for organisations that have demonstrated a clear dereliction of their responsibilities. When assessing whether a fine is issued by your countries appropriate authority, which could number in the millions of £’s or more, a determination is made whether the company has implemented sufficient technical controls to mitigate the potential impact of a data breach. Article 32 sets this out in broad terms:

Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the [Data] controller and the [Data] processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate:
(a) the pseudonymisation and encryption of personal data;
(b) the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services;
(c) the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident;
(d) a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.

It is worth noting that an assessment will be made of your businesses size, turnover etc. when a judgement is made on what “appropriate” steps your organisation has taken to mitigate their risk in this regard. Smaller businesses can, therefore, breathe a sigh of relief in not having to implement large scale and costly technical solutions within their businesses. Speaking more generally though, the importance of encryption within your organisation’s database and application systems becomes a primary concern in demonstrating GDPR compliance. It could also help you when it comes to determining whether you need to report a Data Breach, as an encrypted piece of hardware does not necessarily expose personal data; arguably meaning that no data breach has occurred.

CRM/D365E gives us the option to utilise a well-established feature within SQL Server to implement encryption for our data – Transparent Database Encryption (or TDE). Even better, it’s enabled by default. That being said, it is prudent for you to take a copy of the default encryption key or change it entirely if you haven’t done so already.

Doing either of the above is relatively straightforward. Navigate to Settings -> Data Management within the application and then click on the Data Encryption icon:

The Data Encryption pop-up window will appear, as indicated below:

From here you have two options at your disposal:

  • Use the Show Encryption Key to allow you to copy and paste the key to your location of choice. Note that as outlined by Microsoft, the key may contain Unicode characters, leading to a potential a loss of data when using applications such as Notepad.
  • Generate a new key that meets the requirements set out above and then click on Change.

In both cases, ensure that the encryption key is stored securely and segregated as far away as possible from your CRM/D365E deployment. Keep in mind as well that there are specific privileges that control if a user can access the above or even modify the encryption key in the first place. These privileges can be found on the Core Records tab within a Security Role page:

It may be tempting, knowing that encryption is enabled by default, to put your feet up and not worry about it. Here’s why it’s important to securely hold/segregate your database encryption key and also to think carefully about which users in your organisation have full Administrative privileges on the application:

Let’s assume the following scenario: your on-premise CRM 2016 organisation has database encryption enabled and SQL Server is installed on the same machine, along with all database files. The database encryption key is saved within a .txt file on the same computer.

A rogue member of staff with full Administrative privileges on CRM or an attacker manages to gain access to this server, in the process taking your CRM organisations .mdf database file. They also manage to either take a copy of the .txt file containing the encryption key or the currently configured encryption key by accessing your CRM instance. This person now has the ability to both mount and access the database file without issue. Under GDPR, this would constitute a data breach, requiring your business to do the following as immediate steps:

  • Notify the supervisory body within your country within 72 hours of the breach occurring (Article 33)
  • Notify every person whose personal data was stored in the database that a breach has occurred (Article 34)
  • Record the nature of the breach, the actual effect caused by it and all remedial steps taken to prevent the occurrence of a breach again in the future. All of this may be required by the supervisory body at any time (Article 35)

The fun does not stop there: depending on what processes your business had in place and, given the specific nature of the scenario, a fine may be more than likely. This is due to the clear steps that could have been taken to prevent the database from being so easily accessible. Having to explain this in front of senior executives of a business is not a prospect that any of us would particularly relish and could have been avoided had the following steps being implemented:

  • The rogue member of staff had been given a much more restrictive security role, that did not grant the Manage Data Encryption key – Read privilege.
  • The SQL Server instance had been installed on a different server.
  • The database encryption key had been saved on a different server
  • The database encryption key had been saved in a password protected/encrypted file.

This list is by no means exhaustive, and there is ultimately no silver bullet when it comes to situations like this; however, you can manage your risk much more effectively and demonstrate to authorities like the ICO that you have taken reasonable steps by taking some of the appropriate steps highlighted above.

In next week’s post, we will take a look at the importance of Field Security Profiles and how they can be utilised to satisfy several of the key requirements of GDPR in a pinch!

The ability to incorporate document management functionality within Dynamics CRM/Dynamics 365 for Enterprise (CRM/D365E) is one of the ways that the application integrates neatly with other products in the Microsoft “stack”, giving you the ability to drive further benefit from your existing CRM/D365E deployment. Documents that specifically concern a particular record can be stored within SharePoint and accessed via a few clicks within the application, allowing for quicker collaboration and visibility behind a specific contact, business or sales opportunity. By leveraging the full functionality of SharePoint alongside this, businesses can begin to take advantage of features such as document history, check in/check out capability and the ability to access SharePoint content via the OneDrive desktop client, negating the need to work solely within a web browser to access your documents.

When you are first getting to grips with Document Management, you may come across an oddity with no apparent way of resolving: GUID names are appended onto the names of your SharePoint folders:

There is an understandable reason why this is done (which will be discussed later on in this post), but the folder names can appear jarring and nonsensical to end users of the application. Fortunately, there is a way in which you can change the global setting for this within the application, but it requires making some modifications to your CRM/D365E Organisations global settings – one of those changes that are simple to do if you know how to, but more complex if you don’t 🙂

CRM’s/D365E’s Organisation settings are not exposed within an accessible format natively within the application. To make changes to these settings within the application, the best tool available for both Online and On-Premise versions of the application is the Organization Settings Editor. I have previously discussed how to install and use this tool on the blog, and it is a really straightforward way of updating some of the CRM’s/D365E’s more obscure settings to suit your preferences. The setting that controls all of this is this is the aptly named CreateSPFoldersUsingNameandGuid, which needs to be set to false. Once this is configured, all newly created SharePoint folders will no longer have the GUID appended to them.

How to Fix Existing Folders

It may be the case that you have lots of existing SharePoint folders that are configured in the default manner, but you want to look at “tidying” them up. Simply renaming the SharePoint folder will not work, as CRM/D365E stores “pointers” to each individual folder setup for Document Management, containing a partial URL link; a link that will become broken if anything is modified within SharePoint. To correctly fix your folders and not break anything, you will need to do the following:

  1. Locate the Document Location record within CRM/D365E and modify the Relative URL value to remove the GUID value (e.g. the Document Location record for Test Co5_B3D5C8DFDB77E51181023863BB357C38 should be updated to have a Relative URL value of Test Co5). You may receive a warning that the URL does not exist, which can be safely disregarded.
  2. Rename the folder in SharePoint to match the updated Relative URL value from above.

This could potentially be a laborious task, depending on the number of Document Locations involved. To ease you in your journey, I would recommend digging out the CRM/D365E SDK & the .NET Client API for SharePoint to iteratively update all Document Locations and ditto for all folders within SharePoint online, based on a .csv/spreadsheet input.

Having the ability to make your SharePoint folders look more visually appealing is nice, but keep in mind the following…

By storing a GUID within each folder name, the application can always ensure that the correct folder can be linked back to a CRM record. By implication, this also facilitates situations where duplicate record names are allowed within your environment (for example, you could have two Account records both called Test Company Ltd). By overriding the above setting within the application, you lose the ability to support duplicate folder names; instead, the application will resolve back to any existing folder that matches the Name of the record, essentially becoming a document repository for 2 records as opposed to 1.

An example can better demonstrate this. Below is a test Account record that has been setup for Document Management – with a few test documents saved within and a Document folder successfully configured in SharePoint called My Test Account:

When we then go and create a brand new Account record with the same name – My Test Account – and go to configure Document Management, we are asked to confirm the creation of a Folder with the same name as the one already setup:

Once confirmed, instead of seeing a blank folder, all of the test documents set up on the similarly named record are visible – because the application is pointing to the folder above:

It is highly unlikely that you would allow duplicate record names in the first place within your CRM/D365E instance (particularly for entity types such as Account), so this may be a concern that requires little notice. However, be sure to perform a full analysis of the all the record types that you are hoping to use with Document Management, as there is no way to fine tune the global setting to accommodate specific entities – it is all or nothing.

Conclusions or Wot I Think

Keeping things looking tidy and non-technical are endeavours that need to drive any successful software deployment. End users who exposed to strange looking, “techie” stuff as part of working with a system day in day out may soon start to lose faith in a system, hampering user adoption and faith that it is fit for purpose. Removing GUID’s from a SharePoint folder can obviously present some serious technical problems, depending on the nature of your deployment; but it is arguably something that you should consider if you want to ensure that end users feel that their SharePoint folders are “correctly” named (i.e. do not contain superfluous information). As with anything in IT, proper testing must be carried out for all possible scenarios before rolling out a feature change like this and, assuming your business ticks all of the boxes for safely removing GUID’s from your SharePoint folders, it is definitely something to be considered seriously.

When working with applications day in, day out, you sometimes overlook something that is sitting there, staring at you in the face. It may be an important feature or an inconsequential piece of functionality, but you never really take the time to fully understand either way just what it is and whether it can offer any distinct benefits or assistance. I realised a great example of this when recently deploying some new Plug-ins into Dynamics CRM/Dynamics 365 for Enterprise (CRM/D365E). When you are setting up a new Step for your Plug-in, you are given the option of specifying an Unsecure Configuration and Secure Configuration via a multi-line text box to the right of the window:

1

I was curious about just what these are and why it is not something that you ever really come across when you are first learning about Plug-in development with the application. I took a closer look at what these text boxes do and, as part of this week’s blog post, I wanted to share my findings and provide a demonstration of how they work in practice.

The Theoretical Bit: Unsecure/Secure Configuration Overview

Typically, when we want to get some juicy information relating to a piece of CRM/D365E functionality, we would turn to our good friends TechNet or MSDN. In this instance, however, there is no dedicated page that covers this topic in-depth. We must instead navigate to the Write a Plug-in Constructor page to find dedicated information about how these work:

The Microsoft Dynamics 365 platform supports an optional plug-in constructor that accepts either one or two string parameters. If you write a constructor like this, you can pass any strings of information to the plug-in at run time.

These “one or two” parameters are the multi-line text boxes indicated above. Information is exposed as string objects within you C# code and you enable this feature within your code by specifying the following, SDK adapted constructor within your Plug-in class:

public MyPlugin(string unsecureString, string secureString)
    {
        if (String.IsNullOrWhiteSpace(unsecureString) ||
            String.IsNullOrWhiteSpace(secureString))
            {
                throw new InvalidPluginExecutionException("Unsecure and secure strings are required for this plugin to execute.");
            }

            _unsecureString = unsecureString;
            _secureString = secureString;
    }

As with anything, there are a number of important caveats to bear in mind with this feature. These can be gleaned via additional online sources:

In terms of use cases, the above articles highlight some potential scenarios that they are best utilised within. Perhaps the best example is for an ISV solution that requires integration with external web services to retrieve data that is then consumed by CRM/D365E. Credentials for these web services can be stored securely when the Plug-in is deployed via the use of Secure configuration parameters. Other than that, if you are developing a Plug-in for internal use, that is unlikely to be deployed/managed across multiple environments, then it is probably not worthwhile to look at utilising configuration parameters when you can just as easily specify these within your code.

Practice Makes (for) Perfect (Understanding)!

The best way to see how something works is by getting hands-on and seeing how it works in action. Let’s assume you wish to deploy a plugin that executes whenever a record is opened/viewed by any user across the platform. The plugin should update the First Name (firstname) and Last Name (lastname) fields to match the value(s) in the Unsecure and Secure Configuration properties accordingly. The below plugin code will achieve these requirements:

using System;
using Microsoft.Xrm.Sdk;

namespace D365.BlogDemoAssets.Plugins
{
    public class PostContactRetrieve_PluginConfigurationTest : IPlugin
    {
        private readonly string _unsecureString;
        private readonly string _secureString;
        public PostContactRetrieve_PluginConfigurationTest(string unsecureString, string secureString)
        {
            if (String.IsNullOrWhiteSpace(unsecureString) ||
                String.IsNullOrWhiteSpace(secureString))
            {
                throw new InvalidPluginExecutionException("Unsecure and secure strings are required for this plugin to execute.");
            }

            _unsecureString = unsecureString;
            _secureString = secureString;
        }
        public void Execute(IServiceProvider serviceProvider)
        {
            // Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            // Obtain the organization service reference.
            IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);

            // The InputParameters collection contains all the data passed in the message request.
            if (context.InputParameters.Contains("Target") &&
                context.InputParameters["Target"] is EntityReference)
            {
                Entity contact = new Entity("contact", ((EntityReference)context.InputParameters["Target"]).Id);
                
                contact["firstname"] = _unsecureString;
                contact["lastname"] = _secureString;
                service.Update(contact);

            }
        }
    }
}

When deploying the plugin using the Plugin Registration Tool, we specify the step to execute on the Retrieve message and to execute in the Pre-Operation Stage (otherwise the form will need to be refreshed to see the updated values!). We also need to specify our desired values for the First Name and Last Name fields in the appropriate Configuration fields. The Register New Step window should look similar to the below if configured correctly:

When we navigate into the Jim Glynn (sample) Contact record within CRM/D365E, we can see that the Plug-in has triggered successfully and updated the fields to match against the values specified on Step above:

We can also confirm that the appropriate error is thrown when one of the configuration properties is missing a value, by modifying our Plug-in step and attempting to reload our sample Contact record:

Can you spot what’s missing? 🙂

By clicking Download Log File, we can view the error message specified as part of the InvalidPluginExecutionException call. Below is a modified excerpt of the ErrorDetails XML that is generated:

  <InnerFault>
    <ActivityId>ed4a2021-9c87-4f06-a493-6d804676bf96</ActivityId>
    <ErrorCode>-2147220891</ErrorCode>
    <ErrorDetails xmlns:d3p1="http://schemas.datacontract.org/2004/07/System.Collections.Generic" />
    <Message>Unsecure and secure strings are required for this plugin to execute.</Message>
    <ExceptionSource i:nil="true" />
    <InnerFault i:nil="true" />
    <OriginalException i:nil="true" />
    <TraceText i:nil="true" />
  </InnerFault>

Conclusions or Wot I Think

It is impossible to become what I would like to term a “pub quiz champion” in CRM/D365E; what I mean by this is that I would defy anyone to rattle off every little detail and fact about the entire platform. As with any pub-quiz, those that do may more than likely end up cheating by having their phone out. With this metaphor in mind, I think Plug-in configuration properties would be an excellent topic for a quiz of this nature. As mentioned previously, it is not something that I was ever made aware of when starting to learn about Plug-in development and is not a feature touted regularly within the online community. Perhaps this is because of its very specific and limited application – although it is handy to have at our disposal, I think its usage is really only targeted towards those who are developing solutions that are deployed across multiple environments AND require the need to store configuration properties for external URL’s/web services in a compact and secure manner. Therefore, if you are currently having to use a custom entity within the application to store this type of information, it would make sense to reduce the footprint of your solution within the application itself and make the appropriate changes to use Secure configuration parameters instead. Using a bit of ingenuity (such as XML configuration parameters), you can achieve the same requirements without the need to customise the application unnecessarily.

Generally, when you are looking at adopting Dynamics CRM/Dynamics 365 for Enterprise (D365E) within your business, you can be reasonably satisfied that the majority of what is already configured within the system can be very quickly adapted to suit your business needs. Whether it’s the Lead to Opportunity sales process or the entire Case management module, the functionality at your disposal is suitable for many organisations across the globe. The great thing as well is that, should you wish to fine-tune things further, you have a broad range of options at your disposal that can help you achieve your objectives – sometimes in very specific and highly unique ways. I have previously looked at a good example of this on the blog – namely, how to override the systems built-in pricing engine in favour of your own – and, assuming you have a good understanding of C# and how to deploy plugins to the application, you can spin an important aspect of the systems functionality on its head to match how your business operates.

Having this ability is, undoubtedly, a real boon, but can present some odd behaviours. For example, you may start to notice that suddenly the Extended Amount field is no longer being populated with data after implementing your custom pricing engine. The example pictures below demonstrate a before and after example of adding a Product line item to the Quote entity, using the exact same sample Product:

Before…

…and after.

The odd thing about this is that, as soon as you click into the record, you will suddenly see a value appear in this field. Very strange!

It is difficult to pinpoint exactly what is causing the problem, but I can do a “stab in the dark”. CRM/D365E uses the CalculatePrice message to determine the points when either a) the default price engine or b) a custom one is triggered to perform all necessary calculations. Although there is no official documentation to back this up, I suspect that this message is only triggered when you Update or Retrieve an existing Product line item record (regardless of whether it is an Opportunity Product, Quote Product etc.). This is proven by the fact that, as soon as we click into our Product record, the Extended Amount field is suddenly populated – the platform has triggered the Retrieve message as a result of you opening the record and then, as a next step, forces the CalculatePrice message to also fire. The important thing to clarify with this point is that you must have a custom pricing implemented successfully within the application for this to work. Otherwise, don’t be too surprised if the Extended Amount value remains at 0.

Whilst the workaround for this is somewhat tolerable if you are working with a small subset of records and do not rely on the Extended Amount as part of any existing reporting within the application, this could really start to cause problems for your end users in the long term and give an impression that the application does not “work” as it should do. Fortunately, there is a solution that we can look at implementing that will hopefully lead to some happy fingers from not needing to click into records anymore 🙂 Be sure to have the CRM/D365E SDK handy before you begin the below!

  1. Open up the Plugin Registration Tool from within the SDK, and log into your CRM/D365E instance.
  2. Scroll down to your Assembly and Plugin that contains your custom pricing engine. If already configured correctly, it should have a step configured for the CalculatePrice message on any entity, as a Synchronous, Post-Operation step.
  3. Right click your plugin and click on Register New Step to open the window that lets you specify the required settings for your step. Populate the form as follows:
    • Message: Create
    • Primary Entity: Select one of the Product line item entities that your custom pricing engine uses. The list of accepted entities are invoicedetailopportunityproduct, salesorderdetail or quotedetail.
    • Event Pipeline Stage of Execution: Post-Operation
    • Execution Mode: Synchronous

All other settings can be left as default. Your window should look similar to the below if configured correctly for the quotedetail entity:

  1. Click on Register New Step to add the step to the application.
  2. Repeat steps 3-4 for any additional Product line item entities that are using with your custom pricing engine

Now, when you go back into CRM/D36E, the Extended Amount values will start to be populated automatically as soon as you add a new Product onto the Product line item subgrid.

Conclusions or Wot I Think

Whilst the ability to override an important piece of CRM’s/D365E’s functionality is welcome, you do need to bear in mind the additional overhead and responsibility this leaves your organisation in ensuring that your custom pricing engine is correct and that you have adequately tested the solution to properly identify actions which are out of the ordinary, such as the one discussed in this post. What is slightly frustrating about this quirk, in particular, is the lack of clear documentation regarding the CalculatePrice message from Microsoft. Granted, the message is only exposed for minimal interaction from an SDK point of view and is, for all intents and purposes, an internal application message that we shouldn’t really mess with or care about. Having said this, even just a brief summary of when the message is triggered on the platform would have made it instantly more understandable why any custom pricing calculation engine will fail to provide you with an instant amount within your Extended Amount field. In the end, however, I am pleased that there is a straightforward workaround that can be put into place to ensure that things work as expected; hopefully to the extent that it becomes virtually impossible to determine easily whether your organisation is using the default or a custom pricing engine in the first place.