Back in the days of Dynamics CRM 2016 and earlier, one of the major benefits of opting towards an Online subscription versus an On-Premise license was the Dual-Usage rights granted to your organisation. This meant that, so long as your On-Premise Server installation was licensed, your individual Online User CAL’s would also be licensed like for like for CRM On-Premise. So, if you had 5 CRM Online Professional licenses on Office 365, you were also covered for the equivalent 5 On-Premise Professional user license CAL’s.

The Dynamics 365 for Enterprise (D365E) release took this offer a step further by also including the coveted Server user license as part of this offer. The official licensing guide for D365E elaborates on this further:

One of the advantages of Microsoft Dynamics 365 is the option to deploy either in Microsoft’s cloud or in a private on-premises or partner-hosted cloud. In some cases, customers may want to deploy both modes simultaneously, for migrating a Microsoft Dynamics on-premises deployment to Microsoft Dynamics 365, running private Dev/Test deployments in Microsoft Azure. With Dual Use Rights, Microsoft Dynamics users licensed with the required User SL do not need to acquire CALs to access Server instances. Users or devices licensed with Dynamics 365 SLs have use rights equivalent to a CAL for the purpose of accessing equivalent on-premise workloads. With Microsoft Dynamics 365 the server license is included with the SLs.

Thanks to this change, organisations can look to fully realise their dual-usage ambitions with D365E, without having to pay a single additional penny – nice! 🙂

When I first learned about the above, my first question was “Great! How do I get my license key for D365E on-premise?”. Unfortunately, the documentation available online is not 100% clear on which route you need to take to get your license key. Attempting to try and get an answer directly from Microsoft Online support can sometimes lead to you being redirected to the above licensing guide, which does not set out clearly how the offer works and a step-by-step account of how to claim your license key. I was recently involved in attempting to obtain a dual-usage license key for an organisation, so I thought I would share my experiences as part of this weeks blog post and provide a straightforward guide for others who may find themselves in the same boat.

Before you Begin…

The route you have to traverse will be dictated significantly by the method in which your organisation has obtained your Online D365E licenses. It could be that your business has:

  • Ordered your licenses directly on the Office 365 portal.
  • Purchased your online subscription through a Microsoft Partner.
  • Obtained a redemption key for your subscription via a Volume License agreement or similar with Microsoft.

So before you set out on your dual-usage license key quest, verify how your organisation originally obtained your D365E licenses. Assuming you have this information to hand, please see below for a summary that covers each license purchase route:

If you purchased your licenses via a Cloud Solutions Provider agreement (i.e. directly with a Microsoft partner)…

Then your license key should be viewable within your CustomerSource profile page for your organisation, underneath the Product And Service Summary Section.

If you purchased your licenses via a Microsoft Products and Services Agreement…

Your license key should be viewable within your Microsoft Business Centre page.

If you purchased your licenses via an Enterprise/Volume License Agreement

Log into the Volume Licensing Service Centre and, underneath your list of Products, you should see your product and corresponding license key.

If you purchased your licenses directly via Office 365 and have a partner of record for your subscription…

You should reach out to them directly and they can then log a support request with Microsoft’s Pre-Sales team. The turn-around for this will generally be a couple of days, and at the end of it, you should be emailed your license key.

If you purchased your licenses directly via Office 365 and DO NOT have a partner of record for your subscription…

Then I believe you will need to log a support request directly with Microsoft Online support to obtain the license key information. I am unable to confirm whether this will work successfully or not, so I would be interested in hearing from anyone in the comments below if this works.

Getting D365E On-Premise Installed

This is essentially a two-step process. Given that the D365E release was not a major application release, there is no dedicated installer available for the product. Instead, you will need to install Dynamics CRM Server 2016 and then download and install the December 2016 update to get the application version up to 8.2. All of the usual pre-requisites for a CRM On-Premise install will apply – you will need a SQL Server instance deployed, an Active Directory to authenticate with and be running a compatible version of Windows Server. The full list of requirements can be viewed on TechNet.

Conclusions or Wot I Think

The expansion of the Dual Usage offering as part of D365E is a welcome and highly beneficial development. Assuming your organisation already has existing infrastructure in place that supports an On-Premise deployment, you can begin to mitigate any extra costs required for additional sandbox instances on Office 365 by quickly setting up local D365E organisations on the On-Premise version of the application. I think there is definitely some work to be done around the whole process of obtaining the license key in the first instance – so, for example, perhaps a button/link in the Dynamics 365 Administration Centre that lets you view your On-Premise license key or fill out a request form – but the very fact that organisations are able to take advantage of this in the first place is one of the reasons why D365E is starting to move ahead of the competition within the ERP/CRM application space.

When you have spent any length of time working with Dynamics CRM Online/Dynamics 365 for Enterprise (D365E) data programmatically, you become accustomed to how Option Set, State and Status Reason values are presented to you in code. To explain, the application does not store your Option Set value display names within the SQL Server Entity tables; rather, the Option Set Value that has been specified alongside your Label is what is stored as an integer value. That is why you are always mandatorily prompted to provide both values within the application:

The following benefits are realised as a result of how this is setup:

That being said, when working with these field types in code, you do always have to have the application window open or a list of all Labels/Values to hand so that you don’t get too confused… 🙂

I have previously extolled the virtues of the Data Export Service on the blog, and why you should consider it if you have basic integration requirements for your CRM/D365E deployment. One area in which it differs from other products on the market is how it handles the field types discussed above. For example, when exporting data to a SQL database via Scribe Online, new columns are created alongside that contain the “Display Name” (i.e. label value) that correspond to each Option, Status and Status Reason Label. So by running the following query against a Scribe export database:

SELECT DISTINCT statecode, statecode_displayname
FROM dbo.account

We get the best of both worlds – our underlying statecode value and their display names – all in 2 lines of code:

This is a big help, particularly when you are then using the data as part of a report, as no additional transformation steps are required and your underlying SQL query can be kept as compact as possible.

The Data Export Service differs from the above in an understandable way, as display name values for Status, Status Reason and Option Set column values are instead segregated out into their own separate table objects in your Azure SQL database:

OptionSetMetadata

GlobalOptionSetMetadata

StateMetadata

StatusMetadata

Why understandable? If you consider how the application can support multiple languages, then you realise that this can also apply to metadata objects across the application – such as field names, view names and – wouldn’t you have guessed it – Labels too. So when we inspect the OptionSetMetadata table, we can see that the table structure accommodates the storing of labels in multiple languages via the LocalizedLabelLanguageCode field:

Unlike the Scribe Online route above (which I assume only retrieves the Labels that correspond to the user account that authenticates with CRM/D365E), the Data Export Service becomes instantly more desirable if you are required to build multi-language reports referencing CRM/D365E application data.

The issue that you have to reconcile yourself with is that your SQL queries, if being expressed as natively as possible, instantly become a whole lot more complex. For example, to achieve the same results as the query above, it would have to be adapted as follows for the Data Export Service:

SELECT DISTINCT statecode, LocalizedLabel
FROM dbo.account
 LEFT JOIN dbo.StateMetadata
  ON 'account' = EntityName
  AND statecode = [State]
  AND '1033' = LocalizedLabelLanguageCode

The above is a very basic example, but if your query is complex – and involves multiple Option Set Values – then you would have to resort to using Common Table Expressions (CTE’s) to accommodate each potential JOIN required to get the information you want.

In these moments, we can look at some of the wider functionality provided as part of SQL Server to develop a solution that will keep things as simple as possible and, in this particular instance, a user-defined function is an excellent candidate to consider. These enable you to perform complex operations against the database platform and encapsulate them within very simply expressed objects that can also accept parameters. The good thing about functions is that they can be used to return table objects and scalar (i.e. single) objects.

Using a scalar function, we can, therefore, remove some of the complexity behind returning Option Set, Status and Status Reason labels by creating a function that returns the correct label, based on input parameters received by the function. You could look at creating a “master” function that, based on the input parameters, queries the correct Metadata table for the information you need; but in this example, we are going to look at creating a function for each type of field – Status, Status Reason, Option Set and Global Option Set.

To do this, connect up to your Data Export Service database and open up a new query window, ensuring that the context is set to the correct database. Paste the following code in the window and then hit Execute:

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

--Create Function to return Global Option Set Labels

CREATE FUNCTION [dbo].[fnGetGlobalOptionSetLabel]
(
	@GlobalOptionSetName NVARCHAR(64), --The logical name of the Global Option Set
	@Option INT, --The option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.GlobalOptionSetMetadata WHERE OptionSetName = @GlobalOptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.GlobalOptionSetMetadata WHERE OptionSetName = @GlobalOptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain label for Global Option Set field ' + @GlobalOptionSetName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Option Set Labels

CREATE FUNCTION [dbo].[fnGetOptionSetLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Option Set field
	@OptionSetName NVARCHAR(64), --The logical name of the Option Set field
	@Option INT, --The option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.OptionSetMetadata WHERE EntityName = @EntityName AND OptionSetName = @OptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.OptionSetMetadata WHERE EntityName = @EntityName AND OptionSetName = @OptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain label for Option Set field ' + @OptionSetName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Status Labels

CREATE FUNCTION [dbo].[fnGetStateLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Status field
	@State INT, --The Status option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.StateMetadata WHERE EntityName = @EntityName AND [State] = @State AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.StateMetadata WHERE EntityName = @EntityName AND [State] = @State AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain State label for entity ' + @EntityName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Status Reason Labels

CREATE FUNCTION [dbo].[fnGetStatusLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Status Reason field
	@Status INT, --The Status Reason option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.StatusMetadata WHERE EntityName = @EntityName AND [Status] = @Status AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.StatusMetadata WHERE EntityName = @EntityName AND [Status] = @Status AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain Status label for entity ' + @EntityName AS INT);
	RETURN @Label;

END

GO

This will then go off and create the functions listed in code, which should then show up under the Programmability folder on your SQL database:

For those who are unsure at what the SQL code is doing, it first attempts to determine if only 1 Label can be found for your appropriate field type, based on the parameters provided. If it is successful, then a value is returned; otherwise, the CAST function is designed to force an error to return back to the caller to indicate that none or more than 1 Option Set value was found. In most cases, this would indicate a typo in the parameters you have specified.

As with anything, the best way to see how something works is in the practice! So if we again look at our previous examples shown in this post, we would utilise the dbo.fnGetStateLabel function as follows to return the correct label in English:

SELECT DISTINCT statecode, dbo.fnGetStateLabel('account', statecode, 1033) AS statecode_displayname
FROM dbo.account

With our results returning as follows:

Now we can expose this through our reports and not worry about having to do any kind of transformation/lookup table to get around the issue 🙂

Attempting to keep things as simple as possible by encapsulating complex functionality into simply and clearly expressed functions is an excellent way of ensuring that code can be kept as streamlined as possible, and also ensures that other colleagues can accomplish complex tasks, even if they do not have in-depth knowledge of Transact-SQL.

Although CRM Online/Dynamics 365 for Enterprise (D365E) does provide a plethora of different tools aimed at satisfying reporting requirements for users of the application, you are restricted in how data can be queried within the application. For example, you cannot just connect straight up to the applications SQL database and start writing stored procedures that perform complex data transformations or joins. Traditionally, to achieve this, you would need to look at one of the several tools in the marketplace that enable you to export your data out into a format that best pleases you; or even take the plunge and get a developer to write your own application that satisfies your integration requirements.

With the recent D365E release and in-line with Microsoft’s longstanding approach to how they approach customer data within their applications (i.e. “It’s yours! So just do what you want with it!), the parallel introduction of the Data Export Service last year further consolidates this approach and adds an arguably game-changing tool to the products arsenal. By using the service, relatively straightforward integration requirements can be satisfied in a pinch and a lot of the headache involved in setting up a backup of your organisation’s databases/LOB reporting application can be eliminated. Perhaps the most surprising and crucial aspect of all of this is that using this tool is not going to break the bank too much either.

In this week’s blog post, I’m going to take a closer look at just what the Data Export Service is, the setup involved and the overall experience of using the service from end-to-end.

What is the Data Export Service?

The Data Export Service is a new, free*, add-on for your CRM/D365E subscription, designed to accomplish basic integration requirements. Microsoft perhaps provides the best summary of what the tool is and what it can achieve via TechNet :

The Data Export Service intelligently synchronizes the entire Dynamics 365 data initially and thereafter synchronizes on a continuous basis as changes occur (delta changes) in the Microsoft Dynamics 365 (online) system. This helps enable several analytics and reporting scenarios on top of Dynamics 365 data with Azure data and analytics services and opens up new possibilities for customers and partners to build custom solutions.

The tool is compatible with versions 8.0, 8.1 and 8.2 of the application, which corresponds the following releases of CRM Online/D365E:

  • Dynamics CRM Online 2016
  • Dynamics CRM Online 2016 Update 1
  • Dynamics 365 December Update

*You will still need to pay for all required services in Azure, but the add-on itself is free to download.

The Installation Process

Getting everything configured for the Data Export Service can prove to be the most challenging – and potentially alienating – part of the entire process. For this, you will need the following at your disposal:

  • An active Azure Subscription.
  • An Azure SQL Server configured with a single database or an Azure VM running SQL Server. Microsoft recommends a Premium P1 database or better if you are using an Azure SQL database, but I have been able to get the service working without any issue on S0 tier databases. This is an important point to make, given the cost difference per month can amount to hundreds of £’s.
  • An Azure Key Vault. This is what will securely store the credentials for your DB.
  • PowerShell and access to the Azure Resource Manager (AzureRM) Cmdlets. Powershell can be installed as an OS feature on Windows based platforms, and can now be downloaded onto OS X/Linux as well. PowerShell is required to create an Azure Key Vault, although you can also use it to create your Azure SQL Server instance/Windows VM with SQL Server.

It is therefore recommended that you have at least some experience in how to use Azure – such as creating Resource Groups, deploying individual resources, how the interface works etc. – before you start setting up the Data Export Service. Failing this, you will have to kindly ask your nearest Azure whizz for assistance 🙂 Fortunately, if you know what you’re doing, you can get all of the above setup very quickly; in some cases, less than 10 minutes if you opt to script out the entire deployment via PowerShell.

For your setup with D365E, all is required is the installation of the approved solution via the Dynamics 365 Administration Centre. Highlight the instance that you wish to deploy to and click on the pen icon next to Solutions:

Then click on the Solution with the name Data Export Service for Dynamics 365 and click the Install button. The installation process will take a couple of minutes, so keep refreshing the screen until the Status is updated to Installed. Then, within the Settings area of the application, you can access the service via the Data Export icon:

Because the Data Export Service is required to automatically sign into an external provider, you may also need to verify that your Web Browser pop-up settings/firewall is configured to allow the https://discovery.crmreplication.azure.net/ URL. Otherwise, you are likely to encounter a blank screen when attempting to access the Data Export Service for the first time. You will know everything is working correctly when you are greeted with a screen similar to the below:

Setting up an Export Profile

After accepting the disclaimer and clicking on the New icon, you will be greeted with a wizard-like form, enabling you to specify the following:

  • Mandatory settings required, such as the Export Profile Name and the URL to your Key Vault credentials.
  • Optional settings, such as which database schema to use, any object prefix that you would like to use, retry settings and whether you want to log when records are deleted.
  • The Entities you wish to use with the Export Service. Note that, although most system entities will be pre-enabled to use this service, you will likely need to go into Customizations and enable any additional entities you wish to utilise with the service via the Change Tracking option:

  • Any Relationships that you want to include as part of the sync: To clarify, this is basically asking if you wish to include any default many-to-many (N:N) intersect tables as part of your export profile. The list of available options for this will depend on which entities you have chosen to sync. For example, if you select the AccountLead and Product entities, then the following intersect tables will be available for synchronisation:

Once you have configured your profile and saved it, the service will then attempt to start the import process.

The Syncing Experience A.K.A Why Delta Syncing is Awesome

When the service first starts to sync, one thing to point out is that it may initially return a result of Partial Success and show that it has failed for multiple entities. In most cases, this will be due to the fact that certain entities dependent records have not been synced across (for example, any Opportunity record that references the Account name Test Company ABC Ltd. will not sync until this Account record has been exported successfully). So rather than attempting to interrogate the error logs straightaway, I would suggest holding off a while. As you may also expect, the first sync will take some time to complete, depending on the number of records involved. My experience, however, suggests it is somewhat quick – for example, just under 1 million records takes around 3 hours to sync. I anticipate that the fact that the service is essentially an Azure to Azure export no doubt helps in ensuring a smooth data transit.

Following on from the above, syncs will then take place as and when entity data is modified within the application. The delay between this appears to be very small indeed – often tens of minutes, if not minutes itself. This, therefore, makes the Data Export Service an excellent candidate for a backup/primary reporting database to satisfy any requirements that cannot be achieved via FetchXML alone.

One small bug I have observed is with how the application deals with the listmember intersect entity. You may get an errors thrown back that indicate records failed to sync across successfully, which is not the case upon closer inspection. Hopefully, this is something that may get ironed out and is due to the rather strange way that the listmember entity appears to behave when interacting with it via the SDK.

Conclusions or Wot I Think

For a free add-on service, I have been incredibly impressed by the Data Export Service and what it can do. For those who have previously had to fork out big bucks for services such as Scribe Online or KingswaySoft in the past to achieve very basic replication/reporting requirements within CRM/D365E, the Data Export Service offers an inexpensive way of replacing these services. That’s not to say that the service should be your first destination if your integration requirements are complex – for example, integrating Dynamics 365 with SAP/Oracle ERP systems. In these cases, the names mentioned above will no doubt be the best services to look at to achieve your requirements in a simplistic way. I also have a few concerns that the setup involved as part of the Data Export Service could be a barrier towards its adoption. As mentioned above, experience with Azure is a mandatory requirement to even begin contemplating getting setup with the tool. And your organisation may also need to reconcile itself with utilising Azure SQL databases or SQL Server instances on Azure VM’s. Hopefully, as time goes on, we may start to see the setup process simplified – so, for example, seeing the Export Profile Wizard actually go off and create all the required resources in Azure by simply entering your Azure login credentials.

The D365E release has brought a lot of great new functionality and features to the table, that has been oft requested and adds real benefit to organisations who already or plan to use the application in the future. The Data Export Service is perhaps one of the great underdog features that D365E brings to the table, and is one that you should definitely consider using if you want a relatively smooth sailing data export experience.

As I tweeted a couple of days ago, my head has been spinning with Dynamics 365 for Enterprise (D365E) recently 🙂 :

https://twitter.com/joejgriffin/status/850419515401924609

I took a detailed look at the upgrade process involved as part CRM Online organisations last year, and thankfully the process has not changed much. Indeed, the whole upgrade seemed to complete a lot quicker – 40-50 minutes as opposed to over an hour, which was nice to see.

As part of any planned upgrade, you should always endeavour to perform a thorough test of your existing application customisations with an environment running the latest version – either with a spare sandbox instance on your subscription or by spinning up a 30 day trial of D365E. We were quite thorough as part of our upgrade process with respect to testing, and fortunately, the upgrade completed with only some minor issues left to deal with. For those who are contemplating or have their upgrade scheduled in over the next few weeks/months, then there will be a few things that you may need to be aware of ahead of time to avoid you having to deal with any potential problems with your new D365E deployment. With this in mind, here are 3 things from our upgrade process that bear in mind before you make the jump to 8.2:

Switch Off Learning Path in System Settings To Prevent Annoying Popups

Attempting to keep up with the number of new features that Dynamics 365 brings to the table is a colossal task. It was for this reason that I only became aware of the new Learning Path feature. Boasting functionality that is not too dissimilar to products such as WalkMe and available across the whole spectrum of the D365E experience (Web Client, ISH and Mobile/Tablet app), the feature is designed to provide a guided means of training new application users on how Dynamics 365 works within the specific context of your business. Induction and new user training can be one of the major hurdles that can affect the success of a system deployment, so having a very contextualised, built-in and guided process of training and reminding users how to complete tasks within the application can surely be an important tool for any organisation to have at their disposal.

Unfortunately, the feature looks to be a little bit too intrusive from an end user experience viewpoint, as leaving the feature enabled post-upgrade will result in the application attempting to open pop-ups through your browser of choice:

To disable the feature until you are ready to start rolling it out across your business, then you have two options at your disposal:

  1. Direct your end-users to select the Opt Out of Learning Path option from the gear icon on the D365E sitemap area:
  2. Go to System Settings and then select No under the Enable Learning Path option. This is the recommended option, as it will disable the feature across the board for all users:

Modify Your Error Notification Preferences Options for all users

Error messages can occur occasionally within the web application. Generally, these will take the form of the Send Error Report To Microsoft variety, and can result from either a problem within the application itself or an error that has been caused by a developer customisation (e.g. JScript function, Sitemap amend etc.). The default setting for this is that users will be prompted before an error report is sent to Microsoft on these occasions. Having the default setting enabled can prove useful when diagnosing issues with the application, but could cause problems and distress for your end users if the application is throwing them regularly.

Whether due to customisations involved as part of the above upgrade system or a fault with D365E itself, these error messages seem to be throwing a lot more often in the latest version of the application; in fact, almost pretty much every time a user leaves a record. The error messages are sufficiently non-descript and lacking any reference to customisations made to the system (such as a JScript function name) to indicate that it could be a problem with the customisations itself. By selecting the option to send the error reports to Microsoft, you can ensure that these errors will be looked into and hopefully addressed as part of a security update in the future. But I would recommend, if you are upgrading to D365E, to ensure that have selected the Automatically send an error report to Microsoft within asking the user for permission on the Privacy Settings page to ensure that your end-users are not getting bombarded with constant error messages:

Don’t Upgrade Just Yet If You Are Using Scribe Online

Scribe Online is currently one of the de-facto tools of choice if you are looking to accomplish very basic integration requirements around CRM. The tool enables you to straightforwardly export your application data into external sources – whether they are SQL-based data sources or even completely different applications altogether. I have not had much direct experience with the tool myself, but I can attest to its relative ease-of-use; I do take issue, however, with how the tool operates within a CRM environment. For example, it creates a custom entity directly within your CRM instance within the default solution, using the default solution prefix (new_). Most ISV solutions instead deploy any required customisations out to the application using the much better supported and best practice route of Managed Solutions, allowing application administrators to better determine which components are deployed out as part of a 3rd party solution and to expedite any potential removal of the solution in future. Having said all that, Scribe Online should be your first port of call if you have a requirement to integrate with external systems as part of your CRM solution.

Now, I deliberately avoided mentioning D365E in the above paragraph, as it looks as if the Scribe Online tool has issues either as a direct result of the upgrade process involved with D365E or due to Scribe Online itself. Shortly after upgrading, the application/Scribe Online will modify the properties of your entity records to set the modifiedon field to the same value for every single entity record. If the number of records in your entity exceeds the default amount of records that can be returned programmatically (thereby requiring the use of a paging cookie), then Scribe will return an error message similar to the below when it next attempts to run your RS solution:

Unable to get the next page of data. Dynamics CRM has not advanced the page cookie for Entity:new_mycustomentity, PagingCookie: <cookie page=”2″><modifiedon last=”2017-01-03T06:42:21-00:00″ first=”2017-01-03T06:42:21-00:00″ /><new_mycustomentityid last=”{A56661B7-C969-E611-80EF-5065F38A8A01}” first=”{797EAA25-1645-E611-80E1-5065F38A4AD1}” /></cookie>

This issue looks to be occurring for other organisations who have upgraded as well, and Scribe have published an online support article with a suggested workaround for this situation provided by Felix Chan:

To work around the issue, we used JavaScript with the Microsoft Dynamics 365 Web API to update all of the account records by changing the value of a field we don’t use (e.g. telephone3) from null to “” (which translated back to null). Needless to say, this effectively updated the modifiedon datetime stamp. It also resulted in the change to Telephone 3 to show up in the Audit History of each account record.

The above Workaround is all very well and good if you dealing with a small number of records and have the appropriate knowledge on how to implement some form-level JScript functions. But my concern will be for organisations who lack this knowledge and are instead left with a solution that does not work. Despite not having firm proof of this either, I suspect that the issue is a fault with Scribe itself and not as a result of the upgrade. This is based solely on the value of the modifiedon field being well after the upgrade has taken place and during the time when our RS Solution was running. Scribe need to ideally acknowledge the existence of this issue and confirm what is causing the error to take place; but, in the meantime, if you are reliant on Scribe Online for business-critical integrations, I would strongly recommend to hold off on upgrading until this issue is acknowledged or until you can identify a replacement service that does not suffer from this problem. In our case, we were only using Scribe Online to backup our application data to an Azure SQL database and were instead able to get up and running quickly with the rather excellent Dynamics 365 Data Export Service.

When working with form-level JScript functionality on Dynamics CRM/Dynamics 365 for Enterprise (D365E), you often uncover some interesting pieces of exposed functionality that can be utilised neatly for a specific business scenario. I did a blog post last year on arguably one of the best of these functions when working with Lookup field controls – the Xrm.Page.getControl().addPreSearch method. Similar to other methods exposed via the SDK, its prudent and effective implementation can greatly reduce the amount of steps/clicks that are involved when populating Entity forms.

I’ve already covered as part of last years post just what this method does, its sister method, addCustomFilter, and also some of the interesting problems that are encountered when working with the Customer lookup field type; a special, recently introduced field type that allows you to create a multi-entity lookup/relationship onto the Account and Contact entities on one field. I was doing some work again recently using these method(s) in the exact same conditions, and again came across some interesting quirks that are useful to know when determining whether the utilisation of these SDK methods is a journey worth starting in the first place. Without much further ado, here are two additional scenarios that involve utilising these methods and the “lessons learned” from each:

Pre-Filtering the Customer Lookup to return Account or Contact Records Only

Now, your first assumption with this may be that, if you wanted your lookup control to only return one of the above entity types, then surely it would be more straightforward to just setup a dedicated 1:N relationship between your corresponding entity types to achieve this? The benefits of this seem to be pretty clear – this is a no-code solution that, with a bit of ingenious use of Business Rules/Workflows, could be implemented in a way that the user never even suspects what is taking place (e.g. Business Rule to hide the corresponding Account/Contact lookup field if the other one contains a value). However, assume one (or all) of the following:

  • You are working with an existing System entity (e.g. Quote, Opportunity) that already has the Customer lookup field defined. This would, therefore, mean you would have to implement duplicate schema changes to your Entity to accommodate your scenario, a potential no-no from a best practice point of view.
  • Your entity in question already has a significant amount of custom fields, totalling more than 200-300 in total. Additional performance overheads may occur if you were to then choose to create two separate lookup fields as opposed to one.
  • The entity you are customising already has a Customer lookup field built in, which is populated with data across hundreds, maybe thousands, of records within the application. Attempting to implement two separate lookups and then going through the exercise of updating every record to populate the correct lookup field could take many hours to complete and also have unexpected knock-on effects across the application.

In these instances, it may make more practical sense to implement a small JScript function to conditionally alter how the Customer Lookup field allows the user to populate records when working on the form. The benefit of this being is that you can take advantage of the multi-entity capablities that this field type was designed for, and also enforce the integrity of your business logic/requirements on the applications form layer.

To that end, what you can look at doing is applying a custom FetchXML snippet that prevents either Account or Contact records from returning when a user clicks on the control. Paradoxically, this is not done by, as I first assumed, using the following snippet:

var filter = "<filter type='and'><condition attribute='accountid' operator='not-null' /></filter>";
Xrm.Page.getControl("mycustomerlookupfield").addCustomFilter(filter, "account");

This will lead to no records returning on your lookup control. Rather, you will need to filter the opposite way – only return Contact records where the contactid equals Null i.e. the record does not exist:

var filter = "<filter type='and'><condition attribute='contactid' operator='null' /></filter>";
Xrm.Page.getControl("mycustomerlookupfield").addCustomFilter(filter, "contact");

Don’t Try and Pass Parameters to your addCustomerFilter Function (CRM 2016 Update 1)

If your organisation is currently on Dynamics CRM 2016 Update 1, then you may encounter a strange – and from what I can gather, unresolvable – issue if you are working with multiple, parameterised functions in this scenario. To explain further, lets assume you have a Customer Lookup and a Contact Lookup field on your form. You want to filter the Contact Lookup field to only return Contacts that are associated with the Account populated on the Customer Lookup. Assume that there is already a mechanism in place to ensure that the Customer lookup will always have an Account record populated within it, and your functions to use in this specific scenario may look something like this:

function main() {

    //Filter Contact lookup field if Customer lookup contains a value

    var customerID = Xrm.Page.getAttribute('mycustomerlookupfield').getValue();

    if (customerID != null) {
   
        Xrm.Page.getControl("mycontactfield").addPreSearch(filterContactNameLookup(customerID[0].id));
    }
    
}

function filterContactNameLookup(customerID) {

    var filter = "<condition attribute='parentcustomerid' operator='eq' value='" + customerID + "' />";
    Xrm.Page.getControl("mycontactfield").addCustomFilter(filter, "account");

}

The above example is a perfectly sensible means of implementing this. Because, surely, it make more practical sense to only obtain the ID of our Customer Lookup field in one place and then pass this along to any subsequent functions? The problem is that CRM 2016 Update 1 throws some rather cryptic errors in the developer console when attempting to execute the code, and does nothing on the form itself:

Yet, when we re-write our functions as follows, explicitly obtaining our Customer ID on two occasions, this runs as we’d expect with no error:

function main() {

    //Filter Contact lookup field if Customer lookup contains a value

    var customerID = Xrm.Page.getAttribute('mycustomerlookupfield').getValue();

    if (customerID != null) {
   
        Xrm.Page.getControl("mycontactfield").addPreSearch(filterContactNameLookup);
    }
    
}

function filterContactNameLookup() {

    var customerID = Xrm.Page.getAttribute('mycustomerlookupfield').getValue()[0].id;
    var filter = "<condition attribute='parentcustomerid' operator='eq' value='" + customerID + "' />";
    Xrm.Page.getControl("mycontactfield").addCustomFilter(filter, "account");

}

I’ve been scratching my head at why this doesn’t work, and the only thing I can think of is that the first function – main – would be executed as part of the forms OnLoad event, whereas the filterContactNameLookup is only triggered at the point in which the lookup control is selected. It’s therefore highly possible that the first instance of the customerID is unobtainable by the platform at this stage, meaning that you have to get the value again each time the lookup control is interacted with. If anyone else can figure out what’s going on here or confirm whether this is a bug or not with Dynamics CRM 2016 Update 1, then do please let me know in the comments below.

Conclusions or Wot I Think

It could be argued quite strongly that the examples shown here in this article have little or no use practical use if you are approaching your CRM/D365E implementation from a purely functional point of view. Going back to my earlier example, it is surely a lot less hassle and error-prone to implement a solution using a mix of out of the box functionality within the application. The problem that you eventually may find with this is that the solution becomes so cumbersome and, frankly, undecipherable when someone is coming into your system cold. With anything, there always a balance should be striven for on all occasions and, with a bit of practical knowledge of how to write JScript functionality (something that any would-be CRM expert should have stored in their arsenal), you can put together a solution that is relatively clean from a coding point of view, but also benefits from utilising some great functionality built-in to the application.