I was recently involved in deploying my first ever Office 365 Group. I already had a good theoretical understanding of them, thanks to the curriculum for the Business Applications MCSA, but I had not yet seen how they perform in action. The best way of summing them up is that they are, in effect, a distribution group on steroids. As well as getting a shared mailbox that can be used for all communications relating to the group’s purpose, they also support the following features:

  • Shared Calendar
  • SharePoint Document Site
  • Shared OneNote document
  • Shared Planner

In a nutshell, they can be seen as an excellent vehicle for bringing together the diverse range of features available as part of your Office 365 subscription. What helps further is that they are tightly integrated as part of the tools that you likely already use each day – for example, they can be accessed and worked with from the Outlook desktop client on and Web Access (OWA) portal.

Given that this feature is a very Office 365 centric component, the natural question emerges as to why an exam for Dynamics CRM/Dynamics 365 for Enterprise (CRM/D365E) would want to test your knowledge of them. Since the release of Dynamics CRM 2016 Update 1, you now have the option of integrating Office 365 Groups with the application, to provide a mechanism for easily working with groups from within the CRM/D365E web interface, effectively providing a “bridge” for non-CRM/D365E users who are using Office 365.

You may be pleased to hear that the steps involved in getting setup with Office 365 Groups in CRM/D365E are remarkably straightforward. Here’s a step-by-step guide on how to get up and running with this feature within your business:

Microsoft provides a managed solution that contains everything you need to get going with Office 365 Groups, and this is made available as a Preferred Solution. These are installed from the Dynamics 365 Administration Center by navigating to your instance, selecting the little pen icon next to Solutions and clicking on the Office 365 Groups record on the list that is displayed:

Click on the Install button and then accept the Terms of Service – as Office 365 Groups creates an intrinsic link between your CRM/D365E and Office 365 tenant, it is only natural that data will need to be shared between both, so there are no major concerns in accepting this:

The solution will take a couple of minutes to install, and you can safely refresh the window to monitor progress. Once installed, the Settings sitemap area will be updated with a new button – Office 365 Groups:

Clicking into this will navigate you to the Office 365 Groups Integration Settings page, which allows you start configuring the entities you wish to use to utilise with Office 365 Groups:

For reference purposes, the default out of the box entities that can be used with this feature are as follows:

  • Account
  • Competitor
  • Contact
  • Contract
  • Case
  • Invoice
  • Lead
  • Opportunity
  • Product
  • Quote
  • Sales Literature

You may be wondering if it is possible to enable additional entities for use with Office 365 Groups. At the time of writing, only the system entities recorded above and custom entities can be used with Office 365 Groups.

Now that we know how to get CRM/D365E setup for Office 365 Groups, let’s look at how it works when set up for the Account entity:

Going back to the Office 365 Groups Integration Settings (if you have closed it down), click on the Add entity button to enable a drop-down control, containing a list of the entities referenced above. Select Account and, when you are ready to proceed, click Publish all to enable this entity for Office 365 Groups functionality:

For this example, the Auto Create button is left blank. I would recommend that this setting is always used, so as to prevent the creation of unnecessary Office 365 Groups, that may get named incorrectly as a consequence (you’ll see why this has the potential to occur in a few moments).

Once enabled, when you navigate to an existing Account record, you will see a new icon on the Related Records sitemap area:

After clicking on this, you are then asked to either Create a new group – with the ability to specify its name – or to Search for an existing group. The second option is particularly handy if you have already been using Office 365 Groups and wish to retroactively tie these back to CRM/D365E:

For this example, we are going to create a new group. The process can take a while (as indicated below) so now may be a good opportunity to go make a brew 🙂

Leaving the screen open will eventually force a refresh, at which point your new group will appear, with all the different options at your disposal:

With your group now up and running, you can start uploading documents, configure the shared calendar and fine-tune the group’s settings to suit your purposes. Here are some handy tips to bear in mind when using the group with CRM/D365E:

  • Just because the group is linked up with CRM/D365E doesn’t mean that you have to be a user from this application to access the group. This is one of the great things about utilising Office 365 Groups with CRM/D365E, as standard Office 365 users can join and work with the group without issue. The only thing you have to remember is that the Office 365 user has to have the appropriate license on Office 365 – as indicated by Microsoft, any subscription that gives a user an Exchange Online mailbox and SharePoint Online access will suffice.
  • Remember that the ConversationsNotebook and Documents features are not in any way linked with the equivalent CRM/D365E feature. For example, any Conversation threads will not appear within the Social Pane as an activity; you will need to navigate to the Office 365 Group page to view these.
  • Utilising Office 365 Groups as an end-user requires that you have the appropriate security role access. If you do not, then you may be greeted with the following when attempting to open an Office 365 Group within the application:

That’s right – a whole heap of nothing! 🙂 To fix this, you will need to go into the users Security Role and ensure that they have Organization-level privilege on the ISV Extensions privilege, as indicated below:

Conclusions or Wot I Think

Office 365 Groups present a natural choice when working as part of large-scale teams or projects – especially when they are internally based. They can also be a good fit for when you wish to liaise with 3rd party organisations, thanks to the ability to grant Guest access to external accounts. Having the ability to then tie these groups back within CRM/D365E is useful, but I do wonder whether they are a good match for all of the record types that Microsoft suggests in the list above. Certainly, Account records are a justifiable fit if you are working with an organisation to deliver continuous services or multiple projects. I doubt highly, however, that you want to go to the trouble of creating a shared document repository for a new Lead record right from the bat – particularly if your CRM/D365E deployment is more focused towards B2C selling. You may be tempted to over-excitedly roll out Office 365 Groups carte blanche across your CRM/D365E deployment, but I would caution against this. Don’t forget that the creation of a new Office 365 Group will result in additional overhead when managing your Exchange Online mailbox lists and SharePoint sites, as well as having long-term storage implications for the latter. Acting prudently, you can identify a good business case for enabling specific entities for use with Office 365 Groups and ensure that you manage your entire Office 365 deployment in the most effective manner possible.

The sheer breadth of ways that you can utilise Dynamics CRM/Dynamics 365 for Enterprise (CRM/D365E) can sometimes boggle the mind. Whether it’s through a traditional web browser, mobile app, the new interactive service hub or even through your own website created via the SDK, organisations have an ever-increasing array of routes they can go down when deploying the application into their environment. Despite this, more often than not, you would expect a “standard” deployment to involve using the application via a web browser, either on a local machine or potentially via a Remote Desktop Session (RDS) instance. Whilst Microsoft’s support articles provide fairly definitive software requirements when working on a Windows desktop machine, it is difficult to determine if, for example, Google Chrome on a Windows Server 2012 RDS session is supported. This is an important omission that requires clarification and is worth discussing further to determine if a definitive conclusion can be reached, based on the available evidence.

In this week’s post, I will attempt to sleuth through the various pieces of evidence I can find on this subject, sprinkling this with some experience that I have had with CRM/D365E and RDS, to see if any definitive conclusion can be established.

Before we get into the heart of the matter, it may be useful to provide a brief overview of what RDS is

RDS is a fancy way of describing connecting to a remote computer via the Remote Desktop Connection client on your Windows or OS of choice. Often referred to as Terminal Services, it is a de facto requirement when accessing remote servers for a variety of reasons. Most commonly, you will witness it deployed as part of an internal corporate network, as a mechanism for users to “remote on” when working outside the office. Due to the familiarity of Windows Server compared with each versions corresponding Desktop OS, the look and feel of working on a normal computer can be achieved with minimal effort, and you can often guarantee that the same types of programmes will also work without issue.

Whilst RDS is still frequently used, it could be argued to have taken a back seat in recent years with the rise in virtualisation technologies, from the likes of Citrix and VMware. These solutions tend to offer the same benefits an RDS server can, but places more emphasis on utilising a local desktop environment to essentially stream desktops/applications to end users. As a result of the rise of these technologies, RDS is perhaps entering a period of uncertainty; whilst it will continue to be essential for remote server management, there are arguably much better technologies available that provide an enriched end-user experience, but offer the same benefits of having a centralised server within a backed up/cloud environment.

Now that you (hopefully!) have a good overview of what RDS is, let’s take a look at the evidence available in relation to CRM/D365E and RDS

Evidence #1: Technet Articles

The following TechNet articles provide, when collated together, a consolidated view of supported internet browsers and operating systems for CRM/D365:

From this, we can distill the following:

  • Windows 10, 8.1, 8 and 7 are supported, so long as they are using a “supported” browser:
    • Internet Explorer 10 is supported for Windows 7 and 8 only.
    • Internet Explorer 11 is supported for all Windows OS’s, with the exception of 8.
    • Edge is supported for Windows 10 only.
    • Firefox and Chrome are supported on all OS’s, so long as they are running the latest version.
  • OS X 10.8 (Mountain Lion), 10.9 (Mavericks) and 10.10 Yosemite are supported for Safari only, running the latest version
  • Android 10 is supported for the latest version of Chrome only
  • iPad is supported for the latest version of Safari only (i.e. the latest version of iOS)

The implication from this should be clear – although the following Windows Server devices (that are currently in mainstream support) can be running a supported web browser, they are not covered as part of the above operating server list:

  • Windows Server 2016
  • Windows Server 2012 R2
  • Windows Server 2012

Evidence #2: Notes from the Field

I have had extensive experience both deploying into and supporting CRM/D365E environments running RDS. These would typically involve servers with significant user load (20-30 per RDS server) and, the general experience and feedback from end users has always been…underwhelming. All issues generally came down to the speed of the application which, when compared to running on a standard, local machine, was at a snail’s pace by comparison. Things like loading a form, an entity view or Dialog became tortuous affairs and led to serious issues with user adoption across the deployments. I can only assume that the amount of local CPU/Memory required for CRM/D365E when running inside a web application was too much for the RDS server to handle; this was confirmed by frequent CPU spikes and high memory utilisation on the server.

I can also attest to working with Microsoft partners who have explicitly avoided having issues concerning RDS and CRM/D365E in-scope as part of any support agreement. When this was queried, the reasoning boiled down to the perceived hassle and complexity involved in managing these types of deployment.

To summarise, I would argue that this factors in additional ammunition for Evidence piece #1, insomuch as that RDS compatible servers are not covered on the supported operating system lists because these issues are known about generally.

Evidence #3: What Microsoft Actually Say

I was recently involved as part of a support case with Microsoft, where we were attempting to diagnose some of the performance issues discussed above within an RDS environment. The support professional assigned to the case came back and stated the following in regards to RDS and CRM/D365E:

…using Windows remote desktop service is supported but unfortunately using Windows server 2012 R2 is not supported. You have to use Windows server 2012. Also windows server 2016 is out of our support boundaries.

Whilst this statement is not backed up by an explicit online source (and I worry whether some confusion has been derived from the Dynamics 365 for Outlook application – see below for more info on this), it can be taken as saying that Windows Server 2012 is the only supported operating system that can be used to access CRM/D365E, with one of the supported web browsers mentioned above.

The Anomalous Piece of Evidence: Dynamics 365 for Outlook Application

Whilst it may not be 100% clear cut in regards to supported server operating systems, we can point to a very definitive statement in respect to the Dynamics 365 for Outlook application when used in conjunction with RDS:

Dynamics 365 for Outlook is supported for running on Windows Server 2012 Remote Desktop Services

Source: https://technet.microsoft.com/en-us/library/hh699743.aspx

Making assumptions here again, but can we take this to mean that the web application is supported within Windows Server 2012 RDS environments, as suggested by the Microsoft engineer above? If not, then you may start thinking to yourself “Well, why not just use this instead of a web browser on RDS to access CRM/D365E?”. Here are a few reasons why you wouldn’t really want to look at rolling out the Dynamics 365 for Outlook application any time soon within RDS:

  • If deploying the application into offline mode, then you will be required to install a SQL Express instance onto the machine in question. This is because the application needs to store copies of your synchronised entity data for whenever you go offline. The impact of this on a standard user machine will be minimal at best, but on a shared desktop environment, could lead to eventual performance issues on the RDS server in question
  • With the introduction of new ways to work within CRM/D365 data in an efficient way, such as with the Dynamics 365 App for Outlook, the traditional Outlook client is something that is becoming less of a requirement these days. There are plenty of rumours/commentary on the grapevine that the application may be due for depreciation in the near future, and even Microsoft have the following to say on the subject:

    Dynamics 365 App for Outlook isn’t the same thing as Dynamics 365 for Outlook. As of the December 2016 update for Dynamics 365 (online and on-premises), Microsoft Dynamics 365 App for Outlook paired with server-side synchronization is the preferred way to integrate Microsoft Dynamics 365 with Outlook.

  • I have observed performance issues with the add-in myself in the past – outlook freezing, the occasional crash and also issues with the Outlook ribbon displaying incorrectly.

As you can probably tell, I am not a big fan of the add-in, but the writing on the wall is fairly clear – Microsoft fully supports you accessing CRM/D365E from the Outlook client on Windows Server 2012 RDS.

After reviewing all the evidence, do we have enough to solve this case?

Whilst there is a lot of evidence to consider, the main thing I would highlight is the lack of a “smoking gun” in what has been reviewed. What I mean by this is the lack of a clear support article that states either “X Browser is supported on Windows Server X” or “X Browser is NOT supported on Windows Server X“. Without any of these specific statements, we are left in a situation where we have to infer that RDS is not a supported option for using the CRM/D365E web application. Certainly, the experience I have had with the web client in these environment types would seem to back this up, which may go some way towards explaining the reason why this is not implicitly supported.

So where does this leave you if you are planning to deploy CRM/D365E within an RDS environment? Your only option is to ensure that your RDS environment is running Windows Server 2012 and that your users are utilising the Outlook client, given that there is a very clear statement regarding its supportability. If you are hell bent on ensuring that your end users have the very best experience with CRM/D365E, then I would urge you to reconsider how your environment is configured and, if possible, move to a supported configuration – whether that’s local desktop or a VDI, running your browser of choice. Hopefully, the benefits of utilising the application will far outweigh any overriding concerns and business reasons for using RDS in the first place.

When you have spent any length of time working with Dynamics CRM Online/Dynamics 365 for Enterprise (D365E) data programmatically, you become accustomed to how Option Set, State and Status Reason values are presented to you in code. To explain, the application does not store your Option Set value display names within the SQL Server Entity tables; rather, the Option Set Value that has been specified alongside your Label is what is stored as an integer value. That is why you are always mandatorily prompted to provide both values within the application:

The following benefits are realised as a result of how this is setup:

That being said, when working with these field types in code, you do always have to have the application window open or a list of all Labels/Values to hand so that you don’t get too confused… 🙂

I have previously extolled the virtues of the Data Export Service on the blog, and why you should consider it if you have basic integration requirements for your CRM/D365E deployment. One area in which it differs from other products on the market is how it handles the field types discussed above. For example, when exporting data to a SQL database via Scribe Online, new columns are created alongside that contain the “Display Name” (i.e. label value) that correspond to each Option, Status and Status Reason Label. So by running the following query against a Scribe export database:

SELECT DISTINCT statecode, statecode_displayname
FROM dbo.account

We get the best of both worlds – our underlying statecode value and their display names – all in 2 lines of code:

This is a big help, particularly when you are then using the data as part of a report, as no additional transformation steps are required and your underlying SQL query can be kept as compact as possible.

The Data Export Service differs from the above in an understandable way, as display name values for Status, Status Reason and Option Set column values are instead segregated out into their own separate table objects in your Azure SQL database:

OptionSetMetadata

GlobalOptionSetMetadata

StateMetadata

StatusMetadata

Why understandable? If you consider how the application can support multiple languages, then you realise that this can also apply to metadata objects across the application – such as field names, view names and – wouldn’t you have guessed it – Labels too. So when we inspect the OptionSetMetadata table, we can see that the table structure accommodates the storing of labels in multiple languages via the LocalizedLabelLanguageCode field:

Unlike the Scribe Online route above (which I assume only retrieves the Labels that correspond to the user account that authenticates with CRM/D365E), the Data Export Service becomes instantly more desirable if you are required to build multi-language reports referencing CRM/D365E application data.

The issue that you have to reconcile yourself with is that your SQL queries, if being expressed as natively as possible, instantly become a whole lot more complex. For example, to achieve the same results as the query above, it would have to be adapted as follows for the Data Export Service:

SELECT DISTINCT statecode, LocalizedLabel
FROM dbo.account
 LEFT JOIN dbo.StateMetadata
  ON 'account' = EntityName
  AND statecode = [State]
  AND '1033' = LocalizedLabelLanguageCode

The above is a very basic example, but if your query is complex – and involves multiple Option Set Values – then you would have to resort to using Common Table Expressions (CTE’s) to accommodate each potential JOIN required to get the information you want.

In these moments, we can look at some of the wider functionality provided as part of SQL Server to develop a solution that will keep things as simple as possible and, in this particular instance, a user-defined function is an excellent candidate to consider. These enable you to perform complex operations against the database platform and encapsulate them within very simply expressed objects that can also accept parameters. The good thing about functions is that they can be used to return table objects and scalar (i.e. single) objects.

Using a scalar function, we can, therefore, remove some of the complexity behind returning Option Set, Status and Status Reason labels by creating a function that returns the correct label, based on input parameters received by the function. You could look at creating a “master” function that, based on the input parameters, queries the correct Metadata table for the information you need; but in this example, we are going to look at creating a function for each type of field – Status, Status Reason, Option Set and Global Option Set.

To do this, connect up to your Data Export Service database and open up a new query window, ensuring that the context is set to the correct database. Paste the following code in the window and then hit Execute:

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

--Create Function to return Global Option Set Labels

CREATE FUNCTION [dbo].[fnGetGlobalOptionSetLabel]
(
	@GlobalOptionSetName NVARCHAR(64), --The logical name of the Global Option Set
	@Option INT, --The option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.GlobalOptionSetMetadata WHERE OptionSetName = @GlobalOptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.GlobalOptionSetMetadata WHERE OptionSetName = @GlobalOptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain label for Global Option Set field ' + @GlobalOptionSetName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Option Set Labels

CREATE FUNCTION [dbo].[fnGetOptionSetLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Option Set field
	@OptionSetName NVARCHAR(64), --The logical name of the Option Set field
	@Option INT, --The option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.OptionSetMetadata WHERE EntityName = @EntityName AND OptionSetName = @OptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.OptionSetMetadata WHERE EntityName = @EntityName AND OptionSetName = @OptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain label for Option Set field ' + @OptionSetName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Status Labels

CREATE FUNCTION [dbo].[fnGetStateLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Status field
	@State INT, --The Status option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.StateMetadata WHERE EntityName = @EntityName AND [State] = @State AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.StateMetadata WHERE EntityName = @EntityName AND [State] = @State AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain State label for entity ' + @EntityName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Status Reason Labels

CREATE FUNCTION [dbo].[fnGetStatusLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Status Reason field
	@Status INT, --The Status Reason option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.StatusMetadata WHERE EntityName = @EntityName AND [Status] = @Status AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.StatusMetadata WHERE EntityName = @EntityName AND [Status] = @Status AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain Status label for entity ' + @EntityName AS INT);
	RETURN @Label;

END

GO

This will then go off and create the functions listed in code, which should then show up under the Programmability folder on your SQL database:

For those who are unsure at what the SQL code is doing, it first attempts to determine if only 1 Label can be found for your appropriate field type, based on the parameters provided. If it is successful, then a value is returned; otherwise, the CAST function is designed to force an error to return back to the caller to indicate that none or more than 1 Option Set value was found. In most cases, this would indicate a typo in the parameters you have specified.

As with anything, the best way to see how something works is in the practice! So if we again look at our previous examples shown in this post, we would utilise the dbo.fnGetStateLabel function as follows to return the correct label in English:

SELECT DISTINCT statecode, dbo.fnGetStateLabel('account', statecode, 1033) AS statecode_displayname
FROM dbo.account

With our results returning as follows:

Now we can expose this through our reports and not worry about having to do any kind of transformation/lookup table to get around the issue 🙂

Attempting to keep things as simple as possible by encapsulating complex functionality into simply and clearly expressed functions is an excellent way of ensuring that code can be kept as streamlined as possible, and also ensures that other colleagues can accomplish complex tasks, even if they do not have in-depth knowledge of Transact-SQL.

Although CRM Online/Dynamics 365 for Enterprise (D365E) does provide a plethora of different tools aimed at satisfying reporting requirements for users of the application, you are restricted in how data can be queried within the application. For example, you cannot just connect straight up to the applications SQL database and start writing stored procedures that perform complex data transformations or joins. Traditionally, to achieve this, you would need to look at one of the several tools in the marketplace that enable you to export your data out into a format that best pleases you; or even take the plunge and get a developer to write your own application that satisfies your integration requirements.

With the recent D365E release and in-line with Microsoft’s longstanding approach to how they approach customer data within their applications (i.e. “It’s yours! So just do what you want with it!), the parallel introduction of the Data Export Service last year further consolidates this approach and adds an arguably game-changing tool to the products arsenal. By using the service, relatively straightforward integration requirements can be satisfied in a pinch and a lot of the headache involved in setting up a backup of your organisation’s databases/LOB reporting application can be eliminated. Perhaps the most surprising and crucial aspect of all of this is that using this tool is not going to break the bank too much either.

In this week’s blog post, I’m going to take a closer look at just what the Data Export Service is, the setup involved and the overall experience of using the service from end-to-end.

What is the Data Export Service?

The Data Export Service is a new, free*, add-on for your CRM/D365E subscription, designed to accomplish basic integration requirements. Microsoft perhaps provides the best summary of what the tool is and what it can achieve via TechNet :

The Data Export Service intelligently synchronizes the entire Dynamics 365 data initially and thereafter synchronizes on a continuous basis as changes occur (delta changes) in the Microsoft Dynamics 365 (online) system. This helps enable several analytics and reporting scenarios on top of Dynamics 365 data with Azure data and analytics services and opens up new possibilities for customers and partners to build custom solutions.

The tool is compatible with versions 8.0, 8.1 and 8.2 of the application, which corresponds the following releases of CRM Online/D365E:

  • Dynamics CRM Online 2016
  • Dynamics CRM Online 2016 Update 1
  • Dynamics 365 December Update

*You will still need to pay for all required services in Azure, but the add-on itself is free to download.

The Installation Process

Getting everything configured for the Data Export Service can prove to be the most challenging – and potentially alienating – part of the entire process. For this, you will need the following at your disposal:

  • An active Azure Subscription.
  • An Azure SQL Server configured with a single database or an Azure VM running SQL Server. Microsoft recommends a Premium P1 database or better if you are using an Azure SQL database, but I have been able to get the service working without any issue on S0 tier databases. This is an important point to make, given the cost difference per month can amount to hundreds of £’s.
  • An Azure Key Vault. This is what will securely store the credentials for your DB.
  • PowerShell and access to the Azure Resource Manager (AzureRM) Cmdlets. Powershell can be installed as an OS feature on Windows based platforms, and can now be downloaded onto OS X/Linux as well. PowerShell is required to create an Azure Key Vault, although you can also use it to create your Azure SQL Server instance/Windows VM with SQL Server.

It is therefore recommended that you have at least some experience in how to use Azure – such as creating Resource Groups, deploying individual resources, how the interface works etc. – before you start setting up the Data Export Service. Failing this, you will have to kindly ask your nearest Azure whizz for assistance 🙂 Fortunately, if you know what you’re doing, you can get all of the above setup very quickly; in some cases, less than 10 minutes if you opt to script out the entire deployment via PowerShell.

For your setup with D365E, all is required is the installation of the approved solution via the Dynamics 365 Administration Centre. Highlight the instance that you wish to deploy to and click on the pen icon next to Solutions:

Then click on the Solution with the name Data Export Service for Dynamics 365 and click the Install button. The installation process will take a couple of minutes, so keep refreshing the screen until the Status is updated to Installed. Then, within the Settings area of the application, you can access the service via the Data Export icon:

Because the Data Export Service is required to automatically sign into an external provider, you may also need to verify that your Web Browser pop-up settings/firewall is configured to allow the https://discovery.crmreplication.azure.net/ URL. Otherwise, you are likely to encounter a blank screen when attempting to access the Data Export Service for the first time. You will know everything is working correctly when you are greeted with a screen similar to the below:

Setting up an Export Profile

After accepting the disclaimer and clicking on the New icon, you will be greeted with a wizard-like form, enabling you to specify the following:

  • Mandatory settings required, such as the Export Profile Name and the URL to your Key Vault credentials.
  • Optional settings, such as which database schema to use, any object prefix that you would like to use, retry settings and whether you want to log when records are deleted.
  • The Entities you wish to use with the Export Service. Note that, although most system entities will be pre-enabled to use this service, you will likely need to go into Customizations and enable any additional entities you wish to utilise with the service via the Change Tracking option:

  • Any Relationships that you want to include as part of the sync: To clarify, this is basically asking if you wish to include any default many-to-many (N:N) intersect tables as part of your export profile. The list of available options for this will depend on which entities you have chosen to sync. For example, if you select the AccountLead and Product entities, then the following intersect tables will be available for synchronisation:

Once you have configured your profile and saved it, the service will then attempt to start the import process.

The Syncing Experience A.K.A Why Delta Syncing is Awesome

When the service first starts to sync, one thing to point out is that it may initially return a result of Partial Success and show that it has failed for multiple entities. In most cases, this will be due to the fact that certain entities dependent records have not been synced across (for example, any Opportunity record that references the Account name Test Company ABC Ltd. will not sync until this Account record has been exported successfully). So rather than attempting to interrogate the error logs straightaway, I would suggest holding off a while. As you may also expect, the first sync will take some time to complete, depending on the number of records involved. My experience, however, suggests it is somewhat quick – for example, just under 1 million records takes around 3 hours to sync. I anticipate that the fact that the service is essentially an Azure to Azure export no doubt helps in ensuring a smooth data transit.

Following on from the above, syncs will then take place as and when entity data is modified within the application. The delay between this appears to be very small indeed – often tens of minutes, if not minutes itself. This, therefore, makes the Data Export Service an excellent candidate for a backup/primary reporting database to satisfy any requirements that cannot be achieved via FetchXML alone.

One small bug I have observed is with how the application deals with the listmember intersect entity. You may get an errors thrown back that indicate records failed to sync across successfully, which is not the case upon closer inspection. Hopefully, this is something that may get ironed out and is due to the rather strange way that the listmember entity appears to behave when interacting with it via the SDK.

Conclusions or Wot I Think

For a free add-on service, I have been incredibly impressed by the Data Export Service and what it can do. For those who have previously had to fork out big bucks for services such as Scribe Online or KingswaySoft in the past to achieve very basic replication/reporting requirements within CRM/D365E, the Data Export Service offers an inexpensive way of replacing these services. That’s not to say that the service should be your first destination if your integration requirements are complex – for example, integrating Dynamics 365 with SAP/Oracle ERP systems. In these cases, the names mentioned above will no doubt be the best services to look at to achieve your requirements in a simplistic way. I also have a few concerns that the setup involved as part of the Data Export Service could be a barrier towards its adoption. As mentioned above, experience with Azure is a mandatory requirement to even begin contemplating getting setup with the tool. And your organisation may also need to reconcile itself with utilising Azure SQL databases or SQL Server instances on Azure VM’s. Hopefully, as time goes on, we may start to see the setup process simplified – so, for example, seeing the Export Profile Wizard actually go off and create all the required resources in Azure by simply entering your Azure login credentials.

The D365E release has brought a lot of great new functionality and features to the table, that has been oft requested and adds real benefit to organisations who already or plan to use the application in the future. The Data Export Service is perhaps one of the great underdog features that D365E brings to the table, and is one that you should definitely consider using if you want a relatively smooth sailing data export experience.

When working with form-level JScript functionality on Dynamics CRM/Dynamics 365 for Enterprise (D365E), you often uncover some interesting pieces of exposed functionality that can be utilised neatly for a specific business scenario. I did a blog post last year on arguably one of the best of these functions when working with Lookup field controls – the Xrm.Page.getControl().addPreSearch method. Similar to other methods exposed via the SDK, its prudent and effective implementation can greatly reduce the amount of steps/clicks that are involved when populating Entity forms.

I’ve already covered as part of last years post just what this method does, its sister method, addCustomFilter, and also some of the interesting problems that are encountered when working with the Customer lookup field type; a special, recently introduced field type that allows you to create a multi-entity lookup/relationship onto the Account and Contact entities on one field. I was doing some work again recently using these method(s) in the exact same conditions, and again came across some interesting quirks that are useful to know when determining whether the utilisation of these SDK methods is a journey worth starting in the first place. Without much further ado, here are two additional scenarios that involve utilising these methods and the “lessons learned” from each:

Pre-Filtering the Customer Lookup to return Account or Contact Records Only

Now, your first assumption with this may be that, if you wanted your lookup control to only return one of the above entity types, then surely it would be more straightforward to just setup a dedicated 1:N relationship between your corresponding entity types to achieve this? The benefits of this seem to be pretty clear – this is a no-code solution that, with a bit of ingenious use of Business Rules/Workflows, could be implemented in a way that the user never even suspects what is taking place (e.g. Business Rule to hide the corresponding Account/Contact lookup field if the other one contains a value). However, assume one (or all) of the following:

  • You are working with an existing System entity (e.g. Quote, Opportunity) that already has the Customer lookup field defined. This would, therefore, mean you would have to implement duplicate schema changes to your Entity to accommodate your scenario, a potential no-no from a best practice point of view.
  • Your entity in question already has a significant amount of custom fields, totalling more than 200-300 in total. Additional performance overheads may occur if you were to then choose to create two separate lookup fields as opposed to one.
  • The entity you are customising already has a Customer lookup field built in, which is populated with data across hundreds, maybe thousands, of records within the application. Attempting to implement two separate lookups and then going through the exercise of updating every record to populate the correct lookup field could take many hours to complete and also have unexpected knock-on effects across the application.

In these instances, it may make more practical sense to implement a small JScript function to conditionally alter how the Customer Lookup field allows the user to populate records when working on the form. The benefit of this being is that you can take advantage of the multi-entity capablities that this field type was designed for, and also enforce the integrity of your business logic/requirements on the applications form layer.

To that end, what you can look at doing is applying a custom FetchXML snippet that prevents either Account or Contact records from returning when a user clicks on the control. Paradoxically, this is not done by, as I first assumed, using the following snippet:

var filter = "<filter type='and'><condition attribute='accountid' operator='not-null' /></filter>";
Xrm.Page.getControl("mycustomerlookupfield").addCustomFilter(filter, "account");

This will lead to no records returning on your lookup control. Rather, you will need to filter the opposite way – only return Contact records where the contactid equals Null i.e. the record does not exist:

var filter = "<filter type='and'><condition attribute='contactid' operator='null' /></filter>";
Xrm.Page.getControl("mycustomerlookupfield").addCustomFilter(filter, "contact");

Don’t Try and Pass Parameters to your addCustomerFilter Function (CRM 2016 Update 1)

If your organisation is currently on Dynamics CRM 2016 Update 1, then you may encounter a strange – and from what I can gather, unresolvable – issue if you are working with multiple, parameterised functions in this scenario. To explain further, lets assume you have a Customer Lookup and a Contact Lookup field on your form. You want to filter the Contact Lookup field to only return Contacts that are associated with the Account populated on the Customer Lookup. Assume that there is already a mechanism in place to ensure that the Customer lookup will always have an Account record populated within it, and your functions to use in this specific scenario may look something like this:

function main() {

    //Filter Contact lookup field if Customer lookup contains a value

    var customerID = Xrm.Page.getAttribute('mycustomerlookupfield').getValue();

    if (customerID != null) {
   
        Xrm.Page.getControl("mycontactfield").addPreSearch(filterContactNameLookup(customerID[0].id));
    }
    
}

function filterContactNameLookup(customerID) {

    var filter = "<condition attribute='parentcustomerid' operator='eq' value='" + customerID + "' />";
    Xrm.Page.getControl("mycontactfield").addCustomFilter(filter, "account");

}

The above example is a perfectly sensible means of implementing this. Because, surely, it make more practical sense to only obtain the ID of our Customer Lookup field in one place and then pass this along to any subsequent functions? The problem is that CRM 2016 Update 1 throws some rather cryptic errors in the developer console when attempting to execute the code, and does nothing on the form itself:

Yet, when we re-write our functions as follows, explicitly obtaining our Customer ID on two occasions, this runs as we’d expect with no error:

function main() {

    //Filter Contact lookup field if Customer lookup contains a value

    var customerID = Xrm.Page.getAttribute('mycustomerlookupfield').getValue();

    if (customerID != null) {
   
        Xrm.Page.getControl("mycontactfield").addPreSearch(filterContactNameLookup);
    }
    
}

function filterContactNameLookup() {

    var customerID = Xrm.Page.getAttribute('mycustomerlookupfield').getValue()[0].id;
    var filter = "<condition attribute='parentcustomerid' operator='eq' value='" + customerID + "' />";
    Xrm.Page.getControl("mycontactfield").addCustomFilter(filter, "account");

}

I’ve been scratching my head at why this doesn’t work, and the only thing I can think of is that the first function – main – would be executed as part of the forms OnLoad event, whereas the filterContactNameLookup is only triggered at the point in which the lookup control is selected. It’s therefore highly possible that the first instance of the customerID is unobtainable by the platform at this stage, meaning that you have to get the value again each time the lookup control is interacted with. If anyone else can figure out what’s going on here or confirm whether this is a bug or not with Dynamics CRM 2016 Update 1, then do please let me know in the comments below.

Conclusions or Wot I Think

It could be argued quite strongly that the examples shown here in this article have little or no use practical use if you are approaching your CRM/D365E implementation from a purely functional point of view. Going back to my earlier example, it is surely a lot less hassle and error-prone to implement a solution using a mix of out of the box functionality within the application. The problem that you eventually may find with this is that the solution becomes so cumbersome and, frankly, undecipherable when someone is coming into your system cold. With anything, there always a balance should be striven for on all occasions and, with a bit of practical knowledge of how to write JScript functionality (something that any would-be CRM expert should have stored in their arsenal), you can put together a solution that is relatively clean from a coding point of view, but also benefits from utilising some great functionality built-in to the application.