Exams are always something that you end up worrying about to an obsessive degree. The thought of being placed on the spot and expected to demonstrate your knowledge in a particular subject can be daunting to even the most knowledgeable individuals. Technology exams, such as Microsoft certification, can arguably be the worst of all; the level of detailed technical knowledge that you are expected to know off the top of your head can seem almost impossible, particularly for those are heavily reliant on our friend of old, like me! The pace of technological advancement only complicates this further and, when you are working with solutions as fast-paced as Dynamics 365 for Enterprise (D365E), the pace is almost marathon like. New features are added regularly to the application and this invariably leads to new exam content and accreditations to match. The introduction of an MCSA and MCSE for D365E is, arguably, one of the more welcome of recent changes made, and gives those looking to showcase their knowledge a more enhanced way of doing so.

I have previously reviewed the new exams in more detail on the blog and, after having been through the process and successfully obtained my MCSA and MCSE, I can speak generally about the experience and hopefully guide those who are looking at sitting the exams in the near future. This week’s post will provide some general guidance on how you can best prepare for the exams, an overview of the new badge sharing platform, Acclaim, and my general thoughts on the whole experience.

Disclaimer

Per the terms of the Exam NDA, this post will not make reference to any specific exam content; rather, it will discuss broadly what areas you should focus on to get that passing grade. Any discussion or question relating to exam content will be deleted from the comments section.

The journey to Accreditation may seem somewhat upside down

To achieve your MCSA, and eventually MCSE, Microsoft recommends that you follow a suggested route to attain your certification. Although you are free to pass the exams in any order you wish, it is perhaps strange that, if you follow the prescribed route, you will learn/be tested on how to customise, configure and manage Dynamics 365 before discovering what the application can offer natively. Many of the features available in D365E may very well speed along a deployment, and it is important to always remember the plethora of existing functionality available within the application and not accidentally over-customise when there is no need to.

Don’t underestimate the need to revise…

As with any new exam, the Skills Measured list is updated to reflect features freshly introduced as part of the exams targeted release. If you have not yet had experience with how these work, then I would highly recommend working through the e-learning courses available on the Dynamics Learning Portal in the first instance (some of which, incidentally, are also available on Imagine Academy), targeting yourself towards a) new features and b) functionality that you have the least experience in. With regards to “What’s New” with D365E, I would recommend brushing up on the following subjects as part of your revision:

There’s a good chance, based on each exams specification, that questions on all the above topics could appear on multiple exams, so be sure to prepare yourself for this.

…but realise that hands-on experience is the best route towards passing an exam.

Simply watching the e-learning courses or reading about new functionality is never sufficient when revising. You should ideally have a D365E tenant available to you as part of your revision (trials can be set up in a pinch) and be working through features of the application as you are learning. The above e-learning courses include labs, which is an excellent starting point; from there, you should very much work through setting up features from scratch, navigating around the interface and understanding what various buttons/actions do. You may surprise yourself and discover something about the application that you previously overlooked; something which happens to me all the time!

Be sure to setup your Acclaim account after passing

One of the nifty new perks of passing an exam this year is the introduction of Acclaim, which aims to provide a simplified mechanism of collating together your various accreditations across multiple vendors in a fun way. Upon passing your first exam, you will be sent an email within 24 hours of passing to let you know your badge is waiting to be claimed:

To accept the badge, you will need to setup an account with Acclaim. After this, all subsequent achievements will be saved to the same account, enabling you to build up your “Acclaim transcript” as you pass more exams. The social features of the application are varied and quite nice, meaning that you can quickly share the news of your exam pass with friends, colleagues and family members at the click of a button. Currently, LinkedIn, Twitter and Facebook are supported at the time of writing. Finally, you can download images of your badges and include them as part of job applications/C.V’s, helping them stand out more visually.

If you are interested in finding out more about Acclaim, then you can check out my profile to get a feel for what it can offer you. Suffice it to say, having a straightforward means of sending potential customers/employers a website link to my Acclaim profile as opposed to an entire Microsoft transcript would undoubtedly simplify things. Now, if only you could get physical badges that you could stick on your bag… 🙂

Conclusions or Wot I Think

My own personal journey towards obtaining my first MCSA and MCSE has been challenging and rewarding in equal measure. It feels really good to know that D365E has a proper set of accreditations that individuals can aspire towards obtaining and which exemplify the position of the application alongside other, well-known Microsoft solutions. That is not to say that exams are a definitive means of judging your expertise with a particular product, and an exam fail may indicate a spur of the moment, misjudged answer or a lack of revision for a particular new feature. This post may put people off from trying for an exam, due to the effort involved, but it’s important that you are not daunted in any way. I would readily encourage people who have a passion for D365E to put aside any concerns and not delay in working towards passing each of the exams. By doing so, you can proactively demonstrate your commitment towards D365E and the zeal that you have for it, giving those around you the confidence that you not just talk the talk, but can walk the walk as well.

The very recent Microsoft Data Amp event provided an excellent forum for the SQL Server 2017 announcement, which is due to be released at some point this year. Perhaps the most touted feature of the new version is that it will be available to be installed on Linux; an entirely inconceivable premise 10 years ago, which just goes to show how far Microsoft have changed in their approach to supporting non-Windows platforms as standard. Past the big headline announcements, there is a lot to look forward to underneath the hood with SQL Server 2017 that may act as encouragement for organisations looking to upgrade in the near future.

In this week’s post, I’ll be taking a closer look at 3 new features I am most looking forward to, that are present within the SQL Server Community Technical Preview (CTP) 2.0 version and which will form part of the SQL Server 2017 release later on this year.

Power BI in SSRS: A Match Made in Heaven

This is by far the feature I am most looking forward to seeing in action. I have been working more and more with Power BI this year, often diving into the deep-end in respect to what can be achieved with the product, and I have been impressed with how it can be used for addressing reporting scenarios that SSRS may struggle with natively. The announcement earlier this year that Power BI would be included as part of SSRS in the next major release of the product was, therefore, incredibly welcome and its inclusion as part of SQL Server 2017 is confirmed by the inclusion of Power BI reports in the CTP 2.0 release.

For those who are already familiar with Power BI, there is thankfully not much that you need to learn to get up and running with Power BI in SSRS. One thing to point out is that you will need to download a completely separate version of the Power BI Desktop App to allow you to deploy your Power BI reports to SSRS. I would hope that this is mitigated once SQL Server 2017 is released so that we are can deploy from just a single application for either Online or SSRS 2017. Users who are experienced with the existing Power BI Desktop application should have no trouble using the equivalent product for SSRS, as they are virtually identical.

The actual process of deploying a Power BI report is relatively straightforward. After making sure that you have installed the SSRS Power BI Desktop Application, you can then navigate to your SSRS homepage and select + New -> Power BI Report:

You will be greeted with a prompt similar to the below and the Power BI Desktop application will open automatically:

Now it’s time to build your report 🙂 As an example, I have used the WideWorldImporters Sample Database to build a simplistic Power BI report:

If you were working with Power BI online, then this would be the stage where you would click the Publish button to get it onto your online Power BI tenant. The option to deploy to your SSRS instance is currently missing from Power BI in SSRS application; instead, you will need to manually upload your .pbix file into Reporting Services via the Upload button. Once uploaded, your report will be visible on the home page and can be navigated to in the usual manner:

Simplified CSV Importing

Anyone who has at least some experience working with databases and application systems should have a good overview of the nuances of delimited flat file types – in particular, Comma Separated Value (.csv) files. This file type is generally the de-facto format when working with data exported from systems and, more often than not, will be the most common file type that you will regularly need to import into a SQL Server database. Previously, if you didn’t opt to use the Import Wizard/.dtsx package to straightforwardly get your .csv file imported, you would have to rely on the following example script:

BULK INSERT dbo.TestTable
FROM 'C:\Test.csv'
WITH
	(
		FIELDTERMINATOR = ',',
		ROWTERMINATOR = '\n'
	)

Now, with SQL Server 2017, you can simplify your query by replacing FIELDTERMINATOR and ROWTERMINATOR with a new FORMAT parameter, that specifies the file format we are concerned with:

BULK INSERT dbo.TestTable
FROM 'C:\Test.csv'
WITH (FORMAT = 'CSV');

Whilst the overall impact on your query length is somewhat negligible, it is nice that a much more simplified means of accomplishing a common database task has been introduced and that we now also have the option of accessing Azure Blob Storage locations for import files.

Updated Icons for SSMS

Typically, as part of any major update to the application, the “under the hood” visual side of things are generally not changed much. A good example of this can be found within CRM Online/Dynamics 365 for Enterprise within the Customizations area of the application, which has not seen much of a facelift since CRM 2011. As a result, a lot of the icons can look inconsistent with the application as a whole. As these are generally the areas of the application that we use the most day in, day out, it can be a little discouraging not to see these areas get any love or attention as part of a major update… 🙁

With this in mind, it is pleasing to see that the updated SSMS client for SQL Server 2017 has been given refreshed icons that bring the application more in line with how Visual Studio and other Microsoft products are looking these days. Below is a comparison screenshot, comparing SSMS 2014 with SSMS 2017:

Conclusions or Wot I Think

Whilst there is a lot more to look forward to with the new release that is not covered in this post (for example, the enhancements to R server and deeper integration with AI tools), I believe that the most exciting and important announcement for those with their Business Intelligence/Reporting hats on is the introduction of Power BI into SSRS. Previously, each tool was well suited for a specific reporting purpose – SSRS was great for designing reports that require a lot of visual tailoring and widely common formats for exporting, whereas Power BI is more geared towards real-time, dashboard views that marry together disparate data sources in a straightforward way. By being able to leverage SSRS to fully utilise Power BI reports, the application suddenly becomes a lot more versatile and the potential for combining together functionality becomes a lot more recognisable. So, for example, having the ability to drill down to an SSRS report from a Power BI report would be an excellent way of providing reporting capabilities that satisfy end-user consumption in 2 different, but wildly applicable, scenarios.

In summary, the SQL Server 2017 release looks to be very much focused on bringing the product up to date with the new state of play at Microsoft, successfully managing to achieve cross-platform requirements alongside bringing exciting functionality (that was previously cloud-only) into the hands of organisations who still have a requirement to run their database systems on their on-premise infrastructure. I’m eagerly looking forward to the release later on this year and in seeing the product perform in action. 🙂

Back in the days of Dynamics CRM 2016 and earlier, one of the major benefits of opting towards an Online subscription versus an On-Premise license was the Dual-Usage rights granted to your organisation. This meant that, so long as your On-Premise Server installation was licensed, your individual Online User CAL’s would also be licensed like for like for CRM On-Premise. So, if you had 5 CRM Online Professional licenses on Office 365, you were also covered for the equivalent 5 On-Premise Professional user license CAL’s.

The Dynamics 365 for Enterprise (D365E) release took this offer a step further by also including the coveted Server user license as part of this offer. The official licensing guide for D365E elaborates on this further:

One of the advantages of Microsoft Dynamics 365 is the option to deploy either in Microsoft’s cloud or in a private on-premises or partner-hosted cloud. In some cases, customers may want to deploy both modes simultaneously, for migrating a Microsoft Dynamics on-premises deployment to Microsoft Dynamics 365, running private Dev/Test deployments in Microsoft Azure. With Dual Use Rights, Microsoft Dynamics users licensed with the required User SL do not need to acquire CALs to access Server instances. Users or devices licensed with Dynamics 365 SLs have use rights equivalent to a CAL for the purpose of accessing equivalent on-premise workloads. With Microsoft Dynamics 365 the server license is included with the SLs.

Thanks to this change, organisations can look to fully realise their dual-usage ambitions with D365E, without having to pay a single additional penny – nice! 🙂

When I first learned about the above, my first question was “Great! How do I get my license key for D365E on-premise?”. Unfortunately, the documentation available online is not 100% clear on which route you need to take to get your license key. Attempting to try and get an answer directly from Microsoft Online support can sometimes lead to you being redirected to the above licensing guide, which does not set out clearly how the offer works and a step-by-step account of how to claim your license key. I was recently involved in attempting to obtain a dual-usage license key for an organisation, so I thought I would share my experiences as part of this weeks blog post and provide a straightforward guide for others who may find themselves in the same boat.

Before you Begin…

The route you have to traverse will be dictated significantly by the method in which your organisation has obtained your Online D365E licenses. It could be that your business has:

  • Ordered your licenses directly on the Office 365 portal.
  • Purchased your online subscription through a Microsoft Partner.
  • Obtained a redemption key for your subscription via a Volume License agreement or similar with Microsoft.

So before you set out on your dual-usage license key quest, verify how your organisation originally obtained your D365E licenses. Assuming you have this information to hand, please see below for a summary that covers each license purchase route:

If you purchased your licenses via a Cloud Solutions Provider agreement (i.e. directly with a Microsoft partner)…

Then your license key should be viewable within your CustomerSource profile page for your organisation, underneath the Product And Service Summary Section.

If you purchased your licenses via a Microsoft Products and Services Agreement…

Your license key should be viewable within your Microsoft Business Centre page.

If you purchased your licenses via an Enterprise/Volume License Agreement

Log into the Volume Licensing Service Centre and, underneath your list of Products, you should see your product and corresponding license key.

If you purchased your licenses directly via Office 365 and have a partner of record for your subscription…

You should reach out to them directly and they can then log a support request with Microsoft’s Pre-Sales team. The turn-around for this will generally be a couple of days, and at the end of it, you should be emailed your license key.

If you purchased your licenses directly via Office 365 and DO NOT have a partner of record for your subscription…

Then I believe you will need to log a support request directly with Microsoft Online support to obtain the license key information. I am unable to confirm whether this will work successfully or not, so I would be interested in hearing from anyone in the comments below if this works.

Getting D365E On-Premise Installed

This is essentially a two-step process. Given that the D365E release was not a major application release, there is no dedicated installer available for the product. Instead, you will need to install Dynamics CRM Server 2016 and then download and install the December 2016 update to get the application version up to 8.2. All of the usual pre-requisites for a CRM On-Premise install will apply – you will need a SQL Server instance deployed, an Active Directory to authenticate with and be running a compatible version of Windows Server. The full list of requirements can be viewed on TechNet.

Conclusions or Wot I Think

The expansion of the Dual Usage offering as part of D365E is a welcome and highly beneficial development. Assuming your organisation already has existing infrastructure in place that supports an On-Premise deployment, you can begin to mitigate any extra costs required for additional sandbox instances on Office 365 by quickly setting up local D365E organisations on the On-Premise version of the application. I think there is definitely some work to be done around the whole process of obtaining the license key in the first instance – so, for example, perhaps a button/link in the Dynamics 365 Administration Centre that lets you view your On-Premise license key or fill out a request form – but the very fact that organisations are able to take advantage of this in the first place is one of the reasons why D365E is starting to move ahead of the competition within the ERP/CRM application space.

When you have spent any length of time working with Dynamics CRM Online/Dynamics 365 for Enterprise (D365E) data programmatically, you become accustomed to how Option Set, State and Status Reason values are presented to you in code. To explain, the application does not store your Option Set value display names within the SQL Server Entity tables; rather, the Option Set Value that has been specified alongside your Label is what is stored as an integer value. That is why you are always mandatorily prompted to provide both values within the application:

The following benefits are realised as a result of how this is setup:

That being said, when working with these field types in code, you do always have to have the application window open or a list of all Labels/Values to hand so that you don’t get too confused… 🙂

I have previously extolled the virtues of the Data Export Service on the blog, and why you should consider it if you have basic integration requirements for your CRM/D365E deployment. One area in which it differs from other products on the market is how it handles the field types discussed above. For example, when exporting data to a SQL database via Scribe Online, new columns are created alongside that contain the “Display Name” (i.e. label value) that correspond to each Option, Status and Status Reason Label. So by running the following query against a Scribe export database:

SELECT DISTINCT statecode, statecode_displayname
FROM dbo.account

We get the best of both worlds – our underlying statecode value and their display names – all in 2 lines of code:

This is a big help, particularly when you are then using the data as part of a report, as no additional transformation steps are required and your underlying SQL query can be kept as compact as possible.

The Data Export Service differs from the above in an understandable way, as display name values for Status, Status Reason and Option Set column values are instead segregated out into their own separate table objects in your Azure SQL database:

OptionSetMetadata

GlobalOptionSetMetadata

StateMetadata

StatusMetadata

Why understandable? If you consider how the application can support multiple languages, then you realise that this can also apply to metadata objects across the application – such as field names, view names and – wouldn’t you have guessed it – Labels too. So when we inspect the OptionSetMetadata table, we can see that the table structure accommodates the storing of labels in multiple languages via the LocalizedLabelLanguageCode field:

Unlike the Scribe Online route above (which I assume only retrieves the Labels that correspond to the user account that authenticates with CRM/D365E), the Data Export Service becomes instantly more desirable if you are required to build multi-language reports referencing CRM/D365E application data.

The issue that you have to reconcile yourself with is that your SQL queries, if being expressed as natively as possible, instantly become a whole lot more complex. For example, to achieve the same results as the query above, it would have to be adapted as follows for the Data Export Service:

SELECT DISTINCT statecode, LocalizedLabel
FROM dbo.account
 LEFT JOIN dbo.StateMetadata
  ON 'account' = EntityName
  AND statecode = [State]
  AND '1033' = LocalizedLabelLanguageCode

The above is a very basic example, but if your query is complex – and involves multiple Option Set Values – then you would have to resort to using Common Table Expressions (CTE’s) to accommodate each potential JOIN required to get the information you want.

In these moments, we can look at some of the wider functionality provided as part of SQL Server to develop a solution that will keep things as simple as possible and, in this particular instance, a user-defined function is an excellent candidate to consider. These enable you to perform complex operations against the database platform and encapsulate them within very simply expressed objects that can also accept parameters. The good thing about functions is that they can be used to return table objects and scalar (i.e. single) objects.

Using a scalar function, we can, therefore, remove some of the complexity behind returning Option Set, Status and Status Reason labels by creating a function that returns the correct label, based on input parameters received by the function. You could look at creating a “master” function that, based on the input parameters, queries the correct Metadata table for the information you need; but in this example, we are going to look at creating a function for each type of field – Status, Status Reason, Option Set and Global Option Set.

To do this, connect up to your Data Export Service database and open up a new query window, ensuring that the context is set to the correct database. Paste the following code in the window and then hit Execute:

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

--Create Function to return Global Option Set Labels

CREATE FUNCTION [dbo].[fnGetGlobalOptionSetLabel]
(
	@GlobalOptionSetName NVARCHAR(64), --The logical name of the Global Option Set
	@Option INT, --The option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.GlobalOptionSetMetadata WHERE OptionSetName = @GlobalOptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.GlobalOptionSetMetadata WHERE OptionSetName = @GlobalOptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain label for Global Option Set field ' + @GlobalOptionSetName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Option Set Labels

CREATE FUNCTION [dbo].[fnGetOptionSetLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Option Set field
	@OptionSetName NVARCHAR(64), --The logical name of the Option Set field
	@Option INT, --The option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.OptionSetMetadata WHERE EntityName = @EntityName AND OptionSetName = @OptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.OptionSetMetadata WHERE EntityName = @EntityName AND OptionSetName = @OptionSetName AND [Option] = @Option AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain label for Option Set field ' + @OptionSetName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Status Labels

CREATE FUNCTION [dbo].[fnGetStateLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Status field
	@State INT, --The Status option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.StateMetadata WHERE EntityName = @EntityName AND [State] = @State AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.StateMetadata WHERE EntityName = @EntityName AND [State] = @State AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain State label for entity ' + @EntityName AS INT);
	RETURN @Label;

END

GO

--Create Function to return Status Reason Labels

CREATE FUNCTION [dbo].[fnGetStatusLabel]
(
	@EntityName NVARCHAR(64), --The Entity logical name that contains the Status Reason field
	@Status INT, --The Status Reason option value to retrieve
	@LanguageCode INT --The Language of the label to retrieve. English is 1033. Full list of support languages (Correct as of June 2015) can be found here: https://abedhaniyah.blogspot.co.uk/2015/06/list-of-supported-languages-by.html
)
RETURNS NVARCHAR(256)
AS
BEGIN

	DECLARE @Label NVARCHAR(256);
	DECLARE @RecordCount INT = (SELECT COUNT(*) FROM dbo.StatusMetadata WHERE EntityName = @EntityName AND [Status] = @Status AND LocalizedLabelLanguageCode = @LanguageCode);
	IF @RecordCount = 1
		SET @Label = (SELECT TOP 1 LocalizedLabel FROM dbo.StatusMetadata WHERE EntityName = @EntityName AND [Status] = @Status AND LocalizedLabelLanguageCode = @LanguageCode);
	ELSE
		SET @Label = CAST('An error has occurred. Could not obtain Status label for entity ' + @EntityName AS INT);
	RETURN @Label;

END

GO

This will then go off and create the functions listed in code, which should then show up under the Programmability folder on your SQL database:

For those who are unsure at what the SQL code is doing, it first attempts to determine if only 1 Label can be found for your appropriate field type, based on the parameters provided. If it is successful, then a value is returned; otherwise, the CAST function is designed to force an error to return back to the caller to indicate that none or more than 1 Option Set value was found. In most cases, this would indicate a typo in the parameters you have specified.

As with anything, the best way to see how something works is in the practice! So if we again look at our previous examples shown in this post, we would utilise the dbo.fnGetStateLabel function as follows to return the correct label in English:

SELECT DISTINCT statecode, dbo.fnGetStateLabel('account', statecode, 1033) AS statecode_displayname
FROM dbo.account

With our results returning as follows:

Now we can expose this through our reports and not worry about having to do any kind of transformation/lookup table to get around the issue 🙂

Attempting to keep things as simple as possible by encapsulating complex functionality into simply and clearly expressed functions is an excellent way of ensuring that code can be kept as streamlined as possible, and also ensures that other colleagues can accomplish complex tasks, even if they do not have in-depth knowledge of Transact-SQL.

Although CRM Online/Dynamics 365 for Enterprise (D365E) does provide a plethora of different tools aimed at satisfying reporting requirements for users of the application, you are restricted in how data can be queried within the application. For example, you cannot just connect straight up to the applications SQL database and start writing stored procedures that perform complex data transformations or joins. Traditionally, to achieve this, you would need to look at one of the several tools in the marketplace that enable you to export your data out into a format that best pleases you; or even take the plunge and get a developer to write your own application that satisfies your integration requirements.

With the recent D365E release and in-line with Microsoft’s longstanding approach to how they approach customer data within their applications (i.e. “It’s yours! So just do what you want with it!), the parallel introduction of the Data Export Service last year further consolidates this approach and adds an arguably game-changing tool to the products arsenal. By using the service, relatively straightforward integration requirements can be satisfied in a pinch and a lot of the headache involved in setting up a backup of your organisation’s databases/LOB reporting application can be eliminated. Perhaps the most surprising and crucial aspect of all of this is that using this tool is not going to break the bank too much either.

In this week’s blog post, I’m going to take a closer look at just what the Data Export Service is, the setup involved and the overall experience of using the service from end-to-end.

What is the Data Export Service?

The Data Export Service is a new, free*, add-on for your CRM/D365E subscription, designed to accomplish basic integration requirements. Microsoft perhaps provides the best summary of what the tool is and what it can achieve via TechNet :

The Data Export Service intelligently synchronizes the entire Dynamics 365 data initially and thereafter synchronizes on a continuous basis as changes occur (delta changes) in the Microsoft Dynamics 365 (online) system. This helps enable several analytics and reporting scenarios on top of Dynamics 365 data with Azure data and analytics services and opens up new possibilities for customers and partners to build custom solutions.

The tool is compatible with versions 8.0, 8.1 and 8.2 of the application, which corresponds the following releases of CRM Online/D365E:

  • Dynamics CRM Online 2016
  • Dynamics CRM Online 2016 Update 1
  • Dynamics 365 December Update

*You will still need to pay for all required services in Azure, but the add-on itself is free to download.

The Installation Process

Getting everything configured for the Data Export Service can prove to be the most challenging – and potentially alienating – part of the entire process. For this, you will need the following at your disposal:

  • An active Azure Subscription.
  • An Azure SQL Server configured with a single database or an Azure VM running SQL Server. Microsoft recommends a Premium P1 database or better if you are using an Azure SQL database, but I have been able to get the service working without any issue on S0 tier databases. This is an important point to make, given the cost difference per month can amount to hundreds of £’s.
  • An Azure Key Vault. This is what will securely store the credentials for your DB.
  • PowerShell and access to the Azure Resource Manager (AzureRM) Cmdlets. Powershell can be installed as an OS feature on Windows based platforms, and can now be downloaded onto OS X/Linux as well. PowerShell is required to create an Azure Key Vault, although you can also use it to create your Azure SQL Server instance/Windows VM with SQL Server.

It is therefore recommended that you have at least some experience in how to use Azure – such as creating Resource Groups, deploying individual resources, how the interface works etc. – before you start setting up the Data Export Service. Failing this, you will have to kindly ask your nearest Azure whizz for assistance 🙂 Fortunately, if you know what you’re doing, you can get all of the above setup very quickly; in some cases, less than 10 minutes if you opt to script out the entire deployment via PowerShell.

For your setup with D365E, all is required is the installation of the approved solution via the Dynamics 365 Administration Centre. Highlight the instance that you wish to deploy to and click on the pen icon next to Solutions:

Then click on the Solution with the name Data Export Service for Dynamics 365 and click the Install button. The installation process will take a couple of minutes, so keep refreshing the screen until the Status is updated to Installed. Then, within the Settings area of the application, you can access the service via the Data Export icon:

Because the Data Export Service is required to automatically sign into an external provider, you may also need to verify that your Web Browser pop-up settings/firewall is configured to allow the https://discovery.crmreplication.azure.net/ URL. Otherwise, you are likely to encounter a blank screen when attempting to access the Data Export Service for the first time. You will know everything is working correctly when you are greeted with a screen similar to the below:

Setting up an Export Profile

After accepting the disclaimer and clicking on the New icon, you will be greeted with a wizard-like form, enabling you to specify the following:

  • Mandatory settings required, such as the Export Profile Name and the URL to your Key Vault credentials.
  • Optional settings, such as which database schema to use, any object prefix that you would like to use, retry settings and whether you want to log when records are deleted.
  • The Entities you wish to use with the Export Service. Note that, although most system entities will be pre-enabled to use this service, you will likely need to go into Customizations and enable any additional entities you wish to utilise with the service via the Change Tracking option:

  • Any Relationships that you want to include as part of the sync: To clarify, this is basically asking if you wish to include any default many-to-many (N:N) intersect tables as part of your export profile. The list of available options for this will depend on which entities you have chosen to sync. For example, if you select the AccountLead and Product entities, then the following intersect tables will be available for synchronisation:

Once you have configured your profile and saved it, the service will then attempt to start the import process.

The Syncing Experience A.K.A Why Delta Syncing is Awesome

When the service first starts to sync, one thing to point out is that it may initially return a result of Partial Success and show that it has failed for multiple entities. In most cases, this will be due to the fact that certain entities dependent records have not been synced across (for example, any Opportunity record that references the Account name Test Company ABC Ltd. will not sync until this Account record has been exported successfully). So rather than attempting to interrogate the error logs straightaway, I would suggest holding off a while. As you may also expect, the first sync will take some time to complete, depending on the number of records involved. My experience, however, suggests it is somewhat quick – for example, just under 1 million records takes around 3 hours to sync. I anticipate that the fact that the service is essentially an Azure to Azure export no doubt helps in ensuring a smooth data transit.

Following on from the above, syncs will then take place as and when entity data is modified within the application. The delay between this appears to be very small indeed – often tens of minutes, if not minutes itself. This, therefore, makes the Data Export Service an excellent candidate for a backup/primary reporting database to satisfy any requirements that cannot be achieved via FetchXML alone.

One small bug I have observed is with how the application deals with the listmember intersect entity. You may get an errors thrown back that indicate records failed to sync across successfully, which is not the case upon closer inspection. Hopefully, this is something that may get ironed out and is due to the rather strange way that the listmember entity appears to behave when interacting with it via the SDK.

Conclusions or Wot I Think

For a free add-on service, I have been incredibly impressed by the Data Export Service and what it can do. For those who have previously had to fork out big bucks for services such as Scribe Online or KingswaySoft in the past to achieve very basic replication/reporting requirements within CRM/D365E, the Data Export Service offers an inexpensive way of replacing these services. That’s not to say that the service should be your first destination if your integration requirements are complex – for example, integrating Dynamics 365 with SAP/Oracle ERP systems. In these cases, the names mentioned above will no doubt be the best services to look at to achieve your requirements in a simplistic way. I also have a few concerns that the setup involved as part of the Data Export Service could be a barrier towards its adoption. As mentioned above, experience with Azure is a mandatory requirement to even begin contemplating getting setup with the tool. And your organisation may also need to reconcile itself with utilising Azure SQL databases or SQL Server instances on Azure VM’s. Hopefully, as time goes on, we may start to see the setup process simplified – so, for example, seeing the Export Profile Wizard actually go off and create all the required resources in Azure by simply entering your Azure login credentials.

The D365E release has brought a lot of great new functionality and features to the table, that has been oft requested and adds real benefit to organisations who already or plan to use the application in the future. The Data Export Service is perhaps one of the great underdog features that D365E brings to the table, and is one that you should definitely consider using if you want a relatively smooth sailing data export experience.