Microsoft Flow is a tool that I increasingly have to bring front and centre when considering how to straightforwardly accommodate certain business requirements. The problem I have had with it, at times, is that there are often some notable caveats when attempting to achieve something that looks relatively simple from the outset. A good example of this is the SQL Server connector which, based on headline reading, enables you to trigger workflows when rows are added or changed within a database. Having the ability to trigger an email based on a database record update, create a document on OneDrive or even post a Tweet based on a modified database record are all things that instantly have a high degree of applicability for any number of different scenarios. When you read the fine print behind this, however, there are a few things which you have to bear in mind:

Limitations

The triggers do have the following limitations:

  • It does not work for on-premises SQL Server
  • Table must have an IDENTITY column for the new row trigger
  • Table must have a ROWVERSION (a.k.a. TIMESTAMP) column for the modified row trigger

A slightly frustrating side to this is that Microsoft Flow doesn’t intuitively tell you when your table is incompatible with the requirements – contrary to what is stated in the above post. Whilst readers of this post may be correct in chanting “RTFM!”, it still would be nice to be informed of any potential incompatibilities within Flow itself. Certainly, this can help in preventing any needless head banging early on ūüôā

Getting around these restrictions are fairly straightforward if you have the ability to modify the table you are wanting to interact with using Flow. For example, executing the following script against the MyTable table will get it fully prepped for the service:

ALTER TABLE dbo.MyTable
ADD	[FlowID] INT IDENTITY(1,1) NOT NULL,
	[RowVersion] ROWVERSION
	

Accepting this fact, there may be certain situations when this is not the best option to implement:

  • The database/tables you are interacting with form part of a propriety application, therefore making it impractical and potentially dangerous to modify table objects.
  • The table in question could contain sensitive information. Keep in mind the fact that the Microsoft Flow service would require service account access with full SELECT privileges against your target table. This could expose a risk to your environment, should the credentials or the service itself be compromised in future.
  • If your target table already contains an inordinately large number of columns and/or rows, then the introduction of additional columns and processing via an IDENTITY/ROWVERSION seed could start to tip your application over the edge.
  • Your target database does not use an integer field and IDENTITY seed to uniquely identify columns, meaning that such a column needs to (arguably unnecessarily) added.

An alternative approach to consider would be to configure a “gateway” table for Microsoft Flow to access – one which contains¬†only the fields that Flow needs to process with, is linked back to the source table via a foreign key relationship and which involves the use of a database trigger to automate the creation of the “gateway” record. Note that this approach only works if you have a unique row identifier in your source table in the first place; if your table is recording important, row-specific information and this is not in place, then you should probably re-evaluate your table design ūüėČ

Let’s see how the above example would work in practice, using the following example table:

CREATE TABLE [dbo].[SourceTable]
(
	[SourceTableUID] UNIQUEIDENTIFIER PRIMARY KEY NOT NULL,
	[SourceTableCol1] VARCHAR(50) NULL,
	[SourceTableCol2] VARCHAR(150) NULL,
	[SourceTableCol3] DATETIME NULL
)

In this scenario, the table object is using the UNIQUEIDENTIFIER column type to ensure that each row can be…well…uniquely identified!

The next step would be to create our “gateway” table. Based on the table script above, this would be built out via the following script:

CREATE TABLE [dbo].[SourceTableLog]
(
	[SourceTableLogID] INT IDENTITY(1,1) NOT NULL PRIMARY KEY,
	[SourceTableUID] UNIQUEIDENTIFIER NOT NULL,
	CONSTRAINT FK_SourceTable_SourceTableLog FOREIGN KEY ([SourceTableUID])
		REFERENCES [dbo].[SourceTable] ([SourceTableUID])
		ON DELETE CASCADE,
	[TimeStamp] ROWVERSION
)

The use of a FOREIGN KEY here will help to ensure that the “gateway” table stays tidy in the event that any related record is deleted from the source table. This is handled automatically, thanks to the ON DELETE CASCADE option.

The final step would be to implement a trigger on the dbo.SourceTable object that fires every time a record is INSERTed into the table:

CREATE TRIGGER [trInsertNewSourceTableToLog]
ON [dbo].[SourceTable]
AFTER INSERT
AS
BEGIN
	INSERT INTO [dbo].[SourceTableLog] ([SourceTableLogUID])
	SELECT [SourceTableUID]
	FROM inserted
END

For those unfamiliar with how triggers work, the¬†inserted table is a special object exposed during runtime that allows you to access the values that have been…OK, let’s move on!

With all of the above in place, you can now implement a service account for Microsoft Flow to use when connecting to your database that is sufficiently curtailed in its permissions. This can either be a database user associated with a server level login:

CREATE USER [mydatabase-flow] FOR LOGIN [mydatabase-flow]
	WITH DEFAULT_SCHEMA = dbo

GO

GRANT CONNECT TO [mydatabase-flow]

GO

GRANT SELECT ON [dbo].[SourceTableLog] TO [mydatabase-flow]

GO

Or a contained database user account (this would be my recommended option):

CREATE USER [mydatabase-flow] WITH PASSWORD = 'P@ssw0rd1',
	DEFAULT_SCHEMA = dbo

GO

GRANT CONNECT TO [mydatabase-flow]

GO

GRANT SELECT ON [dbo].[SourceTableLog] TO [mydatabase-flow]

GO

From there, the world is your oyster – you can start to implement whatever action, conditions etc. that you require for your particular requirement(s). There are a few additional tips I would recommend when working with SQL Server and Azure:

  • If you need to retrieve specific data from SQL, avoid querying tables directly and instead encapsulate your logic into Stored Procedures instead.
  • In line with the ethos above, ensure that you always use a dedicated service account for authentication and scope the permissions to only those that are required.
  • If working with Azure SQL, you will need to ensure that you have ticked the¬†Allow access to Azure services options on the Firewall rules page of your server.

Despite some of the challenges you may face in getting your databases up to spec to work with Microsoft Flow, this does not take away from the fact that the tool is incredibly effective in its ability to integrate disparate services together, once you have overcome some initial hurdles at the starting pistol.

In last week’s post, we took a look at how a custom Workflow activity can be implemented within Dynamics CRM/Dynamics 365 for Customer Engagement to obtain the name of the user who triggered the workflow. It may be useful to retrieve this information for a variety of different reasons, such as debugging, logging user activity or to automate the population of key record information. I mentioned in the post the “treasure trove” of information that the IWorkflowContext interface exposes to developers. Custom Workflow activities are not unique in having execution-specific information exposable, with an equivalent interface at our disposal when working with plug-ins. No prizes for guessing its name – the IPluginExecutionContext.

When comparing both interfaces, some comfort can be found in that they share almost identical properties, thereby allowing us to replicate the functionality demonstrated in last weeks post as Post-Execution Create step for the Lead entity. The order of work for this is virtually the same:

  1. Develop a plug-in C# class file that retrieves the User ID of the account that has triggered the plugin.
  2. Add supplementary logic to the above class file to retrieve the Display Name of the User.
  3. Deploy the compiled .dll file into the application via the Plug-in Registration Tool, adding on the appropriate execution step.

The emphasis on this approach, as will be demonstrated, is much more focused towards working outside of the application; something you may not necessarily be comfortable with. Nevertheless, I hope that the remaining sections will provide enough detail to enable you to replicate within your own environment.

Developing the Class File

As before, you’ll need to have ready access to a Visual Studio C# Class file project and the Dynamics 365 SDK. You’ll also need to ensure that your project has a Reference added to the Microsoft.Xrm.Sdk.dll. Create a new¬†Class¬†file and copy and paste the following code into the window:

using System;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;

namespace D365.BlogDemoAssets.Plugins
{
    public class PostLeadCreate_GetInitiatingUserExample : IPlugin
    {
        public void Execute(IServiceProvider serviceProvider)
        {
            // Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            // Obtain the organization service reference.
            IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);

            // The InputParameters collection contains all the data passed in the message request.
            if (context.InputParameters.Contains("Target") &&
                context.InputParameters["Target"] is Entity)

            {
                Entity lead = (Entity)context.InputParameters["Target"];

                //Use the Context to obtain the Guid of the user who triggered the plugin - this is the only piece of information exposed.
      
                Guid user = context.InitiatingUserId;

                //Then, use GetUserDisplayCustom method to retrieve the fullname attribute value for the record.

                string displayName = GetUserDisplayName(user, service);

                //Build out the note record with the required field values: Title, Regarding and Description field

                Entity note = new Entity("annotation");
                note["subject"] = "Test Note";
                note["objectid"] = new EntityReference("lead", lead.Id);
                note["notetext"] = @"This is a test note populated with the name of the user who triggered the Post Create plugin on the Lead entity:" + Environment.NewLine + Environment.NewLine + "Executing User: " + displayName;

                //Finally, create the record using the IOrganizationService reference

                service.Create(note);
            }
        }
    }
}

Note also that you will need to rename the namespace value to match against the name of your project.

To explain, the code replicates the same functionality developed as part of the Workflow on last week’s post – namely, create a Note¬†related to a newly created Lead record and populate it with the Display Name of the User who has triggered the plugin.

Retrieving the User’s Display Name

After copying the above code snippet into your project, you may notice a squiggly red line on the following method call:

The GetUserDisplayName is a custom method that needs to be added in manually and is the only way in which we can retrieve the Display Name of the user, which is not returned as part of the IPluginExecutionContext. We, therefore, need to query the User (systemuser) entity to return the Full Name (fullname) field, which we can then use to populate our newly create Note record. We use a custom method to return this value, which is provided below and should be placed after the last 2 curly braces after the Execute method, but before the final 2 closing braces:

private string GetUserDisplayName(Guid userID, IOrganizationService service)
    {
        Entity user = service.Retrieve("systemuser", userID, new ColumnSet("fullname"));
        return user.GetAttributeValue<string>("fullname");
    }

Deploy to the application using the Plug-in Registration Tool

The steps involved in this do not differ greatly from what was demonstrated in last week’s post, so I won’t repeat myself ūüôā The only thing you need to make sure you do after you have registered the plug-in is to configure the plug-in Step. Without this, your plug-in will not execute. Right-click your newly deployed plug-in on the main window of the Registration Tool and select¬†Register New Step:

On the form that appears, populate the fields/values indicated below:

  • Message:¬†Create
  • Primary Entity: Lead
  • Run in User’s Context: Calling User
  • Event Pipeline Stage of Execution: Post-Operation

The window should look similar to the below if populated correctly. If so, then you can click Register New Step to update the application:

All that remains is to perform a quick test within the application by creating a new Lead record. After saving, we can then verify that the plug-in has created the Note record as intended:

Having compared both solutions to achieve the same purpose, is there a recommended approach to take?

The examples shown in the past two blog posts indicate excellently how solutions to specific scenarios within the application can be achieved via differing ways. As clearly evidenced, one could argue that there is a code-heavy (plug-in) and a light-touch coding (custom Workflow assembly) option available, depending on how comfortable you are with working with the SDK. Plug-ins are a natural choice if you are confident working solely within Visual Studio or have a requirement to perform additional business logic as part of your requirements. This could range from complex record retrieval operations within the application or even an external integration piece involving specific and highly tailored code. The Workflow path clearly favours those of us who prefer to work within the application in a supported manner and, in this particular example, can make certain tasks easier to accomplish. As we have seen, the act of retrieving the Display Name of a user is greatly simplified when we go down the Workflow route. Custom Workflow assemblies also offer greater portability and reusability, meaning that you can tailor logic that can be applied to multiple different scenarios in the future. Code reusability is one of the key drivers in many organisations these days, and the use of custom Workflow assemblies neatly fits into this ethos.

These are perhaps a few considerations that you should make when choosing the option that fits the needs of your particular requirement, but it could be that the way you feel most comfortable with ultimately wins the day – so long as this does not compromise the organisation as a consequence, then this is an acceptable stance to take. Hopefully, this short series of posts have demonstrated the versatility of the application and the ability to approach challenges with equally acceptable pathways for resolution.

It’s sometimes useful to determine the name of the user account that executes a Workflow within Dynamics CRM/Dynamics 365 for Customer Engagement (CRM/D365CE). What can make this a somewhat fiendish task to accomplish is the default behaviour within the application, which exposes very little contextual information each time a Workflow is triggered. Take, for example, the following simplistic Workflow which creates an associated¬†Note¬†record whenever a new¬†Lead record is created:

The Note record is set to be populated with the default values available to us regarding the Workflow execution session –¬†Activity Count, Activity Count including Process¬†and¬†Execution Time:

We can verify that this Workflow works Рand view the exact values of these details Рby creating a new Lead record and refreshing the record page:

The¬†Execution Time¬†field is somewhat useful, but the¬†Activity Count¬†&¬†Activity Count including Process¬†values¬†relate to Workflow execution sessions and are only arguably useful for diagnostic review – not something that end users of the application will generally be interested in ūüôā

Going back to the opening sentence of this post, if we were wanting to develop this example further to include the Name of the user who executed the Workflow in the note, we would have to look at deploying a Custom Workflow Assembly to extract the information out. The IWorkflowContext Interface is a veritable treasure trove of information that can be exposed to developers to retrieve not just the name of the user who triggers a Workflow, but the time when the corresponding system job was created, the Business Unit it is being executed within and information to determine whether the Workflow was triggered by a parent. There are three steps involved in deploying out custom code into the application for utilisation in this manner:

  1. Develop a CodeActivity C# class file that performs the desired functionality.
  2. Deploy the compiled .dll file into the application via the Plugin Registration Tool.
  3. Modify the existing Workflow to include a step that accesses the custom Workflow Activity.

All of these steps will require ready access to Visual Studio, a C# class plugin project (either a new one or existing) and the CRM SDK that corresponds to your version for the application.

Developing the Class File

To begin with, make sure your project includes References to the following Frameworks:

  • System.Activities
  • Microsoft.Xrm.Sdk
  • Microsoft.Xrm.Sdk.Workflow

Add a new Class (.cs) file to your project and copy & paste the below code, overwriting any existing code in the window. Be sure to update the namespace value to reflect your project name:

using System.Activities;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Workflow;

namespace D365.Demo.Plugins
{
    public class GetWorkflowInitiatingUser : CodeActivity
    {
        protected override void Execute(CodeActivityContext executionContext)
        {
            IWorkflowContext workflowContext = executionContext.GetExtension<IWorkflowContext>();
            CurrentUser.Set(executionContext, new EntityReference("systemuser", workflowContext.InitiatingUserId));
        }

        [Output("Current User")]
        [ReferenceTarget("systemuser")]
        public OutArgument<EntityReference> CurrentUser { get; set; }
    }
}

Right-click your project and select¬†Build. Verify that no errors are generated and, if so, then that’s the first step done and dusted ūüôā

Deploy to CRM/D365CE

Open up the Plugin Registration Tool and connect to your desired instance. If you are deploying an existing, updated plugin class, then right-click it on the list of Registered Plugins & Custom Workflow Activities and click Update; otherwise, select Register -> Register New Assembly. The same window opens in any event. Load the newly built assembly from your project (can be located in the \bin\Debug\ folder by default) and ensure the Workflow Activity entry is ticked before selecting Register Selected Plugins:

After registration, the Workflow Activity becomes available for use within the application; so time to return to the Workflow we created earlier!

Adding the Custom Workflow Activity to a Process

By deactivating the Workflow Default Process Values Example Workflow and selecting Add Step, we can verify that the Custom Workflow Assembly is available for use:

Select the above, making sure first of all that the option Insert Before Step is toggled (to ensure it appears before the already configured Create Note for Lead step). It should look similar to the below if done correctly:

Now, when we go and edit the Create Note for Lead step, we will see a new option under Local Values which, when selected, bring up a whole range of different fields that correspond to fields from the User Entity. Modify the text within the Note to retrieve the Full Name value and save it onto the Note record, as indicated below:

After saving and reactivating the Workflow, we can verify its working by again creating a new Lead record and refreshing to review the Note text:

All working as expected!

The example shown in this post has very limited usefulness in a practical business scenario, but could be useful in different circumstances:

  • If your Workflow contains branching logic, then you can test to see if a Workflow has executed by a specific user and then perform bespoke logic based on this value.
  • Records can be assigned to other users/teams, based on who has triggered the Workflow.
  • User activity could be recorded in a separate entity for benchmarking/monitoring purposes.

It’s useful to know as well that the same kind of functionality can also be deployed when working with plugins as well in the application. We will take a look at how this works as part of next week’s blog post.

I took some time out this week to head down to Microsoft’s Reading offices for the November CRMUG meeting. There is often a whole host of reasons that can be conjured up to excuse yourself from events like this – “I’m too busy at work!”, “It’s such a long way away!” etc. – but, ultimately, it’s always best to make the effort and get involved. The theme of the day was around¬†Awareness of your CRM system, which was neatly kicked off by a short presentation from Microsoft on the current roadmap for Dynamics 365 for Customer Engagement (D365CE). There was a clear emphasis towards GDPR on some of the available presentation tracks, a topic that regular readers of the blog should be well prepared for I hope. ūüôā Another key aspect of the day was networking, with ample opportunities to meet new people and to understand their current journey involving CRM/D365CE. Here are my thoughts on the sessions I attended, along with some closing remarks on why these types of events are always beneficial.

Accelerate GDPR with Microsoft Cloud

The first talk I attended was all about GDPR from Microsoft’s perspective. The session was co-led by David Hirst and David Reid from Microsoft and did a really good job in setting out the GDPR stall for the uninitiated, as well as offering some pointers towards Microsoft solutions/technologies that may prove beneficial towards achieving compliance. There were also some fascinating anecdotal pieces, such as, for example, the story of a UK based pub chain who has decided to completely remove all customer email address data from their systems, presumably with GDPR in mind. An extreme, but arguably pragmatic, approach.

The talk came across as refreshingly candid, with a real demonstrable attempt of portraying a concerted effort behind the scenes at Microsoft to ensure that they – and their entire product range – are prepared for GDPR. Microsoft is not just talking the talk when it comes to GDPR (which, to be frank, can often result in a lot of scaremongering by other companies), but are instead providing current and new customers with the tools and information they need to streamline their route towards compliance. The key takeaway from the session, which was borne out by some of the Q&A’s at the end, is that it’s naive to assume that technology companies like Microsoft can provide a “silver bullet” solution to deal with all of your GDPR woes. Organisations need to go away and do a lot of the hard work when it comes to determining the type of data they hold across the entire organisation, whether the source of consent for this could be considered risky and to implement the appropriate business processes and technological¬†helper tools to make dealing with things such as subject access requests as simple as possible.

What is GDPR and how it impacts your business and your Dynamics 365 solutions, Get Ready for your new legal obligations.

The next talk was (again!) on GDPR and was presented by CRM MVP, Mohamed Mostafa, and was specifically focused on GDPR in the context of D365CE.¬†Mohamed’s talk was very focused, assisted by some great visual aids, and he also presented some interesting examples on how you can leverage existing application features to help you towards GDPR compliance. Plenty of food for thought!

One area mentioned by Mohamed in his presentation which I perhaps disagree with him on (sorry!) is the emphasis placed on the massive fine figures that are often quoted when it comes to GDPR. A heavy focus towards this does, in my view, present a degree of scaremongering. This is confirmed by the fact that Elizabeth Denham, the Information Commissioner, has gone public herself on the whole issue¬†and cautions businesses to be wary of the “massive fines” narrative. I agree with her assessment, and that fines should always be considered a “last resort” in targeting organisations that have demonstrated a willful disregard for their obligations in handling personal data. My experience with the ICO on a personal level backs this up, and I have always found them to be fair and proportional when dealing with organisations who are trying to do the best they can. GDPR presents a real opportunity for organisations to get to grips with how they handle their personal data, and I encourage everyone to embrace it and to make the necessary changes to accommodate. But, by the same token, organisations should not be panic-stricken into a narrative that causes them to adopt unnecessary technologies under the whole “silver bullet” pretence.

What’s new in Dynamics 365 9.0

To date, I have not had much of a chance to play around in detail with version 9.0 of D365CE. For this reason, MVP Sarah Critchley’s talk ranked highly on the agenda for me. Sarah’s enthusiasm for the application is infectious, and she covered a wide breadth of the more significant new features that can be found in the new version of the application, including (but not limited to):

  • Presentation changes to the Sitemap
  • Introduction to Virtual Entities and how to set them up
  • Changes to the mobile application

Sarah framed all of the changes with a before/after comparison to version 8.2 of the application, thereby allowing the audience to contextualise the changes a lot better. The best thing that I liked about the whole presentation is that it scratched beneath the surface to highlight less noticeable changes that may have a huge impact for end-users of the application. Attention was paid to the fact that the web application refresh is now a fully mobile responsive template, meaning that it adjusts automatically to suit a mobile or tablet device screen size. Another thing which I didn’t know about the new Virtual Entities feature is that they can be used as Lookup fields on related entities. This immediately expands their versatility, and I am looking forward to seeing how the feature develops in the future.

Implementing a Continuous Integration Strategy with Dynamics 365

I’ll admit that I went into the final talk of the day with Ben Walker¬†not 100% sure what to expect, but walked away satisfied that it was perhaps the most underrated session of the day ūüôā Ben took us through his journey of implementing a continuous integration strategy (translation: testing through the development process and automating the deployment process) for CRM 2015 in his current role, and he should be proud of what he has achieved in this respect. Ben showed the room a number of incredibly useful developer tidbits, such as:

  • The ability to export CRM/D365CE solution information into Visual Studio and then sync up to a Git repository.
  • Deep integrating of unit testing, via the FakeXrmEasy framework.
  • The ability to trigger automated builds in TFS after a code check-in, which can then be used to push out solution updates into a dev/test environment¬†automatically. With the additional option of allowing somebody else to approve the deployment before it starts.

All of the above Ben has been able to build together as an end to end process which looks almost effortless in its construction. The benefit – and main caveat from the whole session, it has to be said – is that Ben is primarily working within an on-premise CRM environment and is using tools which may not be fully supported with online versions of the application. For example, the ability to deploy Solution updates via PowerShell is definitely not supported by D365CE online. Despite this, Ben’s presentation should have left everyone in the room with enough things to go away with, research and implement to make their CRM/D365CE development more of a breeze in future.

Conclusions or Wot I Think

This was my first time attending a CRMUG meeting, and I am glad that I finally found the time to do so. As Sarah highlighted at the start of the day, the key benefit of the whole event is the opportunity to network, and I had ample opportunity to meet face-to-face some of my CRM heroes, as well as others working in the industry. It can often feel like a lonely journey working with applications like CRM/D365CE, particularly if you are working as part of a small team within your business. Events such as these very much bring you together with other like-minded individuals, who will happily talk to you non-stop about their passion for the application and technology. And, because the events are closely supported by Microsoft, it means that Tuesday’s meeting allowed for lots of authoritative information to come to fore throughout the entire day. I am very much looking forward to attending my next CRMUG meeting in the future and would urge anyone with at least a passing interest in the world of CRM/D365CE to consider attending their next local meeting.

When it comes to technology learning, it can often feel as if you are fighting against a constant wave of change, as studying is outpaced by the introduction of new technical innovations. Fighting the tide is often the most desirous outcome to work towards, but it can be understandable why individuals choose to specialise in a particular technology area. There is no doubt some comfort in becoming a subject matter expert and in not having to worry about “keeping up with the Joneses”. However, when working with an application such as Dynamics 365 for Customer Engagement (D365CE), I would argue it is almost impossible to ignore the wider context of what sit’s alongside the application, particularly Azure, Microsoft’s cloud as a service platform. Being able to understand how the application can be extended via external integrations is typically high on the list of any project requirements, and often these integrations require a light-touch Azure involvement, at a minimum. Therefore, the ability to say that you are confident in accomplishing certain key tasks within Azure instantly puts you ahead of others and in a position to support your business/clients more straightforwardly.

Here are 4 good reasons why you should start to familiarise yourself with Azure, if you haven’t done so already, or dedicate some additional time towards increasing your knowledge in an appropriate area:

Dynamics 365 for Customer Engagement is an Azure application

Well…we can perhaps not answer this definitively and say that 100% of D365CE is hosted on Azure (I did hear a rumour that¬†some aspects of the infrastructure were hosted on AWS). Certainly, for instances that are provisioned within the UK, there is ample evidence to suggest this to be the case. What can be said with some degree of certainty is that D365CE is an Azure¬†leveraged¬†application. This is because it uses key aspects of the service to deliver various functionality within the application:

  • Azure Active Directory: Arguably the crux of D365CE is the security/identity aspect, all of which is powered using Microsoft’s cloud version of Active Directory.
  • Azure Key Vault: Encryption is enabled by default on all D365CE databases, and the management of encryption keys is provided via¬†Azure Key Vault.
  • Office 365: Similar to D365CE, Office 365 is – technically – an Azure cloud service provided by Microsoft. As both Office 365 and D365CE often need to be tightly knitted together, via features such as Server-Side Synchronisation, Office 365 Groups and SharePoint document management, it can be considered a¬†de facto part of the base application.

It’s fairly evident, therefore, that D365CE can be considered as a Software as a Service (SaaS) application hosted on Azure. But why is all this important? For the simple reason that, because as a D365CE professional, you will be supporting the full breadth of the application and all it entails, you are already an Azure professional by default. Not having a cursory understanding of Azure and what it can offer will immediately put you a detriment to others who do, and increasingly places you in a position where your D365CE expertise is severely blunted.

It proves to prospective employers that you are not just a one trick pony

When it comes to interviews for roles focused around D365CE, I’ve been at both sides of the table.¬†What I’ve found separates a¬†good D365CE CV from an¬†excellent one all boils down to how effectively the candidate has been able to expand their knowledge into the other areas.¬†How much additional knowledge of other applications, programming languages etc. does the candidate bring to the business? How effectively has the candidate moved out of their comfort zone in the past in exploring new technologies, either in their current roles or outside of work? More importantly, how much¬†initiative¬†and¬†passion has the candidate shown in embracing changes? A candidate who is able to answer these questions positively and is able to attribute, for example, extensive knowledge of Azure will instantly move up in my estimation of their ability. On the flip side of this, I believe that interviews that have resulted in a job offer for me have been helped, in no small part, to the additional technical skills that I can make available to a prospective employer.

To get certain things done involving D365CE, Azure knowledge is a mandatory requirement

I’ve talked about one of these tasks before on the blog, namely, how to setup the Azure Data Export solution to automatically synchronise your application data to an Azure SQL Database. Unless you are in the fortunate position of having an Azure savvy colleague who can assist you with this, the only way you are going to be able to successfully complete this task is to know how to deploy an Azure SQL Server instance, a database for this instance and the process for setting up an Azure Key Vault. Having at least¬†some familiarity with how to deploy simple resources in Azure and accomplish tasks via PowerShell script execution will place you in an excellent position to achieve the requirements of this task, and others such as:

The above is just a flavour of some of the things you can do with D365CE and Azure together, and there are doubtless many more I have missed ūüôā The key point I would highlight is that you should not just naively assume that D365CE is containerised away from Azure; in fact, often the clearest and cleanest way of achieving more complex business/technical requirements will require a detailed consideration of what can be built out within Azure.

There’s really no good reason not to, thanks to the wealth of resources available online for Azure training.

A sea change seems to be occurring currently at Microsoft with respect to online documentation/training resources. Previously, TechNet and MSDN would be your go-to resources to find out how something Microsoft related works. Now, the Microsoft Docs website is where you can find the vast majority of technical documentation. I really rate the new experience that Microsoft Docs provides, and there now seems to be a concerted effort to ensure that these articles are clear, easy to follow and include end-to-end steps on how to complete certain tasks. This is certainly the case for Azure and, with this in mind, I defy anyone to find a reasonable enough excuse not to begin reading through these articles. They are the quickest way towards expanding your knowledge within an area of Azure that interests you the most or to help prepare you to, for example, setup a new Azure SQL database from scratch.

For those who learn better via visual tools, Microsoft has also greatly expanded the number of online video courses available for Azure, that can be accessed for free. There are also some excellent, “deep-dive” topic areas that can also be used to help prepare you for Azure certification.

Conclusions or Wot I Think

I use the term “D365CE professional” a number of times throughout this post. This is a perhaps unhelpful label to ascribe to anyone working with D365CE today. A far better title is, I would argue, “Microsoft cloud professional”, as this gets to the heart of what I think anyone who considers themselves a D365CE “expert” should be. Building and supporting solutions within D365CE is by no means an isolated experience, as you might have argued a few years back. Rather, the onus is on ensuring that consultants, developers etc. are as multi-faceted as possible from a skillset perspective. I talked previously on the blog about becoming a swiss army knife in D365CE. Whilst this is still a noble and recommended goal, I believe casting the net wider can offer a number of benefits not just for yourself, but for the businesses and clients you work with every day. It puts you centre-forward in being able to offer the latest opportunities to implement solutions that can increase efficiency, reduce costs and deliver positive end-user experiences. And, perhaps most importantly, it means you can confidently and accurately attest to your wide-ranging expertise in any given situation.