When it comes to technology learning, it can often feel as if you are fighting against a constant wave of change, as studying is outpaced by the introduction of new technical innovations. Fighting the tide is often the most desirous outcome to work towards, but it can be understandable why individuals choose to specialise in a particular technology area. There is no doubt some comfort in becoming a subject matter expert and in not having to worry about “keeping up with the Joneses”. However, when working with an application such as Dynamics 365 for Customer Engagement (D365CE), I would argue it is almost impossible to ignore the wider context of what sit’s alongside the application, particularly Azure, Microsoft’s cloud as a service platform. Being able to understand how the application can be extended via external integrations is typically high on the list of any project requirements, and often these integrations require a light-touch Azure involvement, at a minimum. Therefore, the ability to say that you are confident in accomplishing certain key tasks within Azure instantly puts you ahead of others and in a position to support your business/clients more straightforwardly.

Here are 4 good reasons why you should start to familiarise yourself with Azure, if you haven’t done so already, or dedicate some additional time towards increasing your knowledge in an appropriate area:

Dynamics 365 for Customer Engagement is an Azure application

Well…we can perhaps not answer this definitively and say that 100% of D365CE is hosted on Azure (I did hear a rumour that some aspects of the infrastructure were hosted on AWS). Certainly, for instances that are provisioned within the UK, there is ample evidence to suggest this to be the case. What can be said with some degree of certainty is that D365CE is an Azure leveraged application. This is because it uses key aspects of the service to deliver various functionality within the application:

  • Azure Active Directory: Arguably the crux of D365CE is the security/identity aspect, all of which is powered using Microsoft’s cloud version of Active Directory.
  • Azure Key Vault: Encryption is enabled by default on all D365CE databases, and the management of encryption keys is provided via Azure Key Vault.
  • Office 365: Similar to D365CE, Office 365 is – technically – an Azure cloud service provided by Microsoft. As both Office 365 and D365CE often need to be tightly knitted together, via features such as Server-Side Synchronisation, Office 365 Groups and SharePoint document management, it can be considered a de facto part of the base application.

It’s fairly evident, therefore, that D365CE can be considered as a Software as a Service (SaaS) application hosted on Azure. But why is all this important? For the simple reason that, because as a D365CE professional, you will be supporting the full breadth of the application and all it entails, you are already an Azure professional by default. Not having a cursory understanding of Azure and what it can offer will immediately put you a detriment to others who do, and increasingly places you in a position where your D365CE expertise is severely blunted.

It proves to prospective employers that you are not just a one trick pony

When it comes to interviews for roles focused around D365CE, I’ve been at both sides of the table. What I’ve found separates a good D365CE CV from an excellent one all boils down to how effectively the candidate has been able to expand their knowledge into the other areas. How much additional knowledge of other applications, programming languages etc. does the candidate bring to the business? How effectively has the candidate moved out of their comfort zone in the past in exploring new technologies, either in their current roles or outside of work? More importantly, how much initiative and passion has the candidate shown in embracing changes? A candidate who is able to answer these questions positively and is able to attribute, for example, extensive knowledge of Azure will instantly move up in my estimation of their ability. On the flip side of this, I believe that interviews that have resulted in a job offer for me have been helped, in no small part, to the additional technical skills that I can make available to a prospective employer.

To get certain things done involving D365CE, Azure knowledge is a mandatory requirement

I’ve talked about one of these tasks before on the blog, namely, how to setup the Azure Data Export solution to automatically synchronise your application data to an Azure SQL Database. Unless you are in the fortunate position of having an Azure savvy colleague who can assist you with this, the only way you are going to be able to successfully complete this task is to know how to deploy an Azure SQL Server instance, a database for this instance and the process for setting up an Azure Key Vault. Having at least some familiarity with how to deploy simple resources in Azure and accomplish tasks via PowerShell script execution will place you in an excellent position to achieve the requirements of this task, and others such as:

The above is just a flavour of some of the things you can do with D365CE and Azure together, and there are doubtless many more I have missed 🙂 The key point I would highlight is that you should not just naively assume that D365CE is containerised away from Azure; in fact, often the clearest and cleanest way of achieving more complex business/technical requirements will require a detailed consideration of what can be built out within Azure.

There’s really no good reason not to, thanks to the wealth of resources available online for Azure training.

A sea change seems to be occurring currently at Microsoft with respect to online documentation/training resources. Previously, TechNet and MSDN would be your go-to resources to find out how something Microsoft related works. Now, the Microsoft Docs website is where you can find the vast majority of technical documentation. I really rate the new experience that Microsoft Docs provides, and there now seems to be a concerted effort to ensure that these articles are clear, easy to follow and include end-to-end steps on how to complete certain tasks. This is certainly the case for Azure and, with this in mind, I defy anyone to find a reasonable enough excuse not to begin reading through these articles. They are the quickest way towards expanding your knowledge within an area of Azure that interests you the most or to help prepare you to, for example, setup a new Azure SQL database from scratch.

For those who learn better via visual tools, Microsoft has also greatly expanded the number of online video courses available for Azure, that can be accessed for free. There are also some excellent, “deep-dive” topic areas that can also be used to help prepare you for Azure certification.

Conclusions or Wot I Think

I use the term “D365CE professional” a number of times throughout this post. This is a perhaps unhelpful label to ascribe to anyone working with D365CE today. A far better title is, I would argue, “Microsoft cloud professional”, as this gets to the heart of what I think anyone who considers themselves a D365CE “expert” should be. Building and supporting solutions within D365CE is by no means an isolated experience, as you might have argued a few years back. Rather, the onus is on ensuring that consultants, developers etc. are as multi-faceted as possible from a skillset perspective. I talked previously on the blog about becoming a swiss army knife in D365CE. Whilst this is still a noble and recommended goal, I believe casting the net wider can offer a number of benefits not just for yourself, but for the businesses and clients you work with every day. It puts you centre-forward in being able to offer the latest opportunities to implement solutions that can increase efficiency, reduce costs and deliver positive end-user experiences. And, perhaps most importantly, it means you can confidently and accurately attest to your wide-ranging expertise in any given situation.

The world of database security and protection can be a difficult path to tread at times. I often find myself having to adopt a “tin-foil hat” approach, obsessing over the smallest potential vulnerability that a database could be compromised with. This thought process can be considered easy compared with any protective steps that need to be implemented in practice, as these can often prove to be mind-bogglingly convoluted. This is one of the reasons why I like working with Microsoft Azure and features such as Azure SQL Database Firewall Rules. They present a familiar means of securing your databases to specific IP address endpoints and are not inordinately complex in how they need to be approached; just provide a name, Start/End IP range and hey presto! Your client/ application can communicate with your database. The nicest thing about them is that the feature is enabled by default, meaning you don’t have to worry about designing and implementing a solution to restrict your database from unauthorised access at the outset.

As alluded to above, Database Firewall Rules are added via T-SQL code (unlike Server Rules, which can be specified via the Azure portal), using syntax that most SQL developers should feel comfortable using. If you traditionally prefer to design and build your databases using a Visual Studio SQL Database project, however, you may encounter a problem when looking to add a Database Firewall rule to your project. There is no dedicated template item that can be used to add this to the database. In this eventuality, you would have to look at setting up a Post-Deployment Script or Pre-Deployment Script to handle the creation of any requisite rules you require. Yet this can present the following problems:

  • Visual Studio will be unable to provide you with the basic syntax to create the rules.
  • Related to the above, Intellisense support will be limited, so you may struggle to identify errors in your code until it is deployed.
  • When deploying changes out to your database, the project will be unable to successfully detect (and remove) any rules that are deleted from your project.

The last one could prove to be particularly cumbersome if you are tightly managing the security of your Azure SQL database. Putting aside the obvious risk of someone forgetting to remove a rule as part of a deployment process, you would then have to manually remove the rules by connecting to your database and executing the following T-SQL statement:

EXECUTE sp_delete_database_firewall_rule 'MyDBFirewallRule'

Not the end of the world by any stretch, but if you are using Visual Studio as your deployment method for managing changes to your database, then having to do this step seems a little counter-intuitive. Fortunately, with a bit of creative thinking and utilisation of more complex T-SQL functionality, we can get around the issue by developing a script that carries out the following steps in order:

  • Retrieve a list of all current Database Firewall Rules.
  • Iterate through the list of rules and remove them all from the database.
  • Proceed to re-create the required Database Firewall Rules from scratch

The second step involves the use of a T-SQL function that I have traditionally steered away from using – Cursors. This is not because they are bad in any way but because a) I have previously struggled to understand how they work and b) have never found a good scenario in which they could be used in. The best way of understanding them is to put on your C# hat for a few moments and consider the following code snippet:

string[] array = new string[] { "Test1", "Test2", "Test3" }; 

foreach(string s in array)
    {
        Console.WriteLine(s);
    }
    

To summarise how the above works, we take our collection of values – Test1, Test2 and Test3 – and carry out a particular action against each; in this case, print out their value into the console. This, in a nutshell, is how Cursors work, and you have a great deal of versatility on what action you take during each iteration of the “loop”.

With a clear understanding of how Cursors work. the below script that accomplishes the aims set out above should hopefully be a lot clearer:

DECLARE @FirewallRule NVARCHAR(128)

DECLARE REMOVEFWRULES_CURSOR CURSOR
	LOCAL STATIC READ_ONLY FORWARD_ONLY
FOR
SELECT DISTINCT [name]
FROM sys.database_firewall_rules

OPEN REMOVEFWRULES_CURSOR
FETCH NEXT FROM REMOVEFWRULES_CURSOR INTO @FirewallRule
WHILE @@FETCH_STATUS = 0
BEGIN
	EXECUTE sp_delete_database_firewall_rule @FirewallRule
	PRINT 'Firewall rule ' + @FirewallRule + ' has been successfully deleted.'
	FETCH NEXT FROM REMOVEFWRULES_CURSOR INTO @FirewallRule
END
CLOSE REMOVEFWRULES_CURSOR
DEALLOCATE REMOVEFWRULES_CURSOR

GO

EXECUTE sp_set_database_firewall_rule @name = N'MyDBFirewallRule1',
		@start_ip_address = '1.2.3.4', @end_ip_address = '1.2.3.4';

EXECUTE sp_set_database_firewall_rule @name = N'MyDBFirewallRule2',
		@start_ip_address = '1.2.3.4', @end_ip_address = '1.2.3.4';
		

To integrate as part of your existing database project, add a new Post-Deployment Script file and modify the above to reflect your requirements. As the name indicates, the script will run after all other aspects of your solution deployment has been completed. Now, the key caveat to bear in mind with this solution is that, during deployment, there will be a brief period of time where all Database Firewall Rules are removed from the database. This could potentially prevent any current database connections from dropping or failing to connect altogether. You should take care when using the above code snippet within a production environment and I would recommend you look at an alternative solution if your application/system cannot tolerate even a second of downtime.

Perhaps one of the most useful features at your disposal when working with Azure SQL Databases is the ability to integrate your Azure Active Directory (Azure AD) login accountsa la Windows Authentication for on-premise SQL Server. There are numerous benefits in shifting away from SQL Server-only user accounts in favour of Azure AD:

  • Ensures consistent login identities across multiple services.
  • Can enforce password complexity and refresh rules more easily.
  • Once configured, they behave exactly the same as standard SQL Server only logins.
  • Supports advanced usage scenarios involving Azure AD, such as multi-factor authentication and Single Sign-On (SSO) via Active Directory Federation Services (ADFS).

Setup can be completed in a pinch, although you will need to allocate a single/group of user(s) as the Active Directory admin for the Azure SQL Server. You may also choose to take due care and precautions when choosing your Active Directory admin(s); one suggestion would be to use a unique service account for the Active Directory admin, with a strong password, instead of granting such extensive privileges to normal user accounts.

Regardless of how you go about configuring the feature, I would recommend using it where-ever you can, for both internal purposes and also for anyone who wishes to access your SQL Server from an external directory. This second scenario is, you may be surprised to hear, fully supported. It assumes, first off, that you have added this account to your directory as a Guest/External User account. Then, you just follow the normal steps to get the account created on your Azure SQL Server.

There is one major “gotcha” to bear in mind when doing this. Let’s assume that you have added john.smith@domain.co.uk to the Azure AD tenant test.onmicrosoft.com. You then go to setup this account to access a SQL Server instance on the tenant. You will more than likely receive the following error message when using the example syntax below to create the account:

CREATE USER [john.smith@domain.co.uk] FROM EXTERNAL PROVIDER

The issue is, thankfully, simple to understand and fix. When External user accounts are added onto your Active Directory, despite having the same login name that derives from their source directory, they are stored in the new directory with a different UserPrincipalName (UPN). Consider the above example – the UPN in the source directory would be as follows:

john.smith@domain.co.uk

Whereas, as the Azure AD tenant name in this example is test.onmicrosoft.com, the UPN for the object would be:

john.smith_domain.co.uk#EXT#@test.onmicrosoft.com

I assume that this is done to prevent any UPN duplication across Microsoft’s no-doubt dizzying array of cloud Active Directory tenants and forests. In any event, knowing this, we can adjust our code above to suit – and successfully create our Database user account:

CREATE USER [john.smith_domain.co.uk#EXT#@test.onmicrosoft.com] FROM EXTERNAL PROVIDER

I guess this is one of those things where having at least a casual awareness of how other technologies within the Microsoft “stack” work can assist you greatly in troubleshooting what turn out to be simplistic errors in your code. Frustrating all the same, but we can claim knowledge of an obscure piece of Azure AD trivia as our end result 🙂

I would not recommend setting up a Windows Server Domain Services role for the first time flying blind. Whilst not necessarily classing myself as a “newbie” in this respect, I have only run through this a few times in the past within lab environments. The process is always tricky – not just in deploying out the role in the first instance, but more via the many quirks that get thrown up as you try to accomplish what should be simple tasks, such as domain joining devices or getting DNS settings correctly mapped out. These and other tiresome rat races can leave you with a severely scratched head and distract you from your ultimate goal.

For this reason, you could argue that Azure Active Directory Domain Services (or AADDS) is the perfect solution for “newbies”. The thing I like the most about it is that a lot of the hassle I make reference to above is something you will never see a sight of, thanks to the fact that Microsoft manages most aspects of the deployment behind the scenes. In addition, the step-by-step guides available on the Microsoft Docs website provide a very clear and no-nonsense holding hand through every step of an Azure Domain Services rollout. What this ultimately means is that you can spend more time on achieving your end goal and reduce the need for extensive administration of the solution following its rollout. Having said that, it is always useful to ensure that you have thoroughly tested any solution as extensively as possible for your particular scenario, as this will always throw up some potential issues or useful tips to remember in future; AADDS is no exception to this rule.

Having recently worked closely with AADDS, in this weeks blog post, I wanted to share some of my detailed thoughts regarding the solution in practice and a few things to remember if you are checking it out for the first time.

Resetting AADDS User passwords could become the bane of your existence

If you are creating an AADDS resource in isolation to any existing identity providers you have in place (i.e. there is no requirement to use Azure AD Connect with an On-Premise Domain Server), then be aware that you will have to set up a users password twice before they will be able to login to the domain. Microsoft explains better than I can why this is:

To authenticate users on the managed domain, Azure Active Directory Domain Services needs credential hashes in a format that’s suitable for NTLM and Kerberos authentication. Azure AD does not generate or store credential hashes in the format that’s required for NTLM or Kerberos authentication, until you enable Azure Active Directory Domain Services for your tenant. For obvious security reasons, Azure AD also does not store any password credentials in clear-text form. Therefore, Azure AD does not have a way to automatically generate these NTLM or Kerberos credential hashes based on users’ existing credentials…If your organization has cloud-only user accounts, all users who need to use Azure Active Directory Domain Services must change their passwords

Source: https://docs.microsoft.com/en-us/azure/active-directory-domain-services/active-directory-ds-getting-started-password-sync

The above article goes into the required steps that need to be followed for each user account that is created and, at the time of writing, I do not believe there is any way of automating this process. Whether you choose to complete these steps yourself or get your end users to do so instead is up to you, but there is a good chance that if a user is experiencing login issues with an AADDS account, then the steps in the above article have not been followed correctly.

Make sure you’re happy with your chosen DNS Name

When first creating your Domain Services resource, you need to be pretty certain your desired DNS domain name will not be subject to change in the future. After some fruitless digging around on the portal and an escalated support request to Microsoft, I was able to confirm that there is no way this can be changed after the Domain Services resource is deployed; your only recourse is to recreate the resource from scratch with your newly desired DNS domain name. This could prove to be problematic if, say, you wish to change the domain name of your in-development domain services account from the default onmicrosoft.com domain to a bespoke one…after you have already joined Virtual Machines to your new domain : / Some efficient use of the Azure templates feature can save you some aggro here, but not if you have already expended considerable effort on bespoke customisation on each of your VM’s operating systems.

Be aware of what’s supported….and what isn’t

There are a few articles that Microsoft have published that can help you to determine whether AADDS is right for your particular scenario:

Whilst these are invaluable and, admittedly, demonstrate the wide-feature array contained with AADDS, there are still a few hidden “gotchas” to be aware of. The articles above hint towards some of these:

  • Only one AADDS resource is allowed per Azure tenant. You will need to configure a clean Active Directory tenant (and therefore a separate Azure portal) for any additional AADDS resource you wish to setup, which also requires an appropriate subscription for billing. This could result in ever-growing complexity to your Azure footprint.
  • AADDS is a continually billable service. Unlike VM’s, which can be set to Stopped (unallocated) status at any time and, therefore, not incur any usage charges, your Domain Services resource will incur fees as soon as you create it and only cease when the resource is deleted.

One unsupported feature that the above articles do not provide any hint towards is Managed Service Accounts. Introduced as part of Windows Server 2008 R2, they provide a more streamlined means of managing service accounts for applications running on Windows Server, reducing the requirement to maintain passwords for these accounts and allowing administrators to provide domain-level privileges to essential service account objects. I try to use them whenever I can in conjunction with SQL Server installations, particularly if the service accounts for SQL need to access network-level resources that are secured via a security group or similar and I would encourage you to read up on them further to see if they could be a help within your SQL Server deployments.

Back to the topic at hand – if you attempt to create a Managed Service Account via PowerShell, you will receive an error message saying that they are not supported within the domain. So, assuming that you are wanting to go ahead and deploy SQL Server on an AADDS joined VM, you would have to revert back to using standard Active Directory user accounts for your Service Accounts to achieve the same functionality. Not great if you also have to enforce password refresh policies, but it would be the only supported workaround in this situation.

Conclusion or Wot I Think

When reviewing potential new IT vendors or products, I always try and judge “how dirty handed” I would need to get with it. What I mean by this is the level of involvement myself, or a business, would need to invest in managing physical server hardware, backend elements of the infrastructure or any aspect of the solution that requires an inordinate amount of time poking around the innards of to troubleshoot basic problems. The great benefit of services such as Azure is that a lot of this pain is taken away by default – for example, there is no need to manage server, firewall and networking hardware at all, as Microsoft does this for you. AADDS goes a step further by removing the need to manage the server aspect of a Domain Services deployment, allowing you to focus more on building out your identities and integrating them within your chosen application. Whilst it does need some work to get it up to an acceptable level of parity with a “do-it-yourself” Domain Server (for example, extensive PowerShell support for the completion of common tasks), the service is still very much in a developed and user-friendly state to warrant further investigation – particularly if you have a simplified Active Directory Domain in place or are looking to migrate across to Active Directory from another vendor. £80 per month for a directory smaller than 25,000 objects is also not an exorbitant price to pay as well, so I would definitely recommend you check AADDS out to see if it could be a good fit for your organisation/application in the near future.

In the early days of Office 365, you would be accustomed to a certain kind of experience when purchasing licenses as a small/medium size business (SMB) customer. As these types of organisations are typically too small to warrant the cost for Enterprise Agreements or Volume Licensing, your only recourse to buying Office 365 services was via the portal itself. At this time, any involvement with a Microsoft Partner would be minimal or even non-existent. Partners would only enter the picture if a business opted to grant them Partner of Record status, thereby allowing them to manage aspects of your subscription behind the scenes. This process, while wholly sufficient, did have some notable gaps and, if you were an organisation focused on tightly managing all aspects of your customer journey, could be prone to change or interruption via interference from Microsoft

The Cloud Solutions Provider programme (or CSP for short) is Microsoft’s “next-generation” opportunity for partners to get directly involved as part of a customers journey onto Office 365, Dynamics 365 and/or Azure and aims to resolve some of the issues highlighted above. Instead of turning directly to Microsoft when you need a new license subscription or have an issue with a particular Office 365 opportunity, customers would instead deal directly with a CSP provider, who will be able to offer them all of this, and more. Everyone wins – the customer, CSP provider and Microsoft itself – and here’s just a few reasons why:

Moving to CSP can save you money

Perhaps the most important reason of all to consider CSP 🙂 The price you will pay for pretty much every single Office 365 Subscription offering available via Microsoft directly will be lower if purchased from a CSP partner. In most cases, this will typically result in a 10-15% reduction across the board guaranteed; a figure which, depending on the size of your organisation, could be a significant portion of your per annum IT spend. In addition, there is no need to wait for your subscription anniversary to switch – any early cancellation charges will be credited in full by Microsoft, should you cancel at any point in your subscription and migrate to the equivalent CSP subscription.

CSP users can benefit from special promotions, previews and other deals unavailable via Microsoft Direct

One example of this at the moment is the preview for Microsoft 365 Business – the next evolutionary step for Office 365 – which is accessible to those who are working with CSP providers currently. Other promotions may also appear from time to time, so you should be speaking to your CSP provider regularly to ensure that they are informing you of any potential discounts or offers available.

CSP enables your current Microsoft Partner to support you better

If you are working with a Microsoft Partner to help support your Office 365 services or Dynamics 365 deployments, there’s a good chance that they may have spoken to you about CSP or migrated you across already. The reasons for this will not be purely based on an altruistic desire to reduce your monthly running costs; by having their customers operating under CSP licensing, Partners are granted additional information regarding your subscriptions and their usage. They also become the de facto organisation that needs to be contacted in case any issues occur relating to the subscription. A customer who, for example, contacts Microsoft directly regarding an Exchange Online issue on their CSP subscription will instead be referred back to the CSP Partner in the first instance; they will then be responsible for escalating the case to Microsoft if required. In most circumstances, this can surely be seen as a plus and in helping Partners to work more closely with their customers.

The above example also goes some way towards explaining why CSP license prices are cheaper compared to going directly to Microsoft. By placing Partner organisations at the front-line of dealing with common 1st/2nd Line support issues, Microsoft can reduce the number of support professionals it allocates internally  and place the burden instead on Microsoft Partners to do the “heavy lifting”, particularly when it comes to dealing with easy to resolve issues (i.e. any support request that can be resolved via the Administration Centre).

It’s for Azure as well…with some caveats

Chances are if you are using Office 365 within your organisation, then you will also be consuming some additional Azure services on top of this – either Virtual Machine(s), storage, websites or even some database capacity. The good news is that you can also look at moving your Azure subscriptions across to CSP, with the same benefits available: reduced monthly costs and the ability for your partner of choice to support you better.

At the time of writing this post, the key “red flag” I would draw you towards when considering Azure CSP is what you lose compared to a Pay As You Go or other direct subscription. For example, ongoing and previous usage history will not be visible on the Azure portal and, chances are, you will only get full visibility of your Azure usage costs at the time when you are billed by your CSP partner. If you typically prefer to micro-manage your ongoing Azure usage costs, then a 10-15% saving may not be a fair trade-off for losing this visibility.

Finally, don’t be surprised if this becomes the de facto way of buying Office 365/Azure in the future if you are an SMB

I mentioned above one reason why CSP license costs are cheaper, and through this, you can begin to see the writing on the wall for SMB’s. This is not necessarily a bad thing. My own personal preference would be in dealing with a Microsoft Partner as opposed to Microsoft direct, as Partners will generally be a lot more flexible and reactive to work with. Having assumed that Microsoft can generate significant internal cost savings and also give eager Partner organisations the opportunity to fill the void, why would they not then turn round and say “We’re sorry, but if you are an organisation that employs 300 people or less, then please speak to a Microsoft Partner for further assistance.”? Certainly, the vibe and talk around CSP at the moment would seem to indicate that this is the long-term trajectory for the programme. Watch this space, but it will be interesting to see in the future whether the Microsoft Direct route is downplayed or removed completely if your potential license order per annum is in only in the hundreds of £’s.