The biggest headache when managing any database system is enforcing data quality and consistency across the entire dataset. This can range from ensuring that field values are entered correctly through to enforcing rules to prevent duplicate records from even touching the periphery of your database. If you are using an application like CRM Online/Dynamics 365 for Enterprise (D365E), then the built-in functionality within these systems can assist with this, via the use of Option Sets for field values and handy features such as Duplicate Detection Rules. If you have developed your own system, then all due thought should be directed towards ensuring that your system can adequality enforce data quality at the client side, wherever possible. Doing this early on can save you a lot of work down the line.

If the backend database system you are using is SQL Server, then, fortunately, the SQL standard (and, specifically, Transact-SQL or T-SQL, Microsoft’s implementation of the standard) has some tricks up its sleeve that is worth noting. CONSTRAINT‘s are database objects that can be setup to enforce…well, constraints on the values that are allowed within a particular field. There are a number of different CONSTRAINT‘s that are at our disposal, but the most commonly used ones are:

  • PRIMARY KEY CONSTRAINT: Typically used to indicate a column that will always contain a unique value in the database, to facilitate working with a specific row of data in a table. Any field type can be setup as a PRIMARY KEY, but it is generally recommended to use an Integer (INT) or Globally Unique Identifier field (UNIQUEIDENTIFIER), more generally referred to as a GUID.
  • FOREIGN KEY CONSTRAINT: FOREIGN KEY‘s are used to indicate a field within a database that is a reference to another table within the database. Values entered into this field must match against a record within the related table and you can configure a wide range of options of how “parent” record behaves, based on how the related record is modified or removed in the database. Those coming from a CRM/D365E background can grasp this better by realising that these are essentially lookup fields as part of a one-to-many (1:N) relationship.
  • DEFAULT CONSTRAINT: On some occasions, when a value is not entered into a column, you may need to ensure that something is put into the field. This is particularly the case when you are working with NOT NULL fields, which always require a value. A DEFAULT CONSTRAINT gets around this issue by allowing you to specify an initial value for the column, should the database operation against the column results in a value not being specified as part of an INSERT statement.

W3 Schools have a handy list that covers all possible CONSTRAINT’s within SQL, but be sure to cross reference this with the relevant Microsoft documentation! As an implementation of the standard, T-SQL can have a few nice – but sometimes surprising – differences that you should be aware of 🙂 The upside of all of this is that, if you have the need to ensure that your database column values are protected against erroneous values, then a CHECK CONSTRAINT is your first port of call. What’s even better is that these are something that can be setup rather straightforwardly to ensure, for example, a field is only allowed to conform to a specific set of values.

A practical illustration is the best way to demonstrate the ease – and potential pitfall – you may hit when working with CHECK CONSTRAINT‘s containing a large list of potential values. Let’s say you want to create a table with a field – TestField – that should only ever accept the values A, B or C. Your CREATE TABLE script would look something like this:

CREATE TABLE [dbo].[Test]
(
	[TestField] CHAR(1)	NULL,
	CONSTRAINT CHK_TestField CHECK ([TestField] IN ('A', 'B', 'C'))
)

This works well within the following situations:

  • You are working with a handful of values that need to checked – ideally no more than a dozen.
  • You can guarantee that the list of values will not be subject to frequent changes.

If your situation fits within the opposite end of the parameters specified above, you may make the assumption that the best way to build a sustainable solution is via a dedicated lookup table within your database. The idea being with this is the list of values required for the CONSTRAINT can be managed in bulk, updated/removed via common T-SQL statements and also prevents you from managing particularly long-winded table scripts within your database. The following script will create a lookup table that records the fields you require CONSTRAINT‘s for (assuming this is the case; this can be removed at your discretion) and the values that need checking:

CREATE TABLE [dbo].[lkp_Test] 
(
	[lkpTestID]	INT	IDENTITY(1,1) NOT NULL,
	CONSTRAINT PK_lkp_Test_lkpTestID PRIMARY KEY CLUSTERED ([lkpTestID]),
	[FieldName] VARCHAR(100)	NOT NULL,
	[Value]		VARCHAR(100)	NOT NULL
)

Those who have good, but not extensive, experience with T-SQL may make the next step assumption that we can then modify our CHECK constraint to directly query the table, similar to the below:

--To recreate an existing Constraint, it has to be dropped and recreated

ALTER TABLE [dbo].[Test]
DROP CONSTRAINT CHK_TestField;

GO

ALTER TABLE [dbo].[Test]
ADD CONSTRAINT CHK_TestField CHECK ([TestField] IN (SELECT [Value] FROM [dbo].[lkp_Test] WHERE [FieldName] = 'TestField' AND [TestField] = [Value]));

GO

After executing the command, your next reaction may be confusion as an error is thrown back to you:

What this means is that there is no way that we can define a query within our CONSTRAINT to essentially “lookup” the values that are allowed from our table and enforce only the list of approved values. The workaround for this is that we can look at utilising a user-defined function, or UDF. The function will perform the necessary query against our lookup table but achieves the requirements of returning a single (i.e. scalar) value, which we can then filter against in our CONSTRAINT. Functions are a topic that has been covered previously on the blog, and, when used effectively, can help to encapsulate complicated functionality within simple objects that can be referenced via your T-SQL queries. Below is a function that can be setup to achieve the requisite conditions for our scenario:

CREATE FUNCTION dbo.fnGetTestConstraintValues(@FieldName VARCHAR(100), @Value VARCHAR(100))
RETURNS VARCHAR(5)
AS
BEGIN
	IF EXISTS (SELECT [Value] FROM [dbo].[lkp_Test] WHERE [FieldName] = @FieldName AND [Value] = @Value)
RETURN 'TRUE'
	IF @Value IS NULL
RETURN 'TRUE'
RETURN 'FALSE'
END

GO

Let’s break down what this is doing in more detail:

  • When called, you must supply two parameters with values – the name of the field that needs to be checked against (@FieldName) and the value to check (@Value). We’ll see how this works in practice in a few moments.
  • The function will always return a single value – either TRUE or FALSE – thereby ensuring we have a scalar value to interact with.
  • Because our TestField has been defined as NULL, we have to add some additional logic to handle these occurrences. SQL Server will not enforce a CHECK constraint for NULL values when entered/updated into a database, but our function does not extend this far. Therefore, unless a record is inserted into our lkp_Test table for our field with a NULL value, these value types will force the CONSTRAINT to be enforced. By adding in an IF condition to check for NULL‘s and return a value of ‘TRUE‘ in these instances, NULL values can be entered into the database successfully.

With this function now in place, our CONSTRAINT just needs to be modified to call the function, verifying that TRUE returns; if not, then the field value will not be allowed:

ADD CONSTRAINT CHK_TestField CHECK (dbo.fnGetTestConstraintValues('TestField', [TestField]) = 'TRUE');

Now we can ensure that we are able to specify a subquery within our CONSTRAINT within a supported manner – excellent! 🙂 CONSTRAINT‘s are a really useful tool at the disposal of any database administrator/developer, and something that you should always have at the front of your mind as part of any design. Doing so will ensure that you are building a solution which, as much as realistically conceivable, tries to keep the data within your database as squeaky clean as possible.

The sheer breadth of ways that you can utilise Dynamics CRM/Dynamics 365 for Enterprise (CRM/D365E) can sometimes boggle the mind. Whether it’s through a traditional web browser, mobile app, the new interactive service hub or even through your own website created via the SDK, organisations have an ever-increasing array of routes they can go down when deploying the application into their environment. Despite this, more often than not, you would expect a “standard” deployment to involve using the application via a web browser, either on a local machine or potentially via a Remote Desktop Session (RDS) instance. Whilst Microsoft’s support articles provide fairly definitive software requirements when working on a Windows desktop machine, it is difficult to determine if, for example, Google Chrome on a Windows Server 2012 RDS session is supported. This is an important omission that requires clarification and is worth discussing further to determine if a definitive conclusion can be reached, based on the available evidence.

In this week’s post, I will attempt to sleuth through the various pieces of evidence I can find on this subject, sprinkling this with some experience that I have had with CRM/D365E and RDS, to see if any definitive conclusion can be established.

Before we get into the heart of the matter, it may be useful to provide a brief overview of what RDS is

RDS is a fancy way of describing connecting to a remote computer via the Remote Desktop Connection client on your Windows or OS of choice. Often referred to as Terminal Services, it is a de facto requirement when accessing remote servers for a variety of reasons. Most commonly, you will witness it deployed as part of an internal corporate network, as a mechanism for users to “remote on” when working outside the office. Due to the familiarity of Windows Server compared with each versions corresponding Desktop OS, the look and feel of working on a normal computer can be achieved with minimal effort, and you can often guarantee that the same types of programmes will also work without issue.

Whilst RDS is still frequently used, it could be argued to have taken a back seat in recent years with the rise in virtualisation technologies, from the likes of Citrix and VMware. These solutions tend to offer the same benefits an RDS server can, but places more emphasis on utilising a local desktop environment to essentially stream desktops/applications to end users. As a result of the rise of these technologies, RDS is perhaps entering a period of uncertainty; whilst it will continue to be essential for remote server management, there are arguably much better technologies available that provide an enriched end-user experience, but offer the same benefits of having a centralised server within a backed up/cloud environment.

Now that you (hopefully!) have a good overview of what RDS is, let’s take a look at the evidence available in relation to CRM/D365E and RDS

Evidence #1: Technet Articles

The following TechNet articles provide, when collated together, a consolidated view of supported internet browsers and operating systems for CRM/D365:

From this, we can distill the following:

  • Windows 10, 8.1, 8 and 7 are supported, so long as they are using a “supported” browser:
    • Internet Explorer 10 is supported for Windows 7 and 8 only.
    • Internet Explorer 11 is supported for all Windows OS’s, with the exception of 8.
    • Edge is supported for Windows 10 only.
    • Firefox and Chrome are supported on all OS’s, so long as they are running the latest version.
  • OS X 10.8 (Mountain Lion), 10.9 (Mavericks) and 10.10 Yosemite are supported for Safari only, running the latest version
  • Android 10 is supported for the latest version of Chrome only
  • iPad is supported for the latest version of Safari only (i.e. the latest version of iOS)

The implication from this should be clear – although the following Windows Server devices (that are currently in mainstream support) can be running a supported web browser, they are not covered as part of the above operating server list:

  • Windows Server 2016
  • Windows Server 2012 R2
  • Windows Server 2012

Evidence #2: Notes from the Field

I have had extensive experience both deploying into and supporting CRM/D365E environments running RDS. These would typically involve servers with significant user load (20-30 per RDS server) and, the general experience and feedback from end users has always been…underwhelming. All issues generally came down to the speed of the application which, when compared to running on a standard, local machine, was at a snail’s pace by comparison. Things like loading a form, an entity view or Dialog became tortuous affairs and led to serious issues with user adoption across the deployments. I can only assume that the amount of local CPU/Memory required for CRM/D365E when running inside a web application was too much for the RDS server to handle; this was confirmed by frequent CPU spikes and high memory utilisation on the server.

I can also attest to working with Microsoft partners who have explicitly avoided having issues concerning RDS and CRM/D365E in-scope as part of any support agreement. When this was queried, the reasoning boiled down to the perceived hassle and complexity involved in managing these types of deployment.

To summarise, I would argue that this factors in additional ammunition for Evidence piece #1, insomuch as that RDS compatible servers are not covered on the supported operating system lists because these issues are known about generally.

Evidence #3: What Microsoft Actually Say

I was recently involved as part of a support case with Microsoft, where we were attempting to diagnose some of the performance issues discussed above within an RDS environment. The support professional assigned to the case came back and stated the following in regards to RDS and CRM/D365E:

…using Windows remote desktop service is supported but unfortunately using Windows server 2012 R2 is not supported. You have to use Windows server 2012. Also windows server 2016 is out of our support boundaries.

Whilst this statement is not backed up by an explicit online source (and I worry whether some confusion has been derived from the Dynamics 365 for Outlook application – see below for more info on this), it can be taken as saying that Windows Server 2012 is the only supported operating system that can be used to access CRM/D365E, with one of the supported web browsers mentioned above.

The Anomalous Piece of Evidence: Dynamics 365 for Outlook Application

Whilst it may not be 100% clear cut in regards to supported server operating systems, we can point to a very definitive statement in respect to the Dynamics 365 for Outlook application when used in conjunction with RDS:

Dynamics 365 for Outlook is supported for running on Windows Server 2012 Remote Desktop Services

Source: https://technet.microsoft.com/en-us/library/hh699743.aspx

Making assumptions here again, but can we take this to mean that the web application is supported within Windows Server 2012 RDS environments, as suggested by the Microsoft engineer above? If not, then you may start thinking to yourself “Well, why not just use this instead of a web browser on RDS to access CRM/D365E?”. Here are a few reasons why you wouldn’t really want to look at rolling out the Dynamics 365 for Outlook application any time soon within RDS:

  • If deploying the application into offline mode, then you will be required to install a SQL Express instance onto the machine in question. This is because the application needs to store copies of your synchronised entity data for whenever you go offline. The impact of this on a standard user machine will be minimal at best, but on a shared desktop environment, could lead to eventual performance issues on the RDS server in question
  • With the introduction of new ways to work within CRM/D365 data in an efficient way, such as with the Dynamics 365 App for Outlook, the traditional Outlook client is something that is becoming less of a requirement these days. There are plenty of rumours/commentary on the grapevine that the application may be due for depreciation in the near future, and even Microsoft have the following to say on the subject:

    Dynamics 365 App for Outlook isn’t the same thing as Dynamics 365 for Outlook. As of the December 2016 update for Dynamics 365 (online and on-premises), Microsoft Dynamics 365 App for Outlook paired with server-side synchronization is the preferred way to integrate Microsoft Dynamics 365 with Outlook.

  • I have observed performance issues with the add-in myself in the past – outlook freezing, the occasional crash and also issues with the Outlook ribbon displaying incorrectly.

As you can probably tell, I am not a big fan of the add-in, but the writing on the wall is fairly clear – Microsoft fully supports you accessing CRM/D365E from the Outlook client on Windows Server 2012 RDS.

After reviewing all the evidence, do we have enough to solve this case?

Whilst there is a lot of evidence to consider, the main thing I would highlight is the lack of a “smoking gun” in what has been reviewed. What I mean by this is the lack of a clear support article that states either “X Browser is supported on Windows Server X” or “X Browser is NOT supported on Windows Server X“. Without any of these specific statements, we are left in a situation where we have to infer that RDS is not a supported option for using the CRM/D365E web application. Certainly, the experience I have had with the web client in these environment types would seem to back this up, which may go some way towards explaining the reason why this is not implicitly supported.

So where does this leave you if you are planning to deploy CRM/D365E within an RDS environment? Your only option is to ensure that your RDS environment is running Windows Server 2012 and that your users are utilising the Outlook client, given that there is a very clear statement regarding its supportability. If you are hell bent on ensuring that your end users have the very best experience with CRM/D365E, then I would urge you to reconsider how your environment is configured and, if possible, move to a supported configuration – whether that’s local desktop or a VDI, running your browser of choice. Hopefully, the benefits of utilising the application will far outweigh any overriding concerns and business reasons for using RDS in the first place.

Exams are always something that you end up worrying about to an obsessive degree. The thought of being placed on the spot and expected to demonstrate your knowledge in a particular subject can be daunting to even the most knowledgeable individuals. Technology exams, such as Microsoft certification, can arguably be the worst of all; the level of detailed technical knowledge that you are expected to know off the top of your head can seem almost impossible, particularly for those are heavily reliant on our friend of old, like me! The pace of technological advancement only complicates this further and, when you are working with solutions as fast-paced as Dynamics 365 for Enterprise (D365E), the pace is almost marathon like. New features are added regularly to the application and this invariably leads to new exam content and accreditations to match. The introduction of an MCSA and MCSE for D365E is, arguably, one of the more welcome of recent changes made, and gives those looking to showcase their knowledge a more enhanced way of doing so.

I have previously reviewed the new exams in more detail on the blog and, after having been through the process and successfully obtained my MCSA and MCSE, I can speak generally about the experience and hopefully guide those who are looking at sitting the exams in the near future. This week’s post will provide some general guidance on how you can best prepare for the exams, an overview of the new badge sharing platform, Acclaim, and my general thoughts on the whole experience.

Disclaimer

Per the terms of the Exam NDA, this post will not make reference to any specific exam content; rather, it will discuss broadly what areas you should focus on to get that passing grade. Any discussion or question relating to exam content will be deleted from the comments section.

The journey to Accreditation may seem somewhat upside down

To achieve your MCSA, and eventually MCSE, Microsoft recommends that you follow a suggested route to attain your certification. Although you are free to pass the exams in any order you wish, it is perhaps strange that, if you follow the prescribed route, you will learn/be tested on how to customise, configure and manage Dynamics 365 before discovering what the application can offer natively. Many of the features available in D365E may very well speed along a deployment, and it is important to always remember the plethora of existing functionality available within the application and not accidentally over-customise when there is no need to.

Don’t underestimate the need to revise…

As with any new exam, the Skills Measured list is updated to reflect features freshly introduced as part of the exams targeted release. If you have not yet had experience with how these work, then I would highly recommend working through the e-learning courses available on the Dynamics Learning Portal in the first instance (some of which, incidentally, are also available on Imagine Academy), targeting yourself towards a) new features and b) functionality that you have the least experience in. With regards to “What’s New” with D365E, I would recommend brushing up on the following subjects as part of your revision:

There’s a good chance, based on each exams specification, that questions on all the above topics could appear on multiple exams, so be sure to prepare yourself for this.

…but realise that hands-on experience is the best route towards passing an exam.

Simply watching the e-learning courses or reading about new functionality is never sufficient when revising. You should ideally have a D365E tenant available to you as part of your revision (trials can be set up in a pinch) and be working through features of the application as you are learning. The above e-learning courses include labs, which is an excellent starting point; from there, you should very much work through setting up features from scratch, navigating around the interface and understanding what various buttons/actions do. You may surprise yourself and discover something about the application that you previously overlooked; something which happens to me all the time!

Be sure to setup your Acclaim account after passing

One of the nifty new perks of passing an exam this year is the introduction of Acclaim, which aims to provide a simplified mechanism of collating together your various accreditations across multiple vendors in a fun way. Upon passing your first exam, you will be sent an email within 24 hours of passing to let you know your badge is waiting to be claimed:

To accept the badge, you will need to setup an account with Acclaim. After this, all subsequent achievements will be saved to the same account, enabling you to build up your “Acclaim transcript” as you pass more exams. The social features of the application are varied and quite nice, meaning that you can quickly share the news of your exam pass with friends, colleagues and family members at the click of a button. Currently, LinkedIn, Twitter and Facebook are supported at the time of writing. Finally, you can download images of your badges and include them as part of job applications/C.V’s, helping them stand out more visually.

If you are interested in finding out more about Acclaim, then you can check out my profile to get a feel for what it can offer you. Suffice it to say, having a straightforward means of sending potential customers/employers a website link to my Acclaim profile as opposed to an entire Microsoft transcript would undoubtedly simplify things. Now, if only you could get physical badges that you could stick on your bag… 🙂

Conclusions or Wot I Think

My own personal journey towards obtaining my first MCSA and MCSE has been challenging and rewarding in equal measure. It feels really good to know that D365E has a proper set of accreditations that individuals can aspire towards obtaining and which exemplify the position of the application alongside other, well-known Microsoft solutions. That is not to say that exams are a definitive means of judging your expertise with a particular product, and an exam fail may indicate a spur of the moment, misjudged answer or a lack of revision for a particular new feature. This post may put people off from trying for an exam, due to the effort involved, but it’s important that you are not daunted in any way. I would readily encourage people who have a passion for D365E to put aside any concerns and not delay in working towards passing each of the exams. By doing so, you can proactively demonstrate your commitment towards D365E and the zeal that you have for it, giving those around you the confidence that you not just talk the talk, but can walk the walk as well.

The very recent Microsoft Data Amp event provided an excellent forum for the SQL Server 2017 announcement, which is due to be released at some point this year. Perhaps the most touted feature of the new version is that it will be available to be installed on Linux; an entirely inconceivable premise 10 years ago, which just goes to show how far Microsoft have changed in their approach to supporting non-Windows platforms as standard. Past the big headline announcements, there is a lot to look forward to underneath the hood with SQL Server 2017 that may act as encouragement for organisations looking to upgrade in the near future.

In this week’s post, I’ll be taking a closer look at 3 new features I am most looking forward to, that are present within the SQL Server Community Technical Preview (CTP) 2.0 version and which will form part of the SQL Server 2017 release later on this year.

Power BI in SSRS: A Match Made in Heaven

This is by far the feature I am most looking forward to seeing in action. I have been working more and more with Power BI this year, often diving into the deep-end in respect to what can be achieved with the product, and I have been impressed with how it can be used for addressing reporting scenarios that SSRS may struggle with natively. The announcement earlier this year that Power BI would be included as part of SSRS in the next major release of the product was, therefore, incredibly welcome and its inclusion as part of SQL Server 2017 is confirmed by the inclusion of Power BI reports in the CTP 2.0 release.

For those who are already familiar with Power BI, there is thankfully not much that you need to learn to get up and running with Power BI in SSRS. One thing to point out is that you will need to download a completely separate version of the Power BI Desktop App to allow you to deploy your Power BI reports to SSRS. I would hope that this is mitigated once SQL Server 2017 is released so that we are can deploy from just a single application for either Online or SSRS 2017. Users who are experienced with the existing Power BI Desktop application should have no trouble using the equivalent product for SSRS, as they are virtually identical.

The actual process of deploying a Power BI report is relatively straightforward. After making sure that you have installed the SSRS Power BI Desktop Application, you can then navigate to your SSRS homepage and select + New -> Power BI Report:

You will be greeted with a prompt similar to the below and the Power BI Desktop application will open automatically:

Now it’s time to build your report 🙂 As an example, I have used the WideWorldImporters Sample Database to build a simplistic Power BI report:

If you were working with Power BI online, then this would be the stage where you would click the Publish button to get it onto your online Power BI tenant. The option to deploy to your SSRS instance is currently missing from Power BI in SSRS application; instead, you will need to manually upload your .pbix file into Reporting Services via the Upload button. Once uploaded, your report will be visible on the home page and can be navigated to in the usual manner:

Simplified CSV Importing

Anyone who has at least some experience working with databases and application systems should have a good overview of the nuances of delimited flat file types – in particular, Comma Separated Value (.csv) files. This file type is generally the de-facto format when working with data exported from systems and, more often than not, will be the most common file type that you will regularly need to import into a SQL Server database. Previously, if you didn’t opt to use the Import Wizard/.dtsx package to straightforwardly get your .csv file imported, you would have to rely on the following example script:

BULK INSERT dbo.TestTable
FROM 'C:\Test.csv'
WITH
	(
		FIELDTERMINATOR = ',',
		ROWTERMINATOR = '\n'
	)

Now, with SQL Server 2017, you can simplify your query by replacing FIELDTERMINATOR and ROWTERMINATOR with a new FORMAT parameter, that specifies the file format we are concerned with:

BULK INSERT dbo.TestTable
FROM 'C:\Test.csv'
WITH (FORMAT = 'CSV');

Whilst the overall impact on your query length is somewhat negligible, it is nice that a much more simplified means of accomplishing a common database task has been introduced and that we now also have the option of accessing Azure Blob Storage locations for import files.

Updated Icons for SSMS

Typically, as part of any major update to the application, the “under the hood” visual side of things are generally not changed much. A good example of this can be found within CRM Online/Dynamics 365 for Enterprise within the Customizations area of the application, which has not seen much of a facelift since CRM 2011. As a result, a lot of the icons can look inconsistent with the application as a whole. As these are generally the areas of the application that we use the most day in, day out, it can be a little discouraging not to see these areas get any love or attention as part of a major update… 🙁

With this in mind, it is pleasing to see that the updated SSMS client for SQL Server 2017 has been given refreshed icons that bring the application more in line with how Visual Studio and other Microsoft products are looking these days. Below is a comparison screenshot, comparing SSMS 2014 with SSMS 2017:

Conclusions or Wot I Think

Whilst there is a lot more to look forward to with the new release that is not covered in this post (for example, the enhancements to R server and deeper integration with AI tools), I believe that the most exciting and important announcement for those with their Business Intelligence/Reporting hats on is the introduction of Power BI into SSRS. Previously, each tool was well suited for a specific reporting purpose – SSRS was great for designing reports that require a lot of visual tailoring and widely common formats for exporting, whereas Power BI is more geared towards real-time, dashboard views that marry together disparate data sources in a straightforward way. By being able to leverage SSRS to fully utilise Power BI reports, the application suddenly becomes a lot more versatile and the potential for combining together functionality becomes a lot more recognisable. So, for example, having the ability to drill down to an SSRS report from a Power BI report would be an excellent way of providing reporting capabilities that satisfy end-user consumption in 2 different, but wildly applicable, scenarios.

In summary, the SQL Server 2017 release looks to be very much focused on bringing the product up to date with the new state of play at Microsoft, successfully managing to achieve cross-platform requirements alongside bringing exciting functionality (that was previously cloud-only) into the hands of organisations who still have a requirement to run their database systems on their on-premise infrastructure. I’m eagerly looking forward to the release later on this year and in seeing the product perform in action. 🙂