The very recent Microsoft Data Amp event provided an excellent forum for the SQL Server 2017 announcement, which is due to be released at some point this year. Perhaps the most touted feature of the new version is that it will be available to be installed on Linux; an entirely inconceivable premise 10 years ago, which just goes to show how far Microsoft have changed in their approach to supporting non-Windows platforms as standard. Past the big headline announcements, there is a lot to look forward to underneath the hood with SQL Server 2017 that may act as encouragement for organisations looking to upgrade in the near future.

In this week’s post, I’ll be taking a closer look at 3 new features I am most looking forward to, that are present within the SQL Server Community Technical Preview (CTP) 2.0 version and which will form part of the SQL Server 2017 release later on this year.

Power BI in SSRS: A Match Made in Heaven

This is by far the feature I am most looking forward to seeing in action. I have been working more and more with Power BI this year, often diving into the deep-end in respect to what can be achieved with the product, and I have been impressed with how it can be used for addressing reporting scenarios that SSRS may struggle with natively. The announcement earlier this year that Power BI would be included as part of SSRS in the next major release of the product was, therefore, incredibly welcome and its inclusion as part of SQL Server 2017 is confirmed by the inclusion of Power BI reports in the CTP 2.0 release.

For those who are already familiar with Power BI, there is thankfully not much that you need to learn to get up and running with Power BI in SSRS. One thing to point out is that you will need to download a completely separate version of the Power BI Desktop App to allow you to deploy your Power BI reports to SSRS. I would hope that this is mitigated once SQL Server 2017 is released so that we are can deploy from just a single application for either Online or SSRS 2017. Users who are experienced with the existing Power BI Desktop application should have no trouble using the equivalent product for SSRS, as they are virtually identical.

The actual process of deploying a Power BI report is relatively straightforward. After making sure that you have installed the SSRS Power BI Desktop Application, you can then navigate to your SSRS homepage and select + New -> Power BI Report:

You will be greeted with a prompt similar to the below and the Power BI Desktop application will open automatically:

Now it’s time to build your report 🙂 As an example, I have used the WideWorldImporters Sample Database to build a simplistic Power BI report:

If you were working with Power BI online, then this would be the stage where you would click the Publish button to get it onto your online Power BI tenant. The option to deploy to your SSRS instance is currently missing from Power BI in SSRS application; instead, you will need to manually upload your .pbix file into Reporting Services via the Upload button. Once uploaded, your report will be visible on the home page and can be navigated to in the usual manner:

Simplified CSV Importing

Anyone who has at least some experience working with databases and application systems should have a good overview of the nuances of delimited flat file types – in particular, Comma Separated Value (.csv) files. This file type is generally the de-facto format when working with data exported from systems and, more often than not, will be the most common file type that you will regularly need to import into a SQL Server database. Previously, if you didn’t opt to use the Import Wizard/.dtsx package to straightforwardly get your .csv file imported, you would have to rely on the following example script:

BULK INSERT dbo.TestTable
FROM 'C:\Test.csv'
WITH
	(
		FIELDTERMINATOR = ',',
		ROWTERMINATOR = '\n'
	)

Now, with SQL Server 2017, you can simplify your query by replacing FIELDTERMINATOR and ROWTERMINATOR with a new FORMAT parameter, that specifies the file format we are concerned with:

BULK INSERT dbo.TestTable
FROM 'C:\Test.csv'
WITH (FORMAT = 'CSV');

Whilst the overall impact on your query length is somewhat negligible, it is nice that a much more simplified means of accomplishing a common database task has been introduced and that we now also have the option of accessing Azure Blob Storage locations for import files.

Updated Icons for SSMS

Typically, as part of any major update to the application, the “under the hood” visual side of things are generally not changed much. A good example of this can be found within CRM Online/Dynamics 365 for Enterprise within the Customizations area of the application, which has not seen much of a facelift since CRM 2011. As a result, a lot of the icons can look inconsistent with the application as a whole. As these are generally the areas of the application that we use the most day in, day out, it can be a little discouraging not to see these areas get any love or attention as part of a major update… 🙁

With this in mind, it is pleasing to see that the updated SSMS client for SQL Server 2017 has been given refreshed icons that bring the application more in line with how Visual Studio and other Microsoft products are looking these days. Below is a comparison screenshot, comparing SSMS 2014 with SSMS 2017:

Conclusions or Wot I Think

Whilst there is a lot more to look forward to with the new release that is not covered in this post (for example, the enhancements to R server and deeper integration with AI tools), I believe that the most exciting and important announcement for those with their Business Intelligence/Reporting hats on is the introduction of Power BI into SSRS. Previously, each tool was well suited for a specific reporting purpose – SSRS was great for designing reports that require a lot of visual tailoring and widely common formats for exporting, whereas Power BI is more geared towards real-time, dashboard views that marry together disparate data sources in a straightforward way. By being able to leverage SSRS to fully utilise Power BI reports, the application suddenly becomes a lot more versatile and the potential for combining together functionality becomes a lot more recognisable. So, for example, having the ability to drill down to an SSRS report from a Power BI report would be an excellent way of providing reporting capabilities that satisfy end-user consumption in 2 different, but wildly applicable, scenarios.

In summary, the SQL Server 2017 release looks to be very much focused on bringing the product up to date with the new state of play at Microsoft, successfully managing to achieve cross-platform requirements alongside bringing exciting functionality (that was previously cloud-only) into the hands of organisations who still have a requirement to run their database systems on their on-premise infrastructure. I’m eagerly looking forward to the release later on this year and in seeing the product perform in action. 🙂

Those who have experience working with an RDMS system like SQL Server will become accustomed towards a certain way of going about things. These can often involve a mixture of “lazy” query writing (e.g. using SELECT *… as opposed to SELECT Column1, Column2…), the manner in which you write your query (ALL CAPS or lower case) and best practice approaches. One arguable example of a best practice approach is the use of Stored Procedures. An illustration of how to use a Stored Procedure can most readily demonstrate their benefits. Take a look at the T-SQL query below, which should execute fine against the AdventureWorks2012 sample database:

SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]
FROM [HumanResources].[Employee] AS E
 INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH
  ON E.[BusinessEntityID] = DH.[BusinessEntityID]
 INNER JOIN [HumanResources].[Department] AS D
  ON DH.[DepartmentID] = D.[DepartmentID]
 INNER JOIN [Person].[Person] AS P
  ON E.[BusinessEntityID] = P.[BusinessEntityID]
WHERE DH.[EndDate] IS NULL
AND E.[JobTitle] = 'Production Technician - WC50'

The query returns the data we need, but not in an efficient manner. Consider the following:

  • Executing a query like the above, in-code, as part of an end-user application could expose your database to the risk of an SQL Injection attack or similar.
  • The query compromises a lot of information regarding our underlying database structure, information which any underlying client executing the query neither cares or should have to worry about.
  • The example is a very precise query, with a specific function – i.e. get me all the current employees who have the Job Title of Production Technician – WC50. If we wanted to modify it to instead obtain all Senior Tool Designers, we would have to write a completely separate query to accommodate this.

Implementing a Stored Procedure to encapsulate our query logic immediately addresses the above concerns, by providing us with a single-line query into the database, giving us just the data we need and enables us to utilise the query for other scenarios as well. Setting one up is very straight forward via the CREATE PROCEDURE command – the rest is pretty much what we have put together already:

CREATE PROCEDURE dbo.uspGetEmployeesByJobTitle 
	@JobTitle NVARCHAR(50)
AS
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

	SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]
	FROM [HumanResources].[Employee] AS E
	 INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH
	  ON E.[BusinessEntityID] = DH.[BusinessEntityID]
	 INNER JOIN [HumanResources].[Department] AS D
	  ON DH.[DepartmentID] = D.[DepartmentID]
	 INNER JOIN [Person].[Person] AS P
	  ON E.[BusinessEntityID] = P.[BusinessEntityID]
	WHERE DH.[EndDate] IS NULL
	AND E.[JobTitle] = @JobTitle
END
GO

By utilising a parameter for our WHERE clause filter on the Job Title, we can pass any valid value back to our stored procedure, immediately making our initial query more versatile across our reporting/business application. And, as a primary bonus, we can now safely take a 10 line query down to 1:

EXECUTE dbo.uspGetEmployeesByJobTitle @JobTitle = 'Senior Tool Designer'

So we have established that Stored Procedures are wicked cool awesome – but what does this have to do with PowerBI?!? 

Having worked with SQL Server Reporting Services (SSRS) extensively in the past, I have become accustomed to using Stored Procedures as a mechanism for storing underlying query logic within the database and having a straightforward means of referencing this from my .rdl file. I can only assume from this that this is the “norm” and preferred method of querying SQL data, as opposed to a direct SELECT statement.

When recently doing some work within PowerBI involving Azure SQL Databases, I was, therefore, surprised that there was no option to return data via a stored procedure as default. Instead, PowerBI would prefer me to directly query underlying table/view objects:

Thankfully, when inspecting the underlying PowerQuery used to return an example table from the above, it doesn’t use any kind of SELECT query to get the data:

let
    Source = Sql.Databases("mydatabaseinstance"),
    AdventureWorks2012 = Source{[Name="AdventureWorks2012"]}[Data],
    Production_ProductModel = AdventureWorks2012{[Schema="Production",Item="ProductModel"]}[Data]
in
    Production_ProductModel

Unfortunately, the same cannot be said for if you select the Advanced options area and input your own SQL query directly:

let
    Source = Sql.Database("mydatabaseinstance", "AdventureWorks2012", [Query="SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]#(lf)FROM [HumanResources].[Employee] AS E#(lf) INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH#(lf)  ON E.[BusinessEntityID] = DH.[BusinessEntityID]#(lf) INNER JOIN [HumanResources].[Department] AS D#(lf)  ON DH.[DepartmentID] = D.[DepartmentID]#(lf) INNER JOIN [Person].[Person] AS P#(lf)  ON E.[BusinessEntityID] = P.[BusinessEntityID]#(lf)WHERE DH.[EndDate] IS NULL#(lf)AND E.[JobTitle] = 'Senior Tool Designer'"])
in
    Source

I do NOT recommend you use the above method to query your SQL Server data!

I have spoken previously on the blog in respect to conventions around working with datasets i.e. only get what you need, and nothing else. As I work more and more with PowerBI, the tool very much seems to be geared towards flipping this mentality on its head. PowerBI has a number of built-in tools that seem to scream out “Just get ALL your data in here, we’ll worry about the rest!”. I realise that the difference between MB and GB these days, from a storage/cost point of view, is minimal; nevertheless, I still believe it is prudent not to put all your eggs in one basket and ensure that your business data is not being stored cavalier-esque within a multitude of different cloud services.

With this in mind, it is good to know that you can utilise stored procedures in PowerBI. You basically have two ways in which this can be achieved:

  • Going back to the Advanced options screen above on the SQL Server database wizard, you can EXECUTE your stored procedure directly using the following SQL Statement:
DECLARE @SP VARCHAR(100) = 'dbo.uspGetEmployeesByJobTitle @JobTitle = ''Senior Tool Designer'''
EXEC (@SP)

Be sure to specify your database and don’t forget the double quotes!

  • If you prefer to use PowerQuery as opposed to the wizard above, then the following code will also work:
let
    Source = Sql.Database("mydatabaseinstance", "AdventureWorks2012", [Query="DECLARE @SP VARCHAR(100) = 'dbo.uspGetEmployeesByJobTitle @JobTitle = ''Senior Tool Designer'''#(lf)EXEC (@SP)"])
in
    Source

In both cases, you will be required to authenticate with the database and your result set should return as follows if using the AdventureWorks2012 example database/code:

Finally, as a best-practice security step, you should ensure that the account connecting to your SQL Server instance is restricted to only EXECUTE the procedures you have specified. This can be achieved via the following T-SQL snippet, executed against your database instance:

GRANT EXECUTE ON OBJECT::dbo.uspMyStoredProcedure  
    TO MySQLServerLogin 

Conclusions or Wot I Think

PowerBI is increasingly becoming a more relevant tool for traditional Business Intelligence/Reporting Services experienced professionals. The bellwether for this can surely be seen in the current Technical Preview for SQL Server Reporting Services, which includes PowerBI reports built-in to the application. Although we have no timescales at this stage at when the next major version of SQL Server will be released, it is reasonable to assume by the end of this year at the earliest, bringing PowerBI reports as a new feature. I am really excited about the introduction of PowerBI into SSRS, as it would appear to be a match made in heaven – giving an opportunity for those with experience in both products the flexibility to develop a unified, best of breed solution, using traditional .rdl reporting capability and/or PowerQuery/DAX functionality.

With the above on the horizon, the importance of being able to integrate seamlessly with SQL Server and having support for traditional/well-proven practices become crucial indicators of whether this match will be over before the honeymoon. And so, I would hope to see the option to access SQL Server data via Stored Procedures become standard when using the built-in data connector within PowerBI. Based on the feedback I have seen online, I’d warrant towards how welcome this feature could be and an excellent way of reducing the need for direct coding to achieve a common requirement within PowerBI.

The single biggest challenge when developing a reporting solution is data. Invariably, you won’t always have one database that contains all the information you need; often, you will need to bring across disparate and unrelated data sources into a common model. This problem can be exasperated if your organisation has a number of application or database systems from different vendors. Finding a tool that can overcome some of these hurdles is a real challenge. For example, whilst I am a huge fan of SQL Server Reporting Services, the out of the box data connection options are generally limited to common vendor products or ODBC/OLE DB data sources. Case in point: Finding or even developing a Data Source that can support a JSON stream can be troublesome and complicated to configure. With the landscape as such, the reporting tool that can offer the most streamlined method of overcoming these challenges is the tool that is going to win the day.

Following on from my deep-dive with the product last year, I have been working more and more with PowerBI in recent weeks. What I like most about the tool is that a lot of the hassle is taken out of configuring your data sources. PowerBI does this by leveraging the existing Power Query language and equipping itself with a large number of Data Source Connectors. The most surprising aspect of this? Microsoft products form only a subset of the options available, with connectors in place for many of competitor products from the likes SAP, SalesForce and Oracle. In my limited experience with the product to date, I have yet to find a data source that it does not support, either as part of a data connector or a manual Power Query.

A recent work example can best illustrate the above, as well as showcasing some of the built-in functionality (and learning curves!) that come to working with data via Power Query and writing Data Analysis Expressions (DAXs). There was a requirement to generate an internal department dashboard for an IT service desk. The dashboard had to meet the following key requirements:

  • Display a summary of each team members movements for the current week, ncluding the location of each person on that current day. Each member of the team was already recording their weekly movements within their Exchange calendar as all day appointments, configuring the Subject field for each appointment accordingly. For example, In OfficeWorking at Home etc. No other all day appointments were booked in the calendars.
  • Query Dynamics CRM and return data relating to Active/Inactive Case records.
  • To be displayable on a TV/Screen, refresh automatically and be exportable as a .pdf document or similar.

A CRM Dashboard can achieve about 50-60% of the above, but the key requirements of querying Exchange and exporting the dashboard are much more tricky; whilst it is certainly possible to perform web service requests from within CRM to Exchange, the process would be so convoluted to implement that is arguably not worth the effort. Likewise, CRM is not particularly friendly when it comes to printing out Dashboards, as you often left to the mercy of the web browser in question. With all this in mind, we decided that PowerBI was the best option and proceeded to bring all the data together using PowerQuery.

We first used the out of the box Exchange connector to query each person’s mailbox for all Calendar items, performing two transformations on the data. First, we filtered the result to return Calendar items from the current week and, second, we added a column to identify which Calendar the record derives from (as there is no field on each record to determine this). We’ll see why this is required in a few moments:

let
    Source = Exchange.Contents("john.smith@domain.com"),
    Calendar1 = Source{[Name="Calendar"]}[Data],
    #"Filtered Rows" = Table.SelectRows(Calendar1, each Date.IsInCurrentWeek([Start])),
    #"Added Custom" = Table.AddColumn(#"Filtered Rows", "Owner", each "John Smith")
in
    #"Added Custom"

Next, we combined all Calendars together into one table – again using Power Query. Table.Combine is a comma separated list of all tables you want to merge together, so you can add/remove accordingly to suit your requirements. We also take this opportunity to remove unnecessary fields and convert our Start and End field values to their correct type:

let
    Source = Table.Combine({Calendar1, Calendar2, Calendar3}),
    #"Removed Columns" = Table.RemoveColumns(Source,{"Folder Path", "Location", "DisplayTo", "DisplayCc", "RequiredAttendees", "OptionalAttendees", "LegacyFreeBusyStatus", "IsReminderSet", "ReminderMinutesBeforeStart", "Importance", "Categories", "HasAttachments", "Attachments", "Preview", "Attributes", "Body"}),
    #"Changed Type" = Table.TransformColumnTypes(#"Removed Columns",{{"Start", type date}, {"End", type date}})
in
    #"Changed Type"

Our CRM data is returned via an adapted version of the query used previously on the blog, taking into account the benefits of using a Saved Query as opposed to FetchXML. No further work is required to manipulate the data once in PowerBI, so this won’t be covered any further. Our issue is now with the Exchange Calendars. Because the appointments in the Calendars to indicate each persons movement are set as All Day appointments spanning multiple days, we have no way of extrapolating the days in between to determine whether it is the current day. So for example, if the all day Appointment starts on Monday and ends on Wednesday, we have Monday and Wednesday’s date, but not Tuesday’s. We, therefore, need to find a solution that determines whether the appointment falls on a specific day of the week – Monday, Tuesday, Wednesday, Thursday or Friday.

Our first step is to generate a date table covering the entire period we are concerned with. Using this very handy query, we can set up a PowerBI function that will allow us to generate just that – in this case, for the whole of 2017:

Why do we need this? Because we need to determine for each date in 2017 what day it falls on. For this reason, we now take off our Power Query hat and jam on our DAX one instead 🙂 Close & Apply your queries in PowerBI and then navigate to your new date table. Add a new Column, using the following DAX formula to populate it:

Day of Week (Number) = WEEKDAY('2017'[Date], 2)

The WEEKDAY function is an incredibly handy function in this regard, enabling us to determine the day of the week for any date value. Nice! We can now go back to our “unified” calendar, and perform the following modifications to it:

  • Add on a column that returns a TRUE/FALSE value for each row on our calendar, which tells us if the Start, End or any date between these values falls on a specific day. So, for our IsMondayAllDay field, our DAX formula is below. This will need to be modified accordingly for each subsequent column, by incrementing 1 on the ‘2017’[Day of Week (Number)], 1 bit by 1 for Tuesday, 2 for Wednesday etc.:

IsMondayAllDay = IF(AND(CONTAINS(CALENDAR([Start], IF([End] = [Start], [End], [End] - 1)), [Date], DATEVALUE(LOOKUPVALUE('2017'[Date], '2017'[Week Number], FORMAT(WEEKNUM(AllCalendars[Start], 2), "General Number"), '2017'[Day of Week (Number)], 1))), AllCalendars[IsAllDayEvent] = TRUE()), "TRUE", "FALSE")

  • A calculated column that tells us whether the current row is today, by referencing each of our fields created in the subsequent step. Similar to above, a TRUE/FALSE is returned for this:

IsToday = IF(([IsMondayAllDay] = "TRUE" && WEEKDAY(TODAY(), 2) = 1) || ([IsTuesdayAllDay] = "TRUE" && WEEKDAY(TODAY(), 2) = 2) || ([IsWednesdayAllDay] = "TRUE" && WEEKDAY(TODAY(), 2) = 3) || ([IsThursdayAllDay] = "TRUE" && WEEKDAY(TODAY(), 2) = 4) || ([IsFridayAllDay] = "True" && WEEKDAY(TODAY(), 2) = 5), "TRUE", "FALSE")

We now have everything we need to configure a Measure that can be used on our Dashboard – the Subject of the calendar appointment and a way of indicating that the appointment is today. So our final DAX formula would be as follows for John Smith:

John's Location (Today) = LOOKUPVALUE(AllCalendars[Subject], AllCalendars[IsToday], "TRUE", AllCalendars[Owner], "John Smith")

Now, it is worth noting, that the above solution is not fool-proof. For example, if a person has multiple All Day Appointments configured in their Calendar, then it is likely that the Measure used above will fall over. Giving that this is unlikely to happen in the above scenario, no proactive steps have been taken to mitigate this, although you can certainly implement a solution to address this (e.g. use the MAX function, only return records which contains “Working”, “Office” or “Home” in the Subject etc.). Nevertheless, I feel the above solution provided an effective “crash course” in a number of fundamental PowerBI concepts, including:

  • PowerQuery data retrieval and manipulation
  • PowerBi Functions
  • DAX Formulas and the difference between Measures and Calculated Columns

As a colleague recently said to me, “I think we will be using PowerBI a lot more in the future”. This is something that I would certainly agree with based on my experience with it so far 🙂

I have had an opportunity recently to start getting to grips with the wonderful world of PowerBI. For those who have walked the tightrope between Excel and SQL Server Reporting Service (SSRS) Reports, PowerBI appears to be the tool with these individuals in mind. It enables you to leverage existing Excel knowledge (through PowerQuery or Excel-based formulas/calculations), whilst also offering a number of easy to setup Visualisations, that are not too dissimilar to the charts, maps and other objects that can be setup on a .rdl report. What’s also good to know, from a Dynamics CRM/365 for Enterprise (D365E) point of view, is that you can very quickly hook up your CRM data to a PowerBI dashboard/report. In addition to this, integrating with other data sources – such as SQL, JSON or flat file sources – or even with completely different application systems is possible (even with SalesForce – shock, horror!). In the days of increasing need for tight integration between a dazzling array of different application and database systems, PowerBI gives you a broad framework to achieve any reporting goal you may have in mind. Having said that, it is still rough around the edges and in need of some further development before it, arguably, becomes the de facto tool to use. For example, I have found some of the formatting and design options available to be rather basic and light-years behind what is possible with SSRS or even Excel. There are also some features missing that are somewhat baffling, such as the ability to send out dashboards/reports via email on a predefined schedule. I would hope that we see increasing attention towards PowerBI in the months and years ahead in order to bring the very best of features from these more traditional applications but exposed in a non-daunting and wholly accessible way.

As referenced above, getting set up with your Online CRM/D365E data is incredibly straightforward via the inbuilt Dynamics 365 connector “wizard” – simply login into your online organisation, specify the entity data that you want to work with and PowerBi will push this data into a table for you, enabling you to start building your report in seconds. The connector “wizard” is suited to most typical data retrieval scenarios, providing a GUI interface to visualise the entity data within your CRM/D365E instance and the ability to put together related entities and return them as part of your query. Before you start using it, however, I would highlight the following:

  • The OData feed is rather indiscriminate in its retrieval – all records from an entity will be returned. Some pre-filtering will occur based on CRM’s/D365E’s security model (e.g. if the account you log in as has Business Unit level Read privilege on the Lead entity, only records in the accounts Business Unit will be returned), but typically it will be System Administrators who set up a PowerBI Dashboard; therefore meaning you have to deal with a lot of records being returned into PowerBI. Given that the two different PowerBI plans have limitations in regards to record retrieval, this could cause problems if you are working with large CRM datasets.
  • Tied in with the above, because you have no way via the “wizard” interface to specify record retrievals, you cannot take advantage of filtering your data based on certain attributes or even take advantage of some of the native functionality within CRM to aggregate your data. Whilst PowerBI is certainly more than capable of doing this, relatively n00bish users may find this an unwelcome barrier that hinders adoption.
  • Lookup and Option Set attributes are returned as a special data type of Record – with the underlying properties of the related record (GUID, display name etc.) stored within this sub-record. Having the data stored in this manner causes additional administrative effort on the PowerBI side, as you will need to figure out how to access the underlying properties of this PowerBi data type.

Fortunately, if you are inclined towards building very specific and focused queries that you can execute against your Online CRM/D365E, there is a way – and the best thing is, we get to use something that has been recently introduced into CRM as well 🙂

The Web API to the rescue!

The Web API was introduced in Dynamics CRM 2016, which implements version 4 of the Open Data (OData) Protocol, and will eventually replace the traditional means that developers would use to access CRM via web services (namely, the Organization service and the old OData service). CRM Developers will need to start becoming increasingly familiar with the Web API in the years ahead, and no doubt a lot of work will need to be done updating existing coding to look at the new Web API.

Because the Web API is a web service, PowerBi can connect to it via the Web connector. By querying the Web API, you have access to all of the messages that are exposed to get data from CRM – Retrieve, Retrieve Multiple and Predefined Query – with multiple options available to use in terms of how you return, filter and aggregate your data. Results will be returned in JSON format, so there will be some additional work that needs to be done to get the data into an accessible format. This post will now take a look at what you need to do in order to return data based on a FetchXML query, as well as (hopefully!) providing a useful code snippet that you can adapt to your environment.

Before starting, ensure that you have installed the CRM Rest Builder Managed Solution within your CRM development environment. This tool allows you to quickly generate code snippets in JScript that perform web service calls into CRM/D365E and is a massive help in a number of different ways. A big shout out and thanks to Jason Lattimer for the work he has done on this.

  1. To begin, you need a FetchXML query that returns the data you need. This can be written manually or generated automatically via Advanced Find. In this example, we are going to use the following snippet that queries the Case entity, that will return 7 sample data records from CRM:
<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false">
  <entity name="incident">
    <attribute name="title" />
    <attribute name="ticketnumber" />
    <attribute name="createdon" />
    <attribute name="incidentid" />
    <attribute name="caseorigincode" />
    <attribute name="customerid" />
    <order attribute="ticketnumber" descending="false" />
    <filter type="and">
      <condition attribute="prioritycode" operator="eq" value="1" />
    </filter>
  </entity>
</fetch>
  1. Next, we need to generate the URL that will be used to query the Web API endpoint. There are two challenges here – the first being that the FetchXML query needs to be included in the URL and that it needs to be encoded so that it is a valid URL. The start of the URL is fairly straightforward to put together – it’s just your CRM organisation URL, in the following format:

https://<Organisation Name>.<Region Code>.dynamics.com

So if your organisation name is crmchap and your CRM tenant is based in the EMEA region, then the URL would be as follows:

https://crmchap.crm4.dynamics.com

The rest of the URL can be obtained from the CRM Rest Builder. Open the Managed Solution, navigating to the Configuration page. It will look something like this:

1

Update the page so that the Action is set to Predefined Query. Ensure that the Primary Entity is set to Incident (always something you have to remember when working with the Case entity 🙂 ) and then copy and paste the FetchXML query into the box below. The window should look similar to the below once ready:

2Press Create Request to put together the code snippet. On Line 2, you will see the following piece of code:

req.open("GET", Xrm.Page.context.getClientUrl() + "/api/data/v8.0/incidents?fetchXml=%3Cfetch%20version%3D%221.0%22%20output-format%3D%22xml-platform%22%20mapping%3D%22logical%22%20distinct%3D%22false%22%3E%3Centity%20name%3D%22incident%22%3E%3Cattribute%20name%3D%22title%22%20%2F%3E%3Cattribute%20name%3D%22ticketnumber%22%20%2F%3E%3Cattribute%20name%3D%22createdon%22%20%2F%3E%3Cattribute%20name%3D%22incidentid%22%20%2F%3E%3Cattribute%20name%3D%22caseorigincode%22%20%2F%3E%3Cattribute%20name%3D%22description%22%20%2F%3E%3Cattribute%20name%3D%22customerid%22%20%2F%3E%3Corder%20attribute%3D%22ticketnumber%22%20descending%3D%22false%22%20%2F%3E%3Cfilter%20type%3D%22and%22%3E%3Ccondition%20attribute%3D%22prioritycode%22%20operator%3D%22eq%22%20value%3D%221%22%20%2F%3E%3C%2Ffilter%3E%3C%2Fentity%3E%3C%2Ffetch%3E"

The bit we are interested in is the string value after the Xrm.Page.context.getClientUrl() function, which will need to be appended to our CRM URL. So based on the above, our URL to use with PowerBI would be as follows:

https://crmchap.crm4.dynamics.com/api/data/v8.0/incidents?fetchXml=%3Cfetch%20version%3D%221.0%22%20output-format%3D%22xml-platform%22%20mapping%3D%22logical%22%20distinct%3D%22false%22%3E%3Centity%20name%3D%22incident%22%3E%3Cattribute%20name%3D%22title%22%20%2F%3E%3Cattribute%20name%3D%22ticketnumber%22%20%2F%3E%3Cattribute%20name%3D%22createdon%22%20%2F%3E%3Cattribute%20name%3D%22incidentid%22%20%2F%3E%3Cattribute%20name%3D%22caseorigincode%22%20%2F%3E%3Cattribute%20name%3D%22description%22%20%2F%3E%3Cattribute%20name%3D%22customerid%22%20%2F%3E%3Corder%20attribute%3D%22ticketnumber%22%20descending%3D%22false%22%20%2F%3E%3Cfilter%20type%3D%22and%22%3E%3Ccondition%20attribute%3D%22prioritycode%22%20operator%3D%22eq%22%20value%3D%221%22%20%2F%3E%3C%2Ffilter%3E%3C%2Fentity%3E%3C%2Ffetch%3E

A bit of a mouthful I agree!

  1. Now that we have the URL, we can connect up to CRM within PowerBI. Create or Open a new PBIX file and select Get Data -> Web:

3

  1. On the From Web window, copy and paste the URL we’ve built and press OK. You will be prompted to log into your CRM organisation; select Organisational account and log in as you would normally via the Office 365 login page. Once logged in, the data will be retrieved and the Query Editor will open, displaying something similar to the below:4
  2. Some additional work is required in order to get our data into a standard, tabular format. In addition, the data at the moment is returning the underlying Option Set and Lookup values from the Incident entity, as opposed to the Display Name; not good from a reporting point of view:

5We, therefore, need to modify the underlying PowerQuery in order to achieve the following:

Right click on Query1 and select Advanced Editor to open the underlying PowerQuery text. Delete everything here and then copy and paste the following into the window:

let
    //Get the CRM data, including the Formatted Values as part of the returned data
    Source = Json.Document(Web.Contents("https://crmchap.crm4.dynamics.com/api/data/v8.0/incidents?fetchXml=%3Cfetch%20version%3D%221.0%22%20output-format%3D%22xml-platform%22%20mapping%3D%22logical%22%20distinct%3D%22false%22%3E%3Centity%20name%3D%22incident%22%3E%3Cattribute%20name%3D%22title%22%20%2F%3E%3Cattribute%20name%3D%22ticketnumber%22%20%2F%3E%3Cattribute%20name%3D%22createdon%22%20%2F%3E%3Cattribute%20name%3D%22incidentid%22%20%2F%3E%3Cattribute%20name%3D%22caseorigincode%22%20%2F%3E%3Cattribute%20name%3D%22customerid%22%20%2F%3E%3Corder%20attribute%3D%22ticketnumber%22%20descending%3D%22false%22%20%2F%3E%3Cfilter%20type%3D%22and%22%3E%3Ccondition%20attribute%3D%22prioritycode%22%20operator%3D%22eq%22%20value%3D%221%22%20%2F%3E%3C%2Ffilter%3E%3C%2Fentity%3E%3C%2Ffetch%3E", [Headers=[#"Prefer"="odata.include-annotations=""OData.Community.Display.V1.FormattedValue"""]])),
    //Get the underlying list of records returned
    values = Source[value],
    //Create a new table with one column, populated with the values list of records
    #"Table from List" = Table.FromList(values, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    //Query will error if no results, therefore use an if statement to build an empty table with the correct column headers
    Expand = if List.IsEmpty(values) then #table({"title", "ticketnumber", "createdon", "incidentid", "caseorigincode", "_customerid_value@OData.Community.Display.V1.FormattedValue"}, {"title", "ticketnumber", "createdon", "incidentid", "caseorigincode", "_customerid_value@OData.Community.Display.V1.FormattedValue"}) else Table.ExpandRecordColumn(#"Table from List", "Column1", {"title", "ticketnumber", "createdon", "incidentid", "caseorigincode", "_customerid_value@OData.Community.Display.V1.FormattedValue"}, {"title", "ticketnumber", "createdon", "incidentid", "caseorigincode", "_customerid_value@OData.Community.Display.V1.FormattedValue"}),
    //For some reason, if there are no results, then the empty table contains error records - so need to specifically remove them
    #"Removed Errors" = Table.RemoveRowsWithErrors(Expand),
    //Finally, rename the _customerid_value field
    #"Renamed Columns" = Table.RenameColumns(#"Removed Errors",{{"_customerid_value@OData.Community.Display.V1.FormattedValue", "Customer Name"}})

in 
   #"Renamed Columns"

The comments should hopefully explain what the code is doing, but to summarise: the PowerQuery is parsing the JSON data into a tabular format, using the returned data to build column headers, and then renaming the _customerid_value field to match our requirements. There is also a logic statement in there to check if we actually have any data; and if not, then build an empty table (since JSON does not return anything that we can use if 0 records are returned).

With our updated PowerQuery, our result set should look similar to the below:

6

Nice, presentable, filtered and exactly what we need to start building our PowerBI report! 🙂

Conclusions or Wot I Think

Whilst I’ll be the first to admit the above example is rather code-heavy and would require significant tinkering to suit specific scenarios, I would argue that the approach demonstrated adheres most closely to some of the common rules when it comes to querying data for reports:

  • Only return the columns you need
  • Filter your data to reduce the number of results returned
  • Ensure that your report-side columns are giving presentable database column values

This approach is the only one adheres most closely to the above and, therefore, I feel that the extra effort is warranted; particularly when it means that those who are more familiar with CRM can stick to the tools they know best. As the default Dynamics 365 connector uses the Organization Data service, I would imagine that eventually this will be updated to use the new Web API instead. I hope that when this happens, we can begin to achieve all of the above via the PowerBI interface, as opposed to resorting to code.