A software upgrade/update always starts out with such good intentions and hopeful optimism. There are perhaps two opposing schools of thought that emerge when it comes to the merits and necessity of always ensuring you’re on the latest version of an application. The arguments for and against will generally look something like this:

In Favour

  • Having the latest version protects against potential exploits or security vulnerabilities present within an older version of the application.
  • The latest version of the application is easier to use, supports X feature etc.
  • The application will be end of life in the very near future, so we need to be proactive as opposed to reactive*

Against

  • The upgrade will take time to complete – we will have to perform in-depth testing and schedule time outside of normal business hours to carry this out.
  • As part of the XXX update, X feature is no longer available and we have to instead use Y feature instead.
  • The current version of the application works just fine – upgrading could cause us problems as a result of a bad security patch or similar.

*”very near future” is becoming more commonplace these days, particularly with Microsoft Cloud products. For example, Office 2013 is technically considered end of life for ProPlus customers at the end of this month.

Whilst I would argue strongly that keeping your important business applications up to date should always be an ultimate priority, the reality is less straightforward and even I have fallen foul of this in the past. Case in point: I recently upgraded to the Windows 10 Anniversary Edition from Windows 10, on a personal machine that had Hyper-V installed and a number of virtual images. The update went fine, but I was informed after the update that the latest version of .NET Framework needed to be downloaded. I dismissed the error message, as I was in the middle of something else, and then spent many hours later on attempting to figure out why the update had removed the Hyper-V program feature from my machine; after researching, I determined it was because of the prompt I had received when first booting up Windows and that the updated version of Hyper-V required the latest .NET Framework. I was able to get the role installed and re-configure all of my virtual images accordingly, but it did take some time and was definitely an unwelcome distraction! Suffice to say, an upgrade can never go exactly to plan, which is why I would always encourage the need for dedicated testing environments within your business for your primary IT systems. This will grant that you sufficient latitude to perform all required testing of an update and to give you the confidence that it can be deployed safely into your production environment(s).

Of course, the above does not help very much if you are upgrading your test environment and everything goes wrong there, such as what happened to me recently. The business in question was wanting to upgrade from Visual Studio 2013 to Visual Studio 2015. Their development environment was a virtualised, remote desktop server, which all of the developers logged into as their primary working environment. Development was carried out using the “out of the box” templates included in Visual Studio (C#, ASP.NET etc.) and also using SQL Server Data Tools for BIDS/SSIS development. All projects/solutions were stored in a central Git repository.

The process of installing Visual Studio 2015 and the current “production ready” 16.5 version of SQL Server Data Tools for Visual Studio 2015 went (rather too) swimmingly, and we began to run tests confirming that all Team Services projects opened without issue. We immediately came across an issue when attempting to open certain .rdl report files – Visual Studio would hang for about 10-15 minutes every time the report was opened, with the following prompt stuck on the screen and the application remaining in a non-responsive state:

The report would open fine in the end, but the issue was repeated whenever the report was re-opened.

We initially tried the following in an attempt to resolve the problem:

  • Re-cloned the repository – no joy.
  • Attempted to open the report from VS 2013 – the report opened fine without issue, so definitely a problem with VS 2015
  • Created a brand new Report Project template in VS 2015, added the report into the project (both as a copy and as a new report, with the underlying .xml definition copy + pasted) and then tried re-opening – the same issue occurred.

Being officially stumped at this juncture, I then did some further research online to see whether anyone else had encountered the same issue. Fortunately, I came across the following TechNet thread which contained the exact same symptoms we were experiencing:

https://social.technet.microsoft.com/Forums/sqlserver/en-US/ba55ce1b-0bac-4997-9e02-8748dfd38fae/opening-large-reports-in-ssrs-2016-takes-a-long-time-after-migratting-from-ssrs2012?forum=sqlreportingservices&prof=required

The thread seemed to attract some confused answers (in the sense that they didn’t grasp the underlying problem), before petering out with no apparent solution. Without holding my breath too much, I replied to the thread in the hopes of getting a definitive answer, which I received in almost record time:

Yes we did, we got a fix from Microsoft: https://go.microsoft.com/fwlink/?linkid=837939. After installing the reports where opening fine.

Not wishing to look a gift horse in the mouth at all, I did first double check the contents of the link to verify it – and it turned out to be 17.0 RC2 of SQL Server Data Tools for Visual Studio 2015. What’s worth noting is that the first hit on Google for SQL Server Data Tools Visual Studio 2015 is the download page for version 16.5 and not the following page that contains links to both versions:

https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt

Those who have already read through the TechNet thread will know how things panned out in the end, but just to summarise – installing this fixed the issue. So major credit to Erwin de Kreuk for doing all of the “hard work” to finding the solution in this unusual case and in responding so quickly to my forum post. This is definitely a great example of the old adage “You don’t ask, you don’t get” and how the wider community can often prove invaluable when resolving an especially troublesome IT issue.

So what should you do if you are planning to upgrade SSDT from Visual Studio 2013 to Visual Studio 2015?

The key takeaway from the above should be the fact that a release-candidate version of SSDT provided a resolution to the problem at hand. It would, therefore, be foolhardy to recommend a general upgrade from VS 2013 whilst 17.0 remains at a release-candidate version. Given that this version was released at the start of the year, it is highly likely to expect that a production-ready version of 17.0 will be released in the very near future. I would recommend holding off on your upgrade if your organisation works with a large number of SSRS reports, lest you also fall foul of this surprisingly strange bug.

Those who have experience working with an RDMS system like SQL Server will become accustomed towards a certain way of going about things. These can often involve a mixture of “lazy” query writing (e.g. using SELECT *… as opposed to SELECT Column1, Column2…), the manner in which you write your query (ALL CAPS or lower case) and best practice approaches. One arguable example of a best practice approach is the use of Stored Procedures. An illustration of how to use a Stored Procedure can most readily demonstrate their benefits. Take a look at the T-SQL query below, which should execute fine against the AdventureWorks2012 sample database:

SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]
FROM [HumanResources].[Employee] AS E
 INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH
  ON E.[BusinessEntityID] = DH.[BusinessEntityID]
 INNER JOIN [HumanResources].[Department] AS D
  ON DH.[DepartmentID] = D.[DepartmentID]
 INNER JOIN [Person].[Person] AS P
  ON E.[BusinessEntityID] = P.[BusinessEntityID]
WHERE DH.[EndDate] IS NULL
AND E.[JobTitle] = 'Production Technician - WC50'

The query returns the data we need, but not in an efficient manner. Consider the following:

  • Executing a query like the above, in-code, as part of an end-user application could expose your database to the risk of an SQL Injection attack or similar.
  • The query compromises a lot of information regarding our underlying database structure, information which any underlying client executing the query neither cares or should have to worry about.
  • The example is a very precise query, with a specific function – i.e. get me all the current employees who have the Job Title of Production Technician – WC50. If we wanted to modify it to instead obtain all Senior Tool Designers, we would have to write a completely separate query to accommodate this.

Implementing a Stored Procedure to encapsulate our query logic immediately addresses the above concerns, by providing us with a single-line query into the database, giving us just the data we need and enables us to utilise the query for other scenarios as well. Setting one up is very straight forward via the CREATE PROCEDURE command – the rest is pretty much what we have put together already:

CREATE PROCEDURE dbo.uspGetEmployeesByJobTitle 
	@JobTitle NVARCHAR(50)
AS
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

	SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]
	FROM [HumanResources].[Employee] AS E
	 INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH
	  ON E.[BusinessEntityID] = DH.[BusinessEntityID]
	 INNER JOIN [HumanResources].[Department] AS D
	  ON DH.[DepartmentID] = D.[DepartmentID]
	 INNER JOIN [Person].[Person] AS P
	  ON E.[BusinessEntityID] = P.[BusinessEntityID]
	WHERE DH.[EndDate] IS NULL
	AND E.[JobTitle] = @JobTitle
END
GO

By utilising a parameter for our WHERE clause filter on the Job Title, we can pass any valid value back to our stored procedure, immediately making our initial query more versatile across our reporting/business application. And, as a primary bonus, we can now safely take a 10 line query down to 1:

EXECUTE dbo.uspGetEmployeesByJobTitle @JobTitle = 'Senior Tool Designer'

So we have established that Stored Procedures are wicked cool awesome – but what does this have to do with PowerBI?!? 

Having worked with SQL Server Reporting Services (SSRS) extensively in the past, I have become accustomed to using Stored Procedures as a mechanism for storing underlying query logic within the database and having a straightforward means of referencing this from my .rdl file. I can only assume from this that this is the “norm” and preferred method of querying SQL data, as opposed to a direct SELECT statement.

When recently doing some work within PowerBI involving Azure SQL Databases, I was, therefore, surprised that there was no option to return data via a stored procedure as default. Instead, PowerBI would prefer me to directly query underlying table/view objects:

Thankfully, when inspecting the underlying PowerQuery used to return an example table from the above, it doesn’t use any kind of SELECT query to get the data:

let
    Source = Sql.Databases("mydatabaseinstance"),
    AdventureWorks2012 = Source{[Name="AdventureWorks2012"]}[Data],
    Production_ProductModel = AdventureWorks2012{[Schema="Production",Item="ProductModel"]}[Data]
in
    Production_ProductModel

Unfortunately, the same cannot be said for if you select the Advanced options area and input your own SQL query directly:

let
    Source = Sql.Database("mydatabaseinstance", "AdventureWorks2012", [Query="SELECT P.[FirstName], P.[LastName], E.[JobTitle], E.[HireDate], D.[Name]#(lf)FROM [HumanResources].[Employee] AS E#(lf) INNER JOIN [HumanResources].[EmployeeDepartmentHistory] AS DH#(lf)  ON E.[BusinessEntityID] = DH.[BusinessEntityID]#(lf) INNER JOIN [HumanResources].[Department] AS D#(lf)  ON DH.[DepartmentID] = D.[DepartmentID]#(lf) INNER JOIN [Person].[Person] AS P#(lf)  ON E.[BusinessEntityID] = P.[BusinessEntityID]#(lf)WHERE DH.[EndDate] IS NULL#(lf)AND E.[JobTitle] = 'Senior Tool Designer'"])
in
    Source

I do NOT recommend you use the above method to query your SQL Server data!

I have spoken previously on the blog in respect to conventions around working with datasets i.e. only get what you need, and nothing else. As I work more and more with PowerBI, the tool very much seems to be geared towards flipping this mentality on its head. PowerBI has a number of built-in tools that seem to scream out “Just get ALL your data in here, we’ll worry about the rest!”. I realise that the difference between MB and GB these days, from a storage/cost point of view, is minimal; nevertheless, I still believe it is prudent not to put all your eggs in one basket and ensure that your business data is not being stored cavalier-esque within a multitude of different cloud services.

With this in mind, it is good to know that you can utilise stored procedures in PowerBI. You basically have two ways in which this can be achieved:

  • Going back to the Advanced options screen above on the SQL Server database wizard, you can EXECUTE your stored procedure directly using the following SQL Statement:
DECLARE @SP VARCHAR(100) = 'dbo.uspGetEmployeesByJobTitle @JobTitle = ''Senior Tool Designer'''
EXEC (@SP)

Be sure to specify your database and don’t forget the double quotes!

  • If you prefer to use PowerQuery as opposed to the wizard above, then the following code will also work:
let
    Source = Sql.Database("mydatabaseinstance", "AdventureWorks2012", [Query="DECLARE @SP VARCHAR(100) = 'dbo.uspGetEmployeesByJobTitle @JobTitle = ''Senior Tool Designer'''#(lf)EXEC (@SP)"])
in
    Source

In both cases, you will be required to authenticate with the database and your result set should return as follows if using the AdventureWorks2012 example database/code:

Finally, as a best-practice security step, you should ensure that the account connecting to your SQL Server instance is restricted to only EXECUTE the procedures you have specified. This can be achieved via the following T-SQL snippet, executed against your database instance:

GRANT EXECUTE ON OBJECT::dbo.uspMyStoredProcedure  
    TO MySQLServerLogin 

Conclusions or Wot I Think

PowerBI is increasingly becoming a more relevant tool for traditional Business Intelligence/Reporting Services experienced professionals. The bellwether for this can surely be seen in the current Technical Preview for SQL Server Reporting Services, which includes PowerBI reports built-in to the application. Although we have no timescales at this stage at when the next major version of SQL Server will be released, it is reasonable to assume by the end of this year at the earliest, bringing PowerBI reports as a new feature. I am really excited about the introduction of PowerBI into SSRS, as it would appear to be a match made in heaven – giving an opportunity for those with experience in both products the flexibility to develop a unified, best of breed solution, using traditional .rdl reporting capability and/or PowerQuery/DAX functionality.

With the above on the horizon, the importance of being able to integrate seamlessly with SQL Server and having support for traditional/well-proven practices become crucial indicators of whether this match will be over before the honeymoon. And so, I would hope to see the option to access SQL Server data via Stored Procedures become standard when using the built-in data connector within PowerBI. Based on the feedback I have seen online, I’d warrant towards how welcome this feature could be and an excellent way of reducing the need for direct coding to achieve a common requirement within PowerBI.

For those who are well versed in rolling out solution updates within Dynamics CRM/365 for Enterprise (CRM/D365E), the process will always have a certain familiarity to it, with a few surprises rolled in now and again. Often, the update will proceed as anticipated; sometimes, you may encounter bizarre issues. I can remember a particularly strange incident I had last year where a solution import would get to about 90-95% completion…and then the green progress bar would suddenly start rolling back to nothing. The import progress window would then hang with no further guidance or error message. To try and determine the root cause, we had to interrogate the importjob entity within the system, which ended up showing the import job progress stuck at 0.15846281% : / In the end, we had to escalate the issue to Microsoft for further investigation, but rest assured that if you have not yet witnessed your own curious solution import experience, it’s definitely in the post 🙂

Thankfully, if you are new to the whole “rolling out solution update” thing, you can be assured that the process is relatively straightforward, and mostly without issue. If you have been handed a set of solution import instructions for the first time, though, you may be wondering why something similar to the following step is included:

Go into the Data Management -> Duplicate Detection Rules page and click Publish on all Duplicate Detection Rules that have a Status Reason of Unpublished

Unfortunately, after importing a solution update, CRM/D365E will automatically unpublish all of your Duplicate Detection Rules automatically. You are therefore required to explicitly publish them again, lest you start to encounter a sudden increase in duplicate records and database storage within your system. The reason why this happens is both understandable and frustrating in equal measure. As outlined in the following MSDN article on the subject:

A duplicate rule condition specifies the name of a base attribute and the name of a matching attribute. For example, specify an account as a base entity and a contact as a matching entity to compare last names and addresses

As part of the above, explicit matchcodes are created for every record that the Duplicate Detection Rule is targeting, based on the current metadata of your CRM/D365E entities and attributes. Because your solution update can potentially alter significant aspects of this metadata, the system automatically unpublishes all Duplicate Detection Rules as a precaution.

The above is perhaps trivial in nature, as the actual process of re-publishing all Duplicate Detection Rules is somewhat negligible in effort terms. Where difficulties can arise is if someone innocently overlooks this part of the process or if your system has many different Duplicate Detection Rules, in a mixture of Unpublished/Published state. You would have to specifically make a note of which rules were Published before beginning your solution import so that you can ensure that the correct rules are published after the fact. I would have thought that after so many versions of the product, that something would be added to address this – for example, perhaps a checkbox at the start of the Solution Import Wizard that lets you specify whether all currently published rules should be reactivated after the import completes successfully.

If you find that the above is an annoyance that you can do without no longer, like with many things on the platform, there is a solution that can be deployed in code. The SDK exposes the PublishDuplicateRuleRequest class, which does exactly what it says on the tin – meaning that you can write a plugin that applies this functionality accordingly. The tricky bit comes in determining which Message (i.e. action) on the platform that you wish to run this against. CRM/D365E does not expose a SolutionUpdate or SolutionImport message that we can piggy-back onto, so we have to look at the PublishAll message instead – the action that is triggered when you press Publish All Customizations in the system. This is because this is generally the action you will always need to take when importing an (unmanaged) solution. As a result, we can write a plugin class that is triggered on the Post-Operation event of this entity to automatically publish all Unpublished Duplicate Detection Rules in the system!

The snippet below is adapted from the sample code provided by Microsoft, but has been tweaked as follows:

  • A QueryExpression is used as opposed to QueryByAttribute, since we need to query on two separate attributes and their values – statecode and statuscode. You also cannot return an easily accessible count on all results returned with QueryByAttribute. We will see why is useful in a few moments.
  • The code explicitly checks for if there are any Unpublished rules first before attempting to proceed further – no point in running code unnecessarily!
  • Instead of activating each rule one-by-one using an Execute request, all of the requests are collected together as part of an ExecuteMultipleRequest, given that we now know the performance benefits that this can have.
  • Tracing has been implemented in liberal amounts, to provide remote debugging from within CRM/D365E.

Here’s the code – just copy into an empty class file on your plugin project, modify the namespace to reflect the name of your project and you will be good to go!

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Sdk.Messages;
using Microsoft.Crm.Sdk.Messages;

namespace MyPlugin.Plugins
{
    public class PostPublishAll_PublishDuplicateDetectionRules : IPlugin 
    {
        public void Execute(IServiceProvider serviceProvider)
        {
            //Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            //Get a reference to the Organization service.

            IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = factory.CreateOrganizationService(context.UserId);

            //Extract the tracing service for use in debugging sandboxed plug-ins

            ITracingService tracing = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

            tracing.Trace("Tracing implemented successfully!");

            if (context.MessageName == "PublishAll")

            {
                PublishRules(service, tracing);
            }
        }

        private void PublishRules(IOrganizationService service, ITracingService tracing)

        {
            EntityCollection rules = GetDuplicateDetectionRules(service);

            tracing.Trace("Obtained " + rules.TotalRecordCount.ToString() + " duplicate detection rules.");

            if (rules.TotalRecordCount >= 1)

            {
                // Create an ExecuteMultipleRequest object.
                ExecuteMultipleRequest request = new ExecuteMultipleRequest()
                {
                    // Assign settings that define execution behavior: don't continue on error, don't return responses. 
                    Settings = new ExecuteMultipleSettings()
                    {
                        ContinueOnError = false,
                        ReturnResponses = false
                    },
                    // Create an empty organization request collection.
                    Requests = new OrganizationRequestCollection()
                };

                //Create a collection of PublishDuplicateRuleRequests, and execute them in one batch

                foreach(Entity entity in rules.Entities)

                {

                    PublishDuplicateRuleRequest publishReq = new PublishDuplicateRuleRequest { DuplicateRuleId = entity.Id };
                    request.Requests.Add(publishReq);
                    
                }

                service.Execute(request);

            }

            else

            {
                tracing.Trace("Plugin execution cancelled, as there are no duplicate detection rules to publish.");
                return;
            }
        }

        private EntityCollection GetDuplicateDetectionRules(IOrganizationService service)

        {
            QueryExpression qe = new QueryExpression("duplicaterule");

            qe.ColumnSet = new ColumnSet("duplicateruleid");

            ConditionExpression condition1 = new ConditionExpression();
            condition1.AttributeName = "statecode";
            condition1.Operator = ConditionOperator.Equal;
            condition1.Values.Add(0);

            ConditionExpression condition2 = new ConditionExpression();
            condition2.AttributeName = "statuscode";
            condition2.Operator = ConditionOperator.Equal;
            condition2.Values.Add(0);

            FilterExpression filter = new FilterExpression();
            filter.FilterOperator = LogicalOperator.And;
            filter.Conditions.Add(condition1);
            filter.Conditions.Add(condition2);

            qe.Criteria.AddFilter(filter);

            //Have to add this, otherwise the record count won't be returned correctly

            qe.PageInfo.ReturnTotalRecordCount = true;

            return service.RetrieveMultiple(qe);

        } 
    }
}

The only caveat with the above is that it is arguably only useful for if you are regularly importing Unmanaged, as opposed to Managed solutions, as the Publish All Customizations option is not displayed on the import wizard for unmanaged solutions. Nevertheless, by rolling out the above into your environment, you no longer need to scrabble around for the mental note you have to make when performing a solution update 🙂

It is often the case, as part of any application or database system, that certain record types will be well-suited towards duplication. Whilst this is generally a big no-no for individual customer records or invoice details, for example, there are other situations where the ability to duplicate and slightly modify an existing record becomes incredibly desirable. This is then expanded further to the point where end-users are given the ability to perform such duplication themselves.

A good example of this can be found within Dynamics CRM/Dynamics 365 for Enterprise (CRM/D365E). Email Templates are, in essence, a record type that is duplicated whenever a user selects the Template and creates a new Email record from within the application. Whilst there will always be details that need to be modified once the duplication is performed, having the ability to essentially “copy + paste” an existing record can generate the following benefits for a business:

  • Streamlining and adherence to business processes
  • Efficiency savings
  • Brand consistency

CRM/D365E does a pretty good job of making available a number of record types designed solely for this purpose, but a recent real-life example demonstrated a potential gap. A business I was working with was implementing the full sales process within the application – Lead to Opportunity to Quote etc. At the Quote stage, the businesses existing process would generally involve having a number of predefined “templates” for Quotes. This was due to the fact that the business would very regularly quote for the same kind of work, often with little or no variation. Out of the box with CRM/D365E, the sales team would have to create a new Quote record and add on every Quote Product line item each time a Quote was required – leading to little or no efficiency benefit of using the application.

To get around the issue, I was tasked with creating a means of setting up a number of “template” Quote records and then have the ability to quickly copy these template records, along with all of their associated Quote Product records, with some minor details changed in the process (for example, the Name value of the Quote). A workflow is immediately the best candidate for addressing the second requirement of this task but would require some additional development work to bring to fruition. I decided then to look, rather nervously, at creating a custom workflow assembly.

Why nervously? To be frank, although I have had plenty experience to date with writing plugins for CRM/D365E, I had not previously developed a custom workflow assembly. So I was a little concerned that the learning curve involved would be steep and take much longer than first anticipated. Fortunately, my fears were unfounded, and I was able to grasp the differences between a plugin and a custom workflow assembly very quickly:

  • Instead of inheriting from the IPlugin interface, your class instead needs to be set to the CodeActivity interface. As with plugins and, depending on Visual Studio version, you can then use CTRL + . to implement your Execute method.
  • Context (i.e. the information regarding the who, what and why of the execution; the User, the Entity and the action) is derived from the IWorkflowContext as opposed to the IPluginExecutionContext
  • Input/Output Parameters are specified within your Execute method and can be given a label, a target entity and then information regarding the data type that will be passed in/out. For example, to specify an Input Parameter for a Quote EntityReference, with the label Quote Record to Copy, you can use the following snippet:
[Input("Quote Record to Copy")]
[ReferenceTarget("quote")]
public InArgument<EntityReference> QuoteReference { get; set; }

The rest is as you would expect when writing a C# plugin. It is good to know that the jump across from plugins to custom workflow assemblies is not too large, so I would encourage anyone to try writing one if they haven’t done so already.

Back to the task at hand…

I implemented the appropriate logic within the custom workflow assembly to first create the Quote, using a Retrieve request to populate the quote variable with the Entity details and fields to copy over:

Entity newQuote = quote;
newQuote.Id = Guid.Empty;
newQuote.Attributes.Remove("quoteid");
newQuote.Attributes["name"] = "Copy of " + newQuote.GetAttributeValue<string>("quotenumber");
Guid newQuoteID = service.Create(newQuote);

The important thing to remember with this is that you must set the ID of the record to blank and then remove it from the newQuote – otherwise, your code will attempt to create the new record with the existing GUID of the copied record, resulting in an error.

Next, I performed a RetrieveMultiple request based off a QueryExpression to return all Quote Product records related to the existing records. Once I had my results in my qp EntityCollection, I then implemented my logic as follows:

foreach (Entity product in qp.Entities)

    {
        Entity newProduct = product;
        newProduct.Id = Guid.Empty;
        newProduct.Attributes.Remove("quotedetailid");
        newProduct.Attributes["quoteid"] = new EntityReference("quote", newQuoteID);
        service.Create(newProduct);
    }

After deploying to CRM and setting up the corresponding Workflow that referenced the assembly, I began testing. I noticed that the Workflow would occasionally fail on certain Quote records, with the following error message:

The plug-in execution failed because the operation has timed-out at the Sandbox Client.System.TimeoutException

The word Sandbox immediately made me think back to some of the key differences between CRM/D365E Online and On-Premise version, precisely the following detail pertaining to custom code deployed to Online versions of the application – it must always be deployed in Sandbox mode which, by default, only allows your code to process for 2 minutes maximum. If it exceeds this, the plugin/workflow will immediately fail and throw back the error message above. Upon closer investigation, the error was only being thrown for Quote records that had a lot of Quote Products assigned to them. I made the assumption that the reason why the workflow was taking longer than 2 minutes is because my code was performing a Create request into CRM for every Quote Product record and, as part of this, only proceeding to the next record once a success/failure response was returned from the application.

The challenge was therefore to find an alternative means of creating the Quote Product records without leading the Workflow to fail. After doing some research, I came across a useful MSDN article and code example that utilised the ExecuteMultipleRequest message:

You can use the ExecuteMultipleRequest message to support higher throughput bulk message passing scenarios in Microsoft Dynamics 365 (online & on-premises), particularly in the case of Microsoft Dynamics 365 (online) where Internet latency can be the largest limiting factor. ExecuteMultipleRequest accepts an input collection of message Requests, executes each of the message requests in the order they appear in the input collection, and optionally returns a collection of Responses containing each message’s response or the error that occurred.

Source: https://msdn.microsoft.com/en-us/library/jj863631.aspx

Throwing caution to the wind, I repurposed my code as follows, in this instance choosing not to return a response for each request:

// Create an ExecuteMultipleRequest object.

ExecuteMultipleRequest request = new ExecuteMultipleRequest()

{
    // Assign settings that define execution behavior: continue on error, return responses. 
    Settings = new ExecuteMultipleSettings()
    {
        ContinueOnError = false,
        ReturnResponses = false
    },
    // Create an empty organization request collection.
    Requests = new OrganizationRequestCollection()
};

    foreach (Entity product in qp.Entities)

    {
        Entity newProduct = product;
        newProduct.Id = Guid.Empty;
        newProduct.Attributes.Remove("quotedetailid");
        newProduct.Attributes["quoteid"] = new EntityReference("quote", newQuoteID);
        CreateRequest cr = new CreateRequest { Target = newProduct };
        request.Requests.Add(cr);

    }

service.Execute(request);

Thankfully, after re-testing, we no longer encountered the same errors on our particularly large Quote test records.

As a learning experience, the above has been very useful in showcasing how straightforward custom workflow assemblies are when coming from a primarily plugin development background. In addition, the above has also presented an alternative method for creating batch records within CRM/D365E, in a way that will not cause severe performance detriment. I was surprised, however, that there is no out of the box means of quickly copying existing records, thereby requiring an approach using code to resolve. Quotes are an excellent example of an Entity that could benefit from Template-isation in the near future, in order to expedite common order scenarios and help prevent carpel tunnel syndrome from CRM users the world over 🙂