When working with Virtual Machines (VM’s) on Azure that have been deployed using a Linux Operating System (Ubuntu, Debian, Red Hat etc.), you would be forgiven for assuming that the experience would be rocky when compared with working with Windows VM’s. Far from it – you can expect an equivalent level of support for features such as full disk encryption, backup and administrator credentials management directly within the Azure portal, to name but a few. Granted, there may be some difference in the availability of certain extensions when compared with a Windows VM, but the experience on balance is comparable and makes the management of Linux VM’s a less onerous journey if your background is firmly rooted with Windows operating systems.

To ensure that the Azure platform can effectively do some of the tasks listed above, the Azure Linux Agent is deployed to all newly created VM’s. Written in Python and supporting a wide variety of common Linux distributable OS’s, I would recommend never removing this from your Linux VM; no matter how tempting this may be. The tradeoff could cause potentially debilitating consequences within Production environments. A good example of this would be if you ever needed to add an additional disk to your VM. This task would become impossible to achieve without the Agent present on your VM. As well as having the proper reverence for the Agent, it’s important to keep an eye on how the service is performing, even more so if you are using Recovery Services vaults to take snapshots of your VM and back up to another location. Otherwise, you may start to see errors like the one below being generated whenever a backup job attempts to complete:

It’s highly possible that many administrators are seeing this error at the time of writing this post (January 2018). The recent disclosures around Intel CPU vulnerabilities have prompted many cloud vendors, including Microsoft, to roll out emergency patches across their entire data centres to address the underlying vulnerability. Whilst it is commendable that cloud vendors have acted promptly to address this flaw, the pace of the work and the dizzying array of resources affected has doubtless led to some issues that could not have been foreseen from the outset. I believe, based on some of the available evidence (more on this later), that one of the unintended consequences of this work is the rise of the above error message with the Azure Linux Agent.

When attempting to deal with this error myself, there were a few different steps I had to try before the backup started working correctly. I have collected all of these below and, if you find yourself in the same boat, one or all of them should resolve the issue for you.

First, ensure that you are on the latest version of the Agent

This one probably goes without saying as it’s generally the most common answer to any error ūüôā However, the steps for deploying updates out onto a Linux machine can be more complex, particularly if your primary method of accessing the machine is via an SSH terminal. The process for updating your Agent depends on your Linux version – this article provides instructions for the most common variants. In addition, it is highly recommended that the auto-update feature is enabled, as described in the article.

Verify that the Agent service starts successfully.

It’s possible that the service is not running on your Linux VM. This can be confirmed by executing the following command:

service walinuxagent start

Be sure to have your administrator credentials handy, as you will need to authenticate to successfully start the service. You can then check that the service is running by executing the ps -e command.

Clear out the Agent cache files

The Agent stores a lot of XML cache files within the /var/lib/waagent/ folder, which can sometimes cause issues with the Agent. Microsoft has specifically recommended that the following command is executed on your machine after the 4th January if you are experiencing issues:

sudo rm -f /var/lib/waagent/*.[0-9]*.xml

The command will delete all “old” XML files within the above folder. A restart of the service should not be required. The date mentioned above links back to the theory suggested earlier in this post that the Intel chipset issues and this error message are linked in some way, as the dates seem to tie with when news first broke regarding the vulnerability.

If you are still having problems

Read through the entirety of this article, and try all of the steps suggested – including digging around in the log files, if required. If all else fails, open a support request directly with Microsoft.

Hopefully, by following the above steps, your backups are now working again without issue ūüôā

This is an accompanying blog post to my YouTube video Dynamics 365 Customer Engagement Deep Dive: Creating a Basic Jscript Form Function, the first in a series that aims to provide tutorials on how to accomplish developer focused tasks within Dynamics 365 Customer Engagement. You can watch the video in full below:

Below you will find links to access some of the resources discussed as part of the video and to further reading topics.

PowerPoint Presentation (click here to download)

Full Code Sample

function changeAddressLabels() {

    //Get the control for the composite address field and then set the label to the correct, Anglicised form. Each line requires the current control name for 'getControl' and then the updated label name for 'setLabel'

    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_line1").setLabel("Address 1");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_line2").setLabel("Address 2");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_line3").setLabel("Address 3");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_city").setLabel("Town");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_stateorprovince").setLabel("County");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_postalcode").setLabel("Postal Code");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_country").setLabel("Country");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line1"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line1").setLabel("Address 1");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line2"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line2").setLabel("Address 2");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line3"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line3").setLabel("Address 3");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_city"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_city").setLabel("Town");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_stateorprovince"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_stateorprovince").setLabel("County");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_postalcode"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_postalcode").setLabel("Postal Code");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_country"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_country").setLabel("Country");
}

Download/Resource Links

Visual Studio 2017 Community Edition

Setup a free 30 day trial of Dynamics 365 Customer Engagement

W3 Schools JavaScript Tutorials

Source Code Management Solutions

Further Reading

MSDN – Use JavaScript with Microsoft Dynamics 365

MSDN – Use the Xrm.Page. object model

MSDN – Xrm.Page.ui control object

MSDN – Overview of Web Resources

Debugging custom JavaScript code in CRM using browser developer tools (steps are for Dynamics CRM 2016, but still apply for Dynamics 365 Customer Engagement)

Have any thoughts or comments on the video? I would love to hear from you! I’m also keen to hear any ideas for future video content as well. Let me know by leaving a comment below or in the video above.

With 2018 now very firmly upon us, it’s time again to see what’s new in the world of Dynamics 365 certification. Nothing much has changed this time around, but there are a few noteworthy updates to be aware if you are keen to keep your certifications as up to date as possible. Here’s my round-up of what’s new – let me know in the comments if you think I have missed anything out.

MCSE Business Applications 2018

The introduction of a dedicated Microsoft Certified Solutions Architect (MCSA) and Microsoft Certified Business Applications (MCSE) certification track for Dynamics 365 was a positive step in highlighting the importance of Dynamics 365 alongside other core Microsoft products. Under the new system, those wishing to maintain a “good standing” MCSE will need to re-certify each year with a brand new exam to keep their certification current for the year ahead. Those who obtained their MCSE last year will now notice on their certificate planner the opportunity to attain the 2018 version of the competency via the passing of a single exam. For Dynamics 365 Customer Engagement focused professionals, assuming you only passed either the Sales or Customer Service exam last year, passing the other exam should be all that is required to recertify – unless you fancy your chances trying some of the new exams described below.

New MCSE Exams

Regardless of what boat you are relating to the Business Applications MCSE, those looking to obtain to 2018 variant of the certification can expect to see two additional exams available that will count towards the necessary award requirements:

The exams above currently only appear on the US Microsoft Learning website but expect them to appear globally within the next few weeks/months.

MB2-877 represents an interesting landmark in Microsoft’s journey towards integrating FieldOne as part of Dynamics CRM/Dynamics 365 Customer Engagement, with it arguably indicating the ultimate fruition of this journey. To pass the exam,¬† you are going to have to have a good grasp of the various entities involved as part of the field service app, as well as a thorough understanding of the mobile application itself. As is typically the case when it comes to Dynamics 365 certification, the Dynamics Learning Portal (DLP) is going to be your destination for preparatory learning resources for the exam; along with a good play around with the application itself within a testing environment. If you have access to the DLP, it is highly recommended you complete the following courses at your own pace before attempting the exam:

Dynamics 365 for Retail is a fairly new addition to the “family” and one which – admittedly – I have very little experience with. It’s rather speedy promotion to exam level status I, therefore, find somewhat surprising. This is emphasised further by the fact that there are no dedicated exams for other, similar applications to Field Service, such as Portals. Similar to MB2-877, preparation for MB6-897 will need to be directed primarily through DLP, with the following recommended courses for preparation:

Exam Preparation Tips

I’ve done several blog posts in the past where I discuss Dynamics CRM/Dynamics 365 exams, offering help to those who may be looking to achieve a passing grade. Rather than repeat myself (or risk breaking any non-disclosure agreements), I’d invite you to cast your eyes over these and (I hope!) that they prove useful in some way as part of your preparation:

If you have got an exam scheduled in, then good luck and make sure you study! ūüôā

Did you know that you can write Plug-ins for Dynamics 365 Customer Engagement/Dynamics CRM (D365CE/CRM) using Visual Basic .NET (VB.NET)? You wouldn’t have thought so after a thorough look through the D365CE/CRM Software Development Kit (SDK). Whilst there is a plethora of code examples available for C# plug-ins, no examples are provided on how to write a basic plug-in for the application using VB.NET. This is to be expected, perhaps due to the common status that the language has when compared with C#. Whilst VB.NET knowledge is a great asset to have when extending Office applications via Visual Basic for Applications, you would struggle to find many at scale application systems that are written using VB.NET. C# is pretty much the¬†de facto language that you need to utilise when developing in .NET, and the commonly held view is that exclusive VB.NET experience is a detriment as opposed to an asset. With this in mind, it is somewhat understandable why the SDK does not have any in-depth VB.NET code examples.

Accepting the above, it is likely however that many long-standing developers will have knowledge of the BASIC language, thereby making VB.NET a natural choice when attempting to extend D365CE/CRM. I do not have extensive experience using the language, but I was curious to see how difficult it would be to implement a plug-in using it Рand to hopefully provide assistance to any lonely travellers out there who want to put their VB.NET expertise to the test. The best way to demonstrate this is to take an existing plug-in developed in C# and reverse engineer the code into VB.NET. We took a look at a fully implemented plug-in previously on the blog, that can be used to retrieve the name of the User who has created a Lead record. The entire class file for this is reproduced below:

using System;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;

namespace D365.BlogDemoAssets.Plugins
{
    public class PostLeadCreate_GetInitiatingUserExample : IPlugin
    {
        public void Execute(IServiceProvider serviceProvider)
        {
            // Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            // Obtain the organization service reference.
            IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);

            // The InputParameters collection contains all the data passed in the message request.
            if (context.InputParameters.Contains("Target") &&
                context.InputParameters["Target"] is Entity)

            {
                Entity lead = (Entity)context.InputParameters["Target"];

                //Use the Context to obtain the Guid of the user who triggered the plugin - this is the only piece of information exposed.
      
                Guid user = context.InitiatingUserId;

                //Then, use GetUserDisplayCustom method to retrieve the fullname attribute value for the record.

                string displayName = GetUserDisplayName(user, service);

                //Build out the note record with the required field values: Title, Regarding and Description field

                Entity note = new Entity("annotation");
                note["subject"] = "Test Note";
                note["objectid"] = new EntityReference("lead", lead.Id);
                note["notetext"] = @"This is a test note populated with the name of the user who triggered the Post Create plugin on the Lead entity:" + Environment.NewLine + Environment.NewLine + "Executing User: " + displayName;

                //Finally, create the record using the IOrganizationService reference

                service.Create(note);
            }
        }
    }
}

The code above encapsulates a number of common operations that a plug-in can seek to accomplish Рupdating a record, obtaining context specific values and performing a Retrieve operation against an entity Рthereby making it a good example for what will follow in this post.

With everything ready to go, it’s time for less talking and more coding ūüôā We’ll build out a VB.NET version of the above class file, covering some of the “gotchas” to expect on each step, before then bringing all of the code together in a finished state.

Importing References

As you would expect within C#, a class file requires references to the D365CE SDK DLL files. These should be imported into your project and then added to your class file using the Imports statement:

With these two lines of code, there are immediately two things which you may need to untrain yourself from doing if you have come from a C# background:

  1. Make sure not to add your semi-colons at the end of each line, as it is not required for VB.NET
  2. You may be tempted to use the Return key to auto-complete your syntax, which works fine in a C# project…but will instead skip you down to the next line in VB.NET. Instead, use the Tab key to autocomplete any IntelliSense prompts.

Adding a Namespace

By default, a VB.NET class project does not implement a Namespace for your class. This will need to be added next, underneath the project references like so:

Implementing the IPlugin Interface

So far so good…and things continue in the same vein when implementing the IPlugin interface. This is configured like so:

The only thing to remember here, if you are still in C# mode, is that your colon is replaced with the Implements statement and that this part of the code needs to be moved to the next line.

Putting together the Execute Method

The Execute method is the heart and soul of any plug-in, as this contains the code that will execute when the plug-in is triggered. In C#, this is implemented using a void method¬†(i.e. a block of code that does not return a specific value, object etc.). It’s equivalent within VB.Net is a¬†Sub – short for “Subroutine” – which needs to be additionally peppered with an¬†Implements¬†statement to the¬†IPlugin.Execute sub:

Implementing Variables

Here’s where things start to get different. Variables within C# are generally implemented using the following syntax:

<Type> <Variable Name> = <Variable Value>;

So to declare a string object called text, the following code should be used:

string text = “This is my string text”;

Variables in VB.NET, by contrast, are always declared as¬†Dim‘s, with the name and then the Type of the variable declared afterwards. Finally, a value can then be (optionally) provided. A complete example of this can be seen below in the implementing of the IPluginExecutionContext interface:

In this above example, we also see two additional differences that C# developers have to reconcile themselves with:

  • There is no need to specifically cast the value as an¬†IPluginExecutionContext object – a definite improvement over C# ūüôā
  • Rather than using the typeof¬†operator when obtaining the service, the VB.NET equivalent GetType should be used instead.

The process of creating variables can seem somewhat labourious when compared with C#, and there are a few other things to bear in mind with variables and this plug-in specifically. These will be covered shortly.

The If…Then Statement

Pretty much every programming language has an implementation of an If…Else construct to perform decisions based on conditions (answers in the comments if you have found a language that doesn’t!). VB.NET is no different, and we can see how this is implemented in the next bit of the plug-in code:

Compared with C#, you have to specifically remember to add a¬†Then statement after your conditional test and also to include an End If at the end of your code block. It’s also important to highlight the use of different operators as well – in this case, And should be used as opposed to &&.

Assigning Values to a CRM Entity Object

The assignment of entity attribute values differs only slight compared with C# – you just need to ensure that you surround your attribute Logical Name value behind brackets as opposed to square brackets:

String concatenates also work slightly differently. Be sure to use & as opposed to + to achieve the same purpose. For new line breaks, there is also an equivalent VB.NET snippet that can be used for this, vbCrLf.

Obtaining the Users Display Name Value

The final part of the class file is the retrieval of the Full Name value of the user. This has to be done via a Function as opposed to a Dim, as a specific value needs to be returned. Keep in mind the following as well:

  • Parameters that are fed to the¬†Function must always be prefaced with the¬†ByVal statement – again, another somewhat tedious thing to remember!
  • Note that for the GetAttributeVale method, we specify the attribute data type using the syntax¬†(Of String)¬†as opposed to¬†<string>

Other than that, syntax-wise, C# experienced developers should have little trouble re-coding this method into VB.NET. This is evidenced by the fact that the below code snippet is approximately 75% similar to how it needs to be in C#:

Bringing it all together

Having gone through the class file from start to bottom, the entire code for the plug-in is reproduced below:

Imports Microsoft.Xrm.Sdk
Imports Microsoft.Xrm.Sdk.Query

Namespace D365.BlogDemoAssets.VB

    Public Class PostLeadCreate_GetInitiatingUserExample
        Implements IPlugin

        Private Sub IPlugin_Execute(serviceProvider As IServiceProvider) Implements IPlugin.Execute

            'Obtain the execution context from the service provider.

            Dim context As IPluginExecutionContext = serviceProvider.GetService(GetType(IPluginExecutionContext))

            'Obtain the organization service reference.
            Dim serviceFactory As IOrganizationServiceFactory = serviceProvider.GetService(GetType(IOrganizationServiceFactory))
            Dim service As IOrganizationService = serviceFactory.CreateOrganizationService(context.UserId)

            'The InputParameters collection contains all the data passed in the message request.
            If (context.InputParameters.Contains("Target") And TypeOf context.InputParameters("Target") Is Entity) Then

                Dim lead As Entity = context.InputParameters("Target")

                'Use the Context to obtain the Guid of the user who triggered the plugin - this is the only piece of information exposed.

                Dim user As Guid = context.InitiatingUserId

                'Then, use GetUserDisplayCustom method to retrieve the fullname attribute value for the record.

                Dim displayName As String = GetUserDisplayName(user, service)

                'Build out the note record with the required field values: Title, Regarding and Description field

                Dim note As Entity = New Entity("annotation")
                note("subject") = "Test Note"
                note("objectid") = New EntityReference("lead", lead.Id)
                note("notetext") = "This is a test note populated with the name of the user who triggered the Post Create plugin on the Lead entity:" & vbCrLf & vbCrLf & "Executing User: " & displayName

                'Finally, create the record using the IOrganizationService reference

                service.Create(note)

            End If

        End Sub
        Private Function GetUserDisplayName(ByVal userID As Guid, ByVal service As IOrganizationService) As String

            Dim user As Entity = service.Retrieve("systemuser", userID, New ColumnSet("fullname"))
            Return user.GetAttributeValue(Of String)("fullname")

        End Function

    End Class

End Namespace

Conclusions or Wot I Think

C# is arguably the de facto choice when programming using D365CE/CRM, and more generally as well for .NET. All of the code examples, both within the SDK and online, will favour C# and I do very much hold the view that development in C# should always be preferred over VB.NET. Is there ever then a good business case for developing in VB.NET over C#? Clearly, if you have a developer available within your business who can do amazing things in VB.NET, it makes sense for this to be the language of choice to use for time-saving purposes. There may also be a case for developing in VB.NET from a proprietary standpoint. VB.NET is, as acknowledged, not as widely disseminated compared with C#. By developing custom plug-ins in VB.NET, that contain sensitive business information, you are arguably safeguarding the business by utilising a language that C# developers may have difficulty in initially deciphering.

All being said, C# should be preferred when developing plug-ins, custom workflow assemblies or custom applications involving D365CE/CRM. Where some work could be made is in ensuring that¬†all¬†supported programming languages are adequately provisioned for within the D365CE SDK moving forward. Because, let’s be honest – there is no point in supporting something if people have no idea how to use it in the first place.

Working in-depth amidst the Sales entities (e.g. Product, Price List, Quote etc.) within Dynamics CRM/Dynamics 365 Customer Engagement (CRM/D365CE) can produce some unexpected complications. What you may think is simple to achieve on the outset, based on how other entities work within the system, often leads you in a completely different direction. A good rule of thumb is that any overtly complex customisations to these entities will mean having to get down and dirty with C#, VB.Net or even JScript. For example,¬†we’ve seen previously on the blog how,¬†with a bit of a developer expertise, it is possible to overhaul the entire pricing engine within the application to satisfy specific business requirements.¬†There is no way in which this can be modified directly through the application interface, which can lead to CRM deployments that make imaginative and complicated utilisation of features such as Workflows, Business Rules and other native features. Whilst there is nothing wrong with this approach per-say, the end result is often implementations that look messy when viewed cold and which become increasingly difficult to maintain in the long term. As always, there is a balance to be found, and any approach which makes prudent use of both application features and bespoke code is arguably the most desirous end goal for achieving certain business requirements within CRM/D365CE.

To prove my point around Sales entity “oddities”, a good illustration can be found when it comes to working with relationship field mappings and Product records. The most desirable feature at the disposal of CRM customisers is the ability to configure automated field mapping between Entities that have a one-to-many (1:N) relationship between them. What this means, in simple terms, is that when you create a many (N) record from the parent entity (1), you can automatically copy the field values to a matching field on the related entity. This can help to save data entry time when qualifying a¬†Lead to an¬†Opportunity, as all the important field data you need to continue working on the record will be there ready on the newly created¬†Opportunity record. Field mappings can be configured from the 1:N relationship setting window, via the¬†Mappings¬†button:

There are a few caveats to bear in mind – you can only map across fields that have the same underlying data type and you cannot map multiple source fields to the same target (it should be obvious why this is ūüôā ) – but on the whole, this is a handy application feature that those who are more accustomed to CRM development should always bear in the mind when working with CRM/D365CE.

Field mappings are, as indicated, a standard feature within CRM/D365CE Рbut when you inspect the field relationships between the Product and Quote Product entity, there is no option to configure mappings at all:

Upon closer inspection, many of the relationships between the¬†Product entity and others involved as part of the sales order process are missing the ability to configure field mappings. So, for example, if you have a requirement to map across the value of the¬†Description entity to a newly created¬†Quote Product¬†record, you would have to look at implementing a custom plugin to achieve your requirements. The main benefit of this route is that we have relatively unrestricted access to the record data we need as part of a plugin execution session and – in addition – we can piggyback onto the record creation process to add on our required field “in-flight” – i.e. whilst the record is being created. The code for achieving all of this is as follows:

using System;

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;

namespace D365.BlogDemoAssets.Plugins
{
    public class PreQuoteProductCreate_GetProductAttributeValues : IPlugin
    {
        public void Execute(IServiceProvider serviceProvider)
        {
            //Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            //Get a reference to the Organization service.

            IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = factory.CreateOrganizationService(context.UserId);

            //Extract the tracing service for use in debugging sandboxed plug-ins

            ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

            tracingService.Trace("Tracing implemented successfully!");

            if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)

            {
                Entity qp = (Entity)context.InputParameters["Target"];

                //Only execute for non-write in Quote Product records

                EntityReference product = qp.GetAttributeValue<EntityReference>("productid");

                if (product != null)

                {

                    Entity p = RetrieveProductID(service, product.Id);
                    string desc = p.GetAttributeValue<string>("description");
                    tracingService.Trace("Product Description = " + desc);
                    qp.Attributes["description"] = desc;

                }

                else

                {
                    tracingService.Trace("Quote Product with record ID " + qp.GetAttributeValue<Guid>("quotedetailid").ToString() + " does not have an associated Product record, cancelling plugin execution.");
                    return;
                }
            }
        }

        public Entity RetrieveProductID(IOrganizationService service, Guid productID)
        {
            ColumnSet cs = new ColumnSet("description"); //Additional fields can be specified using a comma seperated list

            //Retrieve matching record

            return service.Retrieve("product", productID, cs);
        }
    }
}

They key thing to remember when registering your Plugin via the Plugin Registration Tool (steps which regular readers of the blog should have a good awareness of) is to ensure that the Event Pipeline Stage of Execution is set to Pre-operation. From there, the world is your oyster Рyou could look at returning additional fields from the Product entity to update on your Quote Product record or you could even look at utilising the same plugin for the Order Product and Invoice Product entities (both of these entities also have Description field, so the above code should work on these entities as well).

It’s a real shame that Field Mappings are not available to streamline the population of record data from the Product entity; or the fact that there is no way to utilise features such as Workflows to give you an alternate way of achieving the requirement exemplified in this post. This scenario is another good reason why you should always strive to be a Dynamics 365 Swiss Army Knife, ensuring that you have a good awareness of periphery technology areas that can aid you greatly in mapping business requirements to CRM/D365CE.