This is an accompanying blog post to my YouTube video Dynamics 365 Customer Engagement Deep Dive: Creating a Basic Jscript Form Function, the first in a series that aims to provide tutorials on how to accomplish developer focused tasks within Dynamics 365 Customer Engagement. You can watch the video in full below:

Below you will find links to access some of the resources discussed as part of the video and to further reading topics.

PowerPoint Presentation (click here to download)

Full Code Sample

function changeAddressLabels() {

    //Get the control for the composite address field and then set the label to the correct, Anglicised form. Each line requires the current control name for 'getControl' and then the updated label name for 'setLabel'

    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_line1").setLabel("Address 1");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_line2").setLabel("Address 2");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_line3").setLabel("Address 3");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_city").setLabel("Town");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_stateorprovince").setLabel("County");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_postalcode").setLabel("Postal Code");
    Xrm.Page.getControl("address1_composite_compositionLinkControl_address1_country").setLabel("Country");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line1"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line1").setLabel("Address 1");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line2"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line2").setLabel("Address 2");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line3"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_line3").setLabel("Address 3");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_city"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_city").setLabel("Town");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_stateorprovince"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_stateorprovince").setLabel("County");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_postalcode"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_postalcode").setLabel("Postal Code");

    if (Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_country"))
        Xrm.Page.getControl("address2_composite_compositionLinkControl_address2_country").setLabel("Country");
}

Download/Resource Links

Visual Studio 2017 Community Edition

Setup a free 30 day trial of Dynamics 365 Customer Engagement

W3 Schools JavaScript Tutorials

Source Code Management Solutions

Further Reading

MSDN – Use JavaScript with Microsoft Dynamics 365

MSDN – Use the Xrm.Page. object model

MSDN – Xrm.Page.ui control object

MSDN – Overview of Web Resources

Debugging custom JavaScript code in CRM using browser developer tools (steps are for Dynamics CRM 2016, but still apply for Dynamics 365 Customer Engagement)

Have any thoughts or comments on the video? I would love to hear from you! I’m also keen to hear any ideas for future video content as well. Let me know by leaving a comment below or in the video above.

Working in-depth amidst the Sales entities (e.g. Product, Price List, Quote etc.) within Dynamics CRM/Dynamics 365 Customer Engagement (CRM/D365CE) can produce some unexpected complications. What you may think is simple to achieve on the outset, based on how other entities work within the system, often leads you in a completely different direction. A good rule of thumb is that any overtly complex customisations to these entities will mean having to get down and dirty with C#, VB.Net or even JScript. For example, we’ve seen previously on the blog how, with a bit of a developer expertise, it is possible to overhaul the entire pricing engine within the application to satisfy specific business requirements. There is no way in which this can be modified directly through the application interface, which can lead to CRM deployments that make imaginative and complicated utilisation of features such as Workflows, Business Rules and other native features. Whilst there is nothing wrong with this approach per-say, the end result is often implementations that look messy when viewed cold and which become increasingly difficult to maintain in the long term. As always, there is a balance to be found, and any approach which makes prudent use of both application features and bespoke code is arguably the most desirous end goal for achieving certain business requirements within CRM/D365CE.

To prove my point around Sales entity “oddities”, a good illustration can be found when it comes to working with relationship field mappings and Product records. The most desirable feature at the disposal of CRM customisers is the ability to configure automated field mapping between Entities that have a one-to-many (1:N) relationship between them. What this means, in simple terms, is that when you create a many (N) record from the parent entity (1), you can automatically copy the field values to a matching field on the related entity. This can help to save data entry time when qualifying a Lead to an Opportunity, as all the important field data you need to continue working on the record will be there ready on the newly created Opportunity record. Field mappings can be configured from the 1:N relationship setting window, via the Mappings button:

There are a few caveats to bear in mind – you can only map across fields that have the same underlying data type and you cannot map multiple source fields to the same target (it should be obvious why this is 🙂 ) – but on the whole, this is a handy application feature that those who are more accustomed to CRM development should always bear in the mind when working with CRM/D365CE.

Field mappings are, as indicated, a standard feature within CRM/D365CE – but when you inspect the field relationships between the Product and Quote Product entity, there is no option to configure mappings at all:

Upon closer inspection, many of the relationships between the Product entity and others involved as part of the sales order process are missing the ability to configure field mappings. So, for example, if you have a requirement to map across the value of the Description entity to a newly created Quote Product record, you would have to look at implementing a custom plugin to achieve your requirements. The main benefit of this route is that we have relatively unrestricted access to the record data we need as part of a plugin execution session and – in addition – we can piggyback onto the record creation process to add on our required field “in-flight” – i.e. whilst the record is being created. The code for achieving all of this is as follows:

using System;

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;

namespace D365.BlogDemoAssets.Plugins
{
    public class PreQuoteProductCreate_GetProductAttributeValues : IPlugin
    {
        public void Execute(IServiceProvider serviceProvider)
        {
            //Obtain the execution context from the service provider.

            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            //Get a reference to the Organization service.

            IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = factory.CreateOrganizationService(context.UserId);

            //Extract the tracing service for use in debugging sandboxed plug-ins

            ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

            tracingService.Trace("Tracing implemented successfully!");

            if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)

            {
                Entity qp = (Entity)context.InputParameters["Target"];

                //Only execute for non-write in Quote Product records

                EntityReference product = qp.GetAttributeValue<EntityReference>("productid");

                if (product != null)

                {

                    Entity p = RetrieveProductID(service, product.Id);
                    string desc = p.GetAttributeValue<string>("description");
                    tracingService.Trace("Product Description = " + desc);
                    qp.Attributes["description"] = desc;

                }

                else

                {
                    tracingService.Trace("Quote Product with record ID " + qp.GetAttributeValue<Guid>("quotedetailid").ToString() + " does not have an associated Product record, cancelling plugin execution.");
                    return;
                }
            }
        }

        public Entity RetrieveProductID(IOrganizationService service, Guid productID)
        {
            ColumnSet cs = new ColumnSet("description"); //Additional fields can be specified using a comma seperated list

            //Retrieve matching record

            return service.Retrieve("product", productID, cs);
        }
    }
}

They key thing to remember when registering your Plugin via the Plugin Registration Tool (steps which regular readers of the blog should have a good awareness of) is to ensure that the Event Pipeline Stage of Execution is set to Pre-operation. From there, the world is your oyster – you could look at returning additional fields from the Product entity to update on your Quote Product record or you could even look at utilising the same plugin for the Order Product and Invoice Product entities (both of these entities also have Description field, so the above code should work on these entities as well).

It’s a real shame that Field Mappings are not available to streamline the population of record data from the Product entity; or the fact that there is no way to utilise features such as Workflows to give you an alternate way of achieving the requirement exemplified in this post. This scenario is another good reason why you should always strive to be a Dynamics 365 Swiss Army Knife, ensuring that you have a good awareness of periphery technology areas that can aid you greatly in mapping business requirements to CRM/D365CE.

Dynamics CRM/Dynamics 365 for Customer Engagement (CRM/D365CE) is an incredibly flexible application for the most part. Regardless of how your business operates, you can generally tailor the system to suit your requirements and extend it to your heart’s content; often to the point where it is completely unrecognisable from the base application. Notwithstanding this argument, you will come across aspects of the application that are (literally) hard-coded to behave a certain way and cannot be straightforwardly overridden via the application interface. The most recognisable example of this is the Lead Qualification process. You are heavily restricted in how this piece of functionality acts by default but, thankfully, there are ways in which it can be modified if you are comfortable working with C#, JScript and Ribbon development.

Before we can start to look at options for tailoring the Lead Qualification process, it is important to understand what occurs during the default action within the application. In developer-speak, this is generally referred to as the QualifyLead message and most typically executes when you click the button below on the Lead form:

When called by default, the following occurs:

  • The Status/Status Reason of the Lead is changed to Qualified, making the record inactive and read-only.
  • A new OpportunityContact and Account record is created and populated with (some) of the details entered on the Lead record. For example, the Contact record will have a First Name/Last Name value supplied on the preceding Lead record.
  • You are automatically redirected to the newly created Opportunity record.

This is all well and good if you are able to map your existing business processes to the application, but most organisations will typically differ from the applications B2B orientated focus. For example, if you are working within a B2C business process, creating an Account record may not make sense, given that this is typically used to represent a company/organisation. Or, conversely, you may want to jump straight from a Lead to a Quote record. Both of these scenarios would require bespoke development to accommodate currently within CRM/D365CE. This can be broadly categorised into two distinct pieces of work:

  1. Modify the QualifyLead message during its execution to force the desired record creation behaviour.
  2. Implement client-side logic to ensure that the user is redirected to the appropriate record after qualification.

The remaining sections of this post will demonstrate how you can go about achieving the above requirements in two different ways.

Our first step is to “intercept” the QualifyLead message at runtime and inject our own custom business logic instead

I have seen a few ways that this can be done. One way, demonstrated here by the always helpful Jason Lattimer, involves creating a custom JScript function and a button on the form to execute your desired logic. As part of this code, you can then specify your record creation preferences. A nice and effective solution, but one in its guise above will soon obsolete as a result of the SOAP endpoint deprecation. An alternative way is to instead deploy a simplistic C# plugin class that ensures your custom logic is obeyed across the application, and not just when you are working from within the Lead form (e.g. you could have a custom application that qualifies leads using the SDK). Heres how the code would look in practice:

public void Execute(IServiceProvider serviceProvider)
    {
        //Obtain the execution context from the service provider.

        IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

        if (context.MessageName != "QualifyLead")
            return;

        //Get a reference to the Organization service.

        IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        IOrganizationService service = factory.CreateOrganizationService(context.UserId);

        //Extract the tracing service for use in debugging sandboxed plug-ins

        ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

        tracingService.Trace("Input parameters before:");
        foreach (var item in context.InputParameters)
        {
            tracingService.Trace("{0}: {1}", item.Key, item.Value);
        }

        //Modify the below input parameters to suit your requirements.
        //In this example, only a Contact record will be created
        
        context.InputParameters["CreateContact"] = true;
        context.InputParameters["CreateAccount"] = false;
        context.InputParameters["CreateOpportunity"] = false;

        tracingService.Trace("Input parameters after:");
        foreach (var item in context.InputParameters)
        {
            tracingService.Trace("{0}: {1}", item.Key, item.Value);
        }
    }

To work correctly, you will need to ensure this is deployed out on the Pre-Operation stage, as by the time the message reaches the Post-Operation stage, you will be too late to modify the QualifyLead message.

The next challenge is to handle the redirect to your record of choice after Lead qualification

Jason’s code above handles this effectively, with a redirect after the QualifyLead request has completed successfully to the newly created Account (which can be tweaked to redirect to the Contact instead). The downside of the plugin approach is that this functionality is not supported. So, if you choose to disable the creation of an Opportunity record and then press the Qualify Lead button…nothing will happen. The record will qualify successfully (which you can confirm by refreshing the form) but you will then have to manually navigate to the record(s) that have been created.

The only way around this with the plugin approach is to look at implementing a similar solution to the above – a Web API request to retrieve your newly created Contact/Account record and then perform the necessary redirect to your chosen entity form:

function redirectOnQualify() {

    setTimeout(function(){
        
        var leadID = Xrm.Page.data.entity.getId();

        leadID = leadID.replace("{", "");
        leadID = leadID.replace("}", "");

        var req = new XMLHttpRequest();
        req.open("GET", Xrm.Page.context.getClientUrl() + "/api/data/v8.0/leads(" + leadID + ")?$select=_parentaccountid_value,_parentcontactid_value", true);
        req.setRequestHeader("OData-MaxVersion", "4.0");
        req.setRequestHeader("OData-Version", "4.0");
        req.setRequestHeader("Accept", "application/json");
        req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
        req.setRequestHeader("Prefer", "odata.include-annotations=\"OData.Community.Display.V1.FormattedValue\"");
        req.onreadystatechange = function () {
            if (this.readyState === 4) {
                req.onreadystatechange = null;
                if (this.status === 200) {
                    var result = JSON.parse(this.response);
                    
                    //Uncomment based on which record you which to redirect to.
                    //Currently, this will redirect to the newly created Account record
                    var accountID = result["_parentaccountid_value"];
                    Xrm.Utility.openEntityForm('account', accountID);

                    //var contactID = result["_parentcontactid_value"];
                    //Xrm.Utility.openEntityForm('contact', contactID);

                }
                else {
                    alert(this.statusText);
                }
            }
        };
        req.send();
        
    }, 6000);     
}

The code is set to execute the Web API call 6 seconds after the function triggers. This is to ensure adequate time for the QualifyLead request to finish and make the fields we need available for accessing.

To deploy out, we use the eternally useful Ribbon Workbench to access the existing Qualify Lead button and add on a custom command that will fire alongside the default one:

As this post has hopefully demonstrated, overcoming challenges within CRM/D365CE can often result in different – but no less preferred – approaches to achieve your desired outcome. Let me know in the comments below if you have found any other ways of modifying the default Lead Qualification process within the application.

Although CRM Online/Dynamics 365 for Enterprise (D365E) does provide a plethora of different tools aimed at satisfying reporting requirements for users of the application, you are restricted in how data can be queried within the application. For example, you cannot just connect straight up to the applications SQL database and start writing stored procedures that perform complex data transformations or joins. Traditionally, to achieve this, you would need to look at one of the several tools in the marketplace that enable you to export your data out into a format that best pleases you; or even take the plunge and get a developer to write your own application that satisfies your integration requirements.

With the recent D365E release and in-line with Microsoft’s longstanding approach to how they approach customer data within their applications (i.e. “It’s yours! So just do what you want with it!), the parallel introduction of the Data Export Service last year further consolidates this approach and adds an arguably game-changing tool to the products arsenal. By using the service, relatively straightforward integration requirements can be satisfied in a pinch and a lot of the headache involved in setting up a backup of your organisation’s databases/LOB reporting application can be eliminated. Perhaps the most surprising and crucial aspect of all of this is that using this tool is not going to break the bank too much either.

In this week’s blog post, I’m going to take a closer look at just what the Data Export Service is, the setup involved and the overall experience of using the service from end-to-end.

What is the Data Export Service?

The Data Export Service is a new, free*, add-on for your CRM/D365E subscription, designed to accomplish basic integration requirements. Microsoft perhaps provides the best summary of what the tool is and what it can achieve via TechNet :

The Data Export Service intelligently synchronizes the entire Dynamics 365 data initially and thereafter synchronizes on a continuous basis as changes occur (delta changes) in the Microsoft Dynamics 365 (online) system. This helps enable several analytics and reporting scenarios on top of Dynamics 365 data with Azure data and analytics services and opens up new possibilities for customers and partners to build custom solutions.

The tool is compatible with versions 8.0, 8.1 and 8.2 of the application, which corresponds the following releases of CRM Online/D365E:

  • Dynamics CRM Online 2016
  • Dynamics CRM Online 2016 Update 1
  • Dynamics 365 December Update

*You will still need to pay for all required services in Azure, but the add-on itself is free to download.

The Installation Process

Getting everything configured for the Data Export Service can prove to be the most challenging – and potentially alienating – part of the entire process. For this, you will need the following at your disposal:

  • An active Azure Subscription.
  • An Azure SQL Server configured with a single database or an Azure VM running SQL Server. Microsoft recommends a Premium P1 database or better if you are using an Azure SQL database, but I have been able to get the service working without any issue on S0 tier databases. This is an important point to make, given the cost difference per month can amount to hundreds of £’s.
  • An Azure Key Vault. This is what will securely store the credentials for your DB.
  • PowerShell and access to the Azure Resource Manager (AzureRM) Cmdlets. Powershell can be installed as an OS feature on Windows based platforms, and can now be downloaded onto OS X/Linux as well. PowerShell is required to create an Azure Key Vault, although you can also use it to create your Azure SQL Server instance/Windows VM with SQL Server.

It is therefore recommended that you have at least some experience in how to use Azure – such as creating Resource Groups, deploying individual resources, how the interface works etc. – before you start setting up the Data Export Service. Failing this, you will have to kindly ask your nearest Azure whizz for assistance 🙂 Fortunately, if you know what you’re doing, you can get all of the above setup very quickly; in some cases, less than 10 minutes if you opt to script out the entire deployment via PowerShell.

For your setup with D365E, all is required is the installation of the approved solution via the Dynamics 365 Administration Centre. Highlight the instance that you wish to deploy to and click on the pen icon next to Solutions:

Then click on the Solution with the name Data Export Service for Dynamics 365 and click the Install button. The installation process will take a couple of minutes, so keep refreshing the screen until the Status is updated to Installed. Then, within the Settings area of the application, you can access the service via the Data Export icon:

Because the Data Export Service is required to automatically sign into an external provider, you may also need to verify that your Web Browser pop-up settings/firewall is configured to allow the https://discovery.crmreplication.azure.net/ URL. Otherwise, you are likely to encounter a blank screen when attempting to access the Data Export Service for the first time. You will know everything is working correctly when you are greeted with a screen similar to the below:

Setting up an Export Profile

After accepting the disclaimer and clicking on the New icon, you will be greeted with a wizard-like form, enabling you to specify the following:

  • Mandatory settings required, such as the Export Profile Name and the URL to your Key Vault credentials.
  • Optional settings, such as which database schema to use, any object prefix that you would like to use, retry settings and whether you want to log when records are deleted.
  • The Entities you wish to use with the Export Service. Note that, although most system entities will be pre-enabled to use this service, you will likely need to go into Customizations and enable any additional entities you wish to utilise with the service via the Change Tracking option:

  • Any Relationships that you want to include as part of the sync: To clarify, this is basically asking if you wish to include any default many-to-many (N:N) intersect tables as part of your export profile. The list of available options for this will depend on which entities you have chosen to sync. For example, if you select the AccountLead and Product entities, then the following intersect tables will be available for synchronisation:

Once you have configured your profile and saved it, the service will then attempt to start the import process.

The Syncing Experience A.K.A Why Delta Syncing is Awesome

When the service first starts to sync, one thing to point out is that it may initially return a result of Partial Success and show that it has failed for multiple entities. In most cases, this will be due to the fact that certain entities dependent records have not been synced across (for example, any Opportunity record that references the Account name Test Company ABC Ltd. will not sync until this Account record has been exported successfully). So rather than attempting to interrogate the error logs straightaway, I would suggest holding off a while. As you may also expect, the first sync will take some time to complete, depending on the number of records involved. My experience, however, suggests it is somewhat quick – for example, just under 1 million records takes around 3 hours to sync. I anticipate that the fact that the service is essentially an Azure to Azure export no doubt helps in ensuring a smooth data transit.

Following on from the above, syncs will then take place as and when entity data is modified within the application. The delay between this appears to be very small indeed – often tens of minutes, if not minutes itself. This, therefore, makes the Data Export Service an excellent candidate for a backup/primary reporting database to satisfy any requirements that cannot be achieved via FetchXML alone.

One small bug I have observed is with how the application deals with the listmember intersect entity. You may get an errors thrown back that indicate records failed to sync across successfully, which is not the case upon closer inspection. Hopefully, this is something that may get ironed out and is due to the rather strange way that the listmember entity appears to behave when interacting with it via the SDK.

Conclusions or Wot I Think

For a free add-on service, I have been incredibly impressed by the Data Export Service and what it can do. For those who have previously had to fork out big bucks for services such as Scribe Online or KingswaySoft in the past to achieve very basic replication/reporting requirements within CRM/D365E, the Data Export Service offers an inexpensive way of replacing these services. That’s not to say that the service should be your first destination if your integration requirements are complex – for example, integrating Dynamics 365 with SAP/Oracle ERP systems. In these cases, the names mentioned above will no doubt be the best services to look at to achieve your requirements in a simplistic way. I also have a few concerns that the setup involved as part of the Data Export Service could be a barrier towards its adoption. As mentioned above, experience with Azure is a mandatory requirement to even begin contemplating getting setup with the tool. And your organisation may also need to reconcile itself with utilising Azure SQL databases or SQL Server instances on Azure VM’s. Hopefully, as time goes on, we may start to see the setup process simplified – so, for example, seeing the Export Profile Wizard actually go off and create all the required resources in Azure by simply entering your Azure login credentials.

The D365E release has brought a lot of great new functionality and features to the table, that has been oft requested and adds real benefit to organisations who already or plan to use the application in the future. The Data Export Service is perhaps one of the great underdog features that D365E brings to the table, and is one that you should definitely consider using if you want a relatively smooth sailing data export experience.

An oft-requested requirement as part of any Dynamics CRM/Dynamics 365 for Enterprise (D365E) deployment is a level of integration with another application system. In some of these cases, this will involve pulling through external web pages and passing them form-level attribute values, to load an external systems report, record page etc. From a CRM/D365E point of view, this can be very straightforwardly achieved thanks to some of the functionality provided as part of the Xrm.Page object model. For example. let’s assume that you have an IFrame control on your form and you wanted to load an ASP.NET web page, passing the ID of the record as a query parameter in the URL. Setup your IFrame on your form, with a random URL and set to hidden. Then, a JScript function like this on the OnLoad event would get the job done for you:

function loadIFrame() {

    //Get the current record ID

    var entityID = Xrm.Page.data.entity.getId();

    //Replace { & } with their appropriate URL counterparts in entityID

    entityID = entityID.replace("{", "%7b");
    entityID = entityID.replace("}", "%7d");

    //Create the URL

    var url = "http://myexternalwebpage.com/MyAspPage.aspx?id=" + entityID;

    //Then, update the IFrame with the new URL and make it visible on the form

    Xrm.Page.getControl("IFRAME_myiframe").setSrc(url);
    Xrm.Page.getControl("IFRAME_myiframe").setVisible(true);
}

What helps with the above is that there are well-documented code samples that assists when putting together this example, so you can be confident that the solution will work and is fully supported.

Things get a little more complicated once we are operating outside the standard CRM/D365E environment. Assume that instead of displaying this IFrame control on a form, it needs to be displayed as part of an Entity Form in Adxstudio/CRM Portals. Here is where the head scratching can commence in earnest, and you need to look at getting your hand’s dirty writing custom code to achieve your requirements. There a few hurdles to overcome in the first instance:

  • How do you access attribute values from an Entity Form, such as a record ID?
  • Once you are able to access the attribute value, how to you set this up on your Entity Form?
  • How do you embed an IFrame within an Entity Form?

Let’s take a look at one approach to the above, working on the same basis as above – an external URL that we pass the record ID to, from within an Entity Form Web Page. Things may get a bit more difficult if you need to access other entity attribute values, which may require some kind of trickery with Liquid Templates to achieve successfully.

Accessing Entity Form Record ID

When your Entity Form page is loaded on your Portal, there are a number of properties regarding the record that are exposed on the underlying web page – the name of the entity, the record ID, Status and Status Reason values. These can be accessed via a div element on the page, which can be viewed within the DOM Explorer as part of a Web Browsers developer tools (in the below example, Internet Explorer is used):

1

The id of the div class will always be the same, except for the value in the middle, which is the GUID for the Entity Form record within CRM/D365E, but without the dashes. So you don’t need to necessarily go into the DOM to get this value; as a time-saving mechanism, simply export your Entity Form record into Excel and view the first hidden column to obtain this value.

Suffice to say, because we know that this value is accessible when our Portal page loads, we can look at programmatically accessing this via a JScript function. The following snippet will do the trick:

var recordID = document.getElementById('EntityFormControl_31c41a020771e61180e83863bb350f28_EntityFormView_EntityID').value;

Now that we have a means of accessing the attribute value, our options in terms of what we can do with it greatly increase 🙂

Executing Entity Form Custom JScript Functions

There are two ways you can place custom JScript on your portal page – you can either place your functions within the Custom JavaScript field, located on the Entity Form form within CRM:

2

Functions will be added to the bottom of your Web Page when loaded, meaning they can be freely accessed after the page has loaded. The second way, which leads us nicely onto the next section, is to wrap your JScript function as a custom HTML snippet on the Web Pages Copy (HTML) field.

Embedding an IFrame on your Web Page

All Web Pages in Adxstudio/Portals – irrespective of what other content the page is loading – contain a Copy (HTML) field. This enables you to write your own bespoke text or other HTML content that is displayed on the Web Page. In the case of an Entity Form Web Page, then the content will be displayed just below the Entity Form. Thanks to the ability to access and write our own custom HTML code for this, options for bespoke development are greatly increased – simply click the Source button to switch to the underlying HTML editor:

3

Then, using a combination of the snippet we used earlier and utilising the <iframe> HTML tag, we can place the following in our Copy (HTML) to do the lot for us – get our record ID, pass it to an external web page and then load this within an IFrame:

<p>
    <script>
        function getEntityID() {
            var url = "http://myexternalwebpage.com/MyAspPage.aspx?id=";
            var entityID = document.getElementById('EntityFormControl_31c41a020771e61180e83863bb350f28_EntityFormView_EntityID').value;
            var iframeSrc = document.getElementById('myiframe').src;

            if (iframeSrc != url + "%7b" + entityID + "%7d") {

                setTimeout(function () {
                    document.getElementById('myiframe').src = url + "%7b" + entityID + "%7d";
                }, 2000);
            }
        }
    </script>
</p>
<h1>My IFrame</h1>
<p>
    <iframe width="725" height="250" id="myiframe" src="" onload="getEntityID();"></iframe>
</p>

The reason why setTimeout is used is to ensure that the entity form <div> class loads correctly, as this is one of the last things that Adxstudio/Portals loads last on the page. For obvious reasons, if this hasn’t loaded, then our JScript function will error. Putting this aside, however, the above solution gets us to where we want to be and means that we can achieve the same outcome as the CRM/D365E example demonstrated at the start of this post 🙂

Conclusions or Wot I Think

Adxstudio/Portals presents some interesting and different learning opportunities, both given its genesis as a separate product to its gradual integration as part of the CRM/D365E family. This can often mean that you have to abandon your base assumptions and ways of thinking when it comes to CRM/D365E development, and instead look at things from a more general approach. I would hope that, in time, we will begin to see the gradual introduction of common XRM object models within CRM Portals, as it is crucially important that there is a unified approach when developing Portal extensions in the future and that we are not in the situation where unsupported code becomes rampant across different Portal deployments. This latter concern would be my chief worry with the examples provided in this post, as there is currently no clear way of determining whether the approach taken is supported or considered “best practice” from an Adxstudio/Portal perspective. I would be interested in hearing from anyone in the comments below if they have any thoughts or alternative approaches that they would recommend to achieve the above requirement.