Perhaps one of the most useful features at your disposal when working with Azure SQL Databases is the ability to integrate your Azure Active Directory (Azure AD) login accountsa la Windows Authentication for on-premise SQL Server. There are numerous benefits in shifting away from SQL Server-only user accounts in favour of Azure AD:

  • Ensures consistent login identities across multiple services.
  • Can enforce password complexity and refresh rules more easily.
  • Once configured, they behave exactly the same as standard SQL Server only logins.
  • Supports advanced usage scenarios involving Azure AD, such as multi-factor authentication and Single Sign-On (SSO) via Active Directory Federation Services (ADFS).

Setup can be completed in a pinch, although you will need to allocate a single/group of user(s) as the Active Directory admin for the Azure SQL Server. You may also choose to take due care and precautions when choosing your Active Directory admin(s); one suggestion would be to use a unique service account for the Active Directory admin, with a strong password, instead of granting such extensive privileges to normal user accounts.

Regardless of how you go about configuring the feature, I would recommend using it where-ever you can, for both internal purposes and also for anyone who wishes to access your SQL Server from an external directory. This second scenario is, you may be surprised to hear, fully supported. It assumes, first off, that you have added this account to your directory as a Guest/External User account. Then, you just follow the normal steps to get the account created on your Azure SQL Server.

There is one major “gotcha” to bear in mind when doing this. Let’s assume that you have added john.smith@domain.co.uk to the Azure AD tenant test.onmicrosoft.com. You then go to setup this account to access a SQL Server instance on the tenant. You will more than likely receive the following error message when using the example syntax below to create the account:

CREATE USER [john.smith@domain.co.uk] FROM EXTERNAL PROVIDER

The issue is, thankfully, simple to understand and fix. When External user accounts are added onto your Active Directory, despite having the same login name that derives from their source directory, they are stored in the new directory with a different UserPrincipalName (UPN). Consider the above example – the UPN in the source directory would be as follows:

john.smith@domain.co.uk

Whereas, as the Azure AD tenant name in this example is test.onmicrosoft.com, the UPN for the object would be:

john.smith_domain.co.uk#EXT#@test.onmicrosoft.com

I assume that this is done to prevent any UPN duplication across Microsoft’s no-doubt dizzying array of cloud Active Directory tenants and forests. In any event, knowing this, we can adjust our code above to suit – and successfully create our Database user account:

CREATE USER [john.smith_domain.co.uk#EXT#@test.onmicrosoft.com] FROM EXTERNAL PROVIDER

I guess this is one of those things where having at least a casual awareness of how other technologies within the Microsoft “stack” work can assist you greatly in troubleshooting what turn out to be simplistic errors in your code. Frustrating all the same, but we can claim knowledge of an obscure piece of Azure AD trivia as our end result 🙂

I would not recommend setting up a Windows Server Domain Services role for the first time flying blind. Whilst not necessarily classing myself as a “newbie” in this respect, I have only run through this a few times in the past within lab environments. The process is always tricky – not just in deploying out the role in the first instance, but more via the many quirks that get thrown up as you try to accomplish what should be simple tasks, such as domain joining devices or getting DNS settings correctly mapped out. These and other tiresome rat races can leave you with a severely scratched head and distract you from your ultimate goal.

For this reason, you could argue that Azure Active Directory Domain Services (or AADDS) is the perfect solution for “newbies”. The thing I like the most about it is that a lot of the hassle I make reference to above is something you will never see a sight of, thanks to the fact that Microsoft manages most aspects of the deployment behind the scenes. In addition, the step-by-step guides available on the Microsoft Docs website provide a very clear and no-nonsense holding hand through every step of an Azure Domain Services rollout. What this ultimately means is that you can spend more time on achieving your end goal and reduce the need for extensive administration of the solution following its rollout. Having said that, it is always useful to ensure that you have thoroughly tested any solution as extensively as possible for your particular scenario, as this will always throw up some potential issues or useful tips to remember in future; AADDS is no exception to this rule.

Having recently worked closely with AADDS, in this weeks blog post, I wanted to share some of my detailed thoughts regarding the solution in practice and a few things to remember if you are checking it out for the first time.

Resetting AADDS User passwords could become the bane of your existence

If you are creating an AADDS resource in isolation to any existing identity providers you have in place (i.e. there is no requirement to use Azure AD Connect with an On-Premise Domain Server), then be aware that you will have to set up a users password twice before they will be able to login to the domain. Microsoft explains better than I can why this is:

To authenticate users on the managed domain, Azure Active Directory Domain Services needs credential hashes in a format that’s suitable for NTLM and Kerberos authentication. Azure AD does not generate or store credential hashes in the format that’s required for NTLM or Kerberos authentication, until you enable Azure Active Directory Domain Services for your tenant. For obvious security reasons, Azure AD also does not store any password credentials in clear-text form. Therefore, Azure AD does not have a way to automatically generate these NTLM or Kerberos credential hashes based on users’ existing credentials…If your organization has cloud-only user accounts, all users who need to use Azure Active Directory Domain Services must change their passwords

Source: https://docs.microsoft.com/en-us/azure/active-directory-domain-services/active-directory-ds-getting-started-password-sync

The above article goes into the required steps that need to be followed for each user account that is created and, at the time of writing, I do not believe there is any way of automating this process. Whether you choose to complete these steps yourself or get your end users to do so instead is up to you, but there is a good chance that if a user is experiencing login issues with an AADDS account, then the steps in the above article have not been followed correctly.

Make sure you’re happy with your chosen DNS Name

When first creating your Domain Services resource, you need to be pretty certain your desired DNS domain name will not be subject to change in the future. After some fruitless digging around on the portal and an escalated support request to Microsoft, I was able to confirm that there is no way this can be changed after the Domain Services resource is deployed; your only recourse is to recreate the resource from scratch with your newly desired DNS domain name. This could prove to be problematic if, say, you wish to change the domain name of your in-development domain services account from the default onmicrosoft.com domain to a bespoke one…after you have already joined Virtual Machines to your new domain : / Some efficient use of the Azure templates feature can save you some aggro here, but not if you have already expended considerable effort on bespoke customisation on each of your VM’s operating systems.

Be aware of what’s supported….and what isn’t

There are a few articles that Microsoft have published that can help you to determine whether AADDS is right for your particular scenario:

Whilst these are invaluable and, admittedly, demonstrate the wide-feature array contained with AADDS, there are still a few hidden “gotchas” to be aware of. The articles above hint towards some of these:

  • Only one AADDS resource is allowed per Azure tenant. You will need to configure a clean Active Directory tenant (and therefore a separate Azure portal) for any additional AADDS resource you wish to setup, which also requires an appropriate subscription for billing. This could result in ever-growing complexity to your Azure footprint.
  • AADDS is a continually billable service. Unlike VM’s, which can be set to Stopped (unallocated) status at any time and, therefore, not incur any usage charges, your Domain Services resource will incur fees as soon as you create it and only cease when the resource is deleted.

One unsupported feature that the above articles do not provide any hint towards is Managed Service Accounts. Introduced as part of Windows Server 2008 R2, they provide a more streamlined means of managing service accounts for applications running on Windows Server, reducing the requirement to maintain passwords for these accounts and allowing administrators to provide domain-level privileges to essential service account objects. I try to use them whenever I can in conjunction with SQL Server installations, particularly if the service accounts for SQL need to access network-level resources that are secured via a security group or similar and I would encourage you to read up on them further to see if they could be a help within your SQL Server deployments.

Back to the topic at hand – if you attempt to create a Managed Service Account via PowerShell, you will receive an error message saying that they are not supported within the domain. So, assuming that you are wanting to go ahead and deploy SQL Server on an AADDS joined VM, you would have to revert back to using standard Active Directory user accounts for your Service Accounts to achieve the same functionality. Not great if you also have to enforce password refresh policies, but it would be the only supported workaround in this situation.

Conclusion or Wot I Think

When reviewing potential new IT vendors or products, I always try and judge “how dirty handed” I would need to get with it. What I mean by this is the level of involvement myself, or a business, would need to invest in managing physical server hardware, backend elements of the infrastructure or any aspect of the solution that requires an inordinate amount of time poking around the innards of to troubleshoot basic problems. The great benefit of services such as Azure is that a lot of this pain is taken away by default – for example, there is no need to manage server, firewall and networking hardware at all, as Microsoft does this for you. AADDS goes a step further by removing the need to manage the server aspect of a Domain Services deployment, allowing you to focus more on building out your identities and integrating them within your chosen application. Whilst it does need some work to get it up to an acceptable level of parity with a “do-it-yourself” Domain Server (for example, extensive PowerShell support for the completion of common tasks), the service is still very much in a developed and user-friendly state to warrant further investigation – particularly if you have a simplified Active Directory Domain in place or are looking to migrate across to Active Directory from another vendor. £80 per month for a directory smaller than 25,000 objects is also not an exorbitant price to pay as well, so I would definitely recommend you check AADDS out to see if it could be a good fit for your organisation/application in the near future.

In the early days of Office 365, you would be accustomed to a certain kind of experience when purchasing licenses as a small/medium size business (SMB) customer. As these types of organisations are typically too small to warrant the cost for Enterprise Agreements or Volume Licensing, your only recourse to buying Office 365 services was via the portal itself. At this time, any involvement with a Microsoft Partner would be minimal or even non-existent. Partners would only enter the picture if a business opted to grant them Partner of Record status, thereby allowing them to manage aspects of your subscription behind the scenes. This process, while wholly sufficient, did have some notable gaps and, if you were an organisation focused on tightly managing all aspects of your customer journey, could be prone to change or interruption via interference from Microsoft

The Cloud Solutions Provider programme (or CSP for short) is Microsoft’s “next-generation” opportunity for partners to get directly involved as part of a customers journey onto Office 365, Dynamics 365 and/or Azure and aims to resolve some of the issues highlighted above. Instead of turning directly to Microsoft when you need a new license subscription or have an issue with a particular Office 365 opportunity, customers would instead deal directly with a CSP provider, who will be able to offer them all of this, and more. Everyone wins – the customer, CSP provider and Microsoft itself – and here’s just a few reasons why:

Moving to CSP can save you money

Perhaps the most important reason of all to consider CSP 🙂 The price you will pay for pretty much every single Office 365 Subscription offering available via Microsoft directly will be lower if purchased from a CSP partner. In most cases, this will typically result in a 10-15% reduction across the board guaranteed; a figure which, depending on the size of your organisation, could be a significant portion of your per annum IT spend. In addition, there is no need to wait for your subscription anniversary to switch – any early cancellation charges will be credited in full by Microsoft, should you cancel at any point in your subscription and migrate to the equivalent CSP subscription.

CSP users can benefit from special promotions, previews and other deals unavailable via Microsoft Direct

One example of this at the moment is the preview for Microsoft 365 Business – the next evolutionary step for Office 365 – which is accessible to those who are working with CSP providers currently. Other promotions may also appear from time to time, so you should be speaking to your CSP provider regularly to ensure that they are informing you of any potential discounts or offers available.

CSP enables your current Microsoft Partner to support you better

If you are working with a Microsoft Partner to help support your Office 365 services or Dynamics 365 deployments, there’s a good chance that they may have spoken to you about CSP or migrated you across already. The reasons for this will not be purely based on an altruistic desire to reduce your monthly running costs; by having their customers operating under CSP licensing, Partners are granted additional information regarding your subscriptions and their usage. They also become the de facto organisation that needs to be contacted in case any issues occur relating to the subscription. A customer who, for example, contacts Microsoft directly regarding an Exchange Online issue on their CSP subscription will instead be referred back to the CSP Partner in the first instance; they will then be responsible for escalating the case to Microsoft if required. In most circumstances, this can surely be seen as a plus and in helping Partners to work more closely with their customers.

The above example also goes some way towards explaining why CSP license prices are cheaper compared to going directly to Microsoft. By placing Partner organisations at the front-line of dealing with common 1st/2nd Line support issues, Microsoft can reduce the number of support professionals it allocates internally  and place the burden instead on Microsoft Partners to do the “heavy lifting”, particularly when it comes to dealing with easy to resolve issues (i.e. any support request that can be resolved via the Administration Centre).

It’s for Azure as well…with some caveats

Chances are if you are using Office 365 within your organisation, then you will also be consuming some additional Azure services on top of this – either Virtual Machine(s), storage, websites or even some database capacity. The good news is that you can also look at moving your Azure subscriptions across to CSP, with the same benefits available: reduced monthly costs and the ability for your partner of choice to support you better.

At the time of writing this post, the key “red flag” I would draw you towards when considering Azure CSP is what you lose compared to a Pay As You Go or other direct subscription. For example, ongoing and previous usage history will not be visible on the Azure portal and, chances are, you will only get full visibility of your Azure usage costs at the time when you are billed by your CSP partner. If you typically prefer to micro-manage your ongoing Azure usage costs, then a 10-15% saving may not be a fair trade-off for losing this visibility.

Finally, don’t be surprised if this becomes the de facto way of buying Office 365/Azure in the future if you are an SMB

I mentioned above one reason why CSP license costs are cheaper, and through this, you can begin to see the writing on the wall for SMB’s. This is not necessarily a bad thing. My own personal preference would be in dealing with a Microsoft Partner as opposed to Microsoft direct, as Partners will generally be a lot more flexible and reactive to work with. Having assumed that Microsoft can generate significant internal cost savings and also give eager Partner organisations the opportunity to fill the void, why would they not then turn round and say “We’re sorry, but if you are an organisation that employs 300 people or less, then please speak to a Microsoft Partner for further assistance.”? Certainly, the vibe and talk around CSP at the moment would seem to indicate that this is the long-term trajectory for the programme. Watch this space, but it will be interesting to see in the future whether the Microsoft Direct route is downplayed or removed completely if your potential license order per annum is in only in the hundreds of £’s.

In the early days of working with Azure Virtual Machines, there was generally some effort involved in provisioning and maintaining storage for your machines, and a number of considerations that would need to be taken into account. Choosing the most appropriate storage type (Blob, File etc.), its location on the Azure network and the type of resiliency behind the data stored within…all questions that can leave you muddled and confused! What’s more, depending on which type of setup opted for, you could find yourself having to maintain a complex storage solution for what is ultimately a simplified deployment/workload.

The introduction of Managed Disks earlier this year is designed to solve these problems and provide a simplistic, easy-to-scale solution that does not require a high-degree of knowledge of Storage Accounts to successfully maintain. Upon creation of your virtual machine, a VHD disk is created for you; after that time, the only management you need to worry about is in specifying the size of the disk, which can be scaled to suit your requirements. Copies of the disk can be quickly created via snapshots as well, and these (and indeed the disks themselves) can be straightforwardly migrated to other Virtual Machines at a few clicks of a button. Definitely a much nicer and compact solution. 🙂

I was recently working with Managed Disks when deploying out a new Virtual Machine onto Azure. I like to follow a consistent naming convention for resources on Azure, so I was a little frustrated to see that the platform had opted to name the resource for me:

The values used are anything but “friendly” and appear to be the GUID for the object in the backend database! What’s that all about?!?

One of the things you have to remember with Azure resources is that the names cannot be adjusted once a resource is created; the only recourse in this scenario is to recreate the resource from scratch with the desired name. What’s worse is that you have no option as part of the Azure interface when creating your VM to specify a name for your Managed Disk. You only have the single option indicated below – Use managed disks:

So is there any way of being able to manually specify a disk name when creating a new Virtual Machine on Azure? If you don’t mind working with JSON, deployment templates and the currently in-preview Templates feature, then keep reading to see how to achieve this using an, admittedly, rather convoluted workaround.

To begin with, start creating your Virtual Machine as you would normally via the interface. When you get to the final screen (4 – Purchase), click on the Download template and parameters hyperlink that sits next to the Purchase button:

This will then open up the entire code-based template for your deployment settings – basically, a large JSON file with all of the settings that you have just selected through the interface. Click on the Add to library button at the very top of the window:

Selecting this will now enable you save this into your Azure account as a Template, thereby letting you open, modify and deploy it again in future – very handy features if you find yourself deploying similar resource types often. Specify a Name and Description for the Template and then click on Save:

Once saved, the Template can then be accessed via the Templates area on Azure, which can be searched and (optionally) pinned to your favourites:

After opening the Template, you then have the option to Edit it – this includes both the Name/Description values already specified and also the ability to modify the JSON file in-browser by selecting the ARM Template setting below:

Scroll down the JSON file until you get to the key/value pairs for osDisk. There should be two pairs existing there already: createOption and managedDisk . There is an additional setting that can be specified in this area to let you define the resource name. No prizes for guessing what this is called 🙂 By adding this key/value pair into the JSON file, as indicated below, we can ensure our preferred name is utilised when the resource is created:

 

Click Save on the Edit Template window and then verify that your changes have been accepted by the portal:

Now, by going back to the first window for the Template, we can deploy the VM and all its constituent components to the platform. One downside with this is that you will need to specify all of the other settings for your Virtual Machine, such as Location, Size, and names, again:

Once all your values have been validated and accepted, you can then click on Purchase to submit the deployment to the platform. After this completes successfully, we can then verify that our preferred Managed Disk name has been saved along with the resource:

It is nice to know that there is a workaround to enable us to specify a name for Managed Disks in our preferred format, but I would hope in future that functionality is added to the portal that lets you specify the name from within there, instead of making an arbitrary assumption on what the resource is called. Managed Disks are still very much in their infancy within Azure, so I am confident that this improvement will be implemented in future and that the feature is constantly reviewed to ensure the pain of provisioning Virtual Machine storage is almost non-existent.

Although CRM Online/Dynamics 365 for Enterprise (D365E) does provide a plethora of different tools aimed at satisfying reporting requirements for users of the application, you are restricted in how data can be queried within the application. For example, you cannot just connect straight up to the applications SQL database and start writing stored procedures that perform complex data transformations or joins. Traditionally, to achieve this, you would need to look at one of the several tools in the marketplace that enable you to export your data out into a format that best pleases you; or even take the plunge and get a developer to write your own application that satisfies your integration requirements.

With the recent D365E release and in-line with Microsoft’s longstanding approach to how they approach customer data within their applications (i.e. “It’s yours! So just do what you want with it!), the parallel introduction of the Data Export Service last year further consolidates this approach and adds an arguably game-changing tool to the products arsenal. By using the service, relatively straightforward integration requirements can be satisfied in a pinch and a lot of the headache involved in setting up a backup of your organisation’s databases/LOB reporting application can be eliminated. Perhaps the most surprising and crucial aspect of all of this is that using this tool is not going to break the bank too much either.

In this week’s blog post, I’m going to take a closer look at just what the Data Export Service is, the setup involved and the overall experience of using the service from end-to-end.

What is the Data Export Service?

The Data Export Service is a new, free*, add-on for your CRM/D365E subscription, designed to accomplish basic integration requirements. Microsoft perhaps provides the best summary of what the tool is and what it can achieve via TechNet :

The Data Export Service intelligently synchronizes the entire Dynamics 365 data initially and thereafter synchronizes on a continuous basis as changes occur (delta changes) in the Microsoft Dynamics 365 (online) system. This helps enable several analytics and reporting scenarios on top of Dynamics 365 data with Azure data and analytics services and opens up new possibilities for customers and partners to build custom solutions.

The tool is compatible with versions 8.0, 8.1 and 8.2 of the application, which corresponds the following releases of CRM Online/D365E:

  • Dynamics CRM Online 2016
  • Dynamics CRM Online 2016 Update 1
  • Dynamics 365 December Update

*You will still need to pay for all required services in Azure, but the add-on itself is free to download.

The Installation Process

Getting everything configured for the Data Export Service can prove to be the most challenging – and potentially alienating – part of the entire process. For this, you will need the following at your disposal:

  • An active Azure Subscription.
  • An Azure SQL Server configured with a single database or an Azure VM running SQL Server. Microsoft recommends a Premium P1 database or better if you are using an Azure SQL database, but I have been able to get the service working without any issue on S0 tier databases. This is an important point to make, given the cost difference per month can amount to hundreds of £’s.
  • An Azure Key Vault. This is what will securely store the credentials for your DB.
  • PowerShell and access to the Azure Resource Manager (AzureRM) Cmdlets. Powershell can be installed as an OS feature on Windows based platforms, and can now be downloaded onto OS X/Linux as well. PowerShell is required to create an Azure Key Vault, although you can also use it to create your Azure SQL Server instance/Windows VM with SQL Server.

It is therefore recommended that you have at least some experience in how to use Azure – such as creating Resource Groups, deploying individual resources, how the interface works etc. – before you start setting up the Data Export Service. Failing this, you will have to kindly ask your nearest Azure whizz for assistance 🙂 Fortunately, if you know what you’re doing, you can get all of the above setup very quickly; in some cases, less than 10 minutes if you opt to script out the entire deployment via PowerShell.

For your setup with D365E, all is required is the installation of the approved solution via the Dynamics 365 Administration Centre. Highlight the instance that you wish to deploy to and click on the pen icon next to Solutions:

Then click on the Solution with the name Data Export Service for Dynamics 365 and click the Install button. The installation process will take a couple of minutes, so keep refreshing the screen until the Status is updated to Installed. Then, within the Settings area of the application, you can access the service via the Data Export icon:

Because the Data Export Service is required to automatically sign into an external provider, you may also need to verify that your Web Browser pop-up settings/firewall is configured to allow the https://discovery.crmreplication.azure.net/ URL. Otherwise, you are likely to encounter a blank screen when attempting to access the Data Export Service for the first time. You will know everything is working correctly when you are greeted with a screen similar to the below:

Setting up an Export Profile

After accepting the disclaimer and clicking on the New icon, you will be greeted with a wizard-like form, enabling you to specify the following:

  • Mandatory settings required, such as the Export Profile Name and the URL to your Key Vault credentials.
  • Optional settings, such as which database schema to use, any object prefix that you would like to use, retry settings and whether you want to log when records are deleted.
  • The Entities you wish to use with the Export Service. Note that, although most system entities will be pre-enabled to use this service, you will likely need to go into Customizations and enable any additional entities you wish to utilise with the service via the Change Tracking option:

  • Any Relationships that you want to include as part of the sync: To clarify, this is basically asking if you wish to include any default many-to-many (N:N) intersect tables as part of your export profile. The list of available options for this will depend on which entities you have chosen to sync. For example, if you select the AccountLead and Product entities, then the following intersect tables will be available for synchronisation:

Once you have configured your profile and saved it, the service will then attempt to start the import process.

The Syncing Experience A.K.A Why Delta Syncing is Awesome

When the service first starts to sync, one thing to point out is that it may initially return a result of Partial Success and show that it has failed for multiple entities. In most cases, this will be due to the fact that certain entities dependent records have not been synced across (for example, any Opportunity record that references the Account name Test Company ABC Ltd. will not sync until this Account record has been exported successfully). So rather than attempting to interrogate the error logs straightaway, I would suggest holding off a while. As you may also expect, the first sync will take some time to complete, depending on the number of records involved. My experience, however, suggests it is somewhat quick – for example, just under 1 million records takes around 3 hours to sync. I anticipate that the fact that the service is essentially an Azure to Azure export no doubt helps in ensuring a smooth data transit.

Following on from the above, syncs will then take place as and when entity data is modified within the application. The delay between this appears to be very small indeed – often tens of minutes, if not minutes itself. This, therefore, makes the Data Export Service an excellent candidate for a backup/primary reporting database to satisfy any requirements that cannot be achieved via FetchXML alone.

One small bug I have observed is with how the application deals with the listmember intersect entity. You may get an errors thrown back that indicate records failed to sync across successfully, which is not the case upon closer inspection. Hopefully, this is something that may get ironed out and is due to the rather strange way that the listmember entity appears to behave when interacting with it via the SDK.

Conclusions or Wot I Think

For a free add-on service, I have been incredibly impressed by the Data Export Service and what it can do. For those who have previously had to fork out big bucks for services such as Scribe Online or KingswaySoft in the past to achieve very basic replication/reporting requirements within CRM/D365E, the Data Export Service offers an inexpensive way of replacing these services. That’s not to say that the service should be your first destination if your integration requirements are complex – for example, integrating Dynamics 365 with SAP/Oracle ERP systems. In these cases, the names mentioned above will no doubt be the best services to look at to achieve your requirements in a simplistic way. I also have a few concerns that the setup involved as part of the Data Export Service could be a barrier towards its adoption. As mentioned above, experience with Azure is a mandatory requirement to even begin contemplating getting setup with the tool. And your organisation may also need to reconcile itself with utilising Azure SQL databases or SQL Server instances on Azure VM’s. Hopefully, as time goes on, we may start to see the setup process simplified – so, for example, seeing the Export Profile Wizard actually go off and create all the required resources in Azure by simply entering your Azure login credentials.

The D365E release has brought a lot of great new functionality and features to the table, that has been oft requested and adds real benefit to organisations who already or plan to use the application in the future. The Data Export Service is perhaps one of the great underdog features that D365E brings to the table, and is one that you should definitely consider using if you want a relatively smooth sailing data export experience.