I took some time out this week to head down to Microsoft’s Reading offices for the November CRMUG meeting. There is often a whole host of reasons that can be conjured up to excuse yourself from events like this – “I’m too busy at work!”, “It’s such a long way away!” etc. – but, ultimately, it’s always best to make the effort and get involved. The theme of the day was around¬†Awareness of your CRM system, which was neatly kicked off by a short presentation from Microsoft on the current roadmap for Dynamics 365 for Customer Engagement (D365CE). There was a clear emphasis towards GDPR on some of the available presentation tracks, a topic that regular readers of the blog should be well prepared for I hope. ūüôā Another key aspect of the day was networking, with ample opportunities to meet new people and to understand their current journey involving CRM/D365CE. Here are my thoughts on the sessions I attended, along with some closing remarks on why these types of events are always beneficial.

Accelerate GDPR with Microsoft Cloud

The first talk I attended was all about GDPR from Microsoft’s perspective. The session was co-led by David Hirst and David Reid from Microsoft and did a really good job in setting out the GDPR stall for the uninitiated, as well as offering some pointers towards Microsoft solutions/technologies that may prove beneficial towards achieving compliance. There were also some fascinating anecdotal pieces, such as, for example, the story of a UK based pub chain who has decided to completely remove all customer email address data from their systems, presumably with GDPR in mind. An extreme, but arguably pragmatic, approach.

The talk came across as refreshingly candid, with a real demonstrable attempt of portraying a concerted effort behind the scenes at Microsoft to ensure that they – and their entire product range – are prepared for GDPR. Microsoft is not just talking the talk when it comes to GDPR (which, to be frank, can often result in a lot of scaremongering by other companies), but are instead providing current and new customers with the tools and information they need to streamline their route towards compliance. The key takeaway from the session, which was borne out by some of the Q&A’s at the end, is that it’s naive to assume that technology companies like Microsoft can provide a “silver bullet” solution to deal with all of your GDPR woes. Organisations need to go away and do a lot of the hard work when it comes to determining the type of data they hold across the entire organisation, whether the source of consent for this could be considered risky and to implement the appropriate business processes and technological¬†helper tools to make dealing with things such as subject access requests as simple as possible.

What is GDPR and how it impacts your business and your Dynamics 365 solutions, Get Ready for your new legal obligations.

The next talk was (again!) on GDPR and was presented by CRM MVP, Mohamed Mostafa, and was specifically focused on GDPR in the context of D365CE.¬†Mohamed’s talk was very focused, assisted by some great visual aids, and he also presented some interesting examples on how you can leverage existing application features to help you towards GDPR compliance. Plenty of food for thought!

One area mentioned by Mohamed in his presentation which I perhaps disagree with him on (sorry!) is the emphasis placed on the massive fine figures that are often quoted when it comes to GDPR. A heavy focus towards this does, in my view, present a degree of scaremongering. This is confirmed by the fact that Elizabeth Denham, the Information Commissioner, has gone public herself on the whole issue¬†and cautions businesses to be wary of the “massive fines” narrative. I agree with her assessment, and that fines should always be considered a “last resort” in targeting organisations that have demonstrated a willful disregard for their obligations in handling personal data. My experience with the ICO on a personal level backs this up, and I have always found them to be fair and proportional when dealing with organisations who are trying to do the best they can. GDPR presents a real opportunity for organisations to get to grips with how they handle their personal data, and I encourage everyone to embrace it and to make the necessary changes to accommodate. But, by the same token, organisations should not be panic-stricken into a narrative that causes them to adopt unnecessary technologies under the whole “silver bullet” pretence.

What’s new in Dynamics 365 9.0

To date, I have not had much of a chance to play around in detail with version 9.0 of D365CE. For this reason, MVP Sarah Critchley’s talk ranked highly on the agenda for me. Sarah’s enthusiasm for the application is infectious, and she covered a wide breadth of the more significant new features that can be found in the new version of the application, including (but not limited to):

  • Presentation changes to the Sitemap
  • Introduction to Virtual Entities and how to set them up
  • Changes to the mobile application

Sarah framed all of the changes with a before/after comparison to version 8.2 of the application, thereby allowing the audience to contextualise the changes a lot better. The best thing that I liked about the whole presentation is that it scratched beneath the surface to highlight less noticeable changes that may have a huge impact for end-users of the application. Attention was paid to the fact that the web application refresh is now a fully mobile responsive template, meaning that it adjusts automatically to suit a mobile or tablet device screen size. Another thing which I didn’t know about the new Virtual Entities feature is that they can be used as Lookup fields on related entities. This immediately expands their versatility, and I am looking forward to seeing how the feature develops in the future.

Implementing a Continuous Integration Strategy with Dynamics 365

I’ll admit that I went into the final talk of the day with Ben Walker¬†not 100% sure what to expect, but walked away satisfied that it was perhaps the most underrated session of the day ūüôā Ben took us through his journey of implementing a continuous integration strategy (translation: testing through the development process and automating the deployment process) for CRM 2015 in his current role, and he should be proud of what he has achieved in this respect. Ben showed the room a number of incredibly useful developer tidbits, such as:

  • The ability to export CRM/D365CE solution information into Visual Studio and then sync up to a Git repository.
  • Deep integrating of unit testing, via the FakeXrmEasy framework.
  • The ability to trigger automated builds in TFS after a code check-in, which can then be used to push out solution updates into a dev/test environment¬†automatically. With the additional option of allowing somebody else to approve the deployment before it starts.

All of the above Ben has been able to build together as an end to end process which looks almost effortless in its construction. The benefit – and main caveat from the whole session, it has to be said – is that Ben is primarily working within an on-premise CRM environment and is using tools which may not be fully supported with online versions of the application. For example, the ability to deploy Solution updates via PowerShell is definitely not supported by D365CE online. Despite this, Ben’s presentation should have left everyone in the room with enough things to go away with, research and implement to make their CRM/D365CE development more of a breeze in future.

Conclusions or Wot I Think

This was my first time attending a CRMUG meeting, and I am glad that I finally found the time to do so. As Sarah highlighted at the start of the day, the key benefit of the whole event is the opportunity to network, and I had ample opportunity to meet face-to-face some of my CRM heroes, as well as others working in the industry. It can often feel like a lonely journey working with applications like CRM/D365CE, particularly if you are working as part of a small team within your business. Events such as these very much bring you together with other like-minded individuals, who will happily talk to you non-stop about their passion for the application and technology. And, because the events are closely supported by Microsoft, it means that Tuesday’s meeting allowed for lots of authoritative information to come to fore throughout the entire day. I am very much looking forward to attending my next CRMUG meeting in the future and would urge anyone with at least a passing interest in the world of CRM/D365CE to consider attending their next local meeting.

When it comes to technology learning, it can often feel as if you are fighting against a constant wave of change, as studying is outpaced by the introduction of new technical innovations. Fighting the tide is often the most desirous outcome to work towards, but it can be understandable why individuals choose to specialise in a particular technology area. There is no doubt some comfort in becoming a subject matter expert and in not having to worry about “keeping up with the Joneses”. However, when working with an application such as Dynamics 365 for Customer Engagement (D365CE), I would argue it is almost impossible to ignore the wider context of what sit’s alongside the application, particularly Azure, Microsoft’s cloud as a service platform. Being able to understand how the application can be extended via external integrations is typically high on the list of any project requirements, and often these integrations require a light-touch Azure involvement, at a minimum. Therefore, the ability to say that you are confident in accomplishing certain key tasks within Azure instantly puts you ahead of others and in a position to support your business/clients more straightforwardly.

Here are 4 good reasons why you should start to familiarise yourself with Azure, if you haven’t done so already, or dedicate some additional time towards increasing your knowledge in an appropriate area:

Dynamics 365 for Customer Engagement is an Azure application

Well…we can perhaps not answer this definitively and say that 100% of D365CE is hosted on Azure (I did hear a rumour that¬†some aspects of the infrastructure were hosted on AWS). Certainly, for instances that are provisioned within the UK, there is ample evidence to suggest this to be the case. What can be said with some degree of certainty is that D365CE is an Azure¬†leveraged¬†application. This is because it uses key aspects of the service to deliver various functionality within the application:

  • Azure Active Directory: Arguably the crux of D365CE is the security/identity aspect, all of which is powered using Microsoft’s cloud version of Active Directory.
  • Azure Key Vault: Encryption is enabled by default on all D365CE databases, and the management of encryption keys is provided via¬†Azure Key Vault.
  • Office 365: Similar to D365CE, Office 365 is – technically – an Azure cloud service provided by Microsoft. As both Office 365 and D365CE often need to be tightly knitted together, via features such as Server-Side Synchronisation, Office 365 Groups and SharePoint document management, it can be considered a¬†de facto part of the base application.

It’s fairly evident, therefore, that D365CE can be considered as a Software as a Service (SaaS) application hosted on Azure. But why is all this important? For the simple reason that, because as a D365CE professional, you will be supporting the full breadth of the application and all it entails, you are already an Azure professional by default. Not having a cursory understanding of Azure and what it can offer will immediately put you a detriment to others who do, and increasingly places you in a position where your D365CE expertise is severely blunted.

It proves to prospective employers that you are not just a one trick pony

When it comes to interviews for roles focused around D365CE, I’ve been at both sides of the table.¬†What I’ve found separates a¬†good D365CE CV from an¬†excellent one all boils down to how effectively the candidate has been able to expand their knowledge into the other areas.¬†How much additional knowledge of other applications, programming languages etc. does the candidate bring to the business? How effectively has the candidate moved out of their comfort zone in the past in exploring new technologies, either in their current roles or outside of work? More importantly, how much¬†initiative¬†and¬†passion has the candidate shown in embracing changes? A candidate who is able to answer these questions positively and is able to attribute, for example, extensive knowledge of Azure will instantly move up in my estimation of their ability. On the flip side of this, I believe that interviews that have resulted in a job offer for me have been helped, in no small part, to the additional technical skills that I can make available to a prospective employer.

To get certain things done involving D365CE, Azure knowledge is a mandatory requirement

I’ve talked about one of these tasks before on the blog, namely, how to setup the Azure Data Export solution to automatically synchronise your application data to an Azure SQL Database. Unless you are in the fortunate position of having an Azure savvy colleague who can assist you with this, the only way you are going to be able to successfully complete this task is to know how to deploy an Azure SQL Server instance, a database for this instance and the process for setting up an Azure Key Vault. Having at least¬†some familiarity with how to deploy simple resources in Azure and accomplish tasks via PowerShell script execution will place you in an excellent position to achieve the requirements of this task, and others such as:

The above is just a flavour of some of the things you can do with D365CE and Azure together, and there are doubtless many more I have missed ūüôā The key point I would highlight is that you should not just naively assume that D365CE is containerised away from Azure; in fact, often the clearest and cleanest way of achieving more complex business/technical requirements will require a detailed consideration of what can be built out within Azure.

There’s really no good reason not to, thanks to the wealth of resources available online for Azure training.

A sea change seems to be occurring currently at Microsoft with respect to online documentation/training resources. Previously, TechNet and MSDN would be your go-to resources to find out how something Microsoft related works. Now, the Microsoft Docs website is where you can find the vast majority of technical documentation. I really rate the new experience that Microsoft Docs provides, and there now seems to be a concerted effort to ensure that these articles are clear, easy to follow and include end-to-end steps on how to complete certain tasks. This is certainly the case for Azure and, with this in mind, I defy anyone to find a reasonable enough excuse not to begin reading through these articles. They are the quickest way towards expanding your knowledge within an area of Azure that interests you the most or to help prepare you to, for example, setup a new Azure SQL database from scratch.

For those who learn better via visual tools, Microsoft has also greatly expanded the number of online video courses available for Azure, that can be accessed for free. There are also some excellent, “deep-dive” topic areas that can also be used to help prepare you for Azure certification.

Conclusions or Wot I Think

I use the term “D365CE professional” a number of times throughout this post. This is a perhaps unhelpful label to ascribe to anyone working with D365CE today. A far better title is, I would argue, “Microsoft cloud professional”, as this gets to the heart of what I think anyone who considers themselves a D365CE “expert” should be. Building and supporting solutions within D365CE is by no means an isolated experience, as you might have argued a few years back. Rather, the onus is on ensuring that consultants, developers etc. are as multi-faceted as possible from a skillset perspective. I talked previously on the blog about becoming a swiss army knife in D365CE. Whilst this is still a noble and recommended goal, I believe casting the net wider can offer a number of benefits not just for yourself, but for the businesses and clients you work with every day. It puts you centre-forward in being able to offer the latest opportunities to implement solutions that can increase efficiency, reduce costs and deliver positive end-user experiences. And, perhaps most importantly, it means you can confidently and accurately attest to your wide-ranging expertise in any given situation.

The world of database security and protection can be a difficult path to tread at times. I often find myself having to adopt a “tin-foil hat” approach, obsessing over the smallest potential vulnerability that a database could be compromised with. This thought process can be considered easy compared with any protective steps that need to be implemented in practice, as these can often prove to be mind-bogglingly convoluted. This is one of the reasons why I like working with Microsoft Azure and features such as Azure SQL Database Firewall Rules. They present a familiar means of securing your databases to specific IP address endpoints and are not inordinately complex in how they need to be approached; just provide a name, Start/End IP range and hey presto! Your client/ application can communicate with your database. The nicest thing about them is that the feature is enabled by default, meaning you don’t have to worry about designing and implementing a¬†solution to restrict your database from unauthorised access at the outset.

As alluded to above, Database Firewall Rules are added via T-SQL code (unlike Server Rules, which can be specified via the Azure portal), using syntax that most SQL developers should feel comfortable using. If you traditionally prefer to design and build your databases using a Visual Studio SQL Database project, however, you may encounter a problem when looking to add a Database Firewall rule to your project. There is no dedicated template item that can be used to add this to the database. In this eventuality, you would have to look at setting up a Post-Deployment Script or Pre-Deployment Script to handle the creation of any requisite rules you require. Yet this can present the following problems:

  • Visual Studio will be unable to provide you with the basic syntax to create the rules.
  • Related to the above, Intellisense support will be limited, so you may struggle to identify errors in your code until it is deployed.
  • When deploying changes out to your database, the project will be unable to successfully detect (and remove) any rules that are deleted from your project.

The last one could prove to be particularly cumbersome if you are tightly managing the security of your Azure SQL database. Putting aside the obvious risk of someone forgetting to remove a rule as part of a deployment process, you would then have to manually remove the rules by connecting to your database and executing the following T-SQL statement:

EXECUTE sp_delete_database_firewall_rule 'MyDBFirewallRule'

Not the end of the world by any stretch, but if you are using Visual Studio as your deployment method for managing changes to your database, then having to do this step seems a little counter-intuitive. Fortunately, with a bit of creative thinking and utilisation of more complex T-SQL functionality, we can get around the issue by developing a script that carries out the following steps in order:

  • Retrieve a list of all current Database Firewall Rules.
  • Iterate through the list of rules and remove them all from the database.
  • Proceed to re-create the required Database Firewall Rules from scratch

The second step involves the use of a T-SQL function that I have traditionally steered away from using – Cursors. This is not because they are bad in any way but because a) I have previously struggled to understand how they work and b) have never found a good scenario in which they could be used in. The best way of understanding them is to put on your C# hat for a few moments and consider the following code snippet:

string[] array = new string[] { "Test1", "Test2", "Test3" }; 

foreach(string s in array)
    {
        Console.WriteLine(s);
    }
    

To summarise how the above works, we take our collection of values –¬†Test1, Test2¬†and¬†Test3 – and carry out a particular action against each; in this case, print out their value into the console. This, in a nutshell, is how Cursors work, and you have a great deal of versatility on what action you take during each iteration of the “loop”.

With a clear understanding of how Cursors work. the below script that accomplishes the aims set out above should hopefully be a lot clearer:

DECLARE @FirewallRule NVARCHAR(128)

DECLARE REMOVEFWRULES_CURSOR CURSOR
	LOCAL STATIC READ_ONLY FORWARD_ONLY
FOR
SELECT DISTINCT [name]
FROM sys.database_firewall_rules

OPEN REMOVEFWRULES_CURSOR
FETCH NEXT FROM REMOVEFWRULES_CURSOR INTO @FirewallRule
WHILE @@FETCH_STATUS = 0
BEGIN
	EXECUTE sp_delete_database_firewall_rule @FirewallRule
	PRINT 'Firewall rule ' + @FirewallRule + ' has been successfully deleted.'
	FETCH NEXT FROM REMOVEFWRULES_CURSOR INTO @FirewallRule
END
CLOSE REMOVEFWRULES_CURSOR
DEALLOCATE REMOVEFWRULES_CURSOR

GO

EXECUTE sp_set_database_firewall_rule @name = N'MyDBFirewallRule1',
		@start_ip_address = '1.2.3.4', @end_ip_address = '1.2.3.4';

EXECUTE sp_set_database_firewall_rule @name = N'MyDBFirewallRule2',
		@start_ip_address = '1.2.3.4', @end_ip_address = '1.2.3.4';
		

To integrate as part of your existing database project, add a new Post-Deployment Script file and modify the above to reflect your requirements. As the name indicates, the script will run after all other aspects of your solution deployment has been completed. Now, the key caveat to bear in mind with this solution is that, during deployment, there will be a brief period of time where all Database Firewall Rules are removed from the database. This could potentially prevent any current database connections from dropping or failing to connect altogether. You should take care when using the above code snippet within a production environment and I would recommend you look at an alternative solution if your application/system cannot tolerate even a second of downtime.

Office 365 groups have been a recurring topic of the blog in recent months –¬†we’ve seen how we can force Office 365 to use custom domains when creating groups for the very first time¬†and¬†how you can straightforwardly integrate an Office 365 Group within Dynamics 365 for Customer Engagement. With this in mind, there is little point in providing a detailed description of what they are and how they can be used; suffice to say, if you are wanting to collaborate closely with internal/external colleagues for a particular project or department, Office 365 Groups are an excellent candidate to consider.

One of the cornerstones of Office 365 Groups is the ability for all conversations to be tracked via the use of a dedicated shared mailbox. This perhaps explains why the Office 365 portal will refuse to let you add any user within your organisation who does not have an Exchange Online license assigned to them. Case in point – let’s assume we have a user account with no such license assigned to them on the Office 365 portal:

When attempting to add this user into an Office 365 group, we get a message to let us know No match was found for the user account entered and, as a consequence, it cannot be added to the group:

From this, you can perhaps make the assumption that Office 365 groups are not supported at all for users who do not have a mailbox. This is notwithstanding the fact there are several different business scenarios that may necessitate this requirement:

  • A kiosk/”light-use” account may require access to the group to upload documents and manage the SharePoint site.
  • Integration with external applications may be required, stipulating the need for a service account to authenticate with the group to retrieve/add content dynamically.
  • The need to configure an account for external users to access, that is sufficiently locked down and inexpensive to maintain.

Fortunately, as with many other things relating to Office 365, we can get around this limitation within the Office 365 portal by resorting to PowerShell and adding the John Doe user account above to the Group.

The first step towards achieving this is to boot up a PowerShell window. Make sure you have access to this on your machine of choice then, after opening the application using the Run as administrator option, execute the following script:

##Set Execution Policy to Remote Signed - required to fully execute script

Set-ExecutionPolicy RemoteSigned

##Connect to Exchange Online. Enter administrator details when prompted.

$UserCredential = Get-Credential

$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection

Import-PSSession $Session

##Add the non-mailbox user to the Office 365 Group. Substitute the Links value with the username of the account to add.

Add-UnifiedGroupLinks -Identity "Test Office 365 Group" -LinkType Members -Links john.doe@domain.com

##Confirm that the user has been added successfully by returning the Group member list

Get-UnifiedGroupLinks -Identity "Test Office 365 Group" -LinkType Members

##Cleanup by disconnecting from Exchange Online

Remove-PSSession $Session

The penultimate command will make something similar to the below appear in the console window. Interestingly, note that the John.Doe test user has a RecipientType value of User:

Now that the user has been added successfully, they will be able to access the SharePoint site for the group by navigating to the SharePoint library URL. This will look similar to the below and can be grabbed by logging in as another user who has the RecipientType value of UserMailbox and navigating to the Groups SharePoint site:

https://<Your On Microsoft domain prefix>.sharepoint.com/sites/<Your Office 365 Group Name/

Note that this will be on the only way the non-mailbox user can access the site. For example, there will be no link to SharePoint within Office 365 to guide you to the above location. After logging in, you should be greeted with a window similar to the one below:

The John Doe “light-use” account, as referenced above, will have full access to everything that is accessible within SharePoint concerning the Office 365 Group, such as:

  • The Home/News Page
  • Shared Documents Folder (“Documents“)
  • Shared OneNote (“Notebook“)
  • All Site Pages
  • Planner (navigated to via the following link:¬†https://tasks.office.com/<Your Office 365 Primary domain>/en-GB/Home/Planner/)

Conversely, the following features will be inaccessible (due to requiring a Mailbox):

  • Conversations
  • Shared Calendar

If for example, you attempt to navigate to Conversations within SharePoint, you will get the following error message:

This is, perhaps, a small price to pay for what ends up to be a pretty feature-rich experience that can be given to additional users within your organisation at virtually no cost. Perhaps another good excuse to start rolling out Office 365 Groups across your tenant in the near future ūüôā

Perhaps one of the most fiendish aspects of working with SQL Server Integration Services (SSIS) is the inevitable data transformation/conversion issues that get thrown up, even as part of relatively simplistic Extract, Transform & Load (ETL) packages. It doesn’t help as well if, having come from a strictly T-SQL focused background, you are then having to familiarise yourself with the differently named data types that SSIS has in comparison to SQL Server. Ultimately, whether you are still a noobie or season veteran in creating .dtsx packages, you should never be disheartened if you find yourself having to tackle data conversion issues during package development – put another way, there is always going to be a new system or data file format that comes out of nowhere to test your patience ūüôā

I had a rather strange occurrence of this issue recently when working to import Globally Unique Identifier (GUID) data into SQL Server’s equivalent data type – the uniqueidentifier. GUIDs are very much the first choice these days if you are building large-scale applications requiring unique values to distinguish database records. Whereas back in the old days, you could get away with an integer column using the IDENTITY seed, the potential for current datasets to contain billions or more records make this option less practical compared with GUID’s – a data type that is almost always certainly going to be unique, even you are generating them at an insane pace, and which has the headroom to accommodate huge datasets.

Going back to strange occurrence I mentioned above – perhaps the best way to explain the issue (and its resolution) is to show the steps involved. To do this, access to a SQL Server database instance, interfaced with via SQL Server Management Studio (SSMS), is required. Once this has been obtained, a database needs to be created and the following example script executed against it to create the table used during this post:

CREATE TABLE [GUIDImportTest]
(
	[UID] UNIQUEIDENTIFIER NOT NULL,
	[TestCol1] NVARCHAR(MAX) NULL,
	[TestCol2] NVARCHAR(MAX) NULL
)

We then also have our test import file, saved as a .csv file:

With both of these ready, we can then get the error to generate using the¬†SQL Server Import and Export Wizard – a handy tool that enables you to straightforwardly move uncomplex data between applications and file formats. This tool can be accessed via SSMS by right-clicking on any database and selecting¬†Tasks ->¬†Import Data…

Begin the wizard as indicated above and, when specifying the Data Source settings, select Flat File Source. In the Advanced tab, you should also override the default data type settings for the UID field and set it to unique identifier (DT_GUID):

The Target destination (accessed further along the wizard) should be set to SQL Server Native Client and to the server/database where the table created above resides.

On the Select Source Tables and Views screen, be sure that the correct table on the Destination drop-down. By default, if your import source does not match the destination name, then the wizard will assume you want to create a brand new table:

On the Review Data Type Mapping tab, a data conversion warning be will flagged up for the two TestCol fields; these can be safely disregarded, as the import package will successfully convert these values for you without further complaint:

After clicking Next and letting the package, we can then see the titular error of this post occur, which halts the package execution:

Initially, I thought the error was generating because the GUID values in the .csv file were not in upper case (when selecting uniqueidentifier¬†data via a SQL query, this is always returned in this format), but the same error is thrown when importing data in this exact format. It turns out the issue was down to something that I should have readily realised based on my experience working with Dynamics CRM/Dynamics 365 for Customer Engagement. When working with URL’s and query string parameters in the application involving individual records,¬†GUID values require special URL encoding to convert curly brace values – { and } respectively – into “URL friendly” format. So for example, the following:

{06e82887-9afc-4064-abad-f6fb60b8a1f3}

Is converted into:

%7B06e82887-9afc-4064-abad-f6fb60b8a1f3%7D

What does this have to do with SSIS and the task at hand? Well, it turns out that when importing uniqueidentifier data types into the application, the application expects the data to be in the above format, surrounded by curly braces. Our source data, therefore, needs to resemble the following image below to import successfully:

After making the appropriate changes to the source data, the package will then execute successfully, loading the data into the desired SQL table:

I guess the lesson here is that never take for granted any knowledge you may have garnered from a particular source  Рeven when dealing with what may be at first glance a completely disparate challenge. In all likelihood, it just might be that this past experience could present a means of thinking differently about a problem and, ultimately, overcome the challenge you are faced with.