SQL Script To Generate C# Model

6/5/2019 5:38:00 AM

While working on a data migration project that uses C# and Azure Functions.  I found myself in the need to generate a model of the table(s).  While doing a Google search I came across the following blog post that has a script to generate the model properties for me.  Simply replace the table name variable at the top of the script with the table name you want to generate the properties for.  You may also need to include the schema name in the table name if you have multiple tables with the same name, just under different schema's.  I did make one change to it in the select @ result line.  It was adding a blank line in between each property, so I removed the new line.

 declare @TableName sysname = 'TABLE_NAME'  
 declare @Result varchar(max) = 'public class ' + @TableName + '  
 select @Result = @Result + '  
   public ' + (CASE WHEN ColumnName = 'RowVersion' THEN 'byte[]' ELSE ColumnType END) + NullableSign + ' ' + ColumnName + ' { get; set; }'  
     replace(col.name, ' ', '_') ColumnName,  
     column_id ColumnId,  
     case typ.name   
       when 'bigint' then 'long'  
       when 'binary' then 'byte[]'  
       when 'bit' then 'bool'  
       when 'char' then 'string'  
       when 'date' then 'DateTime'  
       when 'datetime' then 'DateTime'  
       when 'datetime2' then 'DateTime'  
       when 'datetimeoffset' then 'DateTimeOffset'  
       when 'decimal' then 'decimal'  
       when 'float' then 'float'  
       when 'image' then 'byte[]'  
       when 'int' then 'int'  
       when 'money' then 'decimal'  
       when 'nchar' then 'string'  
       when 'ntext' then 'string'  
       when 'numeric' then 'decimal'  
       when 'nvarchar' then 'string'  
       when 'real' then 'double'  
       when 'smalldatetime' then 'DateTime'  
       when 'smallint' then 'short'  
       when 'smallmoney' then 'decimal'  
       when 'text' then 'string'  
       when 'time' then 'TimeSpan'  
       when 'timestamp' then 'timestamp'  
       when 'rowversion' then 'byte[]'  
       when 'tinyint' then 'byte'  
       when 'uniqueidentifier' then 'Guid'  
       when 'varbinary' then 'byte[]'  
       when 'varchar' then 'string'  
       else 'UNKNOWN_' + typ.name  
     end ColumnType,  
       when col.is_nullable = 1 and typ.name in ('bigint', 'bit', 'date', 'datetime', 'datetime2', 'datetimeoffset', 'decimal', 'float', 'int', 'money', 'numeric', 'real', 'smalldatetime', 'smallint', 'smallmoney', 'time', 'tinyint', 'uniqueidentifier')   
       then '?'   
       else ''   
     end NullableSign  
   from sys.columns col  
     join sys.types typ on  
       col.system_type_id = typ.system_type_id AND col.user_type_id = typ.user_type_id  
   where object_id = object_id(@TableName)  
 ) t  
 order by ColumnId  
 set @Result = @Result + '  
 print @Result  

Habilis. (2017, May 01). Creating C# model class from SQL query. Retrieved from https://habilisbest.com/creating-c-model-class-from-sql-query

SQL Script to Create Enums For Table Columns

6/5/2019 5:24:00 AM

While working on a data migration project, using C# and Azure Functions, I found myself needing to create enums to use in my code for the columns.  Some of the tables where vary large (over 200 columns) and doing this manually would have taken a while to do it.  So, I wrote the below SQL script to create the enums for me.  To use this simply replace the table and schema variables with the schema and table name you want to generate the enums for.

 declare @Table sysname = '<INSERT TABLE NAME HERE>';  
 declare @Schema sysname = 'INSERT SCHEMA NAME HERE';  
 declare @Result varchar(max) = 'public enum ' + @Table + 'Columns' + '  
 select @Result = CONCAT(@Result, '  
      ',ColName, ' = ', SUM(ColNum), ',')  
      select COLUMN_NAME as ColName, SUM(ORDINAL_POSITION - 1) as ColNum  
      where TABLE_NAME = @Table AND TABLE_SCHEMA = @Schema  
      group by COLUMN_NAME  
 ) t  
 group by ColName, ColNum  
 order by ColNum  
 set @Result = @Result + '  
 print @Result  

UPDATED: Setup Non-Interactive User with Non-Expiring Password

4/4/2019 2:47:00 PM

A while ago, I wrote a blog post on setting up a non-interactive user with a non-expiring password.  Since writing that post, the process has changes and I couldn't find much documentation on the process.  The part of setting up the non-interactive user hasn't changed:

Setup Non-Interactive User:

  1. Create a user in the Office 365 admin center.  Be sure to assign a Dynamics 365 (online) license to the account.
  2. Go to Dynamics 365 (online).
  3. Go to Settings > Security.
  4. Choose Users > Enabled Users, and then click a user’s full name.
  5. In the user form, scroll down under Administration to the Client Access License (CAL) Information section and select Non-interactive for Access Mode.  You then need to remove the Dynamics 365 (online) license from the account.
  6. Go to the Office 365 admin center.
  7. Click Users > Active Users.
  8. Choose the non-interactive user account and under Product licenses, click Edit.
  9. Turn off the Dynamics 365 (online) license, and then click Save > Close multiple times.
  10. Go back to Dynamics 365 (online) and confirm that the non-interactive user account Access Mode is still set for Non-interactive.
  11. In CRM assign a role to the user account.

But, the process for setting up a non-expiring password has:  To make this easy I created a small PowerShell Script.  The script is below.  Simply change the the part for the $UserEmail to the account you want to set.  What the script does is connect the user to Azure, you will need to be a global admin.  Teh checks the password policy for the user.  Then it updates the policy and shows you that it was changed.  Here is a link to the Microsoft article that I referenced when doing this.  I do want to point out that you should check the warning at the bottom of that article.

 $UserEMail = "<Replace with e-mail>"  
 Get-AzureADUser -ObjectId $UserEMail | Select-Object UserprincipalName,@{  
   N="PasswordNeverExpires";E={$_.PasswordPolicies -contains "DisablePasswordExpiration"}  
  Set-AzureADUser -ObjectId $UserEMail -PasswordPolicies DisablePasswordExpiration  
  Get-AzureADUser -ObjectId $UserEMail | Select-Object UserprincipalName,@{  
   N="PasswordNeverExpires";E={$_.PasswordPolicies -contains "DisablePasswordExpiration"}  

Prevent NULL inserts during updates

12/5/2018 6:36:00 AM

Recently, I had an issue that I needed to overcome.  The problem was that the Dynamics 365 instance I was working on, was integrated with multiple other systems with some long running external processes and possible old data in the UI when a record is open, overriding what the value was set to from the external processes.  What was happening is that on create of a contact, we can have a NULL e-mail.  This is normal behavior.  Our external process would be running and update the NULL e-mail field while the record was open in the UI.  Because the UI hadn't been refreshed and the user made other changes, the e-mail would be blanked out (NULL).  To make sure that the field once populated (yes, this is a business requirement) could not be cleared, I wrote a small pre-operation plugin that does the following:

  1. Check's the plugin context to see if we are trying to update the emailaddress1 or emailaddress2.  
  2. If we are trying to update either email address.  Than we check to see if context value is null, "" or string.empty.  We also double check to make sure that the field is in the context, since my first if is an OR statement.
  3. If there is a value, then we move on.  But, if we are trying to blank out the email address, I simply remove the attribute from the context.
Here is a code snippet:

 using Microsoft.Xrm.Sdk;  
 using System;  
 namespace ContactEmailValidation  
   public class ValidateEmail : IPlugin  
     public void Execute(IServiceProvider serviceProvider)  
       ITracingService tracer = (ITracingService)serviceProvider.GetService(typeof(ITracingService));  
       IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));  
       IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));  
       IOrganizationService service = factory.CreateOrganizationService(context.UserId);  
       //Convert context to entity to make it easier to work with.  
       Entity PreValidationData = (Entity)context.InputParameters["Target"];  
       //Check if email fields are in the context. If they are then we are trying to update them.  
       if(PreValidationData.Contains(ContactConstants.PrimaryEmail) || PreValidationData.Contains(ContactConstants.SecondaryEmail))  
         //Get current database contact data  
         using (OrganizationServiceContext context = new OrganizationServiceContext(service))  
           CurrentDatabaseData = (from c in context.CreateQuery("contact")  
                       where (Guid)c[contactid] == id  
                       select c).SingleOrDefault();  
         tracer.Trace($"Execution context does have either primary email or secondary email in it.");  
         //If primary email is trying to update to null, remove it from the context so we don't clear out the email.  
         if (PreValidationData.Contains(emailaddress1) && (PreValidationData[emailaddress1] == null || PreValidationData[emailaddress1].ToString() == "" || PreValidationData[emailaddress1].ToString() == string.Empty))  
           tracer.Trace($"Trying to update primary email to null. Removing from context...");  
         //If secondary email is trying to update to null, remove it from the context so we don't clear out the email.  
         if (PreValidationData.Contains(emailaddress2) && (PreValidationData[emailaddress2] == null || PreValidationData[emailaddress2].ToString() == "" || PreValidationData[emailaddress2].ToString() == string.Empty))  
           tracer.Trace($"Trying to update secondary email to null. Removing from context...");  
         //Update the plugin context that will be saved to the database.  
         tracer.Trace($"Setting target to new context values.");  
         context.InputParameters["Target"] = PreValidationData;  

Because this is a pre-operation removing the values from the context will prevent the value from being cleared out.  Also, this is a stripped down code snippet, so you will need to check it for any typo's or syntax errors.

Multi-Select Option Sets Missing setVisible() for JavaScript (Workaround)

11/28/2018 8:04:00 AM

It is common practice in Microsoft Dynamics 365 to use Business Rules to show and hide fields on a form.  But, with that addition of multi-select option sets, you can't do this.  So the next logical step is to use JavaScript.  There is just one small problem, there is no setVisible() function for multi-selection option sets.  Here are the controls we do have access too, as of this post:

So, how can we dynamically show / hide multi select option sets?  The easiest way is to use a section.  Place a new section on you form.  Then you will need to use JavaScript to get the tab and section by name.  Once you have that, you can dynamically show / hide the section to make the multi-selection option set show and hide.  Here is an example of the JavaScript:

 function ShowHide(executionContext)  
   //Create the form context  
   var formContext = executionContext.getFormContext();  
   var contactStatus = formContext.getAttribute("Field Name").getValue();  
   var tabObj = formContext.ui.tabs.get("Tab Name");  
   var sectionObj = tabObj.sections.get("Section Name");  
   if (contactStatus === 1)  

BETA Release Scribe .NET Library

11/10/2018 9:41:00 AM

In my spare time I have been building a .NET 4.5.2 library for the Tibco Scribe Online API.  For one developer this has been a daunting task and I am at the point that I need some help.  I have the library created and need people that can write unit tests to test the code.  The project is 100% open source and I am aiming for it to be a community build.  The project is hosted at GitHub.

This is a BETA release, because I have not created all the unit tests to make sure every single method is bug free.  Make sure to check the wiki on how to implement the code and use the project.  Once we are out of Beta I will create the NUGET package.

Thank you in advance to all that help out.

Dynamics 365 v9 Unit Testing and .NET Confusion

7/12/2018 11:13:00 AM

Recently while creating a plugin for Dynamics 365 v9, I ran into an issue trying to connect to CRM via the tooling connector in my unit test project.  The underlying problem was that the .NET version I was using in my unit test was 4.5.2.  This was preventing me from being able to connect to CRM to create my organization service.  I updated the .NET version on my unit test project to 4.6.1 and was then finally able to connect.  I will also add that I am using the latest nuget package version for Dynamics 365 v9.

For consistence, I updated the plugin project I was working on to .NET 4.6.1.  Locally, everything was working great.  I was able to connect to CRM and make sure that all the methods I had written did what they where suppose to do using test driven development practices.

Then when publishing my plugin via the latest version of the plugin registration tool, I received an error and could not publish my plugin.  The error was due to the .NET version of my plugin project not being 4.5.2.  REALLY!!!!  I just had connection issues and needed to upgrade the .NET version to 4.6.1 to resolve them.  Now I am being told to downgrade my .NET version to upload the plugins to Dynamics 365 v9.

While researching this, I came across this thread in the Microsoft forums, where people are seeing the same issue and are confused as to what is happening.  Basically, CRM plugins need to use .NET 4.5.2 but the XRM tooling connector uses .NET 4.6.1.  We need to tooling connector in our unit tests, because we are connecting to CRM via an external source to create the organization service context for us.  This means that our unit test project needs to be built on .NET 4.6.1 and out plugin project needs to be built on .NET 4.5.2.  Don't worry, the nuget packages you need for both can work with both .NET versions without any problem and you will be able to connect to CRM from your unit test project and call methods from the plugin project to test your code.  You will then be able to publish your plugins to CRM and not receive a .NET error message in the plugin registration tool.

Working With N:N Relationships in LINQ - Using Late Binding

7/12/2018 10:54:00 AM

Recently I had a requirement where I needed to get related records via a many-to-many (N:N) relationship in Dynamics 365 via a plugin in C#.  During my searching online, I kept coming to posts that involved using query expressions.  I wanted to do this with LINQ for consistence reasons and readability in my code.  I will also add that what I was doing was being done via Associate and Disassociate messages of the plugin.  I won't be reviewing that part of the plugin in this post, but I will just be focusing on the LINQ statement used to get the related records via the N:N relationship.  Here is the example code that I came up with:

 public List<Entity> GetRelatedRecords(Guid parentID, IOrganizationService service)  
      OrganizationServiceContext context = new OrganizationServiceContext(service);  
      List<Entity> oProducts = (from r in context.CreateQuery(RELATIONSHIP_NAME)  
                                     join eOne in context.CreateQuery(ENTITY_ONE_LOGICALNAME) on r[ENTITY_ONE_UNIQUEID_LOGICALNAME] equals eOne[ENTITY_ONE_UNIQUEID_LOGICALNAME]  
                                     join eTwo in context.CreateQuery(ENTITY_TWO_LOGICALNAME) on r[ENTITY_TWO_UNIQUEID_LOGICALNAME] equals eTwo[ENTITY_TWO_UNIQUEID_LOGICALNAME]  
                                     where (Guid)eOne[ENTITY_ONE_UNIQUEID_LOGICALNAME] == parentID  
                                     select eTwo).ToList();  
      return oProducts;  

What the above code does, is uses the relationship name (typically setup as prefix_entityOne_enittyTwo), which is the name of the linking table and joins both linked entities together.  From that we then filter the records to those that are linked to entity one and return entity two records via the GUID of entity one record.  We populate those in a list so we can loop through them or process them as needed.

SCRIBE Software Acquired By TIBCO

6/6/2018 5:52:00 AM

On June 6th, 2018 Scribe Software was acquired by TIBCO.  Below is the announcement I and other partners received in an e-mail with links to the press release.

"We are pleased to announce that on June 6, Scribe Software was acquired by TIBCO Software. This milestone reflects the increasing strategic importance that Scribe’s product line has with IT organizations and presents great opportunities for Scribe’s partner community.

In the short term, there will be no immediate impact to how you conduct business with Scribe. Your sales and support contacts will all remain the same. Over time, we expect that the combination of Scribe’s best-in-class iPaaS with TIBCO’s enterprise product portfolio, which includes messaging, application integration, API management, and analytics offerings, will provide significant capabilities and opportunities for Scribe’s partner community.

To learn more about the opportunities that lay ahead, read the press release..."

HttpWebRequest For Scribe Online Api

5/1/2018 5:20:00 AM

Scribe has made it extremely easy to work with their API.  Currently, in my limited free time, I am working on a reusable C# library to send and receive data from Scribe's API.  Once it is completed for Version 1, I will list it to GitHub, so anyone can consume it and provide feedback on improvements.  There will be a fully built out wiki with how to consume the library with references to the Scribe API documentation.  Also there will be both TDD (Test Driven Development) and BDD (Behavior Driven Development) projects.  This way you can see examples of how to consume the library as well as what I tested before release.  If you are not familiar with BDD, I recommend you check out my blog post in my Dynamics 365 blog, that gives a high level overview of it.

Anyways, back to what this post is about.  While working on the library, I came across one small issue around TLS 1.2 and making sure that the web request uses it.  Luckily, this is really easy to set in C#, when using .NET 4.5.  By default .NET 4.5 does not send TLS 1.2.  It is supported, just not the default.  So if TLS 1.2 is set for an endpoint, then we need to specify to use it.  This way we decrease the chance of an error occurring.  Here is a blog post I came across that helped me with my issue.

Because I am using C# .NET 4.5 I simply needed to add the following line of code:

 ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12  

The line of code is added right before my HttpWebRequest.  Here is the fully working request method:

     public string SendRequest(string username, string password, string url, string method)  
       WebResponseFactory factory = new WebResponseFactory();  
       ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;  
       HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);  
       request.Method = method;  
       request.Headers.Add("Authorization", "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes($"{username}:{password}")));  
       request.ContentType = ParameterContentTypeSettings.applicationjson;  
       request.ContentLength = 0;  
       HttpWebResponse response = (HttpWebResponse)request.GetResponse();  
       return factory.ProcessResponse(response);  

In this code I do use a factory to determine what kind of response I need to return.  It could be just a string response or a JSON string response response.  I have tested this against both Sandbox and Production API's with Scribe Online and it does work.  The only other thing I want to note in the above code, is I am sending a content length of 0.  This is because I did receive some error messages where content length was required, even though I wasn't sending a body.  If you need to send a JSON content body, there is the method where I do that:

        public string SendRequest(string username, string password, string url, string method, string contentBody)  
       ASCIIEncoding encoding = new ASCIIEncoding();  
       byte[] encodedData = encoding.GetBytes(contentBody);  
       WebResponseFactory factory = new WebResponseFactory();  
       HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);  
       request.Method = method;  
       request.Headers.Add("Authorization", "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes($"{username}:{password}")));  
       request.ContentType = ParameterContentTypeSettings.applicationjson;  
       request.ContentLength = encodedData.Length;  
       var requestStream = request.GetRequestStream();  
       requestStream.Write(encodedData, 0, encodedData.Length);  
       HttpWebResponse response = (HttpWebResponse)request.GetResponse();  
       return factory.ProcessResponse(response);  

If you need any help on working with the Scribe Api, please feel free to leave a comment below or leave a post in the Scribe Forums.  Also check out Scribe's Developer Portal for additional information.

Tarnovskiy, S. (2016, April 28). TLS 1.2 and .NET Support: How to Avoid Connection Errors. Retrieved April 30, 2018, from https://blogs.perficient.com/microsoft/2016/04/tsl-1-2-and-net-support/

Test Your Code With Your User Stories - Behavior Driven Development

4/16/2018 9:36:00 AM

When we are designing a new system, one of the tools we use is user stories.  User stories allow us to define what the feature should do from the view point of the end user.  This way we take a user centered approach to designing the system.  These are also used as part of our functional testing when writing code (plugin, JavaScript, etc.) to make sure what was written matches the user story.  Even if we use Test Driven Development (TDD) we could easily miss some of the key functions within the feature and need to go back to our code to make changes and then restart our testing process.  This can be time consuming.  Wouldn't it be better to start our testing off with the user story?

With advancements in frameworks and technology we now have the capability to write test scripts directly from the user story using SpecFlow.  Taking this approach is known as Behavior Driven Development (BDD) because we are testing the users interactions instead of just data and functions.  This becomes even more important when we have to work with a UI. We can mimic button presses and navigation using other tools like Selenium and EasyRepro.

I am not going to go through an entire setup process as SpecFlow has done an awesome job documenting how to set it up in CRM.  Also Wael Hamze has provided some great examples in GitHub when using FakeXrmEasy and EasyRepro.


SCRIBE Online April 2018 Update is Here!!!

4/9/2018 3:49:00 PM

On April 6th, 2018 Scribe released an update to SCRIBE Online.  You can read all the details of the changes here.  In this post about the update, I want to go over some of the stuff I have seen while working with this new update, and it's an AMAZING update to the platform.  Here are my highlights:

  • Performance - There does appear to be improvement to how well the platform functions in Chrome (Chrome is my primary browser).  Previously, I did have some performance issues in Chrome and on larger map's I would have to dump my cache or exit chrome and re-launch to speed it back up again (not always the case, but did need done from time to rime).  This appears to be eliminated for me.  I can clearly see marked improvements when it comes to lag and performance. *See bullet point 4 under Platform which is under Enhancements.
  • Agent Request Logs - Users now have the ability to request the logs from the agent without logging into the server where the agent is installed to get them.  Simple go to the agent in Scribe Online and click on the gear that will appear on the right side of the agent table.  Click on "Request Logs".  This will download a zip file with the logs for you to review. *See Agents section. 
  • Revision History - At the bottom of each mapping you will see a "Revision Comment" box.  While working on a mapping you can insert a comment into this box to log what you are updating on the map.  Every time you click "Apply" of "Ok" to save the map, a revision is stored.  This way if you need to go back and restore a previous revision you can.  To view the revision history go to the map in your solution and click on the gear to the right of your map and look for "Revisions"  I will do a more detailed blog post on this one in the future. * See Maps Section.
  • Debugging - This was a bug that I ran into previously.  In previous release when debugging sometimes the block wouldn't enter where the error happens.  This could cause confusion as to what the problem was.  Also the error when this happened wouldn't appear.  This bug has been squashed.  Debugging is working like a charm and errors are properly appearing and making it really easy to figure out what is happening when debugging a map. * See Debug Section
There are a lot of bug fixes and enhancements in this release.  What I highlighted in this post is only a small amount of what is involved in this release.  These are the ones that I choose to highlight as they are perhaps the most notable and easily recognizable by people using the platform.  Again, for a complete list of bug fixes and enhancements, head over to the SCRIBE Success Community and check out the release notes.

Getting Started Connector Development

3/22/2018 12:36:00 PM

One of the benefits of working with Scribe Online is how easy they make it to create connectors if one does not exist.  In this blog post we are going to look at how to get setup, if this is the first time you have made a connector for Scribe Online.  But, before we get into that, we should first make sure that a connector doesn't already exist that can handle what we need.

We can do this by looking in the Scribe Online Marketplace.  Still not seeing what you need?  Reach out to Scribe directly or ask in the Scribe forums if a connector exists for an application.  There are instances where a connector exists but is not listed in the marketplace.  An example of this is for a client that I built a connector for.  They didn't want to setup a system to run the on-premise agent, so they asked me to set up the connector to run on Scribe's cloud agent.  This meant that I had to submit the connector to Scribe for validation.  Once published the connector is in the Scribe marketplace, but hidden.  Access to it is managed from within the client's Scribe Online org.  This means that only people that ask them for access can use it.  But, unless they tell you this, you won't know it.  So, it's worth asking Scribe before starting to develop a connector, if one exists.  As mentioned before, even if one doesn't Scribe makes it really easy to create one.

In this blog post we are only going to go over what you need to get setup.  We won't be getting in depth on connector development, that will come in future posts.  You will need the following to create connectors:

I will assume that if you are reading this you are already familiar with writing code and have Visual Studio installed.  If that is the case, then all you need to do is install the GitLab extension (only needed if you are going to publish to Scribe for validation or if you don't have a current source control solution).  At this point we will now install the fast connector framework (FCF) for both messaging and integration.  To do this, go into the FastConnectorFramework folder in the CDK that you downloaded.  There you will see 2 folders, each containing a .vsix file.  With visual studio closed, double click on these files to install the extensions in Visual Studio.  Once this is done, you will see the below, when creating a new project in Visual Studio:

With this all setup we can create connectors with the FCF for integration or messaging.  We can also create connectors from scratch using the CDK.  Then we can upload them to Scribe if we want them in the marketplace or use them with the cloud agent.  In future blog posts, we will go more in-depth on connector development and the differences between using the CDK or FCF.  I just wanted to put this post out as an introduction to connector development.

SCRIBE Labs - Variable Connector Used To Control A Loop (Dynamically)

3/9/2018 9:40:00 AM

Recently, while working on a data migration task, I had a requirement to transform data stored in a string delimited by ";" and parse it to create multiple line items in the target systems.  Here is an example as a visual:

productA, productB, productC, productD...


In the target system, each product would be a line item on an opportunity.  In the source system this was all listed in a text box as a string and delimited by ";".  Also the cell where this data was in the source could be null, have one item, ten items, fifty items, etc.  So I needed a way to loop through this.

To set this up I needed to use three connectors, source, target, and Scribe Labs - Variables.  Below is an image of the section of the mapping where I achieved what I needed to happen:

  1. If Product To Add - This is the null handler.  If the field where the products are listed in the source is null, then we skip adding any products in the target system.
  2. Lookup uom - This is target specific related record and not important for what we are working on.
  3. Upsert Counter - Here is where we declare our counter in the the Scribe Labs - Variable connector.  For this I choose NumVarible and set the name to "counter" and the val to 1.
  4. Loop - Now we enter our loop.
  5. Get Counter - Here we do a lookup to get the val of the counter.  Yes, we just created the counter, but remember we are in a loop and need this lookup, since we will be updating the value in the loop.
  6. Lookup Product - Here is where we lookup the product in the target system.  For the lookup I am doing a like match against the product id field in the target system.  To do this I need to use the parse function in Scribe. Here is how the parse is setup: CONCATENATE("%", STRIP(PARSE(Source Field, Counter_NumVariableLookup.val, ";"),"AN")).  What we will focus on in this function is the Parse() piece.  Within the Parse we have to tell Scribe the field that we want to get the data from.  Then we tell it where specifically in that string we want to look.  Then how the string is delimited.  To make this dynamic, we are using our counter variable.  This means on first run our counter variable is set to one.  So, in a string of "ProductA;ProductB;ProductC" our parse is going to pull out "ProductA" because it is the first sub-string.  When the loop runs again, the counter will be two and "ProductB" will be returned.
  7. If Product Found - This if statement is to make sure we don't create any empty line items if the product doesn't exist in the target system.
  8. Create opportunityproduct - If we do find 1 product, then we create the record in the target system.
  9. Increase Counter - Here is where we increase the val of the counter by 1.
  10. If No More to Process - Now that we increased our counter, we need to check our string ("ProductA;ProductB;ProductC") to see if we need to run the loop again.  When you use Parse with a counter like this to get a specific sub-string.  If that we reach the end of the string, in our example that would be 4.  The Parse will return null.  So for our If statement we want to use the ISNULLOREMPTY() to check and see if we reached the end of the string.  We would setup the condition to be ISNULLOREMPTY( PARSE(Source Field, Counter_NumVariableUpsert2.val, ";") ) equals True.
  11. Loop Exit - If we have reached the end of our string, then we exit our loop.
For more information check out these helpful links:

Loop Exit
Scribe Labs - Variable Connector

XrmToolBox Bulk Attachment Manager - Version 2018.2.3.2 Released!

3/9/2018 7:26:00 AM

I am pleased to announce that this week, I published an update to my XrmToolBox Bulk Attachment Manager plugin.  For those that are not aware my plugin allows a user to download note attachments, email attachments, or both from Dynamics CRM.

This update includes bug fixes and a major enhancement that was requested by a user.  The enhancement is the ability to create a report of attachments in the system.  To do this, follow these steps:

  1. In Step One, choose if you want a report on notes, emails or both.
  2. In Step Two, choose "Report Only".
  3. Step Three, click on "Run".
You will see the table populate with information.  Once it is done, click on export and you will be provided with a pipe delimited ("|") .csv file with the data in the table.  You can take this to quickly find attachments you are interested in.  With the note GUIDs from the export you can provide them as a filtered down list, so you only export those attachments using the "Specific attachment" option for export.

Part of this update as well including adding file size as a column of data that is provided in the table.  The file size provided is in bytes.

This is an open source project that I have hosted on GitHub.  If you have any recommendations for the tool, run into any issues with the tool or just want to support the plugin, please head over to my GitHub page.

Nuget Package
GitHub Project

XrmToolBox Bulk Attachment Manager - Version 2018.2.2.7 Released!

2/12/2018 1:24:00 PM

Today I have published version 2018.2.2.7 of my XrmToolBox plugin.  The purpose of this plugin is to make it easy to download and backup attachments in CRM.  This release is a major release as it is the first built version with all pieces working.  The first version only had note downloads working.

Overview of Plugin:

How To Use The Tool:
  1. Launch XRMToolBox.
  2. Connect to organization.
  3. Step 1, choose what you want to download. (Once a choice is made, Step 2 becomes usable)
    1. Notes - Download attachments from note entity.
    2. E-Mail - Download attachments from emails.
    3. Both - Downloads attachments from notes and emails entities.
  4. Step 2, choose if you want to download all attachments in the system or specific attachments.  
    1. All Attachments - This will search the chosen entity to find all records that have an attachment and download them.
      1. Click on "Browse" and choose where to download the attachments too.
    2. Specific Attachments - You will need to provide a .csv file with a GUID(s) of the records you want to download.
      1. Click on "Browse" and choose the .csv file to use.  Attachments will be downloaded to the same location as the .csv file.
  5. Step 3. click on "Run".
  6. When complete, in step 4 click on "Export Results".  This will export the data in the output area into a pipe delimited (|) .csv file.
  • Note attachments are stored in a "Note Attachments" folder at the download location chosen.
  • Email attachment are stored in a "Email Attachments" folder at the download location chosen.
  • Inside the folder mentioned above are other folders.  The folder for each of these sub-folders is the GUID of the source record (Note / Email).  This way we can maintain the relationship of the records.  Within this folder are the attachment(s).
If you run into any issues with this plugin please report the issue against the GitHub Project.

Validating User Input In CRM Portals With JavaScript

2/9/2018 6:53:00 AM

When we are setting up CRM Portals to allow customers to update their information, open cases, fill out an applications, etc. We want to make sure that we are validating their input before it is committed to CRM.  This way we ensure that our data is clean and meaningful to us and the customer.

CRM Portals already has a lot validation checks built into it. But, on occasion we need to add our own.  To do this we will use JavaScript to run the validation and also to output a message to the user to tell them there is an issue they need to fix.

Before we can do any JavaScript, we need to check and see if we are using JavaScript on an Entity Form or Web Page.  This is because the JavaScript, while similar, will be different.  First, we will go over the JavaScript for Entity Forms.  Then, we will go over the JavaScript for Web Pages.  Finally, we will look at the notification JavaScript.

Entity Form:
 if (window.jQuery) {  
   (function ($) {  
     if (typeof (entityFormClientValidate) != 'undefined') {  
       var originalValidationFunction = entityFormClientValidate;  
       if (originalValidationFunction && typeof (originalValidationFunction) == "function") {  
         entityFormClientValidate = function ()   

Web Page:
 if (window.jQuery) {  
   (function ($) {  
     if (typeof (webFormClientValidate) != 'undefined') {  
       var originalValidationFunction = webFormClientValidate;  
       if (originalValidationFunction && typeof (originalValidationFunction) == "function") {  
         webFormClientValidate = function ()  

Message JavaScript:
 $('.notifications').append($("<div class='notification alert alert-danger' role='alert'>PUT ERROR MESSAGE TEXT HERE</div>"));  

Using Microsoft Dynamics 365 Within Microsoft Teams

1/26/2018 9:05:00 AM

In my SCRIBE Online blog, I wrote about how we can use SCRIBE Online in Microsoft Teams.  In this post I am going to talk about how we can use Microsoft Dynamics 365 in Microsoft Teams.  For those that might not be familiar with Microsoft Teams, it was released by Microsoft in 2017 to help increase communication and productivity within Teams.  The best part about Microsoft Teams is it is 100% FREE.  Click here to download Microsoft Teams.  Since it's release there have be many updates and added connectors, that increase its functionality and usability.

To leverage Microsoft Dynamics 365 within Microsoft Teams we can use either the Website Tab or the Dynamics 365 Connector.  First we will start with how to add the Website Tab.
  1. In you team, choose the channel you want to add the tab too.
  2. Click the "+" in the channel you have chosen and you will get a pop-up
  3. Click on "Website"
  4. There will be a pop-up asking to input the tab name and URL.  At this point you need to decide if you are going to have the website launch to the default landing page for  Dynamics 365 or to a specific record in  Dynamics 365.
  5. Leave the "Post to the channel about this tab" checked.  This will create a thread in the "Conversations" tab.  This way any communication around the  Dynamics 365 record can be kept in one place.
  6. Click "Save".
Now, that we know how to add Dynamics 365 in the website tab, you might be asking; "How can I get notifications if there is an update to the record in Dynamics 365 without logging into Dynamics 365?"  That is a really good question and one that the Dynamics 365 connector solves.  At the top of the Teams UI you will see the "Breadcrumbs" of where you are.  This will be the team name followed by the channel.  After the channel name you will see three dots.  See below image.

To add the connector we want, click on the three dots.  This will bring up a menu.  In the menu click on "Connectors" (If you don't see connectors, wait a few minutes as this means you just created the team and it hasn't fully setup yet.).  In the pop up, you will see a list of connectors.  To filter these, click on CRM under "CATEGORY".

Next click on configure next to Dynamics 365.  Wait a few seconds and you will see a screen that will ask you to pick the Dynamics 365 environment and then choose a record.  Once you see this, pick the Dynamics 365 instance you want to connect too.   Once Teams connects to the instance, you will see the instance URL appear.  Now we can search for the record we want in CRM, typically these will be leads, accounts or contacts.  Since almost all other records are linked to these ones in some fashion.  Once we find the record we want in the search results, click on save.  You will see a new thread start in the "Conversations" tab.  Now when any activities are updated for this record, there will be notifications within Microsoft Teams.

I hope this helps you with Microsoft Dynamics 365 within Microsoft Teams.  Stay tuned for any updates to Microsoft Teams and the Dynamics 365 connector for Microsoft Teams.

SCRIBE Online and Microsoft Teams

1/25/2018 7:39:00 AM

In 2017 Microsoft launched Microsoft Teams.  Its a free collaboration tool that brings together multiple different tools into one area and keep communications organized.  But, isn't this a SCRIBE Online blog?  Why are we talking about Microsoft Teams?  That's because we can use SCRIBE within Microsoft Teams to keep communications about our data migrations, integration and replications in one area.  Let's get started...

Download and install Microsoft Teams from this link.  Once installed open that application and right click on "Teams".  This will give you a pop up to create a new team.  For this demo we will call our team SCRIBE.  Input a description and set the privacy to either private or public.  Private - you have to invite people to your team.  Public - others in your organization can see the team and join it.  Here
is what you should see:

Clicking on "Next" will create your team.  After it is created you will get a pop-up to add more people.  For now we are going to skip this.  Now in your list of Teams you should see "SCRIBE" with "General" under it.  The general is referred to as a channel.

For this example we are going to work in the General channel.  But, if you want to add more channels, click on the "..." next to the SCRIBE team name and click on "Add channel".

In the large area to the right of the teams, you will see three circle images and a "Welcome to the team!" message.  Above that you will see, "Conversations", "Files", "Wiki", "+".  Conversations is where team communications will be logged in a thread style.  Files is where we can upload any files we create.  Wiki is where we can create an F.A.Q or keep simple notes.  The plus sign is where we can add new stuff to our team.  For every new item we add there will be a conversation thread created in the conversations area.  For our practice we are going to be adding 3 websites.  Do the following:
  1. Click the "+"
  2. Click on Website icon.
  3. In the pop-up fill in the tab name (what will show in the navigation bar in teams) and URL.
  4. Leave "Post to the channel about this tab" checked.
  5. Click save.
Repeat the above 5 steps for each of these URLs:
  • https://app.scribesoft.com
  • https://help.scribesoft.com/scribe/en/index.htm
  • https://dev.scribesoft.com/en/home.htm
When you are done you should see the following at the top of the team:

In the Conversations tab you should see:

At this point our SCRIBE team is setup.  We can log into SCRIBE online via the SCRIBE Online tab and create our solutions and mappings.  If you click on the little communication icon
in the top right corner.  It will open up the communication thread for the tab we are working in.  This way we can quickly see what other team members have said about this tab and what we have said. 

There is a lot more that we can do with Microsoft Teams and SCRIBE, but for this post, I just wanted to introduce the idea of using Microsoft Teams to increase communication.  I will leave you with a few things to look into that you can use in the SCRIBE team we setup:
  • Add SCRIBE Status RSS feed to conversation area to get updates on the status of SCRIBE Online.
  • Use Files to store the Google Drive Spreadsheet we can create using the SCRIBE Documentation tool.
  • Use files to store backups of out mappings and solutions or add Source Control (TFS, VSTS, GitHub) to team to store them.
  • Add other SCRIBE websites to the team.

Using SCRIBE Online to Migrate Opportunities Into Microsoft Dynamics 365

1/18/2018 8:33:00 AM

  Recently while creating map's for a data migration from a legacy system to Microsoft Dynamics 365 using SCRIBE Online, I found that I needed to bring in Opportunities.  Anyone that has worked with CRM for a time, know's that migrating closed opportunities into CRM can be a bit of a pain.  I was having one issue in particular when bring this data in, the actual close date was being set to the day the record was migrated into CRM.  This is because of how CRM handles opportunities when they are closed.

Here is what I was doing:

  1. I performed all my look ups first to get the associated account and contact from CRM.
  2. I created the opportunity with the closed status of won or lost (see links below for website to get the standard values for reference).
  3. This makes the opportunity as read only (which I expected to have happen.  But, the actual close date was set for the day the record was created.  Even though I provided a date in the actual close date field.
Why is this happening?
     This happens because when you close an opportunity in CRM, it creates an opportunity close activity.  Doing an advance find I was able to locate this record and the actual value and actual close dates where blank.  Keep in mind, my mapping didn't create this record.  CRM created it when the status of the opportunity was set to one of the closed status (won or lost).

How do we over come this?
    The way to over come this is really simple.  It was a post in the SCRIBE forums on how to do this with SCRIBE Insight that helped me come up with the way to do it with SCRIBE Online.  Here is the solution:
  1. Create the opportunity with the close status you want (won or lost).  This will make it read only.  Don't worry about that.
  2. I recommend doing an if else block after the create opportunity step.  This way if you are bring in open opportunities you the if else will be skipped.  We want to enter the if statement only if we have have a closed opportunity.
  3. Inside our if else, we want to do a lookup to get the opportunity we just created.
  4. After our lookup do a delete block against the opportunity close table.  Use the opportunity id in the matching criteria.
  5. After the delete we want to do a create opportunity close and populate it with the actual close date and actual revenue.
  6. Finally, do an update block to the original opportunity and populate the actual close date and actual revenue.  Even though the record is read only, it will update these fields.

Helpful Links:

Getting Your Data Into Dynamics 365 - Data Migration

1/8/2018 9:26:00 AM

When doing an implementation of Dynamics 365, it is rare that a new system is being setup with no data needing to be brought into it.  When looking into how to get that data into the new system, it can be really confusing trying to decide which tool to choose.  In my experience, people tend to choose the tool to use based on price and this should really be the last thing to consider.  That is one of the reasons I will not talk about price in this post.  Instead I will stick to the Pro's and Con's for each tool and provide an overview of the tool and any experience(s) with each.

Here are the tools we will be looking at in this post:
D365 Data Import Wizard
D365 Data Loader

KingswaySoft is an integration toolkit built on SQL Server Integration Services (SSIS).  It is an on-premise solution (but can be cloud if installed on virtual machine in Microsoft Azure or Amazon AWS).  It can be used for integration, migration and replication.  In full disclosure I have not used KingswaySoft for a data migration yet, because of the reasons I list in the con's.  But, I have had clients that use it.  They offer a free download and developer license, so you can give it a try.


  • SSIS has been around for a long time and SQL developers are familiar with it.
  • Has starter packs that can be used to speed up development of mappings.
  • Easy integration with source control since development is done in Visual Studio.
  • Since its built on SSIS you need to know the disadvantages of SSIS as well.
  • Can be intimidating is the user is not a developer, since you have to develop within Visual Studio or SQL Server Management Studio (SSMS).
  • Limited connections
SCRIBE Online:
SCRIBE Online is a product offered by SCRIBE for data integration, migration and replication.  This is the tool I use the most when it comes to migrating data into Microsoft Dynamics 365 and integrating with other systems.  It runs in the cloud and has an on-premise agent you install to access your on-premise data / systems.  There is also a cloud agent, if all your applications are already in the cloud.  They offer a free trial that you can use to see if it will fit your needs.

  • User interface is easy to understand and clean.
  • Runs in the cloud and uses a small tool (agent) to access on-premise data.  If all systems are in the cloud already then a cloud agent can be used so there is no need for an on-premise system.
  • Multiple connectors.  Makes it easy to connect to other systems and migrate the data into CRM.
  • API - They provide an API so you can access your instance programmatically.
  • Fast Connector Framework (FCF) \ Connector Development Kit (CDK) - If a connector doesn't exist you can use the FCF or CDK to create one.
  • Provide starter kits to speed up you data mapping. 
  • They offer a Dynamics 365 app, so you can monitor status from within CRM.
  •  Sometimes there are caching issues and you have to clear browser cache to fix them.
  • To get mappings into source control it is a manual process, since you have to export them and copy them into a local folder.
D365 Data Import Wizard:
The Data Import Wizard is built within Microsoft Dynamics.   It allows for easy import of .csv, .xls, .xlsx and .zip files.  My vary first data migration was done with importing .xls files into CRM using the Data Import Wizard (it was not easy).  If you are doing a large or complex migration, I recommend staying away from this approach and using a different tool.  Only use this for small occasional imports.

  • Built within Dynamics 365.
  • Mappings saved in CRM for later use.
  • Great for small non-complex mappings.
  • Can be used with the CRM SDK / Developer Guide.
  •  There is a limit to the file size you can use for importing data.  This means you will need to split up your data into multiple files, which can increase the possibility of data duplication.
  • If there is relationships to other entities in your mapping, it can be difficult to set all these and have them work consistently.
  • Tool is vary inconsistent when importing data.
D365 Data Loader:
This is a new tool that Microsoft has put out with Dynamics 365.  It was designed to help business go from CRM On-Premise to CRM Online.  This one I can not give a Pro's and Con's list to because it is a preview feature, which mean it still has issues and is not fully released yet.  I will say that it is a viable option of going from CRM On-Premise to CRM Online and both systems are identical.

It can be really hard to choose a tool and say that it will work 100% of the time for all needs.  This is because everyone's needs are different and everyone's skill sets are different.  If I did have to choose one to use I typically choose SCRIBE Online because it is the most user friendly and the one that is easy for people to pick up on and maintain.

SCRIBE Stream Deck Version 2 Release - v02.01.2017

1/2/2018 8:03:00 AM

In a previous post, I talked about using SCRIBE Online with my Stream Deck.  The way I did this was creating a few small programs that I connects to an open button on my Stream Deck.  This was a temporary solution I created as a proof of concept.  What I didn't like about it was that I didn't have all the SCRIBE functions on it and I had to create multiple projects in visual studio.  But, it did prove that what I wanted to do could be done.  Since that post I have rewritten the project into one application that does the following:
  • Allows for user created functions without having to change code.
  • Provides all the SCRIBE functions on the Stream Deck.
  • Provides buttons to launch SCRIBE Online, SCRIBE Forums, and SCRIBE Help.
  • Provides buttons for cut, copy, paste.
Here is an overview of the application:
The purpose of this application is to allow the Stream Deck to be used with SCRIBE Online.

On start this application will end the stock Stream Deck application before starting. Also only 1 instance of this application can run at a time and measures have been taken to prevent multiple instances from running. When the application is done running it will auto relaunch the stock Stream Deck application.

Home Page Buttons:

  • Back - Ends the program and launches the stock Stream Deck Application. (Only on home screen. Other screens it will take you back one level.)
  • Open XML - Opens the XML that contains the SCRIBE and user defined functions.
  • Cut - Same as CTRL+X.
  • Copy - Same as CTRL+C.
  • Paste - Same as CTRL+V.
  • SCRIBE Online - Launches the SCRIBE Online web application.
  • SCRIBE Forums - Launches the SCRIBE Forums web site.
  • SCRIBE Help - Launches the SCRIBE Help web site.
  • Conversion - Folder containing the conversion functions.
  • Date - Folder containing the date functions.
  • Logical - Folder containing logical functions.
  • Math - Folder containing math functions.
  • Text - Folder containing text functions.
  • Miscellaneous - Folder containing miscellaneous functions.
  • User Defined - Folder containing user created functions.
When a folder button is pressed the buttons will refresh with the functions that are in that folder. This matches what is in these folders in the SCRIBE Online application. Because there are only 15 buttons (top left always used for back) this leaves only 14 functions that can be viewed at once. If there are more functions then can be displayed you will see an ellipses (...) button in the bottom right corner button on the stream deck. Also when you go into a folder the back button will take you one level back. It only exits the program from the home screen.

 Here is an outline of the basic usage:
  1. In SCRIBE Online open the formula editor and click where you want to insert the function.
  2. Navigate to the function you want on the Stream Deck. 
  3. Press the function button and release it. Upon releasing, the function will be placed at the cursor location in the formula editor.
Behind the scenes the clipboard on the computer is being used to do this. This is important to know, if you are copying and pasting other items. What is done is when the button is pressed, this tells the application which function you want from the XML and it gets that function and the value of the function. Upon release of the button, it will paste the value of the function where the cursor is.

The back button will remain in the top left corner button. On the home screen this button will end the SCRIBE Stream Deck application. From any other screen it will take the user back one level.

To fully understand the application please check out the wiki on the GitHub project. 

Here are the links:
GitHub Project
Latest Release
Submit Feedback or Issue 

Dynamics Set IFrame URL - D365 v8 vs. D365 v9

12/29/2017 6:47:00 AM

While doing client work, I came across a problem with setting an IFrame URL dynamically.  The underlying issue was that the sandbox instance is on v8 of Dynamics 365 and production is on v9 of Dynamics 365.  The reason for this was because this client was setup around the time that Microsoft rolled out v9.  Anyways, JavaScript that I wrote to dynamically set the URL of the IFrame wasn't working in the v9 instance.  This was because of changes that Microsoft made to how IFrames are loaded on the form and also changes to JavaScript.

Here is my v8 setup:

  1. JavaScript runs OnLoad of contact form.  This works because of how IFrames are loaded in v8.  You can also run it on either a tab change (hide / show) or OnReadyStateComplete event of the IFrame.  Depending on your setup you will need to choose which is best for you.  For me in this case it was the OnLoad event.
  2. Here is the JavaScript:
 function OnLoad()  
      //Get memberid  
      var value = Xrm.Page.data.entity.attributes.get("new_memberid").getValue();  
      //Get the default URL for the IFRAME, which includes the   
      // query string parameters  
      var IFrame = Xrm.Page.ui.controls.get("IFRAME_NAMEofIFRAME");  
      var Url = IFrame.getSrc();  
      //Append the parameters to the new page URL  
      newTarget = Url + "history?member=" + value;  
      // Use the setSrc method so that the IFRAME uses the  
      // new page with the existing parameters  

Here is my v9 setup:
  1. JavaScript runs on OnReadyStateComplete event of the IFrame.  I had to add extra code to prevent constant refresh of the IFrame.  This is because on this event every time we update the URL if causes the OnReadyStateComplete event to fire.  To prevent this we have to use an If/Else so we only update the URL if it hasn't been updated yet.
  2. Here is the JavaScript:
 function OnReadyStateComplete(executionContext)  
      var formContext = executionContext.getFormContext();  
      var value = formContext.getAttribute("new_memberid").getValue();  
      //Get the default URL for the IFRAME, which includes the   
      // query string parameters  
      var IFrame = formContext.ui.controls.get("IFRAME_NAMEofIFRAME");  
      var Url = IFrame.getSrc();  
      var newTarget = "";  
      //Append the parameters to the new page URL  
           newTarget = Url;  
           newTarget = Url + "history?member=" + value;  
           // Use the setSrc method so that the IFRAME uses the  
           // new page with the existing parameters  

As you can see in my v9 code I have to pass in the execution context so I can create the form context.  To do that you have to check the box next to, "Pass execution context as first parameter" when adding the JavaScript to the form.

Microsoft has not yet deprecated Xrm.Page yet but it is coming, so it is best to plan for it.  Here are some important links I used when solving my problem of the JavaScript not running:

Important Change (deprecations) coming
Execution Context
Form Context
Walkthrough: Write your first client script
Client API Reference

Using Stream Deck with SCRIBE

12/27/2017 1:15:00 PM

Because I am doing a lot of work with SCRIBE Online, I find myself trying to come up with solutions to speed up my work.  One solution I came up with is using my Stream Deck to automate some of the commands I type within SCRIBE Online.  Before we go to far let me bring you up to speed on what the Stream Deck is.

The Stream Deck is manufactured by elgato.  It is a small 15 key keyboard that can be customized.  What I like about it is how the keys are clean looking and can be easily changed by the application.  This hardware is marketed towards video editors and game streams (both of which I don't do).   I got it because of the number of keys it has, the ease at which you can update it, and because I didn't want to use macro's on another external keyboard with paper labels stuck on the keys.

I will say that I am not only using the Stream Deck for just SCRIBE, but other applications like Visual Studio, Microsoft Windows shortcut keys, and to launch other applications.  The reason I am pointing it out here is because to use it with SCRIBE I did have to do a little programming as it can't launch bat files like I was hoping.  This meant I reached into my C# tool kit and had to write a few small programs to accomplish what I wanted.

Since SCRIBE Online run's in the browser, I wanted to automate the process of typing out commands like, "IF(ISNULLOREMPTY(entity.attribuate), "TRUE", "FALSE")".  In the past I would have notepad open with that string pasted in it, then I would have to move my mouse over to that, copy it, paste it in the browser and then update it with the proper syntax.  That is easy enough but time consuming when doing this repeatedly.  So now with my Stream Deck I am able to simple press a button and it put that string value on my system clipboard, then I simply press another button and it pastes it where my cursor is.  This means I don't have to type it out and I don't have to leave SCRIBE Online to go over to notepad to copy it.  This saves me a few seconds that can add up over time when working on large integration or migration projects.

Here are the following commands I have built so far into my Stream Deck:

  • IF(AND(),,)
  • IF(OR(),,)
  • FORMAT("yyyy-MM-dd",)

Also all of those where just pasted to this thread using the Stream Deck in about 2 seconds.  This is what my SCRIBE folder on the Stream Deck looks like right now:

I will be changing the icons to make them stand out from each other a little more, but wanted to get it up and running really quickly.   The middle row are using small applications that I wrote and published to GitHub for anyone else that might have a Stream Deck and want to do this.  Here is the link to my GitHub project.

Here are some other links you might find interesting:
Amazon to purchase Stream Deck
Open Source .NET Library (Unofficial)
Stream Deck Website

I would be interested to hear if anyone else is using this device and if they have any ideas on how else we could use it for SCRIBE Online.

Lookup Tables

12/19/2017 10:56:00 AM

While creating data mappings you may run into the need to store some variables in a table in a Key \ Value Pair.  This way you can call the variable by its Key to get the Value and pass it to the other system.  This is helpful when the 2 systems store data in different ways.  An example of this would be a table that stores United States abbreviations as the key and the full state name as the value.  This is helpful when the source stores the full state name and the target stores the abbreviation or visa versa.  So how do we set this up?

1) In the navigation bar click on "MORE".
2) Click on "Lookup Tables".
3) Click on the "+" sign on the right hand side.
4) In the pop-up input a Name.
5) On the ellipses button on the right under Description you will get a dropdown with "Create", "Append" and "Export".
     5.1) Create - Click this to create a new Key / Value Pair in the table.
     5.2) Append - If you have a CSV file already created this is how you import it.  I do want to point              out that is needs to be a CSV that is separated by "," and nothing else or import will fail.
     5.3) Export - Exports the table

Now that we have out table built, we need to consume it.  To do this we will use the functions found in the Lookup Functions.  Here are the specific ones:

  • LOOKUPTABLEVALUE1() - Looks in column 2 for the value we pass it and returns the value in column 1 in the same row.
  • LOOKUPTABLEVALUE1_DEFAULT() -Looks in column 2 for the value we pass it and returns the value in column 1 in the same row. If nothing is found in the table it returns the default value we pass it.
  • LOOKUPTABLEVALUE2() - Looks in column 1 for the value we pass it and returns the value in column 2 in the same row.
  • LOOKUPTABLEVALUE2_DEFAULT() - If nothing is found in the table it returns the default value we pass it. If nothing is found in the table it returns the default value we pass it.
SCRIBE provides some detailed documentation around lookup tables that I encourage you to check out.  Here is a link to the help documentation.  Make sure to check out the "See Also section at the bottom of the link to get more details on each of the functions we can use for the Lookup Tables."

Option Set Helper Code

12/19/2017 8:47:00 AM

When it comes to working with Option Sets while creating a plugin, custom workflow activity or other extension to CRM, it can sometimes be a pain to get the label of the Option Set.  There are many solutions out there on how to accomplish this, but for the most part they were not reliable.  So after doing a bunch of research online and going through the SDK I came up with 4 methods that help me when I am working with Option Sets.  There are 2 for global option sets and 2 for local option sets.  The reason there are 2 for each is because one method will return the value and the other will return the label.

1) Get Local Option Set Label - This method will return the label of the option set value you provide it.

     /// <summary>  
     /// Method to get local option set label  
     /// </summary>  
     /// <param name="entityLogicalName">Schema name of entity</param>  
     /// <param name="optionSetName">schema name of the optionset</param>  
     /// <param name="value">value of the option set option</param>  
     /// <param name="service">CRM Organizational Service</param>  
     /// <returns>String Label of the optionset value provided</returns>  
     public string GetLocalOptionSetLabel(string entityLogicalName, string optionSetName, int value, IOrganizationService service)  
       XrmServiceContext crmContext = new XrmServiceContext(service);  
       RetrieveEntityRequest retrieveDetails = new RetrieveEntityRequest  
         EntityFilters = EntityFilters.All,  
         LogicalName = entityLogicalName  
       RetrieveEntityResponse retrieveEntityResponseObj = (RetrieveEntityResponse)service.Execute(retrieveDetails);  
       EntityMetadata metadata = retrieveEntityResponseObj.EntityMetadata;  
       PicklistAttributeMetadata picklistMetadata = metadata.Attributes.FirstOrDefault(attribute => string.Equals(attribute.LogicalName, optionSetName, StringComparison.OrdinalIgnoreCase)) as PicklistAttributeMetadata;  
       OptionSetMetadata options = picklistMetadata.OptionSet;  
       IList<OptionMetadata> OptionsList = (from o in options.Options  
                          where o.Value.Value == value  
                          select o).ToList();  
       string optionsetLabel = (OptionsList.First()).Label.UserLocalizedLabel.Label;  
       return optionsetLabel;  

2) Get Local Option Set Value - This method till return the value of the option set label you pass it.
     /// <summary>  
     /// Method to get local option set value  
     /// </summary>  
     /// <param name="entityLogicalName">Schema Name of entity</param>  
     /// <param name="optionSetName">Schema Name of option set.</param>  
     /// <param name="label">text label</param>  
     /// <param name="service">CRM Org Service</param>  
     /// <returns>Int option set value</returns>  
     public int GetLocalOptionSetValue(string entityLogicalName, string optionSetName, string label, IOrganizationService service)  
       XrmServiceContext crmContext = new XrmServiceContext(service);  
       RetrieveEntityRequest retrieveDetails = new RetrieveEntityRequest  
         EntityFilters = EntityFilters.All,  
         LogicalName = entityLogicalName  
       RetrieveEntityResponse retrieveEntityResponseObj = (RetrieveEntityResponse)service.Execute(retrieveDetails);  
       EntityMetadata metadata = retrieveEntityResponseObj.EntityMetadata;  
       PicklistAttributeMetadata picklistMetadata = metadata.Attributes.FirstOrDefault(attribute => string.Equals(attribute.LogicalName, optionSetName, StringComparison.OrdinalIgnoreCase)) as PicklistAttributeMetadata;  
       OptionSetMetadata options = picklistMetadata.OptionSet;  
       IList<OptionMetadata> OptionsList = (from o in options.Options  
                          where o.Label.UserLocalizedLabel.Label == label  
                          select o).ToList();  
       int optionsetValue = (OptionsList.First()).Value.Value;  
       return optionsetValue;  

3) Get Global Option Set Label - This method will return the label of a global option set value you pass it.
         /// <summary>  
     /// Method to get global option set label  
     /// </summary>  
     /// <param name="optionSetName">Schema name of option set</param>  
     /// <param name="value">Int value of option set value</param>  
     /// <param name="service">CRM Org service</param>  
     /// <returns>String option set label</returns>  
     public string GetGlobalOptionSetLabel(string optionSetName, int value, IOrganizationService service)  
       XrmServiceContext crmContext = new XrmServiceContext(service);  
       string optionSetLabel = "";  
       RetrieveOptionSetRequest retrieveOptionSetRequest = new RetrieveOptionSetRequest { Name = optionSetName };  
       RetrieveOptionSetResponse retrieveOptionSetResponse = (RetrieveOptionSetResponse)crmContext.Execute(retrieveOptionSetRequest);  
       OptionSetMetadata retrievedOptionSetMetadata = (OptionSetMetadata)retrieveOptionSetResponse.OptionSetMetadata;  
       OptionMetadata[] optionArray = retrievedOptionSetMetadata.Options.ToArray();  
       foreach (var option in optionArray)  
         if(option.Value.Value == value)  
           optionSetLabel = option.Label.LocalizedLabels[0].Label;  
       return optionSetLabel;  

4) Get Global Option Set Value - This method will return the value of the global option set label you pass it.
     /// <summary>  
     /// Method to get the option set value for a global optionset  
     /// </summary>  
     /// <param name="optionSetName">Schema name of the optionset</param>  
     /// <param name="label">text label</param>  
     /// <param name="service">CRM Org Service</param>  
     /// <returns>Int option set value</returns>  
     public int GetGlobalOptionSetValue(string optionSetName, string label, IOrganizationService service)  
       XrmServiceContext crmContext = new XrmServiceContext(service);  
       int optionSetValue = 0;  
       RetrieveOptionSetRequest retrieveOptionSetRequest = new RetrieveOptionSetRequest { Name = optionSetName };  
       RetrieveOptionSetResponse retrieveOptionSetResponse = (RetrieveOptionSetResponse)crmContext.Execute(retrieveOptionSetRequest);  
       OptionSetMetadata retrievedOptionSetMetadata = (OptionSetMetadata)retrieveOptionSetResponse.OptionSetMetadata;  
       OptionMetadata[] optionArray = retrievedOptionSetMetadata.Options.ToArray();  
       foreach (var option in optionArray)  
         if (option.Label.LocalizedLabels[0].Label == label)  
           optionSetValue = option.Value.Value;  
       return optionSetValue;  

XrmToolBox Bulk Attachment Manager - Version

12/14/2017 2:02:00 PM

I am pleased to announce that today I have published version 1 of my Bulk Attachment Manager plugin for XrmToolBox.  This is an open source plugin that I created because I had a need multiple times to migrate attachments out of CRM and into another system.  because of the way CRM stores attachments in the database, migrating these are sometimes hard said then done.  So to make this process easy I came up with this tool.  Before I go any further I am sure you want the important links so you can take a look.  Here they are:

GitHub Project
Project Wiki
Nuget Package

Right now, this plugin can only download note attachments from CRM.  In future releases I will turn on email attachment downloads and note attachment uploads (notes will need to already exist in the system).  To download note attachments, you have 2 options and that is to either download all note attachments or only select note attachments.  To download only select note attachments, you will need to provide a .csv file with guids of the notes that contain the attachment you want to download.

When the attachments download they will download into a folder in the directory you choose called "Note Attachments".  In that folder will be a folder for each note, the folder names are the guids of the note records.  Inside those files are the attached file themselves.  Also for everything that is downloaded there is an output screen showing you what has been downloaded, along with data on the regarding record.  Once the process is done you and export these results into a pipe-delimited .csv file.

If you have any input on this plugin or experience any issues please leave a comment in the issue tab of the GitHub project.  Thank you.

Using SCRIBE Online For Scheduled Tasks

12/10/2017 8:17:00 AM

While doing client work, I ran into a need where we needed to scan for records where a date value is 3 months in the future (i.e. 3 months from today) in Microsoft Dynamics 365.  There are a few options to accomplish this:

  1. Microsoft Dynamics 365 SDK (Developer Guide)
  2. Timed workflows
  3. SCRIBE Online
The issue with using the Microsoft Dynamics 365 SDK (Developer Guide) is you need a software developer to create a console application that is linked to a Windows scheduled task to run every night to connect to CRM, query the records and make the changes.  This means if you need any updates to the application you need to developer to change source code.  This is a viable option if you have an internal development team that can support the application.

When it comes to using timed workflows with wait statements, we would need to have the workflow fire at a specific time in the record.  In this scenario, this means that the workflow needs to fire to start the countdown (wait) when the record was created or when the date field we are monitoring changes.  The problem with this is the more waiting workflows you have in the system, you will start to see a degradation in performance.  Also, if someone manually changes the date field you will need to find the previously waiting workflow to stop it as you will have duplicate workflows waiting.  This can be a nightmare to manage.

The solution that we came up with for this client was to use SCRIBE Online to perform these actions.  Typically, when we think of SCRIBE Online, we think about connecting 2 systems, replicating data to another database or migrating data from one system to another.  There is more that we can do with it and this is one of the areas.  In this scenario, we have an integration map that is setup with only 1 connection as both the target and the source (Microsoft Dynamics 365).  We set the solution containing the mapping to run nightly and query for any records where the date field on the record is 3 months from the date the mapping is running.  If it is true, then it generates a record that will kick off a record creation in CRM (Triggering Entity) that will kick off a workflow to generate a bunch of related records to the account this the date field.

With the above setup, there is no need for a developer to update code as the process can be updated with the internal CRM workflow and SCRIBE Online, which both use an easy to understand GUI.  By moving the counter outside of CRM, we also make sure that the CRM system won't lose performance.

The main reason for my blog post about this, is to show that we can use SCRIBE Online for more than just data migration, integration and replication.  The more that I work within SCRIBE Online, the more I can see its use for other tasks.

SCRIBE Online Training and Resources

11/30/2017 9:18:00 AM

SCRIBE Online is a massive tool that can be intimidating the first time you work with it.  Luckily SCRIBE provides a lot of documentation and training to help you get started with your first go with SCRIBE Online.  When I first started working with SCRIBE Online it was in what they call the "Classic UI".  The latest version of SCRIBE is called "Crystal UI"  I will say that there where a lot of changes between classic and crystal that took me a little bit to grasp.  But, SCRIBE does a great job putting out content to help new users and even experienced users keep up to date with the platform.

In this post I wanted to go over some of the resources that SCRIBE offers to everyone for free.  This way you can get started with your first mapping or refresh your knowledge of SCRIBE Online.  First, we will talk about their webinars.  SCRIBE does regular webinars about its platform.  Past webinars can be found here.

Next, SCRIBE provides excellent video tutorials for SCRIBE Online. As of me writing this post here are list of videos with a small description of each:

  1. Crystal vs Classic UI - A Comparison.  This is a comparison of the classic UI to the Crystal UI.  I recommend this if you are familiar with the classic UI, but haven't worked in the Crystal UI yet.  If you are completely new to SCRIBE Online, you can skip this video.
  2. Introduction to Scribe Online.  As the title states this is an intro to SCRIBE Online platform and is good for people just getting into SCRIBE.
  3. Scribe Online Agent and its Setup.  This video goes over the agents, what they do and how to install them.
  4. Scribe Online Replication Service (RS).  Video that reviews Scribe's replication service.  Includes building solutions for creating and scheduling replications.
  5. Scribe Online Integration Service (IS). Video that reviews Scribe's integration service.  Includes building solutions and mappings for integrations.
  6. Scribe Online Event Maps.  Video that reviews Event driven mappings.  Also goes over how to set these mappings up.
If you need further help with SCRIBE Online please use the community forums to ask questions and search for answers.  Also, don't forget that you can sign up for a free trial to give SCRIBE a go.  Here is how you sign up for your free trial:
  1. Go to this link.
  2. Towards the middle of the page click on "Register For A Free Trial>".  Wait for the new page to load.  For me, this took about 2 minutes to complete.
  3. When the page loads fill in the form.  I recommend checking the check box to get updates.  Don't worry, SCRIBE won't spam you with information, like some companies do.  They will send you only needed information and updates.

SCRIBE Online Status Monitor

11/16/2017 10:39:00 AM

When working with cloud solutions it's always important to understand the options you have available to monitor that service.  For SCRIBE Online, they have a website you can go to to see the current status of:

  • Agents
  • API
  • Connectors
  • Event Solutions
  • User Interface
  • Sandbox
Not only can you see current statuses but you can also see information about current incidents and past incidents.  Also they offer the ability to sign up for alerts by email, text and RSS.  Personally, I am signed up for email and text alerts.  To help keep the team of people I work with update, I setup RSS notification within our Microsoft Team's account.  I also recommend that if you are a consultant, you make sure any client you have setup with SCRIBE of integrations, you let them know about this page.

Here are some helpful tips for working with the site:
  • Subscribing To All Updates:

    1. Navigate to https://trust.scribesoft.com/
    2. Click on "SUBSCRIBE TO UPDATES" button
    3. Choose how you want to subscribe:
    4. Envelope = email
      1. Phone = text message
      2. Comment Bubble = support page
      3. RSS = Subscribe by rss or atom rss feeds
  • Subscribe To Current Incident:
    • If there is a current incident you will see a yellow banner with a message under it.  To subscribe to only that incident do the following:
    1. In the yellow banner click on "Subscribe"
    2. Input email address and/or phone number in the pop up
As I stated above if there is a current incident happening you will see this in yellow as soon as you load the page.  Scrolling down the page you will see the current statuses for each of the items I outlined at the beginning of this post.  Under that you will see about the last 2 weeks broken down by day.  Here you can see if anything happened on those days.  At the vary bottom of the page you will see "Incident History".  Clicking on it takes you to page where you can see past incidents by month.

Documenting Maps

11/9/2017 9:15:00 AM

When we are creating mappings in SCRIBE its important that we document those maps.  This is a two stage process.  First, we want to do an export of the map and save the JSON file into the source control system.  This way we have a back up copy of it and we can reuse the map if we need it again, similar to what you would do with a template.  The second reason is so others can have an understanding of the process.

In projects that I have worked on in the past I have taken more of a manual approach to creating the supporting documentation for my integration and migration mappings.  This can be time consuming to say the least.  During one of these exercises, I found the SCRIBE Documentation Tool.  I walk through the steps they outlined and it auto generated the documentation for me.  This has greatly speed up my work and gives me an spreadsheet that is easy to understand.

To create this documentation you will need a Google account as it uses Google Docs.  Here is a link to the detailed step by step process provided by SCRIBE.  Here is the high level process:

1) Allow API access to your SCRIBE Online organization.
2) Access the Documentation Tool by clicking this link. (Opens Google Spreadsheet).
3) Save a copy.  This is an important step as you create more then one of these, this way you don't over write a previous map documentation.  Also doing this will allow "SCRIBE" to appear in the menu next to "Help".
4) Click SCRIBE -> Documentation Solution and follow the prompts.
5) Once the process starts, depending on the number of mappings in the solution it can run quickly or slowly.  Just wait for it to finish.

When the process is finished you will see the following:

  • Org Details - This contains info about your org.
  • Solution Details - Details about the solution.
  • All tabs after the first 2 are the individual mappings in the solution.
    • The top section of each mapping tab will contain the high level map info.
    • After that section we start with the beginning of the mapping.  each block is highlighted, so it is easy to navigate between them.  under each block is the information contained in that block and the attributes used with what they are linked to or the conversion code we have written.
I hope this has helped you with documenting your data mappings.  Please check out the full documentation at SCRIBE's website.  Also SCRIBE provides the source code if you want to modify this tool for your specific needs.

Goodbye SDK...Hello Developer Guide

11/8/2017 8:36:00 AM

With the release of Microsoft Dynamics 365 version 9, we are saying goodbye to the large SDK we are use to downloading to extend Microsoft Dynamics and saying hello to the Developer Guide for Dynamics 365 Customer Engagement.  Here are some highlights with the version 9 release.

1) Previous SDK versions can still be downloaded for previous release of Microsoft Dynamics CRM

2) Per the Developer Guide Dynamics 365 Version 9 is an online only release.

3) To get the early bound generator, plugin registration tool and other CRM tools you will need to use PowerShell and Nuget to get them.  Here is a link provided by Microsoft on how to get these tools.

4) Webhooks have been added to the plugin registration tool.  This allows for easy integration with Azure Functions.

5) As a developer you will have the ability to download the Developer Guide into a PDF for offline viewing.  To do this simply click on the "Download To PDF" on the bottom left of the screen of the Developer Guide.

6) In the previous versions of the SDK, we where provided with sample code to help us quickly get up and running.  To get sample code and projects you will need to check out MSDN and Github.

There is a ton of information in the new Developer Guide and I highly recommend checking it out.  As I go though it, I will be writing posts about specifics in the new Developer Guide.  This post was meant to serve as a high level overview and introduction to the new Developer Guide.

Setup Non-Interactive User with Non-Expiring Password

11/2/2017 5:41:00 AM

When we need to connect CRM to another system it is important that the connection remain working so data can flow easily between the systems.  One common issue that can arise in the integration is the user account we use to create the connection, password can expire.  When this happens it could be minutes to days before the issue is found and that can lead to data synchronization between the systems getting messed.  To migrate this risk we can easily setup a user with a non-expiring password.  The user type for this in CRM is "Non-Interactive".  Non-Interactive users can't log into CRM via the front end.  If you try you will see this error message:

The other benefit of using a non-interactive user account is it doesn't require a CRM license to work.  You will only need to assign a license to the account for about 5 minutes to setup the account for the first time.  After that you can remove it and it will remain active in CRM.  You are allowed to have 5 non-interactive users.

No lets get to the instructions of how to set this up.  You will need to have an admin account for CRM and Office 365.  You will also need PowerShell Installed.

Setup Non-Interactive User:
  1. Create a user in the Office 365 admin center.  Be sure to assign a Dynamics 365 (online) license to the account.
  2. Go to Dynamics 365 (online).
  3. Go to Settings > Security.
  4. Choose Users > Enabled Users, and then click a user’s full name.
  5. In the user form, scroll down under Administration to the Client Access License (CAL) Information section and select Non-interactive for Access Mode.  You then need to remove the Dynamics 365 (online) license from the account.
  6. Go to the Office 365 admin center.
  7. Click Users > Active Users.
  8. Choose the non-interactive user account and under Product licenses, click Edit.
  9. Turn off the Dynamics 365 (online) license, and then click Save > Close multiple times.
  10. Go back to Dynamics 365 (online) and confirm that the non-interactive user account Access Mode is still set for Non-interactive.
  11. In CRM assign a role to the user account.
Setup Non-Expiring Password:
Important Links:
Windows PowerShell Install Azure Active Directory Module for (cmdlet)
Download Azure PowerShell Version 1 (File at bottom of page)
expire Set an individual user’s password to never (Initial steps I used)
for IT Professionals RTW Download Microsoft Online Services Sign-In Assistant
Initial Setup: (For installs always use the x64 version) – This only needs done once
  1. Install Microsoft Online Services Sign-In Assistant for IT Professionals RTW
  2. Install Microsoft Azure Active Directory Module for Windows PowerShell.  I did run into an issue here and needed to also install Azure AD PowerShell Version 1.
In PowerShell (run the following commands) – replace red text with user name of account you want to set the non-expiring password on:
 1.       Import-Module MSOnline
        2.       Connect-MsolService
      a.       You will get a pop-up and need to login with a global admin account
        3.       Set-MsolUser -UserPrincipalName serviceaccount@contoso.com -PasswordNeverExpires $true
        4.       Confirm the process worked by running the following command:
     a.       Get-MSOLUser -UserPrincipalName user ID | Select PasswordNeverExpires
     b.       If successful you will see this:
    Not Digitally Signed Error: Run this script in PowerShell –
    Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypas

        October 2017 SCRIBE Online Feature Spotlight - Find and Replace

        10/30/2017 3:51:00 PM

        On October 26th, 2017 SCRIBE pushed out the fall update to SCRIBE Online.  One of the first changes I took notice of was the "Find & Replace" feature.  For those who are not aware, in the past when importing a map backup (JSON or XML) into a different environment to use it as a template you would need to rename connections.  This required going into the JSON or XML file to do it in bulk.  For someone not versed in either, this could be an intimidating task. Well, SCRIBE has made this task even easier and in my opinion even safer.

        First, if you don't see the "Find & Replace' in your mapping, please clear your cache of your browser and then close it.  Reopen your browser and you should see it now.  If you are still don't see it or see error messages, check out this troubleshooting article SCRIBE posted that can help.

        When clicking on the Find and replace you will see the following:
        Clicking on the down arrow between the search input and the ".*" button will reveal the replace field along with a bunch of different choices to filter results.
        Here is what each section does:

        • Find - Input the string that you wish to search for.
        • .* - Toggles the ability to use the "Options" area.
        • Replace - Input the string you wish to use in place of the string in the find field.
        • Replace Button - Execute the replace on the chosen record results.
        • Select All - Chooses all results for bulk replace
        • Search Section - Filter where you want to look for the string.  Use the Exp/Map/Filters to find the string within your attribute fields.
        • Options Section - Used if you are not 100% sure of the string and need to find strings similar to it.
        • Casing Section - Specify if the letter casing should be considered in the find or not.
        In the little map I created to test out this new function I used the account entity in Microsoft Dynamics CRM.  I wanted to find all attribute level connections that contained the word "account" but I didn't want it to return my query.  Here is the find with results:
        Using replace to simulate changing the connection to a new account source block (i.e. lookup but wanting to maintain the same level mapping.) - Single record:
        What I did in the above image was input my replace text in the replace input field and then selected the record in blue.  I then clicked on the replace button.  What you can see is that it only replaced the "account" string on that one record but not the others.  Next, we will look at the bulk replace:
        In the above image you can see that all "account" strings have been replaced.  This was done by holding the "CTRL" key and selecting multiple records.  But wait, there is the "Select All" next to the replace button.  Well, because I had already changed the highlight record above before the other 4, I didn't want to update it again.  In this scenario had I used the "Select All" it would have changed it to "accountLookupLookup", which is not what I wanted.

        I hope you found this helpful.  Please read SCRIBE's full Release Notes on the Fall 2017 update.

        Updated To Version 1.7 - Azure Blob Storage For Storing Microsoft Dynamics 365 Attachments

        10/25/2017 3:18:00 PM

        In another post I talked about using Azure Blob Storage to store CRM attachments in Azure rather then in CRM.  Today I installed the app in a clients CRM system and found that Microsoft Labs updated the App to version 1.7.  This change had a lot of updates from UI to how to setup notes attachment movement.  Also they now allow for moving of existing attachments.  Here is the updated steps:

        1) Go To AppSource
        2) Install the Attachment Management App by clicking the "Get It Now" button under the image.
        3) While its installing go to Azure and setup your blob storage account.
        4) For each CRM instance you will need to setup 2 containers in your blob storage.  One is for email attachments and the other is for notes attachments.  So if you have a production instance and sandbox instance you will need to have 4 containers (DevEmails, DevNotes, ProdEmails, ProdNotes).  If you want to separate note attachments by entity, then create one container per entity.  The DevNotes or ProdNotes will be used as the default location, unless you specific a specific entity container for notes.
        5) Still in Azure click on "Shared Access Signatures"  Here you will setup your SAS key.  Complete all sections and click on "Generate SAS".
        6) Copy the SAS key, we will need it later.
        7) Go back to CRM and check to make sure the attachment management app finished installing.
        8) In setting -> Customization -> Solutions, click on publish all customization.
        9) Refresh your screen.
        10) You will see a new link in the navigation bar for Azure Attachment Storage. Click on it and then on Azure Attachment Storage Setup.

        11) Fill in the fields.
              - Name = The name of the blob storage account in Azure
              - SAS Token = Get this from step 6
              - Notes Attachment Container = Name of the container in Azure
              - Email Attachment Container = Name of the container in Azure
        12) Click Confirm.
        13) Go back to the navigation bar and click on Notes Attachment Entity Settings.  Here is where you choose what entities you want to upload notes attachments to azure.  You must choose each entity.  If you setup multiple containers in step 4 then you need to specify those here. ***ANYTHING WITH ADX_ IN THE ENTITY NAME SKIP.  DO NOT ADD THESE TO MOVE THE ATTACHMENTS TO AZURE.  IT WILL BREAK YOUR CRM PORTAL.*** I did notice that if I selected all at once the page did seem to hang.  So I did it in groups and it worked fine.  Make sure to leave all entities selected that you want to automatically move attachments for.  Once unselected they will stop moving.
        That's it, your setup is now complete.

        If you need to move existing attachments to Azure then do the following:
        1) In the navigation click on Azure Attachment Storage and the on Reports and admin.
        2) Choose what you want to move in the left hand column and click  on Move to Blob.  You should see the number on the left decrease and the number on the right increase.

        For additional information including setting up bulk attachment upload see the User Guide.

        Azure Blob Storage For Storing Microsoft Dynamics 365 Attachments

        10/21/2017 4:48:00 PM

        In my Living In SCRIBE Online Blog I talked about work I did to migrate CRM 4.0 on-premise to Dynamics 365 using SCRIBE Online.  In that blog I mentioned how there are about 80 GB worth of attachments that had to be moved into the cloud.  Well this could quickly get expensive if we stored them directly in CRM.  Also, this client wasn't using SharePoint.  So what option do we have to store this data so end users can access the data?  Why not use Azure Blob Storage?

        Microsoft has an app in AppSource to move attachments from CRM to Azure blob storage.  Not only does it all for this, but it also has a web resource in it so you can allow for bulk uploads.  Here is how to set it up:

        1) Go To AppSource
        2) Install the Attachment Management App by clicking the "Get It Now" button under the image.
        3) While its installing go to Azure and setup your blob storage account.
        4) For each CRM instance you will need to setup 2 containers in your blob storage.  One is for email attachments and the other is for notes attachments.  So if you have a production instance and sandbox instance you will need to have 4 containers (DevEmails, DevNotes, ProdEmails, ProdNotes).
        5) Still in Azure click on "Shared Access Signatures"  Here you will setup your SAS key.  Complete all sections and click on "Generate SAS".
        6) Copy the SAS key, we will need it later.
        7) Go back to CRM and check to make sure the attachment management app finished installing.
        8) In setting -> Customization -> Solutions, click on publish all customization.
        9) Refresh your screen.
        10) You may not see a link anywhere in the navigation to the azure blob storage entity, so you can either turn it on to appear in the settings area or just use advanced find to get to the entity.  It is called Azure Blob Storage Settings.
        11) Add a new record.
              - Name = The name of the blob storage account in Azure
              - SAS Token = Get this from step 6
              - Notes Attachment Container = Name of the container in Azure
              - Email Attachment Container = Name of the container in Azure
        12) Save

        Your attachments in CRM will now be stored in Azure.  You will still be able to access the notes within CRM with no problem at all, including even downloading them from CRM.  One thing to note about this is it will not move attachments that are already in CRM.  So it is best to install this early on if you are not going to use SharePoint integration.

        SCRIBE Connector Development - Handling Array List

        10/21/2017 4:20:00 PM

        Are you working on creating a connector with SCRIBE's CDK?  In your connector do you have an array of strings or list of  strings that you need to pass?  SCRIBE makes this easy to do within the CDK and SCRIBE Online.

        I came across this scenario on a connector I was creating that passes a JSON message to an API.  In the JSON message it had a list of strings for entity ID's. Here is an easy way to accomplish this:

        1) Create you Property Definition as past or your Object Definition.

        1:  new PropertyDefinition  
        2:  {
        3:        Description = "Use TOLIST() to pass in a list of entity id's.",  
        4:        FullName = "Entity IDs",  
        5:        IsPrimaryKey = false,  
        6:        MaxOccurs = 1,  
        7:        MinOccurs = 0,  
        8:        Name = "PublishTo",  
        9:        Nullable = true,  
        10:       NumericPrecision = 0,  
        11:       NumericScale = 0,  
        12:       PresentationType = "string",  
        13:       PropertyType = typeof(string).Name,  
        14:       UsedInActionInput = true,  
        15:       UsedInActionOutput = true,  
        16:       UsedInLookupCondition = true,  
        17:       UsedInQueryConstraint = true,  
        18:       UsedInQuerySelect = true,  
        19:       UsedInQuerySequence = true  
        20   }  

        Here I want to note lines 12 and 13.  Set these to the value types that your list will contain.  In this case we will have a list of strings.

        2) Publish connector so you can use it in SCRIBE Online.
        3) Create your mapping and for this attribute use the TOLIST() function.  It can be found in Functions -> Conversion -> TOLIST().

        This function takes 6 arguments all separated by a comma.
        1) The list of items. Example "ABC123, DEF456, GHI789" - Here you can use a function to read from a source instead of typing in all the data.
        2) The delimiter we used to separate the list. Example ","
        3) The type for the data.  Example "string"
        4) How to sort the data.  Example "a" for ascending.
        5) Do you want to remove quotes?  Example true.
        6) Do you want to remove empty values? Example true.

        Here is the expression completed with the examples above:
         TOLIST("ABC123, DEF456, GHI789", ",", "string", "a", true, true)  

        Here is the output:

        Using SCRIBE Online To Migrate From CRM 4.0 to Dynamics 365

        10/21/2017 12:17:00 PM

        Recently I had a client that needed to be migrated from CRM 4.0 on-premise to Dynamics 365 online.  First, I want to say that we opted to not do the upgrade to CRM 2011 on-premise to CRM 2013 on-premise to CRM 2015 on-premise to Dynamics 365 on-premise because the client wanted to start fresh with CRM customization's.  So personally, it would have been a waste of time to do that upgrade process.

        The following issues where identified with the data migration:
        1) CRM 4.0 is no supported by SCRIBE Online.
        2) The server OS that CRM 4.0 and SQL Server ran on was not supported.
        3) The version of SQL Server was no supported.

        How am I going to migrate this data? Hmm.....

        The solutions:
        1) The RDP Terminal Server I am using has a supported server OS.
        2) I am able to connect to SQL Server CRM 4.0 Database with ODBC.
        3) Install SCRIBE On-Premise Agent on the Terminal Server and use the ODBC connector to retrieve the data.

        By using the ODBC connector I could access the data in SCRIBE Online in a supported way.  Because of this approach the migration did run a little slower because I couldn't run multiple maps at the same time over the same ODBC connection, because it would throw a connection error.

        One piece I was not able to migrate with SCRIBE Online was the attachments stored in CRM 4.0, which was about 80 GB worth.  I couldn't migrate these because of changes in how CRM stored them.  So to accomplish this I did the following:

        1)  I downloaded this tools source code from GitHub.
        2) I modified it to read a list of guids.
        3) I exported a list of attachment guids to csv.
        4) I modified the application to then download attachments and put them into one root folder and have subsequent folders in the root.  Each subsequent folder was a guid and inside that folder was the attachment.
        4) I then use the Microsoft Dynamics 365 SDK to create a console application to upload the attachments to Dynamics 365 and re-link them to the parent record.

        Once I made the code changes and wrote the application I the download and upload ran overnight.

        Know you might be asking yourself, "How did you store 80 GB of attachments in CRM online?  Isn't that expensive?"  I will be posting a separate blog on that in my Dynamics 365 blog.

        Creating Pre and Post Build Events For Connector Creation

        10/19/2017 11:29:00 AM

        While building a custom connector using the CDK I found it time consuming to have to manually stop the SCRIBE Agent service, copy the connector .dll to the connector folder in the agent folder and restart the service. To over come this I added some pre and post build functions to my project in Visual Studio. Here is how you setup your development environment to automate all this:

        1) Right click on your project and go to properties.
        2) On left side go to Build Events.
        3) In "Pre-build event command line" input this:

        1:  net stop "Scribe Online Agent"  
        2:  Exit /b 0  
        4) In "Post-build event command line" input this:
        1:  copy /Y "$(TargetDir)$(ProjectName).dll" "C:\Program Files (x86)\Scribe Software\AGENTNAME\Connectors\CONNECTORNAME\$(ProjectName).dll"  
        2:  net start "Scribe Online Agent"  

        Replace AGENTNAME with the name of your agent and CONNECTORNAME with the name of the folder where you place the DLL file.
        What this will do is stop the service, build your connector, copy your connector to the agent folder and then start the service.

        Using Session Memory with CRM Portals

        10/19/2017 6:28:00 AM

        Recently while doing some client work, we noticed that CRM Portals does a post back when adding information within a sub-grid on an entity form.  Why is this an issue?  Because, if you have input fields on the form, these values are not written into  CRM till the save button at the bottom of the form is clicked.  So if a user inputs anything into the sub-grid after they fill in the fields, the post back action will remove what the user input.  This can make for a bad user experience.  To over come this, we can use session memory to temporarily store the values till the browsing session is ended.

        To use session storage the first thing we will need to do is register onChange handlers to update the values the session memory when the user changes a value.  Here are some examples of onChange handlers written in JQuery:

        1:  $(document).on('change', '#CHECKBOX ID', function () { SetSessionValue("#CHECKBOX ID") });  
        2:  $(document).on('change keyup paste', '#TEXTBOX ID', function () { SetSessionValue("#TEXTBOX ID) });  
        #1 is for check boxes and picklists
        #2 is for textbox inputs

        The SetSessionValue is a reusable function I created to handle boolean, string and picklist.  Check my blog post on DateTime fields on how to handle onChange events with them.  Before we get into the source code for the SetSessionValue function, CRM Portals adds an _0 and _1 to the ID for boolean input on the portal.  That is why you see me add an _1 in the source code below:
        1:  function SetSessionValue(fieldname) {  
        2:    if ($(fieldname).hasClass('boolean-radio')) {  
        3:      if ($(fieldname + "_1").is(':checked')) {  
        4:        sessionStorage.setItem(fieldname, true);  
        5:      }  
        6:      else {  
        7:        sessionStorage.setItem(fieldname, false);  
        8:      }  
        9:    }  
        10:    else {  
        11:      var fieldValue = null;  
        12:      if ($(fieldname).hasClass('picklist')) {  
        13:        fieldValue = $(fieldname).find(":selected").val();  
        14:      }  
        15:      else {  
        16:        fieldValue = $(fieldname).val();  
        17:      }  
        18:      sessionStorage.setItem(fieldname, fieldValue);  
        19:    }  
        20:  }  

        To get the value out of session memory I created a small reusable function for that as well:
        1:  function GetSessionValue(fieldname) {  
        2:    var sessionValue = sessionStorage.getItem(fieldname);  
        3:    return sessionValue;  
        4:  }  

        Now with these parts we are writing and reading from session memory.  All  you need to do know is add a window.onload = function(){} or a $(document).ready to your code to populate your fields when a post back occurs.

        CRM Portals OnChange Event For DateTime

        10/18/2017 10:13:00 AM

        Occasionally when working with CRM Portals you may run into the need to get the OnChange event for a date time field.  Because of the way that CRM Portals renders date time fields this is slightly more complicated then working with text boxes, check boxes and pick lists.  During my process I ended up reaching out to Microsoft for help on this and below are the steps they provided to help me with this problem:

        1) Click on the date time control
        2) Press F12
        3) In console type $('div.control') and hit enter (this will give you a list of div controls)
        4) Locate the div control for the date time field

        5) Go to the entity form or web page in CRM and add the following code snippet (replace the 2 with the number your div control is located at):

        1:      $(document).ready(function ()  
        2:      {  
        3:        var dpcontrol = $('div.control')[2];  
        4:        $(dpcontrol).on("dp.change", function (e)   
        5:        {  
        6:           alert("On change Event Triggered");  
        7:           alert(e.date);  
        8:        });  
        9:      });  

        Change the 2 alerts to whatever you need to happen during the OnChange event.