12/5/2018 6:36:00 AM
Recently, I had an issue that I needed to overcome. The problem was that the Dynamics 365 instance I was working on, was integrated with multiple other systems with some long running external processes and possible old data in the UI when a record is open, overriding what the value was set to from the external processes. What was happening is that on create of a contact, we can have a NULL e-mail. This is normal behavior. Our external process would be running and update the NULL e-mail field while the record was open in the UI. Because the UI hadn't been refreshed and the user made other changes, the e-mail would be blanked out (NULL). To make sure that the field once populated (yes, this is a business requirement) could not be cleared, I wrote a small pre-operation plugin that does the following:
using Microsoft.Xrm.Sdk;
using System;
namespace ContactEmailValidation
{
public class ValidateEmail : IPlugin
{
public void Execute(IServiceProvider serviceProvider)
{
ITracingService tracer = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
IOrganizationService service = factory.CreateOrganizationService(context.UserId);
//Convert context to entity to make it easier to work with.
Entity PreValidationData = (Entity)context.InputParameters["Target"];
//Check if email fields are in the context. If they are then we are trying to update them.
if(PreValidationData.Contains(ContactConstants.PrimaryEmail) || PreValidationData.Contains(ContactConstants.SecondaryEmail))
{
//Get current database contact data
using (OrganizationServiceContext context = new OrganizationServiceContext(service))
{
CurrentDatabaseData = (from c in context.CreateQuery("contact")
where (Guid)c[contactid] == id
select c).SingleOrDefault();
}
tracer.Trace($"Execution context does have either primary email or secondary email in it.");
//If primary email is trying to update to null, remove it from the context so we don't clear out the email.
if (PreValidationData.Contains(emailaddress1) && (PreValidationData[emailaddress1] == null || PreValidationData[emailaddress1].ToString() == "" || PreValidationData[emailaddress1].ToString() == string.Empty))
{
tracer.Trace($"Trying to update primary email to null. Removing from context...");
PreValidationData.Attributes.Remove("emailaddress1");
}
//If secondary email is trying to update to null, remove it from the context so we don't clear out the email.
if (PreValidationData.Contains(emailaddress2) && (PreValidationData[emailaddress2] == null || PreValidationData[emailaddress2].ToString() == "" || PreValidationData[emailaddress2].ToString() == string.Empty))
{
tracer.Trace($"Trying to update secondary email to null. Removing from context...");
PreValidationData.Attributes.Remove("emailaddress2");
}
//Update the plugin context that will be saved to the database.
tracer.Trace($"Setting target to new context values.");
context.InputParameters["Target"] = PreValidationData;
}
}
}
}
11/28/2018 8:04:00 AM
It is common practice in Microsoft Dynamics 365 to use Business Rules to show and hide fields on a form. But, with that addition of multi-select option sets, you can't do this. So the next logical step is to use JavaScript. There is just one small problem, there is no setVisible() function for multi-selection option sets. Here are the controls we do have access too, as of this post:
function ShowHide(executionContext)
{
//Create the form context
var formContext = executionContext.getFormContext();
var contactStatus = formContext.getAttribute("Field Name").getValue();
var tabObj = formContext.ui.tabs.get("Tab Name");
var sectionObj = tabObj.sections.get("Section Name");
if (contactStatus === 1)
{
sectionObj.setVisible(true);
formContext.getAttribute("fieldName").setRequiredLevel("required");
}
}
11/10/2018 9:41:00 AM
In my spare time I have been building a .NET 4.5.2 library for the Tibco Scribe Online API. For one developer this has been a daunting task and I am at the point that I need some help. I have the library created and need people that can write unit tests to test the code. The project is 100% open source and I am aiming for it to be a community build. The project is hosted at GitHub.
This is a BETA release, because I have not created all the unit tests to make sure every single method is bug free. Make sure to check the wiki on how to implement the code and use the project. Once we are out of Beta I will create the NUGET package.
Thank you in advance to all that help out.
7/12/2018 11:13:00 AM
7/12/2018 10:54:00 AM
Recently I had a requirement where I needed to get related records via a many-to-many (N:N) relationship in Dynamics 365 via a plugin in C#. During my searching online, I kept coming to posts that involved using query expressions. I wanted to do this with LINQ for consistence reasons and readability in my code. I will also add that what I was doing was being done via Associate and Disassociate messages of the plugin. I won't be reviewing that part of the plugin in this post, but I will just be focusing on the LINQ statement used to get the related records via the N:N relationship. Here is the example code that I came up with:
public List<Entity> GetRelatedRecords(Guid parentID, IOrganizationService service)
{
OrganizationServiceContext context = new OrganizationServiceContext(service);
List<Entity> oProducts = (from r in context.CreateQuery(RELATIONSHIP_NAME)
join eOne in context.CreateQuery(ENTITY_ONE_LOGICALNAME) on r[ENTITY_ONE_UNIQUEID_LOGICALNAME] equals eOne[ENTITY_ONE_UNIQUEID_LOGICALNAME]
join eTwo in context.CreateQuery(ENTITY_TWO_LOGICALNAME) on r[ENTITY_TWO_UNIQUEID_LOGICALNAME] equals eTwo[ENTITY_TWO_UNIQUEID_LOGICALNAME]
where (Guid)eOne[ENTITY_ONE_UNIQUEID_LOGICALNAME] == parentID
select eTwo).ToList();
return oProducts;
}
6/6/2018 5:52:00 AM
On June 6th, 2018 Scribe Software was acquired by TIBCO. Below is the announcement I and other partners received in an e-mail with links to the press release.
"We are pleased to announce that on June 6, Scribe Software was acquired by TIBCO Software. This milestone reflects the increasing strategic importance that Scribe’s product line has with IT organizations and presents great opportunities for Scribe’s partner community.
In the short term, there will be no immediate impact to how you conduct business with Scribe. Your sales and support contacts will all remain the same. Over time, we expect that the combination of Scribe’s best-in-class iPaaS with TIBCO’s enterprise product portfolio, which includes messaging, application integration, API management, and analytics offerings, will provide significant capabilities and opportunities for Scribe’s partner community.
To learn more about the opportunities that lay ahead, read the press release..."
5/1/2018 5:20:00 AM
Scribe has made it extremely easy to work with their API. Currently, in my limited free time, I am working on a reusable C# library to send and receive data from Scribe's API. Once it is completed for Version 1, I will list it to GitHub, so anyone can consume it and provide feedback on improvements. There will be a fully built out wiki with how to consume the library with references to the Scribe API documentation. Also there will be both TDD (Test Driven Development) and BDD (Behavior Driven Development) projects. This way you can see examples of how to consume the library as well as what I tested before release. If you are not familiar with BDD, I recommend you check out my blog post in my Dynamics 365 blog, that gives a high level overview of it.
Anyways, back to what this post is about. While working on the library, I came across one small issue around TLS 1.2 and making sure that the web request uses it. Luckily, this is really easy to set in C#, when using .NET 4.5. By default .NET 4.5 does not send TLS 1.2. It is supported, just not the default. So if TLS 1.2 is set for an endpoint, then we need to specify to use it. This way we decrease the chance of an error occurring. Here is a blog post I came across that helped me with my issue.
Because I am using C# .NET 4.5 I simply needed to add the following line of code:
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12
public string SendRequest(string username, string password, string url, string method)
{
WebResponseFactory factory = new WebResponseFactory();
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Method = method;
request.Headers.Add("Authorization", "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes($"{username}:{password}")));
request.ContentType = ParameterContentTypeSettings.applicationjson;
request.ContentLength = 0;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
return factory.ProcessResponse(response);
}
public string SendRequest(string username, string password, string url, string method, string contentBody)
{
ASCIIEncoding encoding = new ASCIIEncoding();
byte[] encodedData = encoding.GetBytes(contentBody);
WebResponseFactory factory = new WebResponseFactory();
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Method = method;
request.Headers.Add("Authorization", "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes($"{username}:{password}")));
request.ContentType = ParameterContentTypeSettings.applicationjson;
request.ContentLength = encodedData.Length;
var requestStream = request.GetRequestStream();
requestStream.Write(encodedData, 0, encodedData.Length);
requestStream.Close();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
return factory.ProcessResponse(response);
}
4/16/2018 9:36:00 AM
When we are designing a new system, one of the tools we use is user stories. User stories allow us to define what the feature should do from the view point of the end user. This way we take a user centered approach to designing the system. These are also used as part of our functional testing when writing code (plugin, JavaScript, etc.) to make sure what was written matches the user story. Even if we use Test Driven Development (TDD) we could easily miss some of the key functions within the feature and need to go back to our code to make changes and then restart our testing process. This can be time consuming. Wouldn't it be better to start our testing off with the user story?
With advancements in frameworks and technology we now have the capability to write test scripts directly from the user story using SpecFlow. Taking this approach is known as Behavior Driven Development (BDD) because we are testing the users interactions instead of just data and functions. This becomes even more important when we have to work with a UI. We can mimic button presses and navigation using other tools like Selenium and EasyRepro.
I am not going to go through an entire setup process as SpecFlow has done an awesome job documenting how to set it up in CRM. Also Wael Hamze has provided some great examples in GitHub when using FakeXrmEasy and EasyRepro.
References:
4/9/2018 3:49:00 PM
On April 6th, 2018 Scribe released an update to SCRIBE Online. You can read all the details of the changes here. In this post about the update, I want to go over some of the stuff I have seen while working with this new update, and it's an AMAZING update to the platform. Here are my highlights:
3/22/2018 12:36:00 PM
One of the benefits of working with Scribe Online is how easy they make it to create connectors if one does not exist. In this blog post we are going to look at how to get setup, if this is the first time you have made a connector for Scribe Online. But, before we get into that, we should first make sure that a connector doesn't already exist that can handle what we need.
We can do this by looking in the Scribe Online Marketplace. Still not seeing what you need? Reach out to Scribe directly or ask in the Scribe forums if a connector exists for an application. There are instances where a connector exists but is not listed in the marketplace. An example of this is for a client that I built a connector for. They didn't want to setup a system to run the on-premise agent, so they asked me to set up the connector to run on Scribe's cloud agent. This meant that I had to submit the connector to Scribe for validation. Once published the connector is in the Scribe marketplace, but hidden. Access to it is managed from within the client's Scribe Online org. This means that only people that ask them for access can use it. But, unless they tell you this, you won't know it. So, it's worth asking Scribe before starting to develop a connector, if one exists. As mentioned before, even if one doesn't Scribe makes it really easy to create one.
In this blog post we are only going to go over what you need to get setup. We won't be getting in depth on connector development, that will come in future posts. You will need the following to create connectors:
3/9/2018 9:40:00 AM
Recently, while working on a data migration task, I had a requirement to transform data stored in a string delimited by ";" and parse it to create multiple line items in the target systems. Here is an example as a visual:
Source:
productA, productB, productC, productD...
Target:
productA
ProductB
ProductC
ProductD
...
In the target system, each product would be a line item on an opportunity. In the source system this was all listed in a text box as a string and delimited by ";". Also the cell where this data was in the source could be null, have one item, ten items, fifty items, etc. So I needed a way to loop through this.
To set this up I needed to use three connectors, source, target, and Scribe Labs - Variables. Below is an image of the section of the mapping where I achieved what I needed to happen:
3/9/2018 7:26:00 AM
I am pleased to announce that this week, I published an update to my XrmToolBox Bulk Attachment Manager plugin. For those that are not aware my plugin allows a user to download note attachments, email attachments, or both from Dynamics CRM.
This update includes bug fixes and a major enhancement that was requested by a user. The enhancement is the ability to create a report of attachments in the system. To do this, follow these steps:
2/12/2018 1:24:00 PM
Today I have published version 2018.2.2.7 of my XrmToolBox plugin. The purpose of this plugin is to make it easy to download and backup attachments in CRM. This release is a major release as it is the first built version with all pieces working. The first version only had note downloads working.
Overview of Plugin:
2/9/2018 6:53:00 AM
When we are setting up CRM Portals to allow customers to update their information, open cases, fill out an applications, etc. We want to make sure that we are validating their input before it is committed to CRM. This way we ensure that our data is clean and meaningful to us and the customer.
CRM Portals already has a lot validation checks built into it. But, on occasion we need to add our own. To do this we will use JavaScript to run the validation and also to output a message to the user to tell them there is an issue they need to fix.
if (window.jQuery) {
(function ($) {
if (typeof (entityFormClientValidate) != 'undefined') {
var originalValidationFunction = entityFormClientValidate;
if (originalValidationFunction && typeof (originalValidationFunction) == "function") {
entityFormClientValidate = function ()
{
//DO VALIDATION HERE. RETURN TRUE IF PASSED AND FALSE IF FAIL
};
}
}
}(window.jQuery));
}
if (window.jQuery) {
(function ($) {
if (typeof (webFormClientValidate) != 'undefined') {
var originalValidationFunction = webFormClientValidate;
if (originalValidationFunction && typeof (originalValidationFunction) == "function") {
webFormClientValidate = function ()
{
//DO VALIDATION HERE. RETURN TRUE IF PASSED AND FALSE IF FAIL
};
}
}
}(window.jQuery));
}
$('.notifications').append($("<div class='notification alert alert-danger' role='alert'>PUT ERROR MESSAGE TEXT HERE</div>"));
1/26/2018 9:05:00 AM
1/25/2018 7:39:00 AM
1/18/2018 8:33:00 AM
Recently while creating map's for a data migration from a legacy system to Microsoft Dynamics 365 using SCRIBE Online, I found that I needed to bring in Opportunities. Anyone that has worked with CRM for a time, know's that migrating closed opportunities into CRM can be a bit of a pain. I was having one issue in particular when bring this data in, the actual close date was being set to the day the record was migrated into CRM. This is because of how CRM handles opportunities when they are closed.
Here is what I was doing:
1/8/2018 9:26:00 AM
When doing an implementation of Dynamics 365, it is rare that a new system is being setup with no data needing to be brought into it. When looking into how to get that data into the new system, it can be really confusing trying to decide which tool to choose. In my experience, people tend to choose the tool to use based on price and this should really be the last thing to consider. That is one of the reasons I will not talk about price in this post. Instead I will stick to the Pro's and Con's for each tool and provide an overview of the tool and any experience(s) with each.
Here are the tools we will be looking at in this post:
KingswaySoft
SCRIBE Online
D365 Data Import Wizard
D365 Data Loader
KingswaySoft:
KingswaySoft is an integration toolkit built on SQL Server Integration Services (SSIS). It is an on-premise solution (but can be cloud if installed on virtual machine in Microsoft Azure or Amazon AWS). It can be used for integration, migration and replication. In full disclosure I have not used KingswaySoft for a data migration yet, because of the reasons I list in the con's. But, I have had clients that use it. They offer a free download and developer license, so you can give it a try.
Pro's:
1/2/2018 8:03:00 AM
12/29/2017 6:47:00 AM
While doing client work, I came across a problem with setting an IFrame URL dynamically. The underlying issue was that the sandbox instance is on v8 of Dynamics 365 and production is on v9 of Dynamics 365. The reason for this was because this client was setup around the time that Microsoft rolled out v9. Anyways, JavaScript that I wrote to dynamically set the URL of the IFrame wasn't working in the v9 instance. This was because of changes that Microsoft made to how IFrames are loaded on the form and also changes to JavaScript.
Here is my v8 setup:
function OnLoad()
{
//Get memberid
var value = Xrm.Page.data.entity.attributes.get("new_memberid").getValue();
//Get the default URL for the IFRAME, which includes the
// query string parameters
var IFrame = Xrm.Page.ui.controls.get("IFRAME_NAMEofIFRAME");
var Url = IFrame.getSrc();
//Append the parameters to the new page URL
newTarget = Url + "history?member=" + value;
// Use the setSrc method so that the IFRAME uses the
// new page with the existing parameters
IFrame.setSrc(newTarget);
}
function OnReadyStateComplete(executionContext)
{
var formContext = executionContext.getFormContext();
var value = formContext.getAttribute("new_memberid").getValue();
//Get the default URL for the IFRAME, which includes the
// query string parameters
var IFrame = formContext.ui.controls.get("IFRAME_NAMEofIFRAME");
var Url = IFrame.getSrc();
var newTarget = "";
//Append the parameters to the new page URL
if(Url.includes(value))
{
newTarget = Url;
}
else
{
newTarget = Url + "history?member=" + value;
// Use the setSrc method so that the IFRAME uses the
// new page with the existing parameters
IFrame.setSrc(newTarget);
}
}
12/27/2017 1:15:00 PM
Because I am doing a lot of work with SCRIBE Online, I find myself trying to come up with solutions to speed up my work. One solution I came up with is using my Stream Deck to automate some of the commands I type within SCRIBE Online. Before we go to far let me bring you up to speed on what the Stream Deck is.
The Stream Deck is manufactured by elgato. It is a small 15 key keyboard that can be customized. What I like about it is how the keys are clean looking and can be easily changed by the application. This hardware is marketed towards video editors and game streams (both of which I don't do). I got it because of the number of keys it has, the ease at which you can update it, and because I didn't want to use macro's on another external keyboard with paper labels stuck on the keys.
I will say that I am not only using the Stream Deck for just SCRIBE, but other applications like Visual Studio, Microsoft Windows shortcut keys, and to launch other applications. The reason I am pointing it out here is because to use it with SCRIBE I did have to do a little programming as it can't launch bat files like I was hoping. This meant I reached into my C# tool kit and had to write a few small programs to accomplish what I wanted.
Since SCRIBE Online run's in the browser, I wanted to automate the process of typing out commands like, "IF(ISNULLOREMPTY(entity.attribuate), "TRUE", "FALSE")". In the past I would have notepad open with that string pasted in it, then I would have to move my mouse over to that, copy it, paste it in the browser and then update it with the proper syntax. That is easy enough but time consuming when doing this repeatedly. So now with my Stream Deck I am able to simple press a button and it put that string value on my system clipboard, then I simply press another button and it pastes it where my cursor is. This means I don't have to type it out and I don't have to leave SCRIBE Online to go over to notepad to copy it. This saves me a few seconds that can add up over time when working on large integration or migration projects.
Here are the following commands I have built so far into my Stream Deck:
12/19/2017 10:56:00 AM
While creating data mappings you may run into the need to store some variables in a table in a Key \ Value Pair. This way you can call the variable by its Key to get the Value and pass it to the other system. This is helpful when the 2 systems store data in different ways. An example of this would be a table that stores United States abbreviations as the key and the full state name as the value. This is helpful when the source stores the full state name and the target stores the abbreviation or visa versa. So how do we set this up?
1) In the navigation bar click on "MORE".
2) Click on "Lookup Tables".
3) Click on the "+" sign on the right hand side.
4) In the pop-up input a Name.
5) On the ellipses button on the right under Description you will get a dropdown with "Create", "Append" and "Export".
5.1) Create - Click this to create a new Key / Value Pair in the table.
5.2) Append - If you have a CSV file already created this is how you import it. I do want to point out that is needs to be a CSV that is separated by "," and nothing else or import will fail.
5.3) Export - Exports the table
Now that we have out table built, we need to consume it. To do this we will use the functions found in the Lookup Functions. Here are the specific ones:
12/19/2017 8:47:00 AM
When it comes to working with Option Sets while creating a plugin, custom workflow activity or other extension to CRM, it can sometimes be a pain to get the label of the Option Set. There are many solutions out there on how to accomplish this, but for the most part they were not reliable. So after doing a bunch of research online and going through the SDK I came up with 4 methods that help me when I am working with Option Sets. There are 2 for global option sets and 2 for local option sets. The reason there are 2 for each is because one method will return the value and the other will return the label.
1) Get Local Option Set Label - This method will return the label of the option set value you provide it.
/// <summary>
/// Method to get local option set label
/// </summary>
/// <param name="entityLogicalName">Schema name of entity</param>
/// <param name="optionSetName">schema name of the optionset</param>
/// <param name="value">value of the option set option</param>
/// <param name="service">CRM Organizational Service</param>
/// <returns>String Label of the optionset value provided</returns>
public string GetLocalOptionSetLabel(string entityLogicalName, string optionSetName, int value, IOrganizationService service)
{
XrmServiceContext crmContext = new XrmServiceContext(service);
RetrieveEntityRequest retrieveDetails = new RetrieveEntityRequest
{
EntityFilters = EntityFilters.All,
LogicalName = entityLogicalName
};
RetrieveEntityResponse retrieveEntityResponseObj = (RetrieveEntityResponse)service.Execute(retrieveDetails);
EntityMetadata metadata = retrieveEntityResponseObj.EntityMetadata;
PicklistAttributeMetadata picklistMetadata = metadata.Attributes.FirstOrDefault(attribute => string.Equals(attribute.LogicalName, optionSetName, StringComparison.OrdinalIgnoreCase)) as PicklistAttributeMetadata;
OptionSetMetadata options = picklistMetadata.OptionSet;
IList<OptionMetadata> OptionsList = (from o in options.Options
where o.Value.Value == value
select o).ToList();
string optionsetLabel = (OptionsList.First()).Label.UserLocalizedLabel.Label;
return optionsetLabel;
}
/// <summary>
/// Method to get local option set value
/// </summary>
/// <param name="entityLogicalName">Schema Name of entity</param>
/// <param name="optionSetName">Schema Name of option set.</param>
/// <param name="label">text label</param>
/// <param name="service">CRM Org Service</param>
/// <returns>Int option set value</returns>
public int GetLocalOptionSetValue(string entityLogicalName, string optionSetName, string label, IOrganizationService service)
{
XrmServiceContext crmContext = new XrmServiceContext(service);
RetrieveEntityRequest retrieveDetails = new RetrieveEntityRequest
{
EntityFilters = EntityFilters.All,
LogicalName = entityLogicalName
};
RetrieveEntityResponse retrieveEntityResponseObj = (RetrieveEntityResponse)service.Execute(retrieveDetails);
EntityMetadata metadata = retrieveEntityResponseObj.EntityMetadata;
PicklistAttributeMetadata picklistMetadata = metadata.Attributes.FirstOrDefault(attribute => string.Equals(attribute.LogicalName, optionSetName, StringComparison.OrdinalIgnoreCase)) as PicklistAttributeMetadata;
OptionSetMetadata options = picklistMetadata.OptionSet;
IList<OptionMetadata> OptionsList = (from o in options.Options
where o.Label.UserLocalizedLabel.Label == label
select o).ToList();
int optionsetValue = (OptionsList.First()).Value.Value;
return optionsetValue;
}
/// <summary>
/// Method to get global option set label
/// </summary>
/// <param name="optionSetName">Schema name of option set</param>
/// <param name="value">Int value of option set value</param>
/// <param name="service">CRM Org service</param>
/// <returns>String option set label</returns>
public string GetGlobalOptionSetLabel(string optionSetName, int value, IOrganizationService service)
{
XrmServiceContext crmContext = new XrmServiceContext(service);
string optionSetLabel = "";
RetrieveOptionSetRequest retrieveOptionSetRequest = new RetrieveOptionSetRequest { Name = optionSetName };
RetrieveOptionSetResponse retrieveOptionSetResponse = (RetrieveOptionSetResponse)crmContext.Execute(retrieveOptionSetRequest);
OptionSetMetadata retrievedOptionSetMetadata = (OptionSetMetadata)retrieveOptionSetResponse.OptionSetMetadata;
OptionMetadata[] optionArray = retrievedOptionSetMetadata.Options.ToArray();
foreach (var option in optionArray)
{
if(option.Value.Value == value)
{
optionSetLabel = option.Label.LocalizedLabels[0].Label;
}
}
return optionSetLabel;
}
/// <summary>
/// Method to get the option set value for a global optionset
/// </summary>
/// <param name="optionSetName">Schema name of the optionset</param>
/// <param name="label">text label</param>
/// <param name="service">CRM Org Service</param>
/// <returns>Int option set value</returns>
public int GetGlobalOptionSetValue(string optionSetName, string label, IOrganizationService service)
{
XrmServiceContext crmContext = new XrmServiceContext(service);
int optionSetValue = 0;
RetrieveOptionSetRequest retrieveOptionSetRequest = new RetrieveOptionSetRequest { Name = optionSetName };
RetrieveOptionSetResponse retrieveOptionSetResponse = (RetrieveOptionSetResponse)crmContext.Execute(retrieveOptionSetRequest);
OptionSetMetadata retrievedOptionSetMetadata = (OptionSetMetadata)retrieveOptionSetResponse.OptionSetMetadata;
OptionMetadata[] optionArray = retrievedOptionSetMetadata.Options.ToArray();
foreach (var option in optionArray)
{
if (option.Label.LocalizedLabels[0].Label == label)
{
optionSetValue = option.Value.Value;
}
}
return optionSetValue;
}
12/14/2017 2:02:00 PM
I am pleased to announce that today I have published version 1 of my Bulk Attachment Manager plugin for XrmToolBox. This is an open source plugin that I created because I had a need multiple times to migrate attachments out of CRM and into another system. because of the way CRM stores attachments in the database, migrating these are sometimes hard said then done. So to make this process easy I came up with this tool. Before I go any further I am sure you want the important links so you can take a look. Here they are:
GitHub Project
Project Wiki
Nuget Package
Right now, this plugin can only download note attachments from CRM. In future releases I will turn on email attachment downloads and note attachment uploads (notes will need to already exist in the system). To download note attachments, you have 2 options and that is to either download all note attachments or only select note attachments. To download only select note attachments, you will need to provide a .csv file with guids of the notes that contain the attachment you want to download.
When the attachments download they will download into a folder in the directory you choose called "Note Attachments". In that folder will be a folder for each note, the folder names are the guids of the note records. Inside those files are the attached file themselves. Also for everything that is downloaded there is an output screen showing you what has been downloaded, along with data on the regarding record. Once the process is done you and export these results into a pipe-delimited .csv file.
If you have any input on this plugin or experience any issues please leave a comment in the issue tab of the GitHub project. Thank you.
12/10/2017 8:17:00 AM
While doing client work, I ran into a need where we needed to scan for records where a date value is 3 months in the future (i.e. 3 months from today) in Microsoft Dynamics 365. There are a few options to accomplish this:
11/30/2017 9:18:00 AM
SCRIBE Online is a massive tool that can be intimidating the first time you work with it. Luckily SCRIBE provides a lot of documentation and training to help you get started with your first go with SCRIBE Online. When I first started working with SCRIBE Online it was in what they call the "Classic UI". The latest version of SCRIBE is called "Crystal UI" I will say that there where a lot of changes between classic and crystal that took me a little bit to grasp. But, SCRIBE does a great job putting out content to help new users and even experienced users keep up to date with the platform.
In this post I wanted to go over some of the resources that SCRIBE offers to everyone for free. This way you can get started with your first mapping or refresh your knowledge of SCRIBE Online. First, we will talk about their webinars. SCRIBE does regular webinars about its platform. Past webinars can be found here.
Next, SCRIBE provides excellent video tutorials for SCRIBE Online. As of me writing this post here are list of videos with a small description of each:
11/16/2017 10:39:00 AM
11/9/2017 9:15:00 AM
When we are creating mappings in SCRIBE its important that we document those maps. This is a two stage process. First, we want to do an export of the map and save the JSON file into the source control system. This way we have a back up copy of it and we can reuse the map if we need it again, similar to what you would do with a template. The second reason is so others can have an understanding of the process.
In projects that I have worked on in the past I have taken more of a manual approach to creating the supporting documentation for my integration and migration mappings. This can be time consuming to say the least. During one of these exercises, I found the SCRIBE Documentation Tool. I walk through the steps they outlined and it auto generated the documentation for me. This has greatly speed up my work and gives me an spreadsheet that is easy to understand.
To create this documentation you will need a Google account as it uses Google Docs. Here is a link to the detailed step by step process provided by SCRIBE. Here is the high level process:
1) Allow API access to your SCRIBE Online organization.
2) Access the Documentation Tool by clicking this link. (Opens Google Spreadsheet).
3) Save a copy. This is an important step as you create more then one of these, this way you don't over write a previous map documentation. Also doing this will allow "SCRIBE" to appear in the menu next to "Help".
4) Click SCRIBE -> Documentation Solution and follow the prompts.
5) Once the process starts, depending on the number of mappings in the solution it can run quickly or slowly. Just wait for it to finish.
When the process is finished you will see the following:
11/8/2017 8:36:00 AM
With the release of Microsoft Dynamics 365 version 9, we are saying goodbye to the large SDK we are use to downloading to extend Microsoft Dynamics and saying hello to the Developer Guide for Dynamics 365 Customer Engagement. Here are some highlights with the version 9 release.
1) Previous SDK versions can still be downloaded for previous release of Microsoft Dynamics CRM
11/2/2017 5:41:00 AM
When we need to connect CRM to another system it is important that the connection remain working so data can flow easily between the systems. One common issue that can arise in the integration is the user account we use to create the connection, password can expire. When this happens it could be minutes to days before the issue is found and that can lead to data synchronization between the systems getting messed. To migrate this risk we can easily setup a user with a non-expiring password. The user type for this in CRM is "Non-Interactive". Non-Interactive users can't log into CRM via the front end. If you try you will see this error message:
10/30/2017 3:51:00 PM
10/25/2017 3:18:00 PM
In another post I talked about using Azure Blob Storage to store CRM attachments in Azure rather then in CRM. Today I installed the app in a clients CRM system and found that Microsoft Labs updated the App to version 1.7. This change had a lot of updates from UI to how to setup notes attachment movement. Also they now allow for moving of existing attachments. Here is the updated steps:10/21/2017 4:48:00 PM
In my Living In SCRIBE Online Blog I talked about work I did to migrate CRM 4.0 on-premise to Dynamics 365 using SCRIBE Online. In that blog I mentioned how there are about 80 GB worth of attachments that had to be moved into the cloud. Well this could quickly get expensive if we stored them directly in CRM. Also, this client wasn't using SharePoint. So what option do we have to store this data so end users can access the data? Why not use Azure Blob Storage?
Microsoft has an app in AppSource to move attachments from CRM to Azure blob storage. Not only does it all for this, but it also has a web resource in it so you can allow for bulk uploads. Here is how to set it up:
1) Go To AppSource
2) Install the Attachment Management App by clicking the "Get It Now" button under the image.
3) While its installing go to Azure and setup your blob storage account.
4) For each CRM instance you will need to setup 2 containers in your blob storage. One is for email attachments and the other is for notes attachments. So if you have a production instance and sandbox instance you will need to have 4 containers (DevEmails, DevNotes, ProdEmails, ProdNotes).
5) Still in Azure click on "Shared Access Signatures" Here you will setup your SAS key. Complete all sections and click on "Generate SAS".
6) Copy the SAS key, we will need it later.
7) Go back to CRM and check to make sure the attachment management app finished installing.
8) In setting -> Customization -> Solutions, click on publish all customization.
9) Refresh your screen.
10) You may not see a link anywhere in the navigation to the azure blob storage entity, so you can either turn it on to appear in the settings area or just use advanced find to get to the entity. It is called Azure Blob Storage Settings.
11) Add a new record.
- Name = The name of the blob storage account in Azure
- SAS Token = Get this from step 6
- Notes Attachment Container = Name of the container in Azure
- Email Attachment Container = Name of the container in Azure
12) Save
Your attachments in CRM will now be stored in Azure. You will still be able to access the notes within CRM with no problem at all, including even downloading them from CRM. One thing to note about this is it will not move attachments that are already in CRM. So it is best to install this early on if you are not going to use SharePoint integration.
10/21/2017 4:20:00 PM
Are you working on creating a connector with SCRIBE's CDK? In your connector do you have an array of strings or list of strings that you need to pass? SCRIBE makes this easy to do within the CDK and SCRIBE Online.
I came across this scenario on a connector I was creating that passes a JSON message to an API. In the JSON message it had a list of strings for entity ID's. Here is an easy way to accomplish this:
1) Create you Property Definition as past or your Object Definition.
1: new PropertyDefinition
2: {
3: Description = "Use TOLIST() to pass in a list of entity id's.",
4: FullName = "Entity IDs",
5: IsPrimaryKey = false,
6: MaxOccurs = 1,
7: MinOccurs = 0,
8: Name = "PublishTo",
9: Nullable = true,
10: NumericPrecision = 0,
11: NumericScale = 0,
12: PresentationType = "string",
13: PropertyType = typeof(string).Name,
14: UsedInActionInput = true,
15: UsedInActionOutput = true,
16: UsedInLookupCondition = true,
17: UsedInQueryConstraint = true,
18: UsedInQuerySelect = true,
19: UsedInQuerySequence = true
20 }
TOLIST("ABC123, DEF456, GHI789", ",", "string", "a", true, true)
"EntityIDs":
[
"ABC123",
"DEF456",
"GIH789"
]
10/21/2017 12:17:00 PM
Recently I had a client that needed to be migrated from CRM 4.0 on-premise to Dynamics 365 online. First, I want to say that we opted to not do the upgrade to CRM 2011 on-premise to CRM 2013 on-premise to CRM 2015 on-premise to Dynamics 365 on-premise because the client wanted to start fresh with CRM customization's. So personally, it would have been a waste of time to do that upgrade process.
The following issues where identified with the data migration:
1) CRM 4.0 is no supported by SCRIBE Online.
2) The server OS that CRM 4.0 and SQL Server ran on was not supported.
3) The version of SQL Server was no supported.
How am I going to migrate this data? Hmm.....
The solutions:
1) The RDP Terminal Server I am using has a supported server OS.
2) I am able to connect to SQL Server CRM 4.0 Database with ODBC.
3) Install SCRIBE On-Premise Agent on the Terminal Server and use the ODBC connector to retrieve the data.
By using the ODBC connector I could access the data in SCRIBE Online in a supported way. Because of this approach the migration did run a little slower because I couldn't run multiple maps at the same time over the same ODBC connection, because it would throw a connection error.
One piece I was not able to migrate with SCRIBE Online was the attachments stored in CRM 4.0, which was about 80 GB worth. I couldn't migrate these because of changes in how CRM stored them. So to accomplish this I did the following:
1) I downloaded this tools source code from GitHub.
2) I modified it to read a list of guids.
3) I exported a list of attachment guids to csv.
4) I modified the application to then download attachments and put them into one root folder and have subsequent folders in the root. Each subsequent folder was a guid and inside that folder was the attachment.
4) I then use the Microsoft Dynamics 365 SDK to create a console application to upload the attachments to Dynamics 365 and re-link them to the parent record.
Once I made the code changes and wrote the application I the download and upload ran overnight.
Know you might be asking yourself, "How did you store 80 GB of attachments in CRM online? Isn't that expensive?" I will be posting a separate blog on that in my Dynamics 365 blog.
10/19/2017 11:29:00 AM
While building a custom connector using the CDK I found it time consuming to have to manually stop the SCRIBE Agent service, copy the connector .dll to the connector folder in the agent folder and restart the service. To over come this I added some pre and post build functions to my project in Visual Studio. Here is how you setup your development environment to automate all this:
1) Right click on your project and go to properties.
2) On left side go to Build Events.
3) In "Pre-build event command line" input this:
1: net stop "Scribe Online Agent"
2: Exit /b 0
4) In "Post-build event command line" input this: 1: copy /Y "$(TargetDir)$(ProjectName).dll" "C:\Program Files (x86)\Scribe Software\AGENTNAME\Connectors\CONNECTORNAME\$(ProjectName).dll"
2: net start "Scribe Online Agent"
10/19/2017 6:28:00 AM
Recently while doing some client work, we noticed that CRM Portals does a post back when adding information within a sub-grid on an entity form. Why is this an issue? Because, if you have input fields on the form, these values are not written into CRM till the save button at the bottom of the form is clicked. So if a user inputs anything into the sub-grid after they fill in the fields, the post back action will remove what the user input. This can make for a bad user experience. To over come this, we can use session memory to temporarily store the values till the browsing session is ended.
To use session storage the first thing we will need to do is register onChange handlers to update the values the session memory when the user changes a value. Here are some examples of onChange handlers written in JQuery:
1: $(document).on('change', '#CHECKBOX ID', function () { SetSessionValue("#CHECKBOX ID") });
2: $(document).on('change keyup paste', '#TEXTBOX ID', function () { SetSessionValue("#TEXTBOX ID) });
#1 is for check boxes and picklists1: function SetSessionValue(fieldname) {
2: if ($(fieldname).hasClass('boolean-radio')) {
3: if ($(fieldname + "_1").is(':checked')) {
4: sessionStorage.setItem(fieldname, true);
5: }
6: else {
7: sessionStorage.setItem(fieldname, false);
8: }
9: }
10: else {
11: var fieldValue = null;
12: if ($(fieldname).hasClass('picklist')) {
13: fieldValue = $(fieldname).find(":selected").val();
14: }
15: else {
16: fieldValue = $(fieldname).val();
17: }
18: sessionStorage.setItem(fieldname, fieldValue);
19: }
20: }
1: function GetSessionValue(fieldname) {
2: var sessionValue = sessionStorage.getItem(fieldname);
3: return sessionValue;
4: }
10/18/2017 10:13:00 AM
Occasionally when working with CRM Portals you may run into the need to get the OnChange event for a date time field. Because of the way that CRM Portals renders date time fields this is slightly more complicated then working with text boxes, check boxes and pick lists. During my process I ended up reaching out to Microsoft for help on this and below are the steps they provided to help me with this problem:
1) Click on the date time control
2) Press F12
3) In console type $('div.control') and hit enter (this will give you a list of div controls)
4) Locate the div control for the date time field
1: $(document).ready(function ()
2: {
3: var dpcontrol = $('div.control')[2];
4: $(dpcontrol).on("dp.change", function (e)
5: {
6: alert("On change Event Triggered");
7: alert(e.date);
8: });
9: });