Blog

Insights on building software that moves your business forward.

Total Posts: 52

Lookup In Select of URL For Power Pages

August 8, 2023

 While working on a Power Pages implementation I was consume the Power Pages API to get a Lookup table so I could show and hide information on the form using jQuery.  One issue I was having is I couldn't get a lookup field to return even though I had it as a field in the site setting for the table and included it in the my select statement in the URL.  I kept getting an `unexpected error occurred`.

This issue is caused because lookup's are made up using 3 fields.

1) Formatted Value = The information you see in the lookup field (i.e. the record name)

2) Logical Name = The table the related record is a part of

3) Id = The id of the related record

In the api there is a 4th field that is returned that is the Associated Navigation Property and it is the logical name of the lookup field.

When setting up our select and even saying what columns can be returned in the API, we have to think how the OData endpoint will return the data to us.  The easy way to do this with a lookup is to add a `_` to the beginning of the field schema name and append `_value` to the end of the schema name.

Example:

`_new_lookupField_value`

Place that value in the fields site setting for the enabled endpoint, if not using `*` and add it to your select statement in the URL.  You should see the error go away and you should see returned data similar to this


 {  
      "@odata.context": "https://myportal.powerappsportals.com/_api/$metadata#new_example(new_name,_new_lookupField_value)",  
      "@Microsoft.Dynamics.CRM.totalrecordcount": -1,  
      "@Microsoft.Dynamics.CRM.totalrecordcountlimitexceeded": false,  
      "@Microsoft.Dynamics.CRM.globalmetadataversion": "3068318",  
      "value": [  
           {  
                "@odata.etag": "W/\"3066510\"",  
                "new_name": "Example Name",  
                "_new_lookupField_value@OData.Community.Display.V1.FormattedValue": "Example Lookup",  
                "_new_lookupField_value@Microsoft.Dynamics.CRM.associatednavigationproperty": "new_LookupField",  
                "_new_lookupField_value@Microsoft.Dynamics.CRM.lookuplogicalname": "new_relatedTable",  
                "_new_lookupField_value": "2a9c7024-7ca0-4707-f086-419f27c27541"  
           },{  
                "@odata.etag": "W/\"2610804\"",  
                "new_name": "Example of Null",  
                "_new_lookupField_value": null  
           }  
      ]  
 }  
In the above JSON there are two records returned.  One where the lookup contains data and one where it is empty.
DataVerseMicrosoft Dynamics 365Power PagesPower PortalPowerPlatform

Reusable Method To Get Record By Id

July 29, 2023

I have a handful of reusable code that I use when creating plugins or external process (i.e. Azure Functions) for working with DataVerse. The first one I am providing is Getting a Record By Id:

1:  private static Entity GetFullRecord(string entityName, string primaryKey, Guid recordId, IOrganizationService service)  
2:  {  
3:       using (OrganizationServiceContext context = new OrganizationServiceContext(service))  
4:       {  
5:            return (from e in context.CreateQuery(entityName)  
6:            where (Guid)e[primaryKey] == recordId  
7:            select e).Single();  
8:       }  
9:  }  
  • entityName = The logical name of the entity
  • primaryKey = The primary key field for the entity. If using late binding you can create this dynamically by doing:
    • $"{target.LogicalName}id"
  • recordId = Guid of the record to get
  • service = Service to interact with DataVerse

DataVerseMicrosoft Dynamics 365PowerPlatform

Power Pages Update Last Successful Login Using JavaScript and Power Pages API

July 26, 2023

 Recently while working on a Power Pages implementation for a client, I had the requirement to show the last time a user logged in on their profile page.  I thought this would be easy to do as there is already a field on the contact record for "Last Successful Login" (      adx_identity_lastsuccessfullogin).  This use to update when a user logged in, but it appears Microsoft has removed that automation.

While searching I came across a few different ways of achieving this task.  One used application insights in Azure and another one used an HTTP endpoint setup in Power Automate.  I thought, this needs to be simpler.  What I came up with is to use Liquid with JavaScript to tell if a user is logged in or not.  Then use the new Power Pages api to update the logged in users contact record to mark the last time they logged in.

Here is the approach I setup:

1) Make sure you turn on the api for contact in Site Settings.

1) Link to Microsoft Doc On How to turn on the api

2) Make sure you include "adx_identity_lastsuccessfullogin" in the list of fields to return.

2) I tested my process on the home page.  On the home page add two JavaScript Functions.  The first function (CheckIfUserJustLoggedIn) uses Liquid to tell if the user is logged in.  If they are not, it deletes our value from session memory.  If the user is logged in we check session memory to see if there is already a key in there with the date and time the user logged in.  If there is, we don't do anything else.  If there isn't then we create it and we update the contact record for the user via the Power Pages API.  The second function is Microsoft's wrapper function that handles the Ajax call and handles the authentication piece to make sure the user is authorized to perform the option.

1:  function CheckIfUserJustLoggedIn()  
2:  {  
3:    var sessionLastLogin = "sessionLastLogin";  
4:    var isLoggedIn = "{% if user %}true{% else %}false{% endif %}";  
5:    if (isLoggedIn == 'true')  
6:    {  
7:      var userId = "{{user.id}}";  
8:      userId = userId.replace('{', '').replace('}', '');  
9:      var lastLogin = sessionStorage.getItem(sessionLastLogin);  
10:      if (lastLogin == null)  
11:      {  
12:        var dateTimestamp = new Date();  
13:        sessionStorage.setItem(sessionLastLogin, dateTimestamp);  
14:        webapi.safeAjax({  
15:          type: "PATCH",  
16:          url: "/_api/contacts(" + userId + ")",  
17:          contentType: "application/json",  
18:          data: JSON.stringify({  
19:            "adx_identity_lastsuccessfullogin": dateTimestamp  
20:          }),  
21:          success: function (res) {  
22:            console.log(res);  
23:          }  
24:        });  
25:      }  
26:    }  
27:    else  
28:    {  
29:      sessionStorage.removeItem(sessionLastLogin);  
30:    }  
31:  }  


1:  (function(webapi, $){  
2:    function safeAjax(ajaxOptions) {  
3:        var deferredAjax = $.Deferred();  
4:        shell.getTokenDeferred().done(function (token) {  
5:            // add headers for AJAX  
6:            if (!ajaxOptions.headers) {  
7:                $.extend(ajaxOptions, {  
8:                    headers: {  
9:                        "__RequestVerificationToken": token  
10:                    }  
11:                });  
12:            } else {  
13:                ajaxOptions.headers["__RequestVerificationToken"] = token;  
14:            }  
15:            $.ajax(ajaxOptions)  
16:                .done(function(data, textStatus, jqXHR) {  
17:                    validateLoginSession(data, textStatus, jqXHR, deferredAjax.resolve);  
18:                }).fail(deferredAjax.reject); //AJAX  
19:        }).fail(function () {  
20:            deferredAjax.rejectWith(this, arguments); // on token failure pass the token AJAX and args  
21:        });  
22:        return deferredAjax.promise();  
23:    }  
24:    webapi.safeAjax = safeAjax;  
25:  })(window.webapi = window.webapi || {}, jQuery)  

If you run into issues for permissions check your Power Pages Web Roles table and make sure the user has the correct permissions for the contact table.

Power PagesPower PortalPowerPlatform

How to Render Raw HTML Column in View in Power Pages

June 28, 2023

 Recently I had a requirement for a client to convert their existing site over to Power Pages.  One of the items I needed to convert was a "Recent Announcements" page.  This is a pretty straightforward page with only a title of "Recent Announcements" and a grid showing the announcements.  The grid only has two columns, 'Created On' and 'Message'.  The problem I had to solve was I used the Rich Text Editor control on the 'Message' field.  This means that raw HTML was stored in the field.  Out of the box Power Pages can't render the raw HTML in the column.  This means in the Power Page would have raw HTML showing instead of a nicely formatted message.  This is easily solved with a small JavaScript function.

The first thing I did was create a custom entity in DataVerse called Announcement.  Here is the layout of the fields:

  1. Changed the primary name field from 'Name' to 'Subject'.
  2. Added a multi-line text field called 'Message'.
  3. Add a two-option set field called 'Show in Portal'.
    1. This is used to control whether to show the announcement in the portal or not.  This way they can have draft announcements.  I know there are other ways of doing it with Status and Status Reason.  This was just the approach I took.
Since we are only showing a view in Power Pages, there was no need for me to setup a form for Power Pages.  Instead, the form I setup is for the Model Driven App in DataVerse only.  Once I setup the form, I set the multi-line text field control to Rich Text Editor.  Now users can format the text any way they see fit for the announcement.  This is something they didn't have in the existing site.

Next, I created a view called 'Live Announcements' This view has 2 filters:
  1. Status of Active.
  2. Show on Portal set to Yes.
This view will be used to show the messages in the portal and will give users in the Model Driven App a quick reference of what is showing in Power Pages and what isn't.

At this point we setup the Page.  I won't go through step by step how to do that in this post as the new Power Pages process is really easy for creating a new page and adding a list to it.  Just remember to have table permissions setup.  In my case the table permissions are a simple Global Read Only permission on the announcement table.

At this point you should see your list on your Power Page.  But you will notice that the message column contains raw HTML.  Here is how to fix that easily.

  1. Open up the 'List' Record in DataVerse and click on the 'Options' tab.
  2. In the 'Custom JavaScript' field input the below JavaScript
    1. 1:  $(document).ready(function (){  
      2:    $(".entitylist.entity-grid").on("loaded", function () {  
      3:      $('td[data-attribute="REPLACE WITH FIELD SCHEMA NAME"]').each(function() {  
      4:        var rawHtml = $(this).attr('data-value');  
      5:        $(this).html(rawHtml);  
      6:      });  
      7:    });  
      8:  });  
      
Here is what this JavaScript does.  When the document is ready it checks for a loaded event from the list (grid).  If you don't do this, check the code won't work as the page is loaded asynchronously and the grid might not be loaded by the time the read function runs.  By running the on function, looking for a loaded response you will trigger the remaining code when the grid is ready.  Once the grid is ready, we will get our column that contains the raw html.  Make sure to replace the 'REPLACE WITH FIELD SCHEMA NAME' text with the schema name of your column (schema name of the DataVerse field).  For each row in that column, we will read the raw HTML that the Rich Text Editor generated and render it.

Make sure to save your changes and sync your configuration.  Once that is done clear your cache and check the list.  You should no longer see raw html but nicely formatted text, the way the user wanted it to be.

DataVerseMicrosoft Dynamics 365Power AppsPower PagesPower Portal

Understanding Managed and Unmanaged Solutions in Dynamics 365

May 16, 2023

Dynamics 365, Microsoft's robust suite of business applications, boasts a myriad of features that can be customized to cater to the specific needs of any business. A vital concept to grasp when working with Dynamics 365 is the difference between managed and unmanaged solutions. This blog post aims to clarify these two types of solutions, providing a comprehensive analysis of the advantages and disadvantages of each.

Unmanaged Solutions

Unmanaged solutions act as a dynamic development environment, enabling direct alterations and additions to system components. They are often employed during the development and testing phase of a customization project but are equally effective when implemented in production instances, particularly for internal organizational operations.

Pros of Unmanaged Solutions:

  1. Flexibility: Unmanaged solutions provide a high degree of adaptability, permitting developers to modify system components, introduce new elements, or discard those that are no longer necessary.
  2. Streamlined Testing: The process of testing and debugging components becomes significantly simplified with unmanaged solutions, as immediate adjustments can be made, and their effects observed in real time.

Cons of Unmanaged Solutions:

  1. Risk of Errors: The enhanced flexibility also carries the risk of potential complications. Unintentional modifications or deletions of components can occur, potentially leading to issues on a system-wide level.
  2. Lack of Layering: Unmanaged solutions do not support layering, which can result in conflicts when multiple developers are working simultaneously on a project.

Managed Solutions

A managed solution is a finalized, locked package that can be distributed and installed across various environments. It is essentially an unmodifiable version of an unmanaged solution. Managed solutions are most suitable for the distribution of completed customizations or applications, especially when these solutions are being sold to external clients.

Pros of Managed Solutions:

  1. Protection of Intellectual Property: Managed solutions safeguard the intellectual property of the developers or the organization that created them, as they are sealed and cannot be directly modified.
  2. Convenient Distribution: Managed solutions can be easily distributed and installed across different environments, making them an excellent choice for delivering final customizations or applications.
  3. Supports Layering: Unlike unmanaged solutions, managed solutions support layering, allowing the simultaneous installation of multiple solutions without conflicts.

Cons of Managed Solutions:

  1. Limited Flexibility: Managed solutions offer less flexibility due to their sealed nature. Any necessary modifications would require an update from the developer and a subsequent reinstallation.
  2. Dependency Concerns: Managed solutions may create dependencies. Components within a managed solution cannot be deleted if they are being referenced by another solution.

In conclusion, the choice between managed and unmanaged solutions in Dynamics 365 largely depends on your specific needs. If you are developing, testing, or implementing internal operational changes, an unmanaged solution provides the flexibility needed for easy modifications and testing. Conversely, if you are distributing or selling a finalized customization or app, a managed solution offers intellectual property protection, convenient distribution, and supports layering. Understanding these differences and selecting the appropriate solution type is crucial for optimal use of Dynamics 365.

DataVerseMicrosoft Dynamics 365Power AppsPowerPlatform

Effective Logging in Microsoft Dynamics 365 Plugins: Best Practices and Examples

March 22, 2023

Microsoft Dynamics 365 is a powerful suite of business applications that provides organizations with tools for managing customer relationships, sales, and operations. One of the essential aspects of developing custom plugins for Dynamics 365 is proper logging to ensure smooth functionality and easy debugging.

In this blog post, we'll discuss best practices for logging in Microsoft Dynamics 365 plugins and provide examples to help you implement effective logging in your custom solutions.

Understanding the Plugin Trace Log

Microsoft Dynamics 365 provides a built-in logging mechanism called the Plugin Trace Log. The Plugin Trace Log can be used to record custom messages, exceptions, and other information for debugging purposes. To enable the Plugin Trace Log, follow these steps:

  • Navigate to Settings > Administration > System Settings.
  • Under the Customization tab, locate the "Plugin and Custom Workflow Activity Tracing" section.
  • Set the option to "All" or "Exception" based on your requirements.

With the Plugin Trace Log enabled, you can use the ITracingService to log messages within your plugin code.

Using the ITracingService

`ITracingService` is an interface provided by the Microsoft Dynamics 365 SDK. It offers a simple and powerful way to log messages and exceptions in your plugins. To use the `ITracingService`, you need to instantiate it from the `IServiceProvider` passed to your plugin's Execute method.

Example:

 public class MyPlugin : IPlugin  
 {  
   public void Execute(IServiceProvider serviceProvider)  
   {  
     // Obtain the tracing service  
     ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));  
     // Log a simple message  
     tracingService.Trace("MyPlugin started execution.");  
   }  
 }  

Logging Exceptions

When an exception occurs in your plugin, it's crucial to log the error details for troubleshooting purposes. With the `ITracingService`, you can log the exception message and the stack trace.

Example:

 public class MyPlugin : IPlugin  
 {  
   public void Execute(IServiceProvider serviceProvider)  
   {  
     // Obtain the tracing service  
     ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));  
     try  
     {  
       // Your plugin logic here  
     }  
     catch (Exception ex)  
     {  
       tracingService.Trace("An exception occurred in MyPlugin: {0}", ex.ToString());  
       throw new InvalidPluginExecutionException($"An error occurred in MyPlugin: {ex.Message}", ex);  
     }  
   }  
 }  

Monitoring the Plugin Trace Log

Once your plugin logs messages using the `ITracingService`, you can view them in the Plugin Trace Log. To access the log, navigate to Settings > Customizations > Plugin Trace Log in the Dynamics 365 web application. The Plugin Trace Log provides filtering and sorting options to help you analyze log data and troubleshoot issues.

Conclusion

In this blog post, we've discussed the best practices for logging in Microsoft Dynamics 365 plugins, including enabling and monitoring the Plugin Trace Log, using the `ITracingService`, structured logging, performance optimization, and log retention. By implementing these practices, you can ensure that your custom plugins are more maintainable, reliable, and easier to troubleshoot.

References



DataVerseMicrosoft Dynamics 365PowerPlatform

Power Up Your Business: Power Apps, Model-Driven Apps, and Power Portals Demystified

March 21, 2023

Welcome to the wonderful world of Microsoft's Power Platform! As the digital landscape keeps evolving, so do the ways we can create and manage applications for our businesses. If you're looking to power up your business, you've come to the right place! We'll dive into the whimsical world of Power Apps, Model-Driven Apps, and Power Portals, helping you pick the right tool for the job. So, grab a cup of coffee, get comfy, and let's get started! 😃

Power Apps: Your DIY App Builder

Power Apps is a low-code, drag-and-drop application builder that enables users to create custom applications for their organization without needing any fancy coding skills (1). If you're a non-developer with a brilliant app idea, Power Apps is your new best friend. It's a game changer for business users who can create mobile and desktop apps for managing data, automating processes, and connecting to various services.

When to use Power Apps:
  • You need a custom application but don't have the budget or time for traditional development.
  • You want a quick, low-cost solution for internal processes or data management.
  • You're looking to build an app that integrates with other Microsoft services (e.g., Office 365, Dynamics 365, etc.)
Model-Driven Apps: Data-First Design

Model-Driven Apps are part of the Power Apps family, but they take a different approach (2). They start with your data model and create a responsive app that automatically generates forms, views, and dashboards. Model-Driven Apps are perfect for managing business data and processes based on the Common Data Service (CDS), Microsoft's cloud-based storage platform.

When to use Model-Driven Apps:
  • You have a data-centric application, and your main goal is to manage and visualize data.
  • You want to build an app with minimal effort – just define the data model and let the app build itself!
  • You need to create a complex business application that integrates with Dynamics 365 and other Microsoft services.
Power Portals: Web-Based Interaction

Power Portals (formerly known as Dynamics 365 Portals) are a way to create external-facing websites that interact with your organization's data and processes (3). They enable you to create custom, secure, and scalable portals for your customers, partners, or employees to access and interact with your organization's data in a controlled environment.

When to use Power Portals:
  • You want to create a self-service portal for customers, partners, or employees.
  • You need to provide secure access to your organization's data and processes.
  • You're looking to create a web-based solution with seamless integration with Dynamics 365 or other Microsoft services.
Mixing It Up: Combine and Conquer

In some cases, you might find that a combination of Power Apps, Model-Driven Apps, and Power Portals is the best solution for your organization. These tools are designed to work together seamlessly, allowing you to create powerful, interconnected applications that cater to your specific needs (4).

When to combine tools:
  • You want to build a multi-faceted solution, with internal apps for employees and external portals for customers or partners.
  • You need a data-centric app that can be accessed both on mobile devices and via web portals.
  • You're looking to create a unified experience across different parts of your organization, leveraging the full power of Microsoft's ecosystem.
Tips for Success: Getting Started with Power Platform

Now that you're familiar with Power Apps, Model-Driven Apps, and Power Portals, here are a few tips to help you get started on your app-building adventure:
  1. Identify your needs: Before diving in, take some time to define the problem you're trying to solve or the process you'd like to improve.
  2. Evaluate your data: Consider your data sources, structures, and relationships. This will help you decide whether a canvas app, a model-driven app, or a portal is the best fit for your project.
  3. Think about user experience: Keep your users in mind throughout the development process. Consider their needs, preferences, and level of technical expertise.
  4. Leverage Microsoft's ecosystem: Take advantage of the seamless integration between Power Platform tools and other Microsoft services to create powerful, interconnected solutions.
  5. Keep learning and growing: Stay curious and keep exploring the Power Platform's capabilities. The more you learn, the better equipped you'll be to create fantastic applications!
Conclusion

Power Apps, Model-Driven Apps, and Power Portals offer versatile ways to create applications for various business needs. By understanding their individual strengths and how they can be combined, you can build tailored solutions that delight your users and drive your organization's success. And remember, the sky's the limit when it comes to creativity and innovation. So, buckle up and enjoy the ride on your app-building journey!

References:
(1) Microsoft. (n.d.). What is Power Apps? Retrieved from https://docs.microsoft.com/en-us/powerapps/maker/canvas-apps/getting-started
(2) Microsoft. (n.d.). What are model-driven apps? Retrieved from https://docs.microsoft.com/en-us/powerapps/maker/model-driven-apps/model-driven-app-overview
(3) Microsoft. (n.d.). What is a Power Apps portal? Retrieved from https://docs.microsoft.com/en-us/powerapps/maker/portals/overview
(4) Microsoft. (n.d.). Power Apps: Canvas and model-driven apps. Retrieved from https://docs.microsoft.com/en-us/powerapps/maker/
DataVerseMicrosoft Dynamics 365Model Driven AppsPower AppsPower PortalPowerPlatform

Taming the Beast: Early vs. Late Binding in Microsoft Dynamics 365

March 20, 2023

 Hey there, code wranglers! Are you ready to dive deep into the fascinating world of Microsoft Dynamics 365? Today, we're going to explore the differences between Early Binding and Late Binding, two popular programming techniques that you can use to make the most out of this powerful platform. We'll compare examples, performance metrics, and development times, and we'll even sprinkle in some references for those of you hungry for more information. So let's saddle up and get started!

Early Binding - A Swift Ride:

Early Binding is like a well-trained horse that knows exactly where it's going. With Early Binding, you'll be working with strongly-typed classes generated from the CRM metadata. This means you'll have access to IntelliSense, making it easy to spot errors during development and reducing the time it takes to write code.

Let's take a look at an example of Early Binding in action:

 using Microsoft.Xrm.Sdk;  
 using Microsoft.Xrm.Sdk.Client;  
 using Microsoft.Crm.Sdk.Messages;  
 using System.ServiceModel;  
 // Connect to the Organization Service  
 IOrganizationService service = GetOrgService();  
 // Create a new account  
 Account newAccount = new Account  
 {  
   Name = "Contoso Ltd.",  
   Address1_City = "Seattle",  
   Address1_StateOrProvince = "WA"  
 };  
 Guid accountId = service.Create(newAccount);  

In this example, we've created an Account entity using the strongly-typed Account class. Not only does this code look clean and easy to read, but it also allows for better error-checking during development.

Late Binding - A Wild Adventure:

Late Binding, on the other hand, is like riding a wild stallion: it offers more freedom, but it's also a bit more challenging to handle. Late Binding uses the Entity class, allowing you to create and manipulate records in a more dynamic way. Here's an example of Late Binding:

 using Microsoft.Xrm.Sdk;  
 using Microsoft.Xrm.Sdk.Client;  
 using Microsoft.Crm.Sdk.Messages;  
 using System.ServiceModel;  
 // Connect to the Organization Service  
 IOrganizationService service = GetOrgService();  
 // Create a new account  
 Entity newAccount = new Entity("account");  
 newAccount["name"] = "Contoso Ltd.";  
 newAccount["address1_city"] = "Seattle";  
 newAccount["address1_stateorprovince"] = "WA";  
 Guid accountId = service.Create(newAccount);  

In this case, we've used the generic Entity class and specified the entity type with a string. While this approach is more flexible, it lacks the benefits of IntelliSense, making it more error-prone and increasing development time.

Performance Metrics and Development Time:

Early Binding tends to have better performance due to its strongly-typed nature. However, the difference in performance between the two approaches is often negligible, particularly for smaller-scale projects. When it comes to development time, Early Binding is generally faster, thanks to IntelliSense and better error-checking.

Conclusion:

Whether you choose Early Binding or Late Binding, Microsoft Dynamics 365 has the power and flexibility to help you create amazing applications. Just remember, like any good wrangler, you need to choose the right technique for the job. Don't be afraid to mix and match both approaches based on your project's specific needs and your own coding style.

Recommendations on when to use each:

  1. Early Binding:
    • When you need better compile-time error checking and type safety.
    • If you prefer working with IntelliSense for a more streamlined development experience.
    • When your project requires a more structured and organized coding style.
    • If your team is already familiar with the Dynamics 365 SDK and its generated types.
  2. Late Binding:
    • When you need more flexibility and adaptability in your code.
    • If you're working with a dynamic system where the schema or entity types may change frequently.
    • When your project is smaller in scale and doesn't require the full advantages of Early Binding.
    • If you're building a generic or reusable solution that needs to work with multiple Dynamics 365 environments or versions.

Now that you know the differences between Early and Late Binding and have some recommendations on when to use each, you're ready to ride off into the sunset and create some fantastic solutions for your organization. Remember, practice makes perfect, so don't shy away from experimenting with both techniques to see which one works best for you. The more you learn and adapt, the better your coding skills will become. So grab your hat, your coding boots, and your trusty steed, and embark on your own coding adventure with Microsoft Dynamics 365!

Happy coding, partner!

DataVerseMicrosoft Dynamics 365

Navigating Microsoft Dynamics 365 Customization: Plugins vs. Azure Functions

March 18, 2023

Embarking on the Microsoft Dynamics 365 customization journey offers numerous opportunities to enhance your business processes. However, deciding between the available options, such as Plugins and Azure Functions, can be challenging. This engaging post will serve as your trusty guide, helping you choose the best option for your Dynamics 365 customization needs!

The Two Customization Pathfinders: Plugin and Azure Function

The Agile Plugin 🏃‍♂️

Reference: Microsoft Docs - Write a plug-in

Plugins are like the swift trail runners of the Dynamics 365 customization world. They're the go-to choice for quick, real-time (synchronous), or background (asynchronous) operations that occur within the platform. They can intercept events and modify data before it's saved or displayed to the user.

Choose Plugins when:

  • You need real-time processing (synchronous) or background processing (asynchronous).
  • You want to ensure data integrity.
  • You need tight integration with Dynamics 365.

Plugins might not be the best choice for long-running processes, as they could slow down the overall performance of your system if synchronous or consume system resources if asynchronous.

The Powerful Azure Function 🚀

Reference: Microsoft Docs - Azure Functions Documentation

Azure Functions, like the high-powered rocket ships of the customization world, bring power and scalability to your Dynamics 365 customizations. They're perfect for more complex or compute-intensive tasks that require the full power of the Azure platform. Azure Functions can be triggered by various events and can interact with other Azure services, opening up a world of possibilities.

Choose Azure Functions when:

  • You need advanced processing capabilities or complex logic.
  • You require integration with other Azure services.
  • You want to leverage the power and scalability of Azure.

Azure Functions could be overkill for simple, real-time operations, so consider Plugins if your customization needs are more straightforward.

To integrate Azure Functions with Dynamics 365, you can create a custom connector in Power Apps, which can also be applied to Dynamics 365. This approach allows you to call your Azure Function from within your Dynamics 365 solution.

Conclusion

To recap, choose Plugins for real-time (synchronous) or background (asynchronous) operations with tight integration to Dynamics 365, and Azure Functions for advanced processing capabilities and integration with other Azure services. With this guide, you're now equipped to navigate the Dynamics 365 customization landscape and select the perfect customization pathfinder for your needs! 🌄

Microsoft Dynamics 365

Microsoft Dynamics 365 vs. DataVerse: Unraveling the Mysteries!

March 17, 2023

Hey there, tech enthusiasts! Today, we're diving into the magical realm of Microsoft business solutions, where two powerful forces come together to save the day. That's right; we're talking about Microsoft Dynamics 365 and DataVerse. You might be wondering, "What's the difference between these two superhero-esque solutions?" Well, buckle up, because we're about to take a fantastic voyage to find out!

Section 1: Microsoft Dynamics 365 - The Multifaceted Hero

Microsoft Dynamics 365 is like the Swiss Army knife of business applications. It's a suite of intelligent, interconnected, cloud-based apps that cover various aspects of your business - from sales and marketing to finance and operations. These apps are designed to work together seamlessly, helping you manage every aspect of your business more efficiently. Some of the most popular apps include:

  • Dynamics 365 Sales - Your secret weapon for building better customer relationships! [1]
  • Dynamics 365 Marketing - The ultimate marketing companion for creating irresistible campaigns! [2]
  • Dynamics 365 Finance - The financial wizard that keeps your money matters in check! [3]

These apps, and many more, form a powerful alliance to help you streamline your business processes and make data-driven decisions.

Section 2: DataVerse - The Mighty Data Protector

DataVerse, formerly known as Common Data Service, is the backbone of the Dynamics 365 ecosystem. It's the data management platform that securely stores and manages data used by the Dynamics 365 apps, as well as other custom apps built on the Power Platform. [4]

Think of DataVerse as a mighty guardian, ensuring that your data is organized, secure, and easily accessible for your apps. Its superpowers include:

  • Structured data storage: DataVerse organizes your data into neat tables, making it easier for your apps to find and utilize the information they need. [5]
  • Data security and compliance: DataVerse is committed to keeping your data safe, ensuring it complies with industry standards and regulations. [6]
  • Easy app integration: DataVerse's robust API allows for seamless integration with both Dynamics 365 apps and custom-built apps, creating a powerful and unified data ecosystem. [7]

Section 3: Dynamics 365 and DataVerse - A Dynamic Duo!

While Microsoft Dynamics 365 and DataVerse might seem like two separate entities, they actually work together in perfect harmony. Dynamics 365 apps rely on DataVerse to store and manage the data they need to function, while DataVerse benefits from the functionality and versatility of Dynamics 365 apps.

In essence, Dynamics 365 is the star-studded lineup of apps that power your business operations, and DataVerse is the steadfast guardian that keeps your data secure, organized, and accessible. Together, they form an unstoppable force that helps you run your business like a well-oiled machine.

Conclusion:

Now that we've unraveled the mysteries of Microsoft Dynamics 365 and DataVerse, you can see how they complement each other to create a powerful and comprehensive business solution. This dynamic duo will not only help you streamline your processes but also empower you to make better, data-driven decisions.

So, whether you're in need of a superhero to boost your sales or protect your precious data, look no further than the formidable team of Microsoft Dynamics 365 and DataVerse. With their powers combined, they've got your back, and they're ready to take your business to new heights!

Happy adventuring in the exciting world of Microsoft business solutions!

References:

[1] Dynamics 365 Sales - https://dynamics.microsoft.com/en-us/sales/overview/

[2] Dynamics 365 Marketing - https://dynamics.microsoft.com/en-us/marketing/overview/

[3] Dynamics 365 Finance - https://dynamics.microsoft.com/en-us/finance/overview/

[4] DataVerse - https://docs.microsoft.com/en-us/powerapps/maker/data-platform/data-platform-intro

[5] DataVerse Tables - https://docs.microsoft.com/en-us/powerapps/maker/data-platform/create-edit-tables

[6] DataVerse Security - https://docs.microsoft.com/en-us/powerapps/maker/data-platform/data-platform-security

[7] DataVerse API - https://docs.microsoft.com/en-us/powerapps/developer/data-platform/webapi/overview

Once again, we hope this light-hearted and engaging exploration has given you a better understanding of the differences and synergies between Microsoft Dynamics 365 and DataVerse. With their combined powers, they can truly transform your business, making it more efficient, secure, and data-driven.

Now go forth, armed with the knowledge of this dynamic duo, and conquer your business goals! Adventure awaits in the world of Microsoft business solutions

DataVerseMicrosoft Dynamics 365

Report Authoring Extension Updated To Support Visual Studio 2019

January 11, 2021

 FINALLY, Microsoft has pushed an update to the Report Authoring Extension for Dynamics 365 to allow for use of Visual Studio 2019.  This update went live on 12/18/2020 and can be found here.  Installing won't be straight forward into Visual Studio 2019, even if you install everything when you first installed it.  This is because SSDT (SQL Server Data Tools) is installed but, SSAS, SSIS and SSRS are all separate modules for SSDT that you have to install separately.  While you should be fine with just installing SSRS to make this work, I would recommend just installing all 3 parts.  

All of these can be found in Microsoft documentation.  In the "Install SSDT with Visual Studio 2019" section, you will see a link to the marketplace where you can download the extensions and install them.  Once that is done, download the Report Authoring Extension and run the installer.  That is it, you can now work on SSRS within Visual Studio 2019 for Microsoft Dynamics.

Microsoft Dynamics 365

JavaScript to Allow Updating Specific Fields Post Record complete.

August 24, 2020

 Recently for a client I was given the requirement that they needed to update a custom field on activity records whether it was a completed record or not.  The problem when a record is completed, it is in read only status and users can't update the record easily, with minimal clicks.  To solve this business requirement, I wrote two JavaScript libraries.

1) UnlockFieldsOnLoad - This is the function you should list any field that you want to unlock if the record is in complete (read only status).  This function should run on load of the form and will unlock the fields so they can be updated.

2) OnChangeUnlockedFields - This function is what will do the save when the field(s) are updated.  What it does it check to see if the record is in a read only status.  If if it is, it will make the record active again.  After it makes the record activate again, it will write the change to the database, change the status back to the original status and save that change.

 function UnlockFieldsOnLoad(executionContext)  
 {  
   var formContext = executionContext.getFormContext();   
   formContext.getControl("fieldname").setDisabled(false);  
 }  
 //Saves field value if form is read only.  
 function OnChangeUnlockedFields(executionContext)  
 {  
   var formContext = executionContext.getFormContext();  
   var stateCodeValue = formContext.getAttribute("statecode").getValue()  
   //We only want this JS to run if the record is in read only mode. 1=Completed, 2=Canceled  
   if(stateCodeValue === 1 || stateCodeValue === 2)  
   {  
     //Reactivate the record  
     formContext.getAttribute("statecode").setValue(0);   
     formContext.data.save(6).then(function ()  
     {  
       //Complete Record again  
       formContext.getAttribute("statecode").setValue(stateCodeValue);   
       formContext.data.save();  
     });  
   }  
 }  
Microsoft Dynamics 365

Code Snippets

September 23, 2019
To help speed up my development process, I decided to go down the path of creating code snippets for code blocks that I end up copying and pasting a lot.  I have published these to GitHub and will keep adding to them as I find the need to add more.  If you have any that you would like to add, please feel free to send them my way.  Here is a link to the git hub respiratory where I am keeping the code snippets.

https://github.com/JamesStuder/CodeSnippets

If you would like more information on howto create your own code snippets check out Microsoft's Documentation (Link Below).

https://docs.microsoft.com/en-us/visualstudio/ide/code-snippets?view=vs-2019
Microsoft Dynamics 365Scribe Online

SQL Script To Generate C# Model

June 5, 2019
While working on a data migration project that uses C# and Azure Functions.  I found myself in the need to generate a model of the table(s).  While doing a Google search I came across the following blog post that has a script to generate the model properties for me.  Simply replace the table name variable at the top of the script with the table name you want to generate the properties for.  You may also need to include the schema name in the table name if you have multiple tables with the same name, just under different schema's.  I did make one change to it in the select @ result line.  It was adding a blank line in between each property, so I removed the new line.


 declare @TableName sysname = 'TABLE_NAME'  
 declare @Result varchar(max) = 'public class ' + @TableName + '  
 {'  
 select @Result = @Result + '  
   public ' + (CASE WHEN ColumnName = 'RowVersion' THEN 'byte[]' ELSE ColumnType END) + NullableSign + ' ' + ColumnName + ' { get; set; }'  
 from  
 (  
   select   
     replace(col.name, ' ', '_') ColumnName,  
     column_id ColumnId,  
     case typ.name   
       when 'bigint' then 'long'  
       when 'binary' then 'byte[]'  
       when 'bit' then 'bool'  
       when 'char' then 'string'  
       when 'date' then 'DateTime'  
       when 'datetime' then 'DateTime'  
       when 'datetime2' then 'DateTime'  
       when 'datetimeoffset' then 'DateTimeOffset'  
       when 'decimal' then 'decimal'  
       when 'float' then 'float'  
       when 'image' then 'byte[]'  
       when 'int' then 'int'  
       when 'money' then 'decimal'  
       when 'nchar' then 'string'  
       when 'ntext' then 'string'  
       when 'numeric' then 'decimal'  
       when 'nvarchar' then 'string'  
       when 'real' then 'double'  
       when 'smalldatetime' then 'DateTime'  
       when 'smallint' then 'short'  
       when 'smallmoney' then 'decimal'  
       when 'text' then 'string'  
       when 'time' then 'TimeSpan'  
       when 'timestamp' then 'timestamp'  
       when 'rowversion' then 'byte[]'  
       when 'tinyint' then 'byte'  
       when 'uniqueidentifier' then 'Guid'  
       when 'varbinary' then 'byte[]'  
       when 'varchar' then 'string'  
       else 'UNKNOWN_' + typ.name  
     end ColumnType,  
     case   
       when col.is_nullable = 1 and typ.name in ('bigint', 'bit', 'date', 'datetime', 'datetime2', 'datetimeoffset', 'decimal', 'float', 'int', 'money', 'numeric', 'real', 'smalldatetime', 'smallint', 'smallmoney', 'time', 'tinyint', 'uniqueidentifier')   
       then '?'   
       else ''   
     end NullableSign  
   from sys.columns col  
     join sys.types typ on  
       col.system_type_id = typ.system_type_id AND col.user_type_id = typ.user_type_id  
   where object_id = object_id(@TableName)  
 ) t  
 order by ColumnId  
 set @Result = @Result + '  
 }'  
 print @Result  

Reference:
Habilis. (2017, May 01). Creating C# model class from SQL query. Retrieved from https://habilisbest.com/creating-c-model-class-from-sql-query
SQL

SQL Script to Create Enums For Table Columns

June 5, 2019
While working on a data migration project, using C# and Azure Functions, I found myself needing to create enums to use in my code for the columns.  Some of the tables where vary large (over 200 columns) and doing this manually would have taken a while to do it.  So, I wrote the below SQL script to create the enums for me.  To use this simply replace the table and schema variables with the schema and table name you want to generate the enums for.


 declare @Table sysname = '<INSERT TABLE NAME HERE>';  
 declare @Schema sysname = 'INSERT SCHEMA NAME HERE';  
 declare @Result varchar(max) = 'public enum ' + @Table + 'Columns' + '  
 {'  
 select @Result = CONCAT(@Result, '  
      ',ColName, ' = ', SUM(ColNum), ',')  
 from  
 (  
      select COLUMN_NAME as ColName, SUM(ORDINAL_POSITION - 1) as ColNum  
      from INFORMATION_SCHEMA.COLUMNS  
      where TABLE_NAME = @Table AND TABLE_SCHEMA = @Schema  
      group by COLUMN_NAME  
 ) t  
 group by ColName, ColNum  
 order by ColNum  
 set @Result = @Result + '  
 }'  
 print @Result  
SQL

UPDATED: Setup Non-Interactive User with Non-Expiring Password

April 4, 2019
A while ago, I wrote a blog post on setting up a non-interactive user with a non-expiring password.  Since writing that post, the process has changes and I couldn't find much documentation on the process.  The part of setting up the non-interactive user hasn't changed:

Setup Non-Interactive User:
  1. Create a user in the Office 365 admin center.  Be sure to assign a Dynamics 365 (online) license to the account.
  2. Go to Dynamics 365 (online).
  3. Go to Settings > Security.
  4. Choose Users > Enabled Users, and then click a user’s full name.
  5. In the user form, scroll down under Administration to the Client Access License (CAL) Information section and select Non-interactive for Access Mode.  You then need to remove the Dynamics 365 (online) license from the account.
  6. Go to the Office 365 admin center.
  7. Click Users > Active Users.
  8. Choose the non-interactive user account and under Product licenses, click Edit.
  9. Turn off the Dynamics 365 (online) license, and then click Save > Close multiple times.
  10. Go back to Dynamics 365 (online) and confirm that the non-interactive user account Access Mode is still set for Non-interactive.
  11. In CRM assign a role to the user account.

But, the process for setting up a non-expiring password has:  To make this easy I created a small PowerShell Script.  The script is below.  Simply change the the part for the $UserEmail to the account you want to set.  What the script does is connect the user to Azure, you will need to be a global admin.  Teh checks the password policy for the user.  Then it updates the policy and shows you that it was changed.  Here is a link to the Microsoft article that I referenced when doing this.  I do want to point out that you should check the warning at the bottom of that article.

 $UserEMail = "<Replace with e-mail>"  
   
 Connect-AzureAD  
   
 Get-AzureADUser -ObjectId $UserEMail | Select-Object UserprincipalName,@{  
   N="PasswordNeverExpires";E={$_.PasswordPolicies -contains "DisablePasswordExpiration"}  
  }  
   
  Set-AzureADUser -ObjectId $UserEMail -PasswordPolicies DisablePasswordExpiration  
   
  Get-AzureADUser -ObjectId $UserEMail | Select-Object UserprincipalName,@{  
   N="PasswordNeverExpires";E={$_.PasswordPolicies -contains "DisablePasswordExpiration"}  
  }  
Microsoft Dynamics 365

Prevent NULL inserts during updates

December 5, 2018
Recently, I had an issue that I needed to overcome.  The problem was that the Dynamics 365 instance I was working on, was integrated with multiple other systems with some long running external processes and possible old data in the UI when a record is open, overriding what the value was set to from the external processes.  What was happening is that on create of a contact, we can have a NULL e-mail.  This is normal behavior.  Our external process would be running and update the NULL e-mail field while the record was open in the UI.  Because the UI hadn't been refreshed and the user made other changes, the e-mail would be blanked out (NULL).  To make sure that the field once populated (yes, this is a business requirement) could not be cleared, I wrote a small pre-operation plugin that does the following:

  1. Check's the plugin context to see if we are trying to update the emailaddress1 or emailaddress2.  
  2. If we are trying to update either email address.  Than we check to see if context value is null, "" or string.empty.  We also double check to make sure that the field is in the context, since my first if is an OR statement.
  3. If there is a value, then we move on.  But, if we are trying to blank out the email address, I simply remove the attribute from the context.
Here is a code snippet:

 using Microsoft.Xrm.Sdk;  
 using System;  
 namespace ContactEmailValidation  
 {  
   public class ValidateEmail : IPlugin  
   {  
     public void Execute(IServiceProvider serviceProvider)  
     {  
       ITracingService tracer = (ITracingService)serviceProvider.GetService(typeof(ITracingService));  
       IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));  
       IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));  
       IOrganizationService service = factory.CreateOrganizationService(context.UserId);  
       //Convert context to entity to make it easier to work with.  
       Entity PreValidationData = (Entity)context.InputParameters["Target"];  
       //Check if email fields are in the context. If they are then we are trying to update them.  
       if(PreValidationData.Contains(ContactConstants.PrimaryEmail) || PreValidationData.Contains(ContactConstants.SecondaryEmail))  
       {  
         //Get current database contact data  
         using (OrganizationServiceContext context = new OrganizationServiceContext(service))  
         {  
           CurrentDatabaseData = (from c in context.CreateQuery("contact")  
                       where (Guid)c[contactid] == id  
                       select c).SingleOrDefault();  
         }  
         tracer.Trace($"Execution context does have either primary email or secondary email in it.");  
         //If primary email is trying to update to null, remove it from the context so we don't clear out the email.  
         if (PreValidationData.Contains(emailaddress1) && (PreValidationData[emailaddress1] == null || PreValidationData[emailaddress1].ToString() == "" || PreValidationData[emailaddress1].ToString() == string.Empty))  
         {  
           tracer.Trace($"Trying to update primary email to null. Removing from context...");  
           PreValidationData.Attributes.Remove("emailaddress1");  
         }  
         //If secondary email is trying to update to null, remove it from the context so we don't clear out the email.  
         if (PreValidationData.Contains(emailaddress2) && (PreValidationData[emailaddress2] == null || PreValidationData[emailaddress2].ToString() == "" || PreValidationData[emailaddress2].ToString() == string.Empty))  
         {  
           tracer.Trace($"Trying to update secondary email to null. Removing from context...");  
           PreValidationData.Attributes.Remove("emailaddress2");  
         }  
         //Update the plugin context that will be saved to the database.  
         tracer.Trace($"Setting target to new context values.");  
         context.InputParameters["Target"] = PreValidationData;  
       }  
     }  
   }  
 }  

Because this is a pre-operation removing the values from the context will prevent the value from being cleared out.  Also, this is a stripped down code snippet, so you will need to check it for any typo's or syntax errors.
Microsoft Dynamics 365

Multi-Select Option Sets Missing setVisible() for JavaScript (Workaround)

November 28, 2018
It is common practice in Microsoft Dynamics 365 to use Business Rules to show and hide fields on a form.  But, with that addition of multi-select option sets, you can't do this.  So the next logical step is to use JavaScript.  There is just one small problem, there is no setVisible() function for multi-selection option sets.  Here are the controls we do have access too, as of this post:

So, how can we dynamically show / hide multi select option sets?  The easiest way is to use a section.  Place a new section on you form.  Then you will need to use JavaScript to get the tab and section by name.  Once you have that, you can dynamically show / hide the section to make the multi-selection option set show and hide.  Here is an example of the JavaScript:

 function ShowHide(executionContext)  
 {  
   //Create the form context  
   var formContext = executionContext.getFormContext();  
   var contactStatus = formContext.getAttribute("Field Name").getValue();  
   var tabObj = formContext.ui.tabs.get("Tab Name");  
   var sectionObj = tabObj.sections.get("Section Name");  
   if (contactStatus === 1)  
   {  
     sectionObj.setVisible(true);  
     formContext.getAttribute("fieldName").setRequiredLevel("required");  
   }  
 }  

Microsoft Dynamics 365

BETA Release Scribe .NET Library

November 10, 2018
In my spare time I have been building a .NET 4.5.2 library for the Tibco Scribe Online API.  For one developer this has been a daunting task and I am at the point that I need some help.  I have the library created and need people that can write unit tests to test the code.  The project is 100% open source and I am aiming for it to be a community build.  The project is hosted at GitHub.

This is a BETA release, because I have not created all the unit tests to make sure every single method is bug free.  Make sure to check the wiki on how to implement the code and use the project.  Once we are out of Beta I will create the NUGET package.

Thank you in advance to all that help out.
Scribe Online

Dynamics 365 v9 Unit Testing and .NET Confusion

July 12, 2018
Recently while creating a plugin for Dynamics 365 v9, I ran into an issue trying to connect to CRM via the tooling connector in my unit test project.  The underlying problem was that the .NET version I was using in my unit test was 4.5.2.  This was preventing me from being able to connect to CRM to create my organization service.  I updated the .NET version on my unit test project to 4.6.1 and was then finally able to connect.  I will also add that I am using the latest nuget package version for Dynamics 365 v9.

For consistence, I updated the plugin project I was working on to .NET 4.6.1.  Locally, everything was working great.  I was able to connect to CRM and make sure that all the methods I had written did what they where suppose to do using test driven development practices.

Then when publishing my plugin via the latest version of the plugin registration tool, I received an error and could not publish my plugin.  The error was due to the .NET version of my plugin project not being 4.5.2.  REALLY!!!!  I just had connection issues and needed to upgrade the .NET version to 4.6.1 to resolve them.  Now I am being told to downgrade my .NET version to upload the plugins to Dynamics 365 v9.

While researching this, I came across this thread in the Microsoft forums, where people are seeing the same issue and are confused as to what is happening.  Basically, CRM plugins need to use .NET 4.5.2 but the XRM tooling connector uses .NET 4.6.1.  We need to tooling connector in our unit tests, because we are connecting to CRM via an external source to create the organization service context for us.  This means that our unit test project needs to be built on .NET 4.6.1 and out plugin project needs to be built on .NET 4.5.2.  Don't worry, the nuget packages you need for both can work with both .NET versions without any problem and you will be able to connect to CRM from your unit test project and call methods from the plugin project to test your code.  You will then be able to publish your plugins to CRM and not receive a .NET error message in the plugin registration tool.
Microsoft Dynamics 365

Working With N:N Relationships in LINQ - Using Late Binding

July 12, 2018
Recently I had a requirement where I needed to get related records via a many-to-many (N:N) relationship in Dynamics 365 via a plugin in C#.  During my searching online, I kept coming to posts that involved using query expressions.  I wanted to do this with LINQ for consistence reasons and readability in my code.  I will also add that what I was doing was being done via Associate and Disassociate messages of the plugin.  I won't be reviewing that part of the plugin in this post, but I will just be focusing on the LINQ statement used to get the related records via the N:N relationship.  Here is the example code that I came up with:

 public List<Entity> GetRelatedRecords(Guid parentID, IOrganizationService service)  
 {  
      OrganizationServiceContext context = new OrganizationServiceContext(service);  
      List<Entity> oProducts = (from r in context.CreateQuery(RELATIONSHIP_NAME)  
                                     join eOne in context.CreateQuery(ENTITY_ONE_LOGICALNAME) on r[ENTITY_ONE_UNIQUEID_LOGICALNAME] equals eOne[ENTITY_ONE_UNIQUEID_LOGICALNAME]  
                                     join eTwo in context.CreateQuery(ENTITY_TWO_LOGICALNAME) on r[ENTITY_TWO_UNIQUEID_LOGICALNAME] equals eTwo[ENTITY_TWO_UNIQUEID_LOGICALNAME]  
                                     where (Guid)eOne[ENTITY_ONE_UNIQUEID_LOGICALNAME] == parentID  
                                     select eTwo).ToList();  
      return oProducts;  
 }  

What the above code does, is uses the relationship name (typically setup as prefix_entityOne_enittyTwo), which is the name of the linking table and joins both linked entities together.  From that we then filter the records to those that are linked to entity one and return entity two records via the GUID of entity one record.  We populate those in a list so we can loop through them or process them as needed.
Microsoft Dynamics 365

SCRIBE Software Acquired By TIBCO

June 6, 2018
On June 6th, 2018 Scribe Software was acquired by TIBCO.  Below is the announcement I and other partners received in an e-mail with links to the press release.

"We are pleased to announce that on June 6, Scribe Software was acquired by TIBCO Software. This milestone reflects the increasing strategic importance that Scribe’s product line has with IT organizations and presents great opportunities for Scribe’s partner community.

In the short term, there will be no immediate impact to how you conduct business with Scribe. Your sales and support contacts will all remain the same. Over time, we expect that the combination of Scribe’s best-in-class iPaaS with TIBCO’s enterprise product portfolio, which includes messaging, application integration, API management, and analytics offerings, will provide significant capabilities and opportunities for Scribe’s partner community.

To learn more about the opportunities that lay ahead, read the press release..."
Scribe Online

HttpWebRequest For Scribe Online Api

May 1, 2018
Scribe has made it extremely easy to work with their API.  Currently, in my limited free time, I am working on a reusable C# library to send and receive data from Scribe's API.  Once it is completed for Version 1, I will list it to GitHub, so anyone can consume it and provide feedback on improvements.  There will be a fully built out wiki with how to consume the library with references to the Scribe API documentation.  Also there will be both TDD (Test Driven Development) and BDD (Behavior Driven Development) projects.  This way you can see examples of how to consume the library as well as what I tested before release.  If you are not familiar with BDD, I recommend you check out my blog post in my Dynamics 365 blog, that gives a high level overview of it.

Anyways, back to what this post is about.  While working on the library, I came across one small issue around TLS 1.2 and making sure that the web request uses it.  Luckily, this is really easy to set in C#, when using .NET 4.5.  By default .NET 4.5 does not send TLS 1.2.  It is supported, just not the default.  So if TLS 1.2 is set for an endpoint, then we need to specify to use it.  This way we decrease the chance of an error occurring.  Here is a blog post I came across that helped me with my issue.

Because I am using C# .NET 4.5 I simply needed to add the following line of code:
 ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12  

The line of code is added right before my HttpWebRequest.  Here is the fully working request method:


     public string SendRequest(string username, string password, string url, string method)  
     {  
       WebResponseFactory factory = new WebResponseFactory();  
       ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;  
       HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);  
       request.Method = method;  
       request.Headers.Add("Authorization", "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes($"{username}:{password}")));  
       request.ContentType = ParameterContentTypeSettings.applicationjson;  
       request.ContentLength = 0;  
       HttpWebResponse response = (HttpWebResponse)request.GetResponse();  
       return factory.ProcessResponse(response);  
     }  

In this code I do use a factory to determine what kind of response I need to return.  It could be just a string response or a JSON string response response.  I have tested this against both Sandbox and Production API's with Scribe Online and it does work.  The only other thing I want to note in the above code, is I am sending a content length of 0.  This is because I did receive some error messages where content length was required, even though I wasn't sending a body.  If you need to send a JSON content body, there is the method where I do that:

        public string SendRequest(string username, string password, string url, string method, string contentBody)  
     {  
       ASCIIEncoding encoding = new ASCIIEncoding();  
       byte[] encodedData = encoding.GetBytes(contentBody);  
       WebResponseFactory factory = new WebResponseFactory();  
       HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);  
       request.Method = method;  
       request.Headers.Add("Authorization", "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes($"{username}:{password}")));  
       request.ContentType = ParameterContentTypeSettings.applicationjson;  
       request.ContentLength = encodedData.Length;  
       var requestStream = request.GetRequestStream();  
       requestStream.Write(encodedData, 0, encodedData.Length);  
       requestStream.Close();  
       HttpWebResponse response = (HttpWebResponse)request.GetResponse();  
       return factory.ProcessResponse(response);  
     }  

If you need any help on working with the Scribe Api, please feel free to leave a comment below or leave a post in the Scribe Forums.  Also check out Scribe's Developer Portal for additional information.

References:
Tarnovskiy, S. (2016, April 28). TLS 1.2 and .NET Support: How to Avoid Connection Errors. Retrieved April 30, 2018, from https://blogs.perficient.com/microsoft/2016/04/tsl-1-2-and-net-support/
Scribe Online

Test Your Code With Your User Stories - Behavior Driven Development

April 16, 2018
When we are designing a new system, one of the tools we use is user stories.  User stories allow us to define what the feature should do from the view point of the end user.  This way we take a user centered approach to designing the system.  These are also used as part of our functional testing when writing code (plugin, JavaScript, etc.) to make sure what was written matches the user story.  Even if we use Test Driven Development (TDD) we could easily miss some of the key functions within the feature and need to go back to our code to make changes and then restart our testing process.  This can be time consuming.  Wouldn't it be better to start our testing off with the user story?

With advancements in frameworks and technology we now have the capability to write test scripts directly from the user story using SpecFlow.  Taking this approach is known as Behavior Driven Development (BDD) because we are testing the users interactions instead of just data and functions.  This becomes even more important when we have to work with a UI. We can mimic button presses and navigation using other tools like Selenium and EasyRepro.

I am not going to go through an entire setup process as SpecFlow has done an awesome job documenting how to set it up in CRM.  Also Wael Hamze has provided some great examples in GitHub when using FakeXrmEasy and EasyRepro.


References:
Microsoft Dynamics 365

SCRIBE Online April 2018 Update is Here!!!

April 9, 2018
On April 6th, 2018 Scribe released an update to SCRIBE Online.  You can read all the details of the changes here.  In this post about the update, I want to go over some of the stuff I have seen while working with this new update, and it's an AMAZING update to the platform.  Here are my highlights:

  • Performance - There does appear to be improvement to how well the platform functions in Chrome (Chrome is my primary browser).  Previously, I did have some performance issues in Chrome and on larger map's I would have to dump my cache or exit chrome and re-launch to speed it back up again (not always the case, but did need done from time to rime).  This appears to be eliminated for me.  I can clearly see marked improvements when it comes to lag and performance. *See bullet point 4 under Platform which is under Enhancements.
  • Agent Request Logs - Users now have the ability to request the logs from the agent without logging into the server where the agent is installed to get them.  Simple go to the agent in Scribe Online and click on the gear that will appear on the right side of the agent table.  Click on "Request Logs".  This will download a zip file with the logs for you to review. *See Agents section. 
  • Revision History - At the bottom of each mapping you will see a "Revision Comment" box.  While working on a mapping you can insert a comment into this box to log what you are updating on the map.  Every time you click "Apply" of "Ok" to save the map, a revision is stored.  This way if you need to go back and restore a previous revision you can.  To view the revision history go to the map in your solution and click on the gear to the right of your map and look for "Revisions"  I will do a more detailed blog post on this one in the future. * See Maps Section.
  • Debugging - This was a bug that I ran into previously.  In previous release when debugging sometimes the block wouldn't enter where the error happens.  This could cause confusion as to what the problem was.  Also the error when this happened wouldn't appear.  This bug has been squashed.  Debugging is working like a charm and errors are properly appearing and making it really easy to figure out what is happening when debugging a map. * See Debug Section
There are a lot of bug fixes and enhancements in this release.  What I highlighted in this post is only a small amount of what is involved in this release.  These are the ones that I choose to highlight as they are perhaps the most notable and easily recognizable by people using the platform.  Again, for a complete list of bug fixes and enhancements, head over to the SCRIBE Success Community and check out the release notes.

Scribe Online

Getting Started Connector Development

March 22, 2018
One of the benefits of working with Scribe Online is how easy they make it to create connectors if one does not exist.  In this blog post we are going to look at how to get setup, if this is the first time you have made a connector for Scribe Online.  But, before we get into that, we should first make sure that a connector doesn't already exist that can handle what we need.

We can do this by looking in the Scribe Online Marketplace.  Still not seeing what you need?  Reach out to Scribe directly or ask in the Scribe forums if a connector exists for an application.  There are instances where a connector exists but is not listed in the marketplace.  An example of this is for a client that I built a connector for.  They didn't want to setup a system to run the on-premise agent, so they asked me to set up the connector to run on Scribe's cloud agent.  This meant that I had to submit the connector to Scribe for validation.  Once published the connector is in the Scribe marketplace, but hidden.  Access to it is managed from within the client's Scribe Online org.  This means that only people that ask them for access can use it.  But, unless they tell you this, you won't know it.  So, it's worth asking Scribe before starting to develop a connector, if one exists.  As mentioned before, even if one doesn't Scribe makes it really easy to create one.

In this blog post we are only going to go over what you need to get setup.  We won't be getting in depth on connector development, that will come in future posts.  You will need the following to create connectors:

I will assume that if you are reading this you are already familiar with writing code and have Visual Studio installed.  If that is the case, then all you need to do is install the GitLab extension (only needed if you are going to publish to Scribe for validation or if you don't have a current source control solution).  At this point we will now install the fast connector framework (FCF) for both messaging and integration.  To do this, go into the FastConnectorFramework folder in the CDK that you downloaded.  There you will see 2 folders, each containing a .vsix file.  With visual studio closed, double click on these files to install the extensions in Visual Studio.  Once this is done, you will see the below, when creating a new project in Visual Studio:

With this all setup we can create connectors with the FCF for integration or messaging.  We can also create connectors from scratch using the CDK.  Then we can upload them to Scribe if we want them in the marketplace or use them with the cloud agent.  In future blog posts, we will go more in-depth on connector development and the differences between using the CDK or FCF.  I just wanted to put this post out as an introduction to connector development.
Scribe Online

SCRIBE Labs - Variable Connector Used To Control A Loop (Dynamically)

March 9, 2018
Recently, while working on a data migration task, I had a requirement to transform data stored in a string delimited by ";" and parse it to create multiple line items in the target systems.  Here is an example as a visual:

Source:
productA, productB, productC, productD...

Target:
productA
ProductB
ProductC
ProductD
...

In the target system, each product would be a line item on an opportunity.  In the source system this was all listed in a text box as a string and delimited by ";".  Also the cell where this data was in the source could be null, have one item, ten items, fifty items, etc.  So I needed a way to loop through this.

To set this up I needed to use three connectors, source, target, and Scribe Labs - Variables.  Below is an image of the section of the mapping where I achieved what I needed to happen:
  1. If Product To Add - This is the null handler.  If the field where the products are listed in the source is null, then we skip adding any products in the target system.
  2. Lookup uom - This is target specific related record and not important for what we are working on.
  3. Upsert Counter - Here is where we declare our counter in the the Scribe Labs - Variable connector.  For this I choose NumVarible and set the name to "counter" and the val to 1.
  4. Loop - Now we enter our loop.
  5. Get Counter - Here we do a lookup to get the val of the counter.  Yes, we just created the counter, but remember we are in a loop and need this lookup, since we will be updating the value in the loop.
  6. Lookup Product - Here is where we lookup the product in the target system.  For the lookup I am doing a like match against the product id field in the target system.  To do this I need to use the parse function in Scribe. Here is how the parse is setup: CONCATENATE("%", STRIP(PARSE(Source Field, Counter_NumVariableLookup.val, ";"),"AN")).  What we will focus on in this function is the Parse() piece.  Within the Parse we have to tell Scribe the field that we want to get the data from.  Then we tell it where specifically in that string we want to look.  Then how the string is delimited.  To make this dynamic, we are using our counter variable.  This means on first run our counter variable is set to one.  So, in a string of "ProductA;ProductB;ProductC" our parse is going to pull out "ProductA" because it is the first sub-string.  When the loop runs again, the counter will be two and "ProductB" will be returned.
  7. If Product Found - This if statement is to make sure we don't create any empty line items if the product doesn't exist in the target system.
  8. Create opportunityproduct - If we do find 1 product, then we create the record in the target system.
  9. Increase Counter - Here is where we increase the val of the counter by 1.
  10. If No More to Process - Now that we increased our counter, we need to check our string ("ProductA;ProductB;ProductC") to see if we need to run the loop again.  When you use Parse with a counter like this to get a specific sub-string.  If that we reach the end of the string, in our example that would be 4.  The Parse will return null.  So for our If statement we want to use the ISNULLOREMPTY() to check and see if we reached the end of the string.  We would setup the condition to be ISNULLOREMPTY( PARSE(Source Field, Counter_NumVariableUpsert2.val, ";") ) equals True.
  11. Loop Exit - If we have reached the end of our string, then we exit our loop.
For more information check out these helpful links:

Scribe Online

XrmToolBox Bulk Attachment Manager - Version 2018.2.3.2 Released!

March 9, 2018
I am pleased to announce that this week, I published an update to my XrmToolBox Bulk Attachment Manager plugin.  For those that are not aware my plugin allows a user to download note attachments, email attachments, or both from Dynamics CRM.

This update includes bug fixes and a major enhancement that was requested by a user.  The enhancement is the ability to create a report of attachments in the system.  To do this, follow these steps:

  1. In Step One, choose if you want a report on notes, emails or both.
  2. In Step Two, choose "Report Only".
  3. Step Three, click on "Run".
You will see the table populate with information.  Once it is done, click on export and you will be provided with a pipe delimited ("|") .csv file with the data in the table.  You can take this to quickly find attachments you are interested in.  With the note GUIDs from the export you can provide them as a filtered down list, so you only export those attachments using the "Specific attachment" option for export.

Part of this update as well including adding file size as a column of data that is provided in the table.  The file size provided is in bytes.

This is an open source project that I have hosted on GitHub.  If you have any recommendations for the tool, run into any issues with the tool or just want to support the plugin, please head over to my GitHub page.

Microsoft Dynamics 365XrmToolBox

XrmToolBox Bulk Attachment Manager - Version 2018.2.2.7 Released!

February 12, 2018
Today I have published version 2018.2.2.7 of my XrmToolBox plugin.  The purpose of this plugin is to make it easy to download and backup attachments in CRM.  This release is a major release as it is the first built version with all pieces working.  The first version only had note downloads working.

Overview of Plugin:
How To Use The Tool:
  1. Launch XRMToolBox.
  2. Connect to organization.
  3. Step 1, choose what you want to download. (Once a choice is made, Step 2 becomes usable)
    1. Notes - Download attachments from note entity.
    2. E-Mail - Download attachments from emails.
    3. Both - Downloads attachments from notes and emails entities.
  4. Step 2, choose if you want to download all attachments in the system or specific attachments.  
    1. All Attachments - This will search the chosen entity to find all records that have an attachment and download them.
      1. Click on "Browse" and choose where to download the attachments too.
    2. Specific Attachments - You will need to provide a .csv file with a GUID(s) of the records you want to download.
      1. Click on "Browse" and choose the .csv file to use.  Attachments will be downloaded to the same location as the .csv file.
  5. Step 3. click on "Run".
  6. When complete, in step 4 click on "Export Results".  This will export the data in the output area into a pipe delimited (|) .csv file.
Notes:
  • Note attachments are stored in a "Note Attachments" folder at the download location chosen.
  • Email attachment are stored in a "Email Attachments" folder at the download location chosen.
  • Inside the folder mentioned above are other folders.  The folder for each of these sub-folders is the GUID of the source record (Note / Email).  This way we can maintain the relationship of the records.  Within this folder are the attachment(s).
If you run into any issues with this plugin please report the issue against the GitHub Project.
Microsoft Dynamics 365XrmToolBox

Validating User Input In CRM Portals With JavaScript

February 9, 2018
When we are setting up CRM Portals to allow customers to update their information, open cases, fill out an applications, etc. We want to make sure that we are validating their input before it is committed to CRM.  This way we ensure that our data is clean and meaningful to us and the customer.

CRM Portals already has a lot validation checks built into it. But, on occasion we need to add our own.  To do this we will use JavaScript to run the validation and also to output a message to the user to tell them there is an issue they need to fix.

Before we can do any JavaScript, we need to check and see if we are using JavaScript on an Entity Form or Web Page.  This is because the JavaScript, while similar, will be different.  First, we will go over the JavaScript for Entity Forms.  Then, we will go over the JavaScript for Web Pages.  Finally, we will look at the notification JavaScript.

Entity Form:
 if (window.jQuery) {  
   (function ($) {  
     if (typeof (entityFormClientValidate) != 'undefined') {  
       var originalValidationFunction = entityFormClientValidate;  
       if (originalValidationFunction && typeof (originalValidationFunction) == "function") {  
         entityFormClientValidate = function ()   
          {  
           //DO VALIDATION HERE. RETURN TRUE IF PASSED AND FALSE IF FAIL  
         };  
       }  
     }  
   }(window.jQuery));  
 }  

Web Page:
 if (window.jQuery) {  
   (function ($) {  
     if (typeof (webFormClientValidate) != 'undefined') {  
       var originalValidationFunction = webFormClientValidate;  
       if (originalValidationFunction && typeof (originalValidationFunction) == "function") {  
         webFormClientValidate = function ()  
         {  
          //DO VALIDATION HERE. RETURN TRUE IF PASSED AND FALSE IF FAIL   
         };  
       }  
     }  
   }(window.jQuery));  
 }  

Message JavaScript:
 $('.notifications').append($("<div class='notification alert alert-danger' role='alert'>PUT ERROR MESSAGE TEXT HERE</div>"));  

Microsoft Dynamics 365

Using Microsoft Dynamics 365 Within Microsoft Teams

January 26, 2018
In my SCRIBE Online blog, I wrote about how we can use SCRIBE Online in Microsoft Teams.  In this post I am going to talk about how we can use Microsoft Dynamics 365 in Microsoft Teams.  For those that might not be familiar with Microsoft Teams, it was released by Microsoft in 2017 to help increase communication and productivity within Teams.  The best part about Microsoft Teams is it is 100% FREE.  Click here to download Microsoft Teams.  Since it's release there have be many updates and added connectors, that increase its functionality and usability.

To leverage Microsoft Dynamics 365 within Microsoft Teams we can use either the Website Tab or the Dynamics 365 Connector.  First we will start with how to add the Website Tab.
  1. In you team, choose the channel you want to add the tab too.
  2. Click the "+" in the channel you have chosen and you will get a pop-up
  3. Click on "Website"
  4. There will be a pop-up asking to input the tab name and URL.  At this point you need to decide if you are going to have the website launch to the default landing page for  Dynamics 365 or to a specific record in  Dynamics 365.
  5. Leave the "Post to the channel about this tab" checked.  This will create a thread in the "Conversations" tab.  This way any communication around the  Dynamics 365 record can be kept in one place.
  6. Click "Save".
Now, that we know how to add Dynamics 365 in the website tab, you might be asking; "How can I get notifications if there is an update to the record in Dynamics 365 without logging into Dynamics 365?"  That is a really good question and one that the Dynamics 365 connector solves.  At the top of the Teams UI you will see the "Breadcrumbs" of where you are.  This will be the team name followed by the channel.  After the channel name you will see three dots.  See below image.

To add the connector we want, click on the three dots.  This will bring up a menu.  In the menu click on "Connectors" (If you don't see connectors, wait a few minutes as this means you just created the team and it hasn't fully setup yet.).  In the pop up, you will see a list of connectors.  To filter these, click on CRM under "CATEGORY".

Next click on configure next to Dynamics 365.  Wait a few seconds and you will see a screen that will ask you to pick the Dynamics 365 environment and then choose a record.  Once you see this, pick the Dynamics 365 instance you want to connect too.   Once Teams connects to the instance, you will see the instance URL appear.  Now we can search for the record we want in CRM, typically these will be leads, accounts or contacts.  Since almost all other records are linked to these ones in some fashion.  Once we find the record we want in the search results, click on save.  You will see a new thread start in the "Conversations" tab.  Now when any activities are updated for this record, there will be notifications within Microsoft Teams.

I hope this helps you with Microsoft Dynamics 365 within Microsoft Teams.  Stay tuned for any updates to Microsoft Teams and the Dynamics 365 connector for Microsoft Teams.
Microsoft Dynamics 365

Using SCRIBE Online to Migrate Opportunities Into Microsoft Dynamics 365

January 18, 2018
  Recently while creating map's for a data migration from a legacy system to Microsoft Dynamics 365 using SCRIBE Online, I found that I needed to bring in Opportunities.  Anyone that has worked with CRM for a time, know's that migrating closed opportunities into CRM can be a bit of a pain.  I was having one issue in particular when bring this data in, the actual close date was being set to the day the record was migrated into CRM.  This is because of how CRM handles opportunities when they are closed.

Here is what I was doing:
  1. I performed all my look ups first to get the associated account and contact from CRM.
  2. I created the opportunity with the closed status of won or lost (see links below for website to get the standard values for reference).
  3. This makes the opportunity as read only (which I expected to have happen.  But, the actual close date was set for the day the record was created.  Even though I provided a date in the actual close date field.
Why is this happening?
     This happens because when you close an opportunity in CRM, it creates an opportunity close activity.  Doing an advance find I was able to locate this record and the actual value and actual close dates where blank.  Keep in mind, my mapping didn't create this record.  CRM created it when the status of the opportunity was set to one of the closed status (won or lost).

How do we over come this?
    The way to over come this is really simple.  It was a post in the SCRIBE forums on how to do this with SCRIBE Insight that helped me come up with the way to do it with SCRIBE Online.  Here is the solution:
  1. Create the opportunity with the close status you want (won or lost).  This will make it read only.  Don't worry about that.
  2. I recommend doing an if else block after the create opportunity step.  This way if you are bring in open opportunities you the if else will be skipped.  We want to enter the if statement only if we have have a closed opportunity.
  3. Inside our if else, we want to do a lookup to get the opportunity we just created.
  4. After our lookup do a delete block against the opportunity close table.  Use the opportunity id in the matching criteria.
  5. After the delete we want to do a create opportunity close and populate it with the actual close date and actual revenue.
  6. Finally, do an update block to the original opportunity and populate the actual close date and actual revenue.  Even though the record is read only, it will update these fields.

Helpful Links:

Microsoft Dynamics 365Scribe Online

Getting Your Data Into Dynamics 365 - Data Migration

January 8, 2018
When doing an implementation of Dynamics 365, it is rare that a new system is being setup with no data needing to be brought into it.  When looking into how to get that data into the new system, it can be really confusing trying to decide which tool to choose.  In my experience, people tend to choose the tool to use based on price and this should really be the last thing to consider.  That is one of the reasons I will not talk about price in this post.  Instead I will stick to the Pro's and Con's for each tool and provide an overview of the tool and any experience(s) with each.

Here are the tools we will be looking at in this post:
KingswaySoft
SCRIBE Online
D365 Data Import Wizard
D365 Data Loader

KingswaySoft:
KingswaySoft is an integration toolkit built on SQL Server Integration Services (SSIS).  It is an on-premise solution (but can be cloud if installed on virtual machine in Microsoft Azure or Amazon AWS).  It can be used for integration, migration and replication.  In full disclosure I have not used KingswaySoft for a data migration yet, because of the reasons I list in the con's.  But, I have had clients that use it.  They offer a free download and developer license, so you can give it a try.

Pro's:
  • SSIS has been around for a long time and SQL developers are familiar with it.
  • Has starter packs that can be used to speed up development of mappings.
  • Easy integration with source control since development is done in Visual Studio.
Con's:
  • Since its built on SSIS you need to know the disadvantages of SSIS as well.
  • Can be intimidating is the user is not a developer, since you have to develop within Visual Studio or SQL Server Management Studio (SSMS).
  • Limited connections
SCRIBE Online:
SCRIBE Online is a product offered by SCRIBE for data integration, migration and replication.  This is the tool I use the most when it comes to migrating data into Microsoft Dynamics 365 and integrating with other systems.  It runs in the cloud and has an on-premise agent you install to access your on-premise data / systems.  There is also a cloud agent, if all your applications are already in the cloud.  They offer a free trial that you can use to see if it will fit your needs.

Pro's:
  • User interface is easy to understand and clean.
  • Runs in the cloud and uses a small tool (agent) to access on-premise data.  If all systems are in the cloud already then a cloud agent can be used so there is no need for an on-premise system.
  • Multiple connectors.  Makes it easy to connect to other systems and migrate the data into CRM.
  • API - They provide an API so you can access your instance programmatically.
  • Fast Connector Framework (FCF) \ Connector Development Kit (CDK) - If a connector doesn't exist you can use the FCF or CDK to create one.
  • Provide starter kits to speed up you data mapping. 
  • They offer a Dynamics 365 app, so you can monitor status from within CRM.
Con's:
  •  Sometimes there are caching issues and you have to clear browser cache to fix them.
  • To get mappings into source control it is a manual process, since you have to export them and copy them into a local folder.
D365 Data Import Wizard:
The Data Import Wizard is built within Microsoft Dynamics.   It allows for easy import of .csv, .xls, .xlsx and .zip files.  My vary first data migration was done with importing .xls files into CRM using the Data Import Wizard (it was not easy).  If you are doing a large or complex migration, I recommend staying away from this approach and using a different tool.  Only use this for small occasional imports.

Pro's:
  • Built within Dynamics 365.
  • Mappings saved in CRM for later use.
  • Great for small non-complex mappings.
  • Can be used with the CRM SDK / Developer Guide.
Con's:
  •  There is a limit to the file size you can use for importing data.  This means you will need to split up your data into multiple files, which can increase the possibility of data duplication.
  • If there is relationships to other entities in your mapping, it can be difficult to set all these and have them work consistently.
  • Tool is vary inconsistent when importing data.
D365 Data Loader:
This is a new tool that Microsoft has put out with Dynamics 365.  It was designed to help business go from CRM On-Premise to CRM Online.  This one I can not give a Pro's and Con's list to because it is a preview feature, which mean it still has issues and is not fully released yet.  I will say that it is a viable option of going from CRM On-Premise to CRM Online and both systems are identical.

Conclusion:
It can be really hard to choose a tool and say that it will work 100% of the time for all needs.  This is because everyone's needs are different and everyone's skill sets are different.  If I did have to choose one to use I typically choose SCRIBE Online because it is the most user friendly and the one that is easy for people to pick up on and maintain.
Microsoft Dynamics 365Scribe Online

SCRIBE Stream Deck Version 2 Release - v02.01.2017

January 2, 2018

In a previous post, I talked about using SCRIBE Online with my Stream Deck.  The way I did this was creating a few small programs that I connects to an open button on my Stream Deck.  This was a temporary solution I created as a proof of concept.  What I didn't like about it was that I didn't have all the SCRIBE functions on it and I had to create multiple projects in visual studio.  But, it did prove that what I wanted to do could be done.  Since that post I have rewritten the project into one application that does the following:
  • Allows for user created functions without having to change code.
  • Provides all the SCRIBE functions on the Stream Deck.
  • Provides buttons to launch SCRIBE Online, SCRIBE Forums, and SCRIBE Help.
  • Provides buttons for cut, copy, paste.
Here is an overview of the application:
The purpose of this application is to allow the Stream Deck to be used with SCRIBE Online.

On start this application will end the stock Stream Deck application before starting. Also only 1 instance of this application can run at a time and measures have been taken to prevent multiple instances from running. When the application is done running it will auto relaunch the stock Stream Deck application.

Home Page Buttons:


  • Back - Ends the program and launches the stock Stream Deck Application. (Only on home screen. Other screens it will take you back one level.)
  • Open XML - Opens the XML that contains the SCRIBE and user defined functions.
  • Cut - Same as CTRL+X.
  • Copy - Same as CTRL+C.
  • Paste - Same as CTRL+V.
  • SCRIBE Online - Launches the SCRIBE Online web application.
  • SCRIBE Forums - Launches the SCRIBE Forums web site.
  • SCRIBE Help - Launches the SCRIBE Help web site.
  • Conversion - Folder containing the conversion functions.
  • Date - Folder containing the date functions.
  • Logical - Folder containing logical functions.
  • Math - Folder containing math functions.
  • Text - Folder containing text functions.
  • Miscellaneous - Folder containing miscellaneous functions.
  • User Defined - Folder containing user created functions.
When a folder button is pressed the buttons will refresh with the functions that are in that folder. This matches what is in these folders in the SCRIBE Online application. Because there are only 15 buttons (top left always used for back) this leaves only 14 functions that can be viewed at once. If there are more functions then can be displayed you will see an ellipses (...) button in the bottom right corner button on the stream deck. Also when you go into a folder the back button will take you one level back. It only exits the program from the home screen.

 Here is an outline of the basic usage:
  1. In SCRIBE Online open the formula editor and click where you want to insert the function.
  2. Navigate to the function you want on the Stream Deck. 
  3. Press the function button and release it. Upon releasing, the function will be placed at the cursor location in the formula editor.
Behind the scenes the clipboard on the computer is being used to do this. This is important to know, if you are copying and pasting other items. What is done is when the button is pressed, this tells the application which function you want from the XML and it gets that function and the value of the function. Upon release of the button, it will paste the value of the function where the cursor is.

The back button will remain in the top left corner button. On the home screen this button will end the SCRIBE Stream Deck application. From any other screen it will take the user back one level.


To fully understand the application please check out the wiki on the GitHub project. 

Here are the links:
GitHub Project
Wiki
Latest Release
Submit Feedback or Issue 
Scribe Online

Dynamics Set IFrame URL - D365 v8 vs. D365 v9

December 29, 2017
While doing client work, I came across a problem with setting an IFrame URL dynamically.  The underlying issue was that the sandbox instance is on v8 of Dynamics 365 and production is on v9 of Dynamics 365.  The reason for this was because this client was setup around the time that Microsoft rolled out v9.  Anyways, JavaScript that I wrote to dynamically set the URL of the IFrame wasn't working in the v9 instance.  This was because of changes that Microsoft made to how IFrames are loaded on the form and also changes to JavaScript.

Here is my v8 setup:
  1. JavaScript runs OnLoad of contact form.  This works because of how IFrames are loaded in v8.  You can also run it on either a tab change (hide / show) or OnReadyStateComplete event of the IFrame.  Depending on your setup you will need to choose which is best for you.  For me in this case it was the OnLoad event.
  2. Here is the JavaScript:
 function OnLoad()  
 {  
      //Get memberid  
      var value = Xrm.Page.data.entity.attributes.get("new_memberid").getValue();  
      //Get the default URL for the IFRAME, which includes the   
      // query string parameters  
      var IFrame = Xrm.Page.ui.controls.get("IFRAME_NAMEofIFRAME");  
      var Url = IFrame.getSrc();  
      //Append the parameters to the new page URL  
      newTarget = Url + "history?member=" + value;  
      // Use the setSrc method so that the IFRAME uses the  
      // new page with the existing parameters  
      IFrame.setSrc(newTarget);  
 }  

Here is my v9 setup:
  1. JavaScript runs on OnReadyStateComplete event of the IFrame.  I had to add extra code to prevent constant refresh of the IFrame.  This is because on this event every time we update the URL if causes the OnReadyStateComplete event to fire.  To prevent this we have to use an If/Else so we only update the URL if it hasn't been updated yet.
  2. Here is the JavaScript:
 function OnReadyStateComplete(executionContext)  
 {  
      var formContext = executionContext.getFormContext();  
      var value = formContext.getAttribute("new_memberid").getValue();  
      //Get the default URL for the IFRAME, which includes the   
      // query string parameters  
      var IFrame = formContext.ui.controls.get("IFRAME_NAMEofIFRAME");  
      var Url = IFrame.getSrc();  
      var newTarget = "";  
      //Append the parameters to the new page URL  
      if(Url.includes(value))  
      {  
           newTarget = Url;  
      }  
      else  
      {  
           newTarget = Url + "history?member=" + value;  
           // Use the setSrc method so that the IFRAME uses the  
           // new page with the existing parameters  
           IFrame.setSrc(newTarget);  
      }  
 }  

As you can see in my v9 code I have to pass in the execution context so I can create the form context.  To do that you have to check the box next to, "Pass execution context as first parameter" when adding the JavaScript to the form.


Microsoft has not yet deprecated Xrm.Page yet but it is coming, so it is best to plan for it.  Here are some important links I used when solving my problem of the JavaScript not running:

Important Change (deprecations) coming
OnReadyStateComplete
Execution Context
Form Context
Walkthrough: Write your first client script
Client API Reference
Microsoft Dynamics 365

Using Stream Deck with SCRIBE

December 27, 2017
Because I am doing a lot of work with SCRIBE Online, I find myself trying to come up with solutions to speed up my work.  One solution I came up with is using my Stream Deck to automate some of the commands I type within SCRIBE Online.  Before we go to far let me bring you up to speed on what the Stream Deck is.

The Stream Deck is manufactured by elgato.  It is a small 15 key keyboard that can be customized.  What I like about it is how the keys are clean looking and can be easily changed by the application.  This hardware is marketed towards video editors and game streams (both of which I don't do).   I got it because of the number of keys it has, the ease at which you can update it, and because I didn't want to use macro's on another external keyboard with paper labels stuck on the keys.

I will say that I am not only using the Stream Deck for just SCRIBE, but other applications like Visual Studio, Microsoft Windows shortcut keys, and to launch other applications.  The reason I am pointing it out here is because to use it with SCRIBE I did have to do a little programming as it can't launch bat files like I was hoping.  This meant I reached into my C# tool kit and had to write a few small programs to accomplish what I wanted.

Since SCRIBE Online run's in the browser, I wanted to automate the process of typing out commands like, "IF(ISNULLOREMPTY(entity.attribuate), "TRUE", "FALSE")".  In the past I would have notepad open with that string pasted in it, then I would have to move my mouse over to that, copy it, paste it in the browser and then update it with the proper syntax.  That is easy enough but time consuming when doing this repeatedly.  So now with my Stream Deck I am able to simple press a button and it put that string value on my system clipboard, then I simply press another button and it pastes it where my cursor is.  This means I don't have to type it out and I don't have to leave SCRIBE Online to go over to notepad to copy it.  This saves me a few seconds that can add up over time when working on large integration or migration projects.

Here are the following commands I have built so far into my Stream Deck:

  • IF(ISNULLOREMPTY(),,)
  • IF(AND(),,)
  • IF(OR(),,)
  • CONCATENATE()
  • FORMAT("yyyy-MM-dd",)

Also all of those where just pasted to this thread using the Stream Deck in about 2 seconds.  This is what my SCRIBE folder on the Stream Deck looks like right now:


I will be changing the icons to make them stand out from each other a little more, but wanted to get it up and running really quickly.   The middle row are using small applications that I wrote and published to GitHub for anyone else that might have a Stream Deck and want to do this.  Here is the link to my GitHub project.

Here are some other links you might find interesting:
Amazon to purchase Stream Deck
Open Source .NET Library (Unofficial)
Stream Deck Website

I would be interested to hear if anyone else is using this device and if they have any ideas on how else we could use it for SCRIBE Online.
Scribe Online

Lookup Tables

December 19, 2017
While creating data mappings you may run into the need to store some variables in a table in a Key \ Value Pair.  This way you can call the variable by its Key to get the Value and pass it to the other system.  This is helpful when the 2 systems store data in different ways.  An example of this would be a table that stores United States abbreviations as the key and the full state name as the value.  This is helpful when the source stores the full state name and the target stores the abbreviation or visa versa.  So how do we set this up?

1) In the navigation bar click on "MORE".
2) Click on "Lookup Tables".
3) Click on the "+" sign on the right hand side.
4) In the pop-up input a Name.
5) On the ellipses button on the right under Description you will get a dropdown with "Create", "Append" and "Export".
     5.1) Create - Click this to create a new Key / Value Pair in the table.
     5.2) Append - If you have a CSV file already created this is how you import it.  I do want to point              out that is needs to be a CSV that is separated by "," and nothing else or import will fail.
     5.3) Export - Exports the table

Now that we have out table built, we need to consume it.  To do this we will use the functions found in the Lookup Functions.  Here are the specific ones:
  • LOOKUPTABLEVALUE1() - Looks in column 2 for the value we pass it and returns the value in column 1 in the same row.
  • LOOKUPTABLEVALUE1_DEFAULT() -Looks in column 2 for the value we pass it and returns the value in column 1 in the same row. If nothing is found in the table it returns the default value we pass it.
  • LOOKUPTABLEVALUE2() - Looks in column 1 for the value we pass it and returns the value in column 2 in the same row.
  • LOOKUPTABLEVALUE2_DEFAULT() - If nothing is found in the table it returns the default value we pass it. If nothing is found in the table it returns the default value we pass it.
SCRIBE provides some detailed documentation around lookup tables that I encourage you to check out.  Here is a link to the help documentation.  Make sure to check out the "See Also section at the bottom of the link to get more details on each of the functions we can use for the Lookup Tables."
Scribe Online

Option Set Helper Code

December 19, 2017
When it comes to working with Option Sets while creating a plugin, custom workflow activity or other extension to CRM, it can sometimes be a pain to get the label of the Option Set.  There are many solutions out there on how to accomplish this, but for the most part they were not reliable.  So after doing a bunch of research online and going through the SDK I came up with 4 methods that help me when I am working with Option Sets.  There are 2 for global option sets and 2 for local option sets.  The reason there are 2 for each is because one method will return the value and the other will return the label.

1) Get Local Option Set Label - This method will return the label of the option set value you provide it.
     /// <summary>  
     /// Method to get local option set label  
     /// </summary>  
     /// <param name="entityLogicalName">Schema name of entity</param>  
     /// <param name="optionSetName">schema name of the optionset</param>  
     /// <param name="value">value of the option set option</param>  
     /// <param name="service">CRM Organizational Service</param>  
     /// <returns>String Label of the optionset value provided</returns>  
     public string GetLocalOptionSetLabel(string entityLogicalName, string optionSetName, int value, IOrganizationService service)  
     {  
       XrmServiceContext crmContext = new XrmServiceContext(service);  
       RetrieveEntityRequest retrieveDetails = new RetrieveEntityRequest  
       {  
         EntityFilters = EntityFilters.All,  
         LogicalName = entityLogicalName  
       };  
       RetrieveEntityResponse retrieveEntityResponseObj = (RetrieveEntityResponse)service.Execute(retrieveDetails);  
       EntityMetadata metadata = retrieveEntityResponseObj.EntityMetadata;  
       PicklistAttributeMetadata picklistMetadata = metadata.Attributes.FirstOrDefault(attribute => string.Equals(attribute.LogicalName, optionSetName, StringComparison.OrdinalIgnoreCase)) as PicklistAttributeMetadata;  
       OptionSetMetadata options = picklistMetadata.OptionSet;  
       IList<OptionMetadata> OptionsList = (from o in options.Options  
                          where o.Value.Value == value  
                          select o).ToList();  
       string optionsetLabel = (OptionsList.First()).Label.UserLocalizedLabel.Label;  
       return optionsetLabel;  
     } 

2) Get Local Option Set Value - This method till return the value of the option set label you pass it.
     /// <summary>  
     /// Method to get local option set value  
     /// </summary>  
     /// <param name="entityLogicalName">Schema Name of entity</param>  
     /// <param name="optionSetName">Schema Name of option set.</param>  
     /// <param name="label">text label</param>  
     /// <param name="service">CRM Org Service</param>  
     /// <returns>Int option set value</returns>  
     public int GetLocalOptionSetValue(string entityLogicalName, string optionSetName, string label, IOrganizationService service)  
     {  
       XrmServiceContext crmContext = new XrmServiceContext(service);  
       RetrieveEntityRequest retrieveDetails = new RetrieveEntityRequest  
       {  
         EntityFilters = EntityFilters.All,  
         LogicalName = entityLogicalName  
       };  
       RetrieveEntityResponse retrieveEntityResponseObj = (RetrieveEntityResponse)service.Execute(retrieveDetails);  
       EntityMetadata metadata = retrieveEntityResponseObj.EntityMetadata;  
       PicklistAttributeMetadata picklistMetadata = metadata.Attributes.FirstOrDefault(attribute => string.Equals(attribute.LogicalName, optionSetName, StringComparison.OrdinalIgnoreCase)) as PicklistAttributeMetadata;  
       OptionSetMetadata options = picklistMetadata.OptionSet;  
       IList<OptionMetadata> OptionsList = (from o in options.Options  
                          where o.Label.UserLocalizedLabel.Label == label  
                          select o).ToList();  
       int optionsetValue = (OptionsList.First()).Value.Value;  
       return optionsetValue;  
     }  

3) Get Global Option Set Label - This method will return the label of a global option set value you pass it.
         /// <summary>  
     /// Method to get global option set label  
     /// </summary>  
     /// <param name="optionSetName">Schema name of option set</param>  
     /// <param name="value">Int value of option set value</param>  
     /// <param name="service">CRM Org service</param>  
     /// <returns>String option set label</returns>  
     public string GetGlobalOptionSetLabel(string optionSetName, int value, IOrganizationService service)  
     {  
       XrmServiceContext crmContext = new XrmServiceContext(service);  
       string optionSetLabel = "";  
       RetrieveOptionSetRequest retrieveOptionSetRequest = new RetrieveOptionSetRequest { Name = optionSetName };  
       RetrieveOptionSetResponse retrieveOptionSetResponse = (RetrieveOptionSetResponse)crmContext.Execute(retrieveOptionSetRequest);  
       OptionSetMetadata retrievedOptionSetMetadata = (OptionSetMetadata)retrieveOptionSetResponse.OptionSetMetadata;  
       OptionMetadata[] optionArray = retrievedOptionSetMetadata.Options.ToArray();  
       foreach (var option in optionArray)  
       {  
         if(option.Value.Value == value)  
         {  
           optionSetLabel = option.Label.LocalizedLabels[0].Label;  
         }  
       }  
       return optionSetLabel;  
     }  

4) Get Global Option Set Value - This method will return the value of the global option set label you pass it.
     /// <summary>  
     /// Method to get the option set value for a global optionset  
     /// </summary>  
     /// <param name="optionSetName">Schema name of the optionset</param>  
     /// <param name="label">text label</param>  
     /// <param name="service">CRM Org Service</param>  
     /// <returns>Int option set value</returns>  
     public int GetGlobalOptionSetValue(string optionSetName, string label, IOrganizationService service)  
     {  
       XrmServiceContext crmContext = new XrmServiceContext(service);  
       int optionSetValue = 0;  
       RetrieveOptionSetRequest retrieveOptionSetRequest = new RetrieveOptionSetRequest { Name = optionSetName };  
       RetrieveOptionSetResponse retrieveOptionSetResponse = (RetrieveOptionSetResponse)crmContext.Execute(retrieveOptionSetRequest);  
       OptionSetMetadata retrievedOptionSetMetadata = (OptionSetMetadata)retrieveOptionSetResponse.OptionSetMetadata;  
       OptionMetadata[] optionArray = retrievedOptionSetMetadata.Options.ToArray();  
       foreach (var option in optionArray)  
       {  
         if (option.Label.LocalizedLabels[0].Label == label)  
         {  
           optionSetValue = option.Value.Value;  
         }  
       }  
       return optionSetValue;  
     }  
Microsoft Dynamics 365

XrmToolBox Bulk Attachment Manager - Version 1.0.1.0

December 14, 2017
I am pleased to announce that today I have published version 1 of my Bulk Attachment Manager plugin for XrmToolBox.  This is an open source plugin that I created because I had a need multiple times to migrate attachments out of CRM and into another system.  because of the way CRM stores attachments in the database, migrating these are sometimes hard said then done.  So to make this process easy I came up with this tool.  Before I go any further I am sure you want the important links so you can take a look.  Here they are:

GitHub Project
Project Wiki
Nuget Package

Right now, this plugin can only download note attachments from CRM.  In future releases I will turn on email attachment downloads and note attachment uploads (notes will need to already exist in the system).  To download note attachments, you have 2 options and that is to either download all note attachments or only select note attachments.  To download only select note attachments, you will need to provide a .csv file with guids of the notes that contain the attachment you want to download.

When the attachments download they will download into a folder in the directory you choose called "Note Attachments".  In that folder will be a folder for each note, the folder names are the guids of the note records.  Inside those files are the attached file themselves.  Also for everything that is downloaded there is an output screen showing you what has been downloaded, along with data on the regarding record.  Once the process is done you and export these results into a pipe-delimited .csv file.

If you have any input on this plugin or experience any issues please leave a comment in the issue tab of the GitHub project.  Thank you.
Microsoft Dynamics 365XrmToolBox

Using SCRIBE Online For Scheduled Tasks

December 10, 2017
While doing client work, I ran into a need where we needed to scan for records where a date value is 3 months in the future (i.e. 3 months from today) in Microsoft Dynamics 365.  There are a few options to accomplish this:
  1. Microsoft Dynamics 365 SDK (Developer Guide)
  2. Timed workflows
  3. SCRIBE Online
The issue with using the Microsoft Dynamics 365 SDK (Developer Guide) is you need a software developer to create a console application that is linked to a Windows scheduled task to run every night to connect to CRM, query the records and make the changes.  This means if you need any updates to the application you need to developer to change source code.  This is a viable option if you have an internal development team that can support the application.

When it comes to using timed workflows with wait statements, we would need to have the workflow fire at a specific time in the record.  In this scenario, this means that the workflow needs to fire to start the countdown (wait) when the record was created or when the date field we are monitoring changes.  The problem with this is the more waiting workflows you have in the system, you will start to see a degradation in performance.  Also, if someone manually changes the date field you will need to find the previously waiting workflow to stop it as you will have duplicate workflows waiting.  This can be a nightmare to manage.

The solution that we came up with for this client was to use SCRIBE Online to perform these actions.  Typically, when we think of SCRIBE Online, we think about connecting 2 systems, replicating data to another database or migrating data from one system to another.  There is more that we can do with it and this is one of the areas.  In this scenario, we have an integration map that is setup with only 1 connection as both the target and the source (Microsoft Dynamics 365).  We set the solution containing the mapping to run nightly and query for any records where the date field on the record is 3 months from the date the mapping is running.  If it is true, then it generates a record that will kick off a record creation in CRM (Triggering Entity) that will kick off a workflow to generate a bunch of related records to the account this the date field.

With the above setup, there is no need for a developer to update code as the process can be updated with the internal CRM workflow and SCRIBE Online, which both use an easy to understand GUI.  By moving the counter outside of CRM, we also make sure that the CRM system won't lose performance.

The main reason for my blog post about this, is to show that we can use SCRIBE Online for more than just data migration, integration and replication.  The more that I work within SCRIBE Online, the more I can see its use for other tasks.
Microsoft Dynamics 365Scribe Online

SCRIBE Online Training and Resources

November 30, 2017
SCRIBE Online is a massive tool that can be intimidating the first time you work with it.  Luckily SCRIBE provides a lot of documentation and training to help you get started with your first go with SCRIBE Online.  When I first started working with SCRIBE Online it was in what they call the "Classic UI".  The latest version of SCRIBE is called "Crystal UI"  I will say that there where a lot of changes between classic and crystal that took me a little bit to grasp.  But, SCRIBE does a great job putting out content to help new users and even experienced users keep up to date with the platform.

In this post I wanted to go over some of the resources that SCRIBE offers to everyone for free.  This way you can get started with your first mapping or refresh your knowledge of SCRIBE Online.  First, we will talk about their webinars.  SCRIBE does regular webinars about its platform.  Past webinars can be found here.

Next, SCRIBE provides excellent video tutorials for SCRIBE Online. As of me writing this post here are list of videos with a small description of each:

  1. Crystal vs Classic UI - A Comparison.  This is a comparison of the classic UI to the Crystal UI.  I recommend this if you are familiar with the classic UI, but haven't worked in the Crystal UI yet.  If you are completely new to SCRIBE Online, you can skip this video.
  2. Introduction to Scribe Online.  As the title states this is an intro to SCRIBE Online platform and is good for people just getting into SCRIBE.
  3. Scribe Online Agent and its Setup.  This video goes over the agents, what they do and how to install them.
  4. Scribe Online Replication Service (RS).  Video that reviews Scribe's replication service.  Includes building solutions for creating and scheduling replications.
  5. Scribe Online Integration Service (IS). Video that reviews Scribe's integration service.  Includes building solutions and mappings for integrations.
  6. Scribe Online Event Maps.  Video that reviews Event driven mappings.  Also goes over how to set these mappings up.
If you need further help with SCRIBE Online please use the community forums to ask questions and search for answers.  Also, don't forget that you can sign up for a free trial to give SCRIBE a go.  Here is how you sign up for your free trial:
  1. Go to this link.
  2. Towards the middle of the page click on "Register For A Free Trial>".  Wait for the new page to load.  For me, this took about 2 minutes to complete.
  3. When the page loads fill in the form.  I recommend checking the check box to get updates.  Don't worry, SCRIBE won't spam you with information, like some companies do.  They will send you only needed information and updates.

Scribe Online

SCRIBE Online Status Monitor

November 16, 2017
When working with cloud solutions it's always important to understand the options you have available to monitor that service.  For SCRIBE Online, they have a website you can go to to see the current status of:

  • Agents
  • API
  • Connectors
  • Event Solutions
  • User Interface
  • Sandbox
Not only can you see current statuses but you can also see information about current incidents and past incidents.  Also they offer the ability to sign up for alerts by email, text and RSS.  Personally, I am signed up for email and text alerts.  To help keep the team of people I work with update, I setup RSS notification within our Microsoft Team's account.  I also recommend that if you are a consultant, you make sure any client you have setup with SCRIBE of integrations, you let them know about this page.

Here are some helpful tips for working with the site:
  • Subscribing To All Updates:

    1. Navigate to https://trust.scribesoft.com/
    2. Click on "SUBSCRIBE TO UPDATES" button
    3. Choose how you want to subscribe:
    4. Envelope = email
      1. Phone = text message
      2. Comment Bubble = support page
      3. RSS = Subscribe by rss or atom rss feeds
  • Subscribe To Current Incident:
    • If there is a current incident you will see a yellow banner with a message under it.  To subscribe to only that incident do the following:
    1. In the yellow banner click on "Subscribe"
    2. Input email address and/or phone number in the pop up
    3. Click "SUBSCRIBE TO INCIDENT"
As I stated above if there is a current incident happening you will see this in yellow as soon as you load the page.  Scrolling down the page you will see the current statuses for each of the items I outlined at the beginning of this post.  Under that you will see about the last 2 weeks broken down by day.  Here you can see if anything happened on those days.  At the vary bottom of the page you will see "Incident History".  Clicking on it takes you to page where you can see past incidents by month.
Scribe Online

Documenting Maps

November 9, 2017
When we are creating mappings in SCRIBE its important that we document those maps.  This is a two stage process.  First, we want to do an export of the map and save the JSON file into the source control system.  This way we have a back up copy of it and we can reuse the map if we need it again, similar to what you would do with a template.  The second reason is so others can have an understanding of the process.

In projects that I have worked on in the past I have taken more of a manual approach to creating the supporting documentation for my integration and migration mappings.  This can be time consuming to say the least.  During one of these exercises, I found the SCRIBE Documentation Tool.  I walk through the steps they outlined and it auto generated the documentation for me.  This has greatly speed up my work and gives me an spreadsheet that is easy to understand.

To create this documentation you will need a Google account as it uses Google Docs.  Here is a link to the detailed step by step process provided by SCRIBE.  Here is the high level process:

1) Allow API access to your SCRIBE Online organization.
2) Access the Documentation Tool by clicking this link. (Opens Google Spreadsheet).
3) Save a copy.  This is an important step as you create more then one of these, this way you don't over write a previous map documentation.  Also doing this will allow "SCRIBE" to appear in the menu next to "Help".
4) Click SCRIBE -> Documentation Solution and follow the prompts.
5) Once the process starts, depending on the number of mappings in the solution it can run quickly or slowly.  Just wait for it to finish.

When the process is finished you will see the following:

  • Org Details - This contains info about your org.
  • Solution Details - Details about the solution.
  • All tabs after the first 2 are the individual mappings in the solution.
    • The top section of each mapping tab will contain the high level map info.
    • After that section we start with the beginning of the mapping.  each block is highlighted, so it is easy to navigate between them.  under each block is the information contained in that block and the attributes used with what they are linked to or the conversion code we have written.
I hope this has helped you with documenting your data mappings.  Please check out the full documentation at SCRIBE's website.  Also SCRIBE provides the source code if you want to modify this tool for your specific needs.


Scribe Online

Goodbye SDK...Hello Developer Guide

November 8, 2017
With the release of Microsoft Dynamics 365 version 9, we are saying goodbye to the large SDK we are use to downloading to extend Microsoft Dynamics and saying hello to the Developer Guide for Dynamics 365 Customer Engagement.  Here are some highlights with the version 9 release.

1) Previous SDK versions can still be downloaded for previous release of Microsoft Dynamics CRM

2) Per the Developer Guide Dynamics 365 Version 9 is an online only release.

3) To get the early bound generator, plugin registration tool and other CRM tools you will need to use PowerShell and Nuget to get them.  Here is a link provided by Microsoft on how to get these tools.

4) Webhooks have been added to the plugin registration tool.  This allows for easy integration with Azure Functions.

5) As a developer you will have the ability to download the Developer Guide into a PDF for offline viewing.  To do this simply click on the "Download To PDF" on the bottom left of the screen of the Developer Guide.

6) In the previous versions of the SDK, we where provided with sample code to help us quickly get up and running.  To get sample code and projects you will need to check out MSDN and Github.

There is a ton of information in the new Developer Guide and I highly recommend checking it out.  As I go though it, I will be writing posts about specifics in the new Developer Guide.  This post was meant to serve as a high level overview and introduction to the new Developer Guide.
Microsoft Dynamics 365

Setup Non-Interactive User with Non-Expiring Password

November 2, 2017
When we need to connect CRM to another system it is important that the connection remain working so data can flow easily between the systems.  One common issue that can arise in the integration is the user account we use to create the connection, password can expire.  When this happens it could be minutes to days before the issue is found and that can lead to data synchronization between the systems getting messed.  To migrate this risk we can easily setup a user with a non-expiring password.  The user type for this in CRM is "Non-Interactive".  Non-Interactive users can't log into CRM via the front end.  If you try you will see this error message:

The other benefit of using a non-interactive user account is it doesn't require a CRM license to work.  You will only need to assign a license to the account for about 5 minutes to setup the account for the first time.  After that you can remove it and it will remain active in CRM.  You are allowed to have 5 non-interactive users.

No lets get to the instructions of how to set this up.  You will need to have an admin account for CRM and Office 365.  You will also need PowerShell Installed.

Setup Non-Interactive User:
  1. Create a user in the Office 365 admin center.  Be sure to assign a Dynamics 365 (online) license to the account.
  2. Go to Dynamics 365 (online).
  3. Go to Settings > Security.
  4. Choose Users > Enabled Users, and then click a user’s full name.
  5. In the user form, scroll down under Administration to the Client Access License (CAL) Information section and select Non-interactive for Access Mode.  You then need to remove the Dynamics 365 (online) license from the account.
  6. Go to the Office 365 admin center.
  7. Click Users > Active Users.
  8. Choose the non-interactive user account and under Product licenses, click Edit.
  9. Turn off the Dynamics 365 (online) license, and then click Save > Close multiple times.
  10. Go back to Dynamics 365 (online) and confirm that the non-interactive user account Access Mode is still set for Non-interactive.
  11. In CRM assign a role to the user account.
Setup Non-Expiring Password:
Important Links:
Download Azure PowerShell Version 1 (File at bottom of page)
Steps:
Initial Setup: (For installs always use the x64 version) – This only needs done once
  1. Install Microsoft Online Services Sign-In Assistant for IT Professionals RTW
  2. Install Microsoft Azure Active Directory Module for Windows PowerShell.  I did run into an issue here and needed to also install Azure AD PowerShell Version 1.
In PowerShell (run the following commands) – replace red text with user name of account you want to set the non-expiring password on:
 1.       Import-Module MSOnline
        2.       Connect-MsolService
      a.       You will get a pop-up and need to login with a global admin account
        3.       Set-MsolUser -UserPrincipalName serviceaccount@contoso.com -PasswordNeverExpires $true
        4.       Confirm the process worked by running the following command:
     a.       Get-MSOLUser -UserPrincipalName user ID | Select PasswordNeverExpires
     b.       If successful you will see this:

    Troubleshooting:
    Not Digitally Signed Error: Run this script in PowerShell –
    Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypas

        Microsoft Dynamics 365

        Updated To Version 1.7 - Azure Blob Storage For Storing Microsoft Dynamics 365 Attachments

        October 25, 2017

        In another post I talked about using Azure Blob Storage to store CRM attachments in Azure rather then in CRM.  Today I installed the app in a clients CRM system and found that Microsoft Labs updated the App to version 1.7.  This change had a lot of updates from UI to how to setup notes attachment movement.  Also they now allow for moving of existing attachments.  Here is the updated steps:

        1) Go To AppSource
        2) Install the Attachment Management App by clicking the "Get It Now" button under the image.
        3) While its installing go to Azure and setup your blob storage account.
        4) For each CRM instance you will need to setup 2 containers in your blob storage.  One is for email attachments and the other is for notes attachments.  So if you have a production instance and sandbox instance you will need to have 4 containers (DevEmails, DevNotes, ProdEmails, ProdNotes).  If you want to separate note attachments by entity, then create one container per entity.  The DevNotes or ProdNotes will be used as the default location, unless you specific a specific entity container for notes.
        5) Still in Azure click on "Shared Access Signatures"  Here you will setup your SAS key.  Complete all sections and click on "Generate SAS".
        6) Copy the SAS key, we will need it later.
        7) Go back to CRM and check to make sure the attachment management app finished installing.
        8) In setting -> Customization -> Solutions, click on publish all customization.
        9) Refresh your screen.
        10) You will see a new link in the navigation bar for Azure Attachment Storage. Click on it and then on Azure Attachment Storage Setup.
        11) Fill in the fields.
              - Name = The name of the blob storage account in Azure
              - SAS Token = Get this from step 6
              - Notes Attachment Container = Name of the container in Azure
              - Email Attachment Container = Name of the container in Azure
        12) Click Confirm.
        13) Go back to the navigation bar and click on Notes Attachment Entity Settings.  Here is where you choose what entities you want to upload notes attachments to azure.  You must choose each entity.  If you setup multiple containers in step 4 then you need to specify those here. ***ANYTHING WITH ADX_ IN THE ENTITY NAME SKIP.  DO NOT ADD THESE TO MOVE THE ATTACHMENTS TO AZURE.  IT WILL BREAK YOUR CRM PORTAL.*** I did notice that if I selected all at once the page did seem to hang.  So I did it in groups and it worked fine.  Make sure to leave all entities selected that you want to automatically move attachments for.  Once unselected they will stop moving.
        That's it, your setup is now complete.

        If you need to move existing attachments to Azure then do the following:
        1) In the navigation click on Azure Attachment Storage and the on Reports and admin.
        2) Choose what you want to move in the left hand column and click  on Move to Blob.  You should see the number on the left decrease and the number on the right increase.

        For additional information including setting up bulk attachment upload see the User Guide.
        Microsoft Dynamics 365

        Azure Blob Storage For Storing Microsoft Dynamics 365 Attachments

        October 21, 2017
        In my Living In SCRIBE Online Blog I talked about work I did to migrate CRM 4.0 on-premise to Dynamics 365 using SCRIBE Online.  In that blog I mentioned how there are about 80 GB worth of attachments that had to be moved into the cloud.  Well this could quickly get expensive if we stored them directly in CRM.  Also, this client wasn't using SharePoint.  So what option do we have to store this data so end users can access the data?  Why not use Azure Blob Storage?

        Microsoft has an app in AppSource to move attachments from CRM to Azure blob storage.  Not only does it all for this, but it also has a web resource in it so you can allow for bulk uploads.  Here is how to set it up:

        1) Go To AppSource
        2) Install the Attachment Management App by clicking the "Get It Now" button under the image.
        3) While its installing go to Azure and setup your blob storage account.
        4) For each CRM instance you will need to setup 2 containers in your blob storage.  One is for email attachments and the other is for notes attachments.  So if you have a production instance and sandbox instance you will need to have 4 containers (DevEmails, DevNotes, ProdEmails, ProdNotes).
        5) Still in Azure click on "Shared Access Signatures"  Here you will setup your SAS key.  Complete all sections and click on "Generate SAS".
        6) Copy the SAS key, we will need it later.
        7) Go back to CRM and check to make sure the attachment management app finished installing.
        8) In setting -> Customization -> Solutions, click on publish all customization.
        9) Refresh your screen.
        10) You may not see a link anywhere in the navigation to the azure blob storage entity, so you can either turn it on to appear in the settings area or just use advanced find to get to the entity.  It is called Azure Blob Storage Settings.
        11) Add a new record.
              - Name = The name of the blob storage account in Azure
              - SAS Token = Get this from step 6
              - Notes Attachment Container = Name of the container in Azure
              - Email Attachment Container = Name of the container in Azure
        12) Save

        Your attachments in CRM will now be stored in Azure.  You will still be able to access the notes within CRM with no problem at all, including even downloading them from CRM.  One thing to note about this is it will not move attachments that are already in CRM.  So it is best to install this early on if you are not going to use SharePoint integration.
        Microsoft Dynamics 365

        SCRIBE Connector Development - Handling Array List

        October 21, 2017
        Are you working on creating a connector with SCRIBE's CDK?  In your connector do you have an array of strings or list of  strings that you need to pass?  SCRIBE makes this easy to do within the CDK and SCRIBE Online.

        I came across this scenario on a connector I was creating that passes a JSON message to an API.  In the JSON message it had a list of strings for entity ID's. Here is an easy way to accomplish this:

        1) Create you Property Definition as past or your Object Definition.
        1:  new PropertyDefinition  
        2:  {
        3:        Description = "Use TOLIST() to pass in a list of entity id's.",  
        4:        FullName = "Entity IDs",  
        5:        IsPrimaryKey = false,  
        6:        MaxOccurs = 1,  
        7:        MinOccurs = 0,  
        8:        Name = "PublishTo",  
        9:        Nullable = true,  
        10:       NumericPrecision = 0,  
        11:       NumericScale = 0,  
        12:       PresentationType = "string",  
        13:       PropertyType = typeof(string).Name,  
        14:       UsedInActionInput = true,  
        15:       UsedInActionOutput = true,  
        16:       UsedInLookupCondition = true,  
        17:       UsedInQueryConstraint = true,  
        18:       UsedInQuerySelect = true,  
        19:       UsedInQuerySequence = true  
        20   }  

        Here I want to note lines 12 and 13.  Set these to the value types that your list will contain.  In this case we will have a list of strings.

        2) Publish connector so you can use it in SCRIBE Online.
        3) Create your mapping and for this attribute use the TOLIST() function.  It can be found in Functions -> Conversion -> TOLIST().

        This function takes 6 arguments all separated by a comma.
        1) The list of items. Example "ABC123, DEF456, GHI789" - Here you can use a function to read from a source instead of typing in all the data.
        2) The delimiter we used to separate the list. Example ","
        3) The type for the data.  Example "string"
        4) How to sort the data.  Example "a" for ascending.
        5) Do you want to remove quotes?  Example true.
        6) Do you want to remove empty values? Example true.

        Here is the expression completed with the examples above:
         TOLIST("ABC123, DEF456, GHI789", ",", "string", "a", true, true)  

        Here is the output:
         "EntityIDs":   
         [  
           "ABC123",  
           "DEF456",  
           "GIH789"  
         ]  
        Scribe Online

        Using SCRIBE Online To Migrate From CRM 4.0 to Dynamics 365

        October 21, 2017
        Recently I had a client that needed to be migrated from CRM 4.0 on-premise to Dynamics 365 online.  First, I want to say that we opted to not do the upgrade to CRM 2011 on-premise to CRM 2013 on-premise to CRM 2015 on-premise to Dynamics 365 on-premise because the client wanted to start fresh with CRM customization's.  So personally, it would have been a waste of time to do that upgrade process.

        The following issues where identified with the data migration:
        1) CRM 4.0 is no supported by SCRIBE Online.
        2) The server OS that CRM 4.0 and SQL Server ran on was not supported.
        3) The version of SQL Server was no supported.

        How am I going to migrate this data? Hmm.....

        The solutions:
        1) The RDP Terminal Server I am using has a supported server OS.
        2) I am able to connect to SQL Server CRM 4.0 Database with ODBC.
        3) Install SCRIBE On-Premise Agent on the Terminal Server and use the ODBC connector to retrieve the data.

        By using the ODBC connector I could access the data in SCRIBE Online in a supported way.  Because of this approach the migration did run a little slower because I couldn't run multiple maps at the same time over the same ODBC connection, because it would throw a connection error.

        One piece I was not able to migrate with SCRIBE Online was the attachments stored in CRM 4.0, which was about 80 GB worth.  I couldn't migrate these because of changes in how CRM stored them.  So to accomplish this I did the following:

        1)  I downloaded this tools source code from GitHub.
        2) I modified it to read a list of guids.
        3) I exported a list of attachment guids to csv.
        4) I modified the application to then download attachments and put them into one root folder and have subsequent folders in the root.  Each subsequent folder was a guid and inside that folder was the attachment.
        4) I then use the Microsoft Dynamics 365 SDK to create a console application to upload the attachments to Dynamics 365 and re-link them to the parent record.

        Once I made the code changes and wrote the application I the download and upload ran overnight.


        Know you might be asking yourself, "How did you store 80 GB of attachments in CRM online?  Isn't that expensive?"  I will be posting a separate blog on that in my Dynamics 365 blog.
        Microsoft Dynamics 365Scribe Online

        Creating Pre and Post Build Events For Connector Creation

        October 19, 2017
        While building a custom connector using the CDK I found it time consuming to have to manually stop the SCRIBE Agent service, copy the connector .dll to the connector folder in the agent folder and restart the service. To over come this I added some pre and post build functions to my project in Visual Studio. Here is how you setup your development environment to automate all this:

        1) Right click on your project and go to properties.
        2) On left side go to Build Events.
        3) In "Pre-build event command line" input this:
        1:  net stop "Scribe Online Agent"  
        2:  Exit /b 0  
        4) In "Post-build event command line" input this:
        1:  copy /Y "$(TargetDir)$(ProjectName).dll" "C:\Program Files (x86)\Scribe Software\AGENTNAME\Connectors\CONNECTORNAME\$(ProjectName).dll"  
        2:  net start "Scribe Online Agent"  

        Replace AGENTNAME with the name of your agent and CONNECTORNAME with the name of the folder where you place the DLL file.
        What this will do is stop the service, build your connector, copy your connector to the agent folder and then start the service.
        Scribe Online

        Using Session Memory with CRM Portals

        October 19, 2017
        Recently while doing some client work, we noticed that CRM Portals does a post back when adding information within a sub-grid on an entity form.  Why is this an issue?  Because, if you have input fields on the form, these values are not written into  CRM till the save button at the bottom of the form is clicked.  So if a user inputs anything into the sub-grid after they fill in the fields, the post back action will remove what the user input.  This can make for a bad user experience.  To over come this, we can use session memory to temporarily store the values till the browsing session is ended.

        To use session storage the first thing we will need to do is register onChange handlers to update the values the session memory when the user changes a value.  Here are some examples of onChange handlers written in JQuery:
        1:  $(document).on('change', '#CHECKBOX ID', function () { SetSessionValue("#CHECKBOX ID") });  
        2:  $(document).on('change keyup paste', '#TEXTBOX ID', function () { SetSessionValue("#TEXTBOX ID) });  
        #1 is for check boxes and picklists
        #2 is for textbox inputs

        The SetSessionValue is a reusable function I created to handle boolean, string and picklist.  Check my blog post on DateTime fields on how to handle onChange events with them.  Before we get into the source code for the SetSessionValue function, CRM Portals adds an _0 and _1 to the ID for boolean input on the portal.  That is why you see me add an _1 in the source code below:
        1:  function SetSessionValue(fieldname) {  
        2:    if ($(fieldname).hasClass('boolean-radio')) {  
        3:      if ($(fieldname + "_1").is(':checked')) {  
        4:        sessionStorage.setItem(fieldname, true);  
        5:      }  
        6:      else {  
        7:        sessionStorage.setItem(fieldname, false);  
        8:      }  
        9:    }  
        10:    else {  
        11:      var fieldValue = null;  
        12:      if ($(fieldname).hasClass('picklist')) {  
        13:        fieldValue = $(fieldname).find(":selected").val();  
        14:      }  
        15:      else {  
        16:        fieldValue = $(fieldname).val();  
        17:      }  
        18:      sessionStorage.setItem(fieldname, fieldValue);  
        19:    }  
        20:  }  

        To get the value out of session memory I created a small reusable function for that as well:
        1:  function GetSessionValue(fieldname) {  
        2:    var sessionValue = sessionStorage.getItem(fieldname);  
        3:    return sessionValue;  
        4:  }  

        Now with these parts we are writing and reading from session memory.  All  you need to do know is add a window.onload = function(){} or a $(document).ready to your code to populate your fields when a post back occurs.
        Microsoft Dynamics 365

        CRM Portals OnChange Event For DateTime

        October 18, 2017
        Occasionally when working with CRM Portals you may run into the need to get the OnChange event for a date time field.  Because of the way that CRM Portals renders date time fields this is slightly more complicated then working with text boxes, check boxes and pick lists.  During my process I ended up reaching out to Microsoft for help on this and below are the steps they provided to help me with this problem:

        1) Click on the date time control
        2) Press F12
        3) In console type $('div.control') and hit enter (this will give you a list of div controls)
        4) Locate the div control for the date time field
        5) Go to the entity form or web page in CRM and add the following code snippet (replace the 2 with the number your div control is located at):

        1:      $(document).ready(function ()  
        2:      {  
        3:        var dpcontrol = $('div.control')[2];  
        4:        $(dpcontrol).on("dp.change", function (e)   
        5:        {  
        6:           alert("On change Event Triggered");  
        7:           alert(e.date);  
        8:        });  
        9:      });  

        Change the 2 alerts to whatever you need to happen during the OnChange event.
        Microsoft Dynamics 365

        Have a topic you’d like us to cover?

        Let’s start a conversation—your challenges often make for the best posts.