Knowledge Base Management in Data Quality Services (DQS)

The DQS knowledge base is a container of metadata to use in improving data quality through data cleansing and data matching. A knowledge base consists of domains, each of which represents the data in a data field. DQS knowledge management includes the processes used to create and manage the knowledge base, both in a computer-assisted manner and interactively.

Knowledge Discovery

Knowledge discovery is a computer-assisted process that analyzes samples of data to build knowledge about the data. After analysis results, we can validate and enhance the knowledge, and then apply it to perform data cleansing, matching, and profiling.

To prepare knowledge for a data quality project, we can build and maintain a knowledge base (KB) that DQS can use to identify incorrect or invalid data. DQS enables to use both computer-assisted and interactive processes to create, build, and update knowledge base. Knowledge in a knowledge base is maintained in domains, each of which is specific to a data field. The knowledge base is a repository of knowledge about data that enables to understand data and maintain its integrity.

DQS knowledge base have the following benefits:

  • Building knowledge about data is a detailed process. The DQS process of extracting knowledge about data automatically, from sample data, makes the process much easier.
  • DQS enables to see its analysis of the data, and to augment the knowledge in the knowledge base by creating rules and changing data values. We can do so repeatedly to improve the knowledge over time.
  • We can leverage pre-existing data quality knowledge by basing a knowledge base on an existing KB, importing domain knowledge from files into the KB, importing knowledge from a project back into a KB, or using the DQS default KB, DQS Data.
  • We can ensure the quality of data by comparing it to the data maintained by a reference data provider.
  • There is a clear separation between building a knowledge base and applying it in the data correction process, which gives flexibility in how to build and update the knowledge base.

We can use the Data Quality Client application to execute and control the computer-assisted steps, and to perform the interactive steps. The knowledge discovery activity builds the knowledge base by analyzing a sample of data for data quality criteria, looking for data inconsistencies and syntax errors, and proposing changes to the data. This analysis is based on algorithms built into DQS.

We can prepare the process by linking a knowledge base to a SQL Server database table or view that contains sample data similar to the data that the knowledge base will be used to analyze. We can then maps a knowledge base domain to each column of sample data to be analyzed. A domain can either be a single domain that is mapped to a single field, or it can be a composite domain that consists of multiple single domains each of which is mapped to part of the data in a single field. When we run knowledge discovery, DQS extracts data quality information from the sample data into domains in the knowledge base.

We can manually add value changes and we can import domain values from an Excel file. In addition, we can run the knowledge discovery process again at a later point if the data in the sample has changed. We can apply more knowledge from within the Domain Management activity and from within the Data matching activity.

The knowledge discovery process need not be performed on the same data that data correction is performed on. DQS provides the flexibility to create knowledge from one set of database fields and apply it to a second set of related data that needs to be cleansed. The data steward can create a new knowledge base from scratch, base it on an existing knowledge base, or import a knowledge base from a data file. We can also re-run knowledge discovery on an existing knowledge base. We can maintain multiple knowledge bases on a single Data Quality Server. We can also connect multiple instances of an application to the same knowledge base. DQS prevents concurrency conflicts by locking the knowledge base to a user who opens it in a knowledge management session.

Case Insensitivity in DQS

Values in DQS are case-insensitive. That means that when DQS performs knowledge discovery, domain management, or matching, it does not distinguish values by case. If we add a value in value management that differs from another value only by case, they will be considered the same value, not synonyms. If two values that differ only by case are compared in the matching process, they will be considered an exact match.

Posted in General | Tagged , , | Leave a comment

Domain Management in Data Quality Services (DQS)

Domain management enables the user to interactively change and augment the metadata that is generated by the computer-assisted knowledge discovery activity.

Following are the activities we can perform on Domain Management:

  • Create a new domain. The new domain can be linked to or copied from an existing domain.
  • Set domain properties that apply to each term in the domain.
  • Apply domain rules that perform validation or standardization for a range of values.
  • Interactively apply changes to any specific data value in the domain.
  • Use the DQS Speller to check the syntax, spelling, and sentence structure of string values.
  • Import a domain from a .dqs data file or domain values from a Microsoft Excel file.
  • Import values that have been found by a cleansing process in a data quality project back into a knowledge base.
  • Attach a domain to the reference data maintained by a reference data provider, with the result that the domain values are compared to the reference data to determine their integrity and correctness.
  • Apply term-based relations for a single domain.

When the domain management activity is completed, we can publish the knowledge base for use in a data project.

Domain Properties

Domain properties define and drive the processing that will be applied to the associated values. We can set these properties in the domain management activity. We can set the data type of the values, specify that only the leading value in a group of synonyms will be exported, configure the formatting of the output (to upper case, lower case, or initial capitalization), and define which algorithms (syntax error, speller, and string normalization) will be activated.

Reference Data Services

In the domain management process, we can attach online reference data to a domain.

This is how we compare the data in our domain to the data maintained by a reference data provider. We must first configure the reference data provider through the DQS configuration capabilities in the Administration section of the Data Quality Client application.

Applying Domain Rules

We can create domain rules for validation or standardization. A validation rule ensures the accuracy of data, ranging from a basic constraint, such as the possible terms that a string value can be, to a more complex regular expression, such as the valid forms of an email address. A standardization rule is performed to achieve a common data representation. It ensures that data values from multiple sources with the same meaning do not appear in different representations. A standardization rule changes the format or presentation of a value according to a generic function, ensuring conforming according to metadata such as data type, length, precision, scale, and formatting patterns. A standardization rule can be based on a character, date/time, numeric, or SQL function.

For a composite domain, we can create a CD rule that specifies a relation between a value in one single domain and a value in another single domain, both of which are parts of a composite domain.

When a domain rule is applied and a domain value fails the rule, the value is designated invalid. For example, we can create a phone rule which will validate the phone length based on country.

Domain Values

After we have built a knowledge base, we can populate and display data values in each domain of the knowledge base. After knowledge discovery, DQS will show how many times each term appears, what the status of each term is, and any corrections that it proposes. We can manage this knowledge as follows:

  • Change the status of a value, making it correct, in error, or not valid
  • Add a specific value to, or delete a specific value from, the knowledge base
  • Change the relation of one value to another value, including designating a replacement for a term that is in error or not valid
  • Add, remove, or change knowledge associated to the domain.

Values can be created specifically by the user or as part of data discovery or import functionalities. This enables to align the domain to the business and makes it easily extensible.

We can set domain values either in the domain management activity or in the Manage Domain Values step at the end of the knowledge discovery activity. The domain-value functionality is the same in both activities.

Setting Term Relations

In domain management, we can specify a term-based relation for a single domain, specifying a change to a single value. This will build a list of Value/Correct To pairs, such as “LTD.” and “Limited”, or “CO.” and “Company”. This will enable to change a term throughout the domain without manually setting individual domain values as synonyms. If a term-based relation transformation causes two values to be identical, then DQS will create a synonym relationship between them (in knowledge discovery).

Composite Domains

A composite domain is a structure comprised of two or more single domains that each contains knowledge about common data. Examples of data that can be addressed by composite domains are the first, middle, and family names in a name field, and the house number and street, city, state, postal code, and country in an address field. When we map a single field to a composite domain, DQS parses the data from the one field into the multiple domains that make up the composite. Sometimes a single domain does not represent field data in full. Grouping two or more domains in a composite domain can enable to represent the data in an efficient way.

The following are advantages of using composite domains:

  • Analyzing the different single domains that make up a composite domain can be a more effective way of assessing data quality.
  • When we use a composite domain, we can also create cross-domain rules that enable to verify that the relationship between the data in multiple domains is appropriate. For example, we can verify that the string “London” in a city domain corresponds to the string “England” in a country domain. Note that cross-domain rules are taken into consideration after domain rules.
  • Data in composite domains can be attached to a reference data source, in which case the composite domain will be sent to the reference data provider. This is often done with address data.

The data can be parsed by a delimiter, by the order of the columns, or based upon reference data.

Composite domains are managed differently than single domains. We do not manage values in a composite domain; we do so for the single domains that comprise the composite domain. However, from the domain list in the Domain Management activity, we can see the relationships between the different values in a composite domain, and the statistics that apply to them.

In the Discover step of the Knowledge Discovery activity, profiling is performed on the single domains within a composite domain, not on the composite domain. However, in interactive cleansing, we cleanse data in the composite domain, not the single domains.

Matching can be performed on the single domains that comprise the composite domain, but not on the composite domain itself.

Posted in General | Tagged , , | 1 Comment

Data Quality Services (DQS) Security Management

The Data Quality Services (DQS) security management is based upon the SQL Server security management. Database administrator grants a user a set of permissions by associating the user with a DQS role.

DQS Roles

There are four roles for Data Quality Services (DQS). The first one is the database administrator (DBA) who deals primarily with product installation, database maintenance, and user management. This role primarily uses the SQL Server Management Studio, rather than within the Data Quality Client application. Their server role is the sysadmin.

The other three roles are for users who will work on Data Quality Client application. Following are the three roles:

  • The DQS Administrator (dqs_administrator role) can edit and execute a project, create and edit a knowledge base, terminate an activity, stop a process within an activity, and can change the configuration and Reference Data Services settings. The DQS Administrator cannot install the server or add new users.
  • The DQS KB Editor (dqs_kb_editor role) can perform all of the DQS activities, except for administration. The KB Editor can edit and execute a project, and create and edit a knowledge base. They can see the activity monitoring data, but cannot terminate or stop an activity or perform administrative tasks.
  • The DQS KB Operator (dqs_kb_operator role) can edit and execute a project. They cannot perform any kind of knowledge management; they cannot create or change a knowledge base. They can see the activity monitoring data, but cannot terminate an activity or perform administrative tasks.

User Management

The database administrator (DBA) creates DQS users and associates them with DQS roles in SQL Server Management Studio. The DBA manages their permissions by adding SQL Logins as users of the DQS_MAIN database, and associating each user with one of the DQS roles. Each role is granted permissions to a set of stored procedures on the DQS_MAIN database. The three DQS roles are not available for the DQS_PROJECTS and DQS_STAGING_DATA databases.

Posted in General | Tagged , , | Leave a comment

Introduction to Data Quality Services (DQS) of SQL Server

The data quality solution provided by Data Quality Services (DQS) of SQL Server is used to maintain the quality of data and ensure that the data is suited for business usage. DQS is a knowledge-driven solution that provides both computer-assisted and interactive ways to manage the integrity and quality of data sources. DQS enables to discover, build, and manage knowledge about data. We can use that knowledge to perform data cleansing, matching, and profiling.

Why DQS?

In general, we have incorrect data in our application which can result from user entry errors, corruption in transmission or storage, mismatched data dictionary definitions, and other data quality and process issues. Aggregating data from different sources that use different data standards can result in inconsistent data. Incorrect data affects the ability of a business to perform its business functions and to provide services to its customers, resulting in a loss of credibility and revenue, customer dissatisfaction, and compliance issues. Automated systems often do not work with incorrect data, and bad data wastes the time and energy of people performing manual processes. Incorrect data can wreak havoc with data analysis, reporting, data mining, and warehousing.

High-quality data is critical to the efficiency of businesses. An organization of any size can use DQS to improve the information value of its data, making the data more suitable for its intended use. A data quality solution can make data more reliable, accessible, and reusable. It can improve the completeness, accuracy, conformity, and consistency of data, resolving problems caused by bad data in business intelligence or data warehouse workloads.

DQS Features

DQS provides the following features to resolve data quality issues.

Data Cleansing: In this activity, DQS modify, remove, or standardized of data that is incorrect or incomplete, using both computer-assisted and interactive processes.

Matching: In this activity, DQS identify of duplicates records in a rules-based process that enable to determine what constitutes a match and perform de-duplication.

Reference Data Services: This will verify the quality of data using the services of a reference data provider. We can use reference data services from Windows Azure Marketplace DataMarket to easily cleanse, validate, match, and enrich data.

Profiling: This enables to analyze a data source to provide insight into the quality of the data at every stage in the knowledge discovery, domain management, matching, and data cleansing processes. Profiling is a powerful tool in a DQS data quality solution. We can create a data quality solution in which profiling is just as important as knowledge management, matching, or data cleansing.

Monitoring: Monitoring activity provides the ability to verify that data quality solution is doing what it was designed to do.

Knowledge Base: Data Quality Services is a knowledge-driven solution that analyzes data based upon knowledge that builds with DQS. This enables to create data quality processes that continually enhances the knowledge about data and in so doing, continually improves the quality of data.

DQS Components

Data Quality Services consists of Data Quality Server and Data Quality Client. These components provide ability to perform data quality services separately from other SQL Server operations. Both are installed from within the SQL Server setup program.

Data Quality Server is implemented as three SQL Server catalogs that can manage and monitor in the SQL Server Management Studio (DQS_MAIN, DQS_PROJECTS, and DQS_STAGING_DATA). DQS_MAIN includes DQS stored procedures, the DQS engine, and published knowledge bases. DQS_PROJECTS includes data that is required for knowledge base management and DQS project activities. DQS_STAGING_DATA provides an intermediate staging database where we can copy source data to perform DQS operations and then export processed data.

Data Quality Client is a standalone application that enables to perform knowledge management, data quality projects, and administration in one user interface. It is a stand-alone executable file that performs knowledge discovery, domain management, matching policy creation, data cleansing, matching, profiling, monitoring, and server administration. Data Quality Client can be installed and run on the same computer as Data Quality Server or remotely on a separate computer. There are wizard-driven operations available to perform operations in Data Quality Client.

Posted in General | Tagged , , | 6 Comments

Form script for keypress events and auto-completion feature in CRM 2016

The control object provides methods to change the presentation or behavior of a control and identify the corresponding attribute.

We can access controls using the Xrm.Page.ui.controlsXrm.Page.ui Section.controls, or Attribute.controls collections. The Xrm.Page.getControl method is a shortcut method to access Xrm.Page.ui.controls.get.

The new custom controls for CRM mobile clients (phones and tablets) supports all the control properties and methods except Auto-completion methodsgetValueKeypress methods and Lookup control methods and events.

Control properties and methods

Auto-completion methods

Configure the auto-completion experience in text controls in CRM forms. These are the methods which introduced in CRM 2016.


Gets the latest value for a control as users type a character in a specific text or number field. This method was introduced in CRM 2016.

Keypress methods

Add, remove, or perform a function when the user presses a key in a control. These methods were introduced in CRM 2016.

Auto-completion methods

We can use showAutoComplete and hideAutoComplete methods to configure the auto-completion experience in text controls in CRM forms.


We can use this method to show up to 10 matching strings in a drop-down list as users press keys to type character in a specific text field. We can also add a custom command with an icon at the bottom of the drop-down list.




Type: Object that defines the result set, which includes results and commands, to be displayed in the auto-completion drop-down list.

Remarks: Call this method in a function that you added using the addOnKeyPress method to execute on the keypress event.

Example: The following example shows the definition of the object to be passed to the showAutoComplete method.


var resultset = { 
   results: [{ 
         id: <value1>, 
         icon: <url>, 
         fields: [<fieldValue1>]}, 
         id: <valueN>, 
         icon: <url>, 
         fields: [<fieldValue1, fieldValue2,..., fieldValueN>]}],
         id: <value>, 
         icon: <url>, 
         label: <value>, 
         action: <function reference> 


We can use this function to hide the auto-completion drop-down list we configured for a specific text field.




This will give the latest value in a control as the user types characters in a specific text or number field. This method helps us to build interactive experiences by validating data and alerting users as they type characters in a control.

The getValue method is different from the attribute Xrm.Page.getAttribute(arg).getValue() method because the control method retrieves the value from the control as the user is typing in the control as opposed to the attribute Xrm.Page.getAttribute(arg).getValue() method that retrieves the value after the user commits (saves) the field.

For a sample JavaScript code that uses the getValue method to configure the auto-completion.



Return Value

Type: String. The latest data value for a control.

Keypress methods

We can use addOnKeyPressremoveOnKeyPress, and fireOnKeyPress methods to provide immediate feedback or take actions as user types in a control. These methods enable us to perform data validations in a control even before the user commits (saves) the value in a form.


We can use this to add a function as an event handler for the keypress event so that the function is called when we type a character in the specific text or number field.

For a sample JavaScript code that uses the addOnKeyPress method to configure the auto-completion.


Xrm.Page.getControl(arg).addOnKeyPress([function reference])


Type: function reference

Remarks: The function will be added to the bottom of the event handler pipeline. The execution context is automatically set to be passed as the first parameter passed to event handler set using this method.

We should use reference to a named function rather than an anonymous function if we want to remove the event handler for the field.


We can use this to remove an event handler for a text or number field that we have added using addOnKeyPress.


Xrm.Page.getControl(arg).removeOnKeyPress([function reference])


Type: function reference

Remarks: If an anonymous function is set using addOnKeyPress, it can’t be removed using this method.


We can use this to manually fire an event handler that we created for a specific text or number field to be executed on the keypress event.


Posted in MS CRM 2016 | Tagged , , | Leave a comment

Security model of Microsoft Dynamics CRM

Microsoft Dynamics CRM provides a security model that protects data integrity and privacy and supports efficient data access and collaboration. Following are the main goals of the security model:

• Provide users with the access only to the appropriate levels of information that is required to do their jobs.
• Categorize users by role and restrict access based on those roles.
• Support data sharing so that users and teams can granted access to records that they do not own for a specified collaborative effort.
• Prevent a user’s access to records the user does not own or share.

Role-based security – Microsoft Dynamics CRM focuses on grouping a set of privileges together that describe the responsibilities (or tasks that can be performed) for a user. Microsoft Dynamics CRM includes a set of predefined security roles. Each aggregates a set of user rights to make user security management easier.

Record-based security – Microsoft Dynamics CRM focuses on access rights to specific records.

Field-level security – Microsoft Dynamics CRM restricts access to specific high business impact fields in an entity only to specified users or teams.

We can combine role-based security, record-level security, and field-level security to define the overall security rights.

Posted in MS CRM 2013, MS CRM 2016 | Tagged , | Leave a comment

Upload and manage document templates in CRM 2016

Document template in CRM 2016 is used to export CRM data as Excel or Word files, which can be used as templates to generate Excel or Word documents with standardized and up-to-date CRM data for analysis and reporting purposes.
Once created a document template using the web client, we can programmatically upload the template file (.xlsx or . docx) to CRM instance, update the name or the template file associated with a document template record, retrieve the document template record, and delete the document template record. The DocumentTemplate entity is used to upload and manage organization-owned document templates, and the PersonalDocumentTemplate entity is used to upload and manage user-owned or personal document templates. We can share or assign personal document templates to other users.
To upload a document template, specify the path to the document, the name, the document type (Excel or Word), and the content (file to be uploaded) as a base-64 encoded string.

string filePath = @”C:\MyContacts.xlsx”;
DocumentTemplate docTemplate = new DocumentTemplate
Name = “Contact Excel Document Template”;
// 1 – For uploading an Excel template, 2 – For uploading word template.
DocumentType = new OptionSetValue(1);
Content = Convert.ToBase64String(File.ReadAllBytes
(Path.Combine(Directory.GetCurrentDirectory(), filePath)))
TemplateID = _serviceProxy.Create(docTemplate);
Console.WriteLine(“Uploaded template: ‘{0}’.”, docTemplate.Name);

We have to activate it after document template is uploaded, so it can be used to generate documents. We can use the SetStateRequest message to activate the document template.

Posted in MS CRM 2016 | Tagged , , | Leave a comment

Use Connection strings in XRM tooling to connect to CRM

With Microsoft Dynamics CRM Online 2016 Update and Microsoft Dynamics CRM 2016 (on-premises), XRM tooling enables to connect to CRM instance by using connection strings. This is similar to the concept of connection strings used with Microsoft SQL Server. Connection strings have native support in configuration files, including the ability to encrypt the configuration sections for maximum security. This enables to configure CRM connections at deployment time, and not hard code in the application to connect to CRM instance.

Create a connection string

Specify the connection string in the app.config or web.config file for VS project, as shown in the following example.


<add name=”CRMServer” connectionString=”AuthType=AD;Url=http://amartestsrv:8080/Test;” /></connectionStrings>

After creating the connection string, use it to create a CrmServiceClient object.


//Use the connection string named “CRMServer” from the configuration fileCrmServiceClient crmSvc = new CrmServiceClient(ConfigurationManager.ConnectionStrings[“MyCRMServer”].ConnectionString);

After creating a CrmServiceClient object, we can use the object to perform actions in CRM.
Connection string parameters
The connection string contains a series of name=value pair separated by semi-colons. The following table lists supported parameters, which can be entered in any order.

Parameter Name Description
ServiceUri, Service Uri, Url, or Server Specifies the URL to the Microsoft Dynamics CRM Server. The URL can use http or https protocol, and the port is optional. The default port is 80 for the http protocol and 443 for the https protocol. The server URL is typically in the format of http://crm-server:port/organization-name for CRM on-premises and https://organization-name. for CRM Online.

The organization name is required. You can specify either the friendly or the unique name of the organization to connect to.

Example: http://amartestsrv/test, http://amartestsrv:5555/test, https://amartestsrv/test,

Domain Specifies the domain that will verify user credentials.
UserName, User Name, UserId, or User Id Specifies the user’s identification name associated with the credentials.
Password Specifies the password for the user name associated with the credentials.
HomeRealmUri or Home Realm Uri Specifies the Home Realm Uri.
AuthenticationType or AuthType Specifies the authentication type to connect to CRM instance. Valid values are: AD, IFD (AD FS enabled), OAuth, or Office365.

  • AD and IFD are permitted for CRM on-premises instances only.
  • OAuth is permitted for CRM Online and on-premises instances.

Office365 is permitted for CRM Online instances only.

RequireNewInstance Specifies whether to force the creation of a new instance when the connection is created. Possible values are True or False.
ClientId, AppId or ApplicationId Specifies the ClientID assigned when you registered your application in Microsoft Azure Active Directory or Active Directory Federation Services (AD FS).

This parameter is applicable only when the authentication type is specified as OAuth.

RedirectUri or ReplyUrl Specifies the redirect URI of the application you registered in Microsoft Azure Active Directory or Active Directory Federation Services (AD FS).

This parameter is applicable only when the authentication type is specified as OAuth.

TokenCacheStorePath Specifies the full path to the location where the user token cache should be stored. The running process should have access to the specified path. It is the processes responsibility to set and configure this path.

This parameter is applicable only when the authentication type is specified as OAuth.

LoginPrompt Specifies whether the user is prompted for credentials if the credentials are not supplied. Valid values are:

  • Always: Always prompts the user to specify credentials.
  • Auto: Allows the user to select in the login control interface whether to display the prompt or not.
  • Never: Does not prompt the user to specify credentials. If using a connection method does not have a user interface, you should use this value.

This parameter is applicable only when the authentication type is specified as OAuth.

Connection string examples
The following examples show how we can use connection strings for connecting to different deployments and authentication scenarios.

Integrated on-premises authentication
<add name=”CRMServer” connectionString=”AuthType=AD;Url=http://amartestsrv:8080/Test;” />

Named account using on-premises authentication
<add name=”CRMServer” connectionString=”AuthType=AD;Url=http://amartestsrv:8080/Test; Domain=crmhunt; Username=admin; Password=pass#123$” />

Named account using Office 365
<add name=”CRMServer” connectionString=”AuthType=Office365; Username= [email protected]; Password=pass#123$;Url=”/>

OAuth using named account in Office 365 with UX to prompt for authentication
<add name=”CRMServer” connectionString=”AuthType=OAuth; Username= [email protected]; Password=pass#123$; Url=; AppId=<GUID>;RedirectUri =app://<GUID>;TokenCacheStorePath =c:\MyTokenCache; LoginPrompt=Auto”/>

OAuth using named account in CRM on-premises with UX to prompt for authentication
<add name=”CRMServer” connectionString=”AuthType=OAuth;Username= [email protected]; Password=pass#123$;Url=https://amartestsrv:8080/Test; AppId=<GUID>; RedirectUri=app://<GUID>;TokenCacheStorePath =c:\MyTokenCache; LoginPrompt=Auto”/>

IFD using a named account with delegation to a sub realm
<add name=”CRMServer” connectionString=”AuthType=IFD;Url=http://amartestsrv:8080/Test; HomeRealmUri=;Domain=crmhunt; Username=admin; Password=pass#123$” />

Determine your connection status
To determine if the connection request was successful, check the value of the CrmServiceClient.IsReady property. If true, the connection is successful, and you are ready to work. Otherwise, check the values of the CrmServiceClient.LastCrmError and CrmServiceClient.LastCrmException properties for the cause of the connection failure.

Posted in MS CRM 2016 | Tagged , , | 1 Comment

Microsoft Dynamics CRM 2016 Solution Enhancements

Previously, when an entity was added to a solution and that solution was exported, the entity and all of its assets were exported in that solution. This included attributes, forms, views, relationships, visualizations, and any other assets packaged with the entity. All objects were exported regardless of whether the developer actually wanted to ship the object. This process potentially carried dependencies or modified unintended objects on the target deployment.

Now, we can create and publish solution patches that contain subcomponents of entities, as compared to publishing the entire entity and all of its assets. The original solution and multiple released patches can be rolled-up at a later time into an updated version of the original solution, which then can replace the original solution.


We can apply patches to either managed or unmanaged solutions and include only changes to entities and related entity assets. Patches do not contain any non-customized system components or relationships that it dependents upon because these components already exist in the deployed-to organization. At some point in our development cycle, we can roll up all the patches into a new solution version to replace the original solution that the patches were created from.

Patches are stored in the CRM database as Solution entity records. A non-null ParentSolutionId attribute indicates that the solution is a patch. Patches can be created and managed through the Organization Service or Web APIs, which are useful for developing automation such as a product install script. However, the CRM web application provides various web forms that enable us to interactively create and manage patches.

  • Patches can only be created from a parent solution using CloneAsPatchRequest or CloneAsPatch Action.
  • The patch parent can’t be a patch.
  • Patches can only have one parent solution.
  • A patch creates a dependency (at the solution level) on its parent solution.
  • We can only install a patch if the parent solution is present.
  • We can’t install a patch unless the unique name and major/minor version number of the parent solution, as identified by ParentSolutionId, do not match those of the parent solution installed in the target organization.
  • A patch version must have the same major and minor number, but a higher build and release number, than the parent solution version number. The display name can be different.
  • If a solution has patches, subsequent patches must have a numerically higher version number than any existing patch for that solution.
  • Patches support the same operations as solutions, such as additive update and removal.
  • Patches exported as managed must be imported on top of a managed parent solution. The rule is that patch protection (managed or unmanaged) must match its parent.
  • Don’t use unmanaged patches for production purposes.
  • Patches are only supported in CRM organizations of version 8.0 or later.

The SolutionPackager and PackageDeployer tools support solution patches.

Create a patch

Create a patch from an unmanaged or managed solution in an organization by using the CloneAsPatchRequest message or the CloneAsPatch Action, or by using the web application. Once we create the patch, the original solution becomes locked and we can’t change or export it as long as there are dependent patches that exist in the organization that identifies the solution as the parent solution. Patch versioning is similar to solution versioning and specified in the following format: We can’t make changes to the existing major or minor solution versions when we create a patch.

Export and import a patch

We can use the Organization Service or Web APIs, the web application, or the Package Deployer tool to export and import a patch. The relevant Organization Service message requests are ImportSolutionRequest and ExportSolutionRequest. The relevant actions For the Web API are ImportSolution Action and ExportSolution Action.

Patching examples

The following table lists the details of a patching example. Note that in this example, the solution and patches are imported in order (version 1.0 to

Patch Name Description
SolutionA, version 1.0 (unmanaged) Contains entityA with 6 fields.
SolutionA, version (unmanaged) Contains entityA with 3 fields ( -3 ) and adds entityB with 10 fields.
SolutionA, version (unmanaged) Contains entityC with 10 fields.

The import process is as follows.

  1. The developer or customizer first imports the base solution (SolutionA 1.0) into the organization. The result is entityA with 6 fields in the organization.
  2. Next, the SolutionA patch is imported. The organization now contains entityA with 3 fields plus entityB with 10 fields.
  3. Finally, SolutionA patch is imported. The organization now contains entityA with 3 fields, entityB with 10 fields, plus entityC with 10 fields.

Delete a patch

We can delete a patch or base (parent) solution by using DeleteRequest or, for the Web API, use the HTTP DELETE method. The delete process is different for a managed or unmanaged solution that has one or more patches existing in the organization.

For an unmanaged solution, we must uninstall all patches to the base solution first, in reverse version order that they were created, before uninstalling the base solution.

For a managed solution, we simply uninstall the base solution. The CRM system automatically uninstalls the patches in reverse version order before uninstalling the base solution. We can also just uninstall a single patch.

Update a solution

Updating a solution involves rolling up (merging) all patches to that solution into a new version of the solution. Afterward, that solution becomes unlocked and can once again be modified (unmanaged solution only) or exported. For a managed solution, no further modifications of the solution are allowed except for creating patches from the newly updated solution. To rollup patches into an unmanaged solution, use CloneAsSolutionRequest or the CloneAsSolution Action. Cloning a solution creates a new version of the unmanaged solution, incorporating all its patches, with a higher major.minor version number, the same unique name, and a display name.

For a managed solution things are handled slightly differently. We first clone the unmanaged solution (A), incorporating all of its patches and then exporting it as a managed solution (B). In the target organization that contains the managed version of the (A) solution and its patches, we import managed solution (B) and then execute DeleteAndPromoteRequest or the DeleteAndPromote Action to replace managed solution (A) and its patches with the upgraded managed solution (B) that has a higher version number.

Posted in MS CRM 2016 | Tagged , | Leave a comment

Microsoft Dynamics CRM Mobile Offline Capability

With Dynamics CRM 2016, tablets and phones have full offline mobile capabilities and Dynamics CRM Online will be able to get full offline experience with mobile apps. This provides the ability to get your work done even when there is interruption in connectivity. Users will be able to create, change and delete records while offline and Automatic playback of offline actions helps synchronize local changes with CRM Online.

Stay productive when your phone doesn’t have service, or when your tablet’s not connected to the Internet. The mobile apps keep records that you’ve used recently, so you can still access them when you’re disconnected.

You can also capture new information by creating drafts of new records like accounts, contacts, and activities. When you’re connected again, save the records that you created while you were offline.

Your offline experience might look a little different than when you’re connected, because charts and some images aren’t available offline.

A few limitations for creating draft records

There are a few things you should know about working with offline drafts in the mobile apps:

  • While offline, you can only create new records. To edit existing records, you need to be connected. However, you can edit records that you created while you were offline.
  • While disconnected, you can only create standalone records or associate records to those that are available for offline access on your device. For example, you can create an opportunity for an account only if that account was created before you went offline, and if it’s available for offline access. You can’t create an opportunity for an account while offline if you also created the account while offline.
  • When you’re offline, you can’t set the value for lookup fields. If you create a record that is associated with another record, such as adding a phone call to a contact, some lookup fields might populate automatically (in this case, the To and From fields might pre-populate). You need to fill these fields in once you re-connect while you review and save your drafts.

Working offline with on-premises CRM deployments

If you’re using your CRM for tablets app with Microsoft Dynamics CRM 2015 (on-premises) or later, you can continue to use CRM for tablets while disconnected. However, with the Windows 8 app, once you close CRM for tablets (like when you start another app), you can’t use CRM for tablets until you can connect to the Internet. With the Windows 8.1 app, you can continue to access your data even if you close the app. If you’re not sure whether your organization has an on-premises deployment, contact your CRM admin to find out.

Posted in MS CRM 2016 | Tagged , , | Leave a comment