I recently used the sample code provided in the SDK that allows you to have multiple OptionSets linked together so that selecting an item in the first OptionSet will filter the items appearing in the second.
https://msdn.microsoft.com/en-us/library/gg594433(v=crm.7).aspx
There was an error in the instructions for configuring the OnLoad event on the form. It says to use
"sample_TicketDependentOptionSetConfig.xml"
as a parameter. That's wrong. You need to use
"sample_TicketDependentOptionSetConfig"
The SDK.DependentOptionSet.init function in the JavaScript file I modified slightly to support multiple languages. The first few lines now read
//Retrieve the XML Web Resource specified by the parameter passed var clientURL = Xrm.Page.context.getClientUrl();
var userLcid = Xrm.Page.context.getUserLcid();
var pathToWR = clientURL + "/WebResources/" + webResourceName + "_" + userLcid;
The data files were then appended with the relevant LCID for each language
e.g. "sample_TicketDependentOptionSetConfig_1033.xml"
A very cool and easy to use JavaScript Library. Thanks Microsoft.
Saturday, December 12, 2015
Sunday, November 29, 2015
ILMerge Command Line Syntax
I had cause the other day to use ILMerge to combine several DLLs. The first issue I came across was:
Unresolved assembly reference not allowed: System.Core.
This is mentioned in several blogs and the solution suggested was to use the /lib switch and specify the path to the .NET library you want for the target. In my case I wanted to target 4.5.2 so I used the following which supposedly worked (the blog said)
ilmerge /lib:"C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.5.2" /out:MERGED.First.dll First.dll Second.dll /keyfile:key.snk
Note that if you strongly signed your First DLL don't assume the output will be strongly signed because it won't be. You have to add the /keyfile option to do that.
Note also there are blogs that say don't try and point to C:\Windows... because that is not the correct location. You need to use the Reference Assemblies path given above.
Well I didn't get the error and ILMerge started off and I waited. And waited. And waited. So basically ILMerge just hangs still consuming by the way 50% CPU just to fool you into thinking its actually doing something.
Its the wrong syntax. The correct syntax uses /targetplatform not /lib
ilmerge /targetplatform:v4,"C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.5.2" /out:MERGED.First.dll First.dll Second.dll /keyfile:key.snk
Unresolved assembly reference not allowed: System.Core.
This is mentioned in several blogs and the solution suggested was to use the /lib switch and specify the path to the .NET library you want for the target. In my case I wanted to target 4.5.2 so I used the following which supposedly worked (the blog said)
ilmerge /lib:"C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.5.2" /out:MERGED.First.dll First.dll Second.dll /keyfile:key.snk
Note that if you strongly signed your First DLL don't assume the output will be strongly signed because it won't be. You have to add the /keyfile option to do that.
Note also there are blogs that say don't try and point to C:\Windows... because that is not the correct location. You need to use the Reference Assemblies path given above.
Well I didn't get the error and ILMerge started off and I waited. And waited. And waited. So basically ILMerge just hangs still consuming by the way 50% CPU just to fool you into thinking its actually doing something.
Its the wrong syntax. The correct syntax uses /targetplatform not /lib
ilmerge /targetplatform:v4,"C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.5.2" /out:MERGED.First.dll First.dll Second.dll /keyfile:key.snk
Creating a SharePoint OnLine Folder using CSOM
Surprisingly I can't find a complete solution to this on the blogosphere. So let me put that right.
First add two references to
Microsoft.SharePoint.Client.dll
Microsoft.SharePoint.Client.RunTime.dll
I found them on my 64-bit server located here
C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI
In your code add a using statement
using Microsoft.SharePoint.Client;
Then in your method add the following
using (ClientContext ctx = new ClientContext(site))
{var securePassword = new SecureString();
foreach (char c in password)
{
securePassword.AppendChar(c);
}
ctx.Credentials = new Microsoft.SharePoint.Client.SharePointOnlineCredentials(username, securePassword);
var list = ctx.Web.Lists.GetByTitle(doclib);
var folderRoot = list.RootFolder;
ctx.Load(folderRoot);
ctx.ExecuteQuery();folderRoot = folderRoot.Folders.Add(folder);
ctx.ExecuteQuery();
};
First add two references to
Microsoft.SharePoint.Client.dll
Microsoft.SharePoint.Client.RunTime.dll
I found them on my 64-bit server located here
C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI
In your code add a using statement
using Microsoft.SharePoint.Client;
Then in your method add the following
using (ClientContext ctx = new ClientContext(site))
{var securePassword = new SecureString();
foreach (char c in password)
{
securePassword.AppendChar(c);
}
ctx.Credentials = new Microsoft.SharePoint.Client.SharePointOnlineCredentials(username, securePassword);
var list = ctx.Web.Lists.GetByTitle(doclib);
var folderRoot = list.RootFolder;
ctx.Load(folderRoot);
ctx.ExecuteQuery();folderRoot = folderRoot.Folders.Add(folder);
ctx.ExecuteQuery();
};
Monday, August 31, 2015
Serialize a CRM Entity to XML in Sandbox mode Plugin
I have been struggling with this problem for a while. Previous posts have outlined the problem: how to get changes in CRM sent to Back Office systems as XML messages. When using CRM Online a big frustration is that you cannot serialize objects to XML because the WriteObject method of the serializer throws a security exception.
While you can post the Context of a plugin to the Azure Service Bus, you have to use the CRM SDK to interpret this and turn it into something useful. My previous post outlines how to do this with an Azure Worker Process.
(You could construct the XML message line by line using string builder but seriously who wants to do that?)
I found a simple solution which may solve the problem. It was this post that provided the inspiration. It describes how to turn FetchXML results into a data table. Now FetchXML returns an entity collection and the code shows how to load that into a data table and it also handles the special CRM data types like Entity Reference.
So there you are in a Sandbox plugin. You have the Post-Image entity and you want to convert that into a nicely structured XML file that you can post to the Azure Service Bus. You can use part of the code above (ignoring the FetchXML query) to convert the Post-Image to a data table. I put it into a function I called ConvertEntityToDataTable.
Now getting XML is easy.
DataSet ds = new DataSet("Invoice");
DataTable dt = new DataTable("Attributes");
ConvertEntityToDataTable(dt, entity);
ds.Tables.Add(dt);
string xml = ds.GetXml();
The result looks like this (I've just given an extract)
<Invoice>
<Attributes>
<billto_city>London</billto_city>
<billto_line1>12 Hgh Street</billto_line1>
<billto_line2>Clapham Common</billto_line2>
<billto_line3>Clapham</billto_line3>
</Attributes>
</Invoice>
Remember that PostImages provide their attributes in alphabetical order which is great for mapping structured XML messages. The Target entity does not so you would need to address that.
The XML is missing some things I would need to add:
(a) the xml declaration at the top specifying the encoding
(b) a namespace to identify the message to an ESB
But I am delighted with the result because now I can construct proper XML messages in a Sandbox plugin.
The two functions required are shown below
///////
private void ConvertEntityToDataTable(DataTable dataTable, Entity entity)
{
DataRow row = dataTable.NewRow();
foreach (var attribute in entity.Attributes)
{
if (!dataTable.Columns.Contains(attribute.Key))
{
dataTable.Columns.Add(attribute.Key);
}
row[attribute.Key] = getAttributeValue(attribute.Value).ToString();
}
foreach (var fv in entity.FormattedValues)
{
if (!dataTable.Columns.Contains(fv.Key + "name"))
{
dataTable.Columns.Add(fv.Key + "name");
}
row[fv.Key + "name"] = fv.Value;
}
dataTable.Rows.Add(row);
}
///////
private object getAttributeValue(object entityValue)
{
object output = "";
switch (entityValue.ToString())
{
case "Microsoft.Xrm.Sdk.EntityReference":
output = ((EntityReference)entityValue).Name;
break;
case "Microsoft.Xrm.Sdk.OptionSetValue":
output = ((OptionSetValue)entityValue).Value.ToString();
break;
case "Microsoft.Xrm.Sdk.Money":
output = ((Money)entityValue).Value.ToString();
break;
case "Microsoft.Xrm.Sdk.AliasedValue":
output = getAttributeValue(((Microsoft.Xrm.Sdk.AliasedValue)entityValue).Value);
break;
default:
output = entityValue.ToString();
break;
}
return output;
}
}
While you can post the Context of a plugin to the Azure Service Bus, you have to use the CRM SDK to interpret this and turn it into something useful. My previous post outlines how to do this with an Azure Worker Process.
(You could construct the XML message line by line using string builder but seriously who wants to do that?)
I found a simple solution which may solve the problem. It was this post that provided the inspiration. It describes how to turn FetchXML results into a data table. Now FetchXML returns an entity collection and the code shows how to load that into a data table and it also handles the special CRM data types like Entity Reference.
So there you are in a Sandbox plugin. You have the Post-Image entity and you want to convert that into a nicely structured XML file that you can post to the Azure Service Bus. You can use part of the code above (ignoring the FetchXML query) to convert the Post-Image to a data table. I put it into a function I called ConvertEntityToDataTable.
Now getting XML is easy.
DataSet ds = new DataSet("Invoice");
DataTable dt = new DataTable("Attributes");
ConvertEntityToDataTable(dt, entity);
ds.Tables.Add(dt);
string xml = ds.GetXml();
The result looks like this (I've just given an extract)
<Invoice>
<Attributes>
<billto_city>London</billto_city>
<billto_line1>12 Hgh Street</billto_line1>
<billto_line2>Clapham Common</billto_line2>
<billto_line3>Clapham</billto_line3>
</Attributes>
</Invoice>
Remember that PostImages provide their attributes in alphabetical order which is great for mapping structured XML messages. The Target entity does not so you would need to address that.
The XML is missing some things I would need to add:
(a) the xml declaration at the top specifying the encoding
(b) a namespace to identify the message to an ESB
But I am delighted with the result because now I can construct proper XML messages in a Sandbox plugin.
The two functions required are shown below
///////
private void ConvertEntityToDataTable(DataTable dataTable, Entity entity)
{
DataRow row = dataTable.NewRow();
foreach (var attribute in entity.Attributes)
{
if (!dataTable.Columns.Contains(attribute.Key))
{
dataTable.Columns.Add(attribute.Key);
}
row[attribute.Key] = getAttributeValue(attribute.Value).ToString();
}
foreach (var fv in entity.FormattedValues)
{
if (!dataTable.Columns.Contains(fv.Key + "name"))
{
dataTable.Columns.Add(fv.Key + "name");
}
row[fv.Key + "name"] = fv.Value;
}
dataTable.Rows.Add(row);
}
///////
private object getAttributeValue(object entityValue)
{
object output = "";
switch (entityValue.ToString())
{
case "Microsoft.Xrm.Sdk.EntityReference":
output = ((EntityReference)entityValue).Name;
break;
case "Microsoft.Xrm.Sdk.OptionSetValue":
output = ((OptionSetValue)entityValue).Value.ToString();
break;
case "Microsoft.Xrm.Sdk.Money":
output = ((Money)entityValue).Value.ToString();
break;
case "Microsoft.Xrm.Sdk.AliasedValue":
output = getAttributeValue(((Microsoft.Xrm.Sdk.AliasedValue)entityValue).Value);
break;
default:
output = entityValue.ToString();
break;
}
return output;
}
}
Sunday, August 16, 2015
Dynamics CRM 2015 Online and Azure Service Bus Messages
If you've configured Dynamics CRM 2015 Online with Azure Service Bus you will know that the "message" it puts on the queue is the plugin execution context. This is the same context that you use within a plugin. You will know that to do anything with the context you have to use the Microsoft.Xrm.Sdk and extract the Pre or Post Image and the so called Target image. "Target" is s stupid name I always think. It contains the delta - the attributes that were changed during the operation. The Post Image of say an Update event will contain all the attributes that have values in it and excludes any attributes with null values.
The reason for using the Azure Service Bus is usually to get data from CRM to your on-premise systems and more often than not you may be using an ESB to read messages from the Azure Service Bus. BizTalk for example has an adaptor that you just need to configure to read messages from Azure Service Bus. However any BizTalk developer is going to be very disappointed if you provide them with just a Plugin Execution Context because they will have to use the CRM SDK to convert it into an XML message. Even then the entity objects when serialized will result in a collection of KeyValuePairs. The biggest problem with that is trying to map it to a strongly typed schema particularly because the structure varies so much when attributes that are null are set to a value and vice versa. I've tried using the BizTalk mapper and failed.
UPDATE: CRM Online restricts plugins to Sandbox mode and you cannot serialize objects to XML because this is prohibited. I have recently stumbled on a way of getting round this. This may provide a much simpler solution then described in the rest of this article.
One solution is to have custom code that will read the context, convert it to a strongly typed schema and then put it back in another queue. This needs to be done serially to ensure that Ordered Delivery is maintained but at least it gets to a schema that is worthy of the name.
Now you have to be careful when using CRM Online because if your plugins consume a lot of CPU they will be terminated with extreme prejudice. It is best to keep the code within your plugin as minimal as possible and then have the bulk of your code execute on an Azure VM where you don't need to worry about CPU usage because you can size the VM to suit.
The best way to architect this is
a) Have a Plugin that sends the Plugin Execution Context to a Azure Service Bus Queue
b) Create an Azure Worker Process that reads messages from the Queue, creates an XML message and puts it into another Azure Service Bus Queue (be sure to maintain the order)
c) If using BizTalk create a Receive Port that will read XML messages from the second Queue
The Azure Worker Process is a lot like a Windows Service (in Azure its called a Cloud Service) and will automatically restart if it is stopped. When you deploy the Azure Worker Process it will create its own VM. You can supply configuration settings which in our case would include the EndPoint of the Queue we are reading from and the EndPoint of the Queue we are writing to.
While you messages should be placed on the first queue using a plugin running synchronously, the Azure Worker Process is running asynchronously. If you want to post messages back to CRM because of a failure then you need to setup another asynchronous process to send the messages back to CRM.
The reason for using the Azure Service Bus is usually to get data from CRM to your on-premise systems and more often than not you may be using an ESB to read messages from the Azure Service Bus. BizTalk for example has an adaptor that you just need to configure to read messages from Azure Service Bus. However any BizTalk developer is going to be very disappointed if you provide them with just a Plugin Execution Context because they will have to use the CRM SDK to convert it into an XML message. Even then the entity objects when serialized will result in a collection of KeyValuePairs. The biggest problem with that is trying to map it to a strongly typed schema particularly because the structure varies so much when attributes that are null are set to a value and vice versa. I've tried using the BizTalk mapper and failed.
UPDATE: CRM Online restricts plugins to Sandbox mode and you cannot serialize objects to XML because this is prohibited. I have recently stumbled on a way of getting round this. This may provide a much simpler solution then described in the rest of this article.
One solution is to have custom code that will read the context, convert it to a strongly typed schema and then put it back in another queue. This needs to be done serially to ensure that Ordered Delivery is maintained but at least it gets to a schema that is worthy of the name.
Now you have to be careful when using CRM Online because if your plugins consume a lot of CPU they will be terminated with extreme prejudice. It is best to keep the code within your plugin as minimal as possible and then have the bulk of your code execute on an Azure VM where you don't need to worry about CPU usage because you can size the VM to suit.
The best way to architect this is
a) Have a Plugin that sends the Plugin Execution Context to a Azure Service Bus Queue
b) Create an Azure Worker Process that reads messages from the Queue, creates an XML message and puts it into another Azure Service Bus Queue (be sure to maintain the order)
c) If using BizTalk create a Receive Port that will read XML messages from the second Queue
The Azure Worker Process is a lot like a Windows Service (in Azure its called a Cloud Service) and will automatically restart if it is stopped. When you deploy the Azure Worker Process it will create its own VM. You can supply configuration settings which in our case would include the EndPoint of the Queue we are reading from and the EndPoint of the Queue we are writing to.
While you messages should be placed on the first queue using a plugin running synchronously, the Azure Worker Process is running asynchronously. If you want to post messages back to CRM because of a failure then you need to setup another asynchronous process to send the messages back to CRM.
Labels:
Azure Service Bus,
BizTalk,
Dynamics CRM 2015
Saturday, July 4, 2015
Dynamics CRM 2015 Online and Azure Service Bus
In the previous post I talked about sending messages to another system via a message queue. This is the design pattern Microsoft recommends when you want to use Dynamics CRM Online to update systems that are on-premise. They recommend using Azure Service Bus and the integration is built into the Online version by creating a Service Endpoint. This is also available for the on-premise version.
The plugin registration tool for Dynamics CRM offers the ability to register a Service EndPoint when you are connected to CRM Online. What is not clear though is that there are two ways you can configure it.
First though there are two gotchas when setting up the Azure Service Bus.
GOTCHA #1: You set up the Azure Service Bus Namespace using the Portal. Wrong.
Doing it that way you no longer have the option to use ACS authentication and that is what the Plugin Registration tool uses. Delete it. Download the PowerShell Azure Commands add-in and run this command:
New-AzureSBNamespace -Name yourservice -Location "West Europe" -CreateACSNamespace $true -NamespaceType Messaging
The response you get returned is
You need to use the DefaultKey when the Plugin Registration tool prompts you for the Management Key!
GOTCHA #2: You create the Queue (or whatever) using the Portal or PowerShell. Wrong.
You need to leave this for the Plugin Registration tool to do.
I won't give the rest of the details for configuring the endpoint because that is covered in other blogs.
Once you have the Service EndPoint registered there are two ways forward.
The first and seemingly the most attractive option is you can register steps and images right there under the endpoint just as you would do with a plugin. The advantage is that this is a zero code solution. Just by configuring an Entity with the appropriate step and image you can get messages in your queue (or whatever). The thing is though is this method only supports Asynchronous operations. That may be fine if you have a very simple CRM solution and want to configure only one or two entities. In more real world scenarios this is not going to work for you because it won't guarantee ordered delivery. That is what I covered in my previous post. To maintain Ordered Delivery you must use synchronous plugins steps.
The second route is to create an Azure-aware plugin. There is sample code in the SDK for doing this and out there in blogosphere. In this case you just create the service endpoint and copy the Id that it creates. Create your Azure-aware plugin and paste the Id into the Unsecure Configuration section. Register your plugin steps and images as usual. The plugin uses an instance of the IServiceEndpointNotificationService and essentially posts the context (using the Execute method) to the Service Bus endpoint. The point here though is that you have full choice over how to register your steps, so if you need Ordered Delivery you can choose Synchronous.
Personally I find the whole method of configuring a Service EndPoint sucks. What about when I want to deploy this to other environments? I am going to have to repeat the manual steps for each environment and when I deploy my Azure aware plugin I am going to have to amend the Id each time. Now you might argue this will be a one off process and its no big deal. But I prefer my deployments not to involve manual steps so I'm inclined to post messages to the Azure Service Bus using code and have the connection string stored in a configuration entity along with other environment settings. Remember though that you have to use the REST API to post messages because the plugin runs in Sandbox mode.
The plugin registration tool for Dynamics CRM offers the ability to register a Service EndPoint when you are connected to CRM Online. What is not clear though is that there are two ways you can configure it.
First though there are two gotchas when setting up the Azure Service Bus.
GOTCHA #1: You set up the Azure Service Bus Namespace using the Portal. Wrong.
Doing it that way you no longer have the option to use ACS authentication and that is what the Plugin Registration tool uses. Delete it. Download the PowerShell Azure Commands add-in and run this command:
New-AzureSBNamespace -Name yourservice -Location "West Europe" -CreateACSNamespace $true -NamespaceType Messaging
The response you get returned is
You need to use the DefaultKey when the Plugin Registration tool prompts you for the Management Key!
GOTCHA #2: You create the Queue (or whatever) using the Portal or PowerShell. Wrong.
You need to leave this for the Plugin Registration tool to do.
I won't give the rest of the details for configuring the endpoint because that is covered in other blogs.
Once you have the Service EndPoint registered there are two ways forward.
The first and seemingly the most attractive option is you can register steps and images right there under the endpoint just as you would do with a plugin. The advantage is that this is a zero code solution. Just by configuring an Entity with the appropriate step and image you can get messages in your queue (or whatever). The thing is though is this method only supports Asynchronous operations. That may be fine if you have a very simple CRM solution and want to configure only one or two entities. In more real world scenarios this is not going to work for you because it won't guarantee ordered delivery. That is what I covered in my previous post. To maintain Ordered Delivery you must use synchronous plugins steps.
The second route is to create an Azure-aware plugin. There is sample code in the SDK for doing this and out there in blogosphere. In this case you just create the service endpoint and copy the Id that it creates. Create your Azure-aware plugin and paste the Id into the Unsecure Configuration section. Register your plugin steps and images as usual. The plugin uses an instance of the IServiceEndpointNotificationService and essentially posts the context (using the Execute method) to the Service Bus endpoint. The point here though is that you have full choice over how to register your steps, so if you need Ordered Delivery you can choose Synchronous.
Personally I find the whole method of configuring a Service EndPoint sucks. What about when I want to deploy this to other environments? I am going to have to repeat the manual steps for each environment and when I deploy my Azure aware plugin I am going to have to amend the Id each time. Now you might argue this will be a one off process and its no big deal. But I prefer my deployments not to involve manual steps so I'm inclined to post messages to the Azure Service Bus using code and have the connection string stored in a configuration entity along with other environment settings. Remember though that you have to use the REST API to post messages because the plugin runs in Sandbox mode.
Dynamics CRM, Plugins, Ordered Delivery and Queues
This post applies to both Dynamics CRM Online and On-Premise. The scenario is where you need to keep another system synchronized with changes to Dynamics CRM entities. To make this loosely coupled you can write messages to a queue. If your target system is unavailable, the queue can store messages until it comes back online. The same design pattern is recommended for Dynamics CRM Online where messages are written to Azure Service Bus queue. You then have a process on-premise (it my be an ESB) that reads messages from the queue and sends then on to the target system.
This post stems from work I did on a previous project where we used CRM on-premise to write messages to MSMQ. If you are reading this far I assume you already know about Ordered Delivery but here is the bottom line:
If you want to maintain Ordered Delivery you must use Synchronous Plugins.
If you use a plugin registered for asynchronous it may appear to give you Ordered Delivery 4 out of 5 times, but you cannot guarantee it for all messages.
You can spend the time proving it for yourself or read this explanation.
We had a custom entity for address that meant you could create an address that was the primary address or the regulatory address or both. The business rule was that you could only have one active address for primary and regulatory. To achieve this we created a plugin on that fires on Create of an address and as a Pre-Operation, if you set the both primary and regulatory flags on the address to true it checks if any existing addresses are primary or regulatory, sets them to false and then deactivates the address(es). Now the target system has to obey the same logic so we need to send any messages to it n the correct order, i.e. in ordered delivery.
So I created a plugin that was generic and would write a message out to MSMQ. I registered it to run as a Post Operation on Create and Update of an Address and set it to run Asynchronously.
In one test case we have two existing addresses one set as primary, the other set as regulatory (lets call them 'Primary' and 'Regulatory'), We create a new address ('New') and set it to both primary and regulatory. That creates 5 messages.
1. Update of Primary to set the primary flag to false
2. Update of Primary when status is set to deactivate
3. Update of Regulatory to set the regulatory flag to false
4. Update of Regulatory when status is set to deactivate
5. Create of the New address
Now you want to maintain the order that the addresses were written to the database. The Create must come last or you've broken the business rule about only have one active primary or regulatory address. With the on-premise CRM I could examine the Asynchronous table and could see the 5 messages there. They were flagged as belonging to the same transaction but when you looked at the processed time they were all identical. All five records are executed simultaneously and its a matter of chance which message gets in the queue first. There is an order, but its not consistent.
BizTalk works in a similar way to the CRM Asynchronous Service and its architected that way for performance reasons.
When I changed the plugin to work synchronously then it does maintain the correct order of the messages. You do need to pay attention though to the Rank when you have multiple plugins registered on the same entity for the same stage. By default, Rank is zero but you can put any integer up to 99 into it, and this will set the order that the plugins fire in. I wanted my message to be the last plugin to execute so I set it to 99. Remember though that it affects the order
of the plugins within the same stage. The plugin pipeline always executes as
1. Pre-Validation
2. Pre-Operation (before the database write)
3. Post-Operation(after the database write)
Here is the bottom line again
If you want to maintain Ordered Delivery you must use Synchronous Plugins.
This post stems from work I did on a previous project where we used CRM on-premise to write messages to MSMQ. If you are reading this far I assume you already know about Ordered Delivery but here is the bottom line:
If you want to maintain Ordered Delivery you must use Synchronous Plugins.
If you use a plugin registered for asynchronous it may appear to give you Ordered Delivery 4 out of 5 times, but you cannot guarantee it for all messages.
You can spend the time proving it for yourself or read this explanation.
We had a custom entity for address that meant you could create an address that was the primary address or the regulatory address or both. The business rule was that you could only have one active address for primary and regulatory. To achieve this we created a plugin on that fires on Create of an address and as a Pre-Operation, if you set the both primary and regulatory flags on the address to true it checks if any existing addresses are primary or regulatory, sets them to false and then deactivates the address(es). Now the target system has to obey the same logic so we need to send any messages to it n the correct order, i.e. in ordered delivery.
So I created a plugin that was generic and would write a message out to MSMQ. I registered it to run as a Post Operation on Create and Update of an Address and set it to run Asynchronously.
In one test case we have two existing addresses one set as primary, the other set as regulatory (lets call them 'Primary' and 'Regulatory'), We create a new address ('New') and set it to both primary and regulatory. That creates 5 messages.
1. Update of Primary to set the primary flag to false
2. Update of Primary when status is set to deactivate
3. Update of Regulatory to set the regulatory flag to false
4. Update of Regulatory when status is set to deactivate
5. Create of the New address
Now you want to maintain the order that the addresses were written to the database. The Create must come last or you've broken the business rule about only have one active primary or regulatory address. With the on-premise CRM I could examine the Asynchronous table and could see the 5 messages there. They were flagged as belonging to the same transaction but when you looked at the processed time they were all identical. All five records are executed simultaneously and its a matter of chance which message gets in the queue first. There is an order, but its not consistent.
BizTalk works in a similar way to the CRM Asynchronous Service and its architected that way for performance reasons.
When I changed the plugin to work synchronously then it does maintain the correct order of the messages. You do need to pay attention though to the Rank when you have multiple plugins registered on the same entity for the same stage. By default, Rank is zero but you can put any integer up to 99 into it, and this will set the order that the plugins fire in. I wanted my message to be the last plugin to execute so I set it to 99. Remember though that it affects the order
of the plugins within the same stage. The plugin pipeline always executes as
1. Pre-Validation
2. Pre-Operation (before the database write)
3. Post-Operation(after the database write)
Here is the bottom line again
If you want to maintain Ordered Delivery you must use Synchronous Plugins.
Labels:
Azure Service Bus,
Dynamics CRM 2015,
MSMQ,
Ordered Delivery
Thursday, March 5, 2015
Calling a WCF Web Service over HTTPS (SSL)
I was recently trying to access a web service that I wanted to secure over HTTPS. I got it working as an HTTP service, as you do, and made sure that I had a certificate on the server and enabled the https protocol.
I was using basic Http binding and here are the changes that need to be made
<security mode="Transport">
<transport clientCredentialType="None" />
That now worked in the browser if I prefixed the url with https://
Next step was that I needed to call the web service where I was not able to access a web.config or an app.config. Without the service reference being available you have to do this in code. First thing is to make sure you have a copy of the interface class accessible in the client. It doesn't need to be the same name but it does need to specify the operation contract exactly.
[ServiceContract]
public interface IFormDefinition
{
[OperationContract]
[FaultContract(typeof(CRMSoapFault))]
void PublishFormMetaData(string crmEndPoint, string formId, string webResource, string token);
}
[DataContract]
public class CRMSoapFault
{
public CRMSoapFault(string errorMsg)
{
this.ErrorMsg = errorMsg;
}
///
/// This property is used to pass the custom error information
/// from service to client.
///
[DataMember]
public string ErrorMsg { get; set; }
}
To call the web service and set the binding information through code to match this you need to add:
BasicHttpBinding myBinding = new BasicHttpBinding();
myBinding.Security.Mode = BasicHttpSecurityMode.Transport;
myBinding.Security.Transport.ClientCredentialType = HttpClientCredentialType.None;
EndpointAddress myEndpoint = new EndpointAddress(endPointUrl);
ChannelFactory myChannelFactory = new ChannelFactory(myBinding, myEndpoint);
try
{
IFormDefinition wcfClient1 = myChannelFactory.CreateChannel();
// call the web service method
wcfClient1.PublishFormMetaData(crmEndPoint, formId, webResource, token);
}
catch(FaultException faultEx)
{
}
I was using basic Http binding and here are the changes that need to be made
<security mode="Transport">
<transport clientCredentialType="None" />
That now worked in the browser if I prefixed the url with https://
Next step was that I needed to call the web service where I was not able to access a web.config or an app.config. Without the service reference being available you have to do this in code. First thing is to make sure you have a copy of the interface class accessible in the client. It doesn't need to be the same name but it does need to specify the operation contract exactly.
[ServiceContract]
public interface IFormDefinition
{
[OperationContract]
[FaultContract(typeof(CRMSoapFault))]
void PublishFormMetaData(string crmEndPoint, string formId, string webResource, string token);
}
[DataContract]
public class CRMSoapFault
{
public CRMSoapFault(string errorMsg)
{
this.ErrorMsg = errorMsg;
}
///
/// This property is used to pass the custom error information
/// from service to client.
///
[DataMember]
public string ErrorMsg { get; set; }
}
To call the web service and set the binding information through code to match this you need to add:
BasicHttpBinding myBinding = new BasicHttpBinding();
myBinding.Security.Mode = BasicHttpSecurityMode.Transport;
myBinding.Security.Transport.ClientCredentialType = HttpClientCredentialType.None;
EndpointAddress myEndpoint = new EndpointAddress(endPointUrl);
ChannelFactory
try
{
IFormDefinition wcfClient1 = myChannelFactory.CreateChannel();
// call the web service method
wcfClient1.PublishFormMetaData(crmEndPoint, formId, webResource, token);
}
catch(FaultException
{
}
Monday, March 2, 2015
Accessing SharePoint Online with Web Client
Misleading title really because if you try and access SharePoint Online using the WebClient it will fail to authenticate. What you need to do is use the CookieContainer and SharePointOnlineCredentials.
I got the basics of this from this post. and also from this post which uses a class inherits from WebClient. It mentions that you need the SharePoint Client Components SDK which will install Microsoft.SharePoint.Client.DLL and Microsoft.SharePoint.Client.RunTime.DLL. Add references to both DLLs in your project and add these two using statements
using Microsoft.SharePoint.Client;
using System.Security;
Add this class to your project
public class ClaimsWebClient : WebClient
{ private CookieContainer cookieContainer;
public ClaimsWebClient(Uri host, string userName, string password)
{
cookieContainer = GetAuthCookies(host, userName, password);
}
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
if (request is HttpWebRequest)
{ (request as HttpWebRequest).CookieContainer = cookieContainer;
}
return request;
}
private static CookieContainer GetAuthCookies(Uri webUri, string userName, string password)
{ var securePassword = new SecureString();
foreach (var c in password) { securePassword.AppendChar(c); }
var credentials = new SharePointOnlineCredentials(userName, securePassword);
var authCookie = credentials.GetAuthenticationCookie(webUri);
var cookieContainer = new CookieContainer();
cookieContainer.SetCookies(webUri, authCookie); return cookieContainer;
}
}
Then call the ClaimWebClient class in the same way as you would the WebClient. Note that you do not need to set the credentials because it is done within the ClaimWebClient class.
ClaimsWebClient wc = new ClaimsWebClient(new Uri(sharePointSiteUrl), userName, password);
byte[] response = wc.DownloadData(sourceUrl);
I got the basics of this from this post. and also from this post which uses a class inherits from WebClient. It mentions that you need the SharePoint Client Components SDK which will install Microsoft.SharePoint.Client.DLL and Microsoft.SharePoint.Client.RunTime.DLL. Add references to both DLLs in your project and add these two using statements
using Microsoft.SharePoint.Client;
using System.Security;
Add this class to your project
public class ClaimsWebClient : WebClient
{ private CookieContainer cookieContainer;
public ClaimsWebClient(Uri host, string userName, string password)
{
cookieContainer = GetAuthCookies(host, userName, password);
}
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
if (request is HttpWebRequest)
{ (request as HttpWebRequest).CookieContainer = cookieContainer;
}
return request;
}
private static CookieContainer GetAuthCookies(Uri webUri, string userName, string password)
{ var securePassword = new SecureString();
foreach (var c in password) { securePassword.AppendChar(c); }
var credentials = new SharePointOnlineCredentials(userName, securePassword);
var authCookie = credentials.GetAuthenticationCookie(webUri);
var cookieContainer = new CookieContainer();
cookieContainer.SetCookies(webUri, authCookie); return cookieContainer;
}
}
Then call the ClaimWebClient class in the same way as you would the WebClient. Note that you do not need to set the credentials because it is done within the ClaimWebClient class.
ClaimsWebClient wc = new ClaimsWebClient(new Uri(sharePointSiteUrl), userName, password);
byte[] response = wc.DownloadData(sourceUrl);
Saturday, February 28, 2015
Raising SoapFaults on a Web Service
I have used SoapFaults (FaultExceptions) on Web Services before so here is quick recap. In your Interface class add this declaration
[DataContract]
public class SPSoapFault
{
public SPSoapFault(string errorMsg)
{
this.ErrorMsg = errorMsg;
}
[DataMember]
public string ErrorMsg { get; set; }
}
Beneath the OperationContract declaration of the method you want to use this on addthe FaultContract attribute
[OperationContract]
[FaultContract(typeof(SPSoapFault))]
Now in the service you can throw FaultExceptions of this type
throw new FaultException<SPSoapFault>(new SPSoapFault("The byte array is null"), new FaultReason("Required parameter"));
Note that you must add a Fault Reason.
When calling this from a client application make sure you declare the faultexception that is in the web service references.
SPService.SPSoapFault spfault = new SPService.SPSoapFault();
the catch block of your try catch should include this
catch (FaultException< SPSoapFault> faultEx)
{
spfault = faultEx.Detail;
string error = spfault.ErrorMsg;
string reason = faultEx.Reason.ToString();
}
[DataContract]
public class SPSoapFault
{
public SPSoapFault(string errorMsg)
{
this.ErrorMsg = errorMsg;
}
[DataMember]
public string ErrorMsg { get; set; }
}
Beneath the OperationContract declaration of the method you want to use this on addthe FaultContract attribute
[OperationContract]
[FaultContract(typeof(SPSoapFault))]
Now in the service you can throw FaultExceptions of this type
throw new FaultException<SPSoapFault>(new SPSoapFault("The byte array is null"), new FaultReason("Required parameter"));
Note that you must add a Fault Reason.
When calling this from a client application make sure you declare the faultexception that is in the web service references.
SPService.SPSoapFault spfault = new SPService.SPSoapFault();
the catch block of your try catch should include this
catch (FaultException<
{
spfault = faultEx.Detail;
string error = spfault.ErrorMsg;
string reason = faultEx.Reason.ToString();
}
Subscribe to:
Posts (Atom)