Thursday, May 19, 2016

Deploy Word document generation template to new Dynamics 2016 environment.

Microsoft are treating the new document generation facility in Dynamics 2016 as the best thing since sliced bread. There are some genuine limitations with it at the moment which I've seen posted elsewhere.
But I've come across an issue with custom entities. I managed to create a template alright but then tried to install it in another environment. I wasn't trying anything fancy, just a manual install of the template. But when I added the template it bound it to the wrong entity. Now it doesn't offer me the option to choose the entity, just goes ahead and binds it to the wrong thing.
It is to due to the object type code or sometimes called the entity type code. That code is fixed for all the standard entities, but custom entities are given a number in the 10,000 range. What I've discovered is this number is not the same across environments. It's not that I've been stupid and manually created the entity, I deployed a solution as you normally do, but it issues a different type code in the new environment. You can see it easily if you can access the full url for an entity record, you know when you want to find the unique guid. The etc parameter is the entity type code.
Now this wouldn't be a problem except when you create a Word document template it adds the entity type code to the end of the namespace. you can see it when in Word and you are in the Xml mapping pane. It's there at the end of the urn. That then causes it to bind to the wrong entity when deployed into the new environment.
If you spent hours creating the template you don't want to have to repeat this in each environment.
To solve it, take the template and rename the extension from docx to zip. You will see in the extract a customxml folder and in there a file that contains the urn of the source data. You have to change the etc at the end to match the correct one in the target environment. Change any other related custom entities. Zip it back up and rename it back to docx. Load it back into the target environment and it should work. Maybe I'll find the time to write an app to do this for me because I don't see Microsoft fixing this anytime soon. Damn, sometimes Microsoft just don't think things through.

Friday, May 13, 2016

Retrieving the Message from Service Bus Queue

This is a follow up to the previous post where I sent a strongly typed XML message from a CRM plugin to a Service Bus Queue. This is what the Xml looks like that I sent to the queue.
Note that the root node is the name of the entity. This is a snippet to show the attributes and then the formatted values.  You can easily manipulate the Xml by changing the way it populates the datatable.

 <contact>
  <attributes>
    <address1_addressid>c6fd930f-512a-42fb-a645-af4a672c740f</address1_addressid>
    <address2_addressid>6a280e36-c009-463b-84f1-8666c2dfc5ba</address2_addressid>
    <address2_addresstypecode>1</address2_addresstypecode>
    <address2_freighttermscode>1</address2_freighttermscode>
    <address2_shippingmethodcode>1</address2_shippingmethodcode>
    <address3_addressid>5475a20e-3c79-479a-a1d5-05b0f144a8fc</address3_addressid>
    <contactid>1d50bcf0-2519-e611-80da-5065f38b46e1</contactid>
    <createdby>Charles Emes</createdby>
    <createdon>13/05/2016 16:15:51</createdon>
;... more attributes
;... now the formatted values
    <address2_addresstypecodename>Default Value</address2_addresstypecodename>
    <address2_freighttermscodename>Default Value</address2_freighttermscodename>
    <address2_shippingmethodcodename>Default Value</address2_shippingmethodcodename>
    <createdonname>13/05/2016 17:15</createdonname>
  </attributes></contact>

What I'm missing is a namespace and that is easy enough to add once I load it into an XmlDocument. I'm a BizTalk developer so right now I'm happy with a message that I can transform into anything I want. The attributes are in alphabetical order so I can cope with missing attributes easily enough in the XSD by making Nillable=true and MinOccurs=0. But I can also deserialize this into a contact object by creating a class form the XSD using the XSD.EXE.  Note this is absolutely not a CRM Contact object but my own object. Check out my usings because there is no Microsoft.Xrm.SDK anywhere to be seen. 

This receiver is just a console app and I ripped the code from another blogger. I'm just using it show what you can do with the message now you've got it.

     
static void Main(string[] args)
{
string connectionString = CloudConfigurationManager.GetSetting("Microsoft.ServiceBus.ConnectionString");
QueueClient Client = QueueClient.CreateFromConnectionString(connectionString, "anoqueue");
//Console.WriteLine("\nReceiving message from Queue...");
BrokeredMessage message = null;
NamespaceManager namespaceManager = NamespaceManager.Create();
while (true)
{
try
{
//receive messages from Queue
message = Client.Receive(TimeSpan.FromHours(10));
if (message != null)
{
Console.WriteLine(string.Format("Message received: Id = {0}", message.MessageId));
string s = new StreamReader(message.GetBody<Stream>(), Encoding.ASCII).ReadToEnd();
// load into an XML Document to s
System.Xml.XmlDocument xmldoc = new System.Xml.XmlDocument();
xmldoc.LoadXml(s);
// need to add a namespace because it doesn't have one
xmldoc.DocumentElement.SetAttribute("xmlns", "http://xrm.generic.schemas");
string mydocpath = @"C:\Projects\Test\ReceiveFromSB\";
xmldoc.Save(mydocpath + "saved.xml");
// Deserialize to an object too
XmlRootAttribute xRoot = new XmlRootAttribute();
xRoot.ElementName = message.Properties["EntityName"].ToString();
XmlSerializer serializer = new XmlSerializer(typeof(contact),xRoot);
StringReader rdr = new StringReader(s);
contact myContact = (contact)serializer.Deserialize(rdr);
// Now delete the message
message.Complete();

}
else
{
//no more messages in the queue
break;
}
}
catch (MessagingException me)
{
if (!me.IsTransient)
{
Console.WriteLine(me.Message);
throw;
}
else
{
// HandleTransientErrors(e);
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
throw;
}
}
Client.Close();
}
}
 

Dynamics CRM Online: Posting Messages to the Azure Service Bus

I have to say I've never really liked the way Dynamics CRM integrates with the Azure Service Bus. I've posted on this before here and here. What I dislike is the way it posts the Context to the Service Bus.  In order to do anything with the Context I need to use the RemotePluginContext and then extract the entity that I want. That means using the XrmSDK and being familiar with the way to handle the Context. But I like a loosely coupled architecture. I would expect a CRM plugin to place a strongly typed Xml message on the Service Bus. Then processing of the message requires no knowledge of CRM. I'm a BizTalk developer and for interfaces are always about messages that comply with XML schemas because that acts as the contract.  In a development environment where you have different teams of developers with different skill sets this separation I believe is vital.

Now you may know that Dynamics CRM only supports the ACS authentication of the Service Bus and you have to create this using PowerShell commands because it is not possible via the portal. See this post. You can find the details of setting up the Service Endpoint elsewhere but the bottom line is that you are using a Guid that points to the Service Endpoint record. So what about deploying to another environment where you will use a different Service Endpoint? Well you have to go through the manual process again. What in a Production environment? Are you kidding me? No, I want to be able to deploy and then just configure the endpoint url.

The challenge in doing this online is that the plugin runs in Sandbox mode and it restricts what .Net assemblies I can use, System.Security being one of them. That rules out SAS authentication because it requires on System.Security.Cryptography.

My solution is a plugin that does two things. It populates a data table with the attributes from the entity and produces strongly typed Xml.  It then uses WebClient to POST the xml as a string to the Service Bus using the REST API. The authentication uses the ACS token and the only configuration parameters I need are the service bus namespace, the queue name, and the issuer secret. The great thing about the solution is it is totally generic, it works with any entity, all you need is to configure a plugin step. A couple of points to note.
1. I'm using a queue.
2. My plugin is registered as synchronous because I want to maintain ordered delivery
3. I'm sending to the queue synchronously, because I want to know I successfully sent it.
4. I can choose if I want to use the PreImage, PostImage or Target
5.The plugin calls my EntitySerialize class
6. I am indebted to other bloggers for much of this code

The next post shows you can retrieve the message form the queue.
using System;

using System.Collections.Specialized;
using System.Data;
using System.Linq;
using System.Text;
using System.Net;
using Microsoft.Xrm.Sdk;

public class EntityHelper
    {
        // **************Use bits from  the Service bus namespace below ******************
        //
        static string ServiceNamespace = "yournamespace";
        static string ServiceHttpAddress = "https://yournamespace.servicebus.windows.net/queuename/messages";
        const string acsHostName = "accesscontrol.windows.net";
        const string sbHostName = "servicebus.windows.net";
        const string issuerName = "owner";
        const string issuerSecret = "your_issuer_secret_goes_here";
        public static void EntitySerialize(ITracingService tracingService,  IOrganizationService service, Entity entity, string orgName, string msgName, string correlation)
        {
            try
            {
                DataSet ds = new DataSet(entity.LogicalName);
                DataTable dt = new DataTable("attributes");
                DataTable dataTable = new DataTable();
                ConvertEntityToDataTable(dt, entity);
                ds.Tables.Add(dt);
                string xml = ds.GetXml();
                PostMessageToBus(entity, orgName, msgName, correlation, xml);
            }
            catch (System.Net.WebException  we)
            {
                tracingService.Trace(we.Message);
                throw new Exception("Web Exception: " +we.Message);
            }
            catch (Exception e)
            {
                tracingService.Trace(e.Message);
                throw e;
            }
            return ;
        }
        private static void PostMessageToBus(Entity entity, string orgName, string msgName, string correlation, string xml)
        {
            WebClient webClient = new WebClient();
            webClient.Headers[HttpRequestHeader.Authorization] = GetToken();
            webClient.Headers[HttpRequestHeader.ContentType] = "application/atom+xml;type=entry;charset=utf-8";
            // Set brokered Properties this way
            webClient.Headers.Add("BrokerProperties", "{ \"MessageId\":\"123456789\", \"Label\":\"M1\"}");
            // example of the parameters that can be passed and available to receiver as mesage.Properties collection
            webClient.Headers["Correlation"] = correlation;
            webClient.Headers["OrganisationName"] = orgName;
            webClient.Headers["MessageName"] = msgName;
            webClient.Headers["EntityName"] = entity.LogicalName;
            var response = webClient.UploadData(ServiceHttpAddress, "POST", System.Text.Encoding.UTF8.GetBytes(xml));
            string responseString = Encoding.UTF8.GetString(response);
        }
        ///////
          private static void ConvertEntityToDataTable(DataTable dataTable, Entity entity)
         {
             DataRow row = dataTable.NewRow();
             foreach (var attribute in entity.Attributes.OrderBy(a=>a.Key))
             {
                 if (!dataTable.Columns.Contains(attribute.Key))
                 {
                     dataTable.Columns.Add(attribute.Key);
                 }
                 if (getAttributeValue(attribute.Value) != null)
                 {
                     row[attribute.Key] = getAttributeValue(attribute.Value).ToString();
                 }
             }
             foreach (var fv in entity.FormattedValues.OrderBy(a=>a.Key))
             {
                 if (!dataTable.Columns.Contains(fv.Key + "name"))
                 {
                     dataTable.Columns.Add(fv.Key + "name");
                 }
                 row[fv.Key + "name"] = fv.Value;
             }
             dataTable.Rows.Add(row);
         }
        ///////
         private static object getAttributeValue(object entityValue)
         {
             object output = "";
             switch (entityValue.ToString())
             {
                 case "Microsoft.Xrm.Sdk.EntityReference":
                     output = ((EntityReference)entityValue).Name;
                     break;
                 case "Microsoft.Xrm.Sdk.OptionSetValue":
                     output = ((OptionSetValue)entityValue).Value.ToString();
                     break;
                 case "Microsoft.Xrm.Sdk.Money":
                     output = ((Money)entityValue).Value.ToString();
                     break;
                 case "Microsoft.Xrm.Sdk.AliasedValue":
                     output = getAttributeValue(((Microsoft.Xrm.Sdk.AliasedValue)entityValue).Value);
                     break;
                 default:
                     output = entityValue.ToString();
                     break;
             }
             return output;
         }
         private static string GetToken()
         {
             var acsEndpoint = "https://" + ServiceNamespace + "-sb." + acsHostName + "/WRAPv0.9/";
             // Note that the realm used when requesting a token uses the HTTP scheme, even though
             // calls to the service are always issued over HTTPS
             var realm = "http://" + ServiceNamespace + "." + sbHostName + "/";
             NameValueCollection values = new NameValueCollection();
             values.Add("wrap_name", issuerName);
             values.Add("wrap_password", issuerSecret);
             values.Add("wrap_scope", realm);
             WebClient webClient = new WebClient();
             byte[] response = webClient.UploadValues(acsEndpoint, values);
             string responseString = Encoding.UTF8.GetString(response);
             var responseProperties = responseString.Split('&');
             var tokenProperty = responseProperties[0].Split('=');
             var token = Uri.UnescapeDataString(tokenProperty[1]);
             return "WRAP access_token=\"" + token + "\"";
         }

     }
}

Friday, April 22, 2016

Sharing the Host WiFi with Hype-V image

I use Hyper-V for all of my development work.  Up until now I've been providing internet access to my Hyper-V images by setting up a Virtual Switch in Hyper-V, and configuring it as an Internal network.  When my host is connected to the internet, I use Internet Connection Sharing to this Virtual Switch.  I then configure the Hyper-V images to use a legacy adapter and select this Virtual Switch. Here is how my WiFi adapter is shared.

This procedure is well documented in other posts.  But it has a snag. Internet Connection Sharing insists on using IP addresses in the 192.168.0.* range and your WiFi router must use an alternative IP address range e.g. 192.168.1.* 

I recently upgraded my Virgin broadband and the new router was using the IP range 192.168.0.*  You could supposedly change it but I could not get this to work properly so I started to hunt around for another solution that would leave the IP range unchanged.  I stumbled on the wonderful network bridge.

The solution is the go into Virtual Switch settings and create a new Virtual Switch but select External network. I selected my WiFi network card from the list and enabled "Allow management operating system to share this network adapter".


When you click OK, it takes a minute but what its doing is creating a Network Bridge in your Network Connections along with a  Hyper-V Virtual Ethernet adapter you are probably familiar with.  My virtual machines use the same approach, a legacy adapter that is configured for this external Virtual Switch. 

Now when I look at my Network Connections its a bit odd.  My WiFi adapter displays Enabled, Bridged even when I am not connected to the Internet. When I make the connection in the usual way it is the Hyper-V Virtual adapter that goes through the whole Identifying... stage until it connects.  It looks counter-intuitive but it works a treat.  So that's it. My Hyper-V images are connected to the Internet and I didn't have to change the IP address of the router.


Friday, April 15, 2016

Dynamics CRM 2016 and Use Legacy form rendering

So you may know that Microsoft has improved form load by doing some extensive caching.  There is a setting under General Settings called "Use Legacy form rendering" which you can set to Yes or No.

Now you might wonder why I am writing a post about this setting. Well it was a JavaScript  error on a Form Load that I was getting. I had my own JavaScript code on the Form Load  but the error shown was confusing:
Field:window
Event:onload
Error:Unable to get property '$o_3' of undefined or null reference

Huh? A bit of Googling showed that others had come across it to


The post included this comment "Note that this error only occurs when you turn on the “Use Legacy form rendering”.  Sure enough changing this setting to No avoids the error. 

But what if you want to preserve the faster form rendering?  Patric provide the solution in his blog. 
Add the following line before calling the prefilter function.

Xrm.Page.getControl(“EntityName”)._control && Xrm.Page.getControl(“EntityName”)._control.tryCompleteOnDemandInitialization && Xrm.Page.getControl(“EntityName”)._control.tryCompleteOnDemandInitialization();

Brilliant. 

Dynamics CRM 2016 and Legacy Entities

This is not the first time I've run into a problem with so called "Legacy Entities". I am referring to the entities that have been left out in the cold since CRM 2013. These include
  • address
  • opportunity product
  • quote product
  • order product
  • invoice product

There are some things that these entities don't support which include
  • Accessible in the Tablet App
  • Update attributes with workflows
  • Business Rules (though oddly Opportunity product does support business rules)
  • Old style forms

I have also come across with problems when upgrading to CRM 2016. This problem occurred on Opportunity Products form which had been customised in CRM 2015. I removed a number of fields I didn't need and it was working fine. 
After the upgrade to 2016 though I was getting this JavaScript error on form load

Field:window
Event:onload
Error:Unable to get property 'addOnChange' of undefined or null reference

Now that was not from any JavaScript code that I had but from a Microsoft library called OpportunityProduct_main_system_library.js

When I looked into the code I could see that it was referring to attributes that were not on the form:
Manual discount amount
Tax

When I added these back onto the form and hid them, the error went away. 

Moral of the story: Don't delete attributes from legacy forms - just hide them.