Logic Apps and Dynamic’s CRM Online Connector

An easy way of starting to Integrate with your Dynamic CRM Online instance. It just takes a few minutes and some easy steps to get started!

Getting started with the Connector

Getting started with the Connector is easy, all you need is an account on Dynamic CRM Online.
Read more how to connect and use certain actions/triggers etc. at the MSDN forum.


Working with the Connector

It’s fairly easy to get started with the Connector, selecting data and create/update records, but as soon as the business requirments lands on our table it gets a bit trickier. Often we need to handle scenarios where we link relations and/or assign ownership to the entity we are createing/updating. Then we need to know a little more about the setup in Dynamic CRM and our best friend for this is the Settings and Customizations tab.


By then selecting Customize The System a new window will open and it’s now possible to go in and verify the entity, check keys, relations etc.


Using Navigational properties

Navigational properties are used to link entities together, these can be set directly within the body of the update/insert request. This will then be linked correctly inside Dynamics CRM Online, an example would look like this, where we are setting the currency (transactioncurrency) of the account using the navigational property _transactioncurrencyid_value, example:

"body": {
"_transactioncurrencyid_value": "@{first(body('Get_intgrationMetadata')['value'])['_stq_defaultcurrency_value']}",
"accountnumber": "@{triggerBody()['data']['Company_ID']}",
"address2_city": "@{triggerBody()['data']['Town_city_state']}",

Setting owner

A frequent used operation is Assigning owner, this is now implemented so it’s possible to do it directly via the Update/Create operation instead of a separate operation as previously.

In the Dynamic CRM Online Connector it’s easy to set the owner just apply the following fields _ownerid_type wich is set to either systemusers or teams depending on the owner type next is the _ownerid_value wich is the key to the user, an example as bellow:’

"body": {
"_ownerid_type": "systemusers",
"_ownerid_value": "@{first(body('Get_owner')['value'])['systemuserid']}"

Lessons learned

Don’t overflow the CRM! Since Logic Apps is really powerfull in parallelism it’s good to have some sort of control over how many new instances are created and executed against Dynamics CRM. We usually try to make sure that there are no more than 200 parallel actions towards a CRM at any given time.

Learn how to check fields, properties and keys since you will get stuck at errors when sending in the wrong type and then you will need to check what type it is.
OptionSets are commonly used and good from GUI perspective but it’s not as good in integration since it’s translated to a number that we from integration often need to translate to a code or text but learning how to check the values inside CRM will speed up this process.

When we started using the Connector there were problems with assigning ownership and handling float numbers, these where fixed early by the product group and since then we haven’t found any issues with the way we are using the Connector.

Validate incoming HTTP request messages in BizTalk

One of BizTalk’s greatest strengths is that messages are not lost. Even if there is an error processing a message we still get a suspended message that an administrator can handle manually.
Sometimes though that is not the preferred way of handling errors.
When working with web API’s it is common to let the client handle any errors by returning a status code indicating if the request was successful or not, especially if the error is in the client request.

There is a standard pipeline component included in BizTalk that can validate any incoming XML message against its XSD schema, however if the validation fails the request will only get suspended and all the client receives is a timeout message and an administrator will have to handle the suspended message even though there is not much to do about it since the client connection is closed.

One way of handling this is to do the validation in an orchestration, catch any validation errors and create a response message to return but writing an orchestration for just handling validation doesn’t make much sense with the performance implications and all.

A better solution would be if we could:

  • Abort the processing of the request as early as possible if there is a validation error.
  • Leverage the existing component for validating messages.
  • Return a HTTP 400 status code to indicate to the client that the request was bad.
  • Avoid suspending the message in BizTalk.

Since the receive pipeline is the earliest stage we can inject custom code in, a custom pipeline component would be the best fit.

The pipeline component would need to execute the standard XmlValidator component and catch any validation error thrown by the validator. We could of course log any validation errors to the event log if we still would like some error logging.

If a validation error was caught, we need to create a new message with the validation error details so the client understand what is wrong.

The context property OutboundHttpStatusCode should be set to 400 so that the adapter knows to return the message with a status code of 400 bad request.

To prevent any further processing of the message and indicate to BizTalk that the new message is the response to return to the client, a number of context properties related to request response messages need to be set. Correlation tokens are copied from the request message to make the response go through the same receive port instance.

This component is available as a part of iBiz Solutions open source initiative. The source code is available on GitHub and the component is also available on NuGet.

Delete releases from Visual Studio Release Management

At iBiz Solutions we are heavy users of Visual Studio Release Management. Visual Studio Release Management in combination with Team Foundation Server for source control and TFS Build Services creates the Application Lifecycle Management control that we need in our projects. It enables us to have full control over exactly what gets built and what build packages that are released to what environment and so on.

Release Management in action in one of iBiz Solutions projects

Visual Studio Release Management is as mentioned a critical tool for us, but it still has a few places where the tool could do with some improvements. A critical features is of course the ability to get an overview of which build that is released into what environment – the current version however is not very efficient when it comes to search and filter the list of releases.

Another missing feature is the ability to delete previous releases. At first this sounds like a bad idea and that one should save all releases as they might provide important informationation in the future. But there are however situations where one makes stupid mistakes and where releases just clutter the bigger picture and makes it harder to actually see the releases that are important. An efficent way of filtering, or a way of saying that this release is no longer relevant, might have solved the issue but as mentioned this does not exist in the current version of the tool.

Long story short. Here is the script that we run directly against the database to delete specific releases and all their related information in Visual Studio Release Magagement.

Method not allowed when consuming API App in Azure App Services from c#

When consuming an Api App from c# the Azure App Service SDK helps us generate Client Code to improve the develoment process. (read more here)

When testing the generated code you might get an error of ‘Method Not Allowed’ and status code 405 even if the security settings is correct and the API app works perfectly when trying to use it via Logic App, Postman/fiddler/swagger etc.

If the thrown exception looks something like this:

Then the problem probably is an incorrect generated URI from the code generator where it has used http instead of https. (common in several places with API Apps always should be https)

To check this go in to the class of the ServiceClient ( in my case FlatFileEncoder ) and check the base URI settings, as you can see on the image bellow mine where generated with http instead of https.

Changing the URI from http to https and it starts working, my code is executed and the result from the FlatFileToXML function is returned as expected.

Sharing a BizTalk pipeline component in BizTalkComponents component library

So you have decided to share a pipeline component to our library? Great! The BizTalkComponent library is hosted on GitHub. A sample pipeline component that can be used as a template is available at GitHub.

Setting up Git

Before a component is added to the official GitHub repository of BizTalkComponents it must be reviewed. To do that you must first create a public GitHub repository. Make sure to add the .gitattributes file and the .gitignore file for Visual Studio. Also create a README file and a license file. BizTalkComponents uses MIT license.
To start working on our component we need to clone the newly created repository using your favorite Git client like Visual Studio or GitHub for Windows.

Creating the solution

Fire up Visual Studio and create an empty solution in the root folder of the newly cloned repository. The solution file should be called BizTalkComponents.PipelineComponents.{component name}.sln.

Adding unit tests

All components must have unit tests so the first thing we are going to do is to add a new Unit Test project to our solution. The unit test project should be located in Tests/UnitTests under the root folder.
To be able to run a pipeline component outside of BizTalk we use the Winterdom BizTalk Pipeline Testing library available as a NuGet package.

Build tests that not only ensures that the components output is as expected when everything is setup correctly but also that relevant exceptions are thrown when any preconditions like parameters set by the BizTalk administrator or existence of context properties fails.

Implementing the pipeline component

The pipeline component itself should be located in a Src folder under root.
All pipeline components should target .NET 4.0 to ensure compatibility with BizTalk 2010.


The pipeline components of BizTalk components uses a shared utility library called BizTalkComponents.Utils. This library contains helper classes to reduce the amount of code we need to write for common tasks like interacting with the message context and reading and writing to the components property bag. The library is available on NuGet.

Partial classes for a more readable component

The interfaces that every pipeline component needs to implement contains a lot of plumbing code that has nothing to do with what the component actually do. To keep the implementation clean and easy to read BizTalkComponents uses partial classes to separate plumbing code from the component implementation. The class files should be called {component name}.cs and {component name}.Component.cs. The component class should contain any component metadata properties and methods as well as any Validate method.
BizTalkComponents does not use resource files for component metadata. Name, Version and Description are set directly in the property.

The IPersistPropertyBag interface contains a method for validating parameters. This method is called when the component is built. This method should use the ValidationHelper from the utils library.
This method can be completed with an additional Validate method that is not called at build time but rather when the component is called in BizTalk.

Component parameters

Any component parameter should have annotations:
• Parameters required to be set already at design time should be annotated with the Required attribute
• Parameters required to be set at runtime should be annotated with the RequiredRunTime attribute available in the Utils library.
• All parameters should be annotated with a user friendly name in the DisplayName attribute.
• All parameters should be annotated with a description in the Description attribute.

The parameter property should be accompanied with a string constant to be used when reading and writing from the property bag.

The Load and Save methods should be implementing using the PropertyBagHelper from Utils.

The Execute method

If the component has any RequiredRuntime properties the custom Validate method should be called at the beginning of the Execute method to ensure that all parameters are set as expected.

All interactions with the message context should use the utils library’s extension methods.
ContextProperty entity should be used for referencing any context property. This entity can be initialized either with the BizTalk property notation namespace#property name or by separating namespace and property in to different strings.
The Utils library also contains constants for some BizTalk standard properties.

Building the component

All pipeline components should be strong named so that they can be installed into the GAC. A new strong name file (snk) is created for every pipeline component.

MSI files for all components will be generated. To be able to generate an MSI from MSBuild BtsMsiTask must be installed and used.
BtsMsiTask is called from a custom MSBuild build script that should be located in a Build folder under the root.
All MSI should have a version. The AssemblyInformationalVersion attribute is used for versioning and to be able to read that attribute from the build script MSBuild Extension Pack must also be installed.
The build script should be called Build.proj and also be included as a Solution folder in the solution.

This sample build script can be used by just replacing the component name.

Readme file

The README.md mark down file generated by GitHub should have a short description of the component. There are many tools for editing mark down files, for example this extension to Visual Studio.

This blog post aims to introduce the BizTalk Components library and the coding standard it uses.
If you have any questions or comments on the component library or the coding standard don’t hesitate to write a comment to this blog post.

Azure APP Services: Logic Apps process Flat file transform to EDIFACT send to FTP

Logic apps is part of the new and awesome features Microsoft released in preview this spring. Focus on this type of functions are cloud integration! Both hybrid integrations that bring data to and from servers inside your network in a secure and easy way and also pure cloud integration.

Logic Apps is the application that is created in Azure as the workflowish application, I won’t explain more in this post but please read more here.

So the example I’ll take here is a classic one, I have a system that produces small and simple flat files on a local server and we want to bring that data in EDIFACT INVOIC format to a FTP server. Normally this would be solved via an integration platform like BizTalk but for this scenario we don’t have a BizTalk server that we can use so we try to solve this via Logic Apps instead.


  1. Pick up a file on the local filesystem
  2. Decode the flatfile and transform it to EDIFACT
  3. Encode EDIFACT
  4. Send to FTP server

Let’s jump right in into the portal, we need to start by setting up some connectors (File, FTP) and several API Apps from the marketplace will be used.


SQL Databases

First off we need one SQL DB, you can use a previously created one but we need 2 empty instances, I did this in the “old” portal.

The 2 API apps are the following and in brackets you can se my database instance name.

  1. BizTalk Trading Partner Management (TradingPartnerManagement)
  2. BizTalk EDIFACT (EDIDatabase)

Copy the connection strings and change the text {your_password_here} to your password so you can use it later.

Service Bus:

Create a Service Bus namespace for the File Connector.


Download the EDIFACT XSD schema here


Create a BizTalk Service mapper in Visual Studio 2012

  1. Create a Flatfile schema
  2. Create the mapping from the flatfile schema to Edifact in Visual Studio.
    1. BizTalk Service mapper (generating trfm files). I don’t like this mapper since it have bad xslt support so I normaly just create a standard BizTalk project and do the mapping in a temporary project and later copy the xslt into the trfm mapper. J

Resulting in these 3 files for me (after downloading the EDIFACT schema)


Now when we are prepared let’s start creating the services

Creat and configure the API App’s instances.

In Web API section:

Create File Connector:
We need a File Connector for picking up files on the onpremise server we need to provide the following information right away when we create the connector:

  • Root Folder, make sure to write down or remember the value or prepare the server with the catalog structure since the value is impossible to find (if you do find where to read this value, please let me know how!)
  • Service bus Connection String, take the apropiate SAS connection string from the earlier create service bus

After creating this we can install it on the server, see the section about installing File Connector for more info.

Create FTP Connector:
We need a FTP Connector to be able to send messages to the FTP server. For this we need to provide the following information right away when we create the connector:

  • Server Address (make sure to not end with a / or similar since they will automatically add :21 for the port then you get an error when trying to use the connector, se image bellow for how a file name is constructed in this case (notice the staring :21)
  • User name, for the ftp server
  • Password, for the user

Create BizTalk Flat File Encoder:

We need a BizTalk Flat File Encoder that will decode and encode of flatfile to/from XML. In our case it will be decode of flat file to XML

Installation is straight forward and you only need to apply name for the instance.

Configure BizTalk Flat File Encoder:

For the Flat File encoder to work we need to add one or more flatfile schemas. These schemas are the same that is used in BizTalk so you can easily reuse the once you have today.
Go in to the instance we created (remeber the name of the instance) easiest way to find it is in the API apps list. (Browse -> API apps)

  1. Click Schemas
  2. Press Add Button
  3. Upload the Flat File Schema (must be a BizTalk flat file schema XSD file)

Create BizTalk Transform Service

We need a BizTalk Transform Service to be able to transform the xml instance of the flatfile to an xml instance of the EDIFACT schema.
Creating this is quite straight forward just need to apply a name for the service.

Configure BizTalk Transform Service

After creation  we need to some configuration, we basicly need to add the maps. Pick up the app as with the others, when you have found it do the following:

  1. Click Maps
  2. Press Add button
  3. Upload the Map you prepared in the preparations

Create BizTalk Trading Partner Management

BizTalk Trading Partner Management is used to keep information about partners and setup agreements on AS2, EDIFACT and X12.

When creating we need to provide the connection string to the earlier created Database.

Configure BizTalk Trading Partner Management

When the Trading Partner Management is created we need to do some configurations. Mainly we need to create parties that are suppose to be used in the B2B flow and then connect them via agreements so we can say that partner A are exchanging messages with partner B.

First we need to create atleast 2 partners, go in to the instance (just like the others):

  1. Click Partners
  2. Press Add button and add at least 2 Partners (only name is mandatory)
  3. After the Partners are added we need to create Identities for them (you might want to reload the web API to make sure it will load properly) After that press one of the partners
  4. Press Profiles
  5. Press the Profile in the Profiles section
  6. Press Identities (see image below for a guidance)
  7. Enter the identity value and pick a Qualifier
  8. Press Save, repeat the steps from point 3 for the other partner

When the partners are created and we have added identities to them we can start creating agreements BUT to be able to create EDIFACT agreements we also need to provide the EDIFACT schemas that we want to use.

  1. Press Schemas
  2. Press add
  3. Upload the Schema (repeat for more shemas if needed)

Now we can create the agreement:

  1. Click Agreements
  2. Press Add button and enter a name
  3. Choose Protocol, in our case it’s EDIFACT
  4. Partner Settings (here is a click intense part) set all values according to your setup (all must be set)
  5. Enter receive settings
    I left all of this standard and just pointed out the INVOIC schema.
  6. Enter Send settings
    I left all of this standard and just pointed out the INVOIC schema and added Application Reference.
  7. Enter Batch settings
    I filled in the mandatory fields (name and counter =1 )
    If this is skipped Azure will try to create the agreement but it will fail so just fil it in.

Note! After Save write down the Agreement ID, we will use that later on in the Logic App

Create BizTalk EDIFACT

The BizTalk EDIFACT instance will do the hard work of encode or decode your EDIFACT message to XML or vice versa. When creating the instance we need the following:

  1. Database Connection string: Connection string to the earlier created Database for EDIFACT.
  2. TPM Instance Name: The name of the earlier created BizTalk Trading Partner Management (from section 5)

There is no need to do some configuration since the EDIFACT instnace will use the configuration we did in the BizTalk Trading Partner Management instance when handling agreements and so on.


Now we are all set and the preparations are done, let’s start building the Logic Apps app.

Building the app looks easy and are smooth but there are some tricky parts that you will encounter since we will have some looping lists in this way of building the Logic App. (Building it this way so it’s easier to understand and test, quite bad though that if I need to change this later I have some serious remodeling to do, it’s easier to create a new flow.)

As I created it I focused on a flow that would be easy to demonstrate, so I added the fileconnector in a read mode of a folder, instead of the event mode that also is available. (With the read mode I can trigger it at any given time, perfect for demo or when we have need of manual runs).

So let’s start!

First of we pick up the files from the server.

  1. Add a Recurrence, I set this to standard values (once per hour, depending on the Tier choice this might be the lowest value that you can set).
  2. Add the File Connector and use the function List Files, since I will pick up the files from the root folder (that we set when we created the Connector, in my case:  C:\temp\NewFileConnector) I’ll leave the Folder Path blank.
  3. Add a second File Connector to actually pick up the files that the List Files function located.
    Use function Get File, and since the List Files result is a collection we need to add “Repeat over list” and pick the list from List Files.

    1. Repeat over the items returned from the first File Connector.
    2. Set File Path to: @repeatItem().FilePath (which declares that we should take the FilePath result from the repeating item.  (need to be set manually since the automatic help will give you only use the first() function that will give you the information from the first file.
  4. (Optinal and not included in my sample) Add a third File Connector with function Delete File to delete the file from the folder (to prevent picking it up serveral times)
    1. Repeat over the list from File Connector Get Files
    2. File Path, should be the File Path from the repeating item

Now we have this setup and we will start picking up files, eaither on the intervall or at the time we manually start the logic app.

Next section will be adding the BizTalk Flat File Encoder, transformation and the BizTalk EDIFACT component.

  1. Add the BizTalk Flat File Encoder, function Flat File to XML
    1. Repeat over the File controller Get File.
    2. Flat File element should be the content from the Get File operation and since it’s a list we will need to fix this manually.
    3. Schema name must be set, in my case it’s TestInvoice
    4. Root name must be set in my case it’s Invoice
    5. Add the BizTalk Transform Service, now we will transform the xml version of the flatfile to the xml version of the EDIFACT.
      1. Once again we need to repeat over a list, in this case the result list from the BizTalk Flat File Encoder.
      2. Input XML should be the output from the flatfile encoder.
  2. Add the BizTalk Edifact to transform the xml version of the EDIFACT to an actual EDIFACT.
    1. Repeat over the results from the BizTalk Transformation Service
    2. Content should be the xml result from the transformation.
    3. Agreement ID is the EDIFACT agreement ID that is created in the TradingPartnerManagement API App, we wrote this down earlier. (you can also collect this at any given time in the Trading Partner Management instance)
  3. To send this to the FTP add the FTP Connector
    1. Repeat over the result from the BizTalk EDIFACT instance
    2. Content should be the Payload that is returned as a result from the EDIFACT instance.
    3. File Path, make sure to find something unique, I used the tracking ID to make the file have a unique name. And also the built in function @concat, that will concatenate several strings. (make sure to not use the @ sign inside the @concat function since that will give you an error)

In my setup the files will not be deleted so I don’t need to redrop files over and over again. (If we want that behavior just add another File Connector at position 4 (marked as optinal in the setup) or change the whole setup so we use the trigger function in the File Connector that will keep track of new files and pick them up and afterwards delete the file in one step. The dissadvantage with that is when you want to demo since it’s not possible to manually trigger that flow via the start flow buttton.

Result will be, that we have a full functional flow that will pick up the flatfile and do all the handling that is needed to eventually end up as an EDIFACT file on the FTP server.
Here is how the full flow look like for me:

Key Take Aways


This is a preview so don’t be afraid of crashes, I also learned that a lot of the time it was just the async GUI that had the problems especially if I used the Free tier. Then I quickly filled my quote and in combination with heavy traffic or high demand I was unable to work with it. But when I switched to Basic Tier almost all problems just disappeared =)

Updating the map

Strangely I couldn’t update the map, I had to delete it and upload it again.

Installing the File Connector:

To install the File Connector, find it in the portal (easiest way is to go via the Browse) and pick API Apps select the file connector.

You will notice that when loading the settings it will try to check if the connector is installed, click the link.

Next is to download the installer, click “Download and Configure” see image bellow

After downloading, copy and install the application on the prefered server.
When you get promted for the connection string as bellow you should the Primary Configuration String (yellow marked in the image above).


Tips when using the File Connector

  • File name cannot contain spaces, that will cause internal server errors in the File Connector (not sure why)

Finding the files and Installed content on your server:

After the micro service is installed it will be found in IIS

When browsing the file system we will find a folder containing some files that represents a web api:

Testing the installed API App locally with swagger on your server

Troubelshoothing the flow:

So when you get this red error and you have no idea what went wrong we need to start troubleshooting. In the editor it looks and feel so smooth to do things but there is not as easy finding the errors, specially not the first times. For us who has been in the business for some time we are use to troubleshoot and understand xml notations and layouts so this can be a little bit new since Logic App is building it’s settings, configuration and input/output on JSON notation, meaing no xpaths but a .(dot) notation instead.

First of all in an JSON object we use the [object}.{attribute} to access something so when looking at this example we will need the following notation in the field to get the value.

Check the error output messages, example bellow on error in EDIFACT service, reason is found under output.body.Exception

Other tips when working in the portal.

Watch out for portal related issues also, like when editing Partners in Trading Partner Management, make sure to wait for the green save signal before proceeding to fast to the next step. I found myself having trouble when I jumped to fast around in the portal.

Thoughts on Microsoft Azure App Services after BizTalk Summit 2015

We’re just back from a couple of great days in London and BizTalk Summit 2015. Saravana and his BizTalk360 team put together a great show with excellent content and an overall superb arrangement!

This year Microsoft had a whole team of people at the conference that during the first day did a number of presentation on the new Azure App Service concept. In case you missed it App Services is (among  other things in the concept) Microsoft next attempt to move the integration capabilities from BizTalk Server into the cloud, and more specifically into Azure.

Keynote speaker Karandeep Anand started of by explained that the vision for App Services was based on three main pillars: Democratize Integration, Becoming an IPaas Leader and Creating a Rich Ecosystem.

Image by @wearsy

The focus on Democratization is a goal that aims to make it easier to get started and to quickly get to delivering value. This is a great focus as this is a huge problem with today’s platform. Today we have to install severs, databases, heavy applications, setup accounts and a hundred other things before we can even send a file from A to B! I’m sure that it in the end won’t be as simple as in the demos but what we seen so far is definitely impressive when it comes to how simple it looks to get started.

Another part of Democratize Integration of course has to do with pricing. As it look now we can get to a platform that not only will scale technically but also price-wise. Our hope is that we’ll soon have a platform that can be used for a whole different segment of customer, customers with smaller budgets and smaller needs for integration. That would truly Democratize Integration!

What’s different from BizTalk Services and why will it work this time?

Microsoft has always been great at backwards compatibility and already from the start thought about the Hybrid scenarios when it comes to Azure. App Services is no different and that is to us the main thing that differs from this offer from what we have in BizTalk Services. The fact that we by using App Services can read a flat file file from a IBM WebSphere MQ on-premise, parse it in an Azure Logic App and send it to for example to Salesforce without basically any coding is powerful! We can now implement solutions and requirements that we deal with today, solve our customers’ current needs using one platform and deliver value. BizTalk Services however never got so far and always feels like a bit of a step backwards and a subset of what we had in BizTalk Server.

So, it’s great to see how Microsoft this time actually has taken a step back and thought about what makes BizTalk Server so good and then tried to incorporate those pieces in the new platform.

What’s missing from App Services?

Experience, shared knowledge and trust

BizTalk Server has been around for ages and a lot of us have 10+ years of experience on the platform. We know exactly what works, what to avoid, what are good and what are bad design patterns – we’ve learned to trust the platform (and even deal with its less prettier parts).

In App Services we’ve only done simple demos and minor PoCs so far. How to solve a complex request-response, how to implement scatter and gather (or any more complex pattern for that matter) is still very unclear. What happens when a destination service times out and a $10 000 dollar goes missing – will App Services watch my back the same way as BizTalk Server has done so many times?

From what we seen so far, and what’s already in Azure (for example Service Bus and Queues etc), many of the tools to solve the more complex scenarios are there – but the knowledge of what pieces to use when isn’t. At iBiz Solution will try hard to do our part for and filling that knowledge gap and considering the great community that surrounds Microsoft based integration I’m sure we won’t be the only ones. 😉


Logic App designer

As with any young platform we’re missing tooling. It seems that the way to build more complex scenarios is to chain a number of more specific Logic apps together. This in combination with a missing feature for searching and promoting values in incoming messages will make it hard to see status of a specific message, find a certain message and so on. Some sort of overview of status and some sort of archiving with smart searching needs to happen for this to work in more busy and complex scenarios.

Debugging and testing is another area that current feels a bit weak. One can see input and output to each part of the Logic App but it requires a lot of clicking around and there’s no way of steeping through or replaying different parts etc. I can really say that is an area that’s particularly good in BizTalk Server either but it’s something that one is always struggling with and that ends up consuming a lot of time.


One thing that’s extremely important in our BizTalk centric solutions today is how to handle the Application Lifecycle Management and the development process that it involves. Build servers, automated tests, versioning, test servers and production servers are standard in any BizTalk Server project today. How we can achieve similar workflows where we guarantee quality, versioning, efficient deployment and the possibility to always roll back to previous versions etc. isn’t today obvious and needs more work.


At iBiz Solution we are excited to see Microsoft make another attempt at evolving the integration story! The fact that we this time around can see how we can solve today’s requirements on the future platform makes it even better! We looking forward to GA and even though the platform is far from done we feel it is in a much better place than we have been previously and we have already started talking Swagger and Swashbuckle with our customers. 😉

Build and generate environment specific binding files for BizTalk Server using Team Foundation Build Services

As most know a BizTalk Solution has two major parts – its resources in form of dlls, and its configuration in form of bindings. In an previous post I described how to build and pack resources into an MSI using Team Foundation Server (TFS) Build Services. Managing the configuration in a similar way is equally important.

So let’s see how we can build environment specific binding files using TFS Build Services and some config transformations syntax goodness!

Creating a simple binding example for development and test environment

Let’s start with a simple example of a simple binding with a receive port, receive location and a send port for two different environments – one called “Test” and one “Production”.

Port type Name Destination path in Test Destination path in Production
Receive Port BtsSample_ReceivePort_A N/A N/A
Receive Location BtsSample_ReceivePort_A_Location (File) C:\Temp\In\*.xml C:\Temp\In\*.xml
Send Port BtsSample_SendPort_A (File) C:\Temp\TEST\

As one can see there’s a small difference between the send ports destinations paths in Test and Production.

Exporting a binding template

Next we’ll create a binding template. The binding template will hold all information that is shared between the different environments. This is achieved this by an ordinary export of the application binding from the BizTalk Administration Console – as you’ve probably done many times before.

Creating environment specific bindings using web.config Transformation Syntax

The Web.config Transformation Syntax is feature that showed up in Visual Studio 2010 and is often used to transform app.config and web.config files between different versions and environments – but it will of course work on any type of configuration file. Including BizTalk binding files!

So for each environment we’ll then create an environment specific config file that only contains the values that differs between the template and the values for that environment. We use the Web.config Transformation Syntax to match the nodes and values that we like to update in the template. Below is the Test environment specific file matching the send port and replacing the value with the value specific for Test.
The Production specific file also matches the send port but with a different value for the Destination path.

Using MSBuild to execute the transformation

As part of the Visual Studio installation an MSBuild target is installed for executing the transform. The target is installed into the standard MSBuild Extensions Path which usally mean something like C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets depending a Visual Studio version etc.

Finally we’ll add a small .proj file to pass some parameters to the MSBuild process. We need to tell the process what file to use as template and what different environment specific files we like to use.

Next we can kick of MSBuild and point it to the cretated proj file.

C:\Windows\Microsoft.NET\Framework\v4.0.30319\MSBuild build.proj.

Voila! MSBuild has performed our transformation and created two complete environment specific binding file by combining the template with the specific environment files – one of test and one for production.

Generating the file using TFS Build Services

Start by setting up a TFS build definition as described in previous post on Generating BizTalk MSI using TFS Build Services.

We then need to add the $(DestinationPath) property to our Destination path to make sure the outputted binding files are written to the same path as the rest of the resources.

Once we have our build template all that’s needed is to add the build.proj file to the files to compile as shown below.

When finally running the build our two complete binding files are written to the deployment area and ready for installation!

Build and generate MSI for BizTalk Server using Team Foundation Build Services

Using a build server and leveraging continuous integration is good practice in any software development project. The idea behind automated build and continuous integrations is to have a server that monitors one’s source code repository and builds the solution as changes occur. This separate build activity alone will ensure that all artifacts are checked in and that a successful build doesn’t depend on any artifacts or settings on the development machines.

Today build servers do a lot more as part of the build – the build process usually involves execution of tests, labeling the source as well as packing the solution into a deployable artifact.

In this post we’ll see how a build process can be achieved using Team Foundation (TFS) Build Services, building a BizTalk project that results in a deployable MSI artifact.

TFS Build Services

TFS Build services is a component that is part of the standard TFS install media. Each TFS build controller controls a number of “Build Agents” that will perform the actual build process. For each solution to build one has to define its process. These processes are described in a “Build Template” that tells the agent what steps to go through and in what order.

“Build Templates” in TFS Build Services are defined using Visual Studio. The image below shows a build template accessed through Visual Studio Team Explorer.

Major steps in a build template

As one creates a new build template for a solution one has to go through the following major steps:

1. Define a trigger

Decides what should trigger the build. Should it be triggered manually, should it be a scheduled build or should it be triggered at a check-in of new code?

2. Source Setting

This will tell the build process what part of the source tree the build template is relevant for. When queueing a new build this is the part of the source tree that will be downloaded to the staging area. It also tells the build services where on disk the source should be downloaded to.

3. Process

This is where all the steps and activities that the build service should perform are defined. Team Foundation Build Services comes with a number of standard templates and custom ones can be added. In this post we’ll however stick with the default one.

Build your first BizTalk solution

Building BizTalk Server solution using TFS Build Services is straight forward.

In this post I will use this sample BizTalk solution. After checking it into Team Foundation Source Control (I’ll use TFS Source control in this post but it’ll work similarly using Git) I’ll create a new build template for the solution. All that’s needed to change is the MsBuild platform setting property, so we’re using x86 when executing MsBuild as shown below.

After queuing a build we can in the TFS Build Explorer see a successful build!

We can also download the output from the build where we can see all our build artifacts!

Using BtsMsiTask to create a MSI as part of the build

So far so good, but we started the article by saying that what we wanted was a deployable artifact. In the case of BizTalk this means a BizTalk MSI. Let’s see what we need to change to also have the build process create a MSI.

1. Install BtsMsiTask

Download and install BtsMsiTask. This will install a MsBuild task for generating the MSI.

2. Add a MsBuild project file

Add a MsBuild project file (build.proj) to the solution
The project file will tell the BtsMsiTask process what artifacts to include. Add the created project file to the solution and check it in as part of the solution.

3. Add the MsBuild project file to the TFS build template

Add the created MsBuild project file to the TFS build template by adding it to the list of projects to build. After another successful build we can see that we also created a MSI as part of the build!

Adding build information to the MSI

File name

As we can see the MSI we just created ended up with the default BtsMsiFile name that is a combination of the BizTalk application name property and the current date and time. Wouldn’t it be nice of we instead could the build number as part of the name? BtsMsiTask has an optional property called FileName that we for example can set to <FileName>$(TF_BUILD_BUILDNUMBER).msi</FileName>

Source location

When installing the artifact to BizTalk Server we can see that the source location property in the BizTalk Administration Console is set to where the artifact was built on the staging area. It’d be nice to also have information about what build that produced these artifacts. This will give the required information to know exactly what builds that are used for all the installed artifacts. We can change what is set in the source location by using the SourceLocationproperty of BtsMsiTask <SourceLocation>c:\$(TF_BUILD_BUILDNUMBER)</SourceLocation> So after setting the property as below, queue another build, reinstall using the MSI and we’ll get the following result with the build number in the source location property. And finally this is the MsBuild project file we ended up with in our example.

Sending HL 7 messages from BizTalk – The pure messaging way

One thing that makes working with HL7 messages in BizTalk a bit different from working with other standards like EDIFACT and X12 is that the HL7 assembler and disassembler relies on multipart messages for separating the header segments, body segments and extension segments (Z segments).

This makes it a lot easier to use a BizTalk map to create any header segments. Unfortunately this also means that we need orchestrations, even if we are not orchestrating any request response messages. Orchestrations tend to add a lot of complexity to BizTalk integrations, especially if no message orchestration is used. Orchestrations are also hard to make reusable and we typically end up creating an orchestration for every integration.

It is also not always a good thing that headers are created in a BizTalk map. Imagine we have a scenario where we receive an XML message with an envelope that should be mapped and sent with HL7 and the MSH header segment should be based on the envelope from the incoming message. A common way of handling XML envelopes in BizTalk is having the XML Disassembler remove the envelope and promote the elements of interest. But doing so makes the envelope elements inaccessible to the map. Creating a map for the envelope is usually not an option since the envelope can contain many different type of messages and BizTalk uses namespace + root node for identifying the message.

What we would want to do is to send an HL7 message from BizTalk and base the MSH header segment on properties from the envelope of the incoming message by promoting them on recieve and demoting them on send. Having a generic pipeline component creating the message parts instead of an orchestration helps keeping the integration simple and easy to maintain. The message body would be created with an ordinary non multipart map executed on the send port.

Part 1 – Disassembling the incoming XML

There is nothing special going on here just out of the box ordinary XML Envelope handling in BizTalk.

The first thing to do is to strip the envelope from the message and promote all properties that we want to use in the outgoing message.

To achieve this we need to create the schemas of the incoming message. One Envelope schema and one schema for the actual message.

Make sure to make set the schema to an Envelope schema

And set the body xpath to the body node.

Create the body schema

Then create a property schema containing a property for every MSH field we want to set.

Now edit the envelope property to promote each element to the corresponding MSH field, i.e. sender is promoted to the MSH31 context property and recipient is promoted to the MSH51 property.

Set up a Receive port and a Receive location with the XML Receive pipeline.

Part 2 – Transform the incoming message to HL7 XML representation

Just create an ordinary map with the incoming message body as input and the HL7 body segment schema as output.

Create a new send port with the newly created map as outbound map.

Part 3 – Assemble the HL7 message

If we try to run this as is through the HL7 assembler it won’t work since the assembler expects to find three message parts, MSHSegment, BodySegments and Z Segments. So we need to create a pipeline component that can create the MSH segment based on the previously promoted properties and create the message parts, this is where the magic happens.

To create the MSH segment we are going to use property demotion. For this to happen we need to setup property promotion from the BizTalk MSH schema to the property schema we created previously.

To create an instance of the MSH schema and demote the MSH context properties to it I am going to use the same technique I used in a previous
blog post

What this method does is take a parameter for the DocumentSpecName of the BizTalk MSH schema used. It then creates an instance of schema and iterates through all defined properties and demotes them from context. The new XML is then added to a new message part.

Z segments are not used in this scenario so we just set it to an empty string (it must still exists).

The Execute method of the component is quite simple. It just uses the above methods to create the segments and add it to the outbound message.

The part.Charset = “utf-8” part is very important. Without this national characters like åäö won’t work. The casing of the charset is also important.


So what have we gained here? First of all we have gotten rid of any orchestration, which always feels good :). This pipeline component has no dependencies to any specific schemas which makes it easy to reuse in any integration where we want to send HL7 messages as long we can promote any header data to a property schema.
The source code for this component is available here