Method not allowed when consuming API App in Azure App Services from c#

When consuming an Api App from c# the Azure App Service SDK helps us generate Client Code to improve the develoment process. (read more here)

When testing the generated code you might get an error of ‘Method Not Allowed’ and status code 405 even if the security settings is correct and the API app works perfectly when trying to use it via Logic App, Postman/fiddler/swagger etc.

If the thrown exception looks something like this:

Then the problem probably is an incorrect generated URI from the code generator where it has used http instead of https. (common in several places with API Apps always should be https)

To check this go in to the class of the ServiceClient ( in my case FlatFileEncoder ) and check the base URI settings, as you can see on the image bellow mine where generated with http instead of https.

Changing the URI from http to https and it starts working, my code is executed and the result from the FlatFileToXML function is returned as expected.

Posted in: •Azure App Service  •Integration  | Tagged: •API Apps  •Azure Api Apps  •Azure App Service  •Custom Code  •Flat File Encoder  •Integration  •Windows Azure 


Sharing a BizTalk pipeline component in BizTalkComponents component library

So you have decided to share a pipeline component to our library? Great! The BizTalkComponent library is hosted on GitHub. A sample pipeline component that can be used as a template is available at GitHub.

Setting up Git

Before a component is added to the official GitHub repository of BizTalkComponents it must be reviewed. To do that you must first create a public GitHub repository. Make sure to add the .gitattributes file and the .gitignore file for Visual Studio. Also create a README file and a license file. BizTalkComponents uses MIT license. To start working on our component we need to clone the newly created repository using your favorite Git client like Visual Studio or GitHub for Windows.

Creating the solution

Fire up Visual Studio and create an empty solution in the root folder of the newly cloned repository. The solution file should be called BizTalkComponents.PipelineComponents.{component name}.sln.

Adding unit tests

All components must have unit tests so the first thing we are going to do is to add a new Unit Test project to our solution. The unit test project should be located in Tests/UnitTests under the root folder. To be able to run a pipeline component outside of BizTalk we use the Winterdom BizTalk Pipeline Testing library available as a NuGet package.

Build tests that not only ensures that the components output is as expected when everything is setup correctly but also that relevant exceptions are thrown when any preconditions like parameters set by the BizTalk administrator or existence of context properties fails.

Implementing the pipeline component

The pipeline component itself should be located in a Src folder under root. All pipeline components should target .NET 4.0 to ensure compatibility with BizTalk 2010.

BizTalkComponents.Utils

The pipeline components of BizTalk components uses a shared utility library called BizTalkComponents.Utils. This library contains helper classes to reduce the amount of code we need to write for common tasks like interacting with the message context and reading and writing to the components property bag. The library is available on NuGet.

Partial classes for a more readable component

The interfaces that every pipeline component needs to implement contains a lot of plumbing code that has nothing to do with what the component actually do. To keep the implementation clean and easy to read BizTalkComponents uses partial classes to separate plumbing code from the component implementation. The class files should be called {component name}.cs and {component name}.Component.cs. The component class should contain any component metadata properties and methods as well as any Validate method. BizTalkComponents does not use resource files for component metadata. Name, Version and Description are set directly in the property. The IPersistPropertyBag interface contains a method for validating parameters. This method is called when the component is built. This method should use the ValidationHelper from the utils library. This method can be completed with an additional Validate method that is not called at build time but rather when the component is called in BizTalk.

Component parameters

Any component parameter should have annotations: • Parameters required to be set already at design time should be annotated with the Required attribute • Parameters required to be set at runtime should be annotated with the RequiredRunTime attribute available in the Utils library. • All parameters should be annotated with a user friendly name in the DisplayName attribute. • All parameters should be annotated with a description in the Description attribute.

The parameter property should be accompanied with a string constant to be used when reading and writing from the property bag.

The Load and Save methods should be implementing using the PropertyBagHelper from Utils.

The Execute method

If the component has any RequiredRuntime properties the custom Validate method should be called at the beginning of the Execute method to ensure that all parameters are set as expected.

All interactions with the message context should use the utils library’s extension methods. ContextProperty entity should be used for referencing any context property. This entity can be initialized either with the BizTalk property notation namespace#property name or by separating namespace and property in to different strings. The Utils library also contains constants for some BizTalk standard properties.

Building the component

All pipeline components should be strong named so that they can be installed into the GAC. A new strong name file (snk) is created for every pipeline component.

MSI files for all components will be generated. To be able to generate an MSI from MSBuild BtsMsiTask must be installed and used. BtsMsiTask is called from a custom MSBuild build script that should be located in a Build folder under the root. All MSI should have a version. The AssemblyInformationalVersion attribute is used for versioning and to be able to read that attribute from the build script MSBuild Extension Pack must also be installed. The build script should be called Build.proj and also be included as a Solution folder in the solution.

This sample build script can be used by just replacing the component name.

Readme file

The README.md mark down file generated by GitHub should have a short description of the component. There are many tools for editing mark down files, for example this extension to Visual Studio.

This blog post aims to introduce the BizTalk Components library and the coding standard it uses. If you have any questions or comments on the component library or the coding standard don’t hesitate to write a comment to this blog post.

Posted in: •Integration  | Tagged:


Azure APP Services - Logic Apps process Flat file transform to EDIFACT send to FTP

Logic apps is part of the new and awesome features Microsoft released in preview this spring. Focus on this type of functions are cloud integration! Both hybrid integrations that bring data to and from servers inside your network in a secure and easy way and also pure cloud integration.

Logic Apps is the application that is created in Azure as the workflowish application, I won’t explain more in this post but please read more here.

So the example I’ll take here is a classic one, I have a system that produces small and simple flat files on a local server and we want to bring that data in EDIFACT INVOIC format to a FTP server. Normally this would be solved via an integration platform like BizTalk but for this scenario we don’t have a BizTalk server that we can use so we try to solve this via Logic Apps instead.

Scenario:

  1. Pick up a file on the local filesystem
  2. Decode the flatfile and transform it to EDIFACT
  3. Encode EDIFACT
  4. Send to FTP server

Let’s jump right in into the portal, we need to start by setting up some connectors (File, FTP) and several API Apps from the marketplace will be used.

Preparations

SQL Databases

First off we need one SQL DB, you can use a previously created one but we need 2 empty instances, I did this in the “old” portal.

The 2 API apps are the following and in brackets you can se my database instance name.

  1. BizTalk Trading Partner Management (TradingPartnerManagement)
  2. BizTalk EDIFACT (EDIDatabase)

Copy the connection strings and change the text {your_password_here} to your password so you can use it later.

Service Bus:

Create a Service Bus namespace for the File Connector.

Schemas:

Download the EDIFACT XSD schema here

Mapper:

Create a BizTalk Service mapper in Visual Studio 2012

  1. Create a Flatfile schema
  2. Create the mapping from the flatfile schema to Edifact in Visual Studio.
    1. BizTalk Service mapper (generating trfm files). I don’t like this mapper since it have bad xslt support so I normaly just create a standard BizTalk project and do the mapping in a temporary project and later copy the xslt into the trfm mapper. J

Resulting in these 3 files for me (after downloading the EDIFACT schema)

 

Now when we are prepared let’s start creating the services

Creat and configure the API App’s instances.

In Web API section:

Create File Connector: We need a File Connector for picking up files on the onpremise server we need to provide the following information right away when we create the connector:

  • Root Folder, make sure to write down or remember the value or prepare the server with the catalog structure since the value is impossible to find (if you do find where to read this value, please let me know how!)
  • Service bus Connection String, take the apropiate SAS connection string from the earlier create service bus

After creating this we can install it on the server, see the section about installing File Connector for more info.

Create FTP Connector: We need a FTP Connector to be able to send messages to the FTP server. For this we need to provide the following information right away when we create the connector:

  • **Server Address **(make sure to not end with a / or similar since they will automatically add :21 for the port then you get an error when trying to use the connector, se image bellow for how a file name is constructed in this case (notice the staring :21)
  • User name, for the ftp server
  • Password, for the user

Create BizTalk Flat File Encoder:

We need a BizTalk Flat File Encoder that will decode and encode of flatfile to/from XML. In our case it will be decode of flat file to XML

Installation is straight forward and you only need to apply name for the instance.

Configure BizTalk Flat File Encoder:

For the Flat File encoder to work we need to add one or more flatfile schemas. These schemas are the same that is used in BizTalk so you can easily reuse the once you have today. Go in to the instance we created (remeber the name of the instance) easiest way to find it is in the API apps list. (Browse -> API apps)

  • Click Schemas
    1. Press Add Button
    2. Upload the Flat File Schema (must be a BizTalk flat file schema XSD file)

Create BizTalk Transform Service

We need a BizTalk Transform Service to be able to transform the xml instance of the flatfile to an xml instance of the EDIFACT schema. Creating this is quite straight forward just need to apply a name for the service.

Configure BizTalk Transform Service

After creation  we need to some configuration, we basicly need to add the maps. Pick up the app as with the others, when you have found it do the following:

  • Click Maps
    1. Press Add button
    2. Upload the Map you prepared in the preparations

Create BizTalk Trading Partner Management

BizTalk Trading Partner Management is used to keep information about partners and setup agreements on AS2, EDIFACT and X12.

When creating we need to provide the connection string to the earlier created Database.

Configure BizTalk Trading Partner Management

When the Trading Partner Management is created we need to do some configurations. Mainly we need to create parties that are suppose to be used in the B2B flow and then connect them via agreements so we can say that partner A are exchanging messages with partner B.

First we need to create atleast 2 partners, go in to the instance (just like the others):

  • Click Partners
  • Press Add button and add at least 2 Partners (only name is mandatory)
    1. After the Partners are added we need to create Identities for them (you might want to reload the web API to make sure it will load properly) After that press one of the partners
    2. Press Profiles
    3. Press the Profile in the Profiles section
  • Press Identities (see image below for a guidance)
  • Enter the identity value and pick a Qualifier
    1. Press Save, repeat the steps from point 3 for the other partner

When the partners are created and we have added identities to them we can start creating agreements BUT to be able to create EDIFACT agreements we also need to provide the EDIFACT schemas that we want to use.

  • Press Schemas
    1. Press add
    2. Upload the Schema (repeat for more shemas if needed)

Now we can create the agreement:

  • Click Agreements
    1. Press Add button and enter a name
    2. Choose Protocol, in our case it’s EDIFACT
  • Partner Settings (here is a click intense part) set all values according to your setup (all must be set)
    1. Enter receive settings I left all of this standard and just pointed out the INVOIC schema.
    2. Enter Send settings I left all of this standard and just pointed out the INVOIC schema and added Application Reference.
    3. Enter Batch settings I filled in the mandatory fields (name and counter =1 ) If this is skipped Azure will try to create the agreement but it will fail so just fil it in.

Note! After Save write down the Agreement ID, we will use that later on in the Logic App

Create BizTalk EDIFACT

The BizTalk EDIFACT instance will do the hard work of encode or decode your EDIFACT message to XML or vice versa. When creating the instance we need the following:

  1. Database Connection string: Connection string to the earlier created Database for EDIFACT.
  2. TPM Instance Name: The name of the earlier created BizTalk Trading Partner Management (from section 5)

There is no need to do some configuration since the EDIFACT instnace will use the configuration we did in the BizTalk Trading Partner Management instance when handling agreements and so on.

 

Now we are all set and the preparations are done, let’s start building the Logic Apps app.

Building the app looks easy and are smooth but there are some tricky parts that you will encounter since we will have some looping lists in this way of building the Logic App. (Building it this way so it’s easier to understand and test, quite bad though that if I need to change this later I have some serious remodeling to do, it’s easier to create a new flow.)

As I created it I focused on a flow that would be easy to demonstrate, so I added the fileconnector in a read mode of a folder, instead of the event mode that also is available. (With the read mode I can trigger it at any given time, perfect for demo or when we have need of manual runs).

So let’s start!

First of we pick up the files from the server.

  1. Add a Recurrence, I set this to standard values (once per hour, depending on the Tier choice this might be the lowest value that you can set).
  2. Add the File Connector and use the function List Files, since I will pick up the files from the root folder (that we set when we created the Connector, in my case:  C:\temp\NewFileConnector) I’ll leave the Folder Path blank.
  3. Add a second File Connector to actually pick up the files that the List Files function located. Use function Get File, and since the List Files result is a collection we need to add “Repeat over list” and pick the list from List Files.
    1. Repeat over the items returned from the first File Connector.
    2. Set File Path to: @repeatItem().FilePath (which declares that we should take the FilePath result from the repeating item.  (need to be set manually since the automatic help will give you only use the first() function that will give you the information from the first file.
  4. (Optinal and not included in my sample) Add a third File Connector with function Delete File to delete the file from the folder (to prevent picking it up serveral times)
    1. Repeat over the list from File Connector Get Files
    2. File Path, should be the File Path from the repeating item

Now we have this setup and we will start picking up files, eaither on the intervall or at the time we manually start the logic app.

Next section will be adding the BizTalk Flat File Encoder, transformation and the BizTalk EDIFACT component.

  • Add the BizTalk Flat File Encoder, function Flat File to XML
    1. Repeat over the File controller Get File.
    2. Flat File element should be the content from the Get File operation and since it’s a list we will need to fix this manually.
    3. Schema name must be set, in my case it’s TestInvoice
    4. Root name must be set in my case it’s Invoice
    5. Add the BizTalk Transform Service, now we will transform the xml version of the flatfile to the xml version of the EDIFACT.
      1. Once again we need to repeat over a list, in this case the result list from the BizTalk Flat File Encoder.
      2. Input XML should be the output from the flatfile encoder.
        1. Add the BizTalk Edifact to transform the xml version of the EDIFACT to an actual EDIFACT.
    6. Repeat over the results from the BizTalk Transformation Service
    7. Content should be the xml result from the transformation.
    8. Agreement ID is the EDIFACT agreement ID that is created in the TradingPartnerManagement API App, we wrote this down earlier. (you can also collect this at any given time in the Trading Partner Management instance) 3. To send this to the FTP add the FTP Connector
    9. Repeat over the result from the BizTalk EDIFACT instance
    10. Content should be the Payload that is returned as a result from the EDIFACT instance.
    11. File Path, make sure to find something unique, I used the tracking ID to make the file have a unique name. And also the built in function @concat, that will concatenate several strings. (make sure to not use the @ sign inside the @concat function since that will give you an error)

In my setup the files will not be deleted so I don’t need to redrop files over and over again. (If we want that behavior just add another File Connector at position 4 (marked as optinal in the setup) or change the whole setup so we use the trigger function in the File Connector that will keep track of new files and pick them up and afterwards delete the file in one step. The dissadvantage with that is when you want to demo since it’s not possible to manually trigger that flow via the start flow buttton.

Result will be, that we have a full functional flow that will pick up the flatfile and do all the handling that is needed to eventually end up as an EDIFACT file on the FTP server. Here is how the full flow look like for me:

Key Take Aways

Crashes:

This is a preview so don’t be afraid of crashes, I also learned that a lot of the time it was just the async GUI that had the problems especially if I used the Free tier. Then I quickly filled my quote and in combination with heavy traffic or high demand I was unable to work with it. But when I switched to Basic Tier almost all problems just disappeared =) Updating the map

Strangely I couldn’t update the map, I had to delete it and upload it again.

Installing the File Connector:

To install the File Connector, find it in the portal (easiest way is to go via the Browse) and pick API Apps select the file connector.

You will notice that when loading the settings it will try to check if the connector is installed, click the link.

Next is to download the installer, click “Download and Configure” see image bellow

After downloading, copy and install the application on the prefered server. When you get promted for the connection string as bellow you should the Primary Configuration String (yellow marked in the image above).

 

Tips when using the File Connector

  • File name cannot contain spaces, that will cause internal server errors in the File Connector (not sure why)

Finding the files and Installed content on your server:

After the micro service is installed it will be found in IIS

When browsing the file system we will find a folder containing some files that represents a web api:

Testing the installed API App locally with swagger on your server

Troubelshoothing the flow:

So when you get this red error and you have no idea what went wrong we need to start troubleshooting. In the editor it looks and feel so smooth to do things but there is not as easy finding the errors, specially not the first times. For us who has been in the business for some time we are use to troubleshoot and understand xml notations and layouts so this can be a little bit new since Logic App is building it’s settings, configuration and input/output on JSON notation, meaing no xpaths but a .(dot) notation instead.

First of all in an JSON object we use the [object}.{attribute} to access something so when looking at this example we will need the following notation in the field to get the value.

Check the error output messages, example bellow on error in EDIFACT service, reason is found under output.body.Exception

Other tips when working in the portal.

Watch out for portal related issues also, like when editing Partners in Trading Partner Management, make sure to wait for the green save signal before proceeding to fast to the next step. I found myself having trouble when I jumped to fast around in the portal.

Posted in: •Integration  | Tagged: •AzureAPIApps  •API Apps  •EDIFACT  •FileConnector  •FlatFile  •FTP  •Logic Apps  •Microsoft Azure 


Thoughts on Microsoft Azure App Services after BizTalk Summit 2015

We’re just back from a couple of great days in London and BizTalk Summit 2015. Saravana and his BizTalk360 team put together a great show with excellent content and an overall superb arrangement!

This year Microsoft had a whole team of people at the conference that during the first day did a number of presentation on the new Azure App Service concept. In case you missed it App Services is (among  other things in the concept) Microsoft next attempt to move the integration capabilities from BizTalk Server into the cloud, and more specifically into Azure.

Keynote speaker Karandeep Anand started of by explained that the vision for App Services was based on three main pillars: Democratize Integration, Becoming an IPaas Leader and Creating a Rich Ecosystem.

Image by @wearsy

The focus on Democratization is a goal that aims to make it easier to get started and to quickly get to delivering value. This is a great focus as this is a huge problem with today’s platform. Today we have to install severs, databases, heavy applications, setup accounts and a hundred other things before we can even send a file from A to B! I’m sure that it in the end won’t be as simple as in the demos but what we seen so far is definitely impressive when it comes to how simple it looks to get started.

Another part of Democratize Integration of course has to do with pricing. As it look now we can get to a platform that not only will scale technically but also price-wise. Our hope is that we’ll soon have a platform that can be used for a whole different segment of customer, customers with smaller budgets and smaller needs for integration. That would truly Democratize Integration!

What’s different from BizTalk Services and why will it work this time?

Microsoft has always been great at backwards compatibility and already from the start thought about the Hybrid scenarios when it comes to Azure. App Services is no different and that is to us the main thing that differs from this offer from what we have in BizTalk Services. The fact that we by using App Services can read a flat file file from a IBM WebSphere MQ on-premise, parse it in an Azure Logic App and send it to for example to Salesforce without basically any coding is powerful! We can now implement solutions and requirements that we deal with today, solve our customers’ current needs using one platform and deliver value. BizTalk Services however never got so far and always feels like a bit of a step backwards and a subset of what we had in BizTalk Server.

So, it’s great to see how Microsoft this time actually has taken a step back and thought about what makes BizTalk Server so good and then tried to incorporate those pieces in the new platform.

What’s missing from App Services?

Experience, shared knowledge and trust

BizTalk Server has been around for ages and a lot of us have 10+ years of experience on the platform. We know exactly what works, what to avoid, what are good and what are bad design patterns – we’ve learned to trust the platform (and even deal with its less prettier parts).

In App Services we’ve only done simple demos and minor PoCs so far. How to solve a complex request-response, how to implement scatter and gather (or any more complex pattern for that matter) is still very unclear. What happens when a destination service times out and a $10 000 dollar goes missing – will App Services watch my back the same way as BizTalk Server has done so many times?

From what we seen so far, and what’s already in Azure (for example Service Bus and Queues etc), many of the tools to solve the more complex scenarios are there – but the knowledge of what pieces to use when isn’t. At iBiz Solution will try hard to do our part for and filling that knowledge gap and considering the great community that surrounds Microsoft based integration I’m sure we won’t be the only ones. ;)

Tooling

Logic App designer

As with any young platform we’re missing tooling. It seems that the way to build more complex scenarios is to chain a number of more specific Logic apps together. This in combination with a missing feature for searching and promoting values in incoming messages will make it hard to see status of a specific message, find a certain message and so on. Some sort of overview of status and some sort of archiving with smart searching needs to happen for this to work in more busy and complex scenarios.

Debugging and testing is another area that current feels a bit weak. One can see input and output to each part of the Logic App but it requires a lot of clicking around and there’s no way of steeping through or replaying different parts etc. I can really say that is an area that’s particularly good in BizTalk Server either but it’s something that one is always struggling with and that ends up consuming a lot of time.

ALM

One thing that’s extremely important in our BizTalk centric solutions today is how to handle the Application Lifecycle Management and the development process that it involves. Build servers, automated tests, versioning, test servers and production servers are standard in any BizTalk Server project today. How we can achieve similar workflows where we guarantee quality, versioning, efficient deployment and the possibility to always roll back to previous versions etc. isn’t today obvious and needs more work.

Conclusion

At iBiz Solution we are excited to see Microsoft make another attempt at evolving the integration story! The fact that we this time around can see how we can solve today’s requirements on the future platform makes it even better! We looking forward to GA and even though the platform is far from done we feel it is in a much better place than we have been previously and we have already started talking Swagger and Swashbuckle with our customers. ;)

Posted in: •Integration  | Tagged:


Build and generate environment specific binding files for BizTalk Server using Team Foundation Build Services

As most know a BizTalk Solution has two major parts – its resources in form of dlls, and its configuration in form of bindings. In an previous post I described how to build and pack resources into an MSI using Team Foundation Server (TFS) Build Services. Managing the configuration in a similar way is equally important.

So let’s see how we can build environment specific binding files using TFS Build Services and some config transformations syntax goodness!

Creating a simple binding example for development and test environment

Let’s start with a simple example of a simple binding with a receive port, receive location and a send port for two different environments - one called “Test” and one “Production”.

Port type Name Destination path in Test Destination path in Production
Receive Port BtsSample_ReceivePort_A N/A N/A
Receive Location BtsSample_ReceivePort_A_Location (File) C:\Temp\In*.xml C:\Temp\In*.xml
Send Port BtsSample_SendPort_A (File) C:\Temp\ TEST \Out\%MessageID%.xml C:\Temp\ PROD \Out\%MessageID%.xml

As one can see there’s a small difference between the send ports destinations paths in Test and Production.

Exporting a binding template

Next we’ll create a binding template. The binding template will hold all information that is shared between the different environments. This is achieved this by an ordinary export of the application binding from the BizTalk Administration Console - as you’ve probably done many times before.

Creating environment specific bindings using web.config Transformation Syntax

The Web.config Transformation Syntax is feature that showed up in Visual Studio 2010 and is often used to transform app.config and web.config files between different versions and environments – but it will of course work on any type of configuration file. Including BizTalk binding files!

So for each environment we’ll then create an environment specific config file that only contains the values that differs between the template and the values for that environment. We use the Web.config Transformation Syntax to match the nodes and values that we like to update in the template. Below is the Test environment specific file matching the send port and replacing the value with the value specific for Test. The Production specific file also matches the send port but with a different value for the Destination path.

Using MSBuild to execute the transformation

As part of the Visual Studio installation an MSBuild target is installed for executing the transform. The target is installed into the standard MSBuild Extensions Path which usally mean something like C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets depending a Visual Studio version etc.

Finally we’ll add a small .proj file to pass some parameters to the MSBuild process. We need to tell the process what file to use as template and what different environment specific files we like to use.

Next we can kick of MSBuild and point it to the cretated proj file.

C:\Windows\Microsoft.NET\Framework\v4.0.30319\MSBuild build.proj.

Voila! MSBuild has performed our transformation and created two complete environment specific binding file by combining the template with the specific environment files - one of test and one for production.

Generating the file using TFS Build Services

Start by setting up a TFS build definition as described in previous post on Generating BizTalk MSI using TFS Build Services.

We then need to add the $(DestinationPath) property to our Destination path to make sure the outputted binding files are written to the same path as the rest of the resources.

Once we have our build template all that’s needed is to add the build.proj file to the files to compile as shown below.

When finally running the build our two complete binding files are written to the deployment area and ready for installation!

Posted in: •Integration  | Tagged:


Build and generate MSI for BizTalk Server using Team Foundation Build Services

Using a build server and leveraging continuous integration is good practice in any software development project. The idea behind automated build and continuous integrations is to have a server that monitors one’s source code repository and builds the solution as changes occur. This separate build activity alone will ensure that all artifacts are checked in and that a successful build doesn’t depend on any artifacts or settings on the development machines.

Today build servers do a lot more as part of the build - the build process usually involves execution of tests, labeling the source as well as packing the solution into a deployable artifact.

In this post we’ll see how a build process can be achieved using Team Foundation (TFS) Build Services, building a BizTalk project that results in a deployable MSI artifact.

TFS Build Services

TFS Build services is a component that is part of the standard TFS install media. Each TFS build controller controls a number of “Build Agents” that will perform the actual build process. For each solution to build one has to define its process. These processes are described in a “Build Template” that tells the agent what steps to go through and in what order.

“Build Templates” in TFS Build Services are defined using Visual Studio. The image below shows a build template accessed through Visual Studio Team Explorer.

Major steps in a build template

As one creates a new build template for a solution one has to go through the following major steps:

1. Define a trigger

Decides what should trigger the build. Should it be triggered manually, should it be a scheduled build or should it be triggered at a check-in of new code?

2. Source Setting

This will tell the build process what part of the source tree the build template is relevant for. When queueing a new build this is the part of the source tree that will be downloaded to the staging area. It also tells the build services where on disk the source should be downloaded to.

3. Process

This is where all the steps and activities that the build service should perform are defined. Team Foundation Build Services comes with a number of standard templates and custom ones can be added. In this post we’ll however stick with the default one.

Build your first BizTalk solution

Building BizTalk Server solution using TFS Build Services is straight forward.

In this post I will use this sample BizTalk solution. After checking it into Team Foundation Source Control (I’ll use TFS Source control in this post but it’ll work similarly using Git) I’ll create a new build template for the solution. All that’s needed to change is the MsBuild platform setting property, so we’re using x86 when executing MsBuild as shown below.

After queuing a build we can in the TFS Build Explorer see a successful build! We can also download the output from the build where we can see all our build artifacts!

Using BtsMsiTask to create a MSI as part of the build

So far so good, but we started the article by saying that what we wanted was a deployable artifact. In the case of BizTalk this means a BizTalk MSI. Let’s see what we need to change to also have the build process create a MSI.

1. Install BtsMsiTask

Download and install BtsMsiTask. This will install a MsBuild task for generating the MSI.

2. Add a MsBuild project file

Add a MsBuild project file (‘build.proj’) to the solution The project file will tell the BtsMsiTask process what artifacts to include. Add the created project file to the solution and check it in as part of the solution.

3. Add the MsBuild project file to the TFS build template

Add the created MsBuild project file to the TFS build template by adding it to the list of projects to build. After another successful build we can see that we also created a MSI as part of the build!

Adding build information to the MSI

File name

As we can see the MSI we just created ended up with the default BtsMsiFile name that is a combination of the BizTalk application name property and the current date and time. Wouldn’t it be nice of we instead could the build number as part of the name? BtsMsiTask has an optional property called ‘FileName’ that we for example can set to ‘$(TF_BUILD_BUILDNUMBER).msi

Source location

When installing the artifact to BizTalk Server we can see that the source location property in the BizTalk Administration Console is set to where the artifact was built on the staging area. It’d be nice to also have information about what build that produced these artifacts. This will give the required information to know exactly what builds that are used for all the installed artifacts. We can change what is set in the source location by using the ‘SourceLocation’property of BtsMsiTask ‘c:\$(TF_BUILD_BUILDNUMBER)’ So after setting the property as below, queue another build, reinstall using the MSI and we’ll get the following result with the build number in the source location property. And finally this is the MsBuild project file we ended up with in our example.

Posted in: •Integration  | Tagged:


Sending HL 7 messages from BizTalk – The pure messaging way

One thing that makes working with HL7 messages in BizTalk a bit different from working with other standards like EDIFACT and X12 is that the HL7 assembler and disassembler relies on multipart messages for separating the header segments, body segments and extension segments (Z segments).

This makes it a lot easier to use a BizTalk map to create any header segments. Unfortunately this also means that we need orchestrations, even if we are not orchestrating any request response messages. Orchestrations tend to add a lot of complexity to BizTalk integrations, especially if no message orchestration is used. Orchestrations are also hard to make reusable and we typically end up creating an orchestration for every integration.

It is also not always a good thing that headers are created in a BizTalk map. Imagine we have a scenario where we receive an XML message with an envelope that should be mapped and sent with HL7 and the MSH header segment should be based on the envelope from the incoming message. A common way of handling XML envelopes in BizTalk is having the XML Disassembler remove the envelope and promote the elements of interest. But doing so makes the envelope elements inaccessible to the map. Creating a map for the envelope is usually not an option since the envelope can contain many different type of messages and BizTalk uses namespace + root node for identifying the message.

What we would want to do is to send an HL7 message from BizTalk and base the MSH header segment on properties from the envelope of the incoming message by promoting them on recieve and demoting them on send. Having a generic pipeline component creating the message parts instead of an orchestration helps keeping the integration simple and easy to maintain. The message body would be created with an ordinary non multipart map executed on the send port.

Part 1 – Disassembling the incoming XML

There is nothing special going on here just out of the box ordinary XML Envelope handling in BizTalk.

The first thing to do is to strip the envelope from the message and promote all properties that we want to use in the outgoing message.

To achieve this we need to create the schemas of the incoming message. One Envelope schema and one schema for the actual message.

Make sure to make set the schema to an Envelope schema

And set the body xpath to the body node.

Create the body schema

Then create a property schema containing a property for every MSH field we want to set.

Now edit the envelope property to promote each element to the corresponding MSH field, i.e. sender is promoted to the MSH31 context property and recipient is promoted to the MSH51 property. Set up a Receive port and a Receive location with the XML Receive pipeline.

Part 2 – Transform the incoming message to HL7 XML representation

Just create an ordinary map with the incoming message body as input and the HL7 body segment schema as output.

Create a new send port with the newly created map as outbound map.

Part 3 – Assemble the HL7 message

If we try to run this as is through the HL7 assembler it won’t work since the assembler expects to find three message parts, MSHSegment, BodySegments and Z Segments. So we need to create a pipeline component that can create the MSH segment based on the previously promoted properties and create the message parts, this is where the magic happens.

To create the MSH segment we are going to use property demotion. For this to happen we need to setup property promotion from the BizTalk MSH schema to the property schema we created previously.

To create an instance of the MSH schema and demote the MSH context properties to it I am going to use the same technique I used in a previous blog post

What this method does is take a parameter for the DocumentSpecName of the BizTalk MSH schema used. It then creates an instance of schema and iterates through all defined properties and demotes them from context. The new XML is then added to a new message part.

Z segments are not used in this scenario so we just set it to an empty string (it must still exists).

The Execute method of the component is quite simple. It just uses the above methods to create the segments and add it to the outbound message.

The part.Charset = “utf-8” part is very important. Without this national characters like åäö won’t work. The casing of the charset is also important.

Conclusion

So what have we gained here? First of all we have gotten rid of any orchestration, which always feels good :). This pipeline component has no dependencies to any specific schemas which makes it easy to reuse in any integration where we want to send HL7 messages as long we can promote any header data to a property schema. The source code for this component is available here

Posted in: •Integration  | Tagged: •BizTalk  •HL7 


Handling message enrichment with Business Rule Engine

One common thing that we need to handle at an integration level is enrichment of messages, this is often due to missing information in the incoming message or requirements of additional static data. This could be static hard coded data or more advanced rules that consists of several parameters. And the actions might as well be simple static data or more complex handling of adding nodes and elements. The values might change over instances (i.e. different values in Test and production). By using the Business Rule Engine (from now called BRE) we can create a solution with very high maintainability with good tools and support for change both input parameters to the rule and actions taken when the rule is executed without changing any source code or do any redeployment. This is made due to using standard components with minimal links, the only link is the name of the policy which we need to specify when we call the BRE rule engine from our pipeline or orchestration.

So basics first what is the BRE engine? It’s a rule engine and with that said you basically put in rules that turn into true or false and when the result is true there is an action executed. These actions can be quite flexible and can be used to add new elements, records or attributes or just simply set a value to one or more existing attributes or elements.

Now how and where do we use it? Well the places where enrichment is natural is the following, when received by BizTalk before inbound map, inside orchestration or when leaving BizTalk after outbound map (see picture). This possible execution places is receive-, send-pipeline or orchestration and to be able to do that we need some pipeline components and orchestration shapes. There are a great framework for working with BRE called “BizTalk Business Rules Engine Pipeline Framework

So let’s take an exampel:

I have this message received and the sending system cannot provide the field “MinChef”

 

Let’s say that the rule will be following: If company code is 9999 then we add the following static information:

 

Small and easy, also very flexible since adding or changing information like this is not forcing or demanding a recompilation or redeploy. Just add a new version of the policy and deploy and the change is done. There are possibilities to do more complex functions as well, check this example where we add a node and records under the node.  (It might look complex but the tool will help you fill in the information.

Installing and configuring BRE rules engine och BRE pipeline Framework. So basically BRE is shipped with BizTalk so it’s probably installed or you need to run the BizTalk installer. Install he BRE pipeline Framework download here and install.

Tips and tricks regarding BRE:

If you need the possibility to run the added .net functions and libraries we need to change a value in the registry accordingly:

HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\BusinessRules\3.0\

Add the following parameter: Name: StaticSupport Type: REG_DWORD Value: 2

 

If you create a pipeline from the pipeline component we need to add a new policy a “InstructorLoaderPolicy” that will check the message root node text and if it’s a match it will initiate and set a BRE pipeline context value.

The pipeline will then look like this:

Where the TestManager is the policy where the manipulation of the message is happening and the InstructorLoaderPolicy is the meta data policy that helps loading meta data like message type to the BRE engine.

 Conclusion:

We can easily add static data this way and prevent from hard coding it into maps but It needs to be planed and designed for since there are limited places where the enrichment can be done. The enrichment that the components are doing is adding data or performing functions to the message and return the result and the result will be the message that BizTalk continues to work with, but I’ll give you an advice to keep this as isolated as you can since it will fast grow quite complex. Otherwise by using this pattern we can create better and more maintainable solutions where changes to the static data are made simply by updating these rules rather than code and redeploy solutions, witch is a very time consuming task. This also gives the developer more time to spend on developing and other fun stuffs rather than change static data that the business suddenly had to change for any good reason.

Posted in: •Integration  | Tagged: •BizTalk  •BizTalk Business Rules Engine Pipeline Framework  •BRE  •BRE Pipeline Framework  •Business Rule Engine  •Maintainability  •Tips & Trix 


Why you need to be world class in Application Integration

In Sweden there is a radio program called “Spanarna” where a group of three persons tries to read trends in the everyday noise, and then present their visions of the future for the radio listeners.

The criteria for a valid trend-watch is that it requires at least three evidence to substantiate that this is something that will be realized in the coming future.

I will borrow this program idea and present my vision for application integration based on what I see when I visit customers in Scandinavia and listen to their upcoming challenges. Also I use what Radar Group and Gartner advises us to be prepared on.

I’ll start by stating the trend and thesis and prophecy: Your most important critical applications will not only be running in the cloud they will also be running in different clouds. Your requirements on the suppliers will be for them to take greater responsibility on your internal processes, and so will your customers require you to do. Therefore you need to form a plan for the cloud and focus to become world class in the area of application  integration.

Since a valid  trend-watch or thesis requires at least three evidences I made it easy for me.  I  borrowed most  material from other trend-watchers namely Gartner and Radar group. Couldn’t be more safe, since  everything that comes out of their mouths tends to be self-fulfilling prophecies.

Ok, now it is time to present the evidences that proves my trend-watch.

Evidence #1  Gartner Top 10 Strategic Trends by 2015

  • Computing Everywhere
  • The Internet of Things
  • 3D Printing
  • Advanced, Pervasive and Invisible Analytics
  • Context-Rich Systems
  • Smart Machines
  • Cloud/Client Computing
  • Software-Defined Applications and Infrastructure
  • Web-Scale IT
  • Risk-Based Security and Self-Protection

In the above list of hot technologies I have  underlined where the applications most probably will run in the cloud and as you see they all involves application integration. To be able to take advantage of computing everywhere, collect data from smart machines and especially for the internet of things where everything should be connected Application Integration excellence is vital.

Internet of Things is already happening and there is huge opportunity for early adopters to gain advantages if they can identify the value offering that comes with data together with the skills to collect the data the products produces. This can then be used to share and improve your offering and strengthen the relation to the customer. Look at the car industry, they have already started.

We also see small  manufacturers that already understood that they are sitting on an “information gold mine” where they not only can extend the value of their product for the consumers but also sell the data collected to others.

Application integration in combination with the cloud are two of the enablers for this to happen.

Evidence #2

This is an old Gartner prediction from Las Vegas 2013 where Gartner emphasizes the need for organizations to strengthen the application integration skills to meet the below listed bullets where Gartner predicts.

  • Continuously and accelerating B2B growth
  • By 2016, midsize to large companies will spend 33% more on application integrations than in 2013.
  • By 2016, the integration of data on mobile devices will represent 20% of integration spending.
  • By 2017, over two-thirds of all new integration flows will extend outside the enterprise firewall.
  • By 2018, more than 50% of the cost of implementing 90% of new large systems will be spent on integration

Gartner also predicts the transfer of business critical applications will now start to run both in the cloud and also as Apps on our mobile devices. The first business people that saw the possibilities in the cloud was the sales group who moved out the CRM, and by 2015 more than  50% Of CRM Will Be Deployed As SaaS.

The message is clear; Make sure you have a plan for  application integration otherwise you will have a hard time to keep up with your competitors.

Evidence #3

This evidence is taken from an investigation made by Gartner again where both IT as well as business decision makers were asked how they prioritizes and to which area are the  IT-investments made. The result of the survey where quite interesting since they seems to be from different planets and also with completely different agendas.

It get’s even more interesting when you look at how their respective IT-budget is developing the coming year where the IT-departments budget is increasing with 1.3% whereas the business are now getting 3.6% more money to spend.

Guess what? When I look at the table below my guess is that the money our business representatives gets will end up in the cloud. Because I do not see any areas in the table where business and IT have mutual interests.

IT driven prioritization Business driven prioritization
1. Application development. 1. Business Intelligence.
2. Application Management. 2. Business and finance system.
3. Cost reduction. 3. Mobile solutions.
4. Digitization of business. 4. Web.
5. Architecture. 5. Issue Management.
6. Information Security. 6. Distance Meetings (online).
7. Control. 7. Document and information management.
8. Infrastructure Management 8. Customer Relationship Management.
9. Competence development. 9. E-billing.
10. Application Consolidation. 10. Communication Solutions.

Peter Sondergaard from Gartner says that today 38% of total IT spend is outside of IT.  By 2017, it will be over 50%,” .  Read more at: http://which-50.com/blog/2014/october/07/digital-investments-drive-global-it-spend-towards-us4-trillion-gartner/#.

To be able to meet this trend and for you to make sure that the business wont bypass you totally , you need to be seen as an enabler rather than the IT-guy that stops all innovation and spend the money solely on cost reduction rather than business development.

Two things the IT-departments needs to do to meet the new power of the business is to form a plan for the cloud and also strengthen the skills in Application Integration because when your applications starts to run in a mix of different clouds and on-premise applications you better know how to connect in a good way. It is no only good enough to keep systems in sync in a secure, efficient and reliable way. Now you also need to share data to make your customers better or even sell information if your business model allows for that via your well designed information APIs. Conclusion

Of the top  technology trends most of them have the cloud as playground and will require application integration to a large extent. With that in mind I’ll kindly recommend that  you form a plan on how your organization should benefit from the cloud and also stay competitive in your organizations offering.

Because new digital initiatives and start-ups sits inside your own organization but not recessionary in IT. Probably the investments is made in your marketing department, HR,  logistics and in sales. You can bet that  they will drive IT-investments which will require excellence in application integration and since this non-IT people they will most probably choose the cloud to run their application since the cloud supports their need of speed.

At iBiz Solutions we know that to be able to stand prepared for changes and to be able to stay competitive in the are of application integrations you need to think in maintainable application integration. Our Integration Framework describes methodology, best practices, patterns,  guidance and strategical directions based on where you are and what needs you have. In Integration Framework we have collected all our knowledge, experiences and skills in application integration so that you can benefit from it and say competitive in your area of expertise. Let us tell you more about our thoughts about the future of application integration.

iBiz Solutions Integration Framework

This blog post relates mostly to the governance hexagons in iBiz Solutions Integration Framework 

The intention of this blog post was to make sure that if you do not have a plan or application integration it is about time that you create one. The future have never been closer and the speed of which the future travels against us has increased dramatically.I hope I was able to prove the importance of both the cloud and excellence in application integration. If you have any questions on how you should be prepared to meet the coming opportunities, please do not hesitate to contact us.

I will end this post with another  interesting prediction also delivered by Gartner. Listen to this:

-By 2015, 10% of your online “friends” will be nonhuman. How many non-human friends did you have 5 years ago?

-Get used to the sentence. Robot, please bring me a cup of coffee and while you are away please check outside the door if the Amazon drone have delivered the stuff I ordered 10 minutes ago.

If the drone has delivered the goods, then you know that the network of connected systems is well integrated.

Posted in: •Integration  •Management and Business Development  | Tagged:


WCF-WebHttp adapter does not update HTTP header per message

Usually, in BizTalk we never work directly with transport specific data like HTTP headers and file names etc. Instead we manipulate the message context to instruct an adapter to add the transport specific data. Most of the time this works great and the BizTalk adapter does a good job shielding us from the nitty gritty details of the transport and let us focus on the data.

Sometimes though this is not enough. An example of this is when you need to add message specific data to the HTTP header using the WCF-WebHttp adapter. For example, we need to calculate an MD5 hash and add it to the Content-MD5 HTTP property for every outgoing message. This hash should, naturally, be unique for every unique message we send.

The problem

For most BizTalk developers it comes natural to use a pipeline component to do these kind of things. So that’s where I started off.

As stated in this blog post there is a property called HttpHeaders where we can set the header to whatever we want and then the adapter uses this property to create the actual HTTP header.

So the first thing that comes to my mind is to build a pipeline component that calculates the MD5 hash, format the HTTP header appropriately and set the HttpHeaders context property.

Testing the pipeline component

A quick test using Fiddler to inspect the HTTP request shows that an MD5 hash is indeed calculated and added to the HTTP header.

However, if we run the same test again, using another test file, the MD5 header is still the same even though the message body isn’t.

I haven’t found much documentation on this but I guess it has something to do with how BizTalk instantiate the adapter. In the blog post linked above the author mentions this behavior briefly in the last paragraph. The author suggests that using a dynamic send port will solve this issue. I haven’t tried that myself but at least for my scenario a dynamic port would be overkill and just add a bunch of complexity to the solution.

Thinking outside the BizTalk box

I think that in the BizTalk world it is often forgotten that when using a WCF adapter we have the full WCF stack at our disposal and in contrast to most other adapters in BizTalk we can actually execute code in the adapter.

I am not going to get in to the details about WCF extensibility and message inspectors but more information is available on MSDN.

Basically there is an interface called IClientMessageInspector that has a method called BeforeSendRequest that allows us to execute code just before the message is sent to the server.

An implementation of that interface that calculates the MD5 and add it to the HTTP header could look like this:

Once this is registered in a WCF behavior and added to the Machine.config (more on that in the MSDN link above) our behavior is available for selection in the Behavior tab on the WCF-WebHttp adapter properties for our send port.

Now running the same two tests again proves that we get different MD5 hashes for both messages.

Conclusion

Only, if ever, set the HTTP header in the BizTalk context if the data is static or only varies per port and not per message.

WCF extensibility gives us the opportunity to execute code just before data is sent, the pipeline doesn’t.

 

Posted in: •Integration  | Tagged: •BizTalk  •WCF-WebHttp