Build and generate MSI for BizTalk Server using Team Foundation Build Services

Using a build server and leveraging continuous integration is good practice in any software development project. The idea behind automated build and continuous integrations is to have a server that monitors one’s source code repository and builds the solution as changes occur. This separate build activity alone will ensure that all artifacts are checked in and that a successful build doesn’t depend on any artifacts or settings on the development machines.

Today build servers do a lot more as part of the build - the build process usually involves execution of tests, labeling the source as well as packing the solution into a deployable artifact.

In this post we’ll see how a build process can be achieved using Team Foundation (TFS) Build Services, building a BizTalk project that results in a deployable MSI artifact.

TFS Build Services

TFS Build services is a component that is part of the standard TFS install media. Each TFS build controller controls a number of “Build Agents” that will perform the actual build process. For each solution to build one has to define its process. These processes are described in a “Build Template” that tells the agent what steps to go through and in what order.

“Build Templates” in TFS Build Services are defined using Visual Studio. The image below shows a build template accessed through Visual Studio Team Explorer.

Major steps in a build template

As one creates a new build template for a solution one has to go through the following major steps:

1. Define a trigger

Decides what should trigger the build. Should it be triggered manually, should it be a scheduled build or should it be triggered at a check-in of new code?

2. Source Setting

This will tell the build process what part of the source tree the build template is relevant for. When queueing a new build this is the part of the source tree that will be downloaded to the staging area. It also tells the build services where on disk the source should be downloaded to.

3. Process

This is where all the steps and activities that the build service should perform are defined. Team Foundation Build Services comes with a number of standard templates and custom ones can be added. In this post we’ll however stick with the default one.

Build your first BizTalk solution

Building BizTalk Server solution using TFS Build Services is straight forward.

In this post I will use this sample BizTalk solution. After checking it into Team Foundation Source Control (I’ll use TFS Source control in this post but it’ll work similarly using Git) I’ll create a new build template for the solution. All that’s needed to change is the MsBuild platform setting property, so we’re using x86 when executing MsBuild as shown below.

After queuing a build we can in the TFS Build Explorer see a successful build! We can also download the output from the build where we can see all our build artifacts!

Using BtsMsiTask to create a MSI as part of the build

So far so good, but we started the article by saying that what we wanted was a deployable artifact. In the case of BizTalk this means a BizTalk MSI. Let’s see what we need to change to also have the build process create a MSI.

1. Install BtsMsiTask

Download and install BtsMsiTask. This will install a MsBuild task for generating the MSI.

2. Add a MsBuild project file

Add a MsBuild project file (‘build.proj’) to the solution The project file will tell the BtsMsiTask process what artifacts to include. Add the created project file to the solution and check it in as part of the solution.

3. Add the MsBuild project file to the TFS build template

Add the created MsBuild project file to the TFS build template by adding it to the list of projects to build. After another successful build we can see that we also created a MSI as part of the build!

Adding build information to the MSI

File name

As we can see the MSI we just created ended up with the default BtsMsiFile name that is a combination of the BizTalk application name property and the current date and time. Wouldn’t it be nice of we instead could the build number as part of the name? BtsMsiTask has an optional property called ‘FileName’ that we for example can set to ‘$(TF_BUILD_BUILDNUMBER).msi

Source location

When installing the artifact to BizTalk Server we can see that the source location property in the BizTalk Administration Console is set to where the artifact was built on the staging area. It’d be nice to also have information about what build that produced these artifacts. This will give the required information to know exactly what builds that are used for all the installed artifacts. We can change what is set in the source location by using the ‘SourceLocation’property of BtsMsiTask ‘c:\$(TF_BUILD_BUILDNUMBER)’ So after setting the property as below, queue another build, reinstall using the MSI and we’ll get the following result with the build number in the source location property. And finally this is the MsBuild project file we ended up with in our example.

Posted in: •Integration  | Tagged:


Sending HL 7 messages from BizTalk – The pure messaging way

One thing that makes working with HL7 messages in BizTalk a bit different from working with other standards like EDIFACT and X12 is that the HL7 assembler and disassembler relies on multipart messages for separating the header segments, body segments and extension segments (Z segments).

This makes it a lot easier to use a BizTalk map to create any header segments. Unfortunately this also means that we need orchestrations, even if we are not orchestrating any request response messages. Orchestrations tend to add a lot of complexity to BizTalk integrations, especially if no message orchestration is used. Orchestrations are also hard to make reusable and we typically end up creating an orchestration for every integration.

It is also not always a good thing that headers are created in a BizTalk map. Imagine we have a scenario where we receive an XML message with an envelope that should be mapped and sent with HL7 and the MSH header segment should be based on the envelope from the incoming message. A common way of handling XML envelopes in BizTalk is having the XML Disassembler remove the envelope and promote the elements of interest. But doing so makes the envelope elements inaccessible to the map. Creating a map for the envelope is usually not an option since the envelope can contain many different type of messages and BizTalk uses namespace + root node for identifying the message.

What we would want to do is to send an HL7 message from BizTalk and base the MSH header segment on properties from the envelope of the incoming message by promoting them on recieve and demoting them on send. Having a generic pipeline component creating the message parts instead of an orchestration helps keeping the integration simple and easy to maintain. The message body would be created with an ordinary non multipart map executed on the send port.

Part 1 – Disassembling the incoming XML

There is nothing special going on here just out of the box ordinary XML Envelope handling in BizTalk.

The first thing to do is to strip the envelope from the message and promote all properties that we want to use in the outgoing message.

To achieve this we need to create the schemas of the incoming message. One Envelope schema and one schema for the actual message.

Make sure to make set the schema to an Envelope schema

And set the body xpath to the body node.

Create the body schema

Then create a property schema containing a property for every MSH field we want to set.

Now edit the envelope property to promote each element to the corresponding MSH field, i.e. sender is promoted to the MSH31 context property and recipient is promoted to the MSH51 property. Set up a Receive port and a Receive location with the XML Receive pipeline.

Part 2 – Transform the incoming message to HL7 XML representation

Just create an ordinary map with the incoming message body as input and the HL7 body segment schema as output.

Create a new send port with the newly created map as outbound map.

Part 3 – Assemble the HL7 message

If we try to run this as is through the HL7 assembler it won’t work since the assembler expects to find three message parts, MSHSegment, BodySegments and Z Segments. So we need to create a pipeline component that can create the MSH segment based on the previously promoted properties and create the message parts, this is where the magic happens.

To create the MSH segment we are going to use property demotion. For this to happen we need to setup property promotion from the BizTalk MSH schema to the property schema we created previously.

To create an instance of the MSH schema and demote the MSH context properties to it I am going to use the same technique I used in a previous blog post

What this method does is take a parameter for the DocumentSpecName of the BizTalk MSH schema used. It then creates an instance of schema and iterates through all defined properties and demotes them from context. The new XML is then added to a new message part.

Z segments are not used in this scenario so we just set it to an empty string (it must still exists).

The Execute method of the component is quite simple. It just uses the above methods to create the segments and add it to the outbound message.

The part.Charset = “utf-8” part is very important. Without this national characters like åäö won’t work. The casing of the charset is also important.

Conclusion

So what have we gained here? First of all we have gotten rid of any orchestration, which always feels good :). This pipeline component has no dependencies to any specific schemas which makes it easy to reuse in any integration where we want to send HL7 messages as long we can promote any header data to a property schema. The source code for this component is available here

Posted in: •Integration  | Tagged: •BizTalk  •HL7 


Handling message enrichment with Business Rule Engine

One common thing that we need to handle at an integration level is enrichment of messages, this is often due to missing information in the incoming message or requirements of additional static data. This could be static hard coded data or more advanced rules that consists of several parameters. And the actions might as well be simple static data or more complex handling of adding nodes and elements. The values might change over instances (i.e. different values in Test and production). By using the Business Rule Engine (from now called BRE) we can create a solution with very high maintainability with good tools and support for change both input parameters to the rule and actions taken when the rule is executed without changing any source code or do any redeployment. This is made due to using standard components with minimal links, the only link is the name of the policy which we need to specify when we call the BRE rule engine from our pipeline or orchestration.

So basics first what is the BRE engine? It’s a rule engine and with that said you basically put in rules that turn into true or false and when the result is true there is an action executed. These actions can be quite flexible and can be used to add new elements, records or attributes or just simply set a value to one or more existing attributes or elements.

Now how and where do we use it? Well the places where enrichment is natural is the following, when received by BizTalk before inbound map, inside orchestration or when leaving BizTalk after outbound map (see picture). This possible execution places is receive-, send-pipeline or orchestration and to be able to do that we need some pipeline components and orchestration shapes. There are a great framework for working with BRE called “BizTalk Business Rules Engine Pipeline Framework

So let’s take an exampel:

I have this message received and the sending system cannot provide the field “MinChef”

 

Let’s say that the rule will be following: If company code is 9999 then we add the following static information:

 

Small and easy, also very flexible since adding or changing information like this is not forcing or demanding a recompilation or redeploy. Just add a new version of the policy and deploy and the change is done. There are possibilities to do more complex functions as well, check this example where we add a node and records under the node.  (It might look complex but the tool will help you fill in the information.

Installing and configuring BRE rules engine och BRE pipeline Framework. So basically BRE is shipped with BizTalk so it’s probably installed or you need to run the BizTalk installer. Install he BRE pipeline Framework download here and install.

Tips and tricks regarding BRE:

If you need the possibility to run the added .net functions and libraries we need to change a value in the registry accordingly:

HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\BusinessRules\3.0\

Add the following parameter: Name: StaticSupport Type: REG_DWORD Value: 2

 

If you create a pipeline from the pipeline component we need to add a new policy a “InstructorLoaderPolicy” that will check the message root node text and if it’s a match it will initiate and set a BRE pipeline context value.

The pipeline will then look like this:

Where the TestManager is the policy where the manipulation of the message is happening and the InstructorLoaderPolicy is the meta data policy that helps loading meta data like message type to the BRE engine.

 Conclusion:

We can easily add static data this way and prevent from hard coding it into maps but It needs to be planed and designed for since there are limited places where the enrichment can be done. The enrichment that the components are doing is adding data or performing functions to the message and return the result and the result will be the message that BizTalk continues to work with, but I’ll give you an advice to keep this as isolated as you can since it will fast grow quite complex. Otherwise by using this pattern we can create better and more maintainable solutions where changes to the static data are made simply by updating these rules rather than code and redeploy solutions, witch is a very time consuming task. This also gives the developer more time to spend on developing and other fun stuffs rather than change static data that the business suddenly had to change for any good reason.

Posted in: •Integration  | Tagged: •BizTalk  •BizTalk Business Rules Engine Pipeline Framework  •BRE  •BRE Pipeline Framework  •Business Rule Engine  •Maintainability  •Tips & Trix 


Why you need to be world class in Application Integration

In Sweden there is a radio program called “Spanarna” where a group of three persons tries to read trends in the everyday noise, and then present their visions of the future for the radio listeners.

The criteria for a valid trend-watch is that it requires at least three evidence to substantiate that this is something that will be realized in the coming future.

I will borrow this program idea and present my vision for application integration based on what I see when I visit customers in Scandinavia and listen to their upcoming challenges. Also I use what Radar Group and Gartner advises us to be prepared on.

I’ll start by stating the trend and thesis and prophecy: Your most important critical applications will not only be running in the cloud they will also be running in different clouds. Your requirements on the suppliers will be for them to take greater responsibility on your internal processes, and so will your customers require you to do. Therefore you need to form a plan for the cloud and focus to become world class in the area of application  integration.

Since a valid  trend-watch or thesis requires at least three evidences I made it easy for me.  I  borrowed most  material from other trend-watchers namely Gartner and Radar group. Couldn’t be more safe, since  everything that comes out of their mouths tends to be self-fulfilling prophecies.

Ok, now it is time to present the evidences that proves my trend-watch.

Evidence #1  Gartner Top 10 Strategic Trends by 2015

  • Computing Everywhere
  • The Internet of Things
  • 3D Printing
  • Advanced, Pervasive and Invisible Analytics
  • Context-Rich Systems
  • Smart Machines
  • Cloud/Client Computing
  • Software-Defined Applications and Infrastructure
  • Web-Scale IT
  • Risk-Based Security and Self-Protection

In the above list of hot technologies I have  underlined where the applications most probably will run in the cloud and as you see they all involves application integration. To be able to take advantage of computing everywhere, collect data from smart machines and especially for the internet of things where everything should be connected Application Integration excellence is vital.

Internet of Things is already happening and there is huge opportunity for early adopters to gain advantages if they can identify the value offering that comes with data together with the skills to collect the data the products produces. This can then be used to share and improve your offering and strengthen the relation to the customer. Look at the car industry, they have already started.

We also see small  manufacturers that already understood that they are sitting on an “information gold mine” where they not only can extend the value of their product for the consumers but also sell the data collected to others.

Application integration in combination with the cloud are two of the enablers for this to happen.

Evidence #2

This is an old Gartner prediction from Las Vegas 2013 where Gartner emphasizes the need for organizations to strengthen the application integration skills to meet the below listed bullets where Gartner predicts.

  • Continuously and accelerating B2B growth
  • By 2016, midsize to large companies will spend 33% more on application integrations than in 2013.
  • By 2016, the integration of data on mobile devices will represent 20% of integration spending.
  • By 2017, over two-thirds of all new integration flows will extend outside the enterprise firewall.
  • By 2018, more than 50% of the cost of implementing 90% of new large systems will be spent on integration

Gartner also predicts the transfer of business critical applications will now start to run both in the cloud and also as Apps on our mobile devices. The first business people that saw the possibilities in the cloud was the sales group who moved out the CRM, and by 2015 more than  50% Of CRM Will Be Deployed As SaaS.

The message is clear; Make sure you have a plan for  application integration otherwise you will have a hard time to keep up with your competitors.

Evidence #3

This evidence is taken from an investigation made by Gartner again where both IT as well as business decision makers were asked how they prioritizes and to which area are the  IT-investments made. The result of the survey where quite interesting since they seems to be from different planets and also with completely different agendas.

It get’s even more interesting when you look at how their respective IT-budget is developing the coming year where the IT-departments budget is increasing with 1.3% whereas the business are now getting 3.6% more money to spend.

Guess what? When I look at the table below my guess is that the money our business representatives gets will end up in the cloud. Because I do not see any areas in the table where business and IT have mutual interests.

IT driven prioritization Business driven prioritization
1. Application development. 1. Business Intelligence.
2. Application Management. 2. Business and finance system.
3. Cost reduction. 3. Mobile solutions.
4. Digitization of business. 4. Web.
5. Architecture. 5. Issue Management.
6. Information Security. 6. Distance Meetings (online).
7. Control. 7. Document and information management.
8. Infrastructure Management 8. Customer Relationship Management.
9. Competence development. 9. E-billing.
10. Application Consolidation. 10. Communication Solutions.

Peter Sondergaard from Gartner says that today 38% of total IT spend is outside of IT.  By 2017, it will be over 50%,” .  Read more at: http://which-50.com/blog/2014/october/07/digital-investments-drive-global-it-spend-towards-us4-trillion-gartner/#.

To be able to meet this trend and for you to make sure that the business wont bypass you totally , you need to be seen as an enabler rather than the IT-guy that stops all innovation and spend the money solely on cost reduction rather than business development.

Two things the IT-departments needs to do to meet the new power of the business is to form a plan for the cloud and also strengthen the skills in Application Integration because when your applications starts to run in a mix of different clouds and on-premise applications you better know how to connect in a good way. It is no only good enough to keep systems in sync in a secure, efficient and reliable way. Now you also need to share data to make your customers better or even sell information if your business model allows for that via your well designed information APIs. Conclusion

Of the top  technology trends most of them have the cloud as playground and will require application integration to a large extent. With that in mind I’ll kindly recommend that  you form a plan on how your organization should benefit from the cloud and also stay competitive in your organizations offering.

Because new digital initiatives and start-ups sits inside your own organization but not recessionary in IT. Probably the investments is made in your marketing department, HR,  logistics and in sales. You can bet that  they will drive IT-investments which will require excellence in application integration and since this non-IT people they will most probably choose the cloud to run their application since the cloud supports their need of speed.

At iBiz Solutions we know that to be able to stand prepared for changes and to be able to stay competitive in the are of application integrations you need to think in maintainable application integration. Our Integration Framework describes methodology, best practices, patterns,  guidance and strategical directions based on where you are and what needs you have. In Integration Framework we have collected all our knowledge, experiences and skills in application integration so that you can benefit from it and say competitive in your area of expertise. Let us tell you more about our thoughts about the future of application integration.

iBiz Solutions Integration Framework

This blog post relates mostly to the governance hexagons in iBiz Solutions Integration Framework 

The intention of this blog post was to make sure that if you do not have a plan or application integration it is about time that you create one. The future have never been closer and the speed of which the future travels against us has increased dramatically.I hope I was able to prove the importance of both the cloud and excellence in application integration. If you have any questions on how you should be prepared to meet the coming opportunities, please do not hesitate to contact us.

I will end this post with another  interesting prediction also delivered by Gartner. Listen to this:

-By 2015, 10% of your online “friends” will be nonhuman. How many non-human friends did you have 5 years ago?

-Get used to the sentence. Robot, please bring me a cup of coffee and while you are away please check outside the door if the Amazon drone have delivered the stuff I ordered 10 minutes ago.

If the drone has delivered the goods, then you know that the network of connected systems is well integrated.

Posted in: •Integration  •Management and Business Development  | Tagged:


WCF-WebHttp adapter does not update HTTP header per message

Usually, in BizTalk we never work directly with transport specific data like HTTP headers and file names etc. Instead we manipulate the message context to instruct an adapter to add the transport specific data. Most of the time this works great and the BizTalk adapter does a good job shielding us from the nitty gritty details of the transport and let us focus on the data.

Sometimes though this is not enough. An example of this is when you need to add message specific data to the HTTP header using the WCF-WebHttp adapter. For example, we need to calculate an MD5 hash and add it to the Content-MD5 HTTP property for every outgoing message. This hash should, naturally, be unique for every unique message we send.

The problem

For most BizTalk developers it comes natural to use a pipeline component to do these kind of things. So that’s where I started off.

As stated in this blog post there is a property called HttpHeaders where we can set the header to whatever we want and then the adapter uses this property to create the actual HTTP header.

So the first thing that comes to my mind is to build a pipeline component that calculates the MD5 hash, format the HTTP header appropriately and set the HttpHeaders context property.

Testing the pipeline component

A quick test using Fiddler to inspect the HTTP request shows that an MD5 hash is indeed calculated and added to the HTTP header.

However, if we run the same test again, using another test file, the MD5 header is still the same even though the message body isn’t.

I haven’t found much documentation on this but I guess it has something to do with how BizTalk instantiate the adapter. In the blog post linked above the author mentions this behavior briefly in the last paragraph. The author suggests that using a dynamic send port will solve this issue. I haven’t tried that myself but at least for my scenario a dynamic port would be overkill and just add a bunch of complexity to the solution.

Thinking outside the BizTalk box

I think that in the BizTalk world it is often forgotten that when using a WCF adapter we have the full WCF stack at our disposal and in contrast to most other adapters in BizTalk we can actually execute code in the adapter.

I am not going to get in to the details about WCF extensibility and message inspectors but more information is available on MSDN.

Basically there is an interface called IClientMessageInspector that has a method called BeforeSendRequest that allows us to execute code just before the message is sent to the server.

An implementation of that interface that calculates the MD5 and add it to the HTTP header could look like this:

Once this is registered in a WCF behavior and added to the Machine.config (more on that in the MSDN link above) our behavior is available for selection in the Behavior tab on the WCF-WebHttp adapter properties for our send port.

Now running the same two tests again proves that we get different MD5 hashes for both messages.

Conclusion

Only, if ever, set the HTTP header in the BizTalk context if the data is static or only varies per port and not per message.

WCF extensibility gives us the opportunity to execute code just before data is sent, the pipeline doesn’t.

 

Posted in: •Integration  | Tagged: •BizTalk  •WCF-WebHttp 


Calling an on premise Microsoft Dynamics CRM using BizTalk and Active Directory Federated Security (ADFS)

Federated security is a great way of accomplishing single-sign-on (SSO) security for your applications. It’s also a technique that is becoming increasingly relevant as things move to the cloud and we’re starting to get more hybrid situation with some applications on premise and some in the cloud.

Active Directory Federated Security (ADFS) is an implementation of federated security and is used by a number of Microsoft Applications, Microsoft Dynamics CRM being one of them.

Windows Communication Foundation (WCF) has a few techniques to simplify federated security communication and this post will show an example of using Microsoft BizTalk Server and WCF to communicate with an ADFS secured CRM installation.

What is federated security?

Federated security at its core is pretty simple. In our scenario BizTalk Server (client) wants to login in and authenticate itself with the CRM system.

Traditionally the CRM system then had to manage the credentials for the client and verify these as login happened. There a number of drawback to this, the main one being that it doesn’t scale that well when we’re getting many separated systems that need access to each other as login information and login logic is spreads out across a number of systems.

When using federated security each part instead chooses to trust a common part (in this case the ADFS and AD), and as long as someone provide a token that can be validated with the trusted part, the CRM system will trust that it already has been authenticated and that everything is ok.

  1. Authentication and requesting token
  2. Authentication against AD
  3. Login authenticated
  4. ADFS token response
  5. ADFS token based authentication
  6. Response from CRM

So basically a federated security model allows for separating all authentication and authorization out to a separate system.

As mentioned ADFS is just an implementation of federated security were Active Directory acts as the main repository with a Security Token Service implementation on top of it.

BizTalk and ADFS

As BizTalk has great WCF support we can use the WCF stack to handle all of communication with ADFS and CRM. But it does involve a fair bit of configuration. BizTalk and Visual Studio will help in most mainstream WCF scenario where one can point Visual Studio at the WSDL location and a basic binding file is generated for us. However, in the case of and ADFS based WSDL this will just result in an empty binding file that doesn’t help much.

Lots of projects I’ve seen makes the decision at this point to use custom code and create a facade to solve authentication. As Microsoft Dynamics CRM comes with a nice SDK including a C# library to handle authentication is easy to understand how one would end up using that. The problem is however that this creates another code base that again needs to be maintained over time. One could also argue that using custom code that is called by BizTalk further complicates the overall solution and makes it harder to maintain.

So let’s configure the authentication from scratch using Windows Communication Foundation.

Choosing the right WCF binding

First thing to do is to choose the right WCF binding. Let’s create a WCF-Custom Static Solicit-Response send port and choose the “ws2007FederationHttpBinding”.

Adding the Issuer

First thing we need to add is information on how to connect to the Issuer. The Issuer is the one issuing the ticket, in our case that’s the ADFS.

First we need to add information about the address of the Issuer. The WSDL tells us that the mex endpoint for the ADFS server is located at “https://adfs20.xxx.yy/adfs/trust/mex”.

Browsing the WSDL for the ADFS server shows a number of different endpoints. Which one to use depends on what kind of authentication being used when requesting the token. In our case we’re using a simple username and password so we’re using the “usernamemixed” endpoint (“https://adfs20.xxx.yy/adfs/services/trust/2005/usernamemixed”).

Secondly we need to add information about the binding and the binding configuration for communication with the ADFS service.

What this basically means is that we need to add information to a second, or an inner, binding configuration. The BizTalk Server WCF configuration GUI doesn’t provide a way to set this so the only way is to configure this is to use one of the relevant configuration files (“machine.config” or the BizTalk config) and ad a binding manually.

Once this is setup we can point our BizTalk Server WCF configuration to the correct URL and reference the WCF inner binding we just configured.

Finally we need to provide the username and password to authenticate ourselves to the ADFS server.

We now have communication setup to the ADFS service and should be able to get a valid ticket that we then can use to authenticate ourselves to the CRM system!

We now however also need to provide information on how to connect to the actual CRM system.

Configure communication to CRM

The rest is easy. Let’s start with adding the URL to the end service we want to call. As with any other service call we’ll also add the SOAP Action Header that in this case is the Update service (“http://schemas.microsoft.com/xrm/2011/Contracts/Services/IOrganizationService/Update”) of the “OrganizationService” service.

As out service also uses SSL for encryption we need to tell the binding to use “TransportWithMessageCredentials”.

Establishing a Security Context – or not

Finally there is a little tweak that is needed. WCF supports establishing a Security Context. This will cache the token and avoid asking the STS for a new token for each call to the CRM system. BizTalk Server however doesn’t seem to support this so we need to turn it off.

Conclusion

Understanding federated security is important and will become increasingly important as we move over to systems hosted in the cloud - federated security is the de facto authentication standard used by hosted systems in the cloud. Avoiding custom code and custom facades is a key factor in building maintainable large scale BizTalk based systems over time. BizTalk has great WCF support and taking full advantage of it is important to be able to build solutions that easy to oversee and possible to maintain not just by those familiar and comfortable in custom code.

Posted in: •Integration  | Tagged:


Why full NuGet support for BizTalk projects is important!

Let’s start with a summary for those who don’t feel like reading the full post.

Using NuGet to handle BizTalk dependencies for shared schemas, pipeline components and so on works fine today.

As .btproj files however aren’t supported by NuGet (as shown in this pull request) and are not in the current white list of allowed project types Package Restore will not work (issue closed as by design here).

Not having Package Restore of course is a problem as one now is forced to check in all packages as part of the solutions, something that in the end leads to bloated and messy solutions.

So please reach out to your Microsoft contacts and let’s get this fixed!

NuGet

As most people know NuGet is the package management solution from Microsoft for .NET. It started off as an initiative to further boost open source within the .NET community and NuGet packages uploaded to the NuGet.org platform are open and available directly within Visual Studio through the NuGet add-in. Currently there are well over 20 000 open packages for everyone to download and use.

Lately there has however been lots of discussions within the community to use NuGet as a package manager for internal shared resources as well (by Hanselman and others). Solutions like MyGet allows for private NuGet feeds – only available to those within your organization but still levering all the ease and control offered by NuGet.

Using NuGet for references has a number of of advantages:

  • Communication All available resources are directly visible in Visual Studio and when updates to a used library is a available a notification is shown. No more spam mails about changes and never read list of available libraries.
  • Versioning A NuGet package has it’s own versioning. This is useful as it isn’t always optimal to change the dll version, but by using the NuGet package version one can still indicate that something has changed. As you also reference a specific version of a NuGet package from your solution you always have full control of exactly what version you’re targeting and where to find the built and ready bits.
  • Efficiency When starting to work on a project with many references one first have to get the source code from source control for the references, build these (hopefully in the right version … hopefully you have your tags and labels in order …) until all the broken references are fix. With NuGet references this just works straight away and you can be sure you get the right version as the resource isn’t the latest from source control but the actual built dlls that’s part of the referenced NuGet package.

NuGet Feeds

As mentioned NuGet feeds can be public or private. A NuGet feed is basically a RSS feed with available resources and versions of these. A feed and a NuGet server can be a hosted web based solution or something as simple as a folder where you write your NuGet packages to. The NuGet documentation covers these options in depth. The point being that creating your own private NuGet feed is very simple!

So if you haven’t already realized it by now – NuGet is not only a great tool to manage all public external dependencies but can add a a lot a value for internal references as well.

Couple of relevant NuGet features

  • NuGet Package RestoreNuGet Package Restore enables NuGet to download the used referenced package from the package area. The goal is to avoid having to check in the actual references in source control as this will bloat the version control system and in the end the create a messy solution.
  • NuGet Specification (nuspec) metadata token replacements All packages are based on a nuspec file that dictates the version, package description and other meta information. NuGet has the capability to by using replacement tokens (such as_ $version$_ for example) read some of this information from the AssemblyInfo files. This is far from a critical feature but nice to have and avoid having to repeat oneself and have the same information in a number of places.

BizTalk and NuGet?

A typical BizTalk solution has a number of shared resources such as shared schemas, shared libraries and pipeline components. As the resources usually are shared between a number of project they often live in a separate development cycle. So when opening a BizTalk project with such resources it’s not only a lot of work getting the referenced code and building the references, there’s also this nagging feeling that it might not be in the right version and that the source might have changed since the first time the reference was added.

Another reference issue occurs when using a build server for a solution with references. As the solution has a dependency to the referenced project one has to make sure not only that the main solution is fetched to the build workarea by the build server, but also that all the referenced project are fetched from version control – and again, hopefully the latest version in the attended version … This kind of works using TFS Build Service and common TFS Source Control. If however one’s using Git and have resources in separate repositories this becomes impossible as TFS Build Service currently only supports fetching a single repository per build definition to the build workarea … (This issue does not apply for TeamCity that has a number of different options for dependency management)

All the these issues are actually solved when using NuGet references instead of traditional references as we can be sure we’re getting the packaged dlls as part of the NuGet package in the version that one referenced and not the latest checked in version. A NuGet reference also makes things a bit easier when it comes to managing the workarea for the TFS Build Service as one only have to sure the NuGet package is available (either as checked in as part of the solution or by using Package Restore).

But …

NuGet doesn’t support BizTalk projects!

As disused here NuGet currently doesn’t support .btproj files. As BizTalk project files are are basically a standard .NET project file with some extra information a two line change in the NuGet whitelist is needed as in this pull request.

So the main issue it that by not having full support of .btproj files Package Restore won’t work and we’re for now force to check in all the NuGet packages as part of our solutions. An other minor issue is that the token replacement feature also doesn’t work. I also think that if we could actually get full BizTalk support we’d see more BizTalk specific open shared packages for things like useful libraries and pipeline components.

Call for action: Reach out to the NuGet Team or other Microsoft connections and let’s get those two lines added to the white list!

Posted in: •Integration  | Tagged: •BizTalk  •NuGet 


London BizTalk Summit 2014 Wrap Up

BizTalk Summit 2014 was a fully packed event with many interesting speakers. BizTalk 360 and Saravana Kumar did a good job putting this together. After two intense days, we arrived back in Sweden with new contacts in our network and more knowledge regarding new information and techniques.

No less than 12 integration MVP’s were represented at the sessions. On top of that, Harish and Guru from Microsoft initiated the summit by presenting some of the news regarding BizTalk 2013 R2 and BizTalk Services. At the summit as a whole, the subject focus was as broad as it can get within the narrow integration area. Amongst the covered areas we can find BizTalk 360 (of course), WABS and Azure Mobile Services but also a fair share of sessions covering softer subjects on a higher abstract level.

First of all, we are all happy to once again see a united front from Microsoft that BizTalk is a product to rely on and will be around for many years ahead. Microsoft describes it as Reality and points out that it is a vital part in the hybrid solutions that now also is reality. They also press that it still is an important product as an on premise platform, which is how it often is used by many of the customers. The release plan is a regular one release per year where they will be alternating between major and minor (R2) releases. Guru Venkataraman (Senior Program Manager - BizTalk Product Team, Microsoft) also revealed that just for testing the BizTalk engine there will be around 7800 tests available. The large amount of tests may sound insane but as Guru explained, it is necessary since they need to maintain and support BizTalks heavy backpack that has been building up since back in 2000.

With that said, let us move on to the fun parts; a wrap-up of the sessions!

 

BizTalk @ Microsoft and upcoming server releases

The first session opening the summit was held by Guru who presented new features covered in BizTalk 2013 R2. The largest effort has been put in to optimizing compatibility but some new features will be released:

  • Native JSON support
  • JSON Schema wizard
  • Support for empty message when working with REST
  • Platform alignment with Windows Server 2012 R2 and SQL Server 2012 R2

The long awaited native JSON support is very welcome by the BizTalk community which now is mostly using third party components to achieve this. Although, I clearly agree with Richard Seroter’s (integration MVP, when talking about A look at choosing integration technology) statement that the idea of using XSD schemas for JSON is somewhat absurd since it really destroys the whole idea of JSON. However, with the architecture BizTalk is built on today, there is no choice but to implement it this way to be able to support transformation between messages in a btm mapping. Another new feature regarding REST is the processing of empty messages using an empty schema. You heard me, an empty message! It may sound strange but it is in fact a very powerful feature. Guru had a great demo showing that this schema solves the whole problem that the REST adapter had earlier when no message body was sent with a REST GET request witch just is a simple URL.

Some other features added in BizTalk 2013 R2 are requests from American Health Care. When he talked about this, the thing that got to me most was the new xml field data type FreeText. This data type literally makes BizTalk stop parsing that field and says “Hey BizTalk, what’s in this field is none of your business”, even though it could be some xml or invalid characters in the field. I love loose coupling so let’s hope this field is as useful as I think it could be. Planned release is June 2014.

 

Windows Azure BizTalk Services – Latest Updates

The latest news about WABS and Service Bus features were presented by Harish Kumar Agarwal (Program Manager - BizTalk Product Team, Microsoft). Some of the most useful new feature they presented is support for EDI and that WABS now can receive messages from Service bus queues and topics.

As a short and neat presentation on this, they simply took the EDI parties from BizTalk and put them in the cloud. They have tried to simplify the process and kept the most used parts visible and close together, the rest went under the “Advance” section to not interfere in the interface. Harish showed us how the setup is done and he demonstrated how easy it is to add and manage parties. Since this is done in the portal, it is a lot easier to distribute the management of the parties to the users who really should administer these things.

 

Manageability of Windows Azure BizTalk Services

How about maintaining solutions in WABS? Steef-Jaan Wiggers (Integration MVP, Author) had a lot of nice input to this and as with solutions overall, the keys is good source control management and good administration options. So if we take the scenario that the source control part is covered and we started to investigate how to maintain the deployed solutions. How it works today, the main tool that you actually would use is Visual Studio which might seem a bit odd since we are used to have some sort of administration tool added as well. At the moment we cannot edit, but we can read the settings of adapters for example in the Azure Portal which might not be the best solution but it works. The REST API and PowerShell could be used for managing the solutions, stopping, starting, deploying etc.

 

How to move BizTalk Services

So what if you would like to start moving some of your current integrations in BizTalk Server to the cloud using WABS?

Jon Fancey (Integration MVP, Author) showed us in a simple demo that migrating EDI parties and agreements where quite trivial. Migrating of ports is fairly easily doable (as long as they are of the adapter types supported by WABS) as well as pipeline components to bridges and map migration, except for some minor necessary changes. How about orchestrations then? First of all, orchestrations are not part of WABS which is replaced by workflow. You may question this, what, I cannot do these awesome cool Orchestrations anymore? And the answer is: NO! Though, that is not entirely true since the new workflow has been added instead. Jon was not entirely pleased with this as it means it will be hard to motivate a customer not migrating an orchestration to a workflow. WABS does not come with any tool for migrating this so he wanted to find a solution for this by actually making a converter from orchestration to workflow.

The same technique enabled him to also fully migrate pipeline to bridges in WABS. The workflow converter is not fully developed yet but it will be in the future and his demo clearly showed that it will be possible to convert a really advanced orchestration into a workflow and still have almost the same functionality. Even though I might not be the biggest fan of workflows, this approach might become one of the most common ways to solve migration problems. Thus, it will probably have a high possibility to bring the most value in the sense of cost per migration. Still, my opinion is that it is better to use the new platform on the way it is supposed to be used and in the most optimal way. Always use and take advantage of the framework don’t fight it!

 

What if you mess up the configuration?

Tord Glad Nordahl (Integration MVP) had a memorable session on the first day, where he in a clear way stated that all developers are evil and lazy using real examples in a humoristic approach. Personally, I can’t agree to that, not all the way at least. But I do agree with him on the areas such as keeping good strategy of host instances and a good level of logging is making life easier and the solutions much more maintainable. A great admin that knows what he is doing will make the BizTalk server so much happier and healthier. He clearly pointed out that he often sees problem with the disaster recovery (DR) model that might be “forgotten” or just handled poorly. He presented examples like for instance that someone backup their message box instance then two hours later they backup the configuration database. What are the consequences of this? Well you do backup, congrats! But have they actually tested that the backup can be restored? Probably not since this setup is doomed to fail, the consistency is lost and you will run into problem when you try to run a DR. So basically you’re screwed. His real life examples really lightened up the room and everyone had a good laugh, although almost everyone there were developers.

 

BizTalk 2013 in Windows Azure IaaS

We keep on talking about BizTalk Server 2013, we saw a pretty amazing demo by Stephen W. Thomas (Integration MVP) of how to automate creation of complete BizTalk environments, complete with all complex setups done automatically! I mean, who hasn’t dreamt of creating a complete environment with two clustered SQL servers and three BizTalk nodes with one button click, with the exception of a configuration file to change values in. Azure brings these great possibilities to the area of easy virtualization and powerful API’s. Both PowerShell and the REST API have some really great strengths.

With this, Stephen demonstrated the strengths of Azure in a clear and simple way and that really powerful and useful things can be created with small means. Just think about how hard some complex test cases are to create, where you need multiple BizTalk servers and a fully functional Clustered SQL server, just as you have in production. Before Azure I would say almost impossible (unless you have all the time in the world) and even with the Azure portal it still takes a lot of time creating all VM’s, doing the configuration and so on.

So Stephen’s work ended up in a couple of PowerShell scripts and one config file with three parameters and he could create all of this in one click! With this we could actually do these tests. Just use a script to create the environment, create the tests, run them and verify the result. Kill the machines, delete them and go on with it just like it was a simple test case! Well, true not all that simple but, still now it’s doable. This is some really awesome stuff and I look forward to see more of it and be able to create all kinds of messed up test scenarios.

 

Thinking like an Integration Person

At the Summit a Nino Crudele (Integration MVP) on fire had a session on a Visual Studio tool for BizTalk he developed himself. He was literally so exited he could not stand still. He has created some extended tools for Visual Studio with focus on speeding up the developing process. I especially liked the tools that he created for integration solution overview and automated documentation. You could inside Visual Studio right click on an orchestration file or project to generate some real nice documents that described the process and a lot of metrics. There were also tuning of build process, dependent tracking and extraction of code out of BizTalk Server using a dll file! Awesome! He marked an orchestration and said “compare it” and the tool compared the installed orchestration in BizTalk with the one in Visual Studio and if there were differences it downloaded the dll file and extracted the source code of the orchestration. From this, you could manually check the differences and make corrections. This was actually possible to do with all artifacts in BizTalk. Talk about useful tool, in all those lost projects cases where several developers are involved and everyone thinks “uhm, I think this is the source code”. With this we could actually compare the source code with the installed code and if there is a mismatch decompile it out from BizTalk to continue developing from the decompiled code. Another interesting part was the automated documentation which was actually quite good, useful metrics and readable documentation, either for entire projects or just for a specific orchestrations. Guru stated that the tool is wanted and he want to add this to the Developer Toolkit distributed by Microsoft later this year, hopefully together with the R2 release. Another useful feature was possibility of testing of pipelines and pipeline components in the same way you test maps inside VS today. This might be really useful as a complement to other tools but I would still recommend using the unit test solution with WinterDom in addition as you build tests that are long-lived and can be really useful during the whole lifecycle of the pipeline or pipeline component.

 

BizTalk Server Operations and Monitoring using BizTalk360

We also saw a great presentation of the BizTalk360 product which is an enhancement of the BizTalk Administration tool. They did not only improve the user experience but have also added a lot of nice features like audit for all changes, improved security and authorization model so you could give user only read permissions for specific applications. Saravana Kumar (Integration MVP) also demonstrated the monitor and alarm functions available but the most impressive part were that by minimal effort you could access the portal directly via Azure. Using the Live-ID logging into the cloud, from where you could access multiple BizTalk360 instances that technically can be hosted anywhere in the world. That is pretty amazing and could be very useful!

 

Real world Business Activity Monitoring

Dan Rosanova (Integration MVP, Author) had a great presentation of BAM and how this could be used to give the business insight and status report of what’s happening in the integration flows. Obviously, this is not news, but he gave some very useful examples and tips how to use BAM in a way so it is understandable by administrative staff. I have seen and tried BAM before but Dan made it look cool and interesting, he pointed out some obvious but still very nice areas where this could be very useful and give more insight into the area of integration. He also pointed out that for most people in an organization integration is just a black box and with the right tools we could enlighten them about what is happening or even make them understand integration a bit more. In this way, integration can be brought to the table again with a richer life, and as he said “get the business people of our backs living happy with their user friendly tracking tool”.

 

Exposing Operational data to Mobile devices using Windows Azure

Kent Weare (Integration MVP, Author) talked about this topic. As we all know mobility is a smoking hot thing on the market and everyone is talking about it, but what has it to do with BizTalk, WABS and Service Bus? Easy! They need integration as a part of the infrastructure otherwise they would be quite boring, living there alone in the mobile device. Admittedly, it is not that easy, they do know how to communicate with a server so they can get information. But still there is a gap area, from an integration point of view, especially when integrating a corporate app with an ERP system. We need to help the mobile developers so that they can work against a single API or endpoint and then let integration specialists do the work of connecting all the systems together. The coolest and greatest part however is that the Service Bus really helps us to do this in an easy way. Now they have added C# support for programming Mobile Services which means you are not forced to learn node.js. They have now given back the power to us! As we all know by now the Service Bus Relay helps enabling communication with the LOB’s and legacy systems and makes communication go seamless in and out of firewalls in an extremely secure and controlled manner. Thus, integration will be a key and play a bigger role in the future regarding mobility. We might need to speed up the process and make changes on the fly but we need the people understanding how things work.

 

When to use what: A look at choosing Integration Technology

Richard Seroter (Integration MVP, Author) had a session about management and different integration technologies focusing on when and how to use them. His focus where Microsoft core stuff as Service Bus, BizTalk 2013 ,WABS etc. The perspective in which he presented this was “Is this right for me?”, “Do I have people who understand this technique?”, “Is this technique mature enough?”, “How can we maintain the solutions we create?” and other important issues to address before choosing technology. Since a lot of the arguing can start like this “This technique is cool let’s try it!” and later on when you have 15 different technologies and platforms, I bet no one wants to be responsible for maintaining all that. So from what I see, it is very important to understand your organization and your current products and use that has the most value for the organization. Another key question is how steep the learning curve is, how long time does it take for my team to learn these new technologies? Is this technology strong enough to survive the next 5-10 years? These questions should today be addressed before asking whether we should use BizTalk or WABS. Or both? How effective would it be if we had five C++ developers that suddenly had to learn node.js?

The best plan, as Richard told, is to see the potential in your organization, what kind of developers do we have and make sure you believe in the product and not be afraid of mixing a few of them. Just make sure to have good reasons for bringing new ones in and not just because they are new and cool. For example, the Service Bus greatly makes the life a lot easier when communicating in and out over firewalls, so it’s a great add on to BizTalk.

 

Master Data Services

Johan Hedberg’s (Integration MVP) session was about a very common and hard to solve topic, master data. It doesn’t matter if it is customer data or employee data etc, I think we all have faced this problem in some way. It is always hard to solve master data problems where often very complex and hard maintainable solutions are developed. Johan demonstrated the Master Data Services which is a service on SQL Server for keeping track of and manage Master Data. Basically the idea is to create a data type and an entity where the data type is the name of it and the entity represents how the data looks like. The service then creates tables and views for the entities and on top of this, there could be data validation rules added for automatic validation. When adding data we could just populate the staging table of the entity and it will be imported and later validated. For pulling or sending data we read from a view with some filter for only pulling the new, updated or data we are interested in. In this way, we could setup event pushing or just pull the whole table for full push to other systems. This is an easy job for BizTalk 2013! Simple, neat and surprisingly powerful solution.

 

This blog post is written by Mattias Lögdberg, Therese Axelsson and Robin Hultman.

 

Posted in: •Integration  | Tagged: •BizTalk Summit 


Simplified usage of shared BizTalk artefacts using NuGet

Most BizTalk projects has a lot of dependencies! Since an assembly is the smallest deployable unit in BizTalk and we typically do not want to redeploy all our integrations just because we updated one map in one integration. When one artifact needs to use another artifact, it is said to be dependent on that artifact. If the artifacts are located in different assemblies, we have an assembly reference.

There is a ton of material for how to handle these dependencies when it comes to deployment but one of the big annoyances with having a lot of dependencies is just to get it to build fresh from source control.

A typical implement-fix-in-existing-project workflow could look something like this:

  1. Clone the project from the source control repository.
  2. Build
  3. < Get build errors due to missing dependencies >
  4. Clone all dependent source control repositories
  5. Build all dependent projects.
  6. Build the project that we are supposed to update (crossing fingers that we didn’t do any mistake in any of the above steps).
  7. Implement fix
  8. Commit changes.

It is not uncommon that these steps takes more time than the actual fix, especially if there is a lot of dependencies.

As most other development teams we use a build server where built, tested and versioned assemblies are stored. So why do I need to download and build code when I only need the binary?

The ideal thing would be if we can pull down the project we need to update and all dependencies would be automatically be restored against a pre-built versioned assembly from the build server.

NuGet to the rescue?

Note that BizTalk projects is not currently supported by NuGet. I have opened a pull request with support for BizTalk projects https://nuget.codeplex.com/SourceControl/network/forks/robinhultman/NuGetWithBizTalkProjectExtension/contribution/5960. This example uses my private build of NuGet.

When NuGet is mentioned people often think of public feeds for open source libraries but you could just as well use it internally with a private feed.

NuGet enabling a BizTalk project is easy. NuGet will use information from the AssemblyInfo but to add some more metadata about our artifact we can add a nuspec file. The rest is done on the build server.

We use TeamCity as build server. TeamCity has support for packing and publishing NuGet packages to both private and public feeds. TeamCity is also able to act as a NuGet feed server.

Our build process looks like this:

With the package published to our NuGet server we can simply reference it from “Manage NuGet Packages” dialog in Visual Studio:

If the NuGet Visual Studio setting “Allow NuGet to download missing packages” is enabled NuGet will, as the label indicates, download all missing packages automatically. So with this implemented the implement-fix-in-existing-project workflow will look like this:

1. Clone the project from the source control repository.
2. Build < Visual Studio automatically restores dependencies from the build server >.
3. Implement fix.
4. Commit changes.

If we investigate the MSBuild output in Visual Studio from step 2 we can see that the dependencies are downloaded from the NuGet feed.

Restoring NuGet packages… To prevent NuGet from downloading packages during build, open the Visual Studio Options dialog, click on the Package Manager node and uncheck ‘Allow NuGet to download missing packages’.
Installing ‘Demo.Shared.Schemas.EFACT_D93A_INVOIC 1.0.0.0’.
Successfully installed ‘Demo.Shared.Schemas.EFACT_D93A_INVOIC 1.0.0.0’.
Demo.CustomerInvoice.Transforms ->
C:\Projects\Demo\CustomerInvoice\Transforms\bin\Debug\Demo.CustomerInvoice.Transforms.dll

If one of our dependencies gets updated NuGet detects that and enable us to update the reference.

If you think that this could be useful please up vote my pull request on the NuGet project

https://nuget.codeplex.com/SourceControl/network/forks/robinhultman/NuGetWithBizTalkProjectExtension/contribution/5960

To read more about creating and publishing NuGet packages:

http://docs.nuget.org/docs/creating-packages/creating-and-publishing-a-package

More about package restore:

http://docs.nuget.org/docs/reference/package-restore

More about nuspec files:

http://docs.nuget.org/docs/reference/nuspec-reference

Posted in: •Integration  | Tagged: •BizTalk  •NuGet  •TeamCity 


Export BizTalk Server MSI packages directly from Visual Studio using BtsMsiTask

Getting a full Continuous Integration (CI) process working with BizTalk Server is hard!

One of the big advantages in a working CI process is to always have tested and verified artifacts from the build server to deploy into test and production. Packaging these build resources into a deployable unit is however notorious hard in BizTalk Server as a Visual Studio build will not provide a deployable artifact (only raw dlls). The only way to get a deployable MSI package for BizTalk Server is to first install everything into the server and then export – until now.

Why Continuous Integration?

Continuous Integration is a concept first described by Martin Fowler back in 2006. At its core its about team communication and fast feedback but also often leads to better quality software and more efficient processes.

 

A CI process usually works something like the above picture.

1. A developer checks in code to the source control server.

2. The build server detects that a check in has occurred, gets all the new code and initiates a new build while also running all the relevant unit tests.

3. The result from the build and the tests are sent back to the team of developers and provides them with a up to date view of the “health” in the project.

4. If the build and all the test are successful the built and tested resources are written to a deploy area.

As one can see the CI build server acts as another developer on the team but always builds everything on a fresh machine and bases everything on what is actually checked in to source control – guaranteeing that nothing is build using artifacts that for some reasons is not in source control or that some special setting etc is required to achieve a successful build.

In step 4 above the CI server also writes everything to a deploy area. A golden rule for a CI workflow is to use artifacts and packages from this area for further deployment to test and production environments – and never directly build and move artifacts from developer machines!

As all resources from each successful build is stored safely and labeled one automatically achieves versioning and the possibility to roll back to previous versions and packages if needed.

What is the problem with CI and BizTalk?

It is important to have the build and feedback process as efficient as possible to enable frequent checkins and to catch possible errors and mistake directly. As mentioned it is equally as important that the resources are written to the deploy area are the ones used to deploy to test and production so one gets all the advantages with versioning and roll back possibilities etc.

The problem with BizTalk Server is however that only building a project in Visual Studio does not gives us a deployable package (only raw dlls)!

There are a number of different ways to get around this. One popular option is to automate the whole installation of the dlls generated in the build. This not only requires a whole lot of scripting and work, it also requires a full BizTalk Server installation on the build server. The automated process of installation also takes time and slows down the feedback loop back to development team. There are however great frameworks as for example the BizTalk Deployment Framework to help with this (this solution of course also enables integration testing using BizUnit and other framework).

Some people would also argue that the whole script package and the raw dlls could be moved onto test and production and viewed on as a deployment package. But MSI is a powerful packaging tool and BizTalk Server has a number of specialized features around MSI. As MSI also is so simple and flexible it usually the preferred solution by IT operations.

A final possibility is of course to directly add the resources one by one using BizTalk Server Administration console. In more complex solutions this however takes time and requires deeper knowledge into the solution as one manually has to know in what order the different resources should be added.

Another option in BtsMsiTask

Another option is then to use BtsMsiTask to directly generate a BizTalk Server MSI from the Visual Studio build and MsBuild.

 

The BtsMsiTask uses same approach and tools as the MSI export process implemented into BizTalk Server but extracts it into a MSBuild task that can be directly executed as part of the build process.

BtsMsiTask enables the CI server to generate a deployable MSI package directly from the Visual Studio based build without having to first install into BizTalk Server!

 

This post is a cross-post and was originally posted here

Posted in: •Integration  | Tagged: •BizTalk  •BtsMsiTask  •Continuous Integration  •Visual Studio