Understanding API Management Source Code

Developing in Azure API Management is done inside the Azure portal or in the Publisher Portal that comes with the API Management instance, developing and testing is easy, but when it comes to move the developed API to the next instance (Test or Prod) it becomes tricky.

In this demo instance I have 2 API’s and logic applied to handle authentication and formatting.

Where is the code?

When developing API’s in API Management publisher portal you quickly learn that API signature and policy codes are separated and handled in different areas of the portal, API and Policy. This experience is not as clear in the new API blade inside the Azure Portal where a lot of work has been done to make it easier to understand the API’s and see the whole picture at one place.

But the difference is quite important, the API is the “external” definition needed by clients to understand and implement our API and the Policy part is logic that is executed when a call is made to the API.

Exporting the API; Press export and select “OpenAPI specification”

Inspecting the file shows that only the swagger definition is exported and not the logic added to the

We now test to extract the code using the GIT option, in the publisher portal under “Security” -> “Configuration repository”

First Press the “Save configuration to repository”, note that after the save is completed the Git icon on the top right corner changes color to green.

Get the clone URL and note that the Username is “apim”, scroll down and generate a password

Now use the credentials and password to connect. Tip:_ if you are using PowerShell or another tool that requires the authentication in the form, remember to URL encode the password:_

_git clone _https://{user}:{password}@ibizmalo.scm.azure-api.net

Inspecting the content: (2 folders down) and we find the source code!

All of these we can copy to the next instance, but! There are some things to take into consideration; each group/product etc. has a unique ID (if these are manually created in the instances it’s not guaranteed to be the same and the import won’t work in another instance, so import everything you need)

Import changes into a API Management instance

The next step is to import these files to another API Management instance, so to get started, we need to clone the code from this API Management instance:

Copy the files and commit, after that we need to deploy the changes in the repository to the API Management instance and once again we need to go to the “publisher” portal and “Security” -> “Configuration repository”. Scroll down and press the “Deploy repository configuration” and the changes are applied.

Posted in: •Azure API Management  •Integration  | Tagged: •API  •API Management  •Azure 


Exposing SAP Gateway services with API Management

Exposing services like the SAP Gateway is an important task for API Management but not always so easy. Security is often high around these products so we need to understand this in order to get the setup correct. To get started let’s look at the setup that we were facing. SAP Gateway is hosted inside the OnPrem network behind several firewalls and hard restrictions, API Management is hosted in Azure. In this case we had an Express Route setup ready to be used so we could deploy an API Management inside a preexisting subnet. Firewall rules where setup based on the local IP of the API Management instance (according to the information I have these should be static when deployed in a VNET).

After API Management is deployed inside the network, this done by setting the External Network in the network tab to a subnet of the VNET that is connected to the Expres Route. Make sure that API Management is alone in this subnet, see image for more informaton

After this is done we set up the routing/firewall rules, to get the local ip of the API Management instance we put up a vm with IIS and created a API inside API Managment that called the standard IIS start page, after that we searched the loggs in IIS to get the local IP.

Now we can start creating the API and I’ll jump right in to the policy settings of the API that is created to expose the SAP Gateway, the Security model on SAP GW can vary but we in this case we had Basic Authentication as the authentication mode. Adding handling of this is quite straightforward for API Management, there is a policy ready to be used “Authenticate with Basic”:

So we started off adding the authentication part, now the easy part is done. When calling the API we just got a 403 Forbidden response saying “Invalid X-CSRF-Token”, looking around this we found that it’s the Anti Forgery setup with SAP Gateway. To be able to handle this a token and a cookie is needed and the token and cookie are retrieved via a Successful GET call to the Gateway. The initial call is using the same URL (make sure that the GET operation is implemented so the result is successful (return 200 OK) otherwise the token is not valid. Since I had no access to the SAP GW without API Management my testing’s are done from Postman via API Management to SAP GW. Adding the “X-CSRF-Token” header with value Fetch will retrieve the token and cookie making the call like:

The response looks like:

The interesting part is the Headers and Cookies, let’s have a look, under Headers we find the X-CSRF-TOKEN that we need to use in the response back.

 

In the Cookies we find 2 items and we are interested in the one called “sap_xsrf..” this is the anti-cross site forgery cookie that is needed in the POST/PUT request to SAP Gateway.

Composing these makes a valid request look like this:

So now we can do these two requests to get a valid POST call in to SAP Gateway from Postman, so let’s move on to setting this up in API Management. In API Mangement we don’t want the client to understand this and/or force them to implement the two calls and the functionality to copy headers and so on, we just want to expose a “normal” API. So we need to configure API Management to do these two calls to the SAP Gateway from API Management in the same call from the client to be able send a valid POST request to SAP Gateway.

In order to make this work we need to use polices, so after setting up the standard POST request to create Service Order we will go to the Policy tab.

In the beginning it looks something like this:

So the first thing we need to do is to a send-request policy, I configured it wit mode new and used a variable name of fetchtokenresponse. Since retrieving the token is to do a GET request to the SAP Gatway we reuse the same URL to the API (after rewrite). Setting the header X-CSRF-Token to Fetch since we are fetching the token and adding the authorization header with the value for Basic authentication So let’s start with creating the call to do the GET token request. let’s add this code in the inbound section.

<send-request mode="new" response-variable-name="fetchtokenresponse" timeout="10″ ignore-error="false">*
<set-url>@(context.Request.Url.ToString())</set-url>
<set-method>GET</set-method>
<set-header name="X-CSRF-Token" exists-action="override">
<value>Fetch</value>
</set-header>
<set-header name="Authorization" exists-action="override">
<value>Basic aaaaaaaaaaaaaaaaaaaaaaaaaa==</value>
</set-header>
<set-body></set-body>
</send-request>

Next step is to extract the values from the send-request operation and add them to our POST request, setting the X-CSRF-Token header is fairly straightforward so we use the set header policy and retrieves the header from the response variable, the code looks like:

<set-header name="X-CSRF-Token" exists-action="skip">
<value>@(((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("x-csrf-token"))</value>
</set-header>

A bit trickier is the Cookie, since there are no “standard” cookie handler we need to implement some more logic, in the sample I provide I just made a lot of assumptions on the cookie. We need the cookie starting with sap_XSRF I started by splitting all cookies on ‘;’ and finding the cookie that contained “sap_XSRF”, this had in our case also a domain that I didn’t need so I removed it by splitting on ‘,’ (comma) and used the result in a set-header policy.

<set-header name="Cookie" exists-action="skip">
<value>@{
string rawcookie = ((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("Set-Cookie");
string[] cookies = rawcookie.Split(';');
string xsrftoken = cookies.FirstOrDefault( ss => ss.Contains("sap-XSRF"));
return xsrftoken.Split(',')[1];
}
</value>
</set-header>

All in all the policy will look like:

 <policies>
 <inbound>
 <base />
 <rewrite-uri template="sap/opu/odata/sap/ZCAV_AZURE_CS_ORDER_SRV/ItHeaderSet('{oid}')" />
 <send-request mode="new" response-variable-name="fetchtokenresponse" timeout="10" ignore-error="false">
 <set-url>@(context.Request.Url.ToString())</set-url>
 <set-method>GET</set-method>
 <set-header name="X-CSRF-Token" exists-action="override">
 <value>Fetch</value>
 </set-header>
 <set-header name="Authorization" exists-action="override">
 <value>Basic aaaaaaaaaaaaaaaaaaaaaaaaaa ==</value>
 </set-header>
 <set-body>
 </set-body>
 </send-request>
 <set-header name="X-CSRF-Token" exists-action="skip">
 <value>@(((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("x-csrf-token"))</value>
	 </set-header>
 <set-header name="Cookie" exists-action="skip">
 <value>@{
 string rawcookie = ((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("Set-Cookie");
 string[] cookies = rawcookie.Split(';');
 string xsrftoken = cookies.FirstOrDefault( ss => ss.Contains("sap-XSRF"));
 return xsrftoken.Split(',')[1];
 }
 </value>
</set-header>
 </inbound>
 <backend>
<base />
</backend>
<outbound>
 <base />
 </outbound>
 <on-error>
 <base />
 </on-error>
 </policies>

The result is now complete and the client will assume/think that this is just a regular API as the other ones and we can expose this with regular API Management API-Key security.

Wrap up

Exposing the SAP Gateway is not a straight forward task but after understanding the process and being able to implement the SAP Gateway complexity inside API Management we can expose these functions just as any other API, I would suggest adding some Rate call limits and quotas to protect the gateway from overflow. This scenario proves the value of API Management and provides possiblities to solve complex authentication and anti forgery patterns in order to standardise your own API Facade with the same auth mechanism for the client without tacking the backend security in consideration.

Full scenario:

 

Posted in: •Azure API Management  •Integration  •Uncategorized  | Tagged: •API Governance  •API Management  •SAP 


Connecting to Azure Subscriptions from VSTS for release management

When it comes to deploy and realease management in VSTS we need to connect to our subscription.

This is done via Service Endpoints created inside VSTS, there are two ways of authentication to Azure subscriptions, with User Account or AAD Application, typically scenarios for AAD Applications is when the subscription is not in your tenant or when you don’t have access to the subscription with the appropiate role or so.

1 User Account: The subscription is accessible with your user account

When you use the same user account when logging in to VSTS and your Azure Subscription the Azure Subscription is auto discovered and can be picked under the headline “Available Azure Subscriptions”

  • Pick the Subscription
  • Press the Authorize button to make VSTS create the required authorization
    1. If you have the required access you can now start deploying to this subscription

2 AAD Application: The subscription is not accessible with your user account

If the VSTS user dosen’t have access to the subscription it will not be listed in the Subscription list under “Available Azure Subscriptions” and we will need to add it manually.

  • Press the Manage button
    1. A new tab is opened and you will come to the Service tab
  • Press the  “New Service Endpoint”
  • A new dialog is opened and you can now create  a Connection to an existing accessible subscription, just as before but we want to create it based on Service Principal so press the “here” link
  • The dialog is transformed and now we can add the full information from an AAD application. Connection name:_ enter a name for the Connection i.e “Customer A  TEST”Subscription ID: the guid for the subscription_Subscription Name:_ A friendly understandable name of the subscription; we often use the same as Connection name i.e. “Customer A TEST”Service Principal Client ID: this is the AAD Application Client ID_Service Principal Key:this is the AAD Application KeyTenant ID: the guid for the tenant Sample from an AAD Application in Azure, this is how you find the values: TenantID: is the** Directorty ID** and found on the Properties section of the Azure AD Directory. Service Principal Client ID: _This is the **Application ID **of the AAD Application._Service Principal Key: Is the Key on the Azure AD Application and is found under keys, generate a new key (only visible 1 time after save)
    1. Verify the Connection. Tip: if the verification failes make sure that the AAD Application has atleast “Contributor” rights at atleast one resource Group (not just only the subscription)
    2. Press Ok

This service endpoint can now be found in the subscription list.

I prefer using AAD Applicaiton Connection stetup on Production Environments, just to make sure there are no “personal account” Connections that can mess things up.

 

Posted in: •Integration  •Uncategorized  | Tagged:


Logic Apps and Dynamic’s CRM Online Connector

An easy way of starting to Integrate with your Dynamic CRM Online instance. It just takes a few minutes and some easy steps to get started!

Getting started with the Connector

Getting started with the Connector is easy, all you need is an account on Dynamic CRM Online. Read more how to connect and use certain actions/triggers etc. at the MSDN forum.

https://azure.microsoft.com/sv-se/documentation/articles/connectors-create-api-crmonline/

Working with the Connector

It’s fairly easy to get started with the Connector, selecting data and create/update records, but as soon as the business requirments lands on our table it gets a bit trickier. Often we need to handle scenarios where we link relations and/or assign ownership to the entity we are createing/updating. Then we need to know a little more about the setup in Dynamic CRM and our best friend for this is the Settings and Customizations tab.

crm_Settings

By then selecting Customize The System a new window will open and it’s now possible to go in and verify the entity, check keys, relations etc.

crm_customizethesystem

Using Navigational properties

Navigational properties are used to link entities together, these can be set directly within the body of the update/insert request. This will then be linked correctly inside Dynamics CRM Online, an example would look like this, where we are setting the currency (transactioncurrency) of the account using the navigational property_ transactioncurrencyid_value, example:

 "body": {
"_transactioncurrencyid_value": "@{first(body('Get_intgrationMetadata')['value'])['_stq_defaultcurrency_value']}",
"accountnumber": "@{triggerBody()['data']['Company_ID']}",
"address2_city": "@{triggerBody()['data']['Town_city_state']}",
...
..
}

Setting owner

A frequent used operation is Assigning owner, this is now implemented so it’s possible to do it directly via the Update/Create operation instead of a separate operation as previously.

In the Dynamic CRM Online Connector it’s easy to set the owner just apply the following fields _ownerid_type wich is set to either systemusers or teams _depending on the owner type next is the __ownerid_value wich is the key to the user, an example as bellow:’

"body": {
"_ownerid_type": "systemusers",
"_ownerid_value": "@{first(body('Get_owner')['value'])['systemuserid']}"
}

Lessons learned

Don’t overflow the CRM! Since Logic Apps is really powerfull in parallelism it’s good to have some sort of control over how many new instances are created and executed against Dynamics CRM. We usually try to make sure that there are no more than 200 parallel actions towards a CRM at any given time.

Learn how to check fields, properties and keys since you will get stuck at errors when sending in the wrong type and then you will need to check what type it is. OptionSets are commonly used and good from GUI perspective but it’s not as good in integration since it’s translated to a number that we from integration often need to translate to a code or text but learning how to check the values inside CRM will speed up this process.

When we started using the Connector there were problems with assigning ownership and handling float numbers, these where fixed early by the product group and since then we haven’t found any issues with the way we are using the Connector.

Posted in: •Integration  •Logic Apps  | Tagged:


Validate incoming HTTP request messages in BizTalk

One of BizTalk’s greatest strengths is that messages are not lost. Even if there is an error processing a message we still get a suspended message that an administrator can handle manually. Sometimes though that is not the preferred way of handling errors. When working with web API’s it is common to let the client handle any errors by returning a status code indicating if the request was successful or not, especially if the error is in the client request.

There is a standard pipeline component included in BizTalk that can validate any incoming XML message against its XSD schema, however if the validation fails the request will only get suspended and all the client receives is a timeout message and an administrator will have to handle the suspended message even though there is not much to do about it since the client connection is closed.

One way of handling this is to do the validation in an orchestration, catch any validation errors and create a response message to return but writing an orchestration for just handling validation doesn’t make much sense with the performance implications and all.

A better solution would be if we could:

  • Abort the processing of the request as early as possible if there is a validation error.
  • Leverage the existing component for validating messages.
  • Return a HTTP 400 status code to indicate to the client that the request was bad.
  • Avoid suspending the message in BizTalk.

Since the receive pipeline is the earliest stage we can inject custom code in, a custom pipeline component would be the best fit.

The pipeline component would need to execute the standard XmlValidator component and catch any validation error thrown by the validator. We could of course log any validation errors to the event log if we still would like some error logging.

If a validation error was caught, we need to create a new message with the validation error details so the client understand what is wrong.

The context property OutboundHttpStatusCode should be set to 400 so that the adapter knows to return the message with a status code of 400 bad request.

To prevent any further processing of the message and indicate to BizTalk that the new message is the response to return to the client, a number of context properties related to request response messages need to be set. Correlation tokens are copied from the request message to make the response go through the same receive port instance.

This component is available as a part of iBiz Solutions open source initiative. The source code is available on GitHub and the component is also available on NuGet.

Posted in: •Integration  | Tagged: •BizTalk  •Open Source  •Pipeline Components 


Delete releases from Visual Studio Release Management

At iBiz Solutions we are heavy users of Visual Studio Release Management. Visual Studio Release Management in combination with Team Foundation Server for source control and TFS Build Services creates the Application Lifecycle Management control that we need in our projects. It enables us to have full control over exactly what gets built and what build packages that are released to what environment and so on.

[caption id=”attachment_988” align=”alignnone” width=”540”] Release Management in action in one of iBiz Solutions projects[/caption]

Visual Studio Release Management is as mentioned a critical tool for us, but it still has a few places where the tool could do with some improvements. A critical features is of course the ability to get an overview of which build that is released into what environment - the current version however is not very efficient when it comes to search and filter the list of releases.

Another missing feature is the ability to delete previous releases. At first this sounds like a bad idea and that one should save all releases as they might provide important informationation in the future. But there are however situations where one makes stupid mistakes and where releases just clutter the bigger picture and makes it harder to actually see the releases that are important. An efficent way of filtering, or a way of saying that this release is no longer relevant, might have solved the issue but as mentioned this does not exist in the current version of the tool.

Long story short. Here is the script that we run directly against the database to delete specific releases and all their related information in Visual Studio Release Magagement.

Posted in: •Integration  | Tagged:


Method not allowed when consuming API App in Azure App Services from c#

When consuming an Api App from c# the Azure App Service SDK helps us generate Client Code to improve the develoment process. (read more here)

When testing the generated code you might get an error of ‘Method Not Allowed’ and status code 405 even if the security settings is correct and the API app works perfectly when trying to use it via Logic App, Postman/fiddler/swagger etc.

If the thrown exception looks something like this:

Then the problem probably is an incorrect generated URI from the code generator where it has used http instead of https. (common in several places with API Apps always should be https)

To check this go in to the class of the ServiceClient ( in my case FlatFileEncoder ) and check the base URI settings, as you can see on the image bellow mine where generated with http instead of https.

Changing the URI from http to https and it starts working, my code is executed and the result from the FlatFileToXML function is returned as expected.

Posted in: •Azure App Service  •Integration  | Tagged: •API Apps  •Azure Api Apps  •Azure App Service  •Custom Code  •Flat File Encoder  •Integration  •Windows Azure 


Sharing a BizTalk pipeline component in BizTalkComponents component library

So you have decided to share a pipeline component to our library? Great! The BizTalkComponent library is hosted on GitHub. A sample pipeline component that can be used as a template is available at GitHub.

Setting up Git

Before a component is added to the official GitHub repository of BizTalkComponents it must be reviewed. To do that you must first create a public GitHub repository. Make sure to add the .gitattributes file and the .gitignore file for Visual Studio. Also create a README file and a license file. BizTalkComponents uses MIT license. To start working on our component we need to clone the newly created repository using your favorite Git client like Visual Studio or GitHub for Windows.

Creating the solution

Fire up Visual Studio and create an empty solution in the root folder of the newly cloned repository. The solution file should be called BizTalkComponents.PipelineComponents.{component name}.sln.

Adding unit tests

All components must have unit tests so the first thing we are going to do is to add a new Unit Test project to our solution. The unit test project should be located in Tests/UnitTests under the root folder. To be able to run a pipeline component outside of BizTalk we use the Winterdom BizTalk Pipeline Testing library available as a NuGet package.

Build tests that not only ensures that the components output is as expected when everything is setup correctly but also that relevant exceptions are thrown when any preconditions like parameters set by the BizTalk administrator or existence of context properties fails.

Implementing the pipeline component

The pipeline component itself should be located in a Src folder under root. All pipeline components should target .NET 4.0 to ensure compatibility with BizTalk 2010.

BizTalkComponents.Utils

The pipeline components of BizTalk components uses a shared utility library called BizTalkComponents.Utils. This library contains helper classes to reduce the amount of code we need to write for common tasks like interacting with the message context and reading and writing to the components property bag. The library is available on NuGet.

Partial classes for a more readable component

The interfaces that every pipeline component needs to implement contains a lot of plumbing code that has nothing to do with what the component actually do. To keep the implementation clean and easy to read BizTalkComponents uses partial classes to separate plumbing code from the component implementation. The class files should be called {component name}.cs and {component name}.Component.cs. The component class should contain any component metadata properties and methods as well as any Validate method. BizTalkComponents does not use resource files for component metadata. Name, Version and Description are set directly in the property. The IPersistPropertyBag interface contains a method for validating parameters. This method is called when the component is built. This method should use the ValidationHelper from the utils library. This method can be completed with an additional Validate method that is not called at build time but rather when the component is called in BizTalk.

Component parameters

Any component parameter should have annotations: • Parameters required to be set already at design time should be annotated with the Required attribute • Parameters required to be set at runtime should be annotated with the RequiredRunTime attribute available in the Utils library. • All parameters should be annotated with a user friendly name in the DisplayName attribute. • All parameters should be annotated with a description in the Description attribute.

The parameter property should be accompanied with a string constant to be used when reading and writing from the property bag.

The Load and Save methods should be implementing using the PropertyBagHelper from Utils.

The Execute method

If the component has any RequiredRuntime properties the custom Validate method should be called at the beginning of the Execute method to ensure that all parameters are set as expected.

All interactions with the message context should use the utils library’s extension methods. ContextProperty entity should be used for referencing any context property. This entity can be initialized either with the BizTalk property notation namespace#property name or by separating namespace and property in to different strings. The Utils library also contains constants for some BizTalk standard properties.

Building the component

All pipeline components should be strong named so that they can be installed into the GAC. A new strong name file (snk) is created for every pipeline component.

MSI files for all components will be generated. To be able to generate an MSI from MSBuild BtsMsiTask must be installed and used. BtsMsiTask is called from a custom MSBuild build script that should be located in a Build folder under the root. All MSI should have a version. The AssemblyInformationalVersion attribute is used for versioning and to be able to read that attribute from the build script MSBuild Extension Pack must also be installed. The build script should be called Build.proj and also be included as a Solution folder in the solution.

This sample build script can be used by just replacing the component name.

Readme file

The README.md mark down file generated by GitHub should have a short description of the component. There are many tools for editing mark down files, for example this extension to Visual Studio.

This blog post aims to introduce the BizTalk Components library and the coding standard it uses. If you have any questions or comments on the component library or the coding standard don’t hesitate to write a comment to this blog post.

Posted in: •Integration  | Tagged:


Azure APP Services - Logic Apps process Flat file transform to EDIFACT send to FTP

Logic apps is part of the new and awesome features Microsoft released in preview this spring. Focus on this type of functions are cloud integration! Both hybrid integrations that bring data to and from servers inside your network in a secure and easy way and also pure cloud integration.

Logic Apps is the application that is created in Azure as the workflowish application, I won’t explain more in this post but please read more here.

So the example I’ll take here is a classic one, I have a system that produces small and simple flat files on a local server and we want to bring that data in EDIFACT INVOIC format to a FTP server. Normally this would be solved via an integration platform like BizTalk but for this scenario we don’t have a BizTalk server that we can use so we try to solve this via Logic Apps instead.

Scenario:

  1. Pick up a file on the local filesystem
  2. Decode the flatfile and transform it to EDIFACT
  3. Encode EDIFACT
  4. Send to FTP server

Let’s jump right in into the portal, we need to start by setting up some connectors (File, FTP) and several API Apps from the marketplace will be used.

Preparations

SQL Databases

First off we need one SQL DB, you can use a previously created one but we need 2 empty instances, I did this in the “old” portal.

The 2 API apps are the following and in brackets you can se my database instance name.

  1. BizTalk Trading Partner Management (TradingPartnerManagement)
  2. BizTalk EDIFACT (EDIDatabase)

Copy the connection strings and change the text {your_password_here} to your password so you can use it later.

Service Bus:

Create a Service Bus namespace for the File Connector.

Schemas:

Download the EDIFACT XSD schema here

Mapper:

Create a BizTalk Service mapper in Visual Studio 2012

  1. Create a Flatfile schema
  2. Create the mapping from the flatfile schema to Edifact in Visual Studio.
    1. BizTalk Service mapper (generating trfm files). I don’t like this mapper since it have bad xslt support so I normaly just create a standard BizTalk project and do the mapping in a temporary project and later copy the xslt into the trfm mapper. J

Resulting in these 3 files for me (after downloading the EDIFACT schema)

 

Now when we are prepared let’s start creating the services

Creat and configure the API App’s instances.

In Web API section:

Create File Connector: We need a File Connector for picking up files on the onpremise server we need to provide the following information right away when we create the connector:

  • Root Folder, make sure to write down or remember the value or prepare the server with the catalog structure since the value is impossible to find (if you do find where to read this value, please let me know how!)
  • Service bus Connection String, take the apropiate SAS connection string from the earlier create service bus

After creating this we can install it on the server, see the section about installing File Connector for more info.

Create FTP Connector: We need a FTP Connector to be able to send messages to the FTP server. For this we need to provide the following information right away when we create the connector:

  • **Server Address **(make sure to not end with a / or similar since they will automatically add :21 for the port then you get an error when trying to use the connector, se image bellow for how a file name is constructed in this case (notice the staring :21)
  • User name, for the ftp server
  • Password, for the user

Create BizTalk Flat File Encoder:

We need a BizTalk Flat File Encoder that will decode and encode of flatfile to/from XML. In our case it will be decode of flat file to XML

Installation is straight forward and you only need to apply name for the instance.

Configure BizTalk Flat File Encoder:

For the Flat File encoder to work we need to add one or more flatfile schemas. These schemas are the same that is used in BizTalk so you can easily reuse the once you have today. Go in to the instance we created (remeber the name of the instance) easiest way to find it is in the API apps list. (Browse -> API apps)

  • Click Schemas
    1. Press Add Button
    2. Upload the Flat File Schema (must be a BizTalk flat file schema XSD file)

Create BizTalk Transform Service

We need a BizTalk Transform Service to be able to transform the xml instance of the flatfile to an xml instance of the EDIFACT schema. Creating this is quite straight forward just need to apply a name for the service.

Configure BizTalk Transform Service

After creation  we need to some configuration, we basicly need to add the maps. Pick up the app as with the others, when you have found it do the following:

  • Click Maps
    1. Press Add button
    2. Upload the Map you prepared in the preparations

Create BizTalk Trading Partner Management

BizTalk Trading Partner Management is used to keep information about partners and setup agreements on AS2, EDIFACT and X12.

When creating we need to provide the connection string to the earlier created Database.

Configure BizTalk Trading Partner Management

When the Trading Partner Management is created we need to do some configurations. Mainly we need to create parties that are suppose to be used in the B2B flow and then connect them via agreements so we can say that partner A are exchanging messages with partner B.

First we need to create atleast 2 partners, go in to the instance (just like the others):

  • Click Partners
  • Press Add button and add at least 2 Partners (only name is mandatory)
    1. After the Partners are added we need to create Identities for them (you might want to reload the web API to make sure it will load properly) After that press one of the partners
    2. Press Profiles
    3. Press the Profile in the Profiles section
  • Press Identities (see image below for a guidance)
  • Enter the identity value and pick a Qualifier
    1. Press Save, repeat the steps from point 3 for the other partner

When the partners are created and we have added identities to them we can start creating agreements BUT to be able to create EDIFACT agreements we also need to provide the EDIFACT schemas that we want to use.

  • Press Schemas
    1. Press add
    2. Upload the Schema (repeat for more shemas if needed)

Now we can create the agreement:

  • Click Agreements
    1. Press Add button and enter a name
    2. Choose Protocol, in our case it’s EDIFACT
  • Partner Settings (here is a click intense part) set all values according to your setup (all must be set)
    1. Enter receive settings I left all of this standard and just pointed out the INVOIC schema.
    2. Enter Send settings I left all of this standard and just pointed out the INVOIC schema and added Application Reference.
    3. Enter Batch settings I filled in the mandatory fields (name and counter =1 ) If this is skipped Azure will try to create the agreement but it will fail so just fil it in.

Note! After Save write down the Agreement ID, we will use that later on in the Logic App

Create BizTalk EDIFACT

The BizTalk EDIFACT instance will do the hard work of encode or decode your EDIFACT message to XML or vice versa. When creating the instance we need the following:

  1. Database Connection string: Connection string to the earlier created Database for EDIFACT.
  2. TPM Instance Name: The name of the earlier created BizTalk Trading Partner Management (from section 5)

There is no need to do some configuration since the EDIFACT instnace will use the configuration we did in the BizTalk Trading Partner Management instance when handling agreements and so on.

 

Now we are all set and the preparations are done, let’s start building the Logic Apps app.

Building the app looks easy and are smooth but there are some tricky parts that you will encounter since we will have some looping lists in this way of building the Logic App. (Building it this way so it’s easier to understand and test, quite bad though that if I need to change this later I have some serious remodeling to do, it’s easier to create a new flow.)

As I created it I focused on a flow that would be easy to demonstrate, so I added the fileconnector in a read mode of a folder, instead of the event mode that also is available. (With the read mode I can trigger it at any given time, perfect for demo or when we have need of manual runs).

So let’s start!

First of we pick up the files from the server.

  1. Add a Recurrence, I set this to standard values (once per hour, depending on the Tier choice this might be the lowest value that you can set).
  2. Add the File Connector and use the function List Files, since I will pick up the files from the root folder (that we set when we created the Connector, in my case:  C:\temp\NewFileConnector) I’ll leave the Folder Path blank.
  3. Add a second File Connector to actually pick up the files that the List Files function located. Use function Get File, and since the List Files result is a collection we need to add “Repeat over list” and pick the list from List Files.
    1. Repeat over the items returned from the first File Connector.
    2. Set File Path to: @repeatItem().FilePath (which declares that we should take the FilePath result from the repeating item.  (need to be set manually since the automatic help will give you only use the first() function that will give you the information from the first file.
  4. (Optinal and not included in my sample) Add a third File Connector with function Delete File to delete the file from the folder (to prevent picking it up serveral times)
    1. Repeat over the list from File Connector Get Files
    2. File Path, should be the File Path from the repeating item

Now we have this setup and we will start picking up files, eaither on the intervall or at the time we manually start the logic app.

Next section will be adding the BizTalk Flat File Encoder, transformation and the BizTalk EDIFACT component.

  • Add the BizTalk Flat File Encoder, function Flat File to XML
    1. Repeat over the File controller Get File.
    2. Flat File element should be the content from the Get File operation and since it’s a list we will need to fix this manually.
    3. Schema name must be set, in my case it’s TestInvoice
    4. Root name must be set in my case it’s Invoice
    5. Add the BizTalk Transform Service, now we will transform the xml version of the flatfile to the xml version of the EDIFACT.
      1. Once again we need to repeat over a list, in this case the result list from the BizTalk Flat File Encoder.
      2. Input XML should be the output from the flatfile encoder.
        1. Add the BizTalk Edifact to transform the xml version of the EDIFACT to an actual EDIFACT.
    6. Repeat over the results from the BizTalk Transformation Service
    7. Content should be the xml result from the transformation.
    8. Agreement ID is the EDIFACT agreement ID that is created in the TradingPartnerManagement API App, we wrote this down earlier. (you can also collect this at any given time in the Trading Partner Management instance) 3. To send this to the FTP add the FTP Connector
    9. Repeat over the result from the BizTalk EDIFACT instance
    10. Content should be the Payload that is returned as a result from the EDIFACT instance.
    11. File Path, make sure to find something unique, I used the tracking ID to make the file have a unique name. And also the built in function @concat, that will concatenate several strings. (make sure to not use the @ sign inside the @concat function since that will give you an error)

In my setup the files will not be deleted so I don’t need to redrop files over and over again. (If we want that behavior just add another File Connector at position 4 (marked as optinal in the setup) or change the whole setup so we use the trigger function in the File Connector that will keep track of new files and pick them up and afterwards delete the file in one step. The dissadvantage with that is when you want to demo since it’s not possible to manually trigger that flow via the start flow buttton.

Result will be, that we have a full functional flow that will pick up the flatfile and do all the handling that is needed to eventually end up as an EDIFACT file on the FTP server. Here is how the full flow look like for me:

Key Take Aways

Crashes:

This is a preview so don’t be afraid of crashes, I also learned that a lot of the time it was just the async GUI that had the problems especially if I used the Free tier. Then I quickly filled my quote and in combination with heavy traffic or high demand I was unable to work with it. But when I switched to Basic Tier almost all problems just disappeared =) Updating the map

Strangely I couldn’t update the map, I had to delete it and upload it again.

Installing the File Connector:

To install the File Connector, find it in the portal (easiest way is to go via the Browse) and pick API Apps select the file connector.

You will notice that when loading the settings it will try to check if the connector is installed, click the link.

Next is to download the installer, click “Download and Configure” see image bellow

After downloading, copy and install the application on the prefered server. When you get promted for the connection string as bellow you should the Primary Configuration String (yellow marked in the image above).

 

Tips when using the File Connector

  • File name cannot contain spaces, that will cause internal server errors in the File Connector (not sure why)

Finding the files and Installed content on your server:

After the micro service is installed it will be found in IIS

When browsing the file system we will find a folder containing some files that represents a web api:

Testing the installed API App locally with swagger on your server

Troubelshoothing the flow:

So when you get this red error and you have no idea what went wrong we need to start troubleshooting. In the editor it looks and feel so smooth to do things but there is not as easy finding the errors, specially not the first times. For us who has been in the business for some time we are use to troubleshoot and understand xml notations and layouts so this can be a little bit new since Logic App is building it’s settings, configuration and input/output on JSON notation, meaing no xpaths but a .(dot) notation instead.

First of all in an JSON object we use the [object}.{attribute} to access something so when looking at this example we will need the following notation in the field to get the value.

Check the error output messages, example bellow on error in EDIFACT service, reason is found under output.body.Exception

Other tips when working in the portal.

Watch out for portal related issues also, like when editing Partners in Trading Partner Management, make sure to wait for the green save signal before proceeding to fast to the next step. I found myself having trouble when I jumped to fast around in the portal.

Posted in: •Integration  | Tagged: •AzureAPIApps  •API Apps  •EDIFACT  •FileConnector  •FlatFile  •FTP  •Logic Apps  •Microsoft Azure 


Thoughts on Microsoft Azure App Services after BizTalk Summit 2015

We’re just back from a couple of great days in London and BizTalk Summit 2015. Saravana and his BizTalk360 team put together a great show with excellent content and an overall superb arrangement!

This year Microsoft had a whole team of people at the conference that during the first day did a number of presentation on the new Azure App Service concept. In case you missed it App Services is (among  other things in the concept) Microsoft next attempt to move the integration capabilities from BizTalk Server into the cloud, and more specifically into Azure.

Keynote speaker Karandeep Anand started of by explained that the vision for App Services was based on three main pillars: Democratize Integration, Becoming an IPaas Leader and Creating a Rich Ecosystem.

Image by @wearsy

The focus on Democratization is a goal that aims to make it easier to get started and to quickly get to delivering value. This is a great focus as this is a huge problem with today’s platform. Today we have to install severs, databases, heavy applications, setup accounts and a hundred other things before we can even send a file from A to B! I’m sure that it in the end won’t be as simple as in the demos but what we seen so far is definitely impressive when it comes to how simple it looks to get started.

Another part of Democratize Integration of course has to do with pricing. As it look now we can get to a platform that not only will scale technically but also price-wise. Our hope is that we’ll soon have a platform that can be used for a whole different segment of customer, customers with smaller budgets and smaller needs for integration. That would truly Democratize Integration!

What’s different from BizTalk Services and why will it work this time?

Microsoft has always been great at backwards compatibility and already from the start thought about the Hybrid scenarios when it comes to Azure. App Services is no different and that is to us the main thing that differs from this offer from what we have in BizTalk Services. The fact that we by using App Services can read a flat file file from a IBM WebSphere MQ on-premise, parse it in an Azure Logic App and send it to for example to Salesforce without basically any coding is powerful! We can now implement solutions and requirements that we deal with today, solve our customers’ current needs using one platform and deliver value. BizTalk Services however never got so far and always feels like a bit of a step backwards and a subset of what we had in BizTalk Server.

So, it’s great to see how Microsoft this time actually has taken a step back and thought about what makes BizTalk Server so good and then tried to incorporate those pieces in the new platform.

What’s missing from App Services?

Experience, shared knowledge and trust

BizTalk Server has been around for ages and a lot of us have 10+ years of experience on the platform. We know exactly what works, what to avoid, what are good and what are bad design patterns – we’ve learned to trust the platform (and even deal with its less prettier parts).

In App Services we’ve only done simple demos and minor PoCs so far. How to solve a complex request-response, how to implement scatter and gather (or any more complex pattern for that matter) is still very unclear. What happens when a destination service times out and a $10 000 dollar goes missing – will App Services watch my back the same way as BizTalk Server has done so many times?

From what we seen so far, and what’s already in Azure (for example Service Bus and Queues etc), many of the tools to solve the more complex scenarios are there – but the knowledge of what pieces to use when isn’t. At iBiz Solution will try hard to do our part for and filling that knowledge gap and considering the great community that surrounds Microsoft based integration I’m sure we won’t be the only ones. ;)

Tooling

Logic App designer

As with any young platform we’re missing tooling. It seems that the way to build more complex scenarios is to chain a number of more specific Logic apps together. This in combination with a missing feature for searching and promoting values in incoming messages will make it hard to see status of a specific message, find a certain message and so on. Some sort of overview of status and some sort of archiving with smart searching needs to happen for this to work in more busy and complex scenarios.

Debugging and testing is another area that current feels a bit weak. One can see input and output to each part of the Logic App but it requires a lot of clicking around and there’s no way of steeping through or replaying different parts etc. I can really say that is an area that’s particularly good in BizTalk Server either but it’s something that one is always struggling with and that ends up consuming a lot of time.

ALM

One thing that’s extremely important in our BizTalk centric solutions today is how to handle the Application Lifecycle Management and the development process that it involves. Build servers, automated tests, versioning, test servers and production servers are standard in any BizTalk Server project today. How we can achieve similar workflows where we guarantee quality, versioning, efficient deployment and the possibility to always roll back to previous versions etc. isn’t today obvious and needs more work.

Conclusion

At iBiz Solution we are excited to see Microsoft make another attempt at evolving the integration story! The fact that we this time around can see how we can solve today’s requirements on the future platform makes it even better! We looking forward to GA and even though the platform is far from done we feel it is in a much better place than we have been previously and we have already started talking Swagger and Swashbuckle with our customers. ;)

Posted in: •Integration  | Tagged: