Logic Apps Parameters vs ARM Parameters

I got the question about what the difference is between ARM template parameters and Logic App parameters and when these should be used, so that is what I’ll try to explain in this post.

First of ARM template paramters are used with ARM templates and the ARM template is used when deploying ARM based resources to Azure and Logic App’s is a resource that is deployed via ARM templates. The workflow definition language behind Logic App’s is very similar to ARM templates and therefore it can be tricky to see the difference in the beginning.

So let’s start of with ARM template parameters where they are and how they work, intereseted in more info about ARM templates read more https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-authoring-templates.

ARM Template Parameters

An ARM template with a empty Logic App looks as following, two ARM template parameters logicAppName and logicAppLocation and one resource of type Microsoft.Logic/workflow inside the Microsoft.Logic/workflow the Logic App Parmeters are found in the parameters object.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "defaultValue": "",
      "metadata": {
        "description": "Name of the Logic App."
      }

    },
    "logicAppLocation": {
      "type": "string",
      "defaultValue": "[resourceGroup().location]",
      "allowedValues": [
        "eastasia",
        "southeastasia",
        "centralus",
        "eastus",
        "eastus2",
        "westus",
        "northcentralus",
        "southcentralus",
        "northeurope",
        "westeurope",
        "japanwest",
        "japaneast",
        "brazilsouth",
        "australiaeast",
        "australiasoutheast",
        "westcentralus",
        "westus2"
      ],
      "metadata": {
        "description": "Location of the Logic App."
      }
    }
  },
  "variables": {
  },
  "resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [],
      "properties": {
        "definition": {},
        "parameters": {}
      }
    }
  ],
  "outputs": {}
}

So again the ARM template parameters are found here and cotaining a parameter named lat_File_Encoding-SchemaName

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "Flat_File_Encoding-SchemaName": {
	  "type": "string",
	  "defaultValue": "TEST-INT0021.Intime.DailyStatistics"
	},

The ARM template properties has the syntax: [parameters(‘armparam’)] and when accessing ara parameter with name Flat_File_Encoding-SchemaName Fit would look like:

"name": "[parameters('Flat_File_Encoding-SchemaName')]"

This value will be evaluated in runtime and if the ARM parameter would look like this (and no paremter file was used).

,
"Flat_File_Encoding-SchemaName": {
  "type": "string",
  "defaultValue": "TEST-INT0021.Intime.DailyStatistics"
},

The result would look like this after deloyment:

Code View

“name”: “TEST-INT0021.Intime.DailyStatistics”

Desginer View

But in the Designer View it’s pretty evaluated and easy to read for operations.

ARM Template Variables

This is almost an unkown feature and often used to little. This is where we can evaluate expressions for later use, so this is really practical when we want to add two parameter values or use some functions to generate the value that shall be consistent over the Logic App.

In order to prove how this works we will concat two parameters, this could be used for creating resource links etc. The simpel logic app will just be a response action with a body value of the evaluated concat value.

The variable can be found after the paramters:

    }
  },
  "variables": {
  },
  "resources": [

When adding a variable we can the access parameter so when trying to combine two paramters with the concat function it will look like:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "metadata": {
        "description": "Name of the Logic App."
      }
    },
    "flowname": {
      "type": "string",
      "metadata": {
        "description": "Name of the flow"
      }
    },
  },
  "variables": {
    "combinedflowname" :  "[concat(parameters('logicAppName'),'-',parameters('flowname'))]"
  },
  "resources": [

So the variable is created and assigned a value, now we just need to use it and the syntax for accessing the value is simple:

"Response": {
  "inputs": {
    "body": "[variables('combinedflowname')]",
    "statusCode": 200
  },
  "runAfter": {
   
  },
  "type": "Response"
}

The variables will ass the ARM temoplate paramters be evaluated at deploy time so when deployed the value will be displayed as text, if logicAppName was set to INT001-Example and the flowname was set to Orders the evaluated result of the concat function would be: INT001-Example-Orders. And the Logic App when deployed will look like:

Logic App Parameters

The Logic App paremeters are found in this section under “parameters”. Containing a parameter “url”.

"resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [],
      "properties": {
        "definition": {},
        "parameters": {
			"url": {
	          		"defaultValue": "http://www.google.se/",
	         	 	"type": "String"
	        	}
			}
      }
    }
  ],

The Logic App parameters have a diffrent syntax: @parameters(‘laparam’) and when accessing ara parameter with name url it would look like:

"uri": "@parameters('url')"

These parameters are evaluated at runtime wich means that this is not changed after deployment or in the designer so even after deploy or first run is done it will always look like this, but picking the value behind the parameter: The result would look like this after deloyment:

Code View

“uri”: “@parameters(‘url’)”

Desginer View

But during rumtime it will evalve and if we check a run it will look like this:

##Summary It’s good to know when and why to use diffrent types of parameters, I’ve seen alot of over use of Logic App parameters and I just want to share the knowledge and spread how these are threated. We want to push the static values to ARM template parameters so that it’s easy to see the values during a check if the Logic App in the designer to verify that everything is ok, since Logic App paramters will “hide” the value in Code View. It also makes it easy to switch values between DEV/TEST/PROD environments since there is often change between environments. Also make sure to use ARM template variables whenever you need to reuse a computed result over you Logic App.

But with that said when using reference objects for translation or look ups Logic App Paramters is the obvious choice since it’s easy to access. We have seen this succesfully and easy to use when keys are sent in the inpiut data and the used in lookups. @parameters(‘myobjectparam’)triggerbody()[‘key’]]

So make sure to use the appropiate type at the correct time and I hope you got some more insight in how/when and where to use these 3 diffrent types.

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


Logic Apps deployment template extractor trigger updates

Updates to the Logic App Template Extractor has come out, read more about in the earlier post Logic Apps Deployment Template Extractor

Reccurence properties on trigger automatically ARM template properties

So whit a reccurence trigger we are setting an interval and frequency and often we want to have diffrent values in dev/test and production since trigger this action is assoisated with a billable action so frequency is often something we change between the environments. So whit any trigger that has reccurence propperties such as Interval and Frequencey are automatically generated as properties to the ARM Template. Here we can see the standard reccurence trigger:

And in the extracted ARM template we will automatically get:

"parameters": {
...
 "RecurrenceFrequency": {
      "type": "string",
      "defaultValue": "Day"
    },
    "RecurrenceInterval": {
      "type": "int",
      "defaultValue": 1
    }
...
  "triggers": {
    "Recurrence": {
      "recurrence": {
        "frequency": "[parameters('RecurrenceFrequency')]",
        "interval": "[parameters('RecurrenceInterval')]"
      },
      "type": "Recurrence"
    }
  },

File Connector and Base64 paths

The File Connector amongst others are saving the folder path in Base64 format to make sure that the Path is valid, for the File Connector it is so forthe trigger and List files action. The Designers are working with this withouth any problems but as you want to automate the deployment this becomes something we need to know and understand how to handle.

Understanding the Base64 modell

So let’s start with understanding the base64 modell used in these actions/triggers. So first we pick our path:

And if we then go in and look in the codeview:

"List_files_in_folder": {
    "inputs": {
        "host": {
            "connection": {
                "name": "@parameters('$connections')['filesystem_1']['connectionId']"
            }
        },
        "method": "get",
        "path": "/datasets/default/folders/@{encodeURIComponent('XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=')}"
    },
    "metadata": {
        "XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=": "\\\\server\\ARBETSORDER\\OUT\\XML_EXPORT\\TO"
    },
    "runAfter": {},
    "type": "ApiConnection"
}

So as we can see the path we picked in the designer is in the metdata tag and the name of the property is just a “random” name. The “random” name is not so random it’s actually the base64 representation of the path.

\\server\ARBETSORDER\OUT\XML_EXPORT\TO = decodeBase64(‘XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=’).

The metadata tag is used in the GUI to present the path in text and the actuall value sent to the API Connection is found in the path property:

"path": "/datasets/default/folders/@{encodeURIComponent('XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=')}"

So in order to be able to handle deployments this is the actual value we need to change but if we want the GUI to present this value we also need to update the metadata tag. To do this there are handy functions available to use in the ARM template [base64(‘somevalue’)]. The full extract when using the Deployment Template Creator will handle this and look like this:

"parameters": {
...
"List_files_in_folder-folderPath": {
      "type": "string",
      "defaultValue": "\\\\server\\ARBETSORDER\\OUT\\XML_EXPORT\\TO"
    },
}
...
	"List_files_in_folder": {
	  "runAfter": {},
	  "metadata": {
	    "[base64(parameters('List_files_in_folder-folderPath'))]": "[parameters('List_files_in_folder-folderPath')]"
	  },
	  "type": "ApiConnection",
	  "inputs": {
	    "host": {
	      "connection": {
	        "name": "@parameters('$connections')['filesystem_1']['connectionId']"
	      }
	    },
	    "method": "get",
	    "path": "[concat('/datasets/default/folders/@{encodeURIComponent(''',base64(parameters('List_files_in_folder-folderPath')),''')}')]"
	  }
	}

Setting the parameter in the parameters file to the new value will then be populated and working with both the designer and setting the correct value as parameter to the File Connector.

Small fixes

There has also been a number of small fixes as the parameter type now supports all kind of types, we can have objects, strings, integers etc.

Here is a sample of how parameters inside a Logic App of type Object will be handled:

 "parameters": {
    "ismanager": {
        "defaultValue": {
            "0": 572,
            "1": 571,
            "No": 572,
            "Yes": 571
        },
        "type": "Object"
    },

After generating the ARM template the Logic App parameter will be pushed up to an ARM template Parameter, making the ARM template looking like this:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
	"parameters": {
		...
		paramismanager": {
		  "type": "Object",
		  "defaultValue": {
		    "0": 572,
		    "1": 571,
		    "No": 572,
		    "Yes": 571
		  }
		},
		....
	}
	"variables": {},
	"resources": [
	    {
	      "type": "Microsoft.Logic/workflows",
	      "apiVersion": "2016-06-01",
	      "name": "[parameters('logicAppName')]",
	      "location": "[parameters('logicAppLocation')]",
	      "dependsOn": [],
	      "properties": {
	        "definition": {
	          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
	          "contentVersion": "1.0.0.0",
	          "parameters": {
	            "ismanager": {
	              "defaultValue": "[parameters('paramismanager')]",
	              "type": "Object"
	            },
				...
			 }
          },
          "outputs": {}
        },
        "parameters": {}
      }
    }
  ],
  "outputs": {}
}
				

Tools like the Logic App Template Creator helps us focus on the fun and good parts, building great solutions for our customers.

I strongly recomend you to try it and help out evolving it!

https://github.com/jeffhollan/LogicAppTemplateCreator

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


Logic Apps Variables and Do Until

Sometimes we have to handle a workload that is undefined, here I’m going to show how thi could be solved with Do Until and __variables.

The scenario is following: We need to collect all users from a system that has about 14000 users, the problem with this is that due to API restrictions we can collect about 150 users/call. And there is a page handling that we need to use, so we need to get page 1 and page 2 and so on with the page size of 150. In addition to this there is a boolean called last_page to check if we are on the last page.

So how to solve this in Logic Apps? For each cannot be used, calling a function that would collect all users would take to long. Fortunally we can use the new variables actions in combination with the Do_Until action. So what we will do is create a variable to keep the page number, increment it after each GET of users and then check if last_page is true and if it is end the Do until.

Do Until

The Do Until action is a looping action that continues to execute until a condition is met. Fortunaly Microsoft has provided some limits to make sure we don’t end up in endless loops :) So there are Limits to make sure that the loop is not run more than x times default is set to 60 times or longer than a specified time, as default the timeout is set to 1 hour. But ofcourse we will not need this since we will create good expressions, right? :)

Variable

Variables actions provides funtionality to store and handle variables, currently only Integer and float but hopefully soon even objects. There are 3 actions assosiated with variables:

  • increment - increments the value of a specified variable with a configurable amount
  • initialize - initalizes a variable, sets a name of it, type (currently only integer or float) and a starting value.
  • set - sets the specified variable to a specific value

So in order to use variables we need to initialize them via the initalize action and if we want to change the value we can either increment it or set it with the appropiate actions And use them later on in expressions or as in our case as a part of the URL in the HTTP action. We get the value via the following simple code:

@variables('page')

Build the solution

So let’s start adding these actions, when writing “variable” in the new action window 3 actions are showing. Let’s start with initialize the variable.

Name it to page and set type to Integer and value to 1 since it’s the first page we should start with.

We will also add another variable to have in the condition for exit the Do Until loop:

Then we add the Do_Until action using the dountil flag, and also changing the limits since 100 might be to low in the future we set to a really high number of 500 and adding 1 hour in the timout to make sure we can process all users.

After this we add the HTTP action and some other actions needed for processing, after that we add an increment page action to increment the value and also a condition to check the last_page flag.

As the image shows I’ve used increment by 2 and an @or statment in the Condition, this is since I’m running two parallel executions to speed up the process, fetching 2 * 150 users per turn. So anyone of them could be the last page and if anyone of them is we set the dountil variable to 1 to exit the loop.

Now we have created a flow that will collect users until the system tells us we have collected them all.

Thoughts

This is a good pattern and usefull but there are som considerations to take, when using the Do Until loops we don’t get the “pretty” logging that we get in the For Each loops where we can see all the runs, we just se the latest one.

Variables are really handy and easy to use, makes it possible to keep track of indexes and such when doing more complex workflows.

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Loops  •Variables  •Azure 


Logic Apps deployment template extractor

When you have developed your Logic App and it works well in the development envrionemnt it’s time to bring it to test and later on production. If you have devloped it in Visual Studion you will find this easy as long as you don’t use any linked resources like another Logic App, Azure Function, API Management or Integration Account, cause then the the next phase starts, the phase where you manually break down the resource id’s of these actions into concat statements with parameters and arm template functions.

So how does it look like? Let’s start at an innocent Workflow action, a call to another Logic App, in the designer it looks like this (both web designer and VS)

If we take a look at this from the code view, it looks as following:

"INT001_Work_Order": {
    "inputs": {
        "body": "@triggerBody()",
        "host": {
            "triggerName": "manual",
            "workflow": {
                "id": "/subscriptions/fake89f1-660a-4585-b4d8-bf5172cb7f70/resourceGroups/integration-we/providers/Microsoft.Logic/workflows/INT001_Work_Order"
            }
        },
        "retryPolicy": {
            "type": "None"
        }
    },
    "runAfter": {},
    "type": "Workflow"
}

Id is a reference to the Logic App via it’s resource id, so the id in the field bellow is pointing exactly to that other Logic App. Let’s look in to the field:

"/subscriptions/fake89f1-660a-4585-b4d8-bf5172cb7f70/resourceGroups/integration-we/providers/Microsoft.Logic/workflows/INT001_Work_Order"

The harcoded parts is the id of the subscription fake89f1-660a-4585-b4d8-bf5172cb7f70, the resource group name integration-weand the Logic App name INT001_Work_Order. In order to make this possible to relate to in other subscriptions we need to remove the “hard codings” of subscription and resrource group, using arm template language read more at: https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-authoring-templates We will use the subscription object to get the current subscription we are executing and it’s id via the subscriptionId property and the same for the resource group name via the resourceGroup object and the name property when then get the values for the current subscription and resource group we are deploying to.

It will look like this (I assume the logic app name will be same in the next environment).

"[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', resourceGroup().name,'/providers/Microsoft.Logic/workflows/INT001_Work_Order')]"

This can now be deployed to a new subscription and if the INT001_Work_Order Logic App exists it will be referenced automatically, otherwise the deloyment will fail.

Same goes for functions, if we add them alot of information is put in to this id property and put there “hard coded”.

"TransformKNABContactToGeneric": {
    "inputs": {
        "body": "@body('HTTP_Webhook')",
        "function": {
            "id": "/subscriptions/fake89f1-660a-4585-b4d8-bf5172cb7f70/resourceGroups/integration-we/providers/Microsoft.Web/sites/functionappname/functions/INT004-TransformContactToGeneric"
        }
    },
    "runAfter": {},
    "type": "Function"
}

It’s just as with the Logic App reference…

"/subscriptions/fake89f1-660a-4585-b4d8-bf5172cb7f70/resourceGroups/integration-we/providers/Microsoft.Web/sites/functionappname/functions/INT004-TransformContactToGeneric"

As before we need to update it and it should be to something like this, note I’ve added a parameter for the function app container name, this wwill be diffrent in diffrent environments and to be on the safe side we added a parameter for the function name.

"[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', resourceGroup().name,'/providers/Microsoft.Web/sites/', parameters('FunctionAppName'),'/functions/', parameters('FunctionName'),'')]"

Api Management usage is the same, but here we also have some more things to consider instance name and the subscriptionkey (as experience says it’s easy to forgett adding a parameter as is needed to get a smooth deployment)

"UploadAttachment": {
  "runAfter": {},
  "type": "ApiManagement",
  "inputs": {
    "api": { "id": "/subscriptions/FAKE9f1-660a-4585-b4d8-bf5172cb7f70/resourceGroups/Api-Default-West-Europe/providers/Microsoft.ApiManagement/service/apiminstancename/apis/58985740990a990dd41e5392" },
    "body": {
      "attachment": {
        "data": "@{triggerBody()['data']}",
        "extension": "@{triggerBody()['extension']}",
        "filename": "@{triggerBody()['filename']}",
        "orderNumber": "@{triggerBody()['orderNumber']}"
      }
    },
    "method": "post",
    "pathTemplate": {
      "parameters": {},
      "template": "/api/v1/test/rest/UploadAttachment"
    },
    "subscriptionKey": "keytothesubscription"
  }
} 

so in this case we would need to do something like this to make a smooth deployment, note the paramter for instance name and apiid just to be safe.

 "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', parameters('apimResourceGroup'),'/providers/Microsoft.ApiManagement/service/', parameters('apimInstanceName'),'/apis/', parameters('apimApiId'),'')]"
 ..
 "subscriptionKey": "[parameters('subscriptionkey')]"

So as you can see there are som things to do here, and if you are as me a fan of the Web desginer after the development the next step in order to get this into VSTS would be to start building the arm template by hand.. I’ve found this very time consuming and error prone, so easy to make misstakes and it takes time to test the deployment. Also this often changes the Logic App abit and that might not be so good if test’s has been done on the implementation.

So in order to automate all this and more, i.e. pushing parameters in the logic app to the ARM template making it possible to change the parameters with a parameter file between environments. We are using the Logic App Template Creator created by Jeff Holan https://github.com/jeffhollan/LogicAppTemplateCreator to automate this, it’s even possible to create the parameter file. With this we can extract the Logic App, setup parameters for TEST/PROD and add it to VSTS and start deploying directly. Note that Gateway’s and their deployment are not supported yet and if you find bugs, help out or report them so we can fix them togheter.

The Template Generator will automate this and do the following:

Logic App (workflow)

In addition to fix the id to a generic concat path it also adds a parameters at the top if the referenced Logic App is in another resource group.

Logic App in the same resource group (assumes that in the next enviornment it will be the same)

"INT001_Work_Order": {
  "runAfter": {},
  "type": "Workflow",
  "inputs": {
    "body": "@triggerBody()",
    "host": {
      "triggerName": "manual",
      "workflow": {
        "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', resourceGroup().name,'/providers/Microsoft.Logic/workflows/INT001_Work_Order')]"
      }
    },
    "retryPolicy": {
      "type": "None"
    }
  }
}

Logic app not in same resource group, a parameter is added for the resource group name

"parameters": {
...
"INT0014-NewHires-ResourceGroup": {
      "type": "string",
      "defaultValue": "resourcegroupname2"
    }
...
}
...
"INT0014-NewHires": {
  "runAfter": {
    "Parse_JSON": [
      "Succeeded"
    ]
  },
  "type": "Workflow",
  "inputs": {
    "body": {
      "Completed_Date_On_or_After": "@{body('Parse_JSON')?['Completed_Date_On_or_After']}",
      "Completed_Date_On_or_Before": "@{body('Parse_JSON')?['Completed_Date_On_or_Before']}",
      "Event_Effective_Date_On_or_After": "@{body('Parse_JSON')?['Event_Effective_Date_On_or_After']}",
      "Event_Effective_Date_On_or_Before": "@{body('Parse_JSON')?['Event_Effective_Date_On_or_Before']}"
    },
    "host": {
      "triggerName": "manual",
      "workflow": {
        "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', parameters('INT0014-NewHires-ResourceGroup'),'/providers/Microsoft.Logic/workflows/INT0014-NewHires')]"
      }
    }
  }
}

Azure Function

Azure function are referenced via a function app and at the last the part the function name so all these parameters need’s to be parameters, also here we have the same function as with the Logic around the resource group.

"parameters": {
...
 "TransformContactToGeneric-FunctionApp": {
      "type": "string",
      "defaultValue": "functionAppName"
    },
    "TransformContactToGeneric-FunctionName": {
      "type": "string",
      "defaultValue": "INT004-TransContactToGeneric"
    }
...
 "TransformContactToGeneric": {
  "runAfter": {},
  "type": "Function",
  "inputs": {
    "body": "@body('HTTP_Webhook')",
    "function": {
      "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', resourceGroup().name,'/providers/Microsoft.Web/sites/', parameters('TransformContactToGeneric-FunctionApp'),'/functions/', parameters('TransformContactToGeneric-FunctionName'),'')]"
    }
  }
}

API Managemenmt

API Management and API’s are referenced at the same way as all other so we need the same handling. Here it’s also important to be able to set the instance namen and sunscription key since they will change in the next environment.

"parameters": {
...
"apimResourceGroup": {
      "type": "string",
      "defaultValue": "integration-we"
    },
    "apimInstanceName": {
      "type": "string",
      "defaultValue": "apiminstancename"
    },
    "apimApiId": {
      "type": "string",
      "defaultValue": "58778a86990a990f5c794e48"
    },
    "apimSubscriptionKey": {
      "type": "string",
      "defaultValue": "mysecretsubscriptionkey"
    }
...
"INT005_Add_Attachment": {
  "runAfter": {},
  "type": "ApiManagement",
  "inputs": {
    "api": {
      "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', parameters('apimResourceGroup'),'/providers/Microsoft.ApiManagement/service/', parameters('apimInstanceName'),'/apis/', parameters('apimApiId'),'')]"
    },
    "body": {
      "fileextension": "@{triggerBody()['fileextension']}",
      "filename": "@{triggerBody()?['filename']}",
      "image64": "@triggerBody()?['image64']",
      "workorderno": "@{triggerBody()?['workorderno']}"
    },
    "method": "post",
    "pathTemplate": {
      "parameters": {},
      "template": "/test/addattachment"
    },
    "subscriptionKey": "[parameters('apimSubscriptionKey')]"
  }
}

Integration Account

When working with Integration Account’s these need to be set in the next environment aswell so if they are used there will be some parameters and settings added.

"parameters": {
...
"IntegrationAccountName": {
  "type": "string",
  "metadata": {
    "description": "Name of the Integration Account that should be connected to this Logic App."
  },
  "defaultValue": "myiaaccountname"
},
"IntegrationAccountResourceGroupName": {
  "type": "string",
  "metadata": {
    "description": "The resource group name that the Integration Account are in"
  },
  "defaultValue": "[resourceGroup().name]"
}
...
"integrationAccount": {
  "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourcegroups/',parameters('IntegrationAccountResourceGroupName'),'/providers/Microsoft.Logic/integrationAccounts/',parameters('IntegrationAccountName'))]"
}

Parameters

Parameters added inside the Logic App via for now Code View will be pushed and added as a ARM template parameter, so if you have this parameter inside your Logic App.

"definition": {
  "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "MyURL": {
      "defaultValue": "http://requestb.in",
      "type": "String"
    }
  },

After running the extractor this parameter will be a ARM template parameter and the defaultValue will be set to the value of that ARM template parameter, so now this can be set at deployment time via the parameters file.

"parameters": {
...
"paramMyURL": {
  "type": "string",
  "defaultValue": "http://requestb.in"
},
...

"definition": {
  "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "MyURL": {
      "defaultValue": "[parameters('paramMyURL')]",
      "type": "String"
    }
  },

We are currently working on more, like trigger functionality to get frequency, intervall settings put up. And also functions to handle special triggers like file, where the folder path is base64 encoded and built up in a special way.

Be aware that right now the Gateway Connectors are strangely removing the user/pass if they are not supplied so they will break, remove them manually from the template and create them manually for now. A fix is comming to not include them. But for now hope this helps out!

Here is a sample of a powershell script that will take out the Logic App, generate a template and create a parameter file

#if you have problem with execution policy execute this in a administrator runned powershell window.
#Set-ExecutionPolicy -ExecutionPolicy Unrestricted

#we have th module in a folder structure like this and it's needed to be included
Import-Module ".\LogicAppTemplateCreator\LogicAppTemplate.dll"

#Credentials will "flow" if you are logged in.


#Set the name of the Logic App
$logicapname = 'INT001_Work_Order'
#Set the resource group of the Logic App
$resourcegroupname = 'integration-we'
#Set the subscription id of where the Logic App is
$subscriptionid = 'fakeb73-d0ff-455d-a2bf-eae0b300696d'
#Set the tenant to use with the Logic App, make sure it has the right tennant
$tenant = 'ibiz-solutions.se'
 
#setting the output filename
$filenname = $logicapname + '.json'
$filennameparam = $logicapname + '.parameters.json'


Get-LogicAppTemplate -LogicApp $logicapname -ResourceGroup $resourcegroupname -SubscriptionId $subscriptionid -TenantName $tenant | Out-File $filenname
#$filenname = $PSScriptRoot+"\"+$filenname
Get-ParameterTemplate -TemplateFile $filenname | Out-File $filennameparam

Sample of complete extraction INT001_Work_Order:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "defaultValue": "INT001_Work_Order",
      "metadata": {
        "description": "Name of the Logic App."
      }
    },
    "logicAppLocation": {
      "type": "string",
      "defaultValue": "[resourceGroup().location]",
      "allowedValues": [
        "eastasia",
        "southeastasia",
        "centralus",
        "eastus",
        "eastus2",
        "westus",
        "northcentralus",
        "southcentralus",
        "northeurope",
        "westeurope",
        "japanwest",
        "japaneast",
        "brazilsouth",
        "australiaeast",
        "australiasoutheast",
        "westcentralus",
        "westus2"
      ],
      "metadata": {
        "description": "Location of the Logic App."
      }
    },
    "apimResourceGroup": {
      "type": "string",
      "defaultValue": "integration-we"
    },
    "apimInstanceName": {
      "type": "string",
      "defaultValue": "instancename"
    },
    "apimApiId": {
      "type": "string",
      "defaultValue": "5833f8a9990a991064e0e128"
    },
    "apimSubscriptionKey": {
      "type": "string",
      "defaultValue": "fakekeyis here"
    },
    "apimApiId2": {
      "type": "string",
      "defaultValue": "58778a86990a990f5c794e48"
    }
  },
  "variables": {},
  "resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [],
      "properties": {
        "definition": {
          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
          "contentVersion": "1.0.0.0",
          "parameters": {},
          "triggers": {
            "manual": {
              "type": "Request",
              "kind": "Http",
              "inputs": {
                "schema": {}
              }
            }
          },
          "actions": {
            "Check_assigned_to": {
              "actions": {
                "Service_Order": {
                  "runAfter": {},
                  "type": "ApiManagement",
                  "inputs": {
                    "api": {
                      "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', parameters('apimResourceGroup'),'/providers/Microsoft.ApiManagement/service/', parameters('apimInstanceName'),'/apis/', parameters('apimApiId'),'')]"
                    },
                    "body": "@concat(replace(replace(string(triggerBody()),'<?xml version=\"1.0\" encoding=\"utf-8\"?>',''),'<workOrders ','<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><a:Action s:mustUnderstand=\"1\">Submit</a:Action></s:Header><s:Body><workOrders xmlns=\"pv-work-order\" '),'</s:Body></s:Envelope>')",
                    "method": "post",
                    "pathTemplate": {
                      "parameters": {},
                      "template": "/external/Relacom/ServiceOrder/"
                    },
                    "subscriptionKey": "[parameters('apimSubscriptionKey')]"
                  }
                }
              },
              "runAfter": {
                "Compose": [
                  "Succeeded"
                ]
              },
              "expression": "@equals(outputs('Compose')['assigned'], 'RELACOM')",
              "type": "If"
            },
            "INT005_Add_Attachment": {
              "runAfter": {
                "INT001_Create_Order": [
                  "Succeeded"
                ]
              },
              "type": "Workflow",
              "inputs": {
                "body": {
                  "company": null,
                  "fileextension": "xml",
                  "filename": "@concat(concat(xpath(xml(triggerbody()), 'string(/*[local-name()=\"workOrders\"]/*[local-name()=\"workOrder\"]/*[local-name()=\"code\"])'),'_'),xpath(xml(triggerbody()), 'string(/*[local-name()=\"workOrders\"]/*[local-name()=\"header\"]/*[local-name()=\"messageDate\"])'))",
                  "id": null,
                  "image64": "@base64(triggerbody())",
                  "workorderno": "@xpath(xml(triggerbody()), 'string(/*[local-name()=\"workOrders\"]/*[local-name()=\"workOrder\"]/*[local-name()=\"code\"])')"
                },
                "host": {
                  "triggerName": "manual",
                  "workflow": {
                    "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', resourceGroup().name,'/providers/Microsoft.Logic/workflows/INT005_Add_Attachment')]"
                  }
                }
              }
            },
            "Response": {
              "runAfter": {
                "INT005_Add_Attachment": [
                  "Succeeded"
                ]
              },
              "type": "Response",
              "inputs": {
                "body": "OK",
                "statusCode": 200
              }
            }
          },
          "outputs": {}
        },
        "parameters": {}
      }
    }
  ],
  "outputs": {}
}

Sample of the parameters file generated:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "value": "INT001_Work_Order"
    },
    "logicAppLocation": {
      "value": "[resourceGroup().location]"
    },
    "apimResourceGroup": {
      "value": "integration-we"
    },
    "apimInstanceName": {
      "value": "instancename"
    },
    "apimApiId": {
      "value": "5833f8a9990a991064e0e128"
    },
    "apimSubscriptionKey": {
      "value": "fakekeyis here"
    },
    "apimApiId2": {
      "value": "58778a86990a990f5c794e48"
    }
  }
}

As you can see this will handle all of the nessisary concat setups, it automatically add’s paramters for the import things like the API Management instance name and subscription key to make sure the parameters are not forgotten. This makes it better and easier to use the Web designer rather than VS since in VS you still need to this work or export it afterwards (but then what is the point in using VS?). With this it’s possible to just develop in the Web designer as I prefer and then extract the template, verify it and update with parameters (prefereable in the Logic App so you can reextract the template to make sure all is working before the deployment) and put into VSTS for deployment.

I strongly recomend you to try it.

https://github.com/jeffhollan/LogicAppTemplateCreator

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


API Management Backend Service with id “abc” not found

When deploying api’s to API Management via GIT we can run into the error “could not find backend service with id abc”, this comes from the SOAP handling added recently in API Management.

If we take a look inside the GIT repositiory, there is a new folder added in GIT called “backend”.

If we look inside there are some folders named after the url’s.

Inside these files we find a configuration.json that looks like this:

In one of these files you will find the id that is matching the error message, make sure it is added in the commit. Done? If this API is created with this commit (aka first time deployed) the error will unfortunately persist, this is due to a minor bug. To solve this we can temporary remove the set-backend policy and in the SOAP case we find this in “policies/api/{apiname}.xml”

File looks like:

Temporary remove the set-backend service, create a commit and deploy the api, after add the backend service again and create a new commit and now the deployment will work and the backend will be set correctly.

Posted in: •Azure API Management  •Integration  | Tagged: •API  •API Management  •Azure 


Understanding API Management Source Code

Developing in Azure API Management is done inside the Azure portal or in the Publisher Portal that comes with the API Management instance, developing and testing is easy, but when it comes to move the developed API to the next instance (Test or Prod) it becomes tricky.

In this demo instance I have 2 API’s and logic applied to handle authentication and formatting.

Where is the code?

When developing API’s in API Management publisher portal you quickly learn that API signature and policy codes are separated and handled in different areas of the portal, API and Policy. This experience is not as clear in the new API blade inside the Azure Portal where a lot of work has been done to make it easier to understand the API’s and see the whole picture at one place.

But the difference is quite important, the API is the “external” definition needed by clients to understand and implement our API and the Policy part is logic that is executed when a call is made to the API.

Exporting the API; Press export and select “OpenAPI specification”

Inspecting the file shows that only the swagger definition is exported and not the logic added to the

We now test to extract the code using the GIT option, in the publisher portal under “Security” -> “Configuration repository”

First Press the “Save configuration to repository”, note that after the save is completed the Git icon on the top right corner changes color to green.

Get the clone URL and note that the Username is “apim”, scroll down and generate a password

Now use the credentials and password to connect. Tip:_ if you are using PowerShell or another tool that requires the authentication in the form, remember to URL encode the password:_

_git clone _https://{user}:{password}@ibizmalo.scm.azure-api.net

Inspecting the content: (2 folders down) and we find the source code!

All of these we can copy to the next instance, but! There are some things to take into consideration; each group/product etc. has a unique ID (if these are manually created in the instances it’s not guaranteed to be the same and the import won’t work in another instance, so import everything you need)

Import changes into a API Management instance

The next step is to import these files to another API Management instance, so to get started, we need to clone the code from this API Management instance:

Copy the files and commit, after that we need to deploy the changes in the repository to the API Management instance and once again we need to go to the “publisher” portal and “Security” -> “Configuration repository”. Scroll down and press the “Deploy repository configuration” and the changes are applied.

Posted in: •Azure API Management  •Integration  | Tagged: •API  •API Management  •Azure 


Exposing SAP Gateway services with API Management

Exposing services like the SAP Gateway is an important task for API Management but not always so easy. Security is often high around these products so we need to understand this in order to get the setup correct. To get started let’s look at the setup that we were facing. SAP Gateway is hosted inside the OnPrem network behind several firewalls and hard restrictions, API Management is hosted in Azure. In this case we had an Express Route setup ready to be used so we could deploy an API Management inside a preexisting subnet. Firewall rules where setup based on the local IP of the API Management instance (according to the information I have these should be static when deployed in a VNET).

After API Management is deployed inside the network, this done by setting the External Network in the network tab to a subnet of the VNET that is connected to the Expres Route. Make sure that API Management is alone in this subnet, see image for more informaton

After this is done we set up the routing/firewall rules, to get the local ip of the API Management instance we put up a vm with IIS and created a API inside API Managment that called the standard IIS start page, after that we searched the loggs in IIS to get the local IP.

Now we can start creating the API and I’ll jump right in to the policy settings of the API that is created to expose the SAP Gateway, the Security model on SAP GW can vary but we in this case we had Basic Authentication as the authentication mode. Adding handling of this is quite straightforward for API Management, there is a policy ready to be used “Authenticate with Basic”:

So we started off adding the authentication part, now the easy part is done. When calling the API we just got a 403 Forbidden response saying “Invalid X-CSRF-Token”, looking around this we found that it’s the Anti Forgery setup with SAP Gateway. To be able to handle this a token and a cookie is needed and the token and cookie are retrieved via a Successful GET call to the Gateway. The initial call is using the same URL (make sure that the GET operation is implemented so the result is successful (return 200 OK) otherwise the token is not valid. Since I had no access to the SAP GW without API Management my testing’s are done from Postman via API Management to SAP GW. Adding the “X-CSRF-Token” header with value Fetch will retrieve the token and cookie making the call like:

The response looks like:

The interesting part is the Headers and Cookies, let’s have a look, under Headers we find the X-CSRF-TOKEN that we need to use in the response back.

 

In the Cookies we find 2 items and we are interested in the one called “sap_xsrf..” this is the anti-cross site forgery cookie that is needed in the POST/PUT request to SAP Gateway.

Composing these makes a valid request look like this:

So now we can do these two requests to get a valid POST call in to SAP Gateway from Postman, so let’s move on to setting this up in API Management. In API Mangement we don’t want the client to understand this and/or force them to implement the two calls and the functionality to copy headers and so on, we just want to expose a “normal” API. So we need to configure API Management to do these two calls to the SAP Gateway from API Management in the same call from the client to be able send a valid POST request to SAP Gateway.

In order to make this work we need to use polices, so after setting up the standard POST request to create Service Order we will go to the Policy tab.

In the beginning it looks something like this:

So the first thing we need to do is to a send-request policy, I configured it wit mode new and used a variable name of fetchtokenresponse. Since retrieving the token is to do a GET request to the SAP Gatway we reuse the same URL to the API (after rewrite). Setting the header X-CSRF-Token to Fetch since we are fetching the token and adding the authorization header with the value for Basic authentication So let’s start with creating the call to do the GET token request. let’s add this code in the inbound section.

<send-request mode="new" response-variable-name="fetchtokenresponse" timeout="10″ ignore-error="false">*
<set-url>@(context.Request.Url.ToString())</set-url>
<set-method>GET</set-method>
<set-header name="X-CSRF-Token" exists-action="override">
<value>Fetch</value>
</set-header>
<set-header name="Authorization" exists-action="override">
<value>Basic aaaaaaaaaaaaaaaaaaaaaaaaaa==</value>
</set-header>
<set-body></set-body>
</send-request>

Next step is to extract the values from the send-request operation and add them to our POST request, setting the X-CSRF-Token header is fairly straightforward so we use the set header policy and retrieves the header from the response variable, the code looks like:

<set-header name="X-CSRF-Token" exists-action="skip">
<value>@(((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("x-csrf-token"))</value>
</set-header>

A bit trickier is the Cookie, since there are no “standard” cookie handler we need to implement some more logic, in the sample I provide I just made a lot of assumptions on the cookie. We need the cookie starting with sap_XSRF I started by splitting all cookies on ‘;’ and finding the cookie that contained “sap_XSRF”, this had in our case also a domain that I didn’t need so I removed it by splitting on ‘,’ (comma) and used the result in a set-header policy.

<set-header name="Cookie" exists-action="skip">
<value>@{
string rawcookie = ((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("Set-Cookie");
string[] cookies = rawcookie.Split(';');
string xsrftoken = cookies.FirstOrDefault( ss => ss.Contains("sap-XSRF"));
return xsrftoken.Split(',')[1];
}
</value>
</set-header>

All in all the policy will look like:

 <policies>
 <inbound>
 <base />
 <rewrite-uri template="sap/opu/odata/sap/ZCAV_AZURE_CS_ORDER_SRV/ItHeaderSet('{oid}')" />
 <send-request mode="new" response-variable-name="fetchtokenresponse" timeout="10" ignore-error="false">
 <set-url>@(context.Request.Url.ToString())</set-url>
 <set-method>GET</set-method>
 <set-header name="X-CSRF-Token" exists-action="override">
 <value>Fetch</value>
 </set-header>
 <set-header name="Authorization" exists-action="override">
 <value>Basic aaaaaaaaaaaaaaaaaaaaaaaaaa ==</value>
 </set-header>
 <set-body>
 </set-body>
 </send-request>
 <set-header name="X-CSRF-Token" exists-action="skip">
 <value>@(((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("x-csrf-token"))</value>
	 </set-header>
 <set-header name="Cookie" exists-action="skip">
 <value>@{
 string rawcookie = ((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("Set-Cookie");
 string[] cookies = rawcookie.Split(';');
 string xsrftoken = cookies.FirstOrDefault( ss => ss.Contains("sap-XSRF"));
 return xsrftoken.Split(',')[1];
 }
 </value>
</set-header>
 </inbound>
 <backend>
<base />
</backend>
<outbound>
 <base />
 </outbound>
 <on-error>
 <base />
 </on-error>
 </policies>

The result is now complete and the client will assume/think that this is just a regular API as the other ones and we can expose this with regular API Management API-Key security.

Wrap up

Exposing the SAP Gateway is not a straight forward task but after understanding the process and being able to implement the SAP Gateway complexity inside API Management we can expose these functions just as any other API, I would suggest adding some Rate call limits and quotas to protect the gateway from overflow. This scenario proves the value of API Management and provides possiblities to solve complex authentication and anti forgery patterns in order to standardise your own API Facade with the same auth mechanism for the client without tacking the backend security in consideration.

Full scenario:

 

Posted in: •Azure API Management  •Integration  •Uncategorized  | Tagged: •API Governance  •API Management  •SAP 


Connecting to Azure Subscriptions from VSTS for release management

When it comes to deploy and realease management in VSTS we need to connect to our subscription.

This is done via Service Endpoints created inside VSTS, there are two ways of authentication to Azure subscriptions, with User Account or AAD Application, typically scenarios for AAD Applications is when the subscription is not in your tenant or when you don’t have access to the subscription with the appropiate role or so.

1 User Account: The subscription is accessible with your user account

When you use the same user account when logging in to VSTS and your Azure Subscription the Azure Subscription is auto discovered and can be picked under the headline “Available Azure Subscriptions”

  • Pick the Subscription
  • Press the Authorize button to make VSTS create the required authorization
    1. If you have the required access you can now start deploying to this subscription

2 AAD Application: The subscription is not accessible with your user account

If the VSTS user dosen’t have access to the subscription it will not be listed in the Subscription list under “Available Azure Subscriptions” and we will need to add it manually.

  • Press the Manage button
    1. A new tab is opened and you will come to the Service tab
  • Press the  “New Service Endpoint”
  • A new dialog is opened and you can now create  a Connection to an existing accessible subscription, just as before but we want to create it based on Service Principal so press the “here” link
  • The dialog is transformed and now we can add the full information from an AAD application. Connection name:_ enter a name for the Connection i.e “Customer A  TEST”Subscription ID: the guid for the subscription_Subscription Name:_ A friendly understandable name of the subscription; we often use the same as Connection name i.e. “Customer A TEST”Service Principal Client ID: this is the AAD Application Client ID_Service Principal Key:this is the AAD Application KeyTenant ID: the guid for the tenant Sample from an AAD Application in Azure, this is how you find the values: TenantID: is the** Directorty ID** and found on the Properties section of the Azure AD Directory. Service Principal Client ID: _This is the **Application ID **of the AAD Application._Service Principal Key: Is the Key on the Azure AD Application and is found under keys, generate a new key (only visible 1 time after save)
    1. Verify the Connection. Tip: if the verification failes make sure that the AAD Application has atleast “Contributor” rights at atleast one resource Group (not just only the subscription)
    2. Press Ok

This service endpoint can now be found in the subscription list.

I prefer using AAD Applicaiton Connection stetup on Production Environments, just to make sure there are no “personal account” Connections that can mess things up.

 

Posted in: •Integration  •Uncategorized  | Tagged:


Logic Apps and Dynamic’s CRM Online Connector

An easy way of starting to Integrate with your Dynamic CRM Online instance. It just takes a few minutes and some easy steps to get started!

Getting started with the Connector

Getting started with the Connector is easy, all you need is an account on Dynamic CRM Online. Read more how to connect and use certain actions/triggers etc. at the MSDN forum.

https://azure.microsoft.com/sv-se/documentation/articles/connectors-create-api-crmonline/

Working with the Connector

It’s fairly easy to get started with the Connector, selecting data and create/update records, but as soon as the business requirments lands on our table it gets a bit trickier. Often we need to handle scenarios where we link relations and/or assign ownership to the entity we are createing/updating. Then we need to know a little more about the setup in Dynamic CRM and our best friend for this is the Settings and Customizations tab.

crm_Settings

By then selecting Customize The System a new window will open and it’s now possible to go in and verify the entity, check keys, relations etc.

crm_customizethesystem

Using Navigational properties

Navigational properties are used to link entities together, these can be set directly within the body of the update/insert request. This will then be linked correctly inside Dynamics CRM Online, an example would look like this, where we are setting the currency (transactioncurrency) of the account using the navigational property_ transactioncurrencyid_value, example:

 "body": {
"_transactioncurrencyid_value": "@{first(body('Get_intgrationMetadata')['value'])['_stq_defaultcurrency_value']}",
"accountnumber": "@{triggerBody()['data']['Company_ID']}",
"address2_city": "@{triggerBody()['data']['Town_city_state']}",
...
..
}

Setting owner

A frequent used operation is Assigning owner, this is now implemented so it’s possible to do it directly via the Update/Create operation instead of a separate operation as previously.

In the Dynamic CRM Online Connector it’s easy to set the owner just apply the following fields _ownerid_type wich is set to either systemusers or teams _depending on the owner type next is the __ownerid_value wich is the key to the user, an example as bellow:’

"body": {
"_ownerid_type": "systemusers",
"_ownerid_value": "@{first(body('Get_owner')['value'])['systemuserid']}"
}

Lessons learned

Don’t overflow the CRM! Since Logic Apps is really powerfull in parallelism it’s good to have some sort of control over how many new instances are created and executed against Dynamics CRM. We usually try to make sure that there are no more than 200 parallel actions towards a CRM at any given time.

Learn how to check fields, properties and keys since you will get stuck at errors when sending in the wrong type and then you will need to check what type it is. OptionSets are commonly used and good from GUI perspective but it’s not as good in integration since it’s translated to a number that we from integration often need to translate to a code or text but learning how to check the values inside CRM will speed up this process.

When we started using the Connector there were problems with assigning ownership and handling float numbers, these where fixed early by the product group and since then we haven’t found any issues with the way we are using the Connector.

Posted in: •Integration  •Logic Apps  | Tagged:


Validate incoming HTTP request messages in BizTalk

One of BizTalk’s greatest strengths is that messages are not lost. Even if there is an error processing a message we still get a suspended message that an administrator can handle manually. Sometimes though that is not the preferred way of handling errors. When working with web API’s it is common to let the client handle any errors by returning a status code indicating if the request was successful or not, especially if the error is in the client request.

There is a standard pipeline component included in BizTalk that can validate any incoming XML message against its XSD schema, however if the validation fails the request will only get suspended and all the client receives is a timeout message and an administrator will have to handle the suspended message even though there is not much to do about it since the client connection is closed.

One way of handling this is to do the validation in an orchestration, catch any validation errors and create a response message to return but writing an orchestration for just handling validation doesn’t make much sense with the performance implications and all.

A better solution would be if we could:

  • Abort the processing of the request as early as possible if there is a validation error.
  • Leverage the existing component for validating messages.
  • Return a HTTP 400 status code to indicate to the client that the request was bad.
  • Avoid suspending the message in BizTalk.

Since the receive pipeline is the earliest stage we can inject custom code in, a custom pipeline component would be the best fit.

The pipeline component would need to execute the standard XmlValidator component and catch any validation error thrown by the validator. We could of course log any validation errors to the event log if we still would like some error logging.

If a validation error was caught, we need to create a new message with the validation error details so the client understand what is wrong.

The context property OutboundHttpStatusCode should be set to 400 so that the adapter knows to return the message with a status code of 400 bad request.

To prevent any further processing of the message and indicate to BizTalk that the new message is the response to return to the client, a number of context properties related to request response messages need to be set. Correlation tokens are copied from the request message to make the response go through the same receive port instance.

This component is available as a part of iBiz Solutions open source initiative. The source code is available on GitHub and the component is also available on NuGet.

Posted in: •Integration  | Tagged: •BizTalk  •Open Source  •Pipeline Components