API Management DTAP Pipeline with VSTS

In order to handle a full lifecycle, we need to be able to export API’s from our API Management Development instance to the Test instance, if we have a Acceptance Test instance and later on the Production instance.

There are several ways of doing this and here we will go thru the GIT option. Out of Box the API Management instance comes with the function to export the configuration to a dedicated GIT repository (comes with the product, read more about it here) you can the use this saved state to restore your API instance or export API’s to other instances.

Note the GIT exportion will not export users, Named Values (Properties) or subscriptions, and if Named Values (Properties) are used these need’s to be added manually before import to the new istance.

The workflow is simple we develop new API’s and updating old ones in the Development instance first, then the code is saved to the GIT repository and a Build is initalized in VSTS to publish the artifacts for use during deployment. Release/Deployment is then made to next instance by cloning that instance’s repository and merge the changes, replacing URL’s and so on with powershell scripts.

The Build Step

The build is all about creating artifacts for deployment to the next instance.

So this we do by setting up a new build in VSTS, for more information about how to do this in VSTS follow this link.

The build is rather small, we just create a new build with an Empty Process, pick the “Remote Repo” and create a New Service Connection shows up in a popup window and here you shall provide the credentials found at you APIM instance, don’t forgett to generate the password. (annoying is that this expires every 30 days and need’s to be replaced). Read more about how to find credentials and URL’s for GIT repository here

Next step is to publish the artifacts, now we have the possibility to either push everything or add masks to prevent saving unnessisary settings. In this image we show how masking is used to only publish artifacts for a subset of API’s in the API Management instance (isolating the deployment to these API’s).

In order to make these masks you need to understand the GIT structure, read more about how to understand APIM Source Code here in my earlier post

After an initated build the files related to the API are published and ready to be used, as you can see we want to make sure that products, groups and polices are moved with aswell so that we can manage the ground set from Development instance. (since we merge added special products or groups in later instances will be left untouched)

The build part of this is now done and we hav our artifacts published and ready to use with our Releases.

The Release Steps

As stated in the begginign the release steps are also working with GIT, we will now clone the next instance’s GIT repository, merge the changes and publish it all back to the same GIT Repository they came from, thereafter we will initiate a deployment from the GIT repository to the API Management instance.

So how do we get the next instance’s source code and merge it? Using Powershell!

The Release definition is actually only powershell and will be looking like this.

Cloning the target APIM Instance GIT repository

To make this simpler we have created a few powershell scripts that help’s out with these steps, first one is cloning the repository. (tip we are using a shared repository for our powershell script and then links it to the release via Artifacts->Link an artifact source)

param([string] $apimname, [string] $urlencodedkey, [string] $user = "apim", [string] $copyfrompath = ".\APIM\*")

"Cloning repository"

#create the paths
$gitpath = $apimname + ".scm.azure-api.net"
$destpath = ".\" + $gitpath + "\api-management\"

$url = "https://" + $user +":"+ $urlencodedkey + "@" + $apimname + ".scm.azure-api.net/";
$url

git clone $url -q

"Start Copy files"


Copy-Item -Path $copyfrompath -Destination $destpath –Recurse -force

"Files copied"

As you can see from the above script we just need some parameters to get started, and we provide these via the Arguments textbox on the powershell script step, and for easy reuse we use variables. Note that in this script the GIT key need’s to be base64 encoded when provided as variable.

-apimname $(apiname) -urlencodedkey $(apikey)  -copyfrompath ".\INT001 Work Orders APIM\APIM\*"

The copyfrompath url can be found via the button on the script path to browse the path to where you want to start copy files from. (image shows sample)

For easy reuse we provide the varibales on Environment level (easy to clone the environment or the whole release and we are 80% done just changing the variables, search paths and replacement scripts).

Changing URL’s and other values in the files

When the code is extracted we need to do some manipulation, almost always it’s changing the URL and some small adjustments. There are several ways of doing this but we have thought that the easiest way is to just use Regex with the replace function. Here is a commonly used Powershell script for replacing a text via Regex expressions in powershell and a sample on how we use it with parameters.

param([string] $filepath ,[string] $regex, [string] $value)

(Get-Content $filepath) -replace $regex,$value | Out-File $filepath

And the arguments provided need’s to point to the exact file location (on the disk where the release agent can find it) the regex on what it should match (in the sample bellow we change th URL on the backend setting for the SOAP API we are using to the one matching the new instance).

-filepath "$(SYSTEM.ARTIFACTSDIRECTORY)\$(apiname).scm.azure-api.net\api-management\backends\BizTalkServiceInstance_Soa107AV4D\configuration.json" -regex '"url":\W*[0-9a-zA-Z:/\."]*' -value '"url": "https://newurlformysoapservice.com"'

The $(SYSTEM.ARTIFACTSDIRECTORY) is vital for finding the correct path since this parameter will contain the path to the folder where the where we can find the file.

This is then repeated until all URL’s or other settings thas need’s to be changed.

Create a commit and push the changes

When all changes have been made we need to push the changes to the targeted APIM instance GIT repository, this is also done via Powershell and here is a script we are using for this.

param([string] $apimname)
#create the paths
$gitpath = $apimname + ".scm.azure-api.net"

#Part publish Start
"Start Publishing"

#move to git folder
cd $gitpath

#set a git user
git config user.email "deployment@vsts.se"
git config user.name "Visual Studio Team Services"

#Add all new files
git add *

#comit and push
git commit -m "Deployment updates"
git push -q

"Publish done"

#return to original path
cd..

As you can see this script is simpler since we have all GIT settings prepared allready, and it does is commiting localy and then pushing to the . When all this is done it’s easy to clone this and reuse on a new instance/environment or a new Release step for another API.

Deploy the changes from GIT to the APIM instnace

Since we often want to have control over the deployment we normaly do the deployment step manually, entering the Azure Portal and browsing the APIM instance going to the Repositories blade and pressing the Deploy to API Management button but this can also be automated via powershell or the REST API.

Summary

Our experience of the GIT deployment setup is good, altough some minor issues have been found with new functionality as when the backend part was added with the SOAP support. But that is only natural and the product group has been really helpful in finding work arounds on problems and fixing the core problem.

With that said there are room for improvements around the experience, using regex to replace URL’s is not optimal and parameter files would help out improving the experience and make it easier to automate URL changes and so on. The closest thing we have is the Named Values but these can’t be used everywhere and another down sides are that these values need’s to be provided manuall before deployment otherwise it fails with a cryptic error and also the visibility of what URL the API is using, it’s not so fun to start using traces just to make sure the URL is properly set.

This option is working good for us but we are looking forward for the upcoming ARM Template support to see if we can get better and easier ways of handling parameters.

Posted in: •Azure API Management  •Integration  | Tagged: •DTAP  •API Management  •Deployments  •Azure  •GIT 


Logic Apps deployment template extractor Connections update

An exciting update is now released, you who have used the Logic Apps Deployment Template Creator know that the support for connections have been poor. But know we have released an update that handles all connections, regardless if they are connected via gateway or not the needed parameters are generated and in the end populated to the parameters file. Still all parameters are needed to be set manually but automation of this is next step in this area. The files are then redeployable right away and we can move the files to VSTS for source control and for seting up release pipelines.

API connections Samples:

First off we take a SQL Azure instnance and create a Logic App that lists the content of a table (simple start).

Extracting this Logic App with the Template Creator and powershell will generate the following read more about how to extract logic apps with powershell and deployment template creator.:

Parameters: All Azure SQL Connections are extracted and presented, all but the securestring parameters (username and password) are autopopulated with the values on the exporter Logic App and API Connection.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "value": "SQLAzure"
    },
    "logicAppLocation": {
      "value": "[resourceGroup().location]"
    },
    "sql-1_name": {
      "value": "sql-1"
    },
    "sql-1_displayName": {
      "value": "SQL Azure"
    },
    "sql-1_server": {
      "value": "dummyserverone.database.windows.net"
    },
    "sql-1_database": {
      "value": "dummydatabase"
    },
    "sql-1_username": {
      "value": ""
    },
    "sql-1_password": {
      "value": ""
    }
  }
}

Logic App:

 {
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "defaultValue": "SQLAzure",
      "metadata": {
        "description": "Name of the Logic App."
      }
    },
    "logicAppLocation": {
      "type": "string",
      "defaultValue": "[resourceGroup().location]",
      "allowedValues": [
        "[resourceGroup().location]",
        "eastasia",
        "southeastasia",
        "centralus",
        "eastus",
        "eastus2",
        "westus",
        "northcentralus",
        "southcentralus",
        "northeurope",
        "westeurope",
        "japanwest",
        "japaneast",
        "brazilsouth",
        "australiaeast",
        "australiasoutheast",
        "westcentralus",
        "westus2"
      ],
      "metadata": {
        "description": "Location of the Logic App."
      }
    },
    "sql-1_name": {
      "type": "string",
      "defaultValue": "sql-1"
    },
    "sql-1_displayName": {
      "type": "string",
      "defaultValue": "SQL Azure"
    },
    "sql-1_server": {
      "type": "string",
      "defaultValue": "dummyserverone.database.windows.net",
      "metadata": {
        "description": "SQL server name"
      }
    },
    "sql-1_database": {
      "type": "string",
      "defaultValue": "dummydatabase",
      "metadata": {
        "description": "SQL database name"
      }
    },
    "sql-1_username": {
      "type": "securestring",
      "defaultValue": "",
      "metadata": {
        "description": "Username credential"
      }
    },
    "sql-1_password": {
      "type": "securestring",
      "defaultValue": "",
      "metadata": {
        "description": "Password credential"
      }
    }
  },
  "variables": {},
  "resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [
        "[resourceId('Microsoft.Web/connections', parameters('sql-1_name'))]"
      ],
      "properties": {
        "definition": {
          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
          "contentVersion": "1.0.0.0",
          "parameters": {
            "$connections": {
              "defaultValue": {},
              "type": "Object"
            }
          },
          "triggers": {
            "manual": {
              "type": "Request",
              "kind": "Http",
              "inputs": {
                "schema": {}
              }
            }
          },
          "actions": {
            "Get_rows": {
              "runAfter": {},
              "type": "ApiConnection",
              "inputs": {
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['sql_1']['connectionId']"
                  }
                },
                "method": "get",
                "path": "/datasets/default/tables/@{encodeURIComponent(encodeURIComponent('[SalesLT].[Customer]'))}/items"
              }
            }
          },
          "outputs": {}
        },
        "parameters": {
          "$connections": {
            "value": {
              "sql_1": {
                "id": "[concat(subscription().id,'/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/sql')]",
                "connectionId": "[resourceId('Microsoft.Web/connections', parameters('sql-1_name'))]",
                "connectionName": "[parameters('sql-1_name')]"
              }
            }
          }
        }
      }
    },
    {
      "type": "Microsoft.Web/connections",
      "apiVersion": "2016-06-01",
      "location": "[parameters('logicAppLocation')]",
      "name": "[parameters('sql-1_name')]",
      "properties": {
        "api": {
          "id": "[concat('/subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'sql')]"
        },
        "displayName": "[parameters('sql-1_displayName')]",
        "parameterValues": {
          "server": "[parameters('sql-1_server')]",
          "database": "[parameters('sql-1_database')]",
          "username": "[parameters('sql-1_username')]",
          "password": "[parameters('sql-1_password')]"
        }
      }
    }
  ],
  "outputs": {}
}

If you add the sql username and password this will be ready for deployment, creating/updating the API Connection and creating/Updating the Logic App.

Redeploying this with username/password and changing the Logic App name to SQLAzureFromVS and API Connection name to sql-2 to will create a new Logix App and a new API Connection and link them so they are ready to be used right away!

And the new API Connection is shown in the API Connections blade.

If this would have been a SQL Connector usign Gateway there would be some differences but only on the API Connection and the parameters file. The API Connection will then have a Gatway object in the parameterValues object.

{
  "type": "Microsoft.Web/connections",
  "apiVersion": "2016-06-01",
  "location": "[parameters('logicAppLocation')]",
  "name": "[parameters('sql')]",
  "properties": {
    "api": {
      "id": "[concat('/subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'sql')]"
    },
    "displayName": "[parameters('sqldisplayName')]",
    "parameterValues": {
      "server": "[parameters('sql_server')]",
      "database": "[parameters('sql_database')]",
      "authType": "[parameters('sql_authType')]",
      "username": "[parameters('sql_username')]",
      "password": "[parameters('sql_password')]",
      "gateway": {
        "id": "[concat('subscriptions/',subscription().subscriptionId,'/resourceGroups/',parameters('sql_gatewayresourcegroup'),'/providers/Microsoft.Web/connectionGateways/',parameters('sql_gatewayname'))]"
      }
    }
  }
}

And the parameter file will contain the two new parameters sql_gatewayname that should contain the name of the Gateway and sql_gatewayresourcegroup parameter that should contain the resource group where the gateway is deployed in.

"sql_gatewayname": {
  "type": "string",
  "defaultValue": "Malos-LogicApp2015"
},
"sql_gatewayresourcegroup": {
  "type": "string",
  "defaultValue": "OnPremDataGateway"
}

As above, set the credentials, change the database setting to your new ones, point out the gateway via name and resource group and we are good to go.

Mixing multiple connections is not a problem, here is a sample on how it will look like when using both Storage and Service Bus Connectors. Simple Sample bellow saving to storage and then publish on service bus queue.

As you can see bellow the extractor will generate the needed connectors and it’s parameters.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "defaultValue": "ServicebusAndStorage",
      "metadata": {
        "description": "Name of the Logic App."
      }
    },
    "logicAppLocation": {
      "type": "string",
      "defaultValue": "[resourceGroup().location]",
      "allowedValues": [
        "[resourceGroup().location]",
        "eastasia",
        "southeastasia",
        "centralus",
        "eastus",
        "eastus2",
        "westus",
        "northcentralus",
        "southcentralus",
        "northeurope",
        "westeurope",
        "japanwest",
        "japaneast",
        "brazilsouth",
        "australiaeast",
        "australiasoutheast",
        "westcentralus",
        "westus2"
      ],
      "metadata": {
        "description": "Location of the Logic App."
      }
    },
    "azureblob_name": {
      "type": "string",
      "defaultValue": "azureblob"
    },
    "azureblob_displayName": {
      "type": "string",
      "defaultValue": "dummy storage"
    },
    "azureblob_accountName": {
      "type": "string",
      "defaultValue": "dymmystorage",
      "metadata": {
        "description": "Name of the storage account the connector should use."
      }
    },
    "azureblob_accessKey": {
      "type": "securestring",
      "defaultValue": "",
      "metadata": {
        "description": "Specify a valid primary/secondary storage account access key."
      }
    },
    "servicebus_name": {
      "type": "string",
      "defaultValue": "servicebus"
    },
    "servicebus_displayName": {
      "type": "string",
      "defaultValue": "dummy service bus"
    },
    "servicebus_connectionString": {
      "type": "securestring",
      "defaultValue": "",
      "metadata": {
        "description": "Azure Service Bus Connection String"
      }
    }
  },
  "variables": {},
  "resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [
        "[resourceId('Microsoft.Web/connections', parameters('azureblob_name'))]",
        "[resourceId('Microsoft.Web/connections', parameters('servicebus_name'))]"
      ],
      "properties": {
        "definition": {
          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
          "contentVersion": "1.0.0.0",
          "parameters": {
            "$connections": {
              "defaultValue": {},
              "type": "Object"
            }
          },
          "triggers": {
            "manual": {
              "type": "Request",
              "kind": "Http",
              "inputs": {
                "schema": {}
              }
            }
          },
          "actions": {
            "Create_blob": {
              "runAfter": {},
              "type": "ApiConnection",
              "inputs": {
                "body": "hej hej",
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['azureblob']['connectionId']"
                  }
                },
                "method": "post",
                "path": "/datasets/default/files",
                "queries": {
                  "folderPath": "/orders",
                  "name": "@{guid()}.txt"
                }
              }
            },
            "Send_message": {
              "runAfter": {
                "Create_blob": [
                  "Succeeded"
                ]
              },
              "type": "ApiConnection",
              "inputs": {
                "body": {
                  "ContentData": "@{base64('hej hej')}",
                  "ContentType": "text"
                },
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['servicebus']['connectionId']"
                  }
                },
                "method": "post",
                "path": "/@{encodeURIComponent('ordersqueue')}/messages",
                "queries": {
                  "systemProperties": "None"
                }
              }
            }
          },
          "outputs": {}
        },
        "parameters": {
          "$connections": {
            "value": {
              "azureblob": {
                "id": "[concat(subscription().id,'/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/azureblob')]",
                "connectionId": "[resourceId('Microsoft.Web/connections', parameters('azureblob_name'))]",
                "connectionName": "[parameters('azureblob_name')]"
              },
              "servicebus": {
                "id": "[concat(subscription().id,'/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/servicebus')]",
                "connectionId": "[resourceId('Microsoft.Web/connections', parameters('servicebus_name'))]",
                "connectionName": "[parameters('servicebus_name')]"
              }
            }
          }
        }
      }
    },
    {
      "type": "Microsoft.Web/connections",
      "apiVersion": "2016-06-01",
      "location": "[parameters('logicAppLocation')]",
      "name": "[parameters('servicebus_name')]",
      "properties": {
        "api": {
          "id": "[concat('/subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'servicebus')]"
        },
        "displayName": "[parameters('servicebus_displayName')]",
        "parameterValues": {
          "connectionString": "[parameters('servicebus_connectionString')]"
        }
      }
    },
    {
      "type": "Microsoft.Web/connections",
      "apiVersion": "2016-06-01",
      "location": "[parameters('logicAppLocation')]",
      "name": "[parameters('azureblob_name')]",
      "properties": {
        "api": {
          "id": "[concat('/subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'azureblob')]"
        },
        "displayName": "[parameters('azureblob_displayName')]",
        "parameterValues": {
          "accountName": "[parameters('azureblob_accountName')]",
          "accessKey": "[parameters('azureblob_accessKey')]"
        }
      }
    }
  ],
  "outputs": {}
}

The parameter file will look like following, and only the accesskey for blob and the connection string is not autopopulated with current values.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "value": "ServicebusAndStorage"
    },
    "logicAppLocation": {
      "value": "[resourceGroup().location]"
    },
    "azureblob_name": {
      "value": "azureblob"
    },
    "azureblob_displayName": {
      "value": "dummy storage"
    },
    "azureblob_accountName": {
      "value": "dymmystorage"
    },
    "azureblob_accessKey": {
      "value": ""
    },
    "servicebus_name": {
      "value": "servicebus"
    },
    "servicebus_displayName": {
      "value": "dummy service bus"
    },
    "servicebus_connectionString": {
      "value": ""
    }
  }
}

Key Vault integration

A small step has taken for KeyVault integration, read more about using KeyVault with ARM deployments here. When using the Get-ParameterTemplate operation there is a new parameter -KeyVault, it can be either “None” or “Static” and when used with Static as example code bellow a static reference will be generated for KeyVault integration. And when deployed the value stored in the secret will be used as the parameter value, separating secrets from deployment templats.

Get-ParameterTemplate -TemplateFile $filenname -KeyVault Static | Out-File $filennameparam

With one of our earlier sample it will look like this when used with Static:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "value": "SQLInsertOnPrem"
    },
    "logicAppLocation": {
      "value": "[resourceGroup().location]"
    },
    "sql_name": {
      "value": "sql"
    },
    "sql_displayName": {
      "value": "SQL server OnPrem"
    },
    "sql_server": {
      "value": "."
    },
    "sql_database": {
      "value": "InvoiceDatabase"
    },
    "sql_authType": {
      "value": "windows"
    },
    "sql_username": {
      "reference": {
        "keyVault": {
          "id": "/subscriptions/{subscriptionid}/resourceGroups/{resourcegroupname}/providers/Microsoft.KeyVault/vaults/{vault-name}"
        },
        "secretName": "sql_username"
      }
    },
    "sql_password": {
      "reference": {
        "keyVault": {
          "id": "/subscriptions/{subscriptionid}/resourceGroups/{resourcegroupname}/providers/Microsoft.KeyVault/vaults/{vault-name}"
        },
        "secretName": "sql_password"
      }
    },
    "sql_gatewayname": {
      "value": "Malos-LogicApp2015"
    },
    "sql_gatewayresourcegroup": {
      "value": "OnPremDataGateway"
    }
  }
}

Replace the {bracketed} values with the correct ones, secret name is the value of the secret and can also be replace, for simplicity we generate it tot the same value as the parameter.

{subscriptionid} = your subscriptionid

{resourcegroupname} = the resourcegroup where the keyvault is deployed

{vault-name} = the name of the vault

Or just copy the resourceid found at the properties blade on the KeyVault as the image shows.

We think that these features has really improved the experience and ease of use with the Logic App template Creator so I hope you like it.

I strongly recomend you to try it and help out evolving it, more updates coming so stay tuned!

https://github.com/jeffhollan/LogicAppTemplateCreator

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


Logic Apps deployment template extractor updates june

Another update to the Logic App Template Extractor has come out, read more about the Logic App Tmplate Extractor in the earlier post Logic Apps Deployment Template Extractor

This update focus on Usage of Integration Acount actions and the standard HTTP action. With this update it’s easier to create a standard template for multiple flows, we often see the same patterns again again and again. This update makes it easy to create a standrad flatfile, XML Transform pattern that is resuable in multiple flows with just diffrent parameter files.

Flat File Decode and Encode

Flatfile actions Decode and Encode conatins a schema property it contains the name of the schema and this is something that is extract automatically as a property to easily be set via a property file.

And in the extracted ARM template we will automatically get:

"parameters": {
...
  "Flat_File_Decoding-SchemaName": {
      "type": "string",
      "defaultValue": "INT0021.SystemA.DailyStatistics"
    },
    "Flat_File_Encoding-SchemaName": {
      "type": "string",
      "defaultValue": "INT0021.SystemB.DailyStatistics"
    }
...
  "actions": {
    "Flat_File_Decoding": {
      "runAfter": {},
      "type": "FlatFileDecoding",
      "inputs": {
        "content": "@{triggerBody()}",
        "integrationAccount": {
          "schema": {
            "name": "[parameters('Flat_File_Decoding-SchemaName')]"
          }
        }
      }
    },
    "Flat_File_Encoding": {
      "runAfter": {
        "Transform_XML": [
          "Succeeded"
        ]
      },
      "type": "FlatFileEncoding",
      "inputs": {
        "content": "@{body('Transform_XML')}",
        "integrationAccount": {
          "schema": {
            "name": "[parameters('Flat_File_Encoding-SchemaName')]"
          }
        }
      }
    },

XML Transform

The XML Transform contains a map property this is the name of the map, this is something that is extract automatically as a property to easily be set via a property file.

And in the extracted ARM template we will automatically get:

"parameters": {
...
 "Transform_XML-MapName": {
  "type": "string",
  "defaultValue": "INT0021.DailyStatistics.SystemA.To.SystemB"
 }
...
  "actions": {
  	....
    ,
    "Transform_XML": {
      "runAfter": {
        "Flat_File_Decoding": [
          "Succeeded"
        ]
      },
      "type": "Xslt",
      "inputs": {
        "content": "@{body('Flat_File_Decoding')}",
        "integrationAccount": {
          "map": {
            "name": "[parameters('Transform_XML-MapName')]"
          }
        }
      }
    }

Http

The HTTP action is videly used and common in all types of setups, normaly when moving from Dev to Test involves changes to both the URI nad authentication parameters. Now these are moved as ARM template properties to make it easier to deploy changes between environments.

The basic case, promoting the URI

"parameters": {
...
 "HTTP-URI": {
  "type": "string",
  "defaultValue": "http://google.se"
 }
...
	"HTTP": {
	  "type": "Http",
	  "inputs": {	   
	    "body": {
	      "test": "data"
	    },
	    "method": "POST",
	    "uri": "[parameters('HTTP-URI')]"
	  },
	  "conditions": []
	} 
 

Later on you will probobly use some sort of authentication and the extractor will push these to ARM templates aswell.

Http Basic Authentication

"parameters": {
...
 "HTTP-URI": {
  "type": "string",
  "defaultValue": "http://google.se"
 },
 "HTTP-Password": {
  "type": "string",
  "defaultValue": "myusername"
 },
 "HTTP-Username": {
  "type": "string",
  "defaultValue": "mypassword"
 }
...
	"HTTP": {
	  "type": "Http",
	  "inputs": {
	  	"authentication": {
          "password": "[parameters('HTTP-Password')]",
          "type": "Basic",
          "username": "[parameters('HTTP-Username')]"
        },	   
	    "body": {
	      "test": "data"
	    },
	    "method": "POST",
	    "uri": "[parameters('HTTP-URI')]"
	  },
	  "conditions": []
	} 

Http Client Certificate Authentication

"parameters": {
...
 "HTTP-URI": {
  "type": "string",
  "defaultValue": "http://google.se"
 },
 "HTTP-Password": {
  "type": "string",
  "defaultValue": "myusername"
 },
 "HTTP-Pfx": {
  "type": "string",
  "defaultValue": "mypfx"
 }
...
	"HTTP": {
	  "type": "Http",
	  "inputs": {	   
	  	"authentication": {
          "password": "[parameters('HTTP-Password')]",
          "pfx": "[parameters('HTTP-Pfx')]",
          "type": "ClientCertificate"
        },
	    "body": {
	      "test": "data"
	    },
	    "method": "POST",
	    "uri": "[parameters('HTTP-URI')]"
	  },
	  "conditions": []
	} 

Http Active Directory OAuth

"parameters": {
...
 "HTTP-URI": {
  "type": "string",
  "defaultValue": "http://google.se"
 },
 "HTTP-Audience": {
  "type": "string",
  "defaultValue": "myaudience"
 },
 "HTTP-Authority": {
  "type": "string",
  "defaultValue": "https://login.microsoft.com/my"
 },
 "HTTP-ClientId": {
  "type": "string",
  "defaultValue": "myclientid"
 },
 "HTTP-Secret": {
  "type": "string",
  "defaultValue": "https://login.microsoft.com/my"
 },
 "HTTP-Tenant": {
  "type": "string",
  "defaultValue": "mytenant"
 },
...
	"HTTP": {
	  "type": "Http",
	  "inputs": {
	  	"authentication": {
          "audience": "[parameters('HTTP-Audience')]",
          "authority": "[parameters('HTTP-Authority')]",
          "clientId": "[parameters('HTTP-ClientId')]",
          "secret": "[parameters('HTTP-Secret')]",
          "tenant": "[parameters('HTTP-Tenant')]",
          "type": "ActiveDirectoryOAuth"
        },	   
	    "body": {
	      "test": "data"
	    },
	    "method": "POST",
	    "uri": "[parameters('HTTP-URI')]"
	  },
	  "conditions": []
	} 

Tools like the Logic App Template Creator helps us focus on the fun and good parts, building great solutions for our customers.

I strongly recomend you to try it and help out evolving it, more updates coming so stay tuned!

https://github.com/jeffhollan/LogicAppTemplateCreator

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


Logic Apps Parameters vs ARM Parameters

I got the question about what the difference is between ARM template parameters and Logic App parameters and when these should be used, so that is what I’ll try to explain in this post.

First of ARM template paramters are used with ARM templates and the ARM template is used when deploying ARM based resources to Azure and Logic App’s is a resource that is deployed via ARM templates. The workflow definition language behind Logic App’s is very similar to ARM templates and therefore it can be tricky to see the difference in the beginning.

So let’s start of with ARM template parameters where they are and how they work, intereseted in more info about ARM templates read more https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-authoring-templates.

ARM Template Parameters

An ARM template with a empty Logic App looks as following, two ARM template parameters logicAppName and logicAppLocation and one resource of type Microsoft.Logic/workflow inside the Microsoft.Logic/workflow the Logic App Parmeters are found in the parameters object.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "defaultValue": "",
      "metadata": {
        "description": "Name of the Logic App."
      }

    },
    "logicAppLocation": {
      "type": "string",
      "defaultValue": "[resourceGroup().location]",
      "allowedValues": [
        "eastasia",
        "southeastasia",
        "centralus",
        "eastus",
        "eastus2",
        "westus",
        "northcentralus",
        "southcentralus",
        "northeurope",
        "westeurope",
        "japanwest",
        "japaneast",
        "brazilsouth",
        "australiaeast",
        "australiasoutheast",
        "westcentralus",
        "westus2"
      ],
      "metadata": {
        "description": "Location of the Logic App."
      }
    }
  },
  "variables": {
  },
  "resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [],
      "properties": {
        "definition": {},
        "parameters": {}
      }
    }
  ],
  "outputs": {}
}

So again the ARM template parameters are found here and cotaining a parameter named Flat_File_Encoding-SchemaName

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "Flat_File_Encoding-SchemaName": {
	  "type": "string",
	  "defaultValue": "TEST-INT0021.Intime.DailyStatistics"
	},

The ARM template properties has the syntax: [parameters(‘armparam’)] and when accessing arm parameter with name Flat_File_Encoding-SchemaName Fit would look like:

"name": "[parameters('Flat_File_Encoding-SchemaName')]"

This value will be evaluated during runtime and if the ARM parameter would look like this (and no parameter file was used).

,
"Flat_File_Encoding-SchemaName": {
  "type": "string",
  "defaultValue": "TEST-INT0021.Intime.DailyStatistics"
},

The result would look like this after deloyment:

Code View

"name": "TEST-INT0021.Intime.DailyStatistics"

Desginer View

In the Designer View it’s pretty evaluated and easy to read for operations.

ARM Template Variables

This is almost an unkown feature and often used to little. This is where we can evaluate expressions for later use, so this is really practical when we want to add two parameter values or use some functions to generate the value that shall be consistent over the Logic App.

In order to prove how this works we will concat two parameters, this could be used for creating resource links etc. The simpel logic app will just contain a response action with a body value of the evaluated concatenated value of the parameters.

The variable can be found after the parameters in the ARM template:

    }
  },
  "variables": {
  },
  "resources": [

When adding a variable we can the access parameters as input to the evaluation, this makes it possible to combine two paramters with the concat function and it will look like this:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "metadata": {
        "description": "Name of the Logic App."
      }
    },
    "flowname": {
      "type": "string",
      "metadata": {
        "description": "Name of the flow"
      }
    },
  },
  "variables": {
    "combinedflowname" :  "[concat(parameters('logicAppName'),'-',parameters('flowname'))]"
  },
  "resources": [

So the variable is created and assigned a value, now we just need to use it and the syntax for accessing the value is simple:

"Response": {
  "inputs": {
    "body": "[variables('combinedflowname')]",
    "statusCode": 200
  },
  "runAfter": {
   
  },
  "type": "Response"
}

The variables will be evaluated during deployment just as the ARM template paramters. This means that when deployed the value will be displayed as text, so if logicAppName was set to INT001-Example and the flowname was set to Orders the evaluated result of the concat function would be: INT001-Example-Orders. And when looking at the deployed Logic App it will look like this:

Logic App Parameters

The Logic App paremeters are found in this section under “parameters”. Containing a parameter “url”.

"resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [],
      "properties": {
        "definition": {},
        "parameters": {
			"url": {
	          		"defaultValue": "http://www.google.se/",
	         	 	"type": "String"
	        	}
			}
      }
    }
  ],

The Logic App parameters have a diffrent syntax: @parameters(‘laparam’) and when accessing arm parameter with name url it would look like:

"uri": "@parameters('url')"

These parameters are evaluated at runtime wich means that this is not changed after deployment or in the designer so even after deploy or first run is done it will always look like this, but during runtime it will be picking the value behind the parameter: The result would look like this after deloyment: Code View

“uri”: “@parameters(‘url’)”

Desginer View

But during rumtime it will evalve and if we check a run it will look like this:

Summary

It’s good to know when and why to use diffrent types of parameters, I’ve seen alot of over use of Logic App parameters and I just want to share the knowledge and spread how these are treated. We want to push the static values to ARM template parameters so that it’s easy to see the values during a check if the Logic App in the designer to verify that everything is ok, since Logic App paramters will “hide” the value in Code View. It also makes it easy to switch values between DEV/TEST/PROD environments since there is often change between environments. Also make sure to use ARM template variables whenever you need to reuse a computed result over you Logic App.

But with that said when using reference objects for translation or lookups Logic App Paramters is the obvious choice since it’s easy to access. We have seen this succesfully and easy to use when keys are sent in the inpiut data and the used in lookups i.e. @{parameters(‘myobjectparam’)[triggerbody()[‘key’]]}

So make sure to use the appropiate type at the correct time and I hope you got some more insight in how/when and where to use these 3 diffrent types.

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


Logic Apps deployment template extractor trigger updates

Updates to the Logic App Template Extractor has come out, read more about in the earlier post Logic Apps Deployment Template Extractor

Reccurence properties on trigger automatically ARM template properties

So whit a reccurence trigger we are setting an interval and frequency and often we want to have diffrent values in dev/test and production since trigger this action is assoisated with a billable action so frequency is often something we change between the environments. So whit any trigger that has reccurence propperties such as Interval and Frequencey are automatically generated as properties to the ARM Template. Here we can see the standard reccurence trigger:

And in the extracted ARM template we will automatically get:

"parameters": {
...
 "RecurrenceFrequency": {
      "type": "string",
      "defaultValue": "Day"
    },
    "RecurrenceInterval": {
      "type": "int",
      "defaultValue": 1
    }
...
  "triggers": {
    "Recurrence": {
      "recurrence": {
        "frequency": "[parameters('RecurrenceFrequency')]",
        "interval": "[parameters('RecurrenceInterval')]"
      },
      "type": "Recurrence"
    }
  },

File Connector and Base64 paths

The File Connector amongst others are saving the folder path in Base64 format to make sure that the Path is valid, for the File Connector it is so forthe trigger and List files action. The Designers are working with this withouth any problems but as you want to automate the deployment this becomes something we need to know and understand how to handle.

Understanding the Base64 modell

So let’s start with understanding the base64 modell used in these actions/triggers. So first we pick our path:

And if we then go in and look in the codeview:

"List_files_in_folder": {
    "inputs": {
        "host": {
            "connection": {
                "name": "@parameters('$connections')['filesystem_1']['connectionId']"
            }
        },
        "method": "get",
        "path": "/datasets/default/folders/@{encodeURIComponent('XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=')}"
    },
    "metadata": {
        "XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=": "\\\\server\\ARBETSORDER\\OUT\\XML_EXPORT\\TO"
    },
    "runAfter": {},
    "type": "ApiConnection"
}

So as we can see the path we picked in the designer is in the metdata tag and the name of the property is just a “random” name. The “random” name is not so random it’s actually the base64 representation of the path.

\\server\ARBETSORDER\OUT\XML_EXPORT\TO = decodeBase64(‘XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=’).

The metadata tag is used in the GUI to present the path in text and the actuall value sent to the API Connection is found in the path property:

"path": "/datasets/default/folders/@{encodeURIComponent('XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=')}"

So in order to be able to handle deployments this is the actual value we need to change but if we want the GUI to present this value we also need to update the metadata tag. To do this there are handy functions available to use in the ARM template [base64(‘somevalue’)]. The full extract when using the Deployment Template Creator will handle this and look like this:

"parameters": {
...
"List_files_in_folder-folderPath": {
      "type": "string",
      "defaultValue": "\\\\server\\ARBETSORDER\\OUT\\XML_EXPORT\\TO"
    },
}
...
	"List_files_in_folder": {
	  "runAfter": {},
	  "metadata": {
	    "[base64(parameters('List_files_in_folder-folderPath'))]": "[parameters('List_files_in_folder-folderPath')]"
	  },
	  "type": "ApiConnection",
	  "inputs": {
	    "host": {
	      "connection": {
	        "name": "@parameters('$connections')['filesystem_1']['connectionId']"
	      }
	    },
	    "method": "get",
	    "path": "[concat('/datasets/default/folders/@{encodeURIComponent(''',base64(parameters('List_files_in_folder-folderPath')),''')}')]"
	  }
	}

Setting the parameter in the parameters file to the new value will then be populated and working with both the designer and setting the correct value as parameter to the File Connector.

Small fixes

There has also been a number of small fixes as the parameter type now supports all kind of types, we can have objects, strings, integers etc.

Here is a sample of how parameters inside a Logic App of type Object will be handled:

 "parameters": {
    "ismanager": {
        "defaultValue": {
            "0": 572,
            "1": 571,
            "No": 572,
            "Yes": 571
        },
        "type": "Object"
    },

After generating the ARM template the Logic App parameter will be pushed up to an ARM template Parameter, making the ARM template looking like this:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
	"parameters": {
		...
		paramismanager": {
		  "type": "Object",
		  "defaultValue": {
		    "0": 572,
		    "1": 571,
		    "No": 572,
		    "Yes": 571
		  }
		},
		....
	}
	"variables": {},
	"resources": [
	    {
	      "type": "Microsoft.Logic/workflows",
	      "apiVersion": "2016-06-01",
	      "name": "[parameters('logicAppName')]",
	      "location": "[parameters('logicAppLocation')]",
	      "dependsOn": [],
	      "properties": {
	        "definition": {
	          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
	          "contentVersion": "1.0.0.0",
	          "parameters": {
	            "ismanager": {
	              "defaultValue": "[parameters('paramismanager')]",
	              "type": "Object"
	            },
				...
			 }
          },
          "outputs": {}
        },
        "parameters": {}
      }
    }
  ],
  "outputs": {}
}
				

Tools like the Logic App Template Creator helps us focus on the fun and good parts, building great solutions for our customers.

I strongly recomend you to try it and help out evolving it!

https://github.com/jeffhollan/LogicAppTemplateCreator

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


Logic Apps Variables and Do Until

Sometimes we have to handle a workload that is undefined, here I’m going to show how thi could be solved with Do Until and __variables.

The scenario is following: We need to collect all users from a system that has about 14000 users, the problem with this is that due to API restrictions we can collect about 150 users/call. And there is a page handling that we need to use, so we need to get page 1 and page 2 and so on with the page size of 150. In addition to this there is a boolean called last_page to check if we are on the last page.

So how to solve this in Logic Apps? For each cannot be used, calling a function that would collect all users would take to long. Fortunally we can use the new variables actions in combination with the Do_Until action. So what we will do is create a variable to keep the page number, increment it after each GET of users and then check if last_page is true and if it is end the Do until.

Do Until

The Do Until action is a looping action that continues to execute until a condition is met. Fortunaly Microsoft has provided some limits to make sure we don’t end up in endless loops :) So there are Limits to make sure that the loop is not run more than x times default is set to 60 times or longer than a specified time, as default the timeout is set to 1 hour. But ofcourse we will not need this since we will create good expressions, right? :)

Variable

Variables actions provides funtionality to store and handle variables, currently only Integer and float but hopefully soon even objects. There are 3 actions assosiated with variables:

  • increment - increments the value of a specified variable with a configurable amount
  • initialize - initalizes a variable, sets a name of it, type (currently only integer or float) and a starting value.
  • set - sets the specified variable to a specific value

So in order to use variables we need to initialize them via the initalize action and if we want to change the value we can either increment it or set it with the appropiate actions And use them later on in expressions or as in our case as a part of the URL in the HTTP action. We get the value via the following simple code:

@variables('page')

Build the solution

So let’s start adding these actions, when writing “variable” in the new action window 3 actions are showing. Let’s start with initialize the variable.

Name it to page and set type to Integer and value to 1 since it’s the first page we should start with.

We will also add another variable to have in the condition for exit the Do Until loop:

Then we add the Do_Until action using the dountil flag, and also changing the limits since 100 might be to low in the future we set to a really high number of 500 and adding 1 hour in the timout to make sure we can process all users.

After this we add the HTTP action and some other actions needed for processing, after that we add an increment page action to increment the value and also a condition to check the last_page flag.

As the image shows I’ve used increment by 2 and an @or statment in the Condition, this is since I’m running two parallel executions to speed up the process, fetching 2 * 150 users per turn. So anyone of them could be the last page and if anyone of them is we set the dountil variable to 1 to exit the loop.

Now we have created a flow that will collect users until the system tells us we have collected them all.

Thoughts

This is a good pattern and usefull but there are som considerations to take, when using the Do Until loops we don’t get the “pretty” logging that we get in the For Each loops where we can see all the runs, we just se the latest one.

Variables are really handy and easy to use, makes it possible to keep track of indexes and such when doing more complex workflows.

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Loops  •Variables  •Azure 


Logic Apps deployment template extractor

When you have developed your Logic App and it works well in the development envrionemnt it’s time to bring it to test and later on production. If you have devloped it in Visual Studion you will find this easy as long as you don’t use any linked resources like another Logic App, Azure Function, API Management or Integration Account, cause then the the next phase starts, the phase where you manually break down the resource id’s of these actions into concat statements with parameters and arm template functions.

So how does it look like? Let’s start at an innocent Workflow action, a call to another Logic App, in the designer it looks like this (both web designer and VS)

If we take a look at this from the code view, it looks as following:

"INT001_Work_Order": {
    "inputs": {
        "body": "@triggerBody()",
        "host": {
            "triggerName": "manual",
            "workflow": {
                "id": "/subscriptions/fake89f1-660a-4585-b4d8-bf5172cb7f70/resourceGroups/integration-we/providers/Microsoft.Logic/workflows/INT001_Work_Order"
            }
        },
        "retryPolicy": {
            "type": "None"
        }
    },
    "runAfter": {},
    "type": "Workflow"
}

Id is a reference to the Logic App via it’s resource id, so the id in the field bellow is pointing exactly to that other Logic App. Let’s look in to the field:

"/subscriptions/fake89f1-660a-4585-b4d8-bf5172cb7f70/resourceGroups/integration-we/providers/Microsoft.Logic/workflows/INT001_Work_Order"

The harcoded parts is the id of the subscription fake89f1-660a-4585-b4d8-bf5172cb7f70, the resource group name integration-weand the Logic App name INT001_Work_Order. In order to make this possible to relate to in other subscriptions we need to remove the “hard codings” of subscription and resrource group, using arm template language read more at: https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-authoring-templates We will use the subscription object to get the current subscription we are executing and it’s id via the subscriptionId property and the same for the resource group name via the resourceGroup object and the name property when then get the values for the current subscription and resource group we are deploying to.

It will look like this (I assume the logic app name will be same in the next environment).

"[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', resourceGroup().name,'/providers/Microsoft.Logic/workflows/INT001_Work_Order')]"

This can now be deployed to a new subscription and if the INT001_Work_Order Logic App exists it will be referenced automatically, otherwise the deloyment will fail.

Same goes for functions, if we add them alot of information is put in to this id property and put there “hard coded”.

"TransformKNABContactToGeneric": {
    "inputs": {
        "body": "@body('HTTP_Webhook')",
        "function": {
            "id": "/subscriptions/fake89f1-660a-4585-b4d8-bf5172cb7f70/resourceGroups/integration-we/providers/Microsoft.Web/sites/functionappname/functions/INT004-TransformContactToGeneric"
        }
    },
    "runAfter": {},
    "type": "Function"
}

It’s just as with the Logic App reference…

"/subscriptions/fake89f1-660a-4585-b4d8-bf5172cb7f70/resourceGroups/integration-we/providers/Microsoft.Web/sites/functionappname/functions/INT004-TransformContactToGeneric"

As before we need to update it and it should be to something like this, note I’ve added a parameter for the function app container name, this wwill be diffrent in diffrent environments and to be on the safe side we added a parameter for the function name.

"[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', resourceGroup().name,'/providers/Microsoft.Web/sites/', parameters('FunctionAppName'),'/functions/', parameters('FunctionName'),'')]"

Api Management usage is the same, but here we also have some more things to consider instance name and the subscriptionkey (as experience says it’s easy to forgett adding a parameter as is needed to get a smooth deployment)

"UploadAttachment": {
  "runAfter": {},
  "type": "ApiManagement",
  "inputs": {
    "api": { "id": "/subscriptions/FAKE9f1-660a-4585-b4d8-bf5172cb7f70/resourceGroups/Api-Default-West-Europe/providers/Microsoft.ApiManagement/service/apiminstancename/apis/58985740990a990dd41e5392" },
    "body": {
      "attachment": {
        "data": "@{triggerBody()['data']}",
        "extension": "@{triggerBody()['extension']}",
        "filename": "@{triggerBody()['filename']}",
        "orderNumber": "@{triggerBody()['orderNumber']}"
      }
    },
    "method": "post",
    "pathTemplate": {
      "parameters": {},
      "template": "/api/v1/test/rest/UploadAttachment"
    },
    "subscriptionKey": "keytothesubscription"
  }
} 

so in this case we would need to do something like this to make a smooth deployment, note the paramter for instance name and apiid just to be safe.

 "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', parameters('apimResourceGroup'),'/providers/Microsoft.ApiManagement/service/', parameters('apimInstanceName'),'/apis/', parameters('apimApiId'),'')]"
 ..
 "subscriptionKey": "[parameters('subscriptionkey')]"

So as you can see there are som things to do here, and if you are as me a fan of the Web desginer after the development the next step in order to get this into VSTS would be to start building the arm template by hand.. I’ve found this very time consuming and error prone, so easy to make misstakes and it takes time to test the deployment. Also this often changes the Logic App abit and that might not be so good if test’s has been done on the implementation.

So in order to automate all this and more, i.e. pushing parameters in the logic app to the ARM template making it possible to change the parameters with a parameter file between environments. We are using the Logic App Template Creator created by Jeff Holan https://github.com/jeffhollan/LogicAppTemplateCreator to automate this, it’s even possible to create the parameter file. With this we can extract the Logic App, setup parameters for TEST/PROD and add it to VSTS and start deploying directly. Note that Gateway’s and their deployment are not supported yet and if you find bugs, help out or report them so we can fix them togheter.

The Template Generator will automate this and do the following:

Logic App (workflow)

In addition to fix the id to a generic concat path it also adds a parameters at the top if the referenced Logic App is in another resource group.

Logic App in the same resource group (assumes that in the next enviornment it will be the same)

"INT001_Work_Order": {
  "runAfter": {},
  "type": "Workflow",
  "inputs": {
    "body": "@triggerBody()",
    "host": {
      "triggerName": "manual",
      "workflow": {
        "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', resourceGroup().name,'/providers/Microsoft.Logic/workflows/INT001_Work_Order')]"
      }
    },
    "retryPolicy": {
      "type": "None"
    }
  }
}

Logic app not in same resource group, a parameter is added for the resource group name

"parameters": {
...
"INT0014-NewHires-ResourceGroup": {
      "type": "string",
      "defaultValue": "resourcegroupname2"
    }
...
}
...
"INT0014-NewHires": {
  "runAfter": {
    "Parse_JSON": [
      "Succeeded"
    ]
  },
  "type": "Workflow",
  "inputs": {
    "body": {
      "Completed_Date_On_or_After": "@{body('Parse_JSON')?['Completed_Date_On_or_After']}",
      "Completed_Date_On_or_Before": "@{body('Parse_JSON')?['Completed_Date_On_or_Before']}",
      "Event_Effective_Date_On_or_After": "@{body('Parse_JSON')?['Event_Effective_Date_On_or_After']}",
      "Event_Effective_Date_On_or_Before": "@{body('Parse_JSON')?['Event_Effective_Date_On_or_Before']}"
    },
    "host": {
      "triggerName": "manual",
      "workflow": {
        "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', parameters('INT0014-NewHires-ResourceGroup'),'/providers/Microsoft.Logic/workflows/INT0014-NewHires')]"
      }
    }
  }
}

Azure Function

Azure function are referenced via a function app and at the last the part the function name so all these parameters need’s to be parameters, also here we have the same function as with the Logic around the resource group.

"parameters": {
...
 "TransformContactToGeneric-FunctionApp": {
      "type": "string",
      "defaultValue": "functionAppName"
    },
    "TransformContactToGeneric-FunctionName": {
      "type": "string",
      "defaultValue": "INT004-TransContactToGeneric"
    }
...
 "TransformContactToGeneric": {
  "runAfter": {},
  "type": "Function",
  "inputs": {
    "body": "@body('HTTP_Webhook')",
    "function": {
      "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', resourceGroup().name,'/providers/Microsoft.Web/sites/', parameters('TransformContactToGeneric-FunctionApp'),'/functions/', parameters('TransformContactToGeneric-FunctionName'),'')]"
    }
  }
}

API Managemenmt

API Management and API’s are referenced at the same way as all other so we need the same handling. Here it’s also important to be able to set the instance namen and sunscription key since they will change in the next environment.

"parameters": {
...
"apimResourceGroup": {
      "type": "string",
      "defaultValue": "integration-we"
    },
    "apimInstanceName": {
      "type": "string",
      "defaultValue": "apiminstancename"
    },
    "apimApiId": {
      "type": "string",
      "defaultValue": "58778a86990a990f5c794e48"
    },
    "apimSubscriptionKey": {
      "type": "string",
      "defaultValue": "mysecretsubscriptionkey"
    }
...
"INT005_Add_Attachment": {
  "runAfter": {},
  "type": "ApiManagement",
  "inputs": {
    "api": {
      "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', parameters('apimResourceGroup'),'/providers/Microsoft.ApiManagement/service/', parameters('apimInstanceName'),'/apis/', parameters('apimApiId'),'')]"
    },
    "body": {
      "fileextension": "@{triggerBody()['fileextension']}",
      "filename": "@{triggerBody()?['filename']}",
      "image64": "@triggerBody()?['image64']",
      "workorderno": "@{triggerBody()?['workorderno']}"
    },
    "method": "post",
    "pathTemplate": {
      "parameters": {},
      "template": "/test/addattachment"
    },
    "subscriptionKey": "[parameters('apimSubscriptionKey')]"
  }
}

Integration Account

When working with Integration Account’s these need to be set in the next environment aswell so if they are used there will be some parameters and settings added.

"parameters": {
...
"IntegrationAccountName": {
  "type": "string",
  "metadata": {
    "description": "Name of the Integration Account that should be connected to this Logic App."
  },
  "defaultValue": "myiaaccountname"
},
"IntegrationAccountResourceGroupName": {
  "type": "string",
  "metadata": {
    "description": "The resource group name that the Integration Account are in"
  },
  "defaultValue": "[resourceGroup().name]"
}
...
"integrationAccount": {
  "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourcegroups/',parameters('IntegrationAccountResourceGroupName'),'/providers/Microsoft.Logic/integrationAccounts/',parameters('IntegrationAccountName'))]"
}

Parameters

Parameters added inside the Logic App via for now Code View will be pushed and added as a ARM template parameter, so if you have this parameter inside your Logic App.

"definition": {
  "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "MyURL": {
      "defaultValue": "http://requestb.in",
      "type": "String"
    }
  },

After running the extractor this parameter will be a ARM template parameter and the defaultValue will be set to the value of that ARM template parameter, so now this can be set at deployment time via the parameters file.

"parameters": {
...
"paramMyURL": {
  "type": "string",
  "defaultValue": "http://requestb.in"
},
...

"definition": {
  "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "MyURL": {
      "defaultValue": "[parameters('paramMyURL')]",
      "type": "String"
    }
  },

We are currently working on more, like trigger functionality to get frequency, intervall settings put up. And also functions to handle special triggers like file, where the folder path is base64 encoded and built up in a special way.

Be aware that right now the Gateway Connectors are strangely removing the user/pass if they are not supplied so they will break, remove them manually from the template and create them manually for now. A fix is comming to not include them. But for now hope this helps out!

Here is a sample of a powershell script that will take out the Logic App, generate a template and create a parameter file

#if you have problem with execution policy execute this in a administrator runned powershell window.
#Set-ExecutionPolicy -ExecutionPolicy Unrestricted

#we have th module in a folder structure like this and it's needed to be included
Import-Module ".\LogicAppTemplateCreator\LogicAppTemplate.dll"

#Credentials will "flow" if you are logged in.


#Set the name of the Logic App
$logicapname = 'INT001_Work_Order'
#Set the resource group of the Logic App
$resourcegroupname = 'integration-we'
#Set the subscription id of where the Logic App is
$subscriptionid = 'fakeb73-d0ff-455d-a2bf-eae0b300696d'
#Set the tenant to use with the Logic App, make sure it has the right tennant
$tenant = 'ibiz-solutions.se'
 
#setting the output filename
$filenname = $logicapname + '.json'
$filennameparam = $logicapname + '.parameters.json'


Get-LogicAppTemplate -LogicApp $logicapname -ResourceGroup $resourcegroupname -SubscriptionId $subscriptionid -TenantName $tenant | Out-File $filenname
#$filenname = $PSScriptRoot+"\"+$filenname
Get-ParameterTemplate -TemplateFile $filenname | Out-File $filennameparam

Sample of complete extraction INT001_Work_Order:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "defaultValue": "INT001_Work_Order",
      "metadata": {
        "description": "Name of the Logic App."
      }
    },
    "logicAppLocation": {
      "type": "string",
      "defaultValue": "[resourceGroup().location]",
      "allowedValues": [
        "eastasia",
        "southeastasia",
        "centralus",
        "eastus",
        "eastus2",
        "westus",
        "northcentralus",
        "southcentralus",
        "northeurope",
        "westeurope",
        "japanwest",
        "japaneast",
        "brazilsouth",
        "australiaeast",
        "australiasoutheast",
        "westcentralus",
        "westus2"
      ],
      "metadata": {
        "description": "Location of the Logic App."
      }
    },
    "apimResourceGroup": {
      "type": "string",
      "defaultValue": "integration-we"
    },
    "apimInstanceName": {
      "type": "string",
      "defaultValue": "instancename"
    },
    "apimApiId": {
      "type": "string",
      "defaultValue": "5833f8a9990a991064e0e128"
    },
    "apimSubscriptionKey": {
      "type": "string",
      "defaultValue": "fakekeyis here"
    },
    "apimApiId2": {
      "type": "string",
      "defaultValue": "58778a86990a990f5c794e48"
    }
  },
  "variables": {},
  "resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [],
      "properties": {
        "definition": {
          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
          "contentVersion": "1.0.0.0",
          "parameters": {},
          "triggers": {
            "manual": {
              "type": "Request",
              "kind": "Http",
              "inputs": {
                "schema": {}
              }
            }
          },
          "actions": {
            "Check_assigned_to": {
              "actions": {
                "Service_Order": {
                  "runAfter": {},
                  "type": "ApiManagement",
                  "inputs": {
                    "api": {
                      "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', parameters('apimResourceGroup'),'/providers/Microsoft.ApiManagement/service/', parameters('apimInstanceName'),'/apis/', parameters('apimApiId'),'')]"
                    },
                    "body": "@concat(replace(replace(string(triggerBody()),'<?xml version=\"1.0\" encoding=\"utf-8\"?>',''),'<workOrders ','<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><a:Action s:mustUnderstand=\"1\">Submit</a:Action></s:Header><s:Body><workOrders xmlns=\"pv-work-order\" '),'</s:Body></s:Envelope>')",
                    "method": "post",
                    "pathTemplate": {
                      "parameters": {},
                      "template": "/external/Relacom/ServiceOrder/"
                    },
                    "subscriptionKey": "[parameters('apimSubscriptionKey')]"
                  }
                }
              },
              "runAfter": {
                "Compose": [
                  "Succeeded"
                ]
              },
              "expression": "@equals(outputs('Compose')['assigned'], 'RELACOM')",
              "type": "If"
            },
            "INT005_Add_Attachment": {
              "runAfter": {
                "INT001_Create_Order": [
                  "Succeeded"
                ]
              },
              "type": "Workflow",
              "inputs": {
                "body": {
                  "company": null,
                  "fileextension": "xml",
                  "filename": "@concat(concat(xpath(xml(triggerbody()), 'string(/*[local-name()=\"workOrders\"]/*[local-name()=\"workOrder\"]/*[local-name()=\"code\"])'),'_'),xpath(xml(triggerbody()), 'string(/*[local-name()=\"workOrders\"]/*[local-name()=\"header\"]/*[local-name()=\"messageDate\"])'))",
                  "id": null,
                  "image64": "@base64(triggerbody())",
                  "workorderno": "@xpath(xml(triggerbody()), 'string(/*[local-name()=\"workOrders\"]/*[local-name()=\"workOrder\"]/*[local-name()=\"code\"])')"
                },
                "host": {
                  "triggerName": "manual",
                  "workflow": {
                    "id": "[concat('/subscriptions/',subscription().subscriptionId,'/resourceGroups/', resourceGroup().name,'/providers/Microsoft.Logic/workflows/INT005_Add_Attachment')]"
                  }
                }
              }
            },
            "Response": {
              "runAfter": {
                "INT005_Add_Attachment": [
                  "Succeeded"
                ]
              },
              "type": "Response",
              "inputs": {
                "body": "OK",
                "statusCode": 200
              }
            }
          },
          "outputs": {}
        },
        "parameters": {}
      }
    }
  ],
  "outputs": {}
}

Sample of the parameters file generated:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "value": "INT001_Work_Order"
    },
    "logicAppLocation": {
      "value": "[resourceGroup().location]"
    },
    "apimResourceGroup": {
      "value": "integration-we"
    },
    "apimInstanceName": {
      "value": "instancename"
    },
    "apimApiId": {
      "value": "5833f8a9990a991064e0e128"
    },
    "apimSubscriptionKey": {
      "value": "fakekeyis here"
    },
    "apimApiId2": {
      "value": "58778a86990a990f5c794e48"
    }
  }
}

As you can see this will handle all of the nessisary concat setups, it automatically add’s paramters for the import things like the API Management instance name and subscription key to make sure the parameters are not forgotten. This makes it better and easier to use the Web designer rather than VS since in VS you still need to this work or export it afterwards (but then what is the point in using VS?). With this it’s possible to just develop in the Web designer as I prefer and then extract the template, verify it and update with parameters (prefereable in the Logic App so you can reextract the template to make sure all is working before the deployment) and put into VSTS for deployment.

I strongly recomend you to try it.

https://github.com/jeffhollan/LogicAppTemplateCreator

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


API Management Backend Service with id “abc” not found

When deploying api’s to API Management via GIT we can run into the error “could not find backend service with id abc”, this comes from the SOAP handling added recently in API Management.

If we take a look inside the GIT repositiory, there is a new folder added in GIT called “backend”.

If we look inside there are some folders named after the url’s.

Inside these files we find a configuration.json that looks like this:

In one of these files you will find the id that is matching the error message, make sure it is added in the commit. Done? If this API is created with this commit (aka first time deployed) the error will unfortunately persist, this is due to a minor bug. To solve this we can temporary remove the set-backend policy and in the SOAP case we find this in “policies/api/{apiname}.xml”

File looks like:

Temporary remove the set-backend service, create a commit and deploy the api, after add the backend service again and create a new commit and now the deployment will work and the backend will be set correctly.

Posted in: •Azure API Management  •Integration  | Tagged: •API  •API Management  •Azure 


Understanding API Management Source Code

Developing in Azure API Management is done inside the Azure portal or in the Publisher Portal that comes with the API Management instance, developing and testing is easy, but when it comes to move the developed API to the next instance (Test or Prod) it becomes tricky.

In this demo instance I have 2 API’s and logic applied to handle authentication and formatting.

Where is the code?

When developing API’s in API Management publisher portal you quickly learn that API signature and policy codes are separated and handled in different areas of the portal, API and Policy. This experience is not as clear in the new API blade inside the Azure Portal where a lot of work has been done to make it easier to understand the API’s and see the whole picture at one place.

But the difference is quite important, the API is the “external” definition needed by clients to understand and implement our API and the Policy part is logic that is executed when a call is made to the API.

Exporting the API; Press export and select “OpenAPI specification”

Inspecting the file shows that only the swagger definition is exported and not the logic added to the

We now test to extract the code using the GIT option, in the publisher portal under “Security” -> “Configuration repository”

First Press the “Save configuration to repository”, note that after the save is completed the Git icon on the top right corner changes color to green.

Get the clone URL and note that the Username is “apim”, scroll down and generate a password

Now use the credentials and password to connect. Tip:_ if you are using PowerShell or another tool that requires the authentication in the form, remember to URL encode the password:_

_git clone _https://{user}:{password}@ibizmalo.scm.azure-api.net

Inspecting the content: (2 folders down) and we find the source code!

All of these we can copy to the next instance, but! There are some things to take into consideration; each group/product etc. has a unique ID (if these are manually created in the instances it’s not guaranteed to be the same and the import won’t work in another instance, so import everything you need)

Import changes into a API Management instance

The next step is to import these files to another API Management instance, so to get started, we need to clone the code from this API Management instance:

Copy the files and commit, after that we need to deploy the changes in the repository to the API Management instance and once again we need to go to the “publisher” portal and “Security” -> “Configuration repository”. Scroll down and press the “Deploy repository configuration” and the changes are applied.

Posted in: •Azure API Management  •Integration  | Tagged: •API  •API Management  •Azure 


Exposing SAP Gateway services with API Management

Exposing services like the SAP Gateway is an important task for API Management but not always so easy. Security is often high around these products so we need to understand this in order to get the setup correct. To get started let’s look at the setup that we were facing. SAP Gateway is hosted inside the OnPrem network behind several firewalls and hard restrictions, API Management is hosted in Azure. In this case we had an Express Route setup ready to be used so we could deploy an API Management inside a preexisting subnet. Firewall rules where setup based on the local IP of the API Management instance (according to the information I have these should be static when deployed in a VNET).

After API Management is deployed inside the network, this done by setting the External Network in the network tab to a subnet of the VNET that is connected to the Expres Route. Make sure that API Management is alone in this subnet, see image for more informaton

After this is done we set up the routing/firewall rules, to get the local ip of the API Management instance we put up a vm with IIS and created a API inside API Managment that called the standard IIS start page, after that we searched the loggs in IIS to get the local IP.

Now we can start creating the API and I’ll jump right in to the policy settings of the API that is created to expose the SAP Gateway, the Security model on SAP GW can vary but we in this case we had Basic Authentication as the authentication mode. Adding handling of this is quite straightforward for API Management, there is a policy ready to be used “Authenticate with Basic”:

So we started off adding the authentication part, now the easy part is done. When calling the API we just got a 403 Forbidden response saying “Invalid X-CSRF-Token”, looking around this we found that it’s the Anti Forgery setup with SAP Gateway. To be able to handle this a token and a cookie is needed and the token and cookie are retrieved via a Successful GET call to the Gateway. The initial call is using the same URL (make sure that the GET operation is implemented so the result is successful (return 200 OK) otherwise the token is not valid. Since I had no access to the SAP GW without API Management my testing’s are done from Postman via API Management to SAP GW. Adding the “X-CSRF-Token” header with value Fetch will retrieve the token and cookie making the call like:

The response looks like:

The interesting part is the Headers and Cookies, let’s have a look, under Headers we find the X-CSRF-TOKEN that we need to use in the response back.

 

In the Cookies we find 2 items and we are interested in the one called “sap_xsrf..” this is the anti-cross site forgery cookie that is needed in the POST/PUT request to SAP Gateway.

Composing these makes a valid request look like this:

So now we can do these two requests to get a valid POST call in to SAP Gateway from Postman, so let’s move on to setting this up in API Management. In API Mangement we don’t want the client to understand this and/or force them to implement the two calls and the functionality to copy headers and so on, we just want to expose a “normal” API. So we need to configure API Management to do these two calls to the SAP Gateway from API Management in the same call from the client to be able send a valid POST request to SAP Gateway.

In order to make this work we need to use polices, so after setting up the standard POST request to create Service Order we will go to the Policy tab.

In the beginning it looks something like this:

So the first thing we need to do is to a send-request policy, I configured it wit mode new and used a variable name of fetchtokenresponse. Since retrieving the token is to do a GET request to the SAP Gatway we reuse the same URL to the API (after rewrite). Setting the header X-CSRF-Token to Fetch since we are fetching the token and adding the authorization header with the value for Basic authentication So let’s start with creating the call to do the GET token request. let’s add this code in the inbound section.

<send-request mode="new" response-variable-name="fetchtokenresponse" timeout="10″ ignore-error="false">*
<set-url>@(context.Request.Url.ToString())</set-url>
<set-method>GET</set-method>
<set-header name="X-CSRF-Token" exists-action="override">
<value>Fetch</value>
</set-header>
<set-header name="Authorization" exists-action="override">
<value>Basic aaaaaaaaaaaaaaaaaaaaaaaaaa==</value>
</set-header>
<set-body></set-body>
</send-request>

Next step is to extract the values from the send-request operation and add them to our POST request, setting the X-CSRF-Token header is fairly straightforward so we use the set header policy and retrieves the header from the response variable, the code looks like:

<set-header name="X-CSRF-Token" exists-action="skip">
<value>@(((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("x-csrf-token"))</value>
</set-header>

A bit trickier is the Cookie, since there are no “standard” cookie handler we need to implement some more logic, in the sample I provide I just made a lot of assumptions on the cookie. We need the cookie starting with sap_XSRF I started by splitting all cookies on ‘;’ and finding the cookie that contained “sap_XSRF”, this had in our case also a domain that I didn’t need so I removed it by splitting on ‘,’ (comma) and used the result in a set-header policy.

<set-header name="Cookie" exists-action="skip">
<value>@{
string rawcookie = ((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("Set-Cookie");
string[] cookies = rawcookie.Split(';');
string xsrftoken = cookies.FirstOrDefault( ss => ss.Contains("sap-XSRF"));
return xsrftoken.Split(',')[1];
}
</value>
</set-header>

All in all the policy will look like:

 <policies>
 <inbound>
 <base />
 <rewrite-uri template="sap/opu/odata/sap/ZCAV_AZURE_CS_ORDER_SRV/ItHeaderSet('{oid}')" />
 <send-request mode="new" response-variable-name="fetchtokenresponse" timeout="10" ignore-error="false">
 <set-url>@(context.Request.Url.ToString())</set-url>
 <set-method>GET</set-method>
 <set-header name="X-CSRF-Token" exists-action="override">
 <value>Fetch</value>
 </set-header>
 <set-header name="Authorization" exists-action="override">
 <value>Basic aaaaaaaaaaaaaaaaaaaaaaaaaa ==</value>
 </set-header>
 <set-body>
 </set-body>
 </send-request>
 <set-header name="X-CSRF-Token" exists-action="skip">
 <value>@(((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("x-csrf-token"))</value>
	 </set-header>
 <set-header name="Cookie" exists-action="skip">
 <value>@{
 string rawcookie = ((IResponse)context.Variables["fetchtokenresponse"]).Headers.GetValueOrDefault("Set-Cookie");
 string[] cookies = rawcookie.Split(';');
 string xsrftoken = cookies.FirstOrDefault( ss => ss.Contains("sap-XSRF"));
 return xsrftoken.Split(',')[1];
 }
 </value>
</set-header>
 </inbound>
 <backend>
<base />
</backend>
<outbound>
 <base />
 </outbound>
 <on-error>
 <base />
 </on-error>
 </policies>

The result is now complete and the client will assume/think that this is just a regular API as the other ones and we can expose this with regular API Management API-Key security.

Wrap up

Exposing the SAP Gateway is not a straight forward task but after understanding the process and being able to implement the SAP Gateway complexity inside API Management we can expose these functions just as any other API, I would suggest adding some Rate call limits and quotas to protect the gateway from overflow. This scenario proves the value of API Management and provides possiblities to solve complex authentication and anti forgery patterns in order to standardise your own API Facade with the same auth mechanism for the client without tacking the backend security in consideration.

Full scenario:

 

Posted in: •Azure API Management  •Integration  •Uncategorized  | Tagged: •API Governance  •API Management  •SAP