Integrate 2018 summary day 3

Yet another fabulous day here at integrate 2018. We started a bit earlier this day and that is something everyone was glad for, otherwise we most likely would have missed our flight back to Stockholm.

Let’s get started by talking about the last session. Our very own MVP, Mattias Lögdberg, finished of the conference with a splendid session that glued together what have been presented the previous days. He did that with the help of a pen, no demos and the work that our client have trusted us to implement. Great work!! It was a pure pleasure seeing our colleague take the stage. The audience where dropping jaws. Some might say we are bias in our glorification but the twitter-flow isn’t. Nice work Mattias. Happy to be working with you.

So, what did the day start off with? Well, Richard Seroter had the first session called “Architecting Highly Available Cloud Integrations”. This was very interesting and well executed, unfortunately the cameraman where not as prepared as Richard. Some of the things he had on the agenda was

Core patterns

E.g. Retry transitional failure, Load balancing (scale out, scale up, auto-scaling), Replication, throttling of user, Load leveling, Security with encryption, Automatic builds and staging

Configure for availability

Here we have lots to choose from. Sql Server (always-on), CosmosDB (e.g. geo-redundancy), Service Bus and Event Hub using these is being true to the 24/7/365. Logic apps (High Availability by design). Functions (depending on service plan it too can be H/A by design)

VPN gateways deploy active stand-by instances by default and voila H/A 😊

Putting it all together

Only integrate with H/A services (where possible). Document and understand what services that failover together, meaning a process contains services if one service fails the whole process fails.

The second runner up this day was Michael Stephenson and how Flow can empower DevOps. He had the coolest demo of all. DevOps becomes a lot more interesting when you can do it through MindCraft. Talk about solving mundane tasks in a fun way. What we took with us from this session was how powerful Flow and Power-Apps can be. Together they will change devops and how our customers look upon us as integrators. One can do some seriously good and cool stuff, with-out being UX-developer.

After this our bugs-friend Johan Hedberg had a session about “Using VSTS to deploy to Biztalk Server”. Again, a fantastic session, very informative and an eye-opener for many in the audience. Anyone having thoughts around ALM and BizTalk, and have a “bug-free”version of FP2 (i.e. must include CU 4) will most likely benefit from using the new template regarding deployment of biztalk artefacts and if you are familiar with deployment of web apps you will be very familiar with this process.

A big thanks to the Integrate organizers for inserting a break after these three splendid sessions.

When the break was over Wagner Silveira entered the stage with how one can expose biztalk to the world and why one might want to do it. Again we got a session with smashing demos that just worked. He showed us how “easy” it is to expose biztalk server in all its glory to the world in a professional way.

After this session, Dan Toomey took over and showed us the anatomy of an enterprise integration architecture. He gave everyone some food for thought. Just check out these bullet points.

  • Make sure your System Of Record layer is solid (e.g. well definied APIs, extensible, enforcing security and data validation)
  • Limit customizations within the System Of Record (e.g. keep the ERP to be just that)
  • Consider the use of canonical schemas (believe it or not it can be very handy)
  • Low coupling and strong cohesion (does this sound familiar to anyone working with code?). That means basically use pub/sub patterns where applicable.
  • The most important one – Allow room for innovation! Encourage and build platforms for experimentation.

The next runner up was Toon Vanhoutte and he showed us how to “Unlock the power of hybrid integration with Biztalk and webhooks!”. Yet again we are delivered smashing demos. Webhooks are cool! Everybody should be using webhooks. It is kind of pub/sub-by-design. It is like putting a reminder in your calendar and when triggered you can take action. Much better then polling.

But there are things to consider before jumping in.

The conference ended for us with Mattias Lögdbergs session “Refining Biztalk implementations” (we had a flight to catch). As mentioned in beginning his session glued all the pieces, given the previous days, together. He proved that the “pen” it is mightier than the demo 😊

Through his session we got to see that what we are doing together with our clients is the path to go if one wants to stay ahead of the competition.

After Mattias final words we had to rush to make it to Gatwick, so we missed the roundtable discussions with Microsoft

Amazing to see such excellent work and how much preparations the speakers have put in just for us. It was a pure joy and pleasure to be part of this conference. We feel very privileged and hope to be part of Integrate 2019.

Until next time! iBiz

Posted in: •Integration  | Tagged: •Integrate2018 


Integrate 2018 summary day 2

Reminiscing and excerpts from Integration 2018

Day 2 is an exciting day i think for most of the participants in the conference since most of the topics as the title implies are deep dives of most of the technologies.

Here are the excerpts of some of the bits and pieces on day 2 in my perspective.

MS (CSE) use of Logic Apps for Enterprise Integration

Divya and Amit did an excellent job by presenting a tool were it is seems easy to migrate from Biztalk set up into an integration account where the artifacts are directly set up on within it’s particular unit i.e. biztalk maps goes to integration account maps, and BizTalk schemas goes into integration account schemas.

It was quite impressive to see that even within internally, MS uses Logic apps as a ground tool for this.

Aside from a short disruption from the fire alarm exercise and a little lag from the demo, in the end it did worked!

API Management deep dive

Vladimir shows different set of patterns to address security policies in APIM. It was a good tip to see how he manage to do a demo within VS Code with the http plugin and do all the different security stuffs within and see the policies works out in APIM.

Logic apps patterns & best practices

Kevin and Derek did an excellent job with showing the possiblities of logic apps and a gallery of best practices of which exception handling is one of them. This is quite interesting since, although Biztalk has exception handling capabilities, some solutions don’t take this into account and it’s good to mention this now in logic apps.

Microsoft Integration Roadmap

This is maybe the most talked topic just now specially within the Biztalk community since as most of the participants on the conference anticipated that MS seems didn’t care so much on the roadmap for Biztalk instead more focus has been into the Azure integration components… and of course Azure Integration Services (the new meaning of IaaS?)!

Day 3 has even other exciting events… of course including our own Mathias Lögdberg within amazing ’handwritten’ drawings.

Some of the events topics of which one of my favorite speakers, Richard Seroter is already out.

https://www.linkedin.com/feed/update/urn:li:activity:6410436335362023424

Even a familiar face and an impressive presentation of Dan Toomey is as well out.

https://www.linkedin.com/feed/update/urn:li:activity:6410535047534870528

What’s there & what’s coming in BizTalk360 & ServiceBus360

Saravana stepped on stage to showboat his products BizTalk360 and Servicebus360. He started off with BizTalk360 and gave us a brief tour, he showed us how it could be used for administrating and monitoring BizTalk server environments. It comes with customizable dashboards that can in a simple way show you how your BizTalk environment are doing. BizTalk360 offers some actions in the product such as Auto Correct. Auto Correct function enables you to automatically start a resource that has been disabled. You can easily configure and set up alarms to be sent if an incident occurs such as suspended messages, ports been disabled or an SQL jobs fails. You can make actions directly in the interface, for suspended messages you can download, resume and terminate. I found the interface intriguing, and very user friendly.

They have also added the capability of monitoring some Azure Services.

Saravana didn’t go in depth and therefore left a lot to our imagination.

Saravana moved on to ServiceBus360 and quickly announced that it will be renamed to Serverless360. Servicebus360 wishes to gather all your resources that defines your integration in an “Application”. Resources can be a logic apps, queue, functions etc. It doesn’t matter if they work over different subscriptions. Serverless360 gathers all the resources and monitors them as a whole. Saravana talked a lot about the security issues in Azure and how it much easier it could be handled in Serviceless360. He talked about feature request that soon can be realized, he mentioned an edit and resubmit function that would seem useful when dealing with messages on for example a dead letter queue.

WHAT’S THERE & WHAT’S COMING IN ATOMIC SCOPE & DOCUMENT360

Saravana presented Atomic Scope and basically he explained that it enables you to track messages through out the integration. He said it provides an end to end visualization of the integration. It was clear that they put the business user experience before the technician. Using xpath or json etc it is possible to extract values within the message body making them searchable. Atomic scope saves the messages from BizTalk in to a separate database and makes it possible to archive messages for a long time. Its also possible to set up alarms if a message fails to reach its destination, the alarms are not customizable.

Posted in: •Integration  | Tagged: •Integrate2018 


Integrate 2018 summary day 1

Keynote - The Microsoft Integration Platform

The first day of Integrate 2018 was exciting. Key speaker John Fancy started off with a walkthrough of IT from the 80s to the present. He talked about the big changes and how fast they have been implemented and effected IT in the recent decades. He made the comparison as to how slow things were changing in the past and gave an example about transferring money. It took the process of transferring funds hundreds of years to evolve from physically moving funds from location A to B, and now its just a few clicks away over the internet.

The point of Fancys introduction was that change is coming whether you like it or not, and you better be prepared.

He talked a bit about three stages you go through while dealing with change:

Denial

Questioning

Enlightenment

The first stage is denial and its basically you turning back on change, thinking its not necessary to adapt or evolve. The second stage is when you start asking the right questions and being curious about change. The third stage is when you see the big picture and know what road you have to take.

He quoted Tina Fey, “Say yes and you’ll figure it out afterwards”. And by that he meant that if you say no to change, someone else will say yes and leave you behind. If you say no, do you expect them to ask you the next time something comes up? So according to Fey and Fancy its better to say yes and to figure it out as you go.

Introduction to Logic Apps

Kevin Lam and Derek Li held the first product presentation which was an introduction to Logic Apps. They had some basic demos about how easy it is to get started. Then they had a slightly more advanced demo demonstrating the Azure Vision API to interpret a scanned invoice. In the end, the amount of the invoice could then be calculated.

Some announcements were made, such as:

  • Availability in China cloud.
  • MSI(Manage Service Identity) support
  • Key Vault support
  • Mocking data for testing.

It was a good presentation but the session was more targeted to people that haven’t used Logic Apps before.

Azure Functions role in integration workloads

Jeff Hollan gave an introduction to Azure Functions. He reviewed some basic concepts such as triggers and bindings.

He had some good points on where Functions fits into the introduction landscape.

He also had an interesting demo where he sent 30,000 requests against a Function to show how it automatically scale up. He also gave some tips on how to control how the scaling occurs. He was a bit short of time at the end, but he managed to introduce the concept of Durable Functions. Durable Functions is a way to write stateful Functions, which can be used to implement workflows.

Hybrid integration with Legacy Systems

This was actually the first presentation where BizTalk was mentioned. Paul Larsen and Valerie Robb announced Feature Pack 3 to BizTalk 2016. They focused a lot of new Office 365 adapters to handle contacts, mail and calendars. They also spoke a part of compliance with GDPR and FIPS privacy standards.

Compliance with GDPR and FIPS will also come in Cumulative Update 5 for BizTalk along with some other awaited features.

API Management overview

Miao Jiang talked about "The rise of APIs" and why we need an API Management platform to manage our APIs. He introduced some basic concepts in API Management such as policies, named values, products and subscriptions.

He made a few announcements such as better Application Insights integration, more metrics and KeyVault integration.

Eventing, Serverless and the Extensible Enterprise

Clemens Vasters held this presentation and he had a little different focus than the previous speakers. This talk was more focused on architecture at a higher level and a little less on focus on specific platforms.

He made it clear that services should not be scoped on how many artifacts or how much code it contains but ownership.

He defined 2 type of data exchanges.

Messaging: Where the publisher has an intent or expectation on what the consumer should do with the message. This would typically be something that would be implemented with Service Bus.

Eventing: Informs consumers that something has happened and don’t care much what the consumer does with the information.

He also defined to types of events

Discrete events Independent and immediately actionable. For this Event Grid would typically be used.

Event series Typically for streaming purposes where thresholds are monitored.

Dan Rosanova – The Reactive Cloud: Azure Event Grid

To start of his session Dan showed us the ever-growing amount of information that is being processed through the cloud environment. To handle this Dan gave his findings and experience on how to think when choosing the appropriate data handler for different types of data. Dan’s primart focus during his session was the Azure Event Grid and the Azure Event Hub. To make a gross oversimplification

Dan started of talking about Azure Event Grid which is a platform for consuming data from many different providers. He introduced a couple of concepts to bear in mind when working with Event Grid. Communication between applications/organizations, individual message (meaning that the messages don’t depend on one another), push semantics (trigger event), pay as you go, fan out. Dan also mentioned new features like being able to limited the amount of retires, hybrid endpoints, a dead letter endpoint for unconsumed messages/events.

Azure Event Hub was the next topic of the session. A big part of this part was dedicated to Kafka, which is a new Event Hub which is open source. Dan provided us with a demo showing the ability to consume messages from the Kafka Endpoint and receive these in the Azure Event Hub. The Azure Event Hub is used for fan-in purposes, acquiring big amount of messages in a stream(think of it as a ordered sequence), strict ordering.

To conclude this session; Dan shares his point of view that Azure Event Grid should be used for fan-in of data, where Azure Event Hub is meant to be used for Fan-Out Purposes and leaves us with this thought; The best tool depends on the context.

Divya Swarnkar & Jon Fancey - Enterprise Integration using Logic Apps

Divya Swarnakar and Jon Fancey entered the stage with a session about Enterprise Integration with Logic Apps and updated the participants about the improvements made during the past year, and things to come!

If you recall from the previous session where they announced the SAP connector for Logic Apps now being in private preview a demo showing the new connector and it’s functionality. As we all now integration between SAP and BizTalk from our previous experience working with these systems the connector also provides specification possibilities within the connector regarding namespaces. Based on the namespace specified the connector will listen for the specified namespace and trigger once a message containing the appropriate namespace is provided. If the namespace isn’t specified the connector will consume all messages sending it to a specific SAP ProgramId.

They also provided us with additional functionality that they are currently are working on for the SAP connector.

They also demonstrated a lot of new features that they are currently working on for enterprise connectors

Jon Fancy they took the stage and talked about improvements made on the mapping plain of the integration account. They have added support for XSLT 3.0, and briefly showed liquid templates, this was announced last year so consider this a quick recap.

A demo of the OMS template for Logic Apps, they also showed bulk resubmit feature and tracked properties. The tracked properties can now be added through the designer and not only the codeview. A coming feature is also the bulk download feature and an identification method for resubmitted runs of a Logic App.

And last but not least they gave us some information about things to come for Enterprise Integration. The biggest announcement is that they are currently working on a CONSUMPTION based pricing model for the Integration Account!

Kent Weare – Microsoft Flow in the enterprise

The last pass for the day was Kent Weare - Prinicipal Program Manager Microsoft Flow's presentation about Microsoft Flow. Microsoft flow is a "simplification" of Logic Apps. The difference between Microsoft Flow and Logic Apps is that Microsoft Flow does not provide any code view to implement or inspect its components. Office 365 users have access to this so be sure to expire, this is all about CS, AIM, FAS. Then, Kent talks about BAP (Business Application Productivity). In the middle of BAP, we find PowerApps & Power BI, which is targeted at power users. Flow and CDS (Common Data Services) are used to lock data in and out and / or apply business rules along the way before the information is sent to the subscriber of the information.

Flow enables application development / flow development for end users of the product, meaning that you do not have to have been taught in Machine Learning to be able to use its functionality. An example of this is Microsoft Flow's connector against LUIS (Language Understanding Intelligent Service), this was also demonstrated in a demo for a power cutter demo. This enables less involvement from IT departments to be required and, therefore, has less impact on critical IT processes that the IT department is potentially working with.

During Kent's session, four demonstrations of Microsoft Flow applications were implemented.\

  1. Incident management with teams and Flow Bot\
  2. Integration with Excel - Office 365 functionality\
  3. Intelligent customer service\
  4. Hot dog or not hot dog - Inspired from Silicon Valley S4E4

Last but not least, a roadmap for Microsoft Flow was presented

Posted in: •Integration  | Tagged: •Integrate2018 


Welcome to Integration Summit 2017

iBiz Solutions and Microsoft wish you welcome to this years most important event within system integration. The event will be held at the Microsoft HQ in Oslo on the 15th of November and then continue to Stockholm on the 16th of November. Take this opportunity to learn and be inspired by the future of system integration, both in the cloud and on-premises.

To check out the agenda and register, please click the following links:

Oslo, 15th november

Stockholm, 16th november

Posted in: •Integration  | Tagged: •Integration  •Summit2017 


Logic Apps deployment template extractor August Update

New Updates is added to the Template Extractor and they are all making it easier to do deploments with Logic Apps.

First of we added support for the new advanced Scheduling, this means that if you are using these features they will end up as parameters in the ARM template genereated. A sample is shown here, if the Designer GUI looks like this: The parameters will end up like following, making it easy to change settings between environments, as we often want more intense running in production than in development environment. This makes it easy to change settings between environments.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "value": "ReccurenceSample"
    },
    "RecurrenceFrequency": {
      "value": "week"
    },
    "RecurrenceInterval": {
      "value": 3
    },
    "RecurrenceStartTime": {
      "value": "2017-01-01T15:00:00"
    },
    "RecurrenceTimeZone": {
      "value": "W. Europe Standard Time"
    },
    "RecurrenceSchedule": {
      "value": {
        "hours": [
          3,
          10,
          15
        ],
        "minutes": [
          10,
          25
        ],
        "weekDays": [
          "Monday",
          "Wednesday",
          "Saturday"
        ]
      }
    }
  }
}

Next thing is updates with connectors, making it easier to auto generate ARM templates for Logic Apps using Blob Storage. Since this is a Azure Resource that can be accessed during deployment we added full support for autogenerating this, so now only the storage account name is needed. Making the connection generated from the ARM Template extractor looking as follows:

{
  "type": "Microsoft.Web/connections",
  "apiVersion": "2016-06-01",
  "location": "[parameters('logicAppLocation')]",
  "name": "[parameters('azureblob_name')]",
  "properties": {
    "api": {
      "id": "[concat('/subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'azureblob')]"
    },
    "displayName": "[parameters('azureblob_displayName')]",
    "parameterValues": {
      "accountName": "[parameters('azureblob_accountName')]",
      "accessKey": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('azureblob_accountName')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]"
    }
  }
}

The trick behind this is the ARM function “listKeys” so aslong as the user who is running the deployment has read access to the storage account the key will be collected during deployment time. Making the life easier for us developers.

"accessKey": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('azureblob_accountName')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]"

Last but not smallest we have had a new contributor in form of MVP Wagner! (Love the help) that has increased the usability by adding support for generating nested ARM templates in the same deployment, this makes it easier to deploy solutions that are depending on several Logic Apps or other Azure Resources and also makes it possible to dynamic get Key Vaults.

We think that these features has really improved the experience and ease of use with the Logic App template Creator so I hope you like it.

I strongly recomend you to try it and help out evolving it, more updates coming so stay tuned!

https://github.com/jeffhollan/LogicAppTemplateCreator

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


API Management DTAP Pipeline with VSTS

In order to handle a full lifecycle, we need to be able to export API’s from our API Management Development instance to the Test instance, if we have a Acceptance Test instance and later on the Production instance.

There are several ways of doing this and here we will go thru the GIT option. Out of Box the API Management instance comes with the function to export the configuration to a dedicated GIT repository (comes with the product, read more about it here) you can the use this saved state to restore your API instance or export API’s to other instances.

Note the GIT exportion will not export users, Named Values (Properties) or subscriptions, and if Named Values (Properties) are used these need’s to be added manually before import to the new istance.

The workflow is simple we develop new API’s and updating old ones in the Development instance first, then the code is saved to the GIT repository and a Build is initalized in VSTS to publish the artifacts for use during deployment. Release/Deployment is then made to next instance by cloning that instance’s repository and merge the changes, replacing URL’s and so on with powershell scripts.

The Build Step

The build is all about creating artifacts for deployment to the next instance.

So this we do by setting up a new build in VSTS, for more information about how to do this in VSTS follow this link.

The build is rather small, we just create a new build with an Empty Process, pick the “Remote Repo” and create a New Service Connection shows up in a popup window and here you shall provide the credentials found at you APIM instance, don’t forgett to generate the password. (annoying is that this expires every 30 days and need’s to be replaced). Read more about how to find credentials and URL’s for GIT repository here

Next step is to publish the artifacts, now we have the possibility to either push everything or add masks to prevent saving unnessisary settings. In this image we show how masking is used to only publish artifacts for a subset of API’s in the API Management instance (isolating the deployment to these API’s).

In order to make these masks you need to understand the GIT structure, read more about how to understand APIM Source Code here in my earlier post

After an initated build the files related to the API are published and ready to be used, as you can see we want to make sure that products, groups and polices are moved with aswell so that we can manage the ground set from Development instance. (since we merge added special products or groups in later instances will be left untouched)

The build part of this is now done and we hav our artifacts published and ready to use with our Releases.

The Release Steps

As stated in the begginign the release steps are also working with GIT, we will now clone the next instance’s GIT repository, merge the changes and publish it all back to the same GIT Repository they came from, thereafter we will initiate a deployment from the GIT repository to the API Management instance.

So how do we get the next instance’s source code and merge it? Using Powershell!

The Release definition is actually only powershell and will be looking like this.

Cloning the target APIM Instance GIT repository

To make this simpler we have created a few powershell scripts that help’s out with these steps, first one is cloning the repository. (tip we are using a shared repository for our powershell script and then links it to the release via Artifacts->Link an artifact source)

param([string] $apimname, [string] $urlencodedkey, [string] $user = "apim", [string] $copyfrompath = ".\APIM\*")

"Cloning repository"

#create the paths
$gitpath = $apimname + ".scm.azure-api.net"
$destpath = ".\" + $gitpath + "\api-management\"

$url = "https://" + $user +":"+ $urlencodedkey + "@" + $apimname + ".scm.azure-api.net/";
$url

git clone $url -q

"Start Copy files"


Copy-Item -Path $copyfrompath -Destination $destpath –Recurse -force

"Files copied"

As you can see from the above script we just need some parameters to get started, and we provide these via the Arguments textbox on the powershell script step, and for easy reuse we use variables. Note that in this script the GIT key need’s to be base64 encoded when provided as variable.

-apimname $(apiname) -urlencodedkey $(apikey)  -copyfrompath ".\INT001 Work Orders APIM\APIM\*"

The copyfrompath url can be found via the button on the script path to browse the path to where you want to start copy files from. (image shows sample)

For easy reuse we provide the varibales on Environment level (easy to clone the environment or the whole release and we are 80% done just changing the variables, search paths and replacement scripts).

Changing URL’s and other values in the files

When the code is extracted we need to do some manipulation, almost always it’s changing the URL and some small adjustments. There are several ways of doing this but we have thought that the easiest way is to just use Regex with the replace function. Here is a commonly used Powershell script for replacing a text via Regex expressions in powershell and a sample on how we use it with parameters.

param([string] $filepath ,[string] $regex, [string] $value)

(Get-Content $filepath) -replace $regex,$value | Out-File $filepath

And the arguments provided need’s to point to the exact file location (on the disk where the release agent can find it) the regex on what it should match (in the sample bellow we change th URL on the backend setting for the SOAP API we are using to the one matching the new instance).

-filepath "$(SYSTEM.ARTIFACTSDIRECTORY)\$(apiname).scm.azure-api.net\api-management\backends\BizTalkServiceInstance_Soa107AV4D\configuration.json" -regex '"url":\W*[0-9a-zA-Z:/\."]*' -value '"url": "https://newurlformysoapservice.com"'

The $(SYSTEM.ARTIFACTSDIRECTORY) is vital for finding the correct path since this parameter will contain the path to the folder where the where we can find the file.

This is then repeated until all URL’s or other settings thas need’s to be changed.

Create a commit and push the changes

When all changes have been made we need to push the changes to the targeted APIM instance GIT repository, this is also done via Powershell and here is a script we are using for this.

param([string] $apimname)
#create the paths
$gitpath = $apimname + ".scm.azure-api.net"

#Part publish Start
"Start Publishing"

#move to git folder
cd $gitpath

#set a git user
git config user.email "deployment@vsts.se"
git config user.name "Visual Studio Team Services"

#Add all new files
git add *

#comit and push
git commit -m "Deployment updates"
git push -q

"Publish done"

#return to original path
cd..

As you can see this script is simpler since we have all GIT settings prepared allready, and it does is commiting localy and then pushing to the . When all this is done it’s easy to clone this and reuse on a new instance/environment or a new Release step for another API.

Deploy the changes from GIT to the APIM instnace

Since we often want to have control over the deployment we normaly do the deployment step manually, entering the Azure Portal and browsing the APIM instance going to the Repositories blade and pressing the Deploy to API Management button but this can also be automated via powershell or the REST API.

Summary

Our experience of the GIT deployment setup is good, altough some minor issues have been found with new functionality as when the backend part was added with the SOAP support. But that is only natural and the product group has been really helpful in finding work arounds on problems and fixing the core problem.

With that said there are room for improvements around the experience, using regex to replace URL’s is not optimal and parameter files would help out improving the experience and make it easier to automate URL changes and so on. The closest thing we have is the Named Values but these can’t be used everywhere and another down sides are that these values need’s to be provided manuall before deployment otherwise it fails with a cryptic error and also the visibility of what URL the API is using, it’s not so fun to start using traces just to make sure the URL is properly set.

This option is working good for us but we are looking forward for the upcoming ARM Template support to see if we can get better and easier ways of handling parameters.

Posted in: •Azure API Management  •Integration  | Tagged: •DTAP  •API Management  •Deployments  •Azure  •GIT 


Logic Apps deployment template extractor Connections update

An exciting update is now released, you who have used the Logic Apps Deployment Template Creator know that the support for connections have been poor. But know we have released an update that handles all connections, regardless if they are connected via gateway or not the needed parameters are generated and in the end populated to the parameters file. Still all parameters are needed to be set manually but automation of this is next step in this area. The files are then redeployable right away and we can move the files to VSTS for source control and for seting up release pipelines.

API connections Samples:

First off we take a SQL Azure instnance and create a Logic App that lists the content of a table (simple start).

Extracting this Logic App with the Template Creator and powershell will generate the following read more about how to extract logic apps with powershell and deployment template creator.:

Parameters: All Azure SQL Connections are extracted and presented, all but the securestring parameters (username and password) are autopopulated with the values on the exporter Logic App and API Connection.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "value": "SQLAzure"
    },
    "logicAppLocation": {
      "value": "[resourceGroup().location]"
    },
    "sql-1_name": {
      "value": "sql-1"
    },
    "sql-1_displayName": {
      "value": "SQL Azure"
    },
    "sql-1_server": {
      "value": "dummyserverone.database.windows.net"
    },
    "sql-1_database": {
      "value": "dummydatabase"
    },
    "sql-1_username": {
      "value": ""
    },
    "sql-1_password": {
      "value": ""
    }
  }
}

Logic App:

 {
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "defaultValue": "SQLAzure",
      "metadata": {
        "description": "Name of the Logic App."
      }
    },
    "logicAppLocation": {
      "type": "string",
      "defaultValue": "[resourceGroup().location]",
      "allowedValues": [
        "[resourceGroup().location]",
        "eastasia",
        "southeastasia",
        "centralus",
        "eastus",
        "eastus2",
        "westus",
        "northcentralus",
        "southcentralus",
        "northeurope",
        "westeurope",
        "japanwest",
        "japaneast",
        "brazilsouth",
        "australiaeast",
        "australiasoutheast",
        "westcentralus",
        "westus2"
      ],
      "metadata": {
        "description": "Location of the Logic App."
      }
    },
    "sql-1_name": {
      "type": "string",
      "defaultValue": "sql-1"
    },
    "sql-1_displayName": {
      "type": "string",
      "defaultValue": "SQL Azure"
    },
    "sql-1_server": {
      "type": "string",
      "defaultValue": "dummyserverone.database.windows.net",
      "metadata": {
        "description": "SQL server name"
      }
    },
    "sql-1_database": {
      "type": "string",
      "defaultValue": "dummydatabase",
      "metadata": {
        "description": "SQL database name"
      }
    },
    "sql-1_username": {
      "type": "securestring",
      "defaultValue": "",
      "metadata": {
        "description": "Username credential"
      }
    },
    "sql-1_password": {
      "type": "securestring",
      "defaultValue": "",
      "metadata": {
        "description": "Password credential"
      }
    }
  },
  "variables": {},
  "resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [
        "[resourceId('Microsoft.Web/connections', parameters('sql-1_name'))]"
      ],
      "properties": {
        "definition": {
          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
          "contentVersion": "1.0.0.0",
          "parameters": {
            "$connections": {
              "defaultValue": {},
              "type": "Object"
            }
          },
          "triggers": {
            "manual": {
              "type": "Request",
              "kind": "Http",
              "inputs": {
                "schema": {}
              }
            }
          },
          "actions": {
            "Get_rows": {
              "runAfter": {},
              "type": "ApiConnection",
              "inputs": {
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['sql_1']['connectionId']"
                  }
                },
                "method": "get",
                "path": "/datasets/default/tables/@{encodeURIComponent(encodeURIComponent('[SalesLT].[Customer]'))}/items"
              }
            }
          },
          "outputs": {}
        },
        "parameters": {
          "$connections": {
            "value": {
              "sql_1": {
                "id": "[concat(subscription().id,'/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/sql')]",
                "connectionId": "[resourceId('Microsoft.Web/connections', parameters('sql-1_name'))]",
                "connectionName": "[parameters('sql-1_name')]"
              }
            }
          }
        }
      }
    },
    {
      "type": "Microsoft.Web/connections",
      "apiVersion": "2016-06-01",
      "location": "[parameters('logicAppLocation')]",
      "name": "[parameters('sql-1_name')]",
      "properties": {
        "api": {
          "id": "[concat('/subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'sql')]"
        },
        "displayName": "[parameters('sql-1_displayName')]",
        "parameterValues": {
          "server": "[parameters('sql-1_server')]",
          "database": "[parameters('sql-1_database')]",
          "username": "[parameters('sql-1_username')]",
          "password": "[parameters('sql-1_password')]"
        }
      }
    }
  ],
  "outputs": {}
}

If you add the sql username and password this will be ready for deployment, creating/updating the API Connection and creating/Updating the Logic App.

Redeploying this with username/password and changing the Logic App name to SQLAzureFromVS and API Connection name to sql-2 to will create a new Logix App and a new API Connection and link them so they are ready to be used right away!

And the new API Connection is shown in the API Connections blade.

If this would have been a SQL Connector usign Gateway there would be some differences but only on the API Connection and the parameters file. The API Connection will then have a Gatway object in the parameterValues object.

{
  "type": "Microsoft.Web/connections",
  "apiVersion": "2016-06-01",
  "location": "[parameters('logicAppLocation')]",
  "name": "[parameters('sql')]",
  "properties": {
    "api": {
      "id": "[concat('/subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'sql')]"
    },
    "displayName": "[parameters('sqldisplayName')]",
    "parameterValues": {
      "server": "[parameters('sql_server')]",
      "database": "[parameters('sql_database')]",
      "authType": "[parameters('sql_authType')]",
      "username": "[parameters('sql_username')]",
      "password": "[parameters('sql_password')]",
      "gateway": {
        "id": "[concat('subscriptions/',subscription().subscriptionId,'/resourceGroups/',parameters('sql_gatewayresourcegroup'),'/providers/Microsoft.Web/connectionGateways/',parameters('sql_gatewayname'))]"
      }
    }
  }
}

And the parameter file will contain the two new parameters sql_gatewayname that should contain the name of the Gateway and sql_gatewayresourcegroup parameter that should contain the resource group where the gateway is deployed in.

"sql_gatewayname": {
  "type": "string",
  "defaultValue": "Malos-LogicApp2015"
},
"sql_gatewayresourcegroup": {
  "type": "string",
  "defaultValue": "OnPremDataGateway"
}

As above, set the credentials, change the database setting to your new ones, point out the gateway via name and resource group and we are good to go.

Mixing multiple connections is not a problem, here is a sample on how it will look like when using both Storage and Service Bus Connectors. Simple Sample bellow saving to storage and then publish on service bus queue.

As you can see bellow the extractor will generate the needed connectors and it’s parameters.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "defaultValue": "ServicebusAndStorage",
      "metadata": {
        "description": "Name of the Logic App."
      }
    },
    "logicAppLocation": {
      "type": "string",
      "defaultValue": "[resourceGroup().location]",
      "allowedValues": [
        "[resourceGroup().location]",
        "eastasia",
        "southeastasia",
        "centralus",
        "eastus",
        "eastus2",
        "westus",
        "northcentralus",
        "southcentralus",
        "northeurope",
        "westeurope",
        "japanwest",
        "japaneast",
        "brazilsouth",
        "australiaeast",
        "australiasoutheast",
        "westcentralus",
        "westus2"
      ],
      "metadata": {
        "description": "Location of the Logic App."
      }
    },
    "azureblob_name": {
      "type": "string",
      "defaultValue": "azureblob"
    },
    "azureblob_displayName": {
      "type": "string",
      "defaultValue": "dummy storage"
    },
    "azureblob_accountName": {
      "type": "string",
      "defaultValue": "dymmystorage",
      "metadata": {
        "description": "Name of the storage account the connector should use."
      }
    },
    "azureblob_accessKey": {
      "type": "securestring",
      "defaultValue": "",
      "metadata": {
        "description": "Specify a valid primary/secondary storage account access key."
      }
    },
    "servicebus_name": {
      "type": "string",
      "defaultValue": "servicebus"
    },
    "servicebus_displayName": {
      "type": "string",
      "defaultValue": "dummy service bus"
    },
    "servicebus_connectionString": {
      "type": "securestring",
      "defaultValue": "",
      "metadata": {
        "description": "Azure Service Bus Connection String"
      }
    }
  },
  "variables": {},
  "resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [
        "[resourceId('Microsoft.Web/connections', parameters('azureblob_name'))]",
        "[resourceId('Microsoft.Web/connections', parameters('servicebus_name'))]"
      ],
      "properties": {
        "definition": {
          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
          "contentVersion": "1.0.0.0",
          "parameters": {
            "$connections": {
              "defaultValue": {},
              "type": "Object"
            }
          },
          "triggers": {
            "manual": {
              "type": "Request",
              "kind": "Http",
              "inputs": {
                "schema": {}
              }
            }
          },
          "actions": {
            "Create_blob": {
              "runAfter": {},
              "type": "ApiConnection",
              "inputs": {
                "body": "hej hej",
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['azureblob']['connectionId']"
                  }
                },
                "method": "post",
                "path": "/datasets/default/files",
                "queries": {
                  "folderPath": "/orders",
                  "name": "@{guid()}.txt"
                }
              }
            },
            "Send_message": {
              "runAfter": {
                "Create_blob": [
                  "Succeeded"
                ]
              },
              "type": "ApiConnection",
              "inputs": {
                "body": {
                  "ContentData": "@{base64('hej hej')}",
                  "ContentType": "text"
                },
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['servicebus']['connectionId']"
                  }
                },
                "method": "post",
                "path": "/@{encodeURIComponent('ordersqueue')}/messages",
                "queries": {
                  "systemProperties": "None"
                }
              }
            }
          },
          "outputs": {}
        },
        "parameters": {
          "$connections": {
            "value": {
              "azureblob": {
                "id": "[concat(subscription().id,'/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/azureblob')]",
                "connectionId": "[resourceId('Microsoft.Web/connections', parameters('azureblob_name'))]",
                "connectionName": "[parameters('azureblob_name')]"
              },
              "servicebus": {
                "id": "[concat(subscription().id,'/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/servicebus')]",
                "connectionId": "[resourceId('Microsoft.Web/connections', parameters('servicebus_name'))]",
                "connectionName": "[parameters('servicebus_name')]"
              }
            }
          }
        }
      }
    },
    {
      "type": "Microsoft.Web/connections",
      "apiVersion": "2016-06-01",
      "location": "[parameters('logicAppLocation')]",
      "name": "[parameters('servicebus_name')]",
      "properties": {
        "api": {
          "id": "[concat('/subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'servicebus')]"
        },
        "displayName": "[parameters('servicebus_displayName')]",
        "parameterValues": {
          "connectionString": "[parameters('servicebus_connectionString')]"
        }
      }
    },
    {
      "type": "Microsoft.Web/connections",
      "apiVersion": "2016-06-01",
      "location": "[parameters('logicAppLocation')]",
      "name": "[parameters('azureblob_name')]",
      "properties": {
        "api": {
          "id": "[concat('/subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', parameters('logicAppLocation'), '/managedApis/', 'azureblob')]"
        },
        "displayName": "[parameters('azureblob_displayName')]",
        "parameterValues": {
          "accountName": "[parameters('azureblob_accountName')]",
          "accessKey": "[parameters('azureblob_accessKey')]"
        }
      }
    }
  ],
  "outputs": {}
}

The parameter file will look like following, and only the accesskey for blob and the connection string is not autopopulated with current values.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "value": "ServicebusAndStorage"
    },
    "logicAppLocation": {
      "value": "[resourceGroup().location]"
    },
    "azureblob_name": {
      "value": "azureblob"
    },
    "azureblob_displayName": {
      "value": "dummy storage"
    },
    "azureblob_accountName": {
      "value": "dymmystorage"
    },
    "azureblob_accessKey": {
      "value": ""
    },
    "servicebus_name": {
      "value": "servicebus"
    },
    "servicebus_displayName": {
      "value": "dummy service bus"
    },
    "servicebus_connectionString": {
      "value": ""
    }
  }
}

Key Vault integration

A small step has taken for KeyVault integration, read more about using KeyVault with ARM deployments here. When using the Get-ParameterTemplate operation there is a new parameter -KeyVault, it can be either “None” or “Static” and when used with Static as example code bellow a static reference will be generated for KeyVault integration. And when deployed the value stored in the secret will be used as the parameter value, separating secrets from deployment templats.

Get-ParameterTemplate -TemplateFile $filenname -KeyVault Static | Out-File $filennameparam

With one of our earlier sample it will look like this when used with Static:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "value": "SQLInsertOnPrem"
    },
    "logicAppLocation": {
      "value": "[resourceGroup().location]"
    },
    "sql_name": {
      "value": "sql"
    },
    "sql_displayName": {
      "value": "SQL server OnPrem"
    },
    "sql_server": {
      "value": "."
    },
    "sql_database": {
      "value": "InvoiceDatabase"
    },
    "sql_authType": {
      "value": "windows"
    },
    "sql_username": {
      "reference": {
        "keyVault": {
          "id": "/subscriptions/{subscriptionid}/resourceGroups/{resourcegroupname}/providers/Microsoft.KeyVault/vaults/{vault-name}"
        },
        "secretName": "sql_username"
      }
    },
    "sql_password": {
      "reference": {
        "keyVault": {
          "id": "/subscriptions/{subscriptionid}/resourceGroups/{resourcegroupname}/providers/Microsoft.KeyVault/vaults/{vault-name}"
        },
        "secretName": "sql_password"
      }
    },
    "sql_gatewayname": {
      "value": "Malos-LogicApp2015"
    },
    "sql_gatewayresourcegroup": {
      "value": "OnPremDataGateway"
    }
  }
}

Replace the {bracketed} values with the correct ones, secret name is the value of the secret and can also be replace, for simplicity we generate it tot the same value as the parameter.

{subscriptionid} = your subscriptionid

{resourcegroupname} = the resourcegroup where the keyvault is deployed

{vault-name} = the name of the vault

Or just copy the resourceid found at the properties blade on the KeyVault as the image shows.

We think that these features has really improved the experience and ease of use with the Logic App template Creator so I hope you like it.

I strongly recomend you to try it and help out evolving it, more updates coming so stay tuned!

https://github.com/jeffhollan/LogicAppTemplateCreator

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


Logic Apps deployment template extractor updates june

Another update to the Logic App Template Extractor has come out, read more about the Logic App Tmplate Extractor in the earlier post Logic Apps Deployment Template Extractor

This update focus on Usage of Integration Acount actions and the standard HTTP action. With this update it’s easier to create a standard template for multiple flows, we often see the same patterns again again and again. This update makes it easy to create a standrad flatfile, XML Transform pattern that is resuable in multiple flows with just diffrent parameter files.

Flat File Decode and Encode

Flatfile actions Decode and Encode conatins a schema property it contains the name of the schema and this is something that is extract automatically as a property to easily be set via a property file.

And in the extracted ARM template we will automatically get:

"parameters": {
...
  "Flat_File_Decoding-SchemaName": {
      "type": "string",
      "defaultValue": "INT0021.SystemA.DailyStatistics"
    },
    "Flat_File_Encoding-SchemaName": {
      "type": "string",
      "defaultValue": "INT0021.SystemB.DailyStatistics"
    }
...
  "actions": {
    "Flat_File_Decoding": {
      "runAfter": {},
      "type": "FlatFileDecoding",
      "inputs": {
        "content": "@{triggerBody()}",
        "integrationAccount": {
          "schema": {
            "name": "[parameters('Flat_File_Decoding-SchemaName')]"
          }
        }
      }
    },
    "Flat_File_Encoding": {
      "runAfter": {
        "Transform_XML": [
          "Succeeded"
        ]
      },
      "type": "FlatFileEncoding",
      "inputs": {
        "content": "@{body('Transform_XML')}",
        "integrationAccount": {
          "schema": {
            "name": "[parameters('Flat_File_Encoding-SchemaName')]"
          }
        }
      }
    },

XML Transform

The XML Transform contains a map property this is the name of the map, this is something that is extract automatically as a property to easily be set via a property file.

And in the extracted ARM template we will automatically get:

"parameters": {
...
 "Transform_XML-MapName": {
  "type": "string",
  "defaultValue": "INT0021.DailyStatistics.SystemA.To.SystemB"
 }
...
  "actions": {
  	....
    ,
    "Transform_XML": {
      "runAfter": {
        "Flat_File_Decoding": [
          "Succeeded"
        ]
      },
      "type": "Xslt",
      "inputs": {
        "content": "@{body('Flat_File_Decoding')}",
        "integrationAccount": {
          "map": {
            "name": "[parameters('Transform_XML-MapName')]"
          }
        }
      }
    }

Http

The HTTP action is videly used and common in all types of setups, normaly when moving from Dev to Test involves changes to both the URI nad authentication parameters. Now these are moved as ARM template properties to make it easier to deploy changes between environments.

The basic case, promoting the URI

"parameters": {
...
 "HTTP-URI": {
  "type": "string",
  "defaultValue": "http://google.se"
 }
...
	"HTTP": {
	  "type": "Http",
	  "inputs": {	   
	    "body": {
	      "test": "data"
	    },
	    "method": "POST",
	    "uri": "[parameters('HTTP-URI')]"
	  },
	  "conditions": []
	} 
 

Later on you will probobly use some sort of authentication and the extractor will push these to ARM templates aswell.

Http Basic Authentication

"parameters": {
...
 "HTTP-URI": {
  "type": "string",
  "defaultValue": "http://google.se"
 },
 "HTTP-Password": {
  "type": "string",
  "defaultValue": "myusername"
 },
 "HTTP-Username": {
  "type": "string",
  "defaultValue": "mypassword"
 }
...
	"HTTP": {
	  "type": "Http",
	  "inputs": {
	  	"authentication": {
          "password": "[parameters('HTTP-Password')]",
          "type": "Basic",
          "username": "[parameters('HTTP-Username')]"
        },	   
	    "body": {
	      "test": "data"
	    },
	    "method": "POST",
	    "uri": "[parameters('HTTP-URI')]"
	  },
	  "conditions": []
	} 

Http Client Certificate Authentication

"parameters": {
...
 "HTTP-URI": {
  "type": "string",
  "defaultValue": "http://google.se"
 },
 "HTTP-Password": {
  "type": "string",
  "defaultValue": "myusername"
 },
 "HTTP-Pfx": {
  "type": "string",
  "defaultValue": "mypfx"
 }
...
	"HTTP": {
	  "type": "Http",
	  "inputs": {	   
	  	"authentication": {
          "password": "[parameters('HTTP-Password')]",
          "pfx": "[parameters('HTTP-Pfx')]",
          "type": "ClientCertificate"
        },
	    "body": {
	      "test": "data"
	    },
	    "method": "POST",
	    "uri": "[parameters('HTTP-URI')]"
	  },
	  "conditions": []
	} 

Http Active Directory OAuth

"parameters": {
...
 "HTTP-URI": {
  "type": "string",
  "defaultValue": "http://google.se"
 },
 "HTTP-Audience": {
  "type": "string",
  "defaultValue": "myaudience"
 },
 "HTTP-Authority": {
  "type": "string",
  "defaultValue": "https://login.microsoft.com/my"
 },
 "HTTP-ClientId": {
  "type": "string",
  "defaultValue": "myclientid"
 },
 "HTTP-Secret": {
  "type": "string",
  "defaultValue": "https://login.microsoft.com/my"
 },
 "HTTP-Tenant": {
  "type": "string",
  "defaultValue": "mytenant"
 },
...
	"HTTP": {
	  "type": "Http",
	  "inputs": {
	  	"authentication": {
          "audience": "[parameters('HTTP-Audience')]",
          "authority": "[parameters('HTTP-Authority')]",
          "clientId": "[parameters('HTTP-ClientId')]",
          "secret": "[parameters('HTTP-Secret')]",
          "tenant": "[parameters('HTTP-Tenant')]",
          "type": "ActiveDirectoryOAuth"
        },	   
	    "body": {
	      "test": "data"
	    },
	    "method": "POST",
	    "uri": "[parameters('HTTP-URI')]"
	  },
	  "conditions": []
	} 

Tools like the Logic App Template Creator helps us focus on the fun and good parts, building great solutions for our customers.

I strongly recomend you to try it and help out evolving it, more updates coming so stay tuned!

https://github.com/jeffhollan/LogicAppTemplateCreator

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


Logic Apps Parameters vs ARM Parameters

I got the question about what the difference is between ARM template parameters and Logic App parameters and when these should be used, so that is what I’ll try to explain in this post.

First of ARM template paramters are used with ARM templates and the ARM template is used when deploying ARM based resources to Azure and Logic App’s is a resource that is deployed via ARM templates. The workflow definition language behind Logic App’s is very similar to ARM templates and therefore it can be tricky to see the difference in the beginning.

So let’s start of with ARM template parameters where they are and how they work, intereseted in more info about ARM templates read more https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-authoring-templates.

ARM Template Parameters

An ARM template with a empty Logic App looks as following, two ARM template parameters logicAppName and logicAppLocation and one resource of type Microsoft.Logic/workflow inside the Microsoft.Logic/workflow the Logic App Parmeters are found in the parameters object.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "defaultValue": "",
      "metadata": {
        "description": "Name of the Logic App."
      }

    },
    "logicAppLocation": {
      "type": "string",
      "defaultValue": "[resourceGroup().location]",
      "allowedValues": [
        "eastasia",
        "southeastasia",
        "centralus",
        "eastus",
        "eastus2",
        "westus",
        "northcentralus",
        "southcentralus",
        "northeurope",
        "westeurope",
        "japanwest",
        "japaneast",
        "brazilsouth",
        "australiaeast",
        "australiasoutheast",
        "westcentralus",
        "westus2"
      ],
      "metadata": {
        "description": "Location of the Logic App."
      }
    }
  },
  "variables": {
  },
  "resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [],
      "properties": {
        "definition": {},
        "parameters": {}
      }
    }
  ],
  "outputs": {}
}

So again the ARM template parameters are found here and cotaining a parameter named Flat_File_Encoding-SchemaName

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "Flat_File_Encoding-SchemaName": {
	  "type": "string",
	  "defaultValue": "TEST-INT0021.Intime.DailyStatistics"
	},

The ARM template properties has the syntax: [parameters(‘armparam’)] and when accessing arm parameter with name Flat_File_Encoding-SchemaName Fit would look like:

"name": "[parameters('Flat_File_Encoding-SchemaName')]"

This value will be evaluated during runtime and if the ARM parameter would look like this (and no parameter file was used).

,
"Flat_File_Encoding-SchemaName": {
  "type": "string",
  "defaultValue": "TEST-INT0021.Intime.DailyStatistics"
},

The result would look like this after deloyment:

Code View

"name": "TEST-INT0021.Intime.DailyStatistics"

Desginer View

In the Designer View it’s pretty evaluated and easy to read for operations.

ARM Template Variables

This is almost an unkown feature and often used to little. This is where we can evaluate expressions for later use, so this is really practical when we want to add two parameter values or use some functions to generate the value that shall be consistent over the Logic App.

In order to prove how this works we will concat two parameters, this could be used for creating resource links etc. The simpel logic app will just contain a response action with a body value of the evaluated concatenated value of the parameters.

The variable can be found after the parameters in the ARM template:

    }
  },
  "variables": {
  },
  "resources": [

When adding a variable we can the access parameters as input to the evaluation, this makes it possible to combine two paramters with the concat function and it will look like this:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "logicAppName": {
      "type": "string",
      "metadata": {
        "description": "Name of the Logic App."
      }
    },
    "flowname": {
      "type": "string",
      "metadata": {
        "description": "Name of the flow"
      }
    },
  },
  "variables": {
    "combinedflowname" :  "[concat(parameters('logicAppName'),'-',parameters('flowname'))]"
  },
  "resources": [

So the variable is created and assigned a value, now we just need to use it and the syntax for accessing the value is simple:

"Response": {
  "inputs": {
    "body": "[variables('combinedflowname')]",
    "statusCode": 200
  },
  "runAfter": {
   
  },
  "type": "Response"
}

The variables will be evaluated during deployment just as the ARM template paramters. This means that when deployed the value will be displayed as text, so if logicAppName was set to INT001-Example and the flowname was set to Orders the evaluated result of the concat function would be: INT001-Example-Orders. And when looking at the deployed Logic App it will look like this:

Logic App Parameters

The Logic App paremeters are found in this section under “parameters”. Containing a parameter “url”.

"resources": [
    {
      "type": "Microsoft.Logic/workflows",
      "apiVersion": "2016-06-01",
      "name": "[parameters('logicAppName')]",
      "location": "[parameters('logicAppLocation')]",
      "dependsOn": [],
      "properties": {
        "definition": {},
        "parameters": {
			"url": {
	          		"defaultValue": "http://www.google.se/",
	         	 	"type": "String"
	        	}
			}
      }
    }
  ],

The Logic App parameters have a diffrent syntax: @parameters(‘laparam’) and when accessing arm parameter with name url it would look like:

"uri": "@parameters('url')"

These parameters are evaluated at runtime wich means that this is not changed after deployment or in the designer so even after deploy or first run is done it will always look like this, but during runtime it will be picking the value behind the parameter: The result would look like this after deloyment: Code View

“uri”: “@parameters(‘url’)”

Desginer View

But during rumtime it will evalve and if we check a run it will look like this:

Summary

It’s good to know when and why to use diffrent types of parameters, I’ve seen alot of over use of Logic App parameters and I just want to share the knowledge and spread how these are treated. We want to push the static values to ARM template parameters so that it’s easy to see the values during a check if the Logic App in the designer to verify that everything is ok, since Logic App paramters will “hide” the value in Code View. It also makes it easy to switch values between DEV/TEST/PROD environments since there is often change between environments. Also make sure to use ARM template variables whenever you need to reuse a computed result over you Logic App.

But with that said when using reference objects for translation or lookups Logic App Paramters is the obvious choice since it’s easy to access. We have seen this succesfully and easy to use when keys are sent in the inpiut data and the used in lookups i.e. @{parameters(‘myobjectparam’)[triggerbody()[‘key’]]}

So make sure to use the appropiate type at the correct time and I hope you got some more insight in how/when and where to use these 3 diffrent types.

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template 


Logic Apps deployment template extractor trigger updates

Updates to the Logic App Template Extractor has come out, read more about in the earlier post Logic Apps Deployment Template Extractor

Reccurence properties on trigger automatically ARM template properties

So whit a reccurence trigger we are setting an interval and frequency and often we want to have diffrent values in dev/test and production since trigger this action is assoisated with a billable action so frequency is often something we change between the environments. So whit any trigger that has reccurence propperties such as Interval and Frequencey are automatically generated as properties to the ARM Template. Here we can see the standard reccurence trigger:

And in the extracted ARM template we will automatically get:

"parameters": {
...
 "RecurrenceFrequency": {
      "type": "string",
      "defaultValue": "Day"
    },
    "RecurrenceInterval": {
      "type": "int",
      "defaultValue": 1
    }
...
  "triggers": {
    "Recurrence": {
      "recurrence": {
        "frequency": "[parameters('RecurrenceFrequency')]",
        "interval": "[parameters('RecurrenceInterval')]"
      },
      "type": "Recurrence"
    }
  },

File Connector and Base64 paths

The File Connector amongst others are saving the folder path in Base64 format to make sure that the Path is valid, for the File Connector it is so forthe trigger and List files action. The Designers are working with this withouth any problems but as you want to automate the deployment this becomes something we need to know and understand how to handle.

Understanding the Base64 modell

So let’s start with understanding the base64 modell used in these actions/triggers. So first we pick our path:

And if we then go in and look in the codeview:

"List_files_in_folder": {
    "inputs": {
        "host": {
            "connection": {
                "name": "@parameters('$connections')['filesystem_1']['connectionId']"
            }
        },
        "method": "get",
        "path": "/datasets/default/folders/@{encodeURIComponent('XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=')}"
    },
    "metadata": {
        "XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=": "\\\\server\\ARBETSORDER\\OUT\\XML_EXPORT\\TO"
    },
    "runAfter": {},
    "type": "ApiConnection"
}

So as we can see the path we picked in the designer is in the metdata tag and the name of the property is just a “random” name. The “random” name is not so random it’s actually the base64 representation of the path.

\\server\ARBETSORDER\OUT\XML_EXPORT\TO = decodeBase64(‘XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=’).

The metadata tag is used in the GUI to present the path in text and the actuall value sent to the API Connection is found in the path property:

"path": "/datasets/default/folders/@{encodeURIComponent('XFxcXHNlcnZlclxcQVJCRVRTT1JERVJcXE9VVFxcWE1MX0VYUE9SVFxcVE8=')}"

So in order to be able to handle deployments this is the actual value we need to change but if we want the GUI to present this value we also need to update the metadata tag. To do this there are handy functions available to use in the ARM template [base64(‘somevalue’)]. The full extract when using the Deployment Template Creator will handle this and look like this:

"parameters": {
...
"List_files_in_folder-folderPath": {
      "type": "string",
      "defaultValue": "\\\\server\\ARBETSORDER\\OUT\\XML_EXPORT\\TO"
    },
}
...
	"List_files_in_folder": {
	  "runAfter": {},
	  "metadata": {
	    "[base64(parameters('List_files_in_folder-folderPath'))]": "[parameters('List_files_in_folder-folderPath')]"
	  },
	  "type": "ApiConnection",
	  "inputs": {
	    "host": {
	      "connection": {
	        "name": "@parameters('$connections')['filesystem_1']['connectionId']"
	      }
	    },
	    "method": "get",
	    "path": "[concat('/datasets/default/folders/@{encodeURIComponent(''',base64(parameters('List_files_in_folder-folderPath')),''')}')]"
	  }
	}

Setting the parameter in the parameters file to the new value will then be populated and working with both the designer and setting the correct value as parameter to the File Connector.

Small fixes

There has also been a number of small fixes as the parameter type now supports all kind of types, we can have objects, strings, integers etc.

Here is a sample of how parameters inside a Logic App of type Object will be handled:

 "parameters": {
    "ismanager": {
        "defaultValue": {
            "0": 572,
            "1": 571,
            "No": 572,
            "Yes": 571
        },
        "type": "Object"
    },

After generating the ARM template the Logic App parameter will be pushed up to an ARM template Parameter, making the ARM template looking like this:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
	"parameters": {
		...
		paramismanager": {
		  "type": "Object",
		  "defaultValue": {
		    "0": 572,
		    "1": 571,
		    "No": 572,
		    "Yes": 571
		  }
		},
		....
	}
	"variables": {},
	"resources": [
	    {
	      "type": "Microsoft.Logic/workflows",
	      "apiVersion": "2016-06-01",
	      "name": "[parameters('logicAppName')]",
	      "location": "[parameters('logicAppLocation')]",
	      "dependsOn": [],
	      "properties": {
	        "definition": {
	          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
	          "contentVersion": "1.0.0.0",
	          "parameters": {
	            "ismanager": {
	              "defaultValue": "[parameters('paramismanager')]",
	              "type": "Object"
	            },
				...
			 }
          },
          "outputs": {}
        },
        "parameters": {}
      }
    }
  ],
  "outputs": {}
}
				

Tools like the Logic App Template Creator helps us focus on the fun and good parts, building great solutions for our customers.

I strongly recomend you to try it and help out evolving it!

https://github.com/jeffhollan/LogicAppTemplateCreator

Posted in: •Logic Apps  •Integration  | Tagged: •DTAP  •Logic Apps  •Deployments  •Azure  •ARM  •ARM Template