What is it that makes the world which we live in and work different from the world as we knew it 10 years ago? And what will it look like if we look five years ahead in time?
Last week I discovered Chris Hadfield on Twitter (@Cmdr_Hadfield), if you haven’t heard of him I can tell you that he is a Canadian astronaut currently living in space aboard ISS as Flight Engineer on Expedition.
Chris is taken photos from where he is in outer space with his system camera and shares the pictures on Twitter. From his micro blog he also comments and informs us non astronauts about life as it is for a person living in space. You will find spectacular pictures from his camera on Twitter and you can of course also ask him questions and follow his discussion threads with other earthlings. As a nerd I find this, of course, amazing and now I just have to know what it is like in space and how the earth looks like from up there. It also made me think of things that happen closer to my daily life as an Enterprise Architect and particularly my involvement in systems integrations in mid-size and larger companies where I am often involved in the preparation for companies to stay innovative. This by assisting them in the preparation work to provide the best experience for their customers to do informed decisions with the help from IT. In my simple earthling thoughts I was looking both back in time and also made an attempt to look forward into the future and this is what I came up with. So how come I started to think of my job instead of continuing with high flying plans?
Well; 10 years ago I didn’t know that I would write an activity log on the web (Facebook), I didn’t know that my phone had everything I would need from a flashlight to internet and even a telephone, I didn’t know that I would use Instagram to share my images, hell I didn’t even know that I had a need to share images… I certainly didn’t know that I would be “familiar” to an astronaut and I should be able to chat with him from my mobile. The only contact you had with astronauts at that time was from Discovery channel or on the news when well-known reporters had a few minutes to interview the astronauts with well-prepared questions. The media companies also had in their power the possibility to adjust the questions and answers to suite their own purpose. If they had a political or religious purpose, Excuse me for being a bit paranoid…
There are so many things I didn’t know about at that time but in one sense I did a wise choice even though I wasn’t aware of the decisions taken at the beginning of 2000 to focus on not only technique, SOA and information management but also that the governance of services would pay off over and over again as new technical enablers arises. To have a stable ground to build on to meet up with the change in the way we communicate and do business today makes it possible to keep up and change communication channels as the market change. With well-defined SOA-interfaces and with processes to govern and develop the services so that it really mirrors your organizations daily business you will stand prepared to use any new opportunities that will be taken for granted by your customers tomorrow. This you can rely on because information is relatively stable and not sensitive to technical changes.
Looking ahead 5-10 years from now we have most probably developed in a totally different approach from what we believe is possible today both in the way we do business and share our knowledge with our customers and friends as well as what tools that is in our hands to support us in decision making and forecasting. Evolution has brought us to where we are today in a few billion years and you can see that the slope has increased the last hundred years but now things are coming to change radically. The number one reason why we will see a dramatic increase in speed of behavior changes and the way our social and business habits will affect our lives is the introduction of cloud computing.
With the introduction of the cloud the innovation will burst and we will most probably see technical changes happen in a pace never seen before. This since the cloud gives every developer and innovator (You and I, and even our neighbor) a low cost super computer to carry out his vision to a cost that is affordable or at least worth taken an economically risk for. The cloud will for sure change the way we act and on what base we take decisions in the future.
For companies that are actively working to have a strategy for IT-management and Integration strategy this is good news since they can benefit from the new opportunities in a fast and cost effective way. For companies who has taken short cuts and that have only relied on techniques will have it slightly harder and cannot take advantage in the same way as competitors that had a plan for their business IT.
By the way: Just when I finished this blog post I Googled the title and the Google super cloud computer told me that my title was one of 20 600 000 pages with the sentence “Sharing is caring” in the page. Google also had the nerve to calculate the number of results in 0,26 seconds. Worse than Google is that my colleagues laughed and told me that I was probably the last person in the universe not knowing that “Sharing is caring” is a commonly used expression (I will of course ask Chris Hadfield about this).
Never mind! Even though I love to do preparation to meet new enablers in IT I am still a stone aged man in his best years and I can make decisions that is not recommended by a machine just because I feel for it and keeping the title in this blog post is my way of proving it. Hopefully that shows that there is room for us people in the future despite all IT innovators sitting in front of their computers developing the most stunning artificial intelligences which is deployed on a super computer near you with capability to index, analyze billions and billions of pages and throw a result in your face that you do not want to know and all this in a fraction of a second.
Or maybe I am wrong; maybe there is no room for a grumpy man in his best years in the future.
Who knows? Only time will tell. But at least you can prepare for a change because it will come. And you can also prepare by starting to not only looking at technical enablers but also think:
Make a plan on how you should meet the future and make sure that integration is a part of that plan.
On March 7, iBiz Solutions and CompareKvinna arranges an After Work evening where integration is on the schedule. Two of iBiz Solutions stars, Marie Högkvist and Therese Axelsson will talk about life as a systems integrator and the challenge and opportunities of integration.
“It feels really exciting and great to get the confidence to present iBiz Solutions and integration during this evening, and simply describe what efficiency and effect integration have” says Therese Axelsson.
“I Often meet people who do not know what system integration means. It’s going to be great to talk about the possibilities around integration and why it is such an interesting topic” continues Marie Högkvist.
In addition to an exciting presentation by these two women, food and drinks will be served.
If you wish to participate, register before March 1, at this link.
653 40 Karlstad
17.15-17.45: Welcome and mingle
19.15: Food, drinks and mingle
I’ve had the fortune to create two different Invoice routing applications in BizTalk. Lesson learned we made some improvements to the solution when it was time for #2 :)
Here I´m going to try and explain all the pitfalls and “good-to-know” which will hopefully save you a couple of hours of head scratching.
Some clarification before we start. I presume that you have already handled the invoice mapping from your sending invoice system, and the invoice lay outing is already done so that you are left with one or two files that you need to route to the appropriate customer. Often when talking about invoicing there are two types of files involved. One xml file with the raw data, and one pdf or image file that visualizes the data from the xml invoice.
Xml invoice is used to send to brokers that can send the invoice electronic to customers and the pdf/image file is used for email or printing and sending in a more traditional way; by mail. In the example below I use an xml message of the type E2B.
Step 1: Pair the xml and pdf/image file with each other when BizTalk receive them. You can’t exactly know when you receive them both, or in which order. For this we have set up an orchestration with a scope containing a parallel action shape that waits for both files to be received from two different receive ports; one for the xml and one for the pdf.
We have a timeout set on the scope that times out after given time if we haven’t received both files. In this way we can take proactive action before the customer calls you and wonder where the invoice is. It could be stuck in BizTalk forever if you don’t tell BizTalk what to do.
For this receive scenario we use a correlation set on the filename. Let’s presume the filename is set in this following format. [SendingOrgNumber]_[ReceiverOrgNumber]_[InvoiceNumber].[FileType] We then need to match on everything except the [FileType]. For this we create a custom pipeline component, to be used on the receive pipeline, that strips the file extension and promotes the value as ReceivedFileName in the default context.
In this way we get two files that are named exactly the same; leading to BizTalk being able to correlate the two files into the same orchestration. Fortunately we receive the files on different ports, so even though we are inside the orchestration with two files that are named exactly the same we know which is the xml and which is the pdf/image file.
Step 2: Set up routing information. Paper, email or electronic Invoice? Here we have more than one way to get information on how to route the invoice. In the first solution I built we had masterdata information about a customer in a database and fetched routing information based on the organization number fetched from the message itself. In the second one we had the information in the xml file when we received it. Here it is up to you how to get this information.
The important thing is to promote this value so that we can filter on this value on the send ports in the last step. i.e. 1=paper invoice, 2=email, 3=eInvoice.
Step 3: Decide if message should be routed via email or not. If routing is email then there is a dynamic send port set up in the orchestration for this scenario.
The only thing we need to do is to construct the message, promote receive type and reset the filename with file extension as shown in picture above. Last step before sending the invoice through the dynamic email port is to set up SMTP settings. (Here we read settings from appsettings file which makes it easier to change values without the need to actually redeploy solution). Important is to set the outbound SMTP server and credentials inside the construct message shape and also the email address, which is set in an expression shape. Example could look like textbox below.
But if it wasn’t then we really need the routing information in the context of the message. This is where we send the file to the appropriate send port and out of BizTalk’s control. We could set up send ports for each scenario. I.e. one send port for eInvoice xml and one for eInvoice pdf file, then one more for printhouse pdf, and maybe we want archiving… Like that you would end up with an orchestration full of send ports and every time you need to add a scenario you need to redeploy the solution. Instead we set up a send port for the xml and one for the pdf file, to be sent to the messagebox.
From there we can set up send ports in BizTalk Admin Console and set filter on receive type and message type. Then we have a clean orchestration and all other configuration in the admin console, easily configurable and easy to add one more subscription.
One very important thing to keep in mind when sending xml and pdf to the messagebox and you want to be able to filter on custom context properties on pdf you have to set up a correlation set for the pdf file. To be able to make custom context values to show up as promoted in the message you first have to promote the BTS.MessageType on the PDF send port. Create a new correlation set. When in the Correlation Properties window, scroll down to BTS and choose MessageType. Also choose your custom context values from your own custom property schema. Set initializing correlation sets on the xml message to the correlation set you just created and then set following correlation sets to the same on the send pdf.
Step 5: Filtering on send ports. Last step is to set up the actual filter and send the invoice. Here it is up to you to choose how to send the files. In my scenario I have an xml namespace on the xml file and the pdf file, which is treated as a System.Xml.Document inside the orchestration, has no namespace.
So if I would like to send only the xml file to eInvoice broker then I can set up a filter like this. (In my scenario invoice receiveType is 3 for eInvoice).
If you only want to send the pdf all you have to do is to set up filter to say that MessageType is not like http://www.e2b.no/XMLSchema/Internal#Interchange. In this way you can set up arbitrarily send ports to subscribe on the same message without the need to change or redeploy the BizTalk application.
Hope you found this useful and good luck setting up your own SalesInvoice routing application in the future! :)
On January 15, 2013 iBiz Solutions will be the host for the NoBUG event which takes place in Oslo. Two of iBiz Solutions experienced integration architects, Richard Hallgren and Michael Olsson, will hold two presentations. Michael Olsson will hold a presentation about BizTalk IaaS, Paas - Hybrid-based integration solutions using BizTalk locally and / or in the cloud. Richard Hallgren will then talk about efficient system documentation in an integration project.
When: January 15 2013 at 18:00
Where: Microsoft Norway, Lysaker Torg 45, 1366 Lysaker, Norway
More information and registration for the event can be found at: http://lnkd.in/4hzPcC
My name is Johan and I have been working with system integration using BizTalk for 1.5 years. I often come across situations where the customer want changes on existing integration flows. The required changes often mean that logic needs to be added or modified. Many complex integration flows have been created by using the graphic map tool and consists of thousands of links and several hundred functoids. This can look something like the picture below.
This is a small part from such a mapping:
When I am asked to make a change on an integration flow like the one shown in the picture above I feel confused for a while before I understand how I can make the required change. It can be a time consuming task to update a mapping like this where multiple developers have made modifications over a long period of time and the integration flow have started to look like a bowl of spaghetti in the GUI even though the required changes are small. I believe it is a common behavior amongst developers to always choose the same technique (XSLT or the graphic mapping tool) for all their integrations because they are certain it is the best way of doing it or maybe just because they are lazy. To make the maintainability easier for complex integration flows I recommend XSLT as the way of creating a biztalk map. The graphic map tool is more suitable than XSLT for small integrations that haven’t got complex logic.
To summarize this blog post in one sentence: Choose the best fitting mapping technique for every situation in order to make future maintenance of your integration easier.
In November iBiz solutions participated at two job fairs in the country.
One of them was UTNARM that is Uppsala technologist and science corporation career fair which is held in November each year. This year there were over 100 companies participating so it is one of the largest fairs in the country. During our days in Uppsala, we got the chance to meet the approximately 8,700 students at the Technical and Science faculty (TekNat) at Uppsala University. It was a very well organized recruitment day which we were very pleased to have participated at. We had interesting contact calls and got to know lots of interesting and nice students.
Caj Rollny, iBiz Solutions and our host at UTNARM in Uppsala.
“For us, the contact with universities and students is central to our future development as a company. We will always be dependent on being able to recruit the future stars and the best way to reach them is to participate in activities and collaborations with universities, not at least attend on job fairs.” Says Caj Rollny.
The second career day we attended was Futurum, which was organized by the University of Gävle and Gefleteknologerna. This was a little bit smaller job fair that had about 20 companies participating this year. Here we got the chance to show ourselves in our “home ground” and see if we could meet some interesting people that would fit in our Gävle office.
Caj Rollny, iBiz Solutions and our host at Futurum in Gävle.
iBiz Solutions wishes all our customers and partners a Merry Christmas and a Happy New Year! As previous years, we gave a donation to BRIS as a christmas gift to our customers and partners. We consider it very important to support the work BRIS does for Sweden’s vulnerable children.
Read more about BRIS on; www.bris.se
It is the first change to your infrastructure that gives the answer on how good your architecture really is.
It is really hard to estimate to what level you should structure your integration projects in terms of architectural guidelines, documentation and organization. It often feels that structure is an obstacle that stops you from getting the project delivered in time and it is of course tempting to take shortcuts for documentation, testing, routines for installation and change. Most of us probably have an alternative where you ask someone in your company that knows the whole company and can fix it for you in a short time. This someone is often the person that knows all systems, people as well as techniques involved so well it’s not really worth doing any documentation since it is so obvious for him. Everything is all well so far.
The problem is that the people you work with during the projects lifetime and whom you are comfortable with when you take the shortcuts to gain the valuable time, are probably long gone when it is time for the first change or when the integration platform is outdated and you are standing there with n-number of integrations to be migrated to a new stunning platform.
But since most of us know all about taking shortcuts then we also know what usually happens when the day that always comes sooner or later appears. I am talking about the day when there is a reason for a change. Now is the time when we are suffering from the tactical delivery decision once taken, and all the “simple” integrations once deployed now appears like a huge question mark with undefined dependencies and unclear ownership.
Therefore even though spending a little time on documenting and do a proper handover to operations adds on some extra time initially you will benefit of this work later in the lifecycle of the solution. What appears obvious today and for your current team seems likely as a mess for your successors that are going to migrate your “old” unknown technology and associated undocumented integrations to a modern platform. This is when it starts to cost both resources and time which could have been used for more important tasks then to reverse engineer old solutions.
What is the definition of quality of processes, documentation and architectural guidelines then? Personally I believe there is an easy answer to that question. Your level of structure should be exactly as much as you manage to govern with the resources, time and budget you have available. I know the answer is easy to say but is hard to find what level best suits your conditions.
If you have a limited budget or set of resources and do not have the possibilities to ensure that your rules are being applied then don’t apply all the state of the art frameworks available. Keep it on a level that you know will be applied and used in real life but no less than that. Keep as a rule of thumb; Your level of structure and processes needs to be on a level so that you can assign an owner and responsible for the framework, principle or process you are enforcing. No less, No more.
Integration Framework describes components that we know by experience needs to be in place to ensure integration solutions that are cost effective during the whole lifecycle.
Naming conventions, application structure, canonical formats.
I have worked on systems integration for a number of years and have encountered a number of strange names in both projects and applications. Below I try to explain some tips and tricks that can be useful. Of course, the choices you make depend on the situation, future demands, Biztalk license etc etc…
Naming and application structure
I stumbled on this a few years ago:
What can you get from this? That there is a message from Agresso named ABWtransaction? Yes, but it’s also the only thing you get. Direction? Mapping? What is xxx?
Try to have a common way to name the integrations. And try to plan ahead! Think about the fact that a helpdesk will be responsible for the artifacts. If it´s possible and you have some kind of Integration number to identify integrations, use it in the naming.
Another black hole where you tend to put artifacts are “Common” libraries, “all that should be shared”, but it is a truth with modification. The artifacts should be very static, for example, messaging standards such as EDIFACT, papiNet, OAGIS, etc… Unfortunately, it is common practice to also add the canonical schemas there with specific pipeline components that are only used by a specific integration. If you are forced to do so, I recommend that you have just one xsd per project. A problem that can arise is that you have to tear down all the applications to update these “shared” libraries. All systems are affected by an update and a service window for the server is forced to be much longer than necessary. The production units could be affected by an update in a human resources system, that’s not easy to tell the customer or production staff.
One should therefore think about the application structure and naming conventions before starting to develop integration artifacts. It is possible to divide the applications in system groups, geographical and domain areas. Several times, I have had to take over an environment where naming was “outdated” and not prepared for the integration environment and business needs. It is common to install number of pilot projects before a full-scale project is started and unfortunately the naming is often forgotten. Then you must consider that there is a big difference between the administration of 10 BizTalk integrations than of 200 BizTalk integrations for a variety of systems. My advice is that you should think about the operation and management at an early stage. Is it easy to debug? Can you help someone from the business who calls and shouts “We can´t create the documents for the lorry in the warehouse”? Where do you start to look? The answer is that you should think about this early on and have an application structure based on domain areas, such as Logistics, Finance, Sales, BI, HR and so on.
You check the applications in the logistics domain and find a port that is stopped and you start it? Truck driver happy! Developers/administrators have a penchant for system naming, this can also be applied to a domain structure and everyone is happy:
The important part is the domain. You could also have the geographical area if you want:
A canonical format is a common format for converting / mapping to and from. It is used to create loose connections and minimize dependencies between systems. A tip for you to start using canonical schemas is to open it up. I.e. remove constraints, cardinality (that is not justifiable) and in some cases even data types. Set the data type to “string” rather than DateTime in the date element. Why is that then? Well, because you do not know how the schema will be used or the direction in which the messages are sent. You do not know how the data will look. In other words, it is in the mappings from your canonical format instead, you must ensure that the mappings are correct and that the data is in the correct format. The only thing you know is your endpoints and which format to use before the message is allowed to take off. What reasons exist to validate a canonical format? None at all in my opinion! Canonical format is a transport format. The canonical schema should be based on what you know. If you work in for example the paper industry there is common format named papiNet. That’s an accepted standard that has existed for several years and thus it is smart to copy it (if possible) and use it as a canonical format.
If you don’t have a standard to copy, then take a copy of the sending systems schema format. Add an envelope schema part that is also well thought and start your integrations!
Unfortunately, there is no best practice for these types of projects because there are a number of parameters to take into account. But this is some of my thoughts and ideas! Hope it interests some of you.
Over and Out!
Posted in: •Integration | Tagged:
Microsoft finally released the BizTalk 2013 beta. We can now start to adopt and test our build, test and deployment framework. Great!
Integration software provides a great log tool; Integration Manager that provides views for secured access and search capabilities to both Windows Azure EAI and EDI services as well as BizTalk 2013.
BizTalk Server 2013 Beta offers significant enhancements to the already rich integration capabilities by including the following feature additions:
• Integration with Cloud Services – BizTalk Server 2013 Beta includes new out-of-the box adapters to send and receive messages from Windows Azure Service Bus. It also provides capabilities to transfer messages using different relay endpoints hosted on Azure. • RESTful services – BizTalk Server 2013 Beta provides adapters to invoke REST endpoints as well as expose BizTalk Server artifacts as a RESTful service. • Enhanced SharePoint adapter – Integrating with SharePoint using BizTalk Server 2013 Beta is now as simple as integrating with a file share. We have removed the need for dependency on SharePoint farms, while still providing backward compatibility. • SFTP adapter – BizTalk Server 2013 Beta enables sending and receiving messages from an SFTP server. • ESB Toolkit integration – With BizTalk Server 2013 Beta, ESB Toolkit is now fully integrated with BizTalk Server. Also, the ESB Toolkit configuration experience is vastly simplified to enable a quick setup. • Dependency tracking - The dependencies between artifacts can now be viewed and navigated in Admin console. • Improvements in dynamic send ports – BizTalk Server 2013 Beta provides the ability to set host handler per adapter, instead of always using the default send handler of the adapters.
Over and above these, BizTalk Server continues to enable integration solutions heterogeneous Line-of-business systems with Windows .NET and SharePoint-based applications to optimize user productivity, gain business efficiency and increase agility. BizTalk Server 2013 allows .Net developers to take advantage of BizTalk services right out of the box to rapidly build solutions that need to integrate transactions and data from applications like SAP, Mainframes, MS Dynamics, and Oracle. Similarly SharePoint developers can seamlessly use BizTalk services directly through the new Business Connectivity Services in SharePoint 2010. BizTalk Server 2013 includes data mapping & transformation tool to dramatically reduce the development time to mediate data exchange between disparate systems. It also provides various management interfaces to for managing BizTalk Server applications, managing performance parameters, and streamlining deployments from development to test to production. BizTalk Server 2013 includes the scalable Trading Partner Management (TPM) model with a graphical interface for flexible management of business partner relationships and efficient on-boarding process.