In November iBiz solutions participated at two job fairs in the country.
One of them was UTNARM that is Uppsala technologist and science corporation career fair which is held in November each year. This year there were over 100 companies participating so it is one of the largest fairs in the country. During our days in Uppsala, we got the chance to meet the approximately 8,700 students at the Technical and Science faculty (TekNat) at Uppsala University. It was a very well organized recruitment day which we were very pleased to have participated at. We had interesting contact calls and got to know lots of interesting and nice students.
Caj Rollny, iBiz Solutions and our host at UTNARM in Uppsala.
“For us, the contact with universities and students is central to our future development as a company. We will always be dependent on being able to recruit the future stars and the best way to reach them is to participate in activities and collaborations with universities, not at least attend on job fairs.” Says Caj Rollny.
The second career day we attended was Futurum, which was organized by the University of Gävle and Gefleteknologerna. This was a little bit smaller job fair that had about 20 companies participating this year. Here we got the chance to show ourselves in our “home ground” and see if we could meet some interesting people that would fit in our Gävle office.
Caj Rollny, iBiz Solutions and our host at Futurum in Gävle.
iBiz Solutions wishes all our customers and partners a Merry Christmas and a Happy New Year! As previous years, we gave a donation to BRIS as a christmas gift to our customers and partners. We consider it very important to support the work BRIS does for Sweden’s vulnerable children.
Read more about BRIS on; www.bris.se
It is the first change to your infrastructure that gives the answer on how good your architecture really is.
It is really hard to estimate to what level you should structure your integration projects in terms of architectural guidelines, documentation and organization. It often feels that structure is an obstacle that stops you from getting the project delivered in time and it is of course tempting to take shortcuts for documentation, testing, routines for installation and change. Most of us probably have an alternative where you ask someone in your company that knows the whole company and can fix it for you in a short time. This someone is often the person that knows all systems, people as well as techniques involved so well it’s not really worth doing any documentation since it is so obvious for him. Everything is all well so far.
The problem is that the people you work with during the projects lifetime and whom you are comfortable with when you take the shortcuts to gain the valuable time, are probably long gone when it is time for the first change or when the integration platform is outdated and you are standing there with n-number of integrations to be migrated to a new stunning platform.
But since most of us know all about taking shortcuts then we also know what usually happens when the day that always comes sooner or later appears. I am talking about the day when there is a reason for a change. Now is the time when we are suffering from the tactical delivery decision once taken, and all the “simple” integrations once deployed now appears like a huge question mark with undefined dependencies and unclear ownership.
Therefore even though spending a little time on documenting and do a proper handover to operations adds on some extra time initially you will benefit of this work later in the lifecycle of the solution. What appears obvious today and for your current team seems likely as a mess for your successors that are going to migrate your “old” unknown technology and associated undocumented integrations to a modern platform. This is when it starts to cost both resources and time which could have been used for more important tasks then to reverse engineer old solutions.
What is the definition of quality of processes, documentation and architectural guidelines then? Personally I believe there is an easy answer to that question. Your level of structure should be exactly as much as you manage to govern with the resources, time and budget you have available. I know the answer is easy to say but is hard to find what level best suits your conditions.
If you have a limited budget or set of resources and do not have the possibilities to ensure that your rules are being applied then don’t apply all the state of the art frameworks available. Keep it on a level that you know will be applied and used in real life but no less than that. Keep as a rule of thumb; Your level of structure and processes needs to be on a level so that you can assign an owner and responsible for the framework, principle or process you are enforcing. No less, No more.
Integration Framework describes components that we know by experience needs to be in place to ensure integration solutions that are cost effective during the whole lifecycle.
Naming conventions, application structure, canonical formats.
I have worked on systems integration for a number of years and have encountered a number of strange names in both projects and applications. Below I try to explain some tips and tricks that can be useful. Of course, the choices you make depend on the situation, future demands, Biztalk license etc etc…
Naming and application structure
I stumbled on this a few years ago:
What can you get from this? That there is a message from Agresso named ABWtransaction? Yes, but it’s also the only thing you get. Direction? Mapping? What is xxx?
Try to have a common way to name the integrations. And try to plan ahead! Think about the fact that a helpdesk will be responsible for the artifacts. If it´s possible and you have some kind of Integration number to identify integrations, use it in the naming.
Another black hole where you tend to put artifacts are “Common” libraries, “all that should be shared”, but it is a truth with modification. The artifacts should be very static, for example, messaging standards such as EDIFACT, papiNet, OAGIS, etc… Unfortunately, it is common practice to also add the canonical schemas there with specific pipeline components that are only used by a specific integration. If you are forced to do so, I recommend that you have just one xsd per project. A problem that can arise is that you have to tear down all the applications to update these “shared” libraries. All systems are affected by an update and a service window for the server is forced to be much longer than necessary. The production units could be affected by an update in a human resources system, that’s not easy to tell the customer or production staff.
One should therefore think about the application structure and naming conventions before starting to develop integration artifacts. It is possible to divide the applications in system groups, geographical and domain areas. Several times, I have had to take over an environment where naming was “outdated” and not prepared for the integration environment and business needs. It is common to install number of pilot projects before a full-scale project is started and unfortunately the naming is often forgotten. Then you must consider that there is a big difference between the administration of 10 BizTalk integrations than of 200 BizTalk integrations for a variety of systems. My advice is that you should think about the operation and management at an early stage. Is it easy to debug? Can you help someone from the business who calls and shouts “We can´t create the documents for the lorry in the warehouse”? Where do you start to look? The answer is that you should think about this early on and have an application structure based on domain areas, such as Logistics, Finance, Sales, BI, HR and so on.
You check the applications in the logistics domain and find a port that is stopped and you start it? Truck driver happy! Developers/administrators have a penchant for system naming, this can also be applied to a domain structure and everyone is happy:
The important part is the domain. You could also have the geographical area if you want:
A canonical format is a common format for converting / mapping to and from. It is used to create loose connections and minimize dependencies between systems. A tip for you to start using canonical schemas is to open it up. I.e. remove constraints, cardinality (that is not justifiable) and in some cases even data types. Set the data type to “string” rather than DateTime in the date element. Why is that then? Well, because you do not know how the schema will be used or the direction in which the messages are sent. You do not know how the data will look. In other words, it is in the mappings from your canonical format instead, you must ensure that the mappings are correct and that the data is in the correct format. The only thing you know is your endpoints and which format to use before the message is allowed to take off. What reasons exist to validate a canonical format? None at all in my opinion! Canonical format is a transport format. The canonical schema should be based on what you know. If you work in for example the paper industry there is common format named papiNet. That’s an accepted standard that has existed for several years and thus it is smart to copy it (if possible) and use it as a canonical format.
If you don’t have a standard to copy, then take a copy of the sending systems schema format. Add an envelope schema part that is also well thought and start your integrations!
Unfortunately, there is no best practice for these types of projects because there are a number of parameters to take into account. But this is some of my thoughts and ideas! Hope it interests some of you.
Over and Out!
Posted in: •Integration | Tagged:
Microsoft finally released the BizTalk 2013 beta. We can now start to adopt and test our build, test and deployment framework. Great!
Integration software provides a great log tool; Integration Manager that provides views for secured access and search capabilities to both Windows Azure EAI and EDI services as well as BizTalk 2013.
BizTalk Server 2013 Beta offers significant enhancements to the already rich integration capabilities by including the following feature additions:
• Integration with Cloud Services – BizTalk Server 2013 Beta includes new out-of-the box adapters to send and receive messages from Windows Azure Service Bus. It also provides capabilities to transfer messages using different relay endpoints hosted on Azure. • RESTful services – BizTalk Server 2013 Beta provides adapters to invoke REST endpoints as well as expose BizTalk Server artifacts as a RESTful service. • Enhanced SharePoint adapter – Integrating with SharePoint using BizTalk Server 2013 Beta is now as simple as integrating with a file share. We have removed the need for dependency on SharePoint farms, while still providing backward compatibility. • SFTP adapter – BizTalk Server 2013 Beta enables sending and receiving messages from an SFTP server. • ESB Toolkit integration – With BizTalk Server 2013 Beta, ESB Toolkit is now fully integrated with BizTalk Server. Also, the ESB Toolkit configuration experience is vastly simplified to enable a quick setup. • Dependency tracking - The dependencies between artifacts can now be viewed and navigated in Admin console. • Improvements in dynamic send ports – BizTalk Server 2013 Beta provides the ability to set host handler per adapter, instead of always using the default send handler of the adapters.
Over and above these, BizTalk Server continues to enable integration solutions heterogeneous Line-of-business systems with Windows .NET and SharePoint-based applications to optimize user productivity, gain business efficiency and increase agility. BizTalk Server 2013 allows .Net developers to take advantage of BizTalk services right out of the box to rapidly build solutions that need to integrate transactions and data from applications like SAP, Mainframes, MS Dynamics, and Oracle. Similarly SharePoint developers can seamlessly use BizTalk services directly through the new Business Connectivity Services in SharePoint 2010. BizTalk Server 2013 includes data mapping & transformation tool to dramatically reduce the development time to mediate data exchange between disparate systems. It also provides various management interfaces to for managing BizTalk Server applications, managing performance parameters, and streamlining deployments from development to test to production. BizTalk Server 2013 includes the scalable Trading Partner Management (TPM) model with a graphical interface for flexible management of business partner relationships and efficient on-boarding process.
Message queuing is an often forgotten technique to provide asynchronous communication between systems or services where the server and client do not need to interact with the message at the same time. MSMQ is Microsoft’s message queue implementation.
Some of the advantages of message queuing are:
A MSMQ queue can be configured to be either a transactional queue or a non-transactional queue. Messages sent to a transactional queue are transferred exactly once.
There are typically three transactions involved in a message transfer (if the client and the server are not on the same machine).
Client transaction The client transaction takes place when the client pushes a message to the client queue. If the client commits the transaction, the message is pushed to the queue. If the client aborts the transaction the message is rejected from the queue.
Delivery transaction MSMQ is responsible for delivering the message from the client queue to the service queue. If the network is down, or the service machine has crashed, message removal from the client queue is rolled back. MSMQ will eventually deliver the message when the service is up.
Destination transaction The service tries to remove the message from the service queue. If the transaction aborts, message removal is rolled back. If the transaction is committed, the message is removed from the queue.
SOA’izing the message queue Working directly with MSMQ is hard. The developer must handle transaction commits and roll backs explicitly. Retry functionality must be implemented on the client and on the service. The messages must be hand-crafted and throttling behavior must be implemented manually. In the SOA world (Service Oriented Architecture) we are used to not think in terms of messages on a queue, but transport neutral business operations.
WCF to the rescue WCF can act as an abstraction layer on top of MSMQ to shield you from many of the pains of using MSMQ directly. In the SOA world we are used to communicate via contracts, often exposed as WSDL documents. The service contract tells the client how it is supposed to communicate with the service, how transactions are implemented, which security settings should be used, throttling behavior etc. The data contract tells the client what data needs to be sent. A proxy is generated on the client side based on the contracts. The proxy does all the heavy lifting of converting WCF-messages to MSMQ messages, enlisting in transactions etc. There is not a direct mapping of WCF messages to MSMQ messages. The client posts messages to a queue, not a service which means that all calls are asynchronous and disconnected.
Another important aspect is throttling. If there are 15 messages on the queue the service may not want to have 15 concurrent instances and worker threads. The throttling behavior in WCF controls how many instances and sessions can be handled at once. Maxed-out messages stay in the queue.
This has been a brief introduction to what the benefits of message queuing is and how WCF makes life easier when it comes to message queuing. For a more technical reference on leveraging MSMQ in WCF I can really recommend Programming WCF Services by Juval Löwy.
When should you talk about TOGAF or any other architectural framework with your business representatives? The answer is NEVER!
Have you ever had the feeling at a party or any social gathering that people starts to talk about their jobs and what they doing? I have, and it is so strange; because when the nurse talks about their job everyone is excited and interested to listen, same goes for the fireman, carpenter and taxi driver. Everyone can relate to what these people do and have a nice discussion which all can relate to.
But when it comes to you and your story is IT they all stop talking and you never get that second question. People just go back to the nurse, carpenter, doctor and continue to talk about just how important and interesting their job is…
What I just mentioned happened to me a few weeks back when I joined a fantastic initiative called Rosa Landet (http://rosalandet.se). This was a group of ~10 people that started from the upper north of Sweden and bicycled down to Skåne in the very south, a ride for about 2500 km all this to collect money for the breast cancer foundation in Sweden. I joined the group for one stage between Falun and Uppsala. During the trip which lasted about 9 hours there were a lot of interesting discussions going on and most of the people had jobs like fireman, doctors, salesmen or electricians, anyway jobs that people understand what they actually do; Except me! So when we were biking and talked I was very keen on learning more about how a typical working day looks for a fireman, of course there were loads of questions and the women whom I asked answered enthusiastic so we had a really nice conversation. After about an hour we started to talk about me and of course after a while she asked what my occupation is. I answered IT and that was pretty much the end of the discussion…
Today most people love what IT can do for them but they really don’t care about how the apps and information really got there. This is a mystery for us IT-people who just can’t stop ourselves from thinking about how things are integrated and what framework is used to get the pieces in place. But for normal people this should just happen by magic. Sad for us IT-people, but true. It isn’t that strange though, look how much you care about how your car gets fixed, you don’t; isn’t it so. So why should you load details into your customer?
As an example when you leave the car at the garage. You just don’t care what tools your mechanics are using you just care about that your car is returned in better condition than it was when you left it there. If you feel that the relation with the price you paid for fixing your car and the condition the car is in now, then you probably will give the garage your trust next time your car breaks down again (because it will for sure).
This is exactly what you should do next time you meet your business colleagues! Take the opportunity to understand the business challenge and do not expose your internal challenges. You would probably prior to the meeting have safeguarded that your IT is ready to meet the new challenge and now it is time to show that your efforts are well worth the money spent because you are ready to implement what your business requires This is exactly what your car mechanic does prior to a car brand releases a new model. He prepares himself for the new stuff coming.
Like with the car engine that needs oil; TOGAF is your architectural engines lubricant which gets things going smoothly and without too much friction. Like lubrication this should not leak outside the engine because when it does then there is an indication that something is wrong. If you as a driver gets a warning signal it puts you in a state of insecure. This was probably not your intention with exposing your architectural tools but it is a natural reaction if someone don´t understand your language. Probably your business representative will try to get a second opinion from someone that seems to know what he is doing and then you are suddenly not the captain of the IT decision anymore.
For me as an architect it is always good to compare myself with other jobs that people just don’t care how it works, think about the people fixing your car. You just want it to work! And as an advice from a friend; Do not ever compare yourself with the fireman because people do not see you as the savior from the fire, they see you as the representative that make things work.
My personal lesson learned from this story is that next time I am at a party and someone asks me what I am working with, I will tell a lie. My answer will be that I am a fireman but I am soon out of business because I am doing such a good proactive job so I don’t have any fires to chase any longer. Hopefully I will get a second question and then I will find something interest to talk about. Bicycling maybe :-)
iBiz Solutions Integration Framework gives you a helping direction in your dialogue with the business and also it directs you on how to build your integration landscape.
My name is Robin Ericsson and I have been at iBiz Solutions for about two months. I am currently “in training” to become a part of the iBiz Solutions integration team.
During my first time at iBiz Solutions my focus has been to learn to work with and understand system integration with an emphasis on Microsoft BizTalk Server. I have had a great help from internal courses, course material and exercises that we have in the company. By doing the exercises and reading the material I have gotten a basic understanding and knowledge in the subjects.
As I am pretty much fresh out of school, I have been wondering about why there are not more courses at universities that focus on system integration or widely used system integration solutions such as Microsoft BizTalk Server or Tibco. I did some searches at antagning.se (a portal for university studies in Sweden) for “system integration”, “systemintegration”, “Microsoft BizTalk Server” and “Tibco”. What I could find was there are some courses that take up system integration as small part of a course but there is only one course that focuses totally on system integration and that is being held at Karlstad University.
This is why I want to do a shout out to all of the universities in Sweden that have courses and/or programs in IT to consider adding a course in System Integration, as we all know it is one of the most important aspects in IT today and I think that students would have great use of knowledge on the subject.
On October 3rd 2012 I held an informational event on BUGS in the premises of Informator in Stockholm about running BizTalk 2010 R2 (or whatever the name in the end will turn out to be)in Windows Azure. Great news for us professional system integrators who have had over 10 years of pleasure working with BizTalk. The fact that Microsoft allows us to seamlessly move our integration engine from on-premise to the cloud (Microsoft BizTalk 2010 R2 comes with license mobility) will increase availability as most local IT-departments simply do not have the economics, power, resources or the organization to match those of Microsoft. Microsoft will have better SLAs than most other. BizTalk will now run as an infrastructural component directly in the cloud (IaaS). Microsoft is also providing the new EAI and EDI functionality which is based on long established BizTalk patterns. The latter provides the platform as a service (PaaS).
I find it especially nice that Microsoft allows us to setup our own private networks directly in the cloud and to top that they also provides us with the infrastructural configurations to easily connect the on-premise network and the cloud network through a static VPN tunnel. This means that we can move our BizTalks to a more reliable hardware platform and pay as you go. These machines in the cloud are of course easily accessible and manageable from everywhere. BizTalk machines in the cloud can use this VPN-bridge connect to LOB-systems on premise. So there is no reason to wait any longer, you can have your development and test environments up in only a few hours. Production environments should of course wait until final release, which probably lies within 6 months from now.
Microsoft has not yet announced what BizTalk 2010 R2 running in the cloud will cost but indications are that the established path with no costs up front and pay as you go will be the model.
Are you having a hard time keeping up with the pace on the introduction of the cloud, users that want to go mobile, project sites that have stopped using your traditional databases and moved out to the cloud? Too bad for you, but if it is of any comfort, you are not alone; and there is a way to get out of there.
If you ought to summarize the last 10 years in IT you would probably, like myself, have somewhere on your top 5 list the word change. Changes are of course something that is not IT unique. We see it in every business where lifecycles of products is getting shorter which makes the time we have to develop products shorter. We seem to have a never ending demand of experiencing new emotions driven by mobile phones or any kind of new devices and accessories. Change is a good thing and without change life would of course be boring and probably most of us wouldn’t have a job either. With Change constantly happening like in IT and you never really have time to finish what you are currently up to before a new thing is on the arise. Maybe you start to dream about that the evolution of IT should stop for at least a couple of years so that we get the time we so well deserve to clean up what we have today and start all over again. With all the wisdom we have collected during the “wild years”. Unfortunately this could only happen in our dreams…
But maybe there is another way to get out of the feeling of chasing techniques and never really reach the finish line until the next version is knocking at the door. Instead of setting focus on the devices or whatever the new hot thing is, you should change focus and aim for a goal that you can reach with a reasonable amount of investment.
Think about it; How many different types of E-commerce solutions, PC’s, mobile devices have been around for the last decade? I have lost count and a few years ago my webhosting company sold me application hosting and now they are suddenly in the cloud; I don’t know how that happened there! But think again and remind yourself what have happened to the information? An order, customer or a product look pretty much the same as it did 10 years ago from an Information perspective. You would probably say that not very much have happened there, a few more attributes on the product, maybe geo position is added to your customer if you are a company in time and working with location analytics. But it isn’t that dramatic change in Information compared to what my mobile phone can do today and what it did 10 years ago.
This is really good news for all of us because if we start to think that information matters and information is moving in a pace so that we can actually follow the changes and maybe even be a bit proactive. Then we can suddenly feel a little bit better and start to think that there is at least one area that I am in control of. So now we are in control! Think a little bit further and create an imaginary world where you also have connected the enterprise business entities into well-defined and communicated interfaces and services. Then yet again the world looks better because in that imaginary world you have built a foundation where you can meet the evolution in devices and technique by looking at interfaces and information flows rather than fixing thing based on project demands and your services interfaces will most probably live much longer than the device that is currently consuming it. If we are honest to ourselves, not so much have happened the last 15 years when we are looking at integration. Of course the possibilities increased dramatically with the introduction of XML at the end of the 90’s and with the use of integration middleware like Tibco, BizTalk and others. But those who did focus on Information and integration in the beginning of 2000 have been well prepared to support the technical evolution during this time. Because what happened from an integration perspective is that we went from XML-RPC to SOAP and for the last years we see REST-services and JSON-structured data being very popular especially on the web for mobile devices. But if you are on a modern integration platform it doesn’t really care if the information is presented in JSON or Xml or any other formats for that sake. It is still the same old information.
So what do we do then? In a world in a rapid change you need to have a plan! We know that we cannot stop the world from evolving so you will not get very far by trying to stop Google, Apple and Salesforce from providing us with new exciting devices and services. We need to put focus on something that is under our control. We know that we at least can affect our own organization even if it takes a little longer time in some organizations due to size or other factors but it is for sure possible. By finding the right ambassadors and by starting to implement the new thinking of information and integration and how it can make your IT productive and give great value to your organization you will see that you will sleep better at night. Find the low hanging fruit in your company to gain trust and qualify the plan we just worked out. By finding the low hanging fruit that delivers value you will be lifted by success stories rather than pontifications and enforcing standards that cannot be linked by the business to business opportunities. Just make sure your first low hanging fruit is achievable…
By working information and integration closely together you will be better prepared when your colleagues wants to go mobile or move a module out to the cloud because you will now work with identifying what need of information the piece of functionality that you are moving requires and map the information need to your Enterprise services.
iBiz Solutions Integration Framework provides a methodology and framework that gives you a jump start to be more prepared for a world in a rapid change. Integration Framework addresses both architecture and tools on how your information integrates into your IT-landscape. It also gives support for strategic planning for coming opportunities with city planning that makes it easier to manage the information flows in your organization. Important aspects as the governance of SOA-services and how integration patterns links to information in your organization are also covered by Integration Framework.
So let’s start to meet the techniques that are a changing with a stable foundation built on Information and integration and a plan to put them together with the help of Integration Framework.