Across the IT industry, people are re-thinking their approach to application integration and electronic data interchange in the context of ubiquitous scale-out cloud platforms, the onward march of service-orientation, the success of the RESTful philosophy, the adoption of agile methodologies and the rise of devops and continuous deployment. In this rapidly evolving world, the microservices concept has gained attention as an architectural and design model that promotes best practice in building modern solutions.
In this post, I will explain the two worlds of integration and microservices and start to explore how they relate to each other. I will start with a description of integration. It is important to understand the central concerns that integration addresses before moving on to look at the emerging microservices world. Without this, we won’t be able to relate the two to each other appropriately. I will move on to describe the principles that underpin the microservices approach and discuss the notion of 'monolithic' applications. I will then discuss the relationship between microservices and integration, especially in the context of emerging microservice PaaS platforms.
What is Integration?
I’m specifically interested in the concepts of enterprise application integration (EAI) and electronic data interchange (EDI). These related solution categories are widely understood. Together, they constitute a few square inches in the rich tapestry of software development. However, for many organisations, they play a central role in the effective exploitation of information technology.
EAI is necessary in scenarios where the following statements are true:
“We need to protect our existing investment in different applications, systems and services, including ‘heritage’ systems, line of business applications from different vendors and custom-built solutions.”
“We have business processes and activities that depend on, or touch, multiple systems and applications. We need IT to automate and streamline those processes and activities as much as possible by integrating all those systems into our processes through robust forms of data interchange.”
“We have to accept and manage change in the most cost-effective and efficient way we can. This can include new investments and acquisitions, new business processes, the evolution or replacement of existing systems and other issues. Much of this change is beyond the control of IT and is either dictated by the business or forced on us by partners or software vendors.”
EAI is characterised by techniques and approaches which we apply when we need to integrate applications and systems that were never designed or envisaged to interoperate with each other. This is more than a matter of protocol. Different systems often model business data and process in radically different ways. The art of the EAI developer, rather like that of a diplomat, is to build lines of communication that honour the distinctive viewpoint of each participant while fostering a shared and coherent understanding through negotiation and mediation.
As we will see, the microservice community advocates the use of lightweight communication based on standardised interfaces. This principle is typically satisfied through the use of RESTful interfaces. From an EAI perspective, however, this principle is not particularly interesting. Integration handles the protocols and interfaces dictated to it, whatever they may be, and is primarily concerned with mediation. To continue the analogy of diplomacy, it’s rather like encouraging everyone to communicate face-to-face via Skype video. In theory, this may be convenient and efficient, but it is of little use if each participant speaks in a different language and is only interested in communicating their own concerns from their own perspective. The diplomat’s job is to mediate between the participants to enable meaningful interchange. In any case, some participants may not have the necessary bandwidth available to use the technology. Older participants may not be comfortable using Skype and may refuse to communicate this way.
In EAI, the message is king. The most fundamental task of the EAI developer is to mediate the interchange of messages by whatever means necessary. If it is possible to standardise the protocols by which this done is, then that is valuable. However, such standardisation is secondary to the central aim of ensuring robust mediation of messages. There is a strong correlation between the intrinsic value of individual messages to the business and the use of EAI. The value of a message may be financial (orders, invoices, etc.,), reputational or bound up with efficiency and cost savings. The more each individual message is worth, the more likely it is that EAI approaches are required. Mediation ensures that each valuable message is delivered to its recipient in a coherent form and manner, or, if not, that appropriate action is taken. Any other concerns are secondary to this aim.
Messages are the primitives of integration. At their simplest, they are opaque streams of bytes that are communicated between participants. However, most EAI frameworks provide abstractions that support a distinction between content and metadata. These abstractions may be elaborated as required. For example, content may be treated as immutable while metadata remains mutable. Content may be broken down into additional parts, and each part may be provided with part-specific metadata. Message metadata is typically represented as name-value pairs. It may contain unique identifiers, routing information, quality-of-service data and so forth.
The centrality of messages in EAI cannot be underestimated. As an abstraction, messages are decoupled from other abstractions such as endpoints and processes. This means that they exist independent of any notion of a service contract. Service-orientated approaches cannot be mandated in EAI, however desirable they may be. Perhaps more importantly, they can exist independent of any specific application, system or service. Messages possess the following characteristics:
Extensibility: Messages can be arbitrarily enriched and extended with additional metadata. Metadata supports correlation protocols, identity, routing information and any additional semantics with no requirement to change the message content or format.
Malleability: Message content can be amended or transformed as required as it is passed from source to recipient. In transforming a message, we often create a new message to hold the transformed content. Metadata can be used to record the association between the old and new versions of the message.
Routability: Static or dynamic routing decisions can be made at run-time to control the destination of messages, the protocols used to communicate those messages and other concerns. Such decisions are generally made by evaluating metadata against routing rules. This approach supports the flexibility needed in EAI to implement complex interchange and correlation patterns and to adapt rapidly to changes in business requirements.
Traceability: Messages are traceable as they pass between different services and application and undergo multiple transformations. As well as the progress of individual messages, tracing can record the relationships between those different messages. This includes correlated messages (e.g., response messages correlated to request messages), messages that logically represent a given business, batched and sequences of messages, etc. Tracing provides insight, supports troubleshooting and enables the gathering of metrics.
Recoverability: Messages can easily be persisted in highly available stores so that they are recoverable in the event of a failure. When individual messages are accorded significant worth to the business, this provides assurance that important data is not lost and will be process appropriately. It supports high levels of service and is central to building robust automation of business processes.
In EAI, the focus is on the applications, systems and services that organisations invest in and rely on. For EDI, the focus is on the exchange of data with external organisations, including trading partners, suppliers and customers. EDI shares a lot in common with EAI, but is about the integration of different businesses and organisations, rather than the integration of in-house applications. Like EAI, one of the main drivers is the need to manage change effectively and efficiently, even though that change is often beyond the control of IT.
Now we have described the world of integration, we need to explore the concept of microservices. This is best understood as a set of principles for service-orientation. By ‘service-orientation’ I mean any approach to software design that conceives of systems and applications as a collaboration between discrete service components. For some people, the term has strong connotations with complex proprietary SOA frameworks and tooling. I use the term only in a literal and broad sense.
There is no one authoritative definition of the term ‘microservice’. However, a reasonable level of consensus has emerged. We can summarise the principles as follows:
Decompose applications into microservices: Microservices apply service-orientation within the boundaries of individual applications. Solutions are created from the ground up as a collaboration of fine-grained services, rather than monolithic applications with a front-end layer of service interfaces.
Let each microservice do one thing, and do it well: The ‘micro’ in microservices is too often equated with low SLOC counts. While SLOC can act as a useful heuristic for detecting microservice code smells, this misses the point. A microservice is focused on handling a small subset of well-defined and clearly delineated application concerns. Ideally, a microservice will handle just one concern. This focus makes it much easier to understand, stabilise and evolve the behaviour of individual microservices.
Organise microservices around business capabilities: Multi-tier architectures historically divide and group services by functionality. Services reside in the presentation tier, the business tier or the data tier. If we think of this as horizontal partitioning, then microservices emphasises vertical partitioning. Microservices are grouped and partitioned according to the products and services offered by the organisation. This fosters cross-functional development teams that adopt product-centric attitudes aligned to business capabilities. It de-emphasises the boundaries of individual applications and promotes the fluid composition of services to deliver value to the business.
Version, deploy and host microservices independently: Microservices should be as decoupled and cohesive as possible. This minimises the impact of change to any individual microservice. Microservices can evolve independently at the hands of different individuals or small teams. They can be written in different languages and deployed to different machines, processes and runtime environments at different times using different approaches. They can be patched separately and retired gracefully. They can be scaled effectively, chiefly through the use of horizontal scaling approaches.
Use lightweight communication between microservices: Where possible, implement simple interchange through standardised interfaces. The general consensus is that REST and JSON are preferred approaches, although they are by no means mandated. Avoid complex protocols and centralised communication layers. Favour choreography over orchestration and convention over configuration. Implement lightweight design patterns such as API Gateways to act as intermediaries between microservices and clients. Design each microservice for failure using automatic retry, fault isolation, graceful degradation and fail-fast approaches.
Avoid centralised governance and management of microservices: Use contract-first development approaches, but don’t enforce centralised source code management, specific languages or other restrictions across different microservice development teams. Don’t depend on centralised discovery e.g., via service directories. Don’t enforce centralised data management or configuration, but instead let each microservice manage its own data and configuration in its own way.
The most common rationale for microservices contrasts them with the design and implementation of monolithic applications. At one extreme, I’ve seen monolithic applications defined as highly coupled solutions deployed as a single package to a single machine and run in a single process. Of course, few modern enterprise-level applications are designed this way. Multi-tier architectures, component-based design and service orientation, together with the wide-spread adoption of modern design patterns, has ensured that most enterprise-level development has long moved on from the creation of such monstrosities.
A better definition of the term ‘monolith’ focuses on the need to deploy, version and patch entire applications as a whole, regardless of how they are physically structured. From this perspective, the problem is cast in terms of the impact of fine-grained changes on the application as a whole. A change in one component may require a full re-deployment of the entire application.
This type of monolithicity has a detrimental effect on the entire application lifecycle. Each developer must constantly re-build and re-deploy the entire application on their desktop just to test a small change to a single component. The build manager must maintain complex scripts and oversee slow processes that repeatedly re-construct the entire application from numerous artefacts, each worked on by different developers. The testers are restricted to black-box integration testing of large and complex combinations of components. Deployment is a fraught and costly battle to coax the entire ensemble to function in alien environments. Every patch and upgrade requires the entire application to be taken out of commission for a period, compromising the capability of the business to function. Significant change becomes infeasible just in terms of regression testing. To cap it all, once the architects and developers have moved on, no one is left with sufficient understanding of how the application functions as a whole.
Microservices replace the classic notion of the application, defined by tiers of functionality, with the concept of loosely-coupled collaborations of microservices grouped according to business capability. They facilitate the continuous evolution and deployment of solutions, allowing small groups of developers to work on constrained problem domains while minimising the need to enforce top-down governance on their choice of tools, languages and frameworks. Microservices support the agile, high velocity development of product-centric solutions using continuous deployment techniques. They support elastic scalability. They help to minimise technical debt.
An obvious objection could be that microservices lack novelty, by which I mean that they do not possess sufficient distinction from pre-existing and generally received architectural concepts to be of interest. Certainly each of the principles outlined above has a long history pre-dating the emergence of the term ‘microservice’. Such objections arise naturally when microservices are characterised as a remedy to mainstream notions of service-orientation. While it is true that some examples of service-orientated architecture prove vastly over-complicated for the problems they address, that is simply a matter of poor application of architectural principles. Any attempt to claim that ‘service-orientation is a bad thing’ and cast microservices as the solution misses the point entirely and quickly descends into caricature and absurdity.
In reality, microservice principles are a service-orientated response to the world of agility, devops and continuous deployment. As such, their novelty emerges from their ability to mould and fashion the direction of service-orientated development in the context of these concerns. They also represent the desire to ‘democratise’ software development, allowing developers from the widest circle to collaborate without unnecessary restriction.
A number of articles and presentations contrast the microservice approach to the use of proprietary integration and service bus frameworks. While some of the arguments are spurious and ill-informed, the underlying intention is good. It is the desire to avoid closed worlds with their ‘high-priesthoods’ in favour of a more open world in which mainstream development approaches can be used to solve problems by any suitably experienced developer.
I should declare my own position here. I have spent the last decade or more as a ‘high-priest’ of integration with a focus on a proprietary framework and set of tools. However, with the advent of cloud computing, I increasingly inhabit the ‘democratised’ world. I have, as it were, a foot in both camps. Indeed, I spend roughly equal time moving between these two camps. I see worth in both, and I believe they are more closely aligned than some imagine. However, I also recognise that the flow of history is clearly towards democratisation.
When integration and microservices meet
Now we have defined the worlds of integration and microservices, we need to ask some obvious questions. Where and how do these two worlds meet? Do they overlap or not? Are they complementary or do the contradict each other?
There is plenty of scope for disagreement here. We can imagine an argument between two protagonists. Alice is an EAI veteran. Bob is a microservice evangelist.
Alice kicks things off by asserting that microservices are an irrelevance to her. Integration exists to handle the interchange between any applications, systems and services, regardless of their architecture. She is happy to integrate anything, including monolithic applications, microservices, SOA services and all systems of any kind, shape or size.
Bob, piqued by her dismissive attitude, retorts that if people had concentrated on creating microservices in the first place, rather than monolithic applications, there would be no need to integrate applications at all. It is Alice’s skills that would be irrelevant.
Alice, rising to the bait, responds loftily that Bob’s naïve idealism has no relevance in the real world and that she doesn’t expect to be looking for a new job anytime soon.
Bob, irritated by Alice’s tone, suggests that the very tools, approaches and architectures that Alice uses are monolithic, promote monolithic solutions and cause many of the problems microservices are there to solve. She is part of the problem, not part of the solution.
Now seriously annoyed, Alice claims that microservices represent a simplistic and childish approach that completely ignores the inherent complexity she faces every day. The tools she uses were selected because they address this complexity directly. Bob’s way of thinking, she claims, is born of lack of experience and a hopeless idealism. It can only promote chaos and invite failure.
I’m sure you agree this has to stop! We will leave Alice and Bob to their own devices, well out of earshot. For our part, wisdom dictates a cool-headed, reasoned response. We need to think through the issues carefully and honestly, making sure we take time to understand different perspectives and to properly analyse the root causes of the problems we face. I may be a high priest of integration, but I’m as keen as anyone to understand what works well, what works poorly and what is completely broken. Integration can certainly be a demanding discipline and the approaches we use sometimes leave much to be desired. Can the world of microservices inform us and help us do better?
There is a clear delineation of domains of interest that characterise the argument. My world broadly splits into two such domains. The first is the domain of business applications, systems and services. This is located firmly on the other side of the fence to where I am. I have no control over the applications that live in that domain. My job is to accept their existence, trust that the business has good reasons to invest in them and work out how to integrate them. My interests are different to, but do not conflict with, those of the developers who build and maintain those applications.
The second domain is that of integration. This is my natural home and here I have some control over my environment. I can select, or at least influence, the tools and frameworks I believe fit the problem domain, and I can design and implement integration solutions.
Clearly, microservice thinking applies to the first domain. It does so without conflict with the Integration domain. However, microservices are unlikely to dominate the first domain any time soon. Most organisations will continue to invest in on-premises and SaaS Line-of-Business applications, enterprise level CRM, CMS and ERP systems and data management systems. They will apply the principle of ‘buy, before build’, and hence, even if the whole world moves to RESTful interchange at the API level, their services and applications will still be silos of functionality and data in need of integration.
Even in scenarios where organisations invest in writing custom applications and services, it is highly unlikely that they will be willing to re-write their entire custom estate around the principles of microservices. It is far more likely that organisations will adopt microservice approaches over an extended period, using evolutionary approaches to tackle new problems. They will only re-write existing applications and services as microservices when there is a compelling commercial reason to do so.
The rise of µPaaS
We are seeing the first stirrings of interest (this was written in late 2014) in merging the principles of microservices with the provision of Platform-as-a-Service in cloud envrionments. The concept is to build out public PaaS offerings through the provision of microservice market places. In this emerging world, developers will create solutions by selecting pre-built microservices and combining and blending them with additional custom microservices. Public cloud platforms will support commercial approaches to monetise microservices. The PaaS platform itself, however, will leave developers free to exploit this market place, or not, as they choose. They can combine its offerings with custom-built and free/open-source microservices as required.
I cannot resist the temptation to call this new world ‘microPaaS’, or µPaaS. Its emergence is the main incentive to write this article. As soon as the µPaaS concept began to emerge, two key requirements came into sharp focus. The first is the need for better containership at the OS level. PaaS models must, of necessity, provide some kind of OS container for packaged code. This may be a virtual machine instance with automated provisioning. However, this locks developers into a single OS and any runtime environments that happen to target that OS. This violates the intention to allow developers to select the most appropriate tools and technologies for each individual microservice. In addition, microservices demand the freedom to deploy and host each microservice independently. Using an entire virtual machine as a container, possibly for a single microservice, is a top-heavy approach. Hence, a µPaaS needs lightweight, OS-agnostic containership. Efforts are currently focused on the evolution of Docker which, today, is a Linux-only technology, but tomorrow will emerge on other OS platforms, and specifically on future versions of Microsoft Windows.
The second issue is that of integration. In the microservices world, the vision often extends as far as different development teams collaborating within a larger organisation. However, on a public cloud platform, everyone gets to play. This is a problem. Microservices will be provided by entirely different teams and organisations. We can expect that, following the open source model, entire communities will emerge around mini-ecosystems of microservices that share common data representations, transactional boundaries and conventions. However, across the wider ecosystem as a whole, there will still be a need to provide mediation, transformation and co-ordination.
In the µPaaS world, the ideal is to provide integration capabilities as microservices, themselves. The danger, here, lies in constant re-invention of wheels, solving the same integration problems again and again. This suggests the need to provide first class generic integration microservices as a fundamental part of the ecosystem. This, however, highlights a further risk. Generic integration microservices must cater for the complex and arcane issues that can arise when composing solutions from incompatible parts. They cannot afford to ignore this complexity. If they do so, they will be an endless source of frustration and will lower the perceived value of the ecosystem as a whole. Instead, they must implement first-class abstractions over the complexities of integration in order to avoid compromising the ‘democratised’ nature of a µPaaS. They must be easy for any developer to use. No high priests of integration allowed!
The need for integration capabilities in µPaaS is driven by another consideration. A µPaaS platform will be used to build new solutions. However, there will still be a need to integrate these with existing applications and services. This, of course, includes integration with on-premises applications as part of hybrid architectures. This integration can, of course, be achieved using existing EAI/ESB tools and products. However, µPaaS offers the chance to re-think the approach to EAI from a microservices perspective. Again, a driving force for this is the democratisation of EAI, bringing down the cost and effort required to integrate applications. Done well, a microservice approach to integration will result in solutions that are easier to maintain and evolve over time, which scale easily, but which provide the robust message handling capabilities at the heart of integration.
One other reason for providing integration services in µPaaS is to support EDI workloads. The cloud provides an obvious location to host EDI solutions, and we have already seen the emergence of iPasS support for EDiFACT/X12 and AS2, together with trading partner management functionality. Expect to see this capability evolve over time.
The future landscape
Organisations that have made significant investment in EAI, EDI and service bus technologies are unlikely to replace those technologies with microservices in the near future. These tools will continue to play a critical role in enabling organisations to integrate their systems and applications effectively. Until we see µPaaS providing equivalent functionality, they will retain their role as the appropriate tools, frameworks and products for enabling robust, enterprise-level integration of mission-critical workloads.
Microservices apply service-orientated thinking inside application boundaries and serve to break down those boundaries. Contrast this with the application of service-orientation at the application boundary itself. Ten years ago, it was still rare for commercial applications to provide a web service API. Now, it is almost unthinkable for any modern business application to omit such features. In turn, this has allowed EAI tools to evolve more closely towards the concepts of the enterprise service bus. Likewise, many ESB products add value by incorporating integration technologies and tools.
Many of the concerns addressed by existing EAI tools are analogous to those of the microservices world. EAI emphasises strong decoupling of applications and services, ensuring those investments can vary and evolve over time, or even be removed or replaced with minimal impact on other applications and services. It does this through strong decoupling. Within the integration domain itself, most modern EAI and ESB products implement integration components as services. They generally allow those services to be hosted independently and to be scaled horizontally, although cost issues related to licencing and hardware can place limits on this. Integration services are often fine-grained, supporting a constrained set of behaviours for mediation or transformation. They evolved before the notion of microservices was conceived, and they do not generally adhere to all the microservices principles. However, they share a common heritage with microservices and share similar goals.
One issue that muddies the waters in EAI is the hosting of business logic within the integration domain. This can be a controversial matter. Some argue that business logic should be hosted within separate applications, systems and services. This may be driven by the centrality of ERP systems within organisations, or the need to ensure that different parts of the organisation take responsibility for automating the activities in which they are engaged. In this case, the integration layer is viewed simply as a communication hub that mediates messages between these systems. Others argue that business logic services naturally belong within the integration layer. This approach emphasises the need to decouple automated business processes from supporting applications and systems in in order to allow the organisation to rip and replace those systems over time with minimal impact on custom logic.
In my experience, the driving forces that dictate the best approach have more to do with organisational culture and longer-term IT strategy than with any architectural principal. Part of the art of integration is to intelligently predict how business requirements and IT landscapes are likely to evolve over time and to design solutions accordingly. This explains why, in many scenarios, the investment in EAI and ESB products results in the hosting of significant business logic within the integration domain.
What then, of the future? Microservices and µPaaS will undoubtedly work their magic in the enterprise space. However, they won’t be used exclusively. Integration will, in part, move to the µPaaS world. µPaaS itself, will predominantly favour public cloud, but will also be available within private cloud implementations. Today’s EAI and ESB tools will evolve along the lines of cloud enablement and will continue for the foreseeable future to play an important role with the enterprise. Where business logic today is hosted in the integration domain, we can expect a move towards the use of microservices. Integration itself will be ‘democratised’ at least to an extent. This will reduce costs and timescales, and help organisations meet the challenges of the future.
 Notwithstanding the reported advent of automated real-time translation capabilities in Skype. No analogy is perfect!