Defining Interoperability


What does the term interoperability really mean? What does it mean to " interoperate " or be "interoperable" when designing solutions? A number of definitions are available depending on whom you speak to or where you search. Some define interoperability in terms of relationships; others have a much more specific view based on the technology that they're describing.

To define this term, I use a three-part explanation: one formal, one pictorial, and one comparative . The first part of the definition, the formal part, is derived from the ISO Information Technology Vocabulary and looks like this:

Interoperability enables communication, data exchange, or program execution among various systems in a way that requires the user to have little or no awareness of the underlying operations of those systems.

I find the final part of this definition the most pertinent: in a way that requires the user to have little or no awareness of the underlying operations of those systems. To me, this is the ultimate goal of building a solution today. Interoperability is about connecting and building applications that work with each other to such an extent that the presentation to the user is seamless.

One analogy for this leads to my second definition of interoperabilitythe pictorial representation. When describing interoperability, I always make reference to a popular lake- dwelling species, the duck. If you've seen a duck swimming across a lake, you might have noticed that the top half of the duck looks very serene. The duck appears to glide across the lake with little to no effort. Despite this calm exterior, you know that underneath, the duck's legs are frantically kicking in all directions to get to the other side of the lake. Such is the way of interoperability. A well-designed solution that interoperates among many diverse systems should appear "calm and serene" to the userthe user should have no awareness that a click of a button or a switch of a page might create many calls to various systems throughout the enterprise, somewhat akin to the frantically kicking legs of our feathered friend.

The third part of this interoperability definition draws upon the previous two but offers a slightly different slant. Migration , portability, and interoperability are three terms used in many situations when dealing with developing applications to work in a cross-platform environment. These three terms can have different meanings to different people. When dealing with projects that require one or all of these characteristics, keep in mind the following comparison:

Imagine two components. They can be business components, Web pages, classesanything at all. Imagine that one of the components has been written for the .NET platform and the other has been written for the J2EE environment.

If you convert it or rewrite one of the components (using either an automatic process or a manual one) so that it now runs on the other platform, this is known as migration . If you move one of the components to a different vendor but keep it on the same platform, this is known as portability . If, however, you leave each of the components on its own native platform but enable communication and data exchange between the .NET and J2EE platforms, this is interoperability .

You might find your own definition of interoperability, one that better applies to the systems you use today, or you might find an alternative way to describe it. Either way, I hope my three-part definition helps illuminate the key messages put forth in this book.

Interoperability vs. Migration

So what did I learn from my experience at the Architect Council? Did I learn that no one cares about migration and any attempt to migrate from J2EE to .NET will be futile? Absolutely not.

A huge opportunity does exist in the migration space. Customers do have code that can benefit from running on the .NET Framework, and some of this code already exists within J2EE applications today. Migration for many customers will happen over time. At the time of this writing, for many customers, interoperability is more important than migration.

Interoperability vs. Portability

As I mentioned in the last part of the interoperability definition, portability is the notion of running a single component or piece of code on platforms based on multiple implementations from various vendors . For the J2EE space, this is an easier concept to digest. I can, for instance, take a component written to the J2EE specification and, in theory, run it on either Application Server A or Application Server B based on my strategy with (or like or dislike of) the application server vendor at that particular time. In addition, Application Server A might be written for the Microsoft Windows platform, and Application Server B might be written for UNIX.

Because versions of .NET aren't developed by multiple vendors or for multiple operating systems, portability does not apply to a .NET application today. The point of this book isn't to detail (or argue about) the suitability of either the Java or .NET platform for developing applications. This book only succeeds if it promotes interoperability between the two platforms. Many arguments exist for and against portability and realizing portability in the enterprise. Some developers see portability as key for maintaining a vendor-neutral approach, and others are doubtful that portability will ever be realizedciting that the application-server vendor market becomes more fragmented with vendor-specific extensions each day.

I prefer to leave the argument about portability to other programming books and resources. This book is mainly concerned with showing you how to achieve interoperability. In my opinion, designing solutions that achieve interoperability allows a lot more flexibility for the enterprise than portability alone.

The Benefits of Interoperability

Many customers whom I've spoken with want to be confident that there's a good strategy for developing solutions using the .NET Framework that interoperate with heterogeneous systems. This is the real concept driving this book.

As a systems designer, architect, developer, or user, why should you be interested in interoperability? What does interoperability give you that replacing and recycling technology doesn't? I see four clear advantages:

  • Reuse of existing systems Most established companies have a number of legacy systems. By the term legacy , I mean technology that's not being actively developed upon today. For example, a system located in the data center that's still in production but no longer offers a strategic advantage to the company is a legacy system. A plan to move these systems to a new platform might be a longer- term strategy. A solution that has the ability to interoperate with these systems has the potential to extend the life of these systems and more importantly, the knowledge of the developers who work with them.

  • Delivery based on technical merit Designing an architecture that can enable interoperability promotes the selection of platforms based on technical merit. One could argue that every platform and technologywhether it's .NET, J2EE, or anything elsehas its own merits for deployment. This could be maturity, reliability, scalability, security, technical advancement, and so forth. An approach to developing applications and services that have the ability to interoperate with one another allows platforms to be selected based on their merit and applicability to the job, allowing greater choice regardless of the vendor.

  • Pilot for adoption When organizations want to deploy a new technology such as the .NET Framework, it's rare that they simply rip and replace an entire application or system. In many cases, a replacement is normally triggered by a pilot, or proof-of-concept, project. Such a pilot tends to be a short-term project with an aim to prove that the technology can work well with existing systems and applications. The ability for this pilot or proof-of-concept project to interoperate with existing production systems is imperative, and in many cases, can often determine its success.

  • Migrations Even if a system will be replaced or updated, it's rare to find a case where this can be done with a single "flick of a switch"many migrations have to be well planned and carefully executed, and often involve moving an application a few parts at a time. This way of dividing a system for migration purposes often creates a demand for interoperability (because some parts that have been migrated still might need to communicate with others that have not).

If interoperability helps enable these tasks , it might ultimately result in savings in developer time and resources. For example, you might increase the shelf life of existing applications and systems, maximize current developer skills, and provide agility by creating proof-of-concept projects and carefully planned migrations.

The History of Interoperability

Is the ability for systems to interoperate new? Well, yes and no.

Interoperability in its most general form has been around for years it started when the first computers started communicating via a network. If you look at some of the early protocol definitions in the 1970s and earlier, you'll find concrete examples of two systems "interoperating" over a network.

If, on the other hand, you look at the history for interoperability between pre-.NET Microsoft technology and Java (for example, connecting a COM object to a JavaBean), you'll find that relatively few options existed until recently. During the past couple of years, a number of options to connect the two platforms have emerged. These have included custom socket classes, custom implementations of Java RPC specifications (some deriving from standards such as Remote Method Invocation, or RMI), and a number of basic interoperability scenarios via HTTP.

The problem with these options is that although they're technically sound and suited to the task, they were never built on standards. For example, a developer might have created a custom protocol based on network sockets (which both a COM-based and standalone Java application could support). This implementation might have signed and encrypted data across the network between the two platforms and been very successful at the task it was designed to perform. However, if the developer then wanted to connect this system to another system that a third party had created (which again might implement a similar set of security functionality), chances are that unless the two developers had worked together on both the projects, the systems wouldn't interoperate.

This is probably the most fundamental difference in interoperability during the past few years and how products that enable interoperability continue to evolve today. It's no longer a question of if you can achieve interoperabilityit's a question of how you achieve interoperability. Existing and emerging standards bodies will play an increasingly important role as these products mature during the next few years. The later chapters in this book should highlight how important this standardization is.




Microsoft. NET and J2EE Interoperability Toolkit
Microsoft .NET and J2EE Interoperability Toolkit (Pro-Developer)
ISBN: 0735619220
EAN: 2147483647
Year: 2003
Pages: 132
Authors: Simon Guest

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net