Welcome to Orcmid's Lair, the playground for family connections, pastimes, and scholarly vocation -- the collected professional and recreational work of Dennis E. Hamilton
Interoperability: No Code Need Apply?
Technorati Tags: interoperability, open-source development, open specifications, open standards, open-systems integration, protocol specifications
An article by Dino Chiesa has led me to look deeper at the ways that development of open protocols might be similar to open-source activities, even though open protocols for interoperability ideally do not require knowledge of anyone's code. It is more valuable, I say, to look at how the specifications and the verification of implementations are reconciled over time. It isn't always pretty, but maybe there is a place where the many-eyes ideas about improvement of open-source software apply to open specifications.
1. Interoperability and Open-Source Development Are Different
Dino Chiesa has a provocative blog post about Open Source and Interoperability (via Enzo De Lorenzi), arguing for a separation of open source distribution, a development and licensing approach, and interoperability, the ability to connect and operate systems and components together to accomplish some purpose:
I'm already convinced of that. What I am more taken with in Chiesa's post is his analysis of what is important for interoperability. I also think that practices associated with open-source can have a role there. Let's take a look.
2. The Advantage of Not Seeing the Code
First, Chiesa concedes that re-use of code from different sources may be aided by seeing the other code. That is a minor interoperability situation, however useful for understanding and repurposing of other code.
There are greater benefits in not having the code to look at. Chiesa puts it this way:
Protocol specifications describe the protocol data elements, interfaces and the essential behavior of the parties, independent of exactly how a particular software implementation accomplishes its rôle under the protocol. The liability of starting from code is that behavior is buried in accidental and inessential details that obscure recognition of precisely and solely what must happen for the parties to interoperate.
It is particularly easy to demonstrate the problem of working from code if the parties are unable to use the same programming-language system and platform for their implementations. Then it is necessary to reverse-engineer the actual protocol out of the code, freeing it from incidental, implementation-specific baggage that could be a terrible drag if simulated unnecessarily in the second implementation.
Having the behavior and data units be well-specified provides a superior basis for interoperability. It is also advantageous for the future maintenance and portability of the initial implementation, even when first given birth on a single-platform product of a single producer.
Being an adherent to this point of view, you can imagine my surprise on learning that Microsoft began to make source code available to licensees of its Open Specifications (those under the MCPP program at the time) in early 2006. This was understandably expedient considering the difficulty of deriving specifications (with whatever reverse-engineering and confirmation testing that might entail) and of developing prototype implementations by an independent technical committee at that time. The lesson: arrange matters in the future so that inspection of the code becomes unnecessary to knowing the protocol essentials.
Dino Chiesa's example includes the prospective licensing of documented but proprietary protocols. Although that is certainly one case of interoperability arrangements, I want to separate out the licensing of closed protocols and consider the degree to which there is harmony of open-source development practices and the development of open protocol specifications, with or without licensing of intellectual property. The difference is that it involves an open approach to the specifications, independent of whether there are open-source implementation efforts.
3. Open Protocols and Community Engagement
More from Chiesa:
Protocol specifications in which one or more parties have invested significant interoperable implementations are indeed a form of standard, even when entirely held in private arrangements. The specifications are standards because they provide authoritative statements of what is essential to achieve and preserve to accomplish implementation interoperability. Specifications provide the measure.
Specifications are also subject to versioning and consideration of ways that implementations will manage to interoperate appropriately with implementations developed to older and to newer versions of the specification. The tension between preservation of value and expanded utility is part of the evolution of interoperability-oriented specifications.
How do protocol specifications provide standards in this sense?
In practice, the refinement and testing of the specification itself happens when there are multiple implementations and their interoperability is tested and confirmed. Efforts to implement the protocol (special case: a format for interchange of information) will lead to questions of interpretation and ambiguity as the specification is studied by implementers. Arrangements for laboratory verification of actual interoperation, perhaps with test suites, will reveal misunderstandings, bugs in implementations and bugs in the specification.
Once a specification is in widespread use, there are inevitably matters of imperfection:
The opportunity for open community involvement in what is held as an open protocol can be in accelerating the maturation and stabilization of a specification by the attraction of many eyes and interested parties. Efforts to comprehend and apply the specification in interoperability cases are invaluable.
Another opportunity, given that the specification is open enough for such use, is creation of a reference implementation and samples that (once themselves stabilized) do serve as a way to test implementations for essential functionality. It is appealing to use open-source development in this case.
4. What Was the Question, Again?
Have we come full circle? That depends. An open-source reference implementation need not be product-worthy. It may be designed to operate in a very straightforward way with no optimizations and usability considerations. Samples are just samples and kept simple for understandability. And there is no need to disclose production-implementation code. (I am also neglecting intellectual-property considerations that may limit the degrees of freedom available for unconstrained community contribution.)
The reality is that specifications do not have value without implementations that establish their credibility and utility in achieving interoperability. And specifications themselves are tested and refined as the result of implementation effort. There is a cycle of learning and improvement between specifications, implementations, and experience. The challenge is to remove friction and accelerate that process.
Working to make an open specification and have community involvement in perfecting the specification may be crucial to fostering take-up of an interoperability opportunity. Open-source development practices are a low-friction way to invite community contribution and deliver a mutual benefit. Having the specification be open for public use, feedback, and discussion and safe for some kind of implementation (depending on license conditions) is a valuable way for a community to find and organize itself to collaborate on interoperability.
This is something to consider when designing for interoperability and when assessing the level at which interoperability is invited. Perhaps the most important consideration is that fostering and sustaining of interoperability is a journey, not a destination. It won't look perfect.
Comments: Post a Comment
|You are navigating Orcmid's Lair.|