![]() |
status privacy about contact |
|
Welcome to Orcmid's Lair, the playground for family connections, pastimes, and scholarly vocation -- the collected professional and recreational work of Dennis E. Hamilton
Archives
![]() Atom Feed Associated Blogs ![]() Recent Items |
2008-02-26Interoperability by Design
My professional life is all about interoperability and designing for interoperability. It has become so much a part of how I think and the way I approach problems that I am unaware of it. It all came back to me on Thursday, February 21, as I noticed how much Microsoft is working to accommodate interoperability beyond their own products and corporate business/development models. Others have pontificated on the (lack of) significance of the Microsoft interoperability announcement and the delivery of interface, protocol, and format details to the public as part of Microsoft's interoperability principles. I will emphasize aspects of the announcement that fall in my areas of concern. We are all looking at the same tea leaves. This reading is mine. Lest there be any doubt, I am strongly in favor of this direction and look forward to its continuing progression and positive results for our industry and our society. 1. Design for InteroperabilityThe February 21 statement of Microsoft Interoperability Principles features open connections, standards support, and data portability as pillars, along with an approach to open engagement in the industry. Interoperability does not happen and succeed by accident. Interoperability, just like other important architectural qualities, happens on purpose -- by design. Ultimately, it comes down to designing for interoperability, just as one designs for usability, designs for security, designs for performance and designs for dependability. These strike deep into the design and engineering processes. An interoperability initiative, just like a security one, takes considerable nurturing and, at the end of the day, will have altered the engineering culture at Microsoft and other organizations that take on comparable commitments. In terms of the technical artifacts in which interoperability is manifest, these elements of the principles stand out for me:
I trust that designing for interoperability will be good for Microsoft, but it is also expensive at the beginning and also at the sustaining. It takes effort to develop systems in a way where the behavior is well-specified, the conceptual models and interoperability elements are explainable, and there are structures in place to have the arrangement be resilient. It is quite different to provide for the unexpected arrival of other participants, many of whom will find omissions and confusing aspects in available materials. It takes a commitment to the raising of the quality of the now-interoperable surface and getting there from a place where it had been previously done by teams who could look over each others shoulders and make the code work well enough for the case at hand. I can't see how Microsoft's product architectures will fail to benefit. I was not surprised to learn of the effort and investment that it took to prepare specifications in compliance with the European Community regulations. It is a big deal to provide this kind of material. It will be a big deal to orient the engineering culture to anticipate the need for it and reflect that in the approach to design and development. Some Tea Leaves. There is evidence for this awakening when Chief Software Architect, Ray Ozzie, says
He goes on to observe that the need to interoperate with unanticipated third-party elements in a distributed and connected world is also one of those concerns that can no longer be ignored. Here's a specific illustration. Microsoft recently licensed code from a third-party as part of extending the Microsoft Foundation Class libraries to allow Visual Studio 2008 C++ applications to match the fluent-interface techniques of Office 2007. This was done instead of extracting the implementing code from Office 2007 itself. Herb Sutter gave his impression of how that came about, including the observation that
Joel Spolski, who has his own history with Excel, has an useful account for how document formats become extremely difficult to interchange in his "Why are the Microsoft Office file formats so complicated? (And some workarounds)" My experience is that it can be a painful and demanding journey to move from interesting internal usage to product-quality API, protocol, or data format. Ask anyone from an IT organization where their employer thought they could make one of their custom-built business applications worthy of sale to others. The same happens for other internally-used protocols, libraries, and special-case APIs. We also saw how much effort it took to get from a Microsoft draft to a fully-fleshed out ECMA-376 that is still being refined and will continue to be refined long after the February 2008 ballot-resolution meeting is past. The same goes for ODF (and may already be happening, though not so visibly or excitedly). More Tea Leaves. Microsoft General Counsel Brad Smith talks about how the interoperability efforts engages his attention:
Regardless of how much of that documentation was the work product of a greater undertaking, perfecting materials to be relied upon in an interoperability setting, open to all comers, has extraordinary challenges, especially after the fact. My main point is that designing for interoperability alters the character of software engineering. I see this announcement and the effort that it portends as moving Microsoft into that kind of transformation. As Ray Ozzie added at the end of the announcement,
2. Engagement at the Interoperability BoundaryThe just-announced principles establish the contours of the interoperability surface that Microsoft proposes to expose and be accountable for. The surface consists of specified entities: APIs (interfaces and their behaviors), protocols, and formats. There are also noteworthy actions that go beyond the provision of specifications. I've already suggested the extensive impact on the Microsoft product development organizations. But there is also a serious shift in how interoperability is worked out through engagement with others. Participation in industry standards efforts is a sanitary form of that. But there is more involved to provide an assurance of interoperability. The interoperability principles address that as well:
We probably do need reminding that Microsoft is heavily involved with a wide variety of standards organization and is a significant adopter of standards in its products. Along with the Interoperability Principles announcement, Microsoft has provided fact sheets on how much standards fit with Microsoft products and their interoperability features. To encourage industry support for interoperability and effective data exchange, Microsoft will launch a Document Interoperability Initiative with labs where implementers can verify interchange, optimize their data exchange, develop conformance tests, and publish templates that enable optimized interoperability with different formats. (My goodness, I already have a stake in that. This is the first practical offer I have seen with regard to conformance testing and, indirectly, qualification of products for use under various usage conditions. I harp on the need of that from time to time, and this is the first affirmative response I have encountered.) Server and Tools Senior Vice President Bob Muglia says the point of the Document Interoperability Initiative is "to ensure that the documents that are created by users are fully exchangeable, regardless of the tools that they are using." The pro-active engagement of a community of adopters and users around the Microsoft interoperability surface is where we should look for evidence of serious commitment. Microsoft has declared itself ready to demonstrate it commitment to the interoperability principles through appropriate action. In response to a question at the announcement, Brad Smith added
3. The Open-Source ConundrumThere is a serious creative tension for open-source developers with respect to the Open Protocols under the Microsoft Interoperability Principles. First, there is the following protection specifically with regard to the Open Protocols:
This statement is with respect to the identified Open Protocols (as that list will be extended over time). Implementations covered under the Open Specification Promise are not limited to non-commercial distribution. And finally, with regard to published APIs, "Third parties do not need licenses to any Microsoft patents to call these Open APIs." The last case simply perpetuates the existing ability to distribute open-source products that run on Microsoft Windows and rely on the APIs that Microsoft makes available for any developers to access Microsoft platform functions. (By the way, the Open Specification Promise has moved; not all Microsoft links have caught up. The latest (February 15, 2008) version of the OSP is under "Interoperability," there is an older (January 10) version still under "Standards.") Now, even for open-source developers, these considerations are important:
So, if you are interested in implementing an open protocol, you can find out exactly what patents (but not necessarily anyone else's patents) Microsoft asserts on the protocol and determine (with appropriate legal advice) whether or not you can work around it. When it is clear that the patent cannot be worked around, the open-source conundrum is present. The conundrum is, as others have noted, that
Assuming that an open-source developer chooses to make an implementation available within the confines of the covenant, there are now two challenges under the conundrum:
3.1 How Can Open Source Licenses Be Used?[This is not legal advice. If you want legal advice about this, consult an attorney with understanding of the legal situation where you live and operate.] Most open-source licenses are about copyrights. They involve granting perpetual, non-exclusive licenses to exclusive rights you have as a software author. To the extent that your work is a derivative of the copyrighted work of others, you are also relying on the rights those authors are granting for you to do so. This has nothing to do with patents. Got it? The aspects of open-source licenses that are about creative works of authorship and the exclusive rights of authors have nothing to do with patents or patent law (at least in the United States). An open-source license can comply fully with the 10 qualities of the Open Source Definition (version 1.9) without saying anything about patents, especially any that the author does not possess. It is clear that, with regard to patents, the Open Protocols not covered by the Open Specification Promise do not constitute Open Standards (2006-07-24 definition), but that is not about open-source licensing. Some open-source licenses say nothing at all about patents. For example, my favorite BSD template license says nothing about patents or anything like patents. Some provide clauses that are impacted by the assertion of a patent. The GNU General Public License (GPL 2.0), the one most often mentioned in this context, has interesting language in its section 7:
To be clear, it means you couldn't (re-)distribute under the GPL 2.0 (since you are not allowed to restrict commercial use under the GPL 2.0). This condition apparently survives into GPL 3.0, although as a consequence of different language. See that license for further details. If you are not constrained to use the GPL, there are other options. Some add declarations about patents. These are declarations made by you as the author employing that license. They are not about someone else who is not a party to the open-source development. These, such as the Apache License 2.0, require that you as a contributor (and those who contributed before you) have granted a royalty-free license to any patent of yours (theirs). The GPL 3.0 license also constitutes your granting, as a contributor, a royalty-free license of any patent that you could assert. Some licenses, such as the Apache License 2.0 also have a trip wire that prevents you from using a work licensed under the license if you assert a patent against use of the work by others. This is not, of course about Microsoft, it is about you. So, depending on the license that you are using (and any constraints under the license of the software you may be deriving your program from), you might be free to choose a license that does not prevent you from distributing the work even though not all license-permitted forms of redistribution will be protected under the covenant not to sue. 3.2 What About Notice? The Ethical ConsiderationAssuming that an appropriate open-source license is available and acceptable for your purposes. There remains the question of how to alert those downstream from you that the covenant not to sue only applies to them if they independently honor the conditions of the covenant. It would seem that the appropriate measure is to provide notice:
We have not heard the last of this. I have nothing useful to add on how to make this any easier for open-source developers and distributors. You'll notice that I have neglected those options that involve negotiating a patent license and making a commercial distribution. It remains to be seen whether open-source developers are able to stomach that level of compromise. I am sure that commercial developers will not be deterred. 4. The JourneyMicrosoft is learning that, as an enterprise with monopoly power, it is subject to different standards than those who are unable to exercise such control over markets (no matter what their ambitions might be). I see in this announcement evidence that Microsoft is looking to define a responsible relationship that allows it, its competitors, and its customers to thrive in an industry that Microsoft was instrumental in bringing to the level it has now reached. Whether or not Microsoft must now address legal and regulatory considerations in the moves it makes, I grant that there is an enlightened self-interest at play in how Microsoft responds pro-actively as time goes on. From Brad Smith:
and later, in response to a question,
And I think the last word on this was earned by Microsoft CEO Steve Ballmer,
Woven throughout this announcement is, as in these statements, the clear assumption that Microsoft has products that a great many others want to interoperate with and to build upon. I'd say that is accurate. 5. What Others Are Saying
It is surprising to me that I found it necessary to say this in January, 2002. Looking back from today, it seems longer ago than that. The wake-up call about document formats came for me during an address by Eliot Kimber on July 16, 1999 when he challenged us to answer the question "who owns your data?" I took this immediately to apply to documents in particular and nowadays I also ask "who owns your computer?" (as distinct, perhaps, from "who owns your car?" and "who owns your home?"). Nowadays, those questions are always in the forefront, influencing my cautious use of OneNote and not-so-cautious use of Live Writer (and very ambivalent use of Rss Bandit). My developing concerns for interoperability and open systems go much farther back. My focus on interoperability was sufficiently evident that I was drawn back into Xerox in 1988 to look at interoperability for in-development electronic-document products. My participation in Document Enabled Networking (a Novell-Xerox initiative) in 1991 followed by the Document Management Alliance specification and ODMA was a natural move from internal-product interoperability to industry-level product interconnection and interoperability. But I hadn't stepped back to look at the electronic documents that I had become accustomed to using while unable to preserve them with much confidence in future accessibility. My "design for document system interoperability" motif has appeared on my business cards since I became a mostly-retired consultant at the end of 1998. I can look back to the beginning of my career, in 1958, and the tutelage of Theodore Lomax at Boeing and Calvin Wright at the University of Washington for experiences that aroused my fascination with portability of software designs, creation of pluggable/linkable software components, and what would later be portable tools, APIs and protocols. I was an eager receptacle for principles of accountability and scholarship as part of that. As the reach of software has increased, so has my attention on the higher levels of interoperability that are afforded. None of that explains how I happened to receive e-mails from Microsoft's public relations firm, Waggener-Edstrom when I powered up at 7:30 am on February 21. I was intrigued to have been sent the teleconference call-in number and pass code. Feeling oddly-privileged, I found myself listening to hold music for nearly 15 minutes and wondering if I would actually be able to enter the call. That interval gave me time to access the Microsoft web site and find the press release that was already there. Ah, it's about interoperability. OK, I can get that. Oh, and they put the teleconference call-in details on the web too. I wondered if they could handle the likely congestion. They did, and the call didn't start until they got us all tied in, 15 minutes after the scheduled start time. Now there's a copy of the announcement press release, a transcript that is mostly better than my notes, and a 38-minute audio recording (WMA stream) of the session. If you have any question about the tone and what the Microsoft executives saw as significant, you can check the words and context yourself. My speculation is that the e-mail notification is related to my visible interest in OOXML, ODF, and open standards generally. Comments: Post a Comment |
![]() |
You are navigating Orcmid's Lair. |
template
created 2002-10-28-07:25 -0800 (pst)
by orcmid |