Orcmid's Lair status 
privacy 
 
about 
contact 

2008-02-26

Interoperability by Design

My professional life is all about interoperability and designing for interoperability.  It has become so much a part of how I think and the way I approach problems that I am unaware of it.  It all came back to me on Thursday, February 21, as I noticed how much Microsoft is working to accommodate interoperability beyond their own products and corporate business/development models. 

Others have pontificated on the (lack of) significance of the Microsoft interoperability announcement and the delivery of interface, protocol, and format details to the public as part of Microsoft's interoperability principles.  I will emphasize aspects of the announcement that fall in my areas of concern.  We are all looking at the same tea leaves.  This reading is mine.

Lest there be any doubt, I am strongly in favor of this direction and look forward to its continuing progression and positive results for our industry and our society. 

    1. Design for Interoperability
    2. Engagement at the Interoperability Boundary
    3. The Open-Source Conundrum
      3.1 How can open-source licenses be used?
      3.2 What about notice?  The ethical consideration
    4. The Journey
    5. What Others Are Saying

1. Design for Interoperability

The February 21 statement of Microsoft Interoperability Principles features open connections, standards support, and data portability as pillars, along with an approach to open engagement in the industry.

Interoperability does not happen and succeed by accident.  Interoperability, just like other important architectural qualities, happens on purpose -- by design.  Ultimately, it comes down to designing for interoperability, just as one designs for usability, designs for security, designs for performance and designs for dependability.  These strike deep into the design and engineering processes.  An interoperability initiative, just like a security one, takes considerable nurturing and, at the end of the day, will have altered the engineering culture at Microsoft and other organizations that take on comparable commitments.

In terms of the technical artifacts in which interoperability is manifest, these elements of the principles stand out for me:

Principle I: Open Connections to Microsoft Products

  1. Open Protocols, opening up protocols of Microsoft high-volume products that are used by other Microsoft products
      
  2. Open APIs, again featuring those of Microsoft high-volume products that are used by other Microsoft products

    ...

Principle II: Support for Standards

  1. Support for Key Standards, marked by participation with standards bodies, by documentation of how standards are supported, and reliance on customer advice for identifying priorities for adherence to standards
      
    ...

Principle III: Data Portability

  1. Industry Standard Formats, aligned with Principle II
       
  2. Open Formats, submitted for standardization (and presumably under the Open Specification Promise in that case) or publicly documented otherwise (and then with possible RAND licensing on any applicable Microsoft patents)
      
    ...

I trust that designing for interoperability will be good for Microsoft, but it is also expensive at the beginning and also at the sustaining.  It takes effort to develop systems in a way where the behavior is well-specified, the conceptual models and interoperability elements are explainable, and there are structures in place to have the arrangement be resilient.  It is quite different to provide for the unexpected arrival of other participants, many of whom will find omissions and confusing aspects in available materials. 

It takes a commitment to the raising of the quality of the now-interoperable surface and getting there from a place where it had been previously done by teams who could look over each others shoulders and make the code work well enough for the case at hand.  I can't see how Microsoft's product architectures will fail to benefit.

I was not surprised to learn of the effort and investment that it took to prepare specifications in compliance with the European Community regulations.  It is a big deal to provide this kind of material.  It will be a big deal to orient the engineering culture to anticipate the need for it and reflect that in the approach to design and development.

Some Tea Leaves.  There is evidence for this awakening when Chief Software Architect, Ray Ozzie, says

"When a new type of product or technology is introduced, vendors tend to focus first and foremost on little more than whether or not their product satisfies an immediate customer need, and in these early stage products innovation tends to trump interoperability, data portability, or any such concerns."

He goes on to observe that the need to interoperate with unanticipated third-party elements in a distributed and connected world is also one of those concerns that can no longer be ignored.

Here's a specific illustration.  Microsoft recently licensed code from a third-party as part of extending the Microsoft Foundation Class libraries to allow Visual Studio 2008 C++ applications to match the fluent-interface techniques of Office 2007.  This was done instead of extracting the implementing code from Office 2007 itself.  Herb Sutter gave his impression of how that came about, including the observation that

"Internal code isn't at the level required of a product [library for developers] ... . I wonder if people realize this is the norm, not the exception, ... "

" ... because there's a huge difference between 'solid for internal use' and 'productized' with all the additional level of documentation and testing and external usability that requires."

Joel Spolski, who has his own history with Excel, has an useful account for how document formats become extremely difficult to interchange in his "Why are the Microsoft Office file formats so complicated? (And some workarounds)"

My experience is that it can be a painful and demanding journey to move from interesting internal usage to product-quality API, protocol, or data format.  Ask anyone from an IT organization where their employer thought they could make one of their custom-built business applications worthy of sale to others.  The same happens for other internally-used protocols, libraries, and special-case APIs.

We also saw how much effort it took to get from a Microsoft draft to a fully-fleshed out ECMA-376 that is still being refined and will continue to be refined long after the February 2008 ballot-resolution meeting is past.  The same goes for ODF (and may already be happening, though not so visibly or excitedly).

More Tea Leaves.  Microsoft General Counsel Brad Smith talks about how the interoperability efforts engages his attention:

"We're committing the company to actions consistent with these words, actions that will live up to these principles completely. For that reason, people like Ray Ozzie and Bob Muglia have spent substantial time with our engineers to really work through what this means. ...

"At the same time that the principles went up on the Web, so did 30,000 pages of technical documentation. This documentation took literally years, and millions of dollars of software engineering work to create ... And we recognize that this publication ..., as significant as it is, is, nonetheless, a first step to implement these principles."

Regardless of how much of that documentation was the work product of a greater undertaking, perfecting materials to be relied upon in an interoperability setting, open to all comers, has extraordinary challenges, especially after the fact.

My main point is that designing for interoperability alters the character of software engineering.  I see this announcement and the effort that it portends as moving Microsoft into that kind of transformation.  As Ray Ozzie added at the end of the announcement,

" ... Having our systems designed from the outset, engineered from the outset with such interoperability from day one is extremely important. This is a very important strategic shift in terms of how each and every engineer at the company views what their mission is and what their job is. They have to consider what the customer environment is, what the deployment environment is into which the software that they create is being put. As I said, I believe this is an important announcement for the engineers at Microsoft, for our partners, for our competitors, and for our customers."

2. Engagement at the Interoperability Boundary

The just-announced principles establish the contours of the interoperability surface that Microsoft proposes to expose and be accountable for.  The surface consists of specified entities: APIs (interfaces and their behaviors), protocols, and formats.  There are also noteworthy actions that go beyond the provision of specifications.

I've already suggested the extensive impact on the Microsoft product development organizations.  But there is also a serious shift in how interoperability is worked out through engagement with others.  Participation in industry standards efforts is a sanitary form of that.  But there is more involved to provide an assurance of interoperability.  The interoperability principles address that as well:

Principle I: Open Connections to Microsoft Products

  1. Open Protocols   
  2. Open APIs, and additionally, plus
      
  3. Open Access, with publication of Open Protocols and Open APIs on the web with free access to all, including provisions for feedback and commentary, fostering improvement and facilitating successful third-party development
      
  4. RAND Patent Terms, identifying which protocols are covered by patents, listing the specific patents in advance, and providing reasonable and non-discriminatory (RAND) terms with low royalties for commercial use
      
  5. Open Source Compatibility, with a covenant not to sue for open-source development and non-commercial distribution of Open Protocol implementations

Principle II: Support for Standards

  1. Support for Key Standards, as already mentioned, plus ...
      
  2. Broad Compatibility, working with other major implementers to achieve interoperable implementations across a broad range
      
  3. Extensions, fully documented where relevant to interoperability with other implementations, with identification and reasonable availability of any applicable Microsoft patents

Principle III: Data Portability

  1. Industry Standard Formats,
  2. Open Formats, plus ...
      
  3. Open Import/Export in various Microsoft products to enable transfer of user data from one application to another
       
  4. Document Format Defaults, that allow customers of the core Microsoft Office applications (Word, Excel, and PowerPoint for sure) to choose default formats along with a plug-in architecture for adding support for opening and saving additional formats

Principle IV: Open Engagement

  1. Interoperability Forum, on the web, for information about what Microsoft is doing and discussing interoperability-related issues and challenges with users and information technology companies
      
  2. Open Source Interoperability Initiative, involving bilateral arrangements with the open source community, including facilities, resources, labs, "plug festivals," and cooperative development projects

We probably do need reminding that Microsoft is heavily involved with a wide variety of standards organization and is a significant adopter of standards in its products.  Along with the Interoperability Principles announcement, Microsoft has provided fact sheets on how much standards fit with Microsoft products and their interoperability features.

To encourage industry support for interoperability and effective data exchange, Microsoft will launch a Document Interoperability Initiative with labs where implementers can verify interchange, optimize their data exchange, develop conformance tests, and publish templates that enable optimized interoperability with different formats.  (My goodness, I already have a stake in that.  This is the first practical offer I have seen with regard to conformance testing and, indirectly, qualification of products for use under various usage conditions.  I harp on the need of that from time to time, and this is the first affirmative response I have encountered.)

Server and Tools Senior Vice President Bob Muglia says the point of the Document Interoperability Initiative is "to ensure that the documents that are created by users are fully exchangeable, regardless of the tools that they are using."

The pro-active engagement of a community of adopters and users around the Microsoft interoperability surface is where we should look for evidence of serious commitment.  Microsoft has declared itself ready to demonstrate it commitment to the interoperability principles through appropriate action.  In response to a question at the announcement, Brad Smith added

"So we absolutely appreciate and respect that others will assess how we live up to these principles. We fully expect that, and are prepared for it, and yet I fully believe that as people do test this proposition in the months to come, I think they're going to come away with a high regard for the steps that our engineers are taking."

3. The Open-Source Conundrum

There is a serious creative tension for open-source developers with respect to the Open Protocols under the Microsoft Interoperability Principles.  First, there is the following protection specifically with regard to the Open Protocols:

"Microsoft will covenant not to sue open source developers for development and non-commercial distribution of implementations of these Open Protocols."

This statement is with respect to the identified Open Protocols (as that list will be extended over time).  Implementations covered under the Open Specification Promise are not limited to non-commercial distribution.  And finally, with regard to published APIs, "Third parties do not need licenses to any Microsoft patents to call these Open APIs."  The last case simply perpetuates the existing ability to distribute open-source products that run on Microsoft Windows and rely on the APIs that Microsoft makes available for any developers to access Microsoft platform functions. 

(By the way, the Open Specification Promise has moved; not all Microsoft links have caught up.  The latest (February 15, 2008) version of the OSP is under "Interoperability," there is an older (January 10) version still under "Standards.")

Now, even for open-source developers, these considerations are important:

"Some of Microsoft’s Open Protocols are covered by patents. Microsoft will indicate on its website which protocols are covered by Microsoft patents and will license all of these patents on reasonable and non-discriminatory terms, at low royalty rates. To assist developers in clearly understanding whether or not Microsoft patents may apply to any of the protocols, Microsoft will make available a list of the specific Microsoft patents and patent applications that cover each protocol. We will make this list available once for each release of a high-volume product that includes Open Protocols. Microsoft will not assert patents on any Open Protocol unless those patents appear on that list."

So, if you are interested in implementing an open protocol, you can find out exactly what patents (but not necessarily anyone else's patents) Microsoft asserts on the protocol and determine (with appropriate legal advice) whether or not you can work around it.

When it is clear that the patent cannot be worked around, the open-source conundrum is present.  The conundrum is, as others have noted, that

  • the open-source developer is protected under the covenant not to sue (but only with respect to applicable Microsoft patents, not others)
      
  • downstream recipients who make commercial versions or distributions are not so protected and are expected to obtain licenses of the applicable Microsoft patents (the arrangements for any patents of others not being something Microsoft can resolve)

Assuming that an open-source developer chooses to make an implementation available within the confines of the covenant, there are now two challenges under the conundrum:

  1. Appropriate use of an open-source license
      
  2. Appropriate notice to downstream recipients of their obligations with regard to commercial distribution and use of implementation subject to assertion of (known and perhaps unknown) patents

3.1 How Can Open Source Licenses Be Used?

[This is not legal advice.  If you want legal advice about this, consult an attorney with understanding of the legal situation where you live and operate.]

Most open-source licenses are about copyrights.  They involve granting perpetual, non-exclusive licenses to exclusive rights you have as a software author.  To the extent that your work is a derivative of the copyrighted work of others, you are also relying on the rights those authors are granting for you to do so.

This has nothing to do with patents.  Got it?  The aspects of open-source licenses that are about creative works of authorship and the exclusive rights of authors have nothing to do with patents or patent law (at least in the United States).  An open-source license can comply fully with the 10 qualities of the Open Source Definition (version 1.9) without saying anything about patents, especially any that the author does not possess.

It is clear that, with regard to patents, the Open Protocols not covered by the Open Specification Promise do not constitute Open Standards (2006-07-24 definition), but that is not about open-source licensing.

Some open-source licenses say nothing at all about patents.  For example, my favorite BSD template license says nothing about patents or anything like patents.

Some provide clauses that are impacted by the assertion of a patent.  The GNU General Public License (GPL 2.0), the one most often mentioned in this context, has interesting language in its section 7:

"If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Program at all. For example, if a patent license would not permit royalty-free redistribution of the Program by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Program."

To be clear, it means you couldn't (re-)distribute under the GPL 2.0 (since you are not allowed to restrict commercial use under the GPL 2.0).  This condition apparently survives into GPL 3.0, although as a consequence of different language.  See that license for further details.  If you are not constrained to use the GPL, there are other options.

Some add declarations about patents.  These are declarations made by you as the author employing that license.  They are not about someone else who is not a party to the open-source development.   These, such as the Apache License 2.0, require that you as a contributor (and those who contributed before you) have granted a royalty-free license to any patent of yours (theirs).  The GPL 3.0 license also constitutes your granting, as a contributor, a royalty-free license of any patent that you could assert.  

Some licenses, such as the Apache License 2.0 also have a trip wire that prevents you from using a work licensed under the license if you assert a patent against use of the work by others.  This is not, of course about Microsoft, it is about you.

So, depending on the license that you are using (and any constraints under the license of the software you may be deriving your program from), you might be free to choose a license that does not prevent you from distributing the work even though not all license-permitted forms of redistribution will be protected under the covenant not to sue.

3.2 What About Notice? The Ethical Consideration

Assuming that an appropriate open-source license is available and acceptable for your purposes.  There remains the question of how to alert those downstream from you that the covenant not to sue only applies to them if they independently honor the conditions of the covenant.  

It would seem that the appropriate measure is to provide notice:

  1. I would make notice of the covenant and the redistribution conditions under which the covenant is not available.
      
  2. In the case that I was using a license (such as a BSD one) where additional licenses can be introduced, I would caution that the covenant might interfere with attachment of such additional license (e.g., GPL 2.0).
      
  3. In the case of derivative works made using the source code, even non-commercial open-source, I might caution that the Microsoft patent and other patents might be asserted even though the Open Protocol is no longer being implemented, depending on the nature of the essential claims of such patents. 
      
    This is no longer about the covenant not to sue.  It reflects the simple fact that anyone might in some way derive open-source software, or apply open-source software, in a way that infringes on some patent for which a covenant or a royalty-free license, if available at all, is not applicable.  This is the ordinary, every-day condition that pertains to open-source and closed-source software of all kinds, distributed commercially or not.  Microsoft's proclaiming about their software patents might activate our fears over such prospects, but the conditions exist independent of anything that Microsoft says or does (or that anyone else says or does while pointing at Microsoft).

We have not heard the last of this.  I have nothing useful to add on how to make this any easier for open-source developers and distributors. 

You'll notice that I have neglected those options that involve negotiating a patent license and making a commercial distribution.   It remains to be seen whether open-source developers are able to stomach that level of compromise.  I am sure that commercial developers will not be deterred. 

4. The Journey

Microsoft is learning that, as an enterprise with monopoly power, it is subject to different standards than those who are unable to exercise such control over markets (no matter what their ambitions might be).   I see in this announcement evidence that Microsoft is looking to define a responsible relationship that allows it, its competitors, and its customers to thrive in an industry that Microsoft was instrumental in bringing to the level it has now reached.

Whether or not Microsoft must now address legal and regulatory considerations in the moves it makes, I grant that there is an enlightened self-interest at play in how Microsoft responds pro-actively as time goes on.

From Brad Smith:

Finally, just a few words on our competition law obligations. The interoperability principles and actions announced today reflect the changed legal landscape for Microsoft and the information technology industry. Today's announcement represents an important step in a positive direction, to address the obligations outlined in the September 2000 judgment of the European Court of First Instance. As we said immediately after the CFI decision last September, Microsoft is committed to taking all necessary steps to ensure that we're in full compliance with European law. We will take additional steps in the coming weeks to address the remaining portion of the CFI decision.

We're also committed to providing full information to the European Commission and other governments, so they can evaluate all of these steps. We'll look forward to addressing any feedback that the Commission or other governments may provide to us, and we will move promptly to address feedback in a constructive way.

and later, in response to a question,

At one level we've always been very clear when we've taken interoperability steps in the past. We've always said with respect to each step that we weren't claiming that any particular step would be the last step we'd ever take. We've always been clear in stating that we weren't claiming that any step that we took was the best step that could ever be taken. This has been a continuing evolution, not just, I might add, for Microsoft, but in our view for the entire information technology industry.

And I think the last word on this was earned by Microsoft CEO Steve Ballmer,

"Well, these steps are being taken on our own. They are being taken on our own. There certainly were things we did, absolutely, to get in compliance with the European Commission decision, and with the consent decree here in the United States, but these principles are being taken on our own accord, and do reflect both kind of the reality of our unique legal situation, and our view of what will be required, but also, quite frankly, what we see as new kind of opportunities and risks in the more connected world.

"The world we grew up in was primarily a world of individual machines with people writing programs. And the greatest source of value-add around most of our products, frankly, was the value-add that came on the machine that ran our products, Windows, Office, whatever the case may be. In the more connected, services-oriented world that Ray Ozzie has had a chance to describe, and others, one of the greatest value-adds in some senses will be what people do, so to speak, on the other end of the wire. And opening up, particularly for our high volume products, and letting third parties, you could say, hey, we open up new opportunities for third parties to take share from us, I guess that's right, but at the same time we open up new opportunities for third parties to add value around our offering, and the combination of the changed environment, the new opportunities that it presents for our customers, for developers adding value around us, there are risks certainly that come with it, but we think on balance it's both consistent with what we will be doing anyway from a legal perspective, and is pro customer, and frankly, net-net, should be a good thing in the long-run for our shareholders."

Woven throughout this announcement is, as in these statements, the clear assumption that Microsoft has products that a great many others want to interoperate with and to build upon.  I'd say that is accurate. 

5. What Others Are Saying 

  • chromatic: Unreasonable and Discriminatory Pay-to-Interoperate.  (blog post) O'Reilly OnLamp.com, 2008-02-22.  The Microsoft open-protocol requirement for RAND licenses on commercial use and distributions of implementations developed as open-source creates a problem for choice of an open-source license.  This is a problem that already exists with regard to software patents having nothing to do with Microsoft, but it is a problem for open-source developers and all existing open-source software (since patent protection, unlike copyright, is independent of the knowledge of the code's author).  Chromatic points out that the Microsoft RAND promise does not cure this.  I must point out that a royalty-free promise won't help either, if the code is put to an use that is not related to the Microsoft covenant or is found subject to someone else's patent.  This is not a problem of Microsoft's creation, but it is something that the announcement will bring attention to as if it is, as it did here.
       
  • Mary Jo Foley: Microsoft Pledges (Yet Again) That It Wants To Be Interoperable.  All about Microsoft (web log), ZDnet.com, 2008-02-21.  Foley, in an uncharacteristic display of cynicism, sees the interoperability pledges as empty promises designed to influence the National Body vote changes that may result following the February 25-29 DIS 29500 (OOXML) Ballot Resolution Meeting (which does not address adoption of OOXML at all, only technical changes to the DIS 29500 specification in response to comments from the balloting that has already occurred).  I disagree that this is anything so narrow.  I also see this as an important concession to issues that has been raised by Gary Edwards and Sam Hiser with regard to the Microsoft Office System "stack."
         
  • Bill Hilf: See Change.  Port25: Communication from the Open Source Community at Microsoft (weblog), technet.com, 2008-02-21.  Hilf fancies Tim O'Reilly's notion of architecture for participation, an useful idea that can be related to the Shewhart-Deming cycle of learning and improvement too.  In this post, Hilf provides some context for the announcement in terms of the progression up to this point and how he spoke of it in mid-2007.  There's some useful tie-in to Ray Ozzie's announcement remarks and to earlier observations from the Chief Software Architect.
      
  • Gray Knowlton: Microsoft on Interoperability: "Significant, Strategic Change."  Gray Matter (web log), technet.com, 2008-02-21 (via Brian Jones).  This insider view dwells on how much effort has been building up to this point, focusing on the connection with the Microsoft Office System and an agnostic view of document formats, including OOXML.
      
  • Steven Musil: Week in Review: Microsoft the Magnanimous?  (article) CNet News.com, 2008-02-22.  Here is a cleanly-balanced account of the announcement and some reactions to it.  The comments take the usual spin, and they are representative of that.
      
  • Dominic Sartorio: Microsoft's Interop Announcement: A Practical Response to Market Reality.  Blog, Open Solutions Alliance, 2008-02-22.  This note of cautious optimism fits my mood too.
       
  • Andy Updegrove: Why the OOXML Vote Still Matters: A Proposal to Recognize the Need for "Civil ICT Standards."  Standards Blog, ConsortiumInfo.org, 2008-02-24 (via Carol Geyer).  Updegrove proposes a notion of civil standards for Information and Communications Technology (ICT).  The five listed "rights" are taken as a justification for not approving OOXML because Microsoft has not gone far enough (in areas that have nothing to do with OOXML and don't have all that much to do specifically with Microsoft either), and that also impeach ODF and its origination along with the current ICT standards ecology.  The insistence on coupling the clear-cut, well-delimited technical charge of the DIS 29500 (OOXML) ballot resolution meeting with anticipation of how National Bodies might alter their votes (or not) is a common device for those whose agenda does not tolerate the National Bodies having such an opportunity (and the uncertain but nevertheless feared result).  The Civil ICT Standards as a set of "rights" goes far beyond the scope of the document-format subject that is all the National Bodies have before them.
       Updegrove is an articulate champion.  It is valuable to read the complete piece for his vision even if, as I believe, he is mistaken about what will or will not encourage its realization and what is or is not already in service of that vision.  Read the comments too.  See if you can tell whether any of the posturing on ODF and/or OOXML involves technical knowledge of the actual content and provisions of either one of them.

It is surprising to me that I found it necessary to say this in January, 2002.  Looking back from today, it seems longer ago than that.

The wake-up call about document formats came for me during an address by Eliot Kimber on July 16, 1999 when he challenged us to answer the question "who owns your data?"  I took this immediately to apply to documents in particular and nowadays I also ask "who owns your computer?" (as distinct, perhaps, from "who owns your car?" and "who owns your home?").  Nowadays, those questions are always in the forefront, influencing my cautious use of OneNote and not-so-cautious use of Live Writer (and very ambivalent use of Rss Bandit). 

My developing concerns for interoperability and open systems go much farther back.  My focus on interoperability was sufficiently evident that I was drawn back into Xerox in 1988 to look at interoperability for in-development electronic-document products.  My participation in Document Enabled Networking (a Novell-Xerox initiative) in 1991 followed by the Document Management Alliance specification and ODMA was a natural move from internal-product interoperability to industry-level product interconnection and interoperability.  But I hadn't stepped back to look at the electronic documents that I had become accustomed to using while unable to preserve them with much confidence in future accessibility.

My "design for document system interoperability" motif has appeared on my business cards since I became a mostly-retired consultant at the end of 1998.  I can look back to the beginning of my career, in 1958, and the tutelage of Theodore Lomax at Boeing and Calvin Wright at the University of Washington for experiences that aroused my fascination with portability of software designs, creation of pluggable/linkable software components, and what would later be portable tools, APIs and protocols.  I was an eager receptacle for principles of accountability and scholarship as part of that.  As the reach of software has increased, so has my attention on the higher levels of interoperability that are afforded.

None of that explains how I happened to receive e-mails from Microsoft's public relations firm, Waggener-Edstrom when I powered up at 7:30 am on February 21.  I was intrigued to have been sent the teleconference call-in number and pass code.  Feeling oddly-privileged, I found myself listening to hold music for nearly 15 minutes and wondering if I would actually be able to enter the call.  That interval gave me time to access the Microsoft web site and find the press release that was already there.  Ah, it's about interoperability.  OK, I can get that.  Oh, and they put the teleconference call-in details on the web too. I wondered if they could handle the likely congestion.  They did, and the call didn't start until they got us all tied in, 15 minutes after the scheduled start time.  Now there's a copy of the announcement press release, a transcript that is mostly better than my notes, and a 38-minute audio recording (WMA stream) of the session.  If you have any question about the tone and what the Microsoft executives saw as significant, you can check the words and context yourself.  My speculation is that the e-mail notification is related to my visible interest in OOXML, ODF, and open standards generally.

 
Comments: Post a Comment
 
Construction Zone (Hard Hat Area) You are navigating Orcmid's Lair.

template created 2002-10-28-07:25 -0800 (pst) by orcmid
$$Author: Orcmid $
$$Date: 07-12-26 16:37 $
$$Revision: 27 $