Hangout for experimental confirmation and demonstration of software, computing, and networking. The exercises don't always work out. The professor is a bumbler and the laboratory assistant is a skanky dufus.
The nfoCentrale Blog Conclave
nfoCentrale Associated Sites
Technorati Tags: Rick Jelliffe, OOXML, ODF, ODF-OOXML Harmonization, DIS 29500, nfoWorks, Harmony Principles, Rob Weir, Jan van den Beld
What Is in the New Draft of OOXML? Rick Jelliffe has put up an excellent post on the history of the ODF and OOXML progressions and on the results of the Ballot Resolution Meeting in Geneva. The entire post is a valuable summary. With regard to the improvements of DIS 29500 (if approved by ISO/IEC JTC1), Rick's description of improvement is a valuable supplement to the offerings from Alex Brown and Jan van den Beld that I have applauded concerning BRM Closure in my just-updated "In Search of Initiative" post.
With regard to the prospects for harmonization, I found this Jelliffe tid-bit particularly interesting:
I am intrigued by anything that happens in Italy. (Perspective: If I mention this to Vicki, she'll say, "More trips to Tuscany!" Around here, visiting Berlin would be wonderful, living in Italy would be heaven, and taking the bus to meet-ups in Redmond is not a consolation for her.)
More than that, I think this is what our attention should be on and I am happy to see that people are turning their interest to this matter.
Jelliffe also speaks to issues of maintenance. The maintenance prospects are important and I want to feature these passages (ripped from context but, I trust, preserving Jelliffe's sense of it), with my emphasis:
It is the opportunity before us that I find heartening in Jelliffe's analysis. I have the same reaction to the Patrick Durasau analyses that Jelliffe links.
Contra Durasau, Part 1. Rob Weir has a rather different post today (promising even more in his choice of serial title). Weir is articulate and very bright. (I base this on the fact that he plays chess, shows knowledge of philately, and probably also writes software, all better than I do.) He lays down a challenge for ECMA TC-45, a challenge that I think requires some pro-active behavior to remedy (and the W3C model, if not the OASIS one, comes to mind for a participation-inviting alternative):
If I understand Jan van den Beld on this topic, it is pretty much up to TC45 to deal with this (and put in the effort required, although there is ample computer-based support these days).
Weir has a great deal more to say, and it is a worthy read. It bothers me that it is a worthy read, and I understand my concern better after coming across Jelliffe's appraisal. Here's what I see:
It is easy to look at past conduct of Microsoft, including how it operates the revision cycle of its productivity and developer software, and interpret that as unequivocal evidence for no possibility of any other outcome around interoperability and pro-active support for broad-based, open-process industry standards when Microsoft is at the table. (I don't know how the joint Microsoft-IBM work on web services survived this imperative, unless it is something like certain historical non-aggression pacts designed to cynically divide up the spoils.)
Weir's no-possibility argument is his basis for claiming that DIS 29500 should not be approved.
For me, to act from that position of no possibility is a guarantee of no opportunity. I stand for the possibility of broad interoperability arrangements. There is an opportunity here, and it is up to us to seize it. If attention drifts away and maintenance and interoperability suffer, that is a challenge for those of us who see this as worthy and important work. Of course it feels risky. There is always uncertainty in changing the world, even in these small ways.
I'm looking forward to the bringing into being of the world that Rick Jelliffe and Jan van den Beld see as possible.
>It is easy to look at past conduct of Microsoft
Of course it easy to see the similarity between what is happening now and what Microsoft usually does.
How hard is it to ignore past conduct of Microsoft?
The worst possible argument is that OOXML is as best as it can get. I refuse to belive that Microsoft could not have spotted the flaws already at the drafting stage. Microsoft have very talented engineers working for them and is foolish to assume that the defects in OOXML are not intentional.
"Of course it easy to see the similarity between what is happening now and what Microsoft usually does."
One problem I have is that there is no metric on "what Microsoft usually does" as if there is an absolute, always, every-time behavior. This absolutist anecdotalism is one of the weights that we all have to suffer with now. This is no more true of Microsoft than it is of IBM, or Sun Microsystems, or the Republic of France. I find there to be no useful guidance in such indictments.
I disagree about defects. This is very hard work and it is not something that Microsoft technical staff are accustomed to doing. I have said elsewhere that the move from 2000 to 6000 pages was evidence of that, and the continuing work is more evidence. I also agree with Jelliffe that the more times people pick up a standard and look at it, the more other clarification-requiring passages will be noticed. The people best equipped to notice that for OOXML are outside of Microsoft, because the Microsoft guys don't have the beginners mind for approaching the material. Standards may benefit from multiple eyes even more than open-source code.
And thanks. I think both of your concerns deserve careful and pragmatic attention.
"I disagree about defects. This is very hard work and it is not something that Microsoft technical staff are accustomed to doing. I have said elsewhere that the move from 2000 to 6000 pages was evidence of that, and the continuing work is more evidence. I also agree with Jelliffe that the more times people pick up a standard and look at it, the more other clarification-requiring passages will be noticed."
Okay...we might speculate about Micrsoft technical staff lacking the qualifications into making a working standard. On the other hand this makes what Microsoft is currently doing even more bad.
If multiple eyes indeed significantly improves the OOXML standard it is pretty much given that everyone would like as much as possible of such review. Who will gain anything from a premature OOXML standard of poor quality?
Microsoft have all opportunity to send OOXML through the normal ISO track. This would have given there expertice advise about how to make the standard good. Instead they chose to use Ecma that advertise that they aim to reduce contribution and changes to the proposal.
Arnaud Le Hors has written more about the matter in one of his blog posts
"And thanks. I think both of your concerns deserve careful and pragmatic attention."
It is always good when people exchange opinions instead of republishing spin and FUD.
I looked at the Arnaud Le Hors article and I stopped reading when he went into the rubber-stamp-Ecma tirade. You know, how it goes, I'll see your rubber-stamp-Ecma and raise you an OASIS and an ISO?
I don't want to go there.
Now, I can see lots of problems with the ISO for development of these kinds of standards. My favorite model is the IETF, as I have said, and it has an ultra-open process with regard to specifications and it has a laborious staging process by which a specification doesn't even become a *proposed* standard until there are independent, confirmed interoperable implementations. Even then, the process can be tortuous and there are sometimes doubts about how a problem is being solved through standardization. One can also get into specifications in-the-sky, something that has happened in ISO activities (and ANSI ones) but that the IETF process is designed to avoid.
Unfortunately, IETF doesn't do document-format standards of the kind we are dealing with here. So we will be seeing this come around the other way where a specification preceeds heterogeneous interoperability. That is not uncommon in ITC standards, especially programming languages and formats (look at SGML and XML, for two). Many standards are based on existing implementations and are fast-tracked (FT or PAS) to support the relevant industry (EcmaScript is an example).
Sometimes it takes some special forum to deal with the bits that standards organizations do not deal with: test suites, certification of products, agreement on profiles for interchange in various communities, etc. All of that is work that is ready to begin. OOXML and ODF are good enough for that, whatever their individual deficiencies, and this is what it will take to lead to their improvement and stabilization (not perfection). It will also provide quantitative and qualitative understanding of what we have in our hands and what it takes to work with them.
Meanwhile, I suggest that the ODF fast-track (the OASIS PAS equivalent) differed only from the OOXML Fast Track in that it lacked organized opposition (Microsoft being smart, in this case, in that anyone who thought that was a good idea was over-ruled). In this case ISO was a rubber stamp (and they sometimes are) because sometimes that is just how it goes (and you should take a look at the SQL standards if you can stand the hurt on your wallet). This is a human process, not some pristine technical one.
There's this automatic assumption that ODF is solid and OOXML is a rats-nest (and in important ways it is, largely attributable to their different origins and purposes). I say that comparison is a humbug. ODF is not solid, and we will, if the smoke is allowed to clear, now take on reconciling all of that. It is rather obvious that ODF 1.0 was rushed out the door. I don't know what the hang-up is with ODF 1.2, so we'll have to wait and see.
I believe it is ingenuous that the answer is for Microsoft to go to work with the ODF folk. I don't believe the parties are willing to do that, and it will work better when ODF and OOXML are both on the table as the same level of standards. I don't see, as a pragmatic matter, how harmonization could occur otherwise, and I share Jelliffe's optimism about the opportunity.
Whether this is the way it should have been, it is too late to do anything about it. I think there are lessons for everyone, but I am not sure that the same lessons have yet been felt in Armonk, Mountain View, and Redmond.
Whether OOXML needs to go through the normal track or not remains to be seen of course. I am not a stakeholder in this. From my personal selfish perspective, I would rather have the first ISO standardization of OOXML be done with so we can get into the realities of harmonization.
Meanwhile, there are more and more Office 2007 documents out there every day, as there are more and more OpenOffice.org documents (not pristine ODF documents) out there every day.
I think it is out of our hands now. The NBs will make their determinations, on whatever basis they do so, and the score will be re-tallied at the end of March. We'll then know what's next.
This reminds me of the words of one Bob Bemer (thought of as the grandfather of ASCII, a made-up standard): "Standards are arbitrary solutions to recurring problems."
PS: I don't want to say that the Microsoft team didn't know how to write a good specification. I think it is more about how they needed to move something that wasn't designed for interoperability to something that was, and have it be well-specified specification too. I think the learning curve and the pain was underestimated all around. I'm relieved I wasn't the one who had to do that.
I apologize for the length of this response. There are ideas that might better be placed in a new post.
I do think we are touching on valuable concerns, though some of them are not anything we will be able to influence.
I appreciate your willingness to look at the different sides, but I think you are missing some important issues.
One, if this becomes a standard, in its current form, WE are stuck with it! What I mean here is that Microsoft will pour on the marketing and PR hype and make every government, institution, and corporation believe they are finally able to save their documents in a format which will give them complete freedom to use any application they want (i.e. "no more vendor lock in"). The problem is that MS spit out a format that has tons of legal issues that are supposedly only partially protected by a shaky "promise", the OSP. The document format itself has so many technical hurdles that I don't believe any application will ever have the ability to fully interpret the 6000 page specification. So this "standard" will only be for marketing purposes while MS knows that all it does is protect their product (I accept their right to protect their market, but I reject using the ISO to do it).
The other main issue that concerns me from the tone of your blog is the willingness for so many people to think MS all of a sudden is willing to work openly and create a new interoperability relationship with the computing world. I use MS daily and have since 1992, and I respect their products, but I don't think for a second they have any real desire to do any of that. If they did, they would have adopted, if not exclusively, ODF as a viable "save as" format while they developed a standard that was more feature rich for saving files with MS specific data. (They already warn you in saving to other formats that they you may lose some features, why can't they do that with an implementation of ODF filters?!)
These are just some of my thoughts. I personally believe the maintenance phase is the wrong place to correct all the issues, but that this standard should go through the normal ISO channels to work out the bugs, fix all the issues that should have been corrected a year ago, and most of all, test MS's resolve to be honest and open with this format.
I think this is where we agree to disagree. I will keep this shorter.
1. In some sense, we are stuck with OOXML either way. I think we are overlooking something about maintenance of standard specifications though. Maintenance at the standards level doesn't have anything to do with code. What will happen is that as people actually deal with the format and the specification, and work to confirm interoperability (however that works out over time), people will submit concerns and questions and proposals for difficulties with the format and/or its specification. It does mean there needs to be an open forum for confirmation of interoperabiliity and for questions and submissions concerning the specification. It is also important to avoid astronaut document standards work and keep things grounded in reality. I believe we are close enough. The proof is in the pudding of course. Finally, on this point, there is mandatory periodic maintenance review of ISO/IEC specifications (And ANSI too. I don't know about ECMA and OASIS). Standard specifications are allowed to die.
2. I think there was a timing problem around ODF interception for Office (and I bet the ODF people knew it). I can think of lots of reasons that Microsoft did not provide native/built-in support including NIH, IP concerns, and the problem of being accused of high-jacking ODF. It seems to me that if Microsoft had an implementation in Office, that would be the reference implementation by now. You think that would have gone over very well?
3. More forward-looking, Microsoft is offering improved integration hooks so that anyone can incorporate converters that work smoothly and can be the locally-chosen default. They are going to demonstrate it with the OOXML-ODF translator project. So, however odd the governance of that project is, we will see how well your suggestion about ODF integration can be carried out.
4. Finally, I am fine with the OSP. It has as much strength as Sun's equivalent promise for ODF and it frees up open-source implementations for commercial use/sale and source-code distribution. If there are any qualms here, they are ones that apply to all open-source distributions and have nothing exclusively to do with Microsoft. But don't take my word for it. (Please don't: IANAL and this is not legal advice.) If you are in a position where this is a matter of personal concern, check with an IP lawyer. (Point them to Larry Rosen's analysis too.) We have this imaginary hammer over our heads with MSFT embossed on it. For me, that is my own geek fear of uncertainty, and a consequence of pointing at the OSP creating fear of the opposite consequence. I don't think Microsoft is who needs to be feared in this case. It's one of my odd human traits.
5. Finally, and I have gone on way too long, you're right. I do give credence to the Interoperability Principles and Microsoft realizing how important genuine interoperability is to it. I think they are learning that they are in a situation that requires this of them. This is hard to do and it is not where entropy would take them. I am willing to grant that there is a pro-active effort and I will be alert and continue to hold Microsoft to account for their execution, to the extent I, as a lone individual, have any say in the matter at all.
I talked about the prospect of Microsoft Office becoming reference implementation for ODF. That's the wrong term. There is no referenc implementation for ODF. There is a benchmark implementation (OpenOffice.org), and I assure you that if Microsoft Office had native support for ODF, it would have become the benchmark, accompanied by much wailing and appeals to the DOJ.
One more thing. In my (1) above I talk about maintenance of standards. I meant to point out the limitations of the long track to standards definition. The problem with the long track is getting the right people to participate (presumably in a working group under JTC1 SC34) in the particular way that is gone about.
It is my considered opinion that an open-process public maintenance operation will do more, more quickly, than what happens when a long-track process is initiated, and it will be grounded in work at implementing an agreed baseline spec.
>I looked at the Arnaud Le Hors article and I stopped reading when he went into the rubber-stamp-Ecma tirade. You know, how it goes, I'll see your rubber-stamp-Ecma and raise you an OASIS and an ISO?
I suggest that really read his full argument. How can you else know what he means to say?
As for rubber-stamp-Ecma vs Oasis...I don't agree. FT is rubber stamp while PAS is critical review even while the draft work is done outside ISO. Rob Weir has explained it quite well
One of the real problems is that Microsoft have not commited to acutally implement dis29500 or the future versions of that standard so what is the value in the OSP and g dis29500 for harmonization purposes if Microsoft only maybe will implement it.
What is needed is:
*Microsoft must promise to implement the _full_ dis29500, including upcoming versions, if they should be trusted.
*Microsoft must extend the patent promise to exclude the right to sell the patents to patent trolls. As written only Microsoft itself is bound to not use the patents, and that is obviously not enough.
*Microsoft must craft a OSP so that the protection apply to the GPL and open source developers even if these are used for commercial purposes.
*Microsoft must publish all patents they have that concern dis29500. This includes the many patents that Microsoft have requested after OOXML approval process begun.
*Microsoft/Ecma should remove the nonsense requirement that syntactic compability is enough to claim conformance.
The past behavior of Microsoft is not the reason people don't trust Microsoft about OOXML...it is the current behavior that is the problem that make it given that the only sane thing is to reject dis29500.
Unless Microsoft take the above actions the following results are very likely:
*The ISO label will be used by Microsoft to fool people into using Office since it "supports" an ISO standard
*Office 2007 will never faithfully implement dis29500 so all files produced will lack interoperability with producers that don't relie on the Microsoft DLL files.
*Microsoft will sell the patents to patent trolls to attack competitors.
*Microsoft will use all application-defined points and undefined points in the standard to break real compability.
Harmonization between Office 2007 reals format and ODF would be great, but dis29500 as ISO standard does nothing to ease that. The only thing lacking for things to be sorted out is that Micrsoft start to publish their mapping between formats.
The latest comments have specific claims about how Microsoft will/does conduct itself.
Let's put a time-limit on confirmation of what actually occurs. Assuming that DIS 29500 is approved and issued as an ISO/IEC standard this year, let's check on these after years (to 2013) or when the next version of OOXML is issued as an ISO/IEC standard, or when any of these feared behaviors occur, whichever come first?
|You are navigating Orcmid's Lair.|