Orcmid's Lair  
privacy 
 
 
 

Welcome to Orcmid's Lair, the playground for family connections, pastimes, and scholarly vocation -- the collected professional and recreational work of Dennis E. Hamilton



Click for Blog Feed
Blog Feed

Recent Items
 
Republishing before Silence
 
… And It Came to Pass
 
Amaze Your Friends: Datamine Unlimited Statistical...
 
Don’t You Just Hate It When …
 
Blog Template Unification: Template Trickiness
 
OOXML Implementation: Can Expectations Ever Trump ...
 
February Frights Redux: Unification for Creative D...
 
Worst Nightmare: OpenDocument Format Embraced-Exte...
 
Abstraction: Einstein on Mathematics+Theory+Realit...
 
Document-Security Theater: When the Key is More Va...

This page is powered by Blogger. Isn't yours?
  

Locations of visitors to this site
visits to Orcmid's Lair pages

The nfoCentrale Blog Conclave
 
Millennia Antica: The Kiln Sitter's Diary
 
nfoWorks: Pursuing Harmony
 
Numbering Peano
 
Orcmid's Lair
 
Orcmid's Live Hideout
 
Prof. von Clueless in the Blunder Dome
 
Spanner Wingnut's Muddleware Lab (experimental)

nfoCentrale Associated Sites
 
DMA: The Document Management Alliance
 
DMware: Document Management Interoperability Exchange
 
Millennia Antica Pottery
 
The Miser Project
 
nfoCentrale: the Anchor Site
 
nfoWare: Information Processing Technology
 
nfoWorks: Tools for Document Interoperability
 
NuovoDoc: Design for Document System Interoperability
 
ODMA Interoperability Exchange
 
Orcmid's Lair
 
TROST: Open-System Trustworthiness

2010-05-01

 

Republishing before Silence

The nfoCentrale blogs, including Orcmid’s Lair, were published through Blogger via FTP transfer to my web sites. That service is ending.

Then there will be silence as Blogger is unhooked, although the pages will remain.

No new posts or comments will work until I updated the web site to use its own blog engine. Once that migration is completed, posting will resume here, with details about what to know about the transition and any breakage that remains to be repaired.

Meanwhile, if you are curious to watch how this works out, check on Spanner Wingnut’s Muddleware Lab. It may be in various stages of disrepair, but that blog will come under new custodianship first.

Labels:

2010-04-17

 

… And It Came to Pass

Prophets in Their Own Lands

Back in February, I posted “Document Security Theater: When the Key is More Valuable than the Lock.” I was objecting to a technique, now being immortalized in open-document formats such as ODF and OOXML, whereby a hashed copy of a password is stored in the document such that it can easily be retrieved and used to attack the password itself. As explained there, the value of the password is not in being used to overcome the protection of the document against alteration – that is easy to do without ever bothering to know the password. The value of the password is that it is a memorable secret of the password holder and it needs to be protected (i.e., disguised) because it is also used for a variety of valuable purposes.

The failure to achieve a separation of concerns is probably a tip-off here. Either way, the exposure of hashed copies of passwords is not a new issue. There are available expert reports that identify the flaw. Attacks on passwords whose hashed copies are known have been popular since the first widespread Internet worm was released against unprotected systems. For example, the Unix /etc/passwords file with its hashed copies of passwords was commonly readable by all users and certainly anywhere once a root password was compromised. That users had the same passwords on different systems made leap-frog attacks from system-to-system particularly promising. It is like watching an elaborate arrangement of dominoes fall.

Encouraging Gullible Conduct

My argument then was that it is folly to increase the complexity of hash coding and believe that the password is thereby protected against discovery by a determined attacker. The defect in reasoning is in the assumption that the remedy to attackable hashed password copies is to use a “stronger” hashing technique. It does not make a memorable password stronger, and there is effectively a (disguised) copy of the password in plain sight. Having the copy and knowing the hashing technique allows that still-weak password to be attacked about as easily as it ever could be.

Systems which use password hashing as a way of not keeping passwords around in plaintext also arrange to secure the hashed copies against discovery. Once the hashed copies are known, discovery of the password is becoming child’s play, especially for memorable passwords that are reused by the password holder as a matter of convenience.

We’ve all learned by now that convenience trumps security, right? My objection is against willfully pandering to that conduct. You can imagine my dismay when my efforts to end that perpetration in the ODF specification were rebuffed by this argument:

“The justification for stronger algorithms than SHA1 is that many users use the same passwords for multiple tasks. So, it is worth to protect the key. Since we explicitly added the [SHA256 and stronger hashing methods] attributes to ODF 1.2 on request, we should not revert this.”

That is precisely the reason we should “revert” that so far draft-only provision of ODF 1.2.

Reality Will Not Be Fooled

Last week, there was announcement that some servers at Apache.org had been attacked and compromised. I saw notices such as ZDNet’s “Apache.or hit by targeted XSS attack, passwords compromised” and PCWorld’s (via Yahoo) “Apache Project Server Hacked, Passwords Compromised.” I didn’t read the articles, since it was about an all-too-common sort of break-in. What I didn’t appreciate was that the attackers stole lists of user names and their hash coded passwords.

What finally caught my undivided attention was the 2010-04-13 James Clark tweet, “Ouch. Hashed copy of password compromised for all users of Apache hosted JIRA, Bugzilla.”

The notice at the Apache Foundation cannot be clearer: “If you are a user of the Apache hosted JIRA, Bugzilla, or Confluence, a hashed copy of your password has been compromised.” And, of course, if we are putting hashed copies of passwords in plain site, it doesn’t need a hacked JIRA, Bugzilla, or Confluence configuration to get it. Even scarier is this observation: “JIRA and Confluence both use a SHA-512 hash, but without a random salt. We believe the risk to simple passwords based on dictionary words is quite high, and most users should rotate their passwords.”

What more do we need to know?

It is time to stop putting lipstick on what we know to be a pig.


I believe that this situation, for documents, arose through an over-constrained problem. We’ve been blinded into thinking that the safety of keys used for conveniently removing document protections is improved by strengthening the hashing for copies of those keys. All this does is encourage folks to be careless in the choice of passwords for this mundane purpose. We must find a way off that slippery spiral.

The intriguing problem is how to preserve the convenience of protection removal for document authors without subjecting their convenient, memorable password to discovery by attacking the plain-sight hashed copy. Is there a way out of the current awful practice? And if so, what do we do to overcome perpetuation of the flawed approach that is already in place?

[update 2010-04-17T19:09Z I broke up the first paragraph because it did not flow well. This allowed me to embellish the situation with more unpleasant historical facts. It is appalling to see how many years it’s been known that disclosure of hashed copies of passwords is a practically-attackable vulnerability]

2010-04-16

 

Amaze Your Friends: Datamine Unlimited Statistical Nonsense

In today’s Techflash Research post, Todd Bishop wonders why it is that Seattle is only #14 as a “Mac Metropolis.”  (The catchy term is used in the report summary that Bishop links, and it is hard to resist repeating even if that is not what the report is about.)

First off, the Apple Market Ranker analysis that Experian Simmons summarizes is about owners of Apple products, not Macs.  The basic question is, if you scratch a resident of one of the 206 Designated Market Areas (DMAs – don’t you just love being sliced and diced by market analysts?) in the United States, how likely is it that they will own or use an Apple Product: an iPod, iPhone, or Macintosh computer.

OK Ed, Let’s Wow ‘Em with the Numbers

The most impressive number that I see is that fully 21.6% of all adults nationwide own or use one of these products.  I don’t know about you, but even if the iPod dominates the “ground truth” behind this statistical estimate, I am impressed.  I’m sure that Apple stockholders smile and rub their hands in glee over what the iPad launch may do to these figures.  For Apple executive management, on the other hand, I would consider this a cause for concern with regard to the prospect of market saturation.  The iPad would be urgently-welcome as well as a  potential market broadener.

In the San Francisco – Oakland – San Jose DMA, the gravity well of Silicon Valley (the red giant) and Apple (the blue dwarf), the figure is 32.3%.  This is transformed into the wonderful  statement that the adult residents of this DMA are 49% more likely than the average adult American to own or use at least one of these products.   Well, sure 32.3/21.6 = 1.495 so we see where that more-dramatic figure pops out.  If this were an election we’d say that Silicon Valley leads the nation by 10.7 points, but I guess that is not so sexy.  I’m not sure what any of this numerical magic tells us, but let’s play along with the idea that it provides something useful for people who worry about life in the DMAs and how we might discretely dispose of our incomes.

The analysis continues through the top 10 DMAs (4 being in California) by this measure of Apple friendliness, with Boston (the Semiconductor East) at a close second and with Las Vegas at 27.9% as number 10.   Starting with number 3, San Diego, we are told the populations of these DMAs (and the full report lists them all).  For example, #4 New York weighs in with 30.4% of nearly 16 million adults and an observation about the observable presence of the iPhone.  In contrast, the 8th and 9th ranked DMAs have adult populations of less than a million each.

Those Modestly Successful Puget Sound Folks

If you go through the registration to download the full report and find out about the Seattle-Tacoma DMA, you’ll see that we are #14 with a population of 3.6 million adults.  A mere 27.4% are estimated to be Apple users (26.9% above the national average, if that fills you with regional pride).  The largest DMA of those closest to the national average (and why not proud of it?) is #57, St. Louis Missouri, with 2.4 million adults and 21.5% estimated Apple lovers.  For perspective, I note that 152 of the 206 DMAs come in below the national average for Apple Love.

Looking at the map that is provided in the full report, the Seattle-Tacoma DMA is at the heart of an Apple Love territory that spans the Vancouver BC – Portland Pacific-to-Cascades corridor, along with the I90 wedge to Spokane.  We’re among friends.

Keeping Steve Ballmer Awake Nights

It’s not clear to me what this tells us about existing markets and market opportunities.  It would be useful to know what proportion of those same populations own or use any device of the kinds that Apple sells and for how many of those none (and all) of them are made by Apple.  Obviously, economic conditions, educational achievement, and infrastructure in a DMA also matters.  There might even be a market differentiation among liberal (the “rest of us?”) and (economically-)conservative communities. 

When the unpenetrated market consists of the owners of your competitors’ products, life becomes more difficult depending on how much people do not readily churn their discretionary possessions and favored brands.  Still, the king of the mountain has to always sleep with an uneasy crown.  If you are a pretender to the throne, I suppose having to fear disruptive forces other than your own is a condition to look forward to.

And a Little Reality Seasoning

The ZDNet report on personal-computer sales just reached my inbox: “Gartner: Apple sells 1.4 million Macs in US; captures 8% market share.”

To explain how these numbers are so widely different than that wonderful 26.1% of adults, nationwide, it is important to understand that market share is not about what folks own or use, but what was sold.  The market’s 100% is all of the sales in a particular timeframe.  Because sales of personal computers in 1Q2010 are 20% better in units sold than 1Q2009, the market is spoken of as having increased by that much.  Notice that the statement is not about the revenue or the profit from those sales, which might sort out quite differently.

From this perspective, Apple sold 34% more Macintosh computers, moving from 7.2% of the units sold to 8.0% of the units sold in the most-recent quarter.  HP and Dell still dominate with over 50% between them but their unit sales did not grow as much as the market, which grew about 20%.  The sleeplessness at HP and Dell is of a different quality than what has Apple bounding out of bed every morning.  (Whether they made up for it in cash rather than volume, we won’t know from the Gartner analysis.)

To estimate the fuzz in all of this, the ZDNet article also reports an IDC finding that Apple grew its sales but lost market share against the total market (which grew more).  There is not enough information to know which are oranges and which are, uh, apples, among these comparisons.  It could be that Apple lost market share worldwide, since Macintosh penetration is apparently not so hot outside the United States while HP and Dell maintain their positions globally.  [Update 2010-04-19T01:20Z It’s worse than that.  According to the IDC Analysis, Apple doesn’t even show in the top five world-wide, and their growth in the US was below the 18.4% of the total market.  An 8.3% growth in Apple computer shipments left them down from 7.0% to 6.4% of the market.  IDC describes its report as counting shipments and determines market share from that.]

Perhaps the oddest reporting of these latest figures for personal-computer sales is the underplayed fact that Toshiba sales grew faster than Apple’s, taking away 4th place.  Acer did even better strengthening its 3rd place position as well.  These two can be credited with capturing most of the market growth between them.  Although this phenomenon is noted in the ZDNet article, Apple gets the headline and the lede.  Interesting, aye?


[Update 2010-04-16T22:46Z Something lead me back for a second look, adding a paragraph about the far-superior Toshiba and Acer performance as of 1Q2010.   The Tablet derby through to the end of 2010 is going to be fascinating.
 
Update 2010-04-16T20:46Z Repairing a typo allows me to speculate even more with almost no evidence.
 Update 2010-04-16T20:33Z I couldn’t resist adding the information about 1Q2010 personal-computer market volumes as evidence of how important it is to get beneath the numbers to find out what is really going on and who it matters to.]

2010-04-08

 

Don’t You Just Hate It When …

… You visit a site, create a comment, and

You are asked to log in and you have no idea that you have a password for the particular site.

… You attempt to register at a site, and

They tell you that your e-mail is already registered with them because they are part of a conglomeration of sites none of which you recognize at all and/or have saved a password and account entry for in your password safe.

… They prefill a form with your user name or e-mail address

But it is because you created an account on some other blog of the same service but you filed the password under the name of that other place, having no idea you were registering for wordpress or typepad all over the galaxy and actually had no intention of doing so and you don’t remember what that other place was anyhow

… They will take an OpenId

But you have to explicitly register an account anyhow, and your already-filled comment form is lost in the process.

… They insist on inviting your automatic Disqus logon if the cookie is spotted

But then you have to disable the indiscriminate e-mail river of Disqus commentary because it drowns your inbox and so, tell me again, why did I want to use Disqus?

… You can’t find your password and you seek their help

Only they clearly send you what must have been your original password in a plaintext e-mail.

This situation makes me very happy that I use a random-password generator for every new account, so that the password protects only that one logon and nothing else.  I am unwilling to have the key be more valuable than the lock.


You may notice that I have stopped using Technorati tags, since they seem to have no effect whatsoever and I haven’t figured out how to have them make a difference with any alternative source of tags.  I should figure out de.licio.us, I suppose, except in that case I should first figure out why my de.licio.us feed has stopped.

I also use categories, well no … I use Blogger Labels which are sort of like categories except it is hard to find out what they are and place a current list and links on my sidebar.  Blogger backlinks and Blogger labels remind me of the propensity of some Microsoft developer types to do-it-their-way when there is already an established practice out there.  Yes, developers just want to have fun. But inflicting their NIH syndrome on the rest of us is not OK.  Go do that in the privacy of your own home, please.

For the labels, I think I will periodically post a message that simply goes into every category I have used (Windows Live Writer knows what they are), so I can remind myself not to make up more and maybe even prune the list where I tend to always use multiple labels in combination.

Aren’t you happy that I have spiffed up this blog to the point that it serves as an invitation to my regular blogging on whatever strikes my fancy in the moment?  Just wait, there are five more blogs and I have a great deal of pent-up blogging from my 18 months nose-down in document-standards work.

Labels: , , ,

2010-04-07

 

Blog Template Unification: Template Trickiness

In Orcmid's Lair: February Frights Redux: Unification for Creative Destruction, I commented that I am in a death match between decline of my web-development machine and May 1, 2010, when when Blogger ceases publishing via FTP to my own domain and hosting service.

The laptop is now on life support and, so far, has not entered a vegetative state. But it can't sit up and stand on its own any longer.

Meanwhile, I have been working to unify my Blogger templates around one single "classic" layout. That has been interesting.

Testing Without Ruining the Blog

For dressing-up the sidebar and tidying up some aspects of the blog posts, I was able to confirm template changes using the template-preview provisions of Blogger.

Straightening Out the Archive Structure

It became trickier when I decided that each blog's archives should be in a separate folder. Orcmid's Lair wasn't done that way. Its archive pages were at the same level in the blog folder as the main page and some supporting items. Fortunately, I discovered that the archive-list pull-down would automatically change to reflect the new location once I said archives should go into a separate sub-folder. Then all I had to do was move the existing archive pages to the sub-folder to make the pull-down be true.

I now must remember to republish those few pages that have an older version of the pull-down. In this case, the blog is a little-bit broken, but easily fixed.

Being Conditional About Comments

The next tricky business was creating more cases that were conditional on which page was being generated.

In the past, I had full comments show up everywhere there is a copy of the related post. I decided to simplify and have comment detail only on the individual post pages. The version of an article on the main blog page, and in archives, provides a count and a link, but no comment content.

This became tricky to test because the template preview mechanism only shows what happens to the main page. To see what happens to posts, I must install the template and create a post or repost to see the effect on the individual post itself.

I can cause a repost, usually, by adding a comment to the post. If necessary, I can also change the conditionality to see what everything on the main page, but that is not a complete verification.

And Then There’s Backlinking

Even trickier was seeing how support for links would work. Blogger has a feature called backlinking that will report about other blogs that link to this one.

I don’t think it is exactly a track-back mechanism. I'm also not not sure how it works for blogs that are published via FTP.

To test whether backlinking is operating, despite Blogger indicating that my blog is backlink-enabled, I need to create a blog post that links to another of mine, and see what happens. That is the provocation for this particular post.

Also, I am using the BlogThis! pop-up that is provided if the "Create a Link" link is followed from one of my blog posts. This seems to be one way for Blogger to notice that a link to a Blogger-generated post is being made. Once this post is up, I can also recall it into Windows Live Writer and see whether I could have done it from there too.

Next Attempt

Well, BlogThis! does an awful formatting job. I recalled the post into Windows Live Writer to touch it up as well as see if there is any special indication that this post is linking to another. I don’t see anything.

I’ll repost now and then see if I have to publish from Blogger itself to have backlinking be noticed.

Oh, and By the Way

While I have been rooting around in tweaking the individual blog items and how comments and backlinks appear, I noticed another problem. The permalinks on comments don’t work. I have attempted to use an alternative way for creating the backlink, but there is something not happening. I will have to look at the source-code of the generated HTML pages to figure this one out.

Labels: ,

2010-04-04

 

OOXML Implementation: Can Expectations Ever Trump Reality?

I was startled to see the level of passion in Alex Brown’s 2010-03-31 post, Microsoft Fails the Standards Test.  Alex has two concerns: (1) dwindling OOXML standards-maintenance attention and resources; and (2) Microsoft silence with regard to implemented support for the strict level of IS 29500 and any retirement of the transitional level as the only level supported in Microsoft implementations of OOXML. 

Perhaps the most level-headed analysis is the “Wow” from Andy Updegrove in his 2010-04-01 post, Alex Brown: “Without action, the entire OOXML Project is now surely headed for failure.”

For me, the most peculiar aspect of the reactions I see is not that Alex has the concerns he announced, but that others treat his expression of concern for the future as a declaration of the actual present.  Furthermore, these observers who proudly pontificate that there is no action, there will be no action, and there was never going to be any action, excitedly congratulate Alex on having awakened from being hood-winked.  It is as if nothing has happened since 1998 and the book is closed on Microsoft forever.

Expectations Against Observable Reality

I want to look at just one part of this situation: expectations around IS 29500 implementation in Microsoft products.  The desire to have Microsoft abandon to-be-deprecated transitional provisions of IS 29500 in favor of producing only documents in the strict IS 29500 format is tied into that expectation.

I am eminently qualified to address this topic.  I have no information on what Microsoft is actually doing to incorporate support for IS 29500 in its products.  I have no idea what strategy Microsoft has, if any, with regard to the retirement of support for IS 29500 compliant transitional documents.  Microsoft doesn’t tell me anything about product efforts and I am happy to keep it that way.  Microsoft doesn’t seek my advice on the matter either.  So I am perfectly positioned to speculate, with my standing as a standards, interoperability, and architectural armchair astronaut unblemished.

What I am going to report is my observation of the simple state of affairs and how difficult it is to erase the past and jump to implementations that only produce what are called strict IS 29500 documents.  Expecting that to have been achieved in two years is about as unobservant as belief that all Microsoft needed to have done was adopt ODF as its native format in the first place.

Can I Has Have Me Strict Now Please?

The ISO/IEC International Standard for OOXML, IS 29500:2008, has two major levels that can be implemented.  There is a strict IS 29500 format that is the subject of the main part of the specification.  There is also a transitional IS 29500 format that mainly includes everything allowed in the strict format along with some other provisions that it was agreed would be retired but retained now for compatibility purposes.  The idea was that there would be come a time when consumers might accept transitional documents but the routinely-expected output would be a strict IS 29500 document.  The strict-transitional differentiation and the notion that production of non-strict documents would be discouraged for new documents was a creation of the DIS 29500 Ballot Resolution meeting in February 2008.  It was ratified in the approval of DIS 29500 as an International Standard on April 2, 2008.

In IS 29500:2008 as it was first published in November, 2008, the set of strict documents is essentially a subset of the transitional documents.  In addition, it is the transitional documents that include the most provisions of ECMA-376:2006, the specification supported by the already-distributed Microsoft Office System 2007 (and any Office 2003 configuration, like mine, to which the OOXML Compatibility Pack has been added).  Various translators also accept or emit ECMA-376 documents.  The .docx, .xlsx, and .pptx documents that are growing in numbers every day in production use satisfy the provisions of ECMA-376:2006 and tend to use transitional provisions of IS 29500 rather than preferable strict counterparts.

There is already a legacy situation with OOXML concerning the need for products to support ECMA-376:2006 in the documents that are accepted and produced.

Well, Not So Fast, Sparky

In addition, SC34 WG4, the standards-development working group that maintains IS 29500,  has now created a situation in which there is more than one strict IS 29500.

As part of the early maintenance work, some found it disturbing that there was no way to differentiate provisions of ECMA-376:2006 from IS 29500 transitional and of IS 29500 transitional from IS 29500 strict.   The same namespaces were used for all of them.  This issue was being addressed before the ink was dry on IS 29500:2008.  In 2009, the SC34 WG4 arrived at a set of amendments, most of which were designed to separate the namespace used for the strict provisions of IS 29500 from the namespace used for transitional documents with their additional/alternative too-be-deprecated provisions.

If Microsoft had already implemented a way to produce strict IS 29500:2008 documents as defined before these amendments, those documents would now be considered transitional documents.  Their XML parts would employ the transitional namespaces, not the recently-adopted ones for strict IS 29500.

It has been a few months since the amendment solidified, and one would hope that Microsoft is looking at how to enact the production of strict IS 29500 documents in some customer-respectful manner.  Whether that is something that could possibly appear in the soon-to-be-released Microsoft Office 2010 products is not something I will even guess about.  Microsoft’s commitment to support IS 29500 is not specific on this topic, and there may be residual difficulties in how the separation of strict out from under transitional has been executed in the amendments and any anticipatory implementation work.

I can see ways to work through a gradual migration where the expected output is strict IS 29500 documents.  But I would expect transitional/ECMA-676 documents to be accepted for a long time and to be produced at least as long as “Save As … 97-2003 Document” also exists.


Appendix: Arriving at Separated Strict and Transitional Namespaces: The Progression

Although it has been two years since there was agreement on what IS 29500:2008 would be, it was not until November 25 that the specifications were publicly available.   In less than one year after that, amendments creating a substantial difference in the separation of strict OOXML documents from transitional OOXML documents was formulated and put out for ballot.  The official Corrigenda and Addenda carrying those and other changes are not yet available to the public.  Here is the progression.

  • ISO/IEC International Standard 29500:2008 Office Open XML File Formats.  First edition 2008-11-15.  This is the first official publication of IS 29500 for OOXML after the 2007 balloting and a subsequent Ballot Resolution Meeting in early 2008.  The specification is in four parts:
    • Part 1: Fundamentals and Markup Language Reference.  This is where the strict provisions are specified.
    • Part 2: Open Packaging Conventions.  There are some transitional considerations in packaging.  These are being treated by separate defect reports.  The OPC are adaptable for non-OOXML usage.
    • Part 3: Markup Compatibility and Extensibility.  A set of independent features by which extensions can be added to an XML Document using a set of specific conventions that allow for graceful degradation when the extensions are not understood
    • Part 4: Transitional Migration Features.  Additional features that are not included in the strict provisions.  As originally formulated, the strict provisions were to be a subset of the transitional OOXML documents and all of ECMA-376 was embraced by transitional OOXML with a few deviations.
  • SC34 N 1246 ISO/IEC 29500-1:2008/FPDAM1, Part 1: Fundamentals and Markup Language Reference – AMENDMENT 1, 2009-08-04 available as a public 1.56MB downloadable Zip file consisting of the proposed draft amendment and the corrected (RNG and W3C) schemas.
       
    This public document was taken to a four-month SC34 FPDAM Ballot that closed 2009-12-04.    
      
  • SC34 N 1251 ISO/IEC 29500-4:2008/FPDAM1, Part 4: Transitional Features – AMENDMENT 1, 2009-08-04 available as a public 1.59MB downloadable Zip file consisting of the proposed draft amendment and the corrected (RNG and W3C) schemas.
      
    This public document was taken to a four-month SC34 FPDAM Ballot that closed 2009-12-04. 
      
  • SC34 N 1253 IS 2950099:2008 Defect Report Log [At Closure of the DCOR1 and FDAM1 Sets], 2009-08-04 edition available as a public 3.91MB downloadable PDF file. 
      
    Defect Item 08-0012 was submitted by Ms. Ruth Schneider of the National Body (SNV) for Switzerland.  It was circulated to SC34 on 2008-11-03.  The defect involves the inability to distinguish between ECMA-376:2006 schemas and those of IS 29500 because the same namespaces are used.  From the log on this item we can see that there were many ways to deal with explicit versioning and that no reasonable solution stood out.  Finally, at a meeting in Prague on 2009-03-24, a small break-out group returned with proposed requirements for any versioning solution:
      
    • To push producers gently towards strict conformance [emphasis mine, orcmid:]
    • To improve interoperability
    • To do no evil to the 29500 ecosystem (i.e., files, end users, and implementations)

The option to change the namespace for strict schemas and only strict schemas emerged in subsequent discussions.  It took until the 2009-06-22/24 meeting in Copenhagen to finish wading through the considerations and agree that the only change would be new namespaces for strict schemas and the associated narratives.  Along the way there were was stumbling around versioning, conformance attributes, and features of transitional that had no existence in ECMA-376:2006 and were there now merely to preserve transitional as the superset.  Although the final changes needed to accomplish this the namespace separations were contributed by Microsoft expert Shawn Villaron of the ECMA delegation (and Microsoft), the issues and inter-dependencies were heavily discussed among all of the WG4 participants.
   
In the end, resolution of defect item 08-0012 involved over 300 amendment entries in N1246 for IS 29500 Part 1 and over 200 in N1251 for Part 4.  This is about 75% of all of the amendment entries in the FPDAM1 set, all on behalf of this single defect and its simply-stated disposition.

  • Brad Smith’s 2009-12-16 Microsoft Statement on European Commission Decision.  This declaration from Microsoft establishes a “public undertaking” with regard to interoperability.   Part of this undertaking includes a Warranty Agreement covered in a 2009-12-16 Annex (downloadable Microsoft Office Word .doc file).  The timing of this announcement has nothing to do with events at ISO/IEC JTC1 SC34 (and some of the “undertaking” documents were first uploaded with 2009-10-06, dates).  The “undertaking” is in the spirit of the Interoperability Principles and other agreements with regulatory authorities (c.f., my 2008-02-26 Interoperability by Design post).  It also puts some serious teeth into Microsoft commitments by  asserting that “Microsoft will make available legally-binding warranties that will be offered to third parties.”
      
    In my reading of the available on-line documents, legally-binding warranties are provided for an unstated fee to parties that intend to provide interoperability with Microsoft implementations of various protocols and industry standards.  Under such warranties, there is assurance that successors to Microsoft Office Word 2007, Excel 2007, and PowerPoint 2007 will support IS 29500 rather than ECMA-376.  Nothing in the “undertaking” distinguishes between strict and transitional provisions of IS 29500.
      
  • Subsequent Creation of Documents.  There are no more-recent public documents, as of 2010-04-03.  On 2010-03-05, documents SC34 N 1382 and SC34 N 1384 appeared as final FDAM1 amendment texts on the SC34 document repository.  I presume that these reflect the disposition of FPDAM1 ballot comments and perhaps other feedback on N 1246 and N 1251, respectively.
      
  • New Defects from the Namespace Change.  I can’t imagine that the massive changes involved in separation of strict into its own namespaces would not be accompanied by new defects, especially in considering how the dependent the transitional provisions were on the formerly-strict-but-same-namespace provisions.  This seems to be borne out in the Outstanding Action Items 7-9 of the 2010-02-18 SC34 WG6 Teleconference minutes (available as a public 574kB downloadable PDF file).  Although there will need to be those, perhaps other, repairs, there is probably little that would further delay an implementation being able to recognize its supported strict features under either namespace so long as the provisions are identical.  Preventing the clashing of unique-to-strict and unique-to-transitional provisions will require care in all cases.   I am blissfully ignorant of how the mixing of features might complicate the internal, “in-memory” document model of Microsoft and other existing products.  The added complexity of testing and building a relevant document corpus for all of the use cases strikes me as seriously daunting.

[update 2010-04-05T23:41Z And I am finally scrapping all of that useless white space that was on the end of my blog text.
 update 2010-04-05T23:11Z I misquoted Andy Updegrove’s blog title quoting Alex Brown and have fixed it, thanks to the good eye of Rob Weir.
 update 2010-04-05T16:49Z Too be more careful when there are twoo many words to get right the first time.  Sometimes, I just can’t bear it.
 update 2010-04-05T15:11Z I took inspiration from Alex Brown’s comment to tidy up some wording and add an after-thought about the unseen but material impacts of a simple namespace change in a world of comingled strict and transitional features.]

2010-04-03

 

February Frights Redux: Unification for Creative Destruction

I have moved from February Frights to April Insecurity. Here’s an update on the items that I had on my plate:

  1. Frailty of Compagno, the web-development server, is increasing. Danger, danger … . I am operating on chewing-gum and a prayer for now. I have to do this, but figuring out staging for coming up in a reliable way on the Windows Home Server is daunting for me. More urgency is required.
  2. I have successfully migrated Vicki onto a docked laptop. There are some software installs remaining for the Windows 7 Dell Inspiron 15, and WiFi roaming needs to be figured out. Everything is operating beautifully and I am jealous that Vicki has the most advanced system in the house at the moment. Before removing her 2006-vintage Dell desktop system, I repaved it with Windows 7 too. Worked like a champ, and we gave that system to our niece, Liza.
  3. I rolled Quadro, my Tablet PC, back to Windows XP from Windows 7 RC1. This worked easily, by using the restore discs for the Toshiba Satellite Tablet PC. I forgot about all of the craplets that Toshiba installed on this machine. I miss the improved Tablet functionality (and 3-d chess) of Vista, but being back on Toshiba drivers and a stable configuration is worth it. I gave up on the idea of having it be my e-mail machine and attempting to have more than one version of Microsoft Office was also too painful, but Quadro is in a stable, fully-functioning state now. Of course, I am far beyond Toshiba’s 2-year update commitment and must find a Windows 7 Tablet PC replacement eventually. I figure the later the better.
  4. Upgrading the Development Desktop will wait. There’s no technical reason to hurry and I will keep watching the desktop pricing-performance curves move. The biggie, beside running multiple VMs on a hot processor with fast RAM will be managing to drive a 36” display, once energy-efficient and affordable ones of those become available.
  5. Moving off of FTP’d Blogger to different Blog Hosting. There has been a reprieve. Google will continue support for publishing of Blogger blogs via FTP until the end of April. I’m now trapped in two races, once against a Google deadline, once against entropy death of a frail 1998 laptop. Thanks to advice from Rob Weir and researching other ways of self-hosting a blog, I have a plan. You are seeing the early steps right here:
    • I am going to unify all of the blogs I manage under a consistent Blogger template set. The features and layout will be the same, with changes in color scheme and symbols only. This means that it is all going to be table-based HTML 4.01 transitional too. I fooled around with CSS-only layout enough to know that I will stick with what works and solve the CSS problem later.
    • The first stage is to adjust the Orcmid’s Lair template as the pattern. I am starting with a generic sidebar on the left, and some cleanup to what I call the title block at the top of the page. If I have that right, this post will drive out those changes to the main page and all posts from here on out. I will continue working on changes to the body text and comments on Orcmid’s Lair.
    • The next stage is to propagate variations of the unified template to all blogs. This will including Spanner Wingnut, my laboratory for experimenting with further changes. The format unification accomplishes two things. First, I have been resisting blogging over unhappiness of the formats. I wanted something more pleasant before investing in more posting, especially on Vicki’s blog and my nfoWorks: Pursuing Harmony blog. Secondly, once I have migrated Spanner Wingnut successfully, whatever I had to do to establish a new template matching my desired style can then be replicated back to the other blogs.
    • I want to host multiple blogs on a single web hosting account that has multiple domains implemented on a single site. I also want to retire the existing blogs in place. To continue to develop new, self-published pages on the same folder structure, I don’t think systems like WordPress will work easily. In general, I am reluctant to move to systems that generate pages dynamically and/or use directory-redirection techniques to map URLs to what the blog engine really uses. I learned a lot about exploring WordPress and more checking into Drupal also. At the moment, Movable Type looks like a better choice.
    • The only way to be certain that I have found a worked case is to attempt to migrate Spanner Wingnut first. Once I have a working migration that allows Windows Live Writer authoring and multiple blog hosting, I will move over the main blogs, including this one.
    • There will be some breakage. I will retire the current blogs “in place.” The new blogging system will generate all new pages. But the new-systems RSS feed will start anew and the archive links will not include the Blogger-generated pages. Also, comments will stop working on all “legacy” blog pages. I might find some sort of bulk change procedures to repair some of this later on, but it won’t be that pretty during the cutover. May 1 is no longer in the distant future. The big job is accomplishing migration of Spanner Wingnut and then one other.

Labels: , ,

 
Construction Structure (Hard Hat Area) You are navigating Orcmid's Lair.

template created 2002-10-28-07:25 -0800 (pst) by orcmid
$$Author: Orcmid $
$$Date: 10-04-30 19:00 $
$$Revision: 69 $