Blunder Dome Sighting  
privacy 
 
 
 

Hangout for experimental confirmation and demonstration of software, computing, and networking. The exercises don't always work out. The professor is a bumbler and the laboratory assistant is a skanky dufus.



Click for Blog Feed
Blog Feed

Recent Items
 
Republishing before Silence
 
Command Line Utilities: What Would Purr Do?
 
Retiring InfoNuovo.com
 
Confirmable Experience: What a Wideness Gains
 
Confirmable Experience: Consider the Real World
 
Cybersmith: IE 8.0 Mitigation #1: Site-wide Compat...
 
DMware: OK, What's CMIS Exactly?
 
Document Interoperability: The Web Lesson
 
Cybersmith: The IE 8.0 Disruption
 
Cybersmith: The Confirmability of Confirmable Expe...

This page is powered by Blogger. Isn't yours?
  

Locations of visitors to this site
visits to Orcmid's Lair pages

The nfoCentrale Blog Conclave
 
Millennia Antica: The Kiln Sitter's Diary
 
nfoWorks: Pursuing Harmony
 
Numbering Peano
 
Orcmid's Lair
 
Orcmid's Live Hideout
 
Prof. von Clueless in the Blunder Dome
 
Spanner Wingnut's Muddleware Lab (experimental)

nfoCentrale Associated Sites
 
DMA: The Document Management Alliance
 
DMware: Document Management Interoperability Exchange
 
Millennia Antica Pottery
 
The Miser Project
 
nfoCentrale: the Anchor Site
 
nfoWare: Information Processing Technology
 
nfoWorks: Tools for Document Interoperability
 
NuovoDoc: Design for Document System Interoperability
 
ODMA Interoperability Exchange
 
Orcmid's Lair
 
TROST: Open-System Trustworthiness

2004-12-28

 

EDOS Integrates Open-Source Software

ACM News Service: Researchers Get EU Funding for Linux Project.  This is a new European Union university and industry research project to ease the distribution and management of Linux-based software.  This blurb suggests that it is focused on distributions of Linux itself, though this seems confused. The 30-month project will have deliverables every six months, and it is focused on peer-to-peer distributed operation and on an automated quality testing suite.  It is difficult to see how the first applies, though I find the second quite interesting.  I wonder how much of this really has to be tied to Linux, too. James Niccolai's 2004-12-22 ComputerWorld article seems broader, with attention on "building software development and management tools that will cut the costs of large open-source projects," with an eye toward doing "complex IT projects based on Linux and other open-source software."  That makes far more sense. The emphasis on open-source (equated with Linux in Europe) is partly ideological and political for Europe, which sees it as a way to foster an independent European IT competency. It appears that the envisioned P2P application is with regard to automatic installation and updating, so it seems to be about distribution in the sense of deployment and configuration, not necessarily distributed-object operation.  Dealing with interdependencies of components and the version status of them all is recognized as part of the challenge. «"Testing a Linux OS, or indeed any large application built on free/open-source software is a time-consuming and essential operation. Part of [our] plan is to develop tools to make testing more efficient and more comprehensive," the group said in its statement.» is still odd.  It is not clear to me what about use of free/open-source software is peculiar to this problem.  I am willing to believe there is something here, but it also strikes me that using open-source software also makes coordinated quality testing possible.  It seems much trickier when there are proprietary, closed-source components in the mix. I would think that this kind of effort will break the hearts of some Microsoft developers, because it is easy to see a lot of wheels being reinvented here with other good experiences, including lessons about what doesn't work so well, being ignored.  I imagine it goes back farther than Microsoft, too. I suppose my biggest concern is that these Euro-centric efforts are tending to be insular, and I'm not sure how that will be beneficial in the long run.  At the same time, I welcome a visible, open activity in this area.
 

Tech Challenges I Want to See

ACM News Service: Tech Trends That Should End.  This blurb points to a number of technology trends that should end or be curtailed in some way, refering to an Anne Chen, Cameron Sturdevant, and Jim Rapoza 2004-12-27 eWeek article.  A few of these provide interesting challenges, and I feature those:
Sane Patching & Updating
Cameron Sturdevant says that patching chaos must be controlled.  "But the real answer to the patching problem is to obviate or at least decrease dramatically the need for patches.  And the only hope for that is if applications are developed much more carefully from the get-go."  I don't know where to start yammering over this.  Perhaps it is enough to observe that it is going to take great strength of will, consistency of purpose, and a test of character to pull this off.
Balanced Backup
Peter Coffee notices that the capacity of backup techniques is swamped by the ever-growing capacity of desktop computers.  He recommends movement to hierarchical storage with automatic archival tools and intuitive backup aids.
Standardized Information Lifecycle Management
Henry Baltazar comments on the current difficulties of ILM (where was I?).  ILM is intended to analyze data and support migration to economical storage as access needs change.  "The storage industry must standardize the process of ILM and do it in a hurry.  With standardization in place, vendors will be able to develop systems that are interoperable."  This seems tied to record management and, in many circumstances could be usefully tied to the balanced backup situation too, with effective provisions for searching for digital assets that might be off-line.
I notice how my mind leaps after solutions and how things I know about document-management technology should be applicable.  The rational me suggest that perhaps I don't understand the problem.  I suppose my failure to backup my critical digital information might figure in here.  What would be a solution with value exceeding the cost of using it?

2004-12-26

 

Eliminating Mutual Incomprehension in Interoperability Arrangements

ACM News Service: Open Systems--Mutual Understanding Without Limits.  This blurb suggests that it may be possible to assure system coherence in the integration of heterogeneous open systems, using technology developed by groups of Russian researchers. The 2004-12-17 Russian Science News article speaks of universal open system technology that can solve the problem via providing a universal translator interface among standard data representations that are comprehensible to all participants in a system.  There is not much more detail available in English, and there are no links. There is apparently a translator or bridge arrangement involved, and I find it unusual for such strong claims to be made for an arrangement of that kind.  Forty years ago, we would have thought this to be a saving technology, but the practical results were never as strong as what we could imagine.  It would be interesting to see what has changed.  Shimon Nof suggests active middleware for bridging technologies as well, and there is a similar paucity of detail about practical demonstration in the E-Work model.
 

The Dimensions of E-Work

ACM News Service: Purdue Engineers Define 15 Dimensions of 'E-Work'.  This work by Shimon Hof involves four domains: (1) E-Work, (2) distributed decision support, (3) active middleware, and (4) integration, coordination, and collaboration.  the 15 e-dimensions are a means for establishing the integration conditions that apply for some e-work component. The 2004-12-15 Purdue University News article ties the 15 e-dimensions to work of the PRISM Center, though I found it difficult to track down anything more on the 15 dimensions themselves.  It would seem that we must await the forthcoming book edited by Nof and Ceroni.
 

Legitimizing P2P, Maybe?

ACM News Service: P2P Battle Reaches FTC.  I stepped over this blurb the first time, and saw it again because there is no 2004-12-24 edition of ACM News Service.  It landed differently one week later. There is the possibility of a form of P2P responsibility based on FTC regulations for truth in labeling and some additional practices around discouraging IP-violations and extra- if not illegal activity.  Although there is no result from the U.S. Supreme Court examination of the Grokster case, the FTC involvement might provide an appropriate level of warning and compliance to permit ad hoc P2P to continue to flourish for lawful purposes.  That matters to me because I see P2P and distributed overlay discovery as important for use of distributed objects in the long run. Michael Grebb's 2004-12-16 Wired News article gives more flavor to the proceedings of the two-day FTC workshop held on December 15-16.  It seems that everyone is worried about the previous decision that requires substantial non-infringing use, with publishers looking at what the usage actually is, the P2P providers countering how application-neutral the technology is.  There is also a little gaming that went on, when I saw a number of blogs encouraging the publishing and downloading of public-domain materials via popular P2P services as a way of demonstrating that neutrality in practice.  (My sense is that this sort of thing rarely works and it demonstrates a kind of prima facie guilty knowledge.) I would like to see P2P find a way into the light, and I don't know that will be possible.  For my interests, it means being able to work with high-performance P2P privately but maybe have to use a Web Services model for the broader public case, assuming that this allows individual services to be dealt with, if they deliver illegal or misappropriated content, without the entire Web Service infrastructure being threatened.

Legitimizing Peer-to-Peer

ACM News Service: Peer-to-Peer Comes Clean.  2004-10-09: There are many positive approaches to P2P and the article described here indicates a number of them.  One of interest to me is LionShare from Penn State University, at least in terms of the idea of exchanging scholarly information among networks of academics. I'm not sure about distributed hash tables as an interesting activity, but I am certainly enrolled in approaches to automatic discovery that "store data redundantly on numerous machines, shield information with encryption and digital signatures, and sustain participants' motivation and honesty by supporting distributed reputation, trust, and payment systems." Simson Garfinkel's 2004-10-06 Technology Review article provides a link to the September 2003 LionShare proposal. Other useful links provide the Microsoft P2P Development kit, the August 2004 Fourth IEEE International Conference on Peer-to-Peer Computing, and something I need here on the Centrale LAN, Magic Mirror Backup.  Of course, it will take something before I give anything like that local-network privileges inside my residential firewall and router.
 
Construction Structure (Hard Hat Area) You are navigating Orcmid's Lair.

template created 2004-06-17-20:01 -0700 (pdt) by orcmid
$$Author: Orcmid $
$$Date: 10-04-30 22:33 $
$$Revision: 21 $