|
|
privacy |
||
|
Hangout for experimental confirmation and demonstration of software, computing, and networking. The exercises don't always work out. The professor is a bumbler and the laboratory assistant is a skanky dufus.
Blog Feed Recent Items The nfoCentrale Blog Conclave nfoCentrale Associated Sites |
2005-03-26Open Authentication: One-Time Passwords and Crypto-HashingACM News Service: SHA-1 Flaw Seen as No Risk to One-Time Password Proposal. I've seen several links to Mark Willoughby's 2005-03-22 Computerworld article and I passed over each one, thinking the title was self-explanatory and that I understood why SHA-1 is still usable based on Bruce Schneier's reporting on the topic. Fortunately, I did glance over this TechNews summary in my regular scanning of that source. Here's interesting material that you might have overlooked too, and that I want to examine as part of TROSTing development. The Initiative for Open Authentication (Oath!) has the vision of developing strong universal authentication: among all users, all devices, and all networks. The consortium is out to produce a reference architecture based on existing "open standards." (The term "leveraging" is used, so your credibility may vary.) Vision is vision, and some of this may end up being a solution looking to reword the problem, but the effort is interesting to me, especially because the authentication part is based on Hashed Message Authentication Codes (HMACs) and what are called one-time passwords. The scheme is based on SHA-1. There is a very weird statement that this use is less vulnerable to connived collisions because only a small selection of the 60-bit hash are used, and that claim left my jaw hanging open. There is more to the protocol than that, unless information theory has failed. And I remain interested because I want to know how this might work with persistent entities (some of the everythings that the vision is intended to embrace). The one-time password scheme is being proposed to the IETF and their is expected to be a standards-track adoption real-soon-now. The question will be, as always, how trust is established and recognized with all of these wondrous technical mechanisms in place, and how symmetrical can that trust arrangement be? We seem to forget that one can also connive an unreliable application atop a reliable protocol, and this may matter more.2005-03-24Is Faith in Innovation Wearing Thin?ACM News Service: Decrypting the Future of Security. "customers are demanding better software licensing terms, as well as input into the code development lifecycle, greater transparency, and code escrowing in the event vendors are unavailable when customers need them." Mary Kirwan's 2005-03-18 article in the Canadian Globe and Mail is far more interesting and entertaining than the dry account in the news blurb. It is first rather startling to encounter so much sarcasm, but reading along it becomes clear how much the recent RSA Conference can be seen as a self-satirizing event among tech heavies, US-slanted economic and government mavens, and frightened enterprise IT personalities. I'm particularly fond of this observation:"A veritable plague of locusts will descend on the planet and devour it inside out if software liability for vendors becomes a reality. Huge software vendors not usually unduly concerned about the trials and tribulations of the 'small software developer' are wracked with concern for their doomed brethren."For me the most-important take-away is the observation that the community that IT and computing serves is becoming seriously impatient: "Many attendees and speakers expressed the view that if legislation and a new required emphasis on software quality assurance and accountability for code development eradicated purveyors of vapourware, and separated the wheat from the chaff, they were all for it."So, are we having fun yet? Repairing Aberrant Behavior: But Is That the Threat?ACM News Service: Supersmart Security. I notice that there is a great deal of interest these days in systems that detect odd behavior of applications and find some way to fence it in or even re-establish a fresh copy. The motivation is finding an active approach to repairing a system that has been compromised. This is an interesting question, but I find myself puzzled about the problem being solved, especially when we talk about perturbing code so that exploits can't rely on a consistent pattern to apply their malevolent transformations against. I suspect that measure will only have temporary success, if the past is any guide here. A greater concern for me is whether this is the main or even most-serious threat? It is addressed to a symptom of compromise that might not deal with attacks that are really against higher levels in the application stack and intended to compromise data and users and the business system, not damage the computer. And it raises tremendous trust concerns for the security software itself. I already can't be sure my firewall is really working, and I am not sure a root kit, my main worry, is caught by this sort of thing (nor do I expect to see this sort of thing on consumer PCs anyhow). The Gary Anthes 2005-03-21 ComputerWorld article has more for digging deeper. I am simply not sure that system damage is the most prevalent threat or even the intention of most attackers, yet this seems to be the focus of this article, which links compromise and damage together: "For some time, we have been losing the battle against those who would damage our computer systems. That's because computers are increasingly interconnected and the software they run is more complex. Both factors increase vulnerability to infection and intrusion."[updated 2005-03-24T18:00Z I've been bravely posting directly from my Blog This! bookmarklet, but it means less spell-checking. This one had a howler that I couldn't overlook. Standards as Arbitrary Solutions to Recurring ProblemsACM News Service: Faster XML Ahead. The drive to introduce an official W3C specification for an easier-to-handle XML encoding may lead to initiation of an XML supplement for use in performance- or space-critical applications. This raises the problem of introducing more standard in an area where there is one (XML 1.0) already well-established. One does not want to make a move that creates a problem where one does not exist, and fragmenting the choice of XML encodings is a concern. This is especially the case where the new specification addresses what might be better served by a niche agreement, and there does seem to be some need for that. And then the tension arises around needing to emit more than one form, being able to inter-convert, and being able to deal with legacy and repurposing situation. This blurb simply accounts for the current tension in the XML community, observing that "another concern is that a binary XML standard would not be widely adopted; Rys notes that XML 1.1 has not met expectations and that Microsoft has not yet supported the specification because of backwards-compatibility fears." The 2005-03-23 Martin LaMonica CNet News.Com article provides comprehensive coverage of the concerns and the forces that are struggling for resolution. I see the usual tension between diving into the solution space and coughing up direct remedies (go binary, etc.) and taking a serious look at the XML specification life-cycle from a systems perspective. It seems to me that the demands of stability, consistency, and interoperability are quite high. The at-hand availability of XML as a format for everything is appealing, but edge-performance cases will always break down no matter what is done to the core specification. It seems to me that there are ways to assure interoperability by other mechanisms, although they could take more work and they can always be marginalized. The different variant markups of Wikis and Weblog implementations are an example of niche specialization, new variants of which may be simple laziness and lack of homework. Either way, the existence of variant media and document formats are real and it might be more useful to look at ways to know what the format is and ensure they are well-defined and mappable rather than expect a single popular standard carry the weight of the world.2005-03-22Easy trouble-free use of IT tops the listACM News Service: The Dark Side--Looming Threats for the Future of IT. A Computerworld panel considers poor software quality to be the most serious issue confronting IT. The group pointed the finger at major software vendors and asserts that business customers will become increasingly demanding. There is also a pent-up demand for substitutes. I don't see how the commoditization of software fits into this picture, though. The Gary Anthes 2005-03-07 Computerworld article has the full story.Maturing UML and Increasing ExpressivenessACM News Service: UML Integration Reaches Impressive Degree. This blurb features the increasing integration of UML with development tools and the blending from high-level workflows to lower-level implementation details. Along with the notation's increasing familiarity, there are now good practices for making the material more-understandable to non-experts and of greater use in envisioning and explaining a system -- with suitable care. Peter Coffee's full 2005-03-07 eWeek article has the details, and a few additional links to commercial UML-product vendors.Coffee's article inspired me to check the OMG site for available materials on the now-approved OMG Standard for UML 2.0. I wanted to use the latest and greatest in some design work as part of the TROST Project for my M.Sc dissertation. It looks, uhh, interesting as well as daunting. I just downloaded the foundation bits along with the MOF 2.0 Core specification. Maybe I will finally get my head around the Meta-Object Facility. More Open Than OpenMore Open Than Open. [From 2005-03-15-01:30 pst] Here's a find by Scoble on the different ways of looking at open as in open-source software. There are some interesting thoughts on different styles of licensing, especially with regard to restrictions and reciprocity condition (as in GPL and also in licenses where Microsoft might have patents that are applicable, etc.). I think I am tired, so this didn't land all that great with me. I'll give it another look.[Added 2005-03-22-10:36 pst] Well, I'm still tired. The article provides a nice discussion of open-licensing based on needs of reciprocity. I think it is a little defensive about some of the licenses used in the Microsoft Shared-Source initiative not all fitting under the Open Source Definition. It is a valuable discussion. There's a little bit of "ick" in my reaction here. I looked over at my bookshelf and it all came back to me. My experience in purchasing the Stutz, Neward & Shilling Shared Source CLI Essentials actually left me feeling a little dirty and very curious (well, betrayed might be a better term) about O'Reilly letting that pass through their hands. Initially, I was excited to see the book appear and eager to learn more about how the CLI run-time works. Then, as I started reading all of the rules and license conditions on the CD-ROM, I stopped myself. I really don't want to learn something that I can't safely and simply use in any level of my activities. So to avoid falling afoul of the IP constraints, I am not using the CD. I should probably destroy it, but it's too late. I have the book and I have the CD, even though I regret ever breaking the seal. Now I just get to feel dirtied by having it in my possession. Meanwhile, I will learn what there is to learn about CLI by sticking with the ECMA documents and the outpourings of the Mono Project. Removing Complexity Makes Less BetterACM News Service: Taming Your Tech. This article uses the simplicity of Apple's iPod as a sign that technology companies are beginning to respond to the user's desire for simplicity and straightforward usability. The shackles of the myth that more is better are being broken. The David LaGesse 2005-03-14 U.S. News & World Report article is on-line.Your Computer Is Insecure. Bad planning, eh?ACM News Service: Our Frankenputer. This blurb looks at a variety of hardware fixes and software solutions for improving security of the PC. The idea is to have PCs designed with security in mind. While I think that is a good idea, I would like to look more at where Peter Neumann points concerning the user culture and our part in the insecurity of our systems, at least among developers-as-users and power users who do not take appropriate and already-available precautions. The Philip E. Ross, 2005-03-14 Forbes.com article has the following provocative lead: "Hostile programs bend our computers to their own purposes because we designed them that way. Time for some new ideas." Now just exactly who is this we they're talking about?Certification of Network-Attached Components?ACM News Service: Protecting the Internet: Certified Attachments and Reverse Firewalls?. In his 2005-03-16 CircleID article, Karl Auerbach suggests that the Internet be protected at the edges by requiring certification of edge-attached components. Karl adopts "the converse point of view that the net is being endangered by the masses of ill-protected machines operated by users." This would prevent many PCs from engaging in zombie activity through the simple device of having routers and broadband gateways filter outgoing as well as incoming traffic. What I find interesting is that there are easier ways than waiting for household firewall-router technology to be forced into certification and upgrading over time. The service provider could be doing the same thing at the other end of the broadband pipe and the true border onto the internet. Protecting the network from subverted edges can be done much more readily there, with detailing in the terms-of-service offered to end nodes.Reputation and Community Trust of Download FilesACM News Service: Cleaning Spam from Swapping Networks. Although I am fascinated by the prospects of peer-to-peer approaches to distributed information systems, I have been wary of stepping onto this territory until there is some stability and better assurances of safety. The "Credence" program developed at Cornell University is designed to coordinate among peer-network nodes in order to establish trustworthiness of a particular file that is available for download. That may qualify as a form of TROSTing and I am definitely curious. John Borland's 2005-03-18 CNet News.Com article provides a link to Credence. The motivation and approach are definitely interesting, along with the fact that vulnerabilities in the LimeWire Gnutella client were discovered in the course of adding in the Credence functions. I am particularly intrigued by the fact that the process of developing reputation assessments is dynamic and fluid and pretty automatic. So there are interesting prospects for building up trustworthiness assessments out of this mechanism. There's much more here and an interesting body of research projects that Emin Gün Sirer is engaged in.The PITAS from PITAC And the Emperor's Security CloakACM News Service: Study Criticizes Government on Cybersecurity Research. A subcommittee of the President's Information Technology Advisory Committee (PITAC) has indicated that the research investment being made, especially in civilian (a.k.a. academic research) areas is inadequate for the Federal objectives being set for cybersecurity, infrastructure, and personnel. "The report criticizes the commercial cybersecurity strategy of patching, and lists 10 cybersecurity research areas that should take precedence, including cyberforensics, authentication technologies, monitoring and detection tools, and secure protocols." The 2005-03-19 John Markoff New York Times article (registration and/or fee required) identifies the report, "Cybersecurity: A Crisis of Prioritization." The recommended priorities include "authentication technologies, secure protocols, improved engineering techniques, monitoring and detection tools and cyberforensics." I have a tangential interest with regard to trustworthiness in open-system integration, especially with software components. This doesn't map directly onto the overall problems of Internet infrastructure protection, but there is enough harmony for me to be paying attention. |
||
|
|
You are navigating Orcmid's Lair. |
template
created 2004-06-17-20:01 -0700 (pdt)
by orcmid |