Thesis 55


The necessary standards for interoperability do not exist or are not yet widely observed.

A lack of widely observed standards in the dimensions of screw threading inhibited Charles Babbage in his quest to build the Difference Engine, the world's first computer, in the 1840s. A lack of standards in languages and operating systems kept electronic computers from communicating with each other for decades. Well into the era of the personal computer, a lack of standards kept software development balkanized. A lack of standards led to the so-called Browser Wars, which suppressed adoption of the World Wide Web straight into the early years of this decade, as institutions that wanted Web sites were forced to build different versions compatible with each browser then widely used.

This afflicts almost all technologies at some point during their evolution, not just computing. Every American owner of a VW Beetle remembers the hassle of driving a car engineered to metric tolerances in an English-measurement culture; to this day, travelers arriving by rail at the French-Spanish border are forced to change trains because the countries' standard track gauges differ. The matter of standards seems to be a place where we are always having to learn the same lesson.

In some cases, there's a reasonable excuse for one or another system's failure to observe the relevant convention; The Octopus smartcard scheme we'll be discussing, for example, uses an idiosyncratic RFID architecture that does not conform to the ISO 14443 standard, simply because it was first deployed before the standard itself was established.

In other cases, the thicket of incompatible would-be standards is a matter of jockeying for advantageous position in a market that has not yet fully matured. We see this in the wireless networking arena, for example, where it can be hard even for a fairly knowledgeable observer to disentangle the competing specifications, to distinguish Wireless USB from Wi-Media, next-generation Bluetooth, and IEEE 802.15.3aor even to determine whether they compete on the same ground.

There are good commercial reasons for this, of course. Every manufacturer would ideally like to be able to benefit from the "lock-in" effect, in which its entry is recognized as the universal standard, as happened when JVC's VHS beat Sony's ostensibly superior Betamax format early in the adoption of home video. (It is a persistent urban legend that this was in large part due to Sony's refusal to license pornographic content for Betamax.) VHS, of course, went on to become a multibillion-dollar industry, while the Beta format was more or less forgotten, by all except a diehard few. Sony certainly remembers: It has absolutely no intention of letting its high-capacity Blu-Ray DVD format lose out to the competing HD DVD standard.

But what gets lost in the shuffle in such cases is that the jockeying can permanently retard adoption of a technology, especially when it goes on for long enough that the technology itself is leapfrogged. This was the case with early HDTV efforts: Competing producers advanced their incompatible analog standards for so long that a far superior digital HDTV technology emerged in the interim. None of the parties originally marketing analog standards are competitive in HDTV today.

Sometimes lock-in and other legacy issues inhibit the adoption of a standard that might otherwise seem ideal for a given application. We'll be seeing how powerful and general XML is where there is a requirement to communicate structured data between applications, but even given its clear suitability there are some prominent contexts in which it's not yet used. To take two familiar examples, neither the EXIF data that encodes properties such as date, time, and camera type in digital images, nor the ID3 tags that allow MP3 players to display metadata such as track, artist, and album name, are expressed in valid XML. And yet, as we'll be seeing, this is exactly the kind of application XML is well suited for. Whatever the reasons for maintaining separate formats, surely their advantages would be outweighed by those attending compliance with a more universal scheme?

Finally, even where broadly applicable technical standards exist, compliance with them is still subject to the usual vagariesa process that can be seen, in microcosm, in the market for pet-identification RFID transponders.

The United Kingdom mandates that all pet transponders and veterinary readers sold conform to the ISO FDXB standard. A single countrywide registry called PetLog, maintained by the national Kennel Club and recognized by the government's Department for Food, Environment and Rural Affairs, contains some two million records, and lost pets are routinely reunited with their owners as a result of the system's deployment.

By contrast, in the United States, there is no national standard for such tags; your vet has whatever scanner system he or she happens to have bought, which can read the proprietary tags sold by the scanner manufacturer, but not others. Should your pet wander into the next town over and get taken to a vet or a pound using an RFID system from a different vendor, the odds of its being properly identified are slim indeed.

In this case, as in so many others, it's not that a relevant standard does not exist; it does, and it's evidently being used successfully elsewhere. It's merely a question of when, or whether, some combination of pressures from the bottom up (market incentives, consumer action) and the top down (regulation, legislation) will result in a convergence on one universal standard. And we understand by now, certainly, that such processes can drag on for an indefinite amount of time.



Everyware. The dawning age of ubiquitous computing
Everyware: The Dawning Age of Ubiquitous Computing
ISBN: 0321384016
EAN: 2147483647
Year: 2004
Pages: 124

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net