Thesis 42


In everyware, many issues are decided at the level of architecture, and therefore do not admit to any substantive recourse in real time.

Stanford law professor Lawrence Lessig argues, in his book Code and Other Laws of Cyberspace, that the deep structural design of informatic systemstheir architecturehas important implications for the degree of freedom people are allowed in using those systems, forever after. Whether consciously or not, values are encoded into a technology, in preference to others that might have been, and then enacted whenever the technology is employed.

For example, the Internet was originally designed so that the network itself knows nothing about the systems connected to it, other than the fact that each has a valid address and handles the appropriate protocols. It could have been designed differently, but it wasn't. Somebody made the decision that the cause of optimal network efficiency was best served by such an "end-to-end" architecture.[*]

[*] In this case, the identity of the "somebody" in question is widely known: The relevant design decisions were set forth by Robert E. Kahn and Vint Cerf, in a 1974 paper called A Protocol for Packet Network Intercommunication. The identity of responsible parties will not always be so transparent.

Lessig believes that this engineering decision has had the profoundest consequences for the way we present ourselves on the net and for the regulability of our behavior there. Among other things, "in real space... anonymity has to be created, but in cyberspace anonymity is the given." And so some rather high-level behaviorsfrom leaving unsigned comments on a Web site, to being able to download a movie to a local machine, traceable to nothing more substantial than an IP addressare underwritten by a decision made years before, concerning the interaction of host machines at the network layer.

We needn't go quite that deep to get to a level where the design of a particular technical system winds up inscribing some set of values in the world.

Imagine that a large American companysay, an automobile manufactureradopts a requirement that its employees carry RFID-tagged personal identification. After a lengthy acquisition process, the company selects a vendor to provide the ID cards and their associated paraphernaliacard encoders and readers, management software, and the like.

As it happens, this particular identification system has been designed to be as flexible and generic as possible, so as to appeal to the largest pool of potential adopters. Its designers have therefore provided it with the ability to encode a wide range of attributes about a card holderethnicity, sex, age, and dozens of others. Although the automotive company itself never uses these fields, every card carried nevertheless has the technical ability to record such facts about its bearer.

And then suppose thatlargely as a consequence of the automobile manufacturer's successful and public large-scale roll-out of the systemthis identification system is adopted by a wide variety of other institutions, private and public. In fact, with minor modifications, it's embraced as the standard driver's license schema by a number of states. And because the various state DMvs collect such data, and the ID-generation system affords them the technical ability to do so, the new licenses wind up inscribed with machine-readable data about the bearer's sex, height, weight and other physical characteristics, ethnicity....

If you're having a hard time swallowing this set-up, consider that history is chock-full of situations where some convention originally developed for one application was adopted as a de facto standard elsewhere. For our purposes, the prime example is the Social Security number, which was never supposed to be a national identity numberin fact, it was precisely this fear that nearly torpedoed its adoption, in 1936.

By 1961, however, when the Internal Revenue Service adopted the Social Security number as a unique identifier, such fears had apparently faded. At present, institutions both public (the armed forces) and private (banks, hospitals, universities) routinely use the SSN in place of their own numeric identification standards. So there's ample justification not to be terribly impressed by protestations that things like this "could never happen."

We see that a structural decision made for business purposesi.e., the ability given each card to record a lengthy list of attributes about the person carrying iteventually winds up providing the state with an identity card schema that reflects those attributes, which it can then compel citizens to carry. What's more, private parties equipped with a standard, off-the-shelf reader now have the ability to detect such attributes, and program other, interlinked systems to respond to them.

Closing the loop, what happens when a building's owners decide that they'd rather not have people of a given age group or ethnicity on the premises? What happens if such a lock-out setting is enabled, even temporarily and accidentally?[*]

[*] It's worth noting, in this context, that the fundamentalist-Christian putsch in Margaret Atwood's 1986 The Handmaid's Tale is at least in part accomplished by first installing a ubiquitous, nationwide banking network and then locking out all users whose profiles identify them as female.

No single choice in this chain, until the very last, was made with anything but the proverbial good intentions. The logic of each seemed reasonable, even unassailable, at the time it was made. But the clear result is that now the world has been provisioned with a system capable of the worst sort of discriminatory exclusion, and doing it all cold-bloodedly, at the level of its architecture.

Such decisions are essentially uncontestable. In this situation, the person denied access has no effective recourse in real timesuch options that do exist take time and effort to enact. Even if we are eventually able to challenge the terms of the situationwhether by appealing to a human attendant who happens to be standing by, hacking into the system ourselves, complaining to the ACLU, or mounting a class-action lawsuitthe burden of time and energy invested in such activism falls squarely on our own shoulders.

This stands in for the many situations in which the deep design of ubiquitous systems will shape the choices available to us in day-to-day life, in ways both subtle and less so. It's easy to imagine being denied access to some accommodation, for example, because of some machine-rendered judgment as to our suitabilityand given a robustly interconnected everyware, that judgment may well hinge on something we did far away in both space and time from the scene of the exclusion.

Of course, we may never know just what triggered such events. In the case of our inherent attributes, maybe it's nothing we "did" at all. All we'll be able to guess is that we conformed to some profile, or violated the nominal contours of some other.

One immediate objection is that no sane society would knowingly deploy something like thisand we'll accept this point of view for the sake of argument, although again history gives us plenty of room for doubt. But what if segregation and similar unpleasant outcomes are "merely" an unintended consequence of unrelated, technical decisions? Once a technical system is in place, it has its own logic and momentum; as we've seen, the things that can be done with such systems, especially when interconnected, often have little to do with anything the makers imagined.[*]

[*] As security expert Bruce Schneier says, "I think [a vendor of RFID security systems] understands this, and is encouraging use of its card everywhere: at sports arenas, power plants, even office buildings. This is just the sort of mission creep that moves us ever closer to a 'show me your papers' society."

We can only hope that those engineering ubiquitous systems weigh their decisions with the same consciousness of repercussion reflected in the design of the original Internet protocols. The downstream consequences of even the least significant-seeming architectural decision could turn out to be considerableand unpleasant.



Everyware. The dawning age of ubiquitous computing
Everyware: The Dawning Age of Ubiquitous Computing
ISBN: 0321384016
EAN: 2147483647
Year: 2004
Pages: 124

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net