Thesis 75


Everyware must be conservative of face.

Something too rarely considered by the designers of ubiquitous systems is how easily their ordinary operation can place a user's reputation and sense of dignity and worth at risk.

Thomas Disch illustrates this beautifully in his classic 1973 novel 334. The grimly futuristic Manhattan of 334 is a place whose entropic spiral is punctuated only by the transient joys of pills, commercial jingles, and empty sex. The world-weary residents of 334 East 13th Street survive under the aegis of a government welfare agency called MODICUM, a kind of Great Society program gone terminally sour.

In particular, 334's casual sketch of what would later be known as an Active Badge system hews close to this less-than-heroic theme. Disch shows us not the convenience of such a system, but how it might humiliate its human clientsin this case the aging, preoccupied hospital attendant Arnold Chapel. Embroiled in an illicit plot, Chapel has allowed himself to wander from his course, and is audibly corrected by the hospital's ubiquitous traffic control system:

"Arnold Chapel," a voice over the PA said. "Please return along 'K' corridor to 'K' elevator bank. Arnold Chapel, please return along 'K' corridor to 'K' elevator bank."

Obediently he reversed the cart and returned to 'K' elevator bank. His identification badge had cued the traffic control system. It had been years since the computer had had to correct him out loud.

All that was, in fact, necessary or desirable in this scenario was that the system return Chapel to his proper route. Is there any justification, therefore, for the broadcast of information embarrassing to him? Why humiliate, when adjustment is all that is mandated?

Of course, no system in the world can keep people from making fools of themselves. About all that we can properly ask for is that our technology be designed in such a way that it is conservative of face: that ubiquitous systems must not act in such a manner as would unduly embarrass or humiliate users, or expose them to ridicule or social opprobrium, in the course of normal operations.

The ramifications of such an imperative in a fully-developed everyware are surprisingly broad. With so many systems potentially able to provide the location of users in space and time, we've seen that finding people will become trivially easy. We also know that when facts about your location are gathered alongside other factswho you are with, what time it is, what sorts of services happen to be available nearbyand subjected to data-mining operations, a relational system can begin to paint a picture of your behavior.

Whether this should be an accurate picture or notand remember everything we said about the accuracy of machine inferencethe revelation of such information can lead to awkward questions about our activities and intentions, the kind we'd rather not have to answer. Even if we didn't happen to be doing anything "wrong," we will still naturally resent the idea that we should answer to anyone else for our choices.

Our concern here goes beyond information privacy per se, to the instinctual recognition that no human community can survive the total evaporation of its membrane of protective hypocrisy. We lie to each other all the time, we dissemble and hedge, and these face-saving mechanisms are critical to the coherence of our society.

So some degree of plausible deniability, including, above all, imprecision of location, is probably necessary to the psychic health of a given community, such that even (natural or machine-assisted) inferences about intention and conduct may be forestalled at will.

How might we be afforded such plausible deniability? In a paper on seamfulness, Ian MacColl, Matthew Chalmers, and their co-authors give us a hint. They describe an ultrasonic location system as "subject to error, leading to uncertainty about...position," and, as they recognized, this imprecision can within reasonable limits be a good thing. It can serve our ends, by giving anyone looking for you most of the information they need about where you are, but not a pinpoint granular location that might lend itself to unwelcome inference.

The degree to which location becomes problematic depends to some extent on which of two alternative strategies is adopted in presenting it. In a "pessimistic" presentation, only verifiably and redundantly known information is displayed, while an "optimistic" display includes possibles, values with a weaker claim on truth. The less parsimonious optimistic strategy obviously presents the specter of false positives, but if this is less than desirable in ordinary circumstances, in this context, a cloud of possible locations bracketing the true one might be just the thing we want. Still worse than the prospect of being nakedly accountable to an unseen, omnipresent network is being nakedly accountable to each other, at all times and places.

Some critics have insisted that there are, at least occasionally, legitimate social purposes invoked in using technology to shame. They point to the example of Korea's notorious "Dogshit Girl," a self-absorbed young lady whose fashion-accessory pet soiled a subway car; having made not the slightest effort to clean it up, she was immediately moblogged by angry onlookers. The pictures appeared online within minutes and throughout the national press after a few hours; according to the Korean press, her humiliation was so total that the young lady eventually withdrew from university.

The argument is that, had the technology not been in place to record her face and present it for all the world to see (and judge), she would have escaped accountability for her actions. There would have been no national furor to serveostensibly, anywayas deterrent against future transgressions along the same lines.

As to whether hounding someone until she feels compelled to quit school and become a recluse can really be considered "accountability" for such a relatively minor infraction, well, thereof we must be silent. Whatever the merits of this particular case, though, there is no doubt that shame is occasionally as important to the coherence of a community as hypocrisy is in another context.

But we are not talking about doing away with shame. The issue at hand is preventing ubiquitous systems from presenting our actions to one another in too perfect a fidelityin too high a resolution, as it wereand therefore keeping us from maintaining the beneficial illusions that allow us to live as a community. Where everyware contains the inherent potential to multiply the various border crossings that do so much to damage our trust and regard for one other, we must design it instead so that it affords us moments of amnesty. We must build ourselves safe harbors in which to hide from the organs of an accountability that otherwise tends toward the total.

Finally, as we've seen, there is the humiliation and damage to self-worth we experience when we simply can't figure out how to use a poorly designed technical system of any sort. Sadly, no principle or guidelinehowever strongly stated, however widely observedcan ever endow all the world's designers with equal measures of skill, diligence, and compassion. Nor could any guideline ensure that designers are afforded the time and space they require to work out the details of humane systems. What we can insist on, however, is that those tasked with the development of everyware be reminded of the degree to which our sense of ourselves rides on the choices they make.



Everyware. The dawning age of ubiquitous computing
Everyware: The Dawning Age of Ubiquitous Computing
ISBN: 0321384016
EAN: 2147483647
Year: 2004
Pages: 124

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net