In principle, at least as far as some of the more enthusiastic proponents of ubicomp are concerned, few human places exist that could not be usefully augmented by networked information processing.
Whether or not we happen to agree with this proposition ourselves, we should consider it likely that over the next few years we'll see computing appear in a very great number of places (and kinds of places) previously inaccessible to it. What would this mean in practice?
Some classic sites for the more traditional sort of personal computing are offices, libraries, dorm rooms, dens, and classrooms. (If we want to be generous, we might include static informational kiosks.)
When people started using wireless-equipped laptops, this domain expanded to include coffee houses, transit lounges, airliner seats, hotel rooms, airport concoursesbasically anywhere it would be socially acceptable to sit and balance a five-pound machine on your knees, should it come to that.
The advent of a mobile computing based on smartphones and wireless PDAs opened things up still further, both technically and interpersonally. On top of the kinds of places where laptops are typically used, we can spot people happily tapping away at their mobile devices on, in, and around sidewalks, cars, waiting rooms, supermarkets, bus stops, civic plazas, commuter trains.
But extending this consideration to include ubiquitous systems is almost like dividing by zero. How do you begin to discuss the "place" of computing that subsumes all of the above situations, but also invests processing power in refrigerators, elevators, closets, toilets, pens, tollbooths, eyeglasses, utility conduits, architectural surfaces, pets, sneakers, subway turnstiles, handbags, HvAC equipment, coffee mugs, credit cards, and many other things?
The expansion not merely in the number of different places where computing can be engaged, but in the range of scales involved, is staggering. Let's look at some of them in terms of specific projects and see how everyware manifests in the world in ways and in places previous apparitions of computing could not.
Of all the new frontiers opening up for computation, perhaps the most startling is that of the human body. As both a rich source of information in itself and the vehicle by which we experience the world, it was probably inevitable that sooner or later somebody would think to reconsider it as just another kind of networked resource.
The motivations for wanting to do so are many: to leverage the body as a platform for mobile services; to register its position in space and time; to garner information that can be used to tailor the provision of other local services, like environmental controls; and to gain accurate and timely knowledge of the living body, in all the occult complexity of its inner workings.
It's strange, after all, to live in our bodies for as long as we do, to know them about as intimately as anything ever can be known, and to still have so little idea about how they work. The opacity of our relationship with our physical selves is particularly frustrating given that our bodies are constantly signaling their status beneath the threshold of awareness, beyond our ability to control them. In every moment of our lives, the rhythm of the heartbeat, the chemistry of the blood, even the electrical conductivity of the skin are changing in response to evolving physical, situational, and emotional environment.
If you were somehow able to capture and interpret these signals, though, all manner of good could come from it. Bacterial and viral infections could be detected and treated, as might nutritional shortfalls or imbalances. Doctors could easily verify their patients' compliance with a prescribed regimen of pharmaceutical treatment or prophylaxis; a wide variety of otherwise dangerous conditions, caught early enough, might yield to timely intervention.
The information is there; all that remains is to collect it. Ideally, this means getting a data-gathering device that does not call undue attention to itself into intimate proximity with the body, over reasonably long stretches of time. A Pittsburgh-based startup called BodyMedia has done just that, designing a suite of soft sensors that operate at the body's surface.
Their SenseWear Patch prototype resembles a sexy, high-tech Band-Aid. Peel the paper off its adhesive backing and seat it on your arm, and its sensors detect the radiant heat of a living organism, switching it on. Once activated, the unit undertakes the production of what BodyMedia calls a "physiological documentary of your body," a real-time collection of data about heart rate, skin temperature, galvanic skin response, and so on, encrypted and streamed to a base station.
Other networked biosensors operate further away from the body. The current state of the art in such technology has to be regarded as Matsushita Electric's prototype Kenko Toware, an instrumented toilet capable of testing the urine for sugar concentration, as well as registering a user's pulse, blood pressure, and body fat. In what is almost certainly a new frontier for biotelemetry, a user can opt to have this data automatically sent to a doctor via the toilet's built-in Internet connection.[*]
Is such functionality of any real value? While nominally useful in the diagnosis of diabetes, urine testing is regarded as a poor second to blood testing. Most other types of urine-based diagnostics are complicated by the necessity of acquiring an uncontaminated "clean catch," Nevertheless, the significance of Kenko Toware is clear: From now on, even your bodily waste will be parsed, its hidden truths deciphered, and its import considered in the context of other available information.
What of that unfolding just the other side of the skin? Without leaving the scale of the body, we encounter a whole range of technical interventions less concerned with the body as process or oracle than with its possibilities as a convenient platformone that follows us everywhere we go. These have generally been subsumed under the rubric of "wearable computing."
Early experiments in wearability focused on the needs of highly mobile workersprimarily couriers, logistics personnel, law enforcement officers, and other first responderswhose jobs relied on timely access to situational information yet required that they keep their hands free for other tasks. A series of successful academic studies in the 1980s and 1990s, including those at the MIT Media Lab, ETH Zürich, and the Universities of Bristol and Oregon, demonstrated that deploying informatic systems on the body was at least technically feasible.
They were less convincing in establishing that anything of the sort would ever be acceptable in daily life. Researchers sprouting head-up "augmented reality" reticules, the lumpy protuberances of prototype "personal servers," and the broadband cabling to tie it all together may have proven that the concept of wearable computing was valid, but they invariably looked like extras from low-budget cyberpunk filmsor refugees from Fetish Night at the anime festival.
University of Toronto professor Steve Mann has easily trumped anyone else's efforts in this regard, willingly exploring full-time life as a cyborg over the course of several years (and still doing so, as of this writing). Mann attempted to negotiate modern life gamely festooned with all manner of devices, including an "eyetap" that provided for the "continuous passive capture, recording, retrieval, and sharing" of anything that happened to pass through his field of vision.
It was difficult to imagine more than a very few people ever submitting to the awkwardness of all this, let alone the bother that went along with being a constant focus of attention; Mann himself was the subject of a notorious incident at the U.S.-Canada border, soon after the September 11th attacks, in which his mediating devices were forcibly removed by immigration authorities.[*]
But happily for all concerned, the hardware involved in wearable computing has become markedly smaller, lighter, and cheaper. As ordinary people grew more comfortable with digital technology, and researchers developed a little finesse in applying it to the body, it became clear that "wearable computing" need not conjure visions of cyberdork accessories like head-up displays. One obvious solution, once it became practical, was to diffuse networked functionality into something people are already in the habit of carrying on their person at most times: clothing.
In 1999 Philips Electronics published a glossy volume called New Nomads, featuring a whole collection of sleek and highly stylish fashions whose utility was amplified by onboard intelligence. While the work was speculativeall the pieces on display were, sadly, nonfunctional mockups and design studiesthere was nothing in them that would have looked out of place on the sidewalks or ski slopes of the real world. Philips managed to demonstrate that wearable computing was not merely feasible, but potentially sexy.
Nor did the real world waste much time in catching up. Burton released an iPod-compatible snowboarding jacket called Amp in 2003, with narrow-gauge wiring threaded down the sleeves to a wrist-mounted control panel; by winter 2004-2005, Burton and Motorola were offering a Bluetooth-equipped suite of jacket and beanie that kept snowboarders wirelessly coupled to their phones and music players. The adidas_1 sneaker did still more with embedded processors, using sensors and actuators to adjust the shoe's profile in real time, in response to a runner's biomechanics.
Beyond the things you can already buy, hundreds of student projects have explored the possibilities of sensors light and flexible enough to be woven into clothing, typified by Richard Etter and Diana Grathwohl's AwareCuffssleeves that sense the digital environment and respond to the presence of an open Wi-Fi network. Given how often the ideas animating such projects have turned up in commercial products just a few years (or even months) later, we can expect the imminent appearance of a constellation of wearables.
And while none of these products and projects are as total as the digital exoskeleton Steve Mann envisioned, maybe they don't have to be. For all his personal bravery in pioneering wearable computing, Mann's vision was the product of a pre-Web, pre-cellular era in which computational resources were a lot scarcer than they are now. When more such resources are deployed in the world, we probably have to carry fewer of them around with us.
So what's the next step? After decades of habituation due to the wristwatch, the outer surface of the forearm is by now a "natural" and intuitive place to put readouts and controls. Meanwhile, with the generous expanse it offers, the torso offers wireless communication devices enough space for a particularly powerful and receptive antenna; a prototype from the U.S. Army's Natick Soldier Center integrates such an antenna into a vest that also provides a (near-literal) backbone for warfighter electronics, optics, and sensor suites. (The Army, at least, is prepared to attest to the safety of such antennae for its personnel, but I'm less certain that anyone not subject to military law would be so sanguine about wearing one all day.)
We'll also see garments with embedded circuitry allowing them to change their physical characteristics in response to external signals. The North Face's MET5 jacket takes perhaps the simplest approach, offering the wearer a controller for the grid of microscopic conductive fibers that carry heat through the garment. But more than one high-profile consumer fashion brand is currently developing clothing whose fibers actually alter their loft, and therefore their insulation profile, when signaled. When coupled to a household management system, this gives us shirts and pants that get more or less insulating, warmer or cooler, depending on the momentary temperature in the room.
Finally, there are a number of products in development that treat the clothed body as a display surface, the garment itself as a site of mediation. The U.S. Army, again, is experimenting with electro-optical camouflage for its next-generation battle dress, which suggests some interesting possibilities for clothing, from animated logos to "prints" that can be updated with the passing seasons. (Real-world approximations of the identity-dissimulating "scramble suits," so memorably imagined by Philip K. Dick in his 1972 A Scanner Darkly, are another potential byproduct.)
Considered in isolation, these projectsfrom toilet to eyetap, from "body area network" to running shoeare clearly of varying degrees of interest, practicality and utility. But in the end, everything connects. Taken together, they present a clear picture of where we're headed: a world in which the body has been decisively reimagined as a site of networked computation.