Thesis 01


There are many ubiquitous computings.

Almost twenty years ago, a researcher at the legendary Xerox Palo Alto Research Center wrote an articlea sketch, reallysetting forth the outlines of what computing would look like in a post-PC world.

The researcher's name was Mark Weiser, and his thoughts were summarized in a brief burst simply entitled "Ubiquitous Computing #1." In it, as in the series of seminal papers and articles that followed, Weiser developed the idea of an "invisible" computing, a computing that "does not live on a personal device of any sort, but is in the woodwork everywhere."

What Weiser was describing would be nothing less than computing without computers. In his telling, desktop machines per se would largely disappear, as the tiny, cheap microprocessors that powered them faded into the built environment. But computation would flourish, becoming intimately intertwined with the stuff of everyday life.

In this context, "ubiquitous" meant not merely "in every place," but also "in every thing." Ordinary objects, from coffee cups to raincoats to the paint on the walls, would be reconsidered as sites for the sensing and processing of information, and would wind up endowed with surprising new properties. Best of all, people would interact with these systems fluently and naturally, barely noticing the powerful informatics they were engaging. The innumerable hassles presented by personal computing would fade into history.

Even for an institution already famed for paradigm-shattering innovationsthe creation of the graphical user interface and the Ethernet networking protocol notable among themWeiser's "ubicomp" stood out as an unusually bold vision. But while the line of thought he developed at PARC may have offered the first explicit, technically articulated formulation of a ubiquitous computing in the post-PC regime, it wasn't the only one. The general idea of an invisible-but-everywhere computing was clearly loose in the world.

At the MIT Media Lab, Professor Hiroshi Ishii's "Things That Think" initiative developed interfaces bridging the realms of bits and atoms, a "tangible media" extending computation out into the walls and doorways of everyday experience. At IBM, a whole research group grew up around a "pervasive computing" of smart objects, embedded sensors, and the always-on networks that connected them.

And as mobile phones began to percolate into the world, each of them nothing but a connected computing device, it was inevitable that someone would think to use them as a platform for the delivery of services beyond conversation. Philips and Samsung, Nokia and NTT DoCoMoall offered visions of a mobile, interconnected computing in which, naturally, their products took center stage.

By the first years of the twenty-first century, with daily reality sometimes threatening to leapfrog even the more imaginative theorists of ubicomp, it was clear that all of these endeavors were pointing at something becoming real in the world.

Intriguingly, though, and maybe a little infuriatingly, none of these institutions understood the problem domain in quite the same way. In their attempts to grapple with the implications of computing in the post-PC era, some concerned themselves with ubiquitous networking: the effort to extend network access to just about anyplace people could think of to go. With available Internet addresses dwindling by the day, this required the development of a new-generation Internet protocol; it also justified the efforts of companies ranging from Intel to GM to LG to imagine an array of "smart" consumer products designed with that network in mind.

Others concentrated on the engineering details of instrumenting physical space. In the late 1990s, researchers at UC Berkeley developed a range of wireless-enabled, embedded sensors and microcontrollers known generically as motes, as well as an operating system for them to run on. All were specifically designed for use in ubicomp,

Thirty miles to the south, a team at Stanford addressed the absence in orthodox computer science of a infrastructural model appropriate for the ubiquitous case. In 2002, they published a paper describing the event heap, a way of allocating computational resources that better accounted for the arbitrary comings and goings of multiple simultaneous users than did the traditional "event queue."

Developments elsewhere in the broader information technology field had clear implications for the ubiquitous model. Radio-frequency identification (RFID) tags and two-dimensional barcodes were just two of many technologies adapted from their original applications, pressed into service in ubicomp scenarios as bridges between the physical and virtual worlds. Meanwhile, at the human-machine interface, the plummeting cost of processing resources meant that long-dreamed-of but computationally-intensive ways of interaction, such as gesture recognition and voice recognition, were becoming practical; they would prove irresistible as elements of a technology that was, after all, supposed to be invisible-but-everywhere.

And beyond that, there was clearly a ferment at work in many of the fields touching on ubicomp, even through the downturn that followed the crash of the "new economy" in early 2001. It had reached something like a critical mass of thought and innovation by 2005: an upwelling of novelty both intellectual and material, accompanied by a persistent sense, in many quarters, that ubicomp's hour had come 'round at last. Pieces of the puzzle kept coming. By the time I began doing the research for this book, the literature on ubicomp was a daily tide of press releases and new papers that was difficult to stay on top of: papers on wearable computing, augmented reality, locative media, near-field communication, bodyarea networking. In many cases, the fields were so new that the jargon hadn't even solidified yet.

Would all of these threads converge on something comprehensible, useful, or usable? Would any of these ubiquitous computings fulfill PARC's promise of a "calm technology?" And if so, how?

Questions like these were taken up with varying degrees of enthusiasm, skepticism, and critical distance in the overlapping human-computer interaction (HCI) and user experience (UX) communities. The former, with an academic engineering pedigree, had evolved over some thirty years to consider the problems inherent in any encounter between complex technical systems and the people using them; the latter, a more or less ad hoc network of practitioners, addressed similar concerns in their daily work, as the Internet and the World Wide Web built on it became facts of life for millions of nonspecialist users. As the new millennium dawned, both communities found ubicomp on their agendas, in advance of any hard data gleaned from actual use.

With the exception of discussions going on in the HCI community, none of these groups were necessarily pursuing anything that Mark Weiser would have recognized as fully cognate with his ubiquitous computing. But they were all sensing the rapidly approaching obsolescence of the desktop model, the coming hegemony of networked devices, and the reconfiguration of everyday life around them. What they were all grasping after, each in their own way, was a language of interaction suited to a world where information processing would be everywhere in the human environment.



Everyware. The dawning age of ubiquitous computing
Everyware: The Dawning Age of Ubiquitous Computing
ISBN: 0321384016
EAN: 2147483647
Year: 2004
Pages: 124

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net