Thesis 32


Everyware is strongly implied by the continuing validity of Moore's law.

No matter what we choose to do with it, the shape that information technology takes in our lives will always be constrained by the economic and material properties of the processors undergirding it. Speed, power consumption profile, and unit production cost are going to exert enormous influence on the kinds of artifacts we build with processors and on how we use them.

Pretty much right up to the present moment, these qualities have been limiting factors on all visions involving the widespread deployment of computing devices in the environment. Processors have historically been too expensive, too delicate, and too underpowered to use in any such way, leaving computing cycles too scarce a commodity to spend on extravagances like understanding spoken commands.

As the price of processors falls dramatically, and computing power begins to permeate the world, the logic behind such parsimoniousness disappearswe can afford to spend that power freely, even lavishly, with the result that computing resources can be brought to bear on comparatively trivial tasks. We arrive at the stage where processor power can be economically devoted to addressing everyday life: As Mark Weiser put it, "where are the car keys, can I get a parking place, and is that shirt I saw last week at Macy's still on the rack?"

In fact, we know that scattering processors throughout the environment will only continue to get cheaper. The reasoning behind this assertion was first laid out in 1965 by engineer (and later Intel co-founder) Gordon Moore, in a now-legendary article in the industry journal Electronics. It would turn out to be one of the most profoundly influential observations in the history of computing, and as nakedly self-fulfilling a prophecy as there ever has been. (It's so well known in the industry, in fact, that if you feel like you've got a handle on what it implies for everyware, there's no reason for you not to skip ahead to Thesis 33.)

Moore's essay simply pointed out that the prevailing industry trend was for ever greater numbers of transistors to be packed into an ever smaller space, with the number of transistors per unit area approximately doubling every 24 months. He concluded almost parenthetically that the trend would continue for at least ten years into the future.

Transistor density being a fairly reliable stand-in for certain other qualities of a computernotably, speedthis implied that future devices would offer sharply higher performance, in a smaller envelope, at a fixed cost. This "prediction" was actually a rather weak one, couched in a number of qualifiers, but nonetheless it has acquired the imposing name of "Moore's law."

Although the article never says so in so many words, Moore's law has almost universally been interpreted as a bald statement that the amount of processing power available at a given cost will double every eighteen months, indefinitely. Applied to the slightly different context of memory, the Moore curve predicts that a given amount of storage will cost roughly half as much a year and a half from now and take up half as much volume.[*]

[*] Nowhere in the annals of computing is it convincingly explained how the 24-month doubling period of Moore's original article became the 18-month period of geek legend. Moore himself insists to this day that he never used the latter number, either in his published comments or elsewhere.

That Moore's law was more or less consciously adopted as a performance goal by the chip-design industry goes a long way toward explaining the otherwise improbable fact that it still has some predictive utility after some forty years. Compare, for example, the original microprocessor, Intel's 1971 4004, to a 2004 version of the same company's Pentium 4 chip: the 4004 packed 2,300 transistors and ran at a clock speed of 740 KHz, while the Pentium 4 boasts a transistor count of 178 million and runs at 3.4 GHz. That's not so far off the numbers called for by a 24-month doubling curve.

In a purely technodeterminist reading, anyway, Moore's law tells us exactly where we're headed next. It's true that Gordon Moore made his observation in the long-ago of 1965, and so one might be forgiven for thinking that his "law" had little left to tell us. But as far as anyone knowledgeable can tell, its limits are a long way off. A vocal minority continues to assert the belief that even after the photolithography used in chip fabrication hits the limits inherent in matter, more exotic methods will allow the extension of Moore's unprecedented run. Whether or not Moore's law can be extended indefinitely, there is sufficient reason to believe that information-processing componentry will keep getting smaller, cheaper, and more powerful for some time yet to come.

Because processors will be so ridiculously cheap, the world can be seeded with them economically. Because their cheapness will mean their disposability, they'll be installed in places it wouldn't have made sense to put them beforelight switches, sneakers, milk cartons. There will be so very, very many of them, thousands of them devoted to every person and place, and it won't really matter whether some percentage of them fail. They will be both powerful individually, and able to share computation among themselves besides, and able to parse the complexities presented by problems of everyday life. Whatever name it is called by, however little it may resemble the calm technology envisioned by Mark Weiser, a computing with these properties will effectively be ubiquitous, in any meaningful sense of the word.



Everyware. The dawning age of ubiquitous computing
Everyware: The Dawning Age of Ubiquitous Computing
ISBN: 0321384016
EAN: 2147483647
Year: 2004
Pages: 124

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net