Given how conventional a component system may appear before it is incorporated in some context we'd be able to recognize as everyware, we're led to a rather startling conclusion: Relatively few of the people engaged in developing the building blocks of ubiquitous systems will consciously think of what they're doing as such.
In fact, they may never have heard the phrase "ubiquitous computing" or any of its various cognates. They will be working, rather, on finer-grained problems: calibrating the sensitivity of a household sensor grid so that it recognizes human occupants but not the cat, or designing an RFID-equipped key fob so that it reads properly no matter which of its surfaces is brought into range of the reader. With such a tight focus, they will likely have little sense for the larger schemes into which their creations will fit.
This is not an indictment of engineers. They are given a narrow technical brief, and they return solutions within the envelope available to theman envelope that is already bounded by material, economic, and time constraints. Generally speaking, it is not in their mandate to consider the "next larger context" of their work.
And if this is true of professional engineers, how much more so will it apply to all the amateurs newly empowered to develop alongside them? Amateurs have needs and desires, not mandates. They'll build tools to address the problem at hand, and inevitably some of these tools will fall under the rubric of everywarebut the amateur developers will be highly unlikely to think of what they are doing in these terms.
In Weiser and Brown's seminal "The Coming Age of Calm Technology," it appears to have been the authors' contention that responses to a suddenly hegemonic computing would arise as a consequence of its very ubiquity: "If computers are everywhere, they better stay out of the way."
Given the topic, this is a strikingly passive way for them to frame the question. It's as if Weiser and Brown trusted all of the people developing ubiquitous technology to recognize the less salutary implications of their efforts ahead of time and plan accordingly.
Even in the pre-Web 1990s, this was an unreasonably optimistic stanceand taking into account all that we've concluded about how little developers may understand the larger context in which their work is embedded, and the difficulty of planning for emergent properties of interacting systems, it would be indefensible today.
In fact, we should probably regard IT development itself as something unsuited to the production of an acceptably humane everyware. The reason has to do with how such development is conducted in organizations both large and small, from lean and hungry startups to gigantic international consultancies.
Every developer is familiar with the so-called "iron triangle." The version I learned was taught to me by a stereotypically crusty engineer, way back at my first dot-com job in 1999. In response to my request that he build a conduit between our Web site's shopping cart and the warehouse's inventory control system, he grunted, scrawled a quick triangle up on a handy whiteboard, hastily labeled the vertices FAST, GOOD, and CHEAP, and said, "Pick any two."
For all that this is obviously a cartoon of the technology development process, it's also an accurate one. For a variety of reasons, from the advantages that ostensibly accrue to first movers to the constraints imposed by venture capitalists, shareholders, and other bottom-liners, GOOD is rarely among the options pursued. Given the inherent pressures of the situation, it often takes an unusually dedicated, persistent, and powerful advocateSteve Jobs comes to mind, as does vacuum-cleaner entrepreneur James Dysonto see a high-quality design project through to completion with everything that makes it excellent intact.
Moreover, the more complex the product or service at hand, the more likely it will be to have a misguided process of "value engineering" applied at some point between inception and delivery. Although the practice has its roots in an entirely legitimate desire to prune away redundancy and overdesign, it is disastrous when applied to IT development. However vital, the painstakingly detailed work of ensuring a good user experience is frequently hard to justify on a short-term ROI basis, and this is why it is often one of the first things to get value-engineered out of an extended development process. Even if it's clearly a false efficiency from a more strategic perspective, reducing or even eliminating the user-experience phase of development can seem like getting rid of an unnecessary complication.
But we've seen that getting everyware right will be orders of magnitude more complicated than achieving acceptable quality in a Web site, let alone a desktop application. We have an idea how very difficult it will be to consistently produce ubiquitous experiences that support us, encalm us, strengthen and encourage us. Where everyware is concerned, even GOOD won't be GOOD enough. This is not the place for value engineers, not unless they have first earned a sensitive understanding of how difficult the problem domain is and what kinds of complexity it genuinely requiresboth in process and product.