So Volkswagen was right. Small really is beautiful. Power rests not with the big and strong, but with the small and capable. You only have to witness the number of compact cars on the American landscape today to realize that millions of people have arrived at the same conclusion. And Volkswagen did it again with the "retro" bug. Small is "in." Forever "in."
This fondness for the diminutive doesn't stop with cars, either. People are discovering that little things generally have tremendous advantages over their larger counterparts. Paperback books have long outsold hardcover editions, partly because they're less expensive and partly because they're easier to take everywhere. Wristwatches have replaced pocket watches because of their reduced size and greater portability. Technological advances have given today's miniature electronic components more capability than much larger components in the past. Sales of pocket TVs, palmtop computers, and handheld remote controls are skyrocketing. Even today's nuclear weapons are considerably smaller than the ones dropped on Japan during World War II, yet they possess substantially more destructive power.
We owe much of this shrinking universe to superior technology. It takes highly advanced technology to reduce a mainframe computer to a microchip small enough to fit in one's hand. Without the miniaturization afforded by high-density microprocessors, many of today's products would be too cumbersome to be useful.
Still, it requires some level of user sophistication to be able to use these high-technology wonders. It seems as if the smaller and more advanced a device is, the more a user must know to be able to use it.
One example is the evolution of microwave ovens for the home. Early versions employed a start button and a plain knob for the timer. Then, as computer chip prices fell, it became chic to produce ovens that were programmable. These ovens were considerably more intelligent, but they also required more intelligent users to be able to take advantage of their advanced features.
Another way to look at it is that children don't learn how to write first with an ultrafine marker. They start with big, fat crayons. Why? Crayons require the child to hold on with whatever grip he or she can manage. Writing with a razor tip marker requires a greater level of manual dexterity. A child doesn't reach this skill level until several years after first picking up the crayon.
From these examples we can draw a couple of conclusions. First, small things don't interface well with people. While micro technology may be making things smaller, people aren't getting any smaller. Once a certain point of reduction is reached, people lose the ability to deal with things directly. They must resort to tools that enhance their normal senses. For instance, a watchmaker doesn't use the naked eye to scrutinize the tiny parts inside a fine Swiss watch. He must resort to using magnifying lenses that allow him to see the components. Similarly, in today's semiconductor manufacturing facilities, microscopes are used to spot defects in integrated circuits packing millions of transistors per square inch.
The human senses function only within narrow limits. Sounds can be too soft or too loud to be heard. There are light waves at frequencies lower than the eye can perceive. With our sense of smell we can distinguish among perfume varieties or gag from the stench of a skunk, but we have trouble distinguishing between the scents of two skunks.
As technology causes things to grow smaller, eventually they reach a point where our senses can no longer perceive them. Then a special interface is required for human beings to be able to deal with them. As computers become increasingly sophisticated, the gap between what exists and what the senses perceive widens. The distance between the software carrying out a given task and its user interface becomes an ever-widening void.
The second conclusion we can draw about small things is that, while they do not interface well with people, they tend to interface well with each other. Their small size gives them tremendous flexibility. They are readily suited for integration in many situations.
Consider this example: The next time you see a moving van in your neighborhood, watch how the workers load the customer's belongings. If they're putting an automobile on the van, it goes on first. Then they decide where the next largest pieces go. They follow these with the mid-sized pieces and then the small pieces. The sequence is obvious. But what is really happening is a demonstration that smaller things combine with each other in myriad ways to accomplish a task. While you have some flexibility in the placement of the smaller pieces, the larger ones afford fewer placement choices.
What would happen if you had only small pieces? You would achieve maximum flexibility. This flexibility comes with a price, though. The more small pieces you have, the harder it is for people to deal with them. Managing them becomes a serious problem.
Similarly, in the world of computer software, having many small programs and modules gives you the greatest ability to adapt to the environment. Unfortunately, as the modules get smaller, the issue arises of interfacing with the user. The more modules there are, the greater the complexity in dealing with them.
This presents a dilemma for the software designer. He wants maximum flexibility for his application, so he constructs it from a collection of small modules. Yet he is also bound by the requirement that his software be easy to use. People have difficulty dealing with too many small modules.
Unix takes a different approach to resolving this issue. Most other systems try to bridge the ever-widening gap between the user and the module performing the task with a piece of software known as a captive user interface (CUI). Unix developers, on the other hand, recognize that this gap is ever widening. Instead of linking the user and the underlying modules via a mass of "spaghetti" code, they work away at the gap with small chunks or layers.
Since we've just said that Unix and other operating systems diverge on this point, you may suspect that there is an underlying rationale for this. The answer comes from the next tenet of the Unix philosophy.