Systems, Software, and Other Abstractions


Unlike memory chips, which have a regular array of elements, processors and logic chips are limited by the rat's nest of wires that span the chip on multiple layers. The bottleneck in logic chip design is not raw numbers of transistors but the lack of a design approach that can use all that capability in a timely fashion. For a solution, several next-generation processor companies have redesigned "systems on silicon" with a distributed computing bent; wiring bottlenecks are localized, and chip designers can be more productive by using a high-level programming language instead of wiring diagrams and logic gates. Chip design benefits from the abstraction hierarchy of computer science.

Compared with the relentless march of Moore's Law, the cognitive capability of humans is relatively fixed. We have relied on the compounding power of our tools to achieve exponential progress. To take advantage of accelerating hardware power, we must further develop layers of abstraction in software to manage the underlying complexity. For the next thousandfold improvement in computing, the imperative will shift to the growth of distributed complex systems. Our inspiration will likely come from biology.

As we race to interpret the now complete map of the human genome and embark upon deciphering the proteome, the accelerating pace of learning is not only opening doors to the better diagnosis and treatment of disease but is also a source of inspiration for much more powerful models of computer programming and complex systems development.

The Biological Muse

Many of the interesting software challenges relate to growing complex systems or have other biological metaphors as inspiration. Some of the interesting areas include Biomimetics, Artificial Evolution, Genetic Algorithms, A-life, Emergence, IBM's Autonomic Computing initiative, Viral Marketing, Mesh, Hives, Neural Networks, and the Subsumption architecture in robotics. The Santa Fe Institute just launched a BioComp research initiative.

In short, biology inspires IT, and IT drives biology. But how inspirational are the information systems of biology? If we took your entire genetic codethe entire biological program that resulted in your cells, organs, body, and mindand burned it into a CD, it would be smaller than Microsoft Office. Just as images and text can be stored digitally, two digital bits can encode the four DNA bases (A, T, C, and G), resulting in a 750MB file that can be compressed for the preponderance of structural filler in the DNA chain.

If, as many scientists believe, most of the human genome consists of vestigial evolutionary remnants that serve no useful purpose, then we could compress it to 60MB of concentrated information. Having recently reinstalled Office, I am humbled by the comparison between its relatively simple capabilities and the wonder of human life. Much of the power in bioprocessing comes from the use of nonlinear fuzzy logic and feedback in the electrical, physical, and chemical domains.

For example, in a fetus, the initial interneuronal connections, or "wiring," of the brain follow chemical gradients. The massive number of interneuron connections in an adult brain could not be simply encoded in our DNA, even if the entire DNA sequence were dedicated to this one task. Your brain has on the order of 100 trillion synaptic connections between 60 billion neurons.

This highly complex system is not "installed," like Microsoft Office, from your DNA. Rather, it is grown, first through widespread connectivity sprouting from "static storms" of positive electrochemical feedback, and then through the pruning of many underused connections through continuous usage-based feedback. In fact, human brains hit their peak at the age of two to three years, with a quadrillion synaptic connections and twice the energy burn of an adult brain.

The brain has already served as an inspirational model for artificial intelligence (AI) programmers. The neural network approach to AI involves the fully interconnected wiring of nodes, followed by the iterative adjustment of the strength of these connections through numerous training exercises and the back-propagation of feedback through the system.

Moving beyond rule-based AI systems, these artificial neural networks are capable of many humanlike tasks, such as speech and visual pattern recognition, with a tolerance for noise and other errors. These systems shine precisely in the areas where traditional programming approaches fail.

The coding efficiency of our DNA extends beyond the leverage of numerous feedback loops to the complex interactions between genes. The regulatory genes produce proteins that respond to external or internal signals to regulate the activity of previously produced proteins or other genes. The result is a complex mesh of direct and indirect controls.

This nested complexity implies that genetic reengineering can be a very tricky endeavor if we have partial knowledge about the systemwide side effects of tweaking any one gene. For example, recent experiments show that genetically enhanced memory comes at the expense of enhanced sensitivity to pain.

By analogy, our genetic code is a dense network of nested hyperlinks, much like the evolving Web. Computer programmers already tap into the power and efficiency of indirect pointers and recursive loops. More recently, biological systems have inspired research in evolutionary programming, where computer programs are competitively grown in a simulated environment of natural selection and mutation. These efforts could transcend the local optimization inherent in natural evolution.

But therein lies great complexity. We have little experience with the long-term effects of the artificial evolution of complex systems. Early subsystem work can be deterministic of emergent and higher-level capabilities, as with the neuron. (Witness the Cambrian explosion of structural complexity and intelligence in biological systems once the neuron enabled something other than nearest-neighbor intercellular communication. Prior to the neuron, most multicellular organisms were small blobs.)

Recent breakthroughs in robotics were inspired by the "subsumption architecture" of biological evolutionusing a layered approach to assemble reactive rules into complete control systems from the bottom up. The low-level reflexes are developed early and remain unchanged as complexity builds. Early subsystem work in any subsumptive system can have profound effects on its higher-order constructs. We may not have a predictive model of these downstream effects as we are developing the architectural equivalent of the neuron.

The Web is the first distributed experiment in biological growth in technological systems. Peer-to-peer software development and the rise of low-cost Web-connected embedded systems raise the possibility that complex artificial systems will arise on the Internet, rather than on one programmer's desktop. We already use biological metaphors, such as "viral" marketing, to describe the network economy.

Nanotech Accelerants: Quantum Simulation and High-Throughput Experimentation

We have discussed the migration of the lab sciences to the innovation cycles of the information sciences and Moore's Law. Advances in multi-scale molecular modeling are helping some companies design complex molecular systems in silicon. But the quantum effects that underlie the unique properties of nanoscale systems are a double-edged sword. Although scientists have known for nearly 100 years how to write down the equations that an engineer needs to solve in order to understand any quantum system, no computer has ever been built that is powerful enough to solve them. Even today's most powerful supercomputers choke on systems bigger than a single water molecule.

This means that the behavior of nanoscale systems can be reliably studied only by empirical methodsbuilding something in a lab and then poking and prodding it to see what happens.

This observation is distressing on several counts. We would like to design and visualize nanoscale products in the tradition of mechanical engineering, using CAD-like (Computer Aided Design) programs. Unfortunately this future can never be accurately realized using traditional computer architectures. The structures of interest to nanoscale scientists present intractable computational challenges to traditional computers.

The shortfall in our ability to use computers to shorten and reduce the cost of the design cycles of nanoscale products has serious business ramifications. If the development of all nanoscale products fundamentally requires long R&D cycles and significant investment, the nascent nanotechnology industry will face many of the difficulties that the biotechnology industry faces, without having a parallel to the pharmaceutical industry to shepherd products to market.

In a wonderful turn of poetic elegance, quantum mechanics itself turns out to be the solution to this quandary. Machines known as quantum computers, built to harness some simple properties of quantum systems, can perform accurate simulations of any nanoscale system of comparable complexity. The type of simulation conducted by a quantum computer results in an exact prediction of how a system will behave in naturesomething that is literally impossible for any traditional computer, no matter how powerful.

Once quantum computers become available, engineers working at the nanoscale will be able to use them to model and design nanoscale systems, just as today's aerospace engineers model and design airplanescompletely virtuallywith no wind tunnels (or their chemical analogs).

This may seem strange, but really it's not. Think of it this way: Conventional computers are really good at modeling conventional (that is, nonquantum) stuff, such as automobiles and airplanes. Quantum computers are really good at modeling quantum stuff. Each type of computer speaks a different language.

One of our companies is building a quantum computer using aluminum-based circuits. The company projects that by 2008 it will be building thumbnailsized chips that will have more computing power than the aggregate total of all computers on the planet today and ever built in history, when applied to simulating the behavior and predicting the properties of nanoscale systemsthereby highlighting the vast difference between the capabilities of quantum computers and those of conventional computers. This would be of great value to the development of the nanotechnology industry. And it's a jaw-dropping claim. Professor David Deutsch of Oxford summarized it this way: "Quantum computers have the potential to solve problems that would take a classical computer longer than the age of the universe."

Although any physical experiment can be regarded as a complex computation, we will need quantum computers to transcend Moore's Law into the quantum domain to make this equivalence realizable. In the meantime, scientists will perform experiments. Until recently, the methods used for the discovery of new functional materials differed little from those used by scientists and engineers a hundred years ago. It was very much a manual, skilled-labor-intensive process. One sample was prepared from millions of possibilities, then it was tested, and then the results were recorded and the process repeated. Discoveries routinely took years.

Companies like Affymetrix, Intematix, and Symyx have made major improvements in a new methodology: high-throughput experimentation. For example, Intematix performs high-throughput synthesis and screening of materials to produce and characterize these materials for a wide range of technology applications. This technology platform enables the company to discover compound materials solutions more than 100 times as fast as with conventional methods. Initial materials have been developed that have applications in wireless communications, fuel cells, batteries, x-ray imaging, semiconductors, LEDs, and phosphors.

Combinatorial materials discovery replaces the old traditional method by simultaneously generating a multitude of combinationspossibly all feasible combinationsof a set of raw materials. This "materials library" contains all combinations of a set of materials, and they can be quickly tested in parallel by automated methods similar to those used in combinatorial chemistry and the pharmaceutical industry. What used to take years to develop now takes only months.




Nanotechnology. Science, Innovation, and Opportunity
Nanotechnology: Science, Innovation, and Opportunity
ISBN: 0131927566
EAN: 2147483647
Year: 2003
Pages: 204

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net