It's all John Backus' fault. Before his landmark work in the 1950s, everything was so simple.

Jim Gray once noted that "In the beginning, there was FORTRAN." But of course, that wasn't the beginning. The genesis was recognition of a pair of problems. Dr. Cuthbert C. Hurd, Director of IBM's Applied Science Division, noticed in the summer of 1952 that programming the IBM Defense Calculator (later romantically renamed the IBM 701) was quite difficult. He also noticed, however, that the 701 was a speed demon: While watching leading mathematicians of the day have a try at the new machine, he has been quoted as saying, "They each got a shot at the computer. They would feed a program into the computer and, bam, you got the result…We all sat there and said, 'How are we going to keep this machine busy? It's so tremendously fast. How are we going to do that?'"

What a marvelous juxtaposition: The machine was fast as could be, and also difficult to program. In 1953, Hurd proposed that his staffer John Backus do something about this juxtaposition. The result was the Speedcoding system, developed in 1954, and released to the world in 1957 as the FORmula TRANslation system, or FORTRAN for short. Today we don't think much about compilation; it's a fact of life, and we have a plethora of languages from which to choose, each theoretically tuned to some class of problems or infrastructure. But in fact FORTRAN was a critically important first step, the greatest remove yet (in 1957) from the machine's own representation of programs.

Since then, we have apparently flown up the abstraction ladder. Later languages let the programmer pretend infinite memory via automatic memory management (Lisp, for example, or Basic); or bizarre execution models (Prolog or MICROPLANNER being obvious examples). In each case, programmer productivity soared, sometimes (but not always) with attendant increase in run-time cost in space and time.

It's notable, however, that the original FORTRAN system produced excellent code. A member of Backus' own team, Harlan Herrick, remembered in 1982, "I said, 'John, we can't possibly simulate a human programmer with a language this language that would produce machine code that would even approach the efficiency of a human programmer like me, for example. I'm a great programmer, don't you know?'" Nevertheless, the cover of 1956's Fortran Automatic Coding System for the IBM 704 proudly states, "Object Programs produced by Fortran will be nearly as efficient as those written by good programmers." Somehow the high-level abstraction allowed by programming languages does not always have significant run-time costs, so long as the precision of the abstraction allows complete definition of the algorithm.

And so abstraction has marched on. In fact this "compilation" idea is quite well-understood today: What a compiler (or interpreter or assembler, for that matter) does is to translate one model of an algorithm into another (presumably "lower-level") model of the same algorithm. Now if we could just allow the capture of algorithms in some language that feels closer to the mathematical or structural formalisms natural to those who develop the algorithms, and also ensure that mappings exist from those high-level formalisms to the low-level formalisms our real computers understand (i.e., machine language)!

This, in essence, is the promise of Model Driven Architecture (MDA). By asking developers to use precise but abstract and graphical representations of algorithms, MDA allows the construction of computing systems from models that can be understood much more quickly and deeply than can programming language "code." Coding languages even "high-level" languages like Smalltalk and Lisp overlay many unintentional constraints and structural styles over the specification of an algorithm. Modeling languages, in contrast, while certainly overlaying a style and structure, attempt not to constrain the expression of the algorithm, which allows extra freedoms to the compiler and much more clarity of expression to the developer and more importantly, to the maintainer who must figure out the underlying algorithm in order to correct bugs or integrate an existing system with something new.

Not that this leap is a trivial one, of course. Thankfully, MDA doesn't expect that leap of abstraction (and of faith!) to be taken all at once. While it is certainly worth some investment, some "activation energy" to reduce the immense costs of software development, maintenance, and integration, one must expect some expense to shift to a higher abstraction view. You hold in your hands a first step, a distillation of the thoughts and technique that comprise a complete and comprehensive approach to building, maintaining, and integrating software systems using a Model Driven Architecture approach.

The Oxford English Dictionary advises us that in the 14th century, the new word "distill" came to mean, "To trickle down or fall in minute drops…" Sometimes learning a new technology is unfortunately more like drinking from a fire hose. In this book, you are offered a step-by-step distillation (in the Oxford sense, "gentle dropping or falling," we hope!) of the techniques that will make you successful in building software better, faster, and cheaper. Layering on the abstractions one at a time, the authors take you through the modeling steps to succeed with this approach to building reliable, maintainable, and integratable systems.

One of my favorite authors, Chaim Potok, wrote, "All beginnings are hard." Though he may not have been thinking of MDA at the time, he also added a character note, "Then the drawing tells me what I'm trying to say." May your drawings tell you and those who must interpret your work what you are you trying to say.

Richard Mark Soley, Ph.D.
Lexington, Massachusetts
December 2003

MDA Distilled. Principles of Model-Driven Architecture
MDA Distilled. Principles of Model-Driven Architecture
ISBN: B00866PUN2
Year: 2003
Pages: 134 © 2008-2017.
If you may any questions please contact us: