There's no doubt about it: Software is expensive. The United States alone devotes at least $250 billion each year to application development of approximately 175,000 projects involving several million people. For all of this investment of time and money, though, software's customers continue to be disappointed, because over 30 percent of the projects will be canceled before they're completed, and more than half of the projects will cost nearly twice their original estimates.
The demand for software also continues to rise. The developed economies rely to a large extent on software for telecommunications, inventory control, payroll, word processing and typesetting, and an ever-widening set of applications. Only a decade ago, the Internet was text-based, known only to a relatively few scientists connected using DARPAnet and email. Nowadays, it seems as if everyone has his or her own website. Certainly, it's become difficult to conduct even non-computer-related business without email.
There's no end in sight. A Star Trek world of tiny communications devices, voice-recognition software, vast searchable databases of human (for the moment, anyway) knowledge, sophisticated computer-controlled sensing devices, and intelligent display are now imaginable. (As software professionals, however, we know just how much ingenuity will be required to deliver these new technologies.)
Software practitioners, industrial experts, and academics have not been idle in the face of this need to improve productivity. There have been significant improvements in the ways in which we build software over the last fifty years, two of which are worthy of note in our attempts to make software an asset. First, we've raised the level of abstraction of the languages we use to express behavior; second, we've sought to increase the level of reuse in system construction.
These techniques have undoubtedly improved productivity, but as we bring more powerful tools to bear to solve more difficult problems, the size of each problem we're expected to tackle increases to the point at which we could, once again, barely solve it.
MDA takes the ideas of raising the levels of abstraction and reuse up a notch. It also introduces a new idea that ties these ideas together into a greater whole: design-time interoperability.