For philosophers , the eighteenth century is considered the Age of Enlightenment, or the Age of Reason. Other descriptive labels for the era include the Age of Science and the beginning of the Age of the Machine. The universe was considered a kind of complicated mechanism that operated according to discoverable laws. Once understood , those laws could be manipulated to bend the world to human will. Physicists, chemists, and engineers daily demonstrated the validity of this worldview with consistently more clever and powerful devices.
In this particular case, our system metaphor is easy: it s the vending machine itself since our task is to provide software simulations of the behaviors expected of a Universal Vending Machine. Even more helpful, we have an entire row of vending machines in the hallway just outside our office that we can use as references as we talk about what we are building.
The team sits down with the on-site customer to discuss the project. We have all heard the CEO s excited description of the UVM, so we have a common starting point. The first cards that the customer writes include the following. (The story title precedes the colon; the story narrative follows the colon .)
Payment: The UVM accepts payment.
Selection: The UVM allows the customer to make a selection.
Change: The UVM dispenses change.
Inventory: The UVM updates its inventory.
Dispense: The UVM dispenses the selected product.
Rather obvious stories, but a starting point. The team discusses whether it can estimate the stories provided, decides that it cannot, and works with the on-site customer to refactor the story list. This effort yields a modified list:
VerifyCC: The UVM verifies the credit card.
VerifyDBT: The UVM verifies the debit card.
ChargeCC: The UVM charges the credit card.
DeductFunds: The UVM deducts funds from the debit card.
Cash: The UVM accepts cash.
CurrencyConversion: The UVM converts currencies.
WebConnect: The UVM accepts a Web connection.
Menu: The UVM informs the customer of what is available to purchase, making sure only in-stock items are presented.
Selection: The customer is allowed to make a selection. Do we allow her to change her mind?
ValidateSelection: Make sure that the amount of money available is equal to or greater than the cost of the selected item.
ChargeIt: Charge the credit card.
DeductIt: Deduct funds from the debit card.
DispenseItem: Give the customer what she asked for.
Change?: Calculate whether change is due.
DispenseChange: Return change to customer.
NoChange: Ask the customer to enter correct change only, or offer to post the change amount to a credit card or debit card instead of dispensing money.
The next step is to turn the stories into objects. Yes, yes ”the next explicit XP step is to write tests, but what tests, and what should the focus of the tests be? You know that the code you produce will consist of class definitions and methods. You know that the refactoring you will eventually do involves breaking up methods and redistributing methods among your classes. So even though your next activity may be to write tests, your next thinking will involve finding objects.
Because you have internalized object thinking, you can look at the list of stories and foreshadow some likely outcomes . Stories 1 and 2 are likely to involve the same set of objects and might be a single story. The same thing is true of stories 13 and 15 ”dispensing is dispensing.
Developers separate into pairs, select a story, and begin development. Everyone is confident that we can begin development with such sparse definition of requirements and in the absence of any kind of formal modeling. Everyone fully expects a final solution to emerge from ongoing discussions, elaborations, and implementations of this initial set of stories. No one feels the need to create syntactically precise models, but it is surprising how much gets jotted down and drawn in the process of discussing and writing code. Interestingly enough, those that have been doing objects longer tend to make fewer notes as they proceed, while those that are relatively new to objects have many more bits of paper on and near their workspace. Object thinking is a mental exercise: all the ideas, definitions, heuristics, and models presented in this book are internalized as developers put them into practice.
Writers such as Descartes, Hobbes, and Leibniz provided the philosophical ground that explained the success of science and extended the mechanical metaphor to include human thought. For these thinkers, the universe comprised a set of basic elements: literally, the periodic table of elements in the case of chemistry , properties such as mass for the physicists, and mental tokens in the case of human thought. These basic elements could be combined and transformed according to some finite set of unambiguous rules: the laws of nature in the case of the physical sciences, or classical logic in the case of human thought.
Descartes, Hobbes, and Leibniz are likely to be familiar to most readers as rationalist philosophers. Perhaps less well known is the fact that all three were convinced that human thought could be simulated by a machine, a foundation idea behind classical artificial intelligence (AI) research in the 1970s. All three built or attempted to build mechanical thinking/calculating devices. Leibniz was so entranced with binary arithmetic (he invented aspects of binary logic) that it influenced his theology ” The void is zero and God is one and from the One all things are derived ”and prompted an intense interest in the Chinese I Ching (Book of Changes), which has a binary foundation.
This tradition of thought, this worldview or paradigm, has been labeled formalism . Other names with various nuances of meaning include rationalism , determinism , and mechanism . Central to this paradigm are notions of centralized control, hierarchy, predictability, and provability (as in math or logic). If someone could discover the tokens and the manipulation rules that governed the universe, you could specify a syntax that would capture all possible semantics. You could even build a machine capable of human thought by embodying that syntax in its construction.
As science continued to advance, other philosophers and theoreticians refined the formalist tradition. Russell and Whitehead are stellar examples. In the world of computing, Babbage, Turing, and von Neumann ensured that computer science evolved in conformance with formalist worldviews. In some ways, the ultimate example of formalism in computer science is classical artificial intelligence, as seen in the work of Newell, Simon, and Minsky.
Formalist philosophy has shaped Western industrial culture so extensively that even cultural values reflect that philosophy. For example, scientific is good, rational is good, and being objective is good. In both metaphor ( Our team is functioning like a well-oiled machine ) and ideals ( scientific management, computer science , software engineering ), Western culture-at-large ubiquitously expresses the value system derived from formalist philosophy.
Computer science is clearly a formalist endeavor. Its roots are mathematics and electrical engineering. Its foundation concepts include data (the tokens), data structures (combination rules), and algorithms (transformation and manipulation rules). Behind everything else is the foundation of discrete math and predicate calculus. Structured programming, structured analysis and design, information modeling, and relational database theory are all prime examples of formalist thinking. These are the things that we teach in every computer science curriculum.
As a formalist, the computer scientist expects order and logic. The goodness of a program is directly proportional to the degree to which it can be formally described and formally manipulated. Proof ”as in mathematical or logical proof ”of correctness for a piece of software is an ultimate objective. All that is bad in software arises from deviations from formal descriptions that use precisely defined tokens and syntactic rules. Art has no place in a program. In fact, many formalists would take the extreme position: there is no such thing as art; art is nothing more than a formalism that has yet to be discovered and explicated.
Countering the juggernaut of formalism is a minority worldview of equal historical standing, even though it does not share equal awareness or popularity. Variously known as hermeneutics , constructivism , interpretationalism , and most recently postmodernism , this tradition has consistently challenged almost everything advanced by the formalists. Iterative development practices, including XP, and object thinking are consistent with the hermeneutic worldview. Unfortunately, most object, XP, and agile practitioners are unaware of this tradition and its potential for providing philosophical support and justification for their approach to software development.
Hermeneutics, strictly speaking, is the study of interpretation, originally the interpretation of texts . The term is used in religious studies in which the meaning of sacred texts, written in archaic languages and linguistic forms, must be interpreted to a contemporary audience. Husserl, Heidegger, Gadamer, Dilthey, and Vygotsky are among the best-known advocates of hermeneutic philosophy.
Hermeneutics ( her-me-NOO-tiks or her-me-NYOO-tiks ) is derived from the name of the Greek god Hermes, the messenger or god of communication. It is a difficult name and does not flow easily off the tongue like formalism. Unfortunately, there is no comfortable alternative term to use. Most of the philosophers most closely associated with this school of thought ”excepting Heidegger ”are probably unknown to most readers. Unfortunately, there isn t space to fully explicate the ideas of these individuals in this book. It s strongly suggested, however, that your education ”and your education as a software developer in particular ”will not be complete without a reasonably thorough understanding of their ideas. The bibliography contains a section of references that can help the reader begin exploring hermeneutics.
The ideas of the hermeneutic philosophers are frequently illustrated with examples from linguistics , but hermeneutic principles are not limited to that domain. For example, words (the tokens of thought, according to formalists) do not have clear and unambiguous meaning. The meaning (semantics) of a word is negotiated, determined by those using it at the time of its use. Semantics are ephemeral, emerge from the process of communication, and are partially embodied in the minds of those involved in creating them.
According to the hermeneutic position, the meaning of a document ”say a Unified Modeling Language (UML) class diagram ”has semantic meaning only to those involved in its creation. However precise and correct the syntax of such a diagram may be, a significant portion of its meaning (its semantics) is not in the diagram but exists only in the minds of the developers that created the diagram. If a team of analysts and designers create a UML diagram and pass it to a team of programmers to implement, the programmers will not be able to discern the meaning of the diagram. The programmers will, of necessity, have to interpret the document and find their own meaning (semantics) in its syntax. And of course, it will be the programmer s semantics, not the analyst s/ designer s syntax, that actually get implemented.
The hermeneutic conception of the natural world claims a fundamental nondeterminism. Hermeneuticists assert that the world is more usefully thought of as self-organizing , adaptive, and evolutionary with emergent properties. Our understanding of the world, and hence the nature of systems we build to interact with that world, is characterized by multiple perspectives and constantly changing interpretation. Contemporary exemplars of this paradigm are Gell-Mann, Kauffman, Langton, Holland, Prigogine, Wolfram, Maturana, and Varela.
Murray Gell-Mann, Stuart Kauffman, Christopher Langton, and John Holland are closely associated with the Santa Fe Institute and the study of complexity and artificial life. This new discipline challenges many of the formalist assumptions behind classical science, suggesting that significant portions of the real world must be understood by using an alternative paradigm based on self-organization, emergent properties, and nondeterminism. Ilya Prigogine is a Nobel prize “winning physicist (as is Gell-Mann) whose work laid many of the foundations for the study of emergent and chaotic physical systems. Steven Wolfram, developer of Mathematica and an expert in cellular automata , has recently published A New Kind of Science (Wolfram Media, Inc., 2002), which suggests that all we know can be best explained in terms of cellular automata and emergent systems. Humberto Maturana and Francisco Varela are proponents of a new biology consistent with complex systems theory and are collaborators with Terry Winograd on a hermeneutic theory of design strongly influenced by the philosophy of Heidegger.
As exotic and peripheral as these ideas may seem, they have been at the heart of several debates in the field of computer science. One of the best examples is found in the area of artificial intelligence ”the formalists, represented by Newell and Simon, arguing with the Dreyfus brothers and others representing hermeneutic positions .
Allen Newell (before his death) and Herbert Simon were among the leading advocates of traditional artificial intelligence ”the theory that both humans and machines are instances of physical symbol systems. According to them, both humans and machines think by manipulating tokens in a formal way (Descartes redux ), and therefore it is perfectly possible for a digital computer to think as well as (actually better than) a human being. Hubert L. Dreyfus, working with his brother, Stuart, has been one of the most vocal and visible critics of traditional AI. What Computers Can t Do (HarperCollins, 1979) and What Computers Still Can t Do (MIT Press, 1992), written by Dreyfus, present arguments based on the work of Husserl and Heidegger against the formalist understanding of cognition.
Another example centers on the claim for emergent properties in neural networks. Emergence is a hermeneutic concept inconsistent with the formalist idea of a rule-governed world. Arguments about emergence have been heated. The stronger the claim for emergence by neural network advocates, the greater the opposition from formalists. Current work in cellular automata, genetic algorithms, neural networks, and complexity theory clearly reflect hermeneutic ideas. 
Marvin Minsky is another leading advocate of traditional AI. Originally, he was vehemently against the idea of emergent properties in systems ”a view that has seemed to soften in later years . His book Society of Mind (Touchstone Books, 1988) attempted to use object-oriented programming ideas to develop a theory of cognition that could rely on interactions of highly modularized components without the need for emergent phenomena.
The hermeneutic philosopher sees a world that is unpredictable, biological, and emergent rather than mechanical and deterministic. Mathematics and logic do not capture some human-independent truth about the world. Instead, they reflect the particularistic worldview of a specific group of human proponents. Software development is neither a scientific nor an engineering task. It is an act of reality construction that is political and artistic.
As you can see, formalism and hermeneutics contest each other s basic premises and assumptions about the nature of the universe and the place of humanity within that universe. Challenges to basic assumptions are frequently challenges to core values as well. And often, fundamental assumptions and core values are seldom examined. Like articles of faith, they are blindly defended.
As noted previously, Western culture in general is largely formalist (using the labels rationalist and scientific rather than formalist ) in its orientation. Anything challenging this position is viewed with suspicion and antagonism. It is for this reason that the conflict between hermeneutic and formalist worldviews frames the debate about an object paradigm.
A quick glance around the development room reveals some interesting graphical models drawn on paper or on the whiteboard. We also see people pointing to and modifying these models as they engage in developing tests or writing code. Figure 2-3 and Figure 2-4 are examples of two such artifacts. Figure 2-3 is just a rectangle with some text, but it seems to be a model of an object that one team is working on. Figure 2-4 consists of some labeled boxes, lines, and arrows; judging from the conversation of the development pair, it s a model of object interactions that they are trying to understand as a basis for writing a test.
Both of these models are sketches ”nothing formal about them. Both are tools to facilitate communication among the developers. Neither tool is likely to be useful to those outside the team. Neither contains any truth. Both are ways for the developers to explore and share the thinking in their individual heads with each other. Both provide a kind of external memory for those involved in the development activities at hand. Neither is worth keeping around once the task that prompted their creation is completed.
These artifacts are quite consistent with hermeneutic philosophy and at odds with formalist philosophy. They are quite useful for extreme programmers.
XP is the most recent example of a series of attempts to apply hermeneutic, human-centric, and aformal ideas to software development. Some antecedents include the Greek versus Roman software development cultures identified by Robert L. Glass  and the associated debates over the role of creativity in software development, the conflicts between fuzzies and neats in AI, and the classic debates between devotees of Smalltalk and C++.
A prolific writer and chronicler of ideas in software development as well as a leading practitioner, Robert L. Glass appears frequently in publications ranging from ACM Communications to his own newsletter, The Software Practitioner . His extensive experience in the real world and the world of academe make his insights into the conflict between how it is done and how theorists think it is done invaluable for everyone involved in any kind of software development. One of his main themes is that practice requires a great deal more creativity and experimentalism than computer scientists, software engineers, and academicians are willing to publicly acknowledge . His latest work, Facts and Fallacies of Software Engineering (Addison-Wesley, 2003), presents in a concise and highly readable fashion many of the critiques of formalist software development methodology that are presented here as foundation positions for object thinking and extreme programming.
More recently, Michael McCormick notes that
What XP uncovered (again) is an ancient, sociological San Andreas Fault that runs under the software community ”programming versus software engineering (a.k.a. the scruffy hackers versus the tweedy computer scientists). XP is only the latest eruption between opposing continents. 
XP is the latest assertion of the view that people matter. XP is the latest challenger to the dominant (and hostile ) computing and software engineering culture. XP is the latest attempt to assert that developers can do the highest-quality work using purely aformal methods and tools. And XP is the latest victim of the opprobrium of the formalists and the latest approach to be described as suitable only for dealing with small, noncritical problems.
To the extent that objects (our present focus) are seen as an expression of a hermeneutic point of view, they have been characterized as antirationalist, or at best nonrationalist, challenges to the prevailing philosophy. It is my assertion that objects are, or are perceived to be, a reflection of hermeneutic philosophy. A way to test this assertion is to compare objects with other ideas about software development that are clearly hermeneutic ”for example, postmodernism.
 Do not confuse formalism with the use of formal tools, such as mathematics, that are employed in the cited fields of study.
 Glass, Robert L. Software Creativity . Englewood Cliffs, NJ: Prentice Hall, 1995.
 McCormick, Michael. Programming Extremism. Communications of the ACM 44(6), June 2001, 109 “110.