Two Decades of Object Methodology


Scores of object development methods have been advanced since Alan Kay coined the term object-oriented and applied it to software. The first methods proposed clearly reflected the same presuppositions and philosophy that provided foundations for the Smalltalk programming language ” specifically , an orientation toward understanding objects in terms of their behavior. One early method, by Adele Goldberg and Ken Rubin, originated within ParcPlace Systems (a spinoff from Xerox PARC, which developed and marketed the first version of Smalltalk). Object behavior analysis (OBA) was described in published papers but was never truly marketed as a method. An automated support product for OBA was developed but not, at that time, sold. [2]

start sidebar
Behind the Quotes ”Adele Goldberg

Adele Goldberg was a known figure in the Smalltalk world from the time the language was conceived at Xerox PARC until it was eclipsed by Java. Her concerns with education helped shape the evolution of Smalltalk, and she contributed to its leaving PARC and becoming a commercial language. When PARC spun off ParcPlace Systems, she became a board member and later president of the company. With Ken Rubin, she authored a behavioral object analysis method (object behavior analysis, or OBA) and later a guide to object-oriented project management, entitled Succeeding with Objects . While Goldberg was at ParcPlace Systems, a version of Smalltalk with a much larger user base ”from a company named Digitalk ”was acquired , and efforts were made to merge the languages and their customer bases. That effort failed, in part because ParcPlace expended resources trying to sidetrack yet another competitor ”a company named OTI, allied with IBM ”stopping the momentum of Smalltalk (which was threatening to become the next COBOL ) and allowing Java (a pale imitation of Smalltalk at that time) to succeed. Goldberg left ParcPlace and founded a company and product named LearningWorks, which used the Smalltalk environment to teach computing and computer programming. For a time, the Open University in England adopted the Smalltalk environment as a foundation for its programming courses.

end sidebar
 

The year 1991 was a watershed year for object methods in terms of books published. Booch, Coad and Yourdon, Jacobson, Rumbaugh, Schlaer and Mellor, and Wirfs-Brock published books describing various methods. All of these methods could be characterized as first generation. They were all produced quasi-independently and exhibited significant syntactic variation.

A great deal of argument ensued. Which method was best? Much of that argument coincided with arguments about implementation languages. It was common for methods to be closely associated with languages and praised or damned based on that association. For example, Booch s method (which was originally written to support Ada development) was preferred in the C++ community, while Wirfs-Brock was more popular with the Smalltalk crowd .

In addition to arguing with each other, methodologists took note of one another s work and recognized omissions in their own. Points of concurrence were noted. Compliments were noted. Booch, for instance, specifically noted the value of Class, Responsibility, Collaborator (CRC) cards for object discovery and preliminary definition in the second edition of his method book. Other efforts were made to extend the first-generation methods to support specific types of problems: real-time systems and distributed systems, for example.

At the same time, The Market demanded a single right answer as to which method to adopt. Booch, Rumbaugh, and Jacobson, coming together under the Rational corporate umbrella, were quick to consolidate their methods and models in order to present a unified method. Given the sobriquet The Three Amigos, they advanced their cause. Their tool sets and methods became the foundation for Unified Modeling Language (UML) and the Rational Unified Process (RUP) ”a tool and a method that dominate official [3] practice today.

A review of all of the methods advanced in the nineties is not possible in this space. It is possible to group the plethora of methods into three general categories (data-driven, software engineering, and behavioral) and look at exemplars of each category in order to outline the main points of divergence .

Data-driven approaches are exemplified by the work of Schlaer and Mellor, the Object Modeling Technique (OMT) of Rumbaugh et al., and the joint work  of Coad [4] and Yourdon. The central focus of this category of methods is data and the distribution of that data, via the classical rules of normalization, across a set of objects. Objects in this instance are equivalent to entities in classical data models. Schlaer and Mellor, for example, describe the process for discovering objects in a manner that is indistinguishable from data modeling techniques.

OMT syntax and models allow greater variation and flexibility in relationships among objects than do the models of Schlaer and Mellor, capturing interaction among objects in non-data-specific ways, but they still define objects in terms of data attributes. Coad and Yourdon s object-oriented analysis (OOA) method essentially advocates the creation of an entity diagram, placing functions with the distributed data attributes and adding message passing to the data model.

Data-driven methods do not bring about the object paradigm shift. Instead, they carry forward a legacy abstraction, data, and use it as the basis for decomposing the world. Followers of this type of method think like data or think like a (relational) database. This obviously has the appeal of familiarity and consistency with legacy systems and provides the appearance of a quick path to objects. It s also quite consistent with the definition of an object as a package of data and methods.

It isn t consistent with decomposition of the world in a natural way because the world isn t composed of computationally efficient data structures. There are no natural joints in the domain that map to normalized entities. This disjunction is evident in the world of data modeling itself: just consider the differences between a conceptual data model (models data as understood in  the domain) and a logical data model (models data as normalized for implementation).

Data-driven methods tend to create objects with more frequent and tighter coupling than do other methods. Class hierarchies are therefore more brittle, and changes in class definitions tend to have greater impact on other class definitions than desirable. (See Figure 6-1 for an example.) Distributing data across a set of objects in accordance with normalization rules and by following the dictum that a class must be created if a class (entity, really) has at least one attribute different from any existing class needlessly multiplies the number of classes required to model a domain. (See Figure 6-2.)

click to expand
Figure 6-1: In this Department of Motor Vehicles example ( patterned after Coad and Yourdon, 1991), data modeling rules lead the designer to place the fee attribute and therefore the calculate fee method in the LegalDocument class. This mandates the need for a case statement to handle fee calculations and couples LegalDocument to the entire Vehicle class hierarchy.
click to expand
Figure 6-2: In this example, the addition of a single new attribute requires the creation of a new class, allowing some customers to buy on credit and therefore have a creditLimit attribute, for example.

Booch and Jacobson exemplify software engineering methods. They  openly acknowledge an intellectual debt to classical structured development and claim to be an appropriate response to the demand by management and government for a documented (formal) process for software development. They are characterized by multiple models that, when (and if) integrated, present a comprehensive specification of a desired software artifact. Supporting code generation is a frequently stated motive for these methods.

Most software engineering methods claim to be (partly) behavior driven. Jacobson s use cases are essentially identical, in fact, to the scenarios of OBA, a prototypical behavioral method. Because use cases are focused on discovering requirements to be used to define an artifact ”the software ”they tend to fall into thinking about an object as encapsulated data and procedures: they reflect thinking like a computer.

Nothing in these methods compels a design to reflect computer thinking. However, replication of modeling syntax from structured development and the vocabulary employed in discussion (for example, attributes and operations) reflect the artifact-centric point of view borrowed from structural development. Developer focus remains on the artifact and its design to meet specification, rather than on understanding and modeling the problem domain. [5]

Software engineering methods (with the exception of Booch s 1991 book) fail to stress the paradigmatic differences [6] between objects and traditional modules and data structures. Even use cases, as developed by Jacobson, are less about responsibility discovery and assignment than they are about requirements gathering.

Behavioral methods, despite being first and most closely allied with the development of object thinking, have always received less attention than data and engineering methods. Proponents of other methods are quick to assert that this lack of popularity reflects fundamental flaws or lack of utility in behavioral approaches. They are wrong: other factors account for the failure of behavioralism to capture market share.

First, behavioralism was not promoted, as were competing methods. Important advocates of behavioral ideas ”Goldberg and Rubin at PARC ”and the originators of the most widely used behavioral modeling technique ”Beck and Cunningham and CRC cards ”never published any kind of method book. At a time when management almost demanded an automated tool to accompany and support any new method (this period of time was also known as the era of CASE tools), the following occurred:

  • ParcPlace Systems chose not to market the tool (OBA) it had created, although it did provide the tool to those taking its object analysis and design seminars .

  • Beck and Cunningham actively lobbied against the creation of any kind of automated CRC tool and, today, argue against automation of story cards in XP. Their objective was quite sound, philosophically, but disastrous in terms of marketing.

  • Knowledge Systems Corporation developed a tool named coDesign, based on Smalltalk, that was highly regarded by those that saw it demonstrated. It never became a commercial product because of financial and political concerns within KSC.

  • Corporate information technology organizations were actually moving to adopt Smalltalk at this time but refused to give up their relational databases, so tool vendors concentrated on object-relational mapping tools instead of behavioral modeling tools.

The first, and for a long time only, book promoting the behavioral approach to objects was that of Wirfs-Brock, Weiner, and Wilkerson. Nancy Wilkinson published a small volume outlining classical CRC cards a few years later ”long after the opposition had all but won the method war.

Second, and more important, behavioral approaches were the least developed in terms of expressiveness of models and making the transition from analysis to design and implementation in a traceable fashion. It would not be misleading to characterize behavioral methods, circa 1991, as CRC cards ”then Smalltalk.

Wilkinson, in fact, suggests that the primary value of CRC cards and behavioralism is to provide an informal and rapid way to obtain input for more formal software engineering methods. Booch, OMT, and OOSE (Jacobson s Object Oriented Software Engineering) were the formal software engineering methods she seemed to have had in mind.

Third, behavioral approaches are the most alien to established software developers ”even today. Thinking like an object is very different from traditional conceptualizations of the software development process. It requires, at least at the beginning, constant diligence to avoid falling into old mental habits. It is hard. Most people are unwilling to engage in this kind of hard cognitive work without a compelling argument as to why it is worth their while. That argument was never made ”except in a kind of oral tradition shared by a very small community. This book is an attempt to capture several aspects of that oral tradition. As with XP, the primary justification for object thinking derives from a better fit between information technology and business and significant reductions in complexity.

In fact, behavioralism had the same reputation as XP does today ”as being anti-method. There is a small element of truth in this assertion. Both OO and XP proponents oppose the use of methods as traditionally defined ” especially comprehensive, highly formalized , and labor- intensive methods that prevent developers from immediate engagement with code ”with programming.

This does not mean that object thinking lacks anything resembling a method. (The same is true of XP.) It merely means that the rigor and the systematization of work that accompanies object thinking is quite different from what most developers have come to associate with the term method .

To understand the relationship between method and XP, it s necessary to take a few moments and reflect on how method provides value to software developers.

[2] Very late in the life of ParcPlace Systems (the spinoff that marketed the Smalltalk developed at PARC), the OBA tool was sold ”under the name MethodWorks. But by that time, it had no hope of capturing any share of the development tools market. MethodWorks provided a set of forms, consistent with the OBA models, containing textual documentation about the nature and relationships of objects.

[3] A huge discrepancy remains between what developers officially use and what they actually use, in terms of both method and model. Licenses sold still does not equate to actual use by developers.

[4] Coad s later and independent work was far more focused on behavior and patterns of behavior among groups of objects.

[5] Only nominal attention is paid to the process of developing a domain class hierarchy as a tangible product in most software engineering methods.

[6] Booch is a notable exception to this rule. He clearly recognizes that a fundamental shift in thinking is required for object development. He also clearly recognizes the need to decompose the world based on the domain expert s point of view. He parallels Parnas s views on domain-centric (design-centric) decomposition. His method, however, does not develop or directly support this aspect of object development. Instead, his models and syntax are directly focused on what is needed to construct the software artifact.




Microsoft Object Thinking
Object Thinking (DV-Microsoft Professional)
ISBN: 0735619654
EAN: 2147483647
Year: 2004
Pages: 88
Authors: David West

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net