The Object-Oriented Development Process

The object-oriented (OO) approach to design and programming, which was introduced in the 1980s, represents a major paradigm shift in software development. This approach will continue to have a major effect in software for many years . Different from traditional programming, which separates data and control, object-oriented programming is based on objects, each of which is a set of defined data and a set of operations ( methods ) that can be performed on that data. Like the paradigm of structural design and functional decomposition, the object-oriented approach has become a major cornerstone of software engineering. In the early days of OO technology deployment (from late 1980s to mid 1990s), much of the OO literature concerned analysis and design methods; there was little information about OO development processes. In recent years the object-oriented technology has been widely accepted and object-oriented development is now so pervasive that there is no longer a question of its viability.

Branson and Herness (1992) proposed an OO development process for large-scale projects that centers on an eight-step methodology supported by a mechanism for tracking, a series of inspections, a set of technologies, and rules for prototyping and testing.

The eight-step process is divided into three logical phases:

  1. The analysis phase focuses on obtaining and representing customers' requirements in a concise manner, to visualize an essential system that represents the users' requirements regardless of which implementation platform (hardware or software environment) is developed.
  2. The design phase involves modifying the essential system so that it can be implemented on a given set of hardware and software. Essential classes and incarnation classes are combined and refined into the evolving class hierarchy. The objectives of class synthesis are to optimize reuse and to create reusable classes.
  3. The implementation phase takes the defined classes to completion.

The eight steps of the process are summarized as follows :

  1. Model the essential system: The essential system describes those aspects of the system required for it to achieve its purpose, regardless of the target hardware and software environment. It is composed of essential activities and essential data. This step has five substeps :

    • Create the user view.
    • Model essential activities.
    • Define solution data.
    • Refine the essential model.
    • Construct a detailed analysis.

This step focuses on the user requirements. Requirements are analyzed , dissected, refined, combined, and organized into an essential logical model of the system. This model is based on the perfect technology premise .

  1. Derive candidate-essential classes: This step uses a technique known as "carving" to identify candidate-essential classes and methods from the essential model of the whole system. A complete set of data-flow diagrams, along with supporting process specifications and data dictionary entries, is the basis for class and method selection. Candidate classes and methods are found in external entities, data stores, input flows, and process specifications.
  2. Constrain the essential model: The essential model is modified to work within the constraints of the target implementation environment. Essential activities and essential data are allocated to the various processors and containers (data repositories). Activities are added to the system as needed, based on limitations in the target implementation environment. The essential model, when augmented with the activities needed to support the target environment, is referred to as the incarnation model.
  3. Derive additional classes: Additional candidate classes and methods specific to the implementation environment are selected based on the activities added while constraining the essential model. These classes supply interfaces to the essential classes at a consistent level.
  4. Synthesize classes: The candidate-essential classes and the candidate-additional classes are refined and organized into a hierarchy. Common attributes and operations are extracted to produce superclasses and subclasses. Final classes are selected to maximize reuse through inheritance and importation.
  5. Define interfaces: The interfaces, object-type declarations, and class definitions are written based on the documented synthesized classes.
  6. Complete the design: The design of the implementation module is completed. The implementation module comprises several methods, each of which provides a single cohesive function. Logic, system interaction, and method invocations to other classes are used to accomplish the complete design for each method in a class. Referential integrity constraints specified in the essential model (using the data model diagrams and data dictionary) are now reflected in the class design.
  7. Implement the solution: The implementation of the classes is coded and unit tested .

The analysis phase of the process consists of steps 1 and 2, the design phase consists of steps 3 through 6, and the implementation phase consists of steps 7 and 8. Several iterations are expected during analysis and design. Prototyping may also be used to validate the essential model and to assist in selecting the appropriate incarnation. Furthermore, the process calls for several reviews and checkpoints to enhance the control of the project. The reviews include the following:

  • Requirements review after the second substep of step 1 (model essential system)
  • External structure and design review after the fourth substep (refined model) of step 1
  • Class analysis verification review after step 5
  • Class externals review after step 6
  • Code inspection after step 8 code is complete

In addition to methodology, requirements, design, analysis, implementation, prototyping, and verification, Branson and Herness (1993) assert that the object-oriented development process architecture must also address elements such as reuse, CASE tools, integration, build and test, and project management. The Branson and Herness process model, based on their object-oriented experience at IBM Rochester, represents one attempt to deploy the object-oriented technology in large organizations. It is certain that many more variations will emerge before a commonly recognized OOP model is reached.

Finally, the element of reuse merits more discussion from the process perspective, even in this brief section. Design and code reuse gives object-oriented development significant advantages in quality and productivity. However, reuse is not automatically achieved simply by using object-oriented development. Object-oriented development provides a large potential source of reusable components , which must be generalized to become usable in new development environments. In terms of development life cycle, generalization for reuse is typically considered an "add-on" at the end of the project. However, generalization activities take time and resources. Therefore, developing with reuse is what every object-oriented project is aiming for, but developing for reuse is difficult to accomplish. This reuse paradox explains the reality that there are no significant amounts of business-level reusable code despite the promises OO technology offers, although there are many general-purpose reusable libraries. Therefore, organizations that intend to leverage the reuse advantage of OO development must deal with this issue in their development process.

Henderson-Sellers and Pant (1993) propose a two-library model for the generalization activities for reusable parts . The model addresses the problem of costing and is quite promising . The first step is to put "on hold" project-specific classes from the current project by placing them in a library of potentially reusable components (LPRC). Thus the only cost to the current project is the identification of these classes. The second library, the library of generalized components (LGC), is the high-quality company resource. At the beginning of each new project, an early phase in the development process is an assessment of classes that reside in the LPRC and LGC libraries in terms of their reuse value for the project. If of value, additional spending on generalization is made and potential parts in LPRC can undergo the generalization process and quality checks and be placed in LGC. Because the reusable parts are to benefit the new project, it is reasonable to allocate the cost of generalization to the customer, for whom it will be a savings.

As the preceding discussion illustrates, it may take significant research, experience, and ingenuity to piece together the key elements of an object-oriented development process and for it to mature. In 1997, the Unified Software Development Process, which was developed by Jacobson, Booch, and Rumbaugh (1997) and is owned by the Rational Software Corporation, was published. The process relies on the Unified Modeling Language (UML) for its visual modeling standard. It is usecase driven, architecture-centric, iterative, and incremental. Use cases are the key components that drive this process model. A use case can be defined as a piece of functionality that gives a user a result of a value. All the use cases developed can be combined into a use-case model, which describes the complete functionality of the system. The use-case model is analogous to the functional specification in a traditional software development process model. Use cases are developed with the users and are modeled in UML. These represent the requirements for the software and are used throughout the process model. The Unified Process is also described as architecture-centric . This architecture is a view of the whole design with important characterisitcs made visible by leaving details out. It works hand in hand with the use cases. Subsystems, classes, and components are expressed in the architecture and are also modeled in UML. Last, the Unified Process is iterative and incremental. Iterations represent steps in a workflow, and increments show growth in functionality of the product. The core workflows for iterative development are:

  • Requirements
  • Analysis
  • Design
  • Implementation
  • Test

The Unified Process consists of cycles. Each cycle results in a new release of the system, and each release is a deliverable product. Each cycle has four phases: inception, elaboration, construction, and transition. A number of iterations occur in each phase, and the five core workflows take place over the four phases.

During inception, a good idea for a software product is developed and the project is kicked off. A simplified use-case model is created and project risks are prioritized. Next, during the elaboration phase, product use cases are specified in detail and the system architecture is designed. The project manager begins planning for resources and estimating activities. All views of the system are delivered, including the usecase model, the design model, and the implementation model. These models are developed using UML and held under configuration management. Once this phase is complete, the construction phase begins. From here the architecture design grows into a full system. Code is developed and the software is tested. Then the software is assessed to determine if the product meets the users' needs so that some customers can take early delivery. Finally, the transition phase begins with beta testing. In this phase, defects are tracked and fixed and the software is transitioned to a maintenance team.

One very controversial OO process that has gained recognition and generated vigorous debates among software engineers is Extreme Programming (XP) proposed by Kent Beck (2000). This lightweight, iterative and incremental process has four cornerstone values: communication, simplicity, feedback, and courage. With this foundation, XP advocates the following practices:

  • The Planning Game: Development teams estimate time, risk, and story order. The customer defines scope, release dates, and priority.
  • System metaphor: A metaphor describes how the system works.
  • Simple design: Designs are minimal, just enough to pass the tests that bound the scope.
  • Pair programming: All design and coding is done by two people at one workstation. This spreads knowledge better and uses constant peer reviews.
  • Unit testing and acceptance testing: Unit tests are written before code to give a clear intent of the code and provide a complete library of tests.
  • Refactoring: Code is refactored before and after implementing a feature to help keep the code clean.
  • Collective code ownership: By switching teams and seeing all pieces of the code, all developers are able to fix broken pieces.
  • Continuous integration: The more code is integrated, the more likely it is to keep running without big hang-ups.
  • On-site customer: An onsite customer is considered part of the team and is responsible for domain expertise and acceptance testing.
  • 40- hour week: Stipulating a 40-hour week ensures that developers are always alert.
  • Small releases: Releases are small but contain useful functionality.
  • Coding standard: Coding standards are defined by the team and are adhered to.

According to Beck, because these practices balance and reinforce one another, implementing all of them in concert is what makes XP extreme. With these practices, a software engineering team can "embrace changes." Unlike other evolutionary process models, XP discourages preliminary requirements gathering, extensive analysis, and design modeling. Instead, it intentionally limits planning for future flexibility, promoting a "You Aren't Gonna Need It" (YANGI) philosophy that emphasizes fewer classes and reduced documentation. It appears that the XP philosophy and practices may be more applicable to small projects. For large and complex software development, some XP principles become harder to implement and may even run against traditional wisdom that is built upon successful projects. Beck stipulates that to date XP efforts have worked best with teams of ten or fewer members .

What Is Software Quality?

Software Development Process Models

Fundamentals of Measurement Theory

Software Quality Metrics Overview

Applying the Seven Basic Quality Tools in Software Development

Defect Removal Effectiveness

The Rayleigh Model

Exponential Distribution and Reliability Growth Models

Quality Management Models

In-Process Metrics for Software Testing

Complexity Metrics and Models

Metrics and Lessons Learned for Object-Oriented Projects

Availability Metrics

Measuring and Analyzing Customer Satisfaction

Conducting In-Process Quality Assessments

Conducting Software Project Assessments

Dos and Donts of Software Process Improvement

Using Function Point Metrics to Measure Software Process Improvements

Concluding Remarks

A Project Assessment Questionnaire



Metrics and Models in Software Quality Engineering
Metrics and Models in Software Quality Engineering (2nd Edition)
ISBN: 0201729156
EAN: 2147483647
Year: 2001
Pages: 176

Flylib.com © 2008-2020.
If you may any questions please contact us: flylib@qtcs.net