Without adequate documentation, the system can neither be utilized efficiently nor maintained properly. IT departments have been notoriously lax about developing and then maintaining documentation.
Documentation within the context of configuration management has the following attributes:
This chapter discusses the relationship between proper documentation techniques and related configuration status accounting procedures.
Configuration identification provides a unique identity to a product and its associated documentation. This uniqueness enables users and systems people to:
Configuration documentation consists of, but is not limited to:
As one can see, configuration documentation is consistent with the documentation produced throughout the life cycle of a typical systems development effort.
Configuration identification requires an organization to develop a nomenclature for naming or numbering systems. The nomenclature should adhere to the following rules:
The structure is the hierarchy of a product from the highest level to the lowest . Often represented using a tree structure, each level references associated configuration documentation.
There are two levels of product and, thus, two levels of product identifier. End users require a product identifier to possess the ability to order or discuss a particular product or part of that product. The developer, on the other hand, will be privy to internal documentation such as test plans and system specifications. These will not be seen by the end user.
Every industry and many companies develop their own nomenclature. The scheme might consist of:
An effective configuration identification methodology will enable:
Configuration management enforces the stability of product releases. All maintenance and modifications are usually variations of an agreed-upon baseline. Therefore, it is important to establish and then document the baseline of a system prior to any system modification. A baseline is established by agreeing to the stated definition of a product's attributes. Any deviation from this baseline is managed through the configuration management change process.
It is obvious that enforcing configuration management within a company requires the use of automated systems to keep track of the various configuration information about the product.
Configuration status accounting (CSA) enables appropriate users to review this information quickly and effectively to:
Table 7.1 shows typical status accounting information across a product life cycle as per EIA-649.
Life-Cycle Phases |
||||||
---|---|---|---|---|---|---|
Typical CSA Information (select, where applicable and appropriate) |
Conception |
Definition |
Build |
Distribution |
Operation |
Disposal |
Requirements documentation |
||||||
Product structure information |
||||||
Configuration documentation |
||||||
Configuration documentation change notice |
||||||
Change request and proposal |
||||||
Engineering change effectivity |
||||||
Variance documentation |
||||||
Verification and audit action item status |
||||||
Event date entries |
||||||
Product as-built record |
||||||
Product as-delivered record |
||||||
Product warranty information |
||||||
Product as maintained , modified |
||||||
Limited use, shelf-life restrictions, etc. |
||||||
Product operation and maintenance information revision status |
||||||
Product information change requests and change notices |
||||||
Online information access directory or index |
||||||
Restrictions due to facility/product performance degradation |
||||||
Product replacement information |
||||||
Environmental impact information (when applicable) |
||||||
Product or parts salvage information |
Documentation promotes software quality. There are numerous , well-documented reasons for this. David Tufflye, a consultant who specializes in producing high-quality documentation to a predefined standard, says that consistent, accurate project documentation is known to be a major factor contributing to information systems quality. He goes on to say that document production, version control, and filing are often not performed, thus contributing to a higher number of software defects that impact the real and perceived quality of the software, as well as leading to time and expense being spent on rework and higher maintenance costs [Tufflye 2002].
Marcello Alfredo Visconti [1993], in an article that proposes a Software System Documentation Process Maturity Model, argues that one of the major goals of software engineering is to produce the best possible working software along with the best possible supporting documentation.
Decades worth of empirical data shows that software documentation process and products are key components of software quality. These studies show that poor-quality, out-of-date, or missing documentation is a major cause of errors in future software development and maintenance. For example, the majority of defects discovered during integration testing are design and requirements defects (e.g., defects in documentation that were introduced before any code was written).
Visconti's four-level documentation maturity model provides the basis for an assessment of an organization's current documentation process and identifies key practices and challenges to improve the process. The four-level enhanced model appears in Table 7.2. Key practices, as defined by Cook and Visconti [2000], are listed in Table 7.3.
Level 1 Ad hoc |
Level 2 Inconsistent |
Level 3 Defined |
Level 4 Controlled |
|
---|---|---|---|---|
Keywords |
Chaos, variability |
Standards Check-off list Inconsistency |
Product assessment Process definition |
Process assessment Measurement Control feedback Improvement |
Succinct description |
Documentation not a high priority |
Documentation recognized as important and must be done |
Documentation recognized as important and must be done well |
Documentation recognized as important and must be done well consistently |
Key practices |
Ad hoc process Documentation not important |
Inconsistent application of standards |
Documentation quality assessment Documentation usefulness assurance Process definition |
Process quality assessment and measures |
Key indicators |
Documentation missing or out-of-date |
Standards established and use of check-off list |
SQA-like practices |
Data analysis and improvement mechanisms |
Key challenges |
Establish documentation standards |
Exercise quality control over content Assess documentation usefulness Specify process |
Establish process measurement Incorporate control over process |
Automate data collection and analysis Continually striving for optimization |
|
An assessment procedure was developed to determine where an organization's documentation process stands relative to the model. This enables a mapping from an organization's past performance to a documentation maturity level and ultimately generates a documentation process profile. The profile indicates key practices for that level and identifies areas of improvement and challenges to move to the next -higher level.
Application of the model has a definite financial benefit. The software documentation maturity model and assessment procedure have been used to assess a number of software organizations and projects, and a cost/benefit analysis of achieving documentation maturity levels has been performed using COCOMO, yielding an estimated return on investment of about 6:1 when moving from the least mature level to the next. According to Visconti [1993], these results support the main claim of this research: software organizations that are at a higher documentation process maturity level also produce higher-quality software, resulting in reduced software testing and maintenance effort.
Although the majority of software documentation is produced manually ” that is, done with word processing programs or with tools such as Microsoft Visio ” there are also some systems designed to ease the process, that will produce "automatic" documentation. Some of the automatic documentation capabilities are subset systems of a wider range of capabilities; such is the case with many computer-aided software engineering (CASE) tools. These products are designed to support development efforts throughout the software development life cycle (SDLC), with documentation being just one small part.
An example of one such tool is Hamilton Technologies 001 (http://world.std.com/~hti/), a CASE tool (now usually called an application development tool in lieu of the term CASE) that surrounds itself with an intriguing methodology called "Development before the Fact" (DBTF). The premise behind 001 and DBTF is that developing systems in a quality manner begets quality and error-reduced systems. One of the intriguing features of the 001 toolset is that not only does 001 generate programming source code from maps (i.e., models) of a business problem, but it also actually generates the documentation for said system.
On one end of the documentation spectrum, one will find that many companies utilize no tools other than a word processor and some drawing tool to extract documentation out of their reluctant programmers. On the other end of the documentation spectrum, forward-thinking companies make significant investments in their software development departments by outfitting them with tool suites such as 001. The vast majority of organizations lie somewhere in between these two extremes.
The world of client/server has afforded the developer new opportunities and decisions to make in terms of which toolset to use. When Microsoft Office was first introduced, it was primarily utilized for word processing. Today, Microsoft Access, the database component of the MS Office product set, has become a significant player in corporations with a requirement for a robust but less-complex database than the powerhouse computers that run their back offices (e.g., Sybase, Oracle, and Microsoft SQL Server).
Microsoft Access enables the automated production of several kinds of documents related to the datasets that are implemented with the program. The documents describe schemas, queries, and entity relationship diagrams (ERDs) as shown in Figure 7.1.
Figure 7.1: An Access Entity Relationship Diagram (ERD)
Some products are dedicated to producing documentation. One such product is Doc-o-Matic by toolsfactory.com. It is designed to work with the Borland Delphi software development environment. The product works with Delphi's internal structures, which may consist of Author, Bugs , Conditions, Examples, Exceptions, History, Ignore, Internal, Notes, Parameters, Remarks, Return Value, See Also, Todo, and Version [Leahy 2002]. Doc-o-Matic has been compared to a gigantic parsing routine.
As software systems grow in size and sophistication, it becomes increasingly difficult for humans to understand them and anticipate their behavior, says Charles Robert Wallace [2000] in his dissertation, "Formal Specification of Software Using Abstract State Machines." This method essentially enables walk-through before code is written. Wallace argues that normal specification techniques aim to foster understanding and increase reliability by providing a mathematical foundation to software documentation. His technique calls for layering information onto a model through a series of refinements.
At present, many organizations are practicing a "hit-or- miss " form of software documentation. These are usually the companies that follow no or few policies and procedures, and loosely follow standards.
Good software development is standards based and, thus, documentation must also be standards based.
At a minimum, software documentation should consist of the following items.
1. All Documentation Produced Prior to the Start of Code Development
Most projects go through a systems development life cycle. The life cycle often starts with a feasibility study, goes on to create a project plan, and then enters into the requirements analysis and system design phases. Each of these phases produces one or more deliverables, schedules, and artifacts. In sum, the beginnings of a systems documentation effort should include the feasibility study, project plan, requirements specification, and design specification, where available.
2. Program Flowcharts
Programmers usually, although not always, initiate their programming assignment by drawing one or more flowcharts that diagram the "nuts and bolts" of the actual program. Where systems analysts utilize diagrammatic tools such as dataflow diagrams (DFDs) or UML-based (Unified Modeling Language) class diagrams (Figures 7.2 and 7.3, respectively) to depict the entire system from a physical design level, the programmer is often required to utilize flowcharts (Figure 7.4) to depict the flow of a particular component of the DFD or UML class diagram.
Figure 7.2: A Dataflow Diagram (DFD)
Figure 7.3: A UML Class Diagram
Figure 7.4: A Flowchart
3. Use or Business Cases
Item 1 above (all documentation produced prior to the start of code development) recommends including in your documentation all documentation created during the analysis and design component of the systems development effort. Use cases may or may not be a part of these documents ” although they should be. Use cases, an example of which is shown in Figure 7.5, provide a series of end- user procedures that make use of the system in question. For example, in a system that handles student registration, typical use cases might include student log-in, student registering for the first time, and a student request for financial aid. Use cases are valuable in all phases of systems development: (1) during systems analysis, use cases enable analysts to understand what the end user wants out of the new system; (2) during programming, use cases assist the programmer in understanding the logic flow of the system; and (3) during testing, use cases can form the basis of the preliminary test scripts.
Figure 7.5: A Sample Use Case
4. Terms of Reference
Every organization is unique, in that it has its own vocabulary. Systems people are also unique, in that they often use a lingo incoherent to most end users. A "dictionary" of terms used is beneficial in clearing up any misunderstandings.
Requester Logs into the System to Submit a New Request |
---|
|
5. Data Dictionary
While a data dictionary (DD) is usually included in a System Design Specification (SDS), if it is not included, it should be included here. An excerpt of a DD is provided in Table 7.5 and in Appendix C. A data dictionary consits of the "terms of reference" for the data that is used in the system. It describes database, tables, records, fields, and all attributes such as length and type (i.e., alphabetic, numeric). The DD should also describe all edit criteria, such as the fact that a social security number must be numeric and must contain nine characters .
Table 7.5: Data Dictionary Excerpt
Name: CI: |
Membership Database/mem001 |
---|---|
Aliases: |
None |
Where Used/How Used |
Used by the database management system to process requests and return results to the Inquiry and Administration sub-systems |
Content Description: |
Attributes associated with each asset, including: Membership Number = 10 numeric digits Member Since Date = Date Last Name = 16 alphanumeric characters First Name = 16 alphanumeric characters Address = 64 alphanumeric characters Phone Number = 11 numeric digits (1, area code, phone number) Assets on Loan = array containing 10 strings, each containing 64 alphanumeric characters Assets Overdue = array containing 10 strings, each containing 64 alphanumeric characters Late Fees Due = 10 numeric digits Maximum Allowed Loans = 2 numeric digits |
Name: |
Member Data |
---|---|
Aliases: |
None |
Where Used/How Used |
A file used to validate username and passwords for members , librarians, and administrator when attempting to access the system. The username and password entered are compared with the username and password in this file. Access is granted only if a match is found. |
Content Description: |
Attributes associated with each asset, including: Member Username = 16 alphanumeric digits Member Password = 16 alphanumeric digits |
6. Program/Component/Object Documentation
Aside from flowcharts, unless the programmer is using an automated CASE tool that generates documentation, the programmer should provide the following documentation: (1) control sheet (see Appendix D); (2) comments within the program (Figure 7.6); (3) textual description of what the program is doing, including pseudocode, as shown in Table 7.6.
//Get cost of equipment |
rsEquipment = Select * from Equipment Utilized Where Pothole ID = NewPotholeID |
Loop through rsEquipment and keep running total of cost by equipment * rsRepairCrew("Repair Time") |
Total Cost = Total Employee Cost + Total Equipment Cost + Material Cost |
Update Employee Set Total Cost Where Pothole ID = NewPotholeID |
Figure 7.6: Sample Program Comments
7. All Presentation Material
It is likely that, at some point, the system team will be asked to make a presentation about the system. All presentation paraphernalia, such as slides, notes, etc., should be included in the system documentation.
8. Test Cases (Appendix E) and Test Plan
While use cases form the basis of the initial set of test cases, they are but a small subset of test cases. An entire chapter has been dedicated to software testing, so we will not prolong the discussion here. Suffice it to say that any and all test cases used in conjunction with the system ” along with the results of those test cases ” should be included in the system documentation.
9. Metrics
It is sad to say that most organizations do not measure the effectiveness of their programmers. Those that do should add this information to the system documentation. This includes a listing of all metrics (formulae) used and the results of those measurements. At a minimum, the weekly status reports and management reports generated from toolsets such as Microsoft Project should be included in the system documentation.
10. Operations Instructions
Once the system is implemented, aside from the end users that the system was developed for, there might be some computer support operations personnel who are required to support this system in some way. Precise instructions for these support personnel are mandatory and must be included in the documentation for the system.
11. End-User Help Files
Most systems are built using a client/server metaphor that is quite interactive. Most systems, therefore, provide end users with online help. A copy of each help file should be saved as documentation. Most corporate systems are Windows based. Hence, a Windows-style format in creating help files (Figure 7.7) has become the de facto standard. Microsoft Help Workshop is often used to assist in developing these help files, which are compiled from RTF ( rich-text format) files.
Figure 7.7: A Typical Help File
12. User Documentation
Aside from the built-in help file, there must be a user manual included in what is provided to the end user. Increasingly, this user manual is being supplied right on the CD rather than on paper. There are two different types of end-user manuals. One is more of an encyclopedia that explains the terms and workings of the system when the end user has a specific question. The second type of end-user documentation is more of a tutorial.
User tutorials are easy to develop; it is important to approach the task in a step-by-step manner, going through all the motions of using the software exactly like a user would. Simply record every button you push and every key you press. A table format works well, as seen in Table 7.7, documenting the use of the SecureCRT program, which is a product of New Mexico-based Van Dyke Software.
Steps |
Screen |
---|---|
|
|
|
|
|
Another advantage is that the user documentation development process serves double duty as a functional test. As the analyst or tech writer is developing the tutorial, he or she might just uncover some bugs.
In his discussion of system documentation for the article "Tools and Evidence," Ambler [2002] suggests that modeling and documentation are effective when employed with sense and restraint, thus enhancing system functionality. He makes a case that there is a need for restraint, that models should be discarded once they have fulfilled their purpose. As a project progresses, models are superseded by other artifacts such as other models, source code, or test cases that represent the information more effectively. Ambler takes a fresh approach: while it is important to know what to keep, it is also important to know what to throw away.
Documentation is particularly critical for maintenance work. Code can be mysterious to maintenance programmers who must maintain the system for years after the original system was written and the original programmers have moved on to other jobs [Graham et al. 2000].
Documentation is critically important. Kalakota [1996] wrote about organizing practices in his dissertation entitled "Organizing for Electronic Commerce." Echoing the concept of configuration management, Kalakota stressed that organizing has three distinct dimensions:
Distributed documents must be organized such that users and programs are able to locate, track, and use online documents. The growth of networking brings with it a corresponding increase in the number of documents to be organized. Current document organization techniques are derived from techniques used in file systems and are not sufficient for organizing the large number of heterogeneous documents that are becoming available for various purposes.
Kalakota suggests that:
Documentation is an often-neglected but very necessary component of the software development life cycle (SDLC). There are numerous approaches and methods available to software development teams to assist with the task. Most important are a commitment to documenting software, setting standards for the organization, and making them stick that is, adhering to the standards.
Configuration management (CM) enhances documentation by providing a framework of standardization through configuration identification and configuration status accounting.
[AISI 1996] Applied Information Science International, "Entity Relationship Diagram," 1996 ; available online at http://www.aisintl.com/case/olais/pb96/er_model.htm
[Cook and Visconti 2000] Cook, Curtis R. and Marcello Visconti, "Software System Documentation Process Maturity Model," available online at http://www.cs.orst.edu/~cook/doc/Model.htm
[Graham et al. 2000] Graham, C., J.A. Hoffer, J.F. George, and J.S. Valacich, Introduction to Business Systems Analysis , Pearson Custom Publishing, Boston, MA, 2000 .
[Kalakota 1996] Kalakota, Ravi Shankar, "Organizing for Electronic Commerce," DAIA , 57 / 02, 1996 , from University of Phoenix Online Collection [ProQuest Digital Dissertations], publication number AAT 9617262, Available online at http://www.apollolibrary.com:2118/dissertations/fullcit/9617262
[Leahey 2002] Leahey, Robert, Doc-O-Matic 1.0: Generates Docs in WinHelp, RTF, HTML or HTML Help , Delphi Informant, http://www.delphizine.com/productreviews/2001/07/di200107rl_p/di200107rl_p.asp
[Liebhaber 2002] Liebhaber, Karen Powers, "Documentation for a Technical Audience," Intercom , 49 (2), February 2002 .
[Scott 2002] Ambler, Scott W., "Tools and Evidence," Software Development , available online at http://www.sdmagazine.com/documents/s=7134/sdm0205i/0205i.htm
[Tufflye 2002] Tufflye, David, "How to Write, Version & File Software Development Documentation," 2002 , available online at http://tuffley.hispeed.com/tcs20006.htm
[Visconti 1993] Visconti, Marcello Alfredo, Software System Documentation Process Maturity Model, DAI-B , 55 / 03, 1993 , from University of Phoenix Online Collection [ProQuest Digital Dissertations], publication number AAT 9422184, available online at http://www.apollolibrary.com:2118/dissertations/fullcit/9422184
[Wallace 2000] Wallace, Charles Robert, Formal Specification of Software Using Abstract State Machines, DAI-B , 61 / 02, 2000 , from University of Phoenix Online Collection [ProQuest Digital Dissertations], IBSN: 0-599-63514-2, Available online at http://www.apollolibrary.com:2118/dissertations/fullcit/9959880
Preface