Configuration Verification and Audit

OVERVIEW

A variety of things can go wrong with the CM (configuration management) process. Brown et al. [1999] list a set of "antipatterns" ” commonly repeated flawed practices:

  • Reliance on a software configuration tool to implement an SCM program.
  • The CM manager becomes a controlling force beyond his or her planned role. This leads to the CM manager dictating the delivery sequence and dominating all other processes.
  • Delegating CM functions to whoever happens to be available. Project managers, often strapped for resources, frequently parcel out the CM function to developers. CM really needs to be process that stands apart from development. Brown says that their role as a developer compromises their role as software configuration manager, because their primary responsibility is for the development of software. Developers use this as the "product." From a CM perspective, the product is not just the software, but also the documentation.
  • Use of decentralized repositories. The key behind CM is shared information. This requires a shared repository. Many organizations utilize a decentralized mode of operation. Decentralization negates shared information.
  • Object-oriented development poses granularity problems. CM must happen at a detailed level of the interaction of a few objects and at a higher level where component interfaces are deployed.

The configuration verification and audit process includes:

  • Configuration verification of the initial configuration of a CI, and the incorporation of approved engineering changes, to assure that the CI meets its required performance and documented configuration requirements
  • Configuration audit of configuration verification records and physical product to validate that a development program has achieved its performance requirements and configuration documentation or the system/CI being audited is consistent with the product meeting the requirements

The common objective is to establish a high level of confidence in the configuration documentation used as the basis for configuration control and support of the product throughout its life cycle. Configuration verification should be an imbedded function of the process for creating and modifying the CI or CSCI.

As shown in Figure 8.1, inputs to the configuration verification and audit activity include:

  • Configuration, status, and schedule information from status accounting
  • Approved configuration documentation (which is a product of the configuration identification process)
  • The results of testing and verification
  • The physical hardware CI or software CSCI and its representation
  • Manufacturing
  • Manufacturing/build instructions and engineering tools, including the software engineering environment, used to develop, produce, test, and verify the product

click to expand
Figure 8.1: Configuration Verification and Audit Activity Model

Successful completion of verification and audit activities results in a verified system/CI(s) and a documentation set that can be confidently considered a product baseline. It also results in a validated process to maintain the continuing consistency of product to documentation. Appendices V and W provide sample checklists for performing both a functional configuration and physical configuration audit.


CONFIGURATION VERIFICATION AND AUDIT CONCEPTS AND PRINCIPLES

There is a functional and a physical attribute to both configuration verification and configuration audit. Configuration verification is an ongoing process. The reward for effective release, baselining, and configuration/change verification is delivery of a known configuration that is consistent with its documentation and meets its performance requirements. These are precisely the attributes needed to satisfy the ISO 9000 series requirements for design verification and design validation as well as the ISO 10007 requirement for configuration audit.

Configuration Verification

Configuration verification is a process that is common to configuration management, systems engineering, design engineering, manufacturing, and quality assurance. It is the means by which a developer verifies his or her design solution.

The functional aspect of configuration verification encompasses all of the test and demonstrations performed to meet the quality assurance sections of the applicable performance specifications. The tests include verification/qualification tests performed on a selected unit or units of the CI, and repetitive acceptance testing performed on each deliverable CI, or on a sampling from each lot of CIs, as applicable . The physical aspect of configuration verification establishes that the as-built configuration conforms to the as-designed configuration. The developer accomplishes this verification by physical inspection, process control, or a combination of the two.

Once the initial configuration has been verified, approved changes to the configuration must also be verified . Figure 8.2 illustrates the elements in the process of implementing an approved change.

click to expand
Figure 8.2: Change Implementation and Verification

Change verification may involve a detailed audit, a series of tests, a validation of operation, maintenance, installation, or modification instructions, or a simple inspection. The choice of the appropriate method depends on the nature of the CI, the complexity of the change, and the support commodities that the change impacts.

Configuration Audit

The dictionary definition of the word "audit" as a final accounting gives some insight into the value of conducting configuration audits . Configuration management is used to define and control the configuration base-lines for the CIs and the system. In general, a performance specification is used to define the essential performance requirements and constraints that the CI must meet.

For complex systems and CIs, a "performance" audit is necessary to make this determination. Also, because development of an item involves the generation of product documentation, it is prudent to ascertain that the documentation is an accurate representation of the design being delivered.

Configuration audits provide the framework, and the detailed requirements, for verifying that the development effort has successfully achieved all the requirements specified in the configuration baselines. If there are any problems, it is the auditing activity's responsibility to ensure that all action items are identified, addressed, and closed out before the design activity can be deemed to have successfully fulfilled the requirements.

There are three phases to the audit process, and each is very important. The pre-audit part of the process sets the schedule, agenda, facilities, and rules of conduct and identifies the participants for the audit. The actual audit itself is the second phase; and the third phase is the post-audit phase, in which diligent follow-up of the audit action items must take place. For complex products, the configuration audit process may be a series of sequential/parallel audits of various CIs conducted over a period of time to verify all relevant elements in the system product structure. Audit of a CI can include incremental audits of lower-level items to assess the degree of achievement of requirements defined in specifications/documentation.

Functional Configuration Audit

The functional configuration audit (FCA) is used to verify that the actual performance of the CI meets the requirements stated in its performance specification and to certify that the CI has met those requirements. For systems, the FCA is used to verify that the actual performance of the system meets the requirements stated in the system performance specification. In some cases, especially for very large, complex CIs and systems, the audits can be accomplished in increments. Each increment can address a specific functional area of the system/CI and will document any dis-crepancies found in the performance capabilities of that increment. After all the increments have been completed, a final (summary) FCA can be held to address the status of all the action items that have been identified by the incremental meetings and to document the status of the FCA for the system or CI in the minutes and certifications. In this way, the audit is effectively accomplished with minimal complications.

Physical Configuration Audit

The physical configuration audit (PCA) is used to examine the actual configuration of the CI that is representative of the product configuration in order to verify that the related design documentation matches the design of the deliverable CI. It is also used to validate many of the supporting processes that were used in the production of the CI. The PCA is also used to verify that any elements of the CI that were redesigned after the completion of the FCA also meet the requirements of the CI's performance specification.

Application of Audits During Life Cycle

It is extremely unlikely that FCAs or PCAs will be accomplished during the Concept Exploration and Definition phase or the Program Definition and Risk Reduction phase of the life cycle. Audits are intended to address the acceptability of a final, production-ready design and that is hardly the case for any design developed this early in the life cycle.

It is during the Engineering and Manufacturing Development (EMD) phase that the final, production, operationally ready design is developed. Thus, this phase is normally the focus for the auditing activity. A PCA will be preformed for each HW CI that has completed the FCA process to "lock down" the detail design by establishing a product baseline. Hardware CIs built during this phase are sometimes "pre-production prototypes " and are not necessarily representative of the production hardware. Therefore, it is very common for the PCAs to be delayed until early in the Production phase of the program.

Requirements to accomplish FCAs for systems and CIs are included in the Statement of Work (SOW) tasking. The FCA is accomplished to verify that the requirements in the system and CI performance specifications have been achieved in the design. It does not focus on the results of the operational testing that is often accomplished by operational testing organizations in the services, although some of the findings from the operational testing may highlight performance requirements in the baselined specification that have not been achieved. Deficiencies in performance capability, as defined in the baselined specification, result in FCA action items requiring correction without a change to the contract. Deficiencies in the operational capability, as defined in user -prepared need documents, usually result in Engineering Change Proposals (ECPs) to incorporate revised requirements into the baselined specifications or to fund the development of new or revised designs to achieve the operational capability.

Because the final tested software design verified at the FCA normally becomes the production design, the PCAs for CSCIs are normally included as a part of the SOW tasking for the EMD phase. CSCI FCAs and PCAs can be conducted simultaneously to conserve resources and to shorten schedules.

During a PCA, the deliverable item (hardware or software) is compared to the product configuration documentation to ensure that the documentation matches the design. This ensures that the exact design that will require support is documented. The intent is that an exact record of the configuration will be maintained as various repair and modification actions are completed. The basic goal is sometimes compromised in the actual operation and maintenance environment. Expediency, unauthorized changes, cannibalization, overwork, failure to complete paperwork, and carelessness can cause the record of the configuration of operational software or hardware to become inaccurate. In some situations, a unit cannot be maintained or modified until its configuration is determined. In these kinds of circumstances, it is often necessary to inspect the unit against approved product configuration documentation, as in a PCA, to determine where differences exist. Then the unit can be brought back into conformance with the documentation, or the records corrected to reflect the actual unit configuration.

As discussed, configuration audits address two major concerns:

  1. The ability of the developed design to meet the specified performance requirements (the FCA addresses this concern)
  2. The accuracy of the documentation reflecting the production design (the PCA addresses this concern)

Audit checklists are provided in Table 8.1.

Table 8.1: Audit Checklists

Audit Planning Checklist:

  1. Global plan and schedule for all FCAs/PCAs expanding on CM PLAN
  2. CIs/CSCIs to be audited; specific units to be audited
  3. Scope: contract requirements, SOW, specification, approved plans
  4. Location and dates for each audit
  5. Composition of audit team and their functions in the audit
  6. Documentation to be audited and reference material
  7. Administrative requirements; security requirements

Audit Agenda Checklist:

  1. Covering a specific audit, targeted 60 days before audit
  2. Date, time, location, duration ” unless otherwise specified, configuration audits will be conducted at the contractor or a designated sub-contractor facility
  3. Chairpersons
  4. Specific CIs or CSCIs
  5. Documentation to be available for review
  6. Chronological schedule for conduct of the audit
  7. Detailed information pertinent to the audit (e.g., team requirements, facility requirements, administrative information, security requirements)

Audit Teams Checklist:

  1. Assign a co- chair for each audit in audit plan
  2. For FCA: base specific personnel needs on the type and complexity of the CIs to be audited, their technical documentation, and the logistics, training, human factors, safety, producibility, deployability , and other requirements of the governing specification
  3. For PCA: experts in engineering design, computer-aided design, engineering release, computer-aided manufacturing, manufacturing, assembly, and acceptance test processes are needed
  4. Task DCMC plant representatives to review and certify engineering release, configuration control, and verification processes
  5. Prior to each audit, provide organization and security clearance of each participating individual on the audit team

Conducting Configuration Audits

Introductory Briefings Checklist:

  1. All participants
  2. Purpose of the audit
  3. Specific items to be audited; pertinent information/characteristics of the system/CIs
  4. Basic criteria for problem identification and documentation
  5. Schedule and location of audit events
  6. Teams, team leaders , and location of teams
  7. Administrative procedures for the audit (e.g., problem input format, processing flow, audit logistics)
  8. Location of necessary facilities

Conduct Reviews. Prepare Audit Findings (problem write-ups) Checklist:

Sub-teams facilitate the conduct of the audit by enabling parallel effort; auditors assigned to work in area of expertise.

  1. Review specification, verification processes, and results:

    1. Test plans/procedures comply with specification requirements
    2. Test results, analyses, simulations, etc.; verify CI requirements as required by specification
    3. ECPs are incorporated and verified
    4. Interface requirements verified
    5. Configuration documentation reflects configuration of item for which test data is verified
    6. Data for items to be provisioned are sampled to ensure that they reference applicable performance and test requirements
    7. For CSCIs:

      1. Database, storage allocation, timing, and sequencing are in compliance with specified requirements
      2. Software system operation and maintenance documentation is complete
      3. Test results and documentation reflect correct software version
      4. Internal QA audits are satisfied
  2. Temporary departures documented by approved Deviation Request
  3. Product baseline:

    1. Formal examination of the as-built configuration of a CI or CSCI against the specifications and design documentation constituting its product baseline
    2. Ensure proper parts as reflected in the engineering drawings (see below) are actually installed and correctly marked
    3. Determine that the configuration being produced accurately reflects released engineering data
  4. Engineering drawing or CAD representations (design detail) review:

    1. Representative number of drawings (or CAD representations) and associated manufacturing instructions reviewed for accuracy and to ensure that the manufacturing instructions (from which the hardware is built) reflect all design details and include authorized engineering changes

      1. Drawing number and revision on manufacturing instructions matches correct released drawing or
      2. CAD representation
      3. Drawing and revisions are correctly represented in release records; drawings do not have more than five unincorporated changes
      4. List of materials on manufacturing instructions matches drawing parts list
      5. Nomenclature, part number, and serial number markings are correct
      6. All approved changes have been incorporated
      7. There is a continuity of part references and other characteristics for a major assembly from the top drawing down to the piece part
      8. Required approvals are present
    2. Sampling of parts reflected on drawing reviewed to ensure compatibility with program parts selection list (or criteria)
  5. Acceptance test procedures and results:

    1. CI acceptance test data and procedures comply with item specification
    2. Acceptance test requirements prescribed by the documentation are adequate for acceptance of production units of a CI
    3. CIs being audited pass acceptance tests as reflected in test results
  6. Engineering release and configuration control:

    1. System is adequate to properly control the processing and release of engineering changes on a continuing basis
    2. Software changes are accurately identified, controlled, and tracked to the software and documentation affected
  7. Logistics support plan for pre-operational support:

    1. Spares and repair parts provisioned prior to PCA are the correct configuration
  8. For CSCIs:

    1. Documentation is complete and meets applicable conventions, protocols, coding standards, etc.
    2. Software listings reflect design descriptions
    3. Delivery media is appropriately marked and in agreement with specification requirements for packaging and delivery
    4. Documentation of the correct relationship to the components to which the software is to be loaded; for firmware, it contains complete installation and verification requirements
    5. Demonstrate that each CSCI can be compiled from library-based source code
    6. Review operational and support manuals for completeness, correctness, and incorporation of comments made at prior reviews (FCA, test readiness, QA audits, etc.)

Problem Write-up Checklist:

  1. Originator:

    1. Identify contract or configuration document
    2. Item being audited
    3. Requirement
    4. Narrative description of the problem/discrepancy
    5. Recommendation
  2. Sub-team leader preliminary review:

    1. Preliminary control number assigned
    2. Approved and signed
    3. Disapproved
    4. Returned to originator for revision or further analysis
  3. If approved, forwarded to Executive Panel

Disposition Audit Findings Checklist:

  1. Executive panel:

    1. Final review of problem write-ups
    2. Assign control numbers and enter selected problems into official record of the audit
    3. Submit to developer with suspense time (typically a period of hours) for responding to the problem
  2. Developer response:

    1. Concur with problem and recommend action
    2. Offer additional information that resolves or clarifies the problem
    3. Disagree with problem finding or obligation
  3. Review response:

    1. Determine if it appears to provide satisfactory resolution
    2. Provide to Executive Panel
  4. Disposition all problem write-ups that were submitted
  5. Make final decision as to further action:

    1. Close item
    2. Agree on further actions to close out problem
  6. Officially record all dispositions, action assignments, and suspense dates in audit minutes
  7. Co-chairs sign all problem write-ups

Documenting Audit Results Checklist:

  1. Prepare official audit minutes, to include:

    1. Typical meeting minutes: time, place, purpose, participants, etc.
    2. Action item lists reflecting all actions and suspense dates agreed to
    3. Applicable audit certifications documenting key audit review activities
    4. Specific items, systems, documents, or processes reviewed
    5. Summary of discrepancies/deficiencies in each area referenced to control number of applicable audit problem write-ups (action items)
    6. Definitive statements about acceptability or non-acceptability
    7. Final status of the developer's effort in the area being certified


SUMMARY

Testing is a critical component of software engineering. It is the final step taken prior to deploying the system. Configuration Verification and Audit organizes this process to ensure that the deployed system is as expected by the end users.


REFERENCES

This chapter is based on the following report: MIL-HDBK-61A(SE), February 7, 2001 , Military Handbook: Configuration Management Guidance .

[Brown 1999] Brown, William, Hays McCormick, and Scott Thomas, AntiPatterns and Patterns in Software Configuration Management , John Wiley & Sons, New York, 1999 .






Software Configuration Management
Software Configuration Management
ISBN: 0849319765
EAN: 2147483647
Year: 2006
Pages: 235
Authors: Jessica Keyes
Simiral book on Amazon

Flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net