Support Process Areas

Configuration Management

The items that require configuration management have been identified. The items currently under configuration management for the CAU project include the System Specification Document, numerous software requirements specifications, and interface requirements definition documents. Test facilities, including labs, will have configuration management policies. Test cases will also be under configuration management.

The DocumentumTM tool is used for managing most of the documentation associated with the project. This tool is also used across USA for management of many documents. The USA standard document templates contain change notices in the front of documents.

The planning activity to support the software system builds is under way. The project is currently documenting the procedures to be followed with the open-standard concurrent versions system. There will be eight incremental system builds with the final system being used for formal testing.

Change instruments will be used to investigate and track changes to the configuration items. The first time items are delivered, the change request is linked to the initial submittal. Documentum, the concurrent versions system, and the build system will all provide version control and provide for the recovery of each version. The source code will contain identifiers that indicate when a specific line was modified.

The Version Description Document will contain the documented base release configuration. Software Quality Assurance will perform configuration audits.

Process and Product Quality Assurance

Quality engineers from Safety Quality and Mission Assurance independently audit the development of software requirements and will continue to audit the software products. It is USA policy to deliver Human Space Flight products and services safely, on time, and error-free. The USA Safety Quality and Mission Assurance organization provides independent oversight to ensure the associated technical processes are implemented in accordance with this policy. Routine performance data, in the form of Safety Quality and Mission Assurance integrated metrics, will be collected and analyzed to measure the effectiveness of these processes.

The programmers, testers, and requirements analysts audit the technical quality of the flight software products (e.g., software requirement specifications, design, code, test results) by participating in inspections during each phase of the development process. All issues from inspections and testing are recorded. Issues are written and tracked until closure. Currently, requirements inspection issues are recorded and maintained. This practice will be carried forward to design, code, and test as these project phases are reached.

Measurement and Analysis

Within the Flight Software Program Element, the associate program manager expects support to the USA Vision Support Plan Program as well as expects measurements to be rolled out at the lowest levels of the process. The associate program manager directs the processes to collect and present their measurements annually at the internal quality review meeting.

Some objectives for measurement are stated in the project documentation (e.g., use to track the quality and timeliness of products, focus analysis to identify trends that ensure steady progress against goals or suggest areas of concern, provide a basis of communication within the USA Flight software program element concerning the evaluation of technical issues and accomplishments, analyze schedule adherence, identify areas where improvements or corrections are required).

The metrics-related documents provide high-level information that indicates the sources of metrics data and the collection approach. The process documents define the base measure collection.

The list of derived measures is contained in the Metrics Product Development Plan, which USA and NASA have approved. Measure definitions are contained in the quarterly quality report, which is a contract deliverable. The PASS measurement reports include some analysis summary conclusions. Some of the reports are previewed prior to broader dissemination. There is some documentation that represents administrative procedures for analyzing the measurement data and communicating the results.

The base measurement data is collected from multiple sources (e.g., development environment database, configuration management database, SLOC reports) and derived measures are computed based on established Excel file links/computations. Derived measurements are kept in Excel/Word/PowerPoint files stored on the local area network.

Data integrity, accuracy, completeness, and currency checks are typically performed for each Flight Software build and quarterly when discrepancy report data is reconciled during generation of the quarterly quality report. Some data integrity checks are built in to the "process enactment" system at the source of the data recording. Discrepancies noted in collected base measures are reported to the appropriate data source (e.g., control board, process owner).

The appropriate measurement data analysis methods and tools have been established for the project data (e.g., error predictions, reliability models, trend analysis). The analysis results (reports) are stored on the local area network or the USA intranet for contract deliverables. A long-standing process of not using measurement data inappropriately for performance evaluations has been established. The project culture supports the open collection and reporting of measurements across the board.

Measurement reports (e.g., monthly key project metrics, quarterly quality reports, milestone error predictions) are distributed to managers and key staff who have a vested interest in the report's content.

Measurement collection, analysis, and use to support managerial decision making are ingrained in all development processes. Measurement activities are also a means for identifying areas for process improvement.

Decision Analysis and Resolution

The formal evaluation process has been applied for two key aspects of the project: subcontractor source selection and trade studies. The subcontractor source selection is described in the Supplier Agreement Management PA.

Decisions were made early in the project life cycle to use prototyping as USA and NASA met to discuss the best way to determine possible software solutions. Where choices were to be made between commercial products (or build vs. buy decisions), company and customer expectations dictated the use of trade studies.

Trade studies were selected as the method of choice for three different areas where decisions were needed. These areas were (1) operating system selection, (2) implementation language selection, and (3) display tool selection. Prototyping was also used to evaluate implementation techniques.

A trade study methodology was documented, evaluation criteria were defined and weighted, and alternative solutions were identified and documented, along with the trade study results in the reports.

Risk Management is another area where a formal evaluation process is used. Medium-and high-risk areas are also subject to formal evaluation as identified by the company risk mitigation policy.

The USA decision package process is another form of structured decision making that is used across the company. USA requirements dictate when a decision package is needed. Developing a decision package requires identification and consideration of alternative solutions. The decision package documents the results of the evaluation and recommendation of a specific solution.

Causal Analysis and Resolution

Discrepancy reports are subject to causal analysis. Additional problems selected for causal analysis are based on the significance decided by management or control boards. Analysts record the root cause of the problem and any proposed solutions in the discrepancy report analysis to prevent recurrence of the problem. Additional analysts with knowledge in the same technical area review the root cause analysis information. These reviewers also have input on why the problem was missed and may suggest ways to prevent missing the problem in the future.

For identified deficiencies in the process, the process teams determine which of the proposed actions will be implemented. Depending on the seriousness of the problem, they may be directed to implement some action proposals by customer, management, Discrepancy Report Board, or other control boards.

Discrepancy report causal analysis information is recorded and maintained on the discrepancy report quality analysis form. However, sometimes detailed supplemental presentations contain causal analysis data. This information is not controlled in the same manner as the discrepancy report analyses. It is usually maintained but not easily available. Causal analysis data from other activities is similarly maintained but is not readily available.

There is no systematic method in place to evaluate the effect of (individual) changes on process performance.

USA formed a company-wide team to support corporate standardization of root cause analysis. The team defined a standard process and selected a corresponding tool (REASON®) to support the tracking of root causes across the company for defined incidents.



CMMI (c) Guidelines for Process Integration and Product Improvement
CMMI (c) Guidelines for Process Integration and Product Improvement
ISBN: N/A
EAN: N/A
Year: 2006
Pages: 378

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net