Classification and categories of requirements


Classification is an arrangement according to some systematic division into classes or groups. On the other hand, category is a class of division in a scheme of clasification. When classifying requirements, the following typical codes are used:

  • Unclassified. Initial classification of all requirements.

  • Does not apply. DVP team determined that the requirement is not applicable to the program.

  • Meet. DVP team intends to meet this requirement.

  • Deviate from requirement. DVP team intends to meet this requirement, but there is a potential to deviate. Deviation process is handled outside the design verification specification (DVS).

  • Support. DVP team is aware of the requirement, but does not own it and will not write a DVP. Can only be used with a cascaded requirement.

  • Open. DVP team is still investigating the impact of this requirement on the program.

    There are six common categories of requirements:

  • New requirement. A new requirement is a requirement that never has been in existence—at least for the organization. The mandatory items of concern regarding a new requirement are:

    • Program.

    • Title.

    • DVP team.

    • Unit of measure.

    • Owning attribute or SDS information.

  • Modifying requirements. Generic and program-specific requirements can be modified to reflect program needs and assumptions. Modified requirements have the following typical characteristics:

    • A modified requirement is an in-process copy of a requirement.

    • In-process requirements can only be viewed and modified by the owning DVP team.

    • Up to only a certain number of in-process copies of the same requirement can exist at one time. This limit should depend on the project.

    • When an in-process requirement is released, other in-process copies are deleted.

    • In-process requirements must be released before they become part of the DVP team's set of requirements.

    • All released requirements must be classified using the classifications listed above.

    • Modified requirements retain the ID number of the original requirement.

  • Adding requirements. New requirements can be added to reflect the program's needs and assumptions:

    • New requirements can be based on existing requirements or can be created from scratch.

    • Once the mandatory fields are entered and saved, the new requirement is in process.

    • New requirements are given an ID number based on the owning SDS and are numbered beginning with a serial number.

    • All in-process requirements must be released and classified.

  • Deleting requirements. As the review progresses, it may be necessary to delete some requirements. The delition may be because of obsolescence, irrelevancy or some other reason. The following are some rules to follow when evaluation the design for deleting requirements.

    • Only in-process requirements can be deleted. (Caution: Once a requirement is deleted, it cannot be retrieved.)

    • Generic requirements can never be deleted. If they are not needed, they are classified as "does not apply."

    • Released information cannot be deleted.

  • Releasing requirements. Once the requirements have been agreed by the team, then they must be released. Typical rules for releasing the requirements are:

    • In-process requirements can be updated as necessary.

    • Released requirements allow changes to accomondate robustness. If not, then the design should identify these items with specific identification marks.

    • Unit of measure field on the requirements text and target/ assessments milestones.

    • Classification code, classification comments, cascade relationship fields on the classification tab.

    • Engineer, DVP team, sequence number, part number and DVP comments fields on the DVP information tab.

  • Only released requirements can be classified.

Special note:

It should be emphasized here that the engineer can only create and/or edit a requirement, whereas the manager has authority to release and classify a requirement or transfer ownership of a requirement to another DVP team.

Cascading requirements. At this stage of the review process the team is ready to pass on the design to the appropriate person(s) or department or process. It is imperative, then, that it follow the following typical rules for making sure that the requirements as defined will be passed on correctly and effectively:

  • Cascade to existing. This option is used when the owner of a from requirement knows that a to requirement exists and wants to cascade or link to an existing lower-level to requirement. However, there is a caution here: Using the cascade to same option causes the requirement to appear on two DVP teams. This is not recommended, it can cause a great deal of confusion as to who is responsible for verification.

  • Cascade to same. This option is used when the owner of a from requirement wants to cascade the same requirement to a to or lower-level DVP team. The cascaded requirement will be a released copy of the from requirement. The new requirement's ID number should be prefixed by the ID of the assigned SDS and be numbered as a program-specific requirement for less confusion if needed to be traced.

  • Cascade to new. This option is used when the owner of a from requirement wants to create a new requirement and cascade it to a DVP team. The cascaded requirement will be an in-process requirement. The new requirement's ID number should be prefixed by the ID of the assigned SDS and be numbered as a program-specific requirement to avoid confusion if needed to be traced.

Requirement reports. Once the category has been identified and the classification has been defined, then the appropriate requirements for reports should be produced. The following reports can be generated, as required by the program:

  • Ad hoc classifications. Reports on requirements. Details, graphics, and DVMs.

  • DVP team classifications. Prints paper version of requirements for review and classification purposes.

  • Requirement classification. Reports number of requirements and totals for each classification.

  • Classification issues. Lists the requirements that are open, unclassified, or deviate.

  • Engineering verification report. Prints your design verification plan at specific milestones as the plan is being developed.

  • Program metrics. Downloads to Excel used to track the program's progress in recording a design verification plan.

  • DVP&R. Provides two separate reports:

    • Plan. Provides a snapshot of the current status of all applicable requirements (per DVP team or ad hoc search) for incomplete DVPs and focus on the detail of the plan.

    • Report. Provides a snapshot of the current status of all applicable requirements (per DVP team or ad hoc search) for assessment issues and focus on the detail of the status.

Minimum feature configurations. A minimum feature configuration is a set of codes that define minimum features or systems required to verify requirements. This configuration is company- or project-dependent. These configurations are representative of different combinations of features that will be offered to the customer. The configurations are owned by the DVP that created it and are used to determine the prototypes needed to complete the program's design verification. On the other hand, a minimum feature configuration is not:

  • Used to identify specific CAE models or products.

  • An order for a physical build.

  • A parts list.

How then do we proceed to generate these configurations? The following are some typical guidelines:

  • Identify the target configuration by:

    • Representing program-specific build features.

    • Allowing for the entry of target values.

    • Linking to requirements for DVP planning purposes.

  • Conduct a benchmark configuration by:

    • representing your competitors' build features.

    • allowing for entry of target values.

    • not linking to DVMs for DVP planning purposes.

  • Ask perhaps the most important question,

    • Who defines a minimum feature configuration? There are two options:

      1. The project action team/project management team (PAT/PMT) engineers, and

      2. Anyone on a program who needs to schedule a prototype. In discussing the issue of minimum configuration, at least the following should be specified: a) a default value, and b) ad hoc (temporary) search results.

Creating a minimum feature configuration. A minimum feature configuration consists of two parts.

A header-record that contains identification information. A new minimum feature configuration record requires:

  • Program.

  • DVP team.

  • Prototype class.

  • Model year.

  • Product type.

  • Product line.

  • Description.

Minimum feature configurations content uses predefined codes or manually entered features. Minimum feature configurations can be copied by the owning DVP team and transferred to the requesting DVP team within the same program. This allows DVP teams to share configurations with common predefined codes. The requesting team can modify its copy of the configurations without affecting the original. Finally, if a specific configuration is not going to be used by a DVP team, it can be deleted. A deleted configuration:

  • Will not show up on the target matrix.

  • Will not be reported.

  • Must not have a target recorded against it.

  • Cannot be retrieved.

Entering target values. For any design verification system to work effectively, there are several prerequisites dealing with targets that need to be addressed. Therefore, before working with the target matrix, the generic requirements must be classified. These are requirements at appropriate levels that need to be identified, added, or modified and classified in the overall DVS. In addition, target and benchmark minimum feature configurations need to be developed and entered—they must be set at the system level and be able to be cascaded on other systems or subsystem levels. By entering the appropriate and applicable target values, we are, in effect, planning for the effectiveness of our design. Therefore,

  • Target values must be linked to requirements.

  • A minimum feature configuration must be selected.

  • Target values must be either be numeric or pass/fail.

  • Values may be changed until the targets are frozen, according to your program's timing.

Alignment of requirements and configurations may be used when assessments are made. (Special note: only enter targets for requirement and minimum feature configuration combinations that are absolutely necessary to sign-off the specific requirement.) Appropriate and applicable risk assessment analysis must be performed. Typical risk status are: a) unassessed, b) comply, c) minor, d) major, e) major not evaluated, f) not acceptable, and g) not certified.

Reviewing the application process

Any system for any product is not going to be effective unless there is continual feedback on that process. The verification process is not an exception. An ongoing review is recommended to include at least the following actions:

Review your generic requirements either online or on-paper reports.

  • Classify each requirement based on applicability to your program. Record the classification in the appropriate documentation log.

  • Modify or add requirements based on assumptions and targets. Record the changes in the appropriate documentation log.

  • Determine the minimum feature configurations that your DVP team will need to verify its requirements.

  • Establish targets for each of the requirements and configurations. Record the targets on the target matrix.

As the review progresses, the team should be able to consider the next steps. These are:

  • Contact your program administrator about the following:

    • Have you been set up with a user ID and password? (if you are working with a computer, this is very important step).

    • Has your DVP team been established?

    • have the generic requirements been assigned to your DVP team?

    • Print the ad hoc classification report for your DVP team. (The content in DVS may continue to change. Use updated ad hoc classification reports as necessary.)

When these things have been completed, you need to start the application process. After the target recording process is completed, register for DVS.

Whereas the process evaluation is ongoing activity throughout the verification, at the very end there should be an overall review of the verification process that should allow the following activities to take place:

  • Review generic DVMs.

  • Develop DVMs to verify requirements.

  • Plan when and how to best verify requirement-driven design.

  • Complete design verification plan for the requirement-driven designs.

  • Monitor progress of design.

  • Prepare sign-off documentation.

When everything has been reviewed and agreed upon, the team should consider the next steps to complete the design verification plan. Typical steps are:

  • Review requirement details for linked DVMs.

  • Review the generic DVMs.

  • Create and release new DVMs, if the appropriate generic does not exist.

  • Record a DVP request.

  • Maintain assessment records.

  • Generate sign-off documents.

  • Record risk assessments.

The typical tools used are:

  • A reliability and/or robustness plan.

  • A design verification plan with key noises.

  • Correlation tests with customer usage.

  • A reliability/robustness demonstration matrix. The typical deliverables of any complete design review should include at least the following:

  • Test results, including:

    • Product performance over time.

    • Weibull test, hazard plot and so on.

    • Long-term process capabilities.

  • Completed robustness and reliability checklist (if available) with demonstration matrix.

  • Scorecard with actual values of y, σy.

  • Lessons learned, captured and documented.




Six Sigma Fundamentals. A Complete Guide to the System, Methods and Tools
Six Sigma Fundamentals: A Complete Introduction to the System, Methods, and Tools
ISBN: 156327292X
EAN: 2147483647
Year: 2003
Pages: 144
Authors: D.H. Stamatis

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net