A Project Assessment Questionnaire

There are no right or wrong answers to these questions, the purpose of these questions is to ascertain an accurate view of the past and current development processes, methodologies, and practices used for your project.

  1. Project and Development Team Information

    1. Name of project: __________________________________________________

      1. Please provide a brief description of the project and product.
      2. Names and roles of respondent(s) to this questionnaire:
    2. Size of project (lines of code, function points, or other units):

      VA Java code?

      DB- related code?

      Other (C, C++, etc.)

    3. Delivery dates for key functions (or target delivery dates) including original dates and any reset dates: ______________________________________________ _______________________________________________________________
    4. Current stage of the project if not already shipped (e.g., functional test almost complete, in final test phase, product/release in beta, etc.):
      1. Does/did the project involve cross-site or cross-lab development?
      2. If yes, what site(s) and lab(s)?
      3. Is there a cross-lab development process available?
      4. At what organizational level are cross-site/cross-team development implemented (e.g., 1st line level, 2nd line/functional level, etc.)?
      1. Did the design point of the project serve to satisfy multiple users or constituencies?
      2. Was the project implemented on an open /common platform (e.g., Intel, PowerPC, Linux, Window, FreeBSD)?

        Please specify:

    5. On a scale of 1 to 10 (10 being the most complex), how would you rate the complexity of the project based on your experience and knowledge of similar types of software projects?
    6. Development cycle time (equate ship date with final delivery in an iterative model):

      1. From design start to ship: _____ months
      2. From design start to bring-up: _____ months
      3. From bring-up to code integration complete (all coding done): _____ months
      4. From code integration complete to internal customer use (all development tests complete) of the product: _____ months
      5. From development test complete to GA: _____ months
    7. Development team information (please provide estimates if exact numbers are not available):

      1. Total size of team of the entire project: _____
      2. Number of VA Java programmers spending 100% of time on project: _____
      3. Number of VA Java programmers spending less than 100% of time on project: _____
      4. Number of database programmers spending 100% of time on project _____
      5. Number of database programmers spending less than 100% of time on project _____
      6. Number of other programmers _____ (specify skills)
      7. Distribution of team members by education background (percent):

        Computer science _____ %

        Computer engineering _____ %

        Others (please specify) _____ %

        _________________ _____ %

        Total 100.0% (N = total number of members)

      8. Approximate annual turnover rate of team members: _____
    8. How would you describe the skills and experience levels of this team (e.g., years with tools experience, very experienced team, large percent of new hires, etc.)?

      1. Are there sufficient skilled technical leaders /developers in the organization to lead and support the whole team?
      2. If possible, please give percent distribution estimates with regard to years of industry software development experience:

        < 2 years _____ %

        “5 years _____ %

        > 5 years _____ %

        TOTAL _____ 100.0%

  1. Requirements and Specifications

    1. To what extent did the development team review the requirements before they were incorporated into the project. (Please mark the appropriate cell for each row in the table)

       

      Always

      Usually

      Sometimes

      Seldom

      Never

      Functional requirements

               

      Performance requirements

               

      Reliability/availability/serviceability (RAS) requirements

               

      Usability requirements

               

      Web Publishing/ID requirements

               
      1. Per your experience and assessment, how important is this practice (requirements review) to the success of tools projects? (Please mark the appropriate cell for each row in the table.)
       

      Very Important

      Important

      Somewhat Important

      Not Sure

      Functional requirements

             

      Performance requirements

             

      Reliability/availability/serviceability (RAS) requirements

             

      Usability requirements

             

      Web Publishing/ID requirements

             
    2. Specifications were developed based on the requirements and used as the basis for project planning, design and development, testing, and related activities.

      1. Always
      2. Usually
      3. Sometimes
      4. Seldom
      5. Never

        1. Per your experience and assessment, how important is this practice (specifications and requirements to guide overall project implementation) to the success of this project?

          1. Very important
          2. Important
          3. Somewhat important
          4. Not sure
        2. If your assessment of the above is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
    3. How did your project deal with (a) late requirements and (b) requirements changes? Please elaborate.

Project Strengths and Weaknesses with Regard to Section B

(B1) Is there any practice(s) by your project with regard to requirements and specifications that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.

(B2) If you were to do this project all over again, what would you do differently with regard to requirements and specifications, and why?

  1. Design, Code and Reviews/Inspections

    1. To what extent did the design work of the project take the following into account? (Please mark the appropriate cells .)

       

      Largest Extent Possible

      Important Consideration

      Sometimes

      Seldom

      Don't Know

      (a) Design for extensibility

               

      (b) Design for performance

               

      (c) Design for reliability/availability/serviceability (RAS)

               

      (d) Design for usability

               

      (e) Design for debugability

               

      (f) Design for maintainability

               

      (g) Design for testability

               

      (h) Design with modularity (component structure) to allow for component ownership and future enhancements

               
    2. Was there an overall high-level design document in place for the project as overall guidelines for implementation and for common understanding across teams and individuals?

      a. Yes

      b. No

      1. Per your experience and assessment, how important is this practice (overall design document) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
    3. To what extent were design reviews of the project conducted ? (Please mark the appropriate cell in each row in the table.)

       

      All Design Work Done Rigorously

      All Major Pieces of Design Items

      Selected Items Based on Criteria (e.g., Error Recovery)

      Design Reviews Were Occasionally Done

      Not Done

      Original design

               

      Design changes/ rework

               
      1. Per your experience and assessment, how important is this practice (design review/verification) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      2. If your assessment in question 16(a) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
    4. What is the most common form of design reviews for this project?

      1. Formal review meeting with moderators, reviewers/inspectors, and defect tracking ”issues resolution and rework completion as part of the completion criteria
      2. Formal review but issue resolution is up to the owner
      3. Informal review by experts of related areas
      4. Codeveloper (codesigner) informal review
      5. Other.....(Please specify.)
    5. In your development process, are there an entry/exit criteria for major development phases?

      1. If yes to question 18, is the review process related to the entry/exit criteria of process phases (e.g., is the successful completion of design reviews part of exit criteria of the design phase)?
      2. If yes to question 18a, how effectively are the criteria followed/enforced?

        1. Very effectively
        2. Effectively
        3. Somewhat effectively
        4. Not effectively
      3. If yes to question 18, if entrance /exit criteria were not met, what did you do?
      4. Per your experience and assessment, how important is this practice (successful design review as part of exit criteria for the design phase) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      5. If your assessment in question 18d is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
    6. Were any coding standards used?

      If yes, please briefly describe.

    7. To what extent did the code implementation of the project take the following factors into account? (Please mark the appropriate cell for each row in the table.)

       

      Largest Extent Possible

      Important Consideration

      Sometimes

      Seldom

      Don't Know

      Code for extendibility

               

      Code for performance

               

      Code for debugability

               

      Code for reliability/availability/serviceability (RAS)

               

      Code for usability

               

      Code for maintainability

               
    8. To what extent were code reviews/inspections conducted? (Please mark the appropriate cell for each row in the table.)

       

      Rigorously 100% of the Code

      Major Pieces of Code

      Selected Items Based on Criteria (e.g., Error Recovery Code)

      Occasionally Done

      Not Done

      Original code implementation

               

      After significant rework/changes

               

      Final (or near final) code implementation

               
      1. Per your experience and assessment, how important is this practice (code reviews and inspections) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      2. If your assessment in question 21(a) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?

Project Strengths and Weaknesses with Regard to Section C

(C1) Is there any practice(s) by your project with regard to design, code, and reviews/inspections that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.

(C2) If you were to do this project all over again, what would you do differently with regard to design, code, and reviews/inspections, and why?

  1. Code Integration and Driver Build

    1. Was code integration dependency (e.g., with client software, with database, with information development, with other software, with other organizations or even with other sites) a concern for this project?

      a. Yes

      b. No

      1. If yes to question 22, please briefly describe how such dependencies were managed from a code integration/driver build point of view for this project and what (tools, process, etc.) was used.
      2. Per your experience and assessment, how important is this practice (code integration dependency management) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      3. If your assessment in question 22b is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
    2. With regard to the integration and build process, how do you control part integration?

      1. In a cross-site development environment, how is the part integration handled from an organizational point of view? Is there an owning organization responsible for part integration?
      2. If yes to question 23(a), how is the development group involved in the integration/bring-up task?
    3. Please briefly describe your process, if any, in enhancing code integration quality and driver stability.

      1. Per your experience and assessment, how important is this practice (code integration control, action/process on integration quality and driver stability) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      2. If your assessment in question 24(a) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
    4. What is your driver build cycle (e.g., daily, weekly, biweekly, monthly, flexible ”build when ready, etc.)? Please provide your observations on your build cycle as it relates to your project progress (schedule and quality). If it varied throughout the project, please describe how this was handled through the different phases (i.e., early function delivery and bring-up, vs. fix-only mode, etc.).

Project Strengths and Weaknesses with Regard to Section D

(D1) Is there any practice(s) by your project with regard to code integration and driver build that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.

(D2) If you were to do this project all over again, what would you do differently with regard to code integration and driver build, and why?

  1. Test

    1. Was there a test plan in place for this project at the functional (development test) and overall project level (including independent test team)? Who initiated the test plan? (Please fill in the cells in the table.)

       

      Test Plan in Place (Yes/No)

      Who Initiated

      Who Executed

      Development Test

           

      Overall Project

           
    2. What types of test/test phases (unit, simulation test, functional, regression, independent test group, etc.) were conducted for this project? Please specify and give a brief explanation of each.

      1. Please elaborate on your error recovery or "bad path " testing.
      2. Please elaborate on your regression testing.
    3. Was test coverage/code coverage measurement implemented?

      If yes, for which test(s), and who does it?

    4. Are entry/exit criteria used for the major test phases/types?

      If yes,

      (a) please provide a brief description.

      (b) How are the criteria used or enforced?

      1. Per your experience and assessment, how important is this practice (entry/exit criteria for major tests) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      2. If your assessment in question 29a is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
    5. Is there a change control process in place for integrating fixes?

      a. Yes (please briefly describe)

      b. No

      1. If yes, how effectively in your assessment is the process being implemented?

        1. Very effectively
        2. Effectively
        3. Somewhat effectively
        4. Not effectively
      2. Per your experience and assessment, how important is this practice (change control for defect fixes) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      3. If your assessment in question 30b is "very important" or "important" and your project's actual practice/effectiveness didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?

Project Strengths and Weaknesses with Regard to Section E

(E1) Is there any practice(s) by your project with regard to testing that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.

(E2) If you were to do this project all over again, what would you do differently with regard to testing, and why?

  1. Project Management

    1. Was there a dedicated project manager for this project?

      1. How would you describe the role of project management for this project?

        1. Project management was basically done by line management.
        2. There was a project coordinator ”coordinating activities and reporting status across development teams and line managers.
        3. There was a project manager but major project decisions were progress-driven by line management.
        4. The project manager, together with line management, was responsible for the success of the project. The project manager drove progress (e.g., dependency, schedule, quality) of the project and improvements across teams and line management areas.
        5. Other.....(Please specify/describe.)
      2. Per your experience and assessment, how important is this practice (effective role of project management) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      3. If your assessment in question 31(b) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
    2. How were sizing estimates of the project ( specifically the amount of design and development work) derived?
    3. How was the development schedule developed for this project? Please provide a brief statement (e.g., top-down [GA date mandated ], bottom-up, bottom-up and top-down converged with proper experiences and history, based on sizing estimates, etc.).

      1. Per your experience and assessment, how important is this practice (effective sizing and schedule development process based on skills and experiences) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      2. If your assessment in question 33(a) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
    4. Was a staged delivery/code drop plan developed early based on priorities and dependencies and executed?

      a. Yes, please briefly describe.

      b. No, please briefly describe.

      1. Per your experience and assessment, how important is this practice (good staging and code drop plan) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      2. If your assessment in question 34a is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
    5. Does this project have to satisfy multiple constituents or diverse users?

      If yes,

      1. how was work prioritized?
      2. How was workload distribution determined?
      3. How was conflict resolved?
    6. If this is a cross-site development project (Question 5), please briefly describe how cross-site dependency was managed.
    7. Under what level of management were major dependencies for deliverables managed?

      1. Under the same development manager
      2. Under the same functional manager
      3. Across functional areas but under the same development directors
      4. Coordination across development directors
      5. Under the same project executive
      6. Other... Please describe.
    8. What were the major obstacles, if any, to effective team communications for your project?
    9. Were major checkpoint reviews conducted at various stages of the project throughout the development cycle?

      a. Yes

      b. No

      1. If yes to question 39, please describe the major checkpoint review deliverables.
      2. If yes to question 39, how effective in your view were those checkpoint reviews? Please briefly explain.

        1. Very effective
        2. Effective
        3. Somewhat effective
        4. Not effective
      3. Per your experience and assessment, how important is this practice (effective checkpoint process) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      4. If your assessment in question 39(c) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?

Project Strengths and Weaknesses with Regard to Section F

(F1) Is there any practice(s) by your project with regard to project management that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.

(F2) If you were to do this project all over again, what would you do differently with regard to project management, and why?

  1. Metrics, Measurements, Analysis

    1. Were any in-process metrics used to manage the progress (schedule and quality) of the project (e.g., function delivery tracking, problem backlog tracking, test plan execution, etc.)?

      a. Yes

      b. No

      1. If yes, please specify/describe where applicable .

        1. Metric(s) used at the front end of the development cycle (i.e., up to code integration)
        2. Metric(s) used for driver stability
        3. Metric(s) used during testing with targets/baselines for comparisons
        4. Others (simulation measurement, test coverage/code coverage measurement, etc.) ”Please specify.
      2. Per your experience and assessment, how important is this practice (good metrics for schedule and quality management) to the success of this project?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
      3. If your assessment in question 40(b) is "very important" or "important" and your project's actual practice didn't match the level of importance, what were the reasons for the disparity (e.g., obstacles, constraints, process, culture, experiences, etc.)?
    2. Was there any defect cause analysis (e.g., problem components , Pareto analysis) which resulted in improvement/corrective actions during the development of the project?

      If yes, please describe briefly.

Project Strengths and Weaknesses with Regard to Section G

(G1) Is there any practice(s) by your project with regard to metrics, measurements, and analysis you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.

(G2) If you were to do this project all over again, what would you do differently with regard to metrics, measurements, and analysis, and why?

  1. Development Environment/Library

    1. Please name and describe briefly your development environment/platform(s) and source code library system(s).
    2. To what extent was the entire team familiar with the operational, build, and support environment?
    3. Was your current development environment or any part of it a hindrance in any way? What changes might enhance the development process for quality, efficiency, or ease-of-use? Please provide specifics.

Project Strengths and Weaknesses with Regard to Section H

(H1) Is there any practice(s) by your project with regard to development environment/library system that you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.

(H2) If there is any development environment/library system that per your assessment is the best for tools development, please describe and explain.

(H3) If you were to do this project all over again, what would you do differently with regard to development environment/library system, and why?

  1. Tools/Methodologies

    1. In what language(s) was the code for the project written?
    2. Was the project developed with

      1. object-oriented methodology?
      2. procedural methods ?
    3. Are multiple environments required in order to fully test the project? If so, please describe.
    4. Are any kind of simulation test environments available? Please describe.

      1. If yes to question 48, how important is this to the success of tools projects?

        1. Very important
        2. Important
        3. Somewhat important
        4. Not sure
    5. Please describe briefly any tools that were used for each of the following areas:

      1. Design
      2. Debug
      3. Test ”code coverage
      4. Test ”automation/stress
      5. Other. Please explain.
    6. What was the learning curve of the development team to become proficient in using the above tools and the development environment/library discussed earlier? Please provide information if any specific education is needed.

Project Strengths and Weaknesses with Regard to Section I

(I1) Is there any practice(s) by your project with regard to tools and methodologies you consider a strength and that should be considered for implementation by other projects? If so, please describe and explain.

(I2) If any tools and methodologies that per your assessment are the best for the type of projects similar to this project, please describe and explain.

(I3) If you were to do this project all over again, what would you do differently with regard to tools and methodologies?

  1. Project Outcome Assessment

    1. Please provide a candid assessment of the schedule achievement (vs. original schedule) of the project. Please provide any pertinent information as appropriate (e.g., adherence to original schedule, meeting/not meeting GA date, meeting/not meeting interim checkpoints, any schedule reset, any function cutback/increase, unrealistic schedule to begin with, etc.).
    2. Please provide a candid assessment of the quality outcome of the project. Please provide any pertinent information as appropriate (e.g., in-process indicators, test defect volumes /rates, field quality indicators, customer feedback, customer satisfaction measurements, customer critical situations, any existing analysis and presentations. Please attach files or documents, etc.).
    3. How would you rate the overall success of the project (schedule, quality, costs, meeting commitments, etc.)?

      1. Very successful
      2. Successful
      3. Somewhat successful
      4. Not satisfactory
  2. Comments

    Please provide any comments, observations, insights with regard to your project specifically or tools projects in general.

What Is Software Quality?

Software Development Process Models

Fundamentals of Measurement Theory

Software Quality Metrics Overview

Applying the Seven Basic Quality Tools in Software Development

Defect Removal Effectiveness

The Rayleigh Model

Exponential Distribution and Reliability Growth Models

Quality Management Models

In-Process Metrics for Software Testing

Complexity Metrics and Models

Metrics and Lessons Learned for Object-Oriented Projects

Availability Metrics

Measuring and Analyzing Customer Satisfaction

Conducting In-Process Quality Assessments

Conducting Software Project Assessments

Dos and Donts of Software Process Improvement

Using Function Point Metrics to Measure Software Process Improvements

Concluding Remarks

A Project Assessment Questionnaire



Metrics and Models in Software Quality Engineering
Metrics and Models in Software Quality Engineering (2nd Edition)
ISBN: 0201729156
EAN: 2147483647
Year: 2001
Pages: 176

Flylib.com © 2008-2020.
If you may any questions please contact us: flylib@qtcs.net