Quality Plans


Each game project should establish its own plan for how quality will be monitored and tracked during the project. This is typically documented in the Software Quality Assurance Plan (SQAP). The SQAP contains no information about testing the game. That is covered in the game's Software Test Plan. An SQAP is strictly concerned with the independent monitoring and correction of product and process quality issues. It should address the following topics, most of which are covered in more detail here:

  • QA personnel

  • Standards to be used in the product

  • Reviews and audits that will be conducted

  • QA records and reports that will be generated

  • QA problem reporting and corrective actions

  • QA tools, techniques, and methods

  • QA metrics

  • Supplier control

  • QA records collection, maintenance, and retention

  • QA training required

  • QA risks

The book's CD provides a link to an SQAP template document from Teraquest (www.teraquest.com) that follows this outline.

QA Personnel

Begin this section by describing the organizational structure of the QA team. Show who the front-line QA engineers work for and who the head of QA reports to. Identify at which level the QA reporting chain is independent from the person in charge of the game development staff. This helps establish a path for escalating QA issues and identifies which key relationships should be nurtured and maintained during the project. A good rapport between the QA manager and the development director will have a positive effect on both the QA staff and the development staff.

Describe the primary role of each person on the QA team for this project. List what kinds of activities each of them will be involved in. Be as specific as possible. If a person is going to be responsible for auditing the user interface screens against the company's UI standards, then say that. If another person is going to take samples of code and check them with a static code analysis tool, then say that. Use a list or a table to record this information.

Strictly speaking, QA and testing are separate, distinct functions. QA is more concerned with auditing, tracking, and reporting, whereas testing is about the development and execution of tests in the relentless pursuit of finding operational defects in the game. However, depending on the size and skills of your game project team, you may not have separate QA and test teams . It's still best to keep those two plans separate even if some or all of the same people are involved in both kinds of work.

Standards

Two types of standards should be addressed in this section: product standards and process standards. Product standards apply to the function of things that are produced as part of the game project. This includes code, graphics, printed materials, and so on. Process standards apply to the way things are produced. This includes file naming standards, code formatting standards, and maintenance of evolving project documents such as the technical design document. Document all of the standards that apply as well as which items they apply to. Then describe how the QA staff will monitor them and follow up on any discrepancies.

Reviews and Audits

The kinds of reviews performed by QA are not the same as developers or testers would do for code or test designs. A QA review is usually done by a single QA engineer who evaluates a work product or ongoing process against some kind of reference such as a checklist or standard. QA reviews and audits span all phases and groups within the game project.

Project documents, project plans, code, tests, test results, designs, and user documentation are all candidates for QA review. QA should also audit work procedures used by the team. These can include the code inspection process, file backup procedures, and the use of tools to measure game performance over a network.

Reviews and audits can be performed on the results of the process, such as checking that all required fields in a form are filled in with the right type of data and that required signatures have been obtained. Another way to audit is to observe the process in action. This is a good way to audit peer reviews, testing procedures, and weekly backups . Procedures that occur very infrequently, such as restoring project files from backup, can be initiated by QA to make sure that the capability is available when it is needed.

QA itself should be subject to independent review (Rule 2). If you have multiple game projects going on, each project's QA team can review the work of the other in order to provide feedback and suggestions to ensure that they are doing what they documented in the SQAP. If no other QA team exists, you could have someone from another function such as testing, art, or development use a checklist to review your QA work.

The QA activities identified in this section of the SQAP should be placed on a schedule to ensure that the QA people will have the time to do all of the activities they are signed up for. These activities should also be coordinated with the overall project schedule and milestones so you can count on the work products or activities that are being audited to be available at the time you are planning to audit them.

As part of being a good citizen, planned QA activities that will disrupt other people's work, such as restoring backups or sitting down with someone to review a month's worth of TDD updates, should be incorporated into the overall project schedule so the people affected will be able to set aside the appropriate amount of time for preparing and participating in the audit or review. This is not necessary for activities such as sitting in on a code review because the code review was going to take place whether or not you were there.

Feedback and Reports

The SQAP should document what kinds of reports will be generated by SQA activities and how they will be communicated. Reporting should also include the progress and status of SQA activities against the plan. These get recorded in the SQAP along with how frequently the QA team's results will be reported and in what fashion. Items that require frequent attention should be reported on regularly. Infrequent audits and reviews can be summarized at longer intervals. For example, the QA team might produce weekly reports on test result audits, but produce quarterly reports on backup and restoration procedure audits. Test result audits would begin shortly after testing starts and continue through the remainder of the project. Backup and restoration audits could start earlier, once development begins.

SQA reporting can be formal or informal. Some reports can be sent to the team via email, while others may aggregate into quarterly results for presentation to company management at a quarterly project quality review meeting.

Problem Reporting and Corrective Action

SQA is not simply done for the satisfaction of the QA engineers. The point of SQA is to provide a feedback loop to the project team so that they are more conscientious about the importance of doing things the right way. This includes keeping important records and documents complete and up to date. It's up to QA to guide the team or the game company in determining which procedures and work products benefit the most from this compliance. Once an SQA activity finds something to be non-compliant, a problem report is generated.

Problem reports can be very similar to the bug reports you write when testing finds a defect in the software. They should identify which organization or individual will be responsible and describe a timeframe for resolving the issue. The SQAP should define what data and statistics on non-compliant issues should be reported, as well as how and when they are to be reviewed with the project team.

History has shown, unfortunately , that some project members might be more reluctant to spend time closing SQA problems because they have their "real job" to do ‚ development, testing, artwork, and so on. As a consequence, it's a good idea to define the criteria and process for escalating unresolved issues. Similarly, there should be a defined way for resolving issues with products that can't be fixed within the game team, such as software tools or user manuals.

In addition to addressing compliance issues one at a time, SQA should also look for the causes of negative trends or patterns and suggest ways to reverse them. This includes process issues such as schedule slippages and product issues such as game asset memory requirements going over budget. The SQAP should document how the QA team will detect and treat the causes of such problems.

Tools, Techniques, and Methods

Just like development and testing, the QA team can benefit from tools. Since QA project planning and tracking needs to be coordinated with the rest of the project, it's best if they use the same project management tools as the rest of the game team. Likewise, tracking issues found in QA audits and reviews should be done under the same system used for code and test defects. Different templates or schemas might be needed for QA issue entry and processing, but this will keep the team software licensing and operation costs down and make it easy for the rest of the team to access and update QA issues.

Some statistical methods might be useful for QA analysis of project and process results. Many of these methods are supported by tools. Such tools and methods should be identified in the SQAP. For example, Pareto Charts graph a list of results in descending order. The bars furthest on the left are the most frequently occurring items. These are the issues you should spend your time on first. If you are successful at fixing them, the numbers will go down and other issues will replace them on the left of the chart. You can go on forever addressing the issue at the left of the chart because there will always be one. This is kind of like trying to clean out your garage. At some point in time, you can decide the results are "good enough" and move on to some entirely different result to improve.

Figure 6.7 shows an example Pareto Chart of the number of defects found per thousand lines of code (KLOC) in each major game subsystem. The purpose of such a chart could be to identify which portion of the code would benefit the most from using a new automated checking tool. Because there are costs associated with new technologies ‚ purchasing, training, extra effort to use the tool, and so on ‚ it should be introduced where it would have the greatest impact. In this case, start with the rendering code.


Figure 6.7: Pareto Chart of defects per KLOC for each game subsystem.

Another useful software QA method is to plot control charts of product or process results. The control chart shows the average result to expect and "control limit" boundary lines for the set of data provided. Any items outside of the control limits fall beyond the range of values that would indicate they came from the same process as the rest of the data. This is like having a machine that stamps metal squares a certain way, but every once in a while, one comes out very different from the others. If you have the right amount of curiosity to be a QA person, you would want to know why the square comes out wrong some of the time. The same is true for software results that come out " funny ." The control chart reveals results that should be investigated to understand their cause. It might simply be a result of someone entering the wrong data (date, time, size, defects, and so on). Figure 6.8 shows an example control chart for new lines of delta (added or deleted) code changes in the game each week. The numbers are in KLOC.


Figure 6.8: Control chart of weekly code change in KLOC.

The solid line running across the middle of the chart is the average value for the data set. The two dashed lines labeled UCL and LCL represent the Upper Control Limit and the Lower Control Limit, respectively. These values are calculated from the data set as well. The data point for the week of 5/2/2004 lies above the UCL. This is a point that should be investigated.

Note ‚  

The Pareto Chart and control chart in Figures 6.7 and 6.8, respectively, were created using SPC for Excel (www.spcforexcel.com). A link to a demo version is provided on the book's CD-ROM.

I remember one project where there was a noticeable dip in the number of defects submitted one week. This was a good result for the developers but bad for the testers. A quick investigation revealed that "Bud" ‚ a particularly productive tester ‚ had been on vacation that week. The test data for the rest of the team was within the normal range. Legitimately bad results should be understood and subsequently prevented from happening in the future. Especially good results are just as important to understand so they can be imitated. Additional tools and techniques can be identified in the SQAP for those purposes. This result also suggests that the data could be reported in a different way, such as defects per tester, to account for inevitable fluctuations in staffing. This could replace the original chart or be used in addition to it.

Supplier Control

Your game is not just software. It's a customer experience. The advertisements in the store, the game packaging, the user's manual, and the game media are all part of that experience. In many cases these items come from sources outside the game team. These are some of your "suppliers." Their work is subject to the same kinds of mistakes you are capable of producing on your own. You may also have software or game assets supplied to you that you use within the game, such as game engines, middleware, art, and audio files.

In both of these cases, QA should play a role in determining that the supplied items are "fit for use." This can be done in the same way internal deliverables are evaluated. Additionally, the QA team can evaluate the supplier's capability to deliver a quality product by conducting on-site visits to evaluate the supplier's processes. When you go to the deli, it's nice to see that the food is laid out nicely in the display case. You also appreciate the fact that a food inspector has checked out the plant from which the food originates to see that it is uncontaminated, and that the produced in a clean and healthy environment. The same should be true for game- related software and materials that are supplied to you from other companies.

Training

If new tools, techniques, and/or equipment are going to be used in the development of the project, it may be necessary for one or more QA personnel to become acquainted so they can properly audit the affected deliverables and activities. The impact of the new technologies may affect QA preparation as well, such as requiring new audit checklists to be created or new record types to be defined in the audit entry and reporting system.

The QA training should be planned and delivered in time for QA to conduct any activities related to work products or processes using the new technology. If the team is already having an in-house course delivered, then add some seats for QA. If the team is inventing something internally, try to get a briefing from one of the inventors. Some tools and development environments come with their own tutorials, so get some QA licenses and allocate time to go through the tutorial.

New tools or techniques identified for QA-specific functions should be accompanied with appropriate training. Identify these, document them in the SQAP, and get your training funded .

Risk Management

Risk management is a science all unto itself. In addition to all of the risks involved with developing a game, there are also risks that could hamper your team's QA efforts. Some typical SQA risks are

  • Project deliverables go out of sync with planned audits

  • QA personnel diverted to other activities such as testing

  • Lack of independent QA reporting structure

  • Lack of organization commitment to take corrective actions and/or close out issues raised by QA

  • Insufficient funding for new QA technologies

  • Insufficient funding for training in new development and/or QA technologies

It's not enough to list your risks in the SQAP. You also need to identify the potential impact of each risk and any action plans you can conceive to describe how you would proceed if the risk occurs and/or persists.




Game Testing All in One
Game Testing All in One (Game Development Series)
ISBN: 1592003737
EAN: 2147483647
Year: 2005
Pages: 205

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net