5.6 Post-implementation Monitoring

5.6 Post-implementation Monitoring

It is important to continue monitoring databases after remedies have been implemented. This provides two distinct values: validating the improvement effort and checking for the occurrence of new problems.

Validating Changes

The need to measure the quality of data before and after changes accomplishes two things: it validates that the changes have had a positive impact, and it quantifies the value provided to the business. The next chapter covers the factors considered in justifying data quality assurance functions. It demonstrates that it is impossible to predict the effects in advance. The best indicator of the potential value of an investigation is the value returned from other, similar investigations. This demonstrates the need to measure again after changes.

Also remember that the real impact will never be known for sure. If an inaccuracy count of 5% was found before changes and only 0.5% after changes, a logical conclusion is that an impact has been made. However, the real but unknown rate before may have been 7%, and after, 1%. There is a sizeable impact, although the true statistical difference is not known. This underscores the need to keep metrics from being used as absolutes but rather as indicators of the direction and relative size of impacts.

If post-change monitoring does not show significant differences, the analysis of causes or the implementation of remedies was not successful. The issues team needs to circle back and rethink what they have done.

Sometimes changes have an unintentional negative impact. For example, performance may be severely degraded due to extra error checking, or the number of rejected transactions may become too high. The trade-off is between "Do I let incomplete and inaccurate information get through in order to get the transactions processed?" or "Do I insist on perfectly accurate information before any transaction can complete?". There is no need to compromise quality to obtain accurate information, although it may take a lot of work and innovative process design to achieve it. The first attempts may not prove to be the optimal solution, and additional attempts need to be made.

Impacts on the business process should also be observed and documented in the issue management system. These may be positive or negative. Often streamlining the business processes to obtain higher-quality data leads to other savings as well.

Continuous Checking

All information systems should be instrumented to provide ongoing monitoring of data quality parameters. Most older systems have little or no monitoring functions built into them. They should be retrofitted into systems when addressing important quality issues. They should be included when developing new applications or making major renovations to existing applications. Monitoring can include a number of things: feedback on rejected transactions, periodic execution of rule sets over databases, and periodic thorough data profiling.

Feedback on rejected transactions is important because excessive rejection indicates poor business process design. It is easy to accomplish this, but it is rarely done. Indications of the quantity of rejects, the point of rejection, and the data elements causing the rejection provide valuable information to data quality assurance staff, application developers, and business process designers.

An example of this is to design an Internet order form such that every time the user has a SUBMIT function denied because of checking errors a quality packet is built and sent to the application server indicating the errors found. The alternative is to wait for a correct form completion and only send that. The last approach provides no feedback that could lead to better form design and less frustrated customers.

Continuous monitoring tends to become decoupled with issues tracking. This is because the monitoring mechanisms become more global in nature and take in monitoring of information relevant to many issues. At the least, the issue tracking system should identify the specific additions to monitoring functions performed as a result of that issue.



Data Quality(c) The Accuracy Dimension
Data Quality: The Accuracy Dimension (The Morgan Kaufmann Series in Data Management Systems)
ISBN: 1558608915
EAN: 2147483647
Year: 2003
Pages: 133
Authors: Jack E. Olson

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net