The electronics test lab received primarily two types of test requests, production tests and ad-hoc engineering tests. The bulk of the tests were production tests. Ad-hoc engineering tests were performed to check production parts against new requirements or to ensure new products would meet customer and engineering specifications. Production testing was performed to ensure part quality for the life of its manufacturing production.
Production tests consisted of randomly testing finished parts as they came off the production line. Each product that was produced by T1-Auto had test specifications and a test schedule created for it by the engineering team. Products typically had periodic tests defined on a weekly, monthly, quarterly, or yearly basis. Each plant had a quality team assigned to ensure that the test schedules for all of the parts were carried out. Plant quality personnel would pull finished parts from the line and place them in containers for shipment to T1-Auto's test lab.
Product engineers in the company could request the ad-hoc engineering tests at any time. One or more parts would be sent to the lab with specific test instructions including the type of test, number of tests, and specific test parameters.
The test lab consisted of various test machines capable of stress testing parts causing them to run through their particular actions thousands of times in a short period of time. The test lab could also simulate the working environment of a part using special environmental chambers capable of simulating severe weather conditions such as heat and humidity and subjecting the parts to elements such as a road salt.
Once the tests were received at the electronics test lab, the individual test had to be assigned to a lab technician and scheduled. The schedule had to coordinate the technicians, test equipment, and test requirements such as the required completion date.
After the parts had been tested, test reports were written and returned to the plant quality team and product engineer for production tests, or the submitting engineer for ad-hoc tests. If an issue occurred during testing, the engineer would be contacted for instructions. If the issue was related to the testing process or procedure, the engineer could ask the technician to continue the test. If the issue was related to a part or set of parts falling below expectations, the engineer could then ask that the test be stopped and that the parts be sent to his office for evaluation.
When a production test is stopped due to issues uncovered during the testing process, a process designed to inform all of the necessary participants must be followed. A form called the Unsatisfactory Test Results Notification (UTRN) would be filled out and sent to key participants, including the product engineer, sales representative, plant quality team, and product manager. Sales representatives for the particular product are contacted so they can notify the customers about the testing status of the products as soon as possible. In many instances, T1-Auto was contractually bound to notify its customers within twenty-four or forty-eight hours of the time that testing irregularities are discovered. These deadlines created substantial problems for T1, often disrupting previously planned testing which led to further problems. Product engineers would then start a standard ISO/QS 9000-quality process called the "Corrective Action." (See Exhibit 1 for a workflow chart depicting the original pre-ELTS process.)
The ELTS was quickly identified as a Transaction Workflow problem by the IT Lotus Notes™ team. Since the ELTS involved policies and procedures that crossed many groups and divisions within T1-Auto, and because the process was consistent across the organization, the solution lent itself very well to Lotus Notes. However, since T1-Auto was experiencing rapid growth and the number of tests was increasing, the testing process was prone to communication and coordination errors.
Given that T1-Auto was experiencing significant growth and the testing process involved so many different groups, consistency in the process became a concern. Ensuring that all the forms were complete and that they were filled out properly was a problem. Since the participants in the testing process had a number of other responsibilities, the likelihood of delays in time-sensitive events was high. To further complicate matters, the test lab had no advanced notification of required tests. This led to planning inefficiencies in scheduling staff and test equipment. Tests were scheduled once the truck arrived each day with the required tests from each plant. Management became aware that advance test schedule notification would improve utilization of test lab personnel and equipment.
Another problem with the testing process was that test lab technicians often did not know who were the current product or quality engineers for each product. This could result in an e-mail notification of unsatisfactory test results being sent to the wrong individuals. This sort of incorrect communication alone could add two to three days to the entire process. In general, communication speed is critical in the auto industry where contractual conditions often include delivery requirements for new product development. Failure on the part of T-1 to provide timely notification of test issues to its customers could cascade through the systems causing significant delays.
Another communication problem could occur when a customer or engineer called the plant or test labs to inquire about test status. Because the process was entirely paper-based, typically the best the plant quality personnel could do was to report on the portion of the process they controlled. The plant could tell if the parts have been pulled and shipped to the lab which indicated that testing had begun, or if the test report had been returned and the test completed. For any more detail, the plant personnel would have to call the lab for an update. Again, the paper-based system used at the lab was inefficient, since a status inquiry would require tracking down the physical test submittal paperwork and locating the assigned technician. Simple status inquiries added excessive human interaction and non-value-added overhead to the process. Furthermore, because of the amount of effort required to handle all the testing paperwork, the entire process was error prone.
As with all paper-based processes, historical information was difficult to locate. Test reports were copied and sent to the required participants. Historical reporting or analysis meant rifling through file cabinets and transcribing data to Excel™ spreadsheets creating another non-value-added and time consuming process constraint. Communication pertaining to specific tests could also be lost since the communication (e-mail, telephone calls, memo's, and hallway conversations) was not collected and stored with the testing documentation. It was possible at times for a duplicate process error or non value-added event to take place on a test because the necessary historical information was unavailable.
Inefficiencies in the testing process adversely affected part production. The testing process was contractually mandated. Therefore, it had to be done well. At the same time, due to time to market pressures, it had to be done expeditiously. The existing process was replete with inefficiencies that resulted in delays, communication problems with the customer, and the lack of a knowledge base for analyzing previous tests. Many of these problems stemmed from difficulties in handling all the documentation associated with the testing process. These included document tracking and document sign-offs.
Workflow is concerned with the automation of procedures wherein documents, information or tasks are passed between participants according to a defined set of rules to achieve, or contribute to, an overall business goal. While workflow may be manually organized, in practice most workflow is normally organized within the context of an IT system to provide computerized support for the procedural automation and it is to this area that the work of the Coalition is directed (Workflow Management Coalition 1995).
The ELTS software took only two months to develop by a single Notes™ developer. This rapid development time was achieved using the Lotus Notes™ development environment. All of the processing and development was accomplished with the core Notes™ application and the Lotus Notes™ macro development language. Electronic versions of the Lab Work Request, Unsatisfactory Test Results Notification, and Test Results Notification forms were duplicated in the Notes™ ELTS database. Additional forms were added for systems management and to allow the system users to carry out electronic discussions using a comment form within the ELTS system. The system was developed, tested, and deployed using proven software deployment methodologies.
The ELTS was initially developed using version 3.0 of Lotus Notes™ that was released in May of 1993. The first release of Notes™ shipped in 1989 and had become widely deployed and accepted as the leading e-mail and workflow application on the PC desktop in major corporations. With Notes™, users could create and share information using personal computers and networks (Florio n.d.). Today, Lotus Notes™ is on version 5.0 and claims over 50 million users. Lotus Notes™ was being tested at T1-Auto for a specific workflow application in 1994. Due to its power and workflow capabilities, it quickly spread to other applications and became the corporate email platform. Prior to Lotus Notes™, T1-Auto had no corporate e-mail package. Version 4.5 of Notes™, released in 1996, tightly integrated the power of Notes™ with the World Wide Web and called the new server side of the product Lotus Domino™. With a small amount of tweaking, Notes™ applications can become Web enabled and accessible via a Web browser.
Powerful workflow applications can be developed in Lotus Notes™ by designing forms and views. Forms are electronic versions of paper-based forms and contain fields. Fields store the information that is either entered on the form or automatically generated, such as the current date. Besides typical field formats such as numbers, dates, and text, Lotus Notes™ contains a signature field that allows for electronic signing of documents. Once a form has been created and filled in, it is called a document. Views within Notes™ display all of a specific set of documents that belong to an application. These views display in a list, using rows and columns, all of the desired fields from the documents. Multiple views for an application are typically created sorting the documents by different fields. The forms, documents, and views for a specific application are stored in a Lotus Notes™ database. A typical workflow developed in this manner application is stored in only one Notes™ database. (At the time of this development project, Notes™ was the best option for developing workflow systems. It should be noted that at present, there are a number of other options available, e.g. Microsoft Exchange™.)
The Lotus Notes™ development team at T1-Auto followed a standard development methodology for the development of the ELTS application. The project was broken down into the following phases: design, development and testing, pilot, full rollout, and ongoing operations. Prototyping was employed during the design phase to ensure that requirements were well understood and that they could be met.
The initial design simply created electronic versions of the Lab Submittal, Unsatisfactory Test Results Notification (UTRN), and Test Completion Notification (TCN) forms. The intent was to document and automate the existing process. It became clear that process efficiencies could be removed and value could be created if the test schedule was put on-line by including a Lab Submittal form with minimal information. A process map was developed using Visio™, a graphical charting program. The process map was used to document the existing process and develop requirements for the new system. Exhibit 1 contains a flowchart of the original process. The standard Lotus Notes™ development environment lacks the ability to graphically design and test workflow scenarios. It does, however, offer an add-on product called Lotus Workflow™ that provides this capability. The ELTS system was designed and created without the use of a graphical design and testing tool.
The development team quickly (less than a week) developed prototypes of the electronic forms and demonstrated them to the plant and lab personnel. Once the forms were approved, requirements were documented with the users showing how the new application would work and how the completed forms would appear in Notes™ views. Once the requirements were approved, development began immediately.
The completion of the Design phase yielded a clear understanding of the requirements for the ELTS application. Since the majority of the work for developing the forms was completed in the prototype phase, the remaining work consisted of:
Complete the forms,
Design and build views,
Code the form and view logic,
Create the agents,
Set and test security,
Write ‘About’ and ‘Using’ help pages.
Forms were all checked for consistency and proper operation. Fields were checked for proper function and usability. The design team had set standards to ensure a consistent "look and feel" and ease of use from one application to the next.
Notesä views were created to display current and future tests status, tests by plant, and test by month and day. Views were developed from the perspective of each process participant to make applications familiar and easy to learn and use. For instance, a view was developed for the plant personnel that listed each test in progress by plant, status and date. This made it easier for the plants to find their specific tests. For the lab users, views were developed that listed the tests by status, date and plant. Lab users were more interested in tests from all plants by date. Views were also created to develop specific reports for management that provided test status information.
Once the forms and views were completed, the developers wrote the code that automated the workings of the applications. Fields that would need to be automatically set by computation as opposed to user entry had to be coded. Views could also contain fields that would be computed for display such as number of days until a test was required.1 Notes™ program agents are small programs that can be triggered to run on a fixed schedule or when an event occurs, such as saving a form. One of the agents developed for the ELTS system was run each morning and checked all forms to see if they are being processed in a timely fashion. For instance, if a UTRN was not acknowledged by the test engineer, an automatic e-mail notification would be re-sent to the engineer as well as the engineer's manager.
While security is very important for workflow applications in general, it is even more critical for the ELTS system given the sensitive nature of a potential test problem. Security is defined on a very granular basis for the system from who can access the system to who can enter data into a specific form field. Access levels range from permission to view a specific form (reader) to permission to edit data on a form (manager). Typically, the Notes™ team uses roles for each application and manages security based on putting users into roles. This lessened the overall security management overhead necessary for each application. A role was created for the ETLS systems called ‘Test Engineer’. Every engineer using the system was put into this role and all were all granted exactly same security privileges within the ELTS system.
Once the system had been coded and tested, the last step prior to releasing the application was to write the ‘About’ and ‘Using’ Notes™ documents. The ‘About’ document is accessed from the Notes™ menu and contains information about the intent and ownership of each application within the system. The ‘About’ document also contains the application version and release date. The ‘Using’ document is also accessed from the main menu and provides information on the specific usage of each application. The development team typically put a description for each fields on the forms. The ‘Using’ document provided on-line help for the system.
Once user documentation was written and the system tested, the ELTS system was ready for the pilot test.
Prior to full implementation, a pilot version of the new application was installed. A cross-functional team of users from the engineering group, lab, and one plant was formed for pilot testing of the ELTS. The piloting of the project lasted only three weeks. A number of small improvements were made to the application to incorporate the necessary changes identified during the pilot. Most of the change centered on the usability of the application and not the core functionality. Most of the changes were made to the application in near real-time. Members of the development team worked with users in person and over the phone while viewing the pilot version. As the user talked, the developer made and tested many of the changes. If the changes were accepted, they were installed and tested in the production copy of the database.
The pilot was performed in parallel with the existing test process in order to minimize the impact to the organization and to individual workers. This parallel effort doubled the required work for the participants but lowered the overall risk of the project. The pilot group had to perform their typical duties as well as perform the testing process with the new system. This double-duty caused some heartache with an already busy group, but since the pilot only lasted three weeks, the members gracefully agreed to the extra effort. After the three-week pilot was completed, the application was deemed ready for full implementation.
Another by-product of the pilot was a list of additional functionality to extend the application into other processes. One such recommendation was to extend the UTRN sub-process to include the corrective action process by automatically creating and populating an electronic version of the corrective action form. This functionality was added after the initial version of the application was released.
Training for the new system was minimal since the user interface was familiar to most users. Wayne remembers:
Since the company used Lotus Notes™ as its e-mail system, it was a familiar interface for our people. Basically, anyone who was familiar with Lotus Notes™ and the product being tested could easily become a user of the system. Once they were on the system, they could easily navigate around the ELTS and receive information. ELTS was deployed to about fifty users dispersed over seven sites. The users included plant quality personnel, test lab managers and technicians, product engineers, quality engineers, and sales people. Many of the participants needed only view access into the new system. The system provided automatic notification via e-mail for many users when their participation in the process was necessary.
T1-Auto kept a small development team intact that maintained existing applications as well as wrote new applications. This team was augmented by an offsite team of Notes™ developers from T1-Auto's outsourcing partner. Given the ability to quickly and easily make changes to these systems, the development team had to balance rapid turn-around with stream of change requests. The team tried to collect requests for each application and make monthly changes. This system provided a reasonable way to quickly update each system while maintaining version control and quality standards. The T1 development team was responsible for developing and maintaining over 100 Notes™databases. Their costs were comparable to other development groups at the time, but T1 management felt that their productivity and impact was superior.
Abdallah Shanti, The CIO at T-1 Auto, recognized the benefit the ELTS system brought to the organization.
The ELTS project resulted in a benefit of approximately $900,000 annually through the elimination of redundant work. However, the biggest benefit came from more efficient and traceable process that was introduced at the labs. Parts were no longer lost in the middle of a test; testers stuck to their assigned tasks; and plant managers were able to view a test's progress in real-time. A key fact integral to the effective management of the testing process was, when there were irregularities in the test, the owner of the project was instantly notified. All this added up to better service to the customer.
The estimated software development cost of $18,000 over two months generated an estimated yearly saving of $900,000 in hard benefits, and potentially even more in non-quantifiable benefits.
Another valuable benefit was that the ELTS enabled T1 to begin mapping knowledge about the testing process. A knowledge store was developed that tracked all the information on each test. Included with this information was the unstructured data such as discussions and workflow notifications. No longer would there be conflicts regarding when a part was sent and received by the participants. No longer were problems encountered when production engineers did not respond to failed test. (If they didn't, they knew their supervisor would automatically be notified as a result of the inaction.)
The plant quality technicians now had one system for scheduling and tracking the PPT process. In addition, they were given real-time access to test status and were able to quickly look up and answer questions related to tests. Many of the reports they were asked to generate were made obsolete by incorporating the reports into standard Lotus Notes views that were accessible to all participants. With the use of color in views and workflow technologies, process irregularities were identified by the system and action was quickly taken to correct problems. Finally, the plant had to spend less time filling out reports and copying data from one system to another. The quality staff in each plant consisted of a very few people, so no staff reductions were required when the system was implemented. Since cutting cost had not been a primary objective, headcount reduction was not an issue. Finally, the task of updating test status and receiving the Test Completion Notification form was eliminated.
The test labs were given access to the PPT schedule and were better able to schedule their operations. The automatic generation and copying of data from the PPT form to the Test Completion and Test Failure Notification forms reduced errors and time. Workflow notification reduced the time it took for test lab technicians to notify product engineers of testing problems. Test lab supervisors were given a system to track all PPT test status and monitor communication with test technicians. By updating the test status regularly, the test technicians had to spend less time answering questions from engineers and plants particularly when inquiries were status checks.
Product engineers were immediately notified of problems, eliminating one entire day from the process. Engineers were also given ready access to historical test information for each of their products. This information proved to be invaluable when analyzing failure trends over time. For instance, engineers could identify if a failure occurred each year during hot months in the production process. The capability to access and analyze this type of information led to a much quicker resolution for many problems.
Overall, there were a number of factors that led to the success of the project. Within T1 there had already been some successful workflow projects. Thus, PPT users were open to the ELTS project. Once it proved itself, that it made their work easier and made them more productive, they became supporters of the system. The use of prototyping during the design phase ensured that requirements would be met and provided a vehicle for user involvement. The use of the pilot solidified the project's success. Not only did it provide a proof of concept within the testing environment, but it also enabled fine-tuning of the ELTS requirements.