The goal of custom application system testing should be to ensure that each work unit or module (those identified during unit testing) interacts with one another as designed. While unit testing procedures ensure that each component works independently of one another, the system test process ensures that interactions between those units have no unintended consequences. Specifically, the end-to-end business functionality, including both front and back-end components, will be tested. While sometimes broken out, the system test plan could include security testing and performance testing as well. These are just additional test cases in additional sections. The goal for each component should be to define quantifiable test cases that cover all interrelated functionality of the application.
The system test approach starts with a plan and test cases. The plan should include sections that relate to all application components. Those sections should include test cases that relate back to each and every functional requirement component. This assumes that all functional requirements have been written in a way that makes them testable, which should be the case. Your system test plan should not be a redo of the unit test. In many cases, this will add too much time to system testing and cloud the focus on the larger system. The system test plan should be written with an overview of the technical approach, but not based on coding specifics. For reports it might mean watching a trend over simulated time or comparing different reports to verify continuity. The key is that individual pieces of code logic are not the focus; the overall system function as a whole is the key.
System test execution takes on many forms, from packaged applications designed specifically for testing, recording and notifying, to the old standby of Excel and email. The goal and approach are the keys, but the execution can pull it all together and make the difference between success and acceptance test disappointment. From the get-go, develop your execution plan so that you won't need to reinvent the wheel each time you need to execute a system test for your application. Create scripts or use a tool to verify that you have all necessary data and scenarios to perform all tests. If you don't have that data, develop a reusable process to mock up the data entry or source system, as opposed to the data in your application. Create a process that will execute the scripts in a batch if possible, and keep a record of historic results. Write a process document that explains how to execute the system test automation as part of the system test plan. These steps will help to ensure a robust and thorough regression test as well as new functionality validation.
System testing is the development team's conscience. The focus is on the entire application meeting the overall technical requirements. Independently verify that all the pieces fit together and find the holes before users have their first look. First impressions are crucial to early user adoption. Users seem to forgive and forget delays and agree to scope cuts much quicker than they forgive (and maybe never forget) a bug-ridden application that misses service level agreements and causes problems. A good system test can make this difference, so don't cut time here when the project timeline is squeezed. I've never seen such a "shortcut" lead to a more successful, earlier deployment. In fact, it usually takes longer once the issues unfold in acceptance testing. Follow the process, be complete, and focus on reusability and automation for a system test process that will help make a successful application.
Don Steffen is a cofounder and partner of AmberLeaf Partners, Inc. (formerly BI Solutions, Inc.), a consulting firm dedicated to enabling innovative companies with the information to make critical investment decisions. Steffen has been designing and delivering technical architecture and solutions in the business intelligence and data warehouse industry for more than a decade. He can be reached at firstname.lastname@example.org.