Continue in 2 seconds

We are working with a data warehouse project where we extract data from the source (ETL) and create fact tables and dimensions depending on them.

  • Sid Adelman, Clay Rehm, Les Barbusinski
  • January 13 2003, 1:00am EST
More in


We are working with a data warehouse project where we extract data from the source (ETL) and create fact tables and dimensions depending on them. Actually, the end users would query our cubes using their UI Cognos. I need quality assurance and I need to test the complete application. Is there any tool available which can be useful to my case? If not, how should I approach the problem at hand of quality assurance?


Sid Adelman’s Answer: The ETL process should be treated just like any other major application for its testing requirements. In addition, each time the ETL system is run, be sure to include tie-outs to validate the number of rows captured, the dollars, number of customers, etc.

Canned queries and reports in the query and report library should be thoroughly validated and tested by the submitter as well as the library administrator since users will be counting on the quality of these programs.

QA for ad hoc queries is almost impossible. This means the users must be trained to fully understand their data, to perform reasonableness tests on the results, and to do their own cross checks with trusted data.

Les Barbusinski’s Answer: In your case you need to certify two things: 1) the transformed fact and dimension data generated by your ETL tool (e.g., Cognos DecisionStream, Informatica PowerCenter, Ascential DataStage, Microsoft DTS, etc.), and 2) the metrics generated by the reporting tool itself (i.e. Cognos PowerPlay or Impromptu). In the first case, a variety of data quality and auditing tools (from Ascential, Trillium, IBM and others) can be applied to certify the data. In the latter case, you’re on your own (i.e., the proprietary nature of PowerPlay hypercubes and Impromptu "schema catalogs" precludes the use of third-party auditing software).

However, in my experience the really hard work with data and report certification is done manually. This involves preparing test plans that identify the expected result for every fact or dimension table attribute – as well as every report metric – under every possible circumstance. This very tedious and painstaking work lays the foundation for the scripts and test data that must be generated to properly certify the data and reports in any new data warehouse application.

Clay Rehm’s Answer: The very first thing to do is to develop a quality assurance (QA) plan – that is, how do you plan to deal with quality assurance? The quality assurance plan is concerned with providing management with appropriate visability into the processes being used by the project and the products being built. This plan involves reviewing and auditing the software product and activities to verify that they comply with organizational procedures and standards. It also includes the reporting of results of these reviews and audits to the project team and appropriate managers.

Examples of quality assurance tasks:

  • Management, progress and technical reviews/walkthroughs
  • Software development plan and planning data reviews
  • Audits performed by the QA representative to determine compliance with contract requirements, project plans and organizational procedures
  • Program status meeting
  • Review of program status reports
  • Quality audits
  • Corrective action process
  • Escalation of corrective action requests
  • Participation in the preparation and review of the project’s software development plan, standards, and procedures
  • Reviews of deliverable and non-deliverable products
  • Reviews of software and non- software products (e.g., documents)
  • Product development and product verification activities (e.g., executing test cases)

Quality assurance really boils down to a thorough plan developed by the business and IT together, and using tools already available to monitor the testing.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access