In my last column, I described two key implicationsof operational (versus traditional) business intelligence - source data quality and governance/data stewardship. These are necessitated by the processing cycles for operational BI, which do not allow for batch processing and error correction. The two implications discussed in this column are continuous quality improvement and metadata capture and dissemination.
Continuous quality improvement is a concept introduced by Dr. Shewhart in the 1930s and popularized by W. Edwards Deming as a critical component of any effective quality management program. The continuous quality improvement cycle as shown in the figure emphasizes the importance of planning a task, performing it, measuring the performance, taking appropriate corrective actions based on the measurements and then repeating the process. In operational BI, this concept recognizes that unless we address problems at their source, they will recur, which means that erroneous data will be introduced into the BI environment.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access