I would like to thank DM Review for allowing me to take some time off from my monthly column. I would also like to thank my readers who noticed and emailed me. I needed a brief sabbatical in a busy time.

My office is undergoing its summer cleanup, and I have been reviewing files for what can be recycled and which still have value. One such file was the Y2K file. As I was looking over the nearly 400 articles, I contemplated the process improvement cycle of "plan-do-study-act." I reflected on the "study" phase, and suggest that we study what we have learned from the global Y2K remediation initiatives.

The $200 billion cost to the U.S. and as much as $600 billion to the world for these "no added value" projects was solely for correction of or application and database workarounds to handle the date change from December 31, 1999, to January 1, 2000. The entire cost was incurred just to enable business as usual!

But these $200 and $600 billion represent only the direct costs of the remediation. The very real costs include decreased U.S. economic output of 0.3 percentage points in 1999 and 0.5 percentage points in 2000 and 2001, according to Chief Economist Edward Yardeni of Deutsche Morgan Grenfell. How much more productivity was wasted in water-cooler talk and in the hysteria created by fearmongers causing everyday folks to stock up to survive the coming doomsday?

I was chagrined that virtually every post-January 1, 2000, article congratulated everyone on what a great job the world did to rise to the occasion and avoid disaster. But can we really congratulate ourselves for deliberately designing databases with two-digit years long after price-performance advances in technology no longer made compromising data integrity an issue?

The press also incorrectly characterized the problem as a programming problem. It was, in fact, an information design problem. The precipitating cause was the programming practice of saving space, but the root cause was use of programming mechanisms to solve what is inherently an information design problem. The information requirements analysis, modeling and database design processes are the processes that are broken.

Have We Learned From This?

According to a survey I conducted at the DAMA-Metadata Conference in March 2001, out of 118 information professionals, only 20 percent agreed with the statement, "The IRM/DRM/DA [information resource management/data resource management/data architecture] function used the Y2K problem as an opportunity to make fundamental improvements to the data development process to increase the quality of data definition and database design."

This is a timely topic in that many organizations are addressing the need to elevate their information management functions to an enterprise management capability, with the drivers coming from the business. Most organizations today are nearly crippled by poor information management practices, floundering in maintenance of disparately defined silo databases and the interface programs that transform and move data among them.

If an organization has a broken information requirements analysis, modeling and database design process, it must be improved to achieve enterprise-strength, sharable databases that can eliminate unnecessary data movement when data should be shared. These processes must be improved to prevent partial data capture attributes, such as the middle initial of a person's name when they might be known by their middle name, or to capture temperature in degrees Fahrenheit without an explicit reference to which scale the measurement is made.

One thing is certain. If we keep performing the broken processes of the past, we will keep getting the same broken results. To prevent the defects of the past, we must improve the processes by first analyzing root cause(s) and defining improvements that will prevent the causes. We must then implement the improvements in a controlled way to study the effects. Did the improvements work? What were the critical success factors? What are the pitfalls?

Read (or reread) chapter nine, "Improving Information Process Quality: Data Defect Prevention," in Improving Data Warehouse and Business Information Quality, to learn how to perform the "plan-do-study (check)-act" cycle of process improvement. Also read chapter five, "Assessing Data Definition and Information Architecture Quality," to determine how to find problems in the information requirements, modeling and database design processes.

What do you think? Let me hear at Larry.English@infoimpact.com.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access