This is a column I hoped I would not have to write. I know we all are sick and tired of Y2K. But as I am looking at the press commentary being communicated through January 9, 2000, I cannot stand idly by while people draw wrong conclusions. It appears that many do not understand the tragic waste caused by the defective data design problem that has been romanticized as the "Y2K bug" or the "millennium bug." Those who do not understand the root cause of this problem will congratulate themselves for responding to this crisis and go back to business as usual ­ not learning from this opportunity.

The January 2, 2000, lead editorial of The TENNESSEAN was entitled, "New century aced its first challenge: Y2K preparation paid off for major systems." It went on to say, "The world can give itself a pat on the back for crossing into a new century without too many bumps and bruises." Nowhere did the editorial suggest that this Y2K bug indicated a serious flaw in the processes of application software and database development.

In my February 1997 column, three years ago, I predicted, "In January 1997 [sic, should have read "2000" (even quality consultants are not immune to data quality errors!!)], many organizations will be toasting the Year 2000 conversion teams who will have made applications executable. My hat is off to those organizations who will have spent their time creating competitive advantage applications ­ because they did not have a Year 2000 "problem" to "fix."

The popular press has called the Y2K problem a programming shortcut. Not so. The Y2K "bug" is a data design defect, which in the early days of computing had some justification. When I began programming in the early 1970s, computer memory, disk storage and data communications were expensive ­ one megabyte of RAM cost about $1 million. We had to conserve computer resources. But from the late 1980s, the price performance of computing reduced the need for such data design compromises. Even before this, however, there was no justification for making date field compromises without implementing proper data integrity controls. My first experience with Y2K dates back to 1973-4 when as a programmer I wrote a reusable date routine in IBM Assembler language that handled the 1/1/00 date change. I did not perceive I was doing anything extraordinary; that seemed to me to be a normal business requirement. But consider this: my Web master had to terminate a highly rated Web programmer who, in 1999, developed a database for my Web site that had a two-digit year field!!! (It was corrected before problems arose.)

Y2K: A Cost or a Boost to Productivity

Possibly the most blatant conclusion about Y2K comes from the U.S. government's "Y2K czar," John Koskinen, in a press conference January 2, 2000. According to the Dallas Mornings News, Koskinen concludes that the $100 billion spent in the U.S. alone to fix the Y2K problem actually "boosted productivity" and with it may have averted a recession! Please, let us set the record straight. There could have been a recession ­ or worse ­ if nothing had been done about the problem, but the money spent fixing the Y2K problem did not increase productivity. It decreased it.

It is true that some organizations received value when they took this opportunity to replace obsolete applications and data structures with new ones that added functionality, new attributes and ergonomics; but there is zero value added by fixing applications and databases to support this date change. All this did was to enable the organization to operate the same way on January 3, 2000, as it did on December 31, 1999. This time and money falls into the category of "information scrap and rework." In any book, this equates to decreased productivity.

Worse yet, many organizations have "solved" the Y2K problem incorrectly by using what is called a "windowing" technique that maintains the defective two-digit year field. This technique does not solve the problem; it simply changes the date that the problem will occur. For example, the normal window for two-digit years was 00-99, in which the century value "19" is programmatically appended to the two-digit decade and year. By changing the window to [19]30-[20]29, dates from 30 to 99 are appended with 19 to become 1930 to 1999 and dates from 00 to 29 are appended with 20 to become 2000 to 2029. This means they will be spending money again to fix this same problem when the window arrives. To illustrate the problem, a 30-year mortgage beginning in the year 2001 would be incorrectly calculated to have its maturity in 1931. I know some organizations that have used this technique on attributes such as birth dates and already have problems with age calculations!

According to the GartnerGroup, the worldwide costs of Y2K are from $300-$600 billion. IDC estimates current worldwide expenditures at around $331 billion. Did this "investment" increase productivity? What could have been accomplished with $300 billion if it had not been required to fix this defect?

On average, large organizations have every fact of data stored redundantly on average ten times, according to my colleague Mike Brackett, and my studies confirm this figure. This means that organizations spent up to ten times more than what would have been required if they had a well-architected, shared database environment. (But if they had well-architected databases, they would not have had Y2K problems!) Redundant databases and applications alone accounted for at least $150 to $200 billion of the Y2K fix costs.

But the real question is not a technical one; it is a business question. How do you explain these costs to the shareholders? How do you explain the value add of the Y2K investment to the consumers? After all, it is the consumers who really pick up the $300-$600 billion price tag in the form of higher prices or higher taxes.

Root Causes of the Y2K Problem

The Y2K experience will create value only if we use it as an opportunity to examine its root cause to prevent recurrence of such defective data design problems. Such as:

  • Valuing the resource of CPU cycles over data integrity and compromising data design for computer resource efficiency.
  • Defective application development and data design processes that do not design quality in.
  • Failure to design integrity mechanisms to error proof applications when data design compromises must be made.
  • Management rewarding the wrong things. How many managers rewarded project teams for designing Y2K problem databases in the first place but delivered the application on time and then turned around and paid other teams to "fix" the problem! Should not we reward teams that design defect-free or minimal-defect products?

Knowledge management consists of sharing our lessons learned from both our successes and failures. The "lessons learned" from Y2K will be a test of whether an organization is ready for knowledge management.
Lessons (that should be) learned from Y2K:

  • The cost (not investment) of fixing the Y2K problem was required to prevent even higher costs of process failure and to avert a potential global crisis.
  • The costs to fix Y2K problems were not necessary in the first place. With an investment in sound data design when these applications were built or modified since the late 1980s, there would not have been a Y2K crisis.
  • You cannot programmatically solve what is inherently an information problem. Take Al Gore's "Internet town hall" Web site that displayed the date, Monday, January 3, 19100, until it was corrected later through more information scrap and rework effort.
  • The same development process that created the Y2K problem has created other defective data models and database designs. The Y2K error was simply the most visible. Design errors include poorly designed primary keys (embedded meaning); omission of fields needed by the enterprise but not by the immediate beneficiaries of the application; building redundant, proprietary and non-sharable databases to house data that is needed across the enterprise.
  • Data definition is important. Everyone celebrating the advent of the new millennium at midnight, January 1, 2000, celebrated it exactly one year early, as the millennium technically begins January 1, 2001. Precision in definition is eroding, leading ultimately to gross miscommunication.
  • Information quality is free. You do not have to fix problems that never existed because quality was designed in.

The Information Quality Management Mandate

We can ill afford to sit back and do nothing. Information professionals everywhere must unite to raise awareness so we can prevent the squandering of money in information scrap and rework caused by defective data design problems like Y2K. Isn't it our stewardship responsibility to help our organization manage the resource that is second in importance only to an organization's human resources?

If we do nothing, we condemn our organizations to continue making the same costly mistakes that can lead to business failure.

What have we learned from Y2K? I would like to know your thoughts on the following two questions. E-mail or fax your answers to Larry.English@infoimpact.com or (615) 837-8804 or send comments to my Web site at www.infoimpact.com, and I will e-mail you a collated summary.

  1. Which general conclusion about Y2K will prevail?
    A. Our organization rose to the occasion, solved the problem, and now it is time to get back to business as usual. We will keep on developing applications and databases as we did before.
    B. We could have prevented the Y2K problem if we had designed our databases properly. We will improve our data design processes as a result of this experience.
  2. If positive change is going to happen to prevent recurrence of these kinds of problems, who will lead the way?
A. Information management professionals (data administrators, data resource managers, data architects).
B. Application developers.
C. Business professionals (non- IT personnel).
D. External entities, such as consumers disgusted with information quality problems (or their regulatory advocates), or shareholders disgusted with companies' financial performance.

So what have we learned?

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access