Continue in 2 seconds

Raise Your IQ: Order, Structure and Accountability to Improve Information Quality

Published
  • April 01 2003, 1:00am EST
More in

The ripple effect of last year's high-profile cases where information and data controls broke down or were manipulated has gone through the boardroom and finance department and into the IT department. In many public companies, more C- level officers are now active participants in evaluating the controls around critical information assets required to manage the business effectively and ensure the accuracy of their financial statements for market and regulatory disclosure.

What the average citizen and reader of newspaper stories does not understand, however, is that given the hundreds of automated and manual information systems most organizations have today, many with complex timing and interface interdependencies, the oddity is not that controls break down, but rather that controls are possible at all. Legacy systems constitute a labyrinth of complexity; and if you want to trace a path of a particular piece of data to find the root cause of data-related integrity issues, you typically face an arduous task. Complexity then breeds more complexity as a "don't fix what's not broken" mentality prevails. Instead of fixing the problem, many of those charged with system maintenance apply workarounds and simply build additional "pipes."

The technical problems are compounded by business attitudes toward data and information systems. Data, essential to the timely delivery of information for management reporting, is too often viewed as a "waste product," rather than a corporate asset that can generate new business. That's understandable if there is widespread mistrust of the company's own data. Fixing the problem? That is usually a function of the cost involved. Typically, "doing business" takes precedence over data repair and information quality improvement; data quality improvement becomes a business imperative only when risks and the cost of the workarounds finally becomes greater than cost of fixing the problems. These information quality (IQ) problems can, indeed, be addressed, but only by augmenting the traditional approaches, which begin the IQ improvement process by addressing issues at the data element level.

The Pitfalls of the Traditional Approach

For large and complex organizations with tens of thousands of data elements and hundreds of databases, systems and interfaces, even taking the first few steps to begin the IQ improvement process can seem daunting.

As an example, consider a products company that links the terms of its contracts to the qualifications of its customers. The company has a strong relationship with its customers and has favorable policies that allow the delivery of the original contract documents some time after the products have been purchased. However, an internal audit has revealed discrepancies between the contract terms recorded in the company's systems and the original contract documents delivered: for example, the financing terms provided to the customer are inconsistent with the product purchased. What happens? Over time, some contracts become delinquent. Products and pricing may also change, making it harder to reconcile the financing terms and revenues. In turn, these discrepancies have an impact on downstream processes that are dependent on the timeliness and accuracy of the contract data. As business conditions change, competition and market pressures rise, new products are introduced, newer policies favor the higher-volume customers and delays in delivery of the original contract documents continue to lengthen. Now, imagine this occurs for many years. These internal risks, spread across multiple interdependent business process and systems, could result in millions of dollars of internal costs. In addition to the company's internal costs, there is an equivalent, and possibly more damaging, cost to its customers, whose data and systems could also be affected by the data discrepancies.

This company has a data quality issue; in fact, it has many data quality issues. Moreover, traditional approaches that begin the IQ improvement process by addressing issues at the data element level cannot help this company. The root cause of the data quality issues is the company's business policy that allows customers to delay delivering the original contract documents until some time after the contract has been ratified. The impact of these data quality issues can be seen in different and interdependent processes across the business' value chain: the pricing department, which uses sophisticated modeling and analytics to set competitive prices for products; the credit department, which establishes policies for qualifying customers; the customer service department, which enforces the company's policies; the servicing department, which collects payments from customers; and, finally, the finance and accounting department which may face the brunt of potentially restating revenues and costs. Each department makes assumptions based on incorrect contract terms, resulting in inconsistent data proxy (substitution) and invalid financial projections.

How can companies fix IQ problems of this type? How can they negotiate the technical and business labyrinth in order to implement new corporate or board-driven information quality policies to improve control over that data? In the ancient Greek myth, Theseus successfully negotiated the labyrinth of Daedalus by unraveling a long string as he walked so that he could easily retrace his steps. In the world of IQ, what could serve as our piece of string?

These issues can be addressed by encouraging the business to consider the implications of the data-related risks across business processes, from the point of inception or entry into the organization, through transformations in the labyrinth of systems and finally to the point of retirement or destruction. Careful consideration of these risks will result in changes to business policies, business processes, customer relationships, and ultimately in changes to manual and automated systems. To begin to gain control, this approach emphasizes three principles to provide order and structure:

  1. A repeatable process to qualify, assess and justify business process changes. Through this process, an organization can determine which activities to conduct and in what order.
  2. Accountability: Identification of who is responsible for what, and how the different functions interact optimally to produce the highest information quality.
  3. Assessment: Determination of the scale and impact of the data-related risks on the company and its customers before embarking on any change initiative.

This structure of process, accountability and assessment can begin to offer new hope to those who fear they will never emerge from the labyrinth.
Accountability ensures that there is cooperation between the company's officers responsible for each business process. Coordination of the risk mitigation strategies (the repeatable process) ensures that the necessary changes occur in the proper sequence (the assessment), without causing significant disruption to "doing business" or incurring additional internal risk or external exposure to the company.

Raising Information Quality

Essential to the successful execution of any IQ program is an understanding of the role of three major players. First, there is an administration function, typically performed by a data management group within the business. This group provides necessary governance over IQ risk mitigation activities to qualify, justify and coordinate the implementation of business process changes. Risk mitigation strategies are generally organized around categories such as business operations, financial, regulatory, legal and industry. Each organization defines and measures risk differently. If a company has an internal audit department, it may have defined a framework for qualifying risk and may have the mitigation strategies, management controls and implementation timeframes to address each type of risk. Data management would use this framework to qualify and justify the risk resolution.

Second is an authorization function, typically performed by officers within the company; they are often called "stewards." These stewards are decision-makers directly accountable for the success of business objectives and, therefore, for mitigating data-related risks associated with achieving those objectives. Stewards lead a business process area and are typically vice president-level individuals who can authorize, sponsor and fund business process changes. A trend in a number of organizations is board-driven commitments to IQ upon which these officers are obligated to act. On a periodic basis, data management notifies the stewards of the identified risks. Stewards must decide which risks to resolve and assess the impact on the company and its customers.

Usually a steward furthest downstream in the business process (finance and accounting) would sponsor and fund business process changes because they are the ones most affected by the improvement in data quality. Ideally, the steward furthest upstream in the business process (sales and marketing), where the data enters the company, should co-sponsor change – that is, apply the principle that information quality improvement begins at the source of the data. In addition, based on the products company scenario, a third steward may need to be involved – the one responsible for establishing the company's business policies (credit risk management). The point is that attempting to find a single person entirely accountable for the successful mitigation of a data quality risk is unlikely in a complex organization and unnecessary provided there is strong cooperation among stewards.

Lastly, there is an activation function, which is typically performed by middle-management individuals who have significant knowledge of and responsibility for one or more business areas; we'll call them "advocates." Advocates report directly to a steward and are responsible for the implementation of business policy changes. Data management supports the advocates by coordinating the changes across business processes.

There is a close relationship between these three groups, and data management must assist stewards as they balance the need to resolve systemic IQ issues, politics and the pressures of "doing business" across a company's processes.

Assessing Information Quality

Assessing IQ issues is difficult at best, in part because it requires specific empirical or hard evidence of the problems and their impacts. The traditional approach to assessing data quality typically starts at the data element level and attempts to measure accuracy, completeness, consistency and so on. However, a company cannot easily assess and place management controls around every data element to create periodic defect reports on which ones are in error. When such approaches fail, then companies may begin identifying the data issues that adversely affect or present a significant risk to the company. Identifying the critical data elements associated with a risk is the first step. Assuming we can do this, the second and much harder step is assessing the scale and impact of the risk on the company and its customers. This step has three components:

  1. Measuring specific data defect rates regularly and consistently over time, and establishing acceptable tolerances for defects per critical data element. A way to normalize data errors is to use a benchmark such as Six Sigma. 1 I cannot do justice to Six Sigma in this short article, but I can use the products company example to illustrate Six Sigma's strength as a data quality benchmark. Let's assume the products company has 5 million contracts in its portfolio. Even if they are operating at 99.38 percent accuracy (which is 4 Sigma or 6,200 defects per million), the company may still have potentially 31,000 contracts in error (6,200 X 5). Is that acceptable? Six Sigma also helps assess our ability to improve data quality. For example, suppose the products company wishes to improve from 99.38 percent (4 Sigma) to 99.98 percent (5 Sigma or 230 defects per million). This appears to represent just a 0.6 percent difference; however, this improvement actually represents a 27-fold improvement in data quality (6,200 / 230) – not an easy task for any company.
  2. Projecting to the company's customers of the value of the difference between the current state and the desired state, over time. For the products company, let's assume the company has 2,000 customers, each with an average of 100,000 contracts (assume this equals 200 million defect opportunities or 2,000 X 100,000). The company estimates that it will cost an average of $15 for a customer to correct and re-deliver the contracts in error. At 3 Sigma (93.32 percent accuracy or 66,800 defects per million), this would equal $200 million (200 X 66,800 X $15) in additional contract processing costs to the company's customers.
  3. Projecting to the company the value of the difference between the current state and the desired state, over time. This step requires translating the benefits of achieving the desired state into a common denominator such as basis points (BPS) of shareholder value. After quantifying the problem, the benefits of actually making improvements will be easier to justify, and this typically involves net present value calculations. Assume it also costs the products company just $15 per contract in data-related inefficiencies across business processes; that also equals $200 million in additional costs, which directly affect gross margin. Further, let's say the products company has a market capitalization of $20 billion; this additional cost to the company represents 100 BPS to shareholders. For some companies, that may be significant.

The Benefits of a Cross-Business Process Approach

Implementing any IQ program is a slow and painful process, often resulting in business process change – both internally and externally. While data issues are specific to a particular company, the benefits are typically generic and often include a greater understanding of the use of data across the business, greater control over data redundancy, greater control of multitudinous forms of data proxy and reduced system coupling.

Addressing non-quality data is about understanding the use of the data in the enterprise to conduct business, establish business policies and maintain successful relationships with the company's customers and external agents. The IQ approach discussed here allows a company to incrementally scale its IQ program, without letting the process become too overwhelming. Specific benefits of this approach include:

  1. Ensuring that the business is primarily responsible for data quality risk mitigation.
  2. Identifying stewards (senior executives) who will sponsor and fund IQ initiatives and be accountable for business processes changes.
  3. Identifying advocates (middle management) who are responsible for implementing the business process changes in their respective departments.
  4. Establishing a data management function in the business responsible for the governance of the IQ program across business processes. This group must balance the politics of information management with "doing business" to ensure a coherent and repeatable approach to risk mitigation, rather than piecemeal and uncoordinated activities.
  5. Requiring hard evidence that quantifies and justifies making business process changes. This approach combined with a powerful quantitative benchmark such as Six Sigma provides a rigorous methodology to analyze and standardize data defect reporting for those data-related anomalies that present a significant risk to the company.
  6. Improving workforce productivity resulting from reduced data anomalies, improvements in system automation, increased data sharing across systems, and fewer manual processes and stopgap measures.
  7. Achieving economies of scale through incremental and coordinated data quality improvements that are aligned with business strategy. With each increment, the company's overall confidence in its own data will improve.

IQ programs need not cost tens of millions of dollars if you know where to start and how to start. The approach articulated here may permit companies that organize around data and process to pull ahead of their competition. These companies can be better equipped to methodically and purposefully evaluate systemic issues and prepare themselves for greater scrutiny of their financials, disclosures and auditing practices resulting from the passage of the Sarbanes-Oxley act last year.2

Shifting an IQ program into the fast lane and staying there is not easy, but it is possible. With the successful implementation of each improvement, companies can increase their confidence in their data. Moreover, for today's C-level officers who are personally putting themselves at risk by attesting to the validity of financial statements, this has to be good news.

References:
1. Mikel Harry and Richard Schroeder. Six Sigma. Doubleday, 2000.
2. See http://www.aicpa.org/info/Sarbanes-Oxley2002.asp or http://www.sec.gov for more information.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access