A Global 100 Application Snapshot
CHICAGO, June 12th - It took six months. Six months into the SAP implementation, data issues started raising executive temperatures. And yes, it was a bit of a surprise for many - partly because the SAP implementation was already delivering better controls, process automation and integration, but mostly, as puzzled managers were quick to point out, because data quality had already been heavily addressed in the early SAP planning stages.
What exactly was the problem? No sustaining framework. The company had failed to develop an effective management framework to make sure that data quality - once achieved - could be continuously maintained. Thus, data quality had deteriorated to a point where senior management now considered it a major contributing factor to a number of problems such as unreconciled financial reports, failure of key system interfaces and automation bottlenecks. Outside advisors were called in. Twenty-three interviews and three workshops were conducted - all coordinated by the company's data quality center of excellence (CoE).
The findings that emerged from the interviews and workshops presented a mixed picture. On the one hand, the SAP implementation was solid. The data management group had addressed some good basic control functions, the CoE was providing relatively effective guidance and many group heads already saw themselves as data owners who recognized the benefits of improved data quality. On the other hand, 115 data-related issues were identified. And three of them were causing the most trouble:
- Requisitions: Because materials data had not been set up correctly - and in some cases didn't even exist - 56 percent of purchase orders were being created with free-text requisitions. This negatively impacted automation and made effective procurement management difficult.
- Interface Bottlenecks: Poor quality material data was causing key interfaces to fail on a regular basis. This failure was injecting delays into the supply chain process and also meant that product availability was not fully visible to customer service staff.
- Vendor Master Problems: One-time vendors were showing up in the top 10 vendor list - a clear indication that vendor master data was not being used correctly. This too was constraining automation and effective procurement.
A Focused Course of Action
Recommendations to management were frank. These problems would continue to worsen as long as the company had neither a means of quantifying data quality, nor a system for establishing management accountability for it. What the company needed was a data quality management framework - one that explicitly:
- Established data ownership and accountability for data quality through measurement and reporting, and
- Provided a mechanism by which process and system improvements could be continuously identified and implemented in order to improve data quality.
Management was ready to move quickly. In fact, within a week, the company had appointed an executive sponsor responsible for implementing the framework, approved the material master as the first implementation pilot and engaged data and process owners to agree to the detailed work and resource plan.
The nature of the problem dictated a fairly wide scope. After all, data is rarely system- or process-specific; it flows end to end throughout the organization. Because the main focus was SAP, the scope of the project necessarily extended to all data for which SAP was the master source - or where SAP made use of data created in another system. Therefore, it was suggested that data be categorized (e.g., a customer, a vendor, a material), that the framework be implemented for each category in phases and that category prioritization be determined according to where the impact of poor data quality on the business was greatest.
Based on industry best practices, a framework blueprint was created and customized to the company's needs. It comprised two sets of components. The first set included clearly defined roles, such as sponsor, manager, owner, producer and consumer - each with specified responsibilities. These activities represented the second set of components and included key steps such as defining data quality requirements and metrics; setting out an issue analysis process to help data owners identify root causes of problems; establishing a regular forum for linking process, system and data owners in order to facilitate issue rectification; creating a repository of tools, policies, and knowledge to cover analysis and reporting, data modeling, and compliance; and a formal change process to ensure that the impact of ongoing business and system changes to data requirements was assessed and addressed on a continuous basis.
Data Quality Matters
Implementation was successful along a number of dimensions. Process automation and efficiency was markedly improved through a greater number of automated purchase orders, fewer process delays and a reduced need for staff to verify or gather data. The quality of the information supporting management decision making was better, and there was a clear reduction in the cost of rework and rectification.
Yes, data quality is just one among several critical success drivers for any major ERP implementation; however, remember: all by itself, bad data can sabotage a multimillion-dollar implementation - from the inside out. It's one thing to address data quality in the early stages, but without a framework for sustaining data quality day in and day out, you won't have a sustainably cost-effective and efficient ERP platform.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access