(Im going to have one more look at a presentation coming up at our MDM Summit at the San Francisco Hyatt Regency next week. I hope to see you there. If you havent, take a look at www.mdm-summit.com and if youre coming out I look forward to meeting you at this learning and great networking event. -ed)
We all know that data quality is an important measure of good data management because our processes for data entry are fragmented or misaligned, or because we know something else has probably gone wrong already.
But if organizations are always circling back to take corrective action, how can the promise of a synchronized and distributed master record ever really be up do date? Can we address latency in the data quality process and avoid some of the cleanup that always comes later? (The editors who clean up the stories I write might be cheering right about now.)
Lynn Weishaupt is an application architect whos served as the technical team lead for MDM at Weyerhaeuser, where a solid and management approved enterprise data management strategy covering six subject areas has been in place for some time. As part of the strategy, each subject area gets evaluated based on how clean and high quality the data is at a point in time.
But something was making the Weyerhaeuser folks scratch their heads and it had to do with how they react to data quality problems. Our current data quality reports that run in subject areas come after the fact, Weishaupt says. Data is entered into the source system, it gets saved in the database and either nightly or weekly or monthly, that data is evaluated against business rules and any discrepancies are put out in a report.
After that, it falls to a data steward to go in and clean up the data, after which another report is run. The problem is that the reported data might already have been used for some sort of transaction, which would immediately make it incorrect. What Weyerhaeuser and the team wanted to do was to move the quality checking up front. In other words, data shouldnt be available for use in a transaction until it passes business rules on the input side.
What Weishaupt told me sounded like a policy for treating data quality as a process, rather than as an event. Shes absolutely right in her observation that MDM projects harmonize after the fact, that they pull from the source system, they merge and match, they run rules and check results on the back end. That was basically what Weyerhaeuser already had in SQL Server extracts and Business Objects reports.
Honestly, any advocate of root cause analysis would come to the same conclusion, but how do you execute against this problem and get in front of it?
Weyerhaeuser has deployed a Web interface for field users to complete with an expected set of fields. Theyre basically requesting a new customer or an update to an existing customer, says Weishaupt. They can search the database to find the customer, then we utilize some fuzzy match logic, so if they type in Bobs Lumber, well pull up Roberts Lumber or Lumber by Bob and heuristic fuzzy things like that. If they find the customer they want they can go into an update request. If they dont they can go into a create request.
That starts a workflow process that at leads eventually to the sales manager and finally to the credit manager who approves it. What makes the solution nifty is that at each step, business rules check for elements such as valid pricing groups or valid sales regions.
We check it along the way to the point that they cant enter invalid combinations of data from the field, Weishaupt told me. We kick back errors and tell them to fix it before it can be approved to go to the next person in the workflow as defined in each phase.
The last line of defense is still the data steward, but this person now has much more validation to work with and saves time approving field data quality to gold status. At that point its published to the transactional system, which for Weyerhaeuser is SAP.
Few companies would choose to expose field sales to a SAP interface, and this was certainly the case at Weyerhaeuser. That leaves the old update process common to any field operation, a series of faxes, phone calls and emails that came with their own inefficiencies and versioning problems.
The Web-based solution creates controls that address data quality directly. A key thing for us is the forms, the rules and the processes, Weishaupt says. The workflow keeps track of things and if somebodys delayed, maybe past two days, they get a reminder email before its bounced and escalated.
Workflow also informs related operations, perhaps the transportation department that might need to update routing information. The workflow and the MDM system provide the validity check via the rules required to authenticate a gold record.
Weyerhaeusers story really appeals to me and perhaps to others frustrated with the post-facto heavy lifting in MDM.
How do you feel about process efficiencies in MDM projects? If you have a thought, push the comment button below and sound off.