JUN 1, 2006 1:00am ET

Related Links

Health Information Exchange Requires Nationwide Patient Data Matching Strategy
August 14, 2014
Analytics CEO Schools Payers at AHIP
June 16, 2014
Making Sound Business Decisions
June 11, 2014

Web Seminars

Why Data Virtualization Can Save the Data Warehouse
Available On Demand
Essential Guide to Using Data Virtualization for Big Data Analytics
September 24, 2014

The Long View: Where is Data Quality Headed?

Print
Reprints
Email

Editor's note: Theis is the first article in this series which highlights several emerging business uses and implementation approaches for data quality technology. The second article, Discovering Data Quality, provides recommendations for extending a point solution into an enterprise-wide data quality platform. The third part, Winning Hearts and Minds (and Money) with Data Quality, examines the practice of data quality - the tangible activities required to increase data usability, improve effectiveness of business decision-making and build confidence among information consumers about data completeness and accuracy.

The data quality market has seen a dramatic level of consolidation recently - yet another clear indication that the quality of data is finally being recognized as a mainstream component of good IT management. For the most part, companies have learned about the cost of poor data quality the hard way, through failure after failure of major IT projects and through cost overruns and schedules gone completely out of whack. Some forward-looking companies are now using data quality for wider purposes and as a distributed service for many systems, applications and business processes across the enterprise.

Early Adopters Pave the Way

Early adopters in large enterprises have now practiced data quality for several years. As usual with early adopters, these companies are finding competitive advantage by staying ahead of the technological pack. What is most interesting about their implementations, however, is not the higher quality data they are able to give their users access to, but the maturity of data quality practices they have strung together from their early implementations.

One typical example is a company who initially wanted to standardize, deduplicate and consolidate customer data for one of its three divisions. They intended to integrate it into e-commerce processes and a data warehouse. Once this was accomplished, they began to extend the data quality processes outward to:

  • Implement data quality for customer data in the other two divisions,
  • Synchronize customer data between divisions,
  • Build nightly feeds for reporting system,
  • Deliver updated data feeds and duplicate alerts to sales,
  • Provide marketing with nightly feeds for analytics,
  • Enrich data for new records,
  • Provide new leads through data sharing across divisions, and
  • Automate enterprise-wide reporting.

The company was able to build out the original project for customer data in one system to extend data quality practices to multiple systems and functions.

This pattern of best practices is essentially an adaptive one: it is one where business priorities are dictating how and when data quality technology gets used, a process where semantics is just as important - if not more important - than syntax and where the only certainty about data and technology requirements is that they will change. No longer a "fix it and forget it" process applied to mailing lists, data quality has become a practice where the solutions are applied in many different ways in order to build a cohesive and strong foundation. The result is more high-quality data for multiple business purposes.

Where to Start?

Faced with mountains of dirty data, disparate systems, tight budgets and long lists of business requests, companies often do not know where to start improving the quality of their data. This is why in the past many companies sought a data quality solution only when they faced failure in a large IT project due to poor data quality. Today, tools for data profiling and data discovery are available to avert those risks, and companies are using them to more precisely scope out large data projects. For this reason it may not especially matter which data you start with. What does matter is when you start thinking about data quality. It pays to plan ahead - and not just for today's data and technology requirements, but also for what users will need down the road.

Companies are taking advantage of external rules-based data quality processes to reuse work done on an initial project for additional projects. Even though the rules may be modified and refined from one project to another, this kind of reuse still saves time and money in implementation. It also promotes consistency across projects and systems. In effect, a single data quality initiative can grow into the start of a data governance program, where all the rules and metadata associated with various projects can be administered centrally if desired.

Multifaceted Data

Another change is in the type of data that companies want to improve and maintain. No longer is the focus simply on "name and address" data. In fact, as companies shift from product-centric data management to a more customer-centric design, even customer data has come to mean much more than names and addresses. Customer data also includes product information, purchasing histories, service reports, demographics and many other types of information that give complete pictures of customers and contain rich data for new analytics and business intelligence (BI) tools.

Global data, driven by increasing global business transactions and offshore outsourcing and operations, also presents large and difficult challenges for IT. It is not uncommon for large organizations to transact business in several countries and require technology that can automatically correct, standardize and consolidate records in multiple languages using various linguistic and cultural conventions - all in real-time or near real-time environments.

Today's data is also less likely to be a single, monolithic record of information delivered to all users and systems. Instead it is becoming more multifaceted and serving various business purposes in a number of operational, analytic and reporting applications, often simultaneously. The "single view" that businesses often want obscures the complexity of business purposes that such data must serve.

What's Behind the Consolidations

The current interest in master data management (MDM) and customer data integration (CDI) reflects the desire to consolidate and centralize data stores. The pressure to be able to build business processes that span applications in a service-oriented environment is driving these large integration initiatives. It is also clear that compliance and governance are important factors to consider in planning such projects.

The Connected Enterprise

Get access to this article and thousands more...

All Information Management articles are archived after 7 days. REGISTER NOW for unlimited access to all recently archived articles, as well as thousands of searchable stories. Registered Members also gain access to:

  • Full access to information-management.com including all searchable archived content
  • Exclusive E-Newsletters delivering the latest headlines to your inbox
  • Access to White Papers, Web Seminars, and Blog Discussions
  • Discounts to upcoming conferences & events
  • Uninterrupted access to all sponsored content, and MORE!

Already Registered?

Filed under:

Advertisement

Comments (0)

Be the first to comment on this post using the section below.

Add Your Comments:
You must be registered to post a comment.
Not Registered?
You must be registered to post a comment. Click here to register.
Already registered? Log in here
Please note you must now log in with your email address and password.
Twitter
Facebook
LinkedIn
Login  |  My Account  |  White Papers  |  Web Seminars  |  Events |  Newsletters |  eBooks
FOLLOW US
Please note you must now log in with your email address and password.