Continue in 2 seconds

The Data Strategy Advisor

  • December 01 2004, 1:00am EST

With a substantial majority of respondents to the TDWI-Forrester Quarterly Technology Survey (61 percent) acknowledging that they do not operate an information or data quality tool, reasonable people could conclude this is an inactive market and one to be avoided. But Forrester believes the glass only appears to be half empty. The market is wide-open and full of potential. Why? If end-user enterprises can attain a consistent, unified approach to information quality and vendors can address the requirement for integrated tools to address the essential, minimum set of functionality, then the market will be poised for growth.

Data quality, scrubbing and profiling tools have been available in the form of data processing technology for many years; but the majority of enterprises still do not own a tool because:

Information quality is not a standalone application. Information and data quality are not separate applications such as order entry, accounting or market trend analysis, but a necessary condition of the possibility of all of these applications. The functionality that could be grouped into a tool and purchased as such tends to disappear into the background and be an invisible part of the context of the business application. Meanwhile, what has occurred is a gradual, incremental improvement in tools and in the awareness of end-user organizations that the data represents the lifeblood of the business. Information quality, while not a standalone application, is a success criterion for both transactional and business intelligence applications.

Enterprises do not have a consistent approach to information and data quality. A centralized or even federated approach would justify automating the information quality process, which, in turn, would motivate the acquisition of additional technology targeted at improving low quality data. Advances in awareness of the impact and cost of data defects, including at the CXO level, have also grown incrementally to the point where a critical mass of pain exists.

Information quality tool functionality has lacked integration. Until recently, tools have not integrated the minimum essential functions of data profiling, standardization, matching and deduplication into a single code base, fronted by a user friendly interface and meta data-driven impact analysis and implementation; and, even now, the meta data part is often as much work-in-promise as implementation.

For those enterprises that are ready to automate and acquire a tool, the market is a competitive one, and first-time buyers have an advantage of numerous choices. Candidates for second-generation information quality products include dfPower Studio from DataFlux, Information Quality Suite from Firstlogic, DataSight from Group 1 Software (recently acquired by Pitney Bowes), Athanor from Similarity Systems and Trillium from Trillium Software. Prospective buyers should diligently define their IQ requirements and bargain for favorable terms and prices with a shortlist of vendors as suggested here.

The Information Quality Tools Market Glass is Half Full

The fragmentation of the information and data quality market means that end-user enterprises can leverage the competition between the many small players to gain advantages in price, service and support as part of the tool acquisition cycle. In order to make the acquisition of an information quality tool worthwhile, end-user enterprises should:

Design and implement an information quality strategy. Implement a consistent, unified approach to information and data quality improvement within the enterprise. Perform a readiness assessment, identify missing commitments and take steps to acquire them, design and implement a defined, repeatable, measurable process to establish information quality as a priority.

Give priority to the process over the tool. Make the acquisition of an information quality tool part of a comprehensive approach to and method of information quality improvement. Deploy technology at key interfaces in the information quality supply chain as part of a design for quality.

Choose a second-generation tool. When purchasing an information quality, pick a second generation product that provides for integrated profiling, standardization and matching in a single code base with meta data-driven design. Match tool functionality to your requirements in these areas, understanding that technical strengths and weakness should be complemented with an assessment of vendor viability, service and support.

In The Data Warehousing Institute TDWI-Forrester Quarterly Technology Survey (May 2004), the potential of the information quality software market is demonstrated in that 61percent of respondents do not yet have any tool. While this data point can be taken to mean the glass is half empty, Forrester believes this is a market with huge potential for the software providers and end users alike. What it will take to inspire end users to automate this set of critical functions holds the key to growth for vendors and the solution to knotty data quality issues for end-user enterprises.

Figure 1: Results of TDWI-Forrester Quarterly Technology Survey May 2004

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access