Continue in 2 seconds

Enticing but Dangerous: Assessing Web Services from a Data Quality Perspective

  • May 01 2004, 1:00am EDT
More in

From boardrooms to IT planning meetings, there is a wave of interest in the promise of Web services. It is a compelling promise: the prospect of leveraging Web technologies in new ways to realize an automated exchange of information in real time and across widely disparate applications. The business case supporting Web services certainly hits the right notes - automation will reduce human error; costs will decline precipitously; efficiency will accelerate. Additionally, new service bridges will spring up to link not only business partners, suppliers and customers, but also applications within the enterprise that have never been able to share information. That's a pretty good deal, and it's getting executive attention.

However, woven across the tapestry of benefits is a singular issue with searing consequences for organizations that elect to adopt a Web services initiative without properly understanding its strategic dependency: the quality of the underlying data.

A Crisis In Confidence: Executives are Concerned about Data

For most executives, data quality is already a problem. A study conducted by PricewaterhouseCoopers found that 75 percent of companies reported that inaccurate data led to "significant problems" including added costs, failure to bill or collect receivables, lost sales, poor service and eroded shareholder value. Executive confidence is particularly weak with respect to data originating from others. In fact, fewer than 20 percent of respondents reported feeling "very confident" about the quality of other organizations' data.

Data quality isn't just a problem - it's a priority. A survey of 1,648 companies by analysts at IDC has revealed that data cleansing/data quality is considered the second most urgent IT problem in 2003, right behind budget cuts (Source: "Business Analytics Implementation Challenges - Top 10 Considerations for 2003 and Beyond." May 2003, IDC).

These survey responses begin to raise some unsettling implications for the success of a Web services implementation. Data quality, of course, is important for any strategic IT initiative; but for Web services, it is the lifeblood of the solution.

The Excitement of Web Services: A Paradigm Shift in Interoperability

The excitement surrounding Web services is related to interoperability - the ability to pass information back and forth between applications designed around entirely different technologies, platforms and locales.

Today, the most common way to get disparate applications to exchange information is to rely upon application programming interfaces (APIs) and enterprise application integration (EAI) technologies that are fine-grained and tightly coupled. However, these approaches are notoriously finicky, typically expensive, don't scale well and require finding and retaining programmers with sophisticated skill sets. They can require a lot of time to design, test and implement; and though they may execute well in the walled rose garden of an enterprise's internal environment, they stumble just steps from the enterprise gate.

Web services appear ready to change much of this with a coarse-grained, loosely coupled architecture that is resilient and transcends the boundaries of the enterprise. Web services represent exciting new opportunities because the technology gives solution planners far greater interoperability than ever before - at lower cost, on a faster implementation timetable, and with far greater automation, reach and impact than most APIs and traditional EAI could ever deliver.

Web Services are Compelling From a Business Standpoint

At first glance, executives don't need to look hard for a strong business case. The rationale that supports aggressive ROI projections appears compelling:

  • Accelerated integration time-tables: Web services allow an enterprise to accelerate the integration process. The complexity that bogs down a traditional API or EAI-centric design is replaced by a standardized, text-based format and protocols that make analysis and manipulation of data significantly easier.
  • Lower implementation costs: Because Web services can be deployed more quickly and easily, a Web services initiative promises to dramatically cut integration costs. That's attractive to executives concerned with the high costs of IT initiatives, limited IT budgets and pressure to approve IT projects only when rapid and quantifiable ROI can be convincingly demonstrated.
  • Expanded reach and impact: Where traditional APIs merely bridge two different applications with a point-to-point architecture, Web services geometrically expand the solution reach with a "one-to-many" architecture.
  • High levels of automation: Any highly automated solution is attractive because, by definition, it minimizes the errors and inconsistencies generated by human involvement.

Web Services Carry Dangerous Implications for Data Quality

However, managers who count on capturing each of these business benefits make one critical assumption: that the quality of the data that drives the solution is high. In most cases, that's an incorrect assumption. Not only is data quality frequently poor, but the very benefits of Web services that make it attractive to enterprises also magnify the negative impacts associated with poor data quality. Consider the following implications.

  • An accelerated timetable tends to short-change data quality. When IT projects are approved based on short implementation schedules, planners and programmers are far less likely to take time to thoroughly address strategic issues such as data quality - particularly if an enterprise does not already have a strong cultural awareness of the importance of data quality and the information governance structure necessary to translate awareness into a leverageable asset. Time and again, data quality gets short shrift. It can be a dangerous oversight because by design, Web services link fragmented, disparate and poorly aligned applications, and by nature, these applications do not generate data in a standardized manner.
  • Lower integration cost can mean cutting the budget on data quality too. With expectations of lower integration costs, executives can fall into the trap of also cutting back on the budgets that need to support data quality. What is typically overlooked is that the investment needed to improve and protect data quality is far more critical - not less - because the factors that make the solution appealing raise performance benchmarks for data quality.
  • Expanded reach and impact creates problems if data is bad. With the potential for a far greater number of users invoking an enterprise's Web services (effectively with the enterprise's consent), several issues are raised. On the one hand, there is a corresponding increase in the sponsoring enterprise's obligation to ensure the confidentiality, integrity and accuracy of the underlying data. On the other hand, if these protective measures are not engaged, there is a heightened risk of negative impacts related to poor data as measured in both volume and severity depending on the nature of the solution, the sensitivity of the underlying data, the uses to which the data is being applied and the reach of the Web services.
  • Automation can increase the consequences of bad data. Automation is fundamentally a data-driven process - so naturally its dependence on high quality data is easily apparent. However, automation is a process that does more than simply move data around without human intervention; it is a strategic business process that applies business logic and rules to data on a near real-time basis. In effect, automation is the powerful engine that dramatically increases the speed with which 1) raw data is converted into information, and 2) information is translated into action. By extension, if the raw data is poor, then the information is wrong, misleading or untimely. Bad information leads to poor decision making.

What are the Business and Financial Consequences?

What is at stake? Problems with data quality, such as data duplication, inaccuracies and inconsistencies, can lead to more than poor decision making. Other business and financial impacts can include increased costs and missed revenue opportunities, in addition to possible disruptions to operations, harm to brand integrity or damage to the public's perception of the enterprise. Poor data management may also raise strategic legal and regulatory issues. An important question that should be addressed is whether poor data quality in the context of a Web services solution confers a heightened portfolio of legal, liability and regulatory risk on the sponsoring enterprises.

If not properly addressed, these impacts can compromise - and in some cases completely eliminate - the costs and efficiencies originally targeted. Following are a number of pragmatic steps that are integral to ensuring that the underlying data will support a Web services solution.

  1. Start by linking Web services and data quality at a strategic level. A Web services initiative will not succeed unless executives and IT planners develop a clear picture with respect to how a Web services initiative should be strategically aligned with the enterprise's business objectives and its IT infrastructure. Getting this clarity requires a structured and disciplined approach that not only highlights the value that Web services is expected to provide, but also specifically how data quality affects project goals. In some corporations, the complexity inherent in some of these strategic issues will compel executives to temporarily table a Web services initiative until a comprehensive enterprise data management (EDM) strategy can be instituted.
  2. Make data quality a key objective. Because Web services exposes data from heterogeneous sources, a central challenge is maintaining master data, controlling consistency and standardization, and establishing effective data quality metrics that support managerial oversight, benchmarking and a platform for continuous improvement.
  3. Assign ownership and accountability for data quality. It is essential that accountability be identified at key points across the solution architecture, starting at the board level and extending into 1) the business units consuming the services, 2) the IT department maintaining the solution, and 3) any third-party organizations that will be originating, changing or transferring data. At each juncture, someone must be ultimately responsible for the quality of the data that will support the Web services application.
  4. Tighten the scope of the project. For many companies, a strategic approach to data quality on an enterprise scale may not be feasible. Furthermore, if quick results are needed in short order, narrowing the project's scope can bring limiting factors such as cost, time, risk or exposure down to acceptable levels. In some of these cases, a staged approach can resolve the tension between a time-challenged implementation schedule and a long lead time anticipated in order to bring data quality standards up to snuff. In these cases, a pilot project may represent the most cost-effective way to 1) demonstrate the value of a Web services solution before commitment to a full implementation, 2) provide a pragmatic illustration of the implications that data quality has on solution performance, and 3) deliver other benefits such as bringing core task group up to speed before a full-scale rollout to other operating divisions.
  5. Emphasize collaboration with third parties. Whether a Web services project links internal applications or steps across the enterprise perimeter to share business information with suppliers, customers or alliance partners, maintaining a solid focus on data quality will require a high level of collaboration between multiple parties. Care should be taken, for example, to ensure that data quality standards receive the same level of priority among organizational participants and that the same performance metrics are embraced.
  6. Develop training and awareness. A strategic data quality commitment will fail unless it has support from all stakeholders, is explicit, aligned with incentives and backed up with a means of measuring performance and enforcing compliance. Building this support at all levels of the organization requires a commitment to ongoing education and training as well as highly visible and explicit support from executive leadership.
  7. Recognize the need for an ongoing commitment. Data quality remediation is not a one-off exercise. Data quality must be addressed in the initial planning stages, throughout the development process and consistently over the lifetime of the solution.

Look Beneath the Surface

The promise of Web services is real - as real as the implications of poor data quality are dangerous. As a bridge between disparate applications, Web services may be the most effective solution yet. However, unless enterprises address the underlying data quality issues from a strategic perspective, the compelling benefits they seek will be elusive and the consequences of that oversight will begin to appear on boardroom agenda.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access