When it comes to maintaining data quality in the enterprise, too many corporate executives have a laissez-faire attitude which ultimately has a negative impact on the business.

That is the conclusion of a new study from 451 Research and Blazent, a Burlingame, CA-based provider of IT data intelligence. The study, the “2016 State of Enterprise Data Quality” report reveals that less than half (40 percent) of C-level executives and data scientists are ‘very confident’ in their organization’s data quality, with the majority (94 percent) recognize the impact that poor data quality can have on business outcomes.

The report is based on a survey of 200 C-level and senior IT leaders from companies with at least $500 million in annual revenue. It notes that the impact of this attitude disconnect around data quality can affect a number of areas, including lost revenue (cited by 42 percent) and bad decision-making (cited by 39 percent).

“Too often, IT leaders become enamored by the concept of big data without questioning its quality or validity. This report reveals the cost of this oversight and the overall impact it has on the business,” said Carl Lehmann, research manager, Hybrid IT Architecture, Integration & Process Management at 451 Research. 

When asked about the root causes for poor data quality, nearly half (47 percent) cited data migration as a leading cause. In large enterprises, more often than not, IT still bears the burden of keeping data clean (79 percent) despite the introduction of data scientists (26 percent).

The study confirmed the findings of some other recent studies, which point to a problem of expensive data scientists being poorly used for data project initiatives. In cases where data scientists are involved, one-third report spending up to 90 percent of their time “cleansing” raw data. This trend has been identified by some analysts as using data scientists as ‘janitors.’

The challenge of maintaining data quality is sometimes made worse by the use of conventional methods used to ensure data quality, the study notes. For example, 41 percent of respondents said they rely on applications to validate data, 38 percent manually cleanse data, and “a shocking 10 percent … either don’t know what they’re doing or employ a ‘hope for the best’ approach.” 

“Despite the lax attitude and outdated methods used to maintain quality, the importance of good data is tangible and cannot be overlooked. Eighty-one percent of respondents use data analytics to uncover new revenue opportunities, and it is perceived as having a direct impact on increased revenues (51 percent) and lower costs (49 percent),” the study notes.

Data quality also made its way to the top of the list of big data attributes, with the majority of respondents prioritizing integrity (71 percent) and accuracy (68 percent) over timeliness (47 percent) and accessibility (41 percent). Solutions to the problem range from machine learning for asset management (47 percent) to purpose-built data quality and analytics tools.

Other key findings from the research include:

95 percent expect data sources and volumes to triple in the next 12 months. 

82 percent of respondents believe that their organization thinks that the quality of its data is better than it is in reality.

5 percent of responses have no plan to implement data quality tools or solutions in the coming year.

“While data scientists became one of the most coveted roles in IT this past year, the reality is that CIOs and IT leaders still carry the burden of maintaining the proper checks and balances for data quality, and it will be incumbent on them to solve this unwieldy problem as data volumes continue to escalate,” said Gary Oliver, CEO at Blazent.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access