According to the survey results, fewer than 1 percent of businesses lack a data quality strategy and data accuracy improved an average of 8 percentage points from last year. The most common types of errors are noted in the survey are incomplete or missing data, outdated information and duplicate data, and 92 percent of organizations said they have duplicate data within their system.
“The fast-paced, omni-channel environment often results in siloed touch-points and reduced resources,” said Thomas Schutz, SVP, general manager of Experian QAS. “To ensure a positive customer experience, many businesses are implementing new strategies to better utilize consumer intelligence and analytics. To gain a high level of insight that can create a more personalized experience across channels, organizations need to improve accuracy and incorporate data quality into strategic planning.”
Human error, cited by 65 percent of organizations, is the main cause of data problems, followed distantly by a lack of internal manual resources, an inadequate data strategy and insufficient budget; just 14 percent cited inadequate senior management support, which the authors said illustrates the importance of data quality for the C-suite.
The omni-channel environment, the report said, is changing how companies message to consumers, and to create meaningful interactions and customer experiences, companies need to make real-time, dynamic offers and therefore need demographic and behavioral details to better understand individual needs.
The report offers four steps to implementing real-time relevant customer messages:
- Clean internal data – The key to real-time consumer intelligence is being able to marry lots of different information quickly to provide relevant offers
- Clean incoming information – Ensuring the accuracy of data improves the accuracy of inbound information so organizations can get more from marketing efforts, and ensures more accurate matches from third-party data providers
- Enhance searching capabilities – Enhance capabilities to allow for matching, even with minor errors, to aid in pulling and truly understanding internal data.
- Plan – Marketers need a strategic plan for leveraging consumer intelligence and to articulate which data they need to achieve their goals.
Duplicate data is now among the most common data quality issues, and 92 percent of participants admit having duplicate data as a result of human error, multiple points of entry, multiple databases and multiple business channels. The report offered several techniques to remove existing duplicate records within databases:
- Standardize contact data
- Define the level of matching they want to accomplish, as well as a tolerance level for duplicate records
- Software should then identify duplicates based on the defined criteria
- Reduce duplicates being created by implementing fuzzy-matching technology
There are several steps businesses can take to eliminate human error, the report said:
Identify data entry points: Understand how information enters systems; consider all channels and data entry points to create full data workflow, then prioritize projects based on high-volume channels or excessive data quality errors.
Train staff: A lot of information is manually entered; explain the importance of accurate data to employees and how information is used throughout the business.
Automated verification processes: Software solutions can be implemented to help prevent inaccurate address and email contact details, for example. Determine what data is most important to the business, evaluate and prioritize solutions.
Clean the information: Regular database maintenance and allows organizations to review information and make certain tools are effectively managing data to required levels of quality.
The most common problem according to those surveyed is sending mailings to the wrong address, followed by sending multiple mailings to the same customer, and staff inefficiencies; 32 percent said inaccurate contact data negatively influences customer perception and 29 percent said they had lost a customer due to inaccurate data input.
This story originally appeared at Insurance Networking News.
Chris McMahon is senior editor for Insurance Networking News, a SourceMedia publication.