September 16, 2010 – Can you put a price on good data? Intuitively and logically, we know it's important to have the cleanest, deduped, well-managed data possible.
Some researchers actually did come out with some calculations and financial impacts of data and found that maintaining good data could translate into a 10-percent reduction in processing time, which in turn can translate into potentially millions, if not billions saved for organizations. The study was conducted by researchers from the McCombs School of Business at the University of Texas and Indian School of Business, and underwritten by Sybase — now a part of SAP.
The researchers say that taking steps to increase “data usability,” meaning making it more accessible and to users and of higher quality, can go a long way to lowering the additional subsequent processing time needed at the back end.
The Sybase study looked at industry sectors, and estimates that insurance companies could realize a 105-percent return on equity (ROE) as a result of a 10-percent improvement in data quality and mobility. ROE is defined as net income/shareholder equity, and an important indicator of a business’s ability to grow.
“As sales forces become increasingly mobile, it is imperative for competitiveness that they have high quality data and IT systems that enable rich interactions with customers,” the study stated.
Data quality is a critical issue that insurance companies are starting to come to terms with. Jason Tiret, director of modeling and architecture solutions for Embarcadero Technologies, recently shared his insights on what it takes to get your data house in order.
“Poor data quality leads to a gross misunderstanding of your data, and misunderstanding your data is a bit like driving blindfolded,” he says. “You may eventually get to where you need to go, but it is going be very risky and very costly.”
What's the best way to go about achieving data quality? Don't try to tackle your entire data store at once, Tiret advises. “Focus on the high-visibility, business-critical areas first. This will provide immediate value and, more importantly, demonstrate success that will gain momentum and buy-in to focus on other areas. If that is claim information and product codes, focus on that. If that is customer and their contact mechanisms, focus on that.”
Tiret also advocates a “thorough understanding of the data and the structure of the data — technical metadata.” To reach this understanding, a data model or data flow model is essential.
“Involving the subject matter experts and business will help gain understanding of the data and its accuracy according the business” he says. “It is easy, technically, to see if only dates are stored in a date column. It is much more difficult to understand that a claim submission date must be on or after the incident date.”
This originally appeared on Insurance Networking News.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access