An insurer's IT systems can communicate among themselves only if they're speaking the same language. Accordingly, carriers are looking to ensure the data from various systems conform to the same standards and models.

With insurers pressed to respond rapidly to market opportunities, competitive threats and regulatory changes, much hinges on the smooth flow of data among departments, says Mark Lewis, GM of the IBM Global Insurance Industry, Armonk, N.Y. "This is about business as much as it's about IT."

The free flow of information throughout the enterprise can improve the performance of individual business units, says Rich Maynard, assistant VP and chief architect of property/casualty technology for The Hartford Financial Services Group, Hartford, Conn.

"We need to go beyond our silos and leverage each other's information to make our products better," says Maynard. A workers' comp executive, for example, can learn from the way colleagues handle small-business policies. "The only way we're going to achieve the benefits of trying to cross-sell everybody is to bring them together through a common model," he says.

Updating standards and models also adds years to the useful lives of aging IT systems, consultants say. Keeping those systems in operation postpones the need to spend millions of dollars to replace hardware and commit countless hours of development time, they point out. Therefore, it's up to the IT staff to respond to business side demands, even if it entails teasing a 25-year-old legacy system that speaks "underwriting English" into meshing with a relatively new system that's fluent only in "claims English," says Mike Freel, a bureau statistics manager at Des Moines, Iowa-based EMC Insurance Group Inc.

The Difficulties

In addition to bridging that cultural divide within a single company, after a merger or acquisition carriers can find themselves integrating data and definitions that are even further out of sync. Freel provides an example: "Company A may consider a claim closed when they make the last payment. But company B may hold the claim file open longer because they're anticipating recoveries coming in."

The huge storage capacity of today's IT systems also has widened the disparities between legacy data and recently produced data. In the old days, with storage at a premium, IT staffs saved space by stripping down field sizes, and using fields for multiple meanings, depending on other details, Freel says. Now, technical advances have eliminated the need to save space.

"When I started 30 years ago, some of our main company records were 80 and 180 characters in length. Now the records that serve those same purposes are 500 or 1,000 characters in length," he says. "So we've seen a huge increase in main data files that are serving exactly the same purpose."

Some systems still in use are so old, they were created before standards existed, says Karen Pauli, a research director in the insurance practice of Needham, Mass.-based TowerGroup. "People stuck data in odd, empty, little buckets," Pauli says. "When they try to integrate it, it's a huge problem."

Meanwhile, automated interpretation of data has eliminated or changed the jobs of people who used to look at pieces of paper and easily reconcile differing sets of data. Machines cannot understand how a dollar amount with no punctuation jibes with a dollar amount adorned with a dollar sign, comma and decimal point, Freel says.

Improving Data

Carriers that want to improve the consistency of their data could begin by making that mission a priority - a task that's not always easy, Pauli says. Getting executives to modify an antiquated technology stack, or change their familiar ways of doing business can take some doing, she maintains.

"They'll dig in and defend their positions," Pauli says of those staff members. She recommends bringing all of the interested parties into a room, mapping out their tasks, and determining where the data is located. "Then, when we get it on one huge, gigantic, hideous spreadsheet, we say to them: 'You can't get to Point Z, which is fully integrated, transparent data that you can transact against, in this spaghetti mess.' "

Once that realization takes hold, IT can begin to persuade executives on the business side to embrace a data standards project, says The Hartford's Maynard. Twice during his 31 years at The Hartford, Maynard has seen standards efforts fail because IT neglected to include business executives in the creation of definitions, he says.

Instead of starting from scratch on the standards, IT should adopt an existing model, Maynard says. Choices include models from Pearl River, N.Y.-based ACORD, IBM's Insurance Application Architecture or the Needham, Mass.-based Object Management Group. If IT "re-engineers backward" from a company's established methods, executives defend their ways of doing business, indulging in unproductive debate and defensive posturing, he says. The model gives those executives a "common enemy" to argue against, he says.

After a carrier decides on model, a new question arises: Which department gets to use the model and standards first? Look for a new business opportunity to exploit, demonstrate success, and then go to older parts of the organization and introduce the changes, Maynard suggests.

To help accomplish all that, a carrier can designate one executive as the data czar, Pauli says, suggesting the COO or CIO. "The line of business heads will have a tendency to respect a business person more," she notes. Still, many CIOs understand the business and work closely with the business side, she says. "Those CIOs carry as much authority as a line-of-business person," she adds.

In the initial stages of data improvement, carriers should back off from what's commonly viewed as a standard - the size of the field or the range of valid values, for instance - and begin by convincing varied groups to agree upon underlying definitions, Freel says.

"You need to get people together, rather than the newer trend of moving away from meetings because they're inefficient," he says, adding that too many executives now say, "Let's communicate by e-mail," and let it go at that.

In a face-to-face meeting on definitions or standards, an executive who wants a 50-character name can yield to one who advocates a 100-character name because the added capacity causes no harm, Freel says by way of example. Making those meetings routine can help, he adds.

Pauli agrees that IT staffs need to meet with the business side. "We talk to many business executives who say 'I'm very tightly aligned with my IT counterpart,'" she says. "Exactly what does that mean - a quarterly meeting? That's not tightly aligned." The two sides do not have to share lunch every day, but need to meet at least weekly, she maintains. "That's alignment."

Bringing together the chief claims officer and CIO does not result in the most productive data integration meetings because executives at that level steer the whole organization from above, Pauli adds. She recommends going down the "food chain" to VPs who maintain closer contact with daily operations.

Once everybody's talking, the real work begins with the preparation of data for integration. That process takes three steps, according to Martina Conlon, a principal at New York-based Novarica, who concentrates on IT in the insurance practice. Conlon used ACORD as an example in her explanation of the process.

The first step - data quality analysis - doesn't require using the ACORD standards, she says. The analysis simply amounts to gaining an understanding of how good or bad the data in the source systems is before exposing it to trading partners or shipping it to other systems.

Second, carriers determine what systems need to share data - what systems consume the data and what systems provide it - Conlon says. In this step, carriers can map each system's data to the ACORD standard to show, for example, that the "policyholder last name" in System 1 maps to the "party last name" in System 2.

Third, carriers can employ ACORD XML standards as the data structure for exchanging data between systems, Conlon says. Several vendors offer software to help carriers adapt the ACORD XML standard, she says.

Once the standard data exchange format is in place, carriers are positioned to "plug and play" commercially available core systems into the environment and achieve a best-of-breed approach, she says.

Outside Help

Such software and outside advice have become much more common in the insurance business in the last year, representing a "huge switch" from proprietary attitudes, Pauli says. She attributes the swing to technological improvements in the last three years, five years of declining premiums that necessitated efficiency and dictated cost-cutting and a growing fear of failed IT projects. "They will help us be error-proof," she says of experienced outside experts and software.

As the aging labor pool of legacy programmers approaches retirement, converting older data to current standards takes on a new urgency, says Ron Nunan, product marketing manager for Verastream Modernization software from Seattle-based Attachmate Corp.

His company's services-oriented approach to software helps legacy and modern systems work together, producing Web sites, Web applications and other projects that operate far beyond the initial scope of the older systems, he says.

North Hollywood, Calif.-based Century-National Insurance Co. has been working with Attachmate for 20 years and began using the Verastream product about five years ago, says Lou Balicki, Century-National VP of information systems.

"We have used [Verastream] as a wrapper around some of our legacy systems to allow on-the-fly communication between browser-based applications - both internal and external - to these older mid-range applications," Balicki says.

"We have an integrated system, but it's like a quilt," he continues. "Some of the patches are older than others. This makes the quilt look like the seamless felt on a pool table to the outside world."

Without the product, Century-National probably would have replaced its legacy systems sooner - at much greater expense in dollars and development time - Balicki says. Besides, spending time at a small company to replace older systems instead of enhancing the attractiveness of the company to agents and consumers would have been "like re-paneling the basement and nobody's going to see it," he notes.

The Journey

The symbolism of data integration doesn't end with quilts and basement paneling. The process strikes many in the insurance industry as a journey instead of a destination.

"It's a long journey, and not for the faint of heart," The Hartford's Maynard says. His people are taking on a 40-year-old policy administration system with countless customizations. They anticipate taking eight to 10 years to modernize it.

IBM's Lewis uses the same metaphor. On the journey toward standards, he says, carriers are scattered along a spectrum, with some near the beginning, most somewhere in the middle and only a few near the end.

As carriers approach that end, he notes, they have made enough progress to catch up with their data integration, and need to make more changes only as the business itself changes. For many, attaining that goal could take three to five years, he says.

Journey or not, compliance with standards and models provides transparency and keeps the risk-management engine running smoothly, says TowerGroup's Pauli. "If there are carriers still doubting they need standards and need to have data integrated," she asserts, "they've been living inside of a cave."

This article can also be found at InsuranceNetworking.com.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access