Below is the 9th of 11 Novarica Research Council Impact Award nominee case studies, which Insurance Networking Newsis presenting in no particular order. The awards will be presented at the research and advisory firm’s August 13th event in New York and honor best practices in insurance industry IT initiatives and strategy.
Penn Mutual recently underwent an initiative to significantly improve its ease of access to critical information, as well as data quality and consistency, by creating a “data-as-a-platform” capability that unifies information for clients, policies and producers. The initiative also makes data available to all application systems and enables a 360-degree view across all information.
The project transformed the technical architecture by providing a reusable data platform with a services-oriented architecture (SOA) framework for all new project initiatives, speeding up development, improving data accuracy and consistency, and reducing costs.
Penn Mutual, a provider of life and annuities insurance, executed the project in conjunction with a series of major business undertakings over a period of time. The business was highly involved in the definition of requirements for these efforts.
The project was managed under the program management office and a waterfall methodology was used. Steering committees, enterprise architecture, compliance and other areas also were involved. In addition to requirements gathering, business leaders were involved in data stewardship functions including analysis, completeness, consistency and remediation.
The project was staffed by an average of five IT employees over the three and a half years it took to bring the project from conception to production.
The overall architecture combines SOA with master data management (MDM) techniques to offer consistent and comprehensive access methods to the data. Underlying technology principles include industry-standard data representation and transaction interfaces, big data file systems, query interfaces and web service security.
The main challenge of the project was the goal to have one platform to support all business needs. Specific challenges included dealing with disparate data source formats, differing interface protocols, batch and real-time processing requirements, consistent data definitions, data quality, complex business processes and precedence rules.
Having a single platform for all data has delivered multiple benefits to Penn Mutual, including improved data quality, exposure of previously unidentified data issues, and enabling the development of enterprise-level business rules. Application development time also has significantly improved due to the company leveraging a proven data access layer.
Penn Mutual has seen about a 40 percent reduction in development time for interfaces due to reuse, more than a 75 percent effort reduction for quality assurance testing due to data quality and a 60 percent decrease in developer support by using proven data services. Executive support for the project and associated investment, as well as close collaboration between business and IT, were critical success factors.
Straight from the Source:
In his own words, Mark Dash, AVP of Information Management & Technology at Penn Mutual, shares lessons learned from this project.
One important factor is to integrate new technology into the normal support areas within IT. For example we have our DBA group managing and maintaining the Cassandra cluster. The idea of managing the distributed database just like any other database insures that proper controls and maintenance is in place as well as providing a natural integration into the operation.
All along we have taken the strategy of combining the data as a platform infrastructure work with business projects; we call them “host” projects. This provides an immediate business benefit rather than a “build it and they will come” approach. It is also an opportunity to inform and educate the business stakeholders on the architectural approach and it’s applicability to the business solution.
For the acquisition of new skills we have taken the approach of expanding the skill sets of internal staff rather than looking to source externally. Given the market demand for skills around big data and data sciences we have found that internalizing provides strong motivation and raises enthusiasm within the department.
The technology landscape is rapidly evolving around the “big data “ecosystem and therefore we need to expect change and be flexible in both adoption and transition related to technology choices. I expect the offerings to look quite different in coming years. We like the approach of using open source initially to explore capabilities and applicability, and then if we continue with the technology transition to a licensed model.
What would you do differently?
I would put a stronger emphasis on legacy re-wiring. It’s important to quickly address these cleanup activities or technical debt as they tend to linger and become difficult to return to. If not addressed, you run the risk of actually increasing complexity within the environment.
Another area is working with the business on future vision with regard to how the platform can enable new business capabilities not available previously. The natural tendency is to look for improvement areas instead of growth areas.
Originally published by Insurance Networking News.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access