Carolinas HealthCare System is one of the nation's leading and most innovative healthcare organizations. It’s also a big believer in big data and is leveraging the technology to deliver enhanced services to its patients.
The Charlotte-based provider is using an in-house data warehouse to divide its medical treatment population—consisting of millions of unique patients—into various segments, including disease, environmental and geographical categories. CHS’s aim is to analyze these segments, make predictions and reduce readmissions, hospitalizations and inappropriate emergency department use.
“CHS believes that in order to provide the best possible care to our patients and to improve the health of the communities we serve, we need to understand and leverage the massive amounts of data generated across the healthcare continuum,” says Michael Dulin, M.D., who is the chief clinical officer for Analytics and Outcomes Research for Dickson Advanced Analytics Group. The DA2 unit was launched in 2012 and is comprised of more than 130 experts working to better use healthcare data.
“Environmental toxins, genomics, demographics, lab results, physician notes and patient generated data will all provide essential insight needed to optimize and personalize health care,” Dulin says.
The volume of data at CHS and other healthcare organizations is roughly doubling every two years, Dulin says, and it’s challenging for CHS and other providers to make sense of all the information without using statistical and data mining methods.
“The collection and understanding of [the] data is a strategic resource, with analytics enabling the constant transformation of healthcare,” Dulin says. “The ripples of this understanding are felt in all aspects of business—providing a complete patient-centered view, using predictive analytics to proactively deliver care to the right patients at the right time and to deliver high value care.”
CHS’s core, or source, system data is well into the multiple terabyte range, Dulin says. Its electronic medical records (EMR) system, for instance, pushes 20 million transactions daily, he says. And the organization’s enterprise data warehouse (EDW) doubled in size over the past year to 30 terabytes.
The EDW was planned in 2011-2012 and put into production in 2013.
“The past two years we have been working to create self-service capabilities and have launched several tools for reporting and interactive dashboards,” Dulin says.“The arena of big data/analytics is changing and expanding at a rapid pace,” he says. “The influx of non-traditional data types in healthcare, like social media and other streaming data, will quickly test traditional data storage and access models.”
In 2011, CHS began to centralize all of the various data repositories from across the organization and begin planning for the new cross-enterprise data warehouse,Dulin says. The EDW is a structured repository, based on the IBM Unified Health Care Data Model, that runs on IBM’s Netezza appliance. Netezza allows algorithms to be run quickly.
The EDW integrates patient information from a range of sources, including clinical, billing and claims data.
“The EDW was designed to be the single source of truth,” for data within the organization, Dulin says. “Within the EDW, we have built a patient-centered data mart, which serves as a platform for population health reporting, analytics and predictive modeling.” The data mart is a subset of the EDW, and is designed to serve as a consistent starting point for many of the analyses CHS performs. Called Panorama, the mart was created by DA2 to provide a 360-degree view of the patient. Data is populated automatically on a daily basis.
Earlier this year, CHS began building out a Hadoop big data management ecosystem that serves as a data set “landing zone,” where the organization stores unstructured and semi-structured data for analytics and predictive modeling, Dulin says. The EDW can be used to send data into the Hadoop environment
“Hadoop provides a ‘schema-on-read’ methodology, as opposed to the traditional ‘schema-on-write’ of traditional database systems,” Dulin says. In schema-on-read, the structure of the data is decided before the data is pulled into the data store. “This expedites the time to data exploration, as it does not require the same overhead of designing traditional data structures or source-to-destination mappings and ETL processes,” Dulin says.
A major potential gain is that analysts can begin using new data quicker, without occupying IT resources to source the data, Dulin says. “This brings further benefit in helping to understand the value of [the data] without spending large amounts of IT time/resources,” particularly when it’s discovered that the data offers little or no value, he says.
For data aggregation and analysis, CHS uses a variety of tools, including SQL, the R and Scala programming languages, MathWork’s Matlab technical computing language and visualization tool, Tableau visualization software, Esri’s ArcGIS geographic information platform, and Business Objects business intelligence tools and SAS analytics software.
“When it comes to analytics tools, different tools have different strengths and weaknesses, and different tools have different capabilities,” and CHS looks to match the best tool for a given situation, Dulin says. “For this reason, we may build models using pre-built algorithms or packages in these various tools, or build our own from scratch,” he says. “Ultimately, we will evaluate multiple variations to determine the most accurate and/or most efficient tool given the parameters of the situation.”
Big Data in Practice
One of CHS first big data/analytics initiatives, completed in 2014, was a clinical segmentation, or clustering, of CHS’s entire patient population, which consisted of more than million patient records pulled in over a three-year period.
“We had thousands of variables derived from clinical, procedural, medication order and demographic data,” Dulin says. “This was a pure data-driven approach; we wanted the data to speak for itself. That is, we did not arbitrarily decide how a group would be defined, like having diabetes or being above a certain age.”
The effort resulted in the creation of seven groups of patients based on the data. “We named the segments based on some of the key characteristics and include the complex chronic, aging/rising risk, and advanced cancer patients,” Dulin says. “The data that was used to create the segments included disease, economics, geography, etc.” The segments could then rapidly be further refined by filtering for items such as recency of care, payer type, area of residence and utilization patterns. “This segmentation strategy now serves as the underpinning for our care management systems and allows CHS to focus our clinical resources on patients with the highest risk,” Dulin says.
CHS has created, for example, a predictive model using more than 200,000 re-admissions events that have happened over the past several years. CHS used this information to rate patients in the hospital based on their risk, and to change the way it delivers care management for these patients based on the data. “We now have this system up and working in 16 facilities and have provided interventions [to more than 150,000 patients] based on the risk score,” Dulin says.
The big data and analytics endeavors have come with several challenges.
One challenge is the high demand for data analytics from across the organization. “Clinical and business partners have long sought to use data to drive key decisions, and we have to carefully weigh requests for analytic assistance against our resource limitations,” Dulin says.
CHS plans to expand its data and analytics capabilities along the lines of distributed computing.
“Supplementing with a Hadoop cluster, which runs on commodity hardware and has no limit in its scalability—simply add new nodes when needed—will allow us to more cost-effectively increase our data storage and processing capabilities to meet future demand,” Dulin says.
“This allows us to scale data, analytics and predictive modeling at the same time, so we can keep the pace and our understanding on par with the massive amounts of data that are being generated on a daily basis,” Dulin says. “Highlights include imaging and other unstructured or semi-structured data modeling.”
Big data and analytics have been described by senior CHS executives as a “vital organ” to the healthcare system’s body, Dulin says. And it has become a vital component in the healthcare systems efforts to enhance patient care.
“There is one—and only one—reason Carolinas HealthCare System has a focus here, and that is to provide exceptional care to our patients; more personalized, more appropriate, more efficient, more accessible and more effective,” Dulin says. “A key component of our business strategy is to add value as defined by our patient. Big data and analytics are critical to that end.”
The CHS executive leadership team has championed the efforts and made the investments, including physical space, technology and talent, in big data and analytics with the goal of delivering better services.
And on a larger scale that goes beyond its own organization, CHS in 2013 joined other healthcare systems and IBM to launch the Data Alliance Collaborative (DAC), a first-of-its-kind national initiative to improve population health through data analytics and business intelligence.
Members of the DAC share their experiences and expertise in healthcare and analytics to co-develop solutions that integrate data across various healthcare settings and that enable health systems to use the same data model and information, enhancing the value of patient care delivery.
“As a healthcare system, CHS has found that deriving actionable information from big data/analytics is invaluable in the journey to providing the highest quality, value-laden healthcare for our patients and the communities we serve,” Dulin says. “Big data and analytics allow CHS to move into the new frontier of precision medicine through truly individualized knowledge of our patients.”
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access