The healthcare industry is finally nearing the end of the task of digitizing health records. It has been an arduous and necessary journey worth celebrating. Now, however, healthcare organizations are faced with two major challenges.
First, many organizations are struggling to deliver a reasonable payback on their investments. And second, advances in scientific computing and analytics—things like next-generation electronic health records, genomic sciences, precision medicine, predictive analytics and machine learning, the enterprise imaging revolution, advances in electron microscopy (including cryo-EM), exploring unstructured data such as digital notes and more—have resulted in an unbounded data explosion.
The key to monetizing this new data lies in unlocking meaning from this massive quantity of data—applying math, statistics and analytics to understand how to optimize healthcare delivery, make patients and providers happier and result in better patient care and outcomes. This is an extremely intense data management exercise, as are the next-generation of predictive and prescriptive analytics applications. The industry will be generating more data on top of a lot of data, resulting in even more data. And the cycle continues.
To harness the power of this boom, the data infrastructure management model must undergo a fundamental transformation. It must shift from a tactical model—one that is analogous to the processes of procuring surgical gloves and thermometers—to a strategic model in which the data infrastructure is available for day-to-day tasks, but also enables clinicians to focus on proactively applying data to new problems and business models, while innovating and driving value.
With that in mind, the following represents a way that organizations can drive both value and meaning from the massive amounts of EHR data.
In healthcare, all data matters. The short list includes structured data (data in EMRs, OLAP and OLTP databases); exhaust data (data that’s generated as a byproduct of working with other data); unstructured data (data in clinical notes, flat files, images, BLOBs and more); and IoT data (sensors, wearables, trackers and more). It all matters.
The industry is in the early stages of harnessing value from some of this data, but in reality, there’s a lot of yet-to-be-mined value from all of it. With more than 80 percent of the data in healthcare being unstructured, it’s no wonder that deep learning—things like natural language processing, machine learning and artificial intelligence—is getting a significant amount of investment dollars, press attention and hype.
To truly be successful at harnessing all that data, two things are needed: interoperability and a data platform. Data integration and interoperability is very hard; consider that healthcare organizations—from facilities with fewer than 10 beds to mega-organizations with multiple hospitals—might have more than 350 applications running concurrently, each peripherally talking to each other and each typically having their own database and data schema.
Traditional data centers for healthcare organizations have done reasonably well in terms of providing care. But today’s more flexible, demanding data applications require a data platform to transform the business by driving meaning out of the data.
Organizations need a data platform that helps customers put their data to work. Healthcare organizations should be able to run all operations with cloud-like agility, improve the economics of data analytics at high velocity and scale, and ultimately derive new insights to deliver data-driven patient outcomes and results never before possible. Data, as we previously described, is the lifeblood of the digital generation.
As data creation pivots from humans to machines—driven by sensors, IoT, digital imaging and diagnostic breakthroughs, and myriad connected devices—it has exploded in volume.
Simultaneously, the ability to analyze this data has transcended human cognition, with traditional analytics giving way to artificial intelligence (AI), machine and deep learning, neural networks and real-time data stream analytics. These new, data-driven applications require a different approach to the data center infrastructure including storage, designed to deliver massively-parallel access to data at a very high bandwidth.
Modern science requires a data platform that enables healthcare organizations to deploy a new class of applications, to extract new insights from data and to do so in real-time. Healthcare organizations not only need to continue to support legacy applications reliably, but do a lot more with them. And, above all, organizations need to continue to modernize surroundings and mindsets to embrace this, the digital generation.
Doing so, the industry can improve our pace of innovation, deliver better patient care, improve patient and provider satisfaction, and deliver better, more sustainable, financial results.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access