Improving healthcare interoperability is a top priority for health systems, clinicians, patients, and even legislators. The latest governmental efforts to address interoperability come from the Office of the National Coordinator for Health IT (ONC), which issued a proposed interoperability and information-blocking rule in February 2019. The ONC rule defines demands for healthcare and IT providers around data sharing and outlines exceptions to the restrictions on information blocking. While aspects of this legislation are promising, previous attempts to improve interoperability have failed because the main sources for healthcare data—EMRs—produce non-standardized, disparate data.
While the government’s push for Health Information Exchanges (HIEs) began in 2009 with the passage of the Health Information Technology for Economic and Clinical Health (HITECH) Act, the focus for health systems centered around EMR adoption rather than interoperability. Today, EMR adoption is widespread, with almost 98 percent of health systems using a government-certified EMR. But each of the hundreds of EMR systems in use today has its own set of technical specifications, clinical terminologies, and even unique customizations that prevent true interoperability and data sharing across systems.
Today’s EMR systems, and the lack of interoperability between these systems, reveals that healthcare has fundamental problems to address to improve interoperability, including standardization of terminology and normalization of data to those standards. In addition, the volume of data healthcare IT systems are producing exacerbate these problems.
While the EMR was created by and for a single provider (Regenstrief Institute), most systems available today generate tremendous amounts of data, while lacking the necessary tools for data analytics and integration. Expand the number of providers, systems, and data sources, and the record quickly becomes a collection of disparate low-quality data that is a major contributor to provider discontent and burnout. Add to that, the explosion of data from other sources—such as wearables, mobile phones, and genomics—that exacerbates problems of interoperability. EMR systems aren’t designed to integrate data from other sources or to manage data, making additional integration tools necessary.
While the latest interoperability regulations focus on trying to make it easy to find and exchange patient data across multiple organizations and HIEs, the legislation’s lack of fine print addressing patient matching challenges and aggressive implementation timelines nearly ensures the proliferation of existing interoperability problems (i.e., more low-quality data that obscures critical or important data within a patient’s medical records). Patient allergies are one such example: HIEs frequently receive the same allergy list multiple times from multiple provider EMR systems that are each coded using a different standard (e.g., RxNorm, SNOMED, etc.). Having different representations of the same medical concept is confusing to providers who must try to sift through and reconcile the list.
One answer to interoperability problems that’s been gaining traction is to use machine learning (ML) and artificial intelligence (AI) to sift through the high volume of low-quality data. Using the patient allergy list example above, having patient allergy lists in different formats that are coded using different standards makes ML more difficult and complex as all of these terminologies for the same thing must be mapped together. For AI to work, providers need high-quality data sets to train and execute those ML/AI models.
While many in the healthcare industry have touted blockchain as the answer to healthcare’s interoperability issues, its practical application remains unclear. Blockchain, the underlying technology behind bitcoin, is a distributed ledger system for tracking transactions in a highly secure system. In healthcare, blockchain would enable someone looking at a patient’s medical record to know and trust each piece of data in that record. Blockchain databases are designed to be read-only and can’t be edited or deleted, potentially preventing fraud associated with altering transactional data in a permanent record. However, preventing fraud of a medical record with low quality data does nothing to help the case of better interoperability. Instead, health systems first need to address their differing levels of maturity when it comes to data quality and governance.
Before healthcare organizations spend precious technology resources and dollars on blockchain, they need to fix the data at the source. Instead of having multiple standards for data and exchange—the product of EMR manufacturers using their lobbying power or government entities (e.g., the ONC) that can dominate the organizations responsible for creating these standards (e.g., IHE and HL7)—the industry should focus on creating a single set of terminology, data format, and exchange standards for a larger and more comprehensive patient health data set. This patient health data set should include maternal, perinatal, cardiac, and social determinants of health data. Instead of having to search through text notes for perinatal age or stroke index, providers would have access to data codified using standards-based terminology and formatted in a way that can be analyzed by the advanced IT technology of ML/AI. This, in turn, would make it easier for providers to identify critical information—not just critical data.
Enter the Health Catalyst® Data Operating System (DOS™), which combines the features of data warehousing, clinical data repositories, and HIEs in a single, comprehensive technology platform. With DOS, data is kept in its original source format to preserve the sanctity of the source data, and instead of storing data in various proprietary formats, DOS puts disparate data into a single, shared data structure, highly normalizing and standardizing the data. The DOS platform works with EMRs to ingest data from hundreds of data sources and deliver point-of-care insights within the clinical workflow. Having analytics at the point of care improves healthcare quality. Low quality data can actually reduce the accuracy of any analytics measure, but it is even more critical at the point of care, where medical decisions are being made.
While new technologies like ML and blockchain hold promise to transform the healthcare industry, they won’t solve the biggest barriers to interoperability. The hype around blockchain is diverting focus from the real problems surrounding data quality, while the value of ML is dependent on high-quality data sets. Healthcare organizations need to focus their time and resources around creating high-quality structured, standardized, and normalized data as well as the completeness of patient data, paving the way for true improvement in healthcare interoperability.
Would you like to learn more about this topic? Here are some articles we suggest:
Would you like to use or share these concepts? Download the presentation highlighting the key main points.