Healthcare Interoperability: New Tactics and Technology
Interoperability is perhaps the highest-trending topic in healthcare IT today. But when you hear the term, what does it bring to mind? Perhaps a contraption of unnecessary complexity; a Rube-Goldberg-like series of moving parts, where multiple, overly-engineered procedures sequentially perform a fairly simple task. Or maybe the kid’s board game, Mousetrap, which features a series of interrelated actions that result in the simple goal of capturing your opponents’ mice.
When it comes to defining interoperability, there are many perspectives. Let’s take a high-level look at what some of those are and at the challenges we face in “solving” this puzzle. We’ll also take a look at the Health Catalyst platform, applications, and methodologies, and how they are able to achieve some of the requirements of interoperability in a very pragmatic way.
Where a Lack of Healthcare Interoperability Hits Hardest
What does interoperability mean? Simplistically, it’s the ability for multiple systems to talk to each other and integrate data to improve care for the patient. The Center for Medical Interoperability, with a device-centric approach, defines it as the ability to share information across multiple technologies. What it ultimately boils down to is the idea that all players in the healthcare IT space should play well together. This is a requirement if healthcare is to deliver on these three critical fronts in the future:
Accountable Care – ACOs exist to improve the quality and coordination of healthcare delivered to a defined population of patients. Care managers within an ACO need access to information about the patients for which they are accountable. This means that the collection and sharing of data is paramount in order for an ACO to succeed. Yet, around 70 percent of ACOs struggle to collect data, according to a recent survey of 68 ACOs. This struggle is amplified as patients see an increasing number of specialists and out-of-network providers who are unable to convey data back to the initial ACO provider.
Population Health – Dale Sanders, Executive Vice President at Health Catalyst, describes 12 categories of data required for population health, with the minimal data sets being: (1) patient-reported outcomes; (2) social determinants of health; and (3) activity-based costing data that allow healthcare organizations to accurately manage their financial margins. Unfortunately, these are all but missing from the healthcare data landscape today, gaps that can be attributed to the lack of interoperability.
Precision Medicine – Precision medicine integrates research with clinical practice to understand a patient’s individual illness and deliver the right treatment at the right time. We’ve been talking about it since 2011, when the National Research Council first issued a news release calling for “a new data network that integrates emerging research on the molecular makeup of diseases with clinical data on individual patients” for the purposes of developing “a more accurate classification of disease and ultimately [enhancing] diagnosis and treatment.” Again, data sharing and interoperability are integral for it to succeed.
Who Is Impacted and Why
The biggest concern with the lack of interoperability goes back to the patient for whom we are ultimately trying to improve care. The patient is the eventual benefactor regardless of what process needs to be improved. Providers can also suffer because they strive to create ideal care. When they don’t have all the information about their patients, who often undergo care elsewhere, this makes it difficult to administer and coordinate overall care. There are added costs of trying to get relevant data points out of EMRs, with some charging fees and, while different groups have to pay for this, it ultimately trickles down to the patient.
Also, inconsistent data quality can easily make two different records for one patient look like two different patients when tracking them from one system to the next. This becomes a master data problem. And from a resourcing and maintenance perspective, there are other costs in managing the different technologies involved with multiple systems, not to mention the churn and waste.
What’s the Holdup?
In the past, EMR vendors haven’t had to worry about interoperability. They’ve had their own kingdoms. But lately, there has been enough public and government pressure—which stems in part from the billions in funding granted to EMR vendors—to improve patient care, so they’ve started to work with one another and with other large healthcare systems and IT providers. Where EMRs could afford to worry about their own needs in the past, this is now no longer the case.
Some of the biggest struggles with interoperability exist with the workflow and policy differences between organizations and variability in the data captured and different contexts of use. This is where the government has tried to impose standards, but that still takes time and can have unforeseen consequences.
One of the top protocol concerns with interoperability is around standards and getting everyone to agree on the one way to send and receive data. HL7 has been the top standard in terms of interoperability in the past, and most health information exchanges (HIEs) use it. Continuity of Care Documents are another standard. HL7 defines these as fostering “interoperability of clinical data by allowing physicians to send electronic medical information to other providers without loss of meaning and enabling improvement of patient care.” There have been frustrations when dealing with HL7 data because of its unreliability from both a completeness and consistency standpoint. HIE’s also struggle in this area because of insufficient data standards, inaccurate data, and different privacy rules. With the current variety and volume of technologies there are no easy answers, but some solutions are beginning to separate themselves. Application program interfaces (APIs) are being used more and more to interact with Health Systems. An API expresses a software component in terms of its operations, inputs, and outputs, which allows definitions and implementations to vary without compromising the interface. According to The Advisory Board, “the use of a standardized and accessible API is a critical step in allowing the appropriate flow of information across health care stakeholders.”
Fast healthcare interoperability resources (FHIR) is an interoperability standard for electronic exchange of healthcare information that is the successor to HL7. FHIR is a healthcare exchange API that provides a simple and efficient way to discover and consume information across distributed systems. FHIR aims to make the implementation of the data exchange simpler so more time can be spent on the non-technical, hard interoperability issues.
Nurturing these ideas falls under the care of The Argonaut Project, which seeks to advance the adoption of interoperability standards “to enable expanded information sharing for electronic health records, documents, and other health information based on the FHIR specification.” It’s a joint project between HL7 and some of the top healthcare organizations in the U.S., including many of the main EMRs in the United States.
EMRs are going to leverage FHIR more and more to both extract data from the EMR and feed it back into the EMR. It is going to take some time for the healthcare system to realize the promise of FHIR, but it represents progress in that EMR vendors are demonstrating commitment to implement it and it is in a better technological state, than its HL7 predecessor.
The Ultimate Goal: Closed-Loop Analytics
Patients with difficult-to-diagnose conditions (Parkinson’s, heart failure, cancers) will sometimes visit multiple physicians, quite often fragmenting their data ecosystem. The most subjective data content we have in healthcare is contained in clinical notes and diagnostic reports, so if we ever stand a chance of solving the subjective problems of diagnoses, we have to get our hands on that text data and start bouncing it against our discrete data. We also need 7×24 biometrics, genomic, familial, and socio-economic data to reduce the time to accurate diagnoses.
While EMR vendors are slow and improving, the space we want to get to, to truly improve care, is closed-loop analytics. Closed-loop analytics is about closing the loop between analytics and workflow. As a physician works in the EMR, they are creating valuable data on a patient. That data typically is resurfaced in an analytic environment where complex analyses and algorithms can create new data insights that are relevant to clinical care. Traditionally, the analytic insights are delivered in a siloed analytic environment that is accessed completely separate from the EMR system. This leads to many valuable analytics never being used to improve patient care because accessing the analytics is cumbersome and not part of the normal clinical workflow. Closed-loop analytics aims to bring the analytic insights directly into the clinical workflow – analytics relevant to patient care should be displayed directly in the EMR. It’s a matter of getting the right data to the right person at the right time, and in the right modality.
To do this requires getting data from the EMR, feeding it back into the enterprise data warehouse (EDW) to run algorithms and analytics, and then feeding the findings back into the EMR again to improve workflows and care.
How to Get On Track with Interoperability
Using a Late-BindingSM Enterprise Data Warehouse and analytics applications built with this platform, health systems can reduce the pain of getting data out of various systems and into the EDW. This allows the healthcare systems we work with to spend less time on gathering data and more time analyzing it and improving care. One of the ways to improve care is by creating predictive analytics based on the various data in the EDW, then feeding that data back into the EMR so clinicians can use those analytics to improve care. As discussed earlier, this is called closed-loop analytics.
Health Catalyst’s predictive analytics library continues to grow. One example is found in our Congestive Heart Failure application, which looks at this cohort of patients and then, based on a predictive algorithm, identifies which patients are the highest at risk for readmission. Providers use this data to follow up first with the highest-risk patients and apply best-known practices that help alleviate the risk for readmission.
One Approach to Interoperability
A best practice approach is to bring multiple source systems into the EDW in a standardized and very efficient way. The Health Catalyst Analytics Platform does this and captures metadata about the source systems. We use that metadata to quickly load the data into the EDW via database direct querying, flat files, or XML files. New connectors are being developed that will provide the ability to insert streaming HL7 messages, claims EDI data, and Hadoop data into the EDW. Our platform pulls data into the warehouse quickly, so we can start analyzing it quickly. Regardless of what standards or systems are on the organization’s side, we can get that data into the warehouse in a pragmatic way without worrying about getting all the data at once. This approach focuses on initially getting what provides the most analytical value. If it turns out later that there are other metrics needed from a new table, it’s easy to bring that in because of the data acquisition functionality in our platform.
Again, a best practice to improving care is expanding the number of data sources that can be brought into the EDW. The Health Catalyst library includes over a hundred and is growing. It contains many of the prevalent EMR’s and these sources are available to our healthcare systems to reduce the already-decreased time it takes to bring data into the EDW. In terms of interoperability, organizations are able to leverage this library to bring disparate data from different hospitals and systems into one EDW. Time to value is realized even more quickly by leveraging these source starter sets. Data can get into the EDW faster and end users can start integrating and aggregating the data and then use the algorithms and analytics to feed back into the workflows and improve care.
For example, we work with one large healthcare system that has more than 200 sites of care, including nine hospitals. Many of these hospitals have come by way of merger and acquisition. At one point, this was a health system with a single EMR that suddenly found themselves contending with records from Epic, Cerner, Meditech, and some Centricity data. We worked with them to bring all this data into our platform and integrate everything using Source Mart Designer and SAM Designer.
On a related note, ACOs need to integrate various claims systems so their payers have a master data record of the patient to work with. With multiple systems, Health Catalyst can create this common master record for the patient or provider, the single source of truth, from which all analytics can originate.
Lastly, we have begun to ‘close the loop’ with our analytic platform because our applications have been delivered through the EMRs of our healthcare systems and directly into the clinical workflow. We are also developing tools and technology to deliver discrete analytic insights into the EMR through APIs like FHIR and others.
Can We Build a Better Mousetrap?
Time will tell if we can achieve true interoperability. We’re a good year or two from seeing the fruits of the FHIR efforts. But once standards are in place, there will still be issues with so many small, ambulatory clinics running obscure EMRs, that don’t have the technical or financial resources to share data. How do you integrate data from them? Some organizations will continue to look to HIEs to help with interoperability, but there are funding issues. Data quality is another big issue, and the breadth of data often isn’t enough.
We have challenges where EMR vendors in the past didn’t have to worry about interacting with one another. Multiple standards and various technologies made things difficult. Security is also a challenge, which is a concern from a liability standpoint and which is also tied into standards. But if we don’t get hung up too much on standards and we take a pragmatic approach about what data is needed, we will discover an easier way to get the data and start working with it.
Building a better mousetraps starts with the platform, tools, and approach to get the various source systems data and start analyzing it. Patient providers have Master Data Management tools at their disposal to help in those scenarios. And Health Catalyst is developing closed-loop analytics for improving quality and outcomes to get data back to providers for making better decisions to improve care for their patients.
Would you like to use or share these concepts? Download this presentation highlighting the key main points.