As the digital trajectory of healthcare rises, health systems have an array of new resources available to make more effective and timely care decisions. However, to use these data analytics, machine learning, predictive analytics, and wellness applications to gain real-time, data-driven insight at the point of care, health systems must fully integrate the tools with their EHRs. Integration brings technical and administrative challenges, requiring organizations to coordinate around standards, administrative processes, regulatory principles, and functional integration, as well as develop compelling integration use cases that drive demand. When realized, full EHR integration will allow clinicians to leverage data from across the continuum of care (from health plan to patient-generated data) to improve patient diagnosis and treatment.
Data: Quality, Management, Governance
As healthcare transitions from fee-for-service to value-based payment, payer organizations are increasingly looking to population health management strategies to help them lower costs. To manage individuals within their populations, payers must become data driven and establish the technical infrastructure to support expanding access to and reliance on data from across the continuum of care.
To fully leverage the breadth and depth of data that an effective health management strategy requires, payers must address six key challenges of becoming data driven:
Historically technology and talent were primary assets used to weigh the value of M&A activity, but data is an equal pillar. Buyers (the acquiring organizations) face enormous responsibility and risk with M&A transactions. C-suite leaders have a lot to consider—enterprise-wide technology, finances, operations, facilities, talent, processes, workflows, etc.—during the due diligence process. But attention is often heavily weighted toward time-honored balance sheet and facility assets rather than next-generation assets with the long-term strategic value in the M&A process: data. The model for conducting due diligence around data involves four disciplines:
Establish the strategic objectives of the M&A with the leadership team.
Prioritize data along with the standardization of solutions and the design of a new IT organization (i.e., a co-equal effort for data, tools, and talent).
Identify the near-term data strategic priorities, stakeholders, and tools.
Assess the talent and consider creating an analytics center of excellence (ACOE) to harness organizational capabilities.
Healthcare data models are the backbone of innovation in healthcare, without which many new technologies may never come to fruition, so it’s important to build models that focus on relevant content and specific use cases.
Health Catalyst has been continuously refining its approach to building concise yet adaptive healthcare data models for years. Because of our experience, we’ve learned five key lessons when it comes to building healthcare data models:
Focus on relevant content.
Externally validate the model.
Commit to providing vital documentation.
Prioritize long-term planning.
Automate data profiling.
These lessons are essential to apply when building adaptive healthcare data models (and their corresponding methodologies, tools, and best practices) given the prominent role they play in fueling the technologies designed to solve healthcare’s toughest problems.
Most health systems suffer from data clutter and efficiency problems. As a result, analysts spend most of their time searching for data, not performing high value work. There are three steps that can help you address your data management issues: 1) find all your dispersed analysts in the organization, 2) assess your analytics risks and challenges, 3) champion the creation of an EDW as the foundation for clinical data management.
The role of a data lake in healthcare analytics is essential in that it creates broad data access and usability across the enterprise. It has symbiotic relationships with an enterprise data warehouse and a data operating system.
To avoid turning the data lake into a black lagoon, it should feature four specific zones that optimize the analytics experience for multiple user groups:
Raw data zone.
Refined data zone.
Trusted data zone.
Each zone is defined by the level of trust in the resident data, the data structure and future purpose, and the user type.
Understanding and creating zones in a data lake behooves leadership and management responsible for maximizing the return on this considerable investment of human, technical, and financial resources.
Using a supermarket analogy, this article helps healthcare leaders understand what data lakes are (open reservoirs for vast amounts of data), why they’re essential (they reduce the time and resources required to map data), and how they integrate with three common analytic architectures:
Early-Binding Data Warehouse
Late-Binding Data Warehouse
Map-Reduce Hadoop System
Data lakes are useful parts of all three platforms, but deciding which platform to integrate a data lake with depends heavily on a health system’s resources and infrastructure.
Once understood and appropriately integrated with the optimal analytics platform, data lakes save health systems time, money, and resources by adding structure to data only as use cases arise.
Interoperability in healthcare, despite frequent objections by EHR vendors and health systems (e.g. “EHR integration is too difficult to manage”), is integral to delivering high quality patient care.
Interoperability means different things to different health system stakeholders, from leaders seeing it as a purchase they must defend, to clinicians relying on it to get the information they need, when they need it. But it boils down to delivering the highest-quality, most effective, and most efficient care to patients—a goal that’s easier to define than achieve.
One of interoperability’s most important use cases, EHR integration, is challenged by EHR vendors and health systems worried about integration challenges, from HIT vendors wanting to integrate too many tools, to EHR access fears. Fortunately, objections are dissipating with the introduction of national interoperability policy and better cooperation among industry participants.
Amidst these distractions, health systems need to regain focus on interoperability’s top goal: improving patient care by making the best information available at the point of care.
Given the fact that up to 80 percent of clinical data is stored in unstructured text, healthcare organizations need to harness the power of text analytics. But, surprisingly, less than five percent of health systems use it due to resource limitations and the complexity of text analytics.
But given the industry’s necessity to use text analytics to create precise patient registries, enhance their understanding of high-risk patient populations, and improve outcomes, this executive report explains why systems must start using it—and explains how to get started.
Health systems can start using text analytics to improve outcomes by focusing on four key components:
Optimize text search (display, medical terminologies, and context).
Enhance context and extract values with an NLP pipeline.
Always validate the algorithm.
Focus on interoperability and integration using a Late-Binding approach.
This broad approach with position health systems for clinical and financial success.
For better or worse, hospitals are obligated to collect and report data for regulatory purposes. Or they feel compelled to meet some reputational metric. The problem is, an inordinate amount of time can be spent on what is considered data for accountability or punishment, when the real focus should be on data for learning and improvement. When time, effort, and resources are dedicated to the latter, it leads to real outcomes improvement.
Deming has three views of focusing on a process and this article applies them to healthcare:
Sub-optimization, over-emphasizing a single part at the expense of the whole.
Extreme over-emphasis, also called gaming the system.
The right amount of focus, the only path to improvement.
With data for learning as the primary goal, improving clinical, operational, and financial processes becomes an internal strategy that lifts the entire healthcare system.
There’s a new way to think about healthcare data analysts. Give them the responsibilities of a data detective. If ever there were a Sherlock Holmes of healthcare analytics, it’s the analyst who thinks like a detective. Part scientist, part bloodhound, part magician, the healthcare data detective thrives on discovery, extracting pearls of insight where others have previously returned emptyhanded. This valuable role comprises critical thinkers, story engineers, and sleuths who look at healthcare data in a different way. Three attributes define the data detective:
They are inquisitive and relentless with their questions.
They let the data inform.
They drive to the heart of what matters.
Innovative analytics leaders understand the importance of supporting the data analyst through the data detective career track, and the need to start developing this role right away in the pursuit of outcomes improvement in all healthcare domains.
Bad healthcare data is inevitable. Whether it happens as a result of human input error or an incorrect rule, bad healthcare data will happen. And rather than ignoring it, hiding it, or scrubbing it, health systems need to take a more transparent approach.
Bad healthcare data, when approached correctly, has four surprising benefits:
Provides valuable feedback to application users/data consumers.
Inspires an improvement culture.
Creates a Snowball Effect of Success.
Improves Data Accuracy.
It’s not easy to make the shift from fearing bad data to embracing it, but there are several steps systems can take to start creating a data transparency culture:
Empower: encourage data consumers to provide feedback.
Share: Provide a mechanism for sharing feedback.
Act: dedicate time and resources to respond and act.
Health systems prepared and willing to fix bad data will ultimately improve data quality.
How to integrate data across systems of care depends on the organization’s perspective. In this report from the Scottsdale Institute, learn how leaders from Health Catalyst, Cerner, Geisinger, and CHI have tackled issues such as population health, HIEs, value-based payments, and data governance. Ultimately the starting point isn’t really how to integrate the data, but why the data needs to be integrated in the first place. The approach changes, for example, when an organization needs to combine data for a regulatory report versus using data for real-time patient-physician interaction.
The healthcare industry is currently obsessed with outcome measures — and for good reason. But tracking outcome measures alone is insufficient to reach the goals of better quality and reduced costs. Instead, health systems must get more granular with their data by tracking process measures. Process measures make it possible to identify the root cause of a health system’s failures. They’re the checklists of systematically guaranteeing that the right care will be delivered to every patient, every time. By using these checklists, organizations will be able to improve quality and cost by reducing the amount of variation in care delivery.
The data steward is critical to sustained outcomes improvement, yet they tend to be underappreciated members of the healthcare analytics family. Combining the invaluable technical expertise of a data analyst with the vital clinical knowledge of an experienced caregiver, the data steward’s skills and proficiency at both positions brings value beyond measure to any outcomes improvement project. Unfortunately, all too often, their role is non-existent even though potential candidates for the job are located in multiple data sources throughout the organization. Among other responsibilities, the data steward:
Reinforces the global data governance principles.
Helps develop and refine details of local data governance practices.
Is the eyes and ears of the organization with respect to data governance and the governance committee.
Provides direction to peers regarding appropriate data definitions, usage, and access.
Anticipates local consequences of global changes
For innovative health system leaders who have specifically recognized this emerging role, the ROI of data stewards who help achieve improved outcomes is very worthwhile.
The phrase “data shopping” should conjure up images of crowded stores, out-of-stock items, long lines, and cranky sales clerks. This scenario is similar to that of your data users and analysts when they are trying to operate without a strict data management policy and without a unified data platform. Many healthcare institutions attempt to operate with data stored in multiple locations, accessible in different ways. Too much time is spent by users looking for the one source of truth and too much time is spent by analysts attempting to gather data to fulfill user requests. Not enough time is spent analyzing data and generating improvements. Data shopping is dangerous and organizations caught up in the spree need to consider a cleanup on aisle 9 (that’s analytic-speak for “consider an enterprise data warehouse”)
The number of partnerships and collaboratives in healthcare continues to climb. One of the many complications of these deals involve integrating and governing data. In fact, 100% of the 2014 Pioneer ACOs reported that they had difficulties with data integration, which had a major and negative impact on performance. Right now, data governance in healthcare is in a transitionary stage not unlike the U.S. in the 1980s. Leaders who manage the data governance in these partnerships must be like a data-savvy version of Henry Kissinger, able to bring the data of loosely affiliated organizations together for the benefit of all.
Volume doesn’t equal quality in the world of healthcare data governance. This case study from the dairy industry shows that a large data governance committee doesn’t necessarily add up to trustworthy data or effective decision-making. Why the dairy industry? Prior to Organic Valley, the author worked as a data architect in healthcare data warehousing for one of the largest healthcare systems in the Midwest. This article shows how valuable a strong data governance model is, regardless of industry, at aligning IT and business decision-making personnel from across the enterprise to bring about a greater appreciation of the importance of data and its role in guiding the business. A best-practice governance framework also can result in more trustworthy data and a faster track to developing the enterprise data warehouse (EDW) model.
The prescription for improving healthcare outcomes is pretty straightforward: improve quality by working with good data that’s based on patient perceptions of quality, as well as functional health outcomes. Then make that data accessible and actionable among your physicians and give them the leeway they need to reduce variation and, ultimately, improve outcomes. As simple as this may seem, it’s been complicated by an inefficient data infrastructure with non-standardized components (EHRs) and the inability to distribute analyses and visualizations where they are needed most (at the point of care). Dale Sanders explains these issues in detail and outlines solutions in this article published in the April 2015 edition of BMJ Outcomes.
National awareness for the privacy and security of patient electronic health information is currently at an all-time high. Yet providing HIPAA-compliant solutions has been an ongoing priority since the founding of Health Catalyst. While our handling of PHI isn’t as extensive as that of a payer or healthcare provider, we are committed to complete compliance with HIPAA and ensuring the privacy and security of our clients’ PHI. This is possible because of our culture and advanced technology. Technology features include tracking and audit trails, physical security of the data, limited user access to data during deployment, role-based security features, protection of sensitive subsets of PHI, and ongoing control of user access regardless of the hosting environment.
The Changing Role of Healthcare Data Analysts—How Our Most Successful Clients Are Embracing Healthcare Transformation (Executive Report)
The healthcare industry is undergoing a sea change, and healthcare data analysts will play a central role in this transformation. This report explores how the evolution to value-based care is changing the role of healthcare data analysts, how data analysts’ skills can best be applied to achieve value-based objectives and, finally, how Health Catalyst’s most successful health system clients are making this cultural transformation happen in the real world.
Finding the perfect data governance environment is an elusive target. It’s important to govern to the least extent necessary in order to achieve the greatest common good. With the three data governance cultures, authoritarian, tribal, and democratic, the latter is best for a balanced, productive governance strategy.
The Triple Aim of data governance is: 1) ensuring data quality, 2) building data literacy, and 3) maximizing data exploitation for the organization’s benefit. The overall strategy should be guided by these three principles under the guidance of the data governance committee.
Data governance committees need to be sponsored at the executive board and leadership level, with supporting roles defined for data stewards, data architects, database and systems administrators, and data analysts. Data governance committees need to avoid the most common failure modes: wandering, technical overkill, political infighting, and bureaucratic red tape.
Healthcare organizations that are undergoing analytics adoption will also go through six phases of data governance including: 1) establishing the tone for becoming a data-driven organization, 2) providing access to data, 3) establishing data stewards, 4) establishing a data quality program, 5) exploiting data for the benefit of the organization, 6) the strategic acquisition of data to benefit the organization.
As U.S. healthcare moves into its next stage of evolution, the organizations that will survive and thrive will be those who most effectively acquire, analyze, and utilize their data to its fullest extent. Such is the mission of data governance.
Health data stewards are keepers of tribal knowledge, and they’re invaluable when a health system launches or expands a healthcare data analytics initiative. Their intimate and expansive knowledge of how data is collected to represent workflow across different systems can save days’ worth of time (and cost) in the development process while improving the accuracy of the analytics output. But getting anything more than a few spare moments of their time can be difficult because health data stewardship isn’t part of their job description. While it may seem difficult to justify at first, organizations need to formalize the role of the health data steward. The investment will ultimately return many times its value as the organization realizes the advantage of the analytics.
Disease Surveillance: Monitoring and Reacting to Outbreaks (like Ebola) with an Enterprise Data Warehouse
The current options for monitoring data to help identify disease outbreaks like Ebola are not great. These are: 1) Monitoring chief complaint/reason for admission data in ADT data streams. Although this is a real-time approach, the data is not codified and would require some degree of NLP. 2) Monitoring coded data collected in EHRs. The most precise option available, but the data is not available until after the patient encounter is closed, which would be too late in most cases. And 3) Monitoring billing data. This approach has the same problems as the two listed above, but it’s better than nothing in the absence of an EMR. All of these weaknesses can be solved with the use of a data warehouse.
Master data management is key for healthcare organizations looks to integrate different systems. The two types of master data are identity data and reference data. Master data management is the process of linking identity data and reference data. MDM is important for mergers and acquisitions and health information exchanges. The three approaches for MDM are: IT system consolidation, Upstream MDM implementation, and Downstream master data reconciliation in an enterprise data warehouse.