Taylor Larsen joined Health Catalyst in December 2014 as a Data Architect. Prior to coming to Health Catalyst, he worked for the Colorado Department of Health Care Policy and Financing as a Budget and Data Analyst. Taylor has a Master’s degree in Economics from the University of Colorado.
Healthcare data-informed decision making’s complexity and consequences demand the highest-quality data—a relationship that COVID-19 has amplified. Decision-making challenges associated with pandemic-driven urgency, variety of data, and a lack of resources have made it more critical than ever that organization’s build a data quality coalition and strategy to ensure systemwide data is fit for purpose. Having the people, processes, and technology necessary to define, evaluate, and monitor data quality allows for a quick, effective, and sustained response at an organizational scale. The coalition keeps all resources working together on the task at hand within a well-defined structure.
Healthcare providers knew that COVID-19 would threaten the lives of their patients, but few understood the greater ripple effects across their business and industry as a whole. For providers, two significant COVID-19-induced challenges arose: analytic strain and resource limitations. These challenges highlighted the critical importance of data quality.
Healthcare leaders can improve data quality throughout their organizations by understanding the data quality lessons learned from COVID-19. Five guidelines from these lessons will help organizations prepare for the next pandemic or significant analytic use case:
1. Assess data quality throughout the pipeline.
2. Do not leave analysts to firefight.
3. Look outside the four walls of the organization.
4. Data context and purpose matters.
5. Use a singular vision to scale data quality.
Healthcare organizations increasingly understand the value of data quality, but many lack a systematic process for establishing and maintaining that quality. However, as COVID-19 response and recovery further underscores the need for timely, actionable data, organizations must take a more proactive approach to data quality.
A structured process engages technical and subject matter expertise to define, evaluate, and monitor data quality throughout the pipeline. Health systems can follow a simple, four-level framework to measure and monitor data quality, ensuring that data is fit to drive quality data-informed decisions:
1. Think of data as a product.
2. Address structural data quality first.
3. Define content level data quality with subject matter experts.
4. Create a coalition for multidisciplinary support.
As health systems continue to adopt machine learning to impact significant outcomes (e.g., reducing readmissions, preventing hospital-acquired infections, and reducing length of stay), they must also leverage evidence-based medicine. Evidence adds critical insight to machine learning models, ensuring that models incorporate all necessary variables in their risk prediction, and builds credibility among clinicians.
Evidence-based medicine brings three essential elements to healthcare machine learning:
1. Boosts machine learning model credibility.
2. Engages data experts around healthcare projects.
3. Saves time and money and increases ROI.
In today’s improvement-driven healthcare environment, organizations must ensure that improvement measures help them reach desired outcomes and focus on the opportunities with optimal ROI. With data science-based analysis, health systems leverage machine learning to determine if improvement measures align with specific outcomes and avoid the risk and cost of carrying out interventions that are unlikely to support their goals.
There are four essential reasons that insights from data science help health systems implement and sustain improvement:
1. Measures aligned with desired outcomes drive improvement.
2. Improvement teams focus on processes they can impact.
3. Outcome-specific interventions might impact other outcomes.
4. Identifies opportunities with optimal ROI.