Healthcare Data Quality: Five Lessons Learned from COVID-19
COVID-19’s onset was unexpected, and its rapid spread across the globe was unprecedented. Healthcare providers knew the disease presented a threat to their patients’ health, but few fully grasped the extent to which it would affect other facets of their business.
Early in the pandemic, providers faced COVID-19-induced challenges that largely fell into two categories: analytic strain and resource limitations. New COVID-19-specific value sets created analytic strain, the outbreaks drove urgent lab code updates, and the Centers for Disease Control (CDC) and other public health entities published ever-evolving guidelines. Compounding these issues were resource limitations, spawned as providers had to furlough staff due to canceled or postponed surgeries and office visits. Non-furloughed staff often had to work from new remote environments, straining communications and reporting structures.
Pandemic-driven urgency, variety of data, and a lack of resources have highlighted the critical importance of data quality as a prerequisite for any analytic use case. The COVID-19 pandemic will not be the last of its kind. Organizations must prepare for the next large-scale emergency by committing to a systemwide data quality strategy that produces accurate data at all organizational levels.
Data Quality Starts from the Ground Up
Data quality, the state of qualitative or quantitative pieces of information, ranks as “high” when it helps users make quick and accurate decisions. Healthcare providers must cultivate or adopt a systemwide approach to achieve and maintain this data quality level. For example, health data users can base their quality approaches on the Toyota Total Quality Management Approach, a widely accepted framework that integrates customer-centric data quality into each business’s facet. In a healthcare adaptation, all aspects of a health system work together to ensure the free flow of data across the organization, and all are accountable for the quality of that data.
Five Healthcare Data Quality Lessons Learned from COVID-19
The fast and furious nature of COVID-19 has highlighted areas for improvement in healthcare data quality that organizations must address in preparation for future analytic use cases. To get started, healthcare providers can follow the guidance of five data quality lessons learned from COVID-19.
- Assess Data Quality Throughout the Pipeline
End users discover most data quality issues too late—at the conclusion of the pipeline. At that point in the process, an analyst or subject matter expert (SME) must engage in a time-consuming root cause analysis to determine where things went awry, further delaying the delivery of accurate and actionable results.
Report writers, analysts, and SMEs can move quality up the pipeline by assessing their data and inserting quality checks on top of the model wherever they add data. For example, imagine a report is created that lists COVID-19 patients and their primary care providers (PCPs). When an analyst looks at that report, she sees that each patient is listed with every PCP they have ever encountered, as opposed to just their current PCP.
The analyst knows patient and PCP should be a 1:1 relationship, not a one-to-many relationship, so she kicks off a root cause analysis to determine where the error entered the model. She may find that when the patient and PCP tables joined, the analyst hadn’t included a time component. She is then able to build a data quality check on top of the model to ensure that if that 1:1 relationship is broken, it sets off an alert.
- Do Not Leave Analysts to Firefight
Analysts are not firefighters. Organizations shouldn’t rely on them to quickly address analytic emergencies when they arise, though this is often the case. The onset of COVID-19 has created human resource shortages while also increasing requests for metrics and reports. Analysts need a framework that allows them to focus on data quality but should not carry the full burden of quality maintenance. SMEs and report writers access the data in the later stages of its lifecycle and must contribute to its quality by analyzing it from their unique perspectives and implementing quality checks as needed.
- Look Outside the Four Walls of the Organization
When analyzing data, team members should make an effort to look outside their organizational silo with an eye on the two Vs—verification and validation:
- Does the information meet system assumptions? Is local knowledge represented in the data? If so, the data is verified.
- Validation, on the other hand, aims to align data values with relevant external benchmarks. For example, during a merger and acquisition, one should compare data quality between the organizations to ensure no glaring disparities in quality.
- Data Context and Purpose Matters
Both context and purpose matter when determining whether data quality is sufficient to support decision making, but they become increasingly important as data travels up the framework. Table 1 shows a four-level data quality framework.
|Definition||Context Dependent?||Purpose Dependent?|
|Level 1 – Structural||Database constraints are enforced including data types, NULLs, primary keys, and referential integrity.||No||No|
|Level 2 – Content: Single Subject Area||Values are reasonable within the context of the domain.||No||No|
|Level 3 – Content: Multiple Subject Areas||Values are reasonable across multiple domains.||Yes||No|
|Level 4 – Utility||Values represent information empirically demonstrated to support better decisions.||Yes||Yes|
Table 1: Four levels of a data quality framework.
The data quality framework suggests that providers think of data as a product and address its quality as it traverses the system. Analysts and data users should address structural data quality before moving on to more complex challenges. They can then work with the SMEs who use the data to define single subject area and multisubject area data quality use cases, resulting in a data quality coalition across the organization. Overall, the framework helps providers maintain quick access to accurate data for use in typical day-to-day operations or extreme cases such as the response to COVID-19.
- Use a Singular Vision to Scale Healthcare Data Quality
Team members in different departments across the organization likely spend a small percentage of their time addressing data quality, resulting in a low level of commitment and impact across all resources. When those team members adopt a singular vision and framework, they pool all resources, turning those small percentages into a scalable, single vision for data quality, resulting in cohesive insights.
Commit to Data Quality
COVID-19 has highlighted the crucial importance of embracing a systemic approach to healthcare data quality. Analysts and SMEs are necessary pieces of the puzzle, but they are not the solution, and just-in-time data is simply too late when lives are on the line. By taking these lessons learned from COVID-19, organizations can build a reliable data quality framework, preparing them to save jobs and lives when the next urgent analytic use case arrives.
Would you like to learn more about this topic? Here are some articles we suggest:
- How to Run Analytics for More Actionable, Timely Insights: A Healthcare Data Quality Framework
- Six Proven Methods to Combat COVID-19 with Real-World Analytics
- Achieve Data-Informed Healthcare in Eight Steps
- Six Strategies to Navigate COVID-19 Financial Recovery for Health Systems
- Achieve Data-Informed Healthcare in Eight Steps
Would you like to use or share these concepts? Download the presentation highlighting the key main points.