3 Frequent Mistakes in Healthcare Data Analytics

My Folder

Health systems and the healthcare industry in general are exploring the possibilities of healthcare data analytics. If you leaf through any current healthcare publication or glance around a health IT website/blog, there’s a good chance you’ll find something about Big Data, population health management, or accountable care – maybe even all three. These are hot topics because they are viewed by many people as tenets for healthcare reform. Underlying each of these themes is the concept of analytics. Without analytics, it is difficult (if not impossible) to manage population health effectively or determine how risk should be shared.

While healthcare analytics is critical, it is important to note that there is no such thing as a magic bullet for analytics. This may seem obvious, but it is surprising how often healthcare organizations view analytics as the answer to all their woes. 

Having worked with health systems and their analytics efforts for nearly two decades, I’ve seen the following three common mistakes that consistently plague analytic endeavors.

Healthcare Data Analytics Mistake 1: Analytics Whiplash

Recently, I spent some time fishing in the Northwest. It had been quite a while since my prior fishing trip and my technique was, well, a bit rusty. Years ago, my brother-in-law, a master fly fisher, taught me an important lesson. He showed me that a fish needs some time to study the fly. It must be confident that if it is going to make the effort to chase a fly, it will catch it. For the fish to get comfortable, a fisherman needs to make the presentation of the fly on the water flawless. It must land with the same delicate weighting as real flies do when dancing on the surface of the water.

“Perfect presentation,” as he called it, is a technique that takes time to master. As a river guide, he saw far too many fishermen make the mistake of impatiently working a river section without giving fish sufficient time to study the fly. Hurriedly, they would move on, abandoning what could actually be the perfect fishing-hole on a river bend.

More than once, I gave up on a spot, moved on, and then moments later found my brother-in-law pulling a big brown trout from the same spot. He’d laugh and say, “I could catch a trout on a dusty road!”  This lesson made all the difference for my last trip. I slowed it down, focused on finding the perfect snap of the wrist and within short order, I was able to lay down repeated, perfect fly presentations. The payoff was exhilarating.

So, what does fly fishing have to do with healthcare analytics?

As I travel around the country meeting with health systems, I often interact with their top analysts and BI developers. A common theme that emerges is that analysts often feel like they are being whiplashed from one analysis to the next.

The cycle goes like this. Senior leadership has a problem they want to better understand. They approach a respected analyst for help. A task is assigned, the problem is outlined, the data is collected, and the analyst begins to study it. Just as the analyst feels she is really coming to understand the problem well enough to start making meaningful correlations, she is told to quickly wrap up what she is working on so she can move on to another task, and the process starts all over. It is frustrating for an analyst to feel that leadership is casting her to and fro without the patience to let her really settle on a problem that could yield a big catch.

Analytics projects are most successful when the analyst can follow an iterative process – developing a basic report, validating report assumptions with management and/or the people who own the processes being measured, and making adjustments to logic and assumptions for further review by process owners. If the analyst doesn’t have sufficient time to go through these steps thoroughly, the report analysis will likely be incomplete. Producing a handful of half-baked analyses will lead senior management and/or process owners to be uncertain about whether the information they’re receiving from the report can be trusted. This can lead to dissatisfaction with the organization’s data analytics capabilities.

An obvious answer would be to hire more analysts to do the work — but that doesn’t address the root cause. It merely means the organization will have a larger capacity to generate incomplete analysis. The analyst doesn’t necessarily need to work harder, she needs to be given enough time to do smarter analysis.

Prioritization from Leadership Is Key

For this to happen, leadership needs to become proficient with prioritization. Not everything can be priority number one. Furthermore, analysts shouldn’t be put into a position of determining what comes first. That’s a function of leadership. If they haven’t already done so, senior management must collectively take a step back as a group, determine which projects have the highest value (as well as which can wait), and then commit to seeing the highest-value projects through to completion – even if a shiny new object comes along. Putting an end to the constant whiplash of so-called “urgent” projects will ensure senior management receives quality information that helps drive business objectives.

While fishing for fishing’s sake—keeping with the analogy that is, doing analytics just to have some analytics—can be a good thing (good air, the beauties of nature, good company, getting away from the office), it may not be the best way to leverage your brilliant analysts.

Healthcare Data Analytics Mistake 2: Coloring the Truth

Which is more important — telling senior management what they want to hear or reporting bad news accurately? In the short run, telling your senior leadership what you think they want to hear may seem like the easier path. However, it won’t help them make meaningful quality improvements within the hospital. And it will ultimately engender mistrust in IT and analytics.

An example of this type of flawed thinking is illustrated below. Up until 2014, Acute care hospitals were required by the Centers for Medicare and Medicaid Services (CMS) to report on Central Line-Associated Bloodstream Infections (CLABSIs) acquired during a patient’s hospital stay. Under that regulation, only ICUs within the hospitals were required to report CLABSI rates to CMS.

Though CLABSI can occur outside the ICU setting, a hospital is required to report only those that occur within an ICU-patient visit. As CLABSI rates can have an affect on future reimbursement from CMS, hospitals keep a very close watch on those rates. Should an organization limit CLABSI reporting to the CMS-imposed definition (i.e. only those within the ICU), the reported incidence of CLABSI will significantly under represent the actual rate.

Imagine for a moment that you are an analyst tasked with outcomes reporting for CLABSI. You are very good at your job. Those you support with analysis trust the information you offer. Suppose one day you discover that what you have been reporting as CLABSI incidence is limited only to the ICU, but you have now discovered an alarming number of CLABSIs occur outside of the ICU. In your mind, you begin to weigh the pros and cons of sharing this information. You tell yourself there is no future financial penalty associated with these incidents (from a payer perspective). You remind yourself that the hospital is not required to report on these newly identified cases. So, what do you do? Do you go ahead and share this new information with senior leadership? You find yourself repeatedly asking yourself how you think they, the senior leadership, will respond to this kind of bad news. At length, you decide to let a sleeping dog lie. You figure that because there is no financial imperative and that leadership may not react well to the news, it’s just better that they don’t know. “What they don’t know, won’t hurt ‘em” you tell yourself.

Analysts Must Be Comfortable Sharing Bad News

This scenario has played out with many hospital systems. It’s symptomatic of a much bigger problem. In effect, is senior leadership unknowingly incentivizing analysts to lie to them?

To incent analysts to act as real analysts, senior management has to be willing to hear the bad news with the good and include all the data. After all, it’s tough to drive system-wide improvement if you’re not looking under every stone. If the goal is to use data analytics to become a high-performing organization, you need to take all of the data into consideration. It may cause uncomfortable or even painful moments early-on, but it will be much more effective in helping the organization become a high-performing, data-driven health system.

Healthcare Data Analytics Mistake 3: Deceitful visualizations

This is a technique politicians (and the media) use often. By presenting good data with deliberately misleading visualizations, thereby violating Edward Tufte’s six principles of graphical integrity, they can manipulate the public to see what they want them to see instead of what’s really there.

For example, the image below (courtesy of the blog Political Math) shows the difference between an accurate scale and a misleading scale. The graph on the left is deceptive because it gives the impression that there was a large increase in deductibles over two years. The reality, shown on the right, is a more modest increase. While both graphs show accurate information, the scale has been manipulated to give a different impression, breaking data integrity and making it difficult for the viewer fairly evaluate the data.

 

Healthcare Data Analytics

A top-tier newspaper used this technique a few years ago to make it appear as though there was a sudden, significant jump in the price of oil. The graphic represented the price with little oil barrels, and, at a certain point, there were many more oil barrels on the graph. The manipulation was at the point where the price appeared to jump and the scale changed from decades to years — with an almost imperceptible notation acknowledging the change in scale. Most readers would look at the chart quickly and assume the scale was the same, and, as a result, conclude the price changed significantly.

Senior leadership must guard against directing analysts to create visualizations that make the numbers look better (or worse) than they are. The purpose of visualizations is to make complex data easier to understand in order to create better actions. Anything that gets in the way of presenting the data with complete accuracy is a disservice to the data – and to the organization.

Empowered Healthcare Data Analysts Lead to Usable Healthcare Analytics

These are the three most common mistakes I see when organizations are working with analytics. Each one of these mistakes alone can cost a health system millions of dollars. When healthcare analytics is addressed correctly and accurately — with data analysts who are empowered to spend time with the data and be open and honest with senior leadership and use accurate, truthful visualizations — it can make the difference between failing and thriving in a value-based healthcare environment.

Do you agree that these are major mistakes? Have you found others?


PowerPoint Slides

Would you like to use or share these concepts? Download this healthcare data presentation highlighting the key main points

Click Here to Download Slides

Loading next article...