Unleashing Data: The Key To Driving Massive Improvements


 

Tom Burton:        Thanks Tyler. I’m excited today to share some ideas and some thoughts on how we unleash data which we at Health Catalyst believe is really the key to driving massive outcomes improvement. I have three core learning objectives. The first is to illustrate how important it is to invest in analytics and training both the infrastructure and the people to prepare for massive improvement. Second, we’ll talk about how you unleash data across the entire improvement spectrum. Last, we’ll talk about how to sustain and spread improvements across your entire organization.

We believe that to unleash the full potential of data, organization should really adopt a balanced approach to improvement across the spectrums of both effort and value. Now, as Tyler mentioned, our mission at Health Catalyst is to help organizations across the country really massively improve their healthcare outcomes. That’s why we exist, that’s what we do. We’re excited to share some of our experiences that we’ve learned as we’ve learned with many of our partner organizations across the country. I want to first talk about the spectrum idea and I want to think about the effort required to achieve an improvement.

On the one end of the spectrum, there’s a light amount of effort where we simply provide data to the right team and organic improvement happens. They have data, they make better decisions. There’s a medium effort improvement where it still goes fairly fast but it might require a little more effort. Then there’s deep comprehensive improvement which where we’re massively trying to save lives or really impact the patient care or significantly reduce the cost of care. That requires a lot more effort. Then, there’s everything in between. I like to think of it on this matrix, a four box matrix. You can’t have a good webinar without a four box matrix.

If you think about on the X axis, we have effort and on the Y axis, we have the value of the improvement. You might have a clinical improvement you’re working on which requires a significant amount of effort, but it may have a significant value. You may actually impact patient lives. They may have better clinical outcomes. It may reduce the mortality rate. You may have some improvement efforts that have a very light amount of effort but have a significant value. Maybe you change from using a high cost drug to a more generic drug and you change that in an order set and it didn’t require much effort, but you have a significant high value impact pretty quickly.

You then may have patient experience improvements that require a medium or light effort. You’ll have combinations. You’ll have some efforts to produce both a clinical value and a financial value. Ultimately, when we hit the triple aim, when we have all three, financial, clinical, and experience improvements, that’s the sweet spot that we want to shoot for. What we want to avoid are high effort but only light value improvements where we’re putting more money and effort into it than the value that we’re getting out. One of the things that we notice as we’ve worked with organizations across the country is sometimes organizations tend to over emphasize certain aspects of the spectrums.

For example, we might have an organization that focuses only on deep improvements and they’re not allowing the organic improvements to really take off. Or some organizations might just do the light touch improvements or the light effort improvements and they’re missing the opportunity to really deeply change processes. We may have some organizations overly focus on financial improvements at the expense of clinical and experience improvements or vice versa, they may focus too much on clinical without respect the cost of the care.

What we are really ideally aiming for is an even spread across the spectrum, avoiding those high effort low value situations but really looking for both clinical, financial and experience improvements light medium and high effort and light medium and high value, as you can see, shown on the chart here. That brings us to our very first poll question. We’re going to have a lot of poll questions today. We are excited to get your input. The first poll question is, which type of improvement does your organization focus on?

Tyler:                        All right, well, our poll is up. For just a few more moments, give everyone a chance to respond, then we’ll share the results. We’d like to remind everybody that’s joined that, if you have any questions or comments, you can type those into the questions pane of your control panel. We’ll close this poll and share our results.

Tom Burton:        It looks like most of you have a pretty good balanced mix, that’s wonderful, with some focusing on clinical, financial, and the lowest coming in with patient experience, very interesting. Well, I’m glad most of you have a balanced mix, that’s key. We really want to think about all three of those categories. We have another poll question which is, of those four quadrants that I showed, which does your organization tend to gravitate towards?

Tyler:                        All right, we’ve got that poll question up now. Which quadrant does your organization tend to gravitate towards, the high effort and high value, high effort and light value, light effort and high value, light effort and light value or balanced approach avoiding the high effort and light value? Again, we’ll leave this open for just a few moments, give folks a chance to respond. We will share the results. All right, we’re going to close that poll. Here are results.

Tom Burton:        All right, it looks like a lot of organizations are going after those two quadrants, the high effort and high value, and the light effort and high value, with not very many going after the light effort light value and a significant trying to have a balanced approach which is what we would recommend. Well, today we’re going to talk about how do we have that balanced approach and really what are the key capabilities and skills that you’ll need to have massive improvement leveraging your data. I’ve always believe that in using an analogy is key to understanding. We actually developed a game called Spectrum. This webinar is based on this game that we developed.

Now, we won’t be able unfortunately to play the game although I’ve actually thought about creating electronic version of the game. This game we created to help teach these principles. We based it off a popular strategy game called Seven Wonders. The object of the game, the reason I like this game is there are a lot of different ways to win. There’s a lot of different ways to get improvement points. We play it over three rounds, it has multiple aspects of the game. We won’t get into that game but I’ll just give you a general feel because you’ll see some of the game images throughout this webinar.

When we typically do this, we’re doing it with 50 to 75 people and each table has six to seven players at it, each table represents a care delivery system and each player represents a department within that system. In this case, the game board shown here is the urgent care department. We then have a lot of different ways that players can earn points. They earn points from light effort improvements, a system wide adoption, high value, high effort improvements, collaborating between departments, having enough money to fund improvement. There are a lot of different ways that we try to teach.

We’re going to go over all those principles today without actually playing the game. I wanted you to know that we are willing to come out if you can pull together key executives and key stakeholders, we’re willing to come out and share this game with you if you can get a good cohort of people willing to learn and listen to these principles because we think they’re key to the whole industry improving. We’re going to go into the first area, our first key learning objective, talking about investing for success. We’re going to talk about that in three core categories, first investing in your data infrastructure, second investing in your people and finally talking about the progress on your journey towards becoming a true data-driven culture.

Let’s first talk about investing in your immediate structure. There are four key capabilities that are needed here. First, acquiring data; second, granting access to data; third, building actionable metrics; and finally, finding insight in that data. As we think about the data that you need, there was an interesting study done by the province of Alberta where they found that only 8% of the data that’s required for really deep improvement and population health and precision medicine is actually in your EMR today. That means it’s looking at a very low resolution picture and trying to make heads or tails of who that is, that’s same situation. We’re just now beginning the digitization of our health.

It’s great that we have healthcare encounter data that’s found at the EMR. We need to expand into genomic data, outcomes data, biometrics data from wearables, consumer data and other social data, all of which can be great predictors of what interventions are going to work best for the patients we’re trying to serve. A question to all of you, what is your one, three, and five-year strategic plan for acquiring more data? Gartner defines a need of a health data convergence hub which is a place where you would consolidate and converge all of the data that you need for interpreting what’s the best thing to do for your patients.

That convergence hub automates the ingestion of that data and then delivers that data to the right modality. At Health Catalyst, we call this our healthcare data operating system. It’s different from a traditional data warehouse. A traditional data warehouse does a lot of great things but an operating system or this convergence hub enables a lot more. It includes things like R and Python and other predictive modeling tools. It uses the workflow of the EHR or other transactional systems and pushes analytic information back into those transactional systems. It’s real-time, so most traditional data warehouses are nightly updated or weekly updated.

A data operating system is much more real-time. It leverages a lot of these newer technologies. Some of the key things to be aware of when you’re thinking about data acquisition being able to get data into these data like source or source marts and use the principle of late-binding where you don’t have to define everything upfront. You don’t know exactly how you’re going to need that data or use that data but you gather as much data as you can. Second, having really good automated tools for that ingestion process, connectors that automatically connect and take some of the tedious work out of building data marts.

Then finally, leveraging some of the newer technologies that Silicon Valley has created, Hadoop/Spark, Machine Learning, Deep Learning, Natural Language Processing. All of these capabilities are going to be key as we try to acquire enough data to make really meaningful insights and predictions about the patients we’re trying to serve. The second key principle in investing in your infrastructure is granting access to that data. This is one of the most challenging polarities for organizations. That’s how do they balance or how do they grant access to that data. I like to use an analogy.

Some of you may have heard this analogy before but one of my favorite movies is Indiana Jones and the Last Crusade. At the end of that movie, he’s got to save his father by crossing this huge chasm and he can’t see that there’s an invincible bridge and he’s got to take a leap of faith. That’s like managing a polarity. If he goes to either side of this invisible bridge, he’s going to fall to his death. Sometimes we have a tendency to manage to one of the extremes in a polarity. Data access is a polarity where on the one hand we want to protect the data, protect the privacy and the security of the data which is important and we want to share the data for improvement work.

Well, if we go to either extreme, we’re going to be in trouble. We want to go right down the middle. We want to have a balanced approach where we do both. We protect the data and we share it appropriately. If any of you remember that movie Indiana Jones threw sand back over that invisible bridge making it visible, turning the camouflage into a dusted bridge that you can actually see. We can do that as well by establishing policies around what is our streamlined access policy, how can we really make sure the right people are getting access to data and that we’re not over here on the data protection extreme where only IT can get access to the data.

It takes six months to get access and so forth. On the other hand, we also want to protect against data breaches. We want to do consistent audits of use of the data. We want to make sure data stewards, review who has access to their data marts on a frequent basis and have those data stewards be clinicians and business leaders that own a stewardship for that data. We’ve talked about a couple of key concepts in granting access. Number one, data steward ownership; second, trust but verified. Make sure that you’re auditing on a regular basis. Finally, creating team or role-based data access policies can increase the efficiency.

When a new team member joins a team, rather than having to manually grab that access, you can have groups of individuals automatically have access to groups of data. The next principle is all about building actionable metrics. The most sophisticated and accurate predictive model is absolutely worthless unless it promotes an action that would have otherwise not have happened. This is a really key principle, actionable metrics. As we think about what makes something actionable is the right information, deliver to the right audience at the right granularity at the right time in their workflow, in the right visualization or modality and that’s what produces the right action to actually improve the outcomes.

The final principle in investing in the infrastructure is discovering insights. Data really becomes valuable when an insight is discovered such as trend or a pattern, some sort of a correlation or causation that’s shown in the data. Here is an example of that. If you’ve looked at this graph initially, you may think, wow, we did great. Before the change, we were at eight and now we’re at three, that’s a real great improvement in the delay. We’ve reduced that delay in this process. Well, we’re not really sure if that’s accurate unless we have a little bit more data. Here are some examples of how we might have come to the wrong conclusion because of a poor sampling.

Maybe we’re just not sampling enough data and it’s random and we happen to sample something in the past that was bad and something in the current that looks good. There already may have been a natural trend and we’re just taking credit for it. We may have made a change and there was a temporary improvement but it didn’t last. We may have had something externally improved and we made the change and then took credit for it. It would have happened anyway or we may have an outlier where there really wasn’t a problem, there it was just one bleep in the data where there was a data with quality problem or an outlier case that really wasn’t normal.

It’s only when we see a good control chart where we see a run chart where, yes, there was definitely a change from before we made the change to the after where we saw significant improvement. These are some principles avoiding scorecards because they often lack insights. I’ve seen a lot of scorecards in the healthcare industry and a lot of them really don’t tell us much information. What’s much more effective is looking at patterns and distributions of data. We want to avoid what I call the rush to judgment trap or the outliers trap. This is where we set a standard, we say everybody that’s below the standard is bad or doing bad and that may shame those below the standard into doing better.

It really doesn’t move the overall process. What we’d rather do is tighten the curve and shift it towards excellent outcome. Even those that are already doing good work do great work or excellent work. Now, we can translate this if we tip it on inside into a control chart and you see how we’ve taken that distribution, tip it on inside and shown it over time with upper and lower control limits. You can see there’s quite a bit of variation in the process, we then do an intervention and we tighten our process and improve it. Over time, we go from a broad distribution to a tighter distribution. You can see we’ve definitely improve that process by reducing the variation.

This is where insights and using tools like control charts can really be helpful in understanding those insights and drawing conclusions that will help improve outcomes. These are a number of different methods to look for patterns where you see a certain number of consecutive points or certain data points outside the control limits. Those are cause to investigate further and some great techniques that your data architects or analytic engineers can use. All right, so we’ve covered the first topic which was really investing in the data infrastructure. Now, let’s talk about investing in your people.

There are three absolutely key roles that you want to invest in, your analytic engineers, your change agents and your key stakeholders. Let’s talk about your analytic engineers. I love this cartoon. “I told you I wasn’t a hunter gather. I’m an analyst!” You got the caveman there checking out how many … what are kills they’ve had on their recent hunt. The challenge is most of our data folks today spend their time gathering and compiling and hunting for the data versus analyzing the data. We want to reverse that. We want most of their time spent interpreting the data, understanding the questions that are being asked, that’s the real value at. What are the core skills needed to that?

Well, you’ve got to be able to create insights and present that information and that insight in a compelling way. You also have to have a deep understanding of healthcare data. Some core technical skills needed are sequel, data modeling and visualization and reporting skills. In addition, statistical, predictive modeling, machine learning, AI skills, understanding quality improvement, understanding even deeper visualization techniques, and project management are also important. Let me share just a fun example of why you’ve got to both have the technical skills and the context. You might jump to the wrong conclusions.

For example, let’s say I’m analyzing some data and I notice that there seems to be a correlation between our high ice cream sales and shark attacks. I’d say, you know what, I think I understand this. We taste better when we eat ice cream, therefore, what we should do, stop selling ice cream that will lead to fewer shark attacks. Well, this is a situation where we’ve confused correlation with causation. In a reality, warmer weather is the cause of both higher ice cream sells and more shark attacks. Understanding the healthcare context, making sure your analytic engineers are paired with clinicians or paired with operating leaders can make a big difference in interpreting the insights in the data.

The second key capability or the second key role, change agents. I love this quote from Eric Hoffer, “In times of change, learners inherit the future, while the learned find themselves beautifully equipped to deal with a world that no longer exists.” Many of you have seen this adapted from Rogers Everett in the Diffusion of Innovation. He talks about innovators, early adapters, the early and the late majority. When we’re trying to affect change, we’ve got to identify who those early adapters are. It’s usually if you have 100 people, it’s going to be the square root of that 100 or 10 people, but it’s the right 10 people. It’s the 10 people that influence the others.

It’s the people that go out and buy the new iPhone the day it comes out. They are the ones that tell everybody, “Oh, it’s cooler.” “Oh, it’s not that great.” They’re going to be the ones that influence the early and late majority. Getting them on board for the change is really key. One way to do that is to do this analysis by understanding who is willing to lead change and who is able to lead change. Identify those that are willing by saying, who would like to lead this change? Then, poll the rest of the people that the change will affect and ask them who they’d be willing to follow. What you’ll find is a few names will bubble to the top.

Those are the ones you want to pick because they’ll be able to convince their colleagues to make change. Now, one of the key roles of a change agent is to prepare people for change. We go all go through this process of hitting the pit of despair if you will, realizing that my world is changing, I’ve got to do things differently. We want to move that from that feeling of loss to awareness of why the change is needed, a desire to make the change, the ability to make the change and then sustain the improvement. That’s a hard process. This is an area that people often forget. They think that, oh, if we get the technology in place, change will just happen automatically.

That is definitely not what we’ve seen as we’ve worked with lots of health systems across the country. The final key role and key skill set are engaging the right stakeholders. I love this quote, “Things get done only if the data we gather can inform and inspire those in a position to make a difference.” There are actually four key stakeholder groups that we need to engage. There’s the executive group, there’s the domain leadership group, there are those that will lead adoption in their individual units, and there are those that will design the innovation or the change in the process. We actually need all four of those groups engaged at the right level.

One of the tools that we recommend using is what we call a stakeholder analysis. This is where we determine whether the change is going to have a major impact, a minor impact, how important is that change that particular subgroup of stakeholders, and how is it going to impact their what we call heat level. Are they going to be overwhelmed? Are you they going to be disengage and think, hey, that’s not my problem? By identifying and deliberately walking through the stakeholder analysis, you’re better able to prepare people for the changes that are coming.

One of the keys to keeping the change going well is providing visibility to all of the key stakeholders, letting people know how the project is going, when it’s going to impact them, and making challenges in the project visible early so that they can solved for. The finally area we’re going to talk about is this journey towards becoming data-driven. Moving away from the scorecards and some of the older static reports we may have used in the past to embedded much more sophisticated analytics. Then, how do you change your culture and really the core capabilities that are needed to become a data-driven culture.

Billy Beane, who introduced analytics to baseball said, “We’ve got to use every piece of data and piece of information, and hopefully that will help us be accurate with our player evaluation. For us, that’s our life blood.” This is the journey we start often in siloed spreadsheets. We might move to centralize reporting and where we want to eventually get to is a deep data-driven culture. Other industries have already done this where we’ve embedded analytics right in the work flow if you think about Facebook or if you think about Amazon. In Amazon, there’s all these analytics embedded around an order entry system.

I’m going to Amazon to buy particular good, but I have all these analytics like customers like me also bought this item with the item you’re looking at. Usually, we find that pretty useful. Or I look at customer reviews and I say, wow, this item is much better reviewed than that item and it makes a much more informed decision, it supports my decision making. That’s why Amazon is so popular. We actually like the analytics that are surrounding this workflow. A study was done up at Intermountain Healthcare at LDS Hospital a number of years ago. They found that clinicians are 15 times more likely to leverage analytics if it’s presented within the workflow.

What we want to do on that journey is move towards more sophisticated workflow embedded analytics. Oftentimes, when we start working with an organization, they may have a significant number of their analytics in these more traditional static report cards or scorecards. What we want to do over time is shift away from those expensive scorecards towards these higher level analytics. The first thing we can do is we can reduce the cost of a lot of this analytics by using better tools. Lowering the cost across the board for the all the analytics by automated data platform tools can be a significant help.

Then over time, we can shift towards dynamic visualization and analytics embedded within the workflow. Over time, the types of analytics change to these more predictive models and embedded analytics right in the workflow and that makes it easier for those needing to make the change to adapt the change. Finally, to scale outcomes improvement, there are more capabilities than just good analytics. You’ve got to be able to answer these questions. Analytics help us understand how we’re doing or how we predict will do in the future, best practice helps us to understand what should we be doing.

Adoption helps us understand how we will need to change and how to we make those changes happen. Those are the three core capabilities to get outcomes to improve. Then to get them to scale, we’ve got to have leadership culture and governance so we know where to focus. We’ve also have to have financial alignment. We can’t have some people’s bonus tie to things that are in opposition to making the improvement. We often see this with the way that we’re compensated from our payers and even some of our physicians how they’re compensated. We need to think about how we are compensated and does that align with the improvement we’re trying to achieve.

We’ve covered a lot in our first 30 minutes of the webinar. I’d like to now ask a few more poll questions around this first area of investing for success. The next poll question is, which infrastructure component does your organization struggle with the most, acquiring data, granting access to data, building the actionable metrics or finding insights of the data?

Tyler:                        Okay. We have that poll question live. We’ll leave that up for a few moments, give everyone a chance to respond. We have had a couple questions come in regarding the slides. I want to let everybody know that, yes, we will be providing a recording of the copy of the slides as well as this recorded webinar linked to everybody after the webinar. All right, we’ll close our poll and share our results.

Tom Burton:        Okay, very interesting. It looks like what’s the most challenging is building actionable metrics, but close behind our acquiring data and finding insights into that data. Very interesting, it is a challenge and the skill set required for building actionable metrics is usually pretty scarce. Let’s go to the next poll question which is related now to those key roles. Which of these key roles most scarce in your organization, the analytics engineer, the change agent, or the engaged key stakeholder? I will give you a minute to answer that.

Tyler:                        All right, we’ve got that poll question up. Leave it open for just a few more moments. All right, let’s close our poll and share the results.

Tom Burton:        Interesting, I suspected that this will be the case. The analytic engineer is a very scarce resource. I’m glad many of you showed that the change agents are also scarce. Oftentimes, the key stakeholders are there but they’re not engaged which can be problematic. All three of these roles as we discussed are absolutely critical and making sure you invest in training for all three of these groups is very important. I have one more poll question for this section. Which of these core capabilities as your organizations struggle with the most and just select one of the following? Which is your area, is it analytics, how are we doing? Is it best practice, what should we be doing? Is it adoption, how do we change or transform? Financial alignment, how are we compensated? Or governance, where should we focus?

Tyler:                        We’ll leave that open for just a few more moments, give folks a chance to respond. Let’s close this poll and share our results.

Tom Burton:        Yes, very interesting, adoption is by far the hardest. You could have the best analytic system in the world. If you can’t get people to use it to change their behaviors, it’s really not that valuable. That’s why we spend a lot of time educating doing these webinars as we think one of the best ways to help with adoption is making everyone aware of what it takes to make change happen and the tools and techniques and the skills you’ll need for change. That’s very interesting. Thank you all for participating in these poll questions. I find them really fascinating.

Tyler:                        We’ll defer to send it all these results of these poll questions including the number of respondents as well.

Tom Burton:        Great.

Tyler:                        Let everyone has access to that information.

Tom Burton:        Great, thanks Tyler. All right, let’s move on to the next section. Now, you made your investments in your infrastructure, in your people and you’re on your way towards massive outcome improvement. What are some of the things you need to do to leverage the entire spectrum of improvement? Those light effort and deep effort improvements, how do you decide what to work on. Then finally, the key, you’ve got to have enough money to invest. No margin, no mission. Those are four things we’re going to cover in this next section. Let’s start first with what we call on-going opportunity analysis. As opportunities present themselves, you won’t be able to work on all of them simultaneously.

How do you priorities. What do you to take advantage of the opportunities that present themselves? I love this quote from Jim Collins, he said, “Luck is not the key. How you handle good or bad luck is what matters.” There’s this concept of return on luck where both great organizations and mediocre organizations are going to have both good and bad luck, it’s what we do with that. Can you capitalize on even the bad luck? You may actually use bad luck or a bad result to motivate and promote making a change. That can be a great adoption accelerator if you will, if you can rally around a sentinel event or you can rally around something that really needs attention.

Don’t squander both your good luck and your bad luck. Now, some examples of luck in healthcare, you might get a pair that agrees to share savings or you might get a really well informed governor that understands healthcare. The bad luck might be you have a joint commission person that shows up at the absolute worst time or a competitor opens a new facility right next door to you. Those are bad luck. Leverage them as an opportunity for improvement. The next key concept is to build a pipeline of opportunities and this is what your analytic engineers primary role should be.

Going through the data, finding areas where we have opportunities, looking for variation, looking for patterns, looking for correlation and causation. We should have a pipeline that significantly exceeds your capacity to execute. Well, that does a couple of things. It motivates you to increase your capacity, it teaches the organization the importance of prioritization and working on the highest value opportunities. If one of those opportunities stalls, you’ll have backup initiatives that can be worked on so that you keep your analytics engineers fully deployed working on multiple improvement opportunity simultaneously.

As you think about an opportunity, there are some key things you want to evaluate. The first is the return on investment. We like to think about that on that spectrum. How much effort is it going to require and what’s the value of that opportunity? Now, the financial, you can quantify in dollar saved and dollars that’s going to cost you to achieve the improvement. You might categorize light medium and high effort, maybe you say light is less than an FTE, medium is one to three, and high is anything over three. You also want to evaluate the clinical value and the experience value. Maybe a light improvement might be a process metric that is improved.

A medium improve might be an outcome that impacts the patient improve. A high improvement would be something like improving the mortality of a patient cohort. You can do the same with experience. The other key component of evaluating an opportunity is look at the organizations readiness. First, do we the capability of actually making the change? We have the skills. What it would take to gain the skills? Do we have the right equipment and tools? Second, do we have the capacity? If there’s too much on our plate, we can’t make the change. Well, what can we take off the plates of those that would need to make the change?

Finally, willingness, oftentimes you may see the need for the change. Not everybody believes that the change should be made. How do you get buy-in from the frontline that will need to make the change? How much resistance do you think will be encountered? Well, all of those things should help in evaluating and prioritizing opportunities. One of the things that we offer for free on our website is an outcomes improvement readiness assessment. You can simply go to the link outcomes improvement readiness assessment oira.healthcatlyst.com and you can take a 22-question competency survey that looks at each of the core competencies needed for a significant improvement. That can help your organization. You can take it individually or as an organization. We offer that free to just help you start to think about the readiness of your organization for change.

Tyler:                        Tom, we’ll defer to include that link as well in the follow up email as well so that everyone has access to that.

Tom Burton:        Wonderful. All right, let’s dive now into light effort improvements. I notice that many of you answer that you weren’t really focus on the light effort light value improvement, but these can really add up. Thinking about how you enable light effort improvement, there are some core concepts. The first is that you want analytics capabilities spread across your organization. You want a subset of that centralized in a hub such as your training for analytics engineers, your visualization standards, your statistical analysis, your machine learning, some of the deeper higher complexity skill sets, those are probably better centralized.

De-centralized, you’ll want to grant access to the data to a lot of folks that aren’t in a centralized reporting or analytics group. That will allow individual departments to do their own Ad-hoc queries and analysis and build their own custom dashboards that will help them with their specific needs. That’s the first concept. You want a hub and spoke strategy for your analytic resources. Next, there are some core prerequisite. One is broadly accessible data. Second is a standard set of tools that your entire organization has access to. Adequate training support not just for the core centralize team, but for the broader spoke team, those analysts and those leaders throughout every department that should have access to data and access to tools.

You want to give them support and training. Then, communication, making sure people know what’s available from an analytics infrastructure capability and what values already been delivered. That could spark ideas about how we might apply the same principles in our particular unit. Let me give you one example. I was working down with Texas Children Hospitals. We were working on asthma and the physician leader was going through a dashboard that our analytics engineer had created as we were thinking about asthma action plans. There were some additional data on those screens. Suddenly, the physician said, “Whoa, whoa, whoa, stop. What’s that?”

He was showing how many times chest x-rays had been ordered for these patients. He’s like, “Wow! That seems really high.” 60% of the time or some higher number than he expect it. Our immediate thought was, well, maybe there’s a data problem. We dug into it a little bit, right in the meeting. No, it looks accurate. We did a couple of additional hours of investigation. The analytics engineer found out that the residents in the emergency department were using an old order set that automatically included a chest x-ray for these asthma patients, for these kids. That was not in accordance with best practice.

There were only a few medical conditions, maybe 5% of the time, not 60% that these chest x-rays were needed. They quickly change the order set, made an order set without that chest x-ray the default and then educated the residents on. “Okay, this is when you would want to do a chest x-ray.” Well, almost immediately, it went from 60% down to less than 5%. That’s an example of a very light touch improvement which had really the triple aim, you have better care for those patients. You eliminated some cost of the x-rays and the patient experience was definitely better.

Those kids didn’t have to have that uncomfortable exposure to radiation and being in an ex-ray when they didn’t have to. This is an example of a light touch improvement that we want a lot of this happening. That happened almost accidentally because that clinician had access to more data than they’d ever had before. All right, now deep continuous improvement also should be an area that you invest in. To do this well, you need as an organization to pick an improvement methodology. Now, it’s not so important which improvement methodologies you pick, but across the organization have the same methodology. There are lot of great methodology out there.

There’s PDSA, there’s Toyota, there’s IHI, all of these all stem back from Deming and some of the great principles learned last century. Have a standard and try to be able to avoid the tower of Babel where you’re all talking the same language. One of the keys then is answering these core questions as you’re thinking about improvement. Every one of those methodologies has these key questions. First question, do we understand that problem? Do we know where we want to be? What is best practice? What should we be doing? Do we know the cost of our lesson seller performance? Do we know what we want to change? Can we measure the change?

Then, are the change in the results and improvement? Then, how do we sustain the gains? Those key questions appear whatever methodology you’re using. Just make sure that you’re thinking about these core questions when you’re doing a deep improvement, when you’re significantly changing a process. Are you answering all seven of these questions? Now, to do that, you’re going to have to have some organizational structure. You’ll need folks that prioritize. You’ll need folks that innovate, come up with a different way of doing things and then you’ve got to get that adoption piece that we talked about earlier, broad adoption across every unit that has that process.

We recommend for these deep improvement efforts to establish a consistent on-going group of stakeholders that will own prioritizing the work, designing the innovations, and deploying those innovations broadly across the entire organization. The small teams may meet every week and then they may take their drafts to a broader team that’s going to poke holes it and modify it slightly. You may go back and forth and iterate between the broad team and the small team. Then those groups are accountable to that broader guidance team that’s in charge of prioritizing the work. Something like this might happen.

You might be trying to reduce the number of readmissions in a particular process. The team might set some process goals to try to get to that outcome goal. They may have a serious of interventions that help them to do that. As these interventions combine, the process improves and ultimately the patient’s outcomes improve. Finally, we want to talk about no margin, no mission. There’s a great quote from Sister Kraus that really said, “If we don’t have money, it’s really hard to accomplish the mission.” This promotes … excuse me, giving the financial team involved early on.

You’ll want them to set a baseline cost of the current process and then help to calculate what hard cost savings have happened, what soft cost efficiency gains that may not impact the budget but it may free up people to work on more important things. Or reversing the trend, maybe your cost were raising and you’re able to flatten them in a particular department or care process area. One of the things that we have to really think about is what is the type of improvement we’re doing and how that impact the payment method that we’re involved in. These are four types of improvements. Who should get the care? What should the care be? How can we do it efficiently?

Depending on what payment method you have, some of those improvements, for example, an indication for intervention. You could make a great improvement that’s good for the patient, but it’s bad for you financially. If you involve the finance team early, they can help realize, you know what, you’re going to work on spine surgeries and avoiding inappropriately sending folks to get a spine surgery that only need physical therapy. Well, if you’re all free for service, that could have a massive negative impact on your bottom line. You want the finance team involved early.

You’ll probably want to go and renegotiate with your key payers a shared savings or a full capitation or condition capitation type situation. When you make that improvement, you not only benefit your patients clinically but your organization financially. The concept here is have a good balance so that your financial improvements that you achieve can help fund some of those clinical and experience improvements. Another key way to fund the improvement is doing what we call spring cleaning or overhead value analysis. There’s a lot of reports in analytics that for whatever reason we’re produced at one point in time but now are no longer needed.

We have people producing these reports and spending time on them but no one is really using them. We need to go through on a semi-frequent basis and go through and say, how much is it costing us to produce this? Is it still useful? If it’s not, let’s stop doing it. All right, so we’ve covered a lot in this next section about leveraging across the entire spectrum. Let’s go to our poll questions now. First, let’s ask, how is your organization doing at prioritizing opportunities for improvement? How do you do it? Do you have a trouble saying no? Everything gets approved and then nothing is a priority. Do you evaluate the value of those improvements? Do you evaluate both the effort and the value? Do you evaluate effort value and readiness or is it all about politics? Let’s open the poll and see what’s the results are.

Tyler:                        That poll is up and leave that open while folks are responding. We have had a couple of questions come in about the audio, we let folks know that we are looking into audio. The signal seems to be strong here but we are looking into that. We’re going to close the poll and share our results.

Tom Burton:        Interesting. The top answer was it’s all about the politics. 25% do evaluate effort value and readiness. That’s great, that’s very forward thinking. It’s unfortunate. It is often about the politics and one of the ways that we can get better at that is what we’ll talk about in the next section. Helping people to become data-driven versus politically-driven is one of the hardest transitions an organization has to go through but it’s absolutely critical. We’ve got two more poll questions. Next, which area prerequisite of your light effort improvements or organic improvement is the weakest in your organization, broadly accessible data, alignment around an analytic tool set, analytic training support, communication about analytic value and the roadmap or all of these components are working perfectly at my organization for at least well?

Tyler:                        We’ve got that poll question up and we’ll leave that up for a few moments, give everyone a chance to respond. It looks like we were able to resolve the audio issues from before. We’d like to thank everybody for alerting us to that. All right, we’re going to go ahead and close our poll and share our results.

Tom Burton:        All right, 31% have challenges communicating the analytic value and roadmap. Well, our next section, we’ll talk about some strategies to do that so that will be interesting. It’s interesting that very few have all of these working well. As you strategically think about how to invest, think about your organization, take whichever is the weakest and make sure you invest in order to strengthen that organic improvement because it can be really significant as you get to more and more people the ability to leverage data in a very light effort way but those can produce some significant results. All right, our final poll question for this section, true or false. My organization has stale analytic reports that are never used but continue to be produced? In other words, do you need spring cleaning?

Tyler:                        All right, we’ll that poll question is up. That poll question is up. I’d like to respond to the responses from the previous poll question, I’d like to say that David Grauer is doing a webinar on the 24th of this month. He’ll be addressing a lot of those communication issues and other strategies for creating and sustaining those improvements throughout an organization.

Tom Burton:        Yeah, and David is super talented, so former CEO of Intermountain Medical Center, the Flagship Hospital for Intermountain Healthcare joined us about a year ago and just has a great insight in how to communicate. He’s one of the best communicators I’ve ever met. That will be a great webinar. Don’t miss that.

Tyler:                        Let’s close our poll and share the results.

Tom Burton:        All right, it looks like a lot of us need to do some spring cleaning. Well, there are a lot of techniques to do that. One technique that’s really easy to do, you have to be careful with this one but it is something I’ve done before. Just turn off all the reports that you suspect, aren’t being used and see if anybody screams. If someone says something, you can quickly turn it back off or back on but you might eliminate a significant amount of work just by discontinuing some of those reports that you suspect might be not be being used. All right, the final area we’re going to talk about is sustaining and spreading improvement. We’re going to go through three areas.

First, we’re going to talk about collaboration between departments. Second, we’ll talk about system-wide adoption. Finally, we’ll talk about avoiding the politics and the conflict that often are associated with analytics. All right, first, collaborating. “Coming together is a beginning. Keeping together is progress. Working together is success.” Henry Ford said that. If we think about data governance, this is a principle that many organizations struggle with. What we’re really trying to do is maximize the value of that asset which is our data in our organization. There are three key components or we sometimes call it the triple aim of data governance. The first is insuring data quality.

The second is building data literacy, and the final is maximizing data utilization. Data quality is really made up of three components, the timeliness of the data, the accuracy of the data, and the completeness of the data. Now, depending on the used case, those three aspects may have higher threshold or higher need for quality or a medium need for quality. If we’re doing something for research, the bar is much higher, we need a double-blinded study. If we’re just doing it to observe, is this process more efficient than that process? We probably don’t need as detailed and as accurate of data and so the data quality requirement might be lower.

We may be able to say, you know what, the data is not perfect but it’s good enough to make a decision that this process is going to be more effective. Make sure as you’re thinking about data quality, you don’t have the same absolute quality requirements for every used case because they are different. You can see this chart shows some of the difference in data quality requirements. Data literacy, when we start working with a lot of organizations, we see this kind of distribution. There are a bunch of people that have access to reports and we call them viewers. There may be a smaller group that has access to analytic dashboards where they can drill and slice and dice a little bit.

Then, there are very, very few who are what we would call knowledge workers where they have both clinical and operational knowledge as well as access to the tools and access to the data. Well, instead of a pyramid or a triangle shape, what we’d really like is an hourglass shape. We want a lot of people becoming knowledge workers where they have access to the data. We do simplify data structures that they can work with. We standardize naming conventions so that’s easier to interpret and we give them better tools. Now, if we do that, what you do is you enable a significant amount of collaboration between departments and the ability to really interpret the data a lot faster.

That’s what we call increasing the data literacy and that’s the second aspect of a data governance. The third is really as we grant broad access, we need to audit. This is that trust and verify concept I mentioned earlier. These are some screenshots of a great tool that we deploy called Idera which tracks every single query, every single view of the data by anyone is tracked and audited. You need that kind of robust auditing capability to make sure that data is being used appropriately. All right, let’s move now into the concept of system-wide adoption.

One of the key capabilities is getting the analytics folks together, the analysts, the analytic engineers together so that they can share knowledge and they can define reusable standards and components of the analytics infrastructure. What we recommend doing is establishing what we often call the analytic services working group and their responsibilities are standardized certain calculations and definitions, recommend tools and processes, and you want the right tools for the right situations. Then, establish the data quality standards for the organization. What does that it creates this flywheel effect.

You’ll hear about an analyst that had a great result in one department and that will trigger an idea in a different department. Even though this doesn’t have to be a formal reporting relationship and it’s usually not, you get collaboration happening and you start to get system-wide adoption of great principles. Dale Sanders said, “Run your analytics department like a small business.” Think of it as a startup business. Think about it as how do we be customer oriented, how do we focus on the key team members within our analytics department, make sure they have everything they need and make sure we’ve got technically sound tools.

When you have that service orientation, you have satisfied customers, you adopt and develop excellent analytic assets that dense us to drive outcomes improvement throughout the organization. Run it like a small business, now a real key part of that is marketing. The marketing approach that we recommend is the same approach that Health Catalyst takes in general and that is educate people, teach the principles of analytics, teach the principles. Let your organization know what’s available and then help them become more sophisticated users. Then when you have great results, share those results because success breeds success.

We’re going to share one of the great ways to accelerate adoption is publishing what we call improvement vignettes. Here’s an example for a multi care, one of our customers up in Tacoma, Washington and their work that they did on reducing readmissions for COPD patients. Now, it describes on the left hand side here the situation, it’s a big problem. COPD is a severe situation. It talks about then in the paragraph on the right what we actually did and then it shows the results. This is a template that you can use. It’s simple. It’s one page. It’s very quick to understand. The setting, the challenge, the turning point and the result, and so come up with a simple way of sharing it.

It doesn’t have to be this but something like this to share the great results. There’s a headline. There’s the setting. There’s the challenge, the turning point and the result. This is how you tell stories with data. The moral of the story which isn’t necessarily written but it’s implied is that, wow! We’ve got a great analytics infrastructure and we’ve got knowledgeable people that are really making a difference for our patients. As that message starts to get out there, that’s going to generate more demand for better analytics and it’s a snowball effect. It will really start that flywheel spinning.

Be prepared for that because there’ll be people starting to lineup say, well,  how do I get access to the analytics infrastructure and how do I get people trained to become analytics engineer? That’s how you get that marketing engine going. Another key is improvement governance. Now, this is beyond data governance. This is helping the organization be more strategic about prioritization and what to focus on. We actually created the game just for this. We don’t have time to get into all of that, but it’s identifying the right stakeholders, making sure they have a shared understanding, getting them aligned and then picking the right initiatives to focus on.

All right, so we’ve talked about collaborating between departments and system-wide adoption. I want to now dive into avoiding conflict and how do we reduce the politics that we saw as a challenge in that poll. “Freedom is the ability to pause between stimulus and response and in that pause to choose.” That’s a great quote. Easier said than done, how do we do that? How do we pause so that we don’t create conflict but we become more data-driven? I’d like to cover and I’ve seen every one of this. This is a top 10 list of the worst practices in healthcare analytics interactions. Coming in at number 10, stick to the status quo.

Number nine, that delay a decision with stall tactics. Number eight, budget cuts across the board, let’s not look at the data, let’s just … it’s fair. 2% cuts across the board. Political favoritism, analysis paralysis. Number five, highlight the data imperfections and discredit the entire analysis. Number four, disengage from the process. Number three, prevent appropriate access to the data. Number two, misrepresent the data. The worst practice, use data as a weapon. Now, unfortunately, I’ve seen every single one of these across the country being used, analytics being used in a bad way. How do we avoid that?

Well, one of the ways that we can avoid that is thinking about this adoptive leadership concept that high fits teaches and that is to keep people in the productive range. People often get overwhelmed and when they get overwhelmed, they start to do some of those things on the top 10 list. What do we want to do? Start with compassion. Try to understand, okay, this is overwhelming. Can I break this down into smaller parts? Can I simplify it? Can I reallocate the resources so that I can lower the heat enough that someone can actually accept the change and start to be in that productive zone? Another way is to really understand what people are saying.

Oftentimes, our communications aren’t real clear. I love this quote from Robert McCloskey. “I know that you believe you understood what you think I said, but I am not sure you realize that what you heard is not what I meant.” You have to read that four times to really get it, but it’s a really key concept. Harvard did a project on negotiation and talked about this principle of intent versus impact. We can think about these four boxes. My intent, my impact, other’s intent and other’s impact on me. The challenge is that we really only understand two of these four boxes. We understand what our intentions are and we understand how others words impacted us.

Where we get into trouble is when we think, well, I didn’t mean to offend them. Just because we have good intentions doesn’t make our poor communication irrelevant. Just because we think we understand what others intentions were doesn’t mean we truly do understand what they were. What we can do is often ask clarifying questions. We can say, this is what I meant when I said that. I can tell you may have taken it the wrong way. How did you feel? How that land with you? Or if something offends us or we feel like, man, they’re attacking me. We can say, this is how I felt when you said that, is that what you meant?

By just adding some of these clarifying questions, a lot of times we can avoid the politics, we can avoid the contention that arises. All right, so we have covered a lot. For this final section, let’s ask a few more poll questions. The final couple of poll questions, with the triple aim of data governance, which is your organization struggling with the most, data quality, data literacy or data utilization?

Tyler:                        Or we’re great with all three of these? Let’s not forget that response, Tom.

Tom Burton:        Yes, maybe we’ve got this nailed.

Tyler:                        All right, we’ll leave that open for just a few moments. Give everyone a chance to respond. We’ll close that poll and share our results.

Tom Burton:        All right, it looks like data utilization is the biggest challenge. I agree. I see that as a challenge a lot of times. There’s probably three aspects of improving that. First is making sure that everybody has access to the data and second, making sure that people have data literacy, they understand how to use the data, and then third, making sure that they can interpret that data and take action and actually changing a process or changing behavior.  Very interesting, thank you for sharing that. All right, next poll, which of the top five practices have you seen your organization? These are the worst practices I should say. Using data as a weapon, misrepresenting data, preventing appropriate data access, disengaging from the process, highlighting data imperfections and discrediting?

Tyler:                        This is selecting all that apply?

Tom Burton:        Yeah, you can select that apply here. This will be quite ingesting.

Tyler:                        Okay, we’ll leave this up for a few more moments as everyone fills it out.

Tom Burton:        I hope this makes you feel like, wow, we’re all in the same boat here. We’ve all got a lot to improve on and it doesn’t discourage you, because it’s not meant to discourage you. You can improve these. We’ve seen organizations significantly improve in all of these categories. All right, let’s check out the results, Tyler. Interesting …

Tyler:                        Here are the results.

Tom Burton:        … we’ve seen all of these that looks like throughout the organizations. Okay, thank you. All right, well, that concludes our formal presentation portion. We covered a whole lot today. Let me just review the core concepts that we covered. We first talked about setting your organization out for success, making investments and infrastructure, training your people on the key roles and skills. We talked about a data-driven culture in that journey. Second, we talked about unleashing data across the entire spectrum, so continually performing opportunity analysis and evaluating value, effort, capability, capacity and willingness so that you choose the right things to focus on.

We talked about light effort improvement and the prerequisites for that. We talked about investing in deep improvement, and having one improvement methodology and then organizing interdisciplinary teams for improvement. We talked about making sure you have enough money to invest in those mission principles. No margin, no mission. We talked about spring cleaning that we all have it to do to go do a little bit of spring cleaning on, on used analytics that are still costing us effort to produce. Then finally, we talked about sustaining and spreading improvement, collaboration between departments, promoting system-wide adoption, and avoiding the politics, avoiding some of those worst practices in analytics in healthcare.

I hope that these principles were helpful. Again, we have this in pretty fun interactive game format. If you’re able to get 25 of your key executives stakeholders, et cetera, willing to participate in that to Health Catalyst, be happy to come out and share that game with you. It’s a pretty fun experience. We did it at our healthcare analytic summit earlier or last year in September and it got a pretty good review. If you’re interested, please let us  know. I think we have one final poll question just to indicate whether you’re interested in doing that at your organization.

Tyler:                        That’s right, related to that, just please respond. Are you interested in having Tom Burton facilitate the Spectrum game at your organization? Please respond to this. While we have this up, we can go right into our Q&A portion.

Tom Burton:        Great.

Tyler:                        Our first question is, predictive modeling has been a trend in many industries however the healthcare industry appears to be slowing in adopting, why is that?

Tom Burton:        That is an excellent question. Part of the challenge is the data. Remember that slide that I showed where only 8% of the data really needed for population health is found in the EMRs. Well, if we don’t have data to train these predictive models, it’s very hard to produce them that are accurate and meaningful and action base. I think the biggest challenge has been while we have a lot of data, it’s not all of the data we need. The key there is expanding our data acquisition strategy. The second key challenge I think is just there’s high demand for these data scientist skills. Every industry needs this.

I don’t think the healthcare industry has been as aggressive as we need to be in recruiting and finding those people, and then helping to give them the healthcare context. Again, remember the shark and ice cream analogy. If you don’t have the context of healthcare, it’s really hard to build a predictive model that’s useful. That’s the other challenges, recruiting the data scientist and then getting them to have a deep enough knowledge of the context that they can actually make meaningful predictive models. Great question, thank you for that question.

Tyler:                        Our next question. When will we move from data-driven to information-driven or prescriptive analytics, and is there an opportunity for exponential change to get there versus incremental change?

Tom Burton:        Yes, absolutely. We’ve seen that. As we use machine learning, we can actually accelerate a lot of those manual processes that have traditionally taken incremental change. A great example of this is we’re actually using machine learning with UPMC interactivity based costing system. What they’ve done is there used to be a very manual process to make sure and check that all of the cost data was accurate. We were able to identify those patterns and build them into some machine learning algorithms that automated a large chunk of the closing process. In the past, they had six people working about 10 days at the end of each month to close out the books.

Using machine learning, we took that down to six hours, so pretty significant. As you said, not just incremental change but significant exponential change to go from 10 days down to 6 hours, a really, really significant improvement. Again, that was easier in the cost management area. We’ve got to do more and more of that in the clinical area, the data there is even more complex and so that’s part of the challenge, but absolutely.

Tyler:                        All right, thank you. Our next question is, how do you think you can use data analytics in a pharmacy setting?

Tom Burton:        There are a lot of different ways that you can use that. One way that we saw, Intermountain use this is we built a predictor for how effective would an antibiotic be. We actually included that in the workflow. We predicted for this patient with their condition, with their particular makeup, how effective will this particular antibiotic be? We have the top five antibiotics listed that a physician might prescribe. In addition, we put the cost of each of those next to that predictive model. We saw a massive … we didn’t say anything. We didn’t say, use this antibiotic. Just by presenting that data, we saw physicians change their use of a particular drug pretty significantly.

I can see lots of additional types of use of that as we think about integrating pharmacy data with clinical outcomes data being able to predict for certain types of patients, which drugs are going to be most effective and then which drugs are also going to be most cost effective. Great question. Thank you.

Tyler:                        All right, our next question. What are your suggestions for incorporating social data?

Tom Burton:        We’ve started to experiment with this a little bit and the type of social data that we find very interesting is in our care management suite. As we start thinking about how care managers can reach out and work with patients who have really high risk at high cost. Oftentimes, some of the socioeconomic data, such as do they have transportation, what zip code do they live in, how far away are they from accessing care? That can be a great predictor of what interventions are going to be most effective. For example, we may decide it’s cost effective to have an Uber pick this person up and drive them to their appointment because it’s across town.

We know that if we don’t do that, the likelihood them showing up to their appointment is 20%. As we start to incorporate more of that nonclinical data but more what are their buying patterns even from credit card data or what’s their zip code demographic look like in, and what kind of … we did a whole documentary on hot spotting which leverages us a lot of data outside the traditional EMR to help clinicians proactively go after patients that are at high risk. Great question. Thank you.

Tyler:                        All right, our next question. Clinicians who become managers are often not comfortable with data, have you seen this and what are your suggestions to be able to overcome this?

Tom Burton:        Yeah. We have seen a lot of this and I think education training is the key there. It’s really not an option to say, well, you don’t have to learn that. I think it’s absolutely critical. Now, do they have to learn sequel and build their own dashboards? No, but they need to learn how to interpret data. They need to know the difference between a correlation and a causation. They need to be able to read run chart and understand upper and lower control limits. Otherwise, you get managers who start to chase noise in the data versus signal. I think there’s a significant amount of training.

Some of which our partner organizations are establishing onsite with their clinicians to train them on how to think about data. You want to do it in a context of improvement if you can train clinicians. They are very motivated by getting better care for their patients. If you can help them understand that really understanding some of these analytic principles is going to improve care for their patients, they’re pretty motivated to learn. Our experience is that if you give them the chance, if you set aside and protect time so they don’t feel like, well, I’m not seeing patients or I’m getting behind them at work, you actually protect their time so that they can go to that education. It becomes a really effective way to educate and lift their ability to leverage that data. Thank you for the question.

Tyler:                        All right, our next question is, what is the value of a PhD statistician on the analytics team?

Tom Burton:        Good question. I actually think that the need for PhD trained statisticians because of some of the better tools are coming out, it’s becoming less and less of a requirement. Obviously, if you’re fortunate enough to have that, it’s a great asset. However, some of the tools that are now available, make it so you don’t need to be a PhD. You don’t need to be an advanced statistician. Now, there definitely is some training you do need. You should not think that I can only do this if I have a PhD statistician on my team, that’s not the case. They’re valuable but I’d rather have five really good analytic engineers versus one PhD statistician. Thanks for the question.

Tyler:                        All right, thank you. We apparently have someone else join us with their audio live, I can’t seem to find them on our list. We ask to make sure that everyone check to make sure they’re on mute so that we can finish up our questions. We have time for a couple more questions, Tom.

Tom Burton:        Great.

Tyler:                        Let’s see. Our next question is, how could analytics benefit a support department such as clinical engineering for maintaining medical devices?

Tom Burton:        There are some great used cases around this. There’s actually some new technologies that make this pretty interesting. One of the technologies that we’ve seen a couple of organizations adopt is tagging these medical equipment particularly IV pumps where they can locate using RFID. They can locate where the equipment is and cut down on practices like stashing them in your closet so that you have one when you need them. Also, you can use that to then predict when a certain piece of equipment needs maintenance, when it’s needing to be replaced, and avoid preemptively doing that maintenance worker, preemptively replacing it.

That’s much better than it failing in a critical situation. We’ve seen some of our most forward thinking organizations. Using RFID to track their equipment and then also using predictive models to understand which equipment needs to be maintained on what schedule and understanding when those should be replaced.

Tyler:                        All right, we’ve got time for one more question. We actually have several questions in around training. In order for individuals to make valuable insights, what skill sets they need to be trained in? Other questions about the machine learning training other, what are some resources for training for individuals?

Tom Burton:        Yeah, great question. I covered this a little bit early on in the presentation. I’ll try that to maybe scroll back to this one slide here. The core set of training that needs to happen is on the technical tools and then on the healthcare, understanding healthcare data. Let me just give you one example of how important is to understand healthcare data. Think about a diagnosis code. Well, there are about 12 or 13 places a diagnosis code could show up. It could show up in the problem list, it could show up in an admit record, it could show up on an order, it could show up a whole bunch of different places.

If you’re an analyst and someone says, can you tell me how many diabetes patients got, prescribed this drug? Knowing where that data is and the context of that data is absolutely critical. Knowing whether someone has diabetes or they have a history of their family having diabetes, there’s a big difference between those two. Those are examples of really understanding healthcare data deeply. On the core technical skills, one of the things that Health Catalyst offers every September when we have our Healthcare Analytic Summit, the day before that summit, we offer Health Catalyst University which we have a whole number.

We actually had five different tracks, some around data science and predictive modeling, some around telling stories with data, some around visualization principles. We have a great education team that you don’t have to be a Health Catalyst customer to participate in that training. We offer a lot of obviously our training materials online. There’s a lot that you can find on our website. That’s a great place to start. There are also some great tools out there just to learn the basics. Learning sequel is a core fundamental skill for any analyst and getting better at that. Those would be my suggestions to start off. You won’t be sorry for making those investments. That’s one of the best investments you can make is investing in your analytic people. Thank you so much for all the questions. I hope the webinar was useful. Again, thanks for participating today.

Tyler:                        Thank you so much, Tom. We’d like to remind everybody that shortly after this webinar, you will receive an email with links to the recording of the webinar, the presentation slides, the poll question summary results. Also, look forward to the transcript notification. We’ll send you when that’s ready. Also in that email, we’ll remember to send out the link to that outcomes readiness assessment and also a link to our next webinar that we alluded to as well. In behalf of Tom Burton as well as the rest of us here at Health Catalyst, thank you for joining us today. This webinar is now concluded.