Rapid Response Analytics Accelerates Analytics ROI


Eric Just:                       Thanks Sarah. As Sarah mentioned, my name is Eric Just. I am a Senior Vice President in Product Development here at Catalyst. Been with the company for a long time. Before I came to Catalyst, I was at Northwestern University and I held several roles at Northwestern as well as there for almost 10 years as a software developer, a data architect, one of the principal architects of the data warehouse there and as an analytics team manager.

Eric Just:                       And I think a lot of the solution that you’re going to see today is born from both my experience at Health Catalyst but also, my experience in the analytics space before I came to Health Catalyst from Northwestern. So the agenda, we’re going to just go over a few background slides, kind of set some context for you. Then we’ll walk through the Rapid Response Analytics Solution and talk about the components of the solution in a little bit more detail. Next, we will do some demo of the software and finally, you will end with a short conclusion.

Eric Just:                       So just to set the context a little bit, this is the analytics adoption model. It was developed by Dale Sanders and Tom Burton from here at Health Catalyst and later adopted by HIMSS. And this provides a very good framework for me to talk about the … provide some context to the Rapid Response Analytics Solution. It’s designed to show the varying levels of analytic maturity for healthcare analytics organization. And it is also designed to be a progression so at such that execution on each level better prepares you to handle the next level.

Eric Just:                       The upper levels, levels five and up is where the really big analytics ROI is. That’s where we actually use the analytics to reduce variation, reduce waste, population health management, all the way up through personalized medicine. But across the industry, we see resourcing patterns that look like this where you’ve got a large pool of talent and they are more focused on the lower levels of these adoption models, levels two, three, four and we have a relatively smaller number of analyst focus on those really big ROI things.

Eric Just:                       So you got to remember it’s a progression. So the lower levels aren’t scaling, the upper levels won’t benefit. So our solution today fits squarely into Level 2 and it’s designed to scale the lower levels of this adoption model so that we can see resourcing patterns that like more like this where more analytic resource and more analytic firepower is applied to the other levels of the analytic adoption model.

Eric Just:                       So we’re going to start with a quick poll question. Overall, how well do you agree with the following statement? Analytic results in my organization are consistent. If you ask different analysts to the same question, you will get the same results.

Sarah Stokes:                Right. And your answer options are completely disagree, moderately disagree, maybe you’re neutral, moderately agree and completely agree. We see those votes pouring in. I will take this moment while you’re voting to remind you if you missed the beginning of the webinar, you will have access to the webinar recording and the other assets there. We do encourage you to ask questions throughout the presentation. Those will be addressed in a Q&A session at the end.

Sarah Stokes:                Okay. We’re going to go ahead and close that poll and share the results. So it looks like 19% said they completely disagree, 32% which is our majority here said moderately disagree, 22% said neutral, 26% said comp … moderately agree, sorry, and 1% said completely agree.

Eric Just:                       Thank you for taking the time to answer that poll. That’s a very interesting result and in agreement with the patterns that we see in the industry. So over 50% of the people say they either moderately disagree or completely disagree that the results are consistent. You get two different answers when you ask two different analysts the same question. And a relatively small percent, 27% either moderately agree or completely agree.

Eric Just:                       So that is among other results that we’ve observed in the industry with scaling analytics. So another trend that we’ve observed is analysts and IT staff are just stressing over work and no matter how many resources they have, no amount of resources can seem to quell that demand and they’re constantly underwater. This is the point that was just validated through the poll. So organizations struggle with consistency and analytic results and analytics is increasingly seen as an underperforming investment with questionable ROI.

Eric Just:                       And if you go back to that adoption model where we looked at a large amount of resources focusing on the lower ROI path of that adoption model, this seems to make sense. And there’s increasing scrutiny about whether the late binding model is scale enough … scalable enough. So underlying many of those issues is a simple fact that at most organizations, the analytics process still takes too long. Development time is lost with repeating tasks, inefficient communications and using specialized resources for simple things.

Eric Just:                       So every time you ask an analyst to integrate diagnoses from multiple data sources, every time you ask an analyst to identify which encounters were inpatient encounters, every time you ask an analyst to calculate length of stay and repeat these for every project that they’re doing, every time you’re asking an analyst to identify heart failure diagnosis codes, they’re all valuable tasks but it’s really amounting to asking them to reinvent the wheel.

Eric Just:                       And this, on the right hand side, you can see some examples of patents for a new wheel design throughout the 20th century, and these were all for this internal suspension wheel that was so elusive throughout the 20th century. None of these models were commercially successful. So asking an analyst to reinvent the wheel, it’s not the best use of anybody’s time.

Eric Just:                       I’m going to ask another poll question. The amount of analyst time at my institution that is lost due to performing repetitive tasks that could be automated.

Sarah Stokes:                And your options are 0% to 20%, 21% to 40%, 41% to 60%, 61% to 80% and 81% to 100%. We’ll give you just a moment to respond there. That’s when you got to think about a little bit. Okay. Votes are still coming in. All right. We’re going to go ahead and close that poll and share the results. So 2% reported that 0% to 20%, 31% reported 21% to 40%, 42% reported 41% to 60%, 23% reported 61% to 80% and only 2% reported 81% to 100%.

Eric Just:                       That is interesting as well, and you can see no matter how you slice it, over 40% … the vast majority of people are presenting that over 40% of the time is lost due to repetitive tasks. And when you ask an analyst where they spend their time, oftentimes, if you ask them how much time they spend on analytics, you get answers like 25%, maybe 15% of their time that they’re actually spending doing analytics because the rest of their time, they’re performing repetitive tasks. So thank you again for filling out that poll and thank you for helping to underscore that point for us.

Eric Just:                       So another thing that takes along is inefficient communication cycles. So an analyst and a subject matter expert need to really work hand in hand to get good data out of the system. And oftentimes, the communication pattern between the subject matter and the analysis is very disjointed. It takes place over email, it’s asynchronous, an analyst may have to wait days or maybe even a week to get a response on a simple question. And those inefficient communication cycles equal lost time.

Eric Just:                       I like to think of it like air traffic control tower talking an inexperienced pilot how to land a plane. They’re talking from the tower over the speaker, the pilot is looking for the controls and trying to figure out what to do based on what the tower is saying. This is an actual of a plane that was landed by a passenger after the pilot suffered a heart attack. The plane does get landed but there’s oftentimes a better way. More efficient ways to communicate.

Eric Just:                       So another thing that takes so long in analytics is using specialized resources for simple things. So every time you ask an analyst questions like how many diabetics did we see last year? Which patients received valsartan in the last six months? How many patients were diagnosed with scleroderma? How many patients with low back pains are also being prescribed opioids? These analysts are highly trained and these questions are fairly simple or they should be fairly simple to get.

Eric Just:                       So asking a highly trained analyst questions like this, it’s like asking a surgeon to administer a Band-Aid. There’s just better uses of the analyst’s time. So how do we scale? And the truth is that this kind of request is actually clogging the backlog for many healthcare organizations. When I was an analytics manager, these were the exact kind of questions that were high volume, they took a fair amount of time and it was just very difficult to dig through all of these.

Eric Just:                       So in order to understand how to scale the development process, the analytics development process, I think it’s good to review the process itself and then we’ll start talking about how the Rapid Response Solution fits in. So most analytic use cases today start with a population definition. That includes financial reporting, it includes pop health, it includes clinical analytics. And the process of generating that … those analytics starts with data standardization. So an analyst has to have a good place with standardized data, with conformed data structures and terminology match to those data structures. Otherwise, the analyst ends up doing all that work themselves every time they get a request.

Eric Just:                       Then there is this ad hoc analysis phase where the subject matter expert and the analyst really start to iterate on this idea of what the population definition is. And there’s usually a first attempt, it’s not perfect but then the subject matter expert may start to ask questions at the analyst, things like how many patients do we get if I add this diagnosis code or what if I add this exclusion criteria and this process of refinement? This is where that lost communication time can really happen but it’s extremely important and critical to the end result.

Eric Just:                       The next part is really nailing down that criteria, so the inclusion and the exclusion critical for the cohort after that ad hoc analysis phase, that’s what results in a final cohort definition and then having a way to execute that in your analytics environment. The next step is collecting data about the population and linking data to that population. And then finally, the very last step is about algorithms and insights and this is where the real analysis happens.

Eric Just:                       So all of these are steps that need to be in place for an analytics project to be developed and the Rapid Response Analytics Solution is designed to address the first four parts of this analytics process. Really, let the analyst focus on the last part, that’s what they’re trained to do and have technology help scale the first part of this process.

Eric Just:                       So we believe the Rapid Response Analytics Solution can reduce development time by up to 90% because the analyst spends less time standardizing data, spend less time rewriting common definitions, spend less time with that inefficient communications and less time spent answering simple questions. And that means more time on analytics and we’ll walk through how we address each of these points on the rest of the presentation today.

Eric Just:                       First, let’s just do a high level overview of the solution. So the Rapid Response Analytics Solution that we say reduces your analytics development time by up to 90%, it starts with the foundation of the data operating system. This isn’t part of the solution but I need to mention it because it’s a very important and critical piece. The data operating system is where we take all these different data sources, even multiple EHR sources, multiple claim sources, financial data sources, cost, all of them can come into the platform very quickly. They’re collocated and they’re available for analysts to start querying in very short order.

Eric Just:                       And this was a very early value proposition for Health Catalyst. We’ve done this since our founding in 2008. We get data warehouses and data operating stood up … data operating system stood up very quickly. But the conversation today is about analytics. It’s about how do I take the data that’s in the data operating system and make it more efficient for my analysts to get answers? And that’s where the Rapid Response Analytics Solution comes in.

Eric Just:                       It’s an analytics infrastructure built into DOS to make analytics more efficient and it starts with what we call DOS Marts, and DOS Marts provide a curated reusable customizable layer of data content, business logic and algorithms. It also includes a tool called population builder and this is a visual building tool that enables a rapid authoring and sharing of definitions such that you don’t need to have advance SQL skills or advance database querying skills to get answers from the platform. And it allows you to share that and reuses a lot of reused themes in the Rapid Response Analytics Solution.

Eric Just:                       So let’s start talking about DOS Marts in a little more detail. DOS Marts provide that reusable curated content and when I say it leverages decades of experience, that means decades and decades of the people who have been at Health Catalyst. We’ve been a company for over 10 years now. We have many people who have been with us for a long time working on behalf of our clients. We also have a lot of individuals who have worked in healthcare for many, many years, even before they joined healthcare.

Eric Just:                       So all of that experience comes in forming the DOS Mart Suite. And these are data models and the importance of the experience is not only to define what’s in the model but what’s not in the model. So what’s in the model? We really focus on those high-value data elements, the things that are reused over and over and over again, things like integrated EHR data, things like simple definitions like length of stay that you tend to use over and over and over again. That’s what’s in the DOS Marts.

Eric Just:                       What’s not in the DOS Mart is every data point under the sun that you’re ever going to need. We still believe in a late binding approach to certain data elements. So really, it’s that balance between modeling data and the late binding approach. Because we provide this part of the DOS Mart Suite, analysts don’t need to reinvent the wheel for these data definitions. They don’t need to manually integrate data from multiple EHRs. They don’t need to define those common metrics.

Eric Just:                       In fact, it’s brought in for them and this is actually just a picture of an actual implementation of the internal suspension wheel. It’s called the loop wheel. I’ll talk a little bit more about it later but I thought it adequately represented what I’m trying to say here is that don’t need to reinvent the wheel because it’s already been reinvented. We can bring that as part of the platform.

Eric Just:                       When you use the DOS Mart Suite, the results are more consistent and this goes back to that poll question, right? How do we get our analysts singing the same tune? Well, give them a framework for reuse and give them a framework to reuse value that’s already been created in the platform. That leaves analysts to focus on creating value added business analytics versus writing complex SQL across multiple data sources.

Eric Just:                       So as I mentioned before, this does relate to our architectural strategy. When we came to market 10 years ago … over 10 years ago, we came into a market where the enterprise data model was the prevailing data warehousing technique for healthcare and we’ve seen time and time again that just fail. This is the idea that you create one big data model and every piece of data in your organization has to get mapped into that model before it can be used. That does not work. That’s a failing strategy.

Eric Just:                       When we came to market, we had this late binding approach that said, “Hey, don’t worry about creating that big data model. Don’t bind definitions until the business use case requires it and that gets you to value much quicker, right? You can get using the data before you’ve gone through this complex modeling process. But over time, it also leads to proliferation of data objects and can lead to data consistency issues if different analysts are late binding in different ways.

Eric Just:                       So our balanced approach is to continue to preserve that atomic detail in source marts also known as data legs, but provide standardized reusable data marts that focus on those high-value data points and calculations, and that’s what we call DOS Marts and we take kind of a Pareto approach here. You think about modeling 20% of the data, that’s going to drive 80% of the downstream use cases. That’s our approach with these DOS Marts.

Eric Just:                       One thing we’ve learned over many years is that extensibility and customizability is very important to our customers. So we bring any predefined content in our platform, predefined data content, we need to make sure that the client can add their own definitions and maybe even add … modify definitions that come with the platform. So we designed DOS Marts to be extensible to allow those local definitions to be created but still maintaining the upgradeability and maintainability by Health Catalyst. So it is a product that has product managers and we do release upgrades and we are looking at how every client extends and using that to fold and continually improve … fold in and continually improve the product.

Eric Just:                       There are various flavors of DOS Marts. So on Level 0 DOS Marts, these are also called core DOS Marts, terminology and person are good examples of this. Terminology contains all of the value sets and terminologies that we use for mapping and person represents patients in a system. Every time our platform is installed, these two data marts come with it.

Eric Just:                       Level 1 DOS Marts are in the data domain and this is where we start to do things like standardizing and combining EHR data for multiple EHRs into a single data model. Standardizing claims sources into a single data model, cost, patient satisfaction, it make … this level of data mart is where you start to see the value of being able to quickly query across different systems. So if I have three different EHRs, I can write a query against my clinical DOS Mart and the way that data is integrated is going to be the same for every analyst.

Eric Just:                       It also allows you to more easily query across DOS Marts. So combining clinical cost and patient satisfaction is all part of the value of these Level 1 DOS Marts. They also provide a standardized layer such that it’s easier to generate what we call Level 2 DOS Marts, and these are an emerging category for us. We spend a lot of time over the last year or two developing and deploying the Level 1 DOS Marts and now, we’re starting to reap the rewards by developing things like our readmissions DOS Mart which will be out later in the spring that’s based entirely on content in Level 1.

Eric Just:                       And our readmissions data mart, because it’s based on content from Level 1, it only takes five hours to install versus 60 hours for an equivalent backend to our readmissions explorer application. That’s a 92% less time in installing and configuring this data mart. So expect to see more and more content delivered on top of DOS Marts from Health Catalyst.

Eric Just:                       And that alone is not enough to justify the Level 1 DOS Marts. Every time you write a metric, every time you write a subject area data mart or do an ad hoc analysis and you’re not doing manual data integration and you’re not going all the way to the source data, you’re getting value from these. So I almost think of it like compounding interest. The time that it takes to implement the Level 1 DOS Marts, you can think of that as your principal investment and then every Level 2 DOS Mart or every metric you create that uses those DOS Marts is compound interest.

Eric Just:                       The more you build, the more you … the more time you save and the more time your analyst can spend focusing on that upper level of the analytics adoption model. We do have a success story with DOS Marts. Our client Orlando Health was involved with a catalyst installation. They had a previous solution, an enterprise data model solution and they installed the catalyst solution to set up their data warehouse very quickly and they installed DOS Mart as part of that installation.

Eric Just:                       And one of the things that they noted is the second to last square here says 95% reduction in workhours required to incorporate system enhancements and their enterprise data model, it took them weeks or months to make changes to the data model. Because of our extensibility framework that I mentioned and DOS Marts, it took them 95% less time to make that same enhancement within the catalyst data operating system. So that’s a very nice validation of the DOS Mart techniques for customization.

Eric Just:                       The other thing that they noted was that 80% to 90% of their downstream reporting is coming from DOS Marts. So that validates that Pareto principle that I mentioned before where we’ve got 20% of the data driving 80% of the downstream use cases. So I’m just going back to the solution. We’ll talk now about Population Builder. And Population Builder, it’s a first Catalyst tool that demonstrates the benefits of the DOS Mart Suite. So out of the box, Population Builder is designed to work on DOS Marts.

Eric Just:                       And it’s an easy to use visual analytics tool that enables analysts to rapidly develop, analyze and visualize populations up to 90% faster. It brings reusable content and authoring to non-technical users. So somebody who doesn’t know database query language can write their own populations, and we’ll do a demonstration of what that looks like in just a bit. And that it allows those centrally defined populations to be published and governed and that’s a big value add and we’ll demonstrate that capable as well.

Eric Just:                       So just a note before I go into the demos, the idea that DOS Marts are extensible applies to Population Builder as well so imagine this design to be installed very quickly over DOS Marts but it’s not limited to those DOS Marts. So if there’s content that you want to add to Population Builder to create your person, it doesn’t come as part of a DOS Mart. That whole workflow of adding that is supported, and I’ll talk a little bit more about that in a bit.

Eric Just:                       So Population Builder promotes reusability. It puts hundreds of publicly available definitions at your fingertips. The definitions come in the form of value sets. So when CMS has a value set that defines what it means to be a diabetic, that’s available and exposed through the Population Builder tool. The publish registry, so if I take a … if I create a registry of patients, I publish that into the DOS platform and that’s available throughout the analytics environment. It doesn’t just live in the Population Builder tool.

Eric Just:                       You can write dashboards that query that population. So if you’re using QlikView or Tableau, that definition of diabetes can be the same for your analyst across your analytic environment. And the similarly, if they’re using Catalyst tools like Leading Wisely, it’s available for use in that tool as well. And eventually, it will be exposed through APIs as our vision for creating DOS as application development framework continues.

Eric Just:                       So Population Builder improves subject matter expert and analyst collaboration. Remember the analogy of the air traffic controller talking to the pilot? Well, I think of Population Builder as providing more of a copiloting experience where the analyst and the subject matter expert are walking together more closely to define that population. Population Builder is also about getting data directly to consumers, right? We don’t need to call a surgeon to administer a Band-Aid. You don’t need to call an analyst to answer, how many diabetics did we see last year? That should be something that’s easy to get at.

Eric Just:                       That ad hoc analysis puts power in the hands of that consumers instead of relying on the overwhelmed analysts. You’ll see this experience as well in the demos. So we have a few success stories here just before we get into the demo. This one is from the Queen’s Health Systems. They implemented Population Builder and they had a grant deadline that they needed to respond to. And the grant was looking for … to apply for the grant, they were looking for undiagnosed diabetics.

Eric Just:                       They were able to use Population Builder to get the answer to that question real-time during the team’s standing one-hour meeting and they also said that it probably would have taken a couple of weeks to get that answer in their existing system. So when I do the math, I say, “Well, if it took a couple of weeks, let’s call that 80 hours. That’s 40 hours a week times 2.” And the answer was gotten in less than one hour using Population Builder. So that is an 80X savings of time to get the answer to that question. So think of the scale your analytics team could have if these questions that take weeks could be answered in minutes.

Eric Just:                       So this is another piece of feedback from a client. This is an exact quote. I have to keep it anonymous. Their marketing team did approve a quote but it was missing some of the exact quotes of the user and I thought those quotes were very valuable so I wanted to provide the actual quote from the user. So this is anonymous but we did have a user say that, “We saw the Population Builder application at HAS and knew we had to have it.” And HAS is our annual summit. They saw the tool back in September and he said, “Okay. We’ve got to have that tool.” So he appreciated the user interface and knew that they had to have that.

Eric Just:                       He said, “We field numerous requests for ad hoc reports for things like research and knew it would be a great tool to gather patient populations around specific illnesses.” And here, he’s validating that these requests, these simple requests, they tend to fill up a backlog, that they have a lot of these requests and there’s a way that he would like to be able to deliver results to those types of requests more quickly.

Eric Just:                       He said, “We worked with the Pop Builder team to get the software configured and we are up and running.” The implies the software was not complicated or complex to install and it isn’t, they were up and running very quickly. Then he said, “Our first test was to find patients who had been diagnosed with endocarditis. We used the tool and in under five minutes, we had our patient population and compared that to our existing tools output. To create the same output in our current tool took a couple of hours but we got the same patient population.”

Eric Just:                       So we see that multiplier again. The last thing he said was, “We love the efficiency of the Pop Builder tool.” And that’s a 24X efficiency to go from a couple of hours to five minutes. So we’re starting to see where that 10X and the saving analyst development time, 90%, that’s starting to … that story is starting to emerge with how quickly these questions can be answered.

Eric Just:                       So I’m going to end with a brief conclusion and then open up for questions. So I’m going to back to our story about the wheel. We talked frequently about reinventing the wheel and we showed that cool reinvented wheel. I want to talk about story for a minute. The inventor of that wheel saw a mother struggling to get her baby carriage over a curb. And he said, “There’s got to be a better way to do the wheel so that that is easier.” And he drew this pencil sketch of what he called is internal suspension wheel.

Eric Just:                       This was just an idea and he did some research and he found that there were patents filed for internal suspensions wheels and found that these were not successful designs so he did a lot of learning. He learned about what was not good about these and how he can improve on the design and he did a lot of thinking. And one thing he noticed, he was thinking about materials that he could use that maybe weren’t available before and he noticed somebody using a bow and arrow of … this carbon composite bow and arrow.

Eric Just:                       And he said, “Well, that might be what we’re looking for the internal suspension of the wheel”, and it was. He actually worked with a bow and arrow manufacturer to generate this version of the internal suspension wheel. Those are carbon fiber supports in the middle that make the wheel more bouncy and easier to move over curbs. And this model was actually for those little foldable bikes.

Eric Just:                       He wasn’t able to get the production cost down so the wheel ended up costing almost as much as the bike so he went back to the drawing board again and found that there was a market for better wheels in wheelchairs to make the ride more comfortable and to help the people bound to wheelchairs to be able to get to places that maybe harder for them to get to before. So I love the story because it’s about determination but it’s also about time, and the time that it took to reinvent the wheel.

Eric Just:                       It took him 10 years to get from that pencil sketch to the original version of the loop wheel and he was also specialized on it. This is what he was focused on day in and day out. I think the moral of the story here is that the reinvention of the wheel, it’s really hard. Don’t make your analyst reinvent the wheel every day because their time is valuable. We’ve specialized for more than a decade in this field and our specialization has yielded a tool set that promotes reuse and it prevents that every day reinvention of the wheel.

Eric Just:                       The Rapid Response Analytics Solution provides a framework for reuse and it’s a framework for reusing data content from both inside and outside your organization that shows up in … when you use a data element from a DOS Mart, when you create a custom extension to a DOS Mart, when you use value sets or populations generator of patient with Population Builder, these are all ways that you can prevent your analyst from having to reinvent the wheel and really focusing on creating that ROI that we know is possible with data analytics.