The Power of Data: Igniting Meaningful, Scalable, and Sustainable Change
The Power of Data: Igniting, Scalable and Sustainable Change (transcript)
John L. Haughom
[John L. Haughom, MD]
Thank you, Tyler. It is my pleasure to once again be with all of you today. Based on the first three webinars in the series, I believe that it’s safe to say that we have a pretty good sense of what is good about our system of care delivery, as well as the growing challenges that must be addressed. So what’s next? Well now it’s time to roll up our sleeves to better understand and explore what it will take for healthcare organizations and individuals to lay the foundation for future success. The present state is not acceptable or sustainable. It is time for change and change has begun. Thanks to some leading edge healthcare organizations who are pioneering the way. I believe we can get a pretty good idea of what the future will look like and what it will take to get there. Over this decade, there will be an increasing emphasis on value production and effective population management. Individuals and organizations that want to assure their success will need to do what is necessary to prepare a solid foundation for the future. The next several webinars will take a detailed look at the key elements of this foundation. Based on years of experience in this game, I can assure you that the good news is that this will be a fun and rewarding experience for those who can see beyond the challenges and envision a new future for healthcare. We have the necessary skills, the technology solutions. The only real question is whether we have the will.
So let’s get started.
Healthcare: The Way It Should Be
As I mentioned in prior webinars, this webinar series is based on a book I am writing on healthcare transformation, entitled, ‘Healthcare: A Better Way. The New Era of Opportunity’. You can freely download the part 1 and part 2 of the book at the book website. The final two chapters will be available in June. I am very excited to tell you today that part 1 and 2 are now available in combined form, and if you go to the length just at the bottom of this slide, you can get to consolidate version of part 1 and 2 of the book. If you have not registered, you can do so at this length. For those who have registered, we will provide you information about how to get access to the books, so you won’t have to re-register. At this point, we’re starting a deep dive into part 2 of the book, which focuses on the key elements of a foundation that will prepare healthcare providers for the future. This is the heart of the book and the core of its most important messages. The next several webinars will cover the important information in part 2.
As we discussed previously, a lot of healthcare’s present challenges are driven by massive complexity, complexity that is overwhelming a system of care delivery that is just simply not equipped to handle it. Modern healthcare joins one of the most intelligent, educated and committed workforces of any industry in the world. However, they try to deliver care using the delivery system that was developed over 100 years ago. Well that system has carried us a long way. It is not designed to deal with the complexity that characterizes the modern era. While maintaining the best from the past, we now need to build a new, more modern care delivery system that is more equipped to deal with complexity. This new system of this delivery will be found on the latest most modern management system series that were specifically developed to deal with growing organizational complexity that takes these organizations all over the world, whether they’re in healthcare or outside of it. These systems have been hugely successful in other industries and leading edge healthcare organizations are already demonstrating that the same approaches can better manage complexity, increase reliability, drive out wastes, improve quality, and reduce the incidence of harm.
Part 2 of the book is focused on helping individuals and organizations understand what they need to do to emulate these highly successful healthcare improvement pioneers. We must educate and engage both clinicians and operational leaders in these powerful new ideas. As bright and well educated as clinicians and others in healthcare are, you cannot assume they understand these key elements of change. However, every single time that I have walked good clinicians or operational leaders through this process, the vast majority of them not only get engaged, they tap into their strong desire to be the best they can be and we end up having to work to keep up with them.
Poll Question #1
On a scale of 1-5, how would you rate your organization’s ability to manage complexity?
Before we dive in today’s presentation, let’s post the first poll question. On a scale of 1 to 5, how would you rate your organization’s ability to manage complexity? With A. being 5 and excellent, then E. being 1 – poor. I’ll wait for responses.
Alright. We have that poll live now for everyone. As you’re filling up that poll, I would like to let everybody know I have taken the link to Dr. Haughom’s ebook and placed it into chat. So if you like to be able to go and download that right now, you can follow that link and chat to be able to get his ebook that he’s talking about right now. We’ll leave the poll up for just a few more seconds.
Okay. We’re closing the poll right. And Dr. Haughom, here are the results. It looks like 8% with a 5 at excellent, 26% a 4, 45% a 3, 17% 2 and 4% a 1.
[John L. Haughom, MD]
Well these are very interesting findings. I am encouraged to see that 45% put themselves at a 3 and another 9, higher than that. That actually represents a lot of good progress compared to the 5, 7, or 8 years ago. So that’s very encouraging news, very encouraging news. Thank you for your responses.
Implementing an Effective System of Production in Healthcare
Given the complexity of managing meaningful change in healthcare organizations, particularly in these turbulent times, I believe all the healthcare organizations would benefit enormously from developing a systematic approach to improvement – that is a systematic approach to managing complexity and change. It is useful to have a framework to discuss improvement, a framework that can help with discussions and health organizations craft a thoughtful and comprehensive strategy to a new way of delivering care that is safer, higher quality and more reliable for the patient’s reserve.
In my experience, such an effective framework is illustrated on this slide. It divides the challenge of the three critically important components that in combination can ignite sustainable, meaningful and scalable change. The three systems are an analytic system, a deployment system, and a content system. In my experience, achieving scalable and sustainable outcomes requires effectively implementing each of these three systems in combination with each other.
The first system, the analytic system, is where an organization unlocks its data and standardizes the way it measures things – that is where the enterprise data warehouse or EDW and data visualization tools reside. We will be discussing the analytic system in much greater detail today and during my next webinar, so I will not comment on it further at this point.
The second system, the deployment system, involves standardizing organizational work. To improve deployment systems, an organization needs to start by organizing permanent teams to take ownership of the quality, cost of patient satisfaction associated with care delivery. The organization also needs to organize team structures, provide training on roles, allow teams to design their own solutions and make sure improvements are implemented consistently and broadly across an organization.
Encouraging physicians and nurses to design new ways of doing things creates a sense of ownership in the solution that they come up with. Furthermore, organizations need to integrate lean process improvements with measurement systems so they can have immediate automated feedback on performance improvements. All of these activities help organizations improve and sustain their improvement (09:38).
The third system, the content system, involves standardizing medical knowledge work. Even when a new study comes out and identifies best practices, it’s why they know that it can take up to 17 to 20 years for physicians to integrate the new knowledge into everyday practice. This is obviously not acceptable. By standardizing knowledge assets, such as order sets, intervention criteria, value-stream maps and patient safety protocols, an organization can improve the speed at which new medical knowledge becomes everyday practice. This process includes a consistent, standard method for gathering evidence, evaluating that evidence and integrating it into the care delivery process.
We will cover each of these three systems in some detail over the next several webinars.
Analytic System Components
So now let’s turn our attention to the first of the three systems, the analytic system. If you’re going to continuously improve, it is critically important for you to unlock your data and develop an effective measurement system. By the end of today’s webinar and the next one, you should be able to describe different data models, including their strengths and weaknesses in healthcare, know how to use Pareto analysis to prioritize your improvement opportunities, and discover patterns in the data to ignite meaningful, scalable and sustainable change.
Healthcare is a very data intense industry. Clinicians cannot deliver and sustain high quality safe care without accurate information that is readily available to them. Measurement is the basis for assessing and sustaining potential improvements in healthcare quality. In order to know whether a change is an improvement, an analytic system is absolutely essential. Key performance measures allow improvement teams to assess care against past performance, as well as against evidence-based clinical guidelines and nationally recognized standards.
As Lord Kelvin was once known to say, “If you cannot measure it, you cannot improve it.” In an improvement effort, you always need some form of objective measure to demonstrate how well things are working. Analytics have to do with how we make data accessible for use, how we use data, how we measure work, how we prioritize improvement opportunities, and how we monitor improvement efforts over time.
The three components of an analytic system are shown in this slide. First, an organization needs to effectively unlock their data. Second, an organization needs to broadly distribute the data to individuals across the organization in as close to real time as possible and teach them how to access and use the data. This is so-called self-serve analytics. This is in sharp contrast to the traditional report cue mentality where one would request a report and wait days to weeks for the report to be built that may or may not meet their needs. And third, improvement teams need to discover patterns in the data so they can target areas for improvement and ignite meaningful and sustainable change. We’ll discuss each of these components in turn, starting with unlocking the data.
However, I first wanted to make some comments about the appropriate use of data and improvement.
Using Data Appropriately
Few would argue that data is necessary to derive improvement. However, it is equally important to understand at the outside of an improvement initiative how data should be used to optimize the likelihood that clinicians will engage in improvement efforts. New knowledge and a migration to a profession-based model of care requires the move from the traditional judgment-based model to a learning-based model.
Adjustment-based approach focuses on the person while a learning-based model focuses on continuous improvement. A judgment-based approach tends to make most people defensive and creates resistance to learning. Therefore, it will likely impede continuous improvement. Based on a philosophy that the best defense is a good offense, the accused under judgment will often counterattack in an attempt to shift the blame elsewhere. An attempt to kill the messenger, they may challenge the veracity of the accuser, the validity of the analytic system, the accuracy of the data, the legitimacy of the analytical methods, and the accuracy of the evaluation itself. They will also often question the confidence and motives of those conducting the assessment.
This is an example of the Cycle of Fear described by Scherkenbach and illustrated in this slide. The behaviors described in the Cycle of Fear are true because the majority of situations where errors occur are in fact the result of a flawed system rather than a failure by an individual, that is, the accused often has a right to be defensive because it’s genuinely not their issue.
Using Data: Learning vs. Accountability
In 2003, Dr. Brent James at Intermountain Healthcare, Dr. Dan Berwick who founded the Institute for Healthcare Improvement, and Dr. Molly Coye who is currently chief innovation officer at UCLA, defined two ways of using data to get results. As illustrated in this slide, you can use data to hold people accountable or to measure improvement and encourage learning. Every organization needs to gather some data that encourages accountability but the overall focus should be on learning, not on accountability or judgment. The focus you choose will in fact determine what you do to improve your numbers.
Deming identified three ways to get a better number. The first as the most successful is to improve the system. To do this, you have to improve your processes in an honest effort to add value at the frontline. The second is to suboptimize. You focus on improving the area being measured often at the expense of other areas. The third is to gain the numbers. You manipulate the data to make the numbers look better. In healthcare, this is often accomplished by eliminating troublesome populations from the cohort of patients.
A learning approach focuses on the process in the system. This is a bottoms-up approach centered on the idea that people can study a flawed process and improve it over time. We need clinicians to embrace continuous improvement. Implementing a learning model can therefore be viewed as a very powerful engagement strategy for clinicians. A profession-based model allows and encourages people to continuously improve and learn. It involves them in the decision. Organizations that focus on learning are more likely to improve their processes and systems. Organizations that focus on accountability are more likely to suboptimize or game the numbers.
The exciting news is this – if an organization effectively implements the three systems, every care environment can effectively become a learning laboratory. This would be better for patients, it would make care more fun and it would be very empowering for clinicians.
Enterprise data model
Now back to the discussion about the analytical infrastructure required for future success. A Healthcare Enterprise Data Warehouse, or EDW, is the core of an analytical infrastructure. Given its complexity and quantity, it is important that healthcare data be readily accessible electronically and that the design of the EDW is maximally adaptable to support the very dynamic and unique nature of the healthcare environment. EDWs are described using contextual data models. Over time, different data models have been developed to meet various analytical requirements and industries around the world. A data model can be thought as a diagram or a flow chart that illustrates the relationship between data and data sources. The data model serves as the basis that IT professionals can then use to create the physical data model. The characteristics of the data model matter because of the complex and dynamic nature of healthcare data and the dynamic nature of the healthcare environment. Various types of data models and how they relate to healthcare will be described in the next few slides.
There are several approaches to unlocking data in healthcare using different data models. One approach is the Enterprise Data Model which is picture in this slide. In this model, an organization creates a perfect model of the data, representing all the concepts or relationships that need to be defined.
Enterprise Data Model
They then map the different source transactional systems into that model as now demonstrated on this slide. This model works well in some industries such as banking and retail that have minimal variability in their data and where concepts and definitions are relatively static. Unfortunately, this highly organized model cannot be delivered incrementally. It takes a long time to create it and it could be very expensive to build and maintain. The Extract, Transform, and Load or ETL routines used to move data into this model are very complex. Finally, because of the characteristics of healthcare data, including constant evidence-based care updates, you have to continuously redesign this model to make the new data fit. Some healthcare systems have spent years on this approach and still have not been able to move any data into the model. This model has had limited success in healthcare, although it has been very successful in other industries.
Dimensional Data Model
This slide illustrates another approach to unlocking data, the Dimensional Data Model. With this model, an organization builds an analytic data mart for a particular, such as heart failure, diabetes or asthma. It then gathers the data it needs directly from the source systems and maps it to different areas. This model is easy to start. However, it grows very quickly as do the data streams until several redundant streams exist. This creates a challenge for those trying to maintain the model. If one underlying source system changes, they have to change each extraction routine that uses that particular source. Additionally, it often doesn’t have underlying patient level detail. If a metric and a summary mart is unfavorable, you are unable to drill down to the patient level to determine the reasons why.
Late-Binding ™ Data Warehouse
A new approach to data modeling that Health Catalyst uses to address healthcare’s unique data needs is Late-Binding ™ Data Warehouse, illustrated again on this slide. The advantage of the Late-Binding ™ approach are that it’s generally faster to launch, it is easier and less expensive to maintain and most importantly it provides maximum adaptability and flexibility for clinicians who are involved in improving care in the highly dynamic healthcare environment.
In the Late-Binding ™ model, one brings data into the data warehouse in a raw format that keeps the same structure and feel of the underlying transaction system. This quick copy can be done in a few weeks unlike the Enterprise Model which can take years to develop. The structure stays the same, which enables analysts familiar with the transaction systems to recognize the data structure of the warehouse. Naming and data type standards are applied to make it easier for analysis but minimal transformation occurs.
In the Late-Binding ™ platform, one can connect disparate data with common linkable vocabulary. For example, identifiers for patients, providers, and facilities can be linked across different data source systems such as electronic health records, financial systems, claims systems, etc. And one patient can be viewed across the entire system.
From there, you can build marts focused on a particular clinical area, such as diabetes. These are called subject area marts. This thing can be done quickly because you are not going back to the individual source systems to accomplish it. You already have all that data in the Late-Binding ™ Data Warehouse. If an underlying source system changes, you update one extraction routine instead of multiple streams. The result of all these is just in time data binding. Rather than trying to define everything upfront, you bind the data later when you are trying to solve the actual clinical or operational problem.
Finally, you can build graphic old data visualizations atop the subject marts. So it’s easier to interpret the data and identify trends and patterns.
Early versus late binding
Data binding is a technique in which raw data elements are mapped to conceptual definitions. One of the keys to the data model developed by Health Catalyst is binding the data late when clinicians are trying to identify and solve a problem. But that doesn’t mean you always wait until the data that is stable, like vocabulary terms and patient and provider identifiers, can and probably should be bound early.
Data that is likely to change, however, should be bound later. For example, length of stay in a hospital they found straightforward on paper. But surgeons might define length of stay as the point of (25:39) to discharge from the anesthesia unit and cardiologist might define length of stay as time entering the ED to discharge from the hospital. Because things like length of stay definitions will change for different use cases, you will want to bind it later.
This slide shows points where you can bind data in this process. The earliest you can bind data is when you are moving it from the transactional systems into the warehouse and then during the extraction or ETL process. That’s points 1 and 2 on the slide. It is best defined only low volatile rules and vocabularies at this first point. You can also bind data in the target data source where the data lands or when moving it to the customized data marts. That’s points 3 and 4 on the slide. You can bind somewhat volatile data in a customized data mart. This is still considered Late-Binding ™ and it occurs at 0.5.
The last place you could bind data is in the visualization layer for rules of vocabulary that are likely to change, they’re highly volatile. Once you established definitions, the data could be locked down at points 3, 4 and 5.
Automating data gathering
Once the data is unlocked, an organization can automate the broad distribution of the information. Ideally, the data should be distributed electronically to enable clinicians and operational leaders to effectively and efficiently view the information they need in as close to real time as possible. Today, in most healthcare delivery organizations, the distribution work falls primarily to analysts or interested clinicians who encounter many challenges in finding and gathering the data that they need. First, they must understand what types of data are needed. Before they can locate and compile that data, they have to wait for IT to run reports or queries. Only then can they start interpreting data and distributing it to the right people. Obviously, understanding the need for the data and interpreting the data are the two most value-added tasks but at many healthcare organizations, at least 80% of the analysts or clinicians’ time is spent gathering or waiting for data instead of analyzing information. There are several other examples of non-value added tasks.
If the person preparing the report doesn’t get all the necessary data, here she has to do chart obstruction where one pulls up the patient’s record and manually types the missing data into an Excel spreadsheet or another data collection file system tool. This is sometimes called sneaker ETL because the analyst spends a lot of time walking from one system to another to gather and enter data.
When you reach the stage where you want to provide others access to data, the distribution stage, the new data are typically integrated into another spreadsheet but they aren’t tied back to the source information. The person creating the report might have then created enough to build clever macros to grab the data, but if that person leaves the organization, they also take the knowledge about how the macros work with them and the people left behind can only hope that the data imports continue to occur correctly.
As illustrated in this slide, automating the distribution can help solve all of these problems. There are now many powerful data visualization tools that provide very powerful support in gathering, distributing and analyzing data. These tools can be extraordinarily powerful in supporting the ability of frontline healthcare workers to do improvement work. In my next webinar, I will be completing the discussion of the analytic system and they will also demonstrate some of these powerful tools. They are truly amazing, and for most clinicians, something they have never seen or experienced before. And our organization can easily eliminate unused or obsolete reports and standardize data capture as part of the workflow during or just after the events. Instead of sending reports, an organization can encourage frontline workers to explore data to themselves by collecting data in the EDW, standardizing common definitions and automating its information distribution.
Finally, an organization can use rollup instead of summary data by gathering patient level detail and using it in the starting point for summaries. This allows end users to drill down and answer why questions that might otherwise go on unanswered. By automating data capture, data distribution and data analysis, an organization can encourage self-exploration. It is best if healthcare can get away from a report factor mentality when an end user sends them a data request and waits a couple of weeks or longer for the result. By getting rid of this report factor mentality and making tools that the end user can use available, the end user can explore their own data and has to answer their own questions.
Population Health Management
Once an organization’s data is unlocked and readily available, the next step in the improvement journey is to decide where to focus the improvement efforts. Every organization has limited resources. In fact, most physician groups, hospitals and health systems are experiencing a declining bottom line. Their goal needs to be to achieve the greatest benefit from the resources they invest and improvement efforts. Therefore, they need to determine which investments will provide the greatest benefit, improving care for the largest number of patients, streamline operations to the greatest extent possible and lowering cost. Taking this approach will help organizations achieve their highest value for an investment in improvement. Anyone involved in healthcare delivery knows it is complex. Traditionally, healthcare has used clinical service lines to categorize clinical care. While traditional clinical service lines have been useful, they are generally not comprehensive enough to capture all the clinical care. A clinical service line model tends to be acute care centric and it does not adequately describe the details of any given care delivery process, what the decisions are, how decisions are made, and who makes the decisions. In short, the traditional clinical service line model does not provide us with the level of detail and a depth of understanding necessary to organize our thinking and manage the process of care most effectively. That (33:27) a need for a more comprehensive conceptual framework that supports our ability to do this.
The next few slides focus on a framework that lays out a process of healthcare delivery and by context for a discussion about quality improvement and opportunities. A framework comes from population-based improvements and utilization as well as improvements and prevention encounters and cases.
As healthcare increasingly focuses on producing value, higher quality and safer care at the lowest possible cost, there is going to be a shift in emphasis towards managing care across the continuum. There will be a need to efficiently and effectively conceptual and manage care from the home, to the clinic, to the urgent care unit, to the emergency department, to the special procedure unit, to the hospital, and ultimately hopefully back to the home. As the pressure grows to manage care more effectively, there will be an increasing emphasis on post acute care in order to reduce hospital length of stay or bypass the acute care admission altogether. Examples of post acute care environments may include the home, clinic, home healthcare, skilled nursing facilities, and hospice, as illustrated on this slide.
Population Health Management
Anatomy of Healthcare Delivery
The first step in developing the comprehensive understanding in clinical care is to understand the flow of care when patients interact with the delivery system. The anatomy of healthcare delivery framework developed by Dr. David Burton and shown on this slide demonstrates the potential pathways patients can go through in their interactions with the delivery system. This is a conceptual framework that enables us to organize our thinking about the care delivery process and to focus our attention on key processes and decision making points. As seen at the top of this slide, patients may present with symptoms or they may be seeking screening of preventative care. If they have some of the symptoms or there are positive findings identified in the process of screening, patients enter into a diagnostic workup. Once a provisional diagnosis is established, patients are triaged to a treatment venue, such as a clinic, acute medical facility, or invasive facility. The goal should be to triage the patient to the care venue that best matches delivery system resources to the patient’s needs in a manner that optimizes the balance between quality, safety and cost. Furthermore, the goal is also to optimize the process of care anywhere care is provided across the entire continuum of care.
The degree to which an organization standardizes their approach in all patient interactions and each of the knowledge asset categories indicated by the blue and orange boxes shown in the slide now will impact the degree of variation and care delivery. In turn, this variation will drive the level of quality, safety, wastes, and cost. This is what will be required to optimize care value for patients in the future. Condition-specific care guidelines, diagnostic and therapeutic protocols can and should be developed for each of the boxes in this flow of care. The more evidence-based and standardized the management is, the better the quality, the less the harm, less wasteful variation and the lower the cost. Using this approach will optimize clinical processes in every step or stage of treatment. This should apply regardless of whether the patient is simply being seen for health maintenance or prevention, for an illness or injury that’s amenable to outpatient treatment or for more serious situations required in acute care inpatient management.
For those who are interested, there is a more detailed discussion of Dr. Burton’s anatomy of healthcare delivery in chapter 4 of my book, as well as even more comprehensive whitepapers and lectures by Dr. Burton on the Health Catalyst website.
Population Health Management
Clinical Integration Hierarchy – care process families
Now that we have examined how patients flow through the care delivery system and its critical decisions explained, we can use the information to create a logical framework to help us organize a clinical integration hierarchy to help us think about clinical care delivery in more detail. This hierarchy applies along the continuum of care delivery from home and clinics to the outpatient and inpatient venues of acute care and then to post acute care venues. The most granular level of the hierarchy is the care process. This slide illustrates examples of an ischemic heart disease care process, including hyperlipidemia, coronary atherosclerosis, AMI, PCI, coronary bypass and cardiac rehab. These care processes belong to the next level of the hierarchy, the ischemic heart disease care process family. These are just a few of the many different care processes that I referred to in the patient flow diagram on the prior slide.
Population Health Management
Clinical Integration Hierarchy – Clinical Programs
Ischemic heart disease and its care process family siblings of heart failure, heart rhythm disorders and vascular disorders make up the cardiovascular clinical program, which is an example of the next level of the hierarchy. These care process families make up the vast majority of clinical conditions in the cardiovascular domain as illustrated on this slide.
In summary, care processes are part of care process families. Multiple care process families make up a clinical program such as the cardiovascular clinical program illustrated on this slide.
Clinical Integration Hierarchy
Clinical Programs – Ordering of Care
The cardiovascular clinical program is one of several major clinical domains, as shown in this slide. Clinical programs are organized based on physician specialist and other clinicians who share the management of care processes and who are responsible for the ordering of care for patients with these conditions. Here, they work on things together where one team’s output is another team’s input, for example, OB-GYN subspecialist providing patients (40:45). Each of these domains are clinical programs and each consists of a group of care process families.
Clinical Integration Construct
Clinical Support Services – Delivery of Care
Once care is ordered by clinical program physicians, clinical support services are responsible for the delivery of care to patients. Clinical support services, as illustrated in this slide, include diagnostic, therapeutic, clinical care, acute, medical, and invasive clinical support services. The vertical clinical programs order the care and are responsible for defining the evidence-based scientific flow of the care.
The horizontal clinical support services implement the care that is ordered and are responsible for defining a safe and efficient workflow.
Value Stream Protocols to Help Prevent Patient Harm
Patient injury prevention is an integral element of the workflow because patients injured really should be viewed as a defect in the implementation of optimal care. This table shows which value stream protocols each department should use to prevent patient harm.
Mapping – admin codes to clinical
In order to prioritize improvement projects relating to the ordering of care and its implementation, we need to be able to measure the relative size and variability of the three levels of the clinical integration hierarchy. This requires linking each level of the hierarchy to some quantitative metrics, such as cost. This is done by mapping the clinical processes of care to administrative codes such as ICD9, diagnostic and therapeutic codes such as CPT 4 and APR-DRGs.
This slide is a conceptual diagram that illustrates the use of a cardiovascular example using ICD9 and CPT-4 codes. As I’m sure you know, the ICD9 codes are eventually going to be supplanted by ICD10 codes. Prior to this webinar, Mark Kid asked me to the comment on the impact of ICD10 on analytics. In short, I believe the impact is going to be huge. ICD10 will offer many advantages because it can explain clinical conditions in far greater detail. It is a much more detailed clinical coding system than ICD9 and I will be glad to see it arrive.
Thank you, Mark for the question and thanks to everyone else who offers questions, comments, or input. Your input absolutely makes these webinars and the book better.
Population Health Management
Medicare FFS payments by venue – 2008-2012
Medicare recently published nationwide data for the benefit of those developing innovation proposals. Using this data, one can group care by venue. This slide illustrates a nationwide dollars of care based on venue – clinical care, outpatient, inpatient, Skilled Nursing Facility, Inpatient Rehab Facility, home health, and hospice. This helps us to understand the relative contribution of each of the venues of care to the total Medicare expenditures nationwide. Based on this data, it’s pretty obvious where resources are extended in healthcare.
Inpatient per case KPA
The anatomy of healthcare delivery framework helps clinicians and others understand the flow of patient care. It also provides a useful model for organizing the complex care delivery process and determine where to focus care improvement resources to achieve the greatest possible impact in terms of value.
So we will now turn our attention to a discussion of how we can use the anatomy of healthcare delivery framework and the associated clinical integration hierarchy to prioritize improvement efforts.
This graph helps to illustrate how this is done. The key variables in prioritization are resource consumption and variability. Once care process families are mapped to cost, relative resource consumption could be identified and ranked as illustrated on the slide. Each of the blue dots represents one of those care process families that we talked about before, such as arthritis, pregnancy, lower gastrointestinal disorders, ischemic heart disease, etc. The red dots represent the cumulative total of the blue dots. If you focus on the first 10 blue dots, the cumulative total is over 40%. This analysis looks at direct variable cost because they represent the cost for which providers have the most control. Extending out the 32 processes, you reach a point of about 80% of total resource consumption. In addition to highlighting the cost, this approach provides a reasonably good (45:58) for the rest of the patients. For example, the higher the resource consumption, the more likely the patient is in an intensive and costly care environment such as the ICU or surgery. Both because of patient benefit and cost reduction opportunities, it is reasonable to assume that focusing on these 32 processes would yield the greatest benefit or yield for an investment and improvement.
Organizations would likely not be wise to invest money into processes on the far right of the grid because the benefit is less likely to outweigh the cost. If an organization is doing this, it may be prudent to refocus their improvement efforts on high priority areas such that they’d offer a potentially greater return for the investment and improvement.
One of the challenges that can be encountered when applying quality measures to different care process families is that many things cannot be compared and hospitals lack clinical data that can be used to prioritize their problems. Instead of using clinical data, which they do not often have, health organizations can substitute financial data. This works surprisingly well because complicated expensive care often entails greater risks to the patient. Additionally, financial variation often reveals clinical variation. If the cost of care for the same type of patient varies greatly between two physicians at the same facility, the physicians are probably using different clinical practices. By standardizing on evidence-based practice, clinicians can improve outcomes and reduce costs. Financial data can be used to prioritize clinical initiatives but it should not be used to confront physicians about cost. Improvement in cost outcomes should be a byproduct of the standardization and improvement of clinical practices and not in itself.
In summary, organizations need a comprehensive framework, I believe, to help them implement a solid foundation for the future.
The three systems that I’ve described is an example of a comprehensive framework that in combination can lead to future success.
We should use data primarily for learning and less for judgment.
The Late-Binding ™ Model is the quickest to set up, the cheapest to maintain, and most importantly, offers the flexibility required to support continuous improvement.
Automating data distribution allows frontline workers to become self-service gatherers and analyzers of data.
The Anatomy of Healthcare Delivery and Clinical Integration Hierarchy can help organizations focus their improvement efforts and maximize value for the investment.
Coming Attractions (next webinar)
That concludes today’s webinar. I just want to point out that there’s a coming attraction, the next webinar, which I believe will happen on the third week of June but you’ll know shortly, The Analytic Systems. In that point, I’ll bring this altogether, the whole concept of the analytic solution. We’ll have a discussion on understanding variation of the role of SPC in quality improvement, a thoughtful approach towards improvement overview, finding meaningful patterns in data will be discussed, and then as I said earlier, a very exciting demonstration of the power of modern analytical tools to support improvement.
So with that, I’ll ask two poll questions.
Poll Question #2
On a scale of 1-5, how effective is your organization’s analytical strategy and capability (as described today)?
On a scale of 1-5, how effective is your organization’s analytical strategy and capability based on the discussion that we had today? Very robust down to very limited.
Alright. We have that poll open. And as you’re taking that poll, I would like to remind everyone that you can ask questions by writing your questions and comments using the question pane in your control panel. We’ll leave this poll up for just a few more moments and then share the results.
Okay. We’re going to close the poll now. Our results are, we see 4%, Dr. Haughom, answered at a 5, very robust, 16% have answered at a 4, 31% have answered at a 3, 35% a 2 and 14% a 1 or very limited.
[John L. Haughom, MD]
Thank you for your responses. Again, while I would like to see a higher percentage of 5, very robust, the reality is this is very encouraging to me. I can pretty much guarantee you that 5 years ago we would have seen it heavily disputed towards 1 and 2 and I think this is another encouraging sign that healthcare is evolving.
Poll Question #3
Does your organization have a robust strategy to identify high value improvement opportunities?
Next question. Does your organization have a robust strategy to identify high value improvement opportunities? Definitely, down to not at all.
Alright. And we have that poll up and we’ll leave that up for a few moments so everyone has a chance to vote.
Okay. We’ll go ahead and close that poll now and let’s share the results.
And so, we have a 14% have answered at a 5, which is definitely, 13% at a 4, 37% have answered 3, 30% at a 2, and 6% at a 1 or not at all.
[John L. Haughom, MD]
Similar to the last results, this is also encouraging for me. There’s clearly a shift in the bell-shaped curve for being skewed towards the bottom of this array towards the top and I will predict that 24 months from now, if we do this again, we’ll see it’s even much more heavily towards 4 and 5.
Upcoming Educational Opportunities
Okay. I want to thank you all. Just before we start going to the questions, I’d point out some upcoming educational opportunities. Catalyst continues its very robust series of webinars. On May 29th, there will be a discussion on Data Driven Care: The Key to Accountable Care Delivery from a Physician Group Perspective by Dr. Gary Spencer. Then Dr. Burton on June 4th will discuss Accountable Care Transformation: The Four Building Blocks of Population Health Management. I’m very excited about the Health Analytics Summit that Tyler spoke about at the beginning that we’re going to have a drawing for tickets. This is going to be a very exciting conference on September 24th and 25th. Great array of speakers. I’m going to be there. I’m really looking forward to it. It will be a great educational opportunity.
And then one last thing, some of you might be interested. You may be aware that a couple of weeks ago an orthopedic surgeon from California wrote an editorial on the Wall Street Journal, declaring independence and advocating that we resist government mandates. I understood the good doctor’s frustration. I’ve been there before myself in the past and he made some very good points. What I didn’t like about the editorial is he didn’t offer an alternative. I don’t think we can just ignore mandates and then not provide an alternative as care providers. And so, I wrote an editorial about that which you can read. That’s the link that’s demonstrated there.
Okay. With that, Tyler, questions…
Transforming Healthcare Through Analytics
Yes. Before we jump in to questions, we do have the two registration passes to give away to the Healthcare Analytics Summit that Dr. Haughom has spoken about. The first is a registration pass for a single individual and the second is a registration pass for a team of 3.
Are you interested in attending the Healthcare Analytics Summit in Salt Lake City? (single ticket)
So the (54:21) is pretty simple. I’m launching a poll right now. This is, if you are interested and would like to be entered in the giveaway for the single ticket and you believe that you can attend the summit on these dates, September 24th and 25th, please respond to the poll and we’ll leave tat open for just a few more seconds and then we’ll have the second poll for the group registration pass.
Okay. We’re going to go ahead and close that poll now. And at 73% have shared that they know their schedules well enough that they believe they can attend. That’s very encouraging.
Now let’s go ahead and open up the poll for the team registration passes. Again, this will be a pass for a team of up to 3 individuals who will attend the Analytics Summit on September 24th and 25th. So I’m giving another 5 seconds to answer this poll.
Okay. We can now go ahead and close that poll. And this is very interesting. It looks like about 57% who responded are very interested in being able to attend as a team. That’s very encouraging. So thank you very much for that.
So, with that, let’s jump into the question and answer portion.
QUESTIONS AND ANSWERS:
Tyler Morgan: Dr. Haughom, we’ve got a couple of questions here. The first question is by Dr. (56:03). He asks, he says, “My question about analytics is how much do we force end users to document in the EHR in one place to make it easier on the reporting side? We have folks here who feel that there should only be one place for everyone to document a specific piece of data and other folks who feel like the workflows have to be flexible to suit end users.”
Dr. Haughom: Well as a long-term informaticist by interest, that question is (56:32). Let me explain what I say or why I say that. When we started implementing electronic health record in the 1990’s and then built our data warehouse after that, there was a lot of resistance to entering data and documenting the standard way until our data warehouse is built. And then all of a sudden people saw that when you didn’t end the data in a standard way, it greatly diminished the value of the data warehouse and the analytics that you could do because you’re frequently in the position of comparing apples to come (57:06). And so, you need to have standard documentation. Having said that, workflow is important. And I believe that we’re making a progress but we’ve got a long way to go with our electronic health records to really integrate them into the workflow. Part of that has to do just with understanding the workflow on the ground as you implement electronic health record and part of it has to do with just redesigning how you document electronic health records and making it more streamlined and easy for providers. Thank you.
Tyler Morgan: Okay. We have another question from Dr. Anthony Restuccio and he asks, “How do we prevent a well intention focused on evidence-based medicine from stopping well-established scientifically-based clinical practices from being incorporated in a timely fashion?”
Dr. Haughom: Well that’s a great question and I want to answer it by saying when I talk about the deployment and the content system, the answer to your very good question will become clear. And the content is just about make the point that you have to have standard protocols and a standard way of gathering evidence, and vetting the evidence. And then once you’ve vetted the evidence, groups of clinicians can build it into protocols so it can get into clinical care vastly faster than the current 17 years. And then in addition to that content system function, you need a deployment system that allows those changes to be deployed broadly across an organization. It doesn’t matter if it’s just a physician group or a physician group in a hospital or an entire health system. You need a standard deployment system that really is effective in disseminating (59:09) knowledge. Thank you.
Tyler Morgan: Okay. Dr. Haughom, we are at the top of the hour but it looks like we’ve got another 15 questions are so. If (59:16) would like to be able to stay on and continue to answer questions, as long as you have time available, would you like to go ahead?
Dr. Haughom: Oh yeah. (59:25).
Tyler Morgan: Okay.
Dr. Haughom: And also any questions we don’t get to, I’ll respond to the people that asked the question.
Tyler Morgan: Okay. Wonderful. Well, first, Dr. Steve Carlos asks “if the slides will be available afterwards?” Yes, we will provide. We will send out a follow-up email to everyone. That’s a question I can answer. We’ll send out a follow-up email to everyone, along with the recording of this webinar, and also we’ll provide the results to all the poll questions asked as well.
So we have a question from Hue William, “Can big data together with RFID, can that have a role in Lean?”
Dr. Haughom: Oh absolutely they do. That’s what we are going to cover in some detail in the final chapter of my book, chapter 8, which will be out in three or four weeks. RFID is just one example of many many many devices in technologies that’s going to drive healthcare into the big data realm. They include things like RFID, remote patient management systems, (60:37), portal systems, social networking technologies, genomics. All of those things are going to massively increase the flow of data into healthcare delivery systems, particularly the Accountable Care Organizations of the future. And they will be able to use that data to really refine their approach to population health and optimally managing care across the continuum. I think you’re touching on a huge revolution. This, the wave, I can see on the rise and it’s coming. I know it’s 3 or 4 years from now or 5 or 6, but it’s coming, and it’s going to be very powerful. And (61:27) wants to get to that point, you have to have the three systems that I talked about today in place because otherwise you’ll never be able – you’ll drown in data. You’ll never be able to manage it well. Thank you for the question.
Tyler Morgan: Okay. It actually looks like he has another question. He asks, “And if clinical integrated network with multiple disparate EHRs, what is a good way to extract the data into a data warehouse?”
Dr. Haughom: Well, in that regard, it’s always more complicated when you have multiple EHRs, but if you have a well designed enterprise data warehouse, it can in fact be an integrator of data and organizations that are doing that. It gets back though to the talk I did about the three different types of data model. If you use an Enterprise Data Model, the cost of trying to make that work and the kind of environment you’re describing with the massive and give you lots of headaches. On the other hand, the Late-Binding ™ Model that I talked about makes it much easier and it can function as an integrator of data in an environment that has multiple EHRs.
Tyler Morgan: Alright. We have actually a couple of questions from Phillip Fullis and his first question, “I think it relates closely to what we’ve discussed. How do you handle a Late-Binding ™ platform when the underlying structure of the databases are completely different? Example would be an M-live system versus a SQL data warehouse.
Dr. Haughom: Is the question again what type of system in a SQL?
Tyler Morgan: He mentioned and that would be an M-live system versus a SQL data warehouse.
Dr. Haughom: I presume “M” as mumps but anyway, to answer the question, the Enterprise Data Warehouse generally is built on a SQL-type environment because that’s designed well for analytics and that can be successfully linked to transaction systems that have different underlying architectures and use different underlying technologies. It could be harder with some rather than others but it’s done. And again, kind of relating to the past question, in that sense, Late-Binding ™ Data Warehouse can in fact be an integrator for you of data because it’s very hard to get data out of transaction systems, like electronic health records. It’s harder to get data out and to analyze it except with the data warehouse, I should say. Thank you.
Tyler Morgan: And so his other question is, “How do you move the analyst from 80% acquisition to 80% analysis?”
Dr. Haughom: By the same you move the providers there but it gets back to the conversation I had about that slide about automating data gathering, data analysis and data distribution or distribution and analysis. The more you do that, the more you can move it towards the frontline and let them do it. Analysts, I think, will play a role for a very very long time and these tools benefit them enormously. But ultimately, I believe we’re going to have to get these kinds of tools into the hands of the frontline care providers as well. Now that’s scary to some providers when I talked to them about it in front of audiences because they say, “Well, I don’t have time to do that. How can I possibly.” Well that gets back to a discussion about the model of care delivery, which I’ll talk about in chapter 8. I believe our current model is going to have to change. Thank you.
Now, I got, my phone is beeping and I may click off. I think the battery is dying but go ahead.
Tyler Morgan: Well, Dr. Haughom, we’ll keep going and as long as we can, if that’s alright. We’ve got Dr. Sandy Sangupta asks, “How does the Late-Binding ™ Model manage innovation and new technologies that are introduced into healthcare? In other words, how does the model stay nimble and able to take advantage of emerging technologies?”
It looks like we may have lost Dr. Haughom with his phone. We would like to thank everyone who has joined.
Before we close the webinar, we do have one final poll question to ask. We’ve had many individuals asked us to learn more about Health Catalyst and our solutions. And in previous webinars, we’ve asked for your interest in learning more in the post webinar survey. We received some feedback that this has caused some confusion. So in the vein of responding to feedback, we’re asking this question separate from the post webinar survey. So I put the poll up. How interested are you in the demonstration of Health Catalyst’ solutions? From very interested, somewhat, not sure, or not at all? I will leave this up for just a few moments. And while this is up, we would like to remind everyone that after this (inaudible).
Dr. Haughom: Tyler, I’m back. Tyler, I’m back. (67:13).
Tyler Morgan: That’s great. Thank you, Dr. Haughom. I’m going to go ahead and close the poll now and I’ll come back to the question that we had. Dr. Sandy Sangupta asks, “How does the Late-Binding ™ Model manage innovation and new technologies that are introduced into healthcare? In other words, how does the model stay nimble and able to take advantage of emerging technologies?”
Dr. Haughom: Well, in fact, I think it handles innovation better because it’s more flexible and adaptable to changing environments, and one of the things that can change an environment is technology. And so, it is actually the most flexible of the three data models that we discussed. And it doesn’t matter if you’re talking about technology type, like a diagnostic technology that pulls data in for clinical care that you want to analyze or whether it’s a visualization technology that sits on top of the data warehouse and a new one that comes along that’s much more exciting and powerful. At both ends, the Late-Binding ™ Data Warehouse is more flexible and adaptable to allow integration of those things into the environment.
Tyler Morgan: Okay. Suzanne Rabideau asks the question, “Can you please comment on how to overcome the divide between the concept of standardized organizational work processes and the common pushback of I need space to use my own clinical judgment?”
Dr. Haughom: Well that gets back to what we discussed in the last couple of webinars and it gets to the whole concept of mass customization that is part of Lean. But what Lean says is you take the routine, the things that rarely change, they’re kind of a slam dunk and you really don’t need bright folks like docs and nurses doing the things like remembering to order an aspirin on a patient that comes in with an MI, that’s one of dozens of examples. Those routine kind of things, it’s either for having high price (69:37) like a doc worrying about those. Now, they have to notice if it should be ordered because of a bleeding problem of the patient, then they need to change the protocol. But the simple routine thing should be changed and the doc will always be free when there are patient-specific needs to change things to change it. That effectively is Lean mass customization. And I would encourage you here to take a look at my book or listen to the last couple of webinars because I covered that in fairly great detail. Lean fits very well in healthcare. You have to do some adaptation because of the complexity of care but it works very well in healthcare just as it works at Toyota. I mean the example I gave in that lecture, one of those lectures, was when you order a computer from Dell or something like that, you can sit there and design your own specifications, it’s customized to your specifications, the standard stuff is done standardly, the customer expectations change the profile the way they want it and it frequently takes less time to build a computer than it does take to shift it to somebody. So it works very well. Thank you.
Tyler Morgan: Alright. Mel Skern asks the question, “Soyou commented that the ICD10 will provide markedly greater granularity. However, the physician and organization coding are primarily driven by payment methodology. Services and coding follows the dollar. How do you see this changing in ICD10?”
Dr. Haughom: Well first of all the ICD9 codes have been mapped to ICD10 and there will still be the ability to make those links. But the other thing to realize is that all the insurance companies are being required to go to ICD10 as well and that makes ICD10 a very big deal for healthcare providers, because if you don’t ultimately go by the time the insurance industry goes to ICD10, you’re not going to be able to get a bill off the door and that’s a pretty big incentive.
And the other thing I would say is that eventually insurance companies are going to like ICD10 and the reason is that right now, for the same reason that I spoke up for a clinician or like it because of the greater detail, insurance companies also will have much greater detail in order to assess care, in order to manage risks, which is ultimately what they do because they’ll have more information to do it. Information is always a good thing, and the more you add, the better. Source of a long conversation though, but thank you for the question.
Tyler Morgan: It looks like Dr. Sangupta has another question, “In the horizontal diagnostic clinical support services, is it possible to integrate or allocate costs to services such as case costing. For example, is there a way to differentiate costs from charges in various clinical programs such as management of heart failure?”
Dr. Haughom: Absolutely there is and some of those leading-edge organizations are doing that. Example, I know pretty well (73:09) on healthcare. When Dr. Burton used to be at Intermountain and Dr. (73:08) was still there, they developed a very nice cost assessment methodology and wired that in so that in so that in fact they could measure cost horizontally like that. And that’s the future. It’s going to happen. There will probably be some resistance to it because a lot of people (73:31) don’t completely understand it, but it’s possible, it’s doable, and it’s coming. Thank you.
Tyler Morgan: Aimee Salen asks the question, “How do you get your organizational leaders on board with standardizing? It is a challenge to be able to keep players on board.”
Dr. Haughom: Oh I trust that one and my last – I mean I believe that one in my last job. The biggest challenge I had in standardization was a lot of clinicians were a bit of a headache. They failed in comparison to all the regional operational leaders and executives who, just like clinicians, all wanted to do things their own way. The only way I can answer that question with any surety is to say they’ve got to understand what I might have known on this book because if they believe that the world is going to change and if they believe that the vision that some of the leading-edge pioneer healthcare organizations are creating for us and it’s telling us where it’s going, then they’ll see the need for all the things we’ve been talking about in these webinars.
So it ultimately comes back to education. Just like we need to educate physicians and nurses because they are the frontline ‘smart cogs’ that I’ve been talking about on these webinars, they absolute have to get this. But the board has to get it, the C screen has to get it, the operational leaders, maybe department system have to get it. And so, we’re talking about a major educational program in most organizations to get there – because operational leaders are no different. They grew up in a certain model and mindset just like clinicians have and both clinician models and operational models are going to change. Thank you.
Tyler Morgan: It looks like we have three questions left. So our next question is by Julie Grady and she asks, “How do you integrate population health management techniques, like hotspotting, within the big data flow and monitoring?”
Dr. Haughom: Well, I think they’re kind of mutually inclusive, to be honest. Once we go to population management, and again I’ll talk quite a bit about this in the final chapter of my book where I do talk quite a bit about it in that chapter, that managing populations like that is going to require new care delivery models by ACOs. I’m not talking about ACO concepts here, less so the CMS version of the ACOs. It’s going to require Accountable Care Organizations that have the capability and it’s going to require new models of care itself that are less acute-centric and employ advanced technology so that more care can move into lower cost venues including the home. I mean the largest untapped healthcare resource in the country is patient and their families. So, as we move into that realm of population health, it will require new forms of care delivery systems regarding new care models that will be supported by advanced technology. All of which will lead to big data, which will support population health. So they are integrated. They fit together. Thank you. Stay tuned for chapter 8 because I’m going to talk about it there. Thank you.
Tyler Morgan: Well, Dr. Haughom, our last two questions are both by Alexander Baschef and he asks, “If patient satisfaction can be the opposite of bets practices, how do you bridge the gap, is the question, especially when patient satisfaction factors into reimbursement?”
Dr. Haughom: Well, that’s a tough question and I suspect that the question (77:55). In my years of practice, the way I found and have to do it was with thought, reason and compassion and developing a really sound patient and family relationship. Sometimes, you can’t win. I certainly in my years of practice had patients that did things that I thought was just absolutely the wrong thing to do, but ultimately it was their choice to decide to do that and I believe that’s true. But I did also find that the vast majority of the time, you can sit down with patients, particularly if you can show them good evidence, good data, including your own data, by the way, not just a random guy’s control trial published to the paper, but your own data, saying, you know, we’ve had 10,000 patients with your condition and we carefully looked at all the ways of treating that and literally with different outcomes and that’s why I want to give you treatment X because it has a vastly fair outcome for you. Most reasonable patients and families don’t turn down good outcomes but some do. Thank you.
Tyler Morgan: Yeah. Our final question is, “Do you see individual patient responsibility playing a role in ensuring patients for their healthcare? For example, the more obese one is the more costly their premium.”
Dr. Haughom: Yes, I think that’s an absolute trend and I think we’ve seen it up full beyond our (79:11) eyes right now. More and more accountability is going to be put on patients, and within limits, I believe that’s a good thing because they’ve been largely insulated from the cost of care in the past and I think the more we give them accountability the better. It’s just that there has to be a limit to that because they also can’t afford it if it goes too far. So, I’m going to be pushed out of the space is what I mean here. So I’m going to have to ring off pretty soon.
Tyler Morgan: We’re done with our last question. We would like to thank everyone who has joined us today. Shortly after this webinar, you will receive an email with links to the recording, the presentation slides, our poll question results, and the names of the winners of the summit registration pass giveaways.
On behalf of Dr. Haughom, as well as the folks at Health Catalyst, I would like to thank you for joining us today. This webinar is now concluded.
[END OF TRANSCRIPT]