The Analytics Adoption Model Explained (Webinar)
Healthcare Analytics Adoption Model Updated (Transcript)
Alright. Good day everyone and welcome to the Health Catalyst Fall Webinar Series. Thanks to all who have joined us for today’s webinar: Healthcare Analytics Adoption Model. My name is Tyler Morgan and I will be your moderator today. Throughout our presentation, we encourage you to interact with our presenters by typing in questions or comments using the questions pane in your control panel. We will be answering questions at the end of the presentation during our questions and answers time. If we don’t have time to address your question during the webinar, we will follow up with you afterward.
We are recording today’s session, and within 24 hours after the event, you will receive an email with links to the recording, the presentation slides, and a link to our new online knowledge center where you can register for any of our upcoming webinars, as well as view past webinar recordings.
I am very happy to introduce our presenter today. Dale Sanders, Senior Vice President of Strategy at Health Catalyst. Dale has a diverse 30-‐year background in IT and analytics, including 8 years as a CIO in the US Air Force and 3 years as a threat analyst for the National Security Agency. He has spent the last 17 years in healthcare as an adjunct senior research director for the Advisory Board Company, chief information officer for the National Health System in the Cayman Islands, CIO and chief data architect at the Northwestern University Medical Center, and regional director of medical informatics and chief data architect at Intermountain Healthcare.
I will now turn the time over to Dale Sanders.
Thank you, Tyler. And hello to everyone. I see a lot of dear friends in attendance today. Thanks again for sharing your time and I hope it’s a great use of your time.
Today, we’re going to talk about the Healthcare Analytics Adoption Model that a group of us have been working off and on for the past 15 years, more urgently the last couple of years. And we’re going to focus on the use of that model to evaluate vendors who are emerging in the analytic space in healthcare.
So to provide a little overview, we’ll set the scene and context, kind of talk about the shifting of focus from electronic medical records to enterprise data warehouses and learning from our mistakes around EMR Adoption. We’ll talk about my general criteria for evaluating vendors of any kind in IT. Then we’ll start talking about the details of the Healthcare Analytics Adoption Model and how I suggest we evaluate vendors according to that model.
Warning: Details Ahead
We’ll dive in to the very details of that model and I warned everybody ahead of time, it’s very detailed, so get your copy on. And then finally I also want to introduce the HIMSS Analytics Adoption Model. And we’ve been collaborating with HIMSS on both models, trying to figure out a common ground for the industry. So as I mentioned, details lie ahead but I’m a big fan of President Eisenhower and one of his quotes that is held true for me is that “success is found in the shadow of details.“.
And so, all of the best business leaders that I’ve ever known have this ability to work on both the strategic level, as well as the very detailed level. And so that’s kind of the direction and the influence you’ll see here today is from that.
One of the things that I like to communicate nowadays is that modern business moves at the speed of software, and business agility is directly proportional to software agility. And I think the recent events around healthcare.gov, a classic example of this merger of policy and business models and the importance of technology plays with that, and where the managed sort of details is critically important in software engineering.
And I noticed that the younger leaders coming in to our industry get this. They appreciate the value of software as it relates to policy and business models. You know, frankly people like me, in our 50s, we tend to ignore sometimes and the old leaders that are uncomfortable with technology are frankly kind of denying it. And so, I would encourage all of us in healthcare to recognize the importance of this detail and that is the importance of software as it plays out in the transformation of our industry.
The Era of Analytics is Here
There’s no doubt that the era of analytics is here, and I believe it’s really important that senior leaders understand this topic. I also know from a practicing CIOs who have implemented EMRs, managed EMRs, that the return on investment from EMRs is found in the data that we pull out of it and the knowledge that we bend back on healthcare. And from my own personal experience with certified independent ROIs of the data warehouses that I built and designed, we can achieve triple digit, measurable, 18-‐month return on investment from the systems if we implement them properly. We’re kind of stuck with subpar electronic medical records, I believe, because we didn’t demand better software. And I’m encouraging all of us to not make the same mistakes in analytics. We can do better and we can expect more because we know more from the vendors that are emerging in the space.
Center of The Universe Shifts
So the center of the universe is shifting. I think some of you have seen this slide before in other presentations but I really believe that the future of interaction at the data level with healthcare is going to look and be more influenced by analytics than it is the EMR. We’re consumed by the EMR right now, in part, because of high tech. But when you look at the Triple Aim, it’s really driven more by analytics than in this transaction data. And my hope is that EMRs are going to become a commodity and the price is going to plummet. I’m encouraging some of the EMR vendors to follow this model enterprising right now and start moving towards the commoditization of those transaction information systems and instead make their money on the services and value-‐add from analytics.
Closed Loop Analytics: The Triple Aim
So the Triple Aim looks like this. There are patient-‐specific data items in the user interface. That’s the transaction level data, data about Dale Sanders. But then the future of the Triple Aim is also going to require that we close the loop of analytics at the point of care, so that we understand population-‐based metrics of outpatient, like Dale Sanders, as well as the cost of care from both the population, as well as the patient specific perspective. So you can see that 2/3 of the user interface in the future of EMRs is just really driven by analytics. And again, this is one of the reasons I think it’s really important that we make the right decisions now about these analytic systems.
Assessing Vendors in General
So let’s talk a little bit about the way that I asses vendors in general and use this model throughout my career.
Criteria For Any Vendor Assessment
So in general, this is the criteria that I use to asses a vendor.
Total Cost of Ownership
First and foremost is the total cost of ownership. I don’t want to fall in love with any system that I can’t afford. So unlike a lot of negotiations, I push discussions of costs right up to day 1 and I want to get that off the table as soon as we can because, again, if it’s not affordable, I don’t want to talk about it.
Ability to Execute
The next most important to me is the ability to execute and there’s nothing like experience in the delivery of IT. So I spend a lot of time with vendors, making sure that they have great references, and I don’t use the screened references they provide. I expect them to provide a list of all of their clients and let me choose which of those that I prefer to talk to. And then I’m really looking at whether they have experiences similar to mine and have they had the ability to execute in a short time to value time frame.
Completeness of Vision
I also look at their completeness of vision, what do they show me about their understanding of the past, what do they show me about their understanding of the present situation, or what’s their vision of the future.
Culture and Values of Senior Leadership
I like the culture and values of senior leadership of the company because the entire culture of the company starts at the top and I constantly ask myself if I would hire these people as employees and be glad to do so. And if I consistently find myself saying yes to that, I have a lot of faith that they have a company that’s probably going to be a good partner for me. If I find myself saying no to that question quite often, then it’s unlikely that I’ll do business with them because I doubt they have products or services that will be a long-‐term match for my organization.
Technology Adaptability & Supportability
Finally and also importantly is the adaptability and supportability of their technology. And this is where I believe we need to start evaluating healthcare IT products from the ground up rather than from the demo down. Quite often we look at functionality and demonstrations and we never peel back the covers to look at the underlying software engineering. And just like buying a house with a poor foundation or it doesn’t have the ability to adapt to my changing lifestyle, if we don’t evaluate these healthcare IT products from their core engine group perspective, core software engineering skills and abilities, then we’re almost guaranteed that we won’t have a product that can adapt to the future. There’s a reason that we don’t program in Cobol and Visual Basic anymore because there are new and better ways to develop software engineering and we need to look for that and push for that in our industry.
Finally the last things I look at is company viability, will they be around in 8 years? Because that’s the average life span of a significant IT decision nowadays. If not, can I live without them? And what am I going to do contractually and what am I going to do organizationally to help mitigate that risk if they’re not around?
Analytic Technology Assessment
To go just a little deeper now in the analytic technology assessment, these are the 8 areas that I look for most carefully when evaluating analytic systems. And again, I’m looking at this kind of from the bottom up first.
Data Modeling & Analytic Logic
And at the core of all analytic systems is the data modeling and analytic logic and the ability to wrap and bind data according to the rules in my organization and my industry.
Master Reference/Master Data Management
The next most important thing is master reference data and master data management within that context. What kind of tools and techniques and what kind of strategies does the vendor have for managing that master reference data.
I also look at their ETL framework, the ability to extract data from the source systems. It’s probably the most highest risk area technically for data warehouse. I spend a lot of time on evaluating the efficiencies and modularity and the flexibility of ETL scripts in a vendor solution.
Metadata Management is more and more important, especially in healthcare, the ability to expose metadata. The yellow pages, if you will, of the EDW content is critical. And unfortunately I see a lot of people wasting more time and money on metadata than they need to. There are some very expensive tools out there now that are kind of misleading in their value. They collect a lot of computable data about the metadata in the data warehouse. But the most important metadata associated with the data warehouse actually is subjected data that the data stewards provide about the content and quality of the data in the data warehouse. The most valuable computable data is relatively easy to obtain from any database engine. But you don’t need an expensive tool for that.
Visualization of data
Of course you have to have a robust and flexible tool for visualization of data but don’t fall into the trap of believing that one tool will meet all of the needs of the clients in your organization. There will be a whole variety of different needs for visualizing and accessing data. And one of the big mistakes that I see right now is that CIOs in particular will choose a single tool with the expectation that that single tool meets all users’ needs of the data warehouse and it’s a really bad flow of strategy. So give up on that notion that a single tool will provide common access for everyone.
You want to spend a lot of time on the security model because a poor security model around the data warehouse can often get to either enormous security risks or it can constrain the value and the accessibility to the data warehouse. So it’s really important to find the balance between access and security in a data warehouse, and I’ve seen the error towards greater access rather than greater security when it comes to internal access to that data. Of course you want to protect the perimeter very carefully.
EDW performance and utilization metrics
One area that is quite often overlooked in the purchase of a data warehouse in analytics is the performance and utilization metrics around that EDW. You want to be able to track the utilization of who’s using what data and how often, not just for security purposes and auditing purposes, but think about it as your insight to the customers in the way they are interacting with your data in the EDW. And then you’ll use that to adjust your content, your indexing, your access models, your access roles, and that sort of thing.
Hardware and software infrastructure
And then finally hardware and software infrastructure – is it based upon good solid hardware and software infrastructure. And for the most part, that’s become a commodity. At Health Catalyst, we use Microsoft SQL server. I think it’s the best overall value. I made that decision as a CIO when I was in Northwestern. I switched from being kind of an Oracle-‐dominant mindset.
Oracle has been very good to my career, but Microsoft is, I thought, a more affordable and higher value trajectory back in 2005-‐2006 when I made that jump to Microsoft. But any of the products are decent and they are for the most part a commodity now.
Assessing Vendors With The Healthcare Analytics Adoption Model
Okay. Now let’s get in to the details of the Analytics Adoption Model. We’ve talked about some of the general things. So let’s really drill down into what this adoption model can do.
Reviewers & Editors
And I want to acknowledge and this is just a short list of the many folks that have contributed to this, very dear friends, as well as professional colleagues, Jim Adams, Meg Aranow, Dr. David Burton who was Dr. Brent James’ partner at Intermountain Healthcare and an important mentor in my life when I was at Intermountain and continues today. Denis Protti, in particular, a good friend who has always prodded me to do more writing and speaking on this topic of analytics and he has contributed significantly for this model. And so I appreciate all of these folks very much. Thank you if you’re on the call today.
Healthcare Analytics Adoption Model
So this is a high level overview of the model and you can see that we borrowed heavily from the familiarity of the HIMSS Analytics EMR Adoption Model because that model worked well for the industry. It helped raise awareness, it helped establish a benchmark for organizations to measure their own progress in the adoption of EMRs, it helped us evaluate vendors, and it helped vendors develop products that can meet the EMR Adoption Model. And those are exactly the same motives that we have in the production of this Healthcare Analytics Adoption Model. And by the way, as I go through this, in the next slide, we’re going to ask for you to participate in an interactive poll that describes your organization’s operating level in this model.
So much like the early days of EMR adoptions, we were characterized in the industry by fragmented point solutions all tied together with kind of awkward and unreliable HL7 Messaging. And that’s similar to what we’re seeing in the analytics world of the industry now. It’s fragmented point solutions that look at a particular area like finance, maybe Joint Commission Analytics, maybe Meaningful Use, but there’s no broad perspective in general on the entire picture of care for a patient when you rely on those point solutions. So like we did in the movement towards more monolithic structures in EMRs, you know, Epic, McKesson, Cerner, Allscripts, we’re advocating the same thing needs to happen with data warehousing in analytics.
So the first step up at level 1 is collecting and integrating core data content into an enterprise data warehouse. And there’s no alternative to this step in any industry right now. Enterprise data warehouses are still the only way that we can figure out how to collectively harness data. Now, there are different designs, of course, for an enterprise data warehouse but there’s no getting past this step. And I get a little frustrated at times when some of the querying visualization vendors will sell themselves as a virtual data warehouse. And I can tell you right now that’s snake oil. It’s impossible to bind data in complex ways with those visualization tools and you’ll need to bind that data in complex ways to reach the upper levels of this adoption model.
At level 2 is the creation of standardized vocabularies that start to tie your data together if those vocabularies didn’t exist in the source systems, and the organization of data around those registries of patients that were interested in understanding better both disease registries, medication registries, procedure registries, and device registries. There’s a whole variety of those. And we should organize those registries around those high value care processes first and those generally need about 20 to 30 different registries initially. But over time you’ll have hundreds and possibly thousands of registries at the upper levels of this model. So a flexible data warehouse at level 1 and level 2 that can add constantly to those registries improve the precision of your denominators, improve the flexibility with the different numerators. It’s really, really important.
Level 3 and Level 4
At level 3 and 4, we’re automating our internal and external reporting. Those kind of reports are kind of basic utilitarian, we have to get those out of the way, and they don’t do a whole lot to differentiators unless we get them done. So the whole idea is to get those off of our plate, free up the analyst’s time so that it’s not so laborious, make sure they’re consistent and reliable reports, and then turn that labor back into the added value in differentiating analytics that exist at levels 5 and above.
The first step and the true differentiating analytics is at level 5, and that’s where we focus initially, and we recommend focusing, on waste and care variability. You can get into a lot of sort of analyst debates right now about measuring clinical effectiveness and clinical outcomes but measuring care variability is a relatively easy thing to do and it’s a very common process improvement technique borrowed from Deming, borrowed from Toyota. And the principle is where they’re variability, there’s opportunity for improvement in quality and cost. The variability is a proxy for both of those, and it’s a very easy thing to identify analytically.
Once you identify those highly variable processes and looking at population health management, and what I call suggestive analytics, and that is exposing the specific patient that’s under care to population health analytics that suggest improvements to that patient’s specific care protocol. And the example I use all the time is what Amazon does. They surround the transaction of a purchase with all sorts of suggestive analytics but their motives are a little different, they’re about upselling. Our motive in healthcare is to leverage the wisdom of crowds and the wisdom of treatment for population health management and apply that back to the patient’s specific level.
And by the way, I’m sorry everyone, I should be emphasizing here that when you use this model in evaluating vendors, you should be asking them very specifically what are their solutions and their services associated with each of these levels. So that’s the idea – is to ask the vendors when they’re demonstrating their products to you and they’re showing you what they had to offer, that you’re asking them and you’re holding them accountable to demonstrate capability in each of these levels. Quite often what I see right now in the market is a proliferation of vendors that can reach level 4 in this model but not level 5 and above because of the underlying restrictions in their core design. So don’t believe it without seeing it. If a vendor tells you that they can get you up to levels 5 and above, be sure and make sure they prove that.
At level 6 of this model, we’re preparing ourselves for the more complicated analytics at level 7. We’re laying the foundation for the more complicated algorithms associated with predictive analytics and we’re intervening. And the notion here is that a lot of people are interested in predictive analytics, a lot of vendors are selling that right now, but there’s not a lot that we can predict accurately in healthcare right, given our healthcare ecosystem of data. In particular, we’re missing patient outcomes data. It is so difficult to predict outcomes right now without that. So we need to give the market some time to improve and round out that ability to track patient reported outcomes. The other part of this that’s very challenging organizationally is risk intervention. Once you understand the predictive models, intervening is actually the more difficult part. So rather than rush into risk intervention of predictive analytics, our suggestion in this model is that you take care of the basics and very achievable things in a disciplined fashion first. And by the time you get to level 7 you’ll be prepared both from a data perspective, as well as an organizational perspective.
And then finally at level 8, the aspirational level is personalized medicine. So we’re tailoring patient care based upon our population outcomes and now we’ve got data collection processes in-‐place for outcomes data, we have generic and familial data integrated with our EMRs, and our reimbursement model is fee for quality based on the maintenance of health, not necessarily the delivery of care. At this model, we’re trying to keep people out of hospitals and clinics but the care providers are being rewarded in that model.
Okay. So think about this for just a second and let me ask you again in this poll, and I think Tyler will pop up the poll, and that is, at what level does your organization consistently and reliably function in this model? And I’m going to back it up while Tyler pops up that, so that everybody can see it. Let’s go back to that.
Tyler: Dale, the poll is live and we’ll leave this open for another 10 to 15 seconds to give everyone a chance to answer. And while you’re answering, I would like to remind everyone that they can ask questions through the questions pane on their control panel. Also, we have gotten a few questions regarding whether or not we’ll be providing a copy of these slides. Yes, we will be providing copies of the presentation slides to everyone who’s registered for the webinar today.
Dale: This is a very interesting data that we’re seeing here, and it’s actually quite informative to the industry. We’ll share these results with everyone.
Tyler: Alright. And here are your results, Dale.
Dale: Alright. And so, Tyler, can everyone else see these results or should I read them off?
Tyler: They can see them while I’ve got them posted up here and I will remove them when you’re ready. If you’d like to talk to them, great, I can do whatever you like.
Dale: I think this is just a point of curiosity for all of us and it fits with similar polls that we’ve conducted in the past. I’m always curious about the organizations that are operating at levels 5, 6, 7 and 8 and we encourage you to reach out to us. You need a whole lot more of role models that are operating consistently at those levels. And so, we would be more than happy to sponsor webinars and interviews and things like that to highlight those role model organizations.
Progression in the Model
Okay. So let’s talk about some of the patterns in this model that I think are subtle and kind of a prelude to the details.
At each level, as you progress, data content expands. So we’re constantly adding new sources of data to expand our understanding of care delivery in the patient.
We’re also moving towards more timely updates of the data to support faster decision cycles. And I’ve made a mistake in the past of pushing real-‐time data into a data warehouse before the organizations were actually capable of digesting that real-‐time data and all I did was add confusion and chaos to the environment. So this timeliness of data update has to be choreographed with the decision-‐making cycles and timelines of the organization that you’re in. Some organizations are more capable of moving faster than others and you need to be very careful about timing that update of data and the distribution of data in those to choreograph with that.
Data governance also expands and in general it continues to advocate for greater access to data, greater utilization and greater data literacy in the organization and higher data quality.
And the complexity of data binding and algorithms as you go up also increases. They’re moving from descriptive to prescriptive analytics, looking back to looking forward, asking what happened to what should we do in the future. And I’ve mentioned a couple of times that in college I made the mistake of taking two differential equations classes at the same time trying to catch up to my colleagues in school. And it was brutal. It was very hard. And had I thought to do over again, I would have stepped more deliberately through the progression of those classes as they build upon each other. And there’s a lot of hype, again, around predictive analytics right now in the market, but my suggestion is step back, do the basics well very first, at the beginning, and then you’ll be even better at predictive analytics when the time comes.
The Expanding Ecosystem of Data Content
So this expanding ecosystem of data content is really important to organizations and this is where flexibility with your vendor of choice or your strategy of choice is critical. So, some folks, emailed me earlier today, asking me about timelines and costs and things like that. And these are the timelines that I advocate you should expect for the expansion of data content in an EDW.
So if it takes you any longer than 3 to 12 months to reach level 1, which is just right up to the edge of claims data, it’s taking too long. At Health Catalyst, and my reference is to constantly push that down, we will eventually measure this in weeks, not months. We want to make this as commodity and as quick as possible to deployment. In 1 to 2 years, you need to start planning a strategy for the addition of data in these areas – HIE data, detailed cost accounting, bedside monitoring, external pharmacy, familial, home monitoring data. And in 2 to 4 years, to reach these upper levels, patient reported outcomes data, long-‐term care facility data, genomics data, and then finally real time 7×24 biometric monitoring for all patients in the ACO. This is where we’re wired and instrumented as patients constantly being monitored.
I had pointed out that there are no vendors showing any interest in these two areas right now that I know of. I’ve heard a little bit more recently about detailed cost accounting data but these are two very critical data content areas for the progression of analytics in healthcare that need attention from vendors right now. So I would encourage all of us, both as vendors, I assume, on the call, as well as consumers. We need to start pushing the industry and at the national policy level for the acquisition of data in these two areas. It’s critical to our future.
Five Phases of Data Governance
Data governance is also going to progress. The data governance needs to progress a little faster than the data acquisition strategy. Let me back up to that slide for just a second too.
You need to start planning on the acquisition of data in this 1 to 2 and 2 to 4-‐year timeframes right now because it’s going to take some time to work out the data user agreement, the information systems requirement to get to these upper levels of analytics. So your data acquisition strategy has to accompany a business acquisition strategy as you move up these levels and it’s something you need to work on in parallel to the development of the core data warehouse. That’s the role of data governance and why it’s really important to get an active and senior data governance function in-‐place right away.
But don’t get those senior level data governance numbers involved in integrated details initially. Get them involved in setting the tone for analytics in the organization, encouraging access to data, knocking down any barriers or fears about access to data, assigning data stewardship to those core areas of data content like finance, materials management, clinical data content, patient identifiable data content, provider identifiable content, and that sort of thing. So the stewards are going to be the frontline troops that are helping implement the policies and the principles that the executive governance team is establishing.
In the next phase of data governance, we’ll be focusing on quality of data. So once you’re exposing, you’re getting access to data, you’re going to eventually expose data quality problems and you want to bend that back to the source of the data to improve that. Your data stewards will play a really critical role in that and you want to be asking your vendors what kind of experience do they have in facilitating good data governance, what kind of tools do they have for exposing data quality problems, what kind of tools do they have for exposing data stewardship issues in the Metadata library, and what kind of tools do they have for managing access to data and supporting the security model that balances access to data with security of data.
Finally, the data governance body will function on utilization of data. So encouraging a data-‐ literate organization, encouraging the official declaration of reports that eliminate these multiple version of the truth, and they’ll get into in phase 5, the strategic acquisition of data. The data acquisition strategy for the organization is reflected in that previous slide.
So those are the 5 phases of data governance that I see evolving in my life and in my experience over and over again. Those are kind of the natural phases of progression.
And again, you might need data governance to be charted around encouraging more, not less, access to data, increasing content in the warehouse, enhancing quality, establishing standards for master reference data, campaigning for literacy, and resolving analytic priorities. And again, you want to partner with a vendor that has a proven track record, not only with the experience in these settings but also the tools to support data governance.
Strategic Analytic Options in Healthcare Analytics Platform Vendors
These are the strategic analytic options in healthcare that I use to keep my head around what’s available in the market. So we can buy and build from an analytics platform vendor and those are some sample vendors, example vendors there. They have a lot of flexibility. They tend to be more appropriate for data-‐driven cultures with high aspirations. They tend to be more suited for a higher degree of data literacy and data management skills. They’re a little harder to manage but when the ROI occurs, and it does occur, it can be huge. Triple vision is not uncommon. Quadruple vision is also seen every once in a while.
Analytics Service Providers
The other option is buying from an analytics service provider such as Explorys, Lumeris, Optum, and Premier. They tend to be better suited for cultures that want to avoid the details of analytics and data management but aspire to basic internal and external reporting efficiency.
They generally have limited analytic flexibility and adaptability in moving up to levels 5 and above. And I haven’t heard of any stories of substantial ROI from these systems yet, not to say that they don’t exist, but if they do, I haven’t seen it and I’ve been studying this for now like 20 years I think.
“Best of Breed” Point Solutions
The current “best of breed” point solutions are very good at what they do individually but they don’t support continuum of care analytics. And so, these vendors are going to have to figure out a way to evolve or contribute to a more broad perspective on patient care than what they currently support.
EMR Vendor Solutions
The other option of course is buying from EMR vendors, and one of the things that’s attractive about this is the possibility of those closed loop analytics that can bend the Triple Aim back on the user interface to the EMR. But having been a CIO that managed a couple of these EMR vendors, I can say their track record for successful analytics is very poor. Not that we haven’t tried and not that we haven’t worked closely with them to encourage improvements but they don’t leave me with a whole lot of faith that they’re going to solve the problems. And in particular, this expansion and appreciation for data outside of their core product is a big concern for me.
“Build Your Own” EDW
And of course, building your own from scratch is also an option, but it’s a very risky option. There aren’t many people that have made the same mistakes that I have, and if you make mistakes in data warehousing, they typically have a very slow gestation period, so they sit there and simmer and boil for 2 or 3 or 4 years and then you find yourself tainted into an analytics corner and you have to scrub that solution and start over. And that happened to me a couple of times in my career, once at Intermountain Healthcare when we built the health plans data mart and another in the Air Force when we were building an intercontinental ballistic missile analytic system for the Air Force.
So gestation and problems simmer quietly and insidiously, so I don’t suggest you build your own from scratch unless you’re very brave and very successful in past implementations.
The Details in Each Level
Alright. So let’s try to go through some of the details and I can warn everyone, my apologies, it’s very tech savvy. I will make these available for further studies later.
Healthcare Analytics Adoption Model Level 0: Fragmented Point Solutions
So again, the fragmented point solutions, we talked about these. They can be very focused to the vendor base. They internally develop apps that focus on a very specific analytic problem or situation in the organization. They don’t support the continuum of care, and this is where we’re trying to move the organization from this level up to better.
Healthcare Analytics Adoption Model
Level 1: Integrated, Enterprise Data Warehouse
So at the data warehouse level, level 1, we co-‐located EMR Stage 3 data, Revenue Cycle data, Financial data, Costing, Supply Chain, and Patient Experience data. Those are the basic pieces of core data that need to be in the first iteration of your data warehouse. You need to have a searchable metadata repository that’s available across the enterprise. If you got access to claims data at this level, you need to include that. Generally speaking, updates within one month of the source system changes is adequate, but if you can design for something more timely, I encourage that.
Data governance starts to form around the quality of data in the sources systems and access to those source systems. So at this level data governance is advocating a transparent data-‐driven culture and knocking down any barriers to the development of the data warehouse. I generally suggest that the EDW reports to the CIO because the CIO generally can knock down barriers, any unwillingness on the part of the source system stewards to participate in the development of the data warehouse, and that’s a real phenomenon that happens quite often for the source systems are busy or maybe they are a little resentful of the EDW. I think that’s less and less in the case than it was 10 or 15 years ago. But the CIO can generally bridge that gap and get both parties working together, the front end data collection team, as well as the back end analytics team. And by the way, those folks that are working on the front end data collection will be great partners to the data warehouse because they know where the data is stored, they know issues associated with data quality, so that yin and yang partnership between the data collection teams and the data analysis teams in this case is really, really important.
And again, ask yourself and ask vendors, what are you doing to help me achieve level 1 as fast and as affordably as possible? Ask the question and you want them to prove that they have the ability and a track record for meeting all of these level 1 requirements.
Healthcare Analytics Adoption Model
Level 2: Standardized Vocabulary & Patient Registries
So at level 2, now we’re starting to organize the data in registries and vocabularies, tying it together. So you want to ask what kind of registries does the vendor provide out of the box and I also want to emphasize that those registries in the upper levels must be informed by more than just ICD9 billing data. At this level, it’s okay to get by with just the ICD definitions of the patient cohorts. But to become more precise in your analytics at levels 5 and above, you’ll need to round out the precision of those registries with clinical data, lab data, and that sort of thing. So you’re tying the data together, you’re managing vocabularies. Data governance is starting to form around the definition and the evolution of patient registries – so how do we define an asthmatic, a diabetic, congestive heart failure. And also how do we define what the master data element in the organization will be and who will be the stewards of that data, things like facility codes, CPT codes, ICDs, who’s the master for that and how do we incorporate that into the data warehouse.
Healthcare Analytics Adoption Model Level 3: Automated Internal Reporting
At level 3, we’re trying to automate our internal reporting. Those basic reports about clinic and hospital operations, pushing those out to a wide population of consumers in the organization, making it acceptable on their mobile devices, making it acceptable on their desktops, so that everyone is part of the analytic environment of the organization. Data governance now is starting to focus on raising the data literacy of the organization and developing a data acquisition strategy for levels 4 and above.
Now, the important feature here is getting the corporate and business unit analysts, and the central and decentralized analysts, working together to collaborate and steer the enterprise data warehouse. And I always suggest that 60/40 split. 60% of your analytic resources should be centralized, 40% should be decentralized. And that’s a good balance between the two and it encourages this balance between top-‐down initiatives, as well as bottom up creativity and prioritization.
Healthcare Analytics Adoption Model Level 4: Automated External Reporting
External reporting, the goal here is consistent and efficient production of reports for all of the external reporting requirements. We have CMS, Joint Commission, Meaningful Use, PQRS, I mean you name it. We’re inundated with these external reporting requirements right now and they change constantly and they are right now quite ill-‐defined in a lot of areas.
So the important thing in the design and when you’re evaluating vendors in this space is, number one, do they have the full complement of external reporting requirements baked into their products? But is it also an agile software engineering environment and data environment that can keep up with the changes in these external reporting requirements? So agility and adaptability are really important here and just frankly keeping up with all the changes. And you need to ask them, how does that vendor keep up with all these changes and how is that incorporated into the products?
We’re also starting new clinical text content. So radiology reports, pathology reports, progress notes are now exposed in the data warehouse for simple keyword searches. We’re not doing any complex NLP at this point. It’s very simple keyword searches that’s augmented with discrete data analysis as well. And there’s a centralized data governance body in-‐place now for reviewing and approving externally released data. So making sure that what you send out to the organization has the stamp of approval of the internal data steward is really important.
Healthcare Analytics Adoption Model
Level 5: Waste & Care Variability Reduction (1)
Now we’re finally getting into the areas of true analytic differentiation at level 5 and our motive here is focused on measuring adherence to best practices, minimizing waste and care delivery and reducing variability. And it’s quite easy, I want to mention this again, it’s quite easy to identify variability and use that as a proxy for quality and cost reduction.
Data governance, and then again, ask your vendor, what kind of tools do you have to help me identify variability in care? And what kind of tools do you have that allow me to measure our adherence to clinical best practices? And by the way, I want to be able to define what those clinical best practices are, even though I may have several requirements, and imposed some of that on me, I want to define internally what those clinical best practices are and I want to be able to institute those rules in the data warehouse.
The data governance body now expands to include care management teams that are focused on improving the health of patient population. So multidisciplinary teams are forming around these highly variable care process families. So you have IT, nursing, physician, administration, finance all working together to understand that variability in care and its effect on quality and cost. We’re using population-‐based analytics to suggest improvements to individual care. If you folks haven’t read the book called “Nudge”, I highly recommend it. It’s a great kind of airport read and it talks all about and it doesn’t use the term suggestive analytics but it’s really, that’s what it’s all about and that is exposing people to data will generally nudge them in the direction of improvement, and you don’t have to go through a big transformational leadership, touchy feely process to achieve those kinds of organizational changes.
Healthcare Analytics Adoption Model
Level 5: Waste & Care Variability Reduction (2)
Also at this level, the precision of registries has improved, the rounding out of the precision of that ICD9-‐based registry by including data from labs, pharmacy, clinical observations and definitions of patient cohorts and you’ll increase the precision to your registries because you’re going to start taking on greater financial risks at levels 6, 7, and 8.
And if you don’t have precise patient registries, you won’t be able to manage your financial risks. EDW content is organized in the evidence-‐based and standardized data marts that combine clinical and cost data associated with patient registries. So you have care process improvement teams that have their own data marts supporting the care processes that they’re focusing on. Data content expands, to include insurance claims, if you haven’t done that already, and HIE data feeds. And on average, we’re seeing the EDW updated within 1 week of source system changes because the pace of decision-‐making and improvement in the organization is starting to pick up.
Healthcare Analytics Adoption Model
Level 6: Population Health Management & Suggestive Analytics (1)
At level 6, we’re starting to really take on accountable care concepts, not necessarily from the federal perspective but from the perspective of fixed fee guaranteed quality outcomes, and we’re all going to start sharing in the financial risk and reward tied to that. It’s fixing contracting that a lot of us have been exposed to for many years with patient deliverables and guarantees of quality and satisfaction.
At this level, we suggest at least 50% of acute care cases should be managed under bundled payments. And analytics are available at the point of care to support the Triple Aim of understanding individual patient care, population management, and economics of care. So this is where the EMR vendors have to participate with those in the industry to enable the Triple Aim. And I would advocate, and this is what we did in Northwestern, that you support those analytics from the enterprise data warehouse and you plug those analytics into the user interface of the EMR.
We’re expanding the data content at this level to include bedside devices, home monitoring data, external pharmacy data and detailed activity-‐based costing. And again, we need to put some pressure at the policy level in the industry, as well as on vendors and entrepreneurs to develop some activity-‐based costing systems to the industry because it’s a big missing piece of what we need and we can’t reduce cost if we don’t know what they are and we really don’t know what costs are to the delivery of care in healthcare. Intermountain is an example of where they do have a good handle on costing. University of Utah is another good example as well.
Healthcare Analytics Adoption Model
Level 6: Population Health Management & Suggestive Analytics (2)
Data governance now is playing a major role in the accuracy of metrics that are supporting quality-‐based compensation plans for clinicians and executives. So one of the nastiest conflicts that can occur is when you start producing data that puts financial risk on physicians that that data is poorly organized and inaccurate. So it’s really important at this level that as physicians take on risk-‐based salaries and compensation plans for quality, that your data governance teams work to ensure that that data is accurate and the physicians believe it and they have faith in it.
The EDW is now being updated on a daily basis. And I suggest at this level that EDW shift its organizational reporting to a C-‐level who is now accountable for balancing cost of care and quality of care. Sometimes this stays with the CIO. More often it moves out to a chief medical officer and chief quality officer and that kind of thing.
Healthcare Analytics Adoption Model
Level 7: Clinical Risk Intervention & Predictive Analytics (1)
At level 7, we’re talking about clinical risk intervention and predictive analytics and we’re moving into more fixed fee per capita reimbursement models. Our analytic motive is focused around diagnosis-‐based reimbursement, not procedure-‐based reimbursement.
Performing teams and managing cases of collaboration with physicians and payers around episodes of care who are intervening where risk is high and who are developing innovative outreach program, triage, escalation, and referral processes to manage that risk.
Everyone, including patients, start to share in the economic risks and rewards of better healthcare.
One of the important things here is to find vendors that understand that the registries must be tailored, the numerators in those registries must be tailored to handle patients who are unable or unwilling to participate in care protocols. So it’s a really important step in the progression. You don’t want to hold physicians accountable for quality of care if patients can’t participate in their care. There is no way that you can hold a physician accountable for that. So you have to spin those patients up separately and manage them differently and don’t hold physicians accountable for patients that can’t participate in protocols.
We’re expanding the analytic content to include home monitoring data and protocol-‐specific patient reported outcomes, and again, another area that’s not being addressed by any vendors. And you want to ask your vendor, do you have the ability to take near real-‐time updates to your analytic design? Because now the pace of decision-‐making is really starting to pick up.
Healthcare Analytics Adoption Model
Level 8: Personalized Medicine & Prescriptive Analytics
And finally it’s the highest aspirational level. We’re talking about personalized medicine that’s driven from behavioral health perspectives, physical and behavioral functional health perspectives, and the mass customization of care based upon analytics being brought back to that patient at a specific point in their life. We’re including NLP now in text. Prescriptive analytics is actually suggesting changes to protocol data from local data. We’ve exposed that at the point of care and we have included 7×24 biometrics data streaming into the data warehouse for that risk stratification. And now, we’re updating in near real-‐time.
Healthcare Analytics Adoption Model: One Page Self-‐Inspection Guide
This is a one page self-‐inspection guide. I won’t go through this. It basically has in one page what I just covered.
Now, I’d like to pause and ask Tyler to pop up our interactive poll. Even though you just go through…
Tyler: Dale, that poll is active.
Dale: Okay. Great. Thank you. How confident are you that your analytics solution will scale to the upper levels of this model?
Tyler: Alright. We will leave this poll open for another 10 to 15 seconds.
Dale: And again, these organizations – we’d just love it if the organizations that have a lot of confidence in maturity in this area, if you would help us help expose you to the industry. We really want to sponsor those kinds of role model stories.
Tyler: Alright. The poll is now closed and here are the results.
Dale: And Tyler, we’ll share these results with everyone in the follow-‐up emails.
Tyler: That’s right. These results will be a part of the webinar recording. They’ll be able to see the result as part of that as well.
Dale: Great. Thank you. Alright. Now, I just wanted to… Oops…Sorry. Go ahead… Tyler: I apologize, Dale. We are nearing the top of the hour and it looks like we’ve got a ton of questions coming. I just wanted to make sure you are aware of that, that if you have the time to be able to go past the top of the hour for those individuals who are able to stay on.
Dale: Okay. Thank you, Tyler. My apologies for taking too much time.
Five Levels: DELTA Powered™ Analytics Maturity
Let’s go through this. I just want to quickly introduce that HIMSS has an analytics maturity model as well and I encourage everyone to become familiar with it. It’s a little less healthcare-‐ specific in the model that I just reviewed but it does support across-‐industry comparisons. I’m working with James Gap and John Hoy, John Daniels. They’re all dear friends and colleagues to mine at HIMSS and we’re trying to figure out a way to collaborate but I think it’s definitely worth looking at. Tom Davenport from Harvard Business Review Team is a member of the leadership team, so I’m sure it’s very well thought out, and that’s the DELTA model that they advocate. I won’t spend any time on that.
So there is the closing.
But let’s get on to these questions. Let’s see here…
The data warehouse model was really well for larger organization. What about smaller organizations? How do we get analytics down to the front lines of healthcare? What kinds of solutions you see emerging to service these long tail of healthcare providers?
Well I think that it’s critical for vendors, including Health Catalyst, to find an affordable solution. So I put the responsibility back on vendors, like ourselves, to solve this problem. And in fact, we are, and in some of our – I’m confident that we’re going to do this. We support Providence Healthcare which is the second largest system in the US, down to very small hospitals with a couple of hundreds, but if we don’t get analytics down to the frontlines of healthcare – and this is from Bryan Fenrell, this question – when I’m going to meet the demands of the industry. So to me this is an Apollo 13. No other option. We have to figure out a way to do it to make it affordable, and I can speak, from Health Catalyst’s perspective, we’re working on that and we will do that.
Tyler: Could you advance to that final slide that has further webinar information (57:52) answering these questions, that would be great. Thank you.
Dale: Oh yeah. Okay. Thank you. There’s my contact info. You guys can Google me. If you want to reach out, I would love to collaborate with you. And these are some upcoming next webinars that I think you’ll find interesting as well.
How do we standardize the vocabulary.
That’s kind of a long topic, friend, and I would be happy to talk to you about that separately. Radiology and pathology reports of course have been pre-‐standardized for a long time. I wish we could follow that same kind of role model in clinical progress notes. Surgical notes tend to be fairly straightforward but it’s a fairly significant problem. And it’s again one reason I don’t suggest a lot of text analytics right now in healthcare because the notes are so unstructured. So Greg, I would be happy to talk to you more about that later.
From Constance Jackson, how do you navigate incorporating vendors that are specialized in providing patient self-‐management tools of EMR, essentially integrating segmented data analytics that are generated by patient self-‐management to provide predictive analytics?
That is a great question and I don’t have a good answer for that right now, Constance. I think there is a good example of that. Vital tends to receive quite a bit of acknowledgement and in the market for some of their patient self-‐management tools. But it’s not clear to me how we’re going to integrate those into an EMR yet. And I have a separate webinar on Population Health Management that touches on this topic. I think, in part, it’s going to be driven by whether the EMR vendors open their application programming interface, and whether we start interacting at the software layer to integrate these systems rather than the message layer. So you can plug tools and like these into the EMR. But it’s not clear to me right now. That’s a great question.
From Kevin Cozosky, to your earlier slide about the 8 features and linking these with the half life of technology right at 8 years, what do you feel the realistic lifecycle is for visualization tool?
Oh my goodness. Kevin, you’ve asked a great question. We’ve had an ongoing email debate in Health Catalyst about this exact question and that visualization layer is changing, it’s incredibly fast-‐ paced right now. So I think when is the great turnover, you’ll see a whole different set of capabilities and tools in 2 years, about every 2 years for the next probably 8 years, if not, forever. There’s something like Morse Law going on with the visualization layer. So it’s very important that you design so that you’re not so tightly coupled to a visualization tool that you can’t migrate away from it in a couple of years if that tool hasn’t kept up with the evolution of the market. Great question.
From Jamie Norrie, stakeholders see the folks on data comes from EHRs, agree. Would you agree that other data types…Yes, absolutely. You mentioned genetics, which we’re all working towards, but what about biologic disease pathway, biomarker imaging, phenotyping, the information that physicians use as part of their training? Will aggregating these data be a benefit?
Yes, absolutely. I totally agree. And I probably wrongly lumped biomarker and phenotyping data into that genomic data content area. I haven’t given a lot of thought, Jamie, to disease pathway and biologic data. I’d spent some time on the notion of analytics around the imaging. But yeah, I should give that some more thought. Thank you for asking.
Further explanation of data binding. That’s from David Cook.
David, there’s a whitepaper and I would be happy to share more with you about the concept of data binding but there’s a whitepaper on our website. And if you do go Late Binding ™ Data Warehouse, I think that will come up and I suspect you’ll find the concept is pretty straightforward, but I would be happy to chat with you in person about that.
Kevin asks another question. Is cost accounting a goal of Intermountain-‐Cerner partnership? Yes, that’s our understanding, and I’m glad to hear that. I’m actually hopeful that that new partnership is going to change the industry. But yes, they are including cost accounting as a part of that agreement and that’s going to be a great thing for the industry.
From Kevin again, given where the talk is headed, differentiation through effectiveness and quality-‐based analytics, will not all organizations need to go to buy and build eventually if they will be the only ones who survive?
I think so probably, Kevin. I mean that would be my default suggestion. There’s – yeah, if I understand your question, I think that’s the most likely path to success.
Well analytics for you is viewed with the same importance in the ambulatory sector, most of what you’ve seen and been talking about is inpatient, especially when you mentioned vendors and the service providers in the ambulatory space.
Totally valid question. And yeah, so we are really interested in the ambulatory space and we have quite a few products in that space actually. So, sorry that I came across as being inpatient-‐ focused. But the center of healthcare is moving away from the hospital out to the ambulatory space. There are a lot more accountable care organizations that are physician-‐centric right now and there are even some folks who think eventually, you know, hospital bed are going to be traded as a commodity on the stock exchange. So ambulatory-‐based analytics is critical and speaking from a Health Catalyst perspective, chronic condition management with an ambulatory focus is kind of where we’re all headed.
Richard asks, didn’t see Orion listed as a potential vendor.
Sorry about that, Richard. I tried to keep up on all of the leading vendors but I also know that the list is not.
From Cindy Shahim, how does the plan for right access align with HIPAA requirements? Are there differences?
Well yeah. HIPAA, you know, is really about PHI. And so, having a very formal stewardship and auditing process around the PHI data in a data warehouse is not that much different in what you manage right now with access to an EMR. So my suggestion is that you open up access to data but be very careful about access to PHI data, and the same kind of thing that we do with EMR access.
From Don Manning, can an organization be successful moving through these 8 levels for prior information leads, i.e., building out needs horizontally based on priority data rather than all in vertically in each level?
I’m not entirely sure I understand your question. I apologize for that. But if you’re talking about focusing on care process families as opposed to shared services analytics, I think you have to have a sort of battlefront on both. So you need to have an oncology clinical program analytic strategy, alongside a laboratory and a radiology and a materials managing analytic strategy, and you have to build those out in parallel. And at level 3, it’s where a lot of those horizontal analytics should come to play. That’s where your key process metrics in each of those shared services area should emerge. It’s in level 3 of that model.
Ankush Bhagat asks, when the presenter mentioned it’s easy to identify variability process, please provide 2 to 3 examples for variability.
Direct variable cost is one area that we use as a proxy for variability. Charges, you can also use that as a proxy for variability. And we’re not worried about the specific accuracies or inaccuracies of costs or charges. It’s just the variability in those as an indication of variability in care. So that’s a couple of examples.
Denis Protti asks, will we see a list of organizations operating at level 6 or higher? Yeah, that would be great. I would love – for those organizations who are operating at that level and answered the poll at that level, if you would please let us know so that we can advertise you. We would really appreciate that. We don’t care whether you’re with Health Catalyst or not.
Denis goes on to ask, where is the VA? Where is Kaiser and Intermountain?
I think Kaiser – I would say Intermountain for sure. I’ve had more first-‐hand experience there. It consistently operates at level 7. The VA, I would say maybe a level 6. I have a little bit of insight there. Kaiser, not quite as much insight, so I don’t want to venture an answer to there.
It’s 10 minutes after and I’m happy to keep going, guys, so I’ll just do that and whoever can hang on, welcome to do so.
What is the minimum size of an organization which can benefit from the adoption of analytics as outlined in the model?
That’s a great question, Chris. And I think we have to figure out a way to get the analytics down to critical access hospitals, down to 25 deaths. I don’t know exactly how that’s going to happen right now. But the way that we’re operating in Health Catalyst and the way that I hope other vendors are going to operate is that we commoditize those lower levels, levels 1, 2 and 3 especially from a technical perspective, so that we can install those really cheaply, really quickly in a local setting, or we can host those in a data center in the cloud. But the whole idea for this critical access to facilities and small organizations is we have to make it affordable, we have to make it manageable, and we have to get them up to levels, at least 3, and eventually it has to be a level 4 too. And then over time, we’ll figure out how to get them up to more complicated levels.
Another question here. What advice would you give an organization that aspired to level 8 and has tried to reach that aspiration by working on all levels at once? So that’s yielded partially and minimally to coordinating success at different levels.
Well, that happened to us really at Intermountain actually. We aspired and worked aggressively to operate around level 7 and touched on level 8 but every month in every quarter is a fire drill for us to produce level 3 and level 4 data. And so, some of our most talented analysts were consumed on level 3 and 4 when they should have been applying those skills to the upper levels. So we actually backed up our strategy at Intermountain and with a lot of focus on and discipline on safety level 3 and level 4 to a high level maturity and then further up those analysts so that they could work on level 5, 6, 7 and 8. So I really believe in this discipline progressive approach of moving through the model will have a payoff but you have to have that discipline to back up and fix the problems that reside underneath the level you aspire to.
How big is this warehouse to accommodate data needed for population health? What have you implemented so far and how did that scale up?
Let’s see…Well it varies, the size of the warehouse varies depending upon the size of the organization. I think the largest warehouse that we have right now is in maybe 10 to 15 Terabytes and probably the smallest would be less than a Terabyte. It depends on the size of the organization and the enterprise. And that was from Joseph Jones. Thanks, Joseph.
Will we have access to the presentation?
Yes, Lucy, we’ll do that. Thank you.
Can you see a role of staff solutions in healthcare analytics?
Well it’s kinda how I categorize Explorys and Lumeris and to some degree the Crimson as analytics as a service solution. And I think it has a place in the industry. I don’t think that the way those models work right now offer the flexibility and agility that organizations will need to get up above level 5. I think they do okay to get organizations to level 4. But level 5 requires some pretty hands-‐on involvement with your own data. Thanks, Kevin.
Do you see HL7 version 3 implemented in the industry?
Yeah. Curtis is asking that question about where we’re going with HL7 version 3. That’s probably worth a separate conversation, Curtis. I would be happy to talk about that. I’m frankly very disappointed in the progression of the industry and HL7 to version 3 and I’m not a big fan of the CVA architecture. I don’t think it’s the kind of interoperability that we need and there’s no real precedence for that kind of interoperability in any other industry. So I’m not a big fan of CVA as a means for achieving interoperability.
From Grant Vaughn, could you talk a bit about the state of analytics for small private practice physicians?
It is and most of the analytics that are supported at the small practice level right now come from the EMR vendors themselves or targeted analytics. So we are about to sign our first client, a large physician group, well I’d say medium-‐sized physician group, very forward-‐ thinking. And it’s our goal to provide a solution for physician groups that’s going to leapfrog the current state of the art. But it’s not a good state of the art right now, Grant, and I wish I could say otherwise. But your best solution right now is relying on your EMR vendor.
Corinne Egert, what correlations and laddering can you see comparing progression on the Analytics Adoption Model compared with the EMR Adoption Model?
Well we stack our Analytics Adoption Model with the assumption that you have an EMR stage 3 at level 1 of our model. So, I think that’s how I would answer that question, Corinne. Now, it will be interesting to see if the EMR Adoption Model progresses and Meaningful Use.
progresses, to include detailed cost accounting data collection, exposing cost of care at the point of care, and collecting patient reported outcomes through the EMR in some fashion. And so, I think the EMR Adoption Model needs to evolve into these new functions and roles, including the earlier question about the other functions of an EMR. Now, I’m blank. I don’t know exactly what that question was but there was some good points to that, extending the data content and user interface of EMRs. So, EMRs have some time and we need at the policy level the Meaningful Use level. They have some time to evolve but we need to put pressure on them to evolve to become a more complete data ecosystem.
Okay. Last question, could you talk some more about the risk of building rather than buying and risks of getting Metadata, developed using a commercial platform you later want to leave?
Well that’s what I call tight coupling, right? And that’s if you tight the couple yourself to any vendor, the pain of conversion when something better comes along is very high. Metadata, in particular, right now there is no standard Metadata interchange format that’s commonly supported. So if you enter data into a Metadata repository, you can almost guarantee it’s going to be difficult to export that data to something new. The risk of building rather than buying, it’s we’re at the same stage now with analytics that we were probably 10 or 15 years ago with EMRs. where some organizations have built their own EMR, like Duke and Columbian partners in Intermountain. But some commercial opportunities from vendors were also emerging and we’re kind of in that same state of affairs right now with analytics.
I would not recommend building your own data warehouse unless you have an incredibly talented and experienced team in healthcare. One of the mistakes that I see quite often is organizations will hire people from outside of healthcare who have an analytics background, and it’s good experience and I welcome those folks in the industry, we need that, but building a data warehouse in a manufacturing environment, building data warehouse in a retail environment or banking environment, it’s very different than building one in healthcare – in part because our comprehensive understanding of analytics and our persistent understanding of analytics in healthcare is much less mature than those other industries. So you could safely and tightly couple your data in those other industries to rules and processes but you can’t do the same thing in healthcare yet. We’re not mature enough as an industry. Our understanding of ourselves is still evolving.
So, you know, The Healthcare Data Warehousing Association is an organization that I founded in 2001, Adam. This is Adam Veet asking this question. It is I think the best collection of practitioners of data warehousing in healthcare and it’s all at the director level and below for the most part, there are some EVPs that participate, but those are the folks that are on the trenches. A lot of those folks have built their own systems over the years and a lot have purchased products. And so, again, if you google Healthcare Data Warehousing Association and you join that group, there’s no fees associated with it, it’s all free, vendors are allowed to participate, I think you’ll find the best collection of experience and knowledge about the pros and cons, other than my opinion, on buy versus build.
That’s the last question everyone. I appreciate everybody staying with us over time. My apologies for running overtime. Best of luck everyone and please let us know if we can help in any way. Thank you so much.
Thank you, Dale, and thanks to everybody for joining us today. After the meeting closes, you will have the opportunity to take a short 6-‐question survey. Please take a few minutes and fill out the survey so we can continue to bring you relevant content. And within 24 hours after this event, you will receive an email with links to the recording, the presentation slides, and a link to our new online knowledge center where you can register for any of our upcoming webinars, as well as the past webinar recordings.
On behalf of Dale Sanders, as well as the folks at Health Catalyst, thank you so much for joining us today. Have a great day. This webinar has now concluded.
[END OF TRANSCRIPT]