The shift from fee-for-service to value-based reimbursements has good and bad consequences for healthcare. While the shift will ultimately help health systems provide higher quality lower cost care, the transition may be financially disastrous for some. In addition, the shifting revenue mix from commercial payers to Medicare and Medicaid is creating its own set of challenges. There are, however, three keys to surviving the transition: 1) Effectively manage shared savings programs to maximize reimbursement. 2) Improve operating costs. 3) Increase patient volumes. With an analytics foundation, health systems will be able to meet and survive today’s healthcare challenges.
Learn more about Jared Crapo
Jared Crapo joined Health Catalyst in February 2013 as a Vice President. Prior to coming to Health Catalyst, he worked for Medicity as the Chief of Staff to the CEO. During his tenure at Medicity, he was also the Director of Product Management and the Director of Product Strategy. Jared co-founded Allviant, a spin-out of Medicity, that created consumer health management tools. In his early career, he developed physician accounting systems and health claims payment systems.
Read articles by Jared Crapo
Healthcare leaders often turn to healthcare IT analyst rankings and reports for information that drives vendor-related decision making.
Knowing the key differences between several notable healthcare and cross-industry IT analysts—what methodologies they employ to gather data, their missions and goals (ranking vs. consulting), and how much of their own opinions they interject (unbiased vs. opinionated)—will help healthcare leaders be more educated consumers of the reports and rankings that saturate healthcare.
This article provides a high-level overview of the key differences between several healthcare IT analysts:
KLAS Research (ranking focus)
Black Book Rankings (ranking focus)
Chilmark Research (ranking and consulting focus)
Advisory Board (consulting focus)
It also looks at the most notable cross-industry IT analysts that apply a healthcare-specific lens to their findings:
International Data Corporation
Frost & Sullivan
Healthcare leaders with the ability to interpret these rankings and reports to extract the information they need, will make them more effective decision makers.
An effective population health management program must include three systems: Healthcare Analytics, Best Practice, and Adoption. Organizations with only one or two of these systems often display symptoms of weak and ineffective capability for population health management. But when you have a analytics foundation based upon a data warehouse, combined with evidence-based practices contained in a best practice system, and the ability to deploy and implement systematic changes to healthcare processes, health systems are truly prepared to manage population of patients.
Too much is at stake in value-based healthcare and the technology needed to provide it. When it comes to investing in the best healthcare analytics tools for delivering data-driven care management and outcomes improvement, executives should compare these seven points to determine whether an electronic health record or an enterprise data warehouse should be the foundation of their analytics platform:
Incorporating data from a wide range of sources
Ease of reporting
The data mart concept
Relevance of each to value-based care
Relevance of each to managing population health
Surfacing results of sophisticated analysis for physicians at the right time
Ability to combine best practices, data, and technology tools into a system of improvement
This executive report starts by examining the origin of EHRs and EDWs, then dives into the value derived from both in terms of their contributions to the major issues impacting healthcare delivery today.
Healthcare quality reporting is integral to achieving the Triple Aim and improving outcomes. But the sheer volume of quality measures has become as much a part of healthcare as healing and prevention. Recently, CMS and AHIP took the unprecedented step of aligning and consolidating measures in seven care categories. This will go a long way toward reducing the amount of time physicians and staff spend every week on quality reporting, but it’s only a beginning. Healthcare’s focus needs to shift from volume to value of quality measures, such as those that concentrate on quality of life and patient-reported outcomes. The International Consortium for Health Outcomes Measurement is setting the right example for quality measures designed to actually improve outcomes rather than just processes.
Many healthcare organizations are entering, or are planning to enter, into some type of at-risk contract, be it a bundled payment program, a Medicare Advantage plan, or an ACO. In order to manage these contracts most effectively, integrating external and internal claims data in to the EDW is critical. Aggregating data in to an EDW from internal, disparate, clinical, administrative, and financial systems is the first critical step to identify opportunities for quality improvement. However, external data from organizations such as CMS and commercial payers, along with benchmarking and consumer and demographic data, also has the potential to improve the quality of care, increase patient satisfaction, and lower costs. In the new world of at-risk contracts, integrating external and internal data enables leaders to successfully oversee, manage, and strategically plan for future at-risk arrangements.
Early this year, CMS began a per member per month reimbursement for Medicare beneficiaries with two or more chronic conditions. It immediately validated the need for care management programs. Three models are used to measure the savings of an effective care management program:
Historical or intent-to-treat design
Matching comparison design
Randomized control design
All three place a heavy reliance on data and precise, tailored patient registries. Reliable patient registries are one of the most valuable tools in the care management toolbox. And the means to that reliability is an enterprise data warehouse, which essentially gives program managers an all-access pass to stratifying patient risk and leads to a more successful population health initiative.
The data lake style of a data warehouse architecture is a flexible alternative to a traditional data warehouse. It allows for unstructured data. When a warehousing approach requires that the data be in a structured format, there are constraints on the analyses that can be performed because not all of the data can be structured early. The data lake concept is very similar to our Late-Binding approach in that data lakes are our source marts. We increase the efficiency and effectiveness of these through: 1. Metadata, 2. Source Mart Designer, and 3. Subject Area Mart Designer.
When an analyst from another health system asked our resident analytics expert about the practical value of the Analytics Adoption Model, our expert had a lot to say. Specifically, he elaborated on the results the organization would realize, especially if they used the Adoption Model as a roadmap on their journey to become data driven. But first, they would need to adopt a late-binding data warehouse and analytics applications. With both solutions, they would be able to confidently deliver evidence-based care.
Many industries, especially those using huge amounts of data like Facebook, are using Hadoop for their processing needs. So, what exactly is Big Data and Hadoop and what are its implications for healthcare? Hadoop is a distributed processing and storage platform. The use of Hadoop is rare in the healthcare industry, but healthcare analytics hasn’t necessarily been stalled because of this. In fact, the quality of data healthcare produces doesn’t justify Hadoop-level of processing power. This article answers questions such as what is Hadoop, what are the drivers of this platform in other industries, how might it affect healthcare analytics, how would clinicians use data sources outside their environment, and what drawbacks currently exist for further adoption.
With rising healthcare costs, we hear so often about rate pressures on hospitals and the risk these pressures pose for their future. With healthcare reform, the burden of rising healthcare costs is shifting from payers to providers. Hospitals need to move toward value-based reimbursement models or they will face a -15.8 operating margin by 2021.Over the last 15 years premiums and employee contributions for an average family with health insurance sponsored by an employer have risen 167%. Along with these facts, government payers are reimbursing at lower levels becoming a negative margin for hospitals. These changes are not necessarily easy and can seem overwhelming. The question is whether your hospital will be a pioneer on the trail or will delay until it’s too late. The best way to get started is to understand exactly where you are today—your current cost structure and how each area of your organization is performing in terms of quality and cost, using an EDW.
The term Big Data seems to be everywhere. It can be defined by three characteristics: volume, velocity, and variety. Traditional data management techniques include the ever-popular approach of using SQL interactions with a relational database. Here’s how health care fits into big data as defined above:
i. Volume- A typical healthcare firm stores less than 500 terabytes of data, as opposed to an investment firm that stores almost 4,000 terabytes. A traditional database engine can handle far more data than most health organizations produce. This means healthcare systems should carefully consider the tradeoff before switching to big data tools.
ii. Velocity- The speed at which some applications generate new data can overwhelm a system’s ability to store that data. However, most healthcare data is entered by employees unlikely to generate data fast enough to overwhelm a typical SQL database.
iii. Variety- There are three different forms of data in most large healthcare institutions. Discretely codified billing and clinical transactions are well suited for relational data models. The third form of data in healthcare consists of blobs of text. While stored electronically, there is very little analysis done on this data today, because SQL is not able to effectively query or process these large strings. There is much opportunity for progress in this area.
We have found that many customers have similar questions about how the implementation process works when rolling out a Health Catalyst Late-Binding ™ data warehouse platform and analytics solutions. So, we thought it would be useful to produce a document that we hope will answer the majority of these and other common questions. The keys for a successful Health Catalyst implementation are outlined step-by-step format.
Pre-step (most important): Identify key personnel resources needed on the health system side, 1) Implementation Planning, 2) Deploy Hardware, 3) Technical Kickoff Meeting with the Client and Health Catalyst Deployment Teams, 4) Access Source Data, 5) Install Platform, 6) Load Data, 7) Install Foundational Applications, 8) Install Discovery Applications, and 9) Install Advanced Applications
At the beginning of the project, Health Catalyst will begin a collaborative implementation planning process resulting in a timeline tailored to each project. Some projects can be accelerated, with the initial phase completed in 90 days. Your health system will have questions specific to your organization and your circumstances. We are happy to answer those in person.