What Does It Take to Develop a Clinical Data Mart in 90 Days?
Designing and implementing a data mart that fits into an enterprise data warehouse can be a very resource-intensive project taking considerable time. Health systems faced with this fact often compromise by bringing in portions of data sources or skipping entire systems wholesale. As time goes on, the maintenance of existing data marts becomes overly burdensome, consuming resources which can contribute to the avoidance of adding new sources of information.
Software applications come and go. They consolidate and many niche vendors consistently show up to meet new needs. And as you probably know, the software situation in healthcare is perhaps even more complex given the myriad of parties involved in delivering and paying for healthcare.
Quick data mart development and installation, combined with a low maintenance burden is now a requirement for health systems that want a true enterprise data warehouse.
We recently developed a data mart based on the core clinical tables from the Cerner EMR in just 90 days . By exploiting years of health care data warehousing experience, we witnessed an almost perfect storm of events allowing us to accomplish this task.
I’ll summarize the approach as: “Principle-centered and Metadata-driven.” Outlined below are some of the strategic and tactical approaches.
Have you ever sat on a standards forming committee? They can be painstakingly slow as long debates, often over the tiniest of details and usually by very well intentioned people, bring the production of the committee to a halt. With principles in hand, these debates can often be quickly settled. The existence of guidelines, with clear examples, can speed things up tremendously.
Some would characterize the life of an airline pilot as hours and hours of boredom, interrupted by a few moments of sheer terror. Likewise, teams assembled to perform mappings often experience a boring and somewhat unchallenging task, often interrupted by a few episodes of difficult, complex and significant decision making. With principles in hand, and a broad range of experiences, quick decisions can be made and made to stick.
We have found a broad range of personnel, beyond just data architects, to be very effective. We assemble new hires and seasoned experts, IT folks and operational staff, and terminology experts. It also helps to have several clinicians available to answer the inevitable questions, usually about an acronym.
And of course, don’t forget to have plenty of treats available!
A tool that is easy to use and capable of gathering metadata also makes the team very productive. A tool that can retrieve the metadata, make sense of it, and store it to a repository, is invaluable.
Data Dictionary and ETL (Extraction, Transformation, and Loading)
Once all the metadata is deposited in a repository, it can be exploited by two critical data warehouse functions: The data dictionary and the ETL engine.
A properly designed ETL engine can do lots of things dynamically with a rich set of metadata at its disposal.
This is certainly a tactical solution, but it can have enormous gains. We have all experienced slowness with our requests to get access to a system. A team member with no access is not very productive. And volunteers, who have to wait days or weeks for access, are also not productive. Eric Just, our Vice President of Technology, wrote about the importance of having a good data stewardship plan.
Comment below to let us know what strategies you’ve found to be essential to quickly complete complex projects