Since the 2010 passage of the Health Information Technology for Economic and Clinical Health Act, the number of hospitals using electronic medical record systems (EMRs) has risen to 96% from 12%. Leaders and operators looked to EMRs as a cure for the ailing healthcare system, a way to become more efficient and have more reliable data at their fingertips for thoughtful decision making. Many of us were there during the ramp up: deciding on vendors, choosing modules, gathering requirements, process engineering, and testing. In many cases, the primary goal at the time was hitting our agreed upon go-live date. Although we often successfully achieved that goal, we had also laid the foundation for other problems.
EMRs are designed to aggregate information about a respective patient or encounter, so a clinician has all the information they need to make a clinical decision. However, the EMRs’ promise of efficiency was undermined by bad processes as many facilities didn’t design workflows that support data quality. Consequently, while there is plenty of data; there is very little information.
Why Information Matters
The operating room of a hospital is both the highest revenue generator and the highest source of costs. Because of this, managing an operating room is incredibly challenging because organizational politics always focus on mission critical areas. Without high quality information, decision making can fall prey to those politics, resulting in sub-optimal decisions driven by forceful personalities rather than actionable information. This leads to under performance both financially and operationally as valuable OR access is squandered, and surgeon frustration builds. All the data you have is useless and can even become a liability if you can’t understand and act on it as an organization. Stakeholders will interpret information differently, depending on their agendas, if they cannot agree on whether to trust the underlying data. Data unable to provide information is expensive, wasteful, and frustrating.
A Better Way
Over the past decade there has been a tremendous increase in the count and size of M&A hospital transactions across the country. Many sophisticated organizations have invested in a data warehouse to pull data from multiple systems with some form of data visualization software that provides complex analytics and timely information. Having a data warehouse and visualization software can help standardize data from multiple EMRs and other systems at an aggregate level. This facilitates decision making when the aggregated information is coming from trusted source systems. So, how can you ensure your source systems, like EMRs, are delivering high quality data?
The first step is building consensus on the definition and requirements for each data element, and that all the data one needs to build reports is agreed upon. However, gaining consensus and gathering requirements is very challenging, as everyone has their perspective on what should be reported. Often new data tables need to be developed to ensure data quality such that all data reflect agreed upon valid values.
The next step is working through one of the main technical challenges posed by EMRs: the limitation to the tables within its infrastructure. In the case of mergers, for example, facilities often have different EMRs, or different versions that don’t interact well. This makes comparison across facilities impossible if you are only able to rely on the internal EHR reports. An EMR standardization project that migrates all hospitals under a single EHR instance is generally necessary. If the groundwork on building consensus around valid values and data quality standards has been done properly, a facility can more quickly and easily remap their new EMR data into the data warehouse and have continuity in their reporting, even though they are using a new system.
This brings us to the final, and perhaps more important piece of the puzzle. Well-run organizations know it is essential to bring together key stakeholders to build accountability into daily activities and establish a governing body to interpret and work on improvements. This begins and ultimately maintains new processes hardwired with optimal workflows and data outputs. This is critical for enterprise wide reporting initiatives that leverage those outputs to make key business decisions. Healthcare leaders are always looking for nuanced reports to run their facilities more effectively and efficiently. Providing these reports often requires data from multiple source systems. Reducing data complexity into powerful, yet easy to understand information can be the difference between success and failure in today’s environment.
This blog was co-authored by Brian Watha, Associate Vice President of Consulting Services, and Michael Besedick, Engagement Manager.