EHRs, quality measures: Study points to problems
AHA suggests collaboration to ease transitions
Electronic health records (EHRs) are supposed to make your life easier — everything at hand, collected automatically. But that’s not always the reality, and that fact is highlighted in a new report from the American Hospital Association (AHA) on how well hospitals are using EHRs to report on clinical quality measures. There are currently about 90 measures that hospitals have to collect for various quality programs, and the number seems to increase regularly.
The report notes the potential of EHRs to "ease the burden" of quality reporting, while improving the ability to see performance in real time. Meaningful Use Stage 1 requires providers angling for financial incentives to report on 15 measures with some 180 data points. But the rapidity with which the program was implemented and subsequently altered means that accuracy may have suffered. The report notes these problems:
- "existing measures modified for calculation by EHRs without robust testing to determine if all of the data were available in existing EHRs;
- " known errors in the [measures] and lack of implementation within EHRs to test the feasibility of data collection or validation of results; and
- " lack of mature e-specification development and updating process."
Several organizations were visited for the report, all of which had been lauded — by the industry, the press, and the government — for their efforts related to Meaningful Use, electronic health records, and quality improvement programs. They had been ahead of the curve in these areas by as many as 10 years, according to the report. And they all figured that the stage one requirements would be pretty straightforward for them to implement. Their actual experience was not what they expected.
According to the report, the organizations all conducted a gap analysis to compare the requirements against what they were already capturing, and all found a large gap — a lot of the data was there, but not captured in the right format. It took a lot of time and money to modify what they had in place so that it met Meaningful Use requirements. The hospitals all reported that they expected to rely on the electronic clinical quality measure (eCQM) reporting tools to perform eCQM calculations. If they had integrated EHRs, it was easier than if they had unique systems with little or no interoperability.
So they created workarounds — like manually entering data that was already in the system, but somewhere else. Validation of eCQM results was the source of a lot of effort but little success. Says the report: "Two hospitals were able to validate their technical ability to capture the necessary data; however, the use of these data fields was inconsistent, and they did not achieve clinical validation. One hospital achieved technical validation and did not directly compare the results of the eCQMs and the corresponding chart-abstracted measures from which the eCQMs were derived. As an ongoing step in the eCQM validation process, three organizations developed a staff-intensive and unsustainable concurrent review process to encourage documentation directly by nurses or order-entry by physicians."
These were not nay-saying hospitals that wanted no part of the new rules. They were all committed to the process and expected Meaningful Use to be part of a successful overall quality program. They figured they could get quality data from the EHRs, use all of the quality data they collected and share it with others, and use the EHR for clinical decision support related to the eCQMs. What they found instead were specifications that were hard to access, complex, sometimes inaccurate, and not maintained over time. There were technology challenges with tools that didn’t work as expected and didn’t generate accurate results in an efficient way.
Clinicians objected to the additional work that didn’t seem to improve patient care in any way, since a lot of what they had to do was already included in the record in some other manner. Hospitals interviewed spent between two and 18 months of physician and nursing leadership time per measure making changes, and staff has to spend additional time making sure what is included is correct and manually correcting what is not.
"Organizations either spent considerable time in re-work to revise and validate the eCQM measurement process with the eCQM reporting tool, or chose to ignore the results in favor of those derived from the chart-abstracted versions of the measures," the report says.
The hospitals also reported that all this extra time and money meant other things didn’t get done — such as implementing medication bar coding programs.
The recommendations will bring cheers from many: Slow the pace of transition and use fewer, better tested measures; make EHRs and eCQM reporting tools more flexible; improve standards for EHRs to make them more user-friendly and easier to achieve Meaningful Use requirements; test the eCQMs before adopting them; and provide guidance and tested tools that will support hospitals in this transition. The summation: These tools should work for clinicians. Right now, clinicians are working for them and not getting much in return.
Diane Jones, senior associate director of policy at the AHA, says anyone involved in quality in a hospital can gain insight from the report, which includes a list of the workarounds developed by the interviewed facilities, as well as in-depth policy recommendations.
"One of the most interesting results from these interviews is that the people traditionally responsible for quality improvement and measurement are not tightly aligned with the IT department," she says. In some cases, they may not even know each other. But the best programs are those where the IT staff and QI staff are closely involved and collaborate often.
The number-one to-do item that emerges, Jones says, is that you have to develop that collaborative relationship. "You can’t possibly address the issues related to Meaningful Use if you don’t have a close relationship. Reach out to that other department. Foster bridges, bring people together. This is really a team effort."
The report seems like a downer, but Jones says one of the positive lessons is that there is a clearer picture of what’s hard and not working, even for hospitals that had a lot of EHR experience before meaningful use. "We thought if we talked to people across departments — finance, IT, nursing, quality — we could get a picture," she says. And the picture is of difficulty overcome by collaborative environments, not people working in individual silos without knowledge or care of what someone else is doing down the hall.
The complete report can be seen at http://www.aha.org/content/13/13ehrchallenges-issbrief.pdf.
For more information on this topic, contact Diane Jones, Senior Associate Director of Policy, American Hospital Association, Washington, DC. Email: [email protected].