Letting it all hang out does not seem to matter
Letting it all hang out does not seem to matter
Study: Reporting made little difference in mortality
It has been seven years since Medicare started requiring hospitals to publicly report their performance for core measures related to heart attack, heart failure and pneumonia. Ask the hospitals participating in Hospital Compare whether this has affected their quality improvement and patient safety efforts and the vast majority will answer in the affirmative. But does that mean that what they do is resulting in better outcomes, in lower mortality? Not according to a new study in the March issue of Health Affairs1.
Researchers looked at data from 2000 to 2008 for hospitalized Medicare patients with heart attack, heart failure, or pneumonia, and also patients who were admitted for conditions that were not reported to Hospital Compare — stroke, hip fracture, and gastrointestinal hemorrhage. The goal was to see if the public reporting of data for the three former conditions led to any statistically significant change in mortality rates compared to those conditions for which data was not publicly reported. The timeline of the patient base straddled the time when data collection was first required and when it was mandated to be publicly reported in 2005.
The results showed a decline in mortality rates across all three of the reported conditions, but after controlling for existing trends to lower morbidity, that decline was not attributable to the public reporting of Hospital Compare data and all but disappeared for heart attack and pneumonia, and was minimal for heart failure. The researchers could find no movement of patients from poorly performing hospitals to better performing ones, and it could be that patients weren't using the data to choose providers at all. "To the extent that Hospital Compare reduced mortality, this was a result of within-hospital quality improvement," the study concludes, "which has been noted as a mechanism for quality improvement in other public reporting programs."
Author Andrew M. Ryan, Ph.D., assistant professor of public health at Weill Cornell Graduate School of Medical Sciences in New York City, says the question is not really about the effect that is or is not there, but about why we aren't seeing any effect. Everyone can agree that more people are reporting that they are meeting the requirements of the core measures. The numbers are going up for things like aspirin for stroke and heart attack patients. "Compliance with a lot started out so low," he says. "It makes it look like a success that these numbers went up. And maybe it is a success. But is the success a function of clinical or administrative activities?"
Hospitals wanted to do better on these measures, but it is possible the improvement was merely based on better documentation, and hospitals were already doing the things clinically that they were finally writing down, says Ryan.
The underlying logic of the measures is sound, he continues. They are based on peer-reviewed consensus evidence, and in trial settings each of them resulted in better outcomes. "But clinical trial settings are different," Ryan notes. "You have more discretion to exclude patients." The indicators are also pretty narrow and do not effect whole system change. "It is like teaching to the test."
He also says to remember that the study period was only a couple years into the public reporting and did not yet include public reporting of readmission and mortality rates. That those are now part of the record any consumer can check may positively impact those rates in the future.
The measures themselves
Not everyone thinks this is some sort of fluke that does not mean the indicators aren't good to consider and study. For some, the quality measures as they exist in hospitals are "completely perverse." That's the opinion of J. Deane Waldman, MD, MBA, a professor of pediatric medicine, pathology and decision science at the University of New Mexico in Albuquerque and author of the book Uproot U.S. Healthcare, which offers extensive critique of the way quality improvement is conducted currently.
Mortality is viewed as bad, and thus the opposite of it must be good, he says. But what people want is not the opposite of mortality. They want restored function and long life. Waldman also objects to the timeline associated with most quality measures. They do not look at things such as long-term survival rates or how the hip replacement patient does after two years, but in-hospital death and 30-day readmissions.
They are, he says, the easy things to measure and rely more on processes than outcomes, and if they are outcomes measures, they are not the outcomes that really matter to people. "No one ever asks the customers what they think quality is," says Waldman. Those things include ease of getting into and navigating the system, knowing what will happen next and a timeline of reasonable expectations. Patients want a smooth recovery if they are sick or an easy experience of the routine. "But medicine makes it hard." They want to see a physician in a timely manner, not the days or weeks it sometimes takes to see a specialist.
Waldman acknowledges that issues like parking, nice nurses, and decent food are real issues to patients. But they are not the important things. They are just the easiest ones to measure and correct, he says.
What hospitals should be doing is looking beyond regulatory compliance because the desired outcome shouldn't be compliance, Waldman notes. And while it is logical that following many if not all of the standards from The Joint Commission or the Conditions of Participation from the Centers for Medicare & Medicaid Services (CMS) will improve quality, no one has ever proved that case.
And they should do this, Waldman continues, because eventually, looking at positive outcomes measures over a longer timeline will be something that is required. Meanwhile, doing these hard measures is a way of differentiating yourself from the competition. Payers will eventually respond to that.
Give providers incentives to do the things that improve these long-term positive outcomes measures, Waldman says. "What if you gave them an incentive to come up with ways to reduce infections or range of motion in ortho patients? You could save a vast amount of money by reducing your length of stay. If you have a fixed reimbursement for a procedure, you still get paid the same regardless of how long they stay." But for the patient, getting out sooner, being able to move more easily? Priceless.
Far to go, but do not despair
Not everyone is that pessimistic. "Over the last ten years, as a result of having a common set of data for all hospitals and physicians, we can finally have some meaningful conversations about quality across institutions," says Apruv Gupta, MD, MPH, managing partner at the Boston-based Physician Performance Improvement Institute. Yes, there are limitations — among them the fact that the data are out-dated when posted and have limited severity adjustment. "One of the biggest limitations of the data from a quality standpoint is that they can be improved in documentation, but without any corresponding change in practice — which means that even though it appears that process measures are improving, if there is no actual change in care, then outcomes measures such as mortality will not change," Gupta says. "The fallacy is in our belief that the process measures are necessarily linked to the outcomes measures, which they are not. But in any case, the great value of the data is in the quality movement that it has fostered, including transparency, and accountability. Real results will still take many more years to attain."
Gupta takes heart in how seriously hospitals take the measures that are required. "They are trying to develop a culture of measurement and systems for improvement. That's an OK place to be given when we started. We are building the measuring infrastructure. In a decade, there will be significant movement."
He knows that things are moving forward just from how much more willing providers are to discuss measures and report them. "Five years ago, docs would argue with you about what to measure, but now they do not actively fight you," Gupta says. "They are willing to sit at the table — not leading the efforts, but not fighting the efforts. You just have to figure out a way to engage with them."
According to Gupta, these things we are measuring may not be optimal, but they are still important, and it is beneficial to look at them, even if they do not move outcomes as much or as fast as you thought they would. "Improved documentation is critical because you are building a system that is necessary to impact quality. These core measures were what could be used to compare hospitals and gathered pretty easily."
If you are comfortable with the existing measures, Gupta suggests moving beyond them. For heart failure, there are other things you can measure and see if that helps you improve further. But for a regulatory body to mandate 15 measures for one disease alone? That wouldn't have flown.
Ryan does not believe that hospitals should stop doing what they've been doing because they do not believe it has an impact. "Hospital Compare is a good thing. It has raised the profile of quality of care and gotten people to focus on issues that for a long time were taken for granted."
For now, this is a good first step. "As it evolves, it may better engage patients, which it hasn't yet," he says. "And it does not mean it hasn't benefited the individual patient."
For more information on this topic, contact:
- Andrew M. Ryan, Ph.D., Assistant Professor of Public Health, Weill Cornell Graduate School of Medical Sciences New York, NY. Email: [email protected].
- J. Deane Waldman, MD, MBA, Professor of Pediatrics, Pathology and Decision Science, University of New Mexico, Albuquerque, NM. Email:[email protected].
- Apurv Gupta, MD, MPH, Managing Partner, Physician Performance Improvement Institute, Boston, MA. Email: [email protected]
Reference
- Ryan AM, Nallamothu BK, Dimick JB. Medicare's Public Reporting Initiative on Hospital Quality Had Modest or No Impact on Mortality from Three Key Conditions. Health Aff. 2012 Mar;31(3):585-92.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.