TJC praises top hospitals in annual report
Hospitals continue to make progress on quality and safety, according to key measures of evidence-based care processes. That’s the bottom line from The Joint Commission’s (TJC’s) 2015 annual report on quality and safety, which summarizes data on 49 accountability measures more than 3,300 TJC-accredited hospitals collected and reported in 2014.
In the report, a total of 1,043 hospitals achieved “top performer” status, a designation that requires a hospital to:
• achieve a cumulative performance of at least 95% on all reported accountability measures;
• achieve a performance of at least 95% on every reported accountability measure with at least 30 denominator cases;
• have at least one core measure set that achieves a composite rate of at least 95%, and within the measure set, achieve a performance rate of at least 95% on all individual accountability measures.
“This is the largest and most diverse set of data TJC has ever collected from U.S. hospitals measuring how well they are providing care for a variety of specific conditions such as heart attack, perinatal care, and children’s asthma,” noted Mark Chassin, MD, FACP, MPP, MPH, president and CEO of TJC, when announcing the findings on November 17. “This year, seven new accountability measures were added, including two new measure sets related to tobacco treatment and substance use. We also asked hospitals to submit data on six measure sets, up from four in 2013.”
While TJC-accredited hospitals have charted dramatic improvements since TJC’s core measures program debuted in 2002, the number of top performing hospitals noted in the report is actually down by 180 from a year ago. Chassin noted that TJC actually anticipated a steeper decline.
“The bar is higher because we added new measures and new requirements this year for the number of measures that had to be reported. We expected the number of hospitals qualifying as top performers and [meeting] this overall metric of 95% would decline a little bit, but it didn’t decline by very much,” Chassin said. “That is a bit of a surprise. I thought it would decline more.”
Chassin noted that in the first year of the “top performers” recognition program, only 405 hospitals or 14% qualified as top performers based on their performance in 2010. This year, more than 30% have earned the recognition, even with more measuring and reporting requirements. Chassin added that another 165 hospitals missed the top performer designation this year by only one measure.
While lauding the top performing participants, Chassin announced TJC will put the program on a one-year hiatus in 2016. During this period, he noted TJC intends to focus its attention on helping accredited hospitals transition to electronic clinical quality measures. Chassin added that TJC this month will launch a new Pioneers in Quality program focused on helping hospitals reach top performer status “in the electronic clinical quality measures world.”
Chassin acknowledged that with so many players now engaged in rating hospital quality, people are looking to multiple sources for information. However, he cautioned that not all of the measures tracked and reported hold up to close scrutiny.
“For a number of years … TJC and CMS were nearly perfectly aligned in the definition of measures and in the public reporting of those measures, and the public reporting drove a huge amount of improvement,” he said. “Those were the only data on hospital quality that were available, but now the situation is different.”
Chassin took particular issue with the way Medicare has added measures that are derived from billing data.
“We don’t believe those measures are valid measures of quality. We will also not use outcome measures … that rely on billing data to perform risk adjustment because those billing data don’t have any information on the severity of the condition, which is one of the most important things you have to adjust for when comparing different populations,” he explained.
Chassin also took exception to the ratings practice of giving hospitals a single letter grade to denote quality.
“That is just demonstrably misleading, because we know … that quality varies enormously within hospitals from one service to another and from one measure to another,” he said. “Even if you literally had great quality measures and averaged them across an entire hospital, that average would be very misleading because patients might expect to get whatever that average is in a particular service, but one service is going to be higher than average and one service is going to be lower.”
That is why TJC has refrained from coming up with a single measure to denote hospital quality, Chassin said.
“It just flies in the face of decades of research that shows the variability does not allow that kind of measurement to be accurate,” he said. “The evidence is crystal clear that quality varies quite a lot within individual hospitals from one service to another and from one measure to another.”
How can hospitals try to appease all the players in the hospital quality measurement field?
Chassin encouraged organizations to tune out the “noise” and focus on measures that are most important to their own patient populations. “The most important quality improvement that hospitals can do is to understand what risks their patients are facing, what improvements are necessary for their patients, and to act on those incentives,” he said.
“America’s Hospitals: Improving Quality and Safety: The Joint Commission’s Annual Report 2015” is available at: www.jointcommission.org/TJC_annual_report_2015.
Hospitals continue to make progress on quality and safety, according to key measures of The Joint Commission’s 2015 annual report on quality and safety.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.