Assess the quality of your core measure data: What you find may be surprising
Now that data are being made public, you can’t afford mistakes
Imagine hearing this as part of a competitor’s advertising campaign: At your hospital, fewer pneumonia patients receive antibiotics within recommended time frames. And your facility boasts the highest inpatient mortality rate for heart attacks.
There is just one catch: These claims are based on inaccurate, misleading core measure data submitted by your organization. Now, your competition is using your own data against you. Regardless of what damage control you attempt, lasting harm has been done to the public’s perception of the quality of patient care provided at your facility.
This is just one of the problems that can occur if you don’t keep a close watch on the accuracy of your core measure data. As of January 2004, hospitals are required to report on three of the four core measure sets developed by the Joint Commission on Accreditation of Healthcare Organizations:
1. acute myocardial infarction;
2. heart failure;
3. community-acquired pneumonia;
4. pregnancy and related conditions.
In 2003, the Joint Commission conducted site visits to 30 hospitals to assess the interrater reliability of different abstractors collecting the same core measure. During the site visits, Joint Commission staff abstracted core measure data elements and compared them with the original data previously submitted by hospitals — and discovered significant discrepancies.
"We found that the vast majority of hospitals that do core measures don’t do any kind of formal performance assessment and improvement processes focusing on the core measures data collection and abstraction functions," reports Ann Watt, MBA, RHIA, associate project director in the division of research with the Joint Commission and the study’s principal investigator.1
She points to a lack of resources for collecting and analyzing core measure data, which forces quality managers to come up with creative solutions to meet the requirements.
"In many cases, additional resources for this function have not been made available. Instead, this has been heaped onto the quality manager’s other responsibilities," Watt says.
A strong incentive for improving core measure data is the knowledge that the general public and other organizations will have access to them, with the implementation of the National Voluntary Hospital Reporting Initiative — an effort led by the American Hospital Association, the Federation of American Hospitals, and the Association of American Medical Colleges to provide information about hospital quality to the public.
"Who knows how these data are going to be used? And so it’s very important that hospitals have the assurance going forward that their data are accurately represented in these public releases," Watt says.
More than 1,700 hospitals have agreed to share data for the initiative’s 10 quality measures. The first set of clinical process data was publicly released in October 2003, giving the general public access to hospitals’ performance in treating heart attacks, heart failure, and pneumonia.
In addition, the Joint Commission will begin making data on heart attack, heart failure, pneumonia, and other measures publicly available this summer as part of its new Quality Reports.
There is a significant potential problem with consistency of core measure data between facilities, notes Steve Osborn, CPHQ, vice president of clinical quality and patient safety at Saint Vincent Health Center in Erie, PA.
"For instance, when you abstract what time the patient arrived at the hospital, you’re supposed to use the first time in the chart. That could be a triage time or could be a registration time," he says. "Is every hospital doing this the same way? Undoubtedly not."
Still, it’s in your own best interest to improve accuracy of core measure data, Watt urges.
"Hospitals spend a fair amount of time and resources collecting these data. Presumably, they would like to know they’re accurate if they are going to be making clinical, system, and management decisions based on those data," she says.
If your data aren’t accurate, you may be allocating resources to the wrong areas, Watt warns. "People are focusing ever more limited resources in areas that are not of concern and are missing areas of clinical improvement and real impact on the clinical care they give," she says.
To improve the quality of core measure data, use the following strategies:
• Develop a formal interrater reliability process.
The Joint Commission study revealed that hardly any of the hospitals conducted rigorous interrater reliability studies. At regular intervals, a random sample of your cases should be re-abstracted by an independent reviewer and the results compared, Watt says.
"That seems, at this point, to be the most effective way of assessing core measure data," she says.
The reviewer doesn’t have to be an external person, but Watt acknowledges that many hospitals have only one quality person on staff.
"Some hospitals are being creative by swapping cases to review if they are part of multihospital systems, or if they have particularly cooperative consortiums in their region," she reports.
Watt points out that, in sharp contrast to the lack of formal assessment of core measure data, 75% of the hospitals in the study reported that they routinely formally assessed their coding performance.
You should pay equal attention to core measures, since their impact is going to be every bit as significant, advises Watt, adding that before diagnosis-related groups came into existence, there wasn’t a major concern with medical records coding because it had nothing to do with hospital reimbursement.
Education is twofold
"Nobody back then was assessing the accuracy of their coding. But suddenly, it had huge consequences and became a very important function," she says. "I suspect that something similar is going to happen with core measures."
• Educate clinical staff.
A twofold goal at NorthEast Medical Center in Concord, NC, is educating clinical staff about the methodology behind the indicators and soliciting input from caregivers for strategies to make the data more accurate, says Pam Spach, RN, BSN, CPHQ, director of performance improvement and disease management.
"Improving the quality of core measure data submitted to JCAHO is a very important goal for us," she says. "It is quite clear that those with the most influence to effectively improve our outcomes are those closest to the point of care."
These steps were taken to educate clinical staff about core measures:
1. Clinical directors give one-on-one inservices to unit staff about about the core measures and how they are being addressed. "They focus on the clinical practice guidelines which address all of the indicators for the core measures we are following," Spach says.
2. Quarterly updates on core measure data are given at medical staff, leadership, quality council, and board meetings.
3. Core measure results are posted on the hospital’s web site.
4. Clinical processes have been "hard-wired" to meet specific core measure goals, such as giving pneumococcal vaccine to pneumonia patients.
"This was established as a nurse-driven protocol, as opposed to being dependent upon physician order," Spach reports. "Our results have dramatically improved, increasing from 50% to 93% over the past year."
• Make sure that abstracted data are collected consistently.
Abstracting staff at Saint Vincent Health Center discovered a problem while measuring timeliness of the first dose of antibiotics for the pneumonia core measures.
"There are other core measures that have timeliness as a component, but this one is the most particular because the Joint Commission is actually tracking average times," Osborn explains. "You need to have times under four hours, which actually isn’t all that easy. Most of your patients will be under four hours, but a lot will be close to that or over that."
Therefore, getting the exact time is very important, as compared with tracking whether patients receive aspirin within the first 24 hours, which requires only a yes or no response, he says.
When staff began collecting data, they simply documented the first time that they found on the chart, Osborn points out. "But we found out that antibiotics could be located in any one of four different places in the medical record." Those places were:
1. The medical administration record, where the time is rounded to the nearest hour. "It would say the patient got [cephalosporins] at 1400. Well, there is almost an hour difference between 1331 and 1429," he adds.
2. The time the medication is removed from the automated medication dispenser. "This is not exactly the time that the medication was administered, since the nurse might have gotten waylaid, especially in the ED," Osborn explains.
3. The nursing notes. "Nurses can enter a time there, but they are often writing the note hours after the drug was administered," he reports.
4. The ED worksheet, which documents physician orders.
"When we started collecting data, we found two or even three different times for the same drug," Osborn says.
"So the question is, which is the most correct, and are any guaranteed to be correct? It turned out that there are problems with each and every one of them," he adds.
To address the problem, the abstracting staff met with direct caregivers and decided that the automated medication dispenser time is the most accurate. "It may be slightly off, but at least it is consistent, as compared with nursing notes, which are often done after shift, when they go back and pull charts and document things they did hours ago," Osborn explains.
• Make sure patients are not falling through the cracks.
When case managers began collecting real-time data for pneumonia core measures, they reported that they were following 95% of patients.
"It turns out they had data on 88 patients in five months, but our data showed 207 pneumonia patients in the same time period," he says.
The discrepancy was discovered when the abstractors reviewed the patients after discharge based on ICD-9 codes as required by the Joint Commission, and came up with a total of 207 pneumonia patients, as opposed to the 88 reported by the case managers.
"So the case managers may be actually influencing care on 40% of the patients, as opposed to 90%," Osborn says. "For us, it was a big surprise. We thought we would be doing great on these measures, but 40% are still getting the same care they always got."
It was determined that patients were falling through the cracks for a variety of reasons:
1. Some patients were admitted on a Friday and discharged by Monday morning. "The number was higher than they thought it would be," Osborn says. "It begs the question, why are we only case managing Monday through Friday?"
2. Some pneumonia patients were admitted to the telemetry unit because of concerns about cardiac status. This was resolved by making the cardiac case managers aware of the pneumonia initiative, Osborn says.
3. Some patients were coded after the fact as having pneumonia. "This raises the issue that all these initiatives are driven by coding data, and coding is not reflective of actual patient status," says Osborn. To address this, one of the physician team members reviewed a sample of the charts and met with coding staff. "The presence of the pneumonia code was generally found to be accurate when our physician sat down and reviewed these, but there was a debate as to whether that code should have been in the principal diagnosis," he explains.
The problem is that in addition to pneumonia, the patient may have been treated for other conditions, such as congestive heart failure. "Coding requires you to select one of those codes to be the principal diagnosis, the principal reason for the patient coming to the hospital," Osborn says. "And that is not always apparent, even by coders reading all the documentation as faithfully as possible."
To be included in the collection of core measure data, the patient needs to have a principal diagnosis of pneumonia, he explains. "Perhaps the discharge summary wasn’t dictated at the time they were doing coding, and the answer would have been there. Or maybe one doctor was focusing on pneumonia and another was focusing on the congestive heart failure, so it’s not really clear to the coder," he says.
The case managers had three goals: To ensure that pneumonia patients got the blood culture on admission and received one of the selected antibiotics, encourage staff nurses to hang the antibiotic IV as soon as possible, and make sure that patients got appropriate vaccines.
"So if the case managers are there and are aware of who the patients are, they can influence some of the data, but it turns out they may not be seeing all of them," Osborn says. "My guess is that hospitals trying to abstract in real time are going to have the same problems we had and will find out after the fact that they are missing significant chunks of patients."
Because of this, Osborn says, to accurately meet the core measure requirements, data collection must be abstracted post-coding, which for most hospitals will be after the patient is discharged.
"We are not convinced that there is great value in collecting data in real time," he says. "However, we strongly believe that case managers or other staff need to be following these patients in real time and influencing the care they receive."
[For more information on improving the quality of core measure data, contact:
• Steve Osborn, CPHQ, Vice President, Clinical Quality and Patient Safety, Saint Vincent Health Center, 232 W. 25th St., Erie, PA 16544. Phone: (814) 452-7378. Fax: (814) 455-1524. E-mail: [email protected].
• Pam Spach, RN, BSN, CPHQ, Director, Perform-ance Improvement & Disease Management, North East Medical Center, 920 Church St. N., Concord, NC 28025. Phone: (704) 783-4009. Fax: (704) 783-2080. E-mail: [email protected].
• Ann Watt, MBA, RHIA, Associate Project Director, Division of Research, Joint Commission on Accredita-tion of Healthcare Organizations, One Renaissance Blvd., Oakbrook Terrace, IL 60181. Phone: (630) 792-5944. E-mail: [email protected].
A tool kit including a sample Heart Attack Discharge Form can be downloaded at no charge at the American College of Cardiology web site: www.acc.org. Click on "Guidelines in Applied Practice," "Guidelines Applied in Practice Program," "Acute Myocardial Infarction in Michigan," "Download the AMI GAP Tool Kit."]
Reference
1. Watt A, Williams S, Lee K, et al. Keen eye on core measures: Joint Commission data quality study offers insights into data collection, abstracting processes. Journal of AHIMA 2003; 74(10):20-25.
Imagine hearing this as part of a competitors advertising campaign: At your hospital, fewer pneumonia patients receive antibiotics within recommended time frames. And your facility boasts the highest inpatient mortality rate for heart attacks.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.