What if physicians don't believe your quality data?
What if physicians don't believe your quality data?
Your reaction will determine if you obtain buy-in
Have you ever presented physicians with carefully analyzed data to demonstrate areas in need of improvement — and then discover that your findings are disputed?
If physicians don't believe in the validity of your data, they won't use the findings to change their practice or address the systems of care that result in missed opportunities to provide guideline-based care, says Dale W. Bratzler, DO, MPH, medical director of the Oklahoma Foundation for Medical Quality in Oklahoma City.
If you fail to address improvement opportunities that are identified in your organization's performance measure rates, this means that care may not be consistent with guideline recommendations. This could result in your hospital showing up as an outlier, says Bratzler.
"Perhaps the most common pitfall or mistake for quality professionals is to try to promote compliance with national performance measures without carefully working with the medical staff to get buy-in and working to find physician champions for the clinical topics," says Bratzler. "My perception is that those hospitals for which there is medical staff buy-in to the concepts being measured do better on the quality measures."
Having a physician champion to support the work of quality professionals can be particularly helpful in promoting best practices. "Using resources from the physician's own specialty societies, most of which have endorsed aspects of the core measures, is useful," says Bratzler.
Another pitfall is "blindly believing" that every single case should pass the performance measure, says Bratzler. While rates of performance for most measures can be very high, with benchmark rates for most measures near 95%, the measure specifications do not account for every possible clinical exception to the measure. Therefore, compliance with measures should be high, but the target may not be 100% for many measures.
"On occasion, a case will fail the performance measure for legitimate clinical reasons," says Bratzler. "To achieve perfect care on a measure, you have to have a perfect measure. And we do not."
Benefits of buy-in
If physicians believe that quality data are timely and valid, it prompts them to find ways to provide better care, says Debbie Kiser, RN, MBA, CPHQ, director of clinical quality management at the Charleston (WV) Area Medical Center. "Some may be unaware of their own outcomes or data involving a specific issue. They may strive to become better due to competition with their peers," she says.
Buy-in from physicians encourages them to become involved in necessary changes to promote quality patient care. It also means they are more aware of what information is being measured externally and presented to the public regarding patient care at your organization, says Kiser.
At Mission Hospitals in Asheville, NC, the implementation of The Joint Commission's new requirements for ongoing and focused evaluation of practitioners has created "a surge of physician-level reporting," says Tom Knoebber, CPHQ, Six Sigma Black Belt and director of performance improvement.
"By far, the largest issue with physician data is clean attribution," says Knoebber. Health care is becoming more of a team activity, and rarely does a single physician provide the entire treatment plan for a patient, he explains. A consultant, for example, might impact a patient's complication rate, length of stay or even cost of care.
"Without physician buy-in for their performance data, there will always be an 'out' that they were not responsible, regardless of how well you promote the 'captain of the ship' mentality," says Knoebber.
At Mission, physicians currently receive a semi-annual report card, with a limited number of indicators for the areas of service quality, technical quality, and citizenship.
"The current process is somewhat manual, using a data export file to an Excel macro," says Knoebber. "With the implementation of computerized physician order entry, we hope to have more procedural-level data and less case-level aggregate data at the physician level. That should minimize the attribution issues."
Get physicians to educate others
When data have been questioned by physicians at Wellspan Health in York, PA, the measure is first reviewed, including operational definitions with the inclusions and exclusions. "That has always proven enlightening to the clinician," says Susan P. Nelson, MBA, RHIA, CPHQ, director of quality management.
"They tend to think within the parameters of how they treat the patient, but not with all the 'if A, then B' type of exclusions that are incorporated within valid measures," says Nelson.
Second, a sampling of the primary source documents are pulled and reviewed with the questioning clinicians. "That allows for clarifications and teaching moments, concerning what needs to be documented to allow the clinical care delivered to be 'counted,'" says Nelson.
Both of these clarification and validation efforts have proved to be beneficial to both the care provider and the data collector. "These episodes have resulted in clinician 'storytelling' at other meetings and forums where data are presented," says Nelson.
For example, physicians were unclear about all the components and operational definitions for the Centers for Medicare & Medicaid (CMS) quality measures for heart failure discharge instructions. "I spent significant time with one of our physician leaders to be sure that he was clear that when discharging the patient his documentation needed to include both instructions for things to do and documentation about why he would be excluding the patient from the guideline," says Nelson.
As that physician's compliance with the measure got closer and closer to the top 10% benchmark, he shared his story with members of his medical staff department. "That storytelling did more to improve the overall compliance with the recommended guideline than any prior education and reminders, such as the infamous laminated pocket cards," says Nelson.
What to do when challenged
If any one piece of data is suspect, then the entire report is viewed as unreliable and unacceptable by that physician, and most usually the group. "We have had this problem in our institution and that specific report has not been used," says Kiser.
The report involved CMS measures and denoted numbers to "responsible physicians" per specific definitions. "Several physicians did not think that certain measures should be assigned to them, so therefore they thought the data were incorrect," says Kiser. "We have taken this information to specific groups, such as hospitalists, that they use to review their members for quality and give possible bonuses. We are also using these measures as indicators for some departments for use in credentialing."
Quality professionals at Charleston Area Medical Center work hard to validate data prior to distribution and, most importantly, explain the data and the methodology behind it, says Kiser. "I believe that explanation of new reports and/or data is crucial to acceptance," she says.
Many times, it is not that the data are incorrect, but that the methodology is unfamiliar, and thus, not readily accepted by physicians. "In these cases we may begin with small groups, using the data for other work so that it becomes more familiar and, hopefully, acceptable," says Kiser. "This is the process we are now using for the aforementioned report."
Unfortunately, assuring that data are accurate is a time-consuming chore — difficult to do with limited resources. "We have no tricks for this — just internal validation processes. This may delay the distribution of the data somewhat, but assures acceptance by the physicians," says Kiser.
For example, a process for internal validation goes into play if the organization falls below a certain percentage during an external validation. "We have a team for each focus area of the CMS measures. Members of the team review certain cases each month after abstraction to assure accuracy," Kiser says. "Since the average abstractor reviews approximately 400 to 450 charts a month, we understand that we are human and mistakes can occur."
For example, an abstractor might miss a physician's documentation of a contraindication for angiotensin-converting enzyme (ACE) inhibitor at discharge, which is noted as a failure for that indicator. When the team reviews the chart, they find the documentation, which they communicate to the abstractor. The abstractor reviews that part of the chart again and changes the answer within the collection tool.
On the other hand, the team may communicate what they think is missed documentation that turns out not to be acceptable. "That is why, when contacted regarding an inaccuracy, the abstractor re-reviews the record," says Kiser. "The teams are not as knowledgeable of the abstraction guidelines. So they may inaccurately identify a 'correction.' We spend a great deal of time trying to verify an answer, either way, when there is a question from a team."
With other databases, such as the Society of Thoracic Surgeons and the American College of Cardiology, a quality nurse double checks the data against the organization's internal data warehouse before distributing them to physicians. In some cases, physicians review data themselves. "We have an emergency department physician that reviews all 'failures' regarding CMS indicators and responds to each, "says Kiser. "There is also a physician closely involved in the sepsis data collection review."
Physicians are encouraged to contact the quality management department if they believe any information to be incorrect or misleading. "We will re-review the data, or provide them with the case numbers so they can review it themselves," says Kiser.
[For more information, contact:
Dale W. Bratzler, DO, MPH, QIOSC Medical Director, Oklahoma Foundation for Medical Quality, 14000 Quail Springs Parkway, Suite 400, Oklahoma City, OK 73134. Phone: (405) 840-2891, Ext. 209. Fax: (405) 840-1343. E-mail: [email protected].
Debbie Kiser, RN, MBA, CPHQ, Director, Clinical Quality Management, Charleston Area Medical Center, 3200 MacCorkle Ave, SE, Charleston, WV 25314. Phone: (304) 388-8014. E-mail: [email protected].
Tom Knoebber, Director, Performance Improvement, Mission Hospitals, 509 Biltmore Avenue, Asheville, NC 28801. Phone: (828) 213-9194. E-mail: [email protected].
Susan Nelson, MBA, CPHQ, RHIA, Director, Quality Management, Wellspan Health, 45 Monument Road, York, PA 17403-5070. Phone: (717) 851-2003. E-mail: [email protected].]
Have you ever presented physicians with carefully analyzed data to demonstrate areas in need of improvement and then discover that your findings are disputed?Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.