Failed benchmarking effort teaches hospitals hard-learned lessons
Failed benchmarking effort teaches hospitals hard-learned lessons
Lack of focus dooms joint effort
Benchmarking can lead to wonderful improvements in your wound care program, but only if the study is done correctly. That’s the hard-learned lesson of participants in a benchmarking study that they later found lacked focus, good data collection, and follow-up.After spending nearly a year to benchmark best practices in wound care in 1995, a group of six Southeastern hospitals abandoned the project. "We ended up with no guidelines of care from our basic results," says study participant Catherine Newhouse, RPT, director of rehabilitation services at the 588-bed Spartanburg (SC) Regional Medical Center.
"We were looking at a very broad area with very little in journals that have addressed best practices issues," notes participant Harriet Jeffords, RPT, HSHA, director of rehabilitation services at the 333-bed McLeod Regional Medical Center in Florence, SC. "But we weren’t able to draw any conclusions from the information we got."
"Essentially, we did all this work and it sounded great, but we didn’t get anything back from it," adds Donna Beck, PT, director of rehabilitation at Anderson (SC) Area Medical Center, with 579 beds.
For a year, these program managers and their hospitals coordinated a plan they believed would produce results. That plan turned out to be a cruel disappointment. Study managers identified several reasons for its failure:
• Insufficient data.
Not all participating hospitals collected data on the minimal number of diagnoses and therefore had to be dropped from the study.
• Inconsistent data.
The types of wounds for which data were collected ranged too widely.
• Too many caregivers.
Too many different caregivers were involved with wound care, and they did not all have the same skill levels in identifying the condition of wounds according to the data collection tool.
• Lack of follow through.
Compliance with collection slipped as time went on. The study began with a standardized data collection tool and a lot of input from the six hospitals. Caregivers chosen to collect the data were taught how to use the tool to ensure data were collected uniformly, Newhouse says.
The hospitals also developed an initial assessment tool to identify:
• type and amount of tissue removed from the wound;
• stage of the wound;
• wound color;
• whether granulation was present;
• amount of tunneling;
• amount of undermining;
• edema;
• wound depth, width, and length in centimeters.
Assessments were to be done on the first visit, at every sixth visit, and again at discharge. Treatment options were broken down into categories such as:
• whether irrigation was done by hand or whirlpool;
• whether an antiseptic such as Betadine was added to the whirlpool;
• whether a topical debridement was done and how it was done — chemically or mechanically with scissors or tweezers.
Wounds were assigned a number according to their condition. This number was put into a computer for analysis. The hospitals also developed a tool for measuring the reliability of wound ratings to ensure caregivers were assessing wounds in the same way across all sites, Newhouse says.
Despite the participants’ best efforts and training, the wound care data collection instrument proved to be too complicated, says Beck.
"I don’t think everybody consistently followed the guidelines [specifying what] needed to be done on initial assessment and post-treatment. Every time you saw someone you were supposed to measure some basic things like exudation and wound color. Some people just didn’t do it," Newhouse says.
Another problem was the lack of focus, participants say. Data should have been collected on one wound care diagnosis, such as chronic ulcer or diabetic ulcer or burn, she says. "We did every wound that came in."
For example, data were collected on abscesses, amputations that hadn’t healed, arterial ulcers, gangrene, and pressure ulcers. "If we had just looked at our volumes of wounds and what we seemed to have the most problems with or most patients across the sites, we could have focused our study on one wound condition and gotten information that would have been valuable, Newhouse emphasizes.
In addition, not enough education was provided to caregivers on how to collect data, Jeffords says. "It would have helped to have some type of film on data collection to provide reinforcement," she says. "We weren’t there to reinforce the practices as the study was being done."
"We learned we needed to focus our study and improve data collection," Newhouse says. In fact, some of the data were questionable. For example, one hospital reported it saw a patient 437 times and only saw a 45% improvement in the wound. "We thought, no way, some of these numbers are screwy," she continues. "It was a good idea, but we did too much." n
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.