Mandatory patient safety training, from the board room to linen room
Mandatory patient safety training, from the board room to linen room
Lawyers may cringe, but upfront practices work
In early December, a patient in Seattle settled a lawsuit with the University of Washington Medical Center after its surgeons left a 13-inch retractor in his abdomen. Errors happen. In this case, the hospital readily admitted its error and is working to learn from the mistake.
The outcome in Seattle was much better than the outcome of a case of medical error that hit the press in 1995, when physicians at the Dana-Farber Cancer Institute (DFCI) in Boston discovered that a Boston Globe reporter had died from an overdose of chemotherapy. Another patient experienced a similar overdose and ended up in intensive care. The two incidents cast a decidedly unflattering spotlight on the facility. But from the start, DFCI decided to be forthright to the patients, their families, the staff, and the public at large about what happened and what would be done to ensure it didn’t happen again.
James Conway, senior vice president and DFCI’s chief executive officer, says that the facility, at the time, had a typical response to medical mistakes. "When significant events occurred, we captured them and incident reports were filed. A closed group of people saw the information, and it was investigated. It was viewed as a people problem, not a systems issue. Clearly it was a culture where errors were seen as the failures of individuals."
Extraordinary confidentiality, avoiding the press, and involving risk management were typical of the old style of dealing with adverse medical events. "There was not a major focus on disclosure and learning from incidents," Conway notes.
There were three things that led to a change when the story of the reporter, Betsy Lehman, was made public. First, Conway says, there were more than 30 separate front-page articles in the Boston papers about the mistake. Second, the patient’s husband worked at DFCI. Last, Conway says that the board of trustees was greatly shaken up by the two overdosing errors.
The chairman of the board was a principal at one of the largest consulting firms in the country, Conway recalls. "He was responsible for quality and audits in that company, and he knew the role of systems in errors."
No one denied that an error occurred. But the board decided a comprehensive evaluation and a change in the way it looked at errors were required. "They agreed to keep key constituencies involved in the process — patients, staff, the public. And they made a commitment not to go after staff." Conway says board members believed, "they could have fired all the staff and done nothing to reduce the chance of a similar error happening again. But the board wanted to use the tragedy to leverage the organization to a different place. Our goal was to be seen within three or four years as a leader in cancer care."
With the commitment from the board, Conway, and staff, DFCI has achieved that goal, Conway says. His leadership efforts on the patient safety front led the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) and the National Committee on Quality Assurance (NCQA) to present Conway with their annual patient safety award in November.
New processes at work, new data collected
Rather than look at individuals, the process for dealing with medical errors at DFCI now focuses on systems defects and conducting root cause analyses. "We instantly disclose to the patient and family what we know to be true, but we also point out that we don’t yet know all the facts," Conway says.
Staff support is also key. "The nurse, the pharmacist, the physician — they are all in the middle of a nightmare when an error occurs," Conway says. "We make sure they feel supported."
A multidisciplinary team conducts a root-cause analysis, looking at environment, work load, and process factors. But discussions don’t only occur when a mistake happens; data surrounding error are talked about regularly. Patient and family advisory committees and staff are provided with trending data at regular committee meetings.
Systems that once didn’t do anything to prevent error are now designed with error in mind. For instance, computerized chemotherapy order entry systems ensure that more than 85% of all drugs at DFCI are ordered by computer. Errors are also counted differently now. "We start counting at the pharmacy and end at the patient. We don’t just look at what happens on the patient floor," says Conway. Indeed, an increase in error reporting is celebrated. "Reporting doesn’t scare us. Instead we are concerned when reporting numbers go down. And we know that what is reported is probably just the tip of the iceberg."
Barbara Balik, EdD, chief executive officer at Allina Health System of St. Paul, MN, agrees that increased reporting of mistakes should be applauded. Gordon Sprenger, the now-retired former CEO at Allina — a delivery system with 16 hospitals, 60 clinics, home care, and an ambulance service — made patient safety and error reporting such a mantra that he, too, was awarded the JCAHO/NCQA patient safety award.
Three years ago Sprenger participated in a Harvard forum on the topic and decided that patient safety should be a strategic goal. He even named a senior staff person to lead the effort, which Balik took over two years ago.
"Like most hospitals, our standard approach to errors in the past was that we felt badly, we didn’t mean for it to occur, and we’d investigate," Balik says. "And although we had moved away from disciplining people for errors, it was still more punitive a system than not. The emphasis was on individual responsibility and people were admonished to pay more attention."
Data were simply collected as a list of where errors occurred and what kind they were. Then, about five years ago, a decision was made to move to a more open disclosure with the patient and family. In part this was accepted because even risk management at Allina’s predecessor organization had a real quality improvement bent to it. "It wasn’t just about avoiding lawsuits," Balik explains.
But only the openness and disclosure changed. "For the rest, we were like everyone else, seeing error as episodic, and aren’t we happy it doesn’t happen more often," Balik says. "Error was seen as inevitable when you have many systems, staffed by humans, working with sick people."
Over time, an evolution in thinking occurred, thanks to Sprenger’s leadership, she notes. "We don’t have safe systems that sometimes fail. We have unsafe systems and that they are as safe as they are is a credit to incredible people."
One of the first steps in fomenting change was to learn what was out there already, says Balik. "There is a whole science of error and error prevention in industries like aviation and nuclear energy that we didn’t know about." Understanding how human factors impact systems was also important. "Now we build those human factors into vendor and equipment selection."
Changing attitudes is much harder. A recent culture survey sent to all clinics and hospitals in the Allina system asked respondents about the messages they heard from leaders, about the value of a safe environment, and why errors occur. A sample statement: "I understand that errors are the result of a complex system failure." Staff members were asked to rate how strongly they agreed or disagreed. Half thought the statement was true; half didn’t.
"That’s in an organization where we have been talking about this for more than two years," says Balik. "It obviously gives us an area to work on."
That kind of data — what staff think about patient safety — would never have been considered important in the past, let alone sought out and tallied. Using the data has been easy. "Now there will be mandatory training in patient safety for everyone from the linen room to the board room. We want to have all eyes and ears open."
Balik calls old reporting systems for patient safety "weak" and says one change has been to integrate patient safety and risk reports. "Now we collect information on an issue whether it reached the patient or not, whether it caused harm or not." And to help in the culture change, errors that don’t reach or harm the patient are no longer called "near misses," but "good catches" instead.
The label is positive and reporting them is to be viewed as positive. To a degree, it must be working, says Balik, who reports that one Allina hospital had a goal of collecting 400 patient safety reports in three months. "They got more than that and want to use them to find out all they can about system flaws."
The problem that remains in many other facilities, however, is that errors that don’t reach the patient aren’t seen as important, despite all they can teach a hospital and its staff about system problems. Allina is also trying to figure out the costs of errors. "We took a sampling of errors and did a severity adjusted comparison," Balik explains. "The only difference is that in cases where a patient safety report was filed, the average length of stay was five days longer and charges per case were $14,000 higher."
It’s that kind of data collection, collation, and distribution that Balik thinks will assist in her ongoing efforts to make patient safety a priority.
Another data effort had patient safety data overlaid with length of stay data. That project showed that where there were opportunities for improvement in one area, there were opportunities in the other. They coincided beautifully, Balik says, providing an opportunity to integrate patient safety with care improvement activities. Medical directors were just starting to review the data at press time.
Letting it all hang out
Like Conway and DFCI, Allina provides the data to anyone who wants them — and probably many who don’t. "My role is to keep this in front of the medical staff," Balik says. "I constantly educate peers and medical staff about it."
And there is great appreciation for it too. For instance, at the nursing education day that Allina has for nursing leadership and shop stewards, having good, hard, understandable data helps to promote change. "It reinforces to them that this is a systemic issue, not a people issue," she says.
"Gordon deserves the lion’s share of the credit for this," says Balik of her predecessor. "He got people involved and took on a real leadership role." One of the loudest and most symbolic actions Sprenger took, says Balik, is that after those early Harvard sessions he asked risk management to phone him ever time there was a serious injury or patient death.
"When he got that call, he’d call me, ask about what happened, ask how everyone — including the patient’s family and caregivers — were doing. No one wanted to get those Gordy calls.’ But he was a good role model. He called staff at home. He put a lot of attention on this issue with a small act. And sometimes, those small things make a big impact."
Error reduction not immediate
The work is never done. There hasn’t been any reduction in the numbers of errors reported yet, but Balik doesn’t want to see that anyway. "We want to increase reporting and decrease the number of errors getting to patients," she says. "Two years ago, I would have expected we would have seen error reduction by now, but now I’m patient to wait some more. But I will be disappointed if we don’t see reduction next year."
Despite that, there are visible successes. The Institute for Safe Medical Practices has a standard assessment on best practices. Allina has implemented those and so some types of errors are no longer an issue, says Balik.
Training has changed so that the focus is on the role of a team member, not the person. "In the [emergency department], it’s a nurse, a physician, and a tech person like an [emergency medical technician]. We tell people what the expectations are when they are in a certain role." The new system was launched in October and will be rolled out to high-risk teams such as those in the emergency department and the ICU, Balik explains.
Conway says the engagement of executive leadership is critical in making such changes work. "They have to accept that patient safety and protection and reduction of harm is important," he says. "All else will flow from that."
In addition, getting patients and their families involved helps to change the culture at a hospital. "The patient needs to know that you are interested in them and hearing what isn’t working well. They are a real key in risk management. Often when we bring up an issue we think is new, they tell us it’s about time."
Convincing the lawyers — and even some hospital executives — that openness in discussing and reporting errors isn’t bad can be difficult. "But we talk about errors and patient safety all the time and nothing terrible happens. We have this notion that when you disclose an error, someone sues you or someone will tell the paper. But we have six years of experience saying that is largely not the case."
It’s just one of many myths that Conway is constantly challenging. (For more information on some of the unlikely truths about errors that Conway has learned, see "9 things you should know about medical mistakes," below.) "I get the question over and over again about how we can do it this way. But it was easy for us. We had an opportunity to reinvent ourselves and learn from this tragedy. We had the tension for change thrust upon us. For others, it might be harder to create that tension. Leaders like what we are doing, they respect it. They don’t necessarily think I’m nuts any more. But they have the problem of how to move it forward when there are other concerns guiding the organization."
Conway even created a tool to help facilities make the changes necessary to improve patient safety and error reporting. (See "Excerpt of Patient Safety Awareness Self Assessment Tool," below. A complete version is available from the American Hospital Association by calling (312) 422-3000.)
But despite a change in the body language that Conway sees when he gives talks on patient safety, there is still a long way to go. Balik says that when US News & World Report did a story on patient safety last year, the reporter told her that no one else in the country wanted to talk about it. "We were seen as loony for talking about this and owning up to our mistakes," she says. Maybe next year it will be different, Balik says, noting that the Harvard sessions that so influenced Sprenger are now being held outside Harvard for the first time. "We are replicating it here in Minnesota. So far, we have had two sessions, and leaders are acknowledging that safety isn’t a competitive issue. We don’t have to be secretive about what we know and don’t know."
[For more information, contact:
Jim Conway, Senior Vice President, Chief Operating Officer, Dana-Farber Cancer Institute, 375 Longwood Ave., Fifth floor, Boston, MA 02215. Telephone: (617) 632-2158. Barbara Balik, EdD, CEO, Allina Health System, United Hospital, 333 N. Smith Ave., St. Paul, MN 55102. Telephone: (651) 220-8816.]9 things you should know about medical mistakes
- You have plenty of errors and near misses.
- Error reports will initially go up in your journey.
- Firing staff and writing new policy will do little to reduce error.
- Disclosure of error is good for all involved.
- Patients often already know what’s going on.
- Root cause analysis is more powerful than one-on-one investigation.
- Very little is confidential.
- You can talk openly about error, survive, and thrive.
- The things you learned yesterday should inform, but not drive today. It’s a very different day.
Source: James Conway, Dana-Farber Cancer Institute, Boston.
Excerpt of Patient Safety Awareness Self-Assessment Tool
Editor’s Note: Use this list as a checklist. For a complete copy of the tool, contact the American Hospital Association at (312) 422-3000.
- Openly engage with medical staff, nursing, and other leaders in patient safety planning.
- Continuously articulate the business case for safety improvement.
- Personally participate in a significant incident investigation/root cause analysis.
- Tell "my story" around incidents/errors that I have been involved with and the systems improvements that could have prevented them.
- Routinely involve myself, all levels of our staff, and our patients and family members in direct and ongoing communications around the patient safety work of our institution and areas for improvement.
- Routinely bring patient safety matters, trending data, and specific cases to the board and other hospital leadership committees.
- Routinely probe staff perceptions of risk areas from existing or proposed systems and take immediate actions wherever possible.
- Openly support staff involved in incidents and their root-cause analysis.
- Ensure that there is ongoing prioritization and achievement of safety improvement objectives.
- Ensure that articles on patient safety matters regularly appear in my organization’s communications vehicles.
- As part of annual budget preparation, ensure resources are funded for priority safety areas.
- Request and routinely receive reports on facility utilization of and comparison with best-practice information from the American Hospital Association, National Patient Safety Foundation, and Institute for Safe Medical Practices.
- Ensure self-assessments from the AHA and others are completed and used internally for quality improvement activities.
- Cultivate media understanding of patient safety and my organization’s efforts to improve safety.
- Ensure effective systems are in place to assess individual accountability and competence.
Suggested Readings on Patient Safety
- To Err Is Human: Building a Safer Health System, Kohn LT, ed, Corrigan JM, ed, Donaldson MS, ed. Washington, DC: National Academy Press; 1999.
- Human Error, Reason JT, Cambridge University Press, 1990.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.