Do you need to do an RCA on your RCA?
Make sure your analysis can fix a problem
Imagine a very busy hospital where there is a series of wrong-site surgeries. The hospital risk manager does a root-cause analysis (RCA) and figures the surgeons should implement a checklist. The quality director finds one online, gives the instructions to the chief nursing officer that all the surgical nurses should make sure to use it, and the hospital goes back about its business. But the problems persist. The newspapers get wind of it. The CEO isn't happy. The board is less happy. Don't even ask about the patients.
A consultant is brought in to unsnarl the problem and figure out what's wrong. In the eyes of the leadership, it's a matter of flushing out the bad actors, and fast, because time is money and this is just bad publicity that isn't good for business. In her interviews the consultant hears something that cements her feeling that the problem has nothing to do with the individuals in the operating room. Those checklists? The nurses were filling them out before the patient got into the theater because doing so saved time. And time was money. Just ask the CEO.
That fictional example might sound extreme, but it highlights issues than can exist in real-life hospitals.
"Doing a root-cause analysis is more than just finding a problem and creating a solution," says Rhonda Filipp, RN, MPA, director of quality and patient safety for the California Hospital Patient Safety Organization (CHPSO) in Sacramento. "You have to have some way to check if that corrective action plan was put in place and whether it addresses the problem."
A quality and scientific perspective
A checklist is a fine thing if it's used properly. An interview of nurses, an audit of charts — those things could determine if a checklist in an operating room is being used, and being used at the right time, before wrong-site surgeries start piling up, she says.
Filipp notes that there has been a dramatic shift in the last two decades from adverse events being handled by the risk management team, to a patient safety focus, and now to events being viewed from a quality and scientific perspective. But those changes can be hard to digest — with a new lingo to learn, scientific principles to teach to people who didn't think they'd need them in their job — or to apply in a different manner — and a commitment to seeing changes over time, not necessarily today, or even tomorrow.
"If you have an event happen one day, and you start your root-cause analysis the next, a thorough process isn't going to be over quickly," Filipp says. "You can do a causal analysis and a whole lot of work understanding what happened and developing a corrective action plan. But by the time you get there, it is three weeks later, and the emotional response you had right after an event from the people involved — well, that has shifted to the other things that are grabbing their attention."
Addressing causal analysis
So what can you do? Filipp gives the following examples of how to successfully address causal analysis:
- First, she says, you can't stop at the first, obvious layers of causes. Keep working your way to the middle of the onion. The low-hanging fruit, the easy stuff? Sure, grab it. But don't assume that's all there is.
- Develop a corrective plan that includes a way to prove it was implemented. That might mean chart audits, or interviews, or observations. If you have a checklist, you can see if it is done. But if you want it done at a particular time, you either need to do it on a computer when it can be time-stamped, or you will have to rely on the trustworthiness and willingness of people to speak up and speak out. If you don't have a culture that encourages and embraces that, it might be difficult.
- Not all things are easily measurable. "What if one of the things you want to do is encourage better communication," she asks. "How do you measure that? It might mean creating surveys or walking through rounds and watching people and taking notes, and it might not be strictly scientific. That doesn't mean it's not valid."
- Include a timeline for implementation and periodic status reporting periods.
- Have someone overseeing the efforts. One hospital where Filipp worked had a multidisciplinary committee in charge of doing root-cause analysis. They were in charge of determining if the causal analysis was complete and acceptable, they approved the proposed fixes, looked at the evidence that those fixes were being implemented, and eventually, would review the data to see if the solutions were working.
Committee members also determined how long to keep a pilot project going and were available resources if something needed tweaking. (This assumes that you have a quality and safety primed culture, where leadership encourages such activity.) When someone was late with a report, the committee was able to "hold their feet to the fire" and keep the team accountable for their role in improvement, Filipp says.
There are a lot of tools related to creating an effective analysis that is evidence-based which Filipp has developed and would love to share with anyone who has an interest. But many hospitals view them as proprietary. She is hopeful, however, that in the near future there will be a place where quality managers can go to find anything they need related to hospital quality.
"The California Hospital Association is launching the Hospital Quality Institute in January, and we are working to make a lot more of this kind of thing available," she says. "We want the website to be the Google of hospital quality."
Until then, she says your patient safety organization and state hospital association may have some tools — possibly some of the very ones she developed — that could assist you in ensuring you have created a causal analysis that has measurable, provable aims.
The Hospital Quality Institute website will be at hqi.org and goes live in late January.
For more information on this topic, contact Rhonda Filipp, RN, MPA, Director, Quality and Patient Safety, California Hospital Patient Safety Organization (CHPSO), Sacramento, CA. Email: [email protected].