Smart Training Can Prevent Problems with New Technology
Common Pitfalls tool created
Adopting new electronic submission technology across a research enterprise organization can prove to be challenging for IRBs, which have a long list of stakeholders to train and educate. But as one research health organization discovered, mistakes made along the way can be as instructive as any of the successes.
“It started at a meeting of IRB staff, where we brought to the table similar complaints,” says Pamela Johnson, MPH, research education and quality improvement specialist at Hartford HealthCare in Hartford, CT.
“We asked every IRB staff member to send me their top problems, mistakes people make consistently,” Johnson says. “Each administrator sent me a list and I noticed some themes, so I sent the same request to all [electronic submission system] super users and asked for their top complaints.”
Johnson learned that the biggest complaint had to do with documents not being attached properly or in the right place.
“People were attaching the wrong documents to the informed consent tab, or they were naming things in strange ways, maybe using the name the sponsor had on it, and we didn’t know whether it was the informed consent,” she explains. “Then we’d have to find the document and put it in the right section, which was a hassle.”
There also was confusion about how to modify a document that was already in the system, Johnson says.
The program has a check-in/check-out feature that stamps documents approved by the IRB. But many investigators were accessing earlier versions of the consent document, thinking it had been approved when it was the version prior to approval, Johnson explains.
“The problem was the coordinator wasn’t using the check-in/check-out procedure, so they were making changes to the wrong version and it was really hard for IRB administrators to track,” she says.
The educational solution was to address each of these common mistakes in a “Common Pitfalls” tool that was sent to super users for vetting and refinement. Then the IRB created a supplemental user’s manual with step-by-step guidance on how to identify and prevent each pitfall. The following are some examples of the common pitfalls:
• Incorrect response to stipulations. IRB administrators sent all requests for clarification to investigators with stipulations, Johnson says.
For example, if an investigator wanted to add a person to the study, IRB coordinators would check to see if the person had completed all CITI training. If the additional research staff person had not completed the CITI coursework, the IRB would send a notice to the investigator, asking him or her to please resubmit after the CITI coursework was completed, she explains.
The investigator would see that the person completed the coursework and then return the request without having made any changes. This showed an incomplete answer.
“When investigators had trouble doing this, we’d send them this document and highlight the common pitfalls with stipulations, and we’d say, ‘The reason we can’t process this is because you didn’t complete the stipulation,’” Johnson says.
The investigator would have to show on the document that the training had been completed, and then it could be accepted.
Having the common pitfalls tool ready to cut and paste into communications via the electronic system with researchers made it a consistent and simple way to educate and train them on how to use the new electronic system correctly, she says.
Johnson notes that the IRB did not want the back-and-forth communication to be handled through email because it needed to be tracked in the electronic system. “The system has tracking so you know when a stipulation was sent and when it was received, and we can log in to see when they are working on it,” she says. “In the electronic system it shows up as a task they have to do.”
• Attaching revised documents without noting a modification. “People were attaching consents and ads and documents to their continuation report, which was good,” Johnson says.
But sometimes researchers would revise the forms and check “no” when asked about any modifications in the electronic system. “If they said ‘no’ because they didn’t understand the question, we’d approve it without knowing there were changes,” Johnson says. “I think they thought that anything they submitted would be reviewed and approved by the IRB, which is true, but they have to tell us if they are making changes and what those change are.”
This pitfall was sent to researchers, telling them they cannot attach any reviewed documents without telling the IRB that they’re requesting modifications, she adds.
The common pitfalls educational strategy worked, Johnson says.
“Submissions improved,” she says. “The best use for the tool was to help IRB administrators give the research community a consistent message.”
Adopting new electronic submission technology across a research enterprise organization can prove to be challenging for IRBs, which have a long list of stakeholders to train and educate.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.