Solicit input on IRB process changes with
Solicit input on IRB process changes with "crowdsourcing"
Method results in high participation
A major obstacle to implementing process or quality improvement measures is finding out what the people impacted by the change think.
IRBs sometimes handle this through one-on-one meetings, focus groups, or by asking for feedback on surveys. These methods have various advantages and drawbacks. One-on-one meetings, for instance, might solicit feedback, but it's time-consuming and difficult to implement broadly. Focus groups offer slightly broader input, but are hard to schedule with busy professionals. And surveys often have a very low return rate.
Now one research institution has developed a solution that has both a broad reach and is fairly easy to implement.
Called "crowdsourcing," the method enables the IRB office to solicit input from hundreds of individuals at once, and gives them an opportunity to comment on both the IRB office's questions and to the previous comments. It's an electronic solution that can be completed by individuals at any time that's convenient for them.
The Mayo Clinic IRB in Rochester, MN, used a crowdsourcing method to obtain ideas and solutions that would reduce the number of requests for clarification on protocol submissions. The crowdsourcing method resulted in a substantial number of ideas and comments from participants. And these led to the office reducing its overall change requests by 20%.1
"We sent out via email an invitation to participate in this online discussion forum," says Jim Pringnitz, CIP, PMP, a business analyst with the Mayo Clinic in Rochester, MN.
Of the roughly 250 people who were asked to provide input, more than 180 contributed. These included physicians, principal investigators (PIs), and study coordinators.
"I was really surprised," Pringnitz notes. "We had a good response rate; people liked using it, and it worked well with this group."
Here's how the crowdsourcing project worked:
Identify areas that need input: The Mayo Clinic IRB office has specialists who review protocol applications rigorously, sending out for clarifications and questions before the application goes to the board, Pringnitz says.
"We don't do approvals with contingencies," he adds. "We do the screening up front which is why we have so many back-and-forth clarifications before it goes to the board."
Most IRB submissions are returned with clarification requests, and this significantly impacts resources, overall turnaround time, and staff satisfaction.1
The IRB office identified the top 10 reasons why a change was requested of an IRB submission. (See table on Top 10 Reasons for Returned IRB Submissions, below.)
Institution's top 10 reasons The Mayo Clinic IRB in Rochester, MN, has identified a list of 10 common reasons for returned IRB submissions. Here is the IRB's list, along with some of the examples the IRB has provided to investigators: 1. Consent |
What the IRB office needed next was to find out how the underlying problems that resulted in incomplete IRB submissions could be resolved. It was a process improvement that would require input from the IRB team, as well as principal investigators and others.
Develop a brainstorming method for obtaining the input: "The Mayo Clinic has a Center for Innovation that was key in this," Pringnitz says. "I contacted the Center and they recommended we use this online tool called Launchpad an in-house term from Imaginatik."
After identifying more than 200 recent users, the IRB sent them an email with the subject line saying, "IRB Launchpad survey." The email posed questions, based on the list of top 10 reasons, and asked for ideas and input.
The email introduced the quality improvement request with these words:
"The IRB is looking for your ideas on how to reduce the requests for changes on IRB applications. Nearly 50% of all items submitted to the IRB are returned to the study teams with a request for changes to the application. The goal of the IRB Rework Reduction team is an 80% reduction in the number of change requests sent back to the study teams."1
The email asked participants four questions, beginning with the following:
What are your biggest challenges when submitting an application to the IRB?1
"We kept the forum open for two weeks with the challenge that people could reach other's ideas and responses and rate them and comment on the ideas," Pringnitz says. "It worked well: we had 68 ideas and 23 builds or comments."
Perhaps, one reason why the response rate was high was that the PIs were familiar with Launchpad, he notes.
"The institution had used Launchpad for other efforts," he says. "The key is the email is in their Outlook account, ready for them to click on the link and go into this collaborative tool whenever they have time, and they can go back in later to see if anybody commented on their idea."
Use the comments, ratings, and input to improve processes: The crowdsourcing exercise produced a great deal of information, so the IRB office identified five high level process steps. These are as follows:
Know your responsibilities
Anticipate level of review
Develop protocol and supporting documents
Complete IRBe application
Ongoing responsibilities.
"Then we provided links on the right with resources," Pringnitz says.
For instance, the first process of "know your responsibilities" includes links to education and training, IRB handbook, Belmont Report, integrity and compliance program, and Mayo Research Policy Manual.
"We had created this handbook as a practical guide of how to work with our regulations, and we wanted to put that up front and center for them," Pringnitz says. "Our most helpful resources were identified through crowdsourcing."
The crowdsource email had asked participants to identify the additional resources or tools they needed to successfully complete an IRB submission.
The feedback also made it clear the IRB office needed to map its process a little better, Pringnitz notes.
"We worked to put all of it together, using their input on things they said had been helpful in the past," he adds. "We were able to put out all of the problems we identified and let the audience of users help us with improving it and matching what we found."
Reference
- Pringnitz J, Rice M, Kreofsky B, et al. Crowdsourcing for continuous improvement. Abstract presented at the PRIM&R Advancing Ethical Research Conference, Dec. 6-8, 2010, in San Diego, CA.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.