Put ‘metrics’ in the lexicon after IRB, before quality improvement
Put ‘metrics’ in the lexicon after IRB, before quality improvement
Metrics are key to use of e-data
It is a natural electronic data evolution: First, IRBs began using electronic data systems; then these morphed into full electronic submission, and now IRBs are collecting valuable electronic, real-time data that can be mined very quickly for process and quality improvement purposes. Metrics are key to this transition.
“It seemed intuitive to me,” says Daniel Nelson, MSc, CIP, director of the office of human research ethics and professor of social medicine and pediatrics at the University of North Carolina at Chapel Hill. Nelson spoke about using metrics to monitor, manage and improve IRB operations at the recent Advancing Ethical Research Conference of the Public Responsibility in Medicine and Research (PRIM&R), held Dec. 4-6, 2012 in San Diego. Nelson has been using metrics since before most IRBs collected all data electronically.
“When I moved to UNC 15 years ago, one of the first things I did was look through stacks of paper and data,” Nelson says. “With a scientific background it was second nature to analyze and track things.”
At first, collecting metrics was a paper-based, often tedious process of sifting through paper records. Now it’s a matter of running reports and analyses electronically.
“We do this at the end of the calendar year with some reports at the end of the fiscal year,” Nelson says. “The time frame varies, depending on how people run their operations.”
The UNC IRB collects metrics for routine, ongoing tracking, as well as for tackling specific performance improvement issues, he adds.
Metrics also are being used by IRBs for benchmarking purposes.
“The important thing for any IRB is to have some method to measure indicators of quality, first, and then to act on them,” says David G. Forster, JD, MA, CIP, chief compliance officer at the Western Institutional Review Board in Olympia, WA. Forster spoke about collecting and analyzing metrics for quality improvement at the PRIM&R conference in December 2012.
Electronic data has made collecting metrics far easier, he notes.
“Back in the old days of paper-based IRBs, it was hard to measure much,” he says. “You had to record it somewhere to figure out what you had to work with.”
Quality improvement efforts at IRBs have taken a giant step forward with the transition from paper to electronic data over the past 15 years, Forster says. “We have an incredible ability to increase measuring what we do,” he adds. “There’s been a gradual process to electronic data; in 1995, WIRB had a rudimentary system that tracked what we had to find in a paper file. In 2002, we switched to a completely electronic system.”
At UNC Chapel Hill, a fully integrated electronic data collection process has been in place relatively recently, Nelson says.
“We put a home-grown application system in effect in 2011,” he says. “We did it in a phased-in approach, rolling it out with the behavioral and social sciences departments first and then with the biomedical departments.”
The IRB tracked information all along and learned from the data metrics.
For instance, it discovered that IRB processing times actually increased by 300% when the electronic transition took place, Nelson says.
“Our processing times went from a couple of days to a couple of weeks,” he says. “It was a lot longer, which is why I realized we needed to do something because we had implemented a system that was supposed to make things easier for the staff and IRB, and, instead, things got much more complicated and slower.”
At first, this trend was observed by investigators and research staff who complained that response rates had slowed down. A metrics analysis confirmed their anecdotal evidence. The next step was determining what caused the negative change and correcting it.
“We analyzed what people were doing, how they were doing it, and how the system could be improved,” Nelson says. “We used an all-systems approach to handle the backlog snowballing on us; we made changes and adjustments that were able to bring down our turnaround time for expedited review.”
Changes included adjusting staffing positions to put more employees in the roles that needed more attention under an electronic system and decreasing staff from areas that needed less staff with online IRB submissions, Nelson explains.
The expedited review time, which had skyrocketed to 18 days from three days, was brought down to three days initially. Now it takes one or two days and is tracked on a quarterly basis, Nelson says.
In WIRB’s more than 10 years of experience with electronic data and metrics, the focus has gone from what to measure to how to measure, Forster notes.
“The first look by everyone is in turnaround time,” he explains. “But the devil is in the details: How do you measure it? Do you review the complete system, measuring it from when the application goes to the board or staff, or when they work on it after the board meets?”
Once an IRB sets up its measurement parameters it can determine what is working and what is not and what the trend is. While collecting and comparing data is easier, the challenge continues to be in the analysis.
For instance, metrics might show that an IRB’s submission packets were taking three days but now are taking five days.
“What is the cause of that? You can get a very detailed look at your processes,” Forster says. “You need to divide up each type of work and then track it.”
Forster and Nelson outline these strategies IRBs can use to improve to track and analyze metrics:
• Select initial areas to track and trend. Most IRBs begin with tracking the time it takes to complete an IRB review. They can select beginning and endpoints, such as starting with the date the submission was received by the IRB until the date it is sent back to the investigator with an approval or request for more information, Nelson suggests.
Other activities that might be selected for collecting and tracking metrics include these:
- total submissions;
- number of exemptions;
- how many submissions are screened and then returned as not requiring further review;
- staffing levels;
- number of unanticipated problems.
“Once you measure something you can make changes as part of a continuing quality improvement cycle,” Forster says.”For example, if you realize the timeline for processing unanticipated problems is three days, then you look at why it takes three days; once you measure it, you can improve it.”
• Measure error rates. IRBs can use metrics to identify errors in electronically submitted documents, including consent forms, Forster suggests.
Using metrics, IRBs can identify trends that point to certain areas of a form that cause the most errors or problems.
“You can use that information to do some real quality work of looking at root cause problems,” Forster says. “If you see a trend toward spelling errors in the consent form, you can measure this and find out why we’re making spelling errors.”
• Assess the relationship between staffing level and workload. IRBs can use metrics to make certain the staffing level can handle the current workload on an ongoing basis.
“If the workload is outpacing staffing, we make adjustments to create new positions,” Nelson says.
“We look at where our research is coming from, which department is contributing the most research,” he adds. “Every department thinks their research is the most important research, and sometimes it’s an eye-opener for them to realize they were only 5% of your research portfolio when they thought they were closer to 100%.”
Trends in workload shifting don’t occur that fast, Nelson notes.
“There may be a shift on a month-by-month basis or a spike when students come back to school,” he notes. “But it doesn’t change the overall workload of the office.”
It is a natural electronic data evolution: First, IRBs began using electronic data systems; then these morphed into full electronic submission, and now IRBs are collecting valuable electronic, real-time data that can be mined very quickly for process and quality improvement purposes. Metrics are key to this transition.Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.