Intense Competition, Inadequate Assessment are Factors in Research Misconduct
Institutions should create a climate of research integrity, experts say
The number of retractions in scientific journals has increased significantly in recent years, according to research.1 Sometimes, it’s due to honest mistakes — researchers realize they made an error and want to correct the scientific record.
“But there is also misconduct,” says Zubin Master, PhD, assistant professor at Albany (NY) Medical College’s Alden March Bioethics Institute. While some journals publish the reason for retraction of a clinical trial’s finding, others do not. “There isn’t a day that I don’t hear about a case of misconduct. There is always some type of research integrity issue going on,” says Master.
Data are Largely Self-reported
A 2015 study found 1.97% of scientists admitted to have fabricated, falsified, or modified data or results at least once; up to 33.7% admitted other questionable research practices.2 Such data is largely self-reported, however.
“We’re asking research scientists if they’ve ever engaged in misconduct, or seen others engage in misconduct,” says Master. “Obviously, there will be some bias.”
James M. DuBois, DSc, PhD, Steven J. Bander Professor of Medical Ethics and Professionalism at Washington University School of Medicine in St. Louis, is not convinced that research misconduct is actually increasing. “It is far easier today to detect plagiarism, fabricated data sets, and falsified images,” he says.
DuBois believes the movement toward increasing the rigor and reproducibility of science is likely to decrease instances of data falsification and fabrication, even if more instances are discovered in the short-term. “It is much more likely that individuals will get caught when they engage in research misconduct — at least, once suspicions arise,” says DuBois.
As data sharing continues, peer oversight is also likely to increase. “We may see another rise in detected misconduct, but not necessarily misconduct itself,” says DuBois. He names the following approaches:
- ensure researchers know the rules surrounding misconduct, and how seriously society takes them,
- increase transparency by providing adequate institutional server space for all research data, and requiring data sharing, and
- increase oversight of research staff, post-doctoral trainees, and peers by principal investigators, including review of raw data.
“However, this is often difficult as science grows more complicated, and individuals possess unique skills that others on the team lack,” says DuBois. The following are ethical issues involving research misconduct:
• Definitions of research misconduct may vary.
The U.S. Department of Health and Human Services’ (HHS’) Office of Research Integrity defines misconduct as “falsification, fabrication, and plagiarism.”
“Not every nation has such narrow definitions,” says Master. “Other nations have a broader definition of scientific dishonesty.” Some include conflict of interest violations, or ethics violations involving authorship of publications or bad recordkeeping.3
• Intense competition can create a culture that incentivizes misconduct.
“Everybody is ready to blame the bad apple, as if they are morally corrupt. But it’s the environment that pushes people, ethically,” Master says.
Under this kind of pressure, an investigator might decide to publish a study’s findings prematurely. “Right now, the research environment is at a state of hypercompetition. People are scrambling for grants and jobs,” says Master.
Researchers don’t always take that extra step to make sure the data are reproducible, he says. “Or maybe the data doesn’t fit the hypothesis, so they justify removing it and only show certain results,” says Master.
Conflict of interest in clinical trials sponsored by pharmaceutical companies gets a lot of attention. “But we’ve got to remember, academic scientists also have vested interests in their careers,” says Master.
In one study, researchers analyzed 40 cases of falsification, fabrication, and plagiarism. They identified poor ability to cope with research pressures as a contributing factor.4
“The reasons for misconduct are highly diverse,” says DuBois, the study’s lead author. Reasons can include the following:
- researchers with personality disorders who seek fame,
- being under a lot of pressure to obtain funding,
- recklessness with data,
- experiencing personal stress that clouds judgment,
- not realizing how seriously U.S. society takes plagiarism, and
- fear of peer reviewers, which could lead to minor falsification rather than wholesale fabrication of data.
“The diversity of reasons is part of why it is difficult to craft a one-size-fits-all solution,” says DuBois.
• Research institutions typically react only after misconduct cases are reported in the press.
“The university tries to save its reputation and wants to make sure that the behavior isn’t prevalent, so they react and do a bunch of different things in response to a publicized case of misconduct,” Master says. The following are practices that, instead, can create a climate of research integrity:
1. Institutions can reduce the number of “soft money” positions, which require scientists to pay their own salaries with grant money.
“‘Hard money’ positions are becoming extinct,” says Master. “This kind of environment is ripe for research misconduct.”
2. Research funders can limit how many students and fellows are hired with a grant.
If a principal investigator gets a grant for several years, the usual approach is to hire many students and fellows to do the work. “When they graduate and become principal investigators, the system becomes saturated. Resources are limited, creating competition,” Master says. Limiting this number would encourage universities to hire research scientists as permanent employees, he says.
3. Institutions can do a better job of assessing the integrity of their research environment.
Keith Baggerly, PhD, a biostatistician at the University of Texas MD Anderson Cancer Center in Houston, says, “Most do not assume that their faculty are doing things in a fraudulent way. The baseline assumption is that everybody is trying to do things correctly.”
In one case, there initially appeared to be some honest mistakes made in data analysis. “But over time, when it persisted, we got more suspicious there was substantial fraud involved,” says Baggerly. The biggest problem was that after concerns were raised about the quality of the raw data, those concerns were largely ignored or dismissed, he says.
While some institutions do the bare minimum to promote a culture of research integrity, others do it really well. Master suggests using the indirect costs from grant funding for this purpose. “If you are doing well, great,” he says. “But if the climate of research integrity is underserved, ramp it up to do what you can.”
4. Institutions can recognize there is increasing recognition that reproducibility and reliability rates are not as high as they should be.
“If you are aware that this is a big problem — and it is — addressing it is fairly easy,” says Baggerly, who teaches a course on reproducibility of research.
Several journals have begun to ask for the data used to produce results, but don’t go so far as to check the results. “What is easy and fast is simply checking existence of the data. Checking accuracy is not as simple,” Baggerly says.
One problem is that researchers tend not to question whether another factor could have been responsible for a given finding. “Instead, people say, ‘Oh my gosh, how brilliant I was to run this experiment.’ People get excited when they see a cool result,” Baggerly says.
Batch effects also result in misleading findings — in some cases, due to improperly set calibration. “Randomizing the run order can minimize batch effects,” notes Baggerly. “Good design can separate the batch effects from the biological effects that we actually want to find.”
Wide availability of image comparison tools has also contributed to the increase in retractions. “If you can see if it matches, that becomes a lot easier to check,” says Baggerly. “The ability to detect it is getting better.”
Too-light Penalties
Penalties for egregious research fraud are often viewed as too light — for both the investigator and the institution. “If an individual is found guilty of research fraud, a separate assessment needs to be done to determine the degree of culpability on the part of the institution,” says Baggerly.
If fraud is discovered, the institution is supposed to report it to the HHS Office of Research Integrity if grant funding is involved. “But I don’t believe they’ve ever acted or imposed sanctions on the institution — it’s wholly on the individual,” Baggerly says.
Typical penalties include suspension of grant funding, and barring the investigator from serving on various NIH committees. “This might be sufficient to get a bad actor out of the research community, but can be too light if there have been substantial real-world consequences,” says Baggerly.
In one highly publicized case, a prominent cancer researcher at Duke University was found to have falsified data in multiple published studies. His punishment was a five-year ban from federal grant funding.5
Baggerly co-authored an editorial noting that the researcher misused taxpayer dollars and put patients at risk.6 “In that case, I don’t think suspension from grant funding was adequate because it resulted in people being incorrectly treated in clinical trials,” he says.
Except when it is clear the institution engaged in cover-up or was negligent in some fashion, DuBois strongly opposes punishing institutions for the wrongdoing of investigators.
“Doing so is likely to increase the motivation of institutions to cover up wrongdoing,” he explains. “It may also increase an ‘us-them’ dynamic between institutional officials and researchers.”
Both of these consequences would serve to hinder, rather than promote, research integrity. “The airline safety movement has important lessons for the world of research integrity in terms of focusing on root cause analyses and problem-solving, rather than blame of institutions,” says Dubois.
REFERENCES
- Steen RG, Casadevall A, Fang FC. Why has the number of scientific retractions increased? PLoS One 2013; 8(7): e68397.
- Pupovac V, Fanelli D. Scientists admitting to plagiarism: A meta-analysis of surveys. Sci Eng Ethics 2015;21(5):1331-1352.
- Resnik DB and Master Z. Policies and initiatives aimed at addressing research misconduct in high income countries. PLoS Medicine 2013; 10(3): e1001406. DOI:10.1371/journal.pmed.1001406.
- DuBois JM, Anderson EE, Chibnall J, et al. Understanding research misconduct: A comparative analysis of 120 cases of professional wrongdoing. Accountability in Research: Policies and Quality Assurance 2013; 20:320-338.
- Office of the Secretary, HHS. Findings of Research Misconduct. Fed Reg Nov. 11, 2015. http://bit.ly/2c6rwWt.
- Baggerly K, Gunsalus CK. Penalty too light. The Cancer Letter. November 13, 2015.
SOURCES
- James M. DuBois, DSc, PhD, Steven J. Bander Professor of Medical Ethics and Professionalism, Washington University School of Medicine, St. Louis. Phone: (314) 747-2710. Email: [email protected].
- Zubin Master, PhD, Assistant Professor, Alden March Bioethics Institute, Albany (NY) Medical College. Phone: (518) 262-1548. Fax: (518) 262-6856. Email: [email protected].
- Keith Baggerly, PhD, Professor, Department of Bioinformatics and Computational Biology, University of Texas MD Anderson Cancer Center, Houston. Phone: (713) 563-4290. Fax: (713) 563-4244. Email: [email protected].
The number of retractions in scientific journals has increased significantly in recent years, according to research.1 Sometimes, it’s due to honest mistakes — researchers realize they made an error and want to correct the scientific record.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.