The ethics police? New book issues challenge for change
Calls for IRB training, standardization, appeals system
In a controversial book that both damns and praises the so-called “ethics police,” we are ultimately left to ponder a paraphrase of the classic Churchill quote on democracy: IRBs are “the worst system — except for all the others.”
Meet Robert L. Klitzman, MD, director of the Masters of Bioethics Program at Columbia University in New York City, and the author of The Ethics Police? The Struggle to Make Human Research Safe.1
Conducting some 45 interviews with IRB members, he chronicles everything from the completely arbitrary nature of some of their appointments to their petty concerns and heroic struggles in the ethical minefield of human research. (See related story in this issue.) We enter this wide-ranging discussion of risks and benefits, consent and betrayal — all on the frontier of an explosion in genetic research that could magnify every facet of the enterprise — in part because Klitzman was once a young doctor with a father dying of leukemia.
Given three months to live, his father was offered the option to try an experimental chemotherapy treatment and possibly extend life by as much as 18 months, says Klitzman, a clinical professor of psychiatry at Columbia. Noting that his mother thought it was too risky, Klitzman urged his father to pursue the experimental treatment. Hopes were dashed. His father’s white blood cell count improved but the side effects of the treatment killed him three months later — meaning the treatment failed to extend life beyond the original prognosis. This “haunting” event is our starting point for a compelling conversation with the author of The Ethics Police? (The interview has been edited for length and clarity.)
IRB Advisor: Given this powerful personal narrative — and some of your admitted bad experiences with research delays caused by IRBs — did you question whether you could look at the issues objectively in your book?
Klitzman: IRBs have been controversial and I think it’s important to look at any controversy from as many angles as possible. I argue that I bring different perspectives to it. On the one hand, I have been a researcher and have had to deal with IRBs and at times, frankly, that was frustrating. I have also been a family member who has seen how hard it is for families of patients to decide about research and experiments of some kind. But I also run a bioethics program and I’m very concerned about bioethics. I’m Jewish and I have read about the Nazis’ experiments on Jews in concentration camps. It’s horrific, and I’m glad that IRBs and ethical review have been established. I bring several different perspectives to try and understand IRBs.
In the book, I try to have IRB members speak for themselves. I worked with a great anthropologist for a number of years, so in trying to understand a social situation as a social scientist it’s important to understand the perspective of the people in the situation. Rather than just impose our views from the outside, it’s important to hear what people in the situation are saying. In the book, IRB members speak for themselves about the issues they see and face and how they deal with them.
I don’t think I went into this too biased against IRBs or for IRBs in a way that influenced what IRB members told me and how I presented that. These are complex issues that need to be seen from multiple sides. I think my father’s death led me to think about these issues and realize this is worth spending time on and understanding. And I have devoted several years of my life to studying IRBs and writing this book because I realized how hard it was to understand these issues — partly from having that experience. So I would argue that my personal experience led me to want to understand IRBs, but did not lead me to have a bias one way or the other.
IRB Advisor: Should there be more of an effort to inform people and patients in general about IRBs and human research, not just when they are asked to be research subjects?
Klitzman: Yes, absolutely. I have found that research subjects and the public at large know nothing about IRBs. Not only do they operate behind closed doors — that’s a major problem, but they often, as I talk about in the book, don’t want to be studied. We only hear about them when there is a scandal — when someone dies from an experiment, these issues come up. But I would argue that these are important issues and are ultimately about who we want to have data about us. There is ‘Big Data’ out there in the world, whether it is biobanks that hospitals have of hundreds of thousands of individuals, or it is a national security agency trying to get data on people based on doing experiments on us.
There is a tension between the fact that we want to advance science — science has brought us many wonderful things in our lives; I take cholesterol medication and I’m grateful to the science that led to the development of that — but at the same time we need to protect the rights of people. We have seen what’s it like to be railroaded into experiments or [as subjects and families] not to understand experiments. We know what happened with Tuskegee and we need to be sensitive to these issues. There needs to be more public understanding of these issues because they affect all of us.
IRB Advisor: You observe in the book that “given the importance of the work they do, and the potentially grave consequences of IRB lapses and oversights, the lack of preparedness for the work is especially striking. Both general members and chairs have been found to have little if any formal training in ethics.” This impedes research, you argue, but isn’t it also of some benefit to slow down research and carefully examine potential consequences?
Klitzman: IRBs do not need to slow things down. It’s not a matter of time. If anything, having people that are better trained might speed things up and result in reviews that are both of higher quality and more rapid. Sometimes IRBs get hung up because people need to take on the ethical issues. The process could go more smoothly and I think in many instances could yield better results [with more training]. Right now there is no federally required training of IRB members. In some institutions it is required. Some members want to get it but is not [widely] required. You could be the chair of an IRB and have no training in ethics whatsoever.
IRB Advisor: You note some investigators criticize IRBs as “the ethics police” and complain that the boards unnecessarily block or delay studies. But as you point out, they were created as a firewall against unethical, if not criminal, research like denying available treatment with penicillin in the Tuskegee syphilis study. Are you concerned that similar, highly unethical research is still being conducted somewhere under the radar or have IRBs been an effective deterrent?
Klitzman: This is an important question and I would argue we really don’t have complete data on this. My own sense is that the IRB system has had problems. One major set of problems is not about the research being unethical, but rather that the review process has gotten in the way of science without providing any more protection for people. The problem is that when the [Common Rule] regulations were issued in 1974, science was very different. Most research was by doctors in their clinics of maybe 100 patients. Now, to show that a new drug is significantly better than other drugs, I need 2,000 patients and I need to go to dozens of institutions, and so multisite research has become more common. As a result, 20 to 30 IRBs are reviewing the protocol and those IRBs often disagree. Some say this research is great, some say “change this,” some say “change that.” It gets much harder to pool the data from different sites. That gets in the way of research. The problem again is not that the research is unethical, but that the IRBs have impeded the science in ways that are not necessary.
I will also say, though, that there are two other problems. One is that there are still studies researchers do thinking they don’t need IRB approval. Another problem is that sometimes IRBs — because they need more education to help them understand [research and ethical] issues — approve studies that end up being problematic.
For example, there is the case of the Havasupai Native American tribe that lives at the bottom of the Grand Canyon. They believe that they originated there and they have told [this origin story as part of their culture]. Land was taken away from them by white settlers over the years and they have high rates of schizophrenia, diabetes, and alcoholism. Researchers wanted to study that but were afraid if they told the tribe, “We are interested in your high rates of schizophrenia and alcoholism,” the tribe would say no. So the researchers said, “We are interested in studying diabetes and other health problems.” The IRB went along with that [protocol, which included taking blood samples for DNA research], but later the tribe found out that papers were being published about their high rates of schizophrenia and alcoholism and the fact that they came from Siberia, not the Grand Canyon. So there were lawsuits. That is a case where I would argue that the IRB could have — and I personally think perhaps should have — done a better job. So there are cases sometimes where IRBs approve research that leads to unethical [consequences]. Again, I think the problem is that IRB review needs to be much more rigorous, there needs to be better training, etc. I would be surprised if there was another Tuskegee going on, but there are still unethical things that happen with IRB review or without IRB review. There are other examples where IRBs have approved research that has been problematic.
IRB Advisor: You call for clarity and standardization to address IRB problems and more transparency in interpretations and applications of principles in specific cases.
Klitzman: We need to be open to being studied. For example, a website where you say, “I think this is what we should do with this study — does anyone disagree?” Or, “Here is a controversial study — what do you think?” so more people can address the content. There have been studies published, for instance, that show that IRB chairs disagree whether an allergy skin test is minimal risk or not. There should not be disagreement on stuff like that. It is minimal risk or it is not — there is no reason for disagreement, so I think we need much more standardization. Then we need more training on [the consensus standards].
My background is in psychiatry and for many years one of the problems of research [was that psychiatrists disagreed about patient assessments]. There was disagreement about anxiety and depression, and ratings scales were developed over several years so we could all agree that this patient meets these criteria and is depressed or is not depressed. This has become the so-called DSM V diagnostic manual in psychiatry, which is controversial, too, but at least there have been efforts to all get on the same page in terms of making the same diagnosis. I think there needs to be similar work with IRBs to determine that they agree on certain standardized things that they have consensus on — things they now disagree on quite a bit.
IRB Advisor: You also make a kind of legal argument that “a body of case law” should be built based on documented precedents and complete with an appeals system.
Klitzman: I think there should be external appeals. For example, right now if a researcher disagrees with an IRB, he or she goes back to that IRB and they say, “No, we have made up our mind” — the researcher is stuck. There should be an external appeals process [like the] court system where you can appeal all the way up to the Supreme Court. Many of us may disagree with Supreme Court decisions, but the buck stops there and there is consensus.
IRB Advisor: You cite a need for “a change in attitudes” by all parties, getting away from the adversarial nature of IRBs and researchers. Can you give some examples of this?
Klitzman: Right now a lot of IRBs say, “Researchers cannot challenge us because we represent our local community values.” But what I have found is that there may be five IRBs at one institution and they sometimes review the same studies because there are different researchers involved. The IRBs in the same institution and the same community disagree. They are not disagreeing because of community values. They are disagreeing because of [issues with] a researcher, or the institution has been sued — sometimes it is just the personalities of people in the room. There are all kinds of institutional and psychological factors. IRBs need to realize the ways in which they disagree about issues and that there may be more than one acceptable decision about a study.
Some researchers need to change their attitudes. A lot of researchers don’t like IRBs — they call them the ethics police. What they really don’t like are all the regulations, but the IRB is the “face” of the regulations. So when researchers say, “I hate the IRB,” what they really mean is, “I hate having to follow regulations.” They blame the messenger.
Another issue is that right now, according to the regulations, there should be one non-scientific member and one unaffiliated member. IRBs often say, “We have one person combining these roles, a community member.” But, in fact, they are very different roles. [This person may be] a woman of color in a room full of white folks — mostly men. The community member feels intimidated or not empowered. There should be two different people in two different roles. And why not have three non-scientific members instead of just one?
Also, IRBs should have more support. Some institutions underfund IRBs. They are strapped and they can’t do as good a job as they want. There need to be changes at the federal level, the institution level, the IRB level, and the researcher level.
IRB Advisor: Are any of the major problems with IRBs you identify in your book addressed in the recently issued Common Rule Notice of Proposed Rulemaking (NPRM)?
Klitzman: Not really. The NPRM says we should have central IRBs for multisite researchers. There are a lot of details that need to be worked out, but there are some advantages to that. But it doesn’t talk about getting consensus, having more resources, being more open to research, having better training or any training required of IRB members. External appeals is not addressed, changing attitudes is not addressed. The Notice of Proposed Rulemaking does not address [these issues]. I think it could and it should, but some of this does not require changes in federal law or regulations. I think if the Office of Human Research Protections said these are important things, that would do a lot. They could say, “we ‘strongly recommend’ that there be training of a certain kind and standards based on consensus.” Something could be done at the federal level that does not involve regulations per se.
IRB Advisor: You found that IRBs “wrestle with genuine dilemmas and are constantly trying to weigh possible future risks and potential benefits of studies that have not yet been conducted.” It seems the problems you describe are only going to be aggravated by the explosion of genetic research underway and expanding.
Klitzman: Absolutely, IRBs don’t know what to do in the area of genetics. This is one of the areas the NPRM guidelines are asking what should we do with biospecimens and genetic samples? Nobody knows, so in the proposed rulemaking as I understand it says you need consent if you are going to have the specimens — even if they are de-identified. But if I have the data, I don’t need separate consent. The problem is — say I get your blood sample and work up your whole genome sequence. I can throw away the sample — I have the DNA. The proposed rule says researchers can study whatever they want on the data; they just can’t use the actual specimen. Some people may not want to be part of a research study and I can’t use the actual specimen, but if someone sequences a specimen and gives me a computer printout of the data, I can go ahead with the study. That is not exactly logical. Another area that is proposed for change is that consent forms should be posted online, which I think is good. But the [rule] says it would be done after the study is done. I think it should be posted beforehand so there can be more transparency.
REFERENCE
- Klitzman, RL. (2015) The Ethics Police? The Struggle to Make Human Research Safe. New York: Oxford University Press.
Meet Robert L. Klitzman, MD, director of the Masters of Bioethics Program at Columbia University in New York City, and the author of The Ethics Police? The Struggle to Make Human Research Safe.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.