Neurotechnology Takes Human Research Ethics to New Frontiers
It is possible that any IRB might someday review a study that involves making healthy people smarter, cognitively faster, and more resilient mentally.
Neurotechnology, including research funded by the government, also is designed to help people with Parkinson’s disease, locked-in syndrome, mental illness, and other issues. But it could take things a step further for people with no chronic conditions. This potential raises ethical questions.1
“Novel neurotechnologies pose a particular problem because they are affecting people’s brains,” says Darcy McCusker, MA, MSEd, graduate student in the philosophy department at the University of Washington. “There are always ethical issues that come up, like drug treatments and how people in positions of power have access to experimental treatment.”
Neurotechnology can give people possibilities that previously were unimaginable. For instance, one area of research involves brain-computer interfaces, using implants with which people with locked-in syndrome can communicate.
“In the old version, they could control a computer with the motion of their eyes,” McCusker says. “But the brain interface takes it a step further, and people learn how to get their brain and computer to interact by the person thinking about an individual letter.”
By focusing on one letter of the alphabet at a time, the person can train the computer to recognize the signal their brain makes when they think about a particular letter. The goal would be for the person to communicate in written words simply by thinking of the letters of each word, and the computer would recognize their thoughts, she explains.
“The ultimate goal is for someone with severe disability to use the computer without anyone else’s help and to get their needs met,” McCusker says.
With deep brain stimulators, people with depression, obsessive-compulsive disorder, and other mental illnesses could find some relief. Also, in brain stimulating-technologies under investigation for Parkinson’s disease patients, there is the possibility of side effects that make people think they are acting more impulsively, McCusker notes.
“Putting this device in someone’s brain will hopefully help them with their symptoms,” she explains. “But it can have an impact on people and other parts of their brain, making them question, ‘Is that me doing it? Or is it the device doing it?’”
McCusker researches brain-computer interface and deep brain stimulation and how these therapies pose numerous moral risks. The results suggest ethical reflexivity practices can help build public trust in research of new technologies.1
“The idea of ethical reflexivity is an idea that researchers have an obligation to think about the values they’re bringing to the lab,” McCusker explains. “One of the most poignant examples of this is the neuroethics group I work with, and the end-user roundtables we’ve done.”
Researchers can meet with people who can benefit from a new neurotechnology. Rather than develop technological solutions to fix a problem the researcher envisions, they can find out exactly which problems the end users want to solve. For instance, researchers might think the top priority for people in a wheelchair is to walk again. But in talking with people in wheelchairs, investigators might learn they are more concerned about bowel function and sexual function, McCusker explains.
Maybe learning to walk again is not high on their priority list, but controlling when they go to the bathroom is something that affects their quality of life, McCusker adds.
“We want researchers, even without having the experience of interacting directly with someone who will use their device, to ask them questions,” she says. “How do they think about the end-user perspective or the perspective of people in the disability rights movement? Are they concerned about how their research could affect the choices they make?”
An example of ethical reflexivity in practice is the Scientific Perspectives and Ethics Commitments Survey (SPECS), which was developed by the Neuroethics Thrust with the Center for Neurotechnology at the University of Washington. SPECS researchers gather for a meeting to engage in pressing ethical issues related to their novel neurotechnology research.1
These are several sample prompts from SPECS:
- Ethical considerations ought to play a major role in directing neural engineering research.
- Neural engineering research focused on affective or cognitive conditions should aim to enhance affective or cognitive capabilities beyond normal functioning.
- Social inequality is a legitimate moral concern that should shape the direction of neural engineering research.
The trickiest ethical issues involve neurotechnology for the purpose of enhancement. “People think their goal is to take average people and make them better,” McCusker says. “Some researchers want to help people get to their baseline before they had symptoms, to return to their previous levels of functioning, but we don’t want to go beyond that; we don’t want to enhance people. We don’t expect them to change their minds, but we want them to recognize that they think enhancement is a good goal in neurotechnology and a goal to recognize in themselves.”
This recognition helps researchers frame their own goals, inform IRBs, and help the people with whom they work.
IRBs might consider potential harms of neurotechnology for enhancement purposes. For example, a major funder could be the Defense Advanced Research Projects Agency (DARPA), an arm of the Department of Defense. The funders might be interested in neurotechnologies for military engagements, McCusker says.
A few technological evolutions from now, and IRBs might review studies of devices that could make a person run faster or that keep fighter pilots alert for longer periods. “Maybe the technology could help someone focus better,” McCusker says. “But there may be some reasons that we don’t want to try and build the bionic man.”
From a risk perspective, some of these neurotechnologies involve brain procedures. There is a physical risk to performing brain surgery on healthy people, she adds.
For research participants who might benefit from these technologies, IRBs should keep in mind the devices pose a psychological risk. For example, a device might help someone control their tremor, but it also could increase their impulsive behaviors, McCusker says.
“If I gamble away money, am I responsible for it, or does the device share some responsibility for it?” she asks. “It also undermines your trust in yourself in an example like that.”
Research participants might worry about how they feel and what they do when the tremors stop, she adds.
Some neurotechnology devices can sense directly from a person’s brain when a tremor is about to happen, and the device can automatically turn on and prevent it. Others might need to be turned on and off by a technician, McCusker says.
“You want research that’s important and has a high impact, but you also want to make sure you are really engaged in what the people need, and what they want,” she explains. “You should take the time to think about all of this, or at least about some of the important issues that come up with this research.”
REFERENCE
- Tubig P, McCusker D. Fostering the trustworthiness of researchers: SPECS and the role of ethical reflexivity in novel neurotechnology research. Res Ethics 2020; doi: 10.1177/1747016120952500. [Online ahead of print].
It is possible that any IRB might someday review a study that involves making healthy people smarter, cognitively faster, and more resilient mentally. Neurotechnology, including research funded by the government, also is designed to help people with Parkinson’s disease, locked-in syndrome, mental illness, and other issues. But it could take things a step further for people with no chronic conditions. This potential raises ethical questions.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.