Ethics in age of big data’ go beyond privacy issue
Facebook was just beginning
Human research protection program directors and IRB members witnessed a very public ethical dilemma this past summer when debates raged over Facebook’s social media study. Public rebukes of studies involving big data and social media are fairly rare, but these studies can raise all kinds of ethical challenges.
"It goes beyond privacy," says Ryan Spellecy, PhD, an associate professor of bioethics and medical humanities at the Medical College of Wisconsin in Milwaukee.
"This is challenging our usual ways of protecting human subjects, and it moves beyond privacy in the sense that it’s an opportunity for us to engage our communities in conversations that try to adhere to respect of persons," Spellecy says.
Think beyond HIPAA authorizations and consider how IRBs can engage research participants to find out their preferences when studies involve large data warehouses, he says.
One hurdle to having this type of ethical debate is that IRBs often think of the respecting persons rule as synonymous with the informed consent process, says Emily E. Anderson, PhD, MPH, assistant professor of bioethics at Loyola University Chicago in Maywood, IL.
"We’re interested in responding beyond that process," she says. "Engaging participants in the discussion is one way of figuring out what respect means, and anything beyond the individual consent form is where we’re at — redefining respecting persons."
Here are some of the main issues that need to be discussed in the era of big data, experts say:
What do you do when informed consent (IC) doesn’t fit a study?
"In these large data warehouses, the IRB can simply waive consent, and they’ve met regulatory obligations," Spellecy says.
But if consent is waived, how else can an IRB make certain the study shows respect for persons? he asks.
"If we can’t get informed consent when the study is running, then maybe we can be foresighted and get community input about the design itself and the way data is handled to show respect for persons," he adds.
One reason why informed consent is a challenge in many of today’s studies is because regulations were written for past problems, developed in response to previous abuses, Anderson says.
"They’re not forward-looking, and they’re focused on the clinical research environment where there is a one-to-one patient/researcher relationship," she says.
"I think the issues we’re struggling with around big data are not unique to big data," Anderson adds. "They’re struggling with the same thing in oral history, where research doesn’t look the way informed consent looks."
Was data collected in an ethical manner?
When a researcher submits a study involving big data, one of the first questions an IRB might consider involves whether the data’s collection was done ethically, Anderson says.
If the answer isn’t "yes," then maybe the study should not be done, she says.
Often, there isn’t a clear answer. Data could have come from a different academic institution or source, and researchers might not have any control over how it was collected, she notes.
An institution could develop standards for data collection and use. If these standards are agreed upon by multiple institutions, then IRBs and researchers might have access to information that they know has been collected ethically.
Still, even if an academic research organization requires only the use of data collected in an agreed-upon ethical manner, someone likely will use the collected data.
"When I speak with health services researchers, they see data that other nonacademic entities have, and they’d love to get their hands on it because they could do amazing things with it," Spellecy says. "The amount of data out there is unprecedented, and it doesn’t just pose new ethical and regulatory challenges because it’s bigger."
The challenges include the ethical question of whether it is fair to use people’s data even when it’s de-identified, he adds.
"Research on data will happen whether or not it’s approved by IRBs and academics," Anderson says.
Restricting academic researchers’ access to such data could keep them from being able to harness information that could have great social utility, she adds.
"We might be flying blind for the next decade," Anderson says. "It’s just something we’ll have to figure out."
What are the biggest ethical challenges with technology?
"Our current framework assumes the researcher and participant have some sort of relationship and they know each other," Anderson says. "In studies involving big data that relationship is not there."
Studies using population data involve a large number of individuals with geographic distance, and people might not be identifiable, she adds.
"That’s how the technology creates issues," Anderson says. "The biggest issue now is that we’re talking about data that’s already out there as opposed to data that one person specifically asks for from another person."
There are good social reasons for using big data, including the potential for better answers from a scientific perspective.
"There’s more variation," Anderson says. "But people probably didn’t agree to be part of this type of research, and now we have to deal with that."
One simple solution is to cover this possibility in a HIPAA notification: "From time to time, researchers at this institution may review your medical information for research purposes," Spellecy suggests.
Technology continues to evolve, and more complex dilemmas arise.
For instance, now it’s possible for multiple data warehouses to link and for researchers to access warehouse networks.
In these cases it might be difficult to let people know on a HIPAA notification form of the possible use and distribution of their data.
"As regional linkages come online, researchers can look at medical data from six or seven academic medical centers," Anderson says. "There are vast troves of data viewed by researchers at this institution and at multiple institutions."
IRBs should ask whether participants would be upset about how their data could be used. A person might have shared information with a family practitioner that now can be used by researchers from another city or state.
This type of issue should be addressed by the Office of Human Research Protections (OHRP), perhaps through changes to the Common Rule, Anderson says.
"Just as we’re thinking of ways to link data, we need to think of ways to involve our communities," she says. "We should learn from each other about what works."
How do IRBs deal with phenomenon of never-disappearing data?
"What’s different now is that everything you do leaves a digital trace," Anderson says. "Every time I go to the doctor now, he enters something into an electronic medical record; every time I buy something online, it leaves some kind of trace online, whether it’s Googling or mapping, and that didn’t happen before."
There are good and bad things about such readily available data. Many consumers might like having Amazon suggest a book, but they might object to seeing ads about urinary incontinence, she explains.
"IRBs have to stop thinking just in terms of consent and permission," Anderson says. "Instead, we need to think in terms of accountability and transparency."
This requires a multi-pronged approach, Spellecy says.
"With privacy and the Internet, there are generational thoughts on it," he says. "Some people are comfortable with it because they’ve grown up around technology, and they’re comfortable with not having privacy."
For other people, these are troubling issues.
"How do we engage all of these groups when we want to use their medical data?" Spellecy says.
Even as these questions are debated and addressed, public perceptions of privacy on the Web are evolving, Anderson says.
People have been becoming more comfortable with having their data saved and used online, but when the Facebook study was known, they became distrustful, she says.
"What the public wants and expects is changing, and it’s what will drive our definition of what respect for persons means and our ways of achieving that," she says.
It would be a problem if people extended their distrust of Facebook’s study to medical records research, Spellecy says.
"There could be a chilling effect if people were to find out that their medical records were being used in a way that they feel is inappropriate," he adds.