For Sale: Private Mental Health Data and Consumer Trust
Mental health apps have become common, but there is growing concern about privacy and data safety.1-3 “The U.S. does not have a federal comprehensive privacy law, and the protections for health-related data are limited,” explains Jolynn Childers Dellinger, JD, a visiting lecturer and senior fellow at the Kenan Institute for Ethics at Duke University.
HIPAA only applies to “covered entities” (e.g., physicians and insurance companies) and only to “protected health information (PHI).”
“Any health-related data that people share with apps on their phones, for example, is not likely to receive those protections,” says Dellinger, who also teaches privacy law at Duke.
Generally, HIPAA does not apply to private companies operating mental health apps or technologies that collect health data (e.g., wearables or social media platforms). Those companies likely can legally sell users’ health data to third parties — without the users’ consent. What a commercial entity can do with a person’s personal information is determined largely by the company’s own privacy policy. “Privacy policies are notoriously full of legalese, difficult to understand, and rarely read by consumers,” Dellinger notes.
Consumers likely assume their mental health data are private, just as it would be if they saw a mental health provider. In fact, data brokers are selling Americans’ mental health information.4 Of 37 data brokers contacted by the report’s author about selling mental health data, 11 offered to sell the requested information. Some of the brokers offered data in an aggregate form — for instance, giving the buyer information on how many people in a given ZIP code had been diagnosed with depression or anxiety. However, other brokers offered identifiable data with names, addresses, and incomes of people who had been diagnosed with specific conditions, such as bipolar disorder or anxiety issues.
“Companies that sell mental health data are using the individual as a means to make money, without appropriately considering the privacy interests and harm to vulnerable consumers,” Dellinger warns.
Such companies are taking advantage of consumers’ lack of knowledge about HIPAA, which is widely misunderstood. Because people hear about HIPAA and privacy protections when they visit their primary care providers, they wrongly think they are protected when sharing mental health data with an app or online. “Companies that exploit consumers’ lack of knowledge about the law, and abuse consumer trust, are acting unethically,” Dellinger says.
The Federal Trade Commission (FTC) recently issued a proposed order requiring an online counseling service to pay $7.8 million to consumers for revealing data to third parties after promising to keep the information private.5 “Hopefully, the FTC’s action will send a message to the industry,” Dellinger says.
Many consumers do not even realize the data broker industry even exists, let alone the dangers it poses. Mental health data can be used to target individuals with advertisements. “It can be used to discriminate against people in any number of ways,” Dellinger cautions. “Individuals and families can be targeted for theft, con jobs, or financial exploitation.”
Once people learn mental health data can be sold or misused, trust erodes. “These issues will dissuade people from seeking care online or via an app,” Dellinger predicts. “For many people, that may be their only option.”
On the other hand, with the right laws and ethical practices in place constraining the sale of mental health data, it is conceivable these apps could help protect privacy. Mental health apps could provide anonymity, allowing people to obtain treatment in the privacy of their home, avoiding potential stigma. “Honesty, fairness, care, empathy, and justice are all undermined by the sale of mental health data,” Dellinger says.
Samuel Lustgarten, PhD, a professional consultant for the National Register of Health Service Psychologists, co-authored a paper in which he and colleagues argued that mental health providers are ethically obligated to consider how technology used in their practices could affect clients’ privacy.6 “In general, I tell trainees and providers to seek information on the companies they use,” Lustgarten reports.
Those companies, as partners in data usage and storage, also bear responsibility. “Trust is always a factor when utilizing third-party technology,” Lustgarten explains.
As a psychologist in the technology space, Lustgarten routinely calls companies to question their policies. The responses demonstrate how seriously the companies take privacy and confidentiality. “There are a number of intermediaries, third-party services, and other organizations that might threaten patient privacy,” Lustgarten warns.
Data are transferred in various ways, which might affect an individual’s privacy. For instance, the sale or purchase of a company that holds PHI might allow other companies to buy data. Third-party plug-ins and add-ons include tracking features that transfer cookies and relevant personal data to other companies. “There are hacks and breaches, as always,” Lustgarten laments. “Data are then sold on ‘dark’ markets, likely to be used for manipulation or false medical billing requests.”
From the individual consumer’s perspective, a company’s privacy policy and terms and conditions are a far cry from valid informed consent. “The biggest ethical concern in this burgeoning Wild West-like marketplace for apps and websites in mental health is around informed consent,” Lustgarten observes. “We just don’t have informed consent for the data-sharing that is occurring.”
As a mental health provider, says Lustgarten, “you’ve got to get curious and ask questions. The technology might feel like it’s beyond the scope of practice, but your patients deserve it.”
REFERENCES
1. Iwaya LH, Babar MA, Rashid A, Wijayarathna C. On the privacy of mental health apps: An empirical investigation and its implications for app development. Empir Softw Eng 2023;28:2.
2. Alfawzan N, Christen M, Spitale G, Biller-Andorno N. Privacy, data sharing, and data security policies of women’s mHealth apps: Scoping review and content analysis. JMIR Mhealth Uhealth 2022;10:e33735.
3. Schroeder T, Haug M, Gewald H. Data privacy concerns using mHealth apps and smart speakers: Comparative interview study among mature adults. JMIR Form Res 2022;6:e28025.
4. Kim J. Data brokers and the sale of Americans’ mental health data. February 2023.
5. Federal Trade Commission. FTC to ban BetterHelp from revealing consumers’ data, including sensitive mental health information, to Facebook and others for targeted advertising. March 2, 2023.
6. Lustgarten SD, Garrison YL, Sinnard MT, Flynn AW. Digital privacy in mental healthcare: Current issues and recommendations for technology use. Curr Opin Psychol 2020;36:25-31.
Once people learn mental health data can be sold or misused, trust erodes. These issues might dissuade people from seeking care online or via an app. For many, that may be their only option.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.