A sweeping investigation by the International Digital Privacy Consortium has uncovered disturbing data practices among popular mental health apps, prompting global regulatory action and a crisis of confidence in digital therapy tools. The report analyzed 150 top mental health apps and found that 73% shared sensitive user data with third parties—including pharmaceutical companies, data brokers, and even employers—often without proper anonymization. In the most egregious cases, therapy chatbots were found using session content to target users with personalized advertisements for antidepressants.
The fallout has been immediate and severe. The European Union has fast-tracked its Mental Health Data Protection Act, which imposes strict new requirements on digital mental health platforms, including mandatory local data storage and prohibitions on selling user information. In the U.S., the Federal Trade Commission has filed lawsuits against three major app developers for deceptive practices, while class-action suits are mounting from users whose private therapy details were compromised.
These revelations come at a critical juncture for digital mental healthcare, which saw explosive growth during the pandemic. Psychiatrists report many patients now refusing to use apps they previously relied on, creating treatment gaps for those who depended on these affordable, accessible tools. “We’re seeing a real crisis of trust,” explains Dr. Sanjay Gupta, a digital ethics researcher at MIT. “People who were comfortable sharing their deepest struggles with an app now feel violated and vulnerable.”
In response, a new wave of privacy-focused alternatives is emerging. Several nonprofit organizations have launched open-source therapy apps with end-to-end encryption and transparent data policies. Some healthcare systems are developing their own secure platforms to maintain digital access while ensuring compliance. The scandal has also renewed interest in analog alternatives—sales of paper journals and in-person therapy workbooks have surged 240% since the report’s release.
This reckoning highlights the fundamental tension between innovation and ethics in mental healthcare. As the industry rebuilds, the challenge will be creating digital tools that truly prioritize user wellbeing—both psychological and digital—over profit motives. The solutions may determine whether app-based therapy remains a viable component of mental healthcare or becomes a cautionary tale about the dangers of unchecked technological expansion.
Related topics: