Mental health apps have terrible privacy protections, report finds

Mental health apps as a category have worse privacy protections for users than most other types of apps, according to a new analysis by researchers at Mozilla. Prayer apps also had poor privacy standards, the team found.

“The vast majority of mental health and prayer apps are exceptionally scary,” Jen Caltrider, the Mozilla *Privacy not included guide lead, said in a statement. “They track, share and benefit from users’ most intimate personal thoughts and feelings, such as moods, mental status and biometrics.”

In the latest version of the guide, the team analyzed 32 mental health and prayer apps. Of those apps, 29 were given a “privacy not included” warning label, indicating the team was concerned about the way the app was managing user data. The apps are designed for sensitive issues such as mental illness, yet collect large amounts of personal data under vague privacy policies, the team said in the statement. Most apps also had poor security practices, allowing users to create accounts with weak passwords despite containing highly personal information.

The worst-practice apps, according to Mozilla, are Better Help, Youper, Woebot, Better Stop Suicide,, and Talkspace. For example, the AI ​​chatbot Woebot says it collects information about third-party users and shares user information for advertising purposes. Therapy provider Talkspace collects transcripts of user chats.

The Mozilla team said in a statement that it contacted the companies behind these apps multiple times to ask about their policies, but only three responded.

Personalized, traditional mental health care can be difficult for many people to come by – most therapists have long waiting lists, and navigating insurance and costs can be a major barrier to care. The problem got worse during the COVID-19 pandemic, as more and more people began to need care. Mental health apps wanted to fill that void by making resources more accessible and readily available. But that access can come with a privacy consideration, the report shows.

“They work like data-sucking machines with a veneer of a mental health app,” Mozilla researcher Misha Rykov said in a statement. “In other words: a wolf in sheep’s clothing”,


Leave a Reply

Your email address will not be published.