Mental health and self-help apps know your deepest secrets. In a growing string of cases, they’re selling your information and ignoring your right to privacy.
The rise of self-help apps has been a boon for those who seek assistance in dealing with issues in their lives without necessarily needing to share intimate details with others in a more traditional setting. But a new study from Mozilla suggests mental health, domestic violence, and self-help apps have a surprisingly (and concerningly) disappointing track record on privacy and security.
On Monday, Mozilla published its *privacy not included study after analyzing 32 apps that are all designed to improve your mental health or explore religion. The company analyzed whether the apps shared user information, lacked encryption, had poor password policies, or generally failed to meet generally accepted privacy standards. The results were sobering.
“When it comes to protecting people's privacy and security, mental health and prayer apps are worse than any other product category Mozilla researchers have reviewed over the past six years,” the company said, according to ZDNet, which earlier reported on the findings.
The findings are staggering in their scope. Out of the 32 apps Mozilla analyzed, only two — PTSD Coach and Wysa — uphold strong privacy and security policies. Several others, including Better Help, Calm, Headspace, and Better Stop Suicide, all fail to meet basic requirements for safeguarding and protecting user data and privacy, Mozilla says.
Mozilla similarly criticized the other cited apps for vague and concerning privacy policies that at the very least opened the door to the self-help programs sharing or selling data to third parties.
The implications of that are exceedingly concerning. Those using self-help and mental health apps are in need, are going through trying times in their lives, and in some cases, may choose to share things in the programs that they otherwise wouldn’t with others. The very idea that their data could be shared with third-parties without their knowledge is an extremely troubling potential violation of their privacy.
But the apps Mozilla analyzed are far from the only services that at least pave the way for them to sell or share data on very intimate topics.
The Wall Street Journal revealed on Monday that the dating app Grindr has been collecting user location data and selling it to third-parties since at least 2017. The revelation came just two years after Grindr said it would curb its user location tracking, but the Journal’s sources say legacy information may still be available. In 2018, Grindr was also criticized for sharing its users’ HIV statuses with app optimization companies.
Earlier this year, Crisis Text Line, a non-profit suicide hotline, came under fire after it was found to be sharing troves of data with its for-profit spinoff Loris.ai. The organization said that any data it shared with Loris.ai was “anonymized” and all conversation details had been removed, but the revelation was concerning nonetheless, given the intimate details people share with Crisis Text Line.
The Mozilla revelation — and recent data-sharing activities — in the self-help app space makes for a difficult decision among those who need help.
At face value, self-help and mental health apps may promise to help those in need. And in practice, they may indeed do so. But there are clearly privacy implications to using the apps and knowing whether the programs that promise to protect you will also protect your data. As important as self-help apps may be at helping you address some of the struggles you’re facing in your life, they may also fall short of your privacy expectations.
Before users download self-help apps, therefore, it’s critical to evaluate their privacy policies, utilize tools like Mozilla’s *privacy not included, and read through user reviews. Even the apps that seem best at helping people through trying times may not go far enough in protecting them and ensuring their most intimate details are kept private.
If you’re using Apple hardware, the App Store also provides a view into the particular app’s privacy features so you can see before you download it what kind of data it shares, how it tracks you, and more. Be sure to evaluate the app’s privacy profile on its App Store listing to get a sense of what it does behind the scenes and whether you find those practices acceptable.
During times of distress, it’s critical users know that the apps they trust to help them through their difficult period are working in their best interests behind the scenes.
In a recent global survey, Avast found that half of people who use online dating apps or websites have searched for someone they met on a dating app. Here's how internet searches about a date can affect one's experiences.
In the world of AI and machine learning, language learning models are a hot topic. They can be used in a variety of applications, but as with any technology, there are also potential drawbacks and concerns.