Highest Rated Comments


Mozilla-Foundation1586 karma

Pen and paper is an option that worked for people for years. For people who want something a little more advanced than that, an app like Euki is a good option. It’s made by a non-profit, so no collecting your data as a business asset. It stores all data locally, so you keep control over it as long as you keep your phone protected and safe. And it has a special passcode a user can enter if they are forced to open the device when they don’t want to that will keep the app from showing your real information. There are a couple of decent privacy options out there, you just have to search for them. And do your due diligence to understand if you can trust them.

-Jen

Mozilla-Foundation1281 karma

There were so many things that left us feeling like these apps were especially creepy. One of the things that really stood out for me was just how much sensitive, emotional, and personal information these apps can collect. And then you just have to trust them with all that information. And to be honest, I just don’t trust most of these companies. They seem to care about profit first and protecting their users’ privacy way down the line from that. Another thing that really got me as creepy was the issue of consent. Lots of the privacy policys we read said things that sounded quite nice like, “We will never share or sell your personal information without your consent.” Hey, that sounds great. I just don’t give my consent and I’m all good, right? Well, maybe not.Because consent is confusing when it comes to these (and all) apps. To some, by downloading and registering with the app, it appears you have given them consent to use your personal information. Which probably isn’t what most people think of as consent. And then they tell you to withdraw consent, you have to delete the app. Yuck. And then there’s the idea that these companies can change their privacy policy whenever they want to change how they use/protect your personal information. The Verge wrote a great article about that after we published our *Privacy Not Included guide that I really appreciated. https://www.theverge.com/2022/5/4/22985296/mental-health-app-privacy-policies-happify-cerebral-betterhealth-7cups

-Jen C

Mozilla-Foundation858 karma

Some of the most common red flags from our experience are:

An app confronts you with lots of questions about your mental health and other sensitive data straight after download, before actually informing you about its privacy practices and how this data might be used.

An app asks for an excessive access, such as to your camera, photos/videos, voice recordings, precise location, etc

An app allows you to connect Facebook/other numerous third-party extensions into its UX

In a privacy policy, it is not clear if you can easily get your data deleted

Based on the app’s privacy policy (usually a CCPA section), some of the app’s practices may be considered a “sale” of personal information under the Californian law.

You are able to log in with a super weak passwort, such as ‘111111’ or ‘qwerty’

An app forces/manipulates you into giving a ‘consent’ for sharing data for advertisement

After signing up for an app, your email is being overflowed with the app's marketing communication, and you do not recall permitting sending you any marketing emails whatsoever

App’s age ranking differs with a perceived age ranking. For example, some apps that are visibly targeting kids (one wrote 5+ on AppStore), write that no one over the age of 13 or 16 is allowed to use an app. - Misha, Privacy Not Included

Mozilla-Foundation408 karma

To answer your first question:

Soooo many threats. There’s the chance your data could be accessed by someone who wants to accuse you of having an abortion to send the police to investigate which could potentially lead to your arrest and prosecution for seeking reproductive health care. There’s the chance your data can be shared or sold to data brokers and then sold to pretty much anyone and that’s not information you want the world to have. There’s the chance you’ll be targeted with dumb ads forever because they think you’re having a baby. And the stories of women who lose their babies to miscarriage and the emotional harm seeing those ads do. Because we’re talking about things like when your period starts, what your moods are, what your symptoms are, when your doctors appointments are, what baby name you’ve picked out, how much you weigh, your sexual orientation, and on and on and on. So, the threats are large. And one thing I tell people is, once you share this information on the internet it’s out there. You no longer control it.

-JC

Mozilla-Foundation386 karma

That is a great point. As Misha and I were doing this research, we also had this thought. Should mental health apps even exist? However, there is a way to do this better than is being done by many companies now. Non-profit organizations actually make a few of the apps on our list and their privacy policies are generally really great. They don’t collect a lot of personal information and they don’t share it all around for advertising or treat it like a business asset. That is a great approach to this. Have non-profits create and run these apps. That would mean people would have to support these non-profits with donations though. And we did see non-profits just didn’t have the same resources to have security teams to think about the security of these apps, which stinks.

The other thing that we’d love to see is policy regulations to catch up with the use of mental health (and all health) apps. Right now there are not a lot of regulations on what data these apps can collect, how they can use it and how it must be protected. That needs to change. Stricter health privacy laws like HIPAA do cover things like the actual conversation you have with a human therapist. But not things like the fact you’re talking with a therapist, how often, for how long, when, etc. That’s all data that a company can use or share or sell.

Also, there’s the really interesting question about whether AI chatbots are covered by HIPAA. Talking to a human therapist means you have stricter privacy protections. Talking to an AI therapist doesn't necessarily have the same protections. -Jen C