Hi, We’re Jen Caltrider and Misha Rykov - lead Researchers of the Privacy Not Included buyers guide (https://foundation.mozilla.org/en/privacynotincluded/) , from Mozilla!

We’re also joined by the team from mental health app Wysa (one of the apps doing it right!), and we’re all here to answer your burning questions.

We've reviewed the privacy & security of some of the most used mental health apps.

With so many people putting incredibly sensitive information in the hands of big tech we wanted to get a better look at their privacy policies to understand how users’ information is stored, and sometimes, sold.Here is a summary of what we found:20 of the 32 apps we reviewed earned our Privacy Not Included warning label.

This includes popular apps like BetterHelp and Sanvello, with millions of downloads, which happen to be some of the worst in the bunch.

Privacy is not default, and consent is like the Wild West.Two apps made our Best Of category, including Wysa, who’s joining us today!

Learn more about our findings here https://foundation.mozilla.org/en/privacynotincluded/categories/mental-health-apps/ AMA about our research, our guide, or anything else!

Proof: Here's our proof! https://twitter.com/mozilla/status/1665726359459471362)

Thank you all for participating! If you'd like to dig into more details about the apps that we have reviewed please visit *Privacy Not Included, and if you're curious about other Mozilla Foundation projects you can sign up for our newsletter here!

Comments: 179 • Responses: 28  • Date: 

FierySharknado395 karma

Were there any common "red flags" in the shadier apps you found that someone could be on the lookout for to indicate an application may not be handling user data ethically?

Mozilla-Foundation858 karma

Some of the most common red flags from our experience are:

An app confronts you with lots of questions about your mental health and other sensitive data straight after download, before actually informing you about its privacy practices and how this data might be used.

An app asks for an excessive access, such as to your camera, photos/videos, voice recordings, precise location, etc

An app allows you to connect Facebook/other numerous third-party extensions into its UX

In a privacy policy, it is not clear if you can easily get your data deleted

Based on the app’s privacy policy (usually a CCPA section), some of the app’s practices may be considered a “sale” of personal information under the Californian law.

You are able to log in with a super weak passwort, such as ‘111111’ or ‘qwerty’

An app forces/manipulates you into giving a ‘consent’ for sharing data for advertisement

After signing up for an app, your email is being overflowed with the app's marketing communication, and you do not recall permitting sending you any marketing emails whatsoever

App’s age ranking differs with a perceived age ranking. For example, some apps that are visibly targeting kids (one wrote 5+ on AppStore), write that no one over the age of 13 or 16 is allowed to use an app. - Misha, Privacy Not Included

Ecks_136 karma

This reminds me of when you guys reviewed mental health apps last year too. For companies that want a passing grade in your Privacy Not Included, what's the cheat sheet? What things do companies need to start doing bare minimum to get a thumbs up from Mozilla?

Mozilla-Foundation167 karma

Yes! We did this same research last year and our findings were DEPRESSING (no pun intended). Too many mental health apps were absolutely awful at privacy. We did this research again this year to see if any of them had improved. And a few did, which was nice to see. Most of them because we worked directly with them to help improve things like the language in their privacy policy that grants everyone, regardless of where they live, the same rights to access and delete data. 

So, what things do companies need to do to get at the very least a thumbs sideways from us at Privacy Not Included? Well, don’t share personal information with third parties for targeted advertising purposes. That’s a big one. Grant everyone, regardless of what privacy laws they live under, the same rights to access and delete data. Don’t have a terrible track record of protecting and respecting your users’ privacy (looking at you BetterHelp). And meet our Minimum Security Standards. These are some of the big things we ding companies for. I also really appreciate it when companies have clear, easy to read privacy policies without lots of vague language. And we also love it when companies respond to the questions we send them at the contact they list in their privacy policy for privacy related questions. It’s amazing how many companies don’t seem to actually monitor that link.

f anyone would like to read our methodology for reviewing companies, you can find it [here](https://foundation.mozilla.org/en/privacynotincluded/about/methodology/)

As for what a company has to do to get a thumbs up from us. Well, that is much harder. Companies that make our Best Of list go above and beyond with privacy. They have lean data practices, meaning they collect as little personal information as possible, sometimes none at all. They write excellent privacy policies (like Wysa’s, theirs is one of the best we’ve read). They have an excellent track record. There aren’t many of those companies which is why it’s pretty cool Wysa agreed to join us here today to talk about why they do what they do putting privacy first. - Jen C, Privacy Not Included

Mozilla-Foundation44 karma

Thank you Jen. It was so cool to get your Best Of status from Mozilla, and to date its the one status we are most proud of.

Bare minimum is definitely not the culture around privacy for us. Rather it is bare minimum around data sharing. We ask what is the bare minimum data we need to deliver a good outcome, and then work creatively from there to find the safest, most private ways of getting and storing them. - Jo from Wysa

Britoz79 karma

How could a user possibly keep track of an app changing from being good with data to bad? For example what if I used Wysa for two years and for the first year they're great but the second someone else is in control and they sell data to cash in. How would I ever know?

I know they'll have to send out a notification but I got one from Virgin Australia earlier about their membership and I don't have the time to read it all. Or the inclination to be honest. I just want companies to do the right thing. I'm sick of finding a good thing then learning they're no longer good.

Mozilla-Foundation82 karma

Jen from Privacy Not Included: Great question! And unfortunately, the answer is, it’s really hard. Companies often count on users not reading privacy policies or keeping up with changes to them. And then there is the fact that just about every privacy policy I’ve ever read has a line that says your personal information can be shared if the company is ever sold or merged with another.

We saw this recently with Amazon buying iRobot who makes Roomba robot vacuums. iRobot’s privacy practices have actually been one of the very good ones over the years we’ve reviewed them. They earn our Best Of. Now they are being bought by Amazon and Amazon is certainly not one of the good ones when it comes to privacy. It sucks to be someone who bought a Roomba because they were better at privacy only to have all that data transferred to Amazon. 

There are a couple things you can do. Delete your data frequently! Most companies have a way to delete your data from them. And we note on Privacy Not Included which companies are good and guarantee people the same rights no matter what privacy laws you live under. So, delete, delete, delete. Just put a note in your calendar every couple months to delete your data from any device you’re worried about. It’s dumb consumers have to do this, but this is the world we live in.

And yes, you can also regularly check in our privacy policies. But let’s be honest, that’s time consuming and hard and nerdy and most people aren’t going to do that. We just want companies to do the right thing too. Which is why we love to highlight the good companies and call out the bad ones. Don’t support the bad ones if you can avoid it! And do support the good ones when you can. Until then, we’ll keep working to hold companies accountable and we’ll go out anad read those privacy policies for you. Well, as many as our little three person team can.

Mozilla-Foundation35 karma

Shubhankar from Wysa: That is a great question. The main thing is that you shouldn’t have to read the privacy policy to see whether or not the company is good. Just ask yourself if you are clear why your data is being asked and used. Look for an option clearly marked in settings that allows you to delete your data and look for how easy a company’s privacy policy makes it to get what data they use and why, in the first 20 seconds of reading it. Another great way to keep track of how the company is using your data is to ensure they clearly and crisply highlight and maintain what has changed with every notification around policy updates. Lastly, Mozilla will do some of the work for you - they didn’t just review us last year, they did it again this year and if we changed our policy they would definitely call it out!

forward_only46 karma

We hear a lot about all the ways our privacy is being violated, but very little about solutions. Is there anything people can do to protect their privacy now, and are there any viable policies which could help protect people's privacy in America on a broader scale?

Mozilla-Foundation50 karma

Misha from Privacy Not Included: Some of the tips we share with users are:

-Choose your apps wisely. Do not sign up for a service that has a history of data leak and/or gets a *PrivacyNotIncluded ding.

Do NOT connect the app to your Facebook, Google, or other social media accounts or third-party tools, and do not share medical data when connected to those account.

Do not sign up with third-party accounts. Better just log in with an email and strong password.

Choose a strong password! You may use a password control tool like 1Password, KeePass etc

Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless necessary)

Keep your app and phone regularly updated

Limit ad tracking via your device (eg on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization)

Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data

When starting a sign-up, do not agree to tracking of your data if possible

Mozilla-Foundation7 karma

Prachi from Wysa: Great suggestions, Misha!

And yes, a lot is in your hand to protect your own data... begin with what's literally in your hand, you device!

Ask yourself... did you just "OK" all permissions without reading into which all apps have access to your camera, location, folders and contacts list. Consent is a BIG deal in the privacy world, so begin there.

Ask yourself... did you just "OK" all permissions without reading into which all apps have access to your camera, location, folders and contacts list. Consent is a BIG deal in the privacy world, so begin there.

As for Wysa, privacy is built in by default & by design. We believe we don't need your personal information like credentials to alleviate you from your worries. Only a nickname is sufficient to help us personalize our conversation with you. You can also opt-out at any time using the “reset my data” feature available in the App settings.

We lay out many of the validated Best Practicesin our privacy policy too, that'll help you keep your device secure.

yarash25 karma

I use the Calm app for literally one thing. I really like the train sound effect in the background while I work. From a privacy standpoint, I take it I should just find a similar MP3 of a train and listen to that instead?

Mozilla-Foundation37 karma

Misha from Privacy not Included:

From a privacy perspective, finding an original sound to listen to would indeed be better. Since it would not compromise any of your personal data. It could also save you some fees. Finding a CD and listening to it in an analogue way would be even greater. The larger point is, people do not have to share so much data to get those simple things that they like. 

Zoe from Privacy not Included:

I think with these apps it’s always going to be a question of “is the data I’m trading worth what I’m getting in return?”

Calm is OK, but they do collect third party information about you. If you’re really only in it for train sounds, you might consider a music-streaming app that has less privacy risk. And even though those music apps probably aren’t perfect either, it’s better to have fewer apps collecting your data.

Peior-Crustulum21 karma

It's great that you take the time and are willing to spend resources keeping track of this issue.

Are there any effective ways to tell the industry that these practices will not be tolerated?

Mozilla-Foundation13 karma

Jen C from Privacy Not Included: Absolutely! Vote with your dollars. Don’t spend money at the companies with bad privacy policies and practices. Do spend money with companies with good privacy policies and practices. That’s an (somewhat) easy one. 

Also, paying attention is good. I know there is SO much going on in the world to pay attention to and privacy is a hard one to keep up with. But make a little effort. Read a privacy policy before you download an app or buy a device. Search for words like “sell” or “combine” or vague words like “can” “may” “could”. Those raise flags. Move on if they privacy policy makes you uncomfortable. 

Something else that’s happening now in the US, is the FTC is actually really stepping up recently and cracking down on back companies doing misleading and dishonest things with your personal information. GoodRX, BetterHelp, PreMom, and Amazon have all received recent judgements from the FTC for privacy violations. You can sign up on the FTC for their consumer announcements. Yes, it’s nerdy, but hey, it’s an easy way to stay informed. Oh, and you can always check us out at *Privacy Not Included too. =D

benv13819 karma

Do you find these privacy violations go against HPPA regulations?

Mozilla-Foundation46 karma

Jen C: HIPAA is tough when it comes to mental health apps. This US health care privacy law covers communications between medical professionals and you. So, a conversation with your doctor is covered by HIPAA.

A conversation with an AI chatbot or an “emotional support coach” is not always going to be covered by HIPAA because those aren’t considered “medical professionals''. And then there is all the other data outside of HIPAA -- things like your answer to an onboarding questionaiire, your usage data of the app, if you login with your Facebook login -- those things aren’t covered by HIPAA and are fair game. So, consumers have to be very careful if they expect any of their mental health coversations to be covered by HIPAA and do a little extra homework by reading privacy notices (uhg, I know!) and ask questions of the app to determine that.

I’m sure Wysa has their own unique experience with this too.

Mozilla-Foundation13 karma

Jo from Wysa - If a person is talking to Wysa after a clinician referral in a healthcare pathway (not in b2c), certain parts of their interaction with the app become a part of their medical record. This is a very specific kind of implementation that is not anonymous from the healthcare provider perspective, though Wysa still doesn’t store any personal identifiers alongside their conversation. These parts of the healthcare record are covered by HIPAA, and providers do need to be very careful with data security here in any case. However, even in this case, with Wysa the conversation about what is bothering you and what thoughts you are having are never shared. These are private even when discussed in-person healthcare settings, and they remain so here. Most of Wysa is used at a population health level though, where it is completely anonymous, and not linked to health records and as such HIPAA does not apply.

Mennix12 karma

What were some of the more surprising discoveries (both good and bad) that you came across? Any particularly good/creative ideas you can call out?

Mozilla-Foundation19 karma

Misha: There were more bad than good discoveries in my case.

Some of bad discoveries are:

The way many companies manipulate people into giving their ‘consent’, using tricky UX practices or outright denial of service if no consent is given

Many apps confront users with detailed questionnaires on gender, physical diseases, mental states, children, relationships, etc. BEFORE showing a user the Privacy Policy or asking for consent

Most apps are packed with third-party trackers incl. advertisement trackers

The good discoveries were that over the last year, many apps improved the password standards. Also, we were happy to see that FTC went after BetterHelp – at last.

marklein11 karma

Has there yet been any legal restrictions proposed at the USA federal level regarding things like this?

Also, thanks for your work AND MOZILLA RULES.

Mozilla-Foundation16 karma

Thank you for supporting us here at Mozilla! That makes our day.

As for things happening at the federal level, we’ve noticed a couple things happening at the federal level in the US since we first released our mental health app research last year. The one that probably made me the happiest was when the FTC issued a $7.8 million judgment against BetterHelp for misleading their users about never sharing their personal information. That was great to see as BetterHelp did seem to us to be rather questionable (to put it nicely) in their privacy practices. You can read about that here.

And there have been some US Senators stepping up to ask questions and propose potential federal legislation to protect health data.

And some states, especially California, have some proposed legislation to tighten restrictions on federal health data. We actually wrote more about that here.

I’m not a policy person, so I know there are other federal and state level privacy laws bring proposed. These are things I’m most familiar with though

MudraMama9 karma

Do you have a newsletter or something that can give people regular updates on your research? Love what you're doing, so thanks to your team for putting the time into this very neglected topic.

Mozilla-Foundation10 karma

Zoe from Privacy Not Included: For sure! You can sign up here

And thanks so much, we love to get feedback that our work is appreciated. I also agree that privacy doesn’t get enough air time generally.

Privacy Not Included is just one slice of the work that the Mozilla Foundation is doing to help shape the future of the web for good. You can learn more about us here!

Onepopcornman7 karma

  1. Do you have a report/white paper/journal style article of your findings? (link provided is not the most friendly for finding that)

  2. Some of these apps aim to make services more accessible or more affordable out of an insurance setting: is there an indication that data collection is subsidizing how (thinking of Better Health) these apps become affordable?

Mozilla-Foundation6 karma

Jo from Wysa: There is definitely an incentive for founders and startups to use data collection as a part of their fundraising story, especially when they are pre-revenue. However especially in healthcare, there are significant regulatory and ethical barriers, and we have not seen any company become successful in the long run either from a sustainability / impact perspective or even in their ability to monetize private data. Where we have seen success in the sector is in monetizing aggregate data, and analytics around it, and that is something that can co-exist with good privacy policies. Good privacy is good for business too.

Mozilla-Foundation5 karma

1.This article summarizes our findings on all the mental health apps we reviewed this year

Mozilla-Foundation5 karma

Ouf, where to begin. Short answer: yes! Your data is a business asset to these companies. They profit from collecting, sharing, and (sometimes) selling it.

So many of the apps and services we use mine our data for profit. It’s all bad but it feels especially wrong for apps that collect such intimate and sensitive information about our mental health.

That it makes apps more accessible is in some ways good, but at what cost? People shouldn’t have to pay for mental healthcare with their privacy

Dontdothatfucker6 karma

Hey, thanks for doing this!

I hear all the time about my private info being leaked. Bank accounts, passwords, personal information, shopping habits, the list goes on.

Would you be able to point out some of the dangers of lax security in cases like this? What will the companies do with my information? What are some of the potential hazards of strangers knowing my mental (or other) health data?

Thanks!

Mozilla-Foundation4 karma

Andres from Wysa:  Based on what you've shared, if your personal information such as bank accounts, passwords, or shopping habits are leaked or breached, it becomes publicly available for malicious actors to potentially access your accounts on other services like Facebook or Instagram. This is especially risky if you use the same password across different services or if they use your personal information to impersonate you and potentially spam or phishing others. If they have access to your bank account information, they may even try to impersonate you online or by phone to change your password and potentially withdraw your funding. Companies collecting this information should have clear reasons for doing so, such as for billing purposes, and implement strong security practices to protect sensitive information. At Wysa, we don't ask for this information as it's not necessary for our free chatbot app. It's also important to note that if a malicious actor gains access to your mental health status, they can potentially use social engineering techniques to take advantage of you when you are vulnerable. To reduce this risk, Wysa anonymizes your data and what you share with us.

vandom6 karma

This is such an interesting subject. What led you to conduct this research it in the first place?

Mozilla-Foundation10 karma

Well, to be honest, did any of us come out of the pandemic not feeling the strain on mental health? That really was it. We were feeling it and then we were seeing the explosion in mental health apps. Nerdy curious privacy researchers + mental health app explosion + anxiety about the world was kind of the perfect storm that led us to do this research. And we’re really glad we did because what we found was bonkers in a bad way and the good thing about working as a privacy researcher for Mozilla is, when you find something badly bonkers, you can help raise awareness to do something about it. That’s pretty cool.

Now we’re seeing regulators in the US and Europe pay more attention to these apps. We’re seeing the FTC in the US fine these companies issue judgement against their bad privacy practices. And we’re slowly seeing companies make some positive changes. It’s a start. There is still so much farther to go, but hey, it’s a start. 

Comfortable_Occasion6 karma

Hey all, thank you for doing this AMA, where does headspace fall on this list?

Mozilla-Foundation7 karma

Jen from Privacy not Included: Headspace isn’t the worst mental health app we reviewed. But it’s far from the best either. In fact, this year they moved a bit down on our list and earn our *Privacy Not Included warning label for how they use your personal information for things like targeted advertising and also for not clearly granting all users the same rights to delete data regardless of what privacy laws they live under. You can read our review of Headspace here

To be fair to Headspace, they have been in communication with us about our concerns. And they have stated to us that they will review their privacy policy and look into updating their language in there to clearly state all users have the same rights to delete data, no matter where they live. If/when they let us know they have made that update to their privacy policy, we will update our review

ndmy4 karma

Do you see a future where the privacy and security of health apps in general improve?

This seems like a category with so much potential to improve quality of life of it's users, and it's a shame to see it bogged down by skeevy commercial practices

Mozilla-Foundation4 karma

Shubhankar from Wysa - Google play has lately introduced a data safety form which all developers have to complete and will help users better understand the app’s privacy and security practices.

Similar initiatives can be seen on the App store to see what data an app will collect and why before installing the app

Such measures at an ecosystem level can push towards responsible disclosures of data collection and processing but are very limited in scope and enforcement as they stand today. There are also watchdogs and initiatives like the ones led by the FTC to put some form security and governance guardrails in focus for products which look promising in terms of enforcement where others fail

Mozilla-Foundation7 karma

Jen from Privacy Not Included: Oh man Shubhankar!

You hit a hot button issue for me. Those Google Play Store Data Safety labels. Whoo! We release some research we did into the accuracy of those Google Play Store Data Safety labels early this year that showed that nearly 80% of the apps we reviewed had false or misleading information in there. In part because that information is self-reported from the apps and Google doesn’t police that self-reporting very closely. And also because Google’s own rules for what companies must report on that page are rather crappy. You can read our research here if you are interested.

When Twitter and TikTok both claim on their data safety pages that they don’t share data with third pirates, you know something is up. The bottom line is, I tell people not to trust that information and rather look directly at privacy policies.

All that being said, Wysa is correct that there is a push/pull here. And everyone holding companies accountable to do better -- consumers, regulators, the employees at the companies -- is needed to make this space better and safer. Because, yes, mental health apps are needed and helpful to many today and I don’t want to take away from that with privacy concerns. I just want companies to not try and make money off of monetizing people’s personal information when they are at their most vulnerable.

Mozilla-Foundation2 karma

Ramakant from Wysa - There seems to be a push and pull here. There will always be apps that try to wing it, or push the envelope on what they can get away with. In general though, we feel that the direction is positive - lots of stuff has happened recently that will nudge everyone (sometimes with a carrot, sometimes with a stick) towards more responsible stewardship of user data. Apps will understand that bad data privacy is, in the long term, bad for business too.

drinkNfight3 karma

Did any of the "good" apps fund any part of your organization or study?

Mozilla-Foundation9 karma

Absolutely not! We don’t take money or incentives from companies to fund our work.

That is hugely important to me as a privacy researcher and consumer advocate leading Mozilla’s *Privacy Not Included work. We don’t do affiliate links, we don’t accept “test products”, we don’t take money from these companies. We do our research and accountability work with the resources we have from Mozilla. (Note: Mozilla Foundation is a non-profit. That means those resources are in large part funded by small dollar donations from people like you all. So if anyone has a few bucks to donate to support our work, well I will never say no to that! You can donate here )

PS. Someday I will have to share my rant about how affiliate links have absolutely ruined the land of consumer product reviews. I loathe affiliate links. But that is a rant for another time

pop_skittles2 karma

I have a family member who is currently using Betterhelp. He has been for about 3 weeks. What kind of things should he be aware of/ look out for?

Mozilla-Foundation6 karma

Jen C: The biggest issue I have with BetterHelp is trusting them to not use personal information in ways they claim they aren’t using it. They just got busted by the FTC for breaking the promises they made to their users not to share private health information. You can read more about that here.

The FTC did make them promise to do better. But that’s one of those “I’ll believe it when I see it” things for me. If you’d like to share our review of BetterHelp with your family member, you can find it here

Basically, they earn all three of our privacy dings, which means they earn our *Privacy Not Included warning label. That means I would recommend your family member be very cautious about sharing any personal information with BetterHelp, to frequently ask for their data to be deleted, and to delete the app from their device when you aren’t using it.

TheLastMaker1 karma

Did you find differences specific to Android vs IOS app versions?

Mozilla-Foundation3 karma

For our research, we look mostly at publicly available information like privacy policies, company responses to our questions, white papers, news articles, and app store pages. This tells us a lot about the privacy and security practices of the companies who build the apps. Digging deep into the technical specifics of the app isn’t something we do a ton of. Although my research partner Misha does download all the apps and go through the set-up process and look to see how many trackers and such an app might be using. 

I think mostly both apps, for Android and iOS, collect the data and then the company runs with it. Apple does make it a little easier to opt out of tracking at set-up for an app, which is nice. But I don’t know that there are huge differences between privacy whether you use the Android or iOS apps. Perhaps Misha can also weigh in here and offer his more technical insights on this

Mozilla-Foundation1 karma

Misha from Privacy not Included: Usually, iOS gives users wider privacy controls, like opt-in to tracking and reminders about apps’ accesses, that a user might have forgotten a long time ago. After all, Apple is not in the business of targeted advertisement, unlike Google.

However, privacy-wise, the difference is marginal. On both platforms, apps are packed with trackers, and apps try to get the maximum access possible (often incl. to your camera, photos/videos, audio, precise location, etc). So we suggest that on either iOS or Android, you manually adjust the access you provide to every app, limiting it to absolutely necessary.

Security-wise, iOS is closed-source and a bit less vulnerable to cyberattacks. However, that is the reason why Android apps are a bit easier for us to research - there are numerous open-source investigative tools that allows us to track data flows from Android apps, and to call them out

jofish221 karma

Hey folks (and particularly Jen C) — do you have any way to look at the effectiveness of these apps, and are you seeing any correlations, positive or negative, between ethical privacy approaches and effectiveness?

Mozilla-Foundation2 karma

That is a good question. Unfortunately, that goes beyond the scope of our work, to look to see if there are any correlations between good or bad privacy practices and effectiveness of the apps. If there is a research group out there with some good funding, this would be a very interesting research project to take on. I’m not exactly sure what the methodology would look like, but I would certainly read the results. Thank you for asking this question.