We're the researchers who looked into the privacy of 32 popular mental health apps and what we found is frightening. AMA!
UPDATE: Thank you for joining us and for your thoughtful questions! To learn more, you can visit www.privacynotincluded.org. You can also get smarter about your online life with regular newsletters (https://foundation.mozilla.org/en/newsletter) from Mozilla. If you would like to support the work that we do, you can also make a donation here (https://donate.mozilla.org)!
Hi, We’re Jen Caltrider and Misha Rykov - lead researchers of the *Privacy Not Included buyers guide, from Mozilla!
You can learn more:https://foundation.mozilla.org/en/privacynotincluded/categories/mental-health-apps/
Proof: Here's my proof!
One of the things that really stood out for me was just how much sensitive, emotional, and personal information these apps can collect.
Isn't that data collection essential to their value proposition? How could an app like Bearable do what users want it to without storing sensitive personal info?
And to be honest, I just don’t trust most of these companies. They seem to care about profit first and protecting their users’ privacy way down the line from that.
Is that impression based on anything objective? If Happify, for example, were a privacy-first company that prized user privacy far above investor returns, what would that look like to privacy researchers on the outside?
To make those questions a bit more broad, if you were to found a company that made a personal health app that required collection and storage of personal information for its core functionality, what would you do differently to ensure that user privacy is prized above profit? How would privacy researchers be able to tell that that is the case?
This is true. A mood/symptom tracker like Bearable does have to collect that data to provide their service. The concerns come in when you have to trust companies to protect that sensitive information and then when they say they could use some of your personal information (but not necessarily your sensitive health data) to target you with ads or personalize the service to keep you using it even more. What we want to see are absolute assurances in the privacy policies of these companies that they are taking every step possible to protect and secure that data. We want to see things like:
- A clear statement promising to never sell that data.
- A clear statement showing what third parties they share this data with and for what purpose.
- When they say they will never share personal information without your consent, what does that consent look like? How clear is it?
- We want to see a promise to never collect data from third parties to combine it with data they already have on you unless there is an absolutely clear reason they need to do that to offer the service.
- And we want to see that every users, no matter where they live and what legal laws they live under, have the same rights to access and delete their data. This is what we dream to see as privacy researchers. This and for companies to stop using annoyingly vague and unclear language in their privacy policies that leave them all sorts of wiggle room to use your personal information as they want.
Hi there, thanks for this important research! My question:
Given the way the app / data economy is built... would you say that apps + mental health are basically an incompatible pair? Is it even realistic to think this can *ever* been done in a trustworthy way?
It kinda seems like going to therapy in the middle of a crowded mall. It could never work!
That is a great point. As Misha and I were doing this research, we also had this thought. Should mental health apps even exist? However, there is a way to do this better than is being done by many companies now. Non-profit organizations actually make a few of the apps on our list and their privacy policies are generally really great. They don’t collect a lot of personal information and they don’t share it all around for advertising or treat it like a business asset. That is a great approach to this. Have non-profits create and run these apps. That would mean people would have to support these non-profits with donations though. And we did see non-profits just didn’t have the same resources to have security teams to think about the security of these apps, which stinks.
The other thing that we’d love to see is policy regulations to catch up with the use of mental health (and all health) apps. Right now there are not a lot of regulations on what data these apps can collect, how they can use it and how it must be protected. That needs to change. Stricter health privacy laws like HIPAA do cover things like the actual conversation you have with a human therapist. But not things like the fact you’re talking with a therapist, how often, for how long, when, etc. That’s all data that a company can use or share or sell.
Also, there’s the really interesting question about whether AI chatbots are covered by HIPAA. Talking to a human therapist means you have stricter privacy protections. Talking to an AI therapist doesn't necessarily have the same protections. -Jen C
Which apps (nonprofit or otherwise) do you feel are "doing it right"? Which apps (if any) would you feel comfortable using?
Better Help is owned by telehealth company, Teladoc. Should we concerned about how telehealth platforms are leveraging our data?
Absolutely! Be concerned. Ask questions to all your doctors and therapists about how they see your data being handled. Only share what is absolutely necessary. Opt out of data sharing where you can. Ask your health care provider to only share with the telehealth company what is absolutely necessary. Raise your concerns and have a conversation with your health care providers. -Jen C
Hi Jen and Misha,
What kind of data is compromised and in what way is that data dangerous (besides targeted ads of course)?
P.S. Thanks for researching this as maybe no one would have bothered with looking into mental health apps assuming they were for good.
Yes, that is exactly what could and does happen. I spoke with someone during the course of this research who told me they have OCD. And then told me about how they now have OCD ads that follow them around on the internet. Which, shockingly, isn’t good for their OCD.
The content of your conversations with your therapist may be secure. The fact that you're having that conversaiton wtih the therapist, when, how often, and potentially even what that therapist specializes in are not necessarily secure or protected by stricter privacy laws like HIPAA. -Jen C
Are these apps not subject to HIPAA?
Parts of them are. But not all of them. For example, if you use an online therapy app to talk with a therapist, those conversations are covered by HIPAA. However, there is data about those conversations that aren’t covered. Like the fact that you are using an online therapy app, how often, when, and where. All of that information is not necessarily covered by HIPAA. This is the big, murky gray area of where HIPAA ends and our data economy begins that worries us so much. -Jen C
Great question! I suppose you can reach out to the company through customer service and mention you are a shareholder and that you have concerns and would like to know how you can get them addressed. I would caution that we’ve seen a good number of crisis communications from these companies that say things like our research is “untrue” or the like. Our research is based on what these companies say themselves publicly and the responses we get from them (which isn’t often). So, push these companies to really clarify their policies. In one question above, we outlined 5 things we liked to see all these privacy policies include. You could ask them if they would be willing to include those 5 things. -Jen C
But thats a problem with any app. The way this is being framed is a condemnation of mental health apps, but in reality it's not unique to this space.
Would you draw the same conclusion studying apps of any category?
Mental health apps are not drumming apps. They collect a whole lot more personal information about me. Information that I absolutely do care if the world knows about me like my mood, if I’m seeing a therapist, what mental illness I might be struggling with, what medications I’m on, and even conversations with others about my deepest darkest thoughts. Hell yes, I want that information treated differently than the information my drumming app collects. And sometimes it is. But not all of it and not always. And when companies are trying to make money, you also have to worry about how secure that info is and how they handle it and how quickly they are trying to grow and expand their business and is costing them time to worry about my privacy and the security of my personal information. -Jen C
Not necessarily. They could still collect your device ID. There are actually so many ways companies can track you on your phone, computer, and through connected devices. I’m not sure we even truly understand all the ways companies can track you. Resetting your advertising ID won’t hurt. Will it protect you from being tracked around the internet? Probably not in the way you hope. -Jen C
Are chats and such on these apps protected (legally) in the same way as an in person chat with a therapist? Obviously, they aren't as secure, but even in-person therapy is often recorded.
Chats with licensed therapists are generally protected by stricter health privacy laws, yes. Chats with AI chatbot therapists, those are not always covered by stricter privacy laws. And chats with listeners or other types of people who aren’t licensed therapists are often not covered by stricter health privacy laws, as far as we can tell. -Jen C
Hi, thanks for doing this for starters.
I used the BetterHelp app for a while and while BetterHelp says all chat logs are secured and encrypted, a therapist on the app told me that all of the chat logs were routinely checked and read by BetterHelp employees who were not therapists and when the chat logs did not contain threats of violence to the patient, about the patient constituting a mandatory reporting scenario and they did not get prior authorization to review the chat logs in question. She told me that BetterHelp violates HIPAA in this regard by allowing employees other than the therapist access to the chat logs. It is the sole reason that I stopped using the BetterHelp service. She also implied that BetterHelp employees would check the video feeds while therapy was in session but when I asked her for more information about this claim, she was unwilling to give me more specific details.
Due to this experience with BetterHelp, I will never use an online therapy service ever again.
My question to you is, was BetterHelp one of the services that refused to answer any questions?
We can confirm that Betterhelp ignored our questions. We can not confirm if the chat logs are read by BetterHelp employees - that would be horrible. What we do know is that the Economist reported that the app might be sharing chat information with advertisers. The article (https://www.economist.com/business/2021/12/11/dramatic-growth-in-mental-health-apps-has-created-a-risky-industry) quotes a user: “When I first joined BetterHelp, I started to see targeted ads with words that I had used on the app to describe my personal experiences.” -Misha R
One interesting note on this. A friend of mine uses Better Help and got a customer survey from them. In it, my friend mentioned that they were concerned about Better Help’s privacy practices because of our *Privacy Not Included review. The response she got was quite interesting. Better Help responded and said that what we said in our review was untrue (it’s not), and that they were “working to address the misunderstanding with Mozilla”. Interestingly, we have not heard once from Better Help even after we reached out to them multiple times. -Jen C
Did you find any apps that were actually trustworthy?!
We recommend two apps (among those 32 we’ve reviewed). PTSD coach (https://mobile.va.gov/app/ptsd-coach) is a free self-help app created by the US Department of Veterans Affairs National Center. Since the app developers are not making money off users’ data, the privacy of this app is decent: it does not collect any personal data in the first place :)
Looks like you've linked the va tool twice rather than linking Wysa (https://www.wysa.io/) in the second instance.
Thank you for catching that! We have updated the answer with the correct link :)
Have you thought about doing this for DNA tests?
Here’s how that conversation went: Should we review DNA tests? W hy would we do that? No one should give their DNA to a company, ever. That’s not personal information anyone should have anywhere outside of you doctor, and even then it’s scary.
So, that’s a no? Absolutely. People, never share your DNA with a DNA testing company! Even if they say they will protect it, they can’t guarantee that. And you don’t need anyone in the world to have access to your DNA. Finding out if you’re part Neanderthal, while really cool, is not that important.
But people are sharing their DNA with companies because those companies offer a service that those people see as having value. Personally, I don't even understand what the risk would be of sharing my DNA with a company? Like, I'm not going to try to clone myself and sell a million copies so the copyright on my DNA doesn't seem particularly relevant?
What do you think the risks actually are for people?
Think of it this way. You’re likely not just sharing your DNA with this one company but also: Law enforcement, hackers, snoopy employees, the company that buys this company in the future who might not have great intentions, your government, the government of another country that hates your country, the person digging through the dumpster behind the company you shared your DNA with after they accidentally throw out sensitive records, aliens, zombies, and maybe even that mad scientist friend of yours you went to high school with.
The problem is, once this data is shared, there's a very good chance it won’t be kept 100% secure over your lifetime or the lifetime of your kids or grandkids. And your DNA is about the most sensitive personal information you have. You do not want that in the hands of anyone else. -Jen C
We're you able to detect which companies are buying this sensitive information?
Great question! We haven’t done that research yet. But we are hoping to dig into this a little deeper in a couple of ways. First, by looking at the traffic shared between the apps and third parties like Facebook and Google. Unfortunately, that can only tell us so much (and not that much, usually). We are also looking into buying some data from data brokers in an ethical way to see what we can learn on this front. Truth be told, we just don’t know if we can find this out, which is really pretty damn scary. -Jen C
(PS. If you’d like to help support our work, Mozilla is a non-profit and *Privacy Not Included relies in part on donations from wonderful folks like you to do this work. Donations are always welcome, even just a few bucks if you have them. https://donate.mozilla.org/ )
It's really disheartening to hear about these apps that fail your privacy checklists! Can you tell us which ones you've found that didnt fail and/or which ones you would recommend (if any)?
We recommend two apps (among those 32 we’ve reviewed). PTSD coach (https://mobile.va.gov/app/ptsd-coach) is a free self-help app created by the US Department of Veterans Affairs National Center. Since the app developers are not making money off users’ data, the privacy of this app is decent: it does not collect any personal data in the first place :)
Why is it not well known that the point of most apps, social media and search engines is to sell out the user? If it is free you are the product.
We believe that “user is a product” must not be the norm. Especially when we are dealing with apps that collect data about vulnerable moments of potentially vulnerable groups of people, like in the case of mental health apps.
We could also see that most of these apps offer subscriptions AND capitalize on your data. When apps maximize profits, they do it in all possible ways. That is why in certain jurisdictions, regulators step in. GDPR in Europe and CCPA in California are making selling data harder. They also ask for the consent of users. And we strongly believe that sharing personal data with third parties must not happen without a user’s clear consent (in an opt-in manner). -Misha R
Hi Jan, and hello Misha. I'm on the older side of users on this website so a lot of the times people consider my concern about data collection to be from the perspective of a local curmudgeon. What can I do to set these people right? How can we take down massive corporations that have poisoned the well in more ways than one, including literally, during my wrinkly lifetime?
First off, here’s to all the wrinkly curmudgeons of the world! Unite!!!! I’m right there with you.
And if you’re asking, what works to help people see the light of the ever-growing privacy concerns in the world today, that is a great question. Here’s what I do know. People HATE to be told what to do (remember being a teenager and hating all the adults telling you what to do). People, however, LOVE to see themselves as the hero of the story. So approach these conversations you’re having with people, not from a, “Hey idiot, you should be doing this!” perspective (as good as that feels, it rarely works). And approach it more by asking them what they do to protect their privacy now, ask them to tell you how they are already the hero of their story, and then see how they might expand on that to do even more, to become an even bigger hero. Then maybe give them so tools to do that. Like, share our *Privacy Not Included guide let them know that is a great resource as they shop for connected products and then they’re an even better-armed superhero in the epic battle for privacy rights in our world. -Jen C
Thanks for the well thought out answer, Jenc, much appreciated! What data can I trade you in exchange for the Privacy Not Included guide? And can one of you please swing back and answer the second question?
We don’t want your data! Mozilla practices lean data practices so we don’t want your data. As for how can we take down massive corporations? Well, I don’t think you and I are going to do that. And maybe we shouldn’t. Maybe we should focus on working to help make these massive corporations be better, do better, and treat us all better. It’s maybe a sisyphean task, but not one we should give up on. -Jen C
Thanks once again, JenC, for your well thought out replies. What exactly are the principles of lean data practices? And how can Mozilla convince corporations to adopt the same practices when they are raking in the cash hand over foot with big data? Also, does Mozilla hire retirees or are young people preferred?
Here’s some info about Mozilla’s lean data practices I hope helps. Basically, lean data practices mean only collecting the bare minimum of data you need to provide the service your offer. https://www.mozilla.org/about/policy/lean-data/stay-lean/
How can Mozilla convince corporations to adopt these practices? Well, corporations care about money. So, we need to show that there’s money in protecting users’ privacy. That means consumers have to vote with their dollars and chose to support companies that have better data practices over ones that don’t. And I’m not a hiring manager at Mozilla. I do know we hire some pretty cool people of all ages. I myself am no spring chicken. -Jen C
I saw in your exploration that sites like BetterHelp, which provide a platform for therapy services virtually, share data/Metadata with their parent companies, with Facebook or other social media entities and the like.
As a therapist (in Canada) I am beholden to a certain privacy standard, I know similar laws exist in the US. These standards are strict. I cannot even acknowledge my client is a client of mine seeking services without consent or unless certain criteria are met.
I realize many therapists on these platforms probably don't realize this either, but outside of that deception, I am seriously concerned with how they're even recruiting therapists, who are beholden to the same or similar standards of data protection as I am. Do platforms like this not automatically breach such privacy legislation? Do they just pass the liability off onto therapists, who may not be as aware of these things as they should be?
Thank you bringing this to light!
Thank you for your response! Your perspective as a therapist is one I’ve been very curious about. There are so many questions about the laws that govern you as a therapist on these apps and the data outside of your protected conversations with your clients that we just can’t see to clearly answer. It’s scary. I would love to hear more from therapists like you and your experiences on these apps and the concerns you have. -Jen C
Good day! First of all, thank you for bringing this matter into the spotlight and secondly, for holding this AMA.
In the past, on Windows, one could install a stateful packet inspection firewall and use it to monitor, allow or disallow traffic. There seems to be no cognate to that in the app ecosystem.
My question: Is there any way for us mere mortals to find out what data is being sent/received, and where, from an individual app running on iOS?
Thanks for your time.
You can start inspecting your app activity data. Here is a guide for iOS: https://developer.apple.com/documentation/network/privacy_management/inspecting_app_activity_data
And Google announced just two weeks ago that you can do more of such inspection in Android, too: https://blog.google/products/google-play/data-safety/
This said, apps developers are making it hard to inspect the traffic of data. And much data sharing is happening between an app and a third party behind closed doors. So, there is only so much that we can intercept. -Misha R
Good question! The simple answer is money. Data is valuable. Data that gives insights into people’s minds is really valuable. The more the company/advertisers/data brokers know about you, the better they can target you to sell you more things, keep you on the app longer, ask other companies to buy into the data they have so they can sell you more things, grow their audience of users, keep you engaged, and more. Data = Dollars.
Hi! Thanks for doing this!
Do y'all know of similar research into other sorts of self help apps? Apps that help you track diet, health, periods, sleep, habits, and so on? I imagine this area is just rife with data brokerage.
You’re correct that this area is likely rife with data brokerage. We have seen Consumer Reports and a study from Harvard also doing research into these apps. But not a whole lot at the moment. And when you learn that there are somewhere between 10,000 - 20,000 (and growing every day) of these mental health and wellness apps, it gets overwhelming at just how hard it can be to look deeply into this area and really understand how hard it is to keep up with research in this area. We hope that the research we’ve done on the 32 mental health apps in our *Privacy Not Included guide will help give consumers an idea of what questions they should ask and how they can look into apps that we don’t research. -Jen C
Do you have guidance on best practices to NOT behave in a predatory or negligent or harmful way with user data?
- Do not sell personal data and do not share it with anybody without consent
- Do not ask for consent in a way that is misleading or unclear, and do not take consent for granted (with data or else)
- Do not make it hard to delete or access user’s data, regardless of users’ location
- Do not save data for longer than is needed for purposes of collection
- Do not collect data for purposes that go beyond ensuring the functionality of the product
- Do not combine collected data with data from third party sources unless for a clearly stated, legitimate business purpose
- And finally, do not ignore security researchers when they reach you to report a security vulnerability, but better offer them a bug bounty!
Do Mozilla Foundation or anyone else you're aware of offer voluntary audit programs? Can companies proactively seek out 3rd party experts during the development process to ensure they're meeting those goals?
There are companies out there that make a living doing security and privacy reviews of companies. Mozilla doesn’t do that, we focus more on the consumer side of things with *Privacy Not included. The problem is, our standards at Mozilla for privacy are very different from typical industry standards of privacy. We think consumers should have a right to privacy and have their data respected and protected. That’s just not the way of the world these days, unfortunately. There are also certification programs companies can offer their practices up to for review. We know that TÜV Rheinland offers a popular certification in Germany.
Sometimes you see a company do this and use this in their marketing. That is great. It’s still not a silver bullet, but it is a start. -Jen C & Misha R
Does it seem like this is really just the point of these apps? I don't feel like there's much sincerity in their efforts. These apps (both mental health and prayer) seem to scratch the "I'm doing better for myself" itch without the user really having to apply themselves too much or step into a zone that they're not entirely comfortable with.
It's a give and take. You make me feel like my efforts for self-improvement and faith aren't being wasted and I let you have my data.
I also do not see much sincerity in mental health apps’ efforts. Most of them seem to prioritize a business purpose over care about users’ mental health. At the same time, I would not put much of a burden on the users. Their willingness to seek help in apps is good. It is not their fault that those apps are mostly awful when it comes to privacy and security. And it is truly said when apps take advantage of people in vulnerable positions. -Misha R
I’ve been using Cerebral since my father passed away and it’s been a tremendous help. My therapist and prescriber both have been professional and helpful. I do not like how my data is handled but I also know where I was before therapy and meds and how I am with them. I don’t have health insurance so Cerebral is all I can do.
I do agree that the business side of these apps are scummy and don’t care about my health. But I do want to stand up for the doctors because they have been nothing but exemplary.
This is a great point and thank you for making it! These apps do provide benefits for people. Absolutely! We just want to see the business practices of these apps better protect people when they are at their most vulnerable.
And you’re right, therapists are a whole other side to this equation that I have so many questions about. How do they feel about the privacy and security of these apps? Do they feel safe? Do they feel these apps provide them a good environment to work in? Thank you for raising this point. It is a very good one. -Jen C
Im professionally interested in psychiatry and don't "care" about my private data when they are used for ads and all . I have read studies that show a good outcome in cases of moderate depression with these apps .( Or of this kind ) , are your results really so important that should keep patient from using them ? Keep in mind that these apps can help people with no access ( for many different reasons ) to a therapist. You can even prescribe an app to a patient . How should i take your results into account ? (Thank you for your job regarding these apps , im genuine asking for knowledge and not to underestimate anything)
We don’t ever want people who need help for their mental health to not seek out that help. What we do want is for these companies to treat the personal, private, sensitive information these apps can collect with the respect it deserves. While you might not fear for your privacy, others do. Your feelings about not caring so much about privacy are valid, as are the feelings of people who do. We’re just out here trying to make the world a little better place for everyone. Asking mental health app companies to focus more on protecting the privacy and security of the personal information they collect from their often vulnerable users is one thing we can do to hopefully make the world a little better place for all. -Jen C
Now that you've done all this research do you recommend folks use mental health apps or stay away from them? If I'm gonna use one what best practices do you recommend?
Oh, good question. I don’t want to take away from the fact that these apps do offer people benefits. There’s a mental health crisis going on right now and people need help. Help that is often hard to find, access, and afford. These apps can fill in the blanks there. What I absolutely HATE is that companies see this as a money-making opportunity not to be missed. And so are jumping in the game as fast as they can to offer up a service without thinking through how to protect the privacy and security of people who are at their most vulnerable. Companies are moving so fast to grow with the crisis is hot, so to speak, and not thinking along the way about what impacts this might have to privacy, security, and the fact that once this data exists, is collected, it could live on forever.
So, yes, I would recommend a mental health app if someone is struggling with their mental health and need help and can’t find it any other way. I would just caution them to find a good one for privacy (which can be hard, but there are a few), and then be very careful what they share. If you meet online with a therapist, ask them to take notes offline and not upload them to the app system. Only share what personal information you are required. Opt out of as much data sharing as the app will allow. Request your data be deleted periodically. And never, ever sign into one of these apps using social media like Facebook. If you get access to an app through your employer, ask them what data on you they can receive and how they use it. Ask them to create a policy on that. -Jen C
How much do you feel could be done to make there be less need for apps like these (and so therefore less apps that are unsafe of predatory)?
We wish there would be less need for mental health apps - there would be less wars, pandemics, poverty, discrimination, violence, etc. But since there are so many threats to our mental states, any resources or tools that offer help must stand up to the mission. Unfortunately, most mental health apps do not. We believe the apps providers are able to fix the apps. It is the logic of “business over users’ needs” that harms apps the most.
What's the use of such data - do you know who are the buyers? Is it just thrown away to the market and dealt with by data management platforms or is it intended to be used by some particular players like mental health clinics?
We wish we knew. Not all companies sell data. Some just “share” it with advertisers or with people who want to advertise on their platforms. That’s not actually selling the data, but it is giving a lot of third parties access to it to sell targeted ads and such. For companies that do sell the data, who are the buyers? We just don't know exactly. Data brokers, generally. But that is a large, not very transparent group of companies that we don’t know much about. We did see with at least of the prayer apps that they said they could share/sell your data with other faith-based organizations. Does that mean they're giving or sharing your information with other orgs considered faith-based so they can target you to raise more money? Sure sounds that way. And that feels icky. -Jen C
How much of these privacy concerned would be prevented if the US is enforcing privacy acts like GDPR (Europe)?
Have you found in your research whether some of these apps operate in Europe? If so, have you noticed any change in behavior?
GDPR does protect people living in the EU more than people in the US have privacy protections. But, it’s also not a silver bullet. It does allow sharing of personal data, if data is pseudonymized or combined. It does allow sharing data with advertisers with consent. But what we love about GDPR are clear users’ rights, such as the right to erase or access your data at any point.
Most of the apps operate in Europe, and some are even produced in Europe (like MindDoc from Germany). Unfortunately, we did not see that European residentship improved data practices of the apps much. They still used very general unclear clauses in Privacy Policies, shared data with Facebook for ads, etc. -Jen C
As someone who may work for a mental health app in the future, what could I do or prioritize to ensure the app I work for is secured from a privacy perspective?
Ask questions! Read their privacy documentation. And if something makes you feel uncomfortable, don't be satisfied with it. My friend told me the story of how her therapist on one of these apps only took handwritten notes and declined to upload them into the app system and promised to destroy the notes after she was done with them. That’s a great practice therapists can do if they don't trust the privacy and security of the mental health app they work for. And if you’re still concerned, organize! Get together with your colleagues and push as a group for better privacy and security practices. -Jen C
Was the private info put in these apps able to be viewed by anyone? If I talked about a bothersome event and someone gained access to that material, would anyone who knew how to access be able to view what was said?
No, all the personal private information users share with this app is not viewable by anyone. Many of these apps do take strong security measures like using encryption and storing data on encrypted servers, which is great. But, there’s still the chance some of your chats could be leaked, hacked, or viewed by an employee at the company who shouldn’t have access. And that’s just your chat data. That doesn’t include other personal information you might share like your name, email address, location, how often you use the mental health app, your answers to intake questionnaires, and the like. -Jen C
Is It really shocking ??
Like couldn't we see it coming ??
I was shocked. And I do privacy research for a living. I’m jaded about companies protecting privacy, just like it sounds you are. What shocked me the most was the huge amount of very, very personal information these apps can collect. And how too often, that data or the data surrounding it, was treated like a business asset. And they target people at their most vulnerable. It’s yucky, to put it politely.
Then there’s stories like the time Better Help teamed up with Travis Scott after people were trampled to death at his concert to do a promotion to offer people affected by that event one free month of Better Help. (https://www.buzzfeed.com/natashajokic1/travis-scott-betterhelp-controversy)
That just felt so incredibly crass to me. So, yeah, as jaded as I am, I was still shocked. I hope we never get so desensitized to this that we don’t find this shocking. -Jen C
So what can we do to avoid apps that invade privacy like these, or it’s just unavoidable at this point?
Read our *Privacy Not Included guide and pick an app that does better than the others! If your company offers an app through a wellness program, ask you company to have a policy for what data they can collect and how they can use it. And if the app is one without strong privacy protections, ask your company to reach out to the app maker and improve their privacy practices. -Jen C
What was the standard you tested the apps against?
You can read our methodology for all our research of *Privacy Not Included here on our About Page. https://foundation.mozilla.org/en/privacynotincluded/about/methodology/ -Jen C
Recently Prince Harry (the Chief Impact Officer of BetterUp) brought BetterUp to the Invictus Games. On the surface, bringing mental health services to wounded veterans seems commendable, but I have become very jaded with any of these celebrity endorsed mental health programs.
What is your opinion of BetterUp?
I expect that sort of thing from religious organizations and probably companies that are not related to the medical community directly. However, did you do any research into apps that are used by health care providers?
In fact, many of the apps reviewed are used by healthcare providers. Some are being recommended or reimbursed by health insurance providers in Europe and America. Some are offered by employers as part of their wellness plan. While mental health apps may not belong to the medical community directly, they offer a platform for medical therapists, and as such are used to prescribe medication and provide professional help. Given all that, the privacy and security standards of these apps are alarming. -Misha R
Is it fair to say that the business model of some of the worst offenders on your list is that the "patient" is the product?
That is, do they make more money by selling patient data to 3rd parties than they do from fees paid by patients?
What is the best app you found, from an efficacy and data privacy standpoint?
Wysa was great. PTSD Coach was also great. Those are the two apps that earned our Best Of distinction on *Privacy Not Included. (https://foundation.mozilla.org/en/privacynotincluded/categories/best-of/)
Why is the Bible listed as a mental health app?
Many people use prayer apps as mental health apps. The prayer apps offer Bible-based meditations and sleep stories as well as communities of other like-minded people for support. Some of these prayer apps also have terrible privacy practices that seem like they are hiding their data harvesting practices behind a promise to bring people closer to God. That doesn’t feel right to us. -Jen C
What are you going to do about it? It’s the way the system is designed to be. Privacy is dead. You would think there would be patient-doctor confidentiality but not these days. The Wild West days of the internet are long gone my friend.
Awww…please don’t give up. I know it’s hard to stay positive about us having privacy in our world anymore. Trust me, I feel this daily as I’m reading these terrible privacy policies and interacting with these disingenuous companies. But we can’t give up. There are things we can do. Chose not to use products from the worst offenders. Talk to your friends and family and help them understand why it’s important they care and then do something about it by voting with their dollars. Talk to your employer if they offer a mental health app through a wellness program and ask them to work with the company to better their privacy policies (that’s what we’re doing here at Mozilla with the company we use).
I understand, it’s easy to get cynical and defeatist about privacy on the internet. Trust me, I really do. But we can’t give up. I got you all! Misha and I and all of us at Mozilla care and are going to keep working to fight for all our privacy. You can keep fighting too. -Jen C
What would you say is the worst thing that you found from an app?
Sharing data and selling to advertisers? As in, if you are utilizing TalkSpace you will get ads from other mental health ads or products?
Also: my company gave TalkSpace memberships to all their employees, so I thought I would give it a shot. It was AWFUL. The guy comes on, and asks if we have been having sessions before, and if so, what had we been talking about. (It was my very first one). I told him what I would like to talk about, and he just told me about himself and how he overcame those issues. Never went again.
View HistoryShare Link