PROOF: https://i.redd.it/9oybmy7d9sva1.jpg

I’m an investigative reporter at Bloomberg News, and I extensively examined how TikTok can serve up a stream of anxiety and despair to teens. “Death is a gift.” “The perfect ending.” “I wanna die.” I spent hours watching videos like this on the TikTok account of a New York teenager who killed himself last year. The superpopular app says it’s making improvements — but it now faces a flood of lawsuits after multiple deaths.

While practically all tech companies are secretive about their data, insiders who also had experience working for Google, Meta and Twitter cast TikTok as Fort Knox by comparison. You can read my story here and listen to me talk about it on The Big Take podcast here. You can read my other investigations into TikTok and others here.

EDIT: Thanks for joining me today. Social media has become ubiquitous in our lives, yet we do not know what the long-term impact is going to be on kids. These are important conversations to have and we should all be thinking about how to better protect children in our new digital world. I will continue to report on this topic -- and feel free to send me thoughts or tips to: [email protected]

Comments: 210 • Responses: 5  • Date: 

SpaceElevatorMusic370 karma

Hello, and thanks for this AMA.

Is there a workable solution for how tech companies can avoid 'promoting' suicidality without suppressing, for lack of a better word, 'healthy' discussion of suicidality?

bloomberg247 karma

Great question. This is one of the hardest parts of moderating content on social media. These companies have strict guidelines around issues like suicide or eating disorders, and strive to take down content that promotes or glorifies these topics. But, they don't want to over-censor -- or take down posts that may raise awareness of these issues or help those who are struggling. But distinguishing between content that "promotes or glorifies" a topic like suicide and content that "raises awareness" of the issue is a subjective thing. These companies are constantly reworking their policies about these issues, based on advice from experts, to try and find the balance.

As I watched some of the posts coming through to Chase Nasca's feed I became aware of how tough this is. Some of the videos said vague things like "I don't want to be here tomorrow" -- should that be censored? One could argue that a caption like this promotes suicide, but what if the user posted it as a joke? Or what if they were referring to school, rather than life itself? Human moderators have only a few seconds to watch a video and decide whether to take it down. That's why the policies around what should stay up and what should come down are so important.

davincybla60 karma

Hello, thanks for the AMA.

Clearly there needs to be some sort of regulation surrounding social media algorithms in general (not just TikTok's) as it is an armistice race between tech companies to generate engagement and revenue. However, we've seen congress fail to even come up with decent rules on how the internet should be moderated.

What do you think some ways we as a society could help alleviate the negative impact of social media algorithms outside of government regulation since that's probably not happening anytime soon?

bloomberg100 karma

I think we should be calling on these companies to be more transparent with their algorithmic design. Researchers should be able to study these algorithms so we can fully understand their impact on kids. It's hard to regulate -- and research -- an industry we don't understand. And these companies are notoriously protective over their algorithms and intellectual property because its such a fiercely competitive space.

We should be teaching children about online safety in schools and arming them with information about how these algorithms work and why they are so addictive. Right now digital safety has been introduced in some schools, but not others. That's not good enough.

mysticfuko23 karma

Do you think this is in purporse?

bloomberg124 karma

The algorithm doesn't really have a purpose. It is a computer program that is trained to optimize engagement. It's not benevolent or malevolent. It is just sending users what it thinks they want to see, in order to get them to stay glued to the screen for as long as possible. The company has been working to address this issue for years, but it is a complicated area and they haven't found the solution yet.

dehydrated_bones21 karma

do you have any information on how to limit this content on the personal fyp? i’m just barely out of the hospital and it’s intensely triggering

bloomberg51 karma

I'm sorry to hear that, and I hope you are okay.

Yes, you can adjust your For You feed to suit your needs. There is a way for users to filter out certain content based on hashtags. You can also tap on a video and say "not interested", which essentially directs the algorithm not to send you any similar videos. TikTok also just launched a new feature that allows users to refresh their For You feeds, so it will start you off with an entirely new account and the algorithm will relearn what your interests are.

Live_Carpenter_126214 karma

How do you dredge up new stories or find new leads and how do you tend to get started on these cases?

bloomberg23 karma

Finding stories is always a challenging. All reporters will tell you that! You have to think outside the box, read everything that's been written about a topic and try to approach it from a fresh angle.