On this week’s episode of The Times’s new TV show “The Weekly,” we investigate how YouTube spread extremism and conspiracies in Brazil, and explore the research showing how the platform’s recommendation features helped boost right-wing political candidates into the mainstream, including a marginal lawmaker who rose to become president of Brazil.

YouTube is the most watched video platform in human history. Its algorithm-driven recommendation system played a role in driving some Brazilians toward far-right candidates and electing their new president, Jair Bolsonaro. Since taking office in January, he and his followers govern Brazil via YouTube, using the trolling and provocative tactics they honed during their campaigns to mobilize users in a kind of never-ending us-vs-them campaign. You can find the episode link and our takeaways here and read our full investigation into how YouTube radicalized Brazil and disrupted daily life.

We reported in June that YouTube’s automated recommendation system had linked together a vast video catalog of prepubescent, partly clothed children, directing hundreds of thousands of views to what a team of researchers called one of the largest child sexual exploitation networks they’d seen.

We write The Interpreter, a column and newsletter that explore the ideas and context behind major world events. We’re based in London for The New York Times.

Twitter: @Max_Fisher / @amandataub

Proof: https://i.redd.it/kfen9ucij2g31.png

EDIT: Thank you for all of your questions! Our hour is up, so we're signing off. But we had a blast answering your questions. Thank you.

Comments: 2021 • Responses: 20  • Date: 

mydpy1394 karma

How do you investigate an algorithm without access to the source code that defines it? Do you treat it like a black box and measure inputs and outputs? How do you know your analysis is comprehensive?

thenewyorktimes1116 karma

That's exactly right. The good news is that the inputs and outputs all happen in public view, so it's pretty easy to gather enormous amounts of data on them. That allows you to make inferences about how the black box is operating but, just as important, it lets you see clearly what the black box is doing rather than just how or why it's doing it. The way that the Harvard researchers ran this was really impressive and kind of cool to see. More details in our story and in their past published work that used similar methodology. I believe they have a lot more coming soon that will go even further into how they did it.

crateguy945 karma

How do you define extremism/a conspiracy theory?

thenewyorktimes1829 karma

That's a really important question and we spent a lot of time on it. We did not want this to just be a story about how YouTube spread opinions or views that happened to jump out to us, and we wanted to set an extremely high bar for calling something extremism or a conspiracy. After all, one of the great virtues of social media is that it opens space for political discussion and for questioning the official story.

For this story, we only wanted to focus on conspiracy videos whose claims were manifestly false and demonstrably caused real-world harm. Unfortunately, so many videos met this criteria that we never had to worry about the many, many borderline cases. Everyone is familiar with anti-vaccine conspiracy videos, for example — absolutely rampant on YouTube in Brazil, and often served up by YouTube's algorithm to users who so much as searched for basic health terms. And there were many others like this. Videos claiming that diseases like Zika were manufactured by George Soros as an excuse to impose mandatory abortions on Brazil, and therefore parents should ignore medical advice about the disease. Videos that told parents to ignore their doctors' advice on how to safely feed a developmentally disabled child, and to instead use "home remedy" methods that would put the child at potentially fatal risk. And so on.

Doctors, health experts, and former government officials told us that these videos were creating multiple public health crises. And I know it might be easy for internet-savvy folks on here to blame the people who were misled by the videos for going to YouTube for information, but remember that parts of Brazil are quite poor and that YouTube and Google are two of the biggest and most respected American tech companies in the world. YouTube doesn't come with a big disclaimer telling users that the content on the site could threaten your child's life. Some of the conspiracy videos are faked to look like news broadcasts or like doctors giving medical advice.

As for extremism, we did not want to be in the business of deciding which views count as mainstream and which count as extremist. (Though many of the folks we wrote about in Brazil are not at all shy about identifying themselves as well outside the mainstream. So we approached this as a relative, rather than an absolute — is your content becoming consistently more extreme? In other words, if you start on YouTube by watching someone who says that taxes are a little too high and that gay people have too many protections, but then the algorithm consistently pushes you toward videos that call for a military takeover and accuse teachers of secretly indoctrinating children into homosexuality, then we would conclude that your YouTube experience has become more extreme. We documented this consistently enough that we felt comfortable saying that YouTube was pushing users toward extremism. And we asked a lot of Brazilian users themselves whether they considered this characterization fair, and they did.

guesting401 karma

What's in your recommended videos? Mine is all volleyball and 90's concert videos.

thenewyorktimes633 karma

Ha, that sounds awesome. To be honest, after the last few months, it's mostly a mix of far-right Brazilian YouTube personality Nando Moura and baby shark videos. We'll let you guess which of those is from work vs from home use.

Edit -- Max double-checked and his also included this smokin-hot performance of Birdland by the Buddy Rich Big Band. Credit to YouTube where due.

ChiefQuinby337 karma

Aren't all free products that make money off of your time designed to be addictive to maximize revenue and minimize expenses?

thenewyorktimes510 karma

You're definitely right that getting customers addicted to your product has been an effective business strategy since long before social media ever existed.

But we're now starting to realize the extent of the effects of that addiction. Social media is, you know, social. So it's not surprising that these platforms, once they became so massive, might have a broader effect on society — on relationships, and social norms, and political views.

When we report on social media, we think a lot about something that Nigel Shadbolt, a pioneering AI researcher, said in a talk once: that every important technology changes humanity, and we don't know what those changes will be until they happen. Fire didn't just keep early humans warm, it increased the area of the world where they could survive and changed their diets, enabling profound changes in our bodies and societies.

We don't know how social media and the hyper-connection of the modern world is changing us yet, but a lot of our reporting is us trying to figure that out.

abaconIcecream197 karma

What was the biggest challenge in your investigation?

thenewyorktimes396 karma

It was very important to us to speak with ordinary Brazilians — people who aren't politicians or online provocateurs — to learn how YouTube has affected them and their communities. But we're both based in London, and that kind of reporting is hard to do from a distance. We could get the big picture from data and research, and track which YouTubers were running for office and winning. But finding people who could give us the ground-level version, winning their trust (especially tricky for this story because we had a camera crew in tow), and asking the right questions to get the information we needed, was hard. Luckily we had wonderful colleagues helping us, particularly our two fixer/translators, Mariana and Kate. We literally could not have done it without them.

bacon-was-taken171 karma

Should youtube be more strictly regulated? (and if so, by who?)

thenewyorktimes340 karma

That is definitely a question that governments, activists — and, sometimes in private, even members of the big tech companies — are increasingly grappling with.

No one has figured out a good answer yet, for a few reasons. A big one is that any regulation will almost certainly involve governments, and any government is going to realize that social media absolutely has the power to tilt elections for or against them. So there's enormous temptation to abuse regulation in ways that promote messages helpful to your party or punish messages helpful to the other party. Or, in authoritarian states, temptation to regulate speech in ways that are not in the public interest. And even if governments behave well, there's always going to be suspicion of any rules that get handed down and questions about their legitimacy.

Another big issue is that discussion about regulation has focused on moderation. Should platforms be required to remove certain kinds of content? How should they determine what crosses that line? How quickly should they do it? Should government regulators or private companies ultimately decide what comes down? It's just a super hard problem and no one in government or tech really likes any of the answers.

But I think there's a growing sense that moderation might not be the right thing to think about in terms of regulation. Because the greatest harms linked to social media often don't come from moderation failures, they come from what these algorithms choose to promote, and how they promote it. That's a lot easier for tech companies to implement because they control the algorithms and can just retool them, and it's a lot easier for governments to make policy around. But those sorts of changes will also cut right to the heart of Big Tech's business model — in other words, it could hurt their businesses significantly. So expect some pushback.

When we were reporting our story on YouTube's algorithm building an enormous audience for videos of semi-nude children, the company at one point said it was so horrified by the problem — it'd happened before — that they would turn off the recommendation algorithm for videos of little kids. Great news, right? One flip of the switch and the problem is solved, the kids are safe! But YouTube went back on this right before we published. Creators rely on recommendations to drive traffic, they said, so would stay on. In response to our story, a Senator Hawley submitted a bill that would force YouTube to turn off recommendations for videos of kids, but I don't think it's gone anywhere.

Yuval8356130 karma

Why did you choose to investigate YouTube out of all things?

thenewyorktimes501 karma

Hey, good question. It's just a website, right? Until maybe two years ago, we hadn't really thought that social media could be all that important. We'd mostly covered "serious" stories like wars, global politics, mass immigration. But around the end of 2017 we started seeing more and more indications that social media was playing more of an active, and at times destructive, role in shaping the world than we'd realized. And, crucially, that these platforms weren't just passive hosts to preexisting sentiment — they were shaping reality for millions of people in ways that had consequences. The algorithms that determine what people see and read and learn about the world on these platforms were getting big upgrades that made them incredibly sophisticated at figuring out what sorts of content will keep each individual user engaged for as long as possible. Increasingly, that turned out to mean pushing content that was not always healthy for users or their communities.

Our first big story on this came several months later, after a lot of reporting to figure out what sorts of real-world impact was demonstrably driven by the platforms. It focused on a brief-but-violent national breakdown in Sri Lanka that turned out to have been generated largely by Facebook's algorithms promoting hate speech and racist conspiracy theories. That's not just us talking — lots of Sri Lankans, and eventually Facebook itself, acknowledged the platform's role. Our editor came up with a headline that sort of summed it up: Where Countries Are Tinderboxes and Facebook Is a Match. We followed that up with more stories on Facebook and other social networks seemingly driving real-world violence, for example a spate of anti-refugee attacks in Germany.

But none of that answers your question — why YouTube? As we reported more on the real-world impact from social networks like Facebook, experts and researchers kept telling us that we should really be looking at YouTube. The platform's impact is often subtler than sparking a riot or vigilante violence, they said, but it's far more pervasive because YouTube's algorithm is so much more powerful. Sure enough, as soon as we started to look, we saw it. Studies — really rigorous, skeptical research — found that the platform systematically redirects viewers toward ever-more extreme content. The example I always cite is bicycling videos. You watch a couple YouTube videos of bike races, soon enough it'll recommend a viral video of a 20-bike pile-up. Then more videos of bike crashes, then about doping scandals. Probably before long you'll get served a viral video claiming to expose the "real" culprit behind the Olympic doping scandals. Each subsequent recommendation is typically more provocative, more engaging, more curiosity-indulging. And on cycling videos that's fine. But on topics like politics or matters of public health, "more extreme and more provocative" can be dangerous.

Before we published this story, we found evidence that YouTube's algorithm was boosting far-right conspiracies and propaganda-ish rants in Germany — and would often serve them up to German users who so much as served generic news terms. We also found evidence that YouTube's automated algorithm had curated what may have been one of the largest rings of child sexual exploitation videos ever. Again, this was not a case where some bad actors just happened to choose YouTube to post videos of sexualized little kids. The algorithm had learned that it could boost viewership by essentially reaching into the YouTube channels of unwitting families, finding any otherwise-innocent home movie that included a little kid in underwear or bathing suits, and then recommending those videos to YouTube users who watched softcore porn.

In essence, rather than merely serving an audience for child sexual exploitation, YouTube's algorithm created this audience. And some of YouTube's own employees told Bloomberg that the algorithm seemed to have done the same thing for politics by creating and cultivating a massive audience for alt-right videos — an audience that had not existed until the algorithm learned that these videos kept users hooked and drove up ad revenue.

So that's why YouTube, and it's why we spent months trying to understand the platform's impact in their second-largest market after the US: Brazil.

bithewaycurious53 karma

Hey, good question. It's just a website, right? Until maybe two years ago, we hadn't really thought that social media could be all that important. We'd mostly covered "serious" stories like wars, global politics, mass immigration. But around the end of 2017 we started seeing more and more indications that social media was playing more of an active, and at times destructive, role in shaping the world than we'd realized.

Why did it take until the end of 2017 to start taking social media seriously? So many academics and activists have been begging traditional news outlets to take social media seriously as a force for social change. (in this case radicalization).

thenewyorktimes135 karma

In our defense, there was some other stuff happening too! We spent most of 2016 and 2017 reporting on the rise of populism and nationalism around the world. And, to be clear, you're just talking to two NY Times reporters whose beat is normally well outside the tech world. The Times has had a huge and amazing team covering social media since before either of us even got Twitter accounts — and that has been kind enough to let us pitch in a little.

bacon-was-taken68 karma

Will youtube have this effect everywhere, or are these incidents random? Are there countries where "the opposite" have happened, algorithm linking to healthy things or promoting left-wing videos?

thenewyorktimes116 karma

YouTube and its algorithm seem to behave roughly the same everywhere, (with the exception of certain protections they have rolled out in the US but haven't yet deployed elsewhere, such as the limits they recently placed on conspiracy-theory content). But there are differences in how vulnerable different countries are to radicalization.

Brazil was in a really vulnerable moment when right-wing YouTube took off there, because a massive corruption scandal had implicated dozens of politicians from both major parties. One former president is in jail, his successor was impeached. So there was tremendous desire for change. And YouTube is incredibly popular there — the audiences it reaches are huge.

So it's not surprising that this happened in Brazil so quickly. But that doesn't mean it won't happen elsewhere, or isn't already. Just that it might not be so soon, or so noticeable.

GTA_Stuff55 karma

What do you guys personally think about the Epstein death? (Or any other conspiracy theory you care to weigh in on?)

thenewyorktimes50 karma

The thing that amazes us most about conspiracy theories is the way that at their core they are so optimistic about the power and functioning of our institutions. In conspiracy theories, nothing ever happens because of accident or incompetence, it's always a complicated scheme orchestrated by the powerful. Someone is always in charge, and super-competent, even if they're evil.

We can kind of see the appeal. Because the truth is that people get hurt in unfair, chaotic ways all the time. And it is far more likely to happen because powerful institutions never thought about them at all. In a lot of ways that's much scarier than an all-powerful conspiracy.

Million202643 karma

Why is it seemingly the case that the right-wing has been far more successful at exploiting the existing technology and algorithms than the left or more moderate political viewpoints?

thenewyorktimes58 karma

We can’t point to just one answer. Some of it is probably down to the recommendation algorithm. It seems to reward videos that catch your attention and provoke an emotional response, because the system has learned they keep people watching and clicking. So, videos that take a populist stance, promising unvarnished truth that you can’t get from the mainstream media; personalities who convey a willingness to push boundaries, and content that whips up anger and fear against some outside group all do well. There’s no specific reason those have to be associated with the right — leftist populism certainly exists — but in recent years it’s the right that has been most adept at using them on YouTube.

And a lot of it is probably that right-wing groups these days are much more ideological and well-organized than their counterparts in mainstream politics or on the left, and are specifically optimizing their messages for YouTube. In Brazil we met with YouTubers who have big organizations behind them, like MBL, which give them training and resources and support. This isn’t happening in a vacuum. They’re learning from each other, and working together to get their message out and recruit people to their ideology.

DrJawn32 karma

How often do people make Rushmore jokes to you?

thenewyorktimes30 karma

lol. not as much as they used to. but.... a lot.

whaldener20 karma

Hi. If this kind of strategy (overabusing the internet resources to influence people) is being used by all the different political parties and groups in an exhaustive way, don't you think that these opposite groups (left and right wing parties) may, somehow, neutralize the efficiency of such strategy, and eventually just give voice (and amplify them) to those that already share their own views and values? From my perspective, it seems that no politician/political party/ideological or religious group is likely to keep its conduct within the desirable boundaries regarding this topic...

thenewyorktimes57 karma

It seems like what you're envisioning is a kind of dark version of the marketplace of ideas, in which everyone has the same chance to put forward a viewpoint and convince their audience — or, in your version, alienate them.

But something that has become very, very clear to us after reporting on social media for the last couple of years is that in this marketplace, some ideas and speakers get amplified, and others don't. And there aren't people making decisions about what should get amplification, it's all being done by algorithm.

And the result is that people aren't hearing two conflicting views and having them cancel each other out, they're hearing increasingly extreme versions of one view, because that's what the algorithm thinks will keep them clicking. And our reporting suggests that, rather than neutralizing extreme views, that process ends up making them seem more valid.

bacon-was-taken18 karma

Are these events something that would happen without youtube?

(e.g. if not youtube, then on facebook, if not there, then the next platform, because these predicaments would occur one way or another like a flowing river finding it's own path?)

thenewyorktimes45 karma

We attacked that question lots of different ways in our reporting. We mention a few elsewhere in this thread. And a lot of it meant looking at incidents that seemed linked specifically to the way that YouTube promotes videos and strings them together. See, for example, the stories in our article (I know it's crazy long and I'm sorry but we found SO much) that talk about viral YouTube calls for students to film their teachers, videos of which go viral nationally. And we have more coming soon on the ways that YouTube content can travel outside of social platforms and through low-fi, low-income communities in ways that other forms of information just can't.

And you don't have to take it from us. We asked tons of folks in the Brazilian far-right what role YouTube played for their rise to power, if any. We expected them to downplay YouTube and take credit for themselves. But, to the contrary, they seemed super grateful to YouTube. One after another, they told us that the far-right never would have risen without YouTube, and specifically YouTube.

Whitn3y15 karma

The issue with that is placing blame on Youtube when the extremism is just a mirror of the society placing it there. It's like blaming a robot for recommending Mein Kampf in a library where you've already read every other Nazi literature. The robot is just trying to correctly do the math, it's the humans writing the literature that is the issue.

Does Youtube have a moral obligation? That's a different question with different answers than saying "Is Youtube spreading extremism?" because Youtube the company is not spreading extremism anymore than any company who allows users to self publish their own work with little oversight or even publications like the NYT or similar ones who publish Op-Eds that might be extremist in nature.

Don't forget that every media outlet on Earth has spread Donald Trump's messages about immigrants being rapists and thieves and Muslims being terrorists. If Youtube recommends Fox coverage of this, who is spreading what in that scenario?

thenewyorktimes16 karma

I think that you've answered your own question, actually: if the robot recommends Mein Kampf, then it has responsibility for that recommendation, even though it didn't write the book. But YouTube's robot isn't in a library of preexisting content. It also pays creators, encourages them to upload videos, and helps them build audiences. If YouTube didn't exist, most of its content wouldn't either.

And YouTube's version of that robot isn't just suggesting these videos to people who already agree with them. One person we met in Brazil, who is now a local official in President Bolsonaro's party, said that the algorithm was essential to his political awakening. He said that he would never have known what to search for on his own, but the algorithm took care of that, serving up video after video that he called his "political education."

But we did want to make sure that YouTube wasn't just reflecting preexisting interest in these videos or topics. As you say, then it wouldn't really be fair to say that YouTube had spread extremism; it would've been spread by the users. So we consulted with a well-respected team of researchers at the Federal University of Minas Gerias who found YouTube began promoting pro-Bolsonaro channels and videos before he had actually become popular in the real world. And a second, independent study out of Harvard found that the algorithm wasn't just waiting for users to express interest in far-right channels or conspiracy theories — it was actively pushing users toward them.

So to follow on your metaphor a little, it'd be like finding a library that aggressively promoted Mein Kampf, and recommended it to lots of people who had shown no interest in the topic — and if that library had a budget of billions of dollars that it used to develop incredibly sophisticated algorithms to get people to read Mein Kampf because that was one of the best ways to get customers coming back. And then if you noticed that, just after the library started doing this, lots of people in the neighborhood started wearing tiny narrow mustaches. You might want to look into that!

Throwawaymister28 karma

2 questions. What was it like being kicked out of Rushmore Academy? And how are people gaming the algorithm to spread their messages?

thenewyorktimes25 karma

  1. Sic transit gloria
  2. A few ways, which they were often happy to tell us about. Using a really provocative frame or a shocking claim to hook people in. Using conflict and us-versus-them to rile viewers up. One right-wing YouTube activist said they look for "cognitive triggers." Lots of tricks like that, and it's effective. Even if most of us consider ourselves too smart to fall for those tricks, we've all fallen down a YouTube rabbit hole at least a few times. But, all that said, I think in lots of cases people weren't consciously gaming the algorithm. Their message and worldview and style just happened to appeal to the algorithm for the simple reason that it proved effective at keeping people glued to their screens, and therefore kept YouTube's advertising income flowing.

CrazyKripple17 karma

What were some unexpected findings you found during the investigation?

Thanks for this AmA!

thenewyorktimes12 karma

By far the most unexpected (and most horrifying) thing was one of the Harvard researchers' findings: that the algorithm was aggregating videos of very young, partially-dressed children and serving them to audiences who had been watching soft-core sexual content. So, if, say, someone was watching a video of a woman doing what was euphemistically called "sex ed," describing in detail how she performs sex acts, the algorithm might then suggest a video of a woman provocatively trying on children's clothing. And then, after that, it would recommend video after video of very young girls swimming, doing gymnastics, or getting ready for bed.

It showed the power of the algorithm in a really disturbing way, because on their own, any one of the kids' videos would have seemed totally innocent. But seeing them knitted together by the algorithm, and knowing that they had been recommending them to people who were seeking out sexual content, made it clear that the system was serving up videos of elementary schoolers for sexual gratification.

leeter923 karma

Hey! Adam Conover did a bit about this. What were your boundaries as far as what's considered extremist?

thenewyorktimes7 karma

We already answered the extremism question. But <3 Adam Conover, he's great.

Hkatsupreme-3 karma

What are your favorite ice cream flavors?

thenewyorktimes3 karma

The correct answer, Amanda's, is Haagen-Dazs chocolate peanut butter.

Max likes chocolate chip, which is wrong.