Highest Rated Comments


Whitn3y22 karma

if the robot recommends Mein Kampf, then it has responsibility for that recommendation

How so? I would also recommend that literature to anyone interested in history, social science, political theory, etc because it's still a historically significant document with a lot of pertinent information whether I agree with it's particular contents or not, the same way I would recommend the Bible, Quran, or Torah to someone looking to learn more about theology despite not being interested in reading them myself or any violent history associated with them.

But YouTube's robot isn't in a library of preexisting content. It also pays creators, encourages them to upload videos, and helps them build audiences. If YouTube didn't exist, most of its content wouldn't either.

Libraries also add new work and maintain periodicals including daily newspapers with Op-Eds or completely unproven conspiratorial works and extremist literature. I could print a book right now and donate it to the library, just as I could upload a video to Youtube. Libraries are generally non profit, but the publications they carry are not always so. They even carry films for rental, and if we want to get super meta, you can go to the library, check out a computer, and then watch extremist content on the internet there. Is the library then responsible as well? Are the ISPs? The domain owners?

Saying if Youtube didn't exist, it's content wouldn't either is pure speculation that is impossible to back up, even if it's true for specific creators or videos, it's not at all true for the general ideas which have been manifested all over the internet since it's creation, nor would it be possible to accurately speculate the services that would exist in it's void.

Search engines do almost exactly the same thing, driving traffic to creators giving both the engine and the website ad revenue and audiences. They also encourage people to make completely unregulated [barring national laws like child pornography] websites. This mentality could apply to the entire infrastructure of the internet, old media, or all of society after the printing press! I ask again, is the NYT responsible for propagating any extremist rhetoric that appears in it's pages? Because the NYT also encourages users to submit opinions, helps authors build audiences, and pays contributors even outside the scope of pure journalism.

Television and radio are equally culpable as well for the exact same reasons. This is not "what about-ism", this is "applying the logic equally across the spectrum".

One person we met in Brazil, who is now a local official in President Bolsonaro's party, said that the algorithm was essential to his political awakening. He said that he would never have known what to search for on his own, but the algorithm took care of that, serving up video after video that he called his "political education."

No one would have searched for Donald Trump either, but in 2016 the television, print, and radio "algorithms" fueled by profit "suggested" him and his extremists ideologies to millions of Americans and billions of people globally and now he is the POTUS. Who's responsible for that?

And a second, independent study out of Harvard found that the algorithm wasn't just waiting for users to express interest in far-right channels or conspiracy theories — it was actively pushing users toward them.

Did it account for variables that cause a video to go trending, as in, it's just recommending videos that are getting popular regardless of content? Because if we did a meta analysis on the popularity of Youtube videos and their extremist leanings, I'd think you'd be very disappointed in how many people watch any political video compared to Youtube's audience as a whole seeing as how the top 30 most watched Youtube videos are music videos and the most subscribed channels are either music or generally vanilla in nature like Pewdiepie or Jenna Marbles.

Whitn3y15 karma

The issue with that is placing blame on Youtube when the extremism is just a mirror of the society placing it there. It's like blaming a robot for recommending Mein Kampf in a library where you've already read every other Nazi literature. The robot is just trying to correctly do the math, it's the humans writing the literature that is the issue.

Does Youtube have a moral obligation? That's a different question with different answers than saying "Is Youtube spreading extremism?" because Youtube the company is not spreading extremism anymore than any company who allows users to self publish their own work with little oversight or even publications like the NYT or similar ones who publish Op-Eds that might be extremist in nature.

Don't forget that every media outlet on Earth has spread Donald Trump's messages about immigrants being rapists and thieves and Muslims being terrorists. If Youtube recommends Fox coverage of this, who is spreading what in that scenario?

Whitn3y3 karma

For anyone interested, here is a current snapshot of my recommended feed. The names circled in red are channels I am already subscribed to.

https://imgur.com/bLi1sO2

Even with the decent amount of political videos I watch, there is very little in my feed because the algorithm has done a good job learning about what I will most likely want to watch, even with occasional right wing videos in my history due to reasons outside of genuine interest.

EDIT: I've already watched out the videos I wanted to watch today, I don't necessarily want to watch these just to be clear, well maybe the quarters video but not the American Pie cast reunion haha and I already watched the Freddy Got Fingered video because I actually like that film.

Here's some more refreshes without the circles to my subscriptions

https://imgur.com/a/NgM1UmX

Whitn3y2 karma

Obviously I don't work at Google, but the algorithm doesn't recommend such things to me unless I recently watched something related for other reasons and when it does, I don't watch them unless I want to

The algorithm doesn't force you to watch anything or remove the search/subscribe feature. It also doesn't force you to maintain watch time on a video or interact with it in the likes/dislikes/comment sections either.

Whitn3y2 karma

They do and it is. That's why Google is being simultaneously accused of censoring right wing ideology and promoting right wing ideology depending on whether the person you ask is right wing or part of old media.

Like I said, if they want to claim Youtube helped get Bolsonaro elected, then I'm going to claim old media got Donald Trump elected (Who himself helped get Bolsonaro elected by his right wing messages being propagated globally by old media) and the latter is far more of a concern/danger globally than the former.

The only answer here is to make Youtube non profit and funnel profits into content moderation which is basically impossible for these purposes since extremism is so subject to opinion that one television station can be calling Mexicans rapists (Fox) and another can be flat out stating that our president is indentured to foreign powers without evidence (CNN) while yet another chanel promotes conspiracy theories on bigfoot, aliens, and the secrets of Area 51 (HISTORY) while another pushes that vaccines are not trustworthy (HBO/Bill Maher) or a White House administration quoting Breitbart articles saying there is a conspiracy of the government to overthrow the POTUS and pushing the extremist narrative that climate change is fake and yet almost no one is writing an article on how American Television is pushing extremism onto people at the behest of almighty profits. Just Youtube.