Skip to Content
Humans and technology

YouTube’s algorithm seems to be funneling people to alt-right videos

January 29, 2020
hand holding youtube racism alt right alt lite intellectual dark web media recommendation algorithm far
hand holding youtube racism alt right alt lite intellectual dark web media recommendation algorithm farSzabo Victor via Unsplash

A new study suggests what we’ve suspected for years is right: YouTube is a pipeline for extremism and hate.

How do we know that? More than 330,000 videos on nearly 350 YouTube channels were analyzed and manually classified according to a system designed by the Anti-Defamation League. They were labeled as either media (or what we think of as factual news), “alt-lite” intellectual dark web, or alt-right. 

The groups: The alt-right is what’s traditionally associated with white supremacy, pushing for a white ethnostate. Those who affiliate with the “intellectual dark web” justify white supremacy on the basis of eugenics and “race science.” Members of the alt-lite purport to not support white supremacy, though they believe in conspiracy theories about “replacement” by minority groups.

Gateway: The study’s authors hypothesized that the alt-lite and intellectual dark web often serve as a gateway to more extreme, far-right ideologies. So they tested that by tracing the authors of 72 million comments on about two million videos between May and July of last year. The results were worrying. More than 26% of people who commented on alt-lite videos tended to drift over to alt-right videos and subsequently comment there. 

Blame game: It’s fairly easy to get to alt-lite and intellectual dark web content with a simple search, but alt-right videos tend to be harder to find for first-time users. Yet the researchers found that YouTube’s algorithm often directed users who searched for specific keywords toward increasingly violent, extreme content.

And it’s getting worse: The team, from the Swiss Federal Institute of Technology Lausanne, also found evidence that the overlap between alt-righters and others who dabble in intellectual dark web and alt-lite material is growing. The authors estimate that about 60,000 people who commented on alt-lite or intellectual dark web content got exposed to alt-right videos over a period of about 18 months. The work was presented at the 2020 Conference on Fairness, Accountability, and Transparency in Barcelona this week.

We still don’t know a lot about YouTube radicalization: For one thing, we aren’t quite sure exactly what makes people move from alt-lite material to the far-right stuff. That’s partially because YouTube restricts access to recommendation data. It’s also possible some people are coming to YouTube already having already been radicalized by some external, non-YouTube source. But this research suggests that YouTube’s recommendation algorithms may play a significant role.

The background: YouTube has long struggled with the balance between maintaining free speech and addressing hate speech. The company has taken some initial steps by banning some channels, most notably Alex Jones’s Infowars. But critics argue that YouTube hasn’t done enough.

In a statement, YouTube said it’s working through these issues: “Over the past few years ... We changed our search and discovery algorithms to ensure more authoritative content is surfaced and labeled prominently in search results and recommendations and begun reducing recommendations of borderline content and videos that could misinform users in harmful ways."

A spokesperson added that YouTube disputes the methodology and that it doesn’t take into account more recent updates to its hate speech policy or recommendations. "We strongly disagree with the methodology, data and, most importantly, the conclusions made in this new research," the spokesperson said.

Editors note: This story has been edited to include YouTube's comment and dispute.

Deep Dive

Humans and technology

Unlocking the power of sustainability

A comprehensive sustainability effort embraces technology, shifting from risk reduction to innovation opportunity.

Building a more reliable supply chain

Rapidly advancing technologies are building the modern supply chain, making transparent, collaborative, and data-driven systems a reality.

Building a data-driven health-care ecosystem

Harnessing data to improve the equity, affordability, and quality of the health care system.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.