Advertisement

New report claims YouTube unintentionally acts as an indoctrinator for far-right beliefs

Algorithms create a direct pipeline from mainstream conservative videos to white nationalist content.

YouTube has created an ideological rabbit hole by which viewers interested in more traditional conservative and libertarian ideas rapidly find themselves immersed in hardline, far-right views, a new report reveals. (Photo credit: Aytac Unal/Anadolu Agency/Getty Images)
YouTube has created an ideological rabbit hole by which viewers interested in more traditional conservative and libertarian ideas rapidly find themselves immersed in hardline, far-right views, a new report reveals. (Photo credit: Aytac Unal/Anadolu Agency/Getty Images)

YouTube has created an ideological rabbit hole by which viewers interested in more traditional conservative and libertarian ideas rapidly find themselves immersed in hardline, far-right views, a new report reveals.

What’s more, YouTube has a vested monetary interest in allowing these far-right influencers to remain on the platform.

The report, published by Data & Society, cites data from 65 political influencers on 81 different channels. Researcher Rebecca Lewis describes them as the “Alternative Influence Network,” simultaneously pushing reactionary right-wing ideas while also using brand influencer techniques to promote their “authenticity” and build their audience.

A good portion of the influencers are mainstream conservative and libertarian thinkers, like Ben Shapiro, Jordan Peterson, and conservative comedian Steven Crowder. However, by tracking who appeared on the same YouTube video over the course of a year-and-a-half, the report is able to build a map which shows how, with the help of YouTube’s algorithms, a user can seamlessly move from them into more hardline far-right and white nationalist sections of YouTube.

Advertisement

For example, Lewis notes Dave Rubin (of Bari Weiss’ “Intellectual Dark Web” fame) previously had popular far-right figures like Stefan Molyneux and Lauren Southern on his show. Rubin previously told The New York Times he does not consider himself a journalist and simply wants to talk to people and record it, letting the audience make up their own minds.

But this lack of rebuttal makes it extremely easy, as the D&S report notes, “for audience members to be incrementally exposed to, and come to trust, ever more more extremist political positions.” During his debate with Molyneux, for instance, Rubin never seriously pushes back on Molyneux’s belief that there are genetic IQ differences between different races. When Rubin encourages the audience to do their own research, he lists resources which have been provided by Molyneux.

According to the report, the risk of radicalization posed by these YouTube personalities is exacerbated not only by their popularity (Molyneux regularly pulls in hundreds of thousands of views) but also by the fact that they straddle the fence between being “independent journalists” and social media influencers — making them difficult to pin down while also “selling” their political ideology.

“These approaches are meant to provoke feelings, memories, emotions and social ties,” the report reads. “In this way, the ‘accuracy’ of their messaging can be difficult to assess through traditional journalistic tactics like fact-checking.”

YouTube, like other Big Tech platforms, has been under pressure to clean up its act and rid its website of festering hate speech and disinformation. In the wake of the Parkland shooting in February, for instance, YouTube was criticized for allowing a conspiracy theory that the students who survived were “crisis actors” spread like wildfire across the site. The company eventually removed the videos and suspended the accounts posting them. Both Facebook and Twitter faced similar challenges in the wake of that shooting.

Advertisement

YouTube has taken some concrete action in recent months, most notably joining Spotify and Apple in banning conspiracy theorist Alex Jones in April. However the D&S report suggests YouTube’s attempts to tame popular far-right personalities and misinformation are fundamentally flawed, in part because the company capitalizes on their advertising revenue.

“YouTube is built to incentivize the behavior of these political influencers,” the report reads. “YouTube monetizes influence for everyone, regardless of how harmful their belief systems are. The platform, and its parent company, have allowed racist, misogynist and harassing content to remain online — and in many cases to generate advertising revenue — as long as it does not explicitly include slurs.”

The report suggests a moderation system for YouTube in which less focus is placed on the technical side of things — such as ensuring users don’t violate copyright law — and more on governing content through an active promotion of certain values. It also suggests looking at the people political influencers host and evaluating what they say.