New research reveals how deep YouTube’s conspiracy rabbit hole goes

New research shows how conspiracy videos have flourished online, unchecked.

Mourners bring flowers as they pay tribute at a memorial for the victims of the shooting at Marjory Stoneman Douglas High School on Sunday, February 25, 2018. (David Santiago/Miami Herald/TNS via Getty Images)
Mourners bring flowers as they pay tribute at a memorial for the victims of the shooting at Marjory Stoneman Douglas High School on Sunday, February 25, 2018. (David Santiago/Miami Herald/TNS via Getty Images)

In the wake of the mass shooting in Parkland, Florida, students at Marjory Stoneman Douglas High have channeled their grief and fury into demands for better gun control. Less than a fortnight has passed, but already this youth-led movement has triggered major concessions, including a proposal by Florida lawmakers to raise the minimum age for buying a gun and announcements by corporate sponsors that they are abandoning the NRA.

But as these teenagers speak out against the gun lobby, they’ve found themselves victims of paranoid far-right conspiracy theories. Since students like David Hogg and Emma Gonzalez began demanding action on gun control, videos and content has spread online claiming that they are in fact “crisis actors”, paid for by the left to smear gun ownership and make the school shooting seem worse then it actually was. The theories spread like wildfire online, and one video that claimed Hogg was an actor was the top trending video on YouTube, gathering 200,000 views before it was taken down. Meanwhile fellow student Cameron Kasky was forced to leave Facebook after receiving “graphic death threats”.

In a statement at that time Google apologized for allowing the video to appear on Trending in the first place. “Because the video contained footage from an authoritative news source, our system misclassified it,” a Google spokesperson told ThinkProgress. “As soon as we became aware of the video, we removed it from Trending and from YouTube for violating our policies.” Humans aren’t involved in tracking which videos are trending on YouTube, because of the sheer number of videos. Instead, the Trending Tab is curated by an algorithm.

But while YouTube may have taken down that one particular video, new research has shown just how endemic conspiracy theory videos are on its platform. Over the weekend, professor and data journalist Jonathan Albright published research on the network of conspiracy theories to which those watching Parkland “crisis actor” videos might be exposed. By mapping the videos recommended via an algorithm to people watching the Parkland “crisis actor” videos, Albright found a network of over 9,000-related videos which, in total, had been viewed more than four billion times.


A number of the videos are innocuous, or simply clips from TV shows. However as one starts to take a look through the list of mapped videos, the sheer number and breadth of inter-related conspiracy theories starts to become apparent — all of them, as Albright says, “being hosted, monetized and promoted on YouTube.” Video titles include “Body Language: How to SPOT a Crisis actor” (112,000 views), “The Truth About The Orlando Terrorist Attack” (544,000 views) and “MUST WATCH! JOHNNY DEPP EXPOSED AS PEDOPHILE SATANIST! #PIZZAGATE” (316,000 views). A number of the videos also featured disturbing sexual content.

The significance of this conspiracy genre is two-fold. Firstly, it provides those susceptible to conspiracy theories with a lifetime of videos to fuel their endless paranoia. If you click on a video claiming David Hogg was a crisis actor, you could then be recommended a video on crisis actors at Sandy Hook and Las Vegas, and then a video of false flag conspiracies, and so on until you are completely indoctrinated by it. Secondly, and perhaps more worryingly, every mass shooting or terrorist attack makes this genre grow in size and increases its economic value to YouTube.

“It’s algorithmically and financially incentivizing the creation of this type of content at the expense of the truth,” Albright told Buzzfeed News. “Journalists and affected parties…are not only fighting the content on YouTube, they are fighting its algorithms — first at the ‘trending’ level and then again at the search and recommendation level.”

Since the 2016 election attention has focused on Big Tech companies like Facebook, Twitter and Google as they attempt to tackle the problem of fake news, with varying degrees of success. But Albright’s research underscores the crucial point that fake news isn’t just a few bad actors, stirring up trouble whenever there’s a tragedy or major political story. Instead, it’s its own thriving online ecosystem, which only continues to grow.