Google-owned YouTube has become the latest social media platform to crack down on the pro-Trump conspiracy theory QAnon ahead of November’s US election, but stopped short of a full ban on the rapidly spreading movement.
In a blog post on Thursday, the video platform said that it would “prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence”, citing QAnon and related conspiracy theory Pizzagate.
The social media group also said that it had removed “tens of thousands” of videos and “hundreds of channels” related to QAnon, whose members believe US president Donald Trump is under threat from a Satanic “deep state” cabal of Democrats and Hollywood celebrities involved in child trafficking.
The move comes as Facebook and Twitter have also taken steps to eliminate the conspiracy theory from their platforms in recent months. In July, Twitter banned thousands of QAnon-related accounts and said it would stop recommending content linked to the movement, while Facebook announced plans to wipe it from its platform last week.
YouTube’s recommendation algorithms have long been criticised for helping draw users towards radical and extremist content, as well as conspiracy theories. In response to allegations in 2018 that it was pushing its audience “down the rabbit hole” of often baseless conspiracy content, it updated its systems to restrict the reach of harmful misinformation.
Nevertheless, QAnon — which was labelled a domestic terror threat by the FBI last year — has continued to proliferate across social media platforms in the lead-up to the November election and taken on increasingly violent undertones, while also spilling into the mainstream.
Left-leaning non-profit Media Matters has identified 27 congressional candidates who have endorsed or given credence to QAnon, or promoted related content. Last month, a director in Citigroup’s information technology department was dismissed after he was identified as the operator of one of the most important QAnon websites.
Instead of implementing a full ban, YouTube laid out several caveats to its changes: “content discussing [conspiracy theories] without targeting individuals or protected groups” will remain on the platform, it said, as well as news coverage of the issues.
The updates, introduced just weeks before the US vote, come as researchers have increasingly voiced frustration over what they see as a lack of transparency from YouTube over how much misinformation and co-ordinated manipulation is found on its platform, and how it is handled.
Others have pointed to lapses in the enforcement of existing policies. An external study by Media Matters, conducted before the announcement, found 17 top QAnon YouTube channels with more than 4.7m subscribers “explicitly violated” its terms of service.
The move is likely to drive some QAnon believers towards a constellation of smaller alternative platforms with less stringent content moderation policies. Experts have also warned that members of the movement have already infiltrated less contentious communities, such as those dedicated to child protection, where they often attempt to win over new converts by presenting a less political version of the QAnon narrative.
YouTube said it would start enforcing the new policy immediately, adding that it would “look to ramp up in the weeks to come”.