YouTube is the latest social network to take action to limit the spread baseless conspiracy theories online.
The Google-owned video platform followed the lead of Twitter and Facebook by announcing an expansion of their existing hate and harassment policies on Thursday.
YouTube will “prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence”, the company said in a blog post.
An example would be videos that threaten or harass users by suggesting they are complicit in the QAnon conspiracy theory.
Followers of the unfounded theory believe that US President Donald Trump is a secret warrior against an alleged child-trafficking ring run by corrupt celebrities and “deep state” government officials.
Another internet theory – Pizzagate – promoted baseless claims that children were being harmed at a pizza restaurant in Washington. D.C.
In December 2016, a man who believed the conspiracy theory entered the restaurant and fired an assault rifle. He was sentenced to prison in 2017.
In July, Twitter announced they would be strengthening their policies on tackling content which could lead to “offline harm”.
The social network banned thousands of accounts and blocked URLs associated with QAnon content, while also preventing QAnon tweets from being recommended to users.
Facebook meanwhile said they would remove pages, groups and Instagram accounts which “represented” QAnon, even if they didn’t promote violence.
However, a recent report by the Associated Press found that QAnon content was still spreading online and being recommended to users. Industry experts have long advocated for “cross-platform” action against online conspiracy theories.
YouTube said it had already removed tens of thousands of QAnon-videos and eliminated hundreds of channels under its existing policies, especially those that explicitly threaten violence or deny the existence of major violent events.
The number of views of prominent QAnon-related channels from non-subscribed recommendations had dropped by over 80% since January 2019, according to the platform.
“All of this work has been pivotal in curbing the reach of harmful conspiracies, but there’s even more we can do to address certain conspiracy theories that are used to justify real-world violence, like QAnon,” YouTube said on Thursday.
Jonathan Greenblatt, CEO of the Anti-Defamation League, said YouTube’s new policy was “good to see” and showed the platform was taking threats around violent conspiracy theories seriously.
YouTube said it would be enforcing the updated policy immediately and plans to “ramp up in the weeks to come.”
But QAnon has increasingly creeped into politics in the United States, and analysts have suggested that the policy moves by social media companies were long overdue, coming three years after the conspiracy theory first appeared.
On Thursday, President Trump declined to condemn QAnon conspiracy theorists during NBC’s town hall event and says he does not know much about the baseless theory.
Jonathan Greenblatt said the US president’s ignorance on the subject was “unacceptable” and “shockingly irresponsible”.
President Trump has previously retweeted accounts linked to QAnon and his supporters have been seen holding flags bearing the QAnon logo at election rallies.