YouTube Tightens Rules on QAnon and Other Conspiracy Content

https://www.reutersconnect.com/all?id=tag%3Areuters.com%2C2020%3Anewsml_RC2CWI99MPBH&share=true

YouTube Tightens Rules on QAnon and Other Conspiracy Content

Will such measures work or will they only backfire?

Google’s YouTube has updated its hate speech policy to prohibit videos that target individuals or groups with conspiracy theories.

“Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence,” the company announced on its blog.

The new rules will ban any content that “threatens or harasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate,” the blog post read.

YouTube confirmed that it would be enforcing the updated policy immediately and plans to “ramp up in the weeks to come.”

The company added that it has already taken down tens of thousands of QAnon videos and terminated hundreds of channels under existing policies, “particularly those that explicitly threaten violence or deny the existence of major violent events.”

“Due to the evolving nature and shifting tactics of groups promoting these conspiracy theories, we’ll continue to adapt our policies to stay current and remain committed to taking the steps needed to live up to this responsibility,” the blog post read.

QAnon is a loosely organized network and community of believers who embrace a range of unsubstantiated beliefs, which often focus on the idea that a cabal of Satan-worshipping pedophiles—many of whom are Hollywood stars and Democratic politicians—have long controlled much of the so-called “deep state” government.

Recent posts by QAnon have spread false information about voting fraud and the novel coronavirus. Some have even claimed that President Donald Trump faked his coronavirus diagnosis so that he could make secret arrests.

QAnon has also been called a potential source of domestic terrorism by the FBI.

YouTube’s decision to restrict QAnon content follows similar changes made by other social media platforms. Twitter removed QAnon accounts and restricted content in July, and just last week, Facebook announced that it would remove groups, pages, and Instagram accounts that identified with QAnon.

“Starting today, we will remove Facebook Pages, Groups and Instagram accounts for representing QAnon,” Facebook said in a statement.

“We’re starting to enforce this updated policy today and are removing content accordingly, but this work will take time and will continue in the coming days and weeks. Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports.”

Earlier this year, YouTube updated its policy to begin removing content that contains coronavirus-related misinformation, such as claims that 5G causes the virus.

Ethen Kim Lieser is a Minneapolis-based Science and Tech Editor who has held posts at Google, The Korea Herald, Lincoln Journal Star, AsianWeek, and Arirang TV. Follow or contact him on LinkedIn.

Image: Reuters