Facebook Removes QAnon Groups

The social-media platform is also increasing its vigilance toward anarchist groups such as antifa.

Photo: John Nacion/NurPhoto/Zuma Press

Facebook Inc. FB 0.10% said it is removing and will limit the spread of accounts that celebrate or suggest violence, including those associated with QAnon as part of a crackdown on the extremist conspiracy theory that has thrived on the company’s platforms in recent years.

The company said Wednesday it had removed more than 790 Facebook QAnon-related groups, 100 Facebook pages and 1,500 QAnon ads, and blocked more than 300 hashtags across Facebook and Instagram. Facebook has also restricted more than 1,950 groups and 440 pages on Facebook as well as more than 10,000 accounts on Instagram connected to QAnon.

The social-media giant said the moves reflect an expanded policy regarding dangerous individuals and organizations to include accounts that play host to discussions of potential violence even if they don’t meet its criteria for being marked as dangerous.

In some cases, Facebook will reduce the visibility of pages, groups and accounts associated with these movements, including preventing them from being recommended to users, pushing them lower in users’ news feeds and preventing the groups from buying ads.

Many of the restrictions outlined Wednesday would apply to groups and accounts representing QAnon but not to other users, including politicians, who espouse or support QAnon views from personal and professional accounts. Facebook said in the near future it will prohibit anybody from running ads “praising, supporting or representing” these movements.

The QAnon conspiracy centers on the idea that a powerful group of child traffickers control the world and is undermining President Trump with the help of other elites and mainstream news outlets. Last year a Federal Bureau of Investigation field office warned that QAnon and other conspiracies could spark violence in the U.S.

Asked about his view of QAnon at a White House press briefing on Wednesday, Mr. Trump said: “I don’t know much about the movement other than I understand they like me very much—which I appreciate.” When a reporter said the theory alleges that Mr. Trump is protecting the world from a satanic group of pedophiles and cannibals, he said: “I haven’t heard that, but is that supposed to be a bad thing or a good thing?” He added: “We’re saving the world from a radical left philosophy that will destroy this country and when this country is gone, the rest of the world will follow.”

Facebook said it is taking similar steps against U.S.-based militia organizations and anarchist groups. They include antifa, short for “antifascist,” a loosely organized left-wing activist movement that has participated in civil-rights protests in Portland and elsewhere around the country.

“It is hugely important that QAnon is being categorized with militias and offline groups that are tied to violent acts,” said independent disinformation researcher Molly McKew and author of the Stand Up Republic’s Defusing Disinfo blog. “QAnon is often dismissed as a wacky online conspiracy, but radicalized QAnon adherents have been responsible for violent attacks in real life,” she said. “This warrants for serious monitoring and intervention.”

Facebook has been a powerful organizational tool for the QAnon movement, which got its start in late 2017 on the fringe site 4chan and later migrated to Facebook groups.

Since March, when the first pandemic-related lockdowns took place in the U.S., QAnon’s presence has exploded in popularity on Facebook and Instagram. Experts have connected QAnon’s recent rise to the fact that the public-health lockdowns have forced people to spend more time in front of their screens. Researchers also say the content tends to do well on social-media platforms, which can often reward sensational content.

Facebook has also removed more than 980 groups, 520 pages and 160 ads on Facebook from militia groups and organizations that encourage riots. It also restricted more than 1,400 hashtags related to those groups on Instagram.

Facebook’s rules already prohibit content that advocates violence and it bans organizations and individuals that adhere to a violent mission, such as terrorist groups. The company is now broadening its policy around dangerous individuals and organizations to factor in movements and groups that “have demonstrated significant risks to public safety but do not meet the rigorous criteria to be designated as a dangerous organization and banned from having any presence on our platform.”

“We have seen growing movements that, while not directly organizing violence, have celebrated violent acts, shown that they have weapons and suggest they will use them, or have individual followers with patterns of violent behavior,” Facebook said in a press release Wednesday.

In an analysis of Facebook data conducted by social-media research firm Storyful, the average membership in 10 large public QAnon Facebook groups grew from about 6,000 in March to about 40,000 in July, The Wall Street Journal reported last week. Some Instagram accounts more than quadrupled their following.

Facebook said Facebook pages and groups as well as Instagram accounts associated with the militia, antifa and QAnon movements wouldn’t “be eligible to be recommended” to other users who are looking for groups to follow. Those groups would be ranked lower in users’ news feeds and in search results.

The company will also bar Facebook pages associated with these movements from running ads, selling products or fundraising using its products.

Write to Deepa Seetharaman at [email protected]

Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

tinyurlis.gdv.gdv.htu.nuclck.ruulvis.netshrtco.detny.im