QAnon

Facebook Bans QAnon, Including All Pages, Groups and Accounts Linked to the Movement

In a move that many consider long overdue, on Tuesday, Facebook announced a full ban on QAnon, classifying the conspiracy theory group as a ‘militarized social movement’, which is capable of causing serious, real-world harm if left unaddressed.

As per Facebook:

“Starting today, we will remove Facebook Pages, Groups, and Instagram account for representing QAnon. […] We’re starting to enforce this updated policy today and are removing content accordingly, but this work will take time and will continue in the coming days and weeks.”

The move is a broader expansion on Facebook’s crackdown on QAnon groups, which it announced back in August, and which saw the removal of thousands of groups.

At that time, however, Facebook didn’t institute a full ban on QAnon related discussion, explaining that: more

“While we will allow people to post content that supports these movements and groups, so long as they do not otherwise violate our content policies, we will restrict their ability to organize on our platform.”

Now Facebook is removing that provision and taking action on all QAnon-related content.

And the impacts of this new push are already evident.

As noted, many have been calling on Facebook to take action against QAnon for years, with the platform identified as a key facilitator in the spread of the dangerous conspiracy movement.

QAnon is essentially an expansion of the ‘Pizzagate’ theory, which originally suggested that a pizzeria in Washington was at the center of an international child trafficking ring connected to Hillary Clinton, Barack Obama, satanism, and more. The theory has been widely debunked, but even as far back as 2016, it had already been linked to real-world incidents, with a man entering the pizza restaurant, armed with a semi-automatic rifle, in order to investigate for himself what was happening inside.

Back then, Facebook was warned, with many even suggesting that it was a Facebook post that started the movement. Yet, no direct action was taken, and the theory further evolved into a more organized movement, which then morphed into QAnon. An internal investigation conducted by Facebook this year, and leaked by NBC News, found that the platform had provided a home for thousands of QAnon groups and Pages, with millions of members and followers, and with further threats of violence and dangerous activity linked to the group, Facebook finally chose to act.

This is a long time, Facebook had at least some inkling of the potential dangers of QAnon four years ago, and has waited till now to take action.

So why so long?

According to Facebook, QAnon content, up till now, hasn’t violated its policies.

“We remove content calling for or advocating violence, and we ban organizations and individuals that proclaim a violent mission. However, we have seen growing movements that, while not directly organizing violence, have celebrated violent acts, shown that they have weapons and suggest they will use them, or have individual followers with patterns of violent behavior.

So Facebook initially opted not to take action because most of the discussion was just that, web chatter that didn’t quite cross the line. But as noted, further violent incidents, including the murder of a mob boss in 2019, and several armed stand-offs have been linked back to QAnon fanaticism, and with these groups, as Facebook notes, celebrating such activity, the line in the sand became ever more tenuous.

With each incident, Facebook was called upon to take more action to stop the spread of QAnon content, while the movement has also been linked to anti-vaxxers, COVID-19 conspiracies, and more. In fact, QAnon is seen by some analysts as a key amplifier of many conspiracies – which again begs the question, why has it taken till now for Facebook to act?

Some have suggested that the recent Facebook ad boycott, conducted in the wake of the murder of George Floyd, prompted Facebook to take a closer look at movements like QAnon, while several civil rights and political activist groups have called on Facebook to do more to address the concerns. It seems that the ongoing pressure has pushed Facebook into action, and while the platform’s preferred approach is to let its users decide what’s acceptable, clearly, QAnon pushed its limit too much.

Now, Facebook will look to eliminate it completely, with its ‘Dangerous Organizations Operations’ team enforcing its rules on all related content.

“[The DOO team will] continue to enforce this policy and proactively detect content for removal instead of relying on user reports. These are specialists who study and respond to new evolutions in violating content from this movement and their internal detection has provided better leads in identifying new evolutions in violating content than sifting through user reports.”

Facebook says that it expects QAnon members to shift their approach in line with the new rules, and it will also be watching for new behaviors. And if Facebook really does put the squeeze on QAnon groups, it could deal a major blow to the movement. Facebook provides the broadest reach, awaiting the audience of people receptive to such messaging. Without it, QAnon will likely shift platforms, but far fewer people will shift with it.

How will that impact QAnon, and how might it change the upcoming election? Already, QAnon had been linked back to several theories circulating around the campaign, the most recent being that President Trump is not actually sick with COVID-19, but has instead been carrying out “secret missions” in line with the movement.

If Facebook is successful, then it could severely limit the spread of various theories online, but at the same time, it could also raise more questions around the platform’s capacity to take action against similar dangerous movements. If Facebook can stop QAnon, then why not remove all anti-vax discussion as well (Facebook limits anti-vax discussion but doesn’t ban it), or climate change denial – the list goes on.

As such, it’s a particularly interesting move for Facebook to take, and could hint at a wider shift in its approach to dangerous hate speech moving forward.

Similar Posts