Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

SIPA USA/PA Images

QAnon: What is this conspiracy and why has Facebook banned it?

More time on the Internet due to the pandemic, coupled with Facebook’s recommendation algorithms, has added fuel to the QAnon fire.

THE QANON CONSPIRACY has gathered such momentum that it poses a misinformation threat to the upcoming US election – and the coronavirus pandemic. 

Three years after the conspiracy theory first started, Facebook announced last night that it is banning pages and groups affiliated with QAnon. 

The move was immediate: misinformation analyst Aoife Gallagher of the Institute for Strategic Dialogue noted this morning that all four of the Irish QAnon groups she was tracking had been removed from Facebook. Of the roughly 50 international groups she was monitoring, all but five still remain. 

The claim at the heart of this conspiracy movement, which doesn’t have a leader or an organisation, is extreme: that Donald Trump is waging a war against a global elite who worship Satan and traffic children. 

There is a judgement day-esque part of this conspiracy too: that Trump is planning a day where members of a paedophile ring will be arrested. The QAnon conspiracy theorists have decided that those who are complicit in this allegation include mainly Democrats, celebrities and members of the media.

This is similar to the ‘Pizzagate’ conspiracy in 2016, a bizarre rumour peddled on Reddit and 4chan, that claimed a Washington pizzeria called Comet was a sinister front hiding a politically-connected paedophile ring. A man who turned up to the pizza restaurant with a gun, who said that he believed he was saving the children being kept in the restaurant, was sentenced to four years in prison in 2017.

The QAnon assertions seem to have originated before that from a post on the anonymous forum 4chan in October 2017. A user calling themselves ‘Q’ claimed to be a high-level government insider tasked with informing the public about Trump’s master plan by drip-feeding information onto 4chan. A significant number of other users believed this.

Trump, meanwhile, has dodged answering questions about the conspiracy theory, and flown close to legitimising the conspiratorial movement that idolises him.

In August, for example, Trump tweeted his congratulations to a QAnon-supporting Republican nominee for the House of Representatives, Marjorie Taylor Greene.

Taylor Greene, who is a candidate for Georgia’s election of 14 members to the House of Representatives this November, had called the theory “something worth listening to and paying attention to” and called the anonymous 4chan user Q a “patriot, we know that for sure” in a YouTube video

Trump praised her as “strong on everything” and a “future Republican Star”.

Taylor Greene has previously denied that racism is a problem in the US, and when speaking about the election of two Muslim Democrats to Congress, said it was “an Islamic invasion into our government offices”.

Taylor Greene has since tried to distance herself from the ‘QAnon candidate’ label.

The pandemic has fuelled its rise

The QAnon conspiracy has escalated significantly in the past year, and there are increasing concerns of a repeat of the interference that plagued the 2016 US presidential election.

Although the QAnon claims began on the fringes of the internet and originated from an anonymous user, the movement has seen sharp growth on mainstream social media platforms over the course of this year.

It has extended to include further baseless claims that the coronavirus is a conspiracy to control people through using vaccines and 5G.

QAnon followers have also hijacked hashtags such as #SaveTheChildren, which calls for an end to human trafficking and to protect children. As a consequence, this universally important message is being used to legitimise conspiracies and suck concerned citizens and parents into extreme theories without evidence, that usually have a political slant.

US researchers have detected sharp spikes in QAnon content and related searches in March, when many countries had started imposing lockdowns and other pandemic-related measures.

The anxiety, frustration and economic pain caused by the pandemic – coupled with the increased amount of time people were spending online – became an explosive mix that drew people to QAnon, experts say.

Why has Facebook taken action?

In August, Facebook deleted a QAnon-linked group with 200,000 members, for what it says were repeated violations of its company’s policies. Today, it has gone a step further and banned all accounts on Facebook and Instagram linked to the QAnon conspiracy.

In the run-up to the US Presidential election, Facebook will be eager not to repeat the mistakes of 2016, where it was accused of being used by to influence voters’ opinions and interfere with the crucial election. 

The growth of these conspiracy groups has also brought an increased threat of violence: the FBI has identified QAnon as a potential domestic terrorism threat, saying in a report last year that it was one of several movements that could drive “both groups and individual extremists to carry out criminal or violent acts.”

With tensions high during the US election, this will be watched closely by US authorities.

Aside from that, Facebook and other tech giants like YouTube have had a huge role to play in the growth of QAnon accounts and their followers. Tech analysis have highlighted social media’s recommendation algorithm as a key driver of QAnon growth.

This means that users who view, post or search for certain content are guided to what the platform’s algorithm determines to be other content they may be interested in. Analysts have said this helped link existing conspiracy theories – such as those about vaccines and 5G – with QAnon.

“They know this, especially the core of true believers, they are very good at leveraging the algorithmic… amplification techniques to drive engagement to their videos or posts,” said Alex Newhouse, a disinformation researcher at Middlebury College’s Center on Terrorism, Extremism, and Counterterrorism.

“QAnon would not exist in the volume that it exists without the recommendation algorithms on the big tech platforms.”

With reporting from AFP 

Readers like you are keeping these stories free for everyone...
Our Explainer articles bring context and explanations in plain language to help make sense of complex issues. We're asking readers like you to support us so we can continue to provide helpful context to everyone, regardless of their ability to pay.

Close
33 Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel

     
    JournalTv
    News in 60 seconds