Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Mark Zuckerberg also announced the Meta changes on Threads. Alamy Stock Photo

The truth about Meta’s broadside against fact-checkers

Journal Media Managing Editor Susan Daly says that platforms, not fact-checkers, are the ones with the power to control what you see.

WHEN META DROPPED its declaration this week that it was replacing fact-checker partnerships in the US with Community Notes, it was a surprise.

The Journal FactCheck has been a partner in Meta’s Third Party Fact-checking Program (3PFC) since 2018. In common with the more than 100 fact-checker partners in Meta’s project, we had no prior notice of this decision.

Those of us in the European time zones had a morning’s work and some coffee under the belt on Tuesday before Meta slid into our inbox. The 10 US fact-checking partners whose contracts are being terminated would likely have woken to news alerts to that effect on their phones.

As “international” partners, (ie, outside the US) we continue our work with Meta as planned for the year ahead but what Mark Zuckerberg baldly phrased as the decision to “get rid of fact-checkers” in the US casts a long shadow.

I described the decision as “disappointing” on The Journal this week. In truth, while the move continues to provoke disappointment – veering into dismay – it is not all that surprising. There is a political and business reality in the US that has been apparent since the results of the presidential election in November. Meta is not alone among US companies and tech platforms in understanding the memo.

The decision by Meta and the language used to convey it lays a lot of blame at the door of fact-checkers. It accuses them, particularly the US cohort, of political bias and being agents of censorship. We’ll come back to that.

Mark Zuckerberg, in his five-minute video on Meta’s new direction, explicitly namechecks Donald Trump, his incoming administration and his election as a “cultural tipping point”. In recent weeks, Meta has elevated Dana White, a friend of Trump, to its board of directors, and replaced former British deputy prime minister Nick Clegg with former George W Bush administration veteran Joel Kaplan as head of global policy.

“Probably, yeah”

The President-elect has previously accused Zuckerberg of plotting against him in the 2020 election and threatened him with prison. And this all against the backdrop of X billionaire owner Elon Musk getting a seat at Trump’s table. So when Trump says, “Probably, yeah”, when reporters asked if he thinks the declaration from Meta is an attempt to please him, then, yeah. Probably.

Turns out it’s not really about fact-checkers at all. 

 The easiest and, let’s face it, most entertaining narrative is to paint Musk and Zuckerberg as nervous sons jostling for position in front of a mercurial and powerful father. A badly-scripted episode of Succession, with elevated levels of cringe and sucking up.

This might be unfair to them both. Musk in particular seems to believe in every alarming word he utters. But keeping this background in mind puts in perspective a number of claims made by Meta and Zuckerberg about how the 3PFC partnership worked (or didn’t). 

In his video this week, Zuckerberg said that following the 2016 election of Donald Trump “the legacy media wrote non-stop about how misinformation was a threat to democracy”.

He says this like it was a bad thing.

The 3PFC project was created in 2016 as a proactive and welcome response from Meta to growing online disinformation, including Russian attempts on Facebook and other platforms to spread false claims during the US presidential election.

The truth is that it was and remains a Very Good Initiative. It isn’t designed to hide content; it is designed to provide context.

Fact-checkers can not block nor remove content

Fact-checkers can add labels that indicate there is extra information around posts that are assessed to be distributing false or misleading content. Users have the choice to continue through to the post, and/or consult the fact-checker’s analysis. 

Screenshot 2025-01-10 at 12.42.31 Meta's own example of 3PFC labelling. Meta Meta

This brings us to a fundamental miscommunication in this week’s Meta statement about the role of fact-checking organisations in the 3PFC Program. They are described as using it as a “tool to censor”. This simply isn’t possible. Fact-checkers could never remove or block content on Meta platforms.

Meta’s own description of the 3PFC states it very clearly: “Fact-checkers do not remove content, accounts or Pages from our apps. We remove content when it violates our Community Standards, which are separate from our fact-checking programs.” 

 The truth is that only Meta has the power to remove or limit access to content on its own platforms.

Political expediency

There is irony too in Zuckerberg branding US fact-checkers as being “just too politically biased” in a statement that, in both timing and explicit alignment, hints strongly at political expediency. 

The truth is that the people and organisations charged with doing that fact-checking work are required to undergo an extensive annual audit of their adherence to the principles of the International Fact-Checking Network (IFCN). As the IFCN membership stated today, this requires that there are “no affiliations with political parties or candidates, no policy advocacy, and an unwavering commitment to objectivity and transparency”. 

It goes on: “Each news organization undergoes rigorous annual verification, including independent assessment and peer review. Far from questioning these standards, Meta has consistently praised their rigor and effectiveness.”

An additional fact: Meta does not allow fact-checkers to label content made by any politician or election candidate under its 3PFC Program. There has been no shortage of free speech opportunities available in the political arena on Meta platforms and as Meta confirms in its statement this week, it had as a company downgraded the availability of political content in that time, a policy it is now reversing.

It seeks to inform and be transparent

As counter-disinformation efforts go, the fact-checking partnership project has a good deal to recommend it. It seeks to inform and promote transparency. 

And the truth is, up until Tuesday, this is how Meta had expressed its satisfaction in the fact-checking partnerships, both publicly and to us as organisations. In 2021, two years after the Georgetown ‘freedom of expression’ address Zuckerberg mentions in his video, Meta had praised the Program – and with supporting data:

We know this program is working and people find value in the warning screens we apply to content after a fact-checking partner has rated it.

“We surveyed people who had seen these warning screens on-platform and found that 74% of people thought they saw the right amount or were open to seeing more false information labels, with 63% of people thinking they were applied fairly.”

Here’s the truth: fact-checking and freedom of expression has not been at odds in Meta, nor has the company seen it as so. It has been a really useful effort, it has been innovative and it has had an impact in increasing independent, factual information available to users of Meta’s platforms. 

Meta has said: This works

And while there is still a decision to be made about the future of the Program in Europe and beyond, bear in mind Meta’s own assessment of how the partnership with fact-checkers worked to counter misinformation and interference in the 2024 European elections is very positive indeed.

The hard data is there too: “On average, 48% of people who started to share fact-checked content on Facebook, and 44% on Instagram, did not complete this action after receiving a warning from Meta that the content has been fact-checked, demonstrating the impact of labelling efforts in reducing the spread of misinformation on both platforms.” Meta is saying: This works.

We hope that the 3PFC Program will continue in other jurisdictions. It’s not perfect – the ban on fact-checking political speech, for example, has been extremely problematic and we have publicly said so – but it has been a useful part in the fight to keep social media users aware and informed to make their own decisions about what they are seeing online.

Why add to the war on expertise?

At The Journal FactCheck we have found it beneficial to us in both resources and distribution. We think it would be a great loss to users on Meta’s platforms and that there is room for both this effort as well as something like Community Notes.

Why exclude expertise research and analysis in favour of community sentiment when you can consult and present both? And why not continue to add further tools to support this challenge of allowing robust and informed debate on social media and messaging platforms? Why add to the war on expertise?

The Journal FactCheck existed before the partnership, and will continue to do so beyond any future decisions by external partners, but if there is any takeaway, to use Meta’s preferred term, it is that public service journalism like fact-checking requires support and resourcing and we can not depend on any major protagonist of the problem to provide it.

We know that our readers at The Journal value the work of our FactCheck team, sending us notes to say so along with contributions to our readers’ fund. A recent one simply stated: “Your work on fact-checking and disinformation in Ireland is absolutely vital. Thank you, and keep it up!”

Thank you. We stand with you too. 

The Journal FactCheck is a verified signatory of the IFCN, a partner in the Ireland EDMO (European Digital Media Observatory) hub and adheres to the Code of Practice of the Press Council of Ireland. 

Find our latest fact-checks here, and check out our media literacy project, Knowledge Bank, here.

Readers like you are keeping these stories free for everyone...
It is vital that we surface facts from noise. Articles like this one brings you clarity, transparency and balance so you can make well-informed decisions. We set up FactCheck in 2016 to proactively expose false or misleading information, but to continue to deliver on this mission we need your support. Over 5,000 readers like you support us. If you can, please consider setting up a monthly payment or making a once-off donation to keep news free to everyone.

Close
JournalTv
News in 60 seconds