Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Alamy Stock Photo

‘Disinformation will be back’ warns expert as Meta rolls out Community Notes

Similar scheme is used on X – and research finds it still relies on fact-checkers.

AS META PREPARES to roll out its Community Notes feature to counter misinformation, experts warn that organised disinformation groups could exploit the new system.

Meta announced on Thursday it would begin testing Community Notes across its platforms in the United States next week, as it shifts away from third-party fact-checking toward a crowd-sourced approach to content moderation.

Chief executive Mark Zuckerberg announced the new system, popularised by the Elon Musk-owned platform X, in January as he appeared to align himself with the incoming Trump administration.

However, comprehensive analysis of Community Notes on X has revealed that the system still relies heavily on content published by fact-checking organisations.

Despite the Community Notes model being declared by both Musk and Zuckerberg as a superior alternative to fact-checkers, a survey by counter-disinformation organisation Maldita.es found that fact-checking organisations were the third most cited source by users in Community Notes.

Meta will start testing its Community Notes system on Tuesday, after several years of criticism from supporters of Trump, that conservative voices were being censored or stifled on its platforms.

Meta has also scaled back its diversity initiatives and relaxed content moderation rules on its platforms Facebook and Instagram, particularly regarding certain forms of hostile speech.

“Many big disinformation actors abandoned Facebook because fact-checking labels made their work and business model more difficult,” said Carlos Hernández-Echevarría Associate Director & Head of Public Policy at Maldita.es, a Spanish non-profit foundation dedicated to fact-checking.

“It does seem like they will be back when those labels disappear.”

The Community Notes initiative will allow users of Facebook, Instagram, and Threads to write and rate contextual notes.

Community Notes uses fact-checkers

However, an analysis by Maldita.es of how that same system worked for X, analysing 1,175,837 notes proposed in 2024, showed that in many ways the system was still reliant on, and less effective than, traditional fact checking efforts.

The study found that fact-checking organisations were the third most cited source in community notes, behind Wikipedia and other posts on X, and that proposed notes citing fact-checking organisations were published at a disproportionately high rate.

Such posts were also published faster; Community notes using evidence from fact-checkers become visible 90 minutes earlier than notes that did not.

However, the report notes that the vast majority of proposed notes on X, including those containing fact-checkers as sources, are never published, though not due to them lacking good information.

“The baseline argument is that people who normally disagree on the usefulness of proposed notes need to agree so a note can become visible,” Hernández-Echevarría told The Journal.

So by design they are ensuring a lot of notes showing useful, factual context will never become visible, particularly on highly polarized topics.

How exactly Meta’s version of Community Notes will work is unclear.

“This isn’t majority rules,” a press release by the company reads. “No matter how many contributors agree on a note, it won’t be published unless people who normally disagree decide that it provides helpful context.”

At the time of writing, Meta did not respond to requests for further information on their Community Notes system.

Bias and manipulation

Meta’s new approach ignores research that shows Community Notes users are often spurred by “partisan motives” and tend to over-target their political opponents, according to Alexios Mantzarlis, director of the Security, Trust, and Safety Initiative at Cornell Tech.

Meta say they “expect Community Notes to be less biased than the third party fact checking program it replaces because it allows more people with more perspectives to add context to posts,” an approach Hernández-Echevarría compared to giving the same weight on medical issues to a doctor’s opinion as everyone else.

“Meta is already saying notes will be anonymous, so any notion of expertise and confidence creation disappears,” Hernández-Echevarría told The Journal. “Meta’s justification that experts ‘like everyone else, have their biases’ is plainly wrong, and dangerous in my opinion.”

In its press release, Meta acknowledged that they are using X’s algorithm as the basis of their own rating systems — a prospect that is unlikely to comfort critics who can point to many instances of the system failing to stop the spread of dangerous misinformation on X.

One analysis of the source code indicated that a coordinated attack could use X’s consensus model to “bully people away from certain topics” just by rating their notes as unhelpful.

Studies have shown major spikes in hate speech since Musk took over X, as well as major failures to take down child sexual abuse material.

And the Community Notes system intended to hinder lies has even been used as a vector to spread disinformation.

“The threshold for participation is very low,” Hernández-Echevarría told The Journal of Meta’s Community Notes project, “and the threshold for quality of evidence nonexistent.

That combination is basically calling for organised groups to abuse the system. And they will, because the price is worth it.

A testing period

Meta said approximately 200,000 potential contributors in the United States have already signed up across the three platforms. The new approach requires contributors to be over 18 with accounts more than six months old that are in “good standing”.

During the testing period, notes will not immediately appear on content and the company will gradually admit people from the waitlist and thoroughly test the system before public implementation.

Notes will be limited to 500 characters, must include supporting links and will initially support six languages commonly used in the United States: English, Spanish, Chinese, Vietnamese, French and Portuguese.

“Our intention is ultimately to roll out this new approach to our users all over the world, but we won’t be doing that immediately,” the company said.

“Until Community Notes are launched in other countries, the third party fact checking program will remain in place for them,” it added.

The Journal remains in a partnership as a third-party factchecker with Meta in Europe.

However, unlike fact-checked posts that often had reduced distribution, flagged content with Community Notes will not face distribution penalties on Meta’s platforms.

“Purporting this as an anti-misinformation measure is difficult to reconcile with the fact that there are zero consequences for bad behavior,” Hernández-Echevarría told The Journal.

Meta has made clear that getting a note attached to your post for lying, even if it happens every day, will have no consequences on visibility or in your capacity to monetize such content.

“It has also made sure that no community notes can be placed on ads, which is a huge incentive both for disinformers to promote their work in that way.”

With reporting from AFP.

The Journal’s FactCheck is a signatory to the International Fact-Checking Network’s Code of Principles. You can read it here. For information on how FactCheck works, what the verdicts mean, and how you can take part, check out our Reader’s Guide here. You can read about the team of editors and reporters who work on the factchecks here.

Readers like you are keeping these stories free for everyone...
It is vital that we surface facts from noise. Articles like this one brings you clarity, transparency and balance so you can make well-informed decisions. We set up FactCheck in 2016 to proactively expose false or misleading information, but to continue to deliver on this mission we need your support. Over 5,000 readers like you support us. If you can, please consider setting up a monthly payment or making a once-off donation to keep news free to everyone.

Close
JournalTv
News in 60 seconds