Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Taylor Swift performs during her Eras Tour concert in Argentina on 9 Nov, 2023 Alamy Stock Photo

Why X has moved to block searches for Taylor Swift after an AI-related controversy

Last week, AI-generated fake images of the popstar went viral on X.

SOCIAL MEDIA PLATFORM X has blocked searches linked to Taylor Swift after sexually explicit AI-generated images of the pop star went viral on the site.

As of writing, any searches on X for “Taylor Swift” on the platform results in users seeing a message which reads: “Posts aren’t loading right now. Try again.”

AI-generated deepfake porn images of Swift were posted to X last week and caused widespread outrage. 

Deepfakes are digitally manipulated images, video and audio that are designed to create fake material featuring the likeness of an individual.

Deepfakes have become easier to make in recent years and are increasingly used for malicious purposes. 

Joe Benarroch, head of business operations at X, told the Wall Street Journal that the move was a “temporary action” and had been done “with an abundance of caution” as the site “prioritise safety on this issue”.

It is not clear how long the search block will remain in place.

‘Alarming’

Last week, AI-generated fake images of Swift went viral on X.

One post on X featuring the images were viewed close to 50 million times before it was removed from the platform.

According to US media, the post had been on the platform for around 17 hours.

“It is alarming,” said White House Press Secretary Karine Jean-Pierre when asked about the images on Friday.

“Sadly we know that lack of enforcement (by the tech platforms) disproportionately impacts women and they also impact girls who are the overwhelming targets of online harassment,” Jean-Pierre added.

While deepfake porn images of celebrities are not new, activists and regulators are worried that easy-to-use AI tools will create an uncontrollable flood of toxic or harmful content.

Non-celebrities are also victims with increasing reports of young women and teens being harassed on social media with sexually explicit deepfakes that are more and more realistic and easy to manufacture.

The targeting of Swift could shine a new light on the phenomenon, with her millions of fans outraged at the development.

“The only ‘silver lining’ about it happening to Taylor Swift is that she likely has enough power to get legislation passed to eliminate it. You people are sick,” wrote influencer Danisha Carter on X.

In a statement on Friday, X said that “posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content.”

The Elon Musk-owned platform said that it was “actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”

It was also “closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed.”

X has since taken the decision to block searches for “Taylor Swift” on the platform.

‘Easier and cheaper’

Yvette Clarke is a Democratic congresswoman from New York who has backed legislation to fight deepfake porn.

When the images of Swift went viral last week, she remarked: “What’s happened to Taylor Swift is nothing new. For years, women have been targets of deepfakes without their consent.

“And with advancements in AI, creating deepfakes is easier & cheaper.”

Several states in the US have made it a crime to share sexually explicit deepfake images which were created without the person’s consent. 

Despite this, in a public service announcement last year the FBI said it “continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content”.

According to research cited by Wired magazine, 113,000 deepfake videos were uploaded to the most popular porn websites in the first nine months of 2023.

And research in 2019 from Deeptrace Labs, the creator of a service designed to identify deepfakes, found that 96% of deepfake videos on the internet were pornographic.

The issue first came to light in 2017 when deepfake porn images of female celebrities started appearing on forum sites like Reddit. 

The software needed to create these images is now widely available online and many celebrities, mostly women, have been targeted through the creation of fake, sexually explicit images.

However, it is not only celebrities who are the victims of deepfakes.

In September, the BBC reported on a town in Spain where 20 girls between the ages of 11 and 17 had come forward to report that they were the victims of sexually explicit deepfakes. 

The explicit images were created by using photos of the girls fully clothed, with some of the images taken from their own social media accounts.

These were then put through an AI app that converts the image into a sexually explicit deepfake. 

Meanwhile, here at home, Fianna Fáil politicians recently warned that deepfakes have “turbocharged” the disinformation threat to elections.

They called for the Electoral Commission to create a strategy to tackle the misuse of artificial intelligence in political campaigning.

Elsewhere, a study from the World Economic Forum stated that misinformation and disinformation driven by AI ahead of elections in major economies are the biggest global risks this year and the next.

-With additional reporting from © AFP 2024 

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Close
JournalTv
News in 60 seconds