Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

A screenshot from an AI 'nudify' app online

'Deep concerns' over AI 'undress' apps being advertised on Irish social media

So-called ‘nudify’ apps allow users to create deepfake naked images of any woman.

THE DUBLIN RAPE Crisis Centre has expressed concerns over online advertisements in Ireland for services that let users create deepfake naked images of women.

A number of ads on Facebook, seen by The Journal, have in recent weeks promoted so-called ‘nudify’ apps which allow users to create non-consensual, fake nude images of any woman.

The apps work by asking users to upload regular clothed photos of women, and use artificial intelligence to ‘erase’ their clothing and produce a pornographic image instead.

Nudify bots have also proliferated on the messaging app Telegram, where they have been reportedly used by millions of people every month.

They normally require users to pay a fee, but ads seen by The Journal in recent weeks promoted one app that offered to produce deepfake images for free for a limited time.

One such ad claimed “this app can see through objects” and showed a video of a woman dancing with her clothes on, followed by a brief shot of the same woman apparently undressed.

Meta’s advertising policies prohibit nudity and sexual activity, as well as sexual exploitation involving non-consensual acts, and the relevant ads have since been removed. 

A spokesperson for the company said that Meta removes any content in violation of its policies, but that bad actors are “constantly evolving their tactics to avoid enforcement”.

Rachel Morrogh, Chief Executive of the Dublin Rape Crisis Centre, said that she was “deeply concerned” about the capacity of deepfake images to “amplify harm to women”.

“The Dublin Rape Crisis Centre believes apps that enable the generation of deep-fake images which demean, debase and dehumanise women should not be available to download,” she told The Journal.

“The ease of access to AI tools via apps such as the one identified means that this is a form of sexual violence that is likely to become even more prevalent in the years ahead.”

She added that the widespread use of deepfake technology to create non-consensual abusive content reinforces harmful attitudes that objectify women and perpetuate gender-based violence.

The sharing of intimate images without a person’s consent, or threatening to do so, is a crime under Irish law, and carries a maximum sentence of seven years in prison and an unlimited fine.

The relevant legislation, the Harassment, Harmful Communications and Related Offences Act, also covers the sharing of deepfake images which purport to show an intimate image of someone.

Last October, Minister for Justice Helen McEntee revealed that there has been about 100 prosecutions under the law since its introduction in 2021, though it is not known how many relate to the distribution of deepfake images.

If you have been affected by any of the issues mentioned in this article, you can reach out for support through the following helplines:

  • DRCC - 1800 77 8888 (fre, 24-hour helpline) 
  • Samaritans - 116 123 or email jo@samaritans.org (suicide, crisis support)
  • Pieta - 1800 247 247 or text HELP to 51444 – (suicide, self-harm)
  • Teenline - 1800 833 634 (for ages 13 to 19)
  • Childline - 1800 66 66 66 (for under 18s)

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Close
JournalTv
News in 60 seconds