Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Alamy
Safety First

Child abuse images freely available on social media platforms, online safety expert warns

Minister for Justice Helen McEntee will later today launch the annual report of Hotline.ie.

LAST UPDATE | 2 hrs ago

CHILD SEXUAL ABUSE material is now freely available on social media platforms and  inhouse safety teams are struggling to keep up with the sheer volume of illegal content, the head of watchdog organisation Hotline.ie has warned. 

Mick Moran – one of Europe’s leading experts in policing child sexual abuse material (CSAM) – was recently appointed as CEO at the organisation
Hotline.ie was set up in the late 1990s by tech companies so that people could report illegal activities online. The non-profit body – which launches its annual report today – works closely with tech companies, the Irish Government and NGOs to combat illegal content online.

Those activities include everything from sexual abuse images to online scams.

Today’s annual report shows that in 2023, there was a 110% increase in the amount of child sexual abuse material identified by Hotline.ie analysts.

Of the 38,435 reports of suspected CSAM in 2023, 29,044 of those were determined to be containing child sexual abuse material and other illegal activities compared to 13,799 in 2022.

Hotline has warned that the increase is in line with reported global trends and that these trends will – based on the consistent increases shown over the last five years - likely continue in future.

Moran served for 34 years in An Garda Síochána, during which he completed two secondments to Interpol, including as the Assistant Director in Charge of child exploitation, human trafficking and people smuggling investigations. 

He was at the forefront of international policing, child protection operations in Southeast Asia, West Africa and worked as the head of a team of 40 law enforcement experts. 

Speaking to The Journal, Moran detailed how the Irish Government and other states have to more forward with improved internet regulation to control the huge volume of child abuse material online.

He also warned that Artificial Intelligence (AI) can now be used by sex abusers to target children online, and that gaming platforms are places where paedophiles are approaching kids. He says that dealing with the online child abuse ecosystem is not just for law enforcement to tackle but a whole of society endeavour – starting in the home with parents and children.

His first time dealing with the issue was in 1997, in the early days of a ‘Wild West-like internet’. 

“The first time that I ever saw CSAM in the late 90s I was full sure that this couldn’t be tolerated, and that it would be sorted in a week,” he said. 

“Ironically you would have heard me saying 20 years ago that we won’t really have to worry too much about policing the internet later on, because the next generation coming through will have been reared online. Unfortunately, that has not happened.”

Playing catch up

Moran believes society is only catching up on the scale of the problem, adding that regulators and even judges are only now realising the damaging nature of the online world.

He referenced how discussions on introducing laws to limit child access to the internet are happening in countries like Australia and Ireland. He welcomed moves by the Department of Health to set up a new task force with an aim of reducing harms. 

IMG_0782 Mick Moran, CEO of Hotline.ie Niall O'Connor / The Journal Niall O'Connor / The Journal / The Journal

Moran said he has met, down through the years, highly skilled and committed people working as trust and safety officers in tech companies that were dedicated to controlling the flow of the worst of the internet. 

He paid a particularly warm tribute to those teams in Meta platforms and described them as “wonderful people, doing their best”. 

“These trust and safety teams are dedicated, hard working until they’re laid off,” he said of the wider industry.  

He said such is the volume of material out there that it is now almost impossible to catch all of it. 

Despite the efforts by those professionals, the desire for some internet companies to make profits is driving safety process back from the front of the agenda, Moran insisted. A combination of ensuring profit by hiding behind a “cloak” of freedom of speech advocacy is being used to justify those cuts, he believes. 

“The very fact that Hotline exists at all shows that the companies that are our members care about the abuse of their networks.

“For bad behaviour to be online, of course, some of the companies outside of that don’t.

“By pulling on the cloak of privacy and free speech, they are, in my humble opinion, cloaks of convenience. They are being are pulled on to justify bad behaviour, to justify making money. That’s the reality,” he said. 

“Twitter [now known as X] is a very good example, because we saw Twitter where efforts were being made to make it a safer platform by removing bad actors. That was deemed by the new owner to be having a negative effect on free speech.

“And so one of the first moves that the new owner made was to to remove all of the moderation teams and everyone got laid off. What we are looking at now is massive amounts of misinformation, disinformation and hate speech,” he added. 

However it does not stop at that, Moran said.

There is child sex abuse material published across internet services and apps that people use every day.

“Some companies are better than others at removing it, and that’s where we fit in. If you are a member of the public and you see something on X – report it to us too. We’ll follow up and ensure that it’s taken down,” he said. 

It is not just the big social media companies where problems have been identified. Moran said a huge problem has been identified in gaming platforms and servers used by gamers.

He explained that paedophiles are going onto those platforms to find children and to communicate with them as they begin their grooming of potential victims. The ultimate goal is often to meet those children in real life.  

Moran said that companies have a “fiduciary duty” to their shareholders to make profits but that they must also make those platforms safe. 

“I’m not suggesting companies shouldn’t make money. They’re obliged, by law, to make money for their shareholders.

“However, safety should be the cost of doing business,” he added. 

More than a policing solution

While Moran has been involved in many successful prosecutions of child abusers and those disseminating child sexual abuse material, he does not believe that the solution is solely a law enforcement one.   

“We will never police our way out of this,” he said. 

The regulations coming from the European Union and discussions of limiting children’s access to the internet are to be welcomed. 

“All of this has to start with the ‘net citizen’ themselves, this has to start with the child and by extension, the parents,” he said.  

“If you have a net citizen, a child who operates on the internet and you have them being properly supervised by their parents and in their schools and in the other spaces they’re engaged in… if their internet use is being properly monitored – not by technological solutions, but by their parents, by their families – you are creating a resilience in that child that will result in them later on being better able to deal with all the negative things that’s out there,” he added. 

One key burgeoning area of concern is the use of AI by groomers. This essentially means they can have consistency in the language they use and tailor it to better manipulate potential victims. 

Moran said that this will revolutionise grooming for paedophiles and allow them to target multiple victims at once and condense the amount of effort that takes. 

They are using Large Language Models to compose effective messages and strategies. Moran says that his experience shows that offenders can be in touch with at least 70 grooming victims at any one time. 

Technological advances are not all negative for Moran. Hotline.ie is working on a solution to use AI to identify patterns in grooming messages. 

“In 2050, we will look back at this time and see it as an information revolution along the lines of the Industrial Revolution, along the lines of the printing press revolution and how it has affected humanity.

“And of course, when you have changes on that scale for humanity there’s always going to be people left behind, especially vulnerable groups who will suffer. And for me, this my career to help them,” he said.

Minister for Justice Helen McEntee will later today launch the annual report of Hotline.ie.

Speaking on RTÉ Radio One’s Morning Ireland this morning, McEntee said that “we have very strong laws in this regard and our gardaí are doing fantastic work”.

“We have a specialized unit within An Garda Síochána specifically focused on this and Comisiúin na Meán now will be responsible for making sure that illegal content like child sexual abuse material is taken down,” she said.

“But I believe we need to do more. The facial recognition legislation that I will bring forward in the coming weeks will allow gardaí to use facial recognition to try to identify victims but also perpetrators,” McEntee said.

“Beyond that, I believe our biggest challenge is the fact that more and more of this type of abuse and material is going to the dark web. It’s going into end to end encryption, it’s going into spaces where gardaí cannot access them, and that’s why, at a European level, I believe we need to do more.

“I’ve been working closely with my colleague Commissioner Johansson [EU Commissioner for Home Affairs 2019-2024] who is absolutely determined that as a union, we work on this, and that companies do more.

“We’re not there yet, but there needs to be a step change here as well, because this is an issue that is growing. It’s getting worse, and I believe that we can do more.”

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

JournalTv
News in 60 seconds