Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

DOMINIC LIPINSKI VIA PA IMAGES

Coimisiún na Meán: Ireland's new watchdog on using 'teeth' to take on social media giants

Coimisiún na Meán published its obligations for the next nine months in a work programme today.

IRELAND’S NEW COMMISSION for regulating broadcasters and online content will have the power to enforce financial and criminal sanctions on social media companies if they do not comply with legally-binding online safety regulations.

It is the first time that platforms will have legally-binding obligations under Irish and EU law to reduce harmful content that is shared on them.

Coimisiún na Meán, who has taken over the role of the former Broadcasting Authority of Ireland, has published its first work programme, which sets out its obligations for the next nine months. 

The commission is responsible for overseeing the regulation of broadcasting and video-on-demand services, as well as introducing the new regulatory framework for online safety.

It is responsible for regulating platforms that have established their European headquarters in Ireland and will work with its counterparts in other EU member states and with the European Commission in doing so.

It currently has around 50 staff members and is aiming to have 160 by next year, when it becomes fully operational.

Coimisiún na Meán will begin enforcement of the EU’s Digital Services Act next February and develop an online safety code which will address harmful content on video-sharing platforms. 

The code will provide the terms and conditions for firms in order to reduce harmful content online and will contain binding obligations on those firms to have transparent and user-friendly methods to report and flag harmful content, and to take action to remove it.

This content includes toxic beauty, self-harm and suicide ideation, cyberbullying and misogynistic content, as well as hate speech directed against groups with protected characteristics, and criminal offences, including those related to terrorism, child sex abuse material and racism.

If companies do not comply, the commission will have enforcement powers, including civil and criminal sanctions.

Fines

Under the Online Safety and Media Regulation Act 2022, it will be able to impose fines of up to 10% of relevant turnover or €20 million, whichever is greater.

It will also have the ability to prosecute for criminal liability for certain offences under the Digital Services Act, and fine up to 6% of worldwide turnover. 

Jeremy Godfrey, the executive chairperson of the commission, said the obligations will ensure that platforms have “an easy, user-friendly way for people to report concerns about content”. 

A lot of cases that cause a lot of distress are when people see video online, and they report it and then nothing happens, or they can’t find out how to report it.

“[Platforms] will have to deal with those concerns and respond to the person who has reported it in a timely manner. We’ll be monitoring whether or not they’re actually doing that, and if they don’t do it, if they’re systematically not doing it, then we’ll be able to take enforcement action,” Godfrey said.

Speaking to The Journal, online safety commissioner Niamh Hodnett said the commission “has teeth”. 

“We’re not alone in relation to enforcement. In relation to the Digital Services Act, we’ll be working with the European Commission – they’re also ramping up their enforcement team,” she said.

“We will be putting the binding obligations in our online safety code on the platforms themselves for them to comply and demonstrate compliance and then where they don’t we have these significant civil financial sanctions, but we also have criminal sanctions, in relation to certain breaches of offences under the act as well. So we do believe we have teeth – together with the European Commission – to enforce.”

Hodnett said the commission might be able to engage with platforms for lesser infringements, but a formal investigation will be required for “very serious and egregious” infringements, or if platforms do not cooperate.

“These will be formal enforcement powers. We would have authorised officers who can carry out investigations and receive evidence. We can send formal binding information requests and if these are not complied with, there can be criminal sanctions,” she said.

“At that point, we can move to make a decision that there’s been a breach of the code or infringement and then we can fine or impose civil administrative sanctions in relation to that if it was very serious.”

The commission will be holding a call for input over the coming weeks for the public’s thoughts on the online safety code, such as what it should contain and whether there should be a time scale on how quickly platforms should remove content deemed to be harmful.

Safety-by-design

Hodnett said the one obligation that she would like to see in the online safety code is the introduction of a safety-by-design feature to stop the amplification of harmful content online through algorithms. 

“This is the type of harmful content where each individual piece of content on its own may not necessarily look illegal or harmful, but when you take that as the totality of your entire feed, then it has a seriously detrimental effect on the mental health of a child or an adult,” she said.

“It becomes a toxic feed in relation to toxic beauty or in relation to suicide ideation or self-harm ideation. When that is the only feature you’re getting, and you’re getting that as an echo chamber, that has a very negative effect on people’s mental health.”

She said online platforms have not “been doing nothing” in relation to removing harmful online content, with some having made “substantial investment in content moderation”.

“Currently, they may be concerned about removing a particular item of content, whether it would reduce freedom of speech, for example, but if there’s a regulatory obligation to do that, then that gets them over the hurdle of worrying about ‘will I run into difficulty in the freedom of expression or freedom of speech if I remove this content?’”

Asked if she was concerned that the regulations would not change the behaviour of platforms, Hodnett said it’s “in the platform’s interest to have a safe and vibrant space”.

“Otherwise, their users will start turning away or become more polarised and that might not necessarily always be in their interest as well.

I think it very much is a global movement, and we will see a reduction in harmful content online.

Apart from online safety, the commission is also working on a strategy to promote Gender, Equality and Inclusion in the media, which was set out in the Future of Media Commission Implementation Strategy and Action Plan in January.

Irish language media review

This will identify gaps in the media sector and identify how to bridge them through a system of standards. 

It intends to undertake a review of Irish language media services in Ireland, as well as to implement a court reporting scheme and a local democracy scheme for local journalism.

The commission will also produce a statutory report on the impact of changes in ownership of media services over the last three years and the subsequent impact on plurality.

There will also be changes to the complaints regime for broadcasters. Broadcasting commissioner Celine Craig said the commission is currently working on the regime based on the provisions of the Online Safety and Media Regulation Act.

“While the changes are quite extensive, we’ve been working very much on putting a new shape, if you like, on the new regime, and we’ll be putting that in train very soon,” Craig said.

Media Minister Catherine Martin today welcomed the publication of the work programme.

“[The programme] sets out a clear set of milestones to create a safer online environment, including the publication of a binding online safety code by the end of the year which will address the protection of minors, hate speech and criminal offences on video-sharing platform services.

“Coimisiún na Meán will have my full support in delivering this ambitious and vital work programme.”

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Author
Jane Moore
View 11 comments
Close
11 Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel

     
    JournalTv
    News in 60 seconds