Ireland’s media regulator is hopeful it will “soon” have increased powers to initiate investigations into online platforms facilitating child abuse, an Oireachtas committee heard on Thursday.
Jeremy Godfrey, regulatory leader with Coimisiún na Meán, told the committee on children and equality his staff has powers to demand information from platforms on foot of a complaint but not to initiate investigations.
The “ability to require the production of information, to have statutory information requirements at an earlier stage, not just when an investigation has been opened, that would be useful”, he said.
“We are talking to the Department [of Culture and Media] about that and there may be some legislation for you to consider, relatively soon I hope,” he told committee members.
READ MORE
He said a “workforce plan” was being finalised in anticipation of increased investigative powers.
Assistant Garda Commissioner Angela Willis, of the organised and serious crime bureau, said if the force had “more resources” it could “do more” to counter online child sex abuse, which was “growing in scale and sophistication” and “presenting unprecedented challenges for law enforcement” across the globe.
One of the “most challenging areas” was “victim identification” while reviewing material on seized devices.
Last year, working with international police forces, gardaí identified 151 child victims of online sexual exploitation, 16 of whom were in Ireland. Tusla, the Child and Family Agency, was contacted and the children “removed from harm” here.
Risks to children were “no longer [confined to] online grooming and cyberbullying” but have evolved to include “more sinister forms of exploitation through social media and gaming platforms”, said Ms Willis.
[ ‘Significant’ increase in cases of child-on-child online sexual abuse, gardaí sayOpens in new window ]
“Technology has increased the scale, speed and anonymity with which offenders can operate across jurisdictions, making enforcement more complex,” she said.
Asked if gardaí had sufficient powers to investigate online abuse, she said: “When it comes to child sex abuse material we have absolutely sufficient legislative power.”
However, when an adult was victim of intimate image abuse, “we need a complainant and we also need for that image to have been circulated”.
“So there are offences there [that can be investigated] but we work within the legislation that’s provided.”
Fiona Jennings, head of policy at the Irish Society for the Prevention of Cruelty to Children (ISPCC), said that, while regulation may be increasing, there was a sense “we are entering a new iteration of self-regulation” where children and young people were being asked to “control own screen time despite powerful algorithms designed to keep their attention”.
Research conducted by the ISPCC with Technological University Dublin found accounts used by 13-year-olds “generally received higher levels of harmful content across most platforms than 18-year-olds’ accounts”, she said.
It was “concerning” that children affected by online harm were increasingly “turning to AI and chatbots for support” rather than people in their lives or “traditional services like ourselves”.
[ European Commission opens investigation into Grok’s ‘nudification’ featureOpens in new window ]
The society has received numerous emails and calls from children and parents concerned about the Grok tool on X that enabled users to upload photos, including of children, and to “undress” them. X announced measures to prevent the artificial intelligence chatbot from undressing images of real people.
Jane McGarrigle, national co-ordinator of Webwise, an internet safety awareness organisation, said just 20 per cent of its youth panel voted in favour of a ban on social media for children under 16 years.
The 80 per cent against said social media “played an important role in learning, connection and entertainment”, she said. “Banning would fail to address underlying problems like mis- and disinformation ... [and] could push young people into more unregulated spaces.”
They wanted better education on online safety and digital literacy, and better support for parents. Several speakers stressed that young people wanted input into policy on their access to social media and its regulation.











