Digital safety watchdog could prove a milestone online

Role would focus on youth, education and enforcing standards on social media firms

There is widespread frustration among those affected by online abuse that the likes of Twitter and Facebook are at best slow and at worst entirely unresponsive to requests for removal of objectionable content.  File photograph: iStockPhoto/Getty
There is widespread frustration among those affected by online abuse that the likes of Twitter and Facebook are at best slow and at worst entirely unresponsive to requests for removal of objectionable content. File photograph: iStockPhoto/Getty

The announcement by Minister for Communications Denis Naughten that he intends to create a new State office tasked with making sure abusive or dangerous content is quickly removed from social media sites may prove to be a milestone in the history of online regulation here.

It reflects a growing concern in society about the problem of damaging behaviour in an online environment impervious to traditional norms of civilised behaviour. That concern has led to a number of very sweeping proposals in recent years from politicians such as former Labour minister Pat Rabbitte.

The Government's response looks rather more measured. Naughten's ministerial colleague, Frances Fitzgerald, has already proposed the introduction of new and extended criminal offences in the area of digital harassment, stalking and revenge pornography.

Minister for Communications  intends to  establish a statutory digital safety commissioner. Photograph: Gareth Chaney Collins
Minister for Communications intends to establish a statutory digital safety commissioner. Photograph: Gareth Chaney Collins

Now Naughten has announced his intention to establish a statutory digital safety commissioner, with the power to compel social media platforms to remove harmful abuse promptly from their services.

READ MORE

If introduced, will it work? The explosion in popularity of social media platforms and apps has exposed deficiencies in a regulatory framework which still largely dates from the pre-digital era.

Valid concern

The protection of children from harm is a particular and valid concern, now that many young people have daily access to the internet via mobile devices. Meanwhile, there is widespread frustration among those affected by online abuse that the likes of Twitter and Facebook are at best slow and at worst entirely unresponsive to requests for removal of objectionable content.

For their part, these companies have resisted any attempt at external regulation. Most people, though, will agree with Naughten when he says he “will not accept there is a place in this digital world for those who wish others dead, raped, disfigured . . . and the list goes on”.

The announcement, along with Fitzgerald's proposals, comes on foot of a Law Reform Commission report last year on "harmful communications and digital safety". That report recommended the establishment of a statutory commissioner, modelled on comparable offices in Australia and New Zealand.

There may, though, be rather a lot of devil in the detail.

If the Minister follows the report’s recommendations, the proposed commissioner would have an uncontentious role in promoting digital safety, including “positive digital citizenship among children and young people, in conjunction with the Ombudsman for Children and all the education partners”.

Digital safety

However, the commissioner would also publish a statutory code of practice on digital safety, which would “build on the current non-statutory takedown procedures and standards already developed by the online and digital sector, including social media sites and set out nationally agreed standards on the details of an efficient take-down procedure”.

In other words, it would force social media companies – and perhaps other digital operations such as message boards and comment sites – to act more quickly and more effectively or suffer the consequences in court.

All of which raises a number of questions. Will the new code of practice focus solely on behaviour directly relevant to child protection, such as bullying and personal abuse, or will it extend to broader issues such as hate speech?

Since many social media companies have their European headquarters in Ireland, will the new position – like the Data Protection Commissioner – have an EU-wide remit?

State regulation

And will these companies, when they are consulted about these proposals, mention – in an entirely non-threatening way, of course – the thousands of Irish jobs that depend on their presence here?

The answer to that last question is an unequivocal "yes", and social media companies have successfully rebuffed attempts at State regulation so far. But legislators across Europe are becoming impatient with their refusal to accept that the huge power they wield comes with significant responsibilities.

Politicians in Germany are currently proposing a law that would impose punitive fines on companies which do not react swiftly to complaints. Facebook and the other services will fight a vigorous rearguard action, but in this climate, they might seek a compromise – possibly along the lines of the voluntary code of practice applied to media companies which agree to be bound by the decisions of the Press Ombudsman and Press Council.