Dublin riots drag row over facial recognition technology back into spotlight

Some may welcome anything that can make policing easier, but others are likely to be wary of such technology in policy that could intrude on civil liberties

'eGates' for passport control in Terminal 1 of Dublin Airport after they began use, in November 2017. The machines use advanced facial recognition technology to identify passengers ahead of flights and integrate with national and international watch lists. Photograph: Alan Betson
'eGates' for passport control in Terminal 1 of Dublin Airport after they began use, in November 2017. The machines use advanced facial recognition technology to identify passengers ahead of flights and integrate with national and international watch lists. Photograph: Alan Betson

There was an unwelcome sense of “digital deja vu” this week for opponents of facial recognition technology (FRT). In the wake of the Dublin riots there has been renewed impetus around the potential use of the controversial law-enforcement tool by gardaí.

Minister for Justice Helen McEntee reopened a debate on the matter that many hoped had largely been dealt with during the summer.

In response to a previous Government effort to include provision for FRT use in body camera legislation, campaigners, civil liberty groups and academics had all weighed in. Under political pressure, the Minister agreed in June to separate out FRT and deal with the issue another day.

However, following the chaotic scenes witnessed during last month’s riots, McEntee revived the debate about the use of the technology.

READ MORE
Body-worn cameras on gardai. Under political pressure, Minister for Justice Helen McEntee agreed this year to separate out her facial recognition technology ambitions from the issue of gardai wearing cameras. Photograph: Alan Betson
Body-worn cameras on gardai. Under political pressure, Minister for Justice Helen McEntee agreed this year to separate out her facial recognition technology ambitions from the issue of gardai wearing cameras. Photograph: Alan Betson

She argued that such technology could assist gardaí when trawling through “thousands of hours of CCTV footage” after similar events and could help to ensure that “the thugs and criminals who created so much destruction in our city” are brought to justice.

Dublin riots must not be an excuse to magic away legal and human rightsOpens in new window ]

FRT is artificial intelligence-led software that can scan hours of footage for specific individuals. It is considered by many as a white knight for otherwise exhaustive police work. Opponents argue that a “creep” factor must be considered when using cameras and other technology to record, store and track people’s images. They say it opens the door to an unpalatable security-state vista.

Dr Elizabeth Farries, of University College Dublin’s Centre for Digital Policy (CDP), who coined the term “digital deja vu”, says something bad has happened and “we’ve got some grand claims from our Ministers that we are going to take this powerful technology, stir it into an existing social problem and solve it”.

From a human rights point of view and a citizen’s point of view, is FRT going to be used correctly, is it compliant with general data protection regulation and also privacy law requirements?

—  Ronan Lupton, senior counsel

During the debate on FRT earlier this year, the CDP was among a number of critics suggesting the technology was ethically and technically unsound. In plain terms, it said, it amounted to a proposition that gardaí would record their interactions with the public, store them indefinitely, take “faceprints” and in certain circumstances use AI algorithms to assess whether anyone is of interest to an investigation.

Such a scenario, it said, was tantamount to “mass surveillance of our entire population by our police force”, eroding basic freedoms. The notion that it would reduce the time gardaí spend reviewing CCTV was branded as no more than an “argument of convenience”.

The supermarket that holds you in: Britain’s shoplifting epidemic prompts retailers to take drastic measuresOpens in new window ]

The last attempt to legislate caused a rupture between McEntee, a Fine Gael Minister, and her Coalition partners in the Green Party, who were concerned about FRT being attached as an amendment to the Recording Devices Bill, which dealt, among other things, with the use of body cameras. The view was that it would not face a thorough pre-legislative scrutiny process.

McEntee ultimately split the two issues. Standalone FRT legislation is currently being drafted and, on foot of the chaos in the capital, the Minister instructed her officials to expand its reach to include rioting and violent disorder, joining other serious offences including murder and terrorism.

A sign disclosing use of facial recognition technology instore earlier this year at an entrance to Macy's retail store in New York City. Photograph: Kashmir Hill/The New York Times
A sign disclosing use of facial recognition technology instore earlier this year at an entrance to Macy's retail store in New York City. Photograph: Kashmir Hill/The New York Times

As well as pointing to its established use by police in other jurisdictions, a spokesperson for the Department of Justice said the need to identify rioters quickly rendered FRT an “essential tool”. The draft legislation is expected to be submitted to Cabinet for approval “within weeks”.

Fears of the threat to democracy many believe it represents have not gone unheeded. The department points to several safeguards, including its limitation to retrospective rather than live use, and the oversight of a High Court judge.

According to Olga Cronin, a senior policy officer at the Irish Council for Civil Liberties (ICCL), which is deeply concerned by the forthcoming legislation, the technology is based on probability and attempts to match people caught on camera with a database of faces.

When an image is taken, it is measured and biometrically processed to make a template, essentially a string of numbers measuring the distance between facial features. Images are “matched” with a probability score, but a number of variables are at play, Cronin says, particularly the quality of the query image for which a match is being sought.

Even if the technology could be fully accurate all the time, she adds, “that brings its own huge, huge issue, because that means that we are using a technology that literally puts us all in the frame for being just walking barcodes.

“That takes us into a realm of mass surveillance that is just not something that we want to be walking into.”

The CDP says the system would essentially turn members of An Garda Síochána into roaming surveillance units, something that would “fundamentally fracture” their relationship with the public.

Problems with the technology were highlighted in the UK, where it has been used by Metropolitan police since 2016. A study by the University of Essex, whose researchers were given access to police trials, found four out of five people flagged by the technology as possible suspects were in fact innocent.

While this study examined the use of “live” FRT, which has been ruled out in Ireland, it nevertheless confirmed suspicions about the efficacy of the technology.

This year, the ICCL noted that 170 civil society organisations and activists from 55 countries around the world have called for an outright ban on biometric surveillance in public spaces.

“This kind of reaching for a tech solution to a deeply societal[ly] complex problem is very unhelpful,” says Cronin, who looks to practical, as well as moral or philosophical, imperatives.

“We cannot gadget our way out of societal breakdown.”

In this latest round of the FRT debate, Cronin adds, a major problem is that nothing has yet been put down on paper as regards the plans for its use, and the Government has yet to demonstrate its necessity.

Even still, might public incredulity over what has happened in the aftermath of burning garda cars and public transport vehicles be enough to drive proposals for FRT use through legislators?

“I think that every time that FRT is presented as a magic solution, there is that risk [of greater public acceptance]. Because it’s a natural impulse to say and to feel, we want X to happen because that will equal Y,” she says.

What is in little doubt is that the next draft laws around the use of FRT will be just as scrutinised as the last, and the risk of legal challenge is very real.

Security surveillance cameras in Jerusalem's Old City in September  2023. The Israeli government in September endorsed a Bill to allow the police to place facial recognition cameras in public spaces. Photograph: Menahem Kahana/AFP/Getty
Security surveillance cameras in Jerusalem's Old City in September 2023. The Israeli government in September endorsed a Bill to allow the police to place facial recognition cameras in public spaces. Photograph: Menahem Kahana/AFP/Getty

Green Party members have again said they want to see the details behind the pending measures. Legal rights concerns centre around data protection and privacy when it comes to storing or processing people’s images.

“When you’re looking at it from a law enforcement point of view, it’s probably a good thing to stop gardaí having to sit around looking at the footage,” says Ronan Lupton, a senior counsel with expertise in the area of data protection.

Why go the way of China with its surveillance of ethnic minorities? Why go the route of Russia with its surveillance of protesters?

—  Dr Elizabeth Farries, co-chair of UCD’s Centre for Digital Policy

“But on the other hand, from a human rights point of view and a citizen’s point of view, is it going to be used correctly, is it compliant with general data protection regulation and also privacy law requirements? The answer is: who knows? It depends what the Bill says.”

Of particular relevance are strict GDPR rules around processing biometric data, and there must be a legal basis upon which to record random footage in public. There is also the Law Enforcement Directive, which covers how criminal investigations can be conducted so they remain in compliance with GDPR.

Lupton believes the supervision of a High Court judge would be sufficient to safeguard the use of such technology.

Schrems’ privacy group challenges Ryanair’s use of facial recognitionOpens in new window ]

“There needs to be temporal limitations. They can’t just go in all the time whenever they want, and there has to be [court] applications.”

Until legislative proposals are presented, the broader debate looks set to remains fixed somewhere between two polarities. There is the law-and-order standpoint that technology can aid policing, reduce pressure on stretched resources and keep streets safe, while the rights-based view is that FRT could theoretically be used for different, potentially more sinister purposes, in the future.

“Why entrench further an overt mass surveillance society that is used in autocratic regimes?” asks Farries, noting that China and Russia have used it for purposes such as surveillance of ethnic minorities or protesters.

Farries said “the established problems of FRT would be ignored in favour of a rapid and overly simplistic solution” that does not address the underlying causes of the issues it was brought in to address.