Some 22,000 hours of footage from the Dublin riots has come to the attention of the Garda, an Oireachtas committee will hear on Tuesday, as it considers evidence on the merits of introducing facial recognition technology (FRT) for policing.
Civil liberties groups will tell politicians that the risks of introducing FRT into Irish policing are “too high” and that it is not a “silver bullet solution”.
However, Garda Commissioner Drew Harris is expected to tell the committee that “manual processing by Garda personnel sitting at screens is becoming unfeasible and ineffective”.
Mr Harris says that every major criminal investigation now involves processing digital evidence, often images or footage from seized devices or CCTV. There has been an “explosion in the volume of digital footage as evidence”, he says, giving the example of the November 23rd riots. In the early days of the investigation into the unrest, gardaí said they were looking at 6,000 hours of footage, but TDs and Senators will hear on Tuesday that the footage from the riots runs to 22,000 hours.
Individual murder investigations can have “upwards of 50,000 hours of footage” and “seized devices can have over a million images of child sexual abuse”, the Oireachtas Committee on Justice will be told.
[ We are rushing to a new AI-driven world without vital regulationOpens in new window ]
The committee will hear from several witnesses as it conducts pre-legislative scrutiny of the draft outline of the Garda Síochána (Recording Devices) (Amendment) Bill.
Minister for Justice Helen McEntee is seeking to introduce FRT to help in the investigation of serious crimes such as murder, child sexual abuse and abductions.
The technology would speed up Garda trawls through CCTV or other video footage and would apply to retrospective, not live footage.
Ms McEntee has previously said its introduction is not for the purposes of mass surveillance.
In an opening statement to the committee, the Irish Council for Civil Liberties (ICCL) and Digital Rights Ireland (DRI) say the use of FRT by police “engages many fundamental human rights”, including rights to human dignity, privacy, protection of personal data and protest.
The groups call on the committee to “urge the Government to reconsider this proposal to introduce FRT into Irish policing, as we believe the risks to these fundamental rights are too high”.
They argue that FRT is “unreliable and not the silver bullet solution it’s often presented to be”.
They say the technology is “discriminatory” and “studies have clearly demonstrated deeply inherent racial and gender biases”.
ICCL and DRI also say FRT “can enable powerful mass, indiscriminate and pervasive surveillance”, and argue that the Bill as proposed is unlawful under European Union law.
The Data Protection Commission’s (DPC) submission to the committee acknowledges the potential of facial recognition technology to benefit the work of gardaí.
[ Dublin riots drag row over facial recognition technology back into spotlightOpens in new window ]
However, it adds: “As the use of this technology presents serious risks to the individual’s right to data protection, the legislation must implement the necessary restrictions, limitations and safeguards to ensure that any deployment of facial recognition technology by An Garda Síochána is strictly necessary and proportionate”.
It continues: “Facial recognition technology does not provide definitive results” and there are factors that can “significantly impact the reliability and accuracy of the technology”.
The DPC says: “Consequently, it will be necessary for a Data Protection Impact Assessment (DPIA) to be carried out prior to the introduction of the technology.”
For the Garda, Mr Harris says two separate judgments from the Court of Appeal confirm that gardaí “have a duty to process available footage to identify or exclude suspects”.
He says that in dismissing the respective appeal cases of convicted murderers Freddie Thompson and Philip Dunbar, “the court’s rulings were instructive in terms of the balance between a suspect’s right to privacy and the human rights of the victim”.
Mr Harris says there is “understandable public concern” and “some confusion” about artificial intelligence technology.
He will seek to offer reassurance about any Garda use of such technology, saying: “There is no question of autonomous machine decision-making ever”, and “all decisions that can impact on a person are only taken by identifiable and accountable personnel”.
He is also expected to defend the reliability of modern biometric identification systems and say: “There must be safeguards, but these should be proportionate to the risks involved in the specific use cases.”
- Sign up for push alerts and have the best news, analysis and comment delivered directly to your phone
- Find The Irish Times on WhatsApp and stay up to date
- Our In The News podcast is now published daily – Find the latest episode here