Indian city looks to facial-recognition system to combat harassment of women

Privacy concerns are raised over Lucknow’s plans to use cameras programmed to recognise women’s expressions

Cameras will activate ‘as soon as the expressions of a woman in distress change’, alerting a nearby police station, says police commissioner DK Thakur. File photograph: iStock
Cameras will activate ‘as soon as the expressions of a woman in distress change’, alerting a nearby police station, says police commissioner DK Thakur. File photograph: iStock

Digital rights experts have raised concerns over a plan in India to monitor women’s expressions using facial-recognition technology to prevent street harassment.

The move in the northern city of Lucknow – about 500km from the nation’s capital, New Delhi – will lead to intrusive policing and privacy violations, experts warned on Friday.

Police have identified some 200 harassment hotspots in Lucknow that women often visit and where most harassment complaints are reported, said police commissioner DK Thakur.

Under the plan, five artificial intelligence-based cameras that will recognise the expressions of a woman in distress will be set up in the city, he said. The cameras will send an alert to the nearest police station.

READ MORE

“These cameras will become active as soon as the expressions of a woman in distress change,” Mr Thakur said, without giving further details on which expressions would trigger an alert.

Facial-recognition technology is being increasingly deployed in airports, railway stations and cafes across India, with plans for nationwide systems to modernise the police force and its information gathering and criminal identification processes.

Privacy breach

But technology analysts and privacy experts say the benefits are not clear and could breach people’s privacy or lead to greater surveillance, with little clarity on how the technology works, how the data is stored and who can access it.

“The whole idea that cameras are going to monitor women’s expressions to see if they are in distress is absurd,” said Anushka Jain, an associate counsel at digital rights non-profit Internet Freedom Foundation.

“What is the expression of someone in distress – is it fear, is it anger? I could be talking to my mother on the phone and get angry and make a face – will that trigger an alert and will they send a policeman?”

A more feasible solution would be to increase police patrol numbers, Ms Jain said, adding that the technology is untested and could lead to over-policing and the harassment of women who trigger alerts.

India is one of the world’s most dangerous places for women, with one woman reporting rape every 15 minutes, according to government data. Uttar Pradesh, where Lucknow is located, is the least safe state, with the highest number of reported crimes against women in 2019.

Police often turn away women who go to register complaints or fail to take action, said Roop Rekha Verma, a women’s rights activist in Lucknow. “And they want us to believe they will take action watching our facial expressions,” she added.

Legal reforms

India introduced a slew of legal reforms after a fatal 2012 gang rape, including easier mechanisms to report sex crimes, fast-track courts and a tougher rape law with the death penalty, but conviction rates remain low.

While there is growing backlash against facial recognition technology in the United States and Europe, Indian officials have said it is needed to bolster a severely under-policed country and to stop criminals and find missing children.

Digital rights activists say its use is problematic without a data-protection law and it threatens the right to privacy, which was declared to be a fundamental right by the Supreme Court in a landmark ruling in 2017.

“The police are using the technology to solve a problem without considering that this will simply become a new form of surveillance, a new form of exercising power over women,” said Vidushi Marda, a researcher at human rights group Article 19.

“[Artificial intelligence] is not a silver bullet and no amount of fancy tech can fix societal problems,” she said. – Reuters