Activists Decry Police Use of Facial Recognition Cameras To Protect Women

LUCKNOW, India – Privacy issues have been raised since one Indian city rolled out plans for police to install cameras that use artificial intelligence to “read” expressions of women in distress and alert a nearby police station.

As facial recognition technology gains a foothold in the country, activists are questioning the police’s surveillance plans in Lucknow, the capital city of Uttar Pradesh state.

“Equipped with artificial intelligence, the cameras will perform multiple actions simultaneously — capturing data, analyzing it and sending alerts when required,” said Lucknow police commissioner D.K. Thakur.

Under the plan, smart cameras are to be installed in 200 “hotspots” frequented by women. If a woman is being harassed in a public place, an alert will reach police sooner than if she were to dial the emergency number 112 for help, said Thakur, who unveiled the plan to “protect women” at a workshop at Lucknow University on Jan 20.

The “protectionist” approach toward women, instead of an “empowering” one, is the major reason for the failure of most initiatives, some activists say.

“The reaction of the state to any kind of harassment against women in public spaces is to control women’s movements further,” said Ambika Tandon, a senior policy officer at the Centre for Internet and Society, a Bangalore-based nonprofit research organization that works on policy issues relating to freedom of expression and privacy, among other matters.

Such policing systems are designed with the perception of a “good victim” — one who is out in public doing things that the police approve of, said Tandon.

“There are specific groups of women who might feel uncomfortable on being monitored this way, like sex workers, daily wage workers, homeless women, women beggars, or any other woman who does not have fair access to the justice system,” she said.

“Right from the policymakers to the cops — the majority of the people involved in the project are men. This idea of a male gaze on women to ensure their safety in public spaces is counterproductive and furthers patriarchy. Also, monitoring women’s bodies and movements in public places is a violation of their right to privacy,” she said.

The surveillance mechanism is being developed as part of the Home Ministry’s Safe City project rolled out in 2018. Its aim is to “create a safe, secure and empowering environment for women in public places to enable them to pursue all opportunities without the threat of gender-based violence and/or harassment,” the website states. In addition to Lucknow, it is to be launched in Mumbai, Delhi, Kolkata, Chennai, Ahmedaba, Bengaluru and Hyderabad.

Lucknow, the only city in Uttar Pradesh to be selected for the project, has been approved for INR 194.44 crore ($26.6 million) to implement it. Of the 405,861 crimes reported against women in India in 2019, Uttar Pradesh recorded the most, with 59,853, according to the National Crime Records Bureau’s Crime in India 2019 report. The state also had the highest number of crimes against girls under the Protection of Children from Sexual Offences Act, with 7,444 cases, followed by Maharashtra with 6,402 cases.

The Uttar Pradesh government is not the first to bring in surveillance measures perceived to invade people’s privacy. Madhya Pradesh Chief Minister Shivraj Singh Chouhan said on Jan. 13 that his government was setting up a registration system to keep track of youth traveling outside their district or the state for work.

“The state will have all information about the medium through which our sons and daughters go outside, and the nature of work for which they are being taken. The youth, in turn, will also have information on whom they can get in touch with first if they are in distress,” a statement from the chief minister’s office said.

Technology experts say setting up such a system isn’t feasible.

“I know India is a country where the fastest reaction to injustice is to kick the victim, especially if the victim is a woman, but this thing is so remarkably Kafkaesque that it’s funny,” tweeted Anupam Guha, an AI researcher and assistant professor at the Indian Institute of Technology in Bombay.

“First, facial expressions say nothing about the internal mental state of humans. Second, machine learning requiring facial data violates constitutional rights, and third, machine learning is non-deterministic,” said Guha.

As facial recognition technology gains a foothold in the country, activists are questioning the police’s surveillance plans in Lucknow, the capital city of Uttar Pradesh state. (Victor Garcia/Unsplash)

A comprehensive review of studies in 2019 found that current emotion recognition systems are largely inaccurate. It said that though common or “prototypical” facial expressions might exist, the idea of “fingerprinting” an emotion through expression was unreliable.

The study showed that people often scowl when they are not angry, and when they are angry, they scowl less than 30 percent of the time.

Although facial recognition systems have been around for over a decade, privacy advocates today fear that relying on such technology could lead to wrong or biased judgments.

“While I can imagine that there are some genuinely useful use-cases, the privacy implications stemming from emotional surveillance, facial recognition and facial profiling are unprecedented,” Frederike Kaltheuner of Privacy International told the BBC in an interview.

From tracking “suspicious people” at railway stations and on roads to hunting down protesters, the Indian government, of late, has been deploying facial recognition technology for varied purposes.

The government used facial recognition technology to identify 1,922 protesters during rallies against the controversial Citizenship (Amendment) Act, 2019.

The technology found its latest use in the ongoing farmers’ protests that took a violent turn in New Delhi on Jan. 26. Delhi Police Commissioner S. N. Srivastava has said that facial recognition techniques and surveillance cameras will be used to identify “culprits.”

Following nationwide protests against the gang-rape of a young woman in New Delhi in 2012, the central government set up the Nirbhaya Fund to ensure the safety and security of women. The Fund aims to help deter crime and strengthen surveillance networks. The fund is administered by the Department of Economic Affairs in the Ministry of Finance.

A survey in 2019 by the Centre of Internet and Society asking about people’s perceptions of more surveillance cameras being installed in New Delhi under the Nirbhaya Fund, found that most respondents said the cameras were “useless” because the monitoring was irregular.

“Some even shared their distrust in the police and felt the cameras could be misused by voyeuristic cops,” said Tandon.

“There is a lack of clarity on how the Lucknow police would be storing or processing the surveillance data from the smart cameras. At a time when the country is actively trying to push for facial recognition systems based on artificial intelligence or machine learning, how do citizens know if the captured surveillance data is not being fed into larger datasets?” she said.

The National Crime Records Bureau for a year has been working on the National Automated Facial Recognition System, considered among the world’s largest such systems. With an estimated budget of INR 308 crore ($42 million), the project aims to gather existing data and create a national database of photographs that will be used to identify criminals.

(Edited by Uttaran Dasgupta and Judith Isacoff)



The post Activists Decry Police Use of Facial Recognition Cameras To Protect Women appeared first on Zenger News.