APP Users: If unable to download, please re-install our APP.
Only logged in User can create notes
Only logged in User can create notes

General Studies 3 >> Science & Technology

audio may take few seconds to load

FACIAL RECOGNITION TECHNOLOGY

FACIAL RECOGNITION TECHNOLOGY

 
Source: The Hindu 
 

Context

Right to Information (RTI) responses received by the Internet Freedom Foundation, a New-Delhi based digital rights organisation, reveal that the Delhi Police treats matches of above 80 per cent similarity generated by its facial recognition technology (FRT) system as positive results.
 

Why is the Delhi Police using facial recognition technology?

  • The Delhi Police first obtained FRT to trace and identify missing children.
  • According to RTI responses received from the Delhi Police, the Procurement was authorised as per a 2018 direction of a Delhi High Court in Sadhan Haldar Vs NCT of Delhi.
  • The Delhi Police submitted in the Delhi High Court that the accuracy of the technology procured by them was only 2 per cent and not good.
  • Things took a turn after multiple reports came out that the Delhi Police was using FRT to surveil the Anti-CAA protests in 2019.
 

RTI Response in 2020 

  • In 2020, the Delhi Police stated in an RTI response that they obtained FRT as per the Sadhan Haldar direction which related specifically to finding missing children, they were using FRT for people investigations.
  • The widening of the purpose for FRT use demonstrates an instance of function creep wherein a technology or system gradually widens its scope from its original purpose to encompass and fulfil wider functions.
  • As per available information, the Delhi Police has consequently used FRT for investigation purposes and also specifically during the 2020 northeast Delhi riots, the 2021 Red Fort violence and the 2022 Jahangirpuri riots.
 
 

Facial recognition

Facial recognition is an algorithm-based technology which creates a digital map of the face by identifying and mapping an individual's facial features, which it then matches against the database to which it has access.
It can be used for two purposes:
 
 1:1 verification of identity
  •  1:1 verification of identity wherein the facial map is obtained to match it against the person's photograph on a database to authenticate their identity.
  • For example, 1:1 verification is used to unlock phones.
  • However, increasingly it is being used to provide access to any benefits or government schemes.
 
1:n identification
  • There is the 1:n identification of identity wherein the facial map is obtained from a photograph or video and then matched against the entire database to identify the person in the photograph or video.
  • Law enforcement agencies such as the Delhi Police usually procure FRT for 1:n identification.
  • For 1:n identification, FRT generates a probability or a match score between the suspect who is to be identified and the available database of identified criminals.

 

Spread of FRT in India

  • A list of possible matches is generated based on their likelihood to be the correct match with corresponding match scores.
  • Ultimately it is a human analyst who selects the final probable match from the list of matches generated by FRT.
  • According to Internet Freedom Foundation's Project Panoptic tracks the spread of FRT in India, there are at least 124 government-authorised FRT projects in the country.
 

Issues with the use of FRT


India has seen the rapid deployment of FRT in recent years, both the Union and State governments, without putting in place any law to regulate their use.
The use of FRT presents two issues:
  1. Issues related to misidentification due to inaccuracy of the technology and
  2. issues related to mass surveillance due to misuse of the technology.
  • Extensive research into the technology has revealed that its accuracy rates fall starkly based on race and gender.
  • This can result in a false positive, where a person is misidentified as someone else or a false negative where a person is not verified as themselves.
  • Cases of a false positive result can lead to bias against the individual who has been misidentified.
 

Failure of facial recognition 

  • In 2018, the American Civil Liberties Union revealed that Amazon's facial recognition technology, Recognition, incorrectly identified 28 members of Congress as People who have been arrested for a crime.
  • Of the 28, a disproportionate number were people of colour.
  • Also in 2018, researchers Joy Buolamwini and Timnit Gebru found that facial recognition systems had higher error rates while identifying women and people of colour, with the error rate being the highest while identifying women of colour.
  • The use of this technology by law enforcement authorities has already led to three people in the U.S. being wrongfully arrested.
 

Negative Results

  • On the other hand, cases of false negative results can lead to exclusion of the individual from accessing essential schemes which may use FRT as means of providing access.
  • One example of such exclusion is the failure of the biometric-based authentication under Aadhaar which has led to many people being excluded from receiving essential government services which in turn has led to starvation deaths.
 

Misuse

  • Even if accurate, this technology can result in irreversible harm as it can be used as a tool to facilitate state-sponsored mass surveillance.
  • At present, India does not have a data protection law or an FRT-specific regulation to protect against misuse.
  • In such a legal vacuum, there are no safeguards to ensure that authorities use FRT only for the purposes.
  • They have been authorised to enable the constant surveillance of an individual resulting in the violation of their fundamental right to privacy.
     

 2022 RTI responses

  • The RTI responses dated July 25, 2022, were shared by the Delhi Police after the Internet Freedom Foundation filed an appeal before the Central Information Commission for obtaining the information after being denied multiple times by the Delhi Police.
  • In their response, the Delhi Police has revealed that matches above 80 per cent similarity are treated as positive results while matches below 80 per cent similarity are treated as false positive results which require additional corroborative evidence.
  • It is unclear why 80 per cent has been chosen as the threshold between positive and false positive.
  • There is no justification provided to support the Delhi Police's assertion that an above 80 per cent match is sufficient to assume the results are correct.
  • Secondly, the categorisation of below 80 per cent results as false positive instead of negative shows that the Delhi Police may still further investigate below 80 per cent results.
  • Thus, people who share familiar facial features, such as in extended families or communities could end up being targeted.
  • This could result in targeting communities that have been historically overpoliced and have faced discrimination at the hands of law enforcement authorities.
 

Criminal Procedure (Identification) Act, 2022

  • The responses also mention that the Delhi Police are matching the Photographs and videos against photographs collected under Sections three and four of the Identification of Prisoners Act, 1920, which has now been replaced by the Criminal Procedure (Identification) Act, 2022.
  • This Act allows for wider categories of data to be collected from a wider section of people, i.e., "convicts and other persons for identification and investigation criminal matters."
  • It is feared that the Act will lead to an overbroad collection of personal data in violation of internationally recognised best practices for the collection and processing of data.
  • This revelation raises multiple concerns as the use of facial recognition can lead to wrongful arrests and mass surveillance resulting in privacy violations.
 

Conclusion 

Delhi is not the only city where such surveillance is ongoing. Multiple cities, including Kolkata, Bengaluru, Hyderabad, Ahmedabad and Lucknow are rolling out "Safe City" Programmes which implement surveillance infrastructures to reduce gender-based violence, in the absence of any regulatory legal frameworks which would act as safeguards.



Share to Social