Skip to main content

AFRS: The Hawk Eye

January 10, 2020 | Expert Insights

Background 

AFRS uses a network of cameras, digitally linked to a large central database containing photos and videos where AI, also referred to as ‘neural networks’, compares and finds a match to fix identity.

In June 2019, The Indian National Crime Report Bureau (NCRB) floated a tender to procure one of the largest AFRS.  

Analysis 

While AFRS will strengthen the fight against crime, there are concerns that in case stringent regulations are not put in place, the system may be misused.

Currently, the leaders in AFRS technology are Amazon; Face recognition Lambda labs, Microsoft, Google and IBM. There is no Indian company in this field while China with a host of of start-ups like Megvii and AI unicorns like CloudWalk, SenseTime, and Yitu, has made AFRS commonplace in China.

In India, AFRS will be integrated with existing digital systems like the NCRB-managed Crime and Criminal Tracking Network & Systems (CCTNS), Integrated Criminal Justice System (ICJS), the Immigration, Visa and Foreigners Registration & Tracking (IVFRT) System and the ‘Khoya Paya portal’ for missing children and other systems being used at state and central level.  

It will be available as a mobile and web application hosted in NCRB’s Data Centre in Delhi.

The AFRS will be used at the national level down to the police station level. It is being claimed that it will greatly improve identification and verification in criminal investigations by facilitating easy recording, analysis, retrieval and sharing of information between different organisations.

Counterpoint

AFRS is still far from perfect and users have reported many flaws in the technology itself.  Experts say that it can be deceived by wearing dark and shiny sunglasses. In fact, Japan’s National Institute of Informatics has come up with glowing glasses having infra-red LEDs that foil facial recognition. Such as deceptions can either lead to mistaken identity or provide a loophole for terrorists and criminals to exploit. 

The system used in Delhi to identify missing children itself has an accuracy below one per cent and was grossly inefficient in distinguishing between boys and girls. The report by Metropolitan police in UK also stated that the inaccuracy rate as 98%. Accuracy of these systems is said to be only two per cent particularly in the case of minorities, women and children. 

The outcome of machine learning comes from the way the machine is trained. With AFRS, the accuracy and strength of the database require recording, classifying and querying every individual and most times, in situations where the individual themselves will not be aware of such identification. For example, each time a person walks in front of a CCTV camera, fresh data is collected and recorded. 

These systems make some people more vulnerable than others and are subject to aspects such as ethnicity, characteristics, gender. The MIT media lab stated that the accuracy rates of these systems are extremely high if the person is a white male and extremely low if you are a coloured person. 

As being reported in the media, police in many countries have not been very satisfied with the AFRS. The London police is under pressure to end its use due to discrimination in its usage and inherent inefficiencies of the system. Similarly, San Francisco, Oakland California and Sommerville Massachusetts have banned its use by its police department.

Human Rights activists and legal experts in India have voiced their concerns. Pawan Duggal, a top cyber law expert, has been quoted as saying that  "The first casualty of the absence of regulatory framework for facial recognition technology is people's right to privacy,"  Vidushi Marda, a human rights campaigner, has gone on record stating that real-time facial recognition if combined with the world’s largest biometric database like Aadhaar, could create the “perfect Orwellian state”.

Assessment 

  • While modernisation of Indian law enforcement agencies is a crying need of the times, we must be circumspect with the kind of technology we entrust them with.  With all its present shortcomings, AFRS can at best be a deterrent against crime.
  • The Supreme court of India has upheld the ‘right to privacy’ as a fundamental right. Legal experts need to study AFRS to determine its legality.  
  • Caution is advised so that we do not run the risk of becoming a surveillance state like China which uses it ruthlessly over a wide spectrum to keep a sharp eye on its restive population.
  • The JAM trinity (Jan Dhan-Aadhaar-Mobile combination) has created the largest data bank of citizens in the world in India, AFRS will be adding its own flood of data to this. All this data is extremely vulnerable in our weak digital infrastructure and would require the most stringent protocols and systems to secure.