We might be in the market for a new kind of face mask.
Highlights
- Background: On June 28, 2019, the National Crime Records Bureau (NCRB) invited bids from reputed turnkey solution providers for the implementation of a centralised Automated Facial Recognition System (AFRS).
- Need for established safeguards: The proposal to introduce facial recognition into society currently comes with a variety of concerns that need to be addressed first. We write a legal notice to the NCRB with also a covering letter and a copy of the notice to Home Minister, Shri. Amit Shah and Home Secretary, Shri. Rajiv Gauba, highlighting the features and scope of the AFRS and its unfathomable detriment it could bring about to Indians if implemented.
Know your enemy
Its important to understand the characteristics the Request for Proposals requires the systems to possess, in order to understand how problematic it will be. Our legal notice highlights features which will be covered here in brief to provide basic context.
- Function: The AFRS is intended to be a repository of all crime and criminal related facial data and should be able to identify or verify a person from a variety of inputs ranging from images to videos.
- Integration: The system should be able to be integrated with various other databases such as ICJS, CCTNS, IVFRT, state police integration software in existence or any others. The integration does not stop there, it requires the system to be compatible with other biometric solutions such as Iris and AFIS but doesn't specify what these databases are. Might we have yet another Aadhaar worry to add to our frown lines?
- Identification: As per requirements of the RFP, the system should not only be able to match images from a variety of databases, it should also be equipped to capture images from CCTV footage, public or private video feeds. As concerning as this sounds, there's more. It should also have the ability to tag images uploaded from newspapers, raids, sketches etc. with identifiers based on sex, age, scars, tattoos, consider landmarks, features and contours in identifying individuals and also accommodate for images with plastic surgery and make up for accuracy in identification.
- Technical requirements: The aspects that stuck out in the RFP largely related to the need for the system to be compatible with bio-metric solutions such as Iris and Fingerprint identification systems. It also requires there that there be security in storage, user access and authentication
- Security requirements: The bidder is largely responsible for additional measures in maintaining the integrity, confidentiality and availability of data that will be stored, apart from the established ISO standard prescribed in the RFP.
Here's why we aren't quite ready for sci-fi
We take a closer look at these not so frightening features and identify concerns with the entire technology itself.
- Absence of legality: The requirement of facial recognition systems does not stem from a statutory basis nor is it a result of the executive power of the Government. Clearly it lacks any sort of legal backing. This is in addition to the violation of privacy it will so flagrantly undermine as it fails to fulfil any of the elements laid down by Puttaswamy v. Union of India (2017 (10) SCALE 1) in permitting a violation of privacy.
- Manifest arbitrariness: Beginning to picture the massive invasion of privacy the AFRS will potentially have? Especially with its all seeing eye looking at not only databases and databases of images but also strategically located CCTV footage, images of newspaper clippings, raids, sketches etc., all most likely without your knowledge. This kind of en-masse surveillance is bound to bring about a high degree of damage. Studies by MIT and Georgetown and trials conducted the London Metropolitan Police acknowledge that pervasive biases that exist currently within our societies are likely to be mimicked by the algorithms within these systems. Mis-identification and discriminatory profiling is the result we're looking at if these systems are implements. Apparently, the already existing discriminatory practices perpetuated by human beings is no longer enough, we must look to AI to continue our dirty work.
- Absence of safeguards and accountability: In light of the above, there is currently no legal restrictions or limitations to this technology to ensure its proportional use or afford protection to those it interacts with. Add facial recognition to the on going debate on CCTVs and we have ourselves a full fledged mix of India and China no longer being restricted to the Indo-Chinese cuisine.
Considering the trajectory India appears to be on with mass surveillance and technological perpetuating of discrimination, a scarier version of the Orwellian dystopia seems to be right up our alley. We urge the NCRB to take a step back and recall the bidding process for the Automated Facial Recognition System until adequate safeguards that address these various concerns are put into place. This request is also with a disclaimer that failure to do so may cause us to seek remedy in accordance with the law.
Important Documents:
- Request for Proposals by the National Crime Records Bureau for Automated Facial Recognition Systems [link]
- Legal notice to the National Crime Records Bureau [link]
- Covering letter to the Ministry of Home Affairs and Home Secretary of Home Affairs [link]
All your black mirror nightmares coming true? Support us as we fight to restrict dystopias to tv shows and books. Become an IFF member today!