#PrivacyofthePeople: The boom of facial recognition technology in private spaces

Anushka Jain



In the last #PrivacyofthePeople post, we looked at the scope of Section 91 notices, how they may impact your right to privacy, and what remedies you have under law. In this post, we examine the ongoing use of facial recognition technology by private actors such as commercial stores, private college campuses among others.

Why should you care?

The use of facial recognition technology (FRT) in India is increasing rapidly. According to our Panoptic tracker, there are at least 124 ongoing FRT projects being developed by various government authorities in India. The use of this technology can lead to irreversible harms, such as bias and exclusion while also contributing to a normalisation of surveillance. Furthermore, the use of this technology is taking place in a legal vacuum, as there is no data protection law or FRT specific legislation which would regulate the use of FRT in India. In such a vacuum, use of FRT by private actors and the corresponding data collection could lead to grave privacy violations for all who are subjected to it.

Why is the use of FRT by private actors harmful?

Have you ever had tea at Chaayos? Or enrolled yourself for a fitness program at CultFit? A lot of us may have, but a lot of us don’t know that both these places claim to use FRT on their customers for reasons such as creating 'Loyaltea' programmes and “minimizing attendance fraud”. FRT is also being used increasingly by private universities such as Amity University, Noida to “enhance the security system” on campus.


It is important to understand why the use of facial recognition for providing convenience may be harmful in the long run. While some people may be worried about their privacy and refuse to participate in the facial recognition system, others may view it as a small price to pay for the convenience obtained through such use. Joshua A.T. Fairfield & Christoph Engel, in their paper on “Privacy As A Public Good”, deconstruct this assumption by framing privacy as a public good and the lack of privacy as a public bad. In the paper they suggest that while individuals are not concerned about anticipated damage to themselves as a result of information disclosure, if they were to assess the anticipated damage from the disclosures made by others in addition to themselves, they would see that the potential for damage is much higher. According to Danielle Keats Citron & Daniel J. Solove, where there is an aggregation of small harms “(f)rom the standpoint of each individual, the harm is minor, but from the standpoint of society, where the harm to everyone is aggregated, the total amount of harm is quite substantial”.

Thus, the use of FRT by these private actors is harmful. Further, there is no law to regulate how the technology is being used and how the collected data is being processed. This allows private actors to operate as they see fit and leaves unsuspecting people with little to no recourse in situations where their data is misused. The only existing regulations are the “Reasonable security practices and procedures and sensitive personal data or information Rules, 2011” (“Rules”) under S. 43A of the Information Technology Act, 2000 (“IT Act”). S. 43A relates to compensation to be provided by private actors for failure to protect data and the Rules lay down certain obligations on them such as providing a privacy policy, obtaining consent in writing as well as having reasonable security practices and procedures. However, concerns have been raised about their implementation, since they apply only when there is any “wrongful loss or wrongful gain” to any person and there is no dedicated regulatory authority to implement them.

Withdrawal of DPB, 2021 leaves Indian’s privacy in a lurch

The Data Protection Bill, 2021 was withdrawn by the Minister for Communications and Information Technology, Ashwini Vaishnaw on August 3, 2022. The withdrawal of the draft Data Protection Bill, 2021 marks the unsatisfactory end of a long and arduous consultation and review process for the legislation. While the 2021 version was certainly not perfect, we are concerned that this withdrawal has now brought us closer to where we started in 2018 instead of where we should be in 2022 (Read our brief of the issues with the DPB, 2021 here).

What are other countries doing?

Private actors in countries such as the UK and Australia have also started deploying facial recognition technology. However, they are facing considerable backlash. In the UK, the Southern Co-op supermarket chain has been under fire for using FaceWatch, a facial recognition security system in the UK. A complaint has also been filed against the chain by privacy group, Big Brother Watch, with the Information Commissioner. In their complaint, they have called the use of facial recognition “unlawful” and “Orwellian in the extreme”.

Similarly, in Australia, the home improvement chain Bunnings and big box retailer Kmart are under investigation by the Office of the Australian Information Commissioner (OAIC) for their personal data handling practices after the consumer group, CHOICE, released a report revealing that they were using facial recognition. Amid the investigation, Bunnings and Kmart have halted their use of facial recognition.

Our recommendations

It is essential that a data protection framework be put in place which would enforce certain best practices such as purpose limitation, data minimisation, and storage limitation when it comes to how private actors process the data they collect. In the “reasons for withdrawal” shared with other Members of Parliament, including those who were a part of the Joint Parliamentary Committee on the Personal Data Protection Bill, 2019 (JPC), Vaishnaw has stated that the Data Protection Bill, 2021 was withdrawn to make way for a comprehensive legal framework for the digital ecosystem. This comprehensive legal framework is almost ready and will soon be made available for public consultation, Vaishnaw has stated in an interview dated August 4, 2022 with ET Tech.

The existing legal vacuum on data protection portends an Orwellian state and is clearly an infringement of the fundamental right to privacy. We hope that the government has taken into account previous deliberations while drafting the new comprehensive legal framework. Further, we hope that this legal framework will also be made available through a white paper listing out the issues it considers, will have an independent & diverse group of experts and will be put to extensive public consultation to ensure that protection of the rights of Indian citizens is the cornerstone on which this new legal framework is built.

Important documents

  1. Here lies the Data Protection Bill, 2021 dated August 4, 2022 (link)

Share Your Support