regular-article-logo Friday, 21 June 2024

Editorial: Eyes wide open

There is an urgent need for a comprehensive law defining the fair use of facial recognition technology so that transgressors can be held accountable

The Editorial Board Published 02.09.21, 12:59 AM
Representational image.

Representational image. Shutterstock

Big Brother is watching. Digital technology has been used in India in a limited way for some time now. But the evolution of the tools of surveillance and their gradual incursion into public space are leading to growing concern about privacy. Consider the employment of facial recognition technology. A number of State-owned agencies are now integrating FRT with closed-circuit cameras to identify citizens on a real-time basis, purportedly to track criminals, aid law enforcement and facilitate business. These range from the Airports Authority of India — at present it is testing this technology in Varanasi — to educational institutions — at least a dozen government schools in Delhi were reported to have been using FRT. The profusion of FRT in a country like India that is yet to have adequate data protection laws in spite of a threefold increase in cybercrime in 2020 merits introspection. While the personal data protection bill is yet to be passed by Parliament, the Information Technology Act, 2000 is no longer adequate to address the newer challenges that have arisen on account of state-of-the-art technology.

The lack of proper regulation to control the collection, storage and utilization of FRT data is problematic for several reasons. First, consent is not respected in many cases — a State-owned agency, the National Thermal Power Corporation, reportedly adopted the policy of disregarding employee consent while implementing FRT to record attendance. Second, given the poor state of digital literacy in India, the idea of ‘informed consent’ is itself doubtful: most people are not even aware of the ramifications of having their personal and biometric data recorded or when their right to privacy is violated. Ethical concerns regarding the use of FRT persist even when it is used by law enforcement agencies. Last year, the Delhi police, which is directly under the control of the Union home ministry, used facial recognition to profile 1,100 ‘rioters’ — incidentally, the Delhi High Court has expressed concern with the nature of the investigation. Furthermore, in the age of increasing spyware attacks and data harvesting, there is the additional concern of collated personal data falling into the wrong hands, threatening individual and even national security. There is, then, an urgent need for a comprehensive law defining the fair use of FRT so that transgressors can be held accountable. Given the technical nature of the problem, this can only be realized through a collaboration between policymakers and experts in the field.

Follow us on: