Face value: New surveillance
A report by the Internet Freedom Foundation estimates that 32 facial recognition technology systems are getting installed in India under Project Panoptic. This is worrying in a country where, in spite of privacy being declared a fundamental right by the Supreme Court, the Union home minister declared in Parliament that the Delhi police had tapped into driving licences and voter identity databases to apprehend 1,900 ‘rioters’ from the Delhi riots. On paper, the Automated Facial Recognition System proposed by the home ministry is supposed to be used for “criminal identification, verification and its dissemination among various police organisations”. In reality, for the AFRS to be effective, biometric facial data from all individuals, not only the targets of surveillance or those suspected of criminal activity, need to be collated. Worryingly, accuracy rates of facial recognition algorithms are particularly low in the case of minorities, women and children, as demonstrated in multiple studies across the world. Use of such technology in a system where vulnerable groups are over-represented — more than half of all convicts and undertrials in Indian prisons are Muslims, Dalits or adivasis — risks replicating human biases.
What is worse, the use of such surveillance is being contemplated without the checks and balances offered by a comprehensive data protection law. Strikingly, the proposed personal data protection bill — it is yet to be passed by Parliament — gives significant leverage to State agencies. While the bill envisages alerts for users when an attempt has been made to access data from, say, a facial recognition database, the government has been kept out of the purview of this provision. Even if the bill is amended to safeguard citizens against such loopholes, implementation may not be guaranteed. The Justice Srikrishna panel, which was constituted to look into data privacy, had pointed out that in spite of legal checks against phone-tapping, a review committee has to go through as many as 15,000-18,000 interception orders every meeting. Significantly, such privacy concerns are being aired in other parts of the world as well. The European Commission is considering imposing a five-year moratorium on the use of facial recognition technologies in the European Union; in the United States of America, municipalities have passed, or are considering passing, similar prohibitions. In India, where public awareness about digital depredation is minimal, such a system can increase privacy violations and bring about the genesis of a pliant society. Deliberations on facial recognition tools must strike a balance among security concerns and privacy, liberty, and freedom.