MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Wednesday, 01 May 2024

When the mind’s on a leash

The manipulation of brain activity has also been around for some time now and has been ramped up with Artificial Intelligence-based ‘interventions’

Pramod K. Nayar Published 17.04.24, 07:10 AM
Representational image.

Representational image. Sourced by the Telegraph

Neurotechnologies are expanding faster than the speed of light. Brain-based lie detection, devices to assess moods and emotions, recognition of memories and personal knowledge and sexual or political inclinations have already been deployed in several nations, with varying degrees of success. The manipulation of brain activity has also been around for some time now and has been ramped up with AI-based ‘interventions’.

Neurointerventions are not just examination and alteration of brain physiology. They could have profound effects on the personhood of a person. Alterations to the brain’s functioning can possibly produce entirely or partially new persons. If, at some point in the future, the law enforcement process includes procedures to ‘correct’ criminal behaviour by modifying brains, then we are looking at a sci-fi scenario wherein a person could be remade.

ADVERTISEMENT

This kind of a situation opens up the field of neurorights that have two key components. The first is the right to bodily integrity (in Article 3 of the Charter of Fundamental Rights of the European Union, the Convention on the Rights of Persons with Disabilities, and elsewhere) where a person’s body cannot be violated without her/his express consent. The second is the right to mental integrity where a person’s mind cannot be violated without her/his express consent. The first prohibits — we can see this injunction in the world’s first legislation on the use of AI promulgated by the European Union to be enforced from 2026 — violations of the physical/corporeal person. The second prohibits violations of the mind (not just the brain). Taken together, these rights call for respecting the bodily and the mental autonomy of a person, unless she/he is willing to have these modified (for example, those who take recreational drugs or voluntarily choose implants/prostheses are using their autonomy to enable corporeal and mental modification, permanent or temporary).

The view that an individual is an autonomous being and that this autonomy is personhood informs the world’s human rights discourses initiated by the Universal Declaration of Human Rights. But neurotechnologies, human rights scholars fear, can potentially disrupt this autonomy.

A ‘person’ is made of biases, desires, political ideas, religious beliefs, fears and rational or irrational thoughts. The human being is a sum total of these. The person is more often than not a bundle of contradictions. This is why we are suspicious of automata or even humans who are so single-minded as to appear inhuman. There is concern that if contradictions are to be regularised, moods managed, and desires monitored, we would be a step away from overwhelming technological surveillance. This surveillance, as George Orwell demonstrated, is itself a prelude to interventions in the political thought processes of an individual. We are already in the era of curbs on freedom of thought and expression, where curriculum and pedagogy, art and literature, are all examined by fringe elements. What prevents such elements from calling for neural management of artists, teachers and writers?

There is thus reason to believe that neurotechnologies could become weapons of mass surveillance and mass management. Persons with dissenting political ideas or those with low thresholds for socialising, apart from deviants, could then be modified to fit the template of the ideal human. The latter would be a law-abiding, socially pliant, controlled citizen. There would be no deviations from this norm.

Pramod K. Nayar teaches at the Department of English, the University of Hyderabad

Follow us on:
ADVERTISEMENT