MY KOLKATA EDUGRAPH
ADVERTISEMENT
Regular-article-logo Tuesday, 30 April 2024

Guardians of the Internet

Prasun Chaudhuri, Avijit Chatterjee and Varuna Verma meet the men and women who filter the porn, abuse and filth on the Internet before you log on

The Telegraph Online Published 06.12.14, 06:30 PM
  • FLAG THE SMUT: Employees at Foiwe Info GlobalSolutions, Bangalore

 

Satyabrata Das spends eight hours a day in front of a computer, peering at some of the most horrific shades of human behaviour on the Internet. He has seen images of animals getting slaughtered, the worst kinds of child abuse and the most twisted forms of pornography. One recent video showed a man being beheaded by hooded terrorists in a gruesome manner.

Das is no voyeur - this is work. As a content moderator, working for Foiwe Info Global Solutions in Bangalore, he sifts through thousands of offensive images and texts that people upload on social network and other sites and weeds out offensive material. People like him make it possible for the average user to log on to the Internet without being besieged by pornographic or brutal images.

'Our job is to block or flag the filth before it is posted or goes viral,' says Das, who has been working at Foiwe for over two years.

With the growth of social media and the proliferation of websites providing an opportunity to Internet users to upload images and videos, these moderators are in great demand. Hemanshu Nigam, former chief security officer of social media group MySpace and software giant Microsoft in the US, estimates that the number of content moderators cleaning up social media sites, mobile apps and the Internet world runs to well over 1,00,000 across the world.

'In India, the number could be above 10,000 and growing,' he says.

Some call them janitors of the Internet. Nigam refers to them as 'protectors'. They are the people who ensure that uploaders do not cross the line - a job that was initially handled by automated 'nanny' software when the social media and video sites had just started to grow.

User-generated content was filtered through some such software - for instance, those which detected copious flesh tones. Eventually it was realised that the job needed human intervention as machines failed to distinguish smutty images from those showing, say, a mother breastfeeding an infant.

'Moderation requires swift monitoring and an extremely sophisticated series of judgements in split seconds,' says Kalyan Kar, former managing director of Acclaris (India), the off-shore services centre of a US-based tech firm.

Now, in offices across the country - Bangalore, Delhi, Hyderabad and other metros - young men and women pore over screens to block any kind of abuse. For instance, a company like Chegg India, a subsidiary of a US-based e-learning outfit, has a bevy of employers who deal with abusive language and swear words targeted at children.

'Today, round-the-clock - '24/7/365 cycles' - workers ensure the highest degree of efficiency in blocking cruel and offensive content spewed non-stop by an array of creeps, bullies, racists and perverts,' says Aravind Rao, director, operations, Infosearch IteS Pvt. Ltd, Hyderabad.

The job is not for everybody. Foiwe founder and head Suman Howlader points out that it demands analytical skills. 'This is an IT service job, but we require a certain amount of maturity from our employees,' adds Gujju Nalini, who handles human resources for the company and recruits people from tech college campuses and through job portals. Foiwe's primary clients are dating sites such as two.com, mamba.com and some undisclosed social media giants.

The job, indeed calls for a high degree of alertness and acumen. For instance, Das explains, while handling dating sites, one needs to be sharp enough to identify a fake profile of a scamster from one continent trying to forge a relationship with an unsuspecting loner in another, mainly to extract money. Of course, there are occasions when a genuine user gets blocked. In such cases, if a Foiwe employee is involved, the company has to pay a penalty to the client, Nalini adds.

The problem, often, is that the definition of offensive content varies from client to client, or from culture to culture. For instance, a US-based client of Infosearch asks moderators to screen anti-Christ content, but permits rabid criticism of the country's president, says Rao. 'Some companies don't allow swear words but can allow topless nudity,' adds Nigam, who now heads SSP Blue, a leading advisory firm for online safety, security, and privacy in the US.

Employees are trained to handle such nuances. 'We make specific guidelines - a moderation shield - based on client preferences, legal issues, copyright concerns, child safety considerations,' says Rao. Moderators are made familiar with the norms.

Often, the offensive matter is political in nature. 'I recently came across a twisted image of US President Barack Obama along with a nasty racial comment,' says Abdus Salam, a content moderator working at Infosearch. 'I deleted it in seconds,' he says.

While heading security at MySpace in the US, Nigam oversaw moderation, which he says is usually done in two steps. 'In the first phase, contract workers in remote locations (such as India, the Philippines, Costa Rica and Ireland) 'flag' an image or text they find offensive. In phase two, workers in the social media or Internet company double check the content and remove it if they feel it's wrong. All this happens in just a few minutes.'

What's surprising about the whole procedure is that even though the employees work like super-efficient robots, most big tech firms want to hide their contribution. 'I know many social media giants who outsource such jobs to Indian companies in Bangalore, Chennai, Hyderabad but I can't disclose their names,' Nigam says. There are strict non-disclosure agreements and many companies bar them from talking even to other employees of the same outsourcing firm.

According to Nalini, Foiwe is the only start-up working in content moderation in Bangalore. 'IBM, Infosys, Wipro and Tata Consultancy Services also offer content moderation services, but they call it enterprise content moderation,' she adds.

Efforts to elicit a response from Infosys were met with a standard response from their media spokesperson - 'Please excuse us from the story, we would not like to participate (in it). Thank you.' Mails to all the big social media companies - Facebook, YouTube, Twitter and Pinterest - either elicited no response or were stonewalled.

Clearly, Internet companies wish to sweep the details of content moderation under the carpet because they would rather not draw attention to the unsavoury aspects of the social media industry. Sarah T. Roberts, a media studies scholar at the University of Western Ontario, Canada, holds that companies do not want to disrupt 'comfortable myths' about the Internet as a forum for democratic free expression and a virtual world for fun and leisure.

'The existence of and necessity for workers who screen out material that is, quite often, disturbing, violent, degrading and disgusting, paints quite a different picture of what people are using social media for. On a larger scale, it points to something troublesome about the nature of images and videos as commodities based on exploitation,' Roberts adds.

The other downside of the job is its possible impact on employees. Nigam points out that some of the images can haunt the moderator for a long time.

Howlader believes that the company's 90-odd moderators get desensitised by gory images over time. 'We've to go through millions of images every week catering to 71 million users spread across the US, Europe and elsewhere.'

But the job does take its toll on some. Take the case of Sudha (name changed), who used to moderate content on behalf of an outsourcing firm in Chennai. She quit her job recently after she came across a video showing the violent rape of a child.

'The image still haunts this single mother and she sometimes wakes up in the middle of the night,' says Debarati Halder, a lawyer and cyber victim counsellor in Tirunelvelli, Tamil Nadu. 'Some moderators become numb to human feelings and fail to react to the pain of others,' she adds.

Roberts rues that despite such occupational and emotional hazards, many big companies don't offer adequate therapeutic support to these workers. According to her, they deliberately outsource work through layers of contracts and various complex arrangements designed to distance the workers - legally, geographically, and otherwise.

Nigam, however, thinks tech companies are slowly becoming aware of the psychological trauma associated with the job. 'Many of them offer support to the workers,' he says. According to him, therapeutic help is offered to employees who can't cope with the pressure. 'If people can't take it, they are shifted to other projects.'

Infosearch holds regular brainstorming sessions to help employees tackle the emotional hazards of work. 'These sessions are organised especially for fresh graduates,' Rao adds.

Content moderators at Foiwe or InfoSearch are hardly bothered about these rough edges. Das and his team are gearing up for the holiday season - festive for some, but hard work for the moderators. 'December is the busiest month for people to have fun and upload all sorts of stuff. We have to work doubly hard to clean up the mess,' he says.

Additional reporting by Smitha Verma in New Delhi

Follow us on:
ADVERTISEMENT