Query for Facebook: US fine, what about India?
In Sri Lanka and Myanmar, Facebook kept up posts that it had been warned contributed to violence. In India, activists have urged the company to combat posts by political figures targeting Muslims. And in Ethiopia, groups pleaded for the social network to block hate speech after hundreds were killed in ethnic violence inflamed by social media.
For years, Facebook and Twitter have largely rebuffed calls to remove hate speech or other comments made by public figures and government officials that civil society groups and activists said risked inciting violence. The companies stuck to policies, driven by American ideals of free speech, that give such figures more leeway to use their platforms to communicate.
But last week, Facebook and Twitter cut off President Donald Trump from their platforms for inciting a crowd that attacked the US Capitol.
Those decisions have angered human rights groups and activists, who are now urging the companies to apply their policies evenly, particularly in smaller countries where the platforms dominate communications.
“When I saw what the platforms did with Trump, I thought, ‘You should have done this before, and you should do this consistently in other countries around the world,’” said Javier Pallero, policy director at Access Now, a human rights group involved in the Ethiopia letter. “Around the world, we are at the mercy of when they decide to act.”
“Sometimes they act very late,” he added, “and sometimes they act not at all.”
David Kaye, a law professor and former UN monitor for freedom of expression, said political figures in India, the Philippines, Brazil and elsewhere deserved scrutiny for their behaviour online. But he said the actions against Trump raised difficult questions about how the power of American Internet companies was applied, and if their actions set a new precedent to more aggressively police speech around the world.
“The question going forward is whether this is a new kind of standard they intend to apply for leaders worldwide, and do they have the resources to do it?” Kaye said. “There is going to be a real increase in demand to do this elsewhere in the world.”
Many activists singled out Facebook for its global influence and not applying rules uniformly. They said that in many countries it lacked the cultural understanding to identify when posts might incite violence. Too often, they said, Facebook and other social media companies do not act even when they receive warnings.
In many countries, there’s a perception that Facebook bases its actions on its business interests more than on human rights. In India, home to Facebook’s most users, the company has been accused of not policing anti-Muslim content from political figures for fear of upsetting the government of Prime Minister Narendra Modi and his ruling party.
“Developments in our countries aren’t addressed seriously,” said Mishi Choudhary, a technology lawyer and founder of the Software Freedom Law Centre, a digital rights group in India.
“Any takedown of content raises the questions of free expression, but incitement of violence or using a platform for dangerous speech is not a free speech matter but a matter of democracy, law and order.”
New York Times News Service