Advertisement

Home / Opinion / Facebook: Humane approach needed

Facebook: Humane approach needed

The social networking firm has matured with every scandal, but reforms have been slow, piecemeal, and inadequate
Mark Zuckerberg, the Facebook CEO. The company cannot argue that it had been caught off guard by the violence exacerbate by its platform

Debarshi Chakraborty And Ishaan Banerji   |     |   Published 29.03.20, 06:40 PM

In a speech in 2019, Mark Zuckerberg said, “... our values at Facebook are inspired by the American tradition...”. What Zuckerberg conveniently discounted is that only a minuscule portion of Facebook’s 2.45 billion monthly active users are based out of the United States of America. In spite of Facebook’s global reach, it is least devoted to the needs of users and societies situated in nations outside North America and Europe. The consequences of which are, at times, disturbing.

A report released by the United Nations’ Independent International Fact-Finding Mission on Myanmar relating to mass violence against the Rohingya Muslims indicated that Facebook had been a “useful instrument” for those striving to spread hate and propaganda. Myanmar is not the only example. From powering Rodrigo Duterte’s drug war in the Philippines to augmenting hostility in Libya, complaints of Facebook’s links to atrocities being committed around the world have been quite common. As it wrangles with these allegations, the company is also grappling with a different kind of problem in India where its subsidiary, WhatsApp, has been accused of abetting harm. In May 2018, a video that gained widespread circulation over WhatsApp featured an image of dozens of dead children with a male voiceover warning of outsiders harvesting the organs of children. After viewing the fake video, people in a village in Maharashtra lynched five strangers suspecting them to be child-abductors.

Facebook cannot argue that it had been caught off guard. In Myanmar, it had received several early warnings about the platform being used to mould events on the ground. Nor can Facebook claim to be unbiased since it exerts tremendous influence over the permissibility and visibility of online content. As a content porter, Facebook decides the categories of content that are permitted or forbidden on its platform. As content manager and amplifier, Facebook customizes the experience of its users, giving precedence to some content over others through algorithmic personalization. Facebook has matured with every scandal, but reforms have been slow, piecemeal, and inadequate.

UN experts, civil society groups and academics have employed a human rights-based formula for content moderation. The starting point for defining a human rights-based formula in content moderation is the three-pillar framework set forth in the UN’s Guiding Principles on Business and Human Rights (Guiding Principles). According to its second pillar, business enterprises owe corporate responsibility to respect human rights by preventing infringement on the human rights of others and remedying human rights transgressions that they are involved in. In fulfilling this global standard of expected conduct, businesses should close down on a range of policies and procedures apposite to their size and circumstances.

Principle 23 of the Guiding Principles explains that business enterprises should treat the heightened risks of becoming complicit in human-rights abuses committed by other actors in such contexts as a legal compliance issue given the possibility of potential corporate and individual legal liability for acts that amount to gross human-rights abuses. What is more, to guard against the aggravation of such situations, businesses must not only rely on internal expertise, but simultaneously consult externally with credible, independent experts spanning governments, civil society, national human rights institutions and relevant multi-stakeholder initiatives.

Adopting a human rights-based formula would help Facebook swing from its ad hoc approach towards a principled framework strengthened by the common conceptual vocabulary of human rights law. Facebook would also be well-equipped to evaluate the real or potential adverse human rights impacts of its moderation rules, covering the aspects of conception, design and testing, deployment in different contexts, and evaluation.

Advertisement


Advertisement
Advertisement
Advertisement
 
 
 
Copyright © 2020 The Telegraph. All rights reserved.