Facebook workers dissect its election role
Sixteen months before last November’s presidential election, a researcher at Facebook described an alarming development. She was getting content about the conspiracy theory QAnon within a week of opening an experimental account, she wrote in an internal report.
On November 5, two days after the election, another Facebook employee posted a message alerting colleagues that comments with “combustible election misinformation” were visible below many posts.
Four days after that, a company data scientist wrote in a note to his co-workers that 10 per cent of all US views of political material — a startlingly high figure — were of posts that alleged the vote was fraudulent.
In each case, Facebook’s employees sounded an alarm about misinformation and inflammatory content on the platform and urged action — but the company failed or struggled to address the issues.
The internal dispatches were among a set of Facebook documents obtained by The New York Times that give new insight into what happened inside the social network before and after the November election, when the company was caught flat-footed as users weaponised its platform to spread lies about the vote.
Facebook has publicly blamed the proliferation of election falsehoods on former President Donald J. Trump and other social platforms. In mid-January, Sheryl Sandberg, Facebook’s chief operating officer, said the January 6 riot at the Capitol was “largely organised on platforms that don’t have our abilities to stop hate”.
Mark Zuckerberg said in March that the company “did our part to secure the integrity of our election.”
But the company documents show the degree to which Facebook knew of extremist movements and groups on its site that were trying to polarise American voters before the election.
The documents also give new detail on how aware company researchers were after the election of the flow of misinformation that posted votes had been manipulated against Trump.
What the documents do not offer is a complete picture of decision making inside Facebook. Some internal studies suggested that the company struggled to exert control over the scale of its network and how quickly information spread, while other reports hinted that Facebook was concerned about losing engagement or damaging its reputation.
Yet what was unmistakable was that Facebook’s own employees believed the social network could have done more, according to the documents.
“Enforcement was piecemeal,” read one internal review in March of Facebook’s response to Stop the Steal groups, which contended that the election was rigged against Trump. The report’s authors said they hoped the post-mortem could be a guide for how Facebook could “do this better next time”.
New York Times News Service