Horror was designed for the viral era
The horror was designed specifically for an era that has married social media and racism — a massacre apparently motivated by white extremist hatred, streamed live on Facebook and calculated to go viral.
The shooting represented a staggering corruption of a form of communication, used innocently by millions, that promised to draw people together but has also helped pry them apart into warring camps. It also shattered a veneer of civility and security in one of the safest and most highly developed countries in the world.
New safeguards developed by tech companies over the last 18 months were not enough to stop the video and statement from being widely posted, on Facebook, YouTube, Twitter and Instagram. While Facebook and Twitter took down pages thought to be linked to the gunman, the posted content was spread rapidly through other accounts. Some people appeared to be using techniques to evade automated systems that find and delete content.
The massacre was not the first Internet broadcast of a violent crime, but it showed that stopping gory footage from spreading online persists as a major challenge for tech companies.
The massacre in Christchurch was live-streamed by an attacker through his Facebook profile for 17 minutes, according to a copy seen by Reuters. Facebook said it removed the stream after being alerted to it by New Zealand police.
But a few hours later, footage from the stream remained on Facebook, Twitter and Alphabet Inc’s YouTube, as well as Facebook-owned Instagram and WhatsApp.
People who wanted to spread the material had raced to action, rapidly repackaging and distributing the video across many apps and websites within minutes. Facebook, Twitter, YouTube and Mega on Friday said they were taking action to remove the copies.
Other violent crimes that have been live-streamed include a father in Thailand in 2017 who broadcast himself killing his daughter on Facebook. After more than a day, and 370,000 views, Facebook removed the video.
In the US, the assault in Chicago of an 18-year-old man with special needs, accompanied by anti-white racial taunts, in 2017, and the fatal shooting of a man in Cleveland, Ohio, that same year, were also live-streamed.
Facebook, the world’s largest social media network with about 2.3 billion monthly users around the world, tripled the size of its safety and security team to 30,000 people over the last three years to respond more quickly to reports of offensive content. It has also focused on developing artificial intelligence systems to catch material without the need for users to report it first.
But the viral reach of yet another obscene video caused politicians around the globe on Friday to voice the same conclusion: Tech companies are failing. As the massacre video continued to spread, former New Zealand Prime Minister Helen Clark said companies had been slow to remove hate speech.