The challenges in dealing with the rapid dissemination of such content are significant. “This fits into a pattern we’ve seen time and time again,” said Ben Decker, CEO of digital investigative consultancy Memetica and an expert on online radicalization and extremism. “At this point, we know the consumption of these videos creates copycat mass shootings.”
Meta called the event a “terrorist attack” on Saturday, prompting the company’s internal teams to identify and delete the suspect’s account, as well as begin deleting copies of the video and document. and links to them on other sites, according to a company spokesperson. The company added the video and document to an internal database that automatically detects and removes copies if they are re-uploaded. Meta also banned content that praises or supports the attacker, the spokesperson said.
A Streamable spokesperson told CNN the company is “working diligently” to remove copies of the video “promptly.” The spokesperson did not respond when asked how a video reached millions of views before being deleted.
Copies of the document allegedly written by the shooter were uploaded to Google Drive and other smaller online storage sites and shared over the weekend via links to those platforms. Google did not respond to requests for comment on using Drive to deliver the document.
Challenges in countering extremist content
According to Tim Squirrell, head of communications at the Institute for Strategic Dialogue, a think tank dedicated to countering extremism.
But consumer big tech platforms also have to deal with the fact that not all internet platforms want to take action against this content.
“Now technically it failed. It was on Twitch. It then started posting within the first 24 hours,” Decker said, adding that the platforms still had work to do to coordinate effectively to remove harmful content during crisis situations. Still, the work the major platforms have done since Christchurch means their response to Saturday’s attack has been faster and more robust than the response three years ago.
“A lot of the threads on the 4chan message board were just people clamoring for the stream over and over again, and once they got a seven minute version, just repost it over and over” on bigger platforms, Squirrell said. As with other content on the internet, videos like the one from Saturday’s shooting are also often quickly manipulated by extremist communities online and embedded into memes and other content that can be harder for viewers to identify and remove. consumer platforms.
Like Facebook, YouTube, and Twitter, platforms like 4chan rely on user-generated content and are legally protected (at least in the US) by a law called Section 230 from liability for much of what users post. But while mainstream Big Tech platforms are driven by advertisers, social pressures, and users to fight harmful content, smaller, more fringe platforms aren’t driven by a desire to protect ad revenue or attract a large user base. In some cases, they want to be online homes for speeches that would be moderated elsewhere.
“The consequence of that is you can never complete the game of Whack-a-mole,” Squirrell said. “There will always be somewhere, someone passing around a Google Drive link or a Samsung cloud link or something else that allows people to access it… Once it’s in the ether, it is impossible to delete everything.”