In the first 24 hours after the attack, Facebook blocked or removed 1.5 million versions of the video from the platform.
Viewers of the mosque shootings recorded, repackaged and reposted the original video in a range of formats, creating a gruesome game of whack-a-mole. Platforms increasingly face a battle with bad actors organizing on forums such as 8chan to circumvent their detection systems. “The attack demonstrated the misuse of technology to spread radical expressions of hate, and highlighted where we needed to improve detection and enforcement against violent extremist content,” Facebook said. The company said that the incident “strongly influenced” the company’s updates to its policies and enforcement. Facebook announced in a blog post Tuesday that it would work with law enforcement to train its artificial intelligence systems to recognize videos of violent events as part of a wider crackdown on extremist content.įacebook’s systems failed to detect the livestreamed video of the shootings.