The assassin of Christchurch documented his series of brutal murders, which ended with 50 dead, with a helmet-mounted GoPro camera and shared the 17-minute video live on Facebook. Until the original was removed from the social network, other users could view and redistribute it. Some numbers to the group on Tuesday in an “Update on New Zealand” published.
Nearly 200 people watched live, as the right-wing terrorist stormed the Al-Nur Mosque in Christchurch and indiscriminately shot people. It is striking that none of the live viewers found the video problematic and reported it to Facebook for review. Only after 29 minutes, about twelve minutes after the end of the live broadcast, was the first message received, the company said.
That it took so long, until the first user responded, could be explained by the fact that the video was first shared among like-minded people in the forum “8chan”. There, radicals mix with onlookers who regard such content as sensations. In addition, the assassin had announced the live broadcast in the forum in order to make users aware of his transfer.
As it took a long time after the first message until the original video was finally removed from Facebook, the opinion of the Internet company is not apparent. The corporation responded, however, according to their own statements, only when the New Zealand police made aware of the content: “We removed the video of the attacker within minutes of their (the police) request to us.”
Until then, about 4000 users saw the original video on Facebook. The group also blocked the offender’s accounts both on Facebook and on Instagram.
To now at least on their own platforms, the copies of the video to find and delete, Facebook uses according to its own information the “Global Internet Forum Counter Terrorism” (GIFCT). The partnership was launched in 2017 by Internet giants Google, Facebook, Microsoft, Twitter and YouTube to counteract the spread of terrorist content on their platforms.
About a database, the companies according to Googlecreate “digital fingerprints” for terrorist content and share it with other companies to detect retransmission. About this method of “hashing” Facebook was able to delete about 1.5 million videos worldwide in the first 24 hours after the attack. More than 1.2 million of these had already been blocked during the upload.
“We remain shocked and saddened by this tragedy and are determined to work with leaders in New Zealand and other countries and the technology industry to fight hate speech and the threat of terrorism,” Facebook said.
The New Zealand Prime Minister took up the role of social networking in a widely acclaimed parliamentary speech on Tuesday, and called on Facebook to more effectively prevent the spread of terrorist content in the future.
New Zealand companies are considering withdrawing their ads from Facebook, and some have already dropped their mark on the likes of the state-owned Lotto group. A spokesman for the company explained that, “for advertising on Facebook, the sound does not feel right after these events”. Google is also affected by the vote of no confidence.