Facebook: No one reported NZ shooting video during 17-minute livestream

People look on as men pray in a park near Al Noor mosque after a terrorist killed 50 people.

Enlarge / CHRISTCHURCH, NEW ZEALAND—MARCH 19: People look on as men pray in a park near Al Noor mosque after a terrorist attack that killed 50 people.
Getty Images | Carl Court

Facebook says a livestream of last week’s New Zealand mass shooting was viewed fewer than 200 times during its live broadcast and that nobody reported the video to Facebook while the livestream was ongoing.

“The first user report on the original video came in 29 minutes after the video started, and 12 minutes after the live broadcast ended,” Facebook VP and Deputy General Counsel Chris Sonderby wrote in an update posted yesterday.

Ultimately, the original Facebook Live video of the terrorist attack “was viewed about 4,000 times in total before being removed from Facebook,” the company said. Video of the attack was uploaded many times after the original was removed, and a few hundred thousand videos were viewable on Facebook before being taken down.

“In the first 24 hours, we removed about 1.5 million videos of the attack globally,” Facebook said. “More than 1.2 million of those videos were blocked at upload, and were therefore prevented from being seen on our services.”

Facebook removed video after hearing from police

Facebook didn’t say how long it took to remove the original video after the first user reports. But Facebook said it took action within minutes of being contacted by New Zealand police.

“We have been working directly with the New Zealand Police to respond to the attack and support their investigation,” Sonderby wrote. “We removed the attacker’s video within minutes of their outreach to us, and in the aftermath, we have been providing an on-the-ground resource for law enforcement authorities.”

Facebook said that police asked the company “not to share certain details” because of the active investigation.

Last week’s white nationalist terrorist attack occurred during Friday Prayer at the Al Noor Mosque and the Linwood Islamic Centre in Christchurch, New Zealand, and killed 50 people while injuring 50 more. The gunman live-streamed 17 minutes of the attack.

“All profit, no responsibility”

Facebook, YouTube, and Twitter all struggled to contain the spread of the video, as we reported last week. Facebook has faced criticism for not preventing broadcast of the livestream. Today, New Zealand Prime Minister Jacinda Ardern said in a speech at Parliament that social media sites must be accountable for what is published on their platforms.

“The form of distribution, the tools of organization, they are new,” Ardern said. “We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published. They are the publisher, not just the postman. It cannot be a case of all profit, no responsibility.”

Facebook’s update yesterday seems designed to demonstrate that the social network made reasonable efforts to prevent the video’s spread despite factors outside its control. For example, Facebook wrote that “a user on 8chan posted a link to a copy of the video on a file-sharing site” before Facebook was alerted to the video’s existence.

“We removed the original Facebook Live video and hashed it so that other shares that are visually similar to that video are then detected and automatically removed from Facebook and Instagram,” Facebook wrote. “Some variants such as screen recordings were more difficult to detect, so we expanded to additional detection systems including the use of audio technology.”

Videos shared to industry database

Facebook said itself and other members of the Global Internet Forum to Counter Terrorism (GIFCT) “have been in close contact since the attack.”

GIFCT was formed by Facebook, Microsoft, Twitter, and YouTube in 2017 after they announced plans to create “a shared industry database of ‘hashes’—unique digital ‘fingerprints’—for violent terrorist imagery or terrorist recruitment videos or images that we have removed from our services.”

Facebook yesterday said it has “shared more than 800 visually distinct videos related to the attack via our collective database, along with URLs and context on our enforcement approaches. This incident highlights the importance of industry cooperation regarding the range of terrorists and violent extremists operating online.”

Facebook said it also “identified abusive content on other social media sites in order to assess whether or how that content might migrate to one of our platforms.”

Facebook’s post didn’t say how many videos or copies of videos have been removed since the first 24 hours after the attack. “We will continue to work around the clock on this and will provide further updates as relevant,” the company said.

Similar Posts: