Facebook to restrict Facebook Live after New Zealand mosque shootings, Sheryl Sandberg says
Published 5:27 p.m. ET March 29, 2019 | Updated 5:31 p.m. ET March 29, 2019
Play video on original page
Survivors of the mosque attacks in New Zealand describe terror at a door they couldn't open while trying to escape the shooting in the Al Noor mosque. (March 28) AP Domestic
SAN FRANCISCO — Facebook is considering putting restrictions on who can post live videos on the social network following the deadly shootings at two mosques in New Zealand that were broadcast by the gunman, the company's chief operating officer Sheryl Sandberg said Friday.
The social media giant will monitor who can use its "Live" feature depending on such factors as prior violations of Facebook's community standards, Sandberg wrote in a blog post
Facebook has been under
pressure to change its policies since a gunman livestreamed the killing of 50 people at two mosques in Christchurch, New Zealand, on March 15.
Critics say Facebook did not remove the video quickly enough, allowing it to spread across the internet and to be uploaded to other online services such as Google's YouTube.
The original video was seen by about 200 people during the 17-minute live broadcast, Facebook said. Facebook removed the video 12 minutes after the livestream ended, but users grabbed the footage and reposted clips from it, making it challenging for Facebook to block all the footage.
Sandberg says Facebook identified more than 900 different videos showing portions of the attacks and deployed artificial intelligence tools to identify and remove hate groups in Australia and New Zealand including the Lads Society, the United Patriots Front, the Antipodean Resistance and National Front New Zealand.
File photo taken in 2018 shows a cellphone and a computer screen displaying the logo of the social networking site Facebook. (Photo: NORBERTO DUARTE, AFP/Getty Images)
Facebook is building "better technology" to quickly identify edited versions of violent videos and images and prevent people from re-sharing these new versions, Sandberg said.
"Many of you have also rightly questioned how online platforms such as Facebook were used to circulate horrific videos of the attack," Sandberg wrote. "We are committed to reviewing what happened and have been working closely with the New Zealand Police to support their response."
In a 2017 interview, Facebook Chief Executive Officer Mark Zuckerberg agreed that his company shoulders the responsibility for halting violence
on Live and on Facebook in general. The company has hired thousands more moderators, in part, to spot violence in live videos and is training its artificial intelligence to detect it as well.
© 2021 USA TODAY, a division of Gannett Satellite Information Network, LLC.
Leave website feedback