Christchurch shootings: Social media races to stop attack footage

Media playback is unsupported on your device
Media captionChristchurch was put into lockdown as events unfolded

A gunman opened fire in a mosque in Christchurch, New Zealand, killing 50 people and injuring 50 more. As he did so, he filmed the entire crime and live-streamed it directly to Facebook.

What ensued was an exhausting race for social media pages to take the footage down, as it was replicated seemingly endlessly and shared widely in the wake of the attack.

And through social media, it found its way onto the front pages of some of the world's biggest news websites in the form of still images, gifs, and even the full video.

This series of events has, once again, shone a spotlight on how sites like Twitter, Facebook, YouTube and Reddit try - and fail - to address far-right extremism on their platforms.

As the video continued to spread, other members of the public put up their own posts pleading with people to stop sharing it.

One pointed out: "That is what the terrorist wanted."

What was shared?

The video, which shows a first-person view of the killings, has been widely circulated.

  • About 10 to 20 minutes before the attack in New Zealand, someone posted on the /pol/section of 8chan, a message board popular with the alt-right. The post included links to the suspect's Facebook page, where he stated he would be live-streaming and published a rambling and hate-filled document
  • That document was, as Bellingcat analyst Robert Evans points out, filled with "huge amounts of content, most of it ironic, low-quality trolling" and memes, in order to distract and confuse people
  • The suspect also referenced a meme in the actual video. Before opening fire he shouted "subscribe to PewDiePie", a reference to a meme about keeping YouTube star PewDiePie as the most-subscribed-to channel on the platform. PewDiePie has been embroiled in a race row before, so some have speculated that the attacker knew that mentioning him would provoke a reaction online. PewDiePie later said on Twitter he was "absolutely sickened having my name uttered by this person"
  • The attacks were live-streamed on Facebook and, despite the original being taken down, were quickly replicated and shared widely on other platforms, including YouTube and Twitter
  • People continue to report seeing the video, despite the sites acting pretty swiftly to remove the original and copies, and copies are still being uploaded to YouTube faster than it can remove them
  • Several Australian media outlets broadcast some of the footage, as did other major newspapers around the world
  • Ryan Mac, a BuzzFeed technology reporter, has created a timeline of where he has seen the video, including it being shared from a verified Twitter account with 694,000 followers. He claims it has been up for two hours

How have people reacted?

While huge numbers of people have been duplicating and sharing the footage online, many others responded with disgust - urging others not only not to share the footage, but not even to watch it.

Spreading the video, many said, was what the attacker had wanted people to do.

A lot of people were particularly angry at media outlets for publishing the footage.

Channel 4 News anchor Krishnan Guru-Murthy, for example, specifically named two British newspaper websites and accused them of hitting "a new low in clickbait".

Buzzfeed reporter Mark Di Stefano also wrote that MailOnline had allowed readers to download the attacker's 74-page "manifesto" from their news report. The website later removed the document, and released a statement saying it was "an error".

Daily Mirror editor Lloyd Embley also tweeted that they had removed the footage, and that publishing it was "not in line with our policy relating to terrorist propaganda videos".

How have social media companies responded?

All of the social media firms have sent heartfelt sympathy to the victims of the mass shootings, reiterating that they act quickly to remove inappropriate content.

Facebook said: "New Zealand Police alerted us to a video on Facebook shortly after the live-stream commenced and we removed both the shooter's Facebook account and the video.

"We're also removing any praise or support for the crime and the shooter or shooters as soon as we're aware. We will continue working directly with New Zealand Police as their response and investigation continues."

And in a tweet, YouTube said "our hearts are broken", adding it was "working vigilantly" to remove any violent footage.

In terms of what they have done historically to combat the threat of far-right extremists, the social media companies' approach has been more chequered.

Twitter acted to remove alt-right accounts in December 2017. Previously it has removed and then reinstated the account of Richard Spencer, an American white nationalist who popularised the term "alternative right".

Facebook, which suspended Mr Spencer's account in April 2018, admitted at the time that it was difficult to distinguish between hate speech and legitimate political speech.

This month, YouTube was accused of being either incompetent or irresponsible for its handling of a video promoting the banned Neo-Nazi group, National Action.

British MP Yvette Cooper said the video-streaming platform had repeatedly promised to block it, only for it to reappear on the service.

What needs to happen next?

Dr Ciaran Gillespie, a political scientist from Surrey University, thinks the problem goes far deeper than a video, shocking as that content has been.

"It is not just a question about broadcasting a massacre live. The social media platforms raced to close that down and there is not much they can do about it being shared because of the nature of the platform, but the bigger question is the stuff that goes before it," he said.

Image copyright Getty Images
Image caption At least 49 people were killed in the shootings at two mosques in Christchurch

As a political researcher, he uses YouTube "a lot" and says that he is often recommended far-right content.

"There is oceans of this content on YouTube and there is no way of estimating how much. YouTube has dealt well with the threat posed by Islamic radicalisation, because this is seen as clearly not legitimate, but the same pressure does not exist to remove far-right content, even though it poses a similar threat.

"There will be more calls for YouTube to stop promoting racist and far-right channels and content."

'Legitimate controversy'

His views are echoed by Dr Bharath Ganesh, a researcher at the Oxford Internet Institute.

"Taking down the video is obviously the right thing to do, but social media sites have allowed far-right organisations a place for discussion and there has been no consistent or integrated approach to dealing with it.

"There has been a tendency to err on the side of freedom of speech, even when it is obvious that some people are spreading toxic and violent ideologies."

Now social media companies need to "take the threat posed by these ideologies much more seriously", he added.

"It may mean creating a special category for right-wing extremism, recognising that it has global reach and global networks."

Neither under-estimate the magnitude of the task, especially as many of the exponents of far-right views are adept at, what Dr Gillespie calls, "legitimate controversy".

"People will discuss the threat posed by Islam and acknowledge it is contentious but point out that it is legitimate to discuss," he said.

These grey areas are going to be extremely difficult for the social media firms to tackle, they say, but after the tragedy unfolding in New Zealand, many believe they must try harder.

More on this story