IE 11 is not supported. For an optimal experience visit our site on another browser.

Streamed to Facebook, spread on YouTube: New Zealand shooting video circulates online despite takedowns

The video continues to spread across the internet, illustrating how difficult it is to keep graphically violent images away from the public.
Image:
Ambulance staff take a man from outside a mosque in central Christchurch, New Zealand on March 15, 2019.Mark Baker / AP

YouTube and other social media sites worked on Friday to stem the spread of the video allegedly recorded by a shooter who entered a mosque in Christchurch, New Zealand, and killed 49 people, highlighting the continued challenges companies face in moderating their platforms.

For the better part of a day after the shooting, the video spread wildly across the internet, illustrating how difficult it is to keep graphically violent images away from the public once they have been posted to the internet, even though few media outlets showed the footage. Only by Friday afternoon, after public interest declined, did the number of copies slow down.

The video was originally livestreamed on Facebook, which released a statement in the hours after the shooting detailing the company’s plans to limits its spread.

“New Zealand Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video,” Mia Garlick, Facebook’s director of policy in Australia and New Zealand, said in an emailed statement. “We're also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware. We will continue working directly with New Zealand Police as their response and investigation continues.”

Facebook was not able to remove the video before it had been captured by viewers. The livestream was taken down after about 20 minutes, according to timestamped archives of the Facebook page seen by NBC News. Facebook had removed the profile associated with the livestream about an hour and a half the video first started streaming. The video then began to spread around the internet, including on YouTube and Twitter.

YouTube tweeted early Friday that the company was “working vigilantly to remove any violent footage.”

A series of searches on YouTube conducted by NBC News on Friday morning based on keywords related to the shooting turned up more than a dozen versions of the video that included graphic violence uploaded in under an hour, according to details publicly available on the platform. Many links turned up pages where the video had been taken down with the message: “This video has been removed for violating YouTube's policy on violent or graphic content.”

Facebook’s Watch platform also hosted copies of the video, according to a basic keyword search. One graphic video clip had been available for roughly nine hours, according to the Facebook post’s details.

By Friday afternoon, similar searches on YouTube and Facebook turned up only heavily censored videos of the shootings and some that used stills from the livestream that did not contain graphic content.

The spread of the videos, particularly on YouTube, drew criticism on social media from users who said the company was not taking down the videos quickly as well as concerns that pieces of the video could end up interspersed in other videos targeted at young people.

Tom Watson, a member of British Parliament and deputy leader of the Labour Party, tweeted that YouTube should stop user uploads until it can contain the video.

“If YouTube don't have the capability to halt the spread of the NZ massacre videos — because they are going up faster than they can take them down — then they should suspend all new uploads at this time,” Watson wrote.

A YouTube spokesperson said that the company is removing videos as they are found.

Similar searches on Twitter also turned up versions of the video. Some Twitter users reported seeing the video through the platform’s autoplay feature, which begins playing videos without user interaction. Others pleaded with people not to spread the video.

A spokesperson for Twitter said the company has rigorous processes and a dedicated team in place for managing exigent and emergency situations such as this. We also cooperate with law enforcement to facilitate their investigations as required.”

Videos of the shooting continued to spread on Twitter on Friday afternoon, with one such example having been online for more than 11 hours.

The video’s creation on Facebook and spread across Twitter and YouTube come as those companies remain under pressure to better moderate their platforms and quickly remove a wider range of content. Facebook and YouTube in particular have said they are now investing heavily in automated moderation systems and human intervention to deal with the massive amount of content uploaded to their platforms every day.

Reddit banned at least two of its communities, known as subreddits, to which users had posted links to the shooting, including one infamous subreddit dedicated to footage of people dying.

A Reddit spokesperson said in an emailed statement that subreddits are not allowed to glorify violence.

"We are very clear in our site terms of service that posting content that incites or glorifies violence will get users and communities banned from Reddit. Subreddits that fail to adhere to those site-wide rules will be banned,” the spokesperson wrote.

While the full version of the video had become more difficult to find, there is little chance it will disappear from the internet entirely. David Carroll, an associate professor of media design at the New School who studies the spread of misinformation online, warned that parents should be vigilant about where their children go on the internet.

“Parents might consider a temporary draconian moratorium of YouTube at home given the elevated risk of NZ massacre content being maliciously spliced into young kids content and otherwise recommended to teens/pre-teens,” Carroll tweeted.