IE 11 is not supported. For an optimal experience visit our site on another browser.

Cleveland Shooting Highlights Facebook's Responsibility in Policing Depraved Videos

Steve Stephens wanted an audience to see the moment he seemingly chose Robert Godwin Sr. at random and gunned down the grandfather on Easter Sunday.
Image: A man who identified himself as Stevie Steve is seen in a video he broadcast of himself on Facebook Live
Steve Stephens appears in stills from a video he broadcast of himself on Facebook in Cleveland, Ohio on April 16, 2017.Steve Stephens / Facebook via Reuters

Steve Stephens wanted an audience to see the moment he seemingly chose Robert Godwin Sr. at random and gunned down the grandfather in cold blood on Easter Sunday — and he got one.

For one hour and 45 minutes, a video showing Godwin's death remained live on Facebook until it was reported, Justin Osofsky, Facebook's vice president of global operations, said in a blog post Monday.

Facebook wasn't made aware of a live video containing Stephens' confession until after it had ended, Osofsky said. An initial video in which Stephens shares his intent to murder was never reported, according to Osofsky.

Image: A man who identified himself as Stevie Steve is seen in a video he broadcast of himself on Facebook Live
Steve Stephens appears in stills from a video he broadcast of himself on Facebook in Cleveland, Ohio on April 16, 2017.Steve Stephens / Facebook via Reuters

Related: Facebook Killer Manhunt: Police on Lookout in Four States for Cleveland Man Steve Stephens

"We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind," Osofsky said. "But we know we need to do better."

He shared the following timeline of events of how the Easter tragedy unfolded on Facebook.

11:09 a.m. PT — First video, of intent to murder, uploaded. Not reported to Facebook.

11:11 a.m. PT — Second video, of shooting, uploaded.

11:22 a.m. PT — Suspect confesses to murder while using Live, is live for 5 minutes.

11:27 a.m. PT — Live ends, and Live video is first reported shortly after.

12:59 p.m. PT — Video of shooting is first reported.

1:22 p.m. PT — Suspect’s account disabled; all videos no longer visible to public.

The senseless act of violence that has spurred a nationwide manhunt has raised questions about Facebook Live and what can be done to better police criminal content. As a result, Osofsky said Facebook is "reviewing our reporting flows."

Facebook Live, a tool that was initially for celebrities and public figures, has only been available to the site's 1.86 billion users for one year.

It created a star in Chewbacca Mom Candace Payne and has been used as a powerful social justice tool to capture protests and police-involved shootings.

However, during that time, a Chicago man was tortured on Facebook Live. A rash of suicides were broadcast. In Sweden, a gang rape was reportedly broadcast on Facebook Live. And that's just a few of the incidents that have created a unique challenge for Facebook as the social network strives to balance safety with free expression.

"A lot of this phenomenon of people, whether committing suicide or killing, torturing, raping, is because they know they have this have this captive audience," Jen Golbeck, a professor at the University of Maryland's College of Information Studies, told NBC News.

Facebook's existing community standards shed some light on how they're tackling what they call "the unique challenges of live video."

The protocol includes having a team on call at all hours, ready to respond to reports of content that may violate community standards.

Live video is treated like all other content, according to Facebook.

"Anyone can report content to us if they think it goes against our standards, and it only takes one report for something to be reviewed," Facebook's community standards state. "One of the most sensitive situations involves people sharing violent or graphic images of events taking place in the real world. In those situations, context and degree are everything."

Related: Facebook Addresses Growing Issue of Livestreamed Suicides

Last July, Facebook briefly took down a video showing the police shooting of Philando Castile, an unarmed black man fatally shot during a Minnesota traffic stop. The video was restored, with Facebook blaming it on a glitch.

"It's a place where in times of uncertainty or potentially bad things happening, it is a way of holding anyone, but especially authority, accountable," Golbeck said. "It's not just you and them. Lots of people are watching this interaction and it can help in a powerful way."

But what happens when a crime is broadcast on Facebook Live, or in the case of Godwin's murder, when the video is then uploaded to Facebook?

It can often turn into a game of whack-a-mole, with copies cropping up on Facebook and in other corners of the internet.

Related: Virtual Crime Scene: Police Shootings Address New Era of Violent Social Media

The social network should, presumably, have the technology to stop copies of these videos from spreading, Golbeck said.

"I think they are really trying to figure that out. It is one thing to keep up a police shooting, something that is violent, but speaking to a larger issue," Golbeck said. "But it is another thing to be like, 'Here is a guy getting shot.'"

Facebook's Efforts so Far

Earlier this month, Facebook said it was using photo-matching technologies to help stop revenge porn from being repeatedly shared on Facebook, Messenger and Instagram. If someone tries to share an image that has been reported and removed, their attempt will be automatically thwarted and they'll be notified the photo violates Facebook's terms of service.

Facebook did not immediately respond to NBC News' request for comment as to whether this technology might also work for video.

Last month, Facebook also announced the integration of suicide prevention tools into Facebook Live, allowing viewers the option to report a video and to get resources to help them reach out to a friend in need. The person sharing the video will also see resources on their screen, letting them contact a helpline, reach out to a friend or get tips on what they can do.

Facebook's algorithm is also getting smarter and is working to "identify posts as very likely to include thoughts of suicide." After those posts have been flagged, a human member of Facebook's team will review them and, if appropriate, reach out to the person with resources.

The recent addition of enhanced safety features is building on a vision Mark Zuckerberg outlined in a 5,700 word letter in February, outlining his vision for building a safer, more connected global community.

"In times like these, the most important thing we at Facebook can do is develop the social infrastructure to give people the power to build a global community that works for all of us," he wrote.