As a gunman killed one person and wounded three others at a synagogue near San Diego on Saturday, white supremacists descended on a Facebook page connected to the suspected shooter to express support and push racist propaganda.
A link to the Facebook page was posted before the shooting to the far-right message board 8chan by a user claiming to be John T. Earnest, the white supremacist who has been charged in the attack on the Chabad of Poway synagogue. The page promised a livestream of the attack and an “open letter” filled with anti-Semitic tropes.
Many of the posts on the Facebook page celebrated mass shooters, and the first posts provided guesses as to how many people he would kill. One user posted a meme featuring an AR-15 rifle with the words “here we go.” Another commenter asked for a video stream of the attack, saying he “needs the blood for Santa Muerte,” the saint of death. Both of those accounts remain active.
The shooting adds to a growing list of violent acts with ties to fringe parts of the internet that can also be found on mainstream platforms. Experts have warned that in addition to providing ways for extremists to organize, the internet has played a role in radicalizing some people.
Facebook announced in March that it had decided to ban white nationalism and white supremacy from its platform. Facebook’s terms of service also bans extremist and impersonation accounts as well as calls to violence.
Byers Market Newsletter
Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox.
But despite that ban, 22 of the first 25 accounts that posted praise on the suspect’s page remained active on Facebook as of Monday.
Facebook did not respond when asked for comment. The company removed the page in Earnest’s name, which never hosted a livestream.
Facebook released a statement on Sunday saying that it would remove content that violated its policies.
“Content that praises, supports or represents the shooting violates our community standards and we will remove as soon as we identify it,” a Facebook spokesperson said.
Facebook has removed some users who have promoted white nationalist ideology.
The link between 8chan and Facebook echoed the attack last month on two mosques in Christchurch, New Zealand, where a gunman killed 50 people. Before that attack, a person claiming to be the gunman posted a link to 8chan of a Facebook page that would later livestream the shootings.
Melissa Ryan, an extremism researcher at Hope Not Hate, an advocacy group that tries to counter racism, said that Facebook needs to recognize when extremists from other parts of the internet are congregating on its social network.
“At this point, given all we know about how extremists organize across platforms, there’s no excuse for Facebook,” Ryan said. “Facebook still sees this as a PR problem and not a human lives problem. They have to be shamed to enforce their own policies. Until they actually see this as a human-centered problem, they’re always going to be reacting instead of working to stop these things from happening.”
Other Facebook accounts that still remained live were homages to previous mass shooters. One account, which used an alt-right cartoon as a profile picture, used an anagram in its profile name to refer to the accused Christchurch mosque shooter. Another profile picture featured the face of the gunman who killed 58 people in Las Vegas photoshopped onto a video game cover.
Ryan said it’s possible for Facebook’s algorithms to detect when an unusual amount of traffic from one page is coming from a white nationalist hangout, and for a security team to see if that traffic is tied to threats.
“It’s their platform, so they should be able to tell. They should be able to have triggers for messaging,” Ryan said. “If they can target me with an Instagram ad for the competitor of some product I looked at a couple of hours ago, they should be able to figure out when an account is tied to an attack.”