SAN FRANCISCO — It’s become an almost inevitable problem on the internet: If you build it, they will troll.
That is, if a company builds a successful gaming or social media platform, trolls, extremists and other users spouting noxious speech will find a way to those online locations.
This week, a Twitter user by the name of @lululemew started to find neo-Nazi references on Roblox, a popular online game that has more than 100 million active users worldwide and is popular with children. While such disturbing user names, profiles and content in Roblox aren’t new, they got renewed attention from this woman’s tweets.
“My kid plays Roblox,” she wrote in an attempt to alert the company. “Did you know you have members on your site promoting #WhitePowerExtremist #DomesticTerrorism groups?”
Roblox, like Minecraft, allows users to create avatars and virtual worlds for those characters to roam around in. While most people use the game’s platform to create fun, innocuous characters, some have used it to try to spread hateful messages. The game has become yet another frontier in the ongoing battle over content moderation and appropriate lines of speech on private platforms that are now often spaces where people congregate.
For years, YouTube, Reddit, Instagram, Facebook and many other sites and services have faced similar struggles. “There are so many kids that play that game, and the company should be more careful and be aware of the content that’s on their site,” the woman, who asked that her name be withheld for fear of real-world harassment, said in a phone interview with NBC News.
NBC News quickly located and identified four hateful profile accounts, one of which included clear anti-Semitic language. Another showed a model of a Nazi-era uniform, and two others were Proud Boys-related profiles and included an avatar of its founder, Gavin McInnes. The Proud Boys organization has been identified as a hate group by the Southern Poverty Law Center.
After being informed of the existence of those four accounts, Roblox promptly removed them within hours. However, NBC News, along with help from another Twitter user, was able to find over 100 accounts that featured extremist and racist content, which included longstanding neo-Nazi coded language, phrases like “Jews to Gas!” and user names including “WhiteRaceBestRace.”
Byers Market Newsletter
Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox.
Roblox also removed those accounts after there were flagged to the company by NBC News.
"We strive to do everything we can to prevent and detect behavior that violates our Terms of Service and took immediate action to delete these groups,” the company said in a statement emailed by Teresa Brewer, the vice president of communications.
The challenge also comes in keeping the near-constant vigilance now required to keep extremists off platforms. Extremists have been found on Roblox before, forcing the company to develop its own staff to moderate its virtual world. Major tech platforms now routinely employ thousands of moderators to check what is posted.
The company said the Proud Boys pages were primarily used to point people to a related channel on Discord, a voice chat platform popular with gamers and also extremists.
“We employ over 800 human moderators who review millions of pieces of content per month, use automated machine learning technology to monitor communication between players, and empower players in our community, and parents, with features to report bad behavior or content,” the statement continued.
Experts who study extremism say that it is particularly troubling that gaming platforms popular with kids are now vulnerable to noxious ideas in new ways.
“We are just now beginning to understand the ways in which engagement with hate speech radicalizes individuals,” Tamar Mitts, a professor who studies online extremism at the School of International Public Affairs at Columbia University, said in an email.
“The risk of radicalization may be higher for children, who do not have the tools to distinguish between accurate or misleading information.”
Brian Levin, the head of the Center for the Study of Hate and Extremism at California State University, San Bernardino, said that extremists have tried to use video games as a means to spread their message for years.
He pointed to a 2002 video game called “Ethnic Cleansing,” a shooter game where the player assumes the role of a skinhead seeking minority targets.
“In the past you didn’t have the kind of interactivity that you have with the now networked gaming culture,” he said. “So before, for instance you’d have these distinct yet fragmented places where people could go. Now it comes to you.”
The issue of online extremism has recently become the subject of international action after a series of mass shootings that appeared to be connected to internet-based radicalization.
Jason Blazakis, a professor at the Middlebury Institute of International Studies, who served for a decade as a top counterterrorism official at theState Department, underscored that the language spewed by extremists can be used to entice new supporters.
“My top concern with platforms like Roblox is that it can be used for recruitment of a younger generation that may be susceptible to the memes, in-group language and jargon that are proliferated by groups like, but not limited to, the Proud Boys,” he emailed.
“Sadly, it is like a game of whack-a-mole and they must rely on the due diligence of the gamers to flag problematic material, users, and groups.”
Blazakis said that after being contacted by NBC News, he and his four young nephews, who play Roblox regularly, took his own instruction to heart. They logged in and flagged some in-game content.
“This speaks to the best way to counter the abuse of gaming platforms – the vigilance of the users themselves,” he concluded. “It will take civil society solutions to ensure that the gaming space is safe, free of hate and not being used to further terrorism.”