IE 11 is not supported. For an optimal experience visit our site on another browser.

Facebook has doubled down on groups — now it's looking to clean them up

Among other things, Facebook uses AI and machine learning to proactively identify and remove posts and groups that break the rules — even if the groups are private.
US-internet-lifestyle-computers-media
Facebook CEO Mark Zuckerberg speaks during the annual F8 summit at the San Jose McEnery Convention Center in San Jose, Calif., on May 1, 2018.Josh Edelson / AFP - Getty Images file

Facebook offered new insight on Wednesday into policies that ramp up enforcement of its community standards for the tens of millions of active groups on its platform.

“Being in a private group doesn’t mean that your actions should go unchecked,” Facebook’s vice president of engineering, Tom Alison, wrote in a company blog post. “We have a responsibility to keep Facebook safe, which is why our Community Standards apply across Facebook, including in private groups.”

A company spokesman confirmed the announcement is part of an acceleration of its enforcement policy, originally announced in April, for groups.

The announcement follows years of criticism that Facebook heavily promoted groups, but did little to curtail fringe organizations that trafficked in hate, organized violence, or misinformation and conspiracy theories using the groups function.

Facebook CEO Mark Zuckerberg announced a renewed focus on its Groups feature in 2017. By 2019, the company said more than 400 million users were in “meaningful groups.” Facebook also recently launched an advertisement campaign to promote its groups feature.

Alison wrote that Facebook was using artificial intelligence and machine learning to proactively identify and remove posts and groups that break the rules, and promoting tools that would hold group administrators more accountable for posted content.

"There’s a misperception that private groups go unchecked just because they aren't visible to the public." Facebook's groups product manager, Nir Matalon, told NBC News. "In reality, our proactive detection technology can find violations even if no one in the group reports it. We also have barriers in place to catch bad posts from people who have broken our rules before and are holding admins more accountable for what their members share."

While Facebook has taken steps to ban white nationalism and reduce the visibility of health misinformation in recent months, other popular fringe groups including those peddling conspiracies like Qanon and anti-vaccination groups have been allowed to remain. Alison’s post touched on — without definitively answering — what would trigger a group’s removal.

“Deciding whether an entire group should stay up or come down is nuanced,” Alison wrote. “If an individual post breaks our Community Standards, it comes down, but with dozens, hundreds, or sometimes thousands of different members and posts, at what point should a whole group be deemed unacceptable for Facebook?”

Subject matter, including the group name or description, and the actions of administrators and moderators and whether they allow rule-breaking posts, are factors, Alison said.

Facebook declined to comment on whether the crackdown had resulted in group closures or provide details on what kind of groups may have been impacted by the new policies. But for weeks, moderators of well-known fringe groups have been complaining to members that Facebook has been clamping down on their posts.

Last week, Larry Cook, the administrator for one of Facebook’s largest anti-vaccination groups, told his members that Facebook had been uncharacteristically vigilant in moderating the group's posts and notifying him about content that violated Facebook's policies.

“One day, this group could just vanish - prepare for that,” Cook posted.

Not everyone is convinced that a major crackdown is coming.

“Same stuff, different day,” said Megan Squire, a computer science professor at Elon University who tracks online extremism on Facebook. “Just in the past month, I reported groups calling for a global purge of Islam, extermination of people based on religion, and calling for violence through a race war, and Facebook’s response was that none of these groups was a violation of community standards.”

“They’ve been saying ‘We’re really going to get it right this time’ for literally years,” Squire said.

Facebook simultaneously announced it is updating its groups privacy settings, eliminating its “secret” category, and renaming the types of groups “public” and “private.” The move was made for clarity, the Facebook spokesperson said.