Facebook has developed new artificial intelligence and machine learning technology to fight child exploitation on their platform, the company announced on Wednesday.
“We’ve been working on this issue for a really long time, because unfortunately since the inception of the internet people have been trying to exploit children,” Antigone Davis, head of global safety at Facebook, wrote in a post to the company’s blog.
Davis said Facebook is now using “classifiers,” algorithms capable of determining the nature of images and conversations, to combat “inappropriate interactions with children.”
“Recently, our engineers have been focused on classifiers to actually prevent unknown images, new images.” Davis said in a video accompanying the announcement. “Using a nudity filter, as well as signals to indicate that it’s a minor, we’re actually able to get in front of those images and to remove them.”
Previously, Facebook used a “photo-matching” technology that could only detect “known” images of child nudity and expel them from the platform. Facebook’s new technology is able to detect “unknown images,” casting a wider net for exploitative content.
According to Davis, this has enabled the platform to take down “8.7 million pieces of content,” and “99 percent of which was removed before anyone reported it.”
Facebook’s most recent technology can also analyze behaviors toward minors on the platform.
“Perhaps what’s most promising is the use of a behavioral classifier,” Davis said. “And what we can do here is actually get in the way of inappropriate interactions with children.”
Facebook has previously struggled to moderate the content posted by the more than 2.2 billion users on its platform. In response, the company hired thousands of people to act as moderators and police the content on the site. Facebook CEO Mark Zuckerberg has said that the company is working on developing artificial intelligence technology to help with those efforts.
When Facebook detects child sexual abuse material, the company reports the content to the National Center for Missing and Exploited Children, a nonprofit organization.
Michelle DeLaune, chief operating officer of the NCMEC, explained the center’s role in responding to Facebook’s notifications.
“Our job is then to make that report available to the appropriate law enforcement agency.” DeLaune said. “At this point we have the ability to transfer the information to more than 100 law enforcement agencies around the globe.”
In March, Facebook faced criticism for a survey in which it asked users if the platform should allow inappropriate messaging between sexual predators and children. Immediate backlash led the company to retract the survey and acknowledge the mistake.
“We run surveys to understand how the community thinks about how we set policies,” Guy Rosen, Facebook’s vice president of product management, tweeted. “But this kind of activity is and will always be completely unacceptable on FB.”
Facebook also said it will join Microsoft and other industry partners next month to begin building tools for smaller companies to prevent the grooming of children online by sexual predators.