IE 11 is not supported. For an optimal experience visit our site on another browser.

How Facebook keeps the porn, gore and hate out of your News Feed

msnbc

"Pedophilia, Necrophilia. Beheadings, Suicides, etc."

Those are some examples of what Facebook's outsourced content monitors must endure while filtering the Internet viscera, according to one who spoke to Gawker's Adrian Chen.  Animal abuse, "bad fights, a man beating another," and "KKK cropping up everywhere," were other examples provided by employees of oDesk, a California-based content-moderation service staffed by employees in India, Mexico, the Philippines and Turkey who look at Facebook content.

The Internet can be a dark and horrible place, on this we should all agree. Despite the short-lived exceptions, such as the coordinated spam attack that littered Facebook with porn and gorein November, the social network remains a comparatively clean, well-lighted place. Thanks to a few disgruntled and/or traumatized content monitors in those countries, we now get a peek at how Facebook protects us, and more importantly, itself.

For every photo of a breast-feeding motheror nude drawing clumsily removed from Facebook, content monitors slog through overwhelming evidence of humanity at low tide. For the dirty job of censoring content on the social network that just filed a $100 billion IPO, at least one former oDesk employee told Chen he earned $1 an hour. Amine Derkaoui, a 21-year-old Moroccan man, vented to Chen about the oDesk job he describes as humiliating exploitation of workers, and let loose some long-held mysteries on the why and the how of Facebook's censoring process.

Derkaoui shared a one-page cheat sheet for moderators with categories such as "Sex and Nudity," "Hate Content," "Graphic Content" and "Bullying and Harassment." Chen notes:

When it comes to sex and nudity, Facebook is strictly PG-13, according to the guidelines. Obvious sexual activity, even clothed, is deleted, as are "naked ‘private parts' including female nipple bulges and naked butt cracks." But "male nipples are OK." Foreplay is allowed, "even for same sex (man-man/woman-woman)" Even the gays can grope each other on Facebook.

Facebook is more lenient when it comes to violence. Gory pictures are allowed, as long somebody's guts aren't spilling out. "Crushed heads, limbs etc are OK as long as no insides are showing," reads one guideline. "Deep flesh wounds are ok to show; excessive blood is ok to show."

Drugs are a mixed bag. Pictures of marijuana are explicitly allowed, though images of other illegal drugs "not in the context of medical, academic or scientific study" are deleted. As long as it doesn't appear you're a dealer, you can post as many pictures of your stash as you want.

Though the cheat sheet is marked current as of January, a Facebook spokesperson told Chen the cheat sheet "provided a snapshot in time of our standards with regards to [one of our] contractors," adding that up-to-date information could be found on the social network at www.facebook.com/CommunityStandards." 

Indeed, hours before Chen's original piece posted on Gawker, content monitors received an updated version of content monitor guidelines. Chen pointed out the prominent changes to the original:

  • In version 6.1, body fluids were banned. But in version 6.2, "bodily fluids (except semen) are ok to show unless a human being is captured in the process."
  • In version 6.1, all Photoshopped images of someone were banned, whether they were positive or negative. But in version 6.2, only photoshopped images that show someone in a negative light are banned.
  • All the pages now say "Proprietary and Confidential." Wonder why that is.

Here's what Facebook told msnbc.com in an official statement regarding its content-monitoring procedures:

"In an effort to quickly and efficiently process the millions of reports we receive every day, we have found it helpful to contract third parties to provide precursory classification of a small proportion of reported content. These contractors are subject to rigorous quality controls and we have implemented several layers of safeguards to protect the data of those using our service. Additionally, no user information beyond the content in question and the source of the report is shared.We have, and will continue, to escalate the most serious reports internally, and all decisions made by contractors are subject to extensive audits.  

Of course, most of us are too busy complaining about the latest design change to think about who's keeping that remaining bodily fluid blocked in our News Feed. Chen notes, that's most likely how Facebook wants it. "If users knew exactly what criteria (were) being used to judge their content, they could hold Facebook to them," he writes. "It would be clear what Facebook was choosing to censor according to its policies, and what amounted to arbitrary censorship."

Still, considering what one might find via an accidental click on say, social news site Reddit, the job of an oDesk content monitor is unenviable. As one monitor told Chen: "Think like that there is a sewer channel, and all of the mess/dirt/ waste/s*** of the world flow towards you and you have to clean it."

via Gawker

More on the annoying way we live now:

Helen A.S. Popkin goes blah blah blah about the Internet. Tell her to get a real job on Twitter and/or Facebook. Also, Google+.