June 19, 2012 at 12:50 PM ET
It's easy to click the "report" button on Facebook, but what exactly happens once you do? The social network's safety team has a handy-dandy breakdown.
In a post on the Facebook Safety page, it's explained that the social network has multiple teams dedicated to handling reports made by users, 24/7:
Hundreds of Facebook employees are in offices throughout the world to ensure that a team of Facebookers are handling reports at all times. For instance, when the User Operations team in Menlo Park is finishing up for the day, their counterparts in Hyderabad are just beginning their work keeping our site and users safe.
There are four types of teams which review reports — the Safety team, the Hate and Harassment team, the Access team, and the Abusive Content team. The cited reason for a report determines which of the teams will see it. "For example, if you are reporting content that you believe contains graphic violence, the Safety Team will review and assess the report," the blog post offers.
If one of the teams' members finds that reported content violates Facebook's policies, then he or she can remove the content, warn the user who posted it, revoke a user's ability to share particular types of content, disable certain features for a user, completely disable a Facebook account or escalate an issue to law enforcement.
Alternatively, if content does not violate Facebook's policies, the social network offers ways for users to directly communicate "to better resolve their issues beyond simply blocking or unfriending another user."
You can take a peek at the image at the beginning of this post to see exactly how Facebook directs issues reported by users — if the image is making you squint, you can take a closer look at Facebook's reporting guide here.
Want more tech news, silly puns, or amusing links? You'll get plenty of all three if you keep up with Rosa Golijan, the writer of this post, by following her on Twitter, subscribing to her Facebook posts, or circling her on Google+.