IE 11 is not supported. For an optimal experience visit our site on another browser.

When Seeing the Most Depraved Side of the Internet Is Your Job

Machines are getting smarter, but human moderators are still crucial in the fight against child exploitation, bestiality and other violent crimes.
Image: A light illuminates the keyboard of laptop computer.
A light illuminates the keyboard of laptop computer.Chris Ratcliffe / Bloomberg via Getty Images

They see the most depraved side of humanity on a daily basis — from videos of children being exploited, to bestiality, suicide, and other brutal acts of violence.

While machines are getting smarter and algorithms even more effective, at the end of the day, every piece of content potentially showing one of these crimes still needs to be reviewed by a human.

Related: Video Fingerprints Could Help Fight Child Porn

"Machines aren’t perfect. They’ve done a pretty good job of relieving humans from certain responsibilities," said Robert Siciliano, an online safety expert. "However, I don’t know if we will ever be in a position where you can completely automate [the review process.]"

That's why people like Henry Soto and Greg Blauert are needed. The pair, who used to work as members of Microsoft's online safety team, saw the darkest side of humanity as they reviewed images showing child exploitation and other heinous crimes.

And after a while, it became too much to handle.

The Worst Side of Humanity

Soto and Blauert filed a lawsuit against Microsoft last week in Washington State.

In it, they alleged they "were not warned about the likely dangerous impact of reviewing" material showing child exploitation and other heinous crimes. They also alleged a work environment that didn't offer them enough wellness support to deal with the atrocities they had witnessed.

According to the complaint, the two men provided Microsoft with a list of recommendations to help ease the emotional burden of their jobs, including "pre-vacation vacations," mandatory weekly meetings with a psychologist, and a spousal wellness program.

However, they said they continued to suffer.

Soto, who said he saw a little girl abused and killed, reported hallucinations, panic and having trouble being around his own son. Blauert, who is in treatment for Post Traumatic Stress Disorder, according to the complaint, left the job feeling "depressed, anxious, isolated and withdrawn."

A Microsoft spokesperson said the company disagrees with the allegations in the complaint. They have not yet filed a formal response in court.

"Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work," the spokesperson told NBC News in a statement.

Supporting the Internet's 'Watchdogs'

Microsoft has a digital crimes unit devoted to fighting internet scams and deploying its Photo DNA technology, which puts a fingerprint on sex abuse imagery. This stamp allows companies to easily track the terrible media wherever it's disseminated on the internet and crack down on perpetrators.

Related: Social Media Companies Join Forces to Take Down Terror Content

Blauert and Soto worked in a separate department, known as the online safety team, which is a function of Microsoft's customer service support.

Sitting in a room away from their main desks, the typical Microsoft online safety team employee may help review offending images, though it's not their full-time duty, according to a person familiar with the work.

Employees are limited in how long they can do such work each day and must do it in a dedicated office, away from their main desk. This is apparently done to help them to help disassociate the terrible imagery from the main part of their lives.

Microsoft also uses technology to help reduce the "realism" of the images being reviewed. That means high resolution photos are reduced to low resolution, and every image is said to be reviewed only in thumbnail size.

The audio and images from a video are separated, so moderators aren't seeing or hearing everything at the same time. Imagery can also be converted to black and white, helping to remove some of the realism, according to a person familiar with the work.

Checking in with a psychologist once a month is mandatory, and employees must also attend monthly group meetings to help them learn how to manage the impact of what they've seen. Quarterly psychological training sessions are also offered for employees and managers, teaching them how to recognize the signs of trauma.

Employees can also receive a break, a day off or request to not do this work, if they so choose, according to the source.

But as traumatic as it is, Blauert and Soto's complaint recognized the importance of what they had done.

"Throughout their careers at Microsoft, both plaintiffs were instrumental in saving children's lives and providing evidence for successful prosecutions," the complaint said.

The Human Touch

With ISIS propagating on social networks — and a slew of other high-profile items posted to social networks recently — having a human element in the review system is still crucial.

Twitter has reported progress in weeding out terror. After the social network increased the staff on its abuse reporting team in 2015 and leveraged "proprietary spam-fighting tools," they were able to suspend hundreds of thousands of accounts with apparent ties or sympathies to terror.

At Facebook, the popularity of live video is surging and has been used to capture heart-wrenching yet important moments, such as when Philando Castile was shot dead by a Minnesota police officer.

Related: Police Shootings Test New Era of Violent Social Media

In a video broadcast live on Facebook earlier this month, an 18-year-old man, who was said to have mental health challenges, cowered in a corner as a group of teenagers kicked him, slapped him and cut his hair until his scalp bled.

In the case of Castile, the graphic video was briefly removed and then reinstated, as millions of people expressed outrage at what appeared to be an unjust shooting.

It was painful to watch, but served an important purpose. Millions were taken inside the car as Castile's girlfriend, Diamond Reynolds, broadcast video as she pleaded with the police officer

"Please officer don't tell me that you just did this to him. You shot four bullets into him, sir. He was just getting his license and registration, sir," she said. The entire incident unfolded as her 4-year-old daughter was in the backseat.

The Castile video stayed up, with millions recognizing the value Facebook Live had to offer in that moment. The torture video of the Chicago teen did not.

A Facebook spokesperson said the torture video was taken down because the social network does "not allow people to celebrate or glorify crimes."

"In many instances, though, when people share this type of content, they are doing so to condemn violence or raise awareness about it," the spokesperson said in a statement. "In that case, the video would be allowed."

A 12-year-old girl live streamed her suicide last month, putting a noose on a tree and then stepping into it and killing herself. While the video was on live.me, it eventually made the rounds on Facebook and YouTube before it was removed.

That's in addition to the photos and other incendiary items posted on Facebook, YouTube and other platforms that may go well beyond offending and actually violate the community guidelines.

A person familiar with the work described it as "difficult" but said Facebook also has resources available to those who do it, including psychological and wellness support.