IE 11 is not supported. For an optimal experience visit our site on another browser.

Facebook Manipulates Emotions: Business as Usual for Social Media Giant

Anger erupted over a study in which what could be termed the social media company's "Emotions Lab" tweaked the News Feeds of some of its users.
Get more newsLiveon

Facebook attempted to toy with the emotions of nearly 700,000 of its users under the guise of science, reminding users once again they are more product than customer, experts said.

Anger erupted this past weekend over a study in which what could be termed the social media company's "Emotions Lab" tweaked the News Feeds of some of its users, but the study isn't new. In 2012, Facebook’s data science team wanted to nail an answer to a query still common among academic and marketing researchers, not to mention users: Can Facebook make you happy or sad?

Sign up for top Technology news delivered direct to your inbox.

To figure it out, the group secretly altered News Feed algorithms of the test subjects for one week, ensuring one group saw mostly positive posts, the other, mostly negative. Earlier, some experts had suspected that seeing other users post the best parts of their own lives would make people feel left out. The counter-intuitive results, published in Proceedings of the National Academy of Sciences in March, found that a positive News Feed inspired positive posts from the test subjects, and vice versa. For Facebook users, however, the real revelation of the study was learning they were all potential lab rats to the world’s largest social network.

Outrage is understandable -- secret tests have a long and ugly history. And as a marketing study, this test doesn't appear to have covered basic academic research safeguards meant to protect both privacy and the well-being of the test subjects. But experts note this shouldn't come as a surprise.

“Facebook could be doing this sort of manipulation all the time, and the fact is they probably are,” Adi Kamdar, activist at the Electronic Frontier Foundation, told NBC News. “We as users should use the publication of this study as a glimpse into the sort of power that Facebook has.”

Facebook declined to comment on the record in response to questions from NBC News. "In hindsight, the research benefits of the paper may not have justified all of this anxiety," study leader Adam Kramer posted on Facebook.

“People feel like they’re being toyed with, and that makes perfect sense.”

World Cup check-ins, how rumors spread and what Facebook interactions reveal about the health of romantic relationships are a few of the interesting dispatches we’ve seen so far from the Facebook Data Science team, which previously hadn't received a lot of notice on its own. As of Monday, it hosts a modest 307,393 “likes” and a smattering of posts on its comparatively quiet Facebook page. (Facebook’s official Security page has more than 8,350,000 "likes".)

There's little to indicate the importance and potential power of this Facebook team, launched in 2012 to help monetize the reams of freely volunteered information and make the company more appealing to both advertisers and investors.

“For the first time we have a microscope that not only lets us examine social behavior at a very fine level that we’ve never been able to see before, but allows us to run experiments that millions of users are exposed to,” Cameron Marlow, Facebook’s founding Data Science leader, told MIT Technology Review in 2012. Marlow, who has since left the team, posited at the time, “If [Facebook’s] News Feed is the thing that everyone sees and it controls how information is disseminated, it’s controlling how information is revealed to society, and it’s something we need to pay very close attention to.”

Even before Facebook gained a dedicated team to rake through its data, an exercise the social media site performed around the 2010 elections revealed the potential power of data on the site's users. A June article in the New Republic recounted how political scientists worked with Facebook during that election cycle to create a graphic posted in tens of millions of News Feeds. The reminder showed up to six profile photos of Facebook friends who posted their voting status and included links to polling places. Researchers concluded the shareable graphic inspired 340,000 more votes that day. In other words, Facebook may have the power to drive people to the polls.

Certainly nothing manipulates the emotions of a Facebook user like the dystopian vision of the social network manipulating the outcome of an election. Today, however, it’s about access to an unprecedented amount of data, used to manipulate users who agree to nothing beyond a website’s terms of service. As for those fine print, multi-screen warnings, multiple studies have shown that they are not read by many and are understood by fewer.

This, too, is nothing new.

“Facebook has been unabashedly brash about people’s privacy and about how they use their data," Rey Junco, a social media scholar fellow at the Berkman Center for Internet and Society at Harvard University, told NBC News. He cited Wikipedia’s Criticism of Facebook entry, a sprawling entry with 18 thoroughly footnoted categories, such as treatment of users, privacy concerns and misleading campaigns.

“There’s this general kind of distrust of Facebook,” Junco said. “People feel like they’re being toyed with, and that makes perfect sense.” For many, Facebook is almost synonymous with the Internet, a huge part of everyday life, Junco said. “Users think that they’re the customers, but Facebook’s customers are advertisers, and we’re the product producing the data.

“Consumers should understand that Facebook is not a neutral platform,” the EFF’s Kamdar said. “Facebook is an online tool that is run by a for-profit company that wants to tweak settings to provide a better product and also make more money. It’s become such an important part of our lives. We have the expectation that it is a public form and that nothing will be altered or changed in any way, and that isn't totally true.”