IE 11 is not supported. For an optimal experience visit our site on another browser.

Opinion: Facebook Experiment Used Silicon Valley Trickery

Facebook said it had users' consent to manipulate their feeds, but they fell short of requirements of informed consent.

Sometimes even carefully reading the fine print isn't enough. You can still wind up being enrolled in a Silicon-Valley study without even knowing it.

In a study recently published in The Proceedings of the National Academy of Sciences, a Facebook scientist teamed up with two academics to subtly tweak the news feeds of nearly 700,000 Facebook users. The researchers eliminated "negative" messages from some users' news feeds, making the feeds just a bit sunnier than they otherwise would have been. For other subjects, the researchers wiped out "positive" messages, making the feeds just a little bit more depressing. As a control, they also deleted a random selection of messages.

Sign up for top Health and Tech news direct to your inbox.

No doubt there are a small handful of Facebook users who troubled to venture three clicks deep into the website's terms and conditions where they were sort of warned that they might become research subjects. No doubt a few of the especially dogged scrolled down to near the bottom of the page, where it states that Facebook reserves the right to use your information for "internal operations, including troubleshooting, data analysis, testing, research and service improvement." But even those select few who made it that far would almost certainly be surprised to discover that they would be potential subjects in a Facebook experiment on manipulating user's moods and emotions.

Why would Facebook permit these researchers do such a thing? Why manipulate Facebook users' news feeds? The whole exercise, according to the researchers, was to figure out whether the tinkering would also affect Facebook users' emotional state — to see whether happiness and depression are "contagious" through Facebook's social networks.

This should send a shiver down the spine of any Facebook user or anyone thinking about becoming one. Even if everybody agreed that the question was one of great scientific importance (which isn't self-evident), even if the methodology of the study were airtight (which is far from certain), the experiment should never have been performed. It is a violation of the rights of research subjects. It does not come close to meeting the conditions when research involves personal information.

The question of whether or not an experiment is ethical hinges upon the question of "informed consent." Generally, this means that a subject in a study needs to have basic information about the study he's participating in, understand the nature of the experiment and its risks and benefits, and have the ability to withhold his consent without fear of harm or retribution.

The authors of the study argue that they obtained subject consent: Their manipulation of Facebook users' emotions was "... consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research." This is nonsense; it's not informed consent. It is an old Silicon Valley trick for systematically eliminating the legal rights of its customers.

If you've ever clicked "I agree" when installing a new piece of software, you're almost certainly aware of the scheme. As a condition of using a service that you desperately want to use (and often have already paid to use), you need to agree to a huge block of legalese that, more often than not, you won't read. This may be legally enforceable, but it's leagues away from informed consent.

What went on in this sub-rosa study is the tip of a growing iceberg. Innumerable Silicon Valley companies are collecting terabytes of behavioral and biomedical data about you right now — Google knows what you visit on the web, Fitbit knows your exercise habits, 23andMe has your genome — and many of these companies are doing research on you, whether you explicitly agree to it or not.

As Facebook has proven, these experiments aren't always just passive, observational studies. They are beginning to cross into a regime where the companies are trying actively to manipulate you for their own interests. Even, apparently, if it harms you. After all, Facebook didn't seem terribly worried whether eliminating good news from a user's newsfeed might send him into a rage.

When entities feel entitled to experiment on human beings without informed consent, without accountability to anyone but themselves, that's when bad things happen to research subjects. And it's now clear that if we don’t insist on greater regulatory oversight of their ‘research’ you are likely to be next.