IE 11 is not supported. For an optimal experience visit our site on another browser.

OkCupid Founder: 'If You Use the Internet, You're the Subject of Hundreds of Experiments'

The widespread outrage directed at Facebook for manipulating its users' emotions has the dating site, which has been routinely running experiments on its own users, perplexed.
/ Source: Entrepreneur.com

Last month, Facebook revealed that it had manipulated the content in the new feeds of more than 600,000 people to see whether or not the changes would affect people’s emotional state.

If you weren't actively pissed off, chances are you at least found the study – which attempted to manipulate mood by altering the number of positive or negative posts a person would see – creepy. Turns out, most people resent feeling like a lab rat (the clinical language of the study, published in The Proceedings Of The National Academy of Sciences, didn't help on this count).

That's a shame, says OkCupid co-founder and president Christian Rudder, because if you use the Internet you are going to be the unwitting subject of hundreds of a wide range of experiments. "That’s how websites work," he wrote in a blog post published Monday.

Related: Facebook Basically Shrugs Off User Outrage Over 'Emotional' Experiment

Facebook's tepid defense of its 'emotions' study was that it was an attempt to improve "user experience." Rudder goes a step further, arguing that any website that wants to improve its functionality needs to run experiments on its user base. "Most ideas are bad," he wrote. "Even good ideas could be better. Experiments are how you sort all this out."

Putting his money where his mouth is, Rudder revealed in the blog post that OkCupid has been running experiments on its users for years. In light of the recent Facebook debacle, he chose to publicize the details from "a few of the more interesting" ones (which means this is likely the tip of the iceberg as far as the whole OkCupid-experiments-on-its-users thing goes).

In one test, all profile pictures on the site were hidden, which led users to interact more frequently and reveal more details about themselves. In another, profile text was obscured to see how much an individual's looks impacted his or her perceived personality (turns out, looks matter a lot). And in the third, the dating site informed users that their compatibility rating with other users was either better or worse than the site's algorithm actually predicted. While these results are mildly interesting, what's really fascinating is the casual, almost blasé way Rudder delivers the news that not only has his site been experimenting with uninformed users for years, but most other sites are doing the same exact thing.

Related: Mushy Marketing Ploy: Pizza Hut Joins OKCupid

While all three experiments exemplify how easy it is for sites like OkCupid or Facebook (which have a trove of data on users' personalities, habits, likes and dislikes) to manipulate responses, the third is probably the best example.

In that experiment, OkCupid altered the compatibility scores for potential daters, taking pairs of bad matches (an actual 30 percent match) and telling them they were a great match (a 90 match). The research found that if a potential dater was told another user was a good match, he or she was slightly more likely to initiate a message exchange even if the recipient was, in reality, a poor match according to the site's compatibility metrics.

Doctored compatibility ratings seem like as blatant a manipulation as Facebook's tweaked news feeds, but so far, the public backlash over OkCupid's experimentation appears far more subdued than it was over Facebook's.

Related: Mozilla's CEO Resigns in Wake of Criticism Over Stance on Gay Marriage

There lots of possible explanations for this, including the fact that OkCupid didn't publish its results in a scientific journal, the study was more "legitimately useful to users," the tone was "self-deprecating" rather than clinical and that OkCupid, as a niche service, is less scary than omnipresent Facebook.

That's not the real issue, though. The real issue is Rudder's argument that if you use the Internet you are, by default, the subject of a host of experiments. Companies don't need to get your explicit approval (they just have to sneak a line into their terms and services agreement), and there is no committee that distinguishes between an ethical experiment and an unethical one that has the potential to inflict real damage. Which means morally dubious experiments like Facebook's probably happen all the time, we just rarely find out about them because companies aren't obligated to publish their results.

If you don't like it, Rudder is essentially saying, get off the Internet. Despite his "self-deprecating" tone, it's a troubling message.

Related: Tell Us: Will Facebook's Unethical User Experiment Make You Quit?