March 8, 2013 at 3:42 AM ET
What impact will Facebook's new redesign have on users' privacy? It's far too soon to tell, but a study published this week by Carnegie-Mellon University suggests that prior design changes to the social media site have nudged users into sharing more information than they want to.
The long-term study, which followed thousands of Facebook users and their privacy choices over seven years, found that users steadily shared less information with strangers over time. But it also found that they shared more with friends, which ultimately means they shared more with Facebook and third parties like app developers, which the researchers call "silent listeners."
"People are trying to reveal less publicly ... but in fact are disclosing more to these silent listeners," report author Alessandro Acquisti told NBC News, adding that the research is the first so-called “longitudinal study” to examine Facebook user behavior.
There was one sudden reversal in the trend toward more privacy-centric choices in 2009-10, during which users who had been sharing less suddenly began sharing more, the study found. The reversal corresponded to major changes in Facebook's design.
"These findings highlight the tension between privacy choices as expressions of individual subjective preferences, and the role of the environment in shaping those choices," the report says.
Facebook had an entirely different interpretation of the data produced by the researchers.
"Independent research has verified that the vast majority of the people on Facebook are engaging with and using our straightforward and powerful privacy tools, allowing them to control what they're sharing, and with whom they're sharing,” the firm said in a statement. It would not answer additional questions about the study on the record.
Acquisti, along with fellow authors Ralph Gross and Fred Stuzman, examined the public sharing habits of 5,000 Carnegie Mellon students between 2005-2011, focusing on how frequently they posted information that any stranger could see, such as birthday, high school, political affiliation, phone, address and interests. The trend lines on open sharing of personal information like birthday and political affiliation fell steadily over the course of the study. For example, those sharing birthday information sank from 86 percent to 13 percent.
But for other items, public sharing ticked up in 2010. The percentage of those telling the world their hometown, for example, shrank steadily until 2010, when the percentage nearly tripled, from 13 percent to 33 percent. Those sharing their high school, address, or the favorite music and movies jumped similarly.
The authors argue that Facebook's introduction of additional privacy controls during this time actually led to consumers oversharing. Facebook also introduced pages that could be “liked,” which were linked to users’ interests, schools and other information. Links to these pages were public, by default, increasing the amount of information users shared.
"Through the addition of highly granular privacy controls, Facebook argued that individuals would be better able to share information with audiences of their choice. However, Facebook's new privacy interface proved to be confusing to users, resulting in public retractions and updates by the company," the report said. “Changes implemented by Facebook … countered privacy-seeking behavior by arresting and in some cases inverting the trend.”
The report’s main finding, however, is that there are two equal but opposite trends on Facebook – users trying to share less with strangers, but also sharing more with friends and, as a result, more with Facebook and its partners.
Information shared on Facebook with friends, but not with the general public, is also shared with Facebook, which may choose to release the information to law enforcement or other entities in the future, the authors argue. Such data is also shared with third-party app creators when they obtain a one-time consent from users.
“Users aren’t reminded every time they share something with friends that they might be sharing it with an app, too,” Acquisti said.
The data is also indirectly shared with advertisers. Firms that advertise on Facebook through programs such as its new “custom audiences” platform do not receive personally identifiable information about users, but can target groups of users with particular characteristics, such as new young mothers in California.
“The fact that advertisers don't get direct access to the data is some protection, but it does not change the reality that advertisers can indirectly get at you through the data you are revealing about yourself on Facebook,” Acquisti said. “Is your privacy violated only when someone gets your name and birthdate, or if they know you are pregnant and try to send you advertisements that use this information?”
Jules Polonetsky, director of the Future of Privacy Forum, a privacy-related think tank that is supported by corporations,said he saw more positive than negative in the Carnegie-Mellon report.
“I think the most interesting thing about the report is that it shows that Facebook started out as a very public place, and over time it evolved into a place where you primarily share things with your friends, and that's a good thing,” he said.
He disagreed with the description of third-party app developers as “silent listeners,” noting that users give permissions to apps so they can automate tasks that they could do manually, such as finding out if a friend is playing “Worlds with Friends.” He also said that Facebook is doing a good job at keeping advertisers at arm’s length from the data it has on users, and the firm has learned that it doesn’t need to nudge users into oversharing to make them useful to advertisers.
“Ironically, the success of their advertising model may be dependent on more people doing more and more, and sharing more, but doing it privately,” he said. “The sweet spot Facebook has started finding is users don’t need to share things publicly for it to be able to monetize them in an advertising-supported network.”
The crux of the debate lies along this razor’s edge: Just how private is information shared privately with Facebook? And are users being induced to share more than they want to?
In previous studies, Acquisti’s research has shown that more granular privacy controls actually encourage users to share more information about themselves, and they can also distract users from noticing important privacy choices. He calls this the “paradox of control.”
“I don’t want to say it’s a seduction, but you could call it a nudge,” he said. “…The consequence of providing granular control settings is that users become more comfortable with revealing more and more sensitive data. People focus when they are about to put up a new post on whether they want to share that with friends or friends of friends. But you don’t get the option to say, ‘I don’t want Facebook to see this, or I don’t want a third-party app to see this.’”
*Follow Bob Sullivan on Facebook.
* Follow Bob Sullivan on Twitter
More from Red Tape Chronicles: