IE 11 is not supported. For an optimal experience visit our site on another browser.

If Facebook wants to protect user data, it needs to disclose what it has and let people opt out

The company says it's 'outraged' over Cambridge Analytica's actions, but it doesn't want to stop selling access to your information
Image: A man walks past a mural in an office on the Facebook campus
A man walks past a mural in an office on the Facebook campus in Menlo Park, California on June 11, 2014.Jeff Chiu / AP file

The Cambridge Analytica story has struck a chord because of the company's connection to Trump and their resemblance to cartoon villains -- and there’s seemingly no doubt that the company violated Facebook’s rules. But Cambridge Analytica's harvesting of more than 50 million Americans' personal information without their knowledge or consent is really just a byproduct of the larger disease.

Facebook — and countless companies like it — are in the business of collecting and selling access to everyone’s data on a scale that boggles the mind.

Facebook says that it is “outraged” over the brewing scandal and that it “will take whatever steps are required” to prevent a similar incident from happening again. But why should anyone trust them? Their business model, as well as countless other online companies', is to control and then sell to advertisers microtargeted connections to the most massive collection of personal data in human history, encompassing literally billions of people.

The more that people understand how Facebook actually works, the less they’ll like it.

That it why Facebook — along with Google, Verizon, and other tech giants — is at this very moment vigorously trying to stop a California ballot initiative that would go a long way in allowing users to protect themselves from falling prey to the exact kind of unknowing data collection about which Facebook says it is so upset.

The LA Times reported on Tuesday that Facebook is plowing a $200,000 into a political action committee whose goal is to prevent California residents from voting on a measure that would bring much needed transparency to what data tech companies collect and how that data can be used. As the LA Times described it, the proposed ballot measure "would require companies to disclose what personal information from Californians they collect, buy or share [and] allow many consumers to 'opt out' from those practices."

The reason that the proponents of the ballot initiative feel it's necessary is that Facebook knows everything about you — more than even your closest friends and family members — and not just the information that you may choose to post to your personal profile. They know many of the websites you visit, all the sites you are forced to sign into using your Facebook information, the drafts of posts you delete, what music you listen to and where you are at all times of the day when the app is open. And there's no simple way to find out what all they know, with whom they share it or how to change or stop it.

The sheer volume of information they have on you, as well as your limited ability to stop them from collecting it, is why 89% of all new advertising income is going to Facebook and Google.

Image: A server room at a Facebook data center
A server room at a Facebook data center.Jonathan Nackstrand / AFP - Getty Images file

To protect that business model, Facebook in some cases actively attempts prevent users from knowing exactly all the data it has on you and how they gather that data. That’s why it took a years-long legal fight in Europe to force them to allow users to download an archive of your profile information. It’s why Facebook refuses to disclose how its friend recommending system works, despite the fact that it outs people all the time who would prefer to keep their identity siloed and private. It’s partly why Facebook refuses to release all the political ads bought by Russian actors and countless others.

It’s not just about protecting “trade secrets.” The more that people understand how Facebook actually works, the less they’ll like it. This is why we often have to rely on leaks to the press to understand how Facebook actually works.

This tension between the need for transparency and maximizing profits has reportedly played out internally at Facebook. Facebook’s chief of security, Alex Stamos, will allegedly be leaving the company later this year over disputes with the policy and legal team over transparency around the Russia ads. The New York Times reported that “The security team generally pushed for more disclosure about how nation states had misused the site, but the legal and policy teams have prioritized business imperatives.”

Their business model is to control and then sell to advertisers microtargeted connections to the most massive collection of personal data in human history.

The organizers of the important California ballot initiative are calling on Facebook to drop its opposition to the measure in the wake of the Cambridge Analytica, but you can bet they that won’t. With its massive population, California laws are a bellwether for the rest of the country and, if it passes, it’s likely that Facebook would have to comply nationwide, or that other states would soon follow.

Facebook can talk all it wants about how upset they are that its firehose of personal data was misused, but until they provide users the tools to truly control their own data, you can expect something similar to happen again.

Trevor Timm is the executive director of Freedom of the Press Foundation. His writing has also recently appeared in the New York Times, the Guardian, USA Today and the Columbia Journalism Review.

CORRECTION (March 22, 2018 10:45 a.m.): An earlier version of this article misstated the amount that Facebook has donated to a political action committee opposing the privacy ballot initiative. It is $200,000, not $1 million