IE 11 is not supported. For an optimal experience visit our site on another browser.

Breaking up Facebook won't solve any of the problems that make its users angry

None of the proposals for government intervention would resolve the concerns raised by the Cambridge Analytica scandal
Image: Mark Zuckerberg appears before the House and Energy Committee in Washington
Mark Zuckerberg appears before the House and Energy Committee in Washington on April 11, 2018.David Butow / Redux for NBC News

How do you solve a problem like Facebook? Its fiercest critics attack the $500 billion social network as a corporate supervillain that has built a super-profitable, competition-killing monopoly using an immoral business model based on selling its users’ personal data. If the basher is a liberal, they might also slam Facebook for letting its mega-platform be politically weaponized in the 2016 U.S. elections; if a conservative, the critic might complain of bias and the suppression of “politically incorrect” speech.

But before Washington breaks up Facebook, heavily regulates it or nationalizes it — all various answers suggested by critics — maybe it would be wise to make sure there’s actually a serious problem demanding government intervention.

After all, it wasn’t so long ago that Facebook was almost universally considered one of the crown jewels of America’s post-recession economy. It’s still one of the great entrepreneurial stories of all time: From a Harvard dorm room to a global tech titan with two billion users in just over a decade. Facebook also directly employs over 25,000 workers with median annual pay of $240,000. And more than 60 million businesses, many of them small, use its advertising platform to cheaply and effectively target potential customers.

Then there’s the value that Facebook provides at no cost to its two billions users, which is a wicked economic problem to calculate. But there’s little disagreement that the “consumer surplus” is pretty massive.

Being big, powerful and successful is an insufficient reason for action, even if it makes some people uncomfortable.

None of that good stuff, of course, means that Facebook should be above the law or beyond the reach of government regulators. Yet being big, powerful and successful is an insufficient reason for action, even if it makes some people uncomfortable.

Take the first order question of whether Facebook is a monopoly: It obviously is not. Facebook had an estimated fifth of the online advertising market in 2017, a share which some analysts expect to decline this year in face of growing competition from Amazon and Snap. And it’s not even the largest online advertiser: That position belongs to Google, with nearly 40 percent of the market (a share that is also expected to shrink a bit).

Moreover, the flywheel-like network effects that have driven the growth of big tech firms such as Facebook, Google, and Amazon — the more users a platform attracts, the more its appeal grows and on and on — means there’s a certain inevitability to having a few, very large platforms. Although that might result in them being able to squash future competitors and thus hurt consumers, the evidence that they are doing so is weak at best. Indeed, the mega-platforms spend heavily on research and development, intensely compete with each other, and provide an appealing off-ramp to potential innovators in the form of pricey buyouts of their startups.

Even if Facebook were a classic monopoly in terms of market dominance, the current standard for antitrust looks for evidence of consumer harm — of which there really isn’t any

But even if Facebook were a classic monopoly in terms of market dominance, the current standard for antitrust looks for evidence of consumer harm — of which there really isn’t any, given that the company supplies content at no price to consumers. And even a significant antitrust action such as forcing Facebook to split off Instagram or WhatsApp, wouldn’t solve any concerns about privacy, “fake news” or political bias.

But isn’t “deviously” selling user data a most definite consumer harm that should prompt government action? Perhaps, if that were actually how Facebook did business. But, as Zuckerberg kept correctly repeating recently to congressional panels, Facebook doesn’t sell data; rather, it collects data to help advertisers target certain groups. And it has taken numerous steps to prevent the sort of data leakage at the heart of the Cambridge Analytica scandal.

Of course, as The Wall Street Journal put it, “You didn’t read Facebook’s previous data privacy policy, and you probably won’t read this new one either.” It’s funny, but no joke: It’s questionable how much users really care about their personal data as long as no one is using it to elect Donald Trump or drain their bank account. One bit of evidence for this is a study, “How Consumers Value Digital Privacy,” which conducted a survey of 1,579 internet users and found that “85% were unwilling to pay anything for privacy on Google.” (And, of the 15% of Google users willing to pay, the median acceptable fee was around a paltry $20 per year.)

It’s questionable how much users really care about their personal data as long as no one is using it to elect Donald Trump or drain their bank account.

There’s another bit of evidence: Despite the Cambridge scandal and the number of media stories attacking Facebook’s data handling, the evidence so far is that most users are sticking with the social network.

Maybe at some deep level users really do understand and acquiesce to the economic bargain that lies at the heart of the internet economy: The exchange of free or subsidized content for advertising targeted using personal data.

Heavy regulation of that exchange could lead to a more subscription-based internet with tiered pricing for different levels of service. Or, it could also lead to more market power for Facebook, since it and other incumbent tech giants already have plenty of users constantly generating oodles of data and new regulation is more likely to increase barriers to entry for potential competitors.

The oldest story in business is that dominant firms don’t stay dominant forever, although predicting from where the challenge may emerge and how it will play out is hard.

It’s not crazy to think some form of industrywide self-regulation might actually work: Users give Facebook its true power over advertisers and publishers, so the company has a big market incentive not to alienate them. But, beyond that, if government were to act, maybe the most straightforward move would be to compel social media companies to let users port their friends’ names and emails when signing up for another social network, so that a potential competitor can build on that data. Other regulation-light ideas include a special court to adjudicate tech issues, like the ones we have for taxes.

If you’re really worried about Facebook’s power, remember that it’s not the end of history when it comes to any of the big tech companies, as tech analyst Benedict Evans has noted. The oldest story in business is that dominant firms don’t stay dominant forever, although predicting from where the challenge may emerge and how it will play out is hard. (The cover of Fortune magazine once declared Yahoo the winner of the “search engine wars.”) That’s why government regulation de facto targeting individual companies is often less effective that intended: The government is no better at predicting the future than anyone else.

As with doctors, “First, do no harm” should always be top of mind with policymakers and regulators.

James Pethokoukis is an economic policy analyst at the American Enterprise Institute. He is also an official CNBC contributor.