IE 11 is not supported. For an optimal experience visit our site on another browser.

The Facebook Papers: Documents reveal internal fury and dissent over site’s policies

Thousands of leaked documents highlight employees’ disillusionment with spread of misinformation and calls to violence.
Illustration of Frances Haugen, the Facebook whistleblower, speaking into a microphone as papers with Facebook data fly off a stack.
Frances Haugen, who worked as a Facebook product manager until May, has come forward as a whistleblower and provided internal Facebook documents to Congress and the Securities and Exchange Commission.Doug Chayka for NBC News

Hours after the Jan. 6 assault on the U.S. Capitol, Mike Schroepfer, Facebook’s chief technology officer, posted on the company’s internal message board.

“Hang in there everyone,” he wrote. Facebook should allow for peaceful discussion of the riot but not calls for violence, he added. 

His post was met with scathing replies from employees who blamed the company for what was happening. 

“I’m struggling to match my values to my employment here,” an employee wrote in a comment. (The employee’s name was redacted in a version seen by NBC News.) “I came here hoping to effect change and improve society, but all I’ve seen is atrophy and abdication of responsibility.” 

Another employee asked, “How are we expected to ignore when leadership overrides research-based policy decisions to better serve people like the groups inciting violence today?”  

The comments openly challenged the company’s leadership with a not-so-subtle message: Facebook’s well-documented problems in abetting violent polarization and encouraging the spread of misinformation weren’t getting fixed, despite the company’s investments and promises. 

The comments are in thousands of pages of internal Facebook documents given to NBC News detailing Facebook’s internal debates around the societal impact of its platforms. Together the documents offer the deepest look provided to outsiders at the internal workings of the world’s largest social media company. 

They are a small fraction of the internal communications over the past several years at Facebook, where employee message boards that started as a way to embrace transparency have become an outlet for reflection and advocacy on the impact of social media.

The documents show employees — many who were hired to help Facebook address problems on its platforms — debating with one another on internal message boards free of public relations spin. Many tried to figure out how to turn stalled bureaucratic wheels and steer a company that now has so many departments that employees sometimes aren’t aware of overlapping responsibilities. Some employees defended management, with one calling Facebook executives “brilliant, data-driven futurists like many of us.” 

The documents were included in disclosures made to the Securities and Exchange Commission, or SEC, and provided to Congress in redacted form by legal counsel for Frances Haugen, who worked as a Facebook product manager until May and has come forward as a whistleblower. Digital versions of the disclosures — with some names and other personal information redacted — were obtained by a consortium of news organizations, including NBC News. Most of the documents are digital photographs of company material on computer screens. 

The news consortium is making at least some of the disclosures public beginning Monday. The Wall Street Journal reported some of the disclosures earlier. 

Haugen alleges in letters to the SEC Office of the Whistleblower that Facebook executives up to and including CEO Mark Zuckerberg have misled investors for years, giving them a false picture of the reality inside the company about subjects like Facebook’s user base and its record on human rights. She wrote at least eight separate letters, and her attorneys provided the internal documents to the SEC in support of her allegation that executives’ statements don’t match the truth. In the letters, she also offered her help to the SEC if it were to investigate potential violations of securities laws. 

But more broadly, Haugen has kick-started a debate about Facebook’s impact on society, both in the U.S. and abroad. 

“Facebook did not invent partisanship. They did not invent polarization. They didn’t invent ethnic violence,” Haugen said in a call with reporters this month. “But the thing that I think we should be discussing is what role, what choices did Facebook make to expose the public to greater risk than was necessary?” 

Haugen repeated her allegation against Facebook executives in testimony before Congress this month. 

“The company intentionally hides vital information from the public, from the U.S. government and from governments around the world,” she told the Senate Commerce Subcommittee on Consumer Protection. 

She is scheduled to testify Monday before a committee of the U.K. Parliament examining online safety. 

Zuckerberg has pushed back against Haugen’s allegations. “At the heart of these accusations is this idea that we prioritize profit over safety and well-being. That’s just not true,” he said in a Facebook post Oct. 5. He also said Facebook was being punished for trying to study its impact on the world. 

Facebook spokesperson Drew Pusateri defended the disclosures to investors, saying the company is confident that it has given investors the information they need to make informed decisions. 

“We make extensive disclosures in our SEC filings about the challenges we face, including user engagement, estimating duplicate and false accounts, and keeping our platform safe from people who want to use it to harm others,” he said in an email. 

“All of these issues are known and debated extensively in the industry, among academics, and in the media,” he said. Facebook is ready to answer regulators’ questions and will cooperate with government inquiries, he said.  

According to the disclosures from Haugen: 

  • The company spends considerable time and resources studying how to solve such problems, but has declined in some cases to implement potential solutions put forward by its own researchers. Employees complain that sometimes that is because Facebook’s Washington-based policy team has veto power over decisions. Joel Kaplan, Facebook’s global head of public policy, has repeatedly defended his influence, saying he pushes for analytical and methodological rigor about subjects such as the algorithms that power Facebook products. 
  • A change to Facebook’s news feed in 2018 intended to bring friends and family members closer together in meaningful ways often had the opposite effect, internal researchers wrote. Posts spread more easily if they included outrage or misinformation, causing an online “social-civil war” abroad in places like Poland. 
  • Engineers and statisticians struggle to understand why certain posts and not others get traction through re-shares on Facebook and how to fix the “unhealthy side effects.” In 2019, an internal researcher wrote: “We know that many things that generate engagement on our platform leave users divided and depressed.” 
  • Facebook has struggled to filter out many posts that violated its rules. Documents say the company’s automated systems deleted only about 2 percent of hate speech as of 2019 and, as of this year, less than 1 percent of content trying to incite violence. Facebook said in a blog post this month that the documents understate the company’s effectiveness and that the prevalence of hate speech — how often users actually view it, rather than the number of posts — has dropped. 
  • Many documents highlight Facebook’s failure to police its platform outside the U.S., including in Myanmar and Sri Lanka, where the company has issued apologies for its actions contributing to physical violence against religious or ethnic groups. The documents describe translation issues and a lack of local cultural knowledge. 

It isn’t clear whether the SEC is investigating Facebook or whether it would see enough material in the disclosures to warrant an investigation of whether the company could have misled investors. The SEC declined to comment. The commission isn’t required to take any action on whistleblowers’ tips, and when it conducts investigations, it does so on a confidential basis as a matter of policy. In an annual report, the SEC said it received over 6,900 whistleblower tips in the fiscal year ending September 2020. 

Several securities law experts said it wouldn’t be easy to prove wrongdoing. 

“Regulators like clean cases, and they like where someone is on tape doing something wrong,” said Joshua Mitts, a securities law professor at Columbia University. Haugen’s allegations are hardly a “clean case,” he said. 

Facebook pushback

Facebook’s public relations chief last week said Haugen’s disclosures were an “orchestrated ‘gotcha’ campaign” guided by her public relations advisers. 

“A curated selection out of millions of documents at Facebook can in no way be used to draw fair conclusions about us,” Facebook’s vice president for communications, John Pinette, said in a tweet ahead of the release of the Haugen disclosures. 

“Internally, we share work in progress and debate options. Not every suggestion stands up to the scrutiny we must apply to decisions affecting so many people,” Pinette said. 

Haugen has gotten help from experienced lawyers and public relations advisers. A firm run by Bill Burton, an Obama White House spokesperson, is handling media requests, and Haugen is represented by lawyers from Whistleblower Aid, a nonprofit organization. 

The disclosures made by Haugen’s attorneys illustrate a roiling internal debate at Facebook at the same time it has been in a harsh external spotlight, with congressional hearings, privacy investigations, antitrust lawsuits and other scrutiny by outsiders. 

And the upheaval may prove a bigger threat than any external scrutiny, because Facebook relies for its success on being able to attract and keep some of the world’s top software engineers and technologists. If the company can’t attract, retain and motivate talented employees, it could lose its ability to compete effectively, it said in its most recent annual report in January. 

A Facebook employee wrote on an internal message board on Jan. 6: “We have been dealing with questions we can’t answer from our friends, family, and industry colleagues for years. Recruiting, in particular, has gotten more difficult over the years as Facebook’s ethical reputation continues to deteriorate (all while our technical reputation continues to increase).” 

Facebook said in a statement that 83 percent of its employees say they’d recommend it as a great place to work and that it has hired more employees this year than in any previous year. 

Jan. 6 impact 

The internal turmoil over the Jan. 6 attack was apparent throughout the documents, beyond Schroepfer’s internal post. (Schroepfer plans to step down to a part-time role at Facebook next year.) 

According to a Facebook document, the riot so tested the company’s ability to halt incitements to violence that the company reinstituted 25 safeguards that it had in place around the 2020 presidential election to minimize hate speech and other content against the platform’s rules. The efforts were called “Break the Glass.” 

Later, a Facebook employee published an examination of the lead-up to the Capitol attack on the company’s internal message board, with scathing findings about its failure to stop the growth of the conspiracy theory movement promoted by then-President Donald Trump and his followers known as “Stop the Steal.” Believers in the theory falsely assert that President Joe Biden stole the election. 

Facebook was alerted to the first such group on Election Night in early November and disabled it because of hate speech, calls to violence and incitement in the comments, the investigation found. But in the months ahead, new conspiracy-theory groups flourished.

Facebook watched the movement’s “meteoric growth rates,” and related groups were among the fastest-growing groups on all of Facebook, according to some of the documents. But managers failed to act because, they said, they were looking at rule violations one by one and didn’t see the big picture. 

“Because we were looking at each entity individually, rather than as a cohesive movement, we were only able to take down individual Groups and Pages once they exceeded a violation threshold,” the report said. Facebook realized that “Stop the Steal” was a cohesive movement only after the Capitol attack, the report said. 

Facebook seemed unable to understand the dynamics, influencers, tactics and ultimate intentions of the conspiracy movement, even as it operated in plain sight, the documents suggest. 

“This sort of deep investigation takes time, situational awareness, and context that we often don’t have,” the internal report said. 

Facebook’s enforcement was “piecemeal,” the team of researchers wrote, saying “we’re building tools and protocols and having policy discussions to help us do this better next time.” 

In a statement responding to questions about the research, Facebook said it has spent years building defenses and expertise to stop interference in elections. It said some of its tools are so blunt — equivalent to shutting off an entire town’s roads, it said — that they’re for emergencies only, not normal conditions. 

“It is wrong to claim that these steps were the reason for January 6th — the measures we did need remained in place well into February, and some like not recommending news, civic, or political Groups remain in place to this day,” Facebook said. “These were all part of a much longer and larger strategy to protect the election on our platform — and we are proud of that work.” 

Causing ‘social-civil war’ 

Another set of Haugen’s documents describes how the computer algorithm behind Facebook’s news feed — the formula that determines what posts people see and in which order — led to unintended consequences over months and years. 

Facebook announced that it would rewrite the algorithm in January 2018, saying it would emphasize “meaningful social interactions” and give more weight to comments, reactions and re-shares among friends, rather than posts from businesses and brands. 

By the next year, the changes had reverberated throughout European politics. 

“Political parties across Europe claim that Facebook’s algorithm change in 2018 [regarding social interactions] has changed the nature of politics. For the worse,” an employee wrote in an April 2019 internal post. Facebook was responsible for a “social-civil war” in online political discourse in Poland, the person said, passing on a phrase from conversations with political operatives there. (The Facebook employee doesn’t name the political parties or the operatives involved in the “social-civil war” or what issues were at the forefront. A Polish election later that year focused attention on expansion of the welfare state, European integration and gay rights, Reuters reported.) Extremist political parties in various countries celebrated the way the new algorithm rewarded their “provocation strategies” for subjects such as immigration, the Facebook employee wrote.

Studying the impact of the algorithm change became a priority for many economists, statisticians and others who work at Facebook studying the platform, the documents show. A study posted internally in December 2019 said Facebook’s algorithms “are not neutral” but instead value content that will get a reaction, any reaction, with the result that “outrage and misinformation are more likely to be viral.” 

“We know that many things that generate engagement on our platform leave users divided and depressed,” wrote the researcher, whose name was redacted. 

In April 2020, managers presented Zuckerberg with a series of proposed changes to the algorithm, according to a written summary of the meeting included among Haugen’s disclosures. The summary says Zuckerberg rejected some of the proposed changes, including an idea to reduce re-shares — posts that get shared again and again, which researchers found were often misinformation. 

“Mark doesn’t think we could go broad” with the changes, employees wrote afterward in the summary, although the idea had already been implemented for content about health and politics. “We wouldn’t launch if there was a material tradeoff with MSI impact,” they wrote, using the initialism for “meaningful social interactions,” a measure of engagement. 

Zuckerberg defended his decisions this month, saying in his Facebook post that the introduction of the MSI system in 2018 led to fewer viral videos, “which we did knowing it would mean people spent less time on Facebook, but that research suggested it was the right thing for people’s well-being.”

“Is that something a company focused on profits over people would do?” he wrote. 

In a statement Friday, Facebook said it’s not responsible for existing problems in society. 

“Is a ranking change the source of the world’s divisions? No,” the company said. “Research shows certain partisan divisions in our society have been growing for many decades, long before platforms like Facebook even existed.”

Facebook also said that it has spent $13 billion since 2016 to counter bad content and that it employs 40,000 people to work on safety and security. It said it’s continuing to make changes to its platform, such as running tests to reduce political content. 

An employee who worked on platform integrity quit in frustration in August 2020, citing Facebook’s unwillingness to implement safeguards against conspiracy theories like QAnon. 

“We were willing to act only *after* things had spiraled into a dire state,” the employee wrote in a farewell note to colleagues. 

Facebook has made a series of announcements in the past year concerning efforts to limit the spread of fringe and conspiracy groups, including a move in March to change its recommendation system.

In another farewell note in December, a data scientist who was quitting listed what he would miss: interesting work, friendly colleagues and the “amazing” pay. 

Then came the parts he wouldn’t miss. 

“Unfortunately I don’t feel I can stay on in good conscience,” wrote the person, whose name was redacted. Facebook “is probably having a net negative influence on politics in Western countries,” the person wrote, adding that executives didn’t appear committed to fixing the problem. “I don’t think that I can substantially improve things by staying.” 

Facebook said it disagrees that it’s a net negative. “Facebook helps people connect with friends and family and helps businesses around the world thrive,” it said. 

Other languages 

Sometimes the company lacks the systems to enforce its own rules, especially among the large proportion of people who use Facebook in languages other than English, according to some of the documents shared with the SEC and Congress. 

Several documents focus on the Middle East and North Africa, including a presentation in December detailing in part how the company’s tools that take down Arabic-language content for having ties to terrorism were wrong 77 percent of the time, “resulting in a lot of false positives and a media backlash.” The full presentation on platform “integrity” problems when printed is more than 50 pages. 

In a statement, Facebook said the measure seemed to be a mischaracterization because it included content related to Hamas and Hezbollah — organizations that some people in the Middle East wouldn’t consider tied to terrorism, but that the U.S. government has on its list of “foreign terrorist organizations.” Facebook said it has legal obligations to remove that content, as well as a policy against it. 

Other reported problems are language-based. In a separate analysis published in January, a Facebook researcher said that when an Afghan user tries to report hate speech, the instructions are mostly in English and not in Pashto, one of Afghanistan’s national languages. Facebook’s Community Standards, the list of content rules for users, also aren’t translated. 

“There is a huge gap in the hate speech reporting process in local languages in terms of both accuracy and completeness of the translation of the entire reporting process,” wrote the author of the analysis, a Facebook researcher whose name was redacted.

Facebook said it reviews content in Pashto and Dari, the two Afghan national languages. In August, the company also said it was putting additional resources into services for Afghan users, including security controls for people fearing the Taliban’s takeover of the government. 

Other documents show that when Facebook invests in more careful monitoring of a country’s social media activity, it can mitigate viral misinformation and dangerous hate speech. For example, a document outlines how Facebook convened almost 300 people from 40 different teams to focus on the April 2019 elections in India to tackle political misinformation and fend off what it characterized as “bad regulation” for social media companies. The effort, which included the creation of a temporary “operations room” in Singapore, resulted in a “surprisingly quiet, uneventful election period,” the post-analysis states.

The cache also includes the departing memo shared by former Facebook data scientist Sophie Zhang, who worked on Facebook’s site integrity team from January 2018 to September 2020. In the post, previously reported by BuzzFeed News, she outlined how she believed Facebook was ignoring manipulation of the platform by political leaders in India, Ukraine, Spain, Brazil, Bolivia, Ecuador and other countries. 

Facebook said it’s working to improve its capabilities around the world, but recognizes that it still has work to do. 

“In the last two years, we’ve made investments to add more staff with local language, including Arabic, as well as country and topic expertise to expand the number of languages and dialects we can review content in globally,” it said in a statement. “We’re reviewing a range of options to address these challenges including hiring more content reviewers with diverse language capabilities.” 

Zhang said in an interview: “The company has hired a lot of young idealistic people doing research that is largely not acted on. Facebook is looking for free wins in which they can improve things without harming their profit numbers.”  

On the other hand, Zhang said, Facebook must also grapple with complicated trade-offs and inefficient communication within a large organization. 

“It doesn’t act until something has already become a crisis,” she said. 

Potential consequences

Some securities law experts said allegations like Haugen’s wouldn’t necessarily trigger an SEC investigation. 

“Do they really go to the core of what the SEC is required to police?” asked Charles Clark, a former assistant director of the SEC’s enforcement division, who said parts of the accusations didn’t appear to clearly violate securities law. “Some of what she’s complaining about is important to Congress and is important to the world at large but isn’t really tied to the mandate of the SEC.” 

Clark added, however, that one of Haugen’s allegations — that Facebook is potentially inflating user counts and other metrics important to advertisers — “is the type of matter that the SEC has focused on for many years.” 

If a case progresses, the SEC could seek financial penalties against a company or a person after an investigation of securities law violations, experts said, and under federal law, whistleblowers may be given awards. In rare cases, the SEC can try to limit an executive’s ability to serve as a corporate director or officer. 

Facebook, which reported profits of $29 billion last year, could face bigger legal challenges than a potential fine, including a lawsuit from the Federal Trade Commission seeking to break it up into possibly three pieces. 

Securities law experts also don’t rule out how the SEC might respond. Harvey Pitt, a former SEC chair, said that he thinks Haugen’s allegations are credible and that the commission should investigate whether Facebook met its legal obligations in making disclosures to investors. 

“The documents produced are damning,” Pitt, who is now a consultant, said in an emailed response to questions. “The integrity of the corporate disclosure process is at stake here, and this is too high a profile matter for the SEC staff to let it pass by.”