Breaking News Emails

Get breaking news alerts and special reports. The news and stories that matter, delivered weekday mornings.
By Alyssa Newcomb

Facebook employees used to speak about the social platform's News Feed algorithm in vague terms, as though it was a closely guarded secret. Now, the company is pulling back the curtain with a tool that will allow users to understand why they’re seeing a certain post in their news feed.

A new feature called “Why am I seeing this post?” is being rolled out this month, and will be available worldwide in May. Users can click the drop down menu in the upper-right corner of a post to understand why it’s in their News Feed. Facebook’s algorithm considers a variety of factors, including how often you interact with the person, whether the post is getting a lot of engagement, and whether you have a history of engaging more often with posts that contain photos or videos.

But here’s the inherent problem: Facebook’s algorithm has enabled users to live within their own personal echo chambers, which are constructed with every click, like and comment.

“From working with students and the public, I don’t think people are thinking about this stuff,” said Jeremy Littau, an associate professor of journalism and communication at Lehigh University. "You can give them all the information you want — but without context, it’s just noise."

When the feature has been enabled on a Facebook profile, users will see a pop-up next to a post, showing them how they can click to learn more about why it is in their feed. They’ll also be directed to options that will help them better control what they see, whether it’s unfollowing, muting or temporarily snoozing a person’s posts.

While Littau said the data might not be enlightening to people who aren't doing critical thinking about their own personal preferences and biases the algorithm learns, he said the new data should be useful for journalists and researchers who have been studying why Facebook’s algorithm can surface conspiracy theories and other extreme content.

“It will give critics and people who pick things apart some access to data we haven’t had before, and that is a good thing,” said Littau. “On a public advocacy side, this would possibly lead to some feedback that could help Facebook refine its algorithm, because we know it can be used to surface awful things.”

In a tweet, Chris Messina, a product designer and inventor with a large Silicon Valley social media following, said the new tool is a “responsible step,” and suggested it should come with statistics about how many people actually use it.

Facebook is also expanding an existing tool that lets people see if an advertiser was targeting them by age, gender, location, interests or the sites they’ve visited. The tool will now include whether a user’s profile data was a match in an advertiser’s database.

The move comes as Facebook is focusing the future of its business around encrypted messaging and sharing in smaller groups. With News Feed ads driving the company’s revenue, the company will also be forced to think about new ways to deliver advertisements. One such way could be through a premium news tab.

In an interview posted online Monday, Facebook co-founder and Chief Executive Officer Mark Zuckerberg told Mathias Döpfner, CEO of German media giant Axel Springer, that he envisions a section on Facebook where publishers can share high quality news, and be paid a fee for it.

“That’s definitely something that I think we should be thinking about here, because the relationship between us and publishers is different in a surface where we’re showing the content on the basis of us believing that it’s high-quality, trustworthy content,” Zuckerberg said.

Zuckerberg has spent the past few days on a massive media spree. Over the weekend, he published an op-ed in The Washington Post where he said “the internet needs new rules” and shifted some of the responsibility to lawmakers to regulate Facebook.

Facebook has faced a torrent of public criticism over its handling of Russian intervention in the 2016 U.S. presidential election and its policies on hate speech that many governments and users consider too lax.