Truth and transparency have been thrust into the spotlight at Facebook as the company grapples with what its role is when it comes to deciding what makes a story fake news.
"Facebook has morphed into what would be the biggest country on Earth," at 1.8 billion users, said Scott Talan, a social media expert and professor at American University in Washington, D.C.
"With those changes come new responsibilities and challenges," he said. "Facebook has a job to help ensure what is on there is accurate."
The State of Fake News
CEO founder Mark Zuckerberg has said the amount of fake news on Facebook is minuscule and accounts for about 1 percent. But false articles can, to borrow a word from our next president, "bigly" impact how some people view the truth.
Days after Donald Trump was elected president, Zuckerberg said it was "pretty crazy" to think fake news could have influenced the election and warned Facebook "must be extremely cautious about becoming arbiters of truth."
Less than two weeks later, with the issue still simmering, Zuckerberg shared a more detailed account of projects he said were already underway to thwart the spread of misinformation.
While Zuckerberg didn't delve into the details, a post Friday night provided a seven-point plan for how Facebook plans to do its part to help rid the social network of fake news stories.
"Normally we wouldn't share specifics about our work in progress, but given the importance of these issues and the amount of interest in this topic, I want to outline some of the projects we already have underway," he wrote.
Facebook's approach has leaned on a mix of humans and algorithms, but Zuckerberg said even more can be done to foster "stronger detection."
"The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves," he said.
He also said he wants to make it easier for people to flag fake stories and that Facebook will focus on "raising the bar for stories that appear in related articles under linked in News Feed."
Facebook is also exploring the concept of working with fact-checking organizations to provide a third party verification to stories and the idea of labeling stories that have been flagged as false.
Zuckerberg's post comes amid a BuzzFeed report that "renegade" Facebook employees had organized their own task force to take on fake news.
"It's now more important than ever that platforms are trustworthy or believable," Talan said. People need to "know when something is accurate or not."
Aside from the approach of having the community and algorithms flag stories, Facebook is also taking on the lucrative business of fake news.
Some faux outlets can potentially make money by running advertisement's from Facebook's network — something the company made clear last week won't be tolerated.
"Fake news has become more predominant because the barriers to creating a website and articles are much lower," Thales Teixeira, a professor at Harvard Business School who specializes in the "economics of attention," told NBC News.
Some register a domain name similar to a legitimate news outlet but add a different suffix, hoping readers won't notice and will think it's a reputable news source.
"What has happened with Facebook is they have reduced the exposure of the brand, who wrote the article," Teixeira said. "An article is one in a million. Giving back a little prominence [to major news outlets] is one way to reduce this issue."
"I think they have gradually escalated their communication efforts," Niklas Myhr, an assistant professor of marketing at Chapman University, told NBC News. "Even a strong company can fall if the quality of the product goes down."
Facebook Previously Dealt With Bias Allegations
Zuckerberg knows deciding what is the truth can be a tricky endeavor. Earlier this year, a report from tech website Gizmodo cited an anonymous source who said the social network's "news curators" were instructed to artificially "inject" selected stories into trending topics, raising allegations of bias.
Amid the fallout, Zuckerberg hosted leading conservatives at Facebook's campus in May to discuss ensuring Facebook remains a platform that is open to everyone, regardless of background or beliefs.
The company subsequently announced its investigation found "no evidence of systematic political bias in the selection or prominence of stories" curated by its trending topics section. However, Facebook said it would enact several reforms, including more training, transparency and working with a larger list of news outlets to nominate trending topics.
This could have laid a potentially sensitive path for Facebook as it handles the latest news-based controversy, Myhr said.
"They need to have a filter in place to make sure their information isn't completely absurd, but there are always going to be things that go through," he said. "For them to have a user experience that is better, they need to up their game continuously."
Myhr said said this is about as specific as the public should expect Zuckerberg to get.
"Facebook can not go out and say exactly what they are going to do because then everyone would game the system. The people who are behind these sites that have taken off are smart people ... reverse engineering Facebook's algorithm," he said.
"Facebook has to get smarter as manipulators get smarter."