Last month, academic researchers got word of a tantalizing offer. Facebook said it would give them a way to study how web addresses are shared on its social network, for the first time opening a window into the spread of hoaxes, partisan news and disinformation.
But there was a catch: Facebook wouldn’t let researchers look at anything before Jan. 1, 2017.
The time restriction is meaningful because it has the effect of blocking access to Facebook data about the 2016 presidential campaign, one of the most divisive periods in modern American history, as well as from previous years.
Facebook’s decision is a roadblock for experts who want to examine possible factors behind President Donald Trump’s victory, such as alleged Russian influence, his campaign’s digital strategy or the spread of false stories, like one about the pope endorsing him.
“Obviously, that kind of leaves out a really big date everyone would like to look at,” said Sarah Oates, a University of Maryland journalism professor who studies Russian propaganda.
Facebook has drastically increased its overall transparency in the past year, launching a political ad archive and disclosing efforts to stop what it called “information operations” by Russia and other nations.
But the openness often stops at the beginning of 2017, highlighting how little Facebook has been willing to disclose what happened on its platform in the run-up to the 2016 U.S. election.
An open book
One of Facebook’s recent steps toward transparency is a new effort to allow election-related academic studies based on Facebook data. Experts who previously had no such access have said they’re excited about the project, which is known as Social Science One and is led by professors at Harvard and Stanford.
The first area that Facebook is allowing research on is web addresses, such as links to news articles. Researchers will be able to study links that have been sharedby at least 20 people and get data about the age, gender and location of people sharing the links.
But the dataset is limited to website addresses “shared on Facebook starting January 1, 2017 and ending about a month before the present day,” according to a five-page description of the project.
Facebook did not say in the project description why it chose that date, and the company declined to comment on Thursday. Facebook gets to decide what data it makes available for research, subject to privacy laws, ethical guidelines at universities and other restrictions.
In announcing the research collaboration in April, Facebook said it wanted to address the future.
“The focus will be entirely forward looking. And our goals are to understand Facebook’s impact on upcoming elections — like Brazil, India, Mexico and the US midterms — and to inform our future product and policy decisions,” Facebook executives Elliot Schrage and David Ginsberg said in a blog post. Schrage has since said he plans to leave the company.
The terms could change to include data from 2016 and earlier, said Gary King, director of Harvard’s Institute for Quantitative Social Science and a co-chair of Social Science One.
“I would like to expand the scope of what we’re doing, potentially backwards, but that’s not the only thing,” he said. “You have to start somewhere.”
King said that even without data from 2016 or earlier, the amount of information Facebook is offering is “probably the largest compilation of social media data ever made available to researchers.”
Facebook is doing more than its Silicon Valley peers, in some ways. Google has not announced an effort similar to Facebook’s partnership with Social Science One, and the company declined to comment on Thursday.
Political scientists have also not let corporate roadblocks stop all research. One major study on 2016 found that fake news took off that year but remained a small minority of the information people consumed.
Still, researchers called the limit imposed by Facebook disappointing.
“There’s still a lot about Russian disinformation that isn’t understood,” Oates said, though she praised Facebook for moving in the direction of openness.
The time restriction is “curious” and leaves out some of the most interesting data on Facebook, said Bret Schafer, a social media analyst at the Alliance for Securing Democracy, a transatlantic project set up last year to counter Russian disinformation.
On the other hand, Schafer said, discussion of 2016 can be a distraction from what’s happening on social media now.
“You start talking about it and it polarizes people,” he said. “The conversation becomes unhelpful.”
Charles Stewart III, a political science professor at the Massachusetts Institute of Technology, said Facebook’s decision to block off research into the 2016 U.S. election is odd but also noted that it could keep the effort from being sensationalized.
“The team is trying to discourage people from purely fishing around looking for what the Russians did,” he said.