Facebook gives partial data to researchers studying misinformation 


On Friday, members of Facebook’s Open Research and Transparency team held a conference call with researchers studying misinformation. And they apologized for giving them flawed, incomplete data for their work examining how users interact with posts and links on its platform.

The researchers were relying on the Facebook provided data. Some of them may have lost months or even years of their work. 

Years of access to Flawed Data

The academics have been receiving and using the data for a couple of years. The Facebook ORT team was giving access to its data in order to track the spread of misinformation on its platform. It has promised researchers transparency and access to all user interactions, in an effort to get valid results.

Signup for the USA Herald exclusive Newsletter

The data that the social media giant has been providing only includes about half the interactions of US users. And the data they received from outside the US is flawed.

 To make matters more complicated, most of the user interactions that were included in the shared reports are of more political people. The users who engage with political posts often have a political preference and are less likely to be influenced.

 The company told them that it’s fixing the issue. But it could take weeks due to the sheer volume of data it has to process. The academic researchers studying misinformation with Facebook data. apologized for the “inconvenience it may have caused.” 

Facebook spokesperson Mavis Jones blamed the data inaccuracy on a “technical error,” which the company is “working swiftly to resolve.”

Some of the researchers are questioning whether the mistakes were intentional. Because the “cherry-picked” data would sabotage the research and cause the researchers to reach the wrong conclusions.

A Facebook spokesperson emphasized this was a mistake. And it resulted from a technical error. The company “proactively told impacted partners about and are working swiftly to resolve” the problem.

Researchers studying misinformation 

The flaw in the data was first discovered by a researcher at the Italian University of Urbino, who compared a report Facebook released publicly in August to the data it had provided only to the researchers. The data didn’t add up.

From the report from August 18th (Q2 2021), University of Urbino associate professor Fabio Giglietto discovered the issue. His comparison of data given to researchers in the “Widely Viewed Content Report” and data the social network published in August didn’t match. 

Also, the New York Times reported that Facebook had initially shelved a report about its first-quarter because it portrayed the company in a less flattering light. But Facebook finally released the shelved report.

And in August, Facebook cut off access to researchers from New York University’s Ad Observatory project from its platform. The group’s Ad Observer browser plug-in highlighted some issues. The group was banned after researchers discovered Facebook failed to disclose some political ads on its site.

The team was using a browser extension to collect accurate information on political ads. The social network claimed it was  “unauthorized scraping.” 

Laura Edelson, the project’s lead researcher, claims Facebook is silencing the team because its “work often calls attention to problems on its platform.” Edelson added: “If this episode demonstrates anything it is that Facebook should not have veto power over who is allowed to study them.”