Facebook Failed to Prevent Billions of Views of Pages Sharing False Information

More than 10 billion views of content on Facebook pages known to be from sharers of false and misleading information could have been prevented had the social media company acted earlier to try and curtail the dissemination of such posts earlier in the 2020 presidential campaign.

That’s the conclusion from a report produced this week by Avaaz, an organization that describes itself as a “global web movement to bring people-powered politics to decision-making everywhere.” Facebook failed by waiting until October — very late in the election year — to alter its algorithms in order to prevent the spread of disinformation during the last leg of the election season, Avaaz suggested in its findings.

The group’s study was based on observations of pages deemed to be the 100 most prominent spreaders of misinformation during the run-up to Election Day. Avaaz defined such pages as those which shared at least three pieces of misinformation over 90 days and refused to correct themselves after receiving Facebook fact-checks — though, in actuality, on average these pages shared around eight pieces of misinformation without making corrections during that designated time period.

Not every post from these pages was necessarily misinformative. Rather, the 10 billion figure from the report is citing the number of times that any content from the pages recognized as carrying misinformation were viewed. Avaaz noted in its report, however, that the 100 most popular posts from the pages that were known to have been false were viewed around 162 million times on their own.

The organization also noted that these numbers were just the tip of the iceberg, as many other Facebook pages shared similar content.

“This is not the whole universe of misinformation,” said Fadi Quran, a research director at Avaaz who worked on the project. “This doesn’t even include Facebook Groups, so the number is likely much bigger. We took a very, very conservative estimate in this case.”

The misinformation, both before and after the election, had real consequences, the report from Avaaz said, as the kinds of false information that was shared tied directly to the attack on the U.S. Capitol building on January 6.

Facebook “creat[ed] the conditions that swept America down the dark path from election to insurrection,” the organization’s report stated.

The social media site has claimed that it is committed to stopping the spread of misinformation, and said that it was labeling and flagging posts from pages and politicians alike if they contain false or questionable information. Critics, however, contend that Facebook continues to disseminate false information that can lead to violence, a conclusion that some had reached even before the breach of the Capitol.

Carmen Scurato, a senior policy counsel at the digital rights group Free Press, wrote an op-ed for Truthout in late November describing how Facebook had failed in its purported mission.

Scurato noted that statements made by former President Donald Trump, for example, falsely claiming that the election results were wrong and that he would challenge them received warning labels that said Joe Biden was the “projected” winner. That warning itself was misleading, says Scurato, because Biden’s victory was “far more certain than that.”

Scurato also explained that groups like QAnon and other conspiracy movements utilized Facebook for recruitment, even after Facebook had flagged the posts as being false or misleading.

“Facebook could do much more to prevent bad-faith actors from gaming its systems,” Scurato said. “Instead the company accommodates these users and allows them to inundate the network with dangerous disinformation.”