Facebook customizes each user’s online experience. Each of us has different friends, different likes, different group affiliations and different interests. As a result, Facebook’s algorithms perpetuate a different experience for each user, specifically designed to keep us hooked on the platform. That’s one key to the platform’s success — but it’s also what makes Facebook so dangerous. This hyper-customization means each of us lives in a different reality, with a unique set of beliefs and facts that are reinforced by the people and news sources that we follow, and the groups that we join.
In one Facebook reality, people believe that Joe Biden was legitimately elected president. In an alternate reality, people believe the election was stolen from Donald Trump due to widespread voting irregularities.
In other words, Facebook’s tools have allowed Trump’s bogus and dangerous claims of election impropriety and voter fraud to go viral.
Journalists, scholars and civil-society groups continue to identify and debunk election-fraud claims that appear on Facebook. But it’s a difficult task as disinformation mushrooms and morphs. The New York Times has a page, “Daily Distortions,” that’s devoted to tracking viral disinformation. Researchers at Avaaz identified a network of Steve Bannon-driven voter-fraud disinformation spreaders, and flagged the exponential growth of “Stop the Steal” groups, which are mobilizing to stop a Biden presidency.
Facebook removed the networks Avaaz flagged, but speed, scope and continued vigilance are necessary for the enforcement actions to have a lasting impact. Removing the ubiquitous “Stop the Steal” groups has been the equivalent of pouring water on gremlins — they split and multiply, resulting in havoc and chaos across the platform.
Five days after the Associated Press declared Biden the winner of the election, Trump posted “WE WILL WIN!” along with a video that urged his followers to prove that the results were wrong. Facebook tacked a removable label to the post, which stated that Biden was the “projected” winner. The label itself is misleading — Biden’s victory is far more certain than that — and it’s done nothing to stop the spread of Trump’s deceitful message: The post has been liked, shared and commented on by hundreds of thousands of people.
In the ensuing days, Trump has had multiple posts proclaiming that he “won” the election — and Facebook has had the same inadequate response to each of them.
And there’s every reason to believe that Facebook will do even less to combat disinformation the further we get away from Election Day. In October, Mark Zuckerberg told his employees to expect fewer policy changes and content removals after the election.
Facebook knows that divisive and conspiratorial content drives engagement: A 2018 study by its own researchers found as much. But the platform’s executives buried the report, aware that doing anything to curtail engagement would threaten its massive growth rate and billion-dollar revenue streams. Another internal study found that 64 percent of those who joined an extremist group on the platform did so because Facebook’s algorithms recommended it to them.
Protecting the company’s users from disinformation should remain a top priority. In the absence of ongoing enforcement, bad actors will weaponize Facebook at ever greater rates to sow division and hate, destabilize our democracy, disenfranchise voters and poison our information ecosystem.
The fight against disinformation is as important during this post-election period as it was in the run-up to the vote. And given that Facebook has a metric to track “violence and incitement trends,” it seems that the company is at least aware enough to understand that threats to our democracy don’t just follow election cycles. Facebook’s ongoing efforts to tackle the spread of militarized and dangerous social movements like QAnon indicates that it understands at some level that it must remain vigilant against disinformation from people hellbent on destabilizing our democracy.
But is it vigilant enough? In short, no: Facebook could do much more to prevent bad-faith actors from gaming its systems. Instead the company accommodates these users and allows them to inundate the network with dangerous disinformation.
Disinformation is also being used as a tool to recruit and organize. Contrary to what Facebook wants us to believe, the disinformation does not have to appear in our personalized newsfeeds for it to destabilize the democratic process.
On Nov. 10, Facebook’s vice president of analytics and chief marketing officer, Alex Schultz, wrote a post that attempts to downplay the kind of content that often appears in the network’s top 10 list of most engaging posts. He claimed that engagement is not the same as “reach,” a term used to track how many people actually see a piece of content. In essence, Schultz was arguing that ignorance is bliss — if you don’t see something in your feed it allegedly has no effect on you.
But disinformation on Facebook too frequently jumps from the virtual to the real world. Joan Donovan, the research director of the Shorenstein Center on Media, Politics and Public Policy, has tracked the true costs of disinformation, as when right-wing militia groups set up “identity” checkpoints after believing that antifa activists had set the California and Oregon wildfires.
While antitrust action against the Silicon Valley giant is reportedly on the horizon, fixing Facebook’s ad-driven business model — which algorithmically amplifies hateful content and disinformation — requires different measures. We need to update privacy laws to protect the civil rights of platform users and prevent platforms’ misuse of their data. We need to tax platforms’ online-advertising revenues to support independent, conspiracy-busting journalism. And we need tech companies to strengthen their community standards and terms of service — and enforce those rules — to prevent the spread of hate and disinformation across their networks.
The Change the Terms coalition, for example, has developed model corporate policies designed to disrupt hate and disinformation on social media. These policies call on internet companies to moderate content in a transparent manner and open themselves up for regular audits. The policies also urge companies to create better tools for identifying and removing hateful activities — and to deplatform groups that recruit and organize violence online.
Transparency would also help us better understand the grave impacts of disinformation by providing researchers, scholars, and others with the data they need to deconstruct the company’s divisive algorithms. Shining a light on the inner workings of Facebook would go far toward fixing many of the platform’s problems. It’s time for Facebook to finally put the health of people and our democracy over profits.
Truthout Is Preparing to Meet Trump’s Agenda With Resistance at Every Turn
Dear Truthout Community,
If you feel rage, despondency, confusion and deep fear today, you are not alone. We’re feeling it too. We are heartsick. Facing down Trump’s fascist agenda, we are desperately worried about the most vulnerable people among us, including our loved ones and everyone in the Truthout community, and our minds are racing a million miles a minute to try to map out all that needs to be done.
We must give ourselves space to grieve and feel our fear, feel our rage, and keep in the forefront of our mind the stark truth that millions of real human lives are on the line. And simultaneously, we’ve got to get to work, take stock of our resources, and prepare to throw ourselves full force into the movement.
Journalism is a linchpin of that movement. Even as we are reeling, we’re summoning up all the energy we can to face down what’s coming, because we know that one of the sharpest weapons against fascism is publishing the truth.
There are many terrifying planks to the Trump agenda, and we plan to devote ourselves to reporting thoroughly on each one and, crucially, covering the movements resisting them. We also recognize that Trump is a dire threat to journalism itself, and that we must take this seriously from the outset.
After the election, the four of us sat down to have some hard but necessary conversations about Truthout under a Trump presidency. How would we defend our publication from an avalanche of far right lawsuits that seek to bankrupt us? How would we keep our reporters safe if they need to cover outbreaks of political violence, or if they are targeted by authorities? How will we urgently produce the practical analysis, tools and movement coverage that you need right now — breaking through our normal routines to meet a terrifying moment in ways that best serve you?
It will be a tough, scary four years to produce social justice-driven journalism. We need to deliver news, strategy, liberatory ideas, tools and movement-sparking solutions with a force that we never have had to before. And at the same time, we desperately need to protect our ability to do so.
We know this is such a painful moment and donations may understandably be the last thing on your mind. But we must ask for your support, which is needed in a new and urgent way.
We promise we will kick into an even higher gear to give you truthful news that cuts against the disinformation and vitriol and hate and violence. We promise to publish analyses that will serve the needs of the movements we all rely on to survive the next four years, and even build for the future. We promise to be responsive, to recognize you as members of our community with a vital stake and voice in this work.
Please dig deep if you can, but a donation of any amount will be a truly meaningful and tangible action in this cataclysmic historical moment.
We’re with you. Let’s do all we can to move forward together.
With love, rage, and solidarity,
Maya, Negin, Saima, and Ziggy