Skip to content Skip to footer

Why White Supremacists Love Facebook

Far from deplatforming racists, Facebook is moving toward private groups that will be harder to monitor.

Far from deplatforming racists, Facebook is moving toward private groups that will be harder to monitor.

When Facebook CEO Mark Zuckerberg announced that the future of the social networking site was going to be closed private groups, media watchdogs cautioned that the move would create an opportunity for white supremacists to congregate in secret.

It took ProPublica’s report of the group “I’m 10-15” to really show what such a racist private group would look like. Populated by around 9,500 members, the group, which took its name from the Border Patrol code for “alien in custody,” was made up of past and present U.S. Border Patrol agents who would use racist and dehumanizing language in regard to asylum seekers, and referring to Reps. Veronica Escobar and Alexandria Ocasio-Cortez as “scum buckets” and “hoes.” The discovery begged several questions: Why is this dangerous? Is there something in particular about Facebook that makes it susceptible to white supremacists? And what, if anything, can be done?

Jessie Daniels, who teaches sociology at the City University of New York and Hunter College, is a globally respected expert on racism on the internet. “Facebook has no internal system of moderation,” she told Truthout, “and that’s part of what white supremacists look for.”

Facebook outsources moderation to contractors in the United States and abroad. When The Verge unearthed the secret lives of Facebook moderators in the U.S., what they discovered was workers who were underpaid, overworked and operating under, at times, confusing guidelines. Many were also suffering from immense trauma caused by the content they were policing. In some cases, moderators even found themselves adopting the ideologies that they were hired to expunge. All these conditions only make it more difficult to monitor and shut down white supremacist propaganda.

Natalie Martinez, a researcher at the watchdog group Media Matters, says white supremacist groups in the U.S. often borrow European far-right stories about anti-Muslim and anti-African immigration. “These stories are used as a warning sign for the U.S. Then they’re repeated month after month, year after year, as if they’re new stories,” Martinez told Truthout. So one incident is made to look like a pattern of behavior when it’s really just a single incident.

There’s nothing about Facebook in particular, however, that makes it susceptible to such campaigns. “All platforms that don’t address the potential for being exploited by white supremacists up front are going to be exploited by them,” Daniels says. “Even if we close all the windows that are available on these platforms, white supremacists will be looking for whatever’s next. So, whatever we create next, we’ll need to preemptively program against them.”

When contacted for comment, a Facebook spokesperson declined to give an interview and instead sent Truthout a link to a post about how Facebook does not tolerate white nationalism on the platform, nor on its subsidiary, Instagram.

Given that Facebook was not designed with the goal of keeping white supremacists off the platform, the company has a unique and urgent problem to solve. As reported by Business Insider, Facebook had more than 2.3 billion monthly users as of February 4. That’s 2.3 billion users that may unwittingly be pawns in propaganda campaigns run by white supremacists and other dangerous ideologues.

That’s because there’s a very particular way of dealing with propaganda that most people simply don’t know. As John Cook and Stephen Lewandowsky explain in their Debunking Handbook, when dealing with misinformation, you need to focus more on the facts than on the myth in order to avoid accidentally further spreading the propaganda. You also need to explicitly frame the misinformation as false, and you need to offer an alternative account that explains the original false narrative.

If these steps aren’t followed, then what many users will simply see is their Facebook friends sharing information. As reported by CNN, the main reason that people are likely to accept information as true is by virtue of the fact that it’s being shared by a Facebook friend. However, not everyone agrees that this simplistic concept is how white supremacists “recruit” new members.

“It makes it seem like people are pure and unsuspecting, and that they’re lured into something that they had no prior interest in,” says Daniels. “If you think about the way we discover stuff online, it’s because we’re searching for it intentionally, or someone we know shared that content…. [People] click on the items, and they go further into that world.”

That world that Daniels talks about is becoming radicalized based on who people believe have the right to citizenship, and who they believe has a right to the actual land we’re occupying. And that really goes to the foundation of our country, Daniels said. “It’s complicated to talk about how we deal with white supremacy when it’s extreme and violent in a country based on white supremacy.”

Martinez says that Media Matters regularly finds white supremacist content on Facebook that doesn’t get banned. “We’ve reported countless pieces of content that use the phrase ‘Muslim invasion’ that don’t get banned. It’s not even pretending to be about immigration anymore; it’s clearly talking about a demographic shift that’s viewed as a threat to white identity.”

Spencer Sunshine, a researcher of far-right movements, echoes this sentiment. “The reluctance of social media platforms to take substantive action against white nationalists has been a key element in the emergence of the ‘alt-right,’ and white nationalism’s overall spread in the last few years,” he told Truthout.

But it’s not just Facebook users that use this type of language. As Martinez reported, Facebook let Trump’s campaign run more than 2,000 ads referring to immigration as an “invasion.”

“The consequence of this inciteful rhetoric is that there’s going to be more violence,” says Martinez, “especially when the word ‘invasion’ is often accompanied with vague calls to action.”

While the white supremacist message board Stormfront has been linked to at least 100 murders by the Southern Poverty Law Center, Daniels notes that such a link has yet to be established between Facebook and acts of white supremacist terrorism (even though a portion of the Christchurch mosque shooting in New Zealand was streamed on Facebook Live). That doesn’t mean, however, that the possibility of terrorist acts should be taken lightly. “My sense of the landscape is that as Stormfront has waned in influence, it’s been because these other social platforms are so available that people don’t need their own platform,” Daniels says.

What, then, is to be done?

Heidi Beirich, intelligence project director of the Southern Poverty Law Center, says that banning people from Facebook and other platforms is the most effective way to slow the spread of dangerous ideologies like white supremacy. “Don’t let white supremacists use the platform to create another ‘Unite the Right’ or another terrorist incident,” says Beirich. “Make them migrate to a place where they’ll have much less influence. We know from research that was done on banning very racist threads on Reddit that many people never even came back to these racist spaces.”

Daniels, however, sees a very large and orange obstacle in the way of successfully de-platforming white supremacy. “I’ve called for Trump to be de-platformed from Twitter, which I think would help a lot. Until that happens, I think we’re fiddling in the margins because he’s emboldening these people and has the biggest platform of all.”

As Daniels concludes, “The First Amendment of the Constitution talks about the right to dissent from the government, not about a right to post on Facebook.”

Thank you for reading Truthout. Before you leave, we must appeal for your support.

Truthout is unlike most news publications; we’re nonprofit, independent, and free of corporate funding. Because of this, we can publish the boldly honest journalism you see from us – stories about and by grassroots activists, reports from the frontlines of social movements, and unapologetic critiques of the systemic forces that shape all of our lives.

Monied interests prevent other publications from confronting the worst injustices in our world. But Truthout remains a haven for transformative journalism in pursuit of justice.

We simply cannot do this without support from our readers. At this time, we’re appealing to add 50 monthly donors in the next 2 days. If you can, please make a tax-deductible one-time or monthly gift today.