Skip to content Skip to footer

How Facebook and Surveillance Capitalism Empower Authoritarianism

On Tuesday, Facebook said it had removed 32 fake accounts set up to influence the midterm elections.

“Black Elevation.” “Mindful Being.” “Resisters.” “Aztlan Warriors.” Those are the names of some of the accounts removed from Facebook and Instagram Tuesday after Facebook uncovered a plot to covertly influence the midterm elections. The tech giant said 32 fake accounts and Facebook pages were involved in “coordinated inauthentic behavior.” This announcement comes just days after the company suffered the biggest loss in stock market history: about $119 billion in a single day. This is just the latest in a string of controversies surrounding Facebook’s unprecedented influence on democracy in the United States and around the world, from its pivotal role in an explosion of hate speech inciting violence against Rohingya Muslims in Burma to its use by leaders such as Philippines President Rodrigo Duterte in suppressing dissent. Facebook has 2.2 billion users worldwide, and that number is growing. We speak with Siva Vaidhyanathan, author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy. He is a professor of media studies and director of the Center for Media and Citizenship at the University of Virginia.

Transcript

JUAN GONZÁLEZ: “Black Elevation.” “Mindful Being.” “Resisters.” “Aztlan Warriors.” Those are the names of some of the accounts removed from Facebook and Instagram Tuesday, after Facebook uncovered a plot to covertly influence the midterm elections. The tech giant said it uncovered 32 fake accounts and Facebook pages that were involved in what it described as “coordinated inauthentic behavior.” The accounts had a total of 290,000 followers, that had—that created 30 events since April of 2017. One of the accounts had created a Facebook event to promote the protest against the upcoming Unite the Right rally in Washington, D.C. Protest organizers say the fake account is not behind the event. Facebook says it does not have enough technical evidence to state who was behind the fake pages, but said the accounts engaged in some similar activity to pages tied to Russia before the 2016 election. Facebook’s announcement comes just days after the company suffered the biggest loss in stock market history, losing about $119 billion—that’s right, billion dollars—in a single day.

AMY GOODMAN: Facebook has been at the center of a number of controversies in the United States and abroad. Earlier this year, Facebook removed more than 270 accounts it determined to be created by the Russia-controlled Internet Research Agency. Facebook made that move in early April, just days before founder and CEOMark Zuckeberg was question on Capitol Hill about how the voter-profiling company Cambridge Analytica harvested data from more than 87 million Facebook users without their permission in efforts to sway voters to support President Donald Trump. Zuckerberg repeatedly apologized for his company’s actions then.

MARK ZUCKERBERG: We didn’t take a broad enough view of our responsibility, and that was a big mistake. And it was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.

AMY GOODMAN: Today we spend the hour with a leading critic of Facebook, Siva Vaidhyanathan, author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy. He’s professor of media studies and director of the Center for Media and Citizenship at the University of Virginia. We’re speaking to him in Charlottesville.

Professor, welcome to Democracy Now!

SIVA VAIDHYANATHAN: Oh, thanks. It’s good to be here.

AMY GOODMAN: Well, let’s begin with this latest news. There are hearings today that the Senate Intelligence Committee is holding, and yesterday Facebook removed these—well, a bunch of pages, saying they don’t know if it’s Russian trolls, but they think they are inauthentic. Talk about these pages, what they mean, what research is being done and your concerns.

SIVA VAIDHYANATHAN: Yeah. Look, Facebook was unconcerned, for nearly a decade, that various groups, either state-sponsored or sponsored by some, you know, troublemaking group, were popping up around Facebook, not just in the United States, not just in reference to one election or another, but around the world, in a concerted effort, or in some ways a distributed effort, to undermine democracy and civil society. This has been going on almost as long as Facebook has allowed pages to pop up—right?—interest group pages to pop up. And Facebook got caught off guard, bizarrely, after the 2016 election, even though there were people within Facebook who were raising the alarm—right?—that there were these pages, these accounts, that were distributing nonsense, that were posing as Black Lives Matter pages. There were others that were posing as Texas independence pages. There were some supporting radical-right positions and others supporting radical-left positions. And, you know, they were planning events. So, all of this came out after the 2016 election. It should have come out before. And Facebook has been scrambling ever since.

So, what we’ve seen since the 2016 election in the United States is, every time there is a major election around the world, Facebook will put all hands on that election and try to make sure that it can claim that it is cleaning up its act in preparation for that election. So we saw that in 2017 with the elections in Germany and in the Netherlands and in France. We saw that with the referendum on abortion that was held in Ireland earlier this year. In all of these cases, you know, Facebook has made sure to crow about all that it has done to clean up the pollution that might distract people or disrupt the political process, the democratic process.

But, you know, it hasn’t done much in other places in the country. It did almost nothing in Mexico before its election. It has so far come up with no strategy for dealing with the much larger mess in India, the world’s largest democracy. Facebook was instrumental in the election of Rodrigo Duterte in the Philippines in 2016. It was instrumental in the Brexit referendum in 2016. In all of these cases, forces, often from other countries, interfered in the democratic process, distributed propaganda, distributed misinformation, created chaos, often funneled campaign support outside of normal channels, and it’s largely because Facebook is so easy to hijack.

What we see just this week, as Facebook makes these announcements, is that they’ve managed to identify a handful of sites that, you know, a few hundred thousand people have interacted with. We don’t know if this is 5 percent, 10 percent, 50 percent or 100 percent of the disruptive element going on before our off-year elections coming up in November.

JUAN GONZÁLEZ: Well, Professor, one of the points that you make in your book is that so much attention has been focused on the work of Cambridge Analytica, but that you believe that there’s a much deeper structural problem with Facebook than just one company being able to access personal data and then use it for nefarious political ends. Could you talk about the structural issues that you see? And also, you mentioned the Philippines. Most people have not heard much about how Duterte was helped to win his election by Facebook. If you could give an example of how structurally it might have worked in the Philippines?

SIVA VAIDHYANATHAN: Yeah. Look, Cambridge Analytica was a great story, right? It finally brought to public attention the fact that for more than five years Facebook had encouraged application developers to get maximal access to Facebook data, to personal data and activity, not just from the people who volunteered to be watched by these app developers, but all of their friends—right?—which nobody really understood except Facebook itself and the application developers. So, thousands of application developers got almost full access to millions of Facebook users for five years. This was basic Facebook policy. This line was lost in the storm over Cambridge Analytica.

So, Cambridge Analytica was run by Bond villains, right? They look evil. They work for evil people, like Kenyatta in Kenya. You know, Steve Bannon helped run the company for a while. It’s paid for by Robert Mercer, you know, one of the more evil hedge fund managers in the United States. You know, it had worked for Cruz, for Ted Cruz’s campaign, and then for the Brexit campaign and also for Donald Trump’s campaign in 2016. So it’s really easy to look at Cambridge Analytica and think of it as this dramatic story, this one-off. But the fact is, Cambridge Analytica is kind of a joke. It didn’t actually accomplish anything. It pushed this weird psychometric model for voter behavior prediction, which no one believes works.

And the fact is, the Trump campaign, the Ted Cruz campaign, and, before that, the Duterte campaign in the Philippines, the Modi campaign in India, they all used Facebook itself to target voters, either to persuade them to vote or dissuade them from voting. Right? This was the basic campaign, because the Facebook advertising platform allows you to target people quite precisely, in groups as small as 20. You can base it on ethnicity and on gender, on interest, on education level, on ZIP code or other location markers. You can base it on people who are interested in certain hobbies, who read certain kinds of books, who have certain professional backgrounds. You can slice and dice an audience so precisely. It’s the reason that Facebook makes as much money as it does, because if you’re selling shoes, you would be a fool not to buy an ad on Facebook, right? And that’s drawing all of this money away from commercially based media and journalism. At the same time, it’s enriching Facebook. But political actors have figured out how to use this quite deftly.

So, when Modi ran in 2014, when Duterte ran in 2016, in both cases, Facebook had staff helping them work their system more effectively. Facebook also boasted about the fact that both Modi and Duterte were Facebook savvy—right?—the most connected candidates ever. In fact, Narendra Modi has more Facebook friends and followers than any other political figure in the world. He is the master of Facebook. It’s not a coincidence that Narendra Modi and Rodrigo Duterte are dangerous nationalist leaders who have either advocated directly for violence against people, their own people, or have sat back and folded their arms as pogroms happened against Muslims in their country.

JUAN GONZÁLEZ: You mentioned Narendra Modi. I want to turn to a meeting between Mark Zuckerberg and the Indian prime minister at the Facebook headquarters in California in 2015.

MARK ZUCKERBERG: You were one of the early adopters of the internet and social media and Facebook. And did you, at that point, think that social media and the internet would become an important tool for governing and citizen engagement in foreign policy?

PRIME MINISTER NARENDRA MODI: [translated] When I took to social media, even I actually didn’t know that I would become a chief minister at some point, I would become a prime minister at some point, so I never, ever did think that social media would actually be useful for governance. When I took up and I got onto social media, it was basically because I was curious about technology. And I saw that I had been trying to understand the world through books, but I think it’s a part of human nature that, instead of going onto textbooks, if you have a guide, it’s far easier. And, in fact, if, instead of a guide, somebody can give you pretty sure suggestions of what to do, it’s even better.

JUAN GONZÁLEZ: That was Indian Prime Minister Narendra Modi talking with Mark Zuckerberg at Facebook headquarters in California in 2015. Professor, could you talk about this whole—the impact that Modi has had on the internet? He has what? I mean, 43 million Facebook followers?

SIVA VAIDHYANATHAN: Right, and that doesn’t include WhatsApp, right? WhatsApp is the most popular messaging service in India. It’s also owned by Facebook, right? And it’s tremendously important not just in personal communication, but in harnessing mobs for mob violence, mostly against Muslims, but often against Christians and often against Hindus who happen to marry or date Muslims. You know, this sort of vigilante mob violence is breaking out all over India. It’s breaking out in Sri Lanka. We’ve seen the Rohingya massacres and expulsion in Myanmar, in Burma, often fueled—in fact, directly fueled by propaganda spread on Facebook and WhatsApp.

And Modi has taken full advantage of this. Right? He and his people mastered this technique early on. It’s a three-part strategy, which I call the authoritarian playbook. What they do is they use Facebook and WhatsApp to distribute propaganda about themselves, flooding out all other discussion about what’s going on in politics and government. Secondly, they use the same sort of propaganda machines, very accurately targeted, to undermine their opponents and critics publicly. And then, thirdly, they use WhatsApp and Facebook to generate harassment, the sort of harassment that can put any nongovernment organization, human rights organization, journalist, scholar or political party off its game, because you’re constantly being accused of pedophilia, you’re being accused of rape, or you’re being threatened with rape, threatened with kidnapping, threatened with murder, which makes it impossible to actually perform publicly in a democratic space. This is exactly what Modi mastered in his campaign in 2014, and, in fact, a bit before. And that same playbook was picked up by Rodrigo Duterte in the Philippines, and it’s being used all over the world by authoritarian and nationalist leaders, to greater or lesser degrees.

So, in the United States, when Trump’s campaign used Facebook, almost as effectively, to precisely target certain voters in certain states, like Michigan, like Wisconsin, like Pennsylvania, like Florida, and either turn them off from voting or turn them on to voting for Donald Trump, when they might not have been otherwise motivated, by choosing very targeted, specific issues, again, to either turn people on or off from voting, that was a sort of soft, light version of Narendra’s authoritarian playbook. We did not see, and we’ve not seen yet, and hopefully we will not see, the same level of coordinated harassment from the Republican Party. At least we haven’t seen it yet. So, you know, what we are seeing, of course, in a distributed way, anybody who, especially women, who are involved in the public sphere are constantly being assaulted with these messages of all sorts of threats, both publicly and privately. So, you know, the culture of our democracy and the cultures of democracies around the world are directly threatened by these practices, that are not only enabled by Facebook, they’re actually accelerated by Facebook.

AMY GOODMAN: We are going to break, then come back to this discussion and talk about a number of issues, including whether you’re concerned about this massive monopoly determining the content we read and see. Siva Vaidhyanathan is the author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy, professor of media studies and director of the Center for Media and Citizenship at the University of Virginia. His previous books include The Googlization of Everything. Stay with us.

We’re not going to stand for it. Are you?

You don’t bury your head in the sand. You know as well as we do what we’re facing as a country, as a people, and as a global community. Here at Truthout, we’re gearing up to meet these threats head on, but we need your support to do it: We must raise $21,000 before midnight to ensure we can keep publishing independent journalism that doesn’t shy away from difficult — and often dangerous — topics.

We can do this vital work because unlike most media, our journalism is free from government or corporate influence and censorship. But this is only sustainable if we have your support. If you like what you’re reading or just value what we do, will you take a few seconds to contribute to our work?