A longtime Facebook executive has admitted the company’s platform helped Donald Trump win the 2016 election, and it may happen again this year. In an internal memo, Facebook Vice President Andrew Bosworth wrote, “So was Facebook responsible for Donald Trump getting elected? I think the answer is yes.” Bosworth, who was a backer of Hillary Clinton in 2016, went on to write that the company should not change its policies in an effort to hurt Trump’s re-election chances. In his memo, Bosworth referenced the role of the shadowy data firm Cambridge Analytica but downplayed its significance. However, a new Oscar-shortlisted documentary called The Great Hack argues Cambridge Analytica has played a significant role not just in the U.S. election but in elections across the globe. The company harvested some 87 million Facebook profiles without the users’ knowledge or consent and used the data to sway voters during the 2016 campaign. We speak with the directors of The Great Hack, Jehane Noujaim and Karim Amer, as well as former Cambridge Analytica employee Brittany Kaiser and propaganda researcher Emma Briant.
AMY GOODMAN: A longtime Facebook executive has admitted the company’s platform helped Donald Trump win the 2016 election, and predicted it may happen again this year. In an internal memo, Facebook Vice President Andrew Bosworth wrote, “So was Facebook responsible for Donald Trump getting elected? I think the answer is yes,” he said. Bosworth, who was a backer of Hillary Clinton in 2016, went on to write that the company should not change its policies in an effort to hurt Trump’s re-election chances. Bosworth credited Trump for running, quote, “the single best digital ad campaign I’ve ever seen from any advertiser.”
In his memo, Bosworth referenced the role of the shadowy data firm Cambridge Analytica but downplayed its significance. However, a new Netflix documentary called The Great Hack argues Cambridge Analytica has played a critical role in the U.S. election, as well as elections across the globe.
Get our free emails
Cambridge Analytica was founded by the right-wing billionaire Robert Mercer. Trump’s former adviser Steve Bannon was the company’s vice president and claims to have named the company.
Cambridge Analytica harvested some 87 million Facebook profiles without the users’ knowledge or consent and used the data to sway voters during the 2016 campaign. The story of Cambridge Analytica is featured in the new documentary The Great Hack, which has been shortlisted for an Oscar.
DAVID CARROLL: All of your interactions, your credit card swipes, web searches, locations, likes, they’re all collected, in real time, into a trillion-dollar-a-year industry.
CAROLE CADWALLADR: The real game changer was Cambridge Analytica. They worked for the Trump campaign and for the Brexit campaign. They started using information warfare.
DAVID CARROLL: Cambridge Analytica claimed to have 5,000 data points on every American voter.
AMY GOODMAN: Well, earlier this week, I spoke to the directors of The Great Hack, Jehane Noujaim and Karim Amer, as well as propaganda researcher Emma Briant and a former employee at Cambridge Analytica, Brittany Kaiser, who has begun posting online a trove of documents detailing the company’s operations, including its work with President Trump’s former national security adviser John Bolton. Kaiser has also written about her experience at the company in the book Targeted: The Cambridge Analytica Whistleblower’s Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again. Kaiser is one of two former Cambridge Analytica employees featured in The Great Hack. The other is Christopher Wylie.
CHRISTOPHER WYLIE: It’s incorrect to call Cambridge Analytica a purely sort of data science company or an algorithm company. You know, it is a full-service propaganda machine.
AMY GOODMAN: I asked Cambridge Analytica whistleblower Brittany Kaiser to talk about how she became involved with Cambridge Analytica.
BRITTANY KAISER: I think it’s very important to note this, because there are people all around the world that are working for tech companies, that I’m sure joined that company in order to do something good. They want the world to be more connected. They want to use technology in order to communicate with everybody, to get people more engaged in important issues. And they don’t realize that while you’re moving fast and breaking things, some things get so broken that you cannot actually contemplate or predict what those repercussions are going to look like.
Chris Wylie and I both really idealistically joined Cambridge Analytica because we were excited about the potential of using data for exciting and good impact projects. Chris joined in 2013 on the data side in order to start developing different types of psychographic models. So he worked with Dr. Aleksandr Kogan and the Cambridge Psychometrics Centre at Cambridge University in order to start doing experiments with Facebook data, to be able to gather that data, which we now know was taken under the wrong auspices of academic research, and was then used in order to identify people’s psychographic groupings.
AMY GOODMAN: Now, explain that, psychographic groupings, and especially for people who are not on Facebook, who don’t understand its enormous power and the intimate knowledge it has of people. Think of someone you’re talking to who’s never experienced Facebook. Explain what is there.
BRITTANY KAISER: Absolutely. So, the amount of data that is collected about you on Facebook and on any of your devices is much more than you’re really made aware of. You probably haven’t ever read the terms and conditions of any of these apps on your phone. But if you actually took the time to do it and you could understand it, because most of them are written for you not to understand — it’s written in legalese — you would realize that you are giving away a lot more than you would have ever agreed to if there was transparency. This is your every move, everywhere you’re going, who you’re talking to, who your contacts are, what information you’re actually giving in other apps on your phone, your location data, all of your lifestyle, where you’re going, what you’re doing, what you’re reading, how long you spend looking at different images and websites.
This amount of behavioral data gives such a good picture of you that your behavior can be predicted, as Karim was talking about earlier, to a very high degree of accuracy. And this allows companies like Cambridge Analytica to understand how you see the world and what will motivate you to go and take an action — or, unfortunately, what will demotivate you. So, that amount of data, available on Facebook ever since you joined, allows a very easy platform for you to be targeted and manipulated.
And when I say “psychographic targeting,” I’m sure you probably are a little bit more familiar with the Myers-Briggs test, so the Myers-Briggs that asks you a set of questions in order to understand your personality and how you see the world. The system that Cambridge Analytica used is actually a lot more scientific. It’s called the OCEAN five-factor model. And OCEAN stands for O for openness, C for conscientiousness, whether you prefer plans and order or you’re a little bit more fly by the seat of your pants. Extraversion, whether you gather your energy from being out surrounded by people, or you’re introverted and you prefer to gather your energy from being alone. If you are agreeable, you care about your family, your community, society, your country, more than you care about yourself. And if you are disagreeable, then you are a little bit more egotistical. You need messages that are about benefits to you. And then the worst is neurotic. You know, it’s not bad to be neurotic. It means that you are a little bit more emotional. It means, unfortunately, as well, that you are motivated by fear-based messaging, so people can use tactics in order to scare you to doing what they want to do.
And this is what was targeted when they were gathering that data out of Facebook to figure out which group you belonged into. They found about 32 different groups of people, different personality types. And there were groups of psychologists that were looking into how they could understand that data and convert that into messaging that was just for you.
I need to remind everybody that the Trump campaign put together over a million different advertisements that were put out, a million different advertisements with tens of thousands of different campaigns. Some of these messages were for just you, were for 50 people, a hundred people. Obviously, certain groups are thousands, tens of thousands or millions. But some of them were targeted very much directly at the individual, to know exactly what you’re going to click on and exactly what you care about.
AMY GOODMAN: So they were doing this before Cambridge Analytica. But describe — I want to actually go to a Bannon clip, Steve Bannon, who takes credit for naming Cambridge Analytica, right? Because you had SCL before, Defence.
BRITTANY KAISER: Yes.
AMY GOODMAN: And then it becomes Cambridge Analytica, for Cambridge University, right? Where Kogan got this information that he culled from Facebook.
BRITTANY KAISER: Yes.
AMY GOODMAN: This is the White House chief strategist Steve Bannon in an interview at a Financial Times conference in March 2018. Bannon said that reports that Cambridge Analytica improperly accessed data to build profiles on American voters and influence the 2016 presidential election were politically motivated. Months later, evidence emerged linking Bannon to Cambridge Analytica, the scandal, which resulted in a $5 billion fine for Facebook. Bannon is the founder and former board member of the political consulting firm — he was vice president of Cambridge Analytica.
STEPHEN BANNON: All Cambridge Analytica is the data scientists and the applied applications here in the United States. It has nothing to do with the international stuff.The Guardian actually tells you that, andThe Observer tell you that, when you get down to the 10th paragraph, OK? When you get down to the 10th paragraph. And what Nix does overseas is what Nix does overseas. Right? It was a data — it was a data company.
And by the way, Cruz’s campaign and the Trump campaign say, “Hey, they were a pretty good data company.” But this whole thing on psychographics was optionality in the deal. If it ever worked, it worked. But it hasn’t worked, and it doesn’t look like it’s going to work. So, it was never even applied.
AMY GOODMAN: So, that’s Steve Bannon in 2018, key to President Trump’s victory and to his years so far in office, before he was forced to — before he was forced out. What was your relationship with Steve Bannon? You worked at Cambridge Analytica for over three years. You had the keys to the castle, is that right, in Washington?
BRITTANY KAISER: Yes, for a while I actually split the keys to what is Steve’s house, with Alexander Nix, because we used his house as our office. His house is also used as a Breitbart office in the basement. It’s called the “Breitbart Embassy” on Capitol Hill. And that’s where I would go for meetings.
AMY GOODMAN: Who funded that?
BRITTANY KAISER: I believe it was owned by the Mercer family, that building. And we would come into the basement and use that boardroom for our meetings. And we would use that for planning who we were going to go pitch to, what campaigns we were going to work for, what advocacy groups, what conservative 501(c)(3)s and (c)(4)s he wanted us to go see.
And I didn’t spend a lot of time with Steve, but the time I did was incredibly insightful. Almost every time I saw him, he’d be showing me some new Hillary Clinton hit video that he had come out with, or announcing that he was about to throw a book launch party for Ann Coulter for ¡Adios, America!, which was something that he invited both me and Alexander to, and we promptly decided to leave the house before she arrived.
But Steve was very influential in the development of Cambridge Analytica and who we were going to go see, who we were going to support with our technology. And he made a lot of the introductions, which in the beginning seemed a little less nefarious than they did later on, when he got very confident and started introducing us to white right-wing political parties across Europe and in other countries and tried to get meetings with the main political parties, or leftist or green parties instead, to make sure that those far-right-wing parties that do not have the world’s best interests at heart could not get access to these technologies. heart could not get access to these technologies.
AMY GOODMAN: You said in The Great Hack, in the film, that you have evidence of illegality of the Trump and Brexit campaigns, that they were conducted illegally. I was wondering if you can go into that. I mean, it was controversial even, and Carole Calwalladr, the great reporter at The Observer and The Guardian, was blasted and was personally targeted, very well demonstrated in The Great Hack, for saying that Cambridge Analytica was involved in Brexit. They kept saying they had nothing to do with it, until she shows a video of you, who worked for Cambridge Analytica, at one of the founding events of leave it, or Brexit.
BRITTANY KAISER: Yeah, Leave.EU, that panel that I was on, which has now become quite an infamous video, was their launch event to launch the campaign. And Cambridge Analytica was in deep negotiations, through introduction of Steve Bannon, with both of the Brexit campaigns. I was told, actually, originally we pitched remain, and the remain side said that they did not need to spend money on expensive political consultants, because they were going to win anyway. And that’s actually what I also truly believed, and so did they.
So, Steve made the introductions to make sure that we would still get a commercial contract out of this political campaign, and both to Vote Leave and Leave.EU. Cambridge Analytica took Leave.EU, and AIQ, which was Cambridge Analytica’s essentially digital partner, before Cambridge Analytica could run our own digital campaigns, they were running the Vote Leave side, both funded by the Mercers, both with the same access to this giant database on American voters.
AMY GOODMAN: The Mercers funded Brexit?
BRITTANY KAISER: There was Cambridge Analytica work, as well as AIQ work, in both of the leave campaigns. So, a lot of that money, in order to collect that data and in order to build the infrastructure of both of those companies, came from Mercer-funded campaigns, yes.
AMY GOODMAN: And again, explain what AIQ is.
BRITTANY KAISER: AIQ was a company that actually ran all of Cambridge Analytica’s digital campaigns, until January 2016, when Molly Schweickert, our head of digital, was hired in order to build ad tech internally within the company. AIQ was based in Canada and was a partner that had access to Cambridge Analytica data the entire time that they were running the Vote Leave campaign, which was the designated and main campaign in Brexit.
AMY GOODMAN: So, when did you see the connection between Brexit and the Trump campaign?
BRITTANY KAISER: Actually, a lot of it started to come when I saw some of Carole’s reporting, because there were a lot of conspiracy theories over what was going on, and I didn’t know what to believe. All I knew was that we definitely did work in the Brexit campaign, “we” as in when I was at Cambridge Analytica, because I was one of the people working on the campaign. And we obviously played a large role in not just the Trump campaign itself, but Trump super PACs and a lot of other conservative advocacy groups, 501(c)(3)s, (4)s, that were the infrastructure that allowed for the building of the movement that pushed Donald Trump into the White House.
AMY GOODMAN: I mean, it looks like Cambridge Analytica was heading to a billion-dollar corporation.
BRITTANY KAISER: That’s what Alexander used to tell us all the time. That was the carrot that he waved in front of our eyes in order to have us keep going. “We’re building a billion-dollar company. Aren’t you excited?” And I think that that’s what so many people get caught up in, people that are currently working at Facebook, people that are working at Google, people that are working at companies where they are motivated to build exciting technology, that obviously can also be very dangerous, but they think they’re going to financially benefit and be able to take care of themselves and their families because of it.
AMY GOODMAN: So what was illegal?
BRITTANY KAISER: The massive problems that came from the data collection, specifically, are where my original accusations come from, because data was collected under the auspices of being for academic research and was used for political and commercial purposes. There are also different data sets that are not supposed to be matched and used without explicit transparency and consent in the United Kingdom, because they actually have good national data protection laws and international data protection laws through the European Union to protect voters. Unfortunately, in the United States, we only have seen the state of California coming out and doing it.
Now, on the other side, we have voter suppression laws that prevent our vote from being suppressed. We have laws against discrimination in advertising, racism, sexism, incitement of violence. All of those things are illegal, yet somehow a platform like Facebook has decided that if politicians want to use any of those tactics, that they will not be held to the same community standards as you or me, or the basic laws and social contracts that we have in this country.
AMY GOODMAN: Cambridge Analytica whistleblower Brittany Kaiser. When we come back, we speak to the directors of The Great Hack, the documentary that’s just been shortlisted for an Academy Award.
AMY GOODMAN: “Let Me Steal Your Secrets,” soundtrack from the documentary The Great Hack. This is Democracy Now! I’m Amy Goodman, as we continue our look at Cambridge Analytica, Facebook, and their roles in the 2016 U.S. election and other elections. Earlier this week, I spoke to the directors of The Great Hack, Jehane Noujaim and Karim Amer, as well as propaganda researcher Emma Briant and a former Cambridge Analytica whistleblower, Brittany Kaiser. I asked Karim Amer to talk about what Cambridge Analytica effort to suppress the vote in Trinidad and Tobago did.
KARIM AMER: It was important for us to show in the film the expansiveness of Cambridge’s work. This went beyond the borders of the United States and even beyond the borders of the EU and the U.K. Because what we find is that Cambridge used the — in pursuing this global influence industry that they were very much a part of, they used different countries as Petri dishes to learn and get the know-how about different tactics. And from improving those tactics, they could then sell them for a higher cost — higher margin in Western democracies, where the election budgets are, you know — we have to remember, I think it’s important to predicate that the election business has become a multibillion-dollar global business, right? So, we have to remember that while we are upset with companies like Cambridge, we allowed for the commoditization of our democratic process, right? So, people are exploiting this now because it’s become a business. And we, as purveyors of this, can’t really be as upset as we want to be, when we’ve justified that. So I want to preface it with that.
Now, that being said, what’s happened as a result is a company like Cambridge can practice tactics in a place like Trinidad, that’s very unregulated in terms of what they can and can’t do, learn from that know-how and then, you know, use it — parlay it into activities in the United States. What they did in Trinidad, and why it was important for us to show it in the film, is they led something called the “Do So” campaign, where they admit to making it cool and popular among youth to get out and not vote. And they knew —
AMY GOODMAN: So, you had the Indian population and the black population.
KARIM AMER: And the black population. And there is a lot of historic tension between those two, and a lot of generational differences, as well, between those two. And the “Do So” campaign targeted — was was done in a way to, you know, by looking at the data and looking at the predictive analysis of which group would vote or not vote, get enough people to dissuade them from voting, so that they could flip the election.
AMY GOODMAN: Targeted at?
KARIM AMER: Targeted at the youth. And so, this is really — when you watch —
AMY GOODMAN: “Do So” actually meant “don’t vote.”
KARIM AMER: “Do So,” don’t vote.
JEHANE NOUJAIM: Don’t vote.
KARIM AMER: Yes, exactly. And when —
AMY GOODMAN: With their fists crossed.
KARIM AMER: With their fists.
AMY GOODMAN: And that it became cool not to vote.
KARIM AMER: Exactly. And you look at the level of calculation behind this, and it’s quite frightening. Now, as Emma was saying, a lot of these tactics were born out of our own fears in the United States and the U.K. post-9/11, when we allowed for this massive weaponization of influence campaigns to begin. You know, if you remember President Bush talking about, you know, the battle for the hearts and minds of the Iraqi people, all of these kinds of industries were born out of this.
And now I believe what we’re seeing is the hens have come home to roost, right? All of these tactics that we developed in the name of, quote-unquote, “fighting the war on terror,” in the name of doing these things, have now been commercialized and used to come back to the biggest election market in the world, the United States. And how do we blame people for doing that, when we’ve allowed for our democracy to be for sale?
And that’s what Brittany’s files today, that she’s releasing and has released over the last couple days, really give us insight to. The Hindsight Files that Brittany has released show us how there is an auction happening for influence campaigns in every democracy around the world. There is no vote that is unprotected in the current way that we — in the current space that we’re living.
And the thing that’s allowing this to happen is these information platforms like Facebook. And that is what’s so upsetting, because we can actually do something about that. We are the only country in the world that can hold Facebook accountable, yet we still have not done so. And we still keep going to their leadership hoping they do the right thing, but they have not. And why is that? Because no industry has ever shown in American history that it can regulate itself. There is a reason why antitrust laws exist in this country. There’s a tradition of holding companies accountable, and we need to re-embrace that tradition, especially as we enter into 2020, where the stakes could not be higher.
AMY GOODMAN: Brittany Kaiser, can you talk about the “Crooked Hillary” campaign and how it developed?
BRITTANY KAISER: Absolutely. So, this started as a super PAC that was built for Ted Cruz, Keep the Promise I, which was run by Kellyanne Conway and funded by the Mercers. That was then converted to becoming a super PAC for Donald Trump. They tried to register with the Federal Election Commission the name, Defeat Crooked Hillary, and the FEC, luckily, did not allow them to do that. So it was called Make America Number 1.
This super PAC was headed by David Bossie, someone that you might remember from Citizens United, who basically brought dark money into our politics and allowed endless amounts of money to be funneled into these types of vehicles so that we don’t know where all of the money is coming from for these types of manipulative communications. And he was in charge of this campaign.
Now, on that two-day-long debrief that I talked about — and if you want to know more, you can read about it in my book — they told us —
AMY GOODMAN: Wait, and explain where you were and who was in the room.
BRITTANY KAISER: So, I was in New York in our boardroom for Cambridge Analytica’s office on Fifth Avenue. And all of our offices from around the world had called in to videocast. And everybody from the super PAC and the Trump campaign took us through all of their tactics and strategies and implementation and what they had done.
Now, when we got to this Defeat Crooked Hillary super PAC, they explained to us what they had done, which was to run experiments on psychographic groups to figure out what was working and what wasn’t. Unfortunately, what they found out was the only very successful tactic was sending fear-based, scaremongering messaging to people that were identified as being neurotic. And it was so successful in their first experiments that they spent the rest of the money from the super PAC over the rest of the campaign only on negative messaging and fearmongering.
AMY GOODMAN: And crooked, the O-O in “crooked” was handcuffs.
BRITTANY KAISER: Yes. That was designed by Cambridge Analytica’s team.
AMY GOODMAN: Karim?
KARIM AMER: And one thing that I think it’s important to remember here, because there’s been a lot of debate among some people about: Did this actually work? To what degree did it work? How do we know whether it worked or not? What Brittany is describing is a debrief meeting where Cambridge, as a company, is saying, “This is what we learned from our political experience. This is what actually worked.” OK? And they’re sharing it because they’re saying, “Now this is how we want to codify this and commoditize this to go into commercial business.” Right?
So this is the company admitting to their own know-how. There is no debate about whether it works or not. This is not them advertising it to the world. This is them saying, “This is what we’ve learned. Based off that, this is how we’re going to run our business. This is how we’re going to invest in the expansion of this to sell this outside of politics.” The game was, take the political experience, parlay it into the commercial sector. That was the strategy. So, there is no debate whether it worked or not. It was highly effective.
And the thing that’s terrifying is that while Cambridge has been disbanded, the same actors are out there. And there’s nothing has been — nothing has changed to allow us to start putting in place legislation to say there is something called information crimes. In this era of information warfare, in this era of information economies, what is an information crime? What does it look like? Who determines it? And yet, without that, we are still living in this unfiltered, unregulated space, where places like Facebook are continuing to choose profit over the protection of the republic. And I think that’s what’s so outrageous.
JEHANE NOUJAIM: And I think it’s pretty telling that only two people —
AMY GOODMAN: Jehane.
JEHANE NOUJAIM: Only two people have come forward from Cambridge Analytica. Why is that? Both of the people that have come forward, Brittany and Chris, and also with Carole’s writing, have been targeted personally. And it’s been a very, very difficult story to tell. Even with us, when we released the film in January, every single time we have entered into the country, we have been stopped for four to six hours of questioning at the border. That —
AMY GOODMAN: Stopped by?
JEHANE NOUJAIM: Stopped by — on the border of the U.S., in JFK Airport, where you’re taken into the back, asked for all of your social media handles, questioned for four to six hours, every single time we enter the country. So —
AMY GOODMAN: Since when?
JEHANE NOUJAIM: Since we released the film, so since Sundance, since January, every time we’ve come back into the U.S.
AMY GOODMAN: And on what grounds are they saying they’re stopping you?
JEHANE NOUJAIM: No explanation. No —
AMY GOODMAN: And what is your theory?
JEHANE NOUJAIM: My theory is that it’s got something to do with this film. Maybe we’re doing something right. We were at first — we’ve been stopped in Egypt, but we’ve never been stopped in the U.S. in this way. We’re American citizens. Right?
AMY GOODMAN: You talk about people coming forward and not coming forward. I wanted to turn to former Cambridge Analytica COO, the chief operating officer, Julian Wheatland, speaking on the podcast Recode Decode.
JULIAN WHEATLAND: The company made some significant mistakes when it came to its use of data. They were ethical mistakes. And I think that part of the reason that that happened was that we spent a lot of time concentrating on not making regulatory mistakes. And so, for the most part, we didn’t, as far as I can tell, make any regulatory mistakes, but we got almost distracted by ticking those boxes of fulfilling the regulatory requirements. And it felt like, well, once that was done, then we’d done what we needed to do. And we forgot to pause and think about ethically what was — what was going on.
AMY GOODMAN: So, if you could decode that, Brittany? Cambridge Analytica COO Julian Wheatland, who, interestingly, in The Great Hack, while he was — really condemned Chris Wylie, did not appreciate Chris Wylie stepping forward and putting Cambridge Analytica in the crosshairs in the British Parliament, he was more equivocal about you. He — talk about Wheatland and his role and what he’s saying about actually abiding by the regulations, which they actually clearly didn’t.
BRITTANY KAISER: Once upon a time, I used to have a lot of respect for Julian Wheatland. I even thought we were friends. I thought we were building a billion-dollar company together that was going to allow me to do great things in the world. But, unfortunately, that’s a story that I told myself and a story he wanted me to believe that isn’t true at all.
While he likes to say that they spent a lot of time abiding by regulations, I would beg to differ. Cambridge Analytica did not even have a data protection officer until 2018, right before they shut down. I begged for one for many years. I begged for more time with our lawyers and was told I was creating too many invoices. And for a long time, because I had multiple law degrees, I was asked to write contracts. And so were other —
AMY GOODMAN: Didn’t you write the Trump campaign contract?
BRITTANY KAISER: The original one, yes, I did. And there were many other people that were trained in human rights law in the company that were asked to draft contracts, even though contract law was not anybody’s specialty within the company. But they were trying to cut corners and save money, just like a lot of technology companies decide to do. They do not invest in making the ethical or legal decisions that will protect the people that are affected by these technologies.