Cambridge Analytica made “ethical mistakes” because it was too focused on regulation, former COO says

Cambridge Analytica made "ethical mistakes" because it was too focused on regulation, former COO says

This story is part of a group of stories called

Cambridge Analytica made "ethical mistakes" because it was too focused on regulation, former COO says

Uncovering and explaining how our digital world is changing — and changing us.

“There was always going to be a Cambridge Analytica,” Julian Wheatland says in the new Netflix documentary The Great Hack. “It just sucks for me it was Cambridge Analytica.”

It sucks for Wheatland because he was the COO of Cambridge Analytica, a political consulting and data firm that became synonymous with the unchecked power of Facebook to peer into our minds and, many believe, “persuade” us into voting a certain way. In an undercover video released in March 2018, CEO Alexander Nix bragged about creating “defeat Crooked Hillary” Facebook videos for the Trump campaign, which the firm would target to small numbers of people in battleground states.

“The company made some significant mistakes when it came to its use of data,” Wheatland said on the latest episode of Recode Decode. “They were ethical mistakes. I think that part of the reason that that happened was that we spent a lot of time concentrating on not making regulatory mistakes … It felt like, well, once that was done, then we’ve done what we needed to do, and we forgot to pause and think about, ethically, what was going on.”

Appearing on the new podcast — along with The Great Hack’s director and writer Karim Amer and Pedro Kos, and early Facebook investor Roger McNamee — Wheatland called for Facebook to be regulated as a utility and said “every company is a data company today, and how that data is ethically managed needs to spread through all companies.”

McNamee lamented that, so far, the political repercussions for Facebook have been a slap on the wrist. He compared it and Google to chemical companies in the 1950s that were “artificially profitable” because they could pollute the environment at no cost.

“By my very, very rough estimate, Facebook, if it were held accountable for the things that it has done, would be an unprofitable company and Google would be modestly profitable in comparison to today,” McNamee said. “I believe that if we do not do something about making companies in the economy generally accountable for the damage they do, the political arguments are going to turn out to be irrelevant.”

You can listen to Recode Decode wherever you get your podcasts, including Apple Podcasts, Spotify, Google Podcasts, and TuneIn.

Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Julian, Roger, Karim, and Pedro.

Kara Swisher: Well, that was a laugh riot, huh? That was great. Last night, I got to interview Megan Rapinoe. That was really fun. This was not quite as fun. I’m going to bring up our panel. Oh, Roger’s already here, hovering behind me. Why don’t we all come up? Roger McNamee, Julian [Wheatland]. Thank you. Pedro [Kos], come on up, and Karim [Amer]. Thank you.

This was a tremendous … I really enjoyed this one. I’ve been covering this stuff for years. There was a lot of stuff I didn’t know that was in it. It was really interesting. Especially [former Cambridge Analytica exec] Brittany Kaiser, which I thought was really interesting.

I actually want to start with you, Julian, because I don’t know what you think from this thing. I don’t know if you think [former Cambridge Analytica CEO Alexander] Nix is a victim or a hero or if Brittany is, or if you think Chris is. I’d love to get your take. And Julian was the COO of Cambridge Analytica. I can’t tell what you think happened here.

Julian Wheatland: That’s quite a big question.

Well, why don’t you give me a big answer?

Julian Wheatland: I’ll try. I was the chief operating officer and the chief finance officer of the company from the beginning of 2015 until early in 2018, when Alexander stepped down and I took over, but really with the aim of closing the company down at that point. What I think was that the company made some significant mistakes when it came to its use of data. They were ethical mistakes. I think that part of the reason that that happened was that we spent a lot of time concentrating on not making regulatory mistakes.

For the most part, we didn’t, as far as I can tell, make any regulatory mistakes. But we got almost distracted by ticking those boxes of fulfilling the regulatory requirements. It felt like, well, once that was done, then we’ve done what we needed to do, and we forgot to pause and think about, ethically, what was going on. That was part of it.

Then there’s a whole other part of it which … I’m sorry, that was the part of it to do with the data analytics and the digital targeting and the Trump campaign and the Cruz campaign and commercial work as well. Then there was a whole other part of it which was political campaigns elsewhere around the world. That was really very different work. It was very strange work, it was on the ground, doing a lot of research trying to understand people and view them as groups, rather than the attempt at one-to-one targeting for digital.

Well, I’ll be honest with you, there are lots of stuff in this movie about what was being promoted there and proposed there that was news to me. Certainly, stuff that on the undercover video with Alexander Nix, which I’m almost certain was never done. I don’t think the company ever had any capability to do it. I think he just got carried away. If you ask me, honestly, I think they both looked drunk in that undercover video.

So, they didn’t mean to say it?

Julian Wheatland: I could not answer whether they meant to say it or not. The company didn’t provide those services. Don’t think it could have provided those services. They look drunk to me.

All right, so: drunk. All right. Okay.

Julian Wheatland: I suppose the other thing I would say about Alexander is that Alexander’s favorite expression is “never let the truth get in the way of a good story.” He is, to say the very least, a consummate overseller of services. I think what you heard in there was trying to take credit for the Brexit result, for example, without actually saying it or taking credit, but with a nod and a wink.

Also, I think, in the Trump campaign, we took too much credit, more credit than we warranted. There was a reason for that, and that’s because we thought Trump was going to lose. So, we had started a PR campaign in advance of the election to try and explain, well, “it would have been worse if we hadn’t been there and we didn’t choose the candidate” and things like that. Then of course, he won. That groundwork came back to bite us.

You’re basically saying you were stupid and you were liars, which I find it’s kind of interesting to think about that.

Julian Wheatland: I’m fairly sure I didn’t say that.

No. I can’t get what you’re saying. So he was overselling it? He was lying about the capabilities, but he didn’t do the bad things that he said he did. Is that correct?

Julian Wheatland: I can’t tell you what bad things he did or didn’t do. But when it came to shipping girls from the Ukraine to Sri Lanka for a candidate, I don’t believe he did that. I don’t think that we had any capability to do that. Or indeed, that there was any desire of anyone else in the company to do that.

All right. So, Karim, you spent a lot of time in the piece, trying to paint a very powerful company that did have these capabilities. When you finish this, what did … Because one of the parts they thought was the [Alexander] Nix testimony, which I’d forgotten about that he made, was the victimization of Cambridge Analytica, to being thrown under the bus, essentially, for other more powerful players. Which I think Julian talked about, that it was about a bigger thing in the movie. How do you look at Cambridge Analytica in this? Is it just a useful idiot that managed to overstate itself or what?

Karim Amer: I don’t think Cambridge was a useful idiot. I think Cambridge was a sophisticated company that has a history in PSYOPS that we don’t … Its mother company, at least, SCL Group, the mother company of Cambridge, has a history in PSYOPS, running operations in different parts of the world to try and win the battle for the hearts and minds of different people.

Which governments have done for many years.

Karim Amer: Which governments subsidized, which our tax money subsidizes.

But it’s been done since the beginning of time, this kind of thing.

Karim Amer: Well, it’s always been done. But I think the sophistication and weaponization of it to a far more exponential level of power has happened with the advent of that, mixed with some of the power of Silicon Valley. Silicon Valley’s government contracts that allow for some of this testing to happen … oftentimes, this testing is happening in third-world countries.

What was your impression from this of Cambridge Analytica? Did it stop there?

Karim Amer: My impression of Cambridge Analytica is that Cambridge Analytica was involved in a lot of things that are ethically very, I think, morally bankrupt when you look back at some of them for sure. But to blame Cambridge Analytica is, I think, a little short sighted. Because Cambridge Analytica is not the reason why our democracy is for sale.

Well I think that’s —

Karim Amer: We agreed to that. We allowed for this country to have the largest marketplace when it comes to elections. If you’re an aspiring entrepreneur in the election business, this is the megaship. The American public has allowed that, has supported that, has encouraged that. So, you can’t blame companies for wanting to profit off the fact that we’ve commoditized our election process. You can’t blame companies for then realizing that our technology platforms have designed this incredible system whereby we have access to so many people’s data, and have inoculated people to not realize that their data can be used against them.

I think that the idea of blaming someone for the discontent that we’re in is challenging for me, because I think that we’re all complicit in building it. I think that companies like Facebook are far more problematic to me than even Cambridge Analytica. There would be no Cambridge Analytica without Facebook.

I think one of the most interesting quotes was actually Julian’s, “There was always going to be a Cambridge Analytica, it just was, unfortunately for me, Cambridge Analytica.” Which was the idea that they were taking advantage of what was already existing. All kinds of companies do that, all kinds of campaigns do that to manipulate election results now.

Roger, how did you look at what was here? Because I do think there wasn’t a whole lot about Facebook’s behavior in this, because since they’ve been doing this, they now seem to have slipped out of it at this point. Their stock is at an all-time high. They paid what I call the parking ticket to the FTC. They are enjoying, once again, Wall Street celebrating what they’re doing. They’ve made changes, of course, to their behavior, but it seems as if they have slipped out of it. Let me start with Roger and then Pedro.

Roger McNamee: I think this film is incredibly important because what Cambridge Analytica and the UK Brexit referendum and the US election and the interplay with Facebook demonstrates is, it’s like this tiny little example of what is a systemic problem with data in our society. That essentially, our human experience, as the professor from Harvard, Shoshana Zuboff would say, our human experience has been converted into data, then run through machine learning and being used to then manipulate the choices available to us and, through that, manipulating our behavior.

That can be done in any aspect of our lives. We were willing to go along with it right up until the moment when it affected the outcome in what should have been democratic elections. To me, the significance of the story is that this is the one that finally got people to pay attention. In my mind, I look at this and the important lesson is, we have not done one thing, not one thing to prevent this from morphing into greater and greater problems.

If you sit there and you look at what Facebook is doing with Libra — which is its so-called cryptocurrency, which is an attempt to displace reserve currencies around the world — if you look at what Facebook is doing with Sidewalk Labs, which is its Smart Cities initiative, that is essentially an effort to apply The Matrix or Minority Report to real cities.

It’s not just that we haven’t learned anything yet. We’re still at the point where the problems are compounding. I would say it’s not just the stocks are at an all-time high, regulators around the world are struggling to even have the correct vocabulary for dealing with the problems. That we continue to view our relationship with platforms as we give them a little bit of personal data, they give us a tremendously valuable service and do a little ad targeting.

That may have been true five or 10 years ago, it is definitely not true today. The data we give them is less than one-tenth of 1 percent of what they know about us. They have a data voodoo doll, which is a complete digital representation of our lives. With it, they can manipulate our behavior. The data we give them is literally irrelevant. This is all about the fact that your bank can sell your data, credit card receipts can be sold, your health data, your location data, you’re tracked endlessly and everywhere.

We’ve allowed all this to go on for reasons that, to me, make absolutely no sense. Again, we were ignorant, right? It’s okay to let something happen when you don’t know. But now we know. Thanks to this movie, we have a really crystal example of it. If we’re going to learn something from this, if we’re going to have to have gone through this experience, let’s at least learn something and not to assume.

Where do you feel we are? Pedro, where do you feel … Because just again, this week, the FTC settled with YouTube over children, data around children and with Facebook over breaking the decree that they had agreed to in 2011. When you have stuff like this … Look, here we are, we’ve gotten this lesson, have our regulators, especially in this country, because that’s who’s responsible for these largely US-based companies. Have they learned their lesson? Or do they … I feel like the FTC was asleep at the wheel for a decade over with Google and everyone else, and now that they’ve woken up, they don’t have staff, they don’t have funding, they don’t have the political will to do something about it.

Pedro Kos: Yeah, but political will starts with us. If we’re not asking those questions, they’re not going to act upon that. I think one of the things, as Roger said, there was a … When I came into this, I didn’t know anything, and I had a deficit of language. I didn’t understand the space. To me, it was this black box, this magic box. “Oh, I can go online and connect and create and I get instant gratification for a lot of things.” But when what I needed to do was peel back and learn at least what questions to ask, because we don’t even have that.

I think we wanted to approach this in a way where we can at least begin to tackle and have a conversation, because we’re not even having that conversation. For us, it was very important to spark that and to ask the questions, and the ethical questions that I think are desperately needed to be asked.

Right. So, Karim, when you’re thinking about wanting to do that, to create this question, what do you want people to do from watching this? One of the things that I find really interesting is this is going to appear on Netflix, correct? Which is run by Reed Hastings, who was on the board of Facebook, and has since left the board of Facebook, for reasons unknown. What were they trying to do here, and what were you trying to do here?

Karim Amer: Well, it’s important to know that this film, when we initially began it with Netflix, it was not about this topic. It was about, actually, the Sony hack. It had nothing to do with this. Netflix has allowed us to shift course. As we started looking at this word “hack” and realizing that the hack that was more interesting was not physical hacks that are happening to infrastructure or places, but actually the hack of the human mind.

So, we became obsessed with trying to look at how we could show how people’s minds have been hacked and changed and how we could change people’s minds, and the vulnerability that’s amongst us. But the problem we faced was one that I think this entire conversation faces. Which is — Pedro was alluding to this — that there’s a deficit of language.

Essentially, the only thing I can compare it to is similar to the environmental movement, which many people in this community are very familiar with. In the early days of environmentalism, people weren’t using the words “global warming.” We didn’t have these words, we didn’t know what to call the phenomena. Yet people were feeling the anxiety of the planet.

Now, the thing that the environmental movement had that we don’t have here was that at least you could show the imagery of the wreckage site. As data has surpassed oil, as we mentioned in the film, when we were living in the advent of oil, when there was an oil wreckage site, you could see the spillage, you could see the marine life destroyed. You could see that image of a bird barely able to fly because of the …

Covered in oil.

Karim Amer: Covered in oil. That image could transcend language and travel the world. We have had all of these data breaches, these hacks, these big headlines, yet we don’t have the imagery. Part of our task became similar to what David Carroll says in the film is how do you make the invisible visible? We really wrestled with that for a while.

It took a combination of things. One was finding characters — as we always do in a verité style of filmmaking — who can inhabit the story. People who you feel are jumping off a cliff into this world who can take you into the stakes of the conversation. We found that what David Carroll, he was a person saying, “I’m not going to wait for the Mueller report or whatever report’s out there, I want to figure out what do I get to know about myself?” Was just asking simple questions about data transparency and put himself into the ring to ask that.

He also had someone like Carole Cadwalladr who was obsessing in the journalistic endeavors of figuring out what happened and what do we have the right to know and how do we hold power accountable? Then it was important that you had people who were coming out from the inside of these places. People like Chris Wiley, people like Brittany Kaiser, and even people like Julian, who did not have to agree to be in this documentary.

No, I’m going to ask him next. Julian, why did you decide to be in this documentary?

Julian Wheatland: You asked me earlier on what I thought of the company. There was one bit which I didn’t get to, which was that there were 120 people in the company that were clever, innovative, creative, thought that they were doing great work, believed in the work that they were doing, sincere people, and they all lost their jobs from this. They all got tarnished with that undercover video. I wanted to speak up for them as much as anything.

The other thing is that I think that there are lessons. I’ve learned a huge amount through all of this. There are lessons that I’ve taken away from this that I want to talk about, because I don’t want there to be other Cambridge Analyticas. I don’t want us to think that the problem is just Facebook. The problem is Facebook, but every company is a data company today, and how that data is ethically managed needs to spread through all companies.

Right. When you were talking about that there were lots of lots of people that worked at Cambridge Analytica that weren’t part of this. It’s the same thing at these companies now, at Google and Facebook and other companies, that aren’t part of that. How do you sort that out when there’s so much money being made and so much opportunity to take advantage of this data?

I have a theme I’m thinking about writing about this week, that most of civilization is a cheap date to these companies. That you get a map and some email and something else and you think that’s a great trade, when in fact, it’s probably the worst trade you’ve ever made. How do you get that within the companies? You were running one of these companies. Obviously, you were surprised by that video, for example. It killed the company. That video killed the company.

Julian Wheatland: Yeah, it did. I think what’s missing in that … Again, I’m just going to stress, I divorce it from the likes of Facebook and Google because I think they require a different solution. To be honest with you, I would take Facebook and recognize it as being the monopoly utility that it is, that people can’t leave, and should be regulated as such.

But for the rest of … let’s call it the tech community and the ordinary companies, the controls and management processes need to be set up internally such that every time there’s a new data source, every time there’s a new way of analyzing, every time there’s a new technique that can be used, that it gets evaluated as to whether or not it’s in alignment with the ethical objectives, clearly stated ethical objectives of the company, and that that is reviewed up to the board level. And it’s not left for the individual engineer or technologist to do.

Such as Imgur or WhatsApp, not making it viral or something like that. I say the same thing over and over again. I’m trying to get people to repeat it over and over again within Silicon Valley, which is, if you can think of your product as an episode of Black Mirror, don’t make it, yet. Start to think about how you could not make it an episode of Black Mirror.

Karim Amer: I don’t think anybody wakes up in Silicon Valley … I think this is important for this audience. That we’re not here to just scold the Valley. I don’t think anybody …

That’s what I’m here for. But go ahead.

Karim Amer: I don’t think anyone wakes up in Silicon Valley and says, “How am I going to wreck democracy today?” If they are, then …

Well, there’s one guy, but go ahead.

Karim Amer: I think it’s, how do we get here? We get here from the place that this culture of “move fast and break things,” and that we could just innovate and there’s no cost to that innovation, is just fucking false. Sorry, call it what it is. Some of the smartest people in the world live in this community and you can’t call yourself out and realize that this culture has gone haywire, and that no culture ever has self-regulated. It just doesn’t happen because humanity — which some people may have skipped in the focus of just math and tech and science. In that humanity class we learned about hubris and ego and how it’s just that thing that every culture and civilization has and has always been the thing that broke every man. It still exists. That’s a thing that we have to hold accountable in this city.

Kara is doing her part in helping to do that. But I think it takes a community to have this conversation and say, it doesn’t matter how we got here. It doesn’t matter Facebook and Cambridge Analytica were working together in cahoots with the Russians, that doesn’t matter. What matters is there’s the wreckage site and how do we fix it?

I think it should be treated the same way you look at natural disasters. When there’s a natural disaster that hits a community, what happens? People band together and they start working together. They don’t worry about whose fault it was that the doors didn’t work. They just start trying to fix it. I don’t see that urgency …

Well except — Yeah, it’s interesting, that is absolutely true. But Roger, why don’t you talk about this, because I did a big long interview with Mark Zuckerberg that he probably shouldn’t have done. But I couldn’t get him to understand consequences. We spent a lot of time going back and forth about who’s [responsible] and he saw it as a blame thing and I saw it as, if you make something bad, you have to understand how you made it so you can fix it. It was a very different … The points of view were very … It was interesting to see Alexander Nix here because he felt like he was a victim of this, versus that he had any cause, he was any kind of player.

Roger, talk about that concept. Because that’s a famous Facebook [motto], “move fast and break things,” which they’ve changed to “move fast and create stable infrastructure,” which is less problematic, I guess. But that word “break,” I remember when they did it at the time, when I saw it … this was plastered on the walls of Facebook. They had all kinds of sayings and stupid stuff like that. Most of these companies do. But I remember saying, “break?” and they were like, “Yeah, break.” I’m like, “Break is bad.” It was like a debate. You were there. Did you think this was nifty?

Roger McNamee: It wasn’t just them, it was a broad notion. In 2003, 2004, finally, the limits of Moore’s Law, of Metcalfe’s Law, the things that prevented you from making global universal consumer products evaporated. Suddenly, the cost of creating startups went way down. So, the barriers to entry went way down, and everything became about eliminating friction. The notion was, if you wanted to dominate globally, you had to move faster than anyone else. Which also meant you had to move faster than the human reaction time or regulator reaction time or critic reaction time. The notion was, you would blow by everything.

That was something that I think came in with the PayPal mafia. This is Peter Thiel and Reid Hoffman and Elon Musk as well, that they came in and they brought this philosophy, it was so spectacularly successful so quickly. The issues took a long time to show up. There were glimmers all over the place of problems, but you didn’t have anything quite like the 2016 election or like Brexit, that focused everybody’s attention on one thing.

My observation about all of this is that I think it’s time for a forklift replacement of the culture. That essentially, Silicon Valley has to remember that it’s spent 50 years in the business of empowering the people who use technology. Now, it’s in the business of treating the people who use it as a source of fuel. We’re not the customer, we’re not even the product. We’re a source of data that isn’t used to improve something for us. It’s used to manipulate whole populations. In my mind, that is morally bankrupt, and I do not believe it can be fixed.

I think the way you get out of this is the way you always get out of tech problems, which is you create an alternative universe with a different value system and you give people choices. You’re not going to lose the things you like about these products, they’re not going to go away. You shouldn’t be afraid of change. What we should be doing is asking the question of, why isn’t this stuff designed to make my life better? Because it could be. In fact, I think that would be a bigger business opportunity, because all the stuff you’re doing today is still economically valuable, and there’s millions of things they don’t do, because those things don’t fit into the surveillance capitalism business model.

Right. You want to switch out the “Soylent Green is people” idea of things. That’s what you’re talking about: People are fuel. I’m making a really old Charlton Heston reference.

Roger McNamee: But in the current model, that’s right. The thing is, again, Silicon Valley exists in an ecosystem globally, which sits there and says the only constituency that matters is the shareholder, and the only thing that management should care about are metrics. Well, the reality is, ethics doesn’t stand a prayer in that environment. Because ethics, by definition, is the willingness to subordinate a metric to a higher value.

I think that we have a problem in business culture broadly. In a world where Wells Fargo Bank, a local bank here, took money out of the accounts of millions of account holders without their knowledge. Where you had the banks generally bankrupting the entire economy and getting away with it. Silicon Valley’s not alone, any more than Cambridge Analytica was alone. This is a pervasive issue. I just think we have to decide what we want capitalism to be. What do we want our economy to be? Because if this is it, if this is the best we can do, that’s …

One of the things about this movie that I did see, although I was riveted to how you got not just Julian but Brittany Kaiser to talk. I’d love to know how to do that. But first, one of the things that she’s … Wow. I just want to say, I’d love to know what you think of her because I can’t tell.

Julian Wheatland: I’m not saying that here.

Okay. All right. Okay. Yes, you are. This worked, the great hack worked. That’s what I came away from it. It’s like, everyone is finger-pointing and saying this is a mess, it’s a mess. But it worked. What Steve Bannon was doing originally when he was an investor, he wanted to break everything, which he used these tools to do so. It is his philosophy about that. How do you look at that? Because really, what you’re painting is a picture of success in that there’s no repercussions. You’re painting a picture of the damage, but what are you hoping to do?

Karim Amer: Well, what are we hoping to do? I think what we’re hoping to do is … I think that one of the problems is that people in the UK and the US believe that democracy is some god-given right. That it just comes from the sky, and it’s just like most people in this audience are just born with a democracy. So, you just assume it’s always going to be there. The problem is democracy isn’t a god-given right. It’s very fragile, and societies are fragile. Having come between immigrating to this country, and having lived between the United States and Egypt, and our previous film that Pedro and I and Geralyn [White Dreyfous] and Jehane [Noujaim] made about Tahrir Square, illustrated the fragility of that democracy.

I think right now we’re witnessing another kind of fragility. I think that for us, it was important to express that in this film. That fragility we’ve seen from the pendulum of technology swinging to the other direction. In Tahrir Square, we saw the tools of technology be heralded as these democratizing tools that would just catalyze freedom. Then we saw them swing the other direction, and no one in Silicon Valley was taking credit when they were being used by authoritarian governments or being used for ISIS recruitment, etc., etc.

I think what we found here is that there is a fragility of the notion of democratic governments in the face of monopolistic tech platforms. There just is. Who has more power when someone sits — a senator or Mark Zuckerberg? How many constituents does Zuckerberg have? There’s a fragility in the notion of truth when we have weaponized information systems.

Then I think there’s a fragility that I don’t know how to solve for and I’d love to hear from people who work in this, which is the fragility of the “shared values” that we subscribe to as a society, as a democracy, when everyone’s living in their own personal reality. The platform is incentivizing that confirmation bias constantly, and getting to the point where the platforms may actually be incentivizing the polarization of American society. No one’s saying anyone designed this. No engineer sat in a room and said, “Hey, how can we make people more neurotic?” I don’t believe that that’s how it happened. But the point is, we’re here and that’s what’s continuing to happen. The question is, whose balance sheet do these societal losses show up on?

Well, it’s similar to cigarettes or anything else. Just a second, Roger, I want to ask a Pedro question first. How did you get these people? Because in a lot of ways, you’re talking whatever you think, and Brittany, she’s talking, Chris [Wylie] is talking. When I do stories, there’s always one person … I remember there was one person in Uber, we had a very tough story about stealing medical records of a rape victim who was with an Uber driver. It was one person within the company who couldn’t take it.

It was really interesting. I remember the conversations, “I can’t live with myself.” They had lived with themselves and benefited mightily from being in Uber. But I remember that, it was that moment of turning someone who, I wasn’t trying to turn them, but it was a really interesting discussion. These people … Look, Brittany, for example, is a really complex character. I’m not sure what to think of her. I don’t know what her whole jam is going on. I didn’t even know if she was just telling the truth about her parents. I was like, she lies almost continually about, but there’s truth in there. There’s also truth. She was obviously there. She obviously has the emails, she was obviously part of it, that shift from the Obama administration.

I think the only thing she said I thought that was super truthful is like, “They didn’t pay me,” which was really, I thought that was a great moment. How did you get her to talk and how did you … Chris has been talking a lot, obviously. But how did you get her to talk? She’s a key character here.

Pedro Kos: I know, she’s integral. But I think Karim would be a better person ask how I got her to talk. But I will say one thing. I think there’s a lot of truth, and I think also that we’re all very quick to judge Brittany. But I think especially, we have to put ourselves in her shoes as well. I think there’s a lot of mirrors that Brittany can provide of who we are, and her journey, what we really tried … She goes on a journey. It wasn’t a lightswitch, I don’t think, at least in my book.

We really dug deep through … She was very open, but she really, through her journey, through her very intimate journey and got through to starting to see who we are, and really show us a mirror of what questions we need to be asking.

Karim, why don’t you tell us, how did you get her to talk? Because what I found was teaching about her, is someone who perpetually lies to themselves is actually telling the truth by accident. It was sort of a mindfuck that way. I just couldn’t tell what I thought of her. But how did you get her to talk? She loved the headlines. You could see her eating up the attention and everything else at the same time, which I thought was beautifully done from a photography point of view. You got that sense of she loved being part of the drama at the same time.

Karim Amer: Well, she was living it. When we met Brittany, she had just basically decided to begin her process of going public. I had been navigating the story for some time. I had met Carole, almost a year and a half before, or a year before, and I’d met Chris Wiley a year before he’d come public, and I’d met several people in the space. But there was a challenge of getting people on camera at the time to explain what was happening. We’d even tried to get Facebook to talk about it, and they refused.

I basically, through someone in the documentary film community, just searched, met Paul Hilder, who’s in the film, and he had been in touch with Brittany for some time and had heard that I was this obsessive documentary filmmaker who was really into Cambridge Analytica and was looking through all the nooks and crannies. Then we had a chat.

Basically, I was going to the airport and he was leaving the airport, and we spoke for 15 minutes. He’s like, “Well, look, if you really want to do this, you got to go to Thailand tomorrow.” I was like, “Tomorrow?”

Because she’s at a pool.

Karim Amer: I didn’t know where she was. I was like, “I got to go see my kids. I can’t just go to Thailand.” I have three kids under the age of 3.

“Somewhere in Thailand,” not Thailand.

Karim Amer: I had to go to Egypt and she was in Thailand. So, I called my partner and wife, Jehane, who is also an amazing filmmaker. I was like, “I got to go to Thailand.” She’s like, “No, you don’t. You got to come home.” I was like, “I’m telling you this is real.” She’s like, “It’s not.”

Pedro Kos: We all questioned, really, should you go to Thailand? Who was this Brittany anyway? Is it really worth it? But at that time it was really … I have to take my hat off to Karim, who really believed that there was something there, even if it doesn’t lead to anywhere, but I think we have been digging so long in the space, trying to find answers and …

Someone who was there.

Karim Amer: Also, trying to find a journey, because of the way of filmmaking that we follow is something that’s experiential. This is a story where it’s so interesting, and these articles, you can have all these amazing intellectual debates about it. But why the hell is it a movie? What makes it a movie if it’s not experiential?

We were looking for people who were actually actively on a journey that wasn’t just happening in their Twitter feed and it was in real life. Here was someone who not only was on an incredible journey, but in Brittany’s story, she could take you through the ascent of these technologies. They were used as a tool to inspire hope and change and create one of the most historic elections of Barack Obama and then used in the total flip side, in a very different election where a lot of other types of tactics were used, in not so hopeful ways. But led to equally historic outcomes.

For one person to embody both, the trajectory of the whole thing, it became an interesting metaphor for how all of us got here. I think that it’s difficult for many people to absorb who Brittany is because we’re in the most polarized time of our lives. You don’t switch from one uniform to the next. You don’t go from blue to red. That’s blasphemy. You don’t go from remain to leave. That’s blasphemy. Yet at the same time, I challenged the audience to ask this question. If you don’t have redemptive stories in this time, where do you go? Where does the road of polarization take you to? Civil war? Or do you just make these filter bubbles even more cemented, where you don’t have to deal with the people you don’t like and you don’t have to look at the people who challenge you or don’t look like you?

If we don’t provide a way for people who have done things that we may question to come out and not only tell us what actually happened, but to prove that they too can change, then we’re no better than anybody that we’re judging. So, I think it’s important to look at that.

Yeah. Let me ask you, just from a director perspective, why did you film …

Karim Amer: One of the things I was … The pool was the first interview.

Yeah, why did you film her in a pool?

Karim Amer: Because she was in James Bond Island and she was in this pool. I didn’t know if I’d never see her again. My wife told me, “If you don’t come back with something, don’t come back.” So, I was like, “Okay, well, if you’re willing to talk, let’s just do this now.” I just got in the pool and the DP I had never worked with was like, “Really?” I’m like, “Let’s just do it. I don’t know, what if she changes her mind?” So, we just started filming.

In these types of films, as some of the documentary filmmakers who might attest to, you got to film like every day is your last, because you just don’t know. You don’t know if people are going to change their mind. I think one thing that we also wanted to show is we’re these moral creatures who are crippled by our morality in many ways. It’s what holds us back, but also makes us human. But we’re being shaped by these algorithms, and we’re trying to tell a story about how these algorithms are shaping our lives.

These are amoral algorithms who aren’t crippled by that morality. Here was a character who could also traverse the complexity of morality and take us to different moral choices and show us what was. For me, that was also something that was quite important to capture, and that’s why we followed her on this journey.

Okay, Julian, I’m checking in again, what were your thoughts on Brittany?

Julian Wheatland: I guess, the parts of this film that are like a patient and a counselor …

Absolutely.

Julian Wheatland: An interaction between … I didn’t really understand people that go on reality TV shows, full stop. But Brittany has always been at the end of the spectrum. It’s actually the reason why, when the Leave.EU campaign said they were going to hire us, she was the first to put up her hand and say, “I’ll go and I’ll be at the launch with you.” I think she just likes being out in public.

Mm-hmm. And her journey here, do you think it’s truthful?

Julian Wheatland: I think I answered that in the film. I do not know why she said a lot of the things that she said. I don’t have the key to her mind on this.

I see. So, everyone you work for is crazy. It seems like you don’t know why [Alexander] said it. Why do you think they did it, though? Would you have any … You worked with these people for years.

Julian Wheatland: Well, it depends what we’re talking about. We’ve got Alexander and we’ve got Brittany going off and making sales presentations on their own. It seems to me that they were the only people who knew what they were pitching. I think that she was probably led down the wrong path by Alexander, but I don’t know that.

I agree with you on that. Yeah, it’s interesting.

Roger, when we’re talking about, if we’re going to talk to each other, if we’re going to get there, some people feel that’s the way to go to how to figure ourselves out of this. Other people feel, this is a group of people that are going to continue to game the system, and that good people lean forward and say, “Okay, now let’s get together.” But it’s shown to be a bit of an administration who just keeps pushing the boundaries this last week. Same thing with the tweets and everything else. It doesn’t stop. The constant propaganda machine does not quit, and now it’s overt. They’re not even hiding, in lots of ways.

Roger McNamee: The problem that I’m trying to address and have spent, I guess, now three years trying to deal with, is this notion that Silicon Valley has inadvertently created companies that are identical in our economy to what the chemical industry was in 1950. The fastest-growing, most-profitable companies in the economy. They are that way because they are not held accountable for the economic externalities that they created.

Like chemical companies, they would pour mercury into freshwater, that was not a problem. Mine companies would leave tailings on the side of the hill. Petrol stations would pour spent oil into sewers and there was no accountability for that. As a consequence, they were artificially profitable. Eventually, we caught up to them and said, “No, we’re going to hold you accountable.” These companies are doing toxic digital spills for which they’re not being held accountable.

By my very, very rough estimate, Facebook, if it were held accountable for the things that it has done, would be an unprofitable company and Google would be modestly profitable in comparison to today. I believe that if we do not do something about making companies in the economy generally accountable for the damage they do, the political arguments are going to turn out to be irrelevant.

I’m completely with Karim on the issue of, I am not interested in litigating the past, I’m not interested in attributing blame. I am interested in solving a problem. We have lost our sense of civic pride, civic engagement. We are focused on symbols, as opposed to the substance of what it means to be a citizen. In that context, what I’m hopeful of is that we can engage with everyone willing to engage, and that that group of people ideally can, by coming together, create enough mass to restore some function to democracy — not just here, but everywhere that’s being affected by this.

Because it’s this defaulting to authoritarianism, because you can’t see a way out of this mess, you feel like you’re powerless. And I’m going, “That’s nonsense.” Look what this tiny team produced, this amazing film which is going to have this massive ripple effect. And I look at this and … Kara, all by yourself, you have shone lights on so many of these problems and it’s having an effect.

Mm-hmm.

Roger McNamee: We haven’t solved any of those problems, yet …

I don’t think so, but okay. Sure.

Roger McNamee: But all of you are here today. No, but hang on, all of you guys are here today and, whether you like it or not, you’re part of the resistance now because Google and Facebook know you’re here with us.

All right. But Roger, in that regard, and I want to finish up with you guys saying what needs to happen. You’re saying that and you’re like, whatever. I’m the nag, you’re a nag also.

Roger McNamee: I’m a nag.

You’re a nag, we’re nags together. But then, Libra!

Roger McNamee: You’re snarkier.

Yes, I’m a snarkier nag. And then Libra, and I was like, “Geez Louise, these guys.”

Roger McNamee: They haven’t learned a thing, right?

Not a thing. They’re going to ruin global currency now. Yay!

Roger McNamee: This is the thing that just drives me absolutely insane, that they’re so focused on … In fact, this may be symbolic of a whole cultural problem of everybody being so focused on their own needs that they can’t imagine that there might be a flaw in that approach.

My parents were children of the Depression, they went through the Second World War. In the ’50s, we began the culture of consumerism, of having consumer packaged goods your way, prepared foods and all that. And what Google and Facebook did was they did the same thing for ideas. Have it your way, right? Facebook is 2.3 billion Truman Shows and Google is “whatever search results you want.” And I look at this and I just think, “Democracy,” and I think, “Personal choice, agency.” Those things are actually incredibly valuable and you may not realize that until they’re gone, right?

But we, right now, are being challenged to decide what we care about, what we believe in, and how much control of our own lives do we want to have. Because Google strategy’s really simple, right? They want to replace free will. They want to replace democracy with algorithmic processes. This is the genius of what Professor Shoshana Zuboff has written in her book, The Age Of Surveillance Capitalism. She doesn’t just describe how this economic system works. She describes the motivations that create it and, inevitably, the processes that it will lead to. And we ignore her advice at our peril.

Okay. So, let’s talk about what you want to come out of this. Each of you, very shortly … Why don’t we start with you, Karim? What is the goal here? What would you like to see happen with the impact of your movie?

Karim Amer: Just, if you’re not thinking for yourself, there’s an algorithm thinking for you. So, start thinking and asking questions.

Right. And I also love the way you use Endgame[-style graphics], with the … That was well done. There were people disappearing into their digital dots.

Karim Amer: So that’s the genius next to me.

All right. Pedro, what would you like to see happen?

Pedro Kos: I’m wondering who are the next Brittany Kaisers, who are the next people that are — especially here in this community, in this area, in the Bay, in Silicon Valley — who are getting a little uncomfortable and empower them. We talk about filmmaking as an art form that requires imagination. Well, I think everything requires imagination. I think it’s like imagining what could be beyond and really challenging that, and with the work that you’re doing really creating community.

Julian?

Julian Wheatland: I’d like to see competition, and I’d like to see the tools of competition. So, for Facebook and Google it’s different, it’s too late. Break them up, maybe Google we can break up. We could break Facebook up, but because of the network effect of the platform, I think they end up being regulated as a monopoly utility. But for everybody else, I’d like to see the tools of competition so people can compete on how ethically they use data. It’d be transparent, people can understand it and choose whether or not they want to be on that platform, whether or not they want to leave their data there.

There’s one proposal over there, that they share the data, that Facebook and Google share their data, so new companies can be created. But competition, I think, is 100 percent of the way, and then they’ll create new companies that later we have to deal with.

Julian Wheatland: And they’ll make it as hard as possible to share the data as they do now.

Right.

Julian Wheatland: Until someone steps in and makes them do it.

Karim Amer: And then one thing I wanted to say was related to data, as I’ve not involuntarily become more of a person speaking on this than I ever imagined, but I think the word that we can borrow from, that we’re also hearing debated in pop culture actively, is the word “consent.”

Mm-hmm.

Karim Amer: Right? Consent has never been more debated in our society — for good reasons — but we should apply that word as it relates to data, as it relates to this relationship with technology, and as it relates to this question of the admission fee to the connected world is giving up all your privacy without having any idea where it goes, and not being aware in each area of your recordable behavior what you’re consenting to and what your tolerance is for that kind of exchange. So, I think consent should be a key part of design, in Silicon Valley especially.

Oh, probably ruin their business plans, but go ahead. Finally, Roger, short, so we can…

Roger McNamee: I would like everybody here to go out and seek out every elected representative you can find and get in their grill and ask them, “Why in God’s name are companies like Google and Microsoft allowed to scan your emails and your documents and messages for valuable information? Why are banks, credit card processors, and credit-rating agencies allowed to sell your most intimate financial data? Why are health care service companies allowed to sell data about your health care situation? Why are cellular companies allowed to sell your location? Why is anybody allowed to collect data on minors and hold it? Why is anyone allowed to trade any of this stuff? Why are they allowed to exploit it at all?”

I don’t think this is a question of who owns your data. The problem we’re dealing with now is that society is being destroyed. It doesn’t actually hurt any of us individually, it is killing our country, it is killing the world. It is as likely to cause World War Three as anything.

There was, I thought, a terrifying op-ed in the Atlantic by Henry Kissinger and Eric Schmidt, in which they said, “AI has won, get over it, and we’re going to apply it to things like national security and diplomacy.” And I’m going, “Wait a minute. You’re going to take a technology which was perfected on zero-sum games and you’re going to apply that to global diplomacy?” That doesn’t work. Right?

Yeah. It’s just what Henry Kissinger and Eric Schmidt would say.

Roger McNamee: But I’m just saying. We can’t — we got to think …

Those guys, good God.

Roger McNamee: We have to think again. Okay? You know what I’m saying?

How did I miss that? That pair. I didn’t even want to, just look at my face. You gotta be kidding me.

Roger McNamee: Yeah.

Yeah.

Roger McNamee: With Eric Schmidt?

Yeah.

Roger McNamee: Two of your faves.

Okay. And Henry Kissinger adds icing on top of the toxic cake. Julian, are you going to be in any more movies? I’m kidding.

Julian Wheatland: I hope not.

What are you doing? What do you do [next]? I’m curious where you go after Cambridge Analytica.

Julian Wheatland: Yeah, me too.

Okay. All right. And what is the next movie you guys are making? Let’s finish it up.

Pedro Kos: I’m flashing back to the 1960s about a group of incredible Catholic nuns who rebelled against another structure of the Catholic Church.

Oh, that one.

Pedro Kos: And are still challenging that today.

I love a Catholic nun story. Go ahead, what are you doing?

Karim Amer: I’m making a series on this self-improvement group that turned out to be a cult called NXIVM.

Oh, them. Oh, wow, wow. That sounds like a great …

Pedro Kos: It’s both about this and that are both about how persuadable and vulnerable … One of the things someone mentioned about how Cambridge Analytica targeted the “persuadables.” I think the problem with that question is that implies that persuadables are one bucket of people. The reality is we’re all persuadable. We may not be persuadable on certain political topics, or immovable on it, but we’re all persuadable and our minds are a lot more vulnerable. And the sooner we accept that the better, in my opinion.

All right. On that note, thank you so much, Roger, Julian, Pedro, Karim. And tell everyone about the movie. It’s a terrific movie. It’s appearing when? When does it appear?

Karim Amer: Wednesday, so let everyone know about it.

Wednesday.

Karim Amer: Tweet about it and let the world know. And thank you for having us.

Thank you very much.

Karim Amer: And thank you, Kara, for continuing to speak truth to power.

All right, thank you so much.

Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.

Sourse: vox.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *