From the book MINDF*CK: Cambridge Analytica and the Plot to Break America, by Christopher Wylie. Copyright © 2019 by Verbena Limited. Published by Random House, an imprint and division of Penguin Random House LLC. All rights reserved.
At first, it was the most anticlimactic project launch in history. Nothing happened. Five, ten, fifteen minutes went by, and people started shuffling around in anticipation. “What the fuck is this?” Cambridge Analytica’s CEO, Alexander Nix, barked. “Why are we standing here?”
It was June 2014. Fresh out of university the previous year, I had taken a job at a London firm called SCL Group, which was supplying the U.K. Ministry of Defence and NATO armies with expertise in information operations. Western militaries were grappling with how to tackle radicalization online, and the firm wanted me to help build a team of data scientists to create new tools to identify and combat internet extremism. It was fascinating, challenging, and exciting all at once. We thought we would break new ground for the cyber defenses of Britain, America, and their allies and confront bubbling insurgencies with data, algorithms, and targeted narratives online. Then billionaire Robert Mercer acquired our project. His investment was used to fund an offshoot of SCL, which Steve Bannon named Cambridge Analytica.
By now people are familiar with the company: They have heard stories about how it used personality profiles built from Facebook interactions to target and sway potential voters; seen it debated before Congress; or read that it recently inspired Facebook to suspend tens of thousands of apps for improperly accessing data. Some have claimed CA helped sway the election for Trump, while others have said company executives exaggerated their influence. I first met Mercer in November 2013, in a meeting held at his daughter Rebekah’s apartment on the Upper West Side. Over the years, the hedge-fund CEO had donated millions of dollars to conservative campaigns. But in the months leading up to the launch, I believed Mercer’s interest in our work was primarily for its commercial potential, not politics. If we could copy everyone’s data profiles and replicate society in a computer — like the game The Sims but with real people’s data — we could simulate and forecast what would happen in society and the market. If you can predict what people will buy or not buy, or see a crash coming, you have the all-seeing orb for society. You might make billions overnight.
We had spent several weeks calibrating everything, making sure the app worked, that it would pull in the right data, and that everything matched when it injected the data into the internal databases. We were standing by the computer in London, and Dr. Aleksandr Kogan, a professor who specialized in computational modeling of psychological traits, was in Cambridge. Kogan launched the app, and someone said, “Yay.” With that, we were live.
The app worked in concert with Amazon Mechanical Turk, or MTurk. Researchers would invite MTurk members to take a short test, in exchange for a small payment. But in order to get paid, they would have to download our app on Facebook and input a special code. The app, which we called “This Is Your Digital Life,” would take all the responses from the survey and put those into one table. It would then pull all of the user’s Facebook data and put it into a second table. And then it would pull all the data for all the person’s Facebook friends and put that into another table.
One person’s response would, on average, produce the records of three hundred other people. Each of those people would have, say, a couple hundred likes that we could analyze. We needed to organize and track all of those likes. How many possible items, photos, links, and pages are there to like across all of Facebook? Trillions. A Facebook page for some random band in Oklahoma, for example, might have 28 likes in the whole country, but it still counts as its own like in the feature set. We put $100,000 into the account to start recruiting people via MTurk, then waited.
I knew that it would take a bit of time for people to see the survey on MTurk, fill it out, then install the app to get paid. Not long after the underwhelming launch, we saw our first hit.
Then the flood came. We got our first record, then two, then 20, then 100, then 1,000 — all within seconds. Chief technology officer Tadas Jucikas added a random beeping sound to a record counter, and his computer started going boop-boop-boop as the numbers went insane. The increments of zeroes just kept building, growing the tables at exponential rates as friend profiles were added to the database. This was exciting for everyone, but for the data scientists among us, it was like an injection of pure adrenaline.
Bannon started traveling to London more frequently, to check on our progress. One of those visits happened to be not long after we launched the app. We all went into the boardroom again, with the giant screen at the front of the room. Jucikas made a brief presentation before turning to Bannon.
“Give me a name.”
Bannon looked bemused and gave a name.
“Okay. Now give me a state.”
“I don’t know,” he said. “Nebraska.”
Jucikas typed in a query, and a list of links popped up. He clicked on one of the many people who went by that name in Nebraska — and there was everything about her, right up on the screen. Here’s her photo, here’s where she works, here’s her house. Here are her kids, this is where they go to school, this is the car she drives. She voted for Mitt Romney in 2012, she loves Katy Perry, she drives an Audi. And not only did we have all her Facebook data, but we were merging it with all the commercial and state bureau data we’d bought as well. And imputations made from the U.S. Census. We had data about her mortgage applications, we knew how much money she made, whether she owned a gun. We had information from her airline mileage programs, so we knew how often she flew. We could see if she was married (she wasn’t). And we had a satellite photo of her house, easily obtained from Google Earth. We had re-created her life in our computer. She had no idea.
“Give me another,” said Jucikas. And he did it again. And again. And by the third profile, Nix suddenly sat up very straight.
“Wait,” he said, his eyes widening behind his black-rimmed glasses. “How many of these do we have?”
“We’re in the tens of millions now,” said Jucikas. “At this pace, we could get to 200 million by the end of the year with enough funding.”
“Do we have their phone numbers?” Nix asked. I told him we did. And then he reached for the speakerphone and asked for the number. As Jucikas relayed it to him, he punched in the number.
After a couple of rings, someone picked up. We heard a woman say, “Hello?” and Nix, in his most posh accent, said, “Hello, ma’am. I’m terribly sorry to bother you, but I’m calling from the University of Cambridge. We are conducting a survey. Might I speak with Ms. Jenny Smith, please?” The woman confirmed that she was Jenny, and Nix started asking her questions based on what we knew from her data.
“Ms. Smith, I’d like to know, what is your opinion of the television show Game of Thrones?” Jenny raved about it — just as she had on Facebook. “Did you vote for Mitt Romney in the last election?” Jenny confirmed that she had. Nix asked whether her kids went to such-and-such elementary school, and Jenny confirmed that, too. When I looked over at Bannon, he had a huge grin on his face.
After Nix hung up with Jenny, Bannon said, “Let me do one!” We went around the room, all of us taking a turn. It was surreal to think that these people were sitting in their kitchen in Iowa or Oklahoma or Indiana, talking to a bunch of guys in London who were looking at satellite pictures of where they lived, family photos, all of their personal information. Looking back, it’s crazy to think that Bannon — who then was a total unknown, still more than a year away from gaining infamy as an adviser to Donald Trump — sat in our office calling random Americans to ask them personal questions. And people were more than happy to answer him.
We had done it. We had reconstructed tens of millions of Americans inside of a computer, with potentially hundreds of millions more to come. This was an epic moment. I was proud that we had created something so powerful. I felt sure it was something that people would be talking about for decades.
By August 2014, just two months after we launched the app, Cambridge Analytica had collected the complete Facebook accounts of more than 87 million users, mostly from America. They soon exhausted the list of MTurk users and had to engage another company, Qualtrics, a survey platform based in Utah. Almost immediately, CA became one of their top clients and started receiving bags of Qualtrics-branded goodies. CA would get invoices sent from Provo, billing them each time for 20,000 new users in their “Facebook Data Harvest Project.”
As soon as CA started collecting this Facebook data, executives from Palantir, Peter Thiel’s data-mining firm, started making inquiries; their interest was apparently piqued when they found out how much data the team was gathering — and that Facebook was just letting CA do it. The executives CA met with wanted to know how the project worked, and soon they approached our team about getting access to the data themselves.
Palantir was still doing work for the NSA and GCHQ. Staffers there said working with Cambridge Analytica could potentially open an interesting legal loophole: Government security agencies, along with contractors like Palantir, couldn’t legally mass-harvest personal data on American citizens, but polling companies, social networks, and private companies could. And despite the ban on directly surveilling Americans, I was told that U.S. intelligence agencies were nonetheless able to make use of information on American citizens that was “freely volunteered” by U.S. individuals or companies. I didn’t think anyone was actually being serious, but I soon realized that I underestimated everyone’s interest in accessing this data (which was surprisingly easy to acquire through Facebook, with Facebook’s loosely supervised permissioning procedures).
Some of the staff working at Palantir realized that Facebook had the potential to become the best discreet surveillance tool imaginable for the NSA — that is, if that data was “freely volunteered” by another entity. To be clear, these conversations were speculative, and it is unclear if Palantir itself was actually aware of the particulars of these discussions, or if the company received any CA data. The staff suggested to Nix that if Cambridge Analytica gave them access to the harvested data, they could then, at least in theory, legally pass it along to the NSA. One lead data scientist from Palantir began making regular trips to the Cambridge Analytica office to work with the data science team on building profiling models. He was occasionally accompanied by colleagues, but the entire arrangement was kept secret from the rest of the CA teams — and perhaps Palantir itself. (It wasn’t clear whether these Palantir executives were visiting CA officially or “unofficially,” and Palantir has since asserted that it was only a single staff member who worked at CA in a “personal capacity.”)
By late spring 2014, Mercer’s investment had spurred a hiring spree of psychologists, data scientists, and researchers. Nix brought on a new team of managers to organize the fast-growing research operations. Although I remained the titular director of research, the new operations managers were now given control over direct oversight and planning of this rapidly growing exercise. New projects seemed to pop up each day, and sometimes it was unclear how or why projects were being approved to go to field. At this point, I did start to feel weird about everything, but whenever I spoke with other people at the firm, we all managed to calm one another down and rationalize everything. And after Mercer installed Bannon, I overlooked or explained away things that, in hindsight, were obvious red flags. Bannon had his “niche” political interests, but Mercer seemed to be too serious a character to dabble in Bannon’s trashy political sideshows. At the time, many on the team simply assumed that to justify taking such a high financial risk on our ideas, Mercer must have expected that the research had the chance of making tons of money at his hedge fund.
After Kogan joined, I had professors at the University of Cambridge constantly fawning over the groundbreaking potential that the project could have for advancing psychology and sociology, which made me feel like I was on a mission. And if their colleagues at universities like Harvard or Stanford were also getting interested in our work, I thought that surely we must be onto something. As corny as this might sound, it really felt like I was working on something important — not just for Mercer or the company, but for science.
The firm became a revolving door of foreign politicians, fixers, security agencies, and businessmen with their scantily clad private secretaries in tow. It was obvious that many of these men were associates of Russian oligarchs who wanted to influence a foreign government, but their interest in foreign politics was rarely ideological. Rather, they were usually either seeking help to stash money somewhere discreet, or to retrieve money that was sitting in a frozen account somewhere in the world. Staff were told to just ignore the comings and goings of these men and not ask too many questions, but staff would joke about it on internal chat logs, and the visiting Russians in particular were usually the more eccentric variety of clients we would encounter. We hired a man named Sam Patten, who had lived a colorful life as a political operative for hire all over the world. He had just finished a project for pro-Russian political parties in Ukraine working with a man named Konstantin Kilimnik, a former officer of Russia’s Main Intelligence Directorate (the GRU). Although Patten denies that he gave his Russian partner any data, it was later revealed that Paul Manafort, who was for several months Donald Trump’s campaign manager, did pass along voter polling data to Kilimnik in a separate instance.
Patten was a perfect fit to navigate the world of shady international influence operations, and he was also well connected among the growing number of Republicans joining Cambridge Analytica. When CA launched, the Democrats were far ahead of the Republicans in using data effectively. For years, they had maintained a central data system in VAN, which any Democratic campaign in the country could tap into. The Republicans had nothing comparable. CA would close that gap.
First we used focus groups and qualitative observation to unpack the perceptions of a given population and learn what people cared about — term limits, the deep state, draining the swamp, guns, and the concept of walls to keep out immigrants were all explored in 2014, years before the Trump campaign. We then came up with hypotheses for how to sway opinions. CA tested these hypotheses with target segments in online panels or experiments to see whether they performed as the team expected, based on the data. We also pulled Facebook profiles, looking for patterns in order to build a neural-network algorithm that would help us make predictions. Cambridge Analytica would target those who were more prone to impulsive anger or conspiratorial thinking than average citizens, introducing narratives via Facebook groups, ads, or articles that the firm knew from internal testing were likely to inflame the very narrow segments of people with these traits. CA wanted to provoke people, to get them to engage.
We began developing fake pages on Facebook and other platforms that looked like real forums, groups, and news sources, with vague names like Smith County Patriots or I Love My Country. When users joined CA’s fake groups, it would post videos and articles that would further provoke and inflame them. Conversations would rage on the group page, with people commiserating about how terrible or unfair something was. CA broke down social barriers, cultivating relationships across groups. And all the while it was testing and refining messages, to achieve maximum engagement.
Lots of reporting on Cambridge Analytica gave the impression that everyone was targeted. In fact, not that many people were targeted at all. CA didn’t need to create a big target universe, because most elections are zero-sum games: If you get one more vote than the other guy or girl, you win the election. Cambridge Analytica needed to infect only a narrow sliver of the population, and then it could watch the narrative spread.
Mercer looked at winning elections as a social-engineering problem. The way to “fix society” was by creating simulations: If we could quantify society inside a computer, optimize that system, and then replicate that optimization outside the computer, we could remake America in his image. Beyond the technology and the grander cultural strategy, investing in CA was a clever political move. At the time, I was told that because he was backing a private company rather than a PAC, Mercer wouldn’t have to report his support as a political donation. He would get the best of both worlds: CA would be working to sway elections, but without any of the campaign-finance restrictions that govern U.S. elections. His giant footprints would remain hidden.
CA’s client list grew into a who’s who of the American right wing. The Trump and Ted Cruz campaigns paid more than $5 million apiece to the firm. In the autumn of 2014, Jeb Bush paid a visit to the office. He began by telling Nix that if he decided to run for president, he wanted to be able to do it on his terms, without having to “court the crazies” in his party.
“Of course, of course,” Nix answered. When it was over, he was so excited at the possibility of signing up another big American client, he insisted on immediately calling the Mercers with the good news, having apparently forgotten that the Mercers had told him on countless occasions of their support for Ted Cruz.
“We’ve just had Governor Jeb Bush in the office, and he wants to work with us. What do you think of that?” he said proudly. After a pause, Rebekah replied flatly, “Well, I hope you told him very clearly that that’s never happening.” Then she hung up.
For most of the time I was at SCL and Cambridge Analytica, none of what we were doing felt real, partly because so many of the people I met seemed almost cartoonish. The job became more of an intellectual adventure, like playing a video game with escalating levels of difficulty. What happens if I do this? Can I make this character turn from blue to red, or red to blue? Sitting in an office, staring at a screen, it was easy to spiral down into a deeper, darker place, to lose sight of what I was actually involved in.
But I couldn’t ignore what was right in front of my eyes. Weird PACs started showing up. The super-PAC of future national-security adviser John Bolton paid Cambridge Analytica more than $1 million to explore how to increase militarism in American youth. Bolton was worried that millennials were a “morally weak” generation that would not want to go to war with Iran or other “evil” countries.
Eventually, I was feeling more and more as if I was a part of something that I did not understand and could not control, and that was, at its core, deeply unsavory. The deeper I got into SCL’s projects, the more the office culture seemed to be clouding my judgment. Over time, I was acclimatizing to their corruption and moral disregard. Everyone was excited about the discoveries we were making, but how far were we willing to go in the name of this new field of research? Was there a point at which someone would finally say enough is enough? I didn’t know, and in truth, I didn’t want to think about it. Like so many people in technology, I stupidly fell for the hubristic allure of Facebook’s call to “move fast and break things.” I’ve never regretted something so much. I moved fast, I built things of immense power, and I never fully appreciated what I was breaking until it was too late.