Like many people, Roger McNamee has recently come to believe that Facebook is, in his words, “terrible for America.” Unlike many people, he played an integral role in helping the 15-year-old juggernaut wield so much power in the first place. A longtime venture capitalist, McNamee made a key early investment in the company, and served as a mentor to a young Mark Zuckerberg, who was then agonizing about whether to sell his creation or keep running it himself. McNamee also fatefully helped persuade Sheryl Sandberg to meet with Zuckerberg; she became Facebook’s COO in 2008.
But since 2016, McNamee — who had by then ceased involvement with Facebook — has grown into a full-time skeptic of the company he once championed, joining a chorus of onetime tech evangelists who have reevaluated tech’s role in society. His new book Zucked is a cri de coeur against a corporation and a chief executive who he thinks have badly lost their way. Intelligencer spoke with him about the dangers of Facebook’s unchecked power, whether he thinks the company is salvageable, and why Mark Zuckerberg just needs some sleep.
You said that you first started to really have misgivings about Facebook in the run-up to the 2016 election, when you noticed a series of suspicious-looking memes that were denigrating Hillary Clinton — at which point you realized that outside groups were manipulating the platform.
It was “Bay Area for Bernie” and “whatever for Bernie” type groups. Things that seemed legit, that seemed associated with the Sanders campaign. But the viral spread of them, where one day one of my friends was in that group, the next day four of my friends, the day after that, eight of my friends … there was something about it that said to me: in all probability, somebody was spending money to draw people into these groups. And then they were sharing stuff. These memes were so nasty, and so obviously not real in terms of information content, that I just couldn’t imagine even the Sanders campaign doing something like that. You know, they had a reputation, the Bernie bros.
But it just seemed too intense. I had no idea what it meant, but it got stuck in my head, and so, when a month later we had the report about that group that used Facebook advertising tools to gather data on people expressing an interest in Black Lives Matter, and then they were selling those names to police departments? That was obviously evil. The first thing was just a suspicion, the second thing was like, “Oh my God.” Facebook did the right thing, mind you. They expelled the group. But obviously the damage had been done, the data was sold to police departments, and harassment could take place. And that just struck me as completely not fitting my perception of what Facebook was all about, which was puppies and babies.
So you warned Mark Zuckerberg and Sheryl Sandberg.
Yes, but keep in mind that wasn’t ’til October. Brexit happens in June, and then I think, Oh my god, what if it’s possible that in a campaign setting, the candidate that has the more inflammatory message gets a structural advantage from Facebook? And then in August, we hear about Manafort, so we need to introduce the Russians into the equation. And then in October, we hear about Facebook using its ad tools to enable people in the housing market to discriminate in violation of the Fair Housing Act. At that point I write an op-ed for Recode, and instead of publishing it, I share it with Mark and Sheryl because I’m trying to warn them. I’m going, “Guys, there is something really wrong here” — the only logical explanation is there is something about the algorithm and the business model that enable bad actors to harm innocent people using Facebook.
I didn’t expect them to throw up their arms and promise to change everything, but I hoped that they would investigate and take it seriously. Of course, nine days later, you have the election, at which point I’m going, “Okay, guys, we cannot screw around, this is a disaster. Your brand is at stake, you’re a trust business. You must engage. You must treat this the way Johnson & Johnson treated the poisoning of Tylenol. You gotta protect the people who use your product.” And I tried that for three months, but with no luck. And so after that, I became an activist.
We’re now more than two years out from that experience, and obviously the controversies have not gone away — they’ve actually multiplied. Do you think Zuckerberg and Sandberg have made any progress on the stuff you warned about?
I want to avoid absolutes, but I think it’s safe to say that the business model is the source of the problem, and that it’s the same business model as before. And to the extent that they made progress, it’s in going after different moles in the Whack-a-Mole game. From the point of view of the audience, Facebook is as threatening as ever.
You’ve said it’s the ad-based business model that creates, in your opinion, damaging incentives that sort of reward the basest human emotions: fear, anger …
Can I take a moment to describe how it works?
Because if you think about the ad model — in a traditional ad model, you collect data in order to improve your product or service, with respect to the customers that you have. At Facebook and Google, they collect data in order to essentially create new products that take advantage of the weakness of the audience. It’s a fundamentally different thing. So if, in a traditional ad product, the consumer is the product, not the customer — in Facebook and Google they’re the fuel. They aren’t even the product. And here’s why it’s a problem: They need your attention. To get your attention, they appeal to low-level emotions like outrage and fear, and they tickle you with rewards, things like notifications. And those things are really habit-forming. And for many people, when you’re checking Facebook or Google two or three times a day for periods of years, habits become addictions. And when you’re addicted, you are vulnerable to manipulation.
Is this ad-based business model, which is the foundation of so many of these important internet companies, fundamentally flawed, in your view? Is there a more benevolent way of doing it?
To be clear, I think there was always a much more benevolent way of doing this, and I wish they had gone there. When I first met Mark in 2006, the company had $9 million in revenues, and he had solved what I thought was the fundamental issue for the social network, which was that anonymity basically allowed bullies to take over chat rooms, comment boards, and any kind of social thing. And Mark realized that he could provide authenticated identity, and in the early days, that, I felt, was the most exciting thing. He also gave people genuine privacy control. And I bet he could’ve gone to 100 million, maybe 200 million users, mostly in the English-speaking world, without any problems and a really reasonable ad business. But as they started to grow, they found a business model that really did require this surveillance, and they realized that there were no limits at all. The more friction they could get rid of, the faster they would grow. And so what they did was they basically said, “You know what, we’re not gonna enforce this identity thing. And we’re not going to enforce the privacy protections.”
The next thing you know, they’re growing like a bat out of hell, and the bad actors start coming in. To me, the miracle is that they got all the way to 2 billion active users before any problems showed up. And it’s a real tribute to how smart they were that it took that long. If they had a more traditional advertising business, I think everything would be fine. But now they’re at a scale where I don’t see how they can get back to that. You know what I would do? I think the best way to solve this problem from a policy point of view is to do what we did with the chemical industry, which is to say — there are all these externalities that come from your success, all this damage that right now society is paying? From now on, it’s on you. You have to pay all of that. And that would change the incentives pretty quickly.
But it’s pretty hard to measure what, exactly, the cost of the harm is. It gets very nebulous.
Well, yes and no. Because remember, some of the harm shows up at the individual level, so if everybody has the ability to sue for what they think the cost to them was …
Well that would be …
No, I’m not joking about this, right? I mean, a lot of this is letting litigation take its course. And, you’re right: I suspect that estimating the value of a Rohingya life in Myanmar would wind up producing the same kind of horrible outcome you got in Bhopal. But right now there’s no consequence at all, right? And so it would be better to make them do something. But I agree with you, I don’t think this is easy. And that’s the reason why we’re having this conversation, right? I mean, I didn’t write this book because I had all the answers, I wrote this book because I think I know what the questions are, and I’m trying to get the whole world to get engaged.
And I don’t really think people understand the issue that well, either …Dude, I’ve spent an entire life doing this and I don’t understand it! I’m serious! Thirty-four years as a professional tech investor, studying this stuff day and night, being at these companies as they created these models in the early days. Not so much in recent years, so I really missed the model that they’re running now until I became an activist. But I’ve spent 30 years doing nothing but! So when somebody comes up to me and says, “Roger, this is really complicated,” I go, “I’m with you. You’re absolutely correct.” But it turns out that the parts that matter aren’t that complicated. These people were never elected, they are not accountable, and yet they have the most important voice in our politics. Every candidate is running his or her campaign on Instagram. You don’t think that gives these guys an enormous amount of power? Of course it does. We think that that’s inevitable, right? Collectively, we learned how to trust tech in the ’60s, ’70s, ’80s and ’90s. And we don’t realize that we should be not only very skeptical right now, we should actively be suspicious of the biggest companies on the internet.
You were in close contact with Zuckerberg years ago, but you haven’t been in close contact with him for a while. Do you think that Facebook can develop into the kind of company you want it to be with him in charge?
I think that Mark is one good night’s sleep from understanding this. The problem is …
Do you really?
No, hang on. I’m saying from understanding it, right? Now doing it is another whole animal. And the doing requires a dramatic change to the business model.
There’s been such a now-familiar cycle of these breaches of trust every two weeks …
No, no, no, no, no. I get it. We’re talking about two different things. You asked me, “Is he capable of making this change?” And I’m saying to you I think it’s not crazy for him to wake up one day the way I did and say, “You know what? This thing that I believed in so much is deeply flawed, and I owe everybody. I do need to fix it.” I believe Mark Zuckerberg is totally capable of that moment of illumination. Now, there is zero evidence that he has had it, or that he is going to have it, but I think he’s capable of it.
And the trick is, let’s assume he has it. Then you have to actually effect the change. And that would be difficult, but like I said, I think he has the gravitas inside the company to persuade people: “You know what guys, what we’ve been doing here’s wrong, we’ve gotta do it differently.” But I don’t want you to come out of this call thinking I’m optimistic that that’s how things are going to go down.
That’s not what I took from it.
I’m saying to you I’m really, really, really afraid of what these companies could do. And I’m afraid in no small measure because there are hundreds of millions of people — really, billions of people — affected by their actions. And you can fit every single person in the world who understands the problem I’m talking about at the level I’m talking about it, you know, in a basketball arena or something much smaller. And, you know, we still have a huge awareness-building campaign to complete, and we have to get the people in power up to speed. And they’re getting there very rapidly. Like the rest of us, they had every reason to trust these guys for 50, 60 years and, you know, it didn’t look like they needed any regulation, so why would anybody become a pro?
But we just elected 40 freshmen members of Congress where the average age is like, 40, right? And we retired a whole bunch of the people who understood it the least well, and the intersection of those two things is very positive. And here’s the good news: I think there are members on both sides who get how important it is to do something here. And they may not agree on what the path is, but at least if you understand that we need to find common ground, that’s good. And the other piece that’s really important is that we, the people formerly known as “users” — we have more power than we realize. We have the power to withdraw some or all of our attention from these guys, and they need that attention to make this model work. Facebook has seen a really significant decline in hours of use per month per user in North America in the last year. And there’s a reason they don’t report it anymore, because it’s not a good number — fortunately Nielsen keeps track of it. And it’s not exact, but it shows like a 20 percent decline, and that’s a big number. And that gives you some hope.
Again and again, Zuckerberg has said over the last 15 years that his whole mission with Facebook — and he always uses this kind of platitude-heavy language — is that it’s about “connecting the world” and “bringing people together,” etc., etc. And I think people have a difficulty reconciling that with the strange figure he cuts in public. Did you ever feel like you got a handle on what’s driving him, or what he believes in?Step back for just a second, think about that mission statement: His goal is to connect the entire world on a network of his making.
And he perceives that that goal is so obviously important that it justifies whatever means are required to get there. Now, there are two flaws in that mission: One is that it is a network of his making, with his rules, one of which is “thou shalt share everything.” Not a cool rule at all. And the second one is, he is willing to justify whatever means are necessary to get there, including that famous Andrew Bosworth memo that justified the murder of somebody on Facebook Live on the basis of “our growth is so important that you just have to understand that things like this are gonna happen.”
So I would sit there and say, there’s an idealism there but it’s not a “puppies and flowers” idealism, it’s a very dark idealism. And, you know, Mark is an authoritarian, right? This is a company with two classes of stock, where he has absolute control. The same is true with Google, you know, Larry and Sergey have absolute control. And when you have that level of authoritarian control, and you have a product that’s ubiquitous, you must align with the powerful against the powerless. You can never afford to be on the wrong side of power. And for that reason, these guys are really scary in the political realm.
Does it seem unlikely to you that he would ever step down from the thing he created?
Would you even want him to? Because he does have the moral authority to effect change, and let’s just imagine that you can get the other choice of, we’re getting rid of the management team, with the same business model in place, or changing the business model and being forced to retain the entire management team. Which would you pick? You’d change the business model.
If you change the incentives, it doesn’t matter who’s running it. If you don’t change the incentives, it also doesn’t matter who’s running it. Right? It’s all about the incentives. And they have evolved in steps, and realistically, the incentives that are in place now didn’t begin until 2011, and they weren’t really firm until 2013. So my problem was, I was out of there in 2009, so I didn’t see this coming up, and, candidly, wasn’t looking closely enough after that to catch it until I started to see it in 2016. And shame on me. There were probably a number of opportunities to see it that I missed. But I’m trying to make up for it by being an activist. And realistically … I don’t know if you happened to see the segment where I was on CBS …
I did, actually.
Okay, so did you see the Gayle King thing? [Editor’s note: King said that McNamee sounded like a “scorned lover.”]
I thought that that was a surprising thing to bring up. Because obviously the book isn’t that way at all. I have enormously high regards for the talent of both Mark and Sheryl. And I actually give them the benefit of the doubt on their motives. I think they got here by degrees, and then a filter bubble built up inside Facebook, and they were so convinced of the merits of their objective that they stopped being able to see the world properly. And they stopped being able to see the people who use their products as individuals with a right to self-determination. They became a metric. That doesn’t make them bad people, right? They’re not the first people to go blind because of success.
My issue is, now that we have evidence, now that we know there’s a real problem, I wish that they’d step up and go, “You know what? We do have a responsibility, we should be like Johnson & Johnson. We should do what’s necessary to protect everybody.” That would be the right thing to do. And because they have absolute control, because Mark has absolute control, that’s what I mean by he’s one good night’s sleep away from getting this.
That would have to be a hell of a good night’s sleep.
It would have to be a hell of a good night’s sleep! But we can always pray, right?
This interview has been edited and condensed for clarity.
More on Facebook
- It’s Official: Meta Is a Disaster
- Why Your Boss’s Boss Wants You in the Metaverse
- TikTok Is the New King of Social Media. Now What?