Have you heard? Facebook is collecting vast troves of demographic and behavioral data about you — yes, you, even if you don’t use Facebook — and is using it to help advertisers target you and your friends. In some cases, developers hooked into the Facebook platform are extracting data about you and keeping it for themselves. That’s bad! … Right?
Facebook is new enough, and the effects of its business model unclear enough, that it can be a little hard to know what we should be insanely paranoid about and what we can safely ignore. I mean, maybe it’s not that bad if Facebook knows where I live and what movies I like. Why should I care that Facebook knows where I went to college? I’d probably tell that to a random person on the street. Is it really the end of the world that Facebook has my address? And … it’s just ads, right? I see ads on TV all the time. At least on Facebook the ads are more likely to be about stuff I’m interested in.
So why should you care? The answer isn’t that all this data-harvesting somehow made Trump president through acts of sinister data-based black magic. Most experts say that Cambridge Analytica, the company at the center of this week’s Facebook scandal, wasn’t essential to Trump’s victory — and, further, that Facebook is not a radically useful tool in political messaging or turning out the vote (it might just be a step up from relentless email blasts).
One answer might be that it’s creepy to know that a highly accurate portrait of your life, down to your politics and your friendship preferences, is held somewhere on a server you don’t own or control. That is, you might not care that Facebook has this data, but what happens when Facebook gives it away, or if Facebook were to be more directly breached?
If you keep secrets about yourself, even innocuous ones, Facebook is likely aware of them in some capacity thanks to the sites you visit, the profiles you linger over, the comments you leave. And it’s not comforting to think that you’re not in charge of that data. Remember when we found out the NSA was spying on everyone, and people argued that “if you have nothing to hide then you shouldn’t be worried”? This feels kind of like that. You should care about what companies can learn about you without your permission, and what data they can retain against your wishes.
More urgently, it’s worth noting that while political scientists remain skeptical about Facebook’s ability right now to directly affect political outcomes, its ability to manipulate information and give people what they want to see is growing in capability and speed with each passing day — fueled specifically by the data it gathers about us. The problem being that Facebook is both a surveillance operation, and a publishing operation; it takes data from us, and serves back to us whatever it’s decided we want (or need) to hear. “Facebook can simultaneously measure everything about us, and control the information we consume,” Google AI researcher François Chollet recently tweeted. “When you have access to both perception and action, you’re looking at an AI problem. You can start establishing an optimization loop for human behavior.”
And, as Chollet points out, it’s only getting better at it. While Facebook iterates, human nature doesn’t.
But we don’t have to follow Chollet’s argument to the letter to recognize why it’s a bad idea to be complacent about internet surveillance and data-harvesting. The larger point is just as compelling: We’ve never experienced data-collection on this scale, or in this way, before. Sure, Facebook is more or less benevolent now, but what about down the road? What happens if it does actually get hacked? What happens if a government entity seizes all of that data? Taking it a step further: We barely know what Facebook is, let alone what it does and what kinds of things it’ll do in the future, intentionally or accidentally. If the last two years have demonstrated one thing in particular, it’s that we’re very bad at predicting the eventual effects of the internet. In the case of user data, it’s better to be cautious and conservative.
Facebook has proven continually that it cannot predict the effects of the tools it creates. Mark Zuckerberg admitted as much this week, waxing nostalgic about his dorm-room days. Do you really want to give Facebook the ingredients it needs to experiment without regard for consequence?