The scale of Facebook’s data-harvesting operation is, as we all know by now, vast. Through your use of Facebook, or any modern website, or an app that has integration with Facebook, the company has a trove of data that is, according to my pea-brained conception, virtually infinite. Is that bad? The jury is still out.
Consider two semi-related Facebook stories out today. The first one concerns Onavo, the virtual private network (VPN) software that Facebook used to clandestinely track the market and its potential technology competitors. VPNs work by forwarding all of your internet traffic through a proxy (in this case, Facebook) masking the person doing the web surfing in the process. In effect, Onavo gave Facebook detailed snapshots of privacy conscious users’ habits, including “time you spend using apps, mobile and Wi-Fi data you use per app, the websites you visit, and your country, device and network type.”
The app was removed from the iOS app store last year, because Apple has rules stating that an app’s data collection needs to be related to the app’s function. The data collection aspect was not particularly necessary to the app’s stated function (anonymized internet browsing). The Onavo app remained on Google’s Android marketplace, and earlier this month, Facebook got in trouble when it was revealed that the company was misusing Apple’s enterprise certificate program to pay teens to install similar, unauthorized apps on iOS.
Now, TechCrunch reports that Facebook is removing Onavo from Google Play and shutting down its controversial Research initiative. The issue here, to my mind, was not just that Facebook was performing market research (it’d be dumb not to) but the clandestine way in which they were doing it. Onavo’s stated purpose and its true purpose were at odds, and the Research program went to some length to not inform participants that it was Facebook conducting the research.
As today’s other big Facebook story shows, however, shaming Facebook for its data collection practices can only go so far. An investigation by The Wall Street Journal found that, in many cases, popular mobile apps were sending Facebook sensitive data anyway. One such app was Flo, a tool for tracking one’s menstrual cycle.
The piece of software at issue here is a tool for developers called Facebook App Events. “An app event is an action that takes place in your app or on your webpage such as a person installing your app or completing a purchase,” the documentation states. “Facebook App Events allows you to track these events to view analytics, measure ad performance, and build audiences for ad targeting.” Standard events you can track are things like an item being added to a shopping cart, or someone starting a free trial.
A hypothetical use case for this might be: you sign up for a free trial of a streaming service on your phone, and then suddenly you are inundated with Facebook and Instagram ads promoting said service. That’s because the app event has been linked to your Facebook account.
Many of the apps using the App Events were not upfront about the fact that such data was being shared with Facebook. In some cases, apps were sending Facebook information such as a user’s heart rate, or when they were having their period, and while Facebook’s policy prohibits users from sending them this type of health data, it doesn’t appear there were any actual roadblocks preventing the transmission.
Flo, the period app, told the Journal that it would update the software to stop sending the data to Facebook, though it did not explain why it was passing it along in the first place. One likely explanation is that it was using users’ getting their periods as a trigger for ad buys. “For example, you can target people who previously used your app, but have not come back to your app within the last 90 days,” the documentation states. “Or you can target people who have added an item to their cart but didn’t make a purchase.” Or you can target people who are menstruating? (For what it’s worth, Flo’s Facebook ads appear limited to branding campaigns, not direct-response ones.)
These two stories, taken together, paint a pretty good picture of what Facebook has become. It is far from a single piece of software — it’s a sprawling federation of smaller initiatives, all united behind the purpose of collecting as much personal data as possible. The case of Onavo was Facebook playing offense, but the case of App Events is about passivity — it was Facebook not moderating its developer platform effectively (the same lax attitude that spawned the Cambridge Analytica scandal). In both cases, however, Facebook benefited, because more data lets it extract more revenue from each user. Even if Facebook’s most aggressive initiatives are shut down, there are countless software companies out there willing to and benefiting from pumping their users’ data into the Facebook machine. It’s messy, to say the least, but it works for advertisers and companies, if not for individual users.
Compounding the issue of Facebook’s messy, varied data-collection operation is how Facebook has continually dragged its feet on letting users exercise any control. It is weirdly easy for Facebook to get data into its system, but somehow the company has a big problem excising that same data. At the company’s annual F8 developer conference last year, Mark Zuckerberg announced a “Clear History” function that would let users delete all of the off-Facebook tracking info Facebook has collected on them. In December, the head of Facebook’s privacy product team, David Baser, said that the still-unlaunched product was taking longer than usual because Facebook stores user data all over the place in many different ways.
The other reason it’s taking so long, according to a BuzzFeed News report, is that Mark Zuckerberg made up the feature as a reactionary CYA stunt and not as a well-considered step that would benefit users. Zuckerberg and COO Sheryl Sandberg, according to one source, are “not keen on decision-making until they’re forced to do so.” That’s a cool, deliberate, thoughtful way to run the most complex data-hoarding operation in human history.