As plenty of people pointed out in the wake of the recent controversy over the emotional-manipulation-on-Facebook study, Facebook has been studying and experimenting on its users for a long time. Mother Jones has a great rundown of some of these studies (which, author Dana Liebelson points out, all involve observation of user behavior rather than attempted manipulation of it), and they cover interesting subjects like how people communicate with their significant others and children online. My favorite, though, involves conspiracy theories.
How you respond to conspiracy theories: In the spring of 2014, Facebook published a study on how rumors spread on the social network. The researchers looked at rumors identified by the rumor-debunking website Snopes.com that fall into a number of different categories, including politics, medicine, horror, "glurge" (i.e., sentimental stories that usually aren't true), and 9/11. Then, the researchers found rumors posted on Facebook as photos, and gathered 249,035 comments in which people commented on the rumor with a valid link to Snopes. Ultimately, the researchers found reshared posts that received a comment that linked to Snopeswere more likely to be deleted. So, feel free to keep telling your friends that theRussian sleep experiment story is BS.
Conspiracy theories are very, very damaging and difficult to fight, so every bit of information about how they spread through the profoundly powerful incubator and amplifier that is social media is useful. This is helpful research, and in general Facebook produces such rich data, and is such a daily fixture of so many of its users' lives, that a lot of good can come out of these sorts investigations (which also, of course, help Facebook figure out how to make more money by manipulating us into getting even clickier). So things are more complicated than this brand of research being straightforwardly bad or evil or privacy-robbing, especially given that Facebook users have consented to having their information used in this manner.