The salary disparity between men and women has been a hot subject in recent years, what with leaning in and the confidence gap and, most recently, Jill Abramson's departure from the New York Times. So BuzzFeed decided to look into how it works within journalism.
First, the site invited a bunch of journalists to fill out a form anonymously disclosing their gender, salary, and so on. Then, after editors crunched the numbers from the almost 900 responses they had received, the site published an article with the bold headline "Here’s What Female And Male Journalists Actually Make," complete with a couple snazzy charts laying out the results.
And yes, there were disparities, though apparently they didn't kick in until higher levels of management (that is, entry-level male and female journalists are paid about the same). But it isn't worth analyzing this in much depth, unfortunately, because the survey actually tells us very little about the subject at hand due to its many flaws.
BuzzFeed acknowledges some of them:
The goal was to take an honest look at pay discrepancy. The sample wasn’t random, and the survey is far from perfect. People who felt less satisfied with their salaries, for example, might have been more likely to complete the survey. But it’s a useful step toward knowing how big the gap might be, and at what stage in our careers it widens the most.
When it comes to the science of extracting broadly meaningful results from surveys, "The sample wasn't random" should generally be viewed as a red flag waving violently while an alarm blares and firecrackers go off. You usually need a random sample — or close enough to one — if you're going to claim that your results can be interpreted in a general way. And when you scroll down the BuzzFeed article further, you get a sense of just how non-random this sample was, and of other flaws that make it really hard to take these results seriously:
1. More than half of the respondents came from New York City or Washington, D.C. Yes, BuzzFeed stamped "not a representative sample" on the main chart, but that's an understatement. Any useful survey like this will sample from a wide range of media markets. New York and D.C. are, suffice it to say, unique in a lot of ways, and the vast majority of journalists do not work in either market.
2. 83 percent of the respondents were 30 or younger. BuzzFeed's editors basically sampled their friends and their friends' friends: The survey "was emailed to BuzzFeed editorial staffers, as well as to former colleagues and contacts working at the biggest websites, newspapers, magazines, and broadcast networks in the country; they were asked to pass it along." Obviously most journalists are older than 30, so this ends up being a survey that, statistically speaking, effectively excludes most journalists.
3. The categories are hard to interpret. Respondents were asked to describe themselves as "entry level," "mid-level," "senior non-management," or "senior management." It's unclear exactly how to translate these, so there's a chance that if these categories had been phrased just a little bit differently, the results would have changed a lot.
Yes, BuzzFeed deserves some credit for tackling this subject, and greater pay transparency would be a helpful way to combat the gender pay gap in journalism and elsewhere. But on the other hand, the site clearly wasn't all that interested in going about this in a rigorous way: One of the emails in which BuzzFeed solicited responses from journalists found its way to me, and it was time-stamped 3:50 p.m. on Thursday. The piece went live a little before 6 p.m. on Friday, suggesting that BuzzFeed may have been more concerned with riding the wave of Abramson coverage than with allowing some time for a bigger (and possibly more representative) sample to emerge.
This article is going to be passed around a lot and treated like an important data point in the pay-gap discussion, mostly because the headline and charts drown out the many problems. But a survey this flawed probably shouldn't have been published.