A Tetris-Related Science Reporting Failure, and How It Happened

3D colored cubes
Photo: Adi Costin

Wouldn’t it be great if video games, particularly fun, excellent ones like Tetris, could reduce unwanted cravings, and perhaps even help fight serious addictions? Six days ago, Plymouth University published a press release announcing a study in Addictive Behavior purportedly showing just that. The release led off: “Playing Tetris for as little as three minutes at a time can weaken cravings for drugs, food and activities such as sex and sleeping by approximately one fifth, according to new research published this week.”

The press release had the desired effect, with many outlets taking the bait:

News.com.au: “Playing Tetris a few times a day for three minute increments can reduce cravings by up to 20 per cent

Vocativ: “Playing Tetris Can Ease Addictions to Sex, Drugs And Food”


Fusion: “Study finds that Tetris can curb addictive urges — like eating and having sex”

I’m only picking on these outlets because they’re big ones. Just about everyone, it seems, claimed that a new study showed Tetris can help curb these sorts of urges. In reality, the study proved nothing of the sort. It didn’t even come close. But it’s a useful example of how bad science spreads these days.

Here’s what really happened: The researchers, from Plymouth and Queensland University of Technology, divvied up a group of 31 students: 16 were in the control group, 15 in the Tetris group. Over the course of a week, everyone got regular text-message reminders to fill out a questionnaire about whether they were craving anything at the moment, from food to alcohol to sex, how intense that craving was on a 0-100 scale, and whether they had indulged previously reported cravings. The control group — and this will be key — “served to show how craving varied during the study week, allowing us to test if playing Tetris ultimately reduced the tendency to experience cravings or weakened cravings that did occur.”

It’s worth pointing out that this is a small sample size, and that effects measured over the course of a week really can’t tell us anything. That alone would be a reason to be skeptical of this study. As you’ll see, there are plenty of other, even bigger reasons. 

Anyway — those in the Tetris group played Tetris:

In the Tetris condition, participants completed the same questionnaire and in addition then played Tetris for 3 min, manipulating coloured basic shapes to form rows that would then disappear allowing for game play to continue; if the screen became full, a new game started. Participants played Tetris even if they had reported no craving. After 3 min, participants re-stated their craving strength for the same item, on the same 0–100 point scale.

Got it? If they were craving anything, they reported the strength of the craving, then played Tetris, then re-reported the craving strength.

Would it shock you to find out that if you ask someone how much they are craving something, distract them with a fun game, and then re-ask how much they are craving something, they will tend to say they are craving it less? Sure enough, here’s a graph showing the average strength of the cravings reported by participants in the Tetris group throughout the week, before and after playing:


If you’ve had a bit of stats before, you might notice that the confidence intervals overlap, which is an immediate not-great sign (don’t worry — this isn’t vital to understanding all the things wrong with how this study has been reported). Whether or not you are stats-inclined, you’ll notice that while the craving readings taken right after playing Tetris were in fact lower than the ones taken right before taking Tetris, there’s no significant reduction in the intensity of cravings over the course of the week — it’s not like people in the Tetris group experienced weaker cravings as the week progressed and the effects of all that Tetris-playing kicked in, which would at least maybe, questionably, possibly be a sign that there was something going on here more noteworthy than “distracting tasks are distracting” (a week is just way too short to know for sure).

And … that’s basically it for the findings. If you read the paper itself rather than the press release, you’ll see that over and over again, the researchers openly report not having found all that much that’s exciting:

Craving strength did not differ across the week between the Tetris and control conditions … Playing Tetris did not appear to reduce people’s tendency to indulge their cravings … scores on the CEQ (Table 3), measuring intensity, imagery and intrusiveness of strongest craving in the past week, did not vary with time (baseline, follow-up) or condition (control, Tetris; smallest p = 0.15).

The researchers provide various explanations for many of these null findings, but they’re beside the point when it comes to the question of whether this study, on its own, provided evidence that playing Tetris might be a useful way to control cravings, which it didn’t. That doesn’t mean this approach couldn’t work — maybe a bigger, better controlled study in the future will show the potential video games have to help nudge out unwanted cravings. It’s by no means a crazy idea. But again: This study just didn’t come close to providing substantive evidence for that claim.

And yet it has spread far and wide. An open secret in science writing is that the press releases put out by universities and other research institutions frequently overhype findings. In this case, Plymouth went with the rather BreitBartesque headline “New Study Reveals Tetris Can Block Cravings.” Having established that Tetris can block cravings, the release includes quotes from the researchers speculating as to the underlying mechanisms, more or less implying that their research has already proven — or at the very least, strongly suggested — this approach works, and that it’s time to understand why it works. Sure enough, several outlets cribbed directly from the press release’s researcher quotes (which isn’t always an automatic no-no, at least in my opinions, but certainly is one way in which journalists get hoodwinked on stuff like this — and outlets should always make clear when a quote is directly from a release rather than original reporting).

Overall, it just hasn’t been the strongest few days for behavioral-science reporting.