Sometimes you find yourself confronting questions about your gadgets that are so basic you don’t want to consult your IT guy, your manual, or your niece. That’s why we’re here. This week, in basic answers to basic questions: how to stop your TV from playing everything like a soap opera.
It’s something just about all of us have encountered by now: You’re at your aunt and uncle’s place for Thanksgiving, the football game is over, somebody puts on a movie. And it looks … wrong — almost like it was filmed with the same cameras used to broadcast the football game. Everything is too clear, maybe? Too sharp? Is that possible?
Or maybe you unpack and set up a big new TV, and settle in to enjoy the fruits of your labor. You pick out something that’s supposed to look fantastic — and instead you’re greeted with something that gives you the uncomfortable sensation of watching a soap opera in 35mm. What gives?
This is known within the TV-nerd community as the “soap opera effect.” It’s a common problem in which fancy new TVs make everything played on them look cheap, fake, and plasticky. It’s also one of the very few good names for anything in the tech community. It is fixable — and to be honest it’s pretty easy to fix, so if you just want to fix it you can scroll right to the bottom of this page. But if you want to know more about why it happens (so that you can explain it to your family next Thanksgiving), read on.
So! The first thing to understand is the concept of a frame rate, normally talked about as “FPS,” or frames per second. There’s no way to explain that without sounding incredibly condescending, so, whatever: Video of any sort (including video games) is displayed on a TV by showing a rapid series of still images, each one referred to as a “frame.” The frame rate is a measurement of how many of those images the TV can rapidly cycle through in a single second.
Where this gets weird and confusing is that the raw material, like a film or a TV show or a video of a squirrel eating a bagel filmed with an iPhone, has its own native frame rate. Movies and many TV shows have, for decades, been shot at 24fps; it’s become a standard. Digital video (and, before it, analog video) shoots at 30fps.
Even if you can’t quite articulate the difference, you know it intuitively; it’s one of the major reasons why shows shot on video — sports, talk shows, soap operas, multi-camera sitcoms, and reality TV — feel so different than the kinds of prime-time and premium-cable dramas that are shot on film.
TVs have one built-in frame rate that they can display. Older and cheaper TVs are listed at 60Hz, which means in this case the same thing as “frames per second.” That makes the math easy for a digital video shot at 30fps: The TV just displays each frame twice. But for 24fps video, it doesn’t divide as cleanly into the 60fps-displaying TV. So TV manufacturers came up with something called a 2:3 pulldown.
2:3 pulldown repeats frames in a staggered pattern: One frame gets repeated twice, the next three times, the next twice, and the next three times, and so on in that pattern. Your eyes notice this, even if you can’t quite tell what you’re seeing because it’s all happening so fast. The effect of 2:3 pulldown is normally referred to as “judder,” a sort of jittering-stuttering portmanteau, I think. Juddering is what’s responsible for slight blockiness, especially in slow-panning shots in movies. This isn’t necessarily a bad thing: This is what 24fps movies have looked like to us for a long time. That judder spells out “film” to us, a major factor in differentiating film from cheap video.
New TVs often come with a much higher frame rate than before: 120Hz and even 240Hz are common. But most content is still filmed at 24fps or 30fps, so the TV has to do the same kind of doubling-frame or 2:3 pulldown technique as before — except even more so. Now it has, in the case of a 120Hz TV, a whopping 120 still images per second to fill, and it’s got, in the case of a nice film, only 24 images to do it with.
Instead of just repeating those frames over and over, TV manufacturers got fancy. They decided, unanimously though separately, that they could make images appear smoother, crisper, and clearer with a new technique called “interpolation.”
Interpolation replaces that old technique of simply repeating the still images a few times each second. Instead, it looks at the colors and patterns of each frame and the frame to come after and comes up with a pattern somewhere in between. Like, if in this one second of video, a football is thrown across the screen, in one frame the football might be all the way on the left side of the screen, and in the next frame the football might be all the way on the right side of the screen. Interpolation would stick one or more frames in between those two frames in which the football is somewhere in the middle of the screen.
This interpolation guesswork isn’t the worst idea — and works very well at doing what it’s intended to do. Except our eyes are used to seeing repeated frames. The guessed frames make everything too smooth-looking, too clean. Put another way: Most of us have been trained to prefer the “less realistic” blur of film.
Motion interpolation isn’t, necessarily, bad. For digital video you won’t notice anything wrong, and most experts suggest that live sports and video games both look better with interpolation turned on. But for movies? You’ll want to turn it off.
So how to do you do that? Each TV manufacturer buried, somewhere in the hellish lists that make up their menus, some kind of interpolation option. It’s different for every TV, but generally searching through the “video” or “playback” settings will help you locate it. On Samsung TVs it’s called AutoMotion Plus, on LGs it’s TruMotion, Sony calls it MotionFlow, Vizio calls it Smooth Motion Effect.
It should be about the first thing you set on your new TV. And if you want to do this secretly to your aunt and uncle’s TV, you probably should. They’ll thank you later.