select all

YouTube’s Algorithm Wants You to Watch Conspiracy-Mongering Trash

Photo-Illustration: Select All. Images: YouTube; Sudowoodo/Getty Images

The Wall Street Journal has an excellent piece up today, putting YouTube’s video-recommendation engine under a microscope and trying to pull apart why the site seems to so often steer users toward, as the Journal puts it, “divisive, misleading, or false content.” (We might call them something more like “shitvids,” but to each their own.)

That YouTube’s content-recommendation system tends to lead people toward less-than-stellar YouTube videos has been a known problem for a while — BuzzFeed took a long look at it a year ago, and major problems in its recommendations for kids became a hot topic toward the end of 2017.

YouTube introduced the content-recommendation algorithm several years ago in an attempt to keep users watching more videos on YouTube (and beefed up this algorithm with a deep neural network in 2016). As YouTube engineers tell the Journal:

The algorithm doesn’t seek out extreme videos, they said, but looks for clips that data show are already drawing high traffic and keeping people on the site. Those videos often tend to be sensationalist and on the extreme fringe, the engineers said.

But the Journal went a step further, hiring an ex-YouTube engineer, Guillaume Chaslot, who actually spent time working on YouTube’s recommendation engine. Chaslot built a program that searched YouTube for the top-40 most popular search terms in November, December, and January, as well as looking at the “rabbit holes” of content recommendations that users can be sent down.

The Journal and Chaslot found that news results were often leading to highly partisan channels. One prime example would be the Conservative Network, which mainly posts ripped video from other networks and channels with that odd YouTube syntax that seems to draw in clicks, like the video “Melania Trump MOCKS Joy Behar on Live TV.” The channel, which only has about 76,000 subscribers (a very small audience for YouTube), regularly has videos that get millions of views, seemingly by heavily gaming the recommendation system.

But where Chaslot’s research is more interesting — and, frankly, more disturbing — is looking at how quickly YouTube’s content recommendation leads users toward extreme and outright fake videos. A browser cleared of cookies and logged out of YouTube — in YouTube’s eyes, a brand-new user — can put in the term “CIA” or “9/11.” The first results in YouTube’s search recommendations will usually be from credible sources, usually a major news network. But after watching one video, the recommendation system starts to veer toward the batshit and conspiracy-minded. After viewing just one 9/11 video, the recommendation system was serving up videos like “Footage Shows Military Plane Hitting WTC Tower on 9/11 — 13 Witnesses React.”

You can test this out for yourself at Chaslot’s site, AlgoTransparency, which looks at certain topics and then ranks YouTube videos based on how much more likely they are to be recommended. For instance, for someone searching “Is the Earth flat or round” on YouTube, the video “THE BEST Flat Earth VIDEO | 100% Proof the Earth Is Flat | Please Debunk This I Dare You!!!! 2018” is 8.6 times more likely to be recommended than the average video. For those searching for “vaccine facts,” the video “Madison: Before & After Vaccine Induced Autism” is 7.5 times more likely to be recommended than the average video.

YouTube says that its recommendation system drives more than 70 percent of total viewing time on YouTube. “We recognize that this is our responsibility, and we have more to do,” said Johanna Wright, YouTube’s product-management chief for recommendations, to The Wall Street Journal.

There are, of course, past precedents for cleaning up search results and content recommendations. Google’s search engine has done two major cleanups, first eliminating many low-quality “content farm” search results with the Panda updates in 2011, and then moving toward favoring “authoritative” news sources when people were searching terms related to breaking news with Project Owl in 2017.

But YouTube and Google have different incentives in place. Google wins in the market by providing you the best search experience, which often means being a trustworthy and somewhat seamless transition point to another site (while serving up a few ads along the way, and gathering more data about what you’re interested in). YouTube wins by having you watch as many videos — with as many attached ads — as possible. And if what keeps people watching are videos about how 9/11 was an inside job, the Earth is flat, and vaccines induce autism, there’s little incentive for YouTube to change that formula — barring advertisers walking away en masse.

How YouTube’s Algorithm Leads You to Conspiracy-Minded Trash