There is perhaps no better pipeline to online radicalization, save for 4chan, than by watching videos on YouTube. The video repository is loaded with many absurd things, but a lot of attention over the past year or so has been focused on the platform’s recommendation system. Its critics argue that YouTube has a “rabbit hole effect” — one in which viewing a video called “What is the Flat Earth Theory?” often leads users down a handful of increasingly extremist clips to a video called “Flat Earth theory PROVEN RIGHT — [Why ILLUMINATI and *NASA* Lied to Us] (and why they must die).” A recent survey of Flat Earthers found YouTube to be a central tool in their conversion to accepting the theory.
In an interview with the New York Times published today, YouTube chief product officer Neal Mohan denied that the company has a business interest in pushing users toward extremist content. “It is not the case that ‘extreme’ content drives a higher version of engagement or watch time than content of other types,” he said. He also reiterated that, gosh, YouTube is just so vast and it’s so tough to balance user safety and information integrity with concerns about freedom of speech — a standard talking point for large platforms.
Mohan’s denial of the “rabbit hole effect” feels odd though, because to accept his explanation, one also has to deny the existence of a core feature of the YouTube experience: autoplay.
Mohan’s explanation boiled down is that the sidebar containing YouTube recommendations offers a number of options, some more extreme and some less. He did not offer any statistics or numbers to support any of his claims, only noting that, sure, there are bad videos and good videos on YouTube and one could theoretically watch either. He said:
[W]hen a video is watched, you will see a number of videos that are then recommended. Some of those videos might have the perception of skewing in one direction or, you know, call it more extreme. There are other videos that skew in the opposite direction. And again, our systems are not doing this [taking the level of a video’s extremity in to account], because that’s not a signal that feeds into the recommendations. That’s just the observation that you see in the [sidebar] panel.
I’m not saying that a user couldn’t click on one of those videos that are quote-unquote more extreme, consume that and then get another set of recommendations and sort of keep moving in one path or the other. All I’m saying is that it’s not inevitable.
Except it sorta is inevitable! YouTube, configured with default settings, is designed to make these types of choices on behalf of the viewer. Talking about a user actively clicking on what videos to watch is hardly the intended user experience. The site’s “Up Next” feature suggests to the viewer what they should watch next, and autoplay takes them there automatically, making the rabbit hole an opt-out situation (and anyone in the habit of regularly turning autoplay off knows that Google will flip it back on every so often). A recent experiment conducted by BuzzFeed News found in one instance that it took just nine steps through YouTube’s autoplay “to go from an anodyne PBS clip about the 116th United States Congress to an anti-immigrant video from a designated hate organization.”
In reality, YouTube gives users a heavily weighted single recommendation after each video, and takes them to it automatically. The system, when working as YouTube intends, requires no active decision-making from the audience. For Mohan, YouTube’s chief product officer, to assert that the site is only presenting impartial suggestions in its sidebar and letting viewers choose a direction feels disingenuous.