moderation

YouTube’s Moderation Problem Didn’t Come As a Surprise

Photo: YouTube

Concern over how large online platforms are moderated — or not moderated — has been growing for years, particularly in light of the surge in right-wing anger and violence globally. Facebook endured most of the scrutiny early on and continues to do so; Twitter is now viewed as the platform with the most Nazis on it (save for 8chan … maybe). And YouTube is pretty well regarded as a hive of scum and “free thinker” weirdos hawking nootropic powder. Sure, there’s tons of other stuff happening on these platforms and these descriptions are reductive, but they work for the purpose of understanding the broader problems.

We understand these problems now, and have insight into how platforms and algorithmic recommendations drive outrage and hyperbole, but even before public scrutiny rose to its highest levels ever, people at YouTube were aware of the issue. A lengthy report from Bloomberg published today outlines how employees at the video site over the years tried to alert senior leadership to the problem of extreme content. Unfortunately, they were stymied by the primary mandate: increase engagement. That mandate, in retrospect, came at a high price.

While YouTube has system to deal with clickbait — videos that mislead users as to what they are actually about — it has been struggling to deal with the spread of harmful information stated plainly, such as videos endorsing anti-vaxxer thinking. YouTube reportedly dragged its feet in instituting a restriction on videos “close to the line,” troublesome but not in violation of the site’s policies. Those videos now longer appear as recommendations for viewers looking for their next clip.

According to the report, YouTube generally discouraged employees from trying to make the site safer for users. “Lawyers verbally advised employees not assigned to handle moderation to avoid searching on their own for questionable videos,” according to the report, fearing that making substantive content decisions would eliminate the protection YouTube receives from federal shield laws.

The company also experimented with, but ultimately rejected, a revenue-sharing scheme that would likely have only encouraged the site to be flooded with even more extreme videos. The idea was to reward creators not just based on how many ads they ran, but also on engagement — how long they kept viewers on the site. According to Bloomberg, “One person involved said that the algorithms for doling out payments were tightly guarded. If it went into effect then, this person said, it’s likely that someone like Alex Jones — the Infowars creator and conspiracy theorist with a huge following on the site, before YouTube booted him last August — would have suddenly become one of the highest paid YouTube stars.”

The popularity of the alt-right on YouTube should be no surprise to anyone who accidentally watches ten seconds of Ben Shapiro and then is stuck in an endless cycle of Ben Shapiro “epic ownage” clips. But YouTube employees were aware as well. From Bloomberg’s report:

An employee decided to create a new YouTube “vertical,” a category that the company uses to group its mountain of video footage. This person gathered together videos under an imagined vertical for the “alt-right,” the political ensemble loosely tied to Trump. Based on engagement, the hypothetical alt-right category sat with music, sports and gaming as the most popular channels at YouTube, an attempt to show how critical these videos were to YouTube’s business.

For a data-driven company that relies on metrics instead of gut checks to make key decisions, the metrics showed that alt-right content was as important as music for YouTube. That’s something to think about next time the company says that controversial or problematic videos only make up a small fraction of its platform.

Report: YouTube’s Alt-Right Content As Popular As Its Music