In the wake of last week’s Parkland high-school shooting, right-wing conspiracy theorists are pushing the ludicrous story that the teenage survivors speaking out against gun violence are “crisis actors” — dupes hired to pretend to be victims of tragedy. Earlier this morning, the top trending video on YouTube was one implying that David Hogg, one of the students pushing for legislative action on gun control, is an actor.
What does it mean, exactly, for something to be “trending”? YouTube, Facebook, and Twitter all make frequent use of the term, but none of them have a public or transparent definition — let alone a common one. When we sort through our feeds, “latest” has an obvious chronological sorting mechanism; even “popular” has a fairly clear and agreed-upon definition. “Trending,” however, does not. It’s similar, but not the same as “popular”; generally speaking, it means “popular, in some relative, technically defined way.” That is, the “trending” sections of major platforms are, as of now, algorithmically determined, their contents selected by formulas developed internally at those companies and kept private. Automated software determines what is trending, and it does so by examining the content according to a set collection of factors.
YouTube, for instance, identifies trending videos by examining aspects like the view count, the rate of audience growth, and the age of the content. A five-hour-old video is more likely to be trending than a five-year-old video; a video that goes from 100 views to 1 million is more likely to trend (yeah, it’s a verb now) than a video that goes from 250 million views to 251 million. Other factors might be considered as well. A YouTube star with millions of subscribers and hundreds of uploads might be judged on a different acceleration rate than breaking-news footage uploaded by a guy with 19 subscribers.
The first problem with “trending” is that it selects and highlights content with no eye toward accuracy, or quality. Automated trending systems are not equipped to make judgments; they can determine if things are being shared, but they cannot determine whether that content should be shared further. Facebook’s trending section is fully automated. A spokesperson for the company said, “To determine what goes into the Trending section, we look at the number of publishers that are posting articles on Facebook about the same topic, and the engagement around that group of articles.” Facebook used to have an editorial team that oversaw Trending Topics, a bulwark against fake news and online hoaxes, before Gizmodo reported that the editors “suppressed conservative news” in the spring of 2016. Days after Facebook switched to an entirely automated system, a false report about Megyn Kelly being fired from Fox News for supporting Hillary Clinton made it into the module.
Twitter, for its part, selects content via its human-curated Moments section, which informs users of popular posts on the site, both frivolous and substantive, without perpetuating hoaxes on a repeat basis. But it still has a “trending” section, with trends “determined by an algorithm and, by default, are tailored for you based on who you follow, your interests, and your location.” So, if a ton of people you follow are talking about a niche topic, it’s more likely to be presented to you as a trending topic. That means that “trending” news and events on Twitter are, unlike YouTube’s hub, unique to each user.
When Twitter was younger and flooded with Beliebers, it had to change its trending-determining system because it usually included Justin Bieber every single time he made the news.
If we’d been smart, Twitter removing Justin Bieber from its “trending” module should’ve been the moment we all agreed to stop using “trending” as a prominent way to sort content on major social networks: If adolescents, accidentally or on purpose, could game a system, force a topic to “trend,” and attract more attention toward it, what would stop more pernicious actors? And if trending algorithms can be changed, then doesn’t the common industry refrain that algorithms are value-neutral and unbiased, unlike humans, ring false?
This is the other problem of “trending,” conceptually: It’s eminently gameable, but the platforms that use the term never make the rules clear. “Trending” is given the imprimatur of authority — videos or topics handed down from on high, scientifically determined to have trended — when really it’s a cobbled-together list of content being obsessively shared or tweeted about by people who love Justin Bieber. Or Logan Paul. Or who believe in crisis actors.
Which is why it’s worth noting that before YouTube removed the crisis-actor trending video (for “violating YouTube’s policy on harassment and bullying”) it’d been seen by more than 209,000 people in less than 24 hours. That’s a lot of people, but not quite as many as you might expect for something atop the great “trending” list. Automated trending systems perpetuate themselves by identifying content in the nascent stages of virality and telling users that “this is already being shared a lot.” But what does “a lot” even mean in the age of megaplatforms? Is 10,000 retweets a lot? Is 200,000 views a lot? The internet has screwed with our sense of scale to such an extent that 1,000 people alleging that leftists secretly carried out a school-shooting hoax using hired actors is either too many people or just a blip in a sea of billions (or, somehow, both). The “trending” designation is a worthless metric, but in the online-content economy, it somehow means everything.