In retrospect, it’s amazing that YouTube has lasted this long. A video-streaming service that lets anyone upload anything they want, for immediate playback by anyone? The number of problems such a service would have — not just the technical problems associated with delivering video files at that scale, but the legal problems involved in ensuring copyright protection, and the moral and ethical problems of moderating that content — would seem to make its demise inevitable.
Yet here we are: Since its founding in 2004, YouTube has been the standard for sharing video on the web. So much so that YouTube isn’t just a video provider — it is web video, and has been so for more than a decade. And only now that YouTube has cemented its place in the web firmament, and turned itself into an unshakable pillar of the internet, are we beginning to reckon with how vast, influential — and potentially dangerous — the site is. If 2017 was the year of the great Facebook backlash, 2018 is shaping up to be YouTube’s turn.
Earlier this week, a YouTube star named Logan Paul uploaded footage of a dead body that he’d found in a forest. Paul was traveling in Japan and went to Aokigahara, a forest known unofficially as “suicide forest.” Surprise! He found a recent victim of suicide; he filmed the body; he filmed himself reacting to it (“So, okay, there are a lot of things going through my mind right now”); he edited the footage together and found a soundtrack for it; he uploaded the video to his YouTube channel; he added a provocative thumbnail of the body to pique viewer interest; his various hangers-on uploaded their own videos of themselves and Paul finding the body. Before it was taken down, Paul’s video was watched more than 6 million times.
The video is grotesque; Paul has earned the criticism that he is currently receiving. And, to be fair, much of that criticism is coming from Paul’s fellow vloggers and YouTube celebrities. But once the initial shock subsides, the video feels less like an aberration than an inevitability. A cult of stardom based upon incessant self-recording, a celebrity economy built on outsize reaction: Where else could YouTube vlog culture go besides filming a dead body? It is the current but not permanent nadir of YouTube culture.
Paul’s Aokigahara video is not the first moment that the rest of the internet has seemed to seize up in horror at YouTube. In November, artist and writer James Bridle outlined the unsettling nature of YouTube content aimed at kids, which usually stops just short of being outright traumatizing. “Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale,” taking advantage of how the site uses automation to suggest an endless stream of videos to users.
And, to complete the trinity began by amoral vloggers and creepy kids’ videos, we have: the Nazis, white supremacists, and neo-reactionaries, who have identified YouTube as fertile ground to plant and spread their ideologies. In August, the New York Times compared YouTube’s far-right user activity to conservative talk radio: “the value it places on personalities; its reliance on monologue and repetition; its isolation and immunity from direct challenge; its promise to let listeners in on the real, secret story.” The analogy “is also a useful reminder of how potent a medium can become while still appearing marginal to those who don’t care for it or know much about it.”
Of course, YouTube is not only white nationalists, dead-body reactions, and videos of Peppa Pig operating on Spider-Man. It’s also family videos and amateur web series and funny viral clips. It’s often a great site — one of those vast, weird, democratic internet spaces where people meet to argue about Star Wars and share videos of themselves high on laughing gas after dental surgery.
And for a long time, that’s been the chief image we have of YouTube. For the most part, YouTube has been defined by two poles: amateur videos (shaky-cam footage, anime music videos, and fan tributes put together in Windows Movie Maker or iMovie) and professional and semi-professional works (comedy skits, music videos, special-effects reels, cooking and makeup tutorials). The early beauty of YouTube was that it was both a way to watch heavily produced content for free, and a never-ending episode of America’s Funniest Home Videos.
It is easy to distinguish between these two poles. You know what shaky-cam, off-the-cuff footage looks like, and you can tell when someone is putting effort into making a slickly produced video package. To the average viewer, slickly produced video — say, featuring a lot of computer animation or motion graphics, or even just efficient editing — indicates a level of competence, intelligence, and trustworthiness. Amateurish video — a guy ranting about NASA to his low-quality webcam — indicates, well, the opposite. But that distinction, which seemed to many to render YouTube harmless, is now meaningless. Instead of a lot of dumb, amateur video and a few professional-grade producers, many of YouTube’s popular native channels are run by dangerous or careless individuals with professional editing rigs and enormous online followings. YouTube is no longer two poles; it’s a swampy middle ground where aesthetic signifiers, such as production value, can no longer act as a shorthand for level of quality (in part thanks to rapid technological innovation).
Again: The vast majority of YouTube’s content is inoffensive. Much of it is just spam channels trying to game the system for ad money; some of it is genuinely great stuff that the world would be poorer without. But 65 years’ worth of video is uploaded to YouTube every single day, making the site’s full contents unknowable, and unable to be moderated, to any one person or small group of human beings. The technical hurdles — the unscannable, unindexed nature of video — make each rambling 15-minute vlog difficult to evaluate at speed and at scale. And just as Facebook’s enormous size transformed once-funny hoax posts into a global misinformation crisis, YouTube’s size and prominence means its worst, scariest stuff, no matter how marginal in the scheme of things, is still reaching millions of people, many of them young. For adolescents and teenagers across the globe, YouTube is the de facto time-waster, a place where you go to search for toy videos and news — and where, thanks to the wonder of automation, you could wind up in a distorted world of disturbing images, resentment politics, and brain-dead reaction-bait, all of it feeding off of itself.
And as with Facebook, this is one of YouTube’s key problems: The bad stuff is not an anomaly, but a rational response to the platform’s incentive scheme. Logan Paul is not the first person to post video of a dead body on YouTube, and he will not be the last (someone’s probably already done it since Saturday); his video simply operates in a tradition of crazy stunts and wild reactions that generate high engagement numbers on YouTube. Already, dozens of other prominent YouTubers are posting response videos, their own condemnations of Paul’s obviously terrible act, rushing to be the algorithmically determined next video in YouTube’s autoplay queue. That same autoplay function guides children from videos like “Peppa Pig Episodes — Birthday Compilation — Cartoons for Children” to videos like “MLG Peppa Pig Daddy Pig Dies in the BlackBerry Bush.” A parent glancing over their kid’s shoulder every once in a while might not notice the difference.
Might not notice the difference — until stories about creepy animated videos and unthinking vloggers are in the news all of the time. YouTube is now unavoidable, which means that its problems are unavoidable, too.
But how the coming YouTube backlash might actually change the site is unclear. Because of that scale problem, and because of how much more difficult it is to use automated computer systems to process and organize video, the company might not ever be able to sort through video to adequately moderate it. It should also be said that the self-contained nature of the YouTube ecosystem, and the tribalism YouTubers inspire among their fans, often inoculates people like Paul against substantial outside criticism.
For now, YouTube has been flirting with “demonetization” — preventing creators who do or say controversial things from “monetizing” their videos through automated advertising. But this is a messy system: Videos espousing Nazism might be demonetized, and caught in the sweep are videos of games where the player fights Nazis. (Creators then decide to chronicle and explain this problem … on YouTube.) Facebook has all but eliminated links to outside sites from its News Feed; it’s even toyed with putting them on a separate feed entirely. But YouTube doesn’t exactly have that luxury — the site feeds itself. Everything, from the creepy to the reactionary to the heartwarming, already lives on YouTube in the first place.