YouTube Is Dangerous as Hell. But We Still Need It.

  • Oops!
    Something went wrong.
    Please try again later.
THE-YOUTUBE-EFFECT_KRISTY-TULLY_6 - Credit: Trouper Productions
THE-YOUTUBE-EFFECT_KRISTY-TULLY_6 - Credit: Trouper Productions

YouTube is an unparalleled community and resource. It’s the only place I’m sure to find my acting and directing work, like unseen audition clips from our Bill & Ted movies or a terrible KFC commercial I acted in as a pre-teen that my son sleuthed out. As a director of documentaries, YouTube is a mainstay for my researchers and often where we discover the most critical archive material, whatever the topic. But YouTube is also home to some of the darkest and most dangerous content online. It continues to negatively influence and radicalize users, even driving them to real-world violence. It’s the largest and most impactful of all media platforms, yet it is often entirely left out of the conversation about the harmful influence of the internet. My latest film sought to understand why that is.

The YouTube Effect is my fourth feature documentary about the rise of various online communities. I began to look closely at YouTube and its parent company Google in the wake of the 2016 election when the online world appeared to lose its collective mind, rife with conspiracy theories, election-oriented propaganda, and hate speech. At the same time, YouTube continued to be an invaluable resource for everything it does well: granular news and archival media from around the world, creative influencer content, and entertaining videos. But issues with the “rabbit hole effect” began to surface, where the company’s recommender algorithm would offer users increasingly extremist and radicalizing content.

More from Rolling Stone

Things came to a head in 2019, when a lone gunman in Christchurch, New Zealand, murdered 51 Muslims, leaving a manifesto that explicitly credited YouTube as the chief motivator for his spree. The company responded immediately: de-platforming the extremist YouTuber Stefan Molyneux, a primary influence on the shooter; forming task forces to mitigate future harm; and working to correct issues with the recommender algorithm. But the problems persisted, making me wonder if this really was an algorithmic issue. What gave YouTube its unrivaled power, and did it pose significant danger to its users? Later that year, producer Gale Anne Hurd (The Walking Dead) asked me if I was interested in making a documentary about YouTube, and I dove headfirst into these questions.

It’s hard for people to pin down what YouTube is. It’s not social media. It’s not home videos. It’s not news, TV, movies, radio, or a search engine. It’s not an influencer destination, a DIY source, or the repository of all recorded human history. It’s all of those things — the mutant offspring of the Library of Alexandria and the Tower of Babel. YouTube is Google’s media front end, the second most visited website on the planet after Google itself, and owned and run by its parent company, Alphabet, which is the largest of all the Big Tech companies in terms of user engagement.

The scale of YouTube is staggering: over 4.6 billion views a day, far surpassing any other web platform or social media app. The product went online as a small video service in 2005 and was bought by Google for $1.65 billion in 2006. Within a year of that purchase, most of us were hooked. We were hooked by the ease of use, breathtaking diversity, and wide spectrum of voices, but mainly because this platform was visual. It provided a newfound intimacy, or an effective simulacrum of intimacy, and the emotional impact of seeing, not just connecting via audio or on-screen text. YouTube was huge, like everyone-in-the-world-at-once huge, and the problems began immediately.

There was vicious sexual harassment and serious death threats directed at users — predominantly women, people of color, and the LGBTQ+ community; incitement to violence by radical influencers funded by dark money operatives with political and often extremist agendas; highly inappropriate content fed into children’s programming. And it was scaling, meaning this content wasn’t on the fringe; it was getting colossal engagement, and as we worked on the film, everything pointed to another tragedy. This time it was the deadly insurrection at the U.S. Capitol. As writer Talia Lavin divulges in our movie, the journalist group Bellingcat’s study showed that the perpetrators’ mobilization was driven by radicalization on YouTube more than any other platform.

The distracting “algorithm” narrative has undermined efforts to explain the dangers of YouTube to the public and policymakers alike, hampering progress toward a solution. There are algorithms at play, and they can be insidious. But an algorithm isn’t at the root of the harm, and two well-known, non-technical precedents explain what is: One is the heated rivalry between Hearst and Pulitzer in the 1890s that birthed the term “yellow journalism,” the dissemination of knowingly false or exaggerated news for profit. The other is Ralph Nader’s book Unsafe at Any Speed, which exposed the automobile industry for resisting safety features for their cars, and helped lead to the passage of the National Traffic and Motor Vehicle Safety Act in 1966.

The issue with YouTube is its off-the-charts monetization of yellow journalism via a global media platform with no meaningful safeguards. And when a profit motive is tethered to a video-based platform, where a content creator is seemingly looking and talking directly to the end user, a parasocial bond is created that has enormous power. Now, multiply that power by 4.6 billion views a day.

Alex Winter, director of 'The YouTube Effect.'
Alex Winter, director of ‘The YouTube Effect.’

When I asked then-CEO Susan Wojcicki in my film about the platform driving real-world harm, she spoke sincerely about their attempts to address these issues, such as deplatforming incendiary influencers, which they had just done in the case of conspiracy theorist Alex Jones, and flagging and removing content they don’t deem suitable for their advertisers. All true and beneficial. Yet soon after that interview, Steven Crowder, a well-funded, far-right influencer, called for civil war to his four million subscribers on his ad-heavy YouTube channel after the FBI peacefully searched former president Trump’s Florida home for classified documents. Only a few weeks ago, YouTube ended its policy of removing content related to the “Stop The Steal” movement that claims the 2020 election was a fraud, the primary conspiracy theory that drove the insurrection and still encourages real-world violence.

The harm is ongoing, and will be worse in the era we’re about to enter when algorithms are paired with sophisticated AI and disinformation spreads like gasoline on a brush fire. It’s not surprising that as long as YouTube’s business model works, profit will take priority over concerns for end-user safety. But how can a vast monopoly this powerful and influential be allowed to operate for this long without oversight?

Google has unparalleled lobbying power. Its money is sprinkled liberally across all parties worldwide, allowing it to keep lawmakers off its back with the tacit agreement that it will police itself, which it has no incentive to do. As a result, the narrative around internet-fueled harm is being controlled by the tech companies, who blame the end-user for dangers caused by their product and promote impractical non-solutions: “Put down your device, go outside,and touch grass!”

We urgently need our Congress and international governments to pass meaningful legislation and anti-trust law and ultimately to break up these monopolies. But in this climate, we’ll more likely get posturing meant to appease constituents. If we understand the human incentives that cause harm and reject dishonest, diversionary narratives, we can create workable solutions and demand specific action from Congress. It’s on all of us, and it requires an acceptance of the good that platforms like YouTube provide. When we “other” or demonize these platforms, we deny their influence and purpose, ignoring that we need and use them daily.

As Natalie Wynn, behind the highly popular ContraPoints YouTube channel, states in our film, we can’t expect Big Tech to save us; everyone has to get involved and show up for one another. To Natalie’s point, many of the loudest voices to counteract the harm caused by YouTube can be found on YouTube itself. It may seem unpalatable, but I believe the answer to the damage caused by Big Tech is not to put our devices down, but to engage — to become more tech-aware, and thus, more discerning and informed about what we, and our children, take in, and therefore become better armed to help fight for viable solutions. And where else can my kids go to dig up embarrassing Bill & Ted outtakes of their dad? So, I’m sticking around. Until AI comes for me.

Best of Rolling Stone

Click here to read the full article.