OPINION: Media literacy is essential on online platforms
It seems no one can spend a year, or even a week, in college without at least one YouTube binge. Those sessions that start with a video sent by a friend or linked in an article that turn into chains of “next” and “recommended” clicks, taking you away from the video you started with. You finally resurface, hours later, your brain feeling like a swollen sponge, wondering how you went from a harmless Buzzfeed clip to “Why Illuminati Replaced Nicki Minaj With Cardi B – THE TRUTH”.
The answer is YouTube’s recommendation system, a closely monitored algorithm that queues videos based on user history and machine learning. The algorithm is designed to keep the viewer watching the video and creating ad revenue for the site, and is remarkably effective. It harnesses the human desire for novelty and entertainment, automatically recommending and playing videos endlessly until the user chooses to stop. To keep monitoring us, the algorithm goes to extremes in everything from $1,000 makeup runs and all-fruit diets to the most dangerous areas of conspiracy theory, questionable science and fake news. new.
In this area, YouTube represents a much greater danger than a few wasted hours. After the mass shooting in Las Vegas in October, the algorithm was discovered to be promoting conspiracy theories calling the shooting a hoax. Just weeks ago, a video alleging the shooting at a high school in Parkland, Florida, was the product of ‘crisis actors’ filmed by the recommendation algorithm, landing on the site’s coveted ‘trending’ page. due to an extreme spike in its popularity. Videos that peddle blatantly false and conspiratorial news stories seem to be favored by the algorithm, and if someone is exposed to enough of these ideas over time, their view of reality can be significantly distorted.
Additionally, there seems to be a bias in the algorithm towards right-wing and far-right content. Former YouTube engineer Guillaume Chaslot conducted a statistical analysis of the algorithm just before the 2016 presidential election by seeding accounts with pro-Hillary Clinton and pro-Donald Trump searches. A total of 86% of recommended videos contained damaging messages about Clinton, regardless of the initial search. This does not mean that a right-wing bias is written into the site’s code, but that over the years of aggregated user data, videos with right-wing views have taken precedence over their liberal counterparts. , so the algorithm tends to recommend them when all other variables remain consistent.
Leaders of alt-right and white supremacist groups have taken full advantage of this fact, and YouTube is teeming with right-wing pundits who can serve as a gateway into the vitriolic and racist world of the alt-right for vulnerable young people. Transcripts of conversations in alt-right chatrooms even show members coordinating to create thousands of fake accounts and vote down videos with more leftist views, further playing the algorithm in their favor. In an interview with The Daily Beast, Southern Poverty Law Center representative Ryan Lenz said “the alt-right is incredibly adept at navigating and exploiting loopholes in social media platforms.”
As students develop and refine their opinions, it is important to recognize the power of platforms like YouTube. In these companies’ missions to maximize ad revenue, they can lead users down dangerous paths of radicalization, recommending increasingly extreme content and pushing viewers on both sides of the political spectrum even deeper into their bedrooms. ideological echo.
So far, YouTube’s responses to these controversies have been inconsistent and ineffective, using a few human moderators and demonetization to remove only some of the most offensive channels. Yet the site’s artificial intelligence brought a Parkland conspiracy video to the trending page as recently as last month. There may be something deeply wrong with this algorithm, but since it’s still incredibly good at keeping users engaged, YouTube seems reluctant to make substantial changes.
In a world where the average American adult spends ten hours a day consuming media, platforms like YouTube can have as much impact on students’ worldview as any teacher or classroom. But at the end of the day, it’s companies that grab our attention, not educational institutions. It’s important to push for improvement at YouTube (and its parent company, Google), but consumers also need to develop their moderation and critical thinking skills in their media consumption, however difficult that may be.