Connect with us

Technology

TheirTube shows how YouTube’s algorithm creates conspiracy theorists

Published

on

Ever wonder how your dear Aunt Karen got radicalized into believing the bizarre conspiracy theories she shares on social media? What about your apolitical college buddy who suddenly can’t seem to stop complaining about social justice and “cancel culture”?

Well, there’s a they fell down the YouTube . And a new website, TheirTube, wants to show you how that happened.

is a new online project that gives visitors a glimpse into what videos YouTube recommends to certain types of users based on their watch history.

A screenshot of TheirTube

A screenshot of TheirTube

The project breaks down these types of users into six different personalities: conspiracist, prepper, conservative, liberal, fruitarian, and climate denier. 

Clicking on each type of YouTube viewer brings up a list of videos that the platform recommends based on that personality’s watch history. TheirTube provides a playlist for each personality so you can see what videos the prepper or the climate denier, for example, could be watching in order to receive each day’s recommendations from YouTube.

TheirTube was by developer Tomo Kihara, a Mozilla Creative Media Award recipient, and funded by the .

Kihara told me over a private message that he was inspired to create TheirTube after seeing the YouTube frontpage of a person he knew who had turned into a “conspiracist.” He explained to me that it was “vastly different” from his own YouTube homepage.

So, how does TheirTube work?

“Each of these TheirTube personas is informed by interviews with real YouTube users who experienced similar recommendation bubbles,” said Kihara. “The videos are taken randomly from Youtube channels that they subscribed.”

For example, by clicking on TheirTube’s “conspiracist” personality, I was shown a number of recommendations including one from a QAnon channel saying PizzaGate is real. PizzaGate is a bizarre far-right conspiracy theory that promotes the idea that children were being trafficked out of a popular D.C. pizza place’s basement. The conspiracy was birthed out of hacked emails from Hillary Clinton’s campaign manager in 2016. At the height of this conspiracy, an armed gunman who believed PizzaGate was real, entered the D.C. establishment and demanded to be taken to its basement. One problem, however: The pizza place at the heart of the conspiracy doesn’t have a basement.

So what videos did the “conspiracist” personality need to watch to trigger YouTube’s algorithm to recommend that video? According to TheirTube, this profile had viewed content from creators such as the popular YouTuber Shane Dawson, as well as a few BuzzFeed channels.

“YouTube’s recommendation algorithm accounts for 70% of all views on the platform, and while it is useful, it can show the same points of view over and over again, confirming and amplifying your existing bias,” Kihara on Twitter. “This sometimes can lead people to have radical beliefs and ideas”

Over the years, YouTube has increasingly been for its controversial recommendation engine. That algorithm informs the platform as to what other videos to recommend to users based on what they consume on the site. Critics have how this can often lead YouTube’s recommendation engine to promote videos with further extremist viewpoints, pushing users down a “rabbit hole” where all they see on the platform are these kinds of videos.

Kihara tells me that there is an upcoming open-source aspect to TheirTube where users will be able to create their own personas based on their own personal watch history, and “track and trace” their YouTube recommendations. He’s also accepting suggestions for additional personality types to add to TheirTube for public view.

YouTube has significant progress during recent to address the problem. However, the issue . You can now see the proof for yourself on .

Continue Reading
Advertisement Find your dream job

Trending