Technology
YouTube promises to stop recommending flat Earth and 9/11 truther videos
Even Alex Jones, harmful conspiracy theory videos were running rampant on YouTube. Now, the company says it’s going to take action.
In a published on Friday, YouTube said it would be making changes to its recommendations algorithm to explicitly deal with conspiracy theory videos. The company says the update will reduce the suggestion of “borderline content and content that could misinform users in harmful ways.”
YouTube clarified what kind of videos fit that description by providing three examples: “videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
The company clarified that this content doesn’t necessarily violate its . This means that while the content may still exist on YouTube, the site’s algorithm will omit these videos from being recommended to its users.
In order to deal with this sort of problematic content, YouTube says it relies on “a combination of machine learning and real people.” Human evaluators and experts will train the recommendation system to evaluate these videos. At first, the changes will only be visible on a small number of videos in the U.S.
YouTube says that overall less than 1 percent of videos will be affected by this change. But, with the platform’s massive video archive and of new content being uploaded per minute, that still amounts to a lot of videos.
The video site, which is the second most trafficked website in the world, according to , has been for its recommendation engine. The company actually did make in an attempt to combat . For example, YouTube adjusted its search algorithm to center trusted news sources for in September.
YouTube recommendations continued to be a problem, however.
The Washington Post recently discovered , as well as Supreme Court Justice Ruth Bader Ginsberg , being recommended on YouTube. Motherboard on a 9/11 newscast that was being suggested to YouTube users en masse last week.
Just yesterday, BuzzFeed News published an into YouTube’s recommendation algorithm. BuzzFeed found that YouTube would eventually recommend conspiracy theory and hate videos from far-right commentators for the most basic of current events searches.
A published in November found that an increasing number of Americans are researching topics on YouTube and going to the service for news. The study also found that the site’s recommendation engine plays a large role in what videos its users consume.
Omitting flat Earthers, 9/11 truthers, and bogus MDs from YouTube recommendations would be a big step toward fixing one of the platform’s many problems.
-
Entertainment7 days ago
OpenAI’s plan to make ChatGPT the ‘everything app’ has never been more clear
-
Entertainment6 days ago
‘The Last Showgirl’ review: Pamela Anderson leads a shattering ensemble as an aging burlesque entertainer
-
Entertainment7 days ago
How to watch NFL Christmas Gameday and Beyoncé halftime
-
Entertainment6 days ago
Polyamorous influencer breakups: What happens when hypervisible relationships end
-
Entertainment5 days ago
‘The Room Next Door’ review: Tilda Swinton and Julianne Moore are magnificent
-
Entertainment4 days ago
‘The Wild Robot’ and ‘Flow’ are quietly revolutionary climate change films
-
Entertainment4 days ago
Mars is littered with junk. Historians want to save it.
-
Entertainment5 days ago
CES 2025 preview: What to expect