Technology
YouTube employees warned about its ‘toxic’ video problems, but the company ignored them
Over the past couple of years, has taken steps to tackle one of the company’s biggest challenges: the rise of toxic content on the platform.
Just recently, the company addressed misinformation with products like , which fact check certain video search results. Anti-vaccination videos, which could harm the public, have been . YouTube has even promised to address its long-criticized product, so that it would stop actively promoting extremist and conspiratorial content.
There’s no doubt that YouTube is taking platform safety more seriously now than ever before. YouTube certainly to know that. However, a report from now shines a light on how YouTube was consistently warned about these problems by its employees well before it decided to address them. And while YouTube stresses how it’s centered on these issues over the past two years, one such rejected proposal could have helped stifle the spread of Parkland shooting conspiracies just last year.
According to former YouTube employees who spoke to Bloomberg, the company was repeatedly warned about toxic content and misinformation on the service but pushed the concerns aside to focus on the growth of the platform. As recently as February 2018, YouTube employees proposed a solution to limit recommended videos to legitimate news sources in response to conspiracy theory videos calling the Parkland shooting victims “crisis actors.” According to Bloomberg, the proposal was turned down.
Some of the former senior level YouTube employees even cited the spread of this type of content as the reason they left the company.
One early YouTube employee, who had worked there before Google the video site in 2006, explained how the site had previously moderated and demoted problematic videos, using content that promoted anorexia as an example, He pointed out how things seem to have changed once Google came along and prioritized engagement.
With this push to grow engagement and revenue along with it, toxic videos took advantage of the changes. The problem became so well known that, according to Bloomberg, YouTube employees had a nickname for this brand of content: “bad virality.”
Concerns over videos skirting the company’s hate policies, the recommendation engine pushing disinformation, and extremist content being promoted were effectively ignored. Proposed changes to policies to address these issues were also turned down. The company went so far as to tell staff not on the moderation teams to stop looking for problematic content to flag.
As YouTube notes in its response to Bloomberg’s report, the company has begun to take these issues more seriously. YouTube has been especially to toxic content relating to children. The company has even instituted policies similar to the proposal introduced following Parkland when it comes to .
The recent changes that are ongoing at YouTube are undoubtedly a good thing for the future. But it’s pretty clear that so much could have been done even sooner.
Unfortunately, that is already done.
-
Entertainment6 days ago
When will we have 2024 election results online?
-
Entertainment5 days ago
Halloween 2024: Weekend debates, obscure memes, and a legacy of racism
-
Entertainment7 days ago
Social media drives toxic fandom. Is there a solution?
-
Entertainment5 days ago
Is ‘The Substance’ streaming? How to watch at home
-
Entertainment5 days ago
M4 MacBook Pro vs. M3 MacBook Pro: What are the differences?
-
Entertainment4 days ago
Menendez brothers case reignites online: The questions that keep resurfacing
-
Entertainment4 days ago
‘A Real Pain’ review: Jesse Eisenberg and Kieran Culkin charm as odd-couple cousins
-
Entertainment4 days ago
25 greatest sci-fi films on Hulu that you can watch right now