Technology
YouTube employees warned about its ‘toxic’ video problems, but the company ignored them
Over the past couple of years, has taken steps to tackle one of the company’s biggest challenges: the rise of toxic content on the platform.
Just recently, the company addressed misinformation with products like , which fact check certain video search results. Anti-vaccination videos, which could harm the public, have been . YouTube has even promised to address its long-criticized product, so that it would stop actively promoting extremist and conspiratorial content.
There’s no doubt that YouTube is taking platform safety more seriously now than ever before. YouTube certainly to know that. However, a report from now shines a light on how YouTube was consistently warned about these problems by its employees well before it decided to address them. And while YouTube stresses how it’s centered on these issues over the past two years, one such rejected proposal could have helped stifle the spread of Parkland shooting conspiracies just last year.
According to former YouTube employees who spoke to Bloomberg, the company was repeatedly warned about toxic content and misinformation on the service but pushed the concerns aside to focus on the growth of the platform. As recently as February 2018, YouTube employees proposed a solution to limit recommended videos to legitimate news sources in response to conspiracy theory videos calling the Parkland shooting victims “crisis actors.” According to Bloomberg, the proposal was turned down.
Some of the former senior level YouTube employees even cited the spread of this type of content as the reason they left the company.
One early YouTube employee, who had worked there before Google the video site in 2006, explained how the site had previously moderated and demoted problematic videos, using content that promoted anorexia as an example, He pointed out how things seem to have changed once Google came along and prioritized engagement.
With this push to grow engagement and revenue along with it, toxic videos took advantage of the changes. The problem became so well known that, according to Bloomberg, YouTube employees had a nickname for this brand of content: “bad virality.”
Concerns over videos skirting the company’s hate policies, the recommendation engine pushing disinformation, and extremist content being promoted were effectively ignored. Proposed changes to policies to address these issues were also turned down. The company went so far as to tell staff not on the moderation teams to stop looking for problematic content to flag.
As YouTube notes in its response to Bloomberg’s report, the company has begun to take these issues more seriously. YouTube has been especially to toxic content relating to children. The company has even instituted policies similar to the proposal introduced following Parkland when it comes to .
The recent changes that are ongoing at YouTube are undoubtedly a good thing for the future. But it’s pretty clear that so much could have been done even sooner.
Unfortunately, that is already done.
-
Entertainment7 days ago
Earth’s mini moon could be a chunk of the big moon, scientists say
-
Entertainment7 days ago
The space station is leaking. Why it hasn’t imperiled the mission.
-
Entertainment6 days ago
‘Dune: Prophecy’ review: The Bene Gesserit shine in this sci-fi showstopper
-
Entertainment5 days ago
Black Friday 2024: The greatest early deals in Australia – live now
-
Entertainment4 days ago
How to watch ‘Smile 2’ at home: When is it streaming?
-
Entertainment3 days ago
‘Wicked’ review: Ariana Grande and Cynthia Erivo aspire to movie musical magic
-
Entertainment3 days ago
A24 is selling chocolate now. But what would their films actually taste like?
-
Entertainment3 days ago
New teen video-viewing guidelines: What you should know