Affiliate links on Android Authority may earn us a commission.Learn more.

YouTube execs reportedly ignored employee warnings about toxic videos

June 11, 2025

One of the biggest problems withYouTubein recent years has been the prevalence of toxic videos, covering conspiracy theories, and other misinformation. The problem has been that the video-sharing website has even recommended these high-engagement videos to its users, despite the questionable, false, or incendiary content. Why? To get more views.

Now,Bloombergreports that current and formerGoogleand YouTube employees raised concerns with the company about these videos, and offered solutions. Unfortunately, these employees were told not to “rock the boat.” The outlet interviewed over 20 former and current staffers, painting a picture of a company that apparently refused to act for fear of rerducing engagement numbers.

Article image

Read:Inbox by Gmail is dead, but you can bring back its looks with this Chrome extension

One reported solution was offered by former Googler Yonatan Zunger, who left in 2016, suggesting to simply flag “troubling” videos so they weren’t recommended to users. The outlet claimed that the proposal reached the head of YouTube policy, where it was promptly rejected.

Another workaround proposal was submitted after a video, claiming that Parkland school shooting victims were “crisis actors,” went viral. The proposal by policy staff called for recommendations on the video to be limited to vetted news sources — a source toldBloombergthat this solution was rejected as well.

Engagement at all costs?

These proposals also came against the backdrop of YouTube’s internal goal to hit one billion hours of views per day. And the recommendation system, built on a neural network, was reportedly overhauled in order to meet this goal.

According toBloomberg, computer scientist Francis Irving, who has been critical of YouTube’s AI, said that he had informed YouTube representatives of the problems facing this system, calling it an “addiction engine.” The scientist said the representatives either responded with doubt or indicated that they had no plans to change the system.

YouTubeannouncedearlier this year that it’ll no longer recommend videos with “borderline content” or those that “misinform users in a harmful way.” The solution sounds similar to Zunger’s proposal before he left the company. But if these solutions were indeed proposed beforehand, why were they rejected at first? Is it a case of advertisers voicing their displeasure at Google’s recommendations? It wouldn’t be the first time theyintervenedfollowing inaction by the platform.

The website has also since implemented text boxes below specific videos that question facts, linking users to established sources. But it’s unclear whether these measures are enough to quell YouTube’s reputation as both a store and promoter of misinformation.

Bloomberg‘s article also detailed a proposal by YouTube CEO Susan Wojcicki and senior staff to change the way YouTubers earned cash. The proposal called for users to be paid based on engagement, with incoming money being pooled and then shared among uploaders (even if some creators didn’t have ads on their channel). This proposal was rejected by Google CEO Sundar Pichai, who felt that it could exacerbate the site’sfilter bubbleproblem.

NEXT:How to do a system restore on Windows 10

Thank you for being part of our community. Read ourComment Policybefore posting.