YouTube is more likely to serve problematic videos than useful ones, study (and common sense) finds

Shouldn't your next YouTube recommended video be a beauty tutorial, not a conspiracy theory?
By Jennimai Nguyen  on 
YouTube is more likely to serve problematic videos than useful ones, study (and common sense) finds
Shouldn't recommended videos prioritize my interests? Credit: SOPA Images/LightRocket via Gett

Here's a study supported by the objective reality that many of us experience already on YouTube.

The streaming video company's recommendation algorithm can sometimes send you on an hours-long video binge so captivating that you never notice the time passing. But according to a study from software nonprofit Mozilla Foundation, trusting the algorithm means you're actually more likely to see videos featuring sexualized content and false claims than personalized interests.

In a study with more than 37,000 volunteers, Mozilla found that 71 percent of YouTube's recommended videos were flagged as objectionable by participants. The volunteers used a browser extension to track their YouTube usage over 10 months, and when they flagged a video as problematic, the extension recorded if they came across the video via YouTube's recommendation or on their own.

The study called these problematic videos "YouTube Regrets," signifying any regrettable experience had via YouTube information. Such Regrets included videos "championing pseudo-science, promoting 9/11 conspiracies, showcasing mistreated animals, [and] encouraging white supremacy." One girl's parents told Mozilla that their 10-year-old daughter fell down a rabbit hole of extreme dieting videos while seeking out dance content, leading her to restrict her own eating habits.

Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable's weekly Light Speed newsletter.
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

What causes these videos to become recommended is their ability to go viral. If videos with potentially harmful content manage to accrue thousands or millions of views, the recommendation algorithm may circulate it to users, rather than focusing on their personal interests.

YouTube removed 200 videos flagged through the study, and a spokesperson told the Wall Street Journal that "the company has reduced recommendations of content it defines as harmful to below 1% of videos viewed." The spokesperson also said that YouTube has launched 30 changes over the past year to address the issue, and the automated system now detects and removes 94 percent of videos that violate YouTube's policies before they reach 10 views.

While it's easy to agree on removing videos featuring violence or racism, YouTube faces the same misinformation policing struggles as many other social media sites. It previously removed QAnon conspiracies that it deemed capable of causing real-world harm, but plenty of similar-minded videos slip through the cracks by arguing free speech or claiming entertainment purposes only.

YouTube also declines to make public any information about how exactly the recommendation algorithm works, claiming it as proprietary. Because of this, it's impossible for us as consumers to know if the company is really doing all it can to combat such videos circulating via the algorithm.

While 30 changes over the past year is an admirable step, if YouTube really wants to eliminate harmful videos on its platform, letting its users plainly see its efforts would be a good first step toward meaningful action.

Topics YouTube

Mashable Image
Jennimai Nguyen

Jennimai is a tech reporter at Mashable covering digital culture, social media, and how we interact with our everyday tech. She also hosts Mashable’s Snapchat Discover channel and TikTok, so she naturally spends way too much time scrolling the FYP and thinking about iPhones.


Recommended For You
AI-generated Elon Musk videos flood YouTube with fake eclipse streams to promote crypto scams
YouTube logo


Grow plants from fantasy games with this IRL initiative
A split screen juxtaposes a virtual flax flower from Guild Wars 2, with real-life flax flowers in a meadow. Caption reads "Gaming meets nature"

'Harold Halibut' review: A quirky adventure game meets Wes Anderson aesthetics
A man and a character with a mushroom cap for a head stand at the edge of a cliff, looking down at a vibrant, cluttered cavern filled with eclectic objects, artwork, and glowing lightsThe atmosphere is dreamlike, with a warm, yellowish glow and floating particles, creating a sense of otherworldly exploration.


Trending on Mashable
NYT Connections today: See hints and answers for April 24
A phone displaying the New York Times game 'Connections.'

Wordle today: Here's the answer and hints for April 24
a phone displaying Wordle

NYT's The Mini crossword answers for April 24
Closeup view of crossword puzzle clues


NYT Connections today: See hints and answers for April 23
A phone displaying the New York Times game 'Connections.'
The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!