YouTube's algorithm has a pattern of recommending content showing self-harm or eating disorders, particularly to teenage users.
That is according to a new report released Tuesday by the Center for Countering Digital Hate (CCDH) titled "YouTube's Anorexia Algorithm." For the study, the researchers used a fictional 13-year-old user to determine how the platform's algorithm works.
Specifically, the CCDH researchers used the profile to perform 100 searches using popular eating disorder keywords and examined the first 1,000 videos recommended to the fictional 13-year-old who watched videos about weight loss, diet, and exercise.
What Were the Findings?
The study found that 70% of the videos pushed to users demonstrating signs of wanting to go on a diet or lose weight were content that would likely worsen or create anxieties about body image. In fact, one in every three videos recommended to the user by YouTube's algorithm contained harmful eating disorder content.
Some eating disorder videos included Anorexia Boot Camp, which are videos that encourage users to limit their calorie intake to induce anorexia; and Meanspo, which are content that encourages weight loss through bullying comments.
Additionally, 50 of the videos recommended to the user involved content about self-harm or suicide.
Read Also: Texas Lawmakers Mulls Social Media Ban for Minors Amid Rise in Cyberbullying, Exploitation
What Is YouTube Doing To Address the Issue?
The researchers also found that videos promoting self-harm and eating disorders had about 344,000 views each, on average. To put it into perspective, that is around 60 times the views of the average YouTube video. Those contents also displayed major brands, including Nike, Grammarly, and HelloFresh.
As part of the research, CCDH flagged 100 videos that promoted harmful content on eating disorders and weight loss. YouTube only removed or age-restricted 17 of the flagged videos.
The platform has yet to comment on the findings of the study. However, it said that it is "continuously working with mental health experts to refine its approach to content recommendations for teens," per CBS News.
In 2023, YouTube added the ability to age-restrict videos containing disordered eating or discussions that could potentially trigger at-risk viewers. It has also blocked certain keywords, such as "thinspiration," from its searches, though videos related to the term still appear in recommendations.