
YouTube’s role in shaping youth culture has grown exponentially, offering a vast library of content that caters to millions. The platform’s algorithm, designed to maximise engagement, often prioritises harmful content that glorifies skeletal bodies and extreme dieting practices. This troubling trend has disproportionately impacted young girls, amplifying distorted perceptions of beauty and health and increasing mental health challenges such as body dysmorphia and eating disorders.
According to a report by the Center for Countering Digital Hate (CCDH), one-third of video recommendations for adolescent users are linked to eating disorders. This algorithmic bias prioritises engagement over safety, perpetuating harmful narratives that equate thinness with success and beauty. The feedback loop that amplifies exposure to extreme content may cause a user who watches a video about "healthy eating" may quickly find their recommendations filled with content promoting restrictive diets or weight loss transformations.
As these recommendations gain traction, they shape users' perceptions, making it increasingly difficult to escape the cycle of toxic content.
The impact of algorithm-driven content on young viewers is profound. Young girls, already vulnerable to societal pressures on body image, face amplified insecurities from repeated exposure to harmful content. Research shows a strong link between viewing extreme dieting videos and the onset of anxiety, depression, and disordered eating. The CCDH report revealed that 34% of YouTube’s recommendations for a simulated 13-year-old user promoted eating disorders. Over time, these messages erode self-esteem and encourage behaviours that jeopardise mental and physical health. As an overweight teenager in the ’80s, I struggled with self-esteem and an eating disorder fuelled by monthly magazines, a challenge I didn’t overcome until my 40s. Today, young girls face these pressures 24/7 with on-demand content, magnifying the urgency to address this issue.
Extreme dieting trends have gained significant traction on YouTube, often promoted by influencers who portray rapid weight loss as a pathway to beauty and success. Popular content includes prolonged fasting challenges, carnivore diets that include raw steak for breakfast, detox plans, and other restrictive eating practices that promise quick results but fail to address their health consequences. The dangers of these trends are well-documented. Adolescents who follow extreme diets risk malnutrition, hormonal imbalances, and long-term organ damage. Beyond physical health, the psychological toll of constant calorie monitoring and food restriction can lead to obsessive behaviours and chronic dissatisfaction with one’s body. Despite these risks, YouTube’s engagement-driven algorithm prioritises such content, normalising these practices among young audiences.
There continues to be an urgent need for systemic change. Platforms must take greater responsibility for the material they promote, implementing stricter content moderation and re-evaluating their algorithms. While YouTube claims to moderate flagged videos, studies indicate that 81% of harmful material remains accessible. This failure highlights the platform’s negligence in protecting its most vulnerable users.Creators also have a role to play by producing content that celebrates diverse body types and promotes balanced, evidence-based health advice, they can counteract the toxic narratives dominating online spaces.
We know the dangers posed by YouTube’s algorithm are multifaceted, impacting physical health, mental well-being, and societal norms surrounding beauty and success. Addressing these challenges requires a collective effort from policymakers, creators, and users alike.
It is unlikely that Google will ever prioritise user safety on YouTube by refining its algorithm to limit the promotion of harmful content. Transparency in content recommendation processes and proactive moderation are critical steps in creating a healthier digital ecosystem, but safety by design being retrofitted is a pipe dream and the reason education must remain front and centre.
Governments can demand that content creators adopt ethical practices, avoiding sensationalised posts that glorify unhealthy behaviours. Governments mandating industry standards and codes of practice that focus on positive, inclusive messages, can shape a more supportive online community.
Parents play a vital role in combating these harmful influences. Open communication is key discussing body image, media messages, and the risks of algorithm-driven content helps build awareness. Encouraging children to take breaks from social media and explore offline activities creates a healthier balance. Parents can also model positive behaviours by embracing body positivity and mindful content consumption, setting a strong example at home.
Critical media literacy is crucial as content shapes perceptions and influences behaviours more profoundly than ever before. Viewers, particularly young ones, must learn to question the content presented in their feeds, recognising the potential dangers of algorithmically amplified materials that often prioritise engagement over truth or well-being. This skill is not just about identifying misinformation but also about understanding the biases in the platforms they use daily. Families can work together to discuss content, reinforcing the importance of self-worth and emphasising the value of authenticity over curated perfection. Open communication, creating safe spaces for questions, and modelling thoughtful social media consumption can help young people navigate online worlds confidently and resiliently. Building these habits lays the foundation for a generation that can engage with social media critically, compassionately, and responsibly, ensuring that technology is a tool for growth rather than a source of harm.
Comments