Skip to main content

Instagram’s algorithm is showing vulnerable teenagers more “eating disorder-adjacent” content, according to an internal Meta research document reviewed by Reuters. The study found that teens who said the app often made them feel bad about their bodies were exposed to three times more harmful body-related posts than those who didn’t report such feelings.

Researchers tracked 1,149 teens during the 2023–2024 academic year, analyzing what appeared in their feeds. For the 223 users who frequently reported body dissatisfaction, 10.5% of viewed content was connected to disordered eating or negative body image, compared to 3.3% for other teens. The posts often included sexualized body imagery, harsh appearance judgments, and themes of suffering.

Meta’s researchers admitted the study could not determine whether Instagram caused the body dissatisfaction or simply reflected users’ interests. However, they expressed concern that the platform’s algorithm might be amplifying harmful material to those already at risk. They also found Meta’s safety filters could not detect 98.5% of sensitive content, including images of self-harm and extreme thinness.

Meta spokesperson Andy Stone said the company is committed to understanding young users’ experiences and improving safety, pointing to new restrictions on age-inappropriate content. But experts such as Jenny Radesky from the University of Michigan said the findings suggest Instagram’s algorithm profiles psychologically vulnerable teens, serving them more damaging posts instead of shielding them.

The revelation adds pressure to Meta, already facing U.S. investigations and lawsuits over the platform’s impact on youth mental health and allegations of deceptive safety claims.