Given the troubling impacts of social media on teens’ mental health, I was happy to see Instagram taking steps to address these harms and make the platform safer for our children. Instagram’s recent initiative to create “Teen Accounts” with built-in protections seemed like a hopeful step forward. Rolled out in September 2024, these updates promised a more secure and supportive online experience for young users.
Like many, I was cautiously optimistic. But also curious—are these changes actually working?
Unfortunately, recent findings from Jonathan Haidt, author of Anxious Generation and advocacy groups like Accountable Tech and Design It For Us suggest otherwise. Despite Meta’s claims, testing of the new Teen Accounts has revealed deeply concerning patterns:
-
5 out of 5 test Teen Accounts were recommended sensitive content, even with default sensitive content controls enabled.
-
5 out of 5 were shown sexual content.
-
4 out of 5 were pushed content related to body image issues and disordered eating.
-
4 out of 5 users had distressing experiences while using the platform.
-
Just 1 out of 5 was shown educational content.
These findings are alarming—especially considering that Meta made $164 billion in revenue last year and $45 billion last quarter alone. With resources like that, there is no excuse for failing to adequately protect young users.
Instagram doesn’t just fail to protect vulnerable teens—it actively targets them. The platform’s algorithm is designed to maximize engagement, often by exploiting insecurities. For example, if a young girl likes a few posts of fit female athletes, she may soon find her feed flooded with extreme workout content, dieting tips, and “thinspiration” images. This progression is not accidental—it’s driven by a system that amplifies content designed to keep users scrolling, regardless of the psychological toll. Teens with poor self-image are especially vulnerable, and the data shows that Instagram pushes harmful content to them even when safety settings are in place.
A Call to Parents and Caregivers
The lesson here is clear: don’t hand your child a smartphone—or unrestricted social media access—until they are truly ready. Make sure they have the maturity to navigate these platforms and that proper parental controls, education, and ongoing dialogue are in place.
Social media giants like Meta have proven they will not self-regulate. They are building platforms that target our kids’ insecurities, hijack their attention, and shape their sense of self—often with damaging consequences.
Let’s not wait for more studies to confirm what we already suspect. Delay smartphone use. Stay involved. And help your child build the resilience and skills they’ll need to stay mentally healthy in a digital world designed to do the opposite.