Instagram recently updated its guidelines for users under 18, rolling out a set of changes Sept. 17 intended to limit the negative effects of social media on minors. Coined “Teen Accounts,” these changes include setting all minors’ accounts to private by default, muting notifications from 10 p.m. to 7 a.m. and increased parental supervision.
Adam Mosseri, the head of Instagram, announced the feature on his Instagram account. “We’re introducing Teen Accounts to give teens automatic safe protections and parents more peace of mind,” Mosseri wrote.
He broke down the changes, explaining, “Designed based on parents’ biggest concerns, Teen Accounts will automatically apply our strictest settings around who can contact teens and the content they see. Teens under 16 won’t be able to change these settings to be less strict without a parent’s permission.”
Minors will experience these changes in action within 60 days of the announcement. But are these restrictive accounts actually safer for teen users?
Social media has become an integral part of the teenage experience, with 59% of U.S. teens using Instagram, according to the Pew Research Center. In a time when young people are more tech-savvy than ever, will daily time limits, messaging supervision and content censoring really prevent social media’s damaging effects?
A simple loophole presents itself: Lying about one’s age on Instagram’s settings. With no true age verification measures in place, bypassing these restrictions is fairly simple.
James P. Steyer, the founder and CEO of Common Sense Media, a nonprofit that rates media and technology based on their suitability for adolescents, released a statement in light of the “Teen Accounts” update, calling the move “transparently timed.”
“This … announcement from Meta, the day before a key congressional committee is set to mark up social media legislation, is yet another attempt to make a splashy announcement when the company is feeling the heat,” the statement reads. “Clearly, they have always had these capabilities to protect kids and are only acting when they are under pressure from lawmakers and advocates.”
The Wall Street Journal reported in 2021 that Meta, known as Facebook at the time, had conducted research that found that Instagram is harmful to a “sizable percentage” of teens, specifically girls. The research found that, “[Instagram makes] body image issues worse for one in three teen girls. … Teens blame Instagram for increases in the rate of anxiety and depression.”
From exploitation to cyberbullying to suicidal ideation, Instagram has faced ongoing criticism for its lack of protections provided to minors. Meta faces a federal lawsuit from several states due to alleged addictive features that captivate young users and the intentional shielding of this information from the public sphere.
“Our kids can’t keep waiting for these companies to do the right thing. And we cannot allow them to decide when to have strong protections, or when to remove them. We need Congress to pass federal protections now,” Steyer wrote in his statement.
In a guest essay for The New York Times this summer, United States Surgeon General Vivek H. Murthy advocated for a surgeon general’s warning label on social media apps due to the mental health harms associated with adolescent usage. Murthy shared that almost half of U.S. teens aged 13-17 feel worse about their bodies due to social media.
“We have the expertise, resources and tools to make social media safe for our kids,” Murthy wrote in the essay. “Now is the time to summon the will to act. Our children’s well-being is at stake.”
For years, the U.S. government has tried to step in to advocate for further protections. The Kids Online Safety Act passed in the Senate this summer, aimed at requiring major tech companies to censor children from dangerous, sexual or violent content.
Despite the negative discourse surrounding the “Teen Accounts” update, Mosseri asserts that “this is a big change to Instagram and it’s one that [he’s] personally very proud of.”
This move by Instagram is not the first attempt by a popular company to further child safety.
Uber launched “Uber for teens” this year, which allows parents to track their teen’s rides and automatically sets up PIN verification and RideCheck.
TikTok has default privacy settings for users under 18 years old and a Teen Safety Center with tools like a “Privacy tools 101 quiz” and a digital well-being guide for users and their parents.
Well-intentioned efforts to implement guidelines on social media platforms aim to address concerns and improve teens’ user experience. However, their prior unrestricted access to these platforms may have already contributed to mental health challenges.
These “teen modes” are surface-level solutions to quell concerned parents instead of confronting the root of the harms of social media access on adolescent minds. The implementation of Instagram’s “Teen Accounts” will be a testament to the effectiveness of these regulations.