Opinion editors are not responsible for agreeing or disagreeing with their writers but rather elevate each individual’s specific voice.
Trigger Warning: Mentions of suicide, depression, anxiety, and other mental health issues.
Thoughts of depression, suicidal ideation, and damaged body image are just a few toxic consequences of social media usage amongst young users. Mainstream social media platforms, namely Instagram, Snapchat, and TikTok, lure in children as young as 13 under the pretense of creating a space to share life updates with friends, when in reality their purpose is making money via an addictive algorithm.
On Feb. 16, New York City declared that they are suing the platforms named above “for an uptick in mental health issues among young people,” according to a CNN Business article. While NYC is the first major American city to take steps of this magnitude, such lawsuits are nothing new—especially from families who hold such social media platforms responsible for their children’s deteriorating mental health.
Several families have also come forward with their stories about the negative impact of social media on their children. 17-year-old Christopher James Dawley, 16-year-old Ian Mitchell, and 11-year-old Selena Rodriquez are just a few victims of social media addiction which caused them to take their lives—some even still holding their phones with Snapchat open.
Sites like Instagram, TikTok, and Snapchat are telling kids what to wear, who they should look like, and how to act, all of which forces users to compare themselves to the next top influencer and the girl who sits in front of them in history class.
NYC Mayor Eric Adams has previously called social media a “public health hazard” and an “environmental toxin,” and Surgeon General Dr. Vivek Murthy also declared the platforms to be a “profound risk of harm.”
So why aren’t these tech companies making changes for the betterment of the youth, who are their biggest consumers? For money.
Social media use amongst kids is “nearly universal” as up to 95 percent of people from the ages 13 to 17 report using social media, according to the Surgeon General’s advisory. While 13 is the minimum age to use social media sites, like TikTok, Instagram, and Snapchat, there’s evidence that 40 percent of kids from 8 to 12 years old also have accounts. These numbers are simply too high when dealing with addictive sites that don’t foster much positive engagement. Instead, these kids are exposed to content that propels depression, anxiety, and low self-esteem.
The algorithms of these social media platforms succeed because of the tailored content that directly engages users. Each user’s data—whether that’s age, gender, ethnicity, previously liked posts, internet searches, and even living room conversations—is stored to curate specific content that draws in users for hours on end, triggering pathways comparable to addiction.
In May 2023, the state of Montana sought to ban TikTok—an attempt that was blocked several months later by a federal judge in the name of the First Amendment, according to an article written by NPR. Why did Montana want to exercise this ban? Because of the belief that U.S. citizens’ sensitive data is being shared, thus creating a privacy risk for users.
While the law had anti-Chinese sentiment, as brought up by U.S. District Judge Donald Molloy, platforms like TikTok have access to consumers’ detailed information, which is what allows them to show feeds unique to each user. It’s to the point where the platform was fined almost 13 million dollars in April 2023 by the U.K. Information Commissioner’s Office for illegally processing data of more than one million U.K. children under the age of 13, according to Information Commissioner John Edwards. TikTok simply was not doing enough to detect users under the age of 13 (the minimum age to sign up) and then get them off the site. This fine also came simply weeks after the site was banned from U.K. government phones for security concerns.
These tech companies’ highly profitable algorithms are fueled by users’ time spent on any given platform: the never-ending scrolling through tailored content. Thus, these sites are designed to push users to constantly swipe their thumb across a screen, which maximizes advertising and profit, while also exposing users to harmful feeds. Banning TikTok may be an extreme route, especially considering the free speech and small business opportunities it creates, but changes must still be made.
The toxicity of social media doesn’t lie simply in being performative to match an influencer’s lifestyle, but also in the dangers of indoctrination and political pipelines. When biased content isn’t censored, it is endorsed, and then it is consumed by ignorant users. Whether it’s in regards to the serious political turmoil across the globe or day-to-day difficulties, social media sites share a plethora of information that lacks the wellness factor.
It’s true that there are benefits to social media, like community and connection, which are important for marginalized users. However, the description of these positives took up only half a page in the Surgeon General’s five-page advisory regarding the impacts of social media from May 2023.
To all these concerns, however, the spokesperson from Snapchat basically has one thing to say: “while we always have more work to do, we feel good about our role in helping friends feel connected…” or as Meta said, “we’ve spent a decade working on these issues and hiring people to keep young people safe online.”
It’s taken you 10 years to make someone as young as eight years old feel safe on your platform? So that eight year old is now 18 and off to college, and has most probably dealt with feelings of anxiety or low self-esteem on your platform for the majority of their life? Now what?
The bottom line is this: if the algorithms don’t change, and the capitalistic mindset of exploiting consumers doesn’t subside, then users will suffer. Don’t blame the addict, blame the environment the addict was forced into—and those who created that environment.