Attorneys general for Illinois and 12 other states, as well as the District of Columbia, each filed a lawsuit against social media platform TikTok on Oct. 8. They allege the app is purposefully designed to addict users, imperiling the mental health of teenagers nationwide. Specifically, the suit claims the app’s provision of interminable scrolling, as well as its use of push notifications, leads consumers to use the app compulsively.
In an Oct. 8 press release, Illinois Attorney General Kwame Raoul asserted that TikTok has concealed risks of harm involved in using the app from the public. The release further stated that TikTok seeks to make people utilize the app as much as possible, in order to expose them to advertisements—a strategy which the attorneys general say is particularly exploitative of children.
“Social media is harmful to kids in the same way any addictive substance is,” Marketing teacher Kara Mielke said. “It has that same dopamine outburst when you get a ‘like’ on your social media, similar to the long term effects that we’ve seen from the tobacco companies, vape companies, and things like that.”
Such extreme exposure to the contents of the app, Raoul says, has inhibited Illinois children’s sleep and learning, while also causing body dysmorphia, depression, anxiety, and consideration of self-harm. In suing, the press release says, the attorney general is seeking financial punishment for TikTok, while also forcing it to cease its practices which Raoul finds to be harmful.
Other suits claim that TikTok has enabled the sexual exploitation of minors, in that those who create livestreams on the platform are able to receive app-specific currency from other users, which they can then turn into actual money. (NPR reports that TikTok previously discovered that one million minors had received compensation for illicit live streams in a single month.) The District of Columbia’s Attorney General asserted in a press release that TikTok pockets half of all money sent to a streamer, and that the app is serving as a means of sending money without the approval of the requisite government agencies.
Though TikTok claims it takes steps to keep underage users safe, some find that the app’s restrictions are flimsy at best.
“It’s so easy to lie about your age,” Rhiann Michael Calimutan, senior and AP Government & Politics student, said. Calimutan had previously authored a fake bill to make both TikTok and X (formerly Twitter) illegal, as an activity for his class (though he notes that this was meant as a joke.) “You could just switch up the age, or you could put your actual birth date, but you could just switch up the year.”
According to Raoul, TikTok is lying about the extent to which it safeguards young people who use it. This is partially borne out by portions of internal documents obtained by Kentucky Public Radio, where, for example, the company makes note of the percentages of different forms of inappropriate content which have not been removed–including 100 percent of that in the “Fetishizing Minors” category. (NPR quotes a TikTok spokesperson who claims that the material Kentucky Public Radio and NPR obtained is not genuinely representative of the company’s practices.)
This is not the first time TikTok has found itself in legal trouble. A number of other states have sued the company in the past, on similar charges. So did the Justice Department—according to the agency, for acquiring information from underage users without their parents’ consent. Overlaying all of this is TikTok’s current battle against a recent law which would ban the app in the United States unless its parent company, ByteDance, sells it to a different owner located outside of China.
TikTok, however, is not the first social media site to receive legal backlash in regards to child safety. In 2019, social media company Meta, owner of Instagram and Facebook, was hit with lawsuits of its own from 45 states’ attorneys general and D.C. The case against Meta also focused on the possibility of the sexual solicitation of minors and compulsive use. After the filing of the cases, United States Surgeon General Dr. Vivek H. Murthy called for the placing of warning labels on social media sites to discourage addictive use.
“I definitely foresee limitations on social media, not only from a marketing perspective and from companies having to [have a] surgeon general warning right on the ad to their customers. But I also just foresee an age restriction going into social media in general at some point in time, and restricting it from users of younger generations,” Mielke said.
Social media has become ingrained in modern society, with 239 million people using it in the United States. The greater effects of the suit against TikTok are to be seen, and the effects that could come from a possible ban on TikTok could send ripples through the social media-using world. Whatever the outcome will be, it will surely change the way social media companies see their sites, or possibly the way the users see those same sites.