Blogs

US States Sue TikTok Over Harmful Addiction To Children

More than a dozen states file lawsuits against TikTok, claiming its addictive design negatively impacts children's mental health.

Getting your Trinity Audio player ready...
TikTok
Photo: AP
info_icon

On October 9, 2023, a group of more than a dozen states and the District of Columbia initiated lawsuits against TikTok. They argue that the app deliberately designed to be addictive, especially for children. As a result, it has negative impacts on their mental health. This legal action goes beyond individual complaints, it highlights a rising concern regarding the influence of social media on youth.

The Algorithms at Play

These lawsuits focus on TikTok's algorithm, which selects content for users in the "For You" feed. This algorithm is created to keep users engaged for as long as possible, often harming their well-being. The lawsuits note TikTok design features that they say addict children to the platform, such as the ability to scroll endlessly through content, push notifications that come with built-in “buzzes” and face filters that create unrealistic appearances for users.

California Attorney General Rob Bonta emphasized the profit-driven motives behind this design: "They've chosen profit over the health and safety, well-being, and future of our children. And that is not something we can accept. So we've sued." This statement reflects the wider concern that social media companies prioritize ad revenue over user safety, particularly when it comes to vulnerable populations like children.

The Growing Concern

The lawsuits against TikTok are part of a bigger push against social media companies, kind of like the actions taken against the tobacco and pharmaceutical industries in the past. Lawmakers are starting to notice the mental health crisis affecting young people and are coming together to tackle it. Just last year, states sued Meta Platforms, the company behind Instagram, for making features that keep kids hooked and contribute to mental health issues.

TikTok makes its money by keeping users engaged, which helps them sell ads. District of Columbia Attorney General Brian Schwalb pointed out that the same strategies that boost profits can also hurt mental health. This brings up an important question: how can we hold these companies accountable for the harm their platforms cause to young users?

A Question of Regulation

The need for social media regulation is becoming more urgent, especially with TikTok facing serious legal challenges. A new federal law could lead to TikTok being banned in the U.S. by mid-January if its parent company, ByteDance, doesn’t sell it. This adds more complications to the ongoing lawsuits, which aim to address TikTok’s addictive features and protect users' rights and safety.

The filings from the District of Columbia go so far as to call TikTok’s algorithm “dopamine-inducing,” saying it was designed to keep young users glued to their screens. This can lead to serious issues like anxiety, depression, and body image problems. So it really makes you wonder: shouldn’t social media companies be forced to implement stronger safety measures to protect young users from all this harm?

The Role of Parental Control

TikTok claims to restrict access for users under 13 and limits some content for those under 18. However, it is awfully convenient for kids to get around these rules. This loophole puts them at risk of seeing harmful content, highlighting the need for better age verification and parental controls. Parents should have tools to effectively monitor and manage their children’s social media use.

Additionally, TikTok allows users to buy TikTok Coins, a form of virtual currency, which creates more concerns. The app functions like an “unlicensed virtual economy,” letting users send gifts to streamers during live broadcasts. TikTok profits from these transactions without being registered as a money transmitter with the U.S. Treasury Department. This raises serious ethical questions about TikTok's duty to protect its young users.

The Vulnerability of Young Users

TikTok's live-streaming feature has raised concerns about the potential exploitation of minors. Critics say this function allows the app to operate like a “virtual strip club,” exposing young users to sexually explicit content without proper age restrictions. This troubling situation forces us to rethink how social media platforms operate and whether they can truly regulate themselves.

The impact of these lawsuits goes beyond just TikTok, as they highlight a larger issue regarding social media’s influence on young minds. According to research from the Pew Research Center, nearly all teens aged 13 to 17 in the U.S. use social media, with about a third admitting they use it “almost constantly.” Additionally, a CDC survey found that high school students who use social media frequently are more likely to feel persistent sadness or hopelessness. These findings highlight the urgent need for social media companies to be held accountable and for parents and educators to be more aware of these issues.

The Need for Collective Action

A coalition of 14 state attorneys general is working to address TikTok's practices by seeking to impose financial penalties for alleged illegal activities and to secure damages for users harmed by the platform. This collective effort shows a strong commitment to user safety and aligns with the broader concern for mental health and well-being.

Additionally, there is a growing movement against social media companies, with lawmakers and advocacy groups pushing for rules to protect young users. As these legal cases move forward, they could set important standards for how social media platforms are regulated and held accountable. The aim should be to find a balance between encouraging new ideas and ensuring user safety.

The lawsuits against TikTok signify a critical moment in how we view and regulate social media's influence on youth. These legal actions highlight a growing recognition of the potential dangers of these platforms, especially for young and vulnerable users. The outcomes of these legal challenges could reshape the landscape of social media regulation, encouraging companies to adopt more responsible practices and protecting young users.

While the future of TikTok and similar platforms is uncertain, one thing is clear: the stakes are high. It is essential for social media companies, lawmakers, and parents to work together to ensure that technology empowers rather than harms. Through collective action and greater awareness, we can create a safer digital environment for future generations.