Contacts
Members area
Close

Contacts

Registered office:

1065 Budapest, Bajcsy-Zsilinszky út 31. I/11.

info@ceuli.org

Toxic Algorithms: Social Media Platforms Face Legal and Regulatory Challenges Over Addictive Algorithms and Youth Mental Health

dole777-EQSPI11rf68-unsplash

Recent legal and regulatory actions against major social media companies have brought the issue of algorithmic addiction and its impact on youth mental health to the forefront of public discourse. These lawsuits signal a significant shift in how regulators and lawmakers are approaching the responsibilities of tech giants, particularly in relation to their youngest and most vulnerable users.

Strict measures: Australia Bans All Children under 16 from Social Media

As of the latest advancements in social media regulations, Australia has passed a bill in its parliament, an amendment to the Online Safety Bill by restricting the use of social media for children under the age of 16[1]. The bill, which will come into effect from late 2025 is a landmark case of command-and-control regulations restricting the use of social media.

TikTok and Meta Under Legal Scrutiny

In 2024, TikTok faced a barrage of lawsuits from multiple U.S. states, alleging that the platform’s design is inherently addictive and harmful to children and teenagers[2]. This legal action follows similar suits filed against Meta in October 2023, which accused the company of knowingly exploiting young users on Instagram and Facebook[3].

The core argument in these lawsuits is that these platforms have deliberately engineered features to encourage excessive and compulsive usage among young users, prioritizing engagement and profits over user well-being. Internal documents cited in the lawsuits suggest that the companies were aware of the potential harm to adolescent mental health but failed to implement adequate safeguards.

A bipartisan coalition of 14 attorneys general from various states claims that TikTok employs addictive elements to ensnare young users and has deliberately misrepresented the risks associated with extended usage[4]. New York Attorney General Letitia James mentioned that numerous young individuals across the nation have suffered injuries or fatalities while participating in TikTok „challenges,” with many others experiencing heightened feelings of sadness, anxiety, and depression due to the app’s addictive nature. The lawsuit identifies specific features including:

  • Notifications that disrupt sleep patterns.
  • Autoplay of endless streams of videos
  • Disappearing videos that compel users to check the app frequently.
  • Beauty filters that enable users to modify their appearances.

Although TikTok has introduced tools designed to assist users in managing their screen time and curating their content, the lawsuit claims that the effectiveness of these features has been misrepresented.

Debating Causality: Social Media and Youth Mental Health

While there is ongoing debate in the scientific literature about the exact nature of the relationship between social media use and adolescent mental health, these lawsuits indicate that there is sufficient evidence to warrant legal action. The legal challenges against TikTok and Meta suggest that the plaintiffs believe they can establish a causal link between the platforms’ design and negative mental health outcomes among young users. Recent studies have contributed to this debate. For instance, a 2024 study published in the Journal of Psychiatric Research found that adolescents who used social media frequently were at a higher risk of facing mental health challenges like depression and anxiety[5]. The study identified two groups based on mental health status: one with better mental health and minimal indicators of anxiety and depression and another with poorer mental health and more pronounced levels of these indicators. Social psychologist Jonathan Haidt has been vocal about the broader impact of social media on teenage mental health, particularly for girls. Haidt argues that the rise of social media correlates with increased rates of anxiety, depression, and self-harm among teenagers, especially since 2012 when smartphones became ubiquitous. [4]

Regulatory Approaches: EU vs. US

The European Union’s Digital Services Act (DSA)[6], which became fully applicable in February 2024, takes a different approach to addressing these concerns compared to the United States. The DSA requires very large online platforms (VLOPs) to conduct risk assessments and implement mitigation measures for systemic risks, including potential negative effects on mental health. Key provisions of the DSA include:

  • Mandatory risk assessments for VLOPs to identify and mitigate systemic risks, including those related to mental health.
  • Increased transparency requirements regarding algorithmic recommendation systems.
  • Enhanced user controls, including options for non-personalized content feeds.
  • Restrictions on targeted advertising to minors.

The EU approach relies heavily on self-regulation and risk mitigation, focusing on preventive measures and increased transparency. This stands in contrast to the more adversarial stance taken in the United States, where lawsuits are being used as a tool to hold social media companies accountable. The US approach mirrors the landmark lawsuits against tobacco companies, which similarly alleged that the industry knowingly promoted an addictive and harmful product. The comparison to tobacco litigation underscores the seriousness with which these claims are being pursued and the potential for significant legal and regulatory consequences. This difference in approach highlights the ongoing global debate about how best to regulate social media platforms and protect vulnerable users. In the U.S., the lawsuits seek to prohibit TikTok from certain practices and demand financial repercussions. Similar lawsuits have been launched against Facebook and Instagram concerning their influence on the mental health of young individuals. States like Texas and Utah have also previously pursued comparable legal actions against TikTok, emphasizing child safety.

The Role of Platform Design and Algorithms

At the heart of the issue lies the design of social media platforms and their underlying algorithms. TikTok’s success, for instance, is largely attributed to its powerful recommendation system. This algorithm analyzes user behavior to determine interests and decides which videos to display next, creating a highly personalized and engaging experience. However, this same technology can lead users down potentially harmful rabbit holes of content. As Johnny Ryan, Senior Fellow at the Irish Council for Civil Liberties (ICCL), explains, „To keep the young person on the screen, the secret is to give them more and more extreme versions of what they’ve shown some interest in”[7]. The lawsuits against Meta allege that the algorithm, design, and features of Facebook and other social media platforms are to blame for the increase in mental health disorders among teens and adolescents. Meta CEO and founder Mark Zuckerberg has been accused of repeatedly misleading the public about the dangers associated with the use of the platform and the algorithms it uses to enhance personalization and increase engagement among users. Information obtained through leaked documents and whistleblower testimony indicates that Facebook and Instagram were aware that they were causing mental health concerns among teens, particularly young females, and opted not to address the issue.

The Challenge of Content Moderation

Social media platforms face significant challenges in moderating content effectively. While automated systems can filter out certain keywords or phrases, users often find ways to circumvent these restrictions. For example, the RTÉ investigation found that many TikTok users employed „algospeak” or coded language to avoid automatic content removal4. This cat-and-mouse game between users and content moderation systems highlights the need for more sophisticated approaches to protecting young users. It also raises questions about the responsibility of platforms to proactively identify and remove potentially harmful content, especially when it’s being served to underage users.

The Push for Regulation and Platform Accountability

As concerns about the impact of social media on teen mental health continue to grow, there is an increasing call for stronger regulation and platform accountability. In the European Union, the Digital Services Act (DSA)[8] aims to address some of these issues by placing greater responsibility on large social media companies to protect their users, especially minors. In Ireland, where TikTok has its EU headquarters, the online media regulator Comisiún na Meán has finalized the Online Safety Code[9] that will apply to video-sharing platforms. This code prohibits the sharing of content that promotes eating disorders, self-harm, or suicide, among other harmful and illegal content. However, some experts argue that more immediate action is needed. Johnny Ryan of the ICCL believes that regulators already have the power to compel platforms to turn off algorithms that are harming children. He states, „We are past the point where we’re asking these tech companies to fix the problems that they have created for our children”[6].

Conclusion

As society grapples with the complex relationship between technology and mental health, these developments highlight the need for an approach that involves legal action, regulatory frameworks, and industry self-regulation. The comparison between the EU’s preventive approach and the U.S.’s more confrontational legal strategy offers valuable insights into different models of tech regulation. Ultimately, the goal is to create a digital environment that harnesses the benefits of social media while minimizing its potential harms, especially for vulnerable young users. This will require ongoing collaboration between tech companies, policymakers, mental health professionals, and users themselves to develop solutions that protect well-being without stifling innovation.


[1]https://www.aph.gov.au/Parliamentary_Business/Bills_Legislation/Bills_Search_Results/Result?bId=r7284

[2] https://www.bbc.com/news/articles/c20m4k56relo

[3] https://www.robertkinglawfirm.com/personal-injury/social-media-addiction-lawsuit/facebook-mental-health-lawsuit/

[4] https://ag.ny.gov/press-release/2024/attorney-general-james-sues-tiktok-harming-childrens-mental-health

[5] https://www.sciencedirect.com/science/article/abs/pii/S0022395624003704?via%3Dihub

[6] https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32022R2065

[7] https://www.rte.ie/news/primetime/2024/0416/1443731-13-on-tiktok-self-harm-and-suicide-content-shown-shocks-experts/

[8]https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32022R2065

[9] https://www.cnam.ie/wp-content/uploads/2024/10/Coimisiun-na-Mean_Online-Safety-Code.p

Leave a Comment

Az e-mail címet nem tesszük közzé. A kötelező mezőket * karakterrel jelöltük