From Bed Rotting to Doomscrolling: Can Addictive Social Media Be Stopped?

Despite growing sentiment that social media use is detrimental to mental health, there have been few attempts to reform the systems that largely define life in the 21st century.

Rory Powell, Impact Fund Grants Intern

As a college student, it is not uncommon to hear my peers passively comment on their social media use. Descending into Berkeley’s Main Stacks to study, I’ve overheard students complain, “I was going to get started on my outline, but then I rotted on TikTok for 4 hours,” or “I was doomscrolling and didn’t even realize it got so dark outside.” Many of my friends will periodically delete Instagram or TikTok off their phones to physically prevent themselves from procrastinating work. I also regularly hear my peers seeking out alone time on social media expressing, “I'm so exhausted, I just need time to bed rot.” It seems the addictive quality of these platforms leads many people in my generation trapped between wanting to keep “scrolling” online even though they inevitably regret it later.  

While these comments are expressed jokingly, the language used to describe online habits is anything but lighthearted. For example, the use of “rot” in expressions such as “rotting online” and “brain rot” reflects an intuitive sense that social media use is unhealthy. It is no coincidence, then, that recent studies reveal extended social media use is correlated with long-term negative effects on mental health.  

Medical Evidence Ties Social Media Use to Negative Mental Health Outcomes  

A stylized image of a hand holding a phone, which is projecting a large magnet that pulls in heart and eye icons.

The psychological effects of social media use have become an increasingly popular area of study amongst researchers across the world. Studies have found widespread correlations between time spent on social media and increases in anxiety, depression, and negative self-image.  

A Warning From the Surgeon General 

In response to these studies, former U.S. Surgeon General Dr. Vivek Murthy published a formal advisory in 2023 regarding social media’s impact on mental health. Murthy advocated for social media advisory labels that would warn users about the link between social media use and negative mental health outcomes. Murthy argued that these advisories would have a similar effect to warning labels placed on tobacco products, spreading awareness and changing social media habits.  

No Justice  

Despite a widespread sense that social media is detrimental to mental health and a growing body of scientific evidence to back these claims, little has changed. In fact, the only changes made by these platforms seem to intentionally make the online experience more addictive. Social media companies show little interest in addressing these problems through self-regulation, and external legislative solutions such as implementing social media warning labels have failed to be enacted into law. 

"This raises two key questions:    Where can people find justice for personal harms caused by social media use?  How can social media companies be held accountable for designing products that maximize profit at the expense of human health?"

Legal Challenges 

One potential avenue for pursuing justice is to challenge social media companies in the courts via consumer protection or personal injury laws. But bringing these kinds of lawsuits against social media companies is not easy.  

Current Statutes Are Outdated

Laws such as the Communications Decency Act were created during the internet’s infancy and were focused on promoting the infrastructure growth of online technologies, as opposed to user protection. Section 230 of the Act offers protections for social media companies, treating them as separate from publishers and content creators and therefore not liable for content posted on their platforms. But this law, enacted in 1996, frames online platforms as simply virtual message boards, and does not consider all the ways that platforms influence how content is consumed by users.  

Recently, however, through cases such as Anderson v. TikTok, Inc., the courts have begun to reevaluate the protections provided under Section 230. In Anderson, the U.S. Court of Appeals for the Third Circuit held that TikTok could be held liable for 10-year-old Nylah Anderson’s death while attempting the viral “Blackout Challenge.” While Anderson reflects an attempted shift towards placing guardrails on social platforms, the ruling serves, at best, as a warning. The adoption of formal regulations compelling social media platforms to stop dangerous content from reaching vulnerable audiences requires legislative reform—a step elected officials have yet to take. 

Mental Health Harms Are Hard To Prove

The young victim in Anderson suffered from a physical and particularly extreme harm tied to specific content on TikTok’s platform. For people who developed mental health issues over an extended period due to use of a specific platform, proving a tie to social media can be more challenging.  

Consider the scenario where a car with defective brakes leads the driver to crash and break their leg. Before the crash, the driver’s leg is intact. After the crash, their leg is broken, and an x-ray offers clear proof. In contrast, harms developed from social media do not usually develop from a single interaction on the platform, but rather gradually accumulate over time. Furthermore, diagnosing a mental health disorder requires assessment of an individual’s mental state, a complex analysis with multiple contributing factors. So unlike the car crash scenario, with mental health and social media, there is not always such a clean cause and effect relationship, and proof can be harder to come by.  

This ambiguity allows social media companies to raise doubts about their direct involvement in harms experienced by their users. As part of their defense, social media companies might shift blame from themselves by prying into other aspects of the plaintiffs’ lives or implicating another platform that the individual interacted with.  

Despite these significant challenges, public interest attorneys have recently developed creative legal strategies to help individuals and families seeking justice. One of these attorneys is a member of the Impact Fund family. 

A young person lying dejectedly in a stylized field of social media hearts.

Can litigation be used as a tool to combat social media’s effects on mental health?

Social Media Addiction Lawsuits 

Impact Fund Board and Grant Advisory Committee member Andre Mura, a partner at the Oakland-based firm Gibbs Mura, is currently litigating a mass tort action seeking to hold social media companies responsible for designing addictive apps and for failing to warn children and their families of those risks. This collection of lawsuits in state and federal court challenges Meta, TikTok, YouTube, and Snap and includes thousands of families and numerous school districts across the country.  

The various plaintiffs argue that developers utilize specific design elements in their products to addict users to the platform. This legal approach focuses on the companies’ choices in designing their platforms. 

Is Meta the New Marlboro?

The legal arguments for social media addiction share much in common with those from lawsuits filed against tobacco companies a few decades ago. Throughout the twentieth century, lawsuits against tobacco companies failed to secure damages for tobacco-related illnesses despite a body of medical evidence linking smoking to negative health effects. For a long time, the industry dodged liability claims by casting doubt or downplaying medical evidence tying long-term tobacco use to cancer and chronic obstructive pulmonary disease (COPD). In the 1990s, however, the industry’s unbeatable defenses broke when documents were discovered detailing the steps that tobacco companies took to enhance the addictiveness of their products. Thus, tobacco companies were forced to pay billions of dollars in damages for deceiving consumers in the pursuit of company profit.  

Mura’s team argues that social media companies are exhibiting the same type of deceptive behaviors today. Platform characteristics such as infinite scrolling and excessive notifications are intentionally designed to maximize user engagement and company profits. The more time users spend on platforms, the more advertising revenue the companies earn.  

Today the families represented in Mura’s mass tort action argue that the social media companies concealed internal research from the public and failed to warn users about risks associated with platform use. The plaintiffs accuse these companies of negligence, arguing that they violated a duty to keep parents informed about the potential health risks associated with children’s use of their platforms.  

Moving Forward: Potential for Systemic Impact? 

For cases similar to Mura’s, a successful outcome would provide direct monetary relief for families and children affected by social media addictions. More broadly, a successful ruling would establish a precedent of company liability for harmful product design. Whether this kind of warning from the courts will motivate social media platforms to change company behavior is greatly dependent on the size of the damage award in relation to the companies’ profits.  

A stylized image of a disoriented young person sitting in front of a tablet with a hypnotizing swirl for a screen.

Top social media companies collectively generate a shocking $11 billion in yearly ad revenue just from American users under the age of 18. If the pay out from a mass tort action is relatively small in relation to the revenue the companies receive, it may not affect their bottom line any more than other expenses involved in running a business, thus limiting its impact. Mura and others agree that government regulation or legislative action is necessary to change the social media industry. To date, the public has been surprisingly complacent in allowing these companies to continuously dodge accountability. Society has largely accepted the company narrative that the development of mental health disorders is a problem with the user, not the technology. But if the problem is with the user, then why are social media addiction and the resulting mental health impacts becoming more and more common? 

Good mental health and social media use are not inherently incompatible. Yet the social media industry has shown deep resistance to change, a resistance that is fundamentally antithetical to the tech industry values of “innovation” and “iterative design.” Furthermore, given the ample number of creative and intelligent people available, it is well within the capacity of social media companies to develop creative solutions to improve social media in ways that put the user’s health first. Thus, this lack of empathy for their users is particularly egregious, especially when you consider the growing body of scientific evidence concluding that excessive time spent on these platforms harms users. Litigation is a necessary first step to identifying the current dangers. But it is also time for the public to demand a better alternative: one in which users can take advantage of social media platforms’ many benefits without jeopardizing their mental health.  

Production Credits

Writer: Rory Powell

Editors: Amy Daniewicz, Teddy Basham-Witherington, Lindsay Nako

Web Producer: John Henry Frankel

Web Editors: Rory Powell, Amy Daniewicz, Teddy Basham-Witherington

Next
Next

Native Hawaiian Farmers Seek to Recover Benefits Reaped from Illegal Stream Diversions