"The only certainty of this litigation is it's going to be a big fight, and a long fight," said Matthew Bergman, founding attorney of the Social Media Victims Law Center.
Social media platforms such as Facebook, Instagram and TikTok are facing an onslaught of lawsuits alleging they cause mental health problems in teens, from eating disorders to suicide attempts.
Law.com Radar, which recently rolled out new analytic and trend-detection systems, surfaced eight lawsuits of this type in the first week of August. It found more than 30 cases since the start of June.
Some law firms are aggressively seeking new cases. Orlando, Florida-based Morgan & Morgan, for instance, is running ads on Google headlined, “Why Social Media Can Be Dangerous for Your Child.”
“Parents are saying enough is enough,” said Matthew Bergman, founding attorney of the Social Media Victims Law Center. “Frances Haugen’s revelations combined with the surgeon general’s report on teen mental health has made clear that social media products are dangerous and are harming our kids.”
Bergman is among the attorneys bringing several of the lawsuits, including one targeting ByteDance, creator of the popular social app TikTok. That suit alleges ByteDance intentionally built in addictive qualities with teens in mind.
The potential mental health risks of social media splashed into the headlines last year after Haugen, a former data engineer for Facebook parent Meta, leaked internal company documents showing the company’s own research found its Instagram service harmed the mental health of some young people.
U.S. Surgeon General Dr. Vivek Murthy highlighted similar concerns in a report last year and in public comments. In an interview with NPR in December, he said, “Right now, we’re conducting this national experiment on our kids with social media. And it’s worrisome to me as a parent.”
Social media companies have not yet responded to the latest batch of suits, and they did not respond to requests for comment. However, in response to criticism of its Instagram service, Meta has said, “We continue to build new features to help people who might be dealing with negative social comparisons or body image issues.”
The companies also believe they have a legal shield of sorts via Section 230 of the Communications Decency Act, which protects internet companies from being liable for content created by their users.
Typical of the mental health lawsuits is one filed in January by Tammy Rodriguez in the Northern District of California against Meta and Snap Inc., parent of Snapchat. It says her daughter, 11-year-old Selene Rodriguez, became addicted to social media, particularly Instagram, and eventually took her own life.
The suit alleges the companies designed the platforms to “promote problematic and excessive use that they know is indicative of addictive and self-destructive use.”
Meta responded to the suit by stating that it simply provides services and tools for users to publish their own content, and that none of the services and tools used are illegal themselves.
“Selena’s death is a tragedy, and Meta deeply sympathizes with Plaintiff and her family,” Meta wrote in its June motion to dismiss the suit. “But the First Amendment prohibits forcing a communications platform to adopt or enforce particular content policies or practices.”
Jim Wagstaffe, who is among the attorneys bringing a separate lawsuit against Instagram, said pushback is growing in Congress against the protections that Section 230 affords social medial companies.
Just as the First Amendment doesn’t protect newspapers from libel prosecution, some lawmakers are coming around to the idea that Section 230 doesn’t protect social media companies from product liability, he said.
But David Anderson, a retired law professor at the University of Texas School of Law, told the Texas Tribune last year that outright repealing Section 230 would spell the end of social media sites altogether.
“In my opinion, Sec. 230 definitely needs to be limited, but how to do that without killing a lot of useful communication is very difficult,” he told the publication in an email. “It would require some extensive hearings, careful drafting, and contentious trade-offs.”
The Information Technology and Innovation Foundation argues that any solution requiring social media platforms to review content is unworkable because of the massive volume of content on the sites.
Also complicating the issue is that Republicans and Democrats want to replace Section 230 with diametrically opposed frameworks, Mark Lemley, a Stanford Law School professor, told The Texas Tribune.
“Democrats want more content moderation targeting hate speech and misinformation. Republicans want to apply the First Amendment to social media sites even if they are private actors,” he said.
Meanwhile, lawsuits continue to pile up. Eventual rulings in the cases will shed light on how much of a shield Section 230 will prove to be for social media companies.
“The only certainty of this litigation is it’s going to be a big fight, and a long fight,” Bergman said. “To paraphrase Thomas Hobbes, it’s going to be nasty, brutish—and long.”