Communications Litigation Today was a Warren News publication.
'Addictive Qualities'

Parents 'Powerless' to Protect Minors From Meta, Snapchat Harm: Complaint

Instagram and Snapchat are responsible for causing and contributing to “the burgeoning mental health crisis” among U.S. children and teenagers, alleged a Tuesday liability complaint (docket 4:23-cv-00646) against Meta, Snap and additional parties in U.S. District Court for Northern California in Oakland.

Injuries suffered by the plaintiffs -- A.C. and his son, "John Doe," who was 14 at the time of the harm and 15 at the time the complaint was filed -- were caused by defendants’ “defective and unreasonably dangerous” Instagram and Snapchat products, including features unnecessary to the products' functionality and “known by them to be harmful to a significant number of their minor users,” said the complaint.

Meta and Snap designed their products in a way that helps minors evade parental authority and control, including features that “exploit minor users’ physiological and psychosocial vulnerabilities” and enable connection of minors with “predatory adults” resulting in “exploitation and abuse,” alleged the complaint.

The social media companies don’t warn minors or parents about harmful product features and don’t provide reasonable, accessible or staffed reporting mechanisms for parents to protect their children from harm on their sites, said the complaint. Reporting systems are “operated in such a defective manner that even parents who work in the technology industry are powerless to protect their children” from harm, and Meta doesn’t respond to reports in a manner designed to protect minors, it said.

Direct messaging gives anonymous adults, bullies and other strangers “unrestricted and unsupervised access” to minors, who “lack the cognitive ability and life experience to identify online grooming behavior by prurient adults and the psychosocial maturity to decline invitations to exchange salacious material and mass-messaging capabilities,” it said. It's technologically feasible to design social media sites that reduce the magnitude of harm to minors with a “negligible increase in production cost,” the complaint said. Though programming changes could help protect the youngest users, defendants choose not to make such changes "as a matter of [their] cost-benefit analysis.”

Defendants fail to provide adequate warnings to minors and their parents about the danger of “mental, physical, and emotional harms” arising from social media use, said the complaint. The “addictive qualities” of recommendation technologies and account settings were known to the companies but are “unknown” to minors and their parents, it said.

Plaintiffs claim intentional or negligent infliction of “emotional distress” against Meta, which distributes a product that’s harmful to a significant number of minor users “and then deprives parents of any reasonable or effective means to report and put a stop to such harms.” Meta and Snap designed their social media products to be “addictive” to minors and “failed to include safeguards to account for and ameliorate the psychosocial immaturity of their minor users,” the complaint said.

Snap’s “self-destructing content” feature “promises appeal to minor users” and encourages them to exchange “harmful, illegal, and sexually explicit images with adults,” alleged the complaint. Snapchat gives predators a “safe and efficient vehicle to recruit victims” and is a “go-to application for sexual predators,” it said. The disappearing design is particularly harmful to teens who use the platform to send photos but learn only “after the fact that recipients have means to save photos -- and are often bullied, exploited and/or sexually abused as a direct result,” alleged the complaint. Meta implemented a similar feature on Instagram in 2021, the complaint said, though it’s not the default setting.

Plaintiffs seek to hold Meta and Snap liable for their own speech “and their own silence in failing to warn of foreseeable dangers arising from anticipated use of their products.” Plaintiffs gave suggestions for a “reasonably safe” social product that include prioritizing content pushed to minors, a requirement for identification and parental consent to use the platforms, setting all minor accounts to private, limiting the types of data collected from minors and direct message access, and restricting access to “addictive product features.”

In addition to product liability, plaintiffs claim negligence, violation of California’s Unfair Competition Law, unjust enrichment, invasion of privacy and infliction of emotional distress. Both plaintiffs seek monetary damages for ongoing physical and mental pain, loss of future income and earning capacity, past and future medical expenses, punitive damages, legal costs, attorneys’ fees and injunctive relief.