Communications Litigation Today was a Warren News publication.
Lawsuits Bring 'Validation'

Advocates Clash on Need to Regulate Social Media Platforms' Youth Policies

Speakers from different advocacy groups clashed during an Information Technology and Innovation Foundation webinar Wednesday over what controls the courts and lawmakers should place on social media companies amid what the multitude of pending lawsuits claim is a youth mental health crisis fueled by internet addiction.

The webinar came a day after U.S. District Judge Yvonne Gonzalez Rogers for Northern California in Oakland denied the social media defendants’ motions to dismiss the In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation multidistrict litigation. The webinars' participants referenced the MDL (docket 3047) and California v. Meta et al., with similar claims brought by 33 state attorneys general last month (see [Ref:2310240060), plus pending suits in state courts.

Carl Szabo, general counsel at NetChoice, representing social media platforms, and Ava Smithing, congressional advocate for Young People’s Alliance, had opposite takes on who should be accountable for the content young people see online and how to create safe spaces for them. Young People's Alliance advocates for algorithm-focused provisions in federal social media legislation. Both advocates drew from their personal experience to shape their arguments. Szabo, a parent of kids who fall into the subject group, argued parents, “not Silicon Valley,” should have the role and responsibility to figure out what’s best for their kids as they “delve into the digital realm.”

Smithing, who said she had eating disorder issues as a teen that were exacerbated by social media algorithms, said the various social media lawsuits are giving many kids a “sense of validation. “To see that there were other people outside of just the affected young people who wanted to stand up and understand the gravity of these situations, and do right on our behalf, is an excellent step in the right direction," she said. "These lawsuits can greatly strengthen the evidence that we use behind the push for legislation to make social media safer.”

Social media companies aren’t incentivized to keep kids safe, said Smithing, so she’s “excited” by the ability to use the social media lawsuits “as a tool to disrupt those questionable incentivization models.” Litigation might not be enough to change platforms’ behavior: “To accomplish what we need, we need federal legislation,” which can be influenced by facts presented in the lawsuits, she said.

Szabo objected to the use of the term “addiction” in the MDL, calling it a case of tort law, where someone feels harmed and wants “to be made whole.” Tort law requires proximate cause, a “but for" cause, he said. The term "addicted" is being misappropriated to mean “anything that makes me want to continue using something,” he said.

A problem with the social media lawsuits is that youths are generally “distressed,” Szabo said, attributing that to societal trauma stemming from the COVID-19 pandemic rather than to social media use. The social media MDL can’t stand on the legal merits, Szabo said, “because you can’t show “but for social media, kids would be happy,” he said. Some social media lawsuits are blaming a decline in learning on social media, he said, calling that assertion “bull hockey.” On the contrary, he said, being “virtually schooled for three years has done monstrous amounts of damage to the educational development of my children."

Jess Miers, legal advocacy counsel for Chamber of Progress, which backs public policies for inclusion of marginalized groups through technology, said something that gets lost in conversations about how kids use social media is that “not every kid's situation at home is the same.” Marginalized youths rely on the social media platforms for “a sense of community and well-being." The internet gives them a way to get resources for topics they may not be able to explore at home, she said.

On how social media can minimize the harm platforms do to young people, Smithing would like to see a way to “interfere with the way algorithms can target people with content and advertisements." The algorithms are what bring people “from nonharmful to harmful content,” she said. Social media algorithms don’t just take into consideration things that social media users engage with positively; they also register “how long you look at something, and if you look at something longer, it will show you something like that,” she said.

Human negativity bias shows that just because someone looks at something for a longer period of time “doesn’t mean you like it,” said Smithing. Algorithms today interpret what users look at for a longer period of time as something they’re interested in, and in her case, algorithms misinterpreted what she was interested in to be a Victoria’s Secret ad, she said. The ad “triggered” her and led her to look at it longer, causing the algorithm to think she might also be interested in “a daily diet of those same models, which are unrealistic,” she said.

Algorithms “understand our biases better than we do, and they move faster than we do,” Smithing said. “We can’t have this machine that’s moving this fast and unchecked deciding what content to deliver to our kids based on nonaffirmative actions that they do not even make themselves,” she said. The Young People’s Alliance wants to interfere with the “recommended content that manipulates user psychology,” she said.

Szabo said the various social media addiction lawsuits are an effort by states to control speech online and a way to make tech companies the scapegoat for the failure of government and law enforcement to protect children from child sex abuse. The lawsuits are an “excuse for trial attorneys to get a quick payday and government officials to point the finger to blame at somebody who’s not them for failure to do their jobs.”

If that were the case, said Smithing, “you wouldn’t have people like me, and kids younger than me, coming forward and raising red flags about this issue.” It’s the “digital native generation that social media has harmed the most, who’ve watched so many of their friends starve and die and hurt, who miss their friends and miss social interaction.” That generation has become vocal about changes it wants to see, she said. “Random rewards, through the algorithms, keep individuals responding for longer,” and that was platforms’ goal, she said, referencing Facebook documents cited in lawsuits. “It’s not fair to exploit our psychology to keep us doing something we do not want to do,” Smithing said. A majority of young people say they wish they spent less time online, she said, but kids, facing social pressure, “don’t have the agency to leave Facebook.”

If there’s too much regulation of social media, and platforms are forced to make their algorithms conform to the particular requirements of say, Texas, Florida, California and other states, “these services are going to just say, ‘This isn’t worth it; we’re not going to have people under 18 anymore on our services,'" Miers said. "They're going to kick you all out." Then, young people "will lose the ability to communicate via social media and to share and advocate," she said. Commenting on why so much legislative and legal activity to regulate social media is happening now, she said, “because we’re heading into an election year.”