Communications Litigation Today was a Warren News publication.
'Speculative Arguments'

Any 'Incidental Effect' of Social Media Law on Businesses' Speech Justified, Says Calif. AG

Nothing in California’s Age-Appropriate Design Code Act, AB-2273, restricts the content that businesses can provide to minors, and “any incidental effect the Act may have on businesses’ speech is justified by the State’s compelling interest in children’s welfare,” said California Attorney General Rob Bonta (D), in a Friday opposition (docket 5:22-cv-08861) to NetChoice’s April motion for preliminary injunction (see 2304070041) in U.S.District Court for Northern California in San Jose.

An injunction "would inflict irreparable harm upon California by preventing enforcement of a statute enacted by representatives of the people, and in the public interest to advance the safety and protection of minors," said Bonta in a Monday news release. The law is not preempted by federal law because it does not conflict with existing federal law, he said.

U.S. District Judge Beth Labson Freeman for Northern California signed an order this month (see 2304070041), continuing until Dec. 7 from April 13 the initial case management conference in NetChoice’s lawsuit for a preliminary injunction to block the social media design law from taking effect in July 2024. The judge opted to continue the conference to Dec. 7 rather than doing so indefinitely but will grant “further joint requests for continuance” as necessary, said her order.

In the Friday opposition, Bonta said the act, which regulates businesses that trade in consumers’ personal information and offer products, services and features likely to be accessed by children, requires “certain actions that proactively protect that information and prohibits certain actions that involve the collection and use of that information.” The trade association’s members “do not have a First Amendment right to children’s personal information.”

The act’s specific requirements and prohibitions, plus its procedural protections and scienter requirements, “ensure that businesses’ rights remain protected,” said the filing. Compliance with AB-2273 doesn’t trigger concerns under the “dormant Commerce Clause,” it said, and it isn’t inconsistent with existing federal law.

Children are especially vulnerable to the risks of businesses’ data collection, said Bonta. Despite being aware that children use their services – and the existence of technology that can provide a safer experience for children – businesses “often fail to take steps to provide that safer experience,” he said. And children’s use of social media is on the rise, with 6-12-year-olds online an average 5 hours, 33 minutes per day; 13-17-year-olds, 8 hours, 39 minutes per day, he said.

Despite the “ubiquitous presence” of the internet in children’s lives, specific legal protections for them are limited, Bonta said. The federal Children’s Online Privacy Protection Act (COPPA) requires that online businesses protect the personal information of children, but only where the platform is “directed towards” children under 13, he noted. Unless a website identifies as one that targets children, a website only has to “prevent the disclosure of personal information from visitors who identify themselves as under age 13” without parental consent. Though businesses can’t share or sell personal data of users under 16, there’s no “comprehensive law” to protect against the collection and use of kids’ data, he said.

Citing an “agnosticism” businesses have about services used by adults and kids, Bonta said businesses are “disincentivized from identifying their services as ‘child-directed’" because that limits their ability to monetize their products by collecting and selling user data and showing targeted ads. “This leads not only to the over-collection of children’s data, but to using this data to lead children to inappropriate ads and other content.”

Bonta identified several prohibitions under AB-2273, including that regulated businesses can’t use a child’s personal information in a way that is “materially detrimental to the physical health, mental health, or well-being of the child.” They can’t profile a child except in cases where there are safeguards in place and there’s a “compelling reason profiling is in the best interest of children.” Nor can they share precise geolocation information.

Businesses can’t collect, share, sell or retain personal information not necessary to provide a service without a compelling reason that the practice is in the best interest of children, and they can’t use “dark patterns”: interfaces designed with the effect of “subverting or impairing user autonomy, decision-making, or choice” that leads kids to share information beyond what’s “reasonably expected to provide the service.”

NetChoice has failed to show why conduct required or prohibited by AB-2273 can’t constitutionally be subject to some regulation, Bonta said. NetChoice “relies on speculative arguments about children’s inability to access content,” Bonta said, but nothing in the act bars children from seeking, or businesses from providing access to, any content, he said.

A plaintiff must establish that it will likely suffer irreparable harm if its injunction is not granted, Bonta said, but NetChoice “has not been deprived of any constitutional right.” Because AB 2273 goes into effect on July 1, 2024, any allegation of “imminent enforcement fails,” he said. NetChoice’s “vague and speculative allegations of financial injury are insufficient to support a finding of irreparable harm,” he said. That’s especially applicable because many of NetChoice’s members must already comply with the UK Children’s Code, a “substantially similar” law, he said.