Communications Litigation Today was a Warren News publication.
AI Bill Also Passes

Calif. Panel Advances Bill Targeting Online Child Exploitation

The California Assembly Privacy Committee advanced legislation Tuesday that would hold online platforms liable for knowingly, recklessly or negligently helping facilitate child sex trafficking.

AB-1394 would require platforms to provide a tool for users to report child sexual abuse material (CSAM). Platforms would have 30 days to verify the material is CSAM and block it from reappearing, according to bill language. Victims of commercial exploitation would have the right to sue platforms for deploying features that were a “substantial factor” in causing the exploitation. The bill would impose penalties up to $250,000 for the “worst offenders.” The committee voted 5-0 to pass the bill to the Assembly Judiciary Committee, with Republicans abstaining but making encouraging remarks about the bill.

Assemblymember Buffy Wicks (D) introduced AB-1394. One example of child exploitation she highlighted involved TikTok users using comment sections to give young girls gifts to perform sexually explicit acts. The platforms are aware that sex trafficking is originating on their platforms, said Common Sense Media State Policy Associate Kami Peer, testifying in support. Facebook’s own internal research shows the platform “enables all three stages of the human exploitation life cycle: recruitment, facilitation and exploitation,” she said. Peer noted the average age of a victim is 13-14, and the average life expectancy after exploitation is seven years.

TechNet and NetChoice testified against the bill. It opens the door to liability for all features used by a predator, including any form of communication between users, which are essential to online functionality and communication, said TechNet Executive Director Lia Nitake. AB-1394 could result in platforms banning all users under the age of 18, she said: It also violates free speech rights and is preempted by federal law.

Platforms don’t want this activity on their services, but they need to be held accountable for negligence and intentional facilitation, said Vice Chair Joe Patterson (R). He said he’s inclined to support the bill, but he abstained pending further amendments. Platforms are being used for “very bad, bad things,” including open drug sales, he said: “More needs to be done.”

California parents say legislators should use every means possible to protect children, and these platforms have a responsibility, a “moral obligation,” to come to the table and be part of the solution, said Chair Jesse Gabriel (D). The bill passed with five Democrats in support.

The committee also passed legislation to the Judiciary Committee that’s intended to protect internet users from discrimination and bias embedded in artificial intelligence tools. AB-331, authored by Assemblymember Rebecca Bauer-Kahan (D), is modeled after President Joe Biden’s AI bill of rights. It requires developers to assess AI tools for discriminatory biases and “mitigate accordingly,” and provides a private right of action for individuals to sue for discrimination. Fines are capped at $10,000 per infraction.

TechNet, NetChoice and California business groups testified against AB-331. Smaller companies don’t have the resources to do yearly assessments, said California Chamber of Commerce Policy Advocate Ronak Daylami. The chamber is aligned in principle and agrees bias and discrimination are serious problems, but a “more measured approach is needed,” she said.

Tech companies deployed AI practices that result in discrimination based on gender and race, said Taneicia Herring, a regional government relations specialist with the state NAACP. Amazon, for example, tried to use a hiring algorithm based on historical data for selecting software engineers, and the results disproportionately rejected women, she said. Discrimination is unlawful whether the decision is made by a person or an algorithm, said Assemblymember Bill Essayli (R). He said in that particular instance, Amazon could probably be sued without the proposed regulation. The liability is too much since AI technology is still in its infancy, said Patterson. AB-331 passed 7-2 with Patterson and Essayli opposed.

The committee passed AB-1546, which would grant the state attorney general the same five-year statute of limitations to bring a civil action to enforce the California Consumer Privacy Act that the California Privacy Protection Agency has to bring an administrative enforcement action under the same law. Due to drafting oversight, the law doesn’t specify that the AG also has five years to bring civil action, said Gabriel, author of the bill. If it’s not specified, under state law, the AG has only one year, which is far too short a time to develop a case, he said. Patterson raised concerns about companies being unaware of wrongdoing and incurring multiple violations if action isn’t brought within the first year. The California Chamber of Commerce believes voters’ silence on AG statute of limitations was intentional, and timely action will best serve all parties, allowing businesses to better mitigate, said Daylami. The bill passed 5-3 with Patterson, Essayli and Assemblymember Vince Fong (R) opposed.