Communications Litigation Today was a Warren News publication.
Md., Minn. Hearings

More States Weigh Age-Appropriate Design Code Bills

Maryland’s attorney general found no potential constitutional or preemptive problems with a state bill to require kids’ privacy rules, said its sponsor, Del. Jared Solomon (D), at a livestreamed hearing Wednesday. House Economic Matters Committee members appeared to support requirements for websites at a hearing on a bill (HB-901) based on California’s Age-Appropriate Design Code Act. The Minnesota House Commerce Committee voted by voice to advance a similar bill (HF-2257) to the Judiciary Committee at a hearing the same day.

Solomon asked Maryland AG Anthony Brown (D) to review HB-901, the delegate said. The AG said the bill has no constitutional problems and its proposed rules for children under 18 wouldn’t be preempted by the federal Children’s Online Privacy Protection Act (COPPA), which covers only kids under 13, Solomon said. COPPA is “dramatically out of date,” the delegate said.

The bill lets companies "innovate" their platforms to keep children safe, said Solomon. Including a 90-day right to cure in the bill shows "we're not out to get these companies.” The bill requires companies to estimate -- not authenticate -- age, and the authors left some terms "purposely vague" so the proposed law wouldn’t be too prescriptive, he added. "This is not about restricting content," said the delegate. “This is about making sure our young people are not manipulated into seeing content that is not appropriate for them.”

Online platforms have become “unregulated necessities of everyday life,” said committee Chair C.T. Wilson (D), who co-sponsored HB-901. Like how cars must have seatbelts, "this technology needs to have some standards for consumer safety,” he said. “We're not asking for anything ridiculous" or "onerous." British House of Lords member Beeban Kidron, who chairs the 5Rights Foundation, and advocates for parents and state educators testified in support.

Industry lined up against the bill. Parents and guardians, not companies, should decide what content is appropriate for children, said TechNet Mid-Atlantic Executive Director Margaret Durkin. Age verification runs counter to many companies' data minimization efforts, she said. Age verification could sacrifice user privacy because companies would need to collect even more sensitive data, agreed Claire Park, Chamber of Progress external affairs manager. HB-901 lacks nuance and “narrowly tailored definitions,” said Computer & Communications Industry Association (CCIA) State Policy Director Khara Boender.

Multiple Democratic committee members wondered what industry wouldn’t oppose. "Without this bill … how exactly do you suggest that parents act in the best interest and prepare their children?" asked Del. Lorig Charkoudian. Industry can’t perennially call proposals imperfect without suggesting an alternative approach, she said. Del. Adrian Boafo asked, “What policy standpoint do you guys actually agree with?”

Committee Vice Chair Brian Crosby pointed out that the bill requires companies to estimate age. Platforms without objectionable content shouldn’t have problems, said the Democrat, saying the bill’s right to cure makes the bill more reasonable.

Minn. Panel Seeks Rules

The same industry groups rejected Minnesota’s HF-2257 at a hearing later that afternoon for many of the same reasons. The committee advanced the bill despite those and Republican concerns.

Rep. Kurt Daudt (R) worries about requiring companies to collect data they wouldn’t have otherwise, he said. Age data could be combined with a user’s geocoded location, which also would have to be collected to ensure the law applies only to Minnesotans, he said. "It seems really, really scary to me.” Rep. Anne Neu Brindley (R) hadn’t thought of that, she said. She came into the hearing thinking she would support HF-2257 but changed her mind after hearing "doublespeak runaround" from supporters. California is “under litigation for a reason,” she added.

Sponsor Rep. Kristin Bahner (D) sees no reason companies can't design software in a way that protects privacy from the outset, she said: Don't wait until harm is done. "We don't want to make life harder for businesses," said Bahner. “What we're asking you to do is to put your customer first.”

Supporter 5Rights agrees there's always “tension” between privacy and safety, but the bill requires age estimation, not verification, said Head of U.S. Affairs Nichole Rocha. There are “dozens” of ways to check age, she said. The bill requires companies to use the least invasive method possible and to delete the data quickly after checking, she said.

The committee also cleared a Republican bill to regulate social media despite business opposition, including from the same industry groups that opposed the children’s privacy measure. HF-1503, which goes next to the Judiciary panel, would ban social media "from using algorithms to target unsolicited content at kids,” said sponsor Rep. Kristin Robbins (R). Also, companies would have to obtain verifiable parental consent before allowing a child to open an account, she said.

The internet “runs on algorithms,” said TechNet Executive Director-Midwest Tyler Diers in opposition. Age verification required by the parental consent provision would be problematic, requiring companies to collect more data on children, he said. The bill’s private right of action would lead to frivolous lawsuits, said CCIA State Policy Manager Jordan Rodell.

Websites complaining a bill would cause too much data collection is “the pot calling the kettle black,” remarked Chair Zack Stephenson (D).

But HF-1503 co-author Brindley said she now has questions about what social media companies will use instead of algorithms. Also, the bill may have the same geocoding privacy problem that was raised about HF-2257, she said.

More States Weigh Measures

Inspired by the California kids’ privacy law, “19 proposals have emerged across 15 states that regulate specific issues related to children’s interactions with online services and social media platforms,” WilmerHale attorneys wrote Wednesday. “If enacted into law, many of these bills would impose new regulatory compliance requirements on companies with users under the age of 18.” The lawyers warned that most bills, if enacted, “would expose companies to significant liability for mishandling children’s information including civil penalties, damages, and administrative fees.”

Problems with several state bills meant to protect children online could lead to constitutional challenges, CCIA said Wednesday. Bills may conflict with federal law or make it tougher for websites to restrict harmful content, said CCIA. In recent letters, the internet industry association raised concerns about bills in Connecticut, Maryland, Minnesota and New Mexico. Connecticut lawmakers considered multiple social media proposals, including an age-appropriate design bill, at a Tuesday hearing (see 2302280075).

CCIA instead backs social media literacy bills in several states (see 2301100026). The association supported a Connecticut measure (HB-6760). “Given the complexity of tackling this critical issue, existing industry efforts to support child safety and privacy online could be bolstered by educational curricula focused on how to be a good citizen online,” it said in written comments submitted for a Wednesday hearing.

Advocates for students with disabilities supported Connecticut’s digital literacy bill in other written testimony. “Because some students with disabilities have separate classes, they are treated as the Other, different from and inferior to neurotypical students,” wrote Andrew Feinstein, Special Education Equity for Kids in Connecticut legislative chair. He said HB-6760 “is an excellent vehicle to confront this issue.” Other supporters include the School and State Finance Project, the Connecticut Hate Crimes Advisory Council and the Commission on Women, Children, Seniors, Equity and Opportunity.