Communications Litigation Today was a Warren News publication.
‘Act More Responsibly’

NTIA’s Davidson: Parents Not Solely Responsible for Kids’ Online Safety

Parents aren't the only ones responsible for protecting their children online, and social media companies should do more as their safety obligations evolve with the rise of AI, NTIA Administrator Alan Davidson said Monday.

NTIA in November closed public comment on its inquiry into child safety-related risks associated with social media use (see 2311160059 and 2311200054). President Joe Biden’s Task Force on Kids Online Health and Safety -- which includes officials from NTIA, the National Institute of Science and Technology, the FTC and DOJ -- shows the administration has a “strong feeling” this is no longer the responsibility solely of parents, Davidson said during the Knight Foundation’s Informed conference Monday evening. Policymakers should rethink service providers’ obligations, he said: “That may be a bit of a change in outlook that we need to embrace.”

The Biden task force will consider NTIA’s public comments when formulating voluntary guidance, policy recommendations and an industry “toolkit” aimed at protecting children, the agency said in November.

As agencies implement Biden’s AI executive order, advocates are pushing Congress for kids’ privacy legislation. The Parents Television and Media Council on Tuesday urged passage of the Kids Online Safety Act and the Children's Online Privacy Protection Act 2.0 (see 2312040058). PTC noted the Senate Judiciary Committee will hold a Jan. 31 hearing with the CEOs of TikTok, Meta, Snapchat, Discord and X (see 2311290072). The five companies have failed to properly protect children online, Vice President Melissa Henson said: “Congress cannot stand back and keep hoping that Big Tech will do more, be better, or act more responsibly.”

It’s “crazy” there’s no federal privacy law, given people have been discussing these issues for more than 25 years, said Davidson. Congressional solutions on data ownership and data governance would go a long way in addressing items related to the increased reliance on AI technology, he said: “It's a fruitful area for further exploration.”

Along with privacy concerns, AI raises competition matters, said Davidson. He discussed a need to “democratize” AI to avoid "a world where only a small set of players have access to the most important and powerful AI systems out there.”

NTIA is working on a separate inquiry about how policies can ensure AI technology is trustworthy (see 2306300036). Davidson expects a report some time “this winter.” The agency is examining the “whole lifecycle” of accountability, starting with transparency and the information needed to hold AI models accountable, he said. He suggested creating a system of third-party auditors similar to the financial sector's practice: “Responsible AI innovation is going to make a huge difference in people’s lives, a huge positive difference in many people’s lives. But we’re only going to realize that potential if we deal with the very risks that exist.”

Openness is also important, Davidson added: The AI discussion is similar to the “early days” of open source software, which ultimately benefited the tech industry. NTIA’s AI examination has shown there’s a “gradient” of openness that “we can lean into,” and that’s “what we’re going to be exploring this spring.”