Communications Litigation Today was a Warren News publication.
Rodgers ‘Encouraged’

House Commerce Democrats Send Privacy Draft to Republicans

House Commerce Committee Democrats sent draft privacy bill language to Republicans last week in hopes the two sides can reach agreement in the near future, said Chairman Frank Pallone, D-N.J. Updates were made on the bipartisan staff discussion draft, which the two sides have been negotiating since last Congress, he said. Democrats are “confident” the two sides can work together and reach agreement, said Consumer Protection Subcommittee Chair Jan Schakowsky, D-Ill., who hosted Thursday’s hearing on Big Tech issues.

Though the sides are nowhere near introducing a bipartisan product, Republicans are “looking at it, and I’m encouraged that they have presented some language,” committee ranking member Cathy McMorris Rodgers, R-Wash., told us. “I don’t believe it has bipartisan support. We’re anxious to get serious about a national framework around privacy. We believe it’s overdue. We’re looking at it now.”

Asked how far apart the parties are on a potential bipartisan draft, Pallone said he didn’t know. The work continues, and he’s optimistic, he said during the hearing. Schakowsky said she shares the urgency of Republicans. Consumer Protection ranking member Gus Bilirakis, R-Fla., noted his support for Republican legislation that would create an FTC privacy bureau.

Bilirakis noted the subcommittee asked TikTok to testify, but the company declined. He hopes to work with Schakowsky on getting the social networking service to appear in the near future. There were many unanswered questions from the company’s recent Senate hearing, he said (see 2110260070). The company didn’t comment now.

Pallone discussed the need to pass whistleblower protections so more individuals like Frances Haugen, formerly of Facebook, will come forward and shed light on tech companies (see 2110050062). Members discussed the FTC Whistleblower Act (HR-6093), which would provide incentives and protections for whistleblowers through FTC authority.

Facebook

Whistleblowers like Haugen are “incredibly important,” said Center for Countering Digital Hate CEO Imran Ahmed. Haugen exposed deception, which is hard to do unless you’re on the inside, he said. Haugen's testimony helped reveal lies of Facebook CEO Mark Zuckerberg, said Anti-Defamation League CEO Jonathan Greenblatt. He noted Facebook’s AI fails to identify 95%-97% of hate speech.

The FTC should investigate whether Facebook misled advertisers and users “about ensuring brand safety and the reach of its advertisements,” Senate Commerce Committee Chair Maria Cantwell, D-Wash., wrote the agency Thursday. She noted Facebook’s community standards enforcement report “states that its algorithms remove 97 percent of the content it eventually takes down for hate speech before the content is posted.” Whistleblower documents show Facebook “believes that its processes miss more than 90 percent of hate speech content despite being 97 percent effective at catching the hate speech Facebook eventually takes down,” she wrote. The FTC confirmed receiving the letter.

Rodgers said she's disappointed the hearing wasn’t used to focus on privacy issues. She asked how the bills Democrats presented will fit into a national privacy framework. Not having a national privacy standard has consequences, said Bilirakis. Kelley Drye’s Jessica Rich, a former FTC Consumer Protection Bureau director, agreed, saying it’s a “highly confusing environment,” in which no one “really knows the rules.”

Facebook’s goal is to reduce hate speech prevalence, or “the amount of it that people actually see,” a Meta spokesperson emailed. “The prevalence of hate speech on Facebook is now 0.03 percent of content viewed and is down by 50 percent in the last four quarters, facts that are regrettably being glossed over. We report these figures publicly four times a year and are even working with an independent auditor to validate our results.”

All six witnesses said they would support a federal privacy law. Nathalie Marechal, Ranking Digital Rights senior policy and partnerships manager, said her group's support depends on the law's strength.

Communications Subcommittee ranking member Bob Latta, R-Ohio, drew attention to issues with ICANN’s Whois database and data access problems with EU’s general data protection regulation (see 2102160001). He suggested restored access would improve internet safety. ICANN’s leadership has said the organization is limited in what it can do because of the GDPR, said Iggy Ventures CEO Rick Lane: “Our own security is put at risk because of a foreign entity’s legislation and regulation.”

The committee is interested in banning certain platform design features to protect children online, said Pallone, referencing HR-5439, the Kids Internet Design and Safety Act. It should be a priority for Congress to get rid of data-driven ads for children, said Fairplay Executive Director Josh Golin.

Senate Hearing

Senate Commerce Committee member Sen. Amy Klobuchar, D-Minn., touted her filing with Sens. Chris Coons, D-Del., and Rob Portman, R-Ohio, of the Platform Accountability and Transparency Act, during a separate Thursday Communications Subcommittee hearing. The measure would allow independent researchers to submit proposals to the National Science Foundation that seek access to social media platforms’ data. The bill would require the companies to provide the requested data following NSF approval while adhering to certain privacy protections.

We need more transparency on algorithms, and we need to do something about this,” Klobuchar said. She noted Facebook’s much-publicized cutoff of researchers’ access to some data, and instances of claimed algorithm misuse by Amazon, Apple and others to “increase the time users spend on their platform.” Interest in smoothing researchers’ access to social media algorithm data has increased in recent months, prompting YouTube, TikTok and Snap to commit to sharing internal research information with lawmakers (see 2110260070).

Congress can act to protect external researchers and enable independent oversight and auditing of systems in privacy-preserving ways,” said Senate Communications Chairman Ben Ray Lujan, D-N.M. He and subpanel ranking member John Thune, R-S.D., cited their Protecting Americans from Dangerous Algorithms Act (S-3029). Platforms need to face “real accountability when their algorithms are at fault for making” content “recommendations that contribute to terrorism,” Lujan said.

Thune also cited his Filter Bubble Transparency Act. S-2024 would require platforms that collect data from more than one million users and gross more than $50 million per year to let users view content that has not been curated as a result of a secret algorithm. There’s “a growing bipartisan consensus that we need to shed greater light on the secretive content that the moderation processes that social media companies use, and it’s time, in my view, to make Big Tech more transparent, more accountable," said Thune.