Communications Litigation Today was a Warren News publication.
Censorship, Misinformation?

Democrats, GOP Attack CDA Section 230 at House Commerce Hearing

Section 230 of the Communications Decency Act needs recalibration because Big Tech isn’t doing enough to combat disinformation, House Commerce Committee Democrats said Wednesday. Republicans suggested platforms provide more transparency about content moderation decisions, citing political bias. It was a hearing (see 2006110064) of the Communications and Consumer Protection subcommittees.

Social media companies are distributing and amplifying disinformation, which is dividing Americans during an era of racial turmoil, said Communications Chairman Mike Doyle, D-Pa. He cited Facebook, YouTube and Twitter, arguing they are pervasive sources of information and news. Section 230 provided a sword and shield for companies moderating content, but they failed to act on misinformation, which allows them to monetize their businesses, he said. He noted when platforms do moderate, they are attacked for censorship, citing Twitter’s recent actions against President Donald Trump (see 2006180059).

Section 230 protections allowed tech companies to become gatekeepers, but too often companies don’t take responsibility for the content within those gates, said ranking member Bob Latta, R-Ohio. He emphasized he isn’t advocating to repeal 230 or create niche carve-outs. Broad immunity was granted without companies having to prove they are doing everything possible to moderate, he said. More oversight is needed for content moderation practices, showing what they choose to censor and not, he said. If platforms can make complex decisions moderating conservative speech, surely they can make the easy decisions dealing with hateful or racist content, he argued.

Big Tech uses the statute as a shield from liability when it fails to protect consumers and as a sword to defend its interests in the legislative discussion, said Consumer Protection Chair Jan Schakowsky, D-Ill. She accused Trump of using his position to chill speech, arguing content moderation should foster a safer online world.

Rep. Brett Guthrie, R-Ky., said better transparency about guidelines for content moderation and mechanisms for repeals processes is needed. Congress must ensure platforms are applying these standards fairly, instead of labeling differing opinions as misinformation, he argued. Rep. John Shimkus, R-Ill., questioned whether any protections exist to ensure decisions aren’t one-sided “at the whims” of platform employees.

Guthrie cited his Countering Online Harms Act. It would direct the FTC to study how artificial intelligence is used to remove harmful online content like misinformation and other fraudulent content.

Tech companies failed to uphold transparency, accountability and fairness, testified Color of Change Senior Campaign Director Brandi Collins-Dexter. She recommended Congress convene civil rights-focused hearings with tech executives. She backed restoring funding for the Office of Technology Assessment to help Congress tackle issues like data privacy, tech election protection and disinformation campaigns.

University of California-Berkeley professor Hany Farid cited a “litany of daily horrors” online: the spread of child sexual abuse material, radicalization of domestic and international terrorists, distribution of illegal drugs, disinformation campaigns disrupting democratic elections, harassment of women and minorities, and failures to protect personal data. Big tech failed to deploy proper safeguards, he said.

Congress should restore a duty of care online, requiring platforms to take good faith steps as a condition of receiving Section 230 protections, testified DigitalFrontiers Advocacy Principal Neil Fried. This would better protect users and address competition concerns, he said. Increased transparency would help, he agreed with Republicans. It’s a losing battle if platforms can facilitate illicit activity with impunity, he said.

Disinformation is used to divide Americans along racial lines, testified George Washington University law professor Spencer Overton. Section 230 gives authority to remove disinformation, and Big Tech needs to use it to do a better job of removing disinformation, he said. Some platforms say they want to protect free speech, but the harms upheld in the name of free expression have a disproportionate impact on minority communities, he added.

Rep. G.K. Butterfield, D-N.C., asked witnesses to highlight how disinformation disproportionately affects minority communities. Microtargeting allowed state actors like Russia to target Black populations, telling individuals to protest police brutality by staying home and not voting, Overton said as one example.

Rep. Michael Burgess, R-Texas, asked about restoring a duty of care by requiring Section 230 immunity through good-faith efforts. He asked what is “good faith.” The section means companies can’t be held culpable even when there’s negligence, said Fried: There should be requirements for “reasonable steps.”