Rep. Schakowsky to Circulate Draft Legislation on Platform Liability
Draft legislation will be circulated next week that would “fundamentally alter” tech companies' business models, House Consumer Protection Subcommittee Chair Jan Schakowsky, D-Ill., said during a hearing. Her draft bill will be aimed at giving regulators and consumers recourse when companies fail to deliver basic, stated commitments, she said. Reached after the hearing, Schakowsky wouldn’t say whether the draft bill directly targets Communications Decency Act Section 230.
The lawmaker said platform liability will be the focus. Asked whether there’s bipartisan momentum on Section 230, Schakowsky said in an interview, “I really felt that in the hearing today.” Both sides see a need to do something, and there hasn’t been voluntary compliance from social media companies, she said.
Manipulation and platform surveillance are getting worse, Rep. Kathy Castor, D-Fla., told us. Thursday’s “eye-opening” hearing showed bipartisan potential for solutions on Section 230, she added.
Facebook has shown an internal tendency to ignore privacy concerns in favor of profits and expanding the network, testified Moment CEO Tim Kendall, former Facebook monetization director. Schakowsky asked him about companies that release new products despite potential harm to consumers. Facebook discussions about harm focused on user privacy, but concerns during new product releases were often ignored, Kendall said, noting the motivation was always to gain more users. The company doesn’t respond to constraints or threats from regulators, only financial impact, he said, noting the FTC’s $5 billion settlement’s impact on privacy at the company. The platform didn’t comment.
The internet has its faults, especially when companies don’t fulfill their responsibilities, but it’s been an overwhelming force for good, said ranking member Cathy McMorris Rodgers, R-Wash. She raised concerns about platforms applying inconsistent standards for moderating content and accused Twitter of singling out President Donald Trump (see 2005290058). Censorship shouldn’t be the answer to political content a platform disagrees with, she said. Commerce Committee ranking member Greg Walden, R-Ore., in his opening remarks drew attention to the spread of misinformation on platforms, calling social media a “cancer on civility.”
Like Schakowsky, House Commerce Committee Chairman Frank Pallone, D-N.J., drew attention to social media company business models. He said algorithms are designed to optimize user engagement and attract more users, which leads to the amplification of harmful content. Kendall agreed, saying engagement drives every decision at Facebook.
Castor suggested Section 230 has become an impediment to protecting children. Facebook and YouTube make millions off a “stomach-turning stew” of illegal activity without any threat of liability, she said. As long as Section 230 is on the books, platforms will have only a moral obligation to protect children, not a legal one, testified Coalition for a Safer Web President Marc Ginsberg.
Also testifying was Taylor Dumpson, American University’s first black female student body president and the target of a racist online harassment campaign. Dumpson’s story alone is enough to convince Congress that Section 230 needs to be addressed, said Rep. Larry Bucshon, R-Ind. Platforms are no longer innocently hosting content, he said: Congress should consider whether they’re more traditional publishers and have an honest discussion about Section 230.
Rep. Robin Kelly, D-Ill., questioned whether social media platforms are capable of self-correcting. Kendall said no, that the incentive to maintain the status quo is too lucrative. Ginsberg noted the Coalition for a Safer Web wants a social media standards board to create a harmonized code of conduct modeled after the 1970s Financial Accounting Standards Board, which standardized the banking industry.
Rep. Darren Soto, D-Fla., suggested Congress can establish social media platform liability to compensate victims and force business changes. He floated the idea of making platforms liable for damages in court for criminal activity perpetrated on platforms.