New Hart-Scott-Rodino rules will mean increased time and cost when filing premerger notifications with the FTC and DOJ, antitrust attorneys said Monday. The FTC on Thursday announced finalization of new HSR filing rules. It said the changes will help enforcers “detect illegal mergers and acquisitions prior to consummation.” The commission voted 5-0 to finalize the changes. They include requirements for filing additional transaction documents, high-level business plans and disclosures of investors in the buying party. The FTC estimates the new rules will add 68 hours to the average time for preparing an HSR filing and increase the cost by about $39,644. The rules are expected to become effective in January, 90 days after Federal Register publication. Hunton Andrews said they will create uncertainty about the level of detail required for filings, including “descriptions of the ownership structure, transaction rationale and overlaps.” Paul Weiss said that based on the FTC announcement, the most significant increases in costs could be felt in “transactions with complex party or deal structures, those involving entities with many overlapping business operations or existing business relationships in the supply chain, or where the parties have a history of acquisitions in the same business lines.” The FTC said it will lift its suspension of early termination when the rules go into effect. The “temporary” suspension was implemented in February 2021 (see 2102040025 and 2203170004).
Meta, Google, TikTok and Snapchat must defend themselves against claims that their platforms are designed to “foster compulsive use by minors,” the U.S. District Court for the Northern District of California ruled Tuesday (docket 4:22-md-03047-YGR). Judge Yvonne Gonzalez Rogers ruled on hundreds of consolidated legal claims filed on behalf of children, school districts, local governments and state attorneys general. The ruling covered lawsuits from 35 different states, including California, New York, Georgia and Florida. Rogers “generally denied” the companies’ motions to dismiss but limited many claims' scope. “Much of the States’ consumer protection claims are cognizable,” she said. “Meta’s alleged yearslong public campaign of deception as to the risks of addiction and mental harms to minors from platform use fits readily within these states’ deceptive acts and practices framework.” However, she noted Communications Decency Act Section 230 provides a “fairly significant limitation on these claims.” Section 230 also protects against “personal injury plaintiffs’ consumer-protection, concealment, and misrepresentation theories,” she said. Rogers declined to dismiss “theories of liability predicated on a failure-to-warn of known risks of addiction attendant to any platform features or as to platform construction in general,” including claims against YouTube, Snap and TikTok. The companies didn’t comment.
Utah is appealing a preliminary injunction against the state’s social media age-verification law, Attorney General Sean Reyes (R) said in a Thursday filing with the U.S. District Court of Utah (docket 2:23-cv-00911). NetChoice won an injunction against SB-194 in September on First Amendment grounds (see 2409110025). Reyes and Katherine Hass, the state's Department of Commerce Consumer Protection Division director, are appealing to the 10th U.S. Circuit Court of Appeals.
It “doesn’t make sense” to break up Google because the company is facing its most intense competition, Rep. Ro Khanna, D-Calif., said Thursday. However, he told CNBC Google should drop its exclusive contract with Apple and not self-preference its own products. One of Google’s AI competitors, ChatGPT, has more than 100 million users, he said: Google is “actually having the most competition today than ever before.” DOJ on Tuesday filed a proposed remedy framework in its antitrust lawsuit against the company, and said breaking up Google should be considered (see 2410090035).
The FTC investigated 70% of Hart-Scott-Rodino merger transactions in fiscal 2023, the agency said in its annual report with DOJ on Thursday. DOJ issued second requests, the mechanism used to initiate an investigation, in 30% of transactions in 2023. There were 1,723 total HSR transactions in 2023, 124 of which received FTC clearance. DOJ received clearance on 61 of the transactions. The FTC in fiscal 2022 issued second requests on 53.2% of transactions it reviewed, and DOJ initiated investigations in about 47% of its cases. During President Donald Trump’s last year in office, fiscal 2020, the FTC issued requests on 48% of transactions, and DOJ issued requests on 52% of its cases. There were 1,580 total transactions in 2020 and 3,029 in 2022. The commission voted 3-2 to issue the 2023 report, with Republicans Melissa Holyoak and Andrew Ferguson dissenting. Holyoak and Ferguson urged the two agencies to fix discrepancies between how the FTC and DOJ report certain figures, specifically how they measure litigation results.
FTC Chair Lina Khan “only attends official events at the request of members of Congress" and abides by “all the rules governing her role as chair,” an agency spokesperson said Wednesday (see 2410020046). House Oversight Committee Chairman James Comer, R-Ky., wrote Khan Tuesday saying his committee is expanding its investigation of FTC “politicization” under her leadership to also probe her participation in “campaign-season events with Democrat candidates” (see 2410080062). The agency spokesperson said members of Congress invite Khan to “official events so she can hear from their constituents, because every community has a stake in fair competition.”
The 3rd U.S. Circuit Court of Appeals should grant TikTok's request for a full-court review of a three-judge panel’s decision that Section 230 doesn’t protect its algorithmic recommendations (see 2408280014) (docket 22-3061), tech associations said in an amicus brief filed Tuesday. Signees included CTA, the Computer & Communications Industry Association, NetChoice, TechNet and the Software & Information Industry Association. Chamber of Progress, Engine and the Interactive Advertising Bureau also signed. TechFreedom signed a separate amicus brief supporting TikTok. The three-judge panel remanded a district court decision dismissing a lawsuit from the mother of a 10-year-old TikTok user who unintentionally hanged herself after watching a “Blackout Challenge” video on the platform. The platform can’t claim Communications Decency Act Section 230 immunity from liability when its content harms users, the panel found. That decision threatens the internet “as we know it,” the associations said in their filing: It jeopardizes platforms’ ability to “disseminate user-created speech and the public’s ability to communication online.” TechFreedom Appellate Litigation Director Corbin Barthold said the panel wrongly concluded that “because recommendations are a website’s own First Amendment-protected expression, they fall outside Section 230’s liability shield. A website’s decision simply to host a third party’s speech at all is also First Amendment-protected expression. By the panel’s misguided logic, Section 230’s key provision -- Section 230(c)(1) -- is a nullity; it protects nothing.”
Rules for implementation of Florida’s social media age-verification law state that a "commercial entity willfully disregards a person’s age if it, based on the facts or circumstance readily available to the respondent, should reasonably have been aroused to question whether the person was a child and thereafter failed to perform reasonable age verification.” The Florida attorney general’s office on Monday sent us rules that it adopted for implementing the state law (HB-3) taking effect Jan. 1. Gov. Ron DeSantis (R) signed the law in March after lawmakers approved a revised proposal that includes parental consent after he vetoed a proposal banning kids younger than 16 from having social media accounts (see 2403080063). “Willful disregard of a person’s age constitutes a knowing or intentional violation” of Florida’s social media age-restriction law, the AG rules say. “The department will not find willful disregard of a person’s age has occurred if a commercial entity establishes it has utilized a reasonable age verification method with respect to all who access the social media platform and that reasonable age verification method determined that the person was not a child unless the social media platform later obtained actual knowledge that the person was a minor and failed to act.” The rules define a “commercially reasonable method of age verification” as “a method of verifying age that is regularly used by the government or businesses for the purpose of age and identity verification." Meanwhile, “reasonable parental verification” is defined as “any method that is reasonably calculated at determining that a person is a parent of a child that also verifies the age and identity of that parent by commercially reasonable means.” That might include asking the child for a parent’s name, address, phone number and email address, contacting that person and “confirming that the parent is the child’s parent by obtaining documents or information sufficient to evidence that relationship,” and “utilizing any commercially reasonable method regularly used by the government or business to verify that parent’s identity and age.” Under another rule, social media platforms must permanently delete all personal information related to an account within 14 business days of the account's termination.
The 5th U.S. Circuit Court of Appeals should affirm a lower court’s decision that blocks Mississippi’s social media age-verification law because it violates the First Amendment, the American Civil Liberties Union, the Electronic Frontier Foundation, Chamber of Progress and other groups argued in filings Thursday (docket 24-60341). The amici filed in support of NetChoice, which won a preliminary injunction against the law from the U.S. District Court for Southern Mississippi on July 1 (see 2407160038). The ACLU and EFF filed a joint brief, arguing that online age verification blocks access to protected speech for millions of adults who lack proof of identification. Users have a right to be anonymous online, and age-verification requirements force people to put sensitive data at risk of inadvertent disclosure in data breaches, they said. Chamber of Progress filed with LGBT Tech, the Woodhull Freedom Foundation and the Coalition for Responsible Home Education. Minors don’t “shed their First Amendment rights at the gateway to the internet,” they said: Their participation in the “marketplace of ideas,” which includes unpopular ideas, is “essential to a functioning democracy.” The Foundation for Individual Rights and Expression argued that legal precedent requires government to show there isn’t a “less restrictive alternative” to achieving its objective, and Mississippi hasn’t shown the new law is the “least restrictive means of addressing concerns about young peoples’ use of social media.”
Texas sued TikTok for allegedly violating the state’s new social media parental-consent law. The social media platform shared minors’ personal data in violation of the state’s social media age-restriction law (HB-18), Texas said in a complaint at the Texas District Court in Galveston County (case 24-CV-1763). “Texas law requires social media companies to take steps to protect kids online and requires them to provide parents with tools to do the same,” said Ken Paxton (R), the Texas attorney general. The complaint claims that TikTok failed to provide those tools and develop a commercially reasonable parental-consent mechanism. In addition, Texas alleged that TikTok shared and disclosed minors’ personal identifying information without parental consent. Paxton sought injunctive relief and civil penalties of up to $10,000 per violation. A TikTok spokesperson said, “We strongly disagree with these allegations and, in fact, we offer robust safeguards for teens and parents, including Family Pairing, all of which are publicly available. We stand by the protections we provide families.” The lawsuit comes roughly one month after the U.S. District Court of Western Texas granted a preliminary injunction (see 2409030039) against the 2024 law in a case that tech industry groups NetChoice and the Computer & Communications Industry Association (CCIA) brought. However, TikTok is not a member of NetChoice or CCIA. “The injunction granted by Judge [Robert] Pitman of the Western District of Texas bars the state from enforcing particular provisions of [HB-18] only as to CCIA, NetChoice, and their members,” said Stephanie Joyce, CCIA chief of staff.