Communications Litigation Today was a Warren News publication.

X Says Court Ruling in BIPA Plaintiffs' Favor Would Leave It With 'Impossible Choice'

The court should dismiss a privacy complaint against X, formerly Twitter, because the plaintiff has no viable claim the social media platform violated the Illinois Biometric Information Privacy Act (BIPA), said X's Nov. 20 reply (docket 1:23-cv-05449 ) in support of its motion to dismiss. Plaintiff Mark Martell’s August complaint alleged X implemented software to police pornographic and other “not-safe-for-work” images uploaded to Twitter without adequately informing individuals who interacted with the platform “that it collects and/or stores their biometric identifiers in every photograph containing a face that is uploaded to Twitter" (see 2308160021). In his opposition to X’s motion to dismiss, Martell failed to explain how PhotoDNA collects scans of facial geometry when the PhotoDNA webpage “directly refutes that allegation,” said the reply. Martell concedes that PhotoDNA doesn’t enable X to identify individuals depicted in images uploaded to Twitter, but he argued that biometric identifiers don’t have to be capable of identifying an individual, which X called an “oxymoron.” A biometric identifier must be capable of identifying an individual by definition, X said. Because BIPA applies only to data that can be used to identify an individual, and because Martell “fails to allege that PhotoDNA enables X to identify anyone,” his claims should be dismissed, it said. The plaintiff also fails to explain why Section 230(c)(2)(A) of the Communications Decency Act doesn’t immunize X from BIPA liability, said the reply. Though he said he didn’t seek to post obscene images on X and is not suing X for its actions as a publisher or speaker, “both arguments are irrelevant,” said X's reply, saying all that matters under the CDA is whether X’s alleged conduct is an “action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” That is what X does with PhotoDNA: It uses it to “identify and remove known images of child exploitation,” the reply said. If PhotoDNA can’t be used to identify individuals depicted in images, “how can X possibly seek those individuals’ consent” under BIPA, it said. If the court determines that Martell’s understanding of BIPA is correct, and that Section 230 doesn’t immunize X from liability, then X will have an “impossible choice”: “Subject itself to recurring and outsized liability under BIPA by continuing to use PhotoDNA to remove child-exploitation images, or halt that use and allow child-exploitation images to circulate on its platform,” it said.