BIPA Lawsuit Proceeds Against Apple in Federal Court
Time 3 Minute Read

On November 12, 2020, Chief Judge Nancy J. Rosenstengel of the U.S. District Court for the Southern District of Illinois rejected Apple Inc.’s (“Apple’s”) motion to dismiss a class action alleging its facial recognition software violates Illinois’ Biometric Information Privacy Act (“BIPA”). Judge Rosenstengel agreed with Apple, however, that the federal court lacks subject matter jurisdiction over portions of the complaint.

The case involves a group of Illinois residents who allege that Apple uses technology to collect facial geometries from user pictures stored in the Photo app on Apple devices. According to the plaintiffs, this feature came preinstalled on Apple devices made since mid-2016 and users have no way to remove or disable the software. They claim that Apple violated BIPA by collecting, possessing and profiting from biometric information without the knowledge or consent of Apple device users. The class action lawsuit seeks to represent Illinois residents who had their photo stored on an Apple device with the facial scanning technology and seeks damages for each alleged violation of BIPA.

After removing the case from state court to federal court, Apple sought dismissal asserting that the lawsuit did not belong in federal court because the plaintiffs had not demonstrated how they were harmed by the facial recognition technology and thus lack standing in federal court. Apple also contends that the facial scans are not linked to identifiable individuals and should not be considered identifiable under BIPA, and further contends that BIPA should not apply because the facial recognition takes place within the user devices and is not transmitted from the photo app to Apple. Apple also asserts that the plaintiffs did not show how Apple profited from the facial recognition technology.

Judge Rosenstengel remanded back to state court the following two allegations for lack of standing in federal court: (1) the allegation that Apple possesses the biometric information without having created a retention policy for when it would destroy the information and (2) the allegation that Apple is profiting by selling devices with the facial recognition software. Judge Rosenstengel allowed the allegation that Apple violated BIPA by collecting the biometric information to proceed in federal court concluding that the plaintiffs have standing because they allege Apple never received informed consent before collecting facial scans within the photos app.

The Judge concluded that, at this point in the litigation, questions of fact remained as to the data collected by the facial recognition technology and it is not yet clear whether BIPA applies. Specifically, she pointed out that BIPA “exhaustively” defines biometric information and scans of face geometry are included in the definition. Taking all allegations as true and making all inferences in favor of plaintiffs, the Judge wrote that this alleged violation “would create a concrete, particularized injury to plaintiffs, as their power to make informed decisions about the collection and storage of their biometric data has been eroded.”

You May Also Be Interested In

Time 3 Minute Read

The results are in: attorneys are filing more employment law cases in court.  Indeed, year-end reporting from legal databases like LexMachina confirm that the pace of filing new employment discrimination cases reached its highest level in 2025, surpassing 20,000 new filings nationwide.  Though overtime and minimum wage lawsuits under the Fair Labor Standards Act (FLSA) have continued to decline since 2015, discrimination cases under laws like Title VII of the Civil Rights Act of 1964 and the Americans with Disabilities Act are on the rise.

Time 1 Minute Read

A recent federal court decision determined that documents created by a criminal defendant using AI and subsequently shared with legal counsel were not shielded by attorney-client privilege or the work product doctrine. In USA v. Heppner, Judge Jed S. Rakoff of the U.S. District Court for the Southern District of New York compelled the disclosure of 31 documents created with Anthropic’s Claude. This order was issued despite the defendant including information from counsel in the AI tool’s input and later providing the resulting outputs to his attorneys. The ruling offers early judicial perspective on privilege concerns involving AI-generated materials, an area where case law remains sparse.

Time 1 Minute Read

A recent federal court ruling held that AI-generated documents prepared by a defendant and later shared with legal counsel were not protected by attorney-client privilege or the work product doctrine.

Time 3 Minute Read

The U.S. Supreme Court will soon decide who qualifies as a “consumer” under the federal Video Privacy Protection Act, a 1988 law originally enacted to protect the privacy of individuals’ video rental and purchase records.

Search

Subscribe Arrow

Recent Posts

Categories

Tags

Archives

Jump to Page