Judge Approves $92 Million TikTok Settlement
Time 2 Minute Read

On July 28, 2022, a federal judge approved TikTok’s $92 million class action settlement of various privacy claims made under state and federal law. The agreement will resolve litigation that began in 2019 and involved claims that TikTok, owned by the Chinese company ByteDance, violated the Illinois Biometric Information Privacy Act (“BIPA”) and the federal Video Privacy Protection Act (“VPPA”) by improperly harvesting users’ personal data. U.S. District Court Judge John Lee of the Northern District of Illinois also awarded approximately $29 million in fees to class counsel.

The class action claimants alleged that TikTok violated BIPA by collecting users’ faceprints without their consent and violated the VPPA by disclosing personally identifiable information about the videos people watched. The settlement agreement also provides for several forms of injunctive relief, including:

  • Refraining from collecting and storing biometric information, collecting geolocation data and collecting information from users’ clipboards, unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • Not transmitting or storing U.S. user data outside of the U.S., unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • No longer pre-uploading U.S. user generated content, unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • Deleting all pre-uploaded user generated content from users who did not save or post the content; and
  • Training all employees and contractors on compliance with data privacy laws and company procedures.

You May Also Be Interested In

Time 2 Minute Read

On April 1, 2026, the U.S. Court of Appeals for the Seventh Circuit held that the 2024 amendment to Illinois’ Biometric Information Privacy Act, limiting damages, applies retroactively to pending cases.

Time 3 Minute Read

The Connecticut Attorney General recently issued a legal memorandum regarding the application of existing Connecticut laws, such as the Connecticut Data Privacy Act, to the use of artificial intelligence.

Time 3 Minute Read

The results are in: attorneys are filing more employment law cases in court.  Indeed, year-end reporting from legal databases like LexMachina confirm that the pace of filing new employment discrimination cases reached its highest level in 2025, surpassing 20,000 new filings nationwide.  Though overtime and minimum wage lawsuits under the Fair Labor Standards Act (FLSA) have continued to decline since 2015, discrimination cases under laws like Title VII of the Civil Rights Act of 1964 and the Americans with Disabilities Act are on the rise.

Time 1 Minute Read

A recent federal court decision determined that documents created by a criminal defendant using AI and subsequently shared with legal counsel were not shielded by attorney-client privilege or the work product doctrine. In USA v. Heppner, Judge Jed S. Rakoff of the U.S. District Court for the Southern District of New York compelled the disclosure of 31 documents created with Anthropic’s Claude. This order was issued despite the defendant including information from counsel in the AI tool’s input and later providing the resulting outputs to his attorneys. The ruling offers early judicial perspective on privilege concerns involving AI-generated materials, an area where case law remains sparse.

Search

Subscribe Arrow

Recent Posts

Categories

Tags

Archives

Jump to Page