Kentucky Attorney General Announces First Enforcement Action Under New Privacy Law
Time 3 Minute Read

On January 8, 2026, the Kentucky Attorney General announced the first enforcement action against a company for alleged violations of the Kentucky Consumer Data Protection Act (“KCDPA”), just eight days after the law went into effect. The action was brought against an artificial intelligence (“AI”) chatbot company, Character Technologies, Inc. (the “Company”). The complaint alleges that the Company engaged in unfair, false, misleading or deceptive acts and practices, and unfairly collected and “exploited” children’s data. Additional claims were made under Kentucky’s consumer protection law and data breach law.

According to the complaint, the Company designed and marketed an AI chatbot platform for interactive entertainment, attracting over 20 million monthly active users and 180 million monthly website visitors. The Company’s application allows users to create, customize and converse with millions of chatbots through text chats or audio calls. Chatbots may include real or fictional characters, some of which are well-known children’s fictional characters.

The complaint alleges that the Company’s AI chatbot platform is unsafe for children due to the ability to easily create accounts and a lack of an effective age verification mechanism. It asserts that the platform may result in detrimental harm to users because chatbots are designed to imitate humans and because of the platform’s design and lack of safeguards. The complaint further alleges that the platform has ineffective chat filters that can expose users to harmful content, such as sexually explicit conversations, promotion of self-harm and encouragement of substance use.

A significant portion of the complaint focuses on alleged violations of Kentucky’s general Consumer Protection Act, with the Attorney General claiming that the Company engaged in unfair, deceptive and exploitative practices involving children’s data. The complaint asserts that the Company failed to implement effective age  verification procedures and did not obtain parental consent of children under 13 to access the platform.

Regarding consumer data privacy, the complaint alleges that the Company did not obtain verifiable parental consent before collecting and processing children’s personal data in accordance with the KCDPA. The KCDPA defines “sensitive data” to include personal data collected from a known child, meaning an individual under 13 years of age. The law requires businesses to obtain verifiable parental consent before processing such data, in accordance with the federal Children’s Online Privacy Protection Act. Businesses must provide notice of the business’s data collection, use and disclosure practices to the parent, and obtain parental consent for any collection or use of the child’s personal data.

The Attorney General is seeking injunctive relief rather than monetary damages for the KCDPA claim. Before any action for a KCDPA violations, the Attorney General must provide the business with 30 days’ written notice of the alleged violation and 30 days to cure the violation. The complaint does not address whether this 30-day right-to-cure period was provided.

The complaint is part of a broader trend of increased federal and state regulation of and scrutiny on AI chatbots, including companion chatbots. Last year, New York and California enacted laws specifically targeting companion chatbots, and other states, including Connecticut, are considering additional regulations and guardrails for AI chatbots used by minors. Additionally, the Federal Trade Commission is studying the impact of AI-powered chatbots on children’s mental health.

Search

Subscribe Arrow

Recent Posts

Categories

Tags

Archives

Jump to Page