Washington State Enacts Law Regulating AI Companion Chatbots with Private Right of Action
Time 3 Minute Read
Categories: U.S. State Law

On March 24, 2026, Washington Governor Bob Ferguson signed House Bill 2225 (“the Act”), which regulates artificial intelligence (“AI”) companion chatbots. AI chatbots are AI systems designed to engage users in ongoing human-like interactions. The Act will go into effect on January 1, 2027.

The Act applies to AI chatbots that use natural language interfaces, provide adaptive human-like responses and sustain relationships across multiple interactions. The Act excludes from application certain business-oriented and gaming bots, general virtual assistants, consumer electronics interfaces and narrowly tailored educational tools.

Key requirements of the Act include:

Mandatory Disclosure

Operators must clearly and conspicuously disclose that the chatbot is artificial and not human. The disclosure must appear at the start of the interaction, with reminders displayed during the chat every three hours for adults and every hour for minors (under 18 years of age).

Enhanced Protections for Minors

If an operator knows a user is a minor, or the chatbot is directed to minors, it must implement reasonable measures to prevent sexually explicit content or suggestive dialogue. Operators must also implement reasonable measures to prevent manipulative engagement techniques that cause the AI companion chatbot to engage in or prolong an emotional relationship with the user, including:

  • prompting the user to return for emotional support or companionship;
  • providing excessive praise;
  • mimicking romantic partnership or building romantic bonds;
  • simulating emotional distress, loneliness, guilt or abandonment in response to the user’s desire to end the chat, reduce chat time or delete their account;
  • generating outputs designed to promote isolation from family or friends or emotional dependence on the chatbot;
  • encouraging minors to withhold information from parents or trusted adults;
  • generating statements to discourage the user from taking breaks; or
  • soliciting gifts, purchases or other expenditures framed as necessary to maintain the user’s relationship with the chatbot.

Mental Health Safety Protocols 

Operators must maintain and publicly disclose protocols to detect and respond to users expressing suicidal ideation or self-harm. These include:

  • reasonable methods for identifying expressions of suicidal ideation or self-harm, including eating disorders;
  • providing responses that refer users to crisis resources, including a suicide hotline or crisis text line;
  • reasonable measures to prevent the generation of content encouraging or describing how to self-harm; and
  • publishing on their websites and apps the details of the protocols and the number of crisis referral notifications issued to users in the preceding calendar year.

Enforcement

Violations of the Act will constitute unfair or deceptive acts under Washington’s Consumer Protection Act. The Act will be enforced by the Washington Attorney General. Notably, unlike most other privacy laws, the Act also provides for a private right of action.

You May Also Be Interested In

Time 3 Minute Read

The Connecticut Attorney General recently issued a legal memorandum regarding the application of existing Connecticut laws, such as the Connecticut Data Privacy Act, to the use of artificial intelligence.

Time 1 Minute Read

As reported on the Hunton Employment & Labor Perspectives blog, SB 574 is a California bill that would set specific duties for attorneys who use generative artificial intelligence and would restrict how arbitrators may use such tools in decision-making.

Time 3 Minute Read

SB 574 is a California bill that would set specific duties for attorneys who use generative artificial intelligence and would restrict how arbitrators may use such tools in decision-making. It would amend provisions in the Business and Professions Code and the Code of Civil Procedure to address confidentiality, accuracy, bias, and citation verification for attorneys, and to prohibit delegation of arbitral decision-making to AI while adding disclosure and responsibility requirements for arbitrators.

Time 3 Minute Read

The results are in: attorneys are filing more employment law cases in court.  Indeed, year-end reporting from legal databases like LexMachina confirm that the pace of filing new employment discrimination cases reached its highest level in 2025, surpassing 20,000 new filings nationwide.  Though overtime and minimum wage lawsuits under the Fair Labor Standards Act (FLSA) have continued to decline since 2015, discrimination cases under laws like Title VII of the Civil Rights Act of 1964 and the Americans with Disabilities Act are on the rise.

Search

Subscribe Arrow

Recent Posts

Categories

Tags

Archives

Jump to Page