As reported on the Hunton Employment & Labor Perspectives blog, on February 15, 2024, California lawmakers introduced the bill AB 2930. AB 2930 seeks to regulate use of artificial intelligence (“AI”) in various industries to combat “algorithmic discrimination.” The proposed bill defines “algorithmic discrimination” as a “condition in which an automated decision tool contributes to unjustified differential treatment or impacts disfavoring people” based on various protected characteristics including actual or perceived race, color, ethnicity, sex, national origin, disability and veteran status.
Specifically, AB 2930 seeks to regulate “automated decision tools,” that make “consequential decisions.” An “automated decision tool” is any system that uses AI which has been developed to make, or be a controlling factor in making, “consequential decisions.” And a “consequential decision” is defined as a decision or judgment that has a legal, material, or similarly significant effect on an individual’s life relating to the impact of, access to, or the cost, terms, or availability of, any of the following: 1) employment, including any decisions regarding pay or promotion, hiring or termination and automated task collection; 2) education; 3) housing or lodging; 4) essential utilities, 5) family planning, 6) adoption services, reproductive services or assessments related to child protective services; 7) health care or health insurance; 8) financial services; 9) the criminal justice system; 10) legal services; 11) private arbitration; 12) mediation and 13) voting.
AB 2930 aims to prevent algorithmic discrimination through impact assessments, notice requirements, governance programs, policy disclosure requirements and providing for civil liability.
Impact Assessments
Any employers or developers using or developing automated decision tools, by January 1, 2026, will be required to perform annual impact assessments. The annual impact assessment requirements are largely the same for both employers and developers and include, among other things, a statement of purpose for the automated decision tool; descriptions of the automated decision tool’s outputs and how they are used in making a consequential decision; and analysis of potential adverse impacts. Employers, but not developers, are required to: 1) describe the safeguards in place to address reasonably foreseeable risks of algorithmic discrimination, and 2) provide a statement of the extent to which the employer’s use of the automated decision tool is consistent with or varies from the developer’s statement of the intended use of the automated decision tool (which developers are required to provide under Section 22756.3 of the proposed bill). Employers with fewer than 25 employees will not be required to perform this assessment, unless the automated system impacted more than 999 people in the calendar year.
Notice Requirements
Employers using automated decision tools are required to notify any person subject to a consequential decision that the automated decision tool is being used to make a consequential decision. The notice is required to include: 1) a statement of the purpose of the automated decision tool; 2) contact information of the employer; and 3) a plain language description of the automated decision tool. Also, if the consequential decision is made solely based on the output on the automated decision tool, the employer is required to, if technically feasible, accommodate a person’s request to be subject to an alternative selection process.
Governance Programs
Employers using automated decision tools are required to establish a governance program to address any reasonable foreseeable risks of algorithmic discrimination associated with the use of an automated decision tool. The governance program must, among other things, designate at least one employee responsible for overseeing and maintaining the governance program and compliance with AB 2930; implement safeguards to address reasonably foreseeable risks of algorithmic discrimination; conduct an annual and comprehensive review of policies, practices and procedures to ensure compliance with AB 2930; and maintain results of impact assessments for at least two years. Employers with fewer than 25 employees will not be required to form a governance program, unless the automated system impacted more than 999 people in the calendar year.
Policy Disclosure Requirements
Any employers or developers using or developing automated decision tools are also required to make publicly available a clear policy that provides a summary of both of the following: 1) the types of automated decision tools currently in use and 2) how the employer or developer manages the reasonably foreseeable risks of algorithmic discrimination that may arise from the use of the automated decision tools it uses.
Civil Liability
A person may bring a civil action against an employer for violating AB 2930 if the person is able to demonstrate that the automated decision tool caused actual harm to the person. A prevailing plaintiff may be able to receive compensatory damages, declaratory relief, and reasonable attorney’s fees. Public attorneys, including district attorneys and city prosecutors, may also bring civil actions against employers for violating AB 2930.
Search
Recent Posts
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HIPPA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- Iowa
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Regulation
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott H. Kimpel
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code