On June 7, 2024, following a public consultation, the French Data Protection Authority (the “CNIL”) published the final version of its guidelines addressing the development of AI systems from a data protection perspective (the “Guidelines”). Read our blog on the pre-public consultation version of these Guidelines.
In the Guidelines, the CNIL states that, in its view, the successful development of AI systems can be reconciled with the challenges of protecting privacy.
The Guidelines are divided into seven “AI how-to sheets” in which the CNIL seeks to guide organizations through the necessary steps to take in order to develop AI systems in a manner compatible with the GDPR. The “AI how-to sheets” provide guidance on: (1) determining the applicable legal regime (e.g., the GDPR or the Law Enforcement Directive); (2) defining a purpose; (3) determining the legal qualification of AI system providers (e.g., controller, processor or joint controller); (4) ensuring the lawfulness of the data processing; (5) carrying out a data protection impact assessment (“DPIA”) when necessary; (6) taking into account data protection when designing the AI system; and (7) taking into account data protection in data collection and management.
Noteworthy takeaways from the Guidelines include:
- In line with the GDPR, the purpose for processing personal data in the development of an AI system must be specific, explicit and legitimate. Additionally, the data must not be further processed in a manner incompatible with this initial purpose as per the principle of purpose limitation. The CNIL clarifies that where an AI system is developed for a single operational use, the purpose for processing personal data in the development phase is directly related to the purpose of processing in the deployment phase. Therefore, if the purpose in the deployment phase is specified, explicit and legitimate, the purpose in the development phase will also be determined. However for certain AI systems, such as general purpose AI systems, the operational use may not be clearly identifiable in the development phase. In this case, to be deemed sufficiently precise regarding the purpose of the processing, the data subject must be provided information on the type of system developed (g., large language model) in a clear and intelligible way, and the controller should assess the technically feasible functionalities and capabilities of the AI system (e.g., the controller must draw up a list of capabilities that it can reasonably foresee at the development stage).
- The role of the parties involved in processing operations related to AI systems should be assessed on a case-by-case basis. However, the CNIL draws attention to certain elements that should be considered when carrying out this analysis. For example, a provider that is at the initial development of an AI system and creates the training dataset based on data it has selected on its own account should generally be considered as a controller.
- Consent, legitimate interests, performance of a contract and public interest may all theoretically serve as legal bases for the development of AI systems. Legal obligation could also serve as a legal basis for the deployment of AI systems, but the CNIL considers it difficult to rely on this basis for development in most cases. Controllers must carefully assess the most adequate legal basis for their specific case.
- DPIAs carried out to address the processing of data for the development of AI systems must address specific AI risks, such as the risk of automated discrimination caused by the AI system, the risks related to the confidentiality of the data that could be extracted from the AI system, the risk of producing fictitious content about a real person, or the risks associated with known attacks specific to AI systems (g., attacks by data poisoning, insertion of a backdoor or model inversion).
- The association of an ethics committee with the development of AI systems is a good practice to ensure that ethical issues and the protection of human rights and freedoms are taken into account upstream.
- Data minimization and data protection measures that have been implemented during data collection may become obsolete over time and must be continuously monitored and updated when required.
- Re-using datasets, particularly those publicly available on the Internet, is possible to train AI Systems, provided that the data was lawfully collected and that the purpose of re-use is compatible with the original collection purpose.
In the coming months, the CNIL will supplement these Guidelines with further how-to sheets including with regards the legal basis of legitimate interest, the management of rights, the information of data subjects, and annotation and security during the development phase.
Read the Guidelines and the press release.
Search
Recent Posts
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HIPPA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- Iowa
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Regulation
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott H. Kimpel
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code