On February 7, 2025, the French Data Protection Authority (“CNIL”) released two recommendations aimed at guiding organizations in the responsible development and deployment of artificial intelligence (“AI”) systems in compliance with the EU General Data Protection Regulation (“GDPR”). The first recommendation is titled “AI: Informing Data Subjects” (the “Recommendation on Informing Individuals”) and the second recommendation is titled “AI: Complying and Facilitating Individuals’ Rights” (the “Recommendation on Individual Rights”). The recommendations build on the CNIL’s four-pillar AI action plan announced in 2023.
At a general level, the CNIL clarifies in its press release that:
- The purpose limitation principle applies flexibly to general-purpose AI systems. Operators who cannot precisely define all future applications at the training stage may limit themselves to describing the type of system being developed and illustrating its potential key functionalities.
- The data minimization principle does not prevent the use of large training datasets. In principle, the data used should be selected and cleaned to optimize algorithm training while avoiding the use of unnecessary personal data.
- Training data may be retained for extended periods, if justified and appropriate security measures are implemented.
- The reuse of databases, including those available online, is possible in many cases, subject to verifying that the data was not collected unlawfully and that its reuse is compatible with the original collection purpose.
We have summarized below key takeaways for each recommendation.
Recommendation on Informing Individuals
The CNIL emphasizes the importance of transparency in AI systems that process personal data. Organizations must provide clear, accessible, and intelligible information to data subjects about the processing of their data by an AI system. Specifically:
- Timing of the information. The CNIL recommends providing information at the time of the data collection. If data is obtained indirectly, individuals should be informed as soon as possible and at the latest, at the first point of contact with the individuals or the first sharing of the data with another data recipient. In any event, individuals must be informed about the processing of their personal data within one month maximum after the collection of their data.
- How to provide information. The CNIL recommends providing concise, transparent and easily understandable information, using clear and simple language. The information should be easily accessible and distinguished from other unrelated content. To achieve those objectives, the CNIL recommends using a layered approach to provide essential information upfront while linking to more detailed explanations.
- Derogations to information provided individually. The CNIL analyzes various use cases that allow for an exemption from the obligation to individually inform data subjects. For example, when the individuals already have the information as per Article 14 of the GDPR. In all cases, organizations must ensure that these exemptions are applied judiciously and that individuals’ rights are upheld through alternative measures.
- What information must be provided. When providing information to the data subjects, the CNIL states that providing details as required by Articles 13 and 14 of the GDPR will generally be required. If individual notification is exempt under the GDPR, organizations must still ensure transparency by publishing general privacy notices. A website, for example, that contains as much relevant information that would have been provided through individual notification. If the organization cannot identify individuals, they must explicitly state this in the notice. If possible, individuals should be informed of what additional details they can provide to help the organization verify their identity. Regarding data sources, the organization is generally required to provide specific details about these sources when the training datasets comes from a small number of sources, unless an exception applies. However, if the data comes from numerous publicly available sources, a general disclosure is sufficient. This can include the categories and examples of key or typical sources. This aligns with Recital 61 of the GDPR which allows for general information on data sources when multiple sources are used.
- AI models subject to the GDPR. The CNIL looks at the applicability of the GDPR to AI models, emphasizing that not all AI systems are subject to its provisions. Some AI models are considered anonymous because they do not process personal data. In such cases, the GDPR does not apply. However, the CNIL highlights that certain AI models may memorize parts of their training data, leading to potential retention of personal data. If so, those models would fall under the scope of the GDPR and the transparency obligation apply. As a best practice, the CNIL advises AI providers to specify the risks associated with data extraction from the model in their information notices, such as the possibility of “regurgitation” of training data in generative AI, the mitigation measures implemented to reduce those risks and the recourse mechanisms available to individuals in case one of those risks materializes (e.g., in the event of “regurgitation”).
Recommendation on Individual Rights
The CNIL’s guidelines aim to ensure that individuals’ rights are respected and facilitated when their personal data is used in developing AI systems or models.
- General Principles. The CNIL emphasizes that individuals must be able to exercise their data protection rights both with respect to training datasets and AI models, unless the models are considered anonymous (as specified in the EDPB Opinion 28/2024 on certain data protection aspects related to the processing of personal data in the context of AI models). The CNIL flags that while the rights of access, rectification or erasure for training datasets present challenges similar to those faced with other large databases, exercising these rights directly with respect to the AI model (as opposed to the training dataset) raises unique and complex issues. To balance individual rights and AI innovation, the CNIL calls for realistic and proportionate solutions, and highlights that the GDPR provides flexibility to accommodate the specificities of AI models when handling data subject rights requests. For example, the complexity of responding to the request and costs to do so are relevant factors that can be taken into account when assessing how to respond to a request.
- Exercising rights in AI model or system development.
- According to the CNIL, how rights requests should be responded to depends on whether these requests concern training datasets or the AI model itself. In this respect, organizations should clearly inform individuals about how their request is interpreted, i.e., whether it relates to training data or the AI model, and explain how the request is handled. When rights requests relate to training datasets, organizations may face challenges in identifying individuals. In this respect, the CNIL highlights:
- If an organization no longer needs to identify individuals in a training dataset and can prove it, it may indicate this in response to rights requests.
- AI providers generally do not need to identify individuals in their training datasets.
- Organizations are not required to retain identifiers solely to facilitate rights requests if data minimization principles justify their deletion.
- If individuals provide additional information, the organization may use this to verify their identity and facilitate rights requests.
- Individuals have the right to obtain copies of their personal data from training datasets, including annotations and metadata in an understandable format. Complying with this right of access must not infringe others’ rights, such as intellectual property and trade secrets. Further, when complying with the right of access, organizations must provide details on data recipients and sources. If the original source is known, this information must be disclosed. When multiple sources are used, organizations must provide all available information but are not required to retain URLs unless necessary for compliance. More generally, the CNIL highlights that a case-by-case analysis is necessary to determine the level of detail and content of information that must be reasonably and proportionately stored to respond to access requests.
- With respect to the rectification, erasure and objection rights, the CNIL clarifies that, among others:
- Individuals can request correction of inaccurate annotations in training datasets.
- When processing is based on legitimate interest or public interest, individuals may object, if the circumstances justify it.
- AI developers should explore technical solutions, such as opt-out mechanisms or exclusion lists, to facilitate rights requests in cases of web scraping.
- Article 19 of the GDPR provides that a controller must notify each data recipient with whom it has shared personal data of a rectification, restriction or deletion request. Accordingly, when a dataset is shared, updates should be communicated to recipients via APIs or contractual obligations requiring those recipients to apply those updates.
- According to the CNIL, how rights requests should be responded to depends on whether these requests concern training datasets or the AI model itself. In this respect, organizations should clearly inform individuals about how their request is interpreted, i.e., whether it relates to training data or the AI model, and explain how the request is handled. When rights requests relate to training datasets, organizations may face challenges in identifying individuals. In this respect, the CNIL highlights:
- Exercising rights on AI Models subject to GDPR. Certain AI models are trained on personal data but remain anonymous after training. In such cases, GDPR does not apply. If the model retains identifiable personal data, GDPR applies and individuals must be able to exercise their rights over the model:
- Organizations must assess whether a model contains personal data. If the presence of personal data is uncertain, the organization must demonstrate that it is not able to identify individuals as part of its model.
- Once a specific individual has been identified as part of a model, the organization must identify the data that are included. If feasible, data subjects must be given the opportunity to provide additional information to help verify their identity and exercise their rights. If the organization still has access to training data, it may be appropriate to first identify the individual within the dataset before verifying whether their data was memorized by the AI model and could be extracted. If training data is no longer available, the organization can rely on the data typology to determine the likelihood that specific categories of data were memorized. For generative AI models, the CNIL advises providers to establish an internal procedure to systematically query the model using a predefined set of prompts.
- The rights to rectification and erasure are not absolute and should be assessed in light of the sensitivity of the data and the impact on the organization, including the technical feasibility and cost of retraining the model. In some cases, retraining the model is not feasible and the request may be denied. That said, AI developers should monitor advances in AI compliance since evolving techniques may require previously denied requests to be honored in the future. When the organization is still in possession of the training data, retraining the model to remove or correct data should be envisaged. In any event, as current solutions do not always provide a satisfactory response in cases where an AI model is subject to the GDPR, the CNIL recommends that providers anonymize training data. If this is not feasible, they should ensure that the AI model itself remains anonymous after training.
- Exceptions to the exercise of rights. When relying on an exception to limit individuals’ rights as per the GDPR, the organization must inform individuals in advance that their rights may be restricted and explain the reasons for such restrictions.
Read the CNIL’s Press Release (available in English), Recommendation on Informing Individuals and Recommendation on Individual Rights (both only available in French).
For more information on the interplay between the EU AI Act and the requirements of the GDPR, please consult Hunton’s AI Act Guide for In-House Lawyers.
Search
Recent Posts
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- Age Appropriate Design Code
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Behavioral Advertising
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DORA
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Electronic Protected Health Information
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- European Union
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- Financial Data
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Geolocation Data
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- HIPAA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- Iowa
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- North Korea
- Norway
- Obama Administration
- OCPA
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Online Behavioral Advertising
- Online Privacy
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Profiling
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Sensitive Data
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code