Skip to main content

GDPR + EU AI Act Dual-Compliance Assessment

AI systems processing personal data in the EU must satisfy two major regulations simultaneously: the General Data Protection Regulation (Regulation (EU) 2016/679) and the EU Artificial Intelligence Act (Regulation (EU) 2024/1689). This assessment maps obligations side-by-side across 14 overlap areas, identifies coverage gaps, and gives a prioritized implementation roadmap aligned to COMPEL transformation stages.

14

Overlap Areas

2

Full Overlaps

2

Potential Conflicts

5

Critical Priority

Overlap Taxonomy

Full Overlap

2 areas

Partial Overlap

6 areas

Complementary

4 areas

Potential Conflict

2 areas

Overlap Areas

Each entry maps a specific GDPR requirement against its EU AI Act counterpart, flags the overlap type, and specifies COMPEL stages where the obligation is executed.

DUAL-001 Partial Overlap critical priority COMPEL: Calibrate COMPEL: Evaluate

Impact Assessment Obligations

GDPR — Article 35, Recital 84, Recital 89-91

Data Protection Impact Assessment (DPIA) required where processing, in particular using new technologies, is likely to result in a high risk to the rights and freedoms of natural persons. Must assess necessity, proportionality, risks, and safeguards.

AI Act — Article 9, Article 43, Annex VI, Annex VII

Conformity assessment required for high-risk AI systems before market placement. Must demonstrate compliance with Articles 8-15 covering risk management, data governance, technical documentation, and human oversight.

Conflict Resolution Guidance

Conduct an integrated impact assessment that satisfies both DPIA requirements (focus: data subjects' rights) and AI Act conformity assessment (focus: system compliance). The DPIA can be embedded within the broader conformity assessment, with additional data protection-specific analysis.

Evidence requirements & notes

Evidence Requirements

  • Integrated impact assessment document covering both DPIA and conformity elements
  • Risk assessment covering data protection risks and AI system risks
  • Consultation records with DPO and AI compliance officer
  • Safeguard and mitigation measures documentation

Notes

Where the AI system processes personal data, the GDPR DPIA is mandatory alongside the AI Act conformity assessment. The European Data Protection Board (EDPB) recommends a combined approach.

DUAL-002 Partial Overlap critical priority COMPEL: Produce COMPEL: Learn

Transparency Obligations

GDPR — Article 13, Article 14, Recital 58-62

Controllers must provide data subjects with information about processing including purposes, legal basis, recipients, retention periods, and the existence of automated decision-making with meaningful information about the logic involved.

AI Act — Article 13, Article 52

High-risk AI systems must be designed for transparency. Deployers must provide instructions for use covering capabilities, limitations, accuracy, and human oversight measures. Users must be informed when interacting with AI.

Conflict Resolution Guidance

Create a unified transparency package: GDPR privacy notice enriched with AI Act-specific disclosures (system capabilities, limitations, accuracy metrics). Ensure the "meaningful information about the logic" (GDPR Art 13(2)(f)) is complemented by the AI Act's system capability and limitation disclosures.

Evidence requirements & notes

Evidence Requirements

  • Unified privacy notice with AI-specific disclosures
  • AI interaction notification mechanism documentation
  • System capability and limitation disclosure
  • Instructions for use provided to deployers

Notes

GDPR transparency focuses on data processing; AI Act transparency focuses on system behavior. Both must be addressed simultaneously for AI systems processing personal data.

DUAL-003 Full Overlap critical priority COMPEL: Model COMPEL: Produce

Automated Decision-Making and Human Oversight

GDPR — Article 22, Recital 71

Data subjects have the right not to be subject to decisions based solely on automated processing, including profiling, which produces legal or similarly significant effects. Exceptions require explicit consent, contractual necessity, or legal authorization, with safeguards including the right to human intervention.

AI Act — Article 14

High-risk AI systems must be designed for effective human oversight. Natural persons assigned human oversight must be able to fully understand the system, monitor its operation, override or reverse outputs, and intervene when necessary.

Evidence requirements & notes

Evidence Requirements

  • Human oversight design specification
  • Override and intervention mechanism documentation
  • GDPR Art 22 safeguards implementation evidence
  • Human reviewer competency verification records
  • Decision contestation process documentation

Notes

Both regulations require human involvement in automated decisions. The AI Act's human oversight provisions (Art 14) can satisfy GDPR Art 22 safeguard requirements when properly implemented, creating a compliance efficiency.

DUAL-004 Potential Conflict critical priority COMPEL: Organize COMPEL: Model

Data Governance and Processing Principles

GDPR — Article 5, Article 6, Article 9, Article 25

Personal data must be processed lawfully, fairly, and transparently; collected for specified, explicit and legitimate purposes; adequate, relevant and limited to what is necessary (data minimization); accurate; and subject to storage limitation and integrity/confidentiality safeguards.

AI Act — Article 10

Training, validation, and testing data sets must meet quality criteria: relevant, representative, free of errors, complete, and subject to appropriate governance practices including bias examination, annotation procedures, and gap identification.

Conflict Resolution Guidance

Potential tension between GDPR data minimization (Art 5(1)(c)) and AI Act data quality/representativeness (Art 10(3)). Resolution: establish a documented legal basis for processing the data volume needed for AI quality, demonstrate that the data set is the minimum necessary for achieving representativeness, and apply technical safeguards (pseudonymization, access controls).

Evidence requirements & notes

Evidence Requirements

  • Data governance policy covering both GDPR principles and AI Act requirements
  • Legal basis documentation for training data processing
  • Data minimization justification for AI data volumes
  • Data quality assessment reports
  • Bias examination results

Notes

The tension between GDPR data minimization and AI Act representativeness is one of the most significant conflicts. Art 10(5) of the AI Act provides a limited derogation for bias detection using special category data.

DUAL-005 Partial Overlap high priority COMPEL: Organize COMPEL: Produce

Record-Keeping Obligations

GDPR — Article 30

Controllers and processors must maintain records of processing activities including purposes, categories of data subjects and personal data, recipients, transfers, retention periods, and security measures.

AI Act — Article 12

High-risk AI systems must technically allow automatic recording of events (logs) throughout the system lifetime, ensuring traceability, audit trail completeness, and minimum six-month retention accessible to authorities.

Conflict Resolution Guidance

Implement a unified record-keeping system: GDPR processing activity records (organizational/legal focus) combined with AI Act event logs (technical/operational focus). Ensure log retention policies satisfy both the GDPR storage limitation principle and the AI Act six-month minimum.

Evidence requirements & notes

Evidence Requirements

  • Records of processing activities (ROPA) covering AI system data processing
  • AI system event logs meeting Art 12 requirements
  • Unified log retention policy satisfying both regimes
  • Authority access procedures for both data protection and AI oversight bodies

Notes

GDPR records focus on data processing activities; AI Act logs focus on system operation events. Both are required and should reference each other for completeness.

DUAL-006 Complementary high priority COMPEL: Produce COMPEL: Learn

Individual Rights and Redress

GDPR — Article 15, Article 16, Article 17, Article 18, Article 20, Article 21, Article 22

Data subjects have rights to access, rectification, erasure, restriction, portability, objection, and not to be subject to solely automated decisions. Controllers must respond within one month.

AI Act — Article 14(4), Article 86

Affected persons have the right to an explanation of individual decisions made by high-risk AI systems and the right to contest such decisions. Member States may establish additional rights.

Evidence requirements & notes

Evidence Requirements

  • Unified rights request handling process
  • Explanation generation mechanism for AI decisions
  • Decision contestation workflow and SLAs
  • Response timeline compliance records

Notes

GDPR rights are broader (access, erasure, portability); AI Act rights are deeper for AI decisions (explanation, contestation). Implement a unified rights management system covering both.

DUAL-007 Complementary high priority COMPEL: Organize

Designated Compliance Roles

GDPR — Article 37, Article 38, Article 39

Data Protection Officer (DPO) must be designated for public authorities and organizations whose core activities involve large-scale systematic monitoring or processing of special categories of data. DPO must be independent, report to highest management, and have expert knowledge.

AI Act — Article 4(3), Article 26(5)

Deployers of high-risk AI systems must designate a natural person responsible for human oversight. Providers must ensure AI literacy of their staff. While no formal "AI Compliance Officer" role is mandated, the obligations imply a designated AI governance function.

Evidence requirements & notes

Evidence Requirements

  • DPO appointment documentation
  • AI governance officer/function designation
  • Role and responsibility matrix showing DPO and AI governance function interaction
  • AI literacy training records for relevant staff

Notes

The DPO and AI compliance function should collaborate closely. In smaller organizations, the DPO may take on AI compliance responsibilities. Clear delineation of responsibilities is essential.

DUAL-008 Potential Conflict high priority COMPEL: Organize COMPEL: Model

Consent Mechanisms for AI Training Data

GDPR — Article 6, Article 7, Article 9, Recital 42-43

Consent must be freely given, specific, informed, and unambiguous for processing personal data. For special categories, explicit consent is required. Consent must be withdrawable at any time without detriment.

AI Act — Article 10(5), Article 53(1)(d)

AI Act allows exceptional processing of special category personal data for bias detection under strict safeguards. GPAI providers must publish sufficiently detailed summaries of training data content.

Conflict Resolution Guidance

Where consent is the legal basis: ensure consent mechanisms clearly cover AI training purposes, allow granular choices (training vs. inference), and support withdrawal (though practical model unlearning may be limited). Consider legitimate interests (Art 6(1)(f)) as an alternative legal basis with proper balancing test. For AI Act bias detection derogation (Art 10(5)): document compliance with GDPR safeguards and demonstrate the processing is strictly necessary.

Evidence requirements & notes

Evidence Requirements

  • Legal basis assessment for training data processing
  • Consent forms covering AI-specific purposes (if consent is the basis)
  • Legitimate interest assessment (if LI is the basis)
  • Data retention and withdrawal mechanism documentation
  • Art 10(5) strict necessity justification (for special category data)

Notes

Consent withdrawal poses challenges for AI training data already incorporated into models. Organizations should consider legal bases that accommodate the technical realities of AI training while respecting data subject rights.

DUAL-009 Complementary medium priority COMPEL: Organize

Cross-Border Data Transfer for AI Models

GDPR — Article 44, Article 45, Article 46, Article 49

Personal data transfers to third countries require adequacy decisions, appropriate safeguards (SCCs, BCRs), or derogations. The Schrems II ruling requires supplementary measures where surveillance risks exist.

AI Act — Article 53(1), Article 2(7)

AI Act applies to providers placing AI systems on the EU market regardless of establishment. GPAI models trained on data from multiple jurisdictions must comply. The regulation does not override GDPR transfer mechanisms.

Evidence requirements & notes

Evidence Requirements

  • Data transfer impact assessment for AI training data
  • Transfer mechanism documentation (SCCs, BCRs, adequacy decisions)
  • Supplementary measures assessment for high-risk transfers
  • AI model training data geographic provenance documentation

Notes

Cloud-based AI training and inference often involve cross-border transfers. Organizations must ensure GDPR transfer mechanisms cover AI-specific data flows, including model weights that may encode personal data.

DUAL-010 Full Overlap high priority COMPEL: Produce COMPEL: Learn

Right to Explanation of AI Decisions

GDPR — Article 13(2)(f), Article 14(2)(g), Article 22(3), Recital 71

Data subjects have the right to meaningful information about the logic involved in automated decision-making, the significance, and envisaged consequences. Recital 71 references the right to obtain an explanation of the decision reached.

AI Act — Article 13, Article 86

High-risk AI systems must be designed for transparency enabling deployers to interpret outputs. Article 86 establishes the right to explanation of individual AI-assisted decisions.

Evidence requirements & notes

Evidence Requirements

  • Explainability design documentation
  • Explanation generation methodology (LIME, SHAP, counterfactual, etc.)
  • Sample explanations demonstrating meaningfulness
  • Explanation delivery mechanism for affected individuals

Notes

GDPR's "meaningful information about the logic" and the AI Act's right to explanation are mutually reinforcing. The AI Act provides more operational specificity about what transparency means in practice.

DUAL-011 Complementary medium priority COMPEL: Model COMPEL: Produce

Privacy by Design and AI System Requirements

GDPR — Article 25

Controllers shall implement appropriate technical and organisational measures designed to implement data-protection principles (data protection by design and by default). This includes pseudonymization, data minimization, and privacy-enhancing technologies.

AI Act — Article 15

High-risk AI systems shall achieve appropriate levels of accuracy, robustness, and cybersecurity throughout their lifecycle. Redundancy, fail-safe mechanisms, and resilience against adversarial attacks are required.

Evidence requirements & notes

Evidence Requirements

  • Privacy by design assessment document
  • Privacy-enhancing technology implementation evidence
  • AI system accuracy and robustness test results
  • Integrated design review records covering both privacy and AI quality

Notes

Privacy by design (GDPR) and accuracy/robustness (AI Act) are complementary: both require proactive engineering of safeguards. Privacy-enhancing technologies (differential privacy, federated learning) can satisfy both simultaneously.

DUAL-012 Partial Overlap critical priority COMPEL: Evaluate COMPEL: Learn

Incident Notification and Reporting

GDPR — Article 33, Article 34

Controllers must notify supervisory authorities within 72 hours of becoming aware of a personal data breach. Data subjects must be notified without undue delay if the breach is likely to result in a high risk to their rights and freedoms.

AI Act — Article 62, Article 73

Providers and deployers must report serious incidents to market surveillance authorities. Serious incidents include those causing death, serious damage to health, property, environment, or fundamental rights.

Conflict Resolution Guidance

Implement a unified incident management process with dual notification paths: GDPR notification to supervisory authority (72 hours) and AI Act notification to market surveillance authority. An incident may trigger both obligations simultaneously (e.g., AI system malfunction causing data breach). Coordinate timelines and reporting content to ensure consistency.

Evidence requirements & notes

Evidence Requirements

  • Unified incident response plan covering both GDPR and AI Act obligations
  • Incident classification criteria distinguishing data breaches and AI serious incidents
  • Dual notification templates for supervisory and market surveillance authorities
  • Incident log with timeline compliance records

Notes

A single AI system incident can trigger both GDPR breach notification (if personal data is affected) and AI Act serious incident reporting. Organizations need a single incident management process with dual reporting tracks.

DUAL-013 Partial Overlap medium priority COMPEL: Organize

AI Supply Chain Governance and Joint Controllership

GDPR — Article 26, Article 28

Where two or more controllers jointly determine purposes and means of processing, they are joint controllers and must arrange their respective responsibilities by agreement. Processors must be bound by data processing agreements.

AI Act — Article 25, Article 28

Obligations apply along the AI value chain: providers, deployers, importers, distributors, and authorized representatives each have defined obligations. Downstream integrators may become providers under certain conditions.

Conflict Resolution Guidance

Map the AI Act value chain roles (provider, deployer, importer) to GDPR processing roles (controller, joint controller, processor). Ensure contractual arrangements (Art 26/28 GDPR) cover AI-specific obligations alongside data protection clauses. A single contract should address both regimes.

Evidence requirements & notes

Evidence Requirements

  • AI supply chain mapping with role designation under both regulations
  • Joint controllership agreement (where applicable) covering AI and data protection
  • Data processing agreements incorporating AI Act obligations
  • Supply chain due diligence records

Notes

AI supply chains often involve complex multi-party arrangements. An AI provider (AI Act) may be a processor (GDPR) or joint controller depending on the arrangement. Clear contractual allocation is essential.

DUAL-014 Partial Overlap high priority COMPEL: Organize COMPEL: Learn

Penalties and Enforcement Coordination

GDPR — Article 58, Article 83, Article 84

GDPR supervisory authorities can impose administrative fines up to EUR 20M or 4% of total worldwide annual turnover (whichever is higher) for the most serious infringements. Each Member State supervisory authority has investigative, corrective, and advisory powers.

AI Act — Article 71, Article 99, Article 100, Article 101

AI Act penalties: up to EUR 35M or 7% of annual worldwide turnover for prohibited AI practices; up to EUR 15M or 3% for other obligations. The AI Office coordinates enforcement for GPAI. National market surveillance authorities enforce for high-risk AI.

Conflict Resolution Guidance

The AI Act explicitly states (Recital 10) that it complements GDPR and does not affect its application. Penalties under both regulations can theoretically be imposed for the same conduct (ne bis in idem questions may arise). Organizations should implement unified compliance programs to mitigate risk under both regimes simultaneously.

Evidence requirements & notes

Evidence Requirements

  • Compliance program documentation covering both GDPR and AI Act
  • Enforcement authority mapping (supervisory authority vs. market surveillance)
  • Risk assessment for dual-enforcement scenarios
  • Legal analysis of ne bis in idem implications

Notes

AI Act penalties (7% turnover for prohibited practices) exceed GDPR maximums (4% turnover). The interaction between GDPR supervisory authorities and AI Act market surveillance authorities requires careful organizational mapping.

Gap Analysis

Obligations that one regulation mandates but the other does not — organizations must bridge both sides to achieve defensible dual compliance.

GAP-001 GDPR Gap Ref: AI Act Art 9

AI Risk Management System

GDPR does not mandate a dedicated risk management system for AI. The AI Act requires a documented risk management system throughout the AI lifecycle for high-risk systems.

Action: Establish an AI risk management framework that integrates with your DPIA process. Use the AI Act risk management requirements as a blueprint and extend your GDPR compliance program.

GAP-002 GDPR Gap Ref: AI Act Art 10

Training Data Quality Standards

GDPR focuses on lawful processing and data minimization but does not prescribe data quality standards for AI training data (representativeness, completeness, bias testing).

Action: Supplement GDPR data governance with AI-specific data quality requirements: bias examination, representativeness testing, and annotation quality controls as required by Art 10.

GAP-003 AI Act Gap Ref: GDPR Art 17

Right to Erasure for AI Models

The AI Act does not address how the GDPR right to erasure applies to personal data embedded in trained AI models. Model unlearning remains technically challenging.

Action: Document your approach to erasure requests involving AI training data. Consider techniques such as model retraining, machine unlearning, or data isolation with access controls.

GAP-004 AI Act Gap Ref: GDPR Art 20

Data Portability for AI Contexts

The AI Act does not address data portability in AI contexts. GDPR portability rights may be difficult to implement where data has been transformed through AI processing.

Action: Implement data export capabilities for personal data used in AI systems. Document which data can be ported and any technical limitations on portability of AI-processed data.

GAP-005 GDPR Gap Ref: AI Act Art 15

Cybersecurity and Adversarial Robustness

GDPR requires integrity and confidentiality safeguards but does not address AI-specific threats such as adversarial attacks, model poisoning, or prompt injection.

Action: Extend your security program to cover AI-specific attack vectors. Implement adversarial robustness testing, model integrity verification, and input validation against prompt injection.

GAP-006 AI Act Gap Ref: GDPR Art 9

Special Category Data in AI Training

The AI Act provides a narrow derogation (Art 10(5)) for processing special category data for bias detection, but does not fully harmonize with GDPR Art 9 conditions.

Action: When using special category data for bias detection, document both the AI Act derogation and the applicable GDPR Art 9 condition. Implement strict safeguards including pseudonymization and access controls.

Conflict Resolution

Areas where the GDPR and EU AI Act create tension that organizations must actively resolve through program design.

CR-001

Data Governance and Processing Principles

GDPR Position

Personal data must be processed lawfully, fairly, and transparently; collected for specified, explicit and legitimate purposes; adequate, relevant and limited to what is necessary (data minimization); accurate; and subject to storage limitation and integrity/confidentiality safeguards.

AI Act Position

Training, validation, and testing data sets must meet quality criteria: relevant, representative, free of errors, complete, and subject to appropriate governance practices including bias examination, annotation procedures, and gap identification.

Resolution

Potential tension between GDPR data minimization (Art 5(1)(c)) and AI Act data quality/representativeness (Art 10(3)). Resolution: establish a documented legal basis for processing the data volume needed for AI quality, demonstrate that the data set is the minimum necessary for achieving representativeness, and apply technical safeguards (pseudonymization, access controls).

GDPR Article 5, Article 6, Article 9, Article 25 read together with AI Act Article 10. See also AI Act Recital 10 (GDPR compatibility clause).

CR-002

Consent Mechanisms for AI Training Data

GDPR Position

Consent must be freely given, specific, informed, and unambiguous for processing personal data. For special categories, explicit consent is required. Consent must be withdrawable at any time without detriment.

AI Act Position

AI Act allows exceptional processing of special category personal data for bias detection under strict safeguards. GPAI providers must publish sufficiently detailed summaries of training data content.

Resolution

Where consent is the legal basis: ensure consent mechanisms clearly cover AI training purposes, allow granular choices (training vs. inference), and support withdrawal (though practical model unlearning may be limited). Consider legitimate interests (Art 6(1)(f)) as an alternative legal basis with proper balancing test. For AI Act bias detection derogation (Art 10(5)): document compliance with GDPR safeguards and demonstrate the processing is strictly necessary.

GDPR Article 6, Article 7, Article 9, Recital 42-43 read together with AI Act Article 10(5), Article 53(1)(d). See also AI Act Recital 10 (GDPR compatibility clause).

Implementation Priority Matrix

Sequencing guidance based on regulatory priority, implementation effort, and recommended deadlines.

ID Area Priority Effort Category Deadline
PM-001 Impact Assessment Obligations critical very high partial overlap Q2 2026
PM-002 Transparency Obligations critical very high partial overlap Q2 2026
PM-003 Automated Decision-Making and Human Oversight critical very high full overlap Q2 2026
PM-004 Data Governance and Processing Principles critical very high potential conflict Q2 2026
PM-005 Record-Keeping Obligations high high partial overlap Q3 2026
PM-006 Individual Rights and Redress high high complementary Q3 2026
PM-007 Designated Compliance Roles high high complementary Q3 2026
PM-008 Consent Mechanisms for AI Training Data high high potential conflict Q3 2026
PM-009 Cross-Border Data Transfer for AI Models medium medium complementary Q4 2026
PM-010 Right to Explanation of AI Decisions high high full overlap Q3 2026
PM-011 Privacy by Design and AI System Requirements medium medium complementary Q4 2026
PM-012 Incident Notification and Reporting critical very high partial overlap Q2 2026
PM-013 AI Supply Chain Governance and Joint Controllership medium medium partial overlap Q4 2026
PM-014 Penalties and Enforcement Coordination high high partial overlap Q3 2026