This article provides the governance-professional-level analysis of conformity assessment under Articles 43-49, covering both the internal assessment pathway (Annex VI) and the notified body pathway (Annex VII), the quality management system requirements (Article 17), the documentation requirements (Annex IV), and the practical considerations that determine pathway selection.
Pathway Determination
The first governance decision is which conformity assessment pathway applies. This determination flows directly from the classification analysis conducted in the previous article.
When Internal Assessment (Annex VI) Applies
Internal conformity assessment is the default pathway for most high-risk AI systems. It applies when:
- The AI system is classified as high-risk under Article 6(2) (Annex III categories), AND
- The AI system is not a safety component of a product requiring third-party assessment under Annex I Union harmonisation legislation, AND
- The provider has not voluntarily elected to involve a notified body
Under internal assessment, the provider conducts the conformity assessment using its own resources and personnel. No external certification body is required. However, the assessment must be conducted by personnel with appropriate competence and independence from the development team.
When Notified Body Assessment (Annex VII) Applies
Notified body assessment is required when:
- The AI system is classified as high-risk under Article 6(1) (Annex I product safety), AND
- The applicable Union harmonisation legislation requires third-party conformity assessment
Notified body assessment is also available as a voluntary option for providers of Annex III systems who wish to obtain third-party assurance.
Practical Decision Factors
Beyond the legal determination, several practical factors influence pathway choice:
| Factor | Internal Assessment | Notified Body Assessment |
|---|---|---|
| Cost | Lower direct cost (internal resources) | Higher cost (notified body fees) |
| Timeline | Typically faster (2-4 weeks for assessment phase) | Longer (6-12 weeks including engagement, audit, and certification) |
| Credibility | Provider self-declaration | Third-party certification |
| Capacity constraint | Limited by internal competence | Limited by notified body availability |
| Customer expectation | May be sufficient for B2B deployers | May be expected for safety-critical products |
| Regulatory perception | Accepted for Annex III systems | May carry greater weight in enforcement |
Internal Conformity Assessment (Annex VI): Step-by-Step
Step 1: Verify Quality Management System Compliance
The assessor must verify that the provider’s QMS complies with Article 17. This requires examining:
QMS Policy and Objectives
- Is there a documented quality policy that references EU AI Act compliance?
- Are quality objectives defined, measurable, and tracked?
- Is the policy communicated to all relevant personnel?
Design, Development, and Verification Procedures
- Are there documented procedures for AI system design and development?
- Do procedures include specification of requirements, architecture design, algorithm selection, and model training?
- Are verification and validation procedures defined for each development stage?
Testing and Validation Procedures
- Are test strategies documented before testing begins?
- Do test procedures cover functional testing, performance testing, robustness testing, and bias testing?
- Are acceptance criteria defined and objectively measurable?
- Are test results recorded, reviewed, and retained?
Data Management Procedures
- Is there a documented data governance framework covering the AI system lifecycle?
- Do procedures cover data collection, preparation, annotation, quality assessment, and bias detection?
- Is data lineage traceable from source to model?
Risk Management Integration
- Is the risk management system (Article 9) integrated with the QMS?
- Do QMS procedures reference risk management outputs?
- Are risk management review records maintained?
Post-Market Monitoring
- Is there a documented post-market monitoring plan?
- Does the plan specify data collection methods, monitoring metrics, and review cadence?
- Are corrective action procedures defined for monitoring findings?
Incident Reporting
- Are incident classification criteria defined?
- Is the reporting procedure documented with timelines and responsible parties?
- Have incident reporting procedures been tested?
Record-Keeping and Accountability
- Are document control procedures in place?
- Is there a defined accountability framework with assigned roles?
- Are records retained for the required period (10 years per Article 19)?
Step 2: Examine Technical Documentation
The assessor must examine the technical documentation (Annex IV) to assess whether the AI system complies with all relevant requirements. This examination covers:
General System Description
- Is the intended purpose clearly and completely described?
- Is the system’s interaction with hardware, software, and other systems documented?
- Are the versions of software and firmware identified?
- Are affected persons and use contexts described?
Development Process Description
- Are design specifications documented (including choices about algorithms, data, and architecture)?
- Is the training methodology described (objective function, optimisation, hyperparameters)?
- Are validation and testing methodologies described with results?
- Are the computational resources used documented?
Data Documentation
- Are training, validation, and testing data sets described?
- Are data governance measures documented?
- Is bias assessment conducted and documented?
- Are data gaps and limitations acknowledged?
Performance Metrics
- Are accuracy, precision, recall, and other relevant metrics declared?
- Are metrics disaggregated by relevant groups?
- Is the appropriateness of chosen metrics justified?
- Are validation results consistent with declared metrics?
Risk Management Documentation
- Does the risk management system documentation cover all Article 9 requirements?
- Are identified risks, treatment decisions, and residual risks documented?
- Is testing evidence for risk management measures available?
Step 3: Verify Design and Development Process Consistency
The assessor must verify that the design and development process and the post-market monitoring are consistent with the technical documentation. This means checking that:
- The system as built matches the system as documented
- Design decisions referenced in documentation are traceable to implementation
- Post-market monitoring procedures described in documentation are actually operational
- Changes made during development are reflected in updated documentation
Step 4: Document Assessment Findings
The assessor must document:
- The scope of the assessment (which requirements were assessed)
- The methodology used (what was examined, how it was evaluated)
- Findings for each requirement (compliant, non-compliant, observation)
- Non-conformities identified and their severity
- Corrective actions required and agreed timelines
- The overall assessment conclusion
Step 5: Corrective Actions and Verification
For any non-conformities identified:
- The provider must investigate the root cause
- Corrective actions must be defined, implemented, and verified
- The assessor must verify that corrective actions effectively resolve the non-conformity
- Re-assessment of affected requirements may be necessary
Step 6: Assessment Conclusion
Upon satisfactory completion of all assessment activities:
- The assessor issues an internal conformity assessment report
- The provider draws up the EU declaration of conformity (Article 47, Annex V)
- CE marking is affixed (Article 48)
- Registration in the EU database is completed (Article 49)
Notified Body Assessment (Annex VII): Step-by-Step
Phase 1: Engagement
Selecting a Notified Body
- Notified bodies are designated by Member States and listed in the NANDO database
- Select a notified body with expertise in your system’s domain
- Verify that the notified body’s designation covers the applicable legislation
- Engage early — capacity is limited, especially in the initial years
Application
- Submit a formal application including system description, intended purpose, and classification rationale
- Provide access to technical documentation
- Agree on assessment scope, timeline, and fees
- Sign a contractual engagement
Phase 2: QMS Audit
The notified body audits the provider’s QMS to determine whether it meets Article 17 requirements. This is a formal audit comparable to ISO 9001 or ISO 13485 audits:
- Stage 1 audit: Document review and readiness assessment
- Stage 2 audit: On-site (or remote) audit of QMS implementation and effectiveness
- Audit findings: Categorised as major non-conformities, minor non-conformities, or observations
- Major non-conformities must be resolved before certification
Phase 3: Technical Documentation Assessment
The notified body examines the technical documentation to verify:
- Completeness against Annex IV requirements
- Compliance with all applicable Articles 8-15
- Accuracy and consistency of documented information
- Adequacy of testing and validation evidence
Phase 4: Certification
Upon satisfactory completion:
- QMS certificate issued (valid for a maximum of 5 years, subject to surveillance audits)
- Technical documentation certificate issued (where applicable)
- The provider draws up the EU declaration of conformity referencing the certificates
- CE marking is affixed with the notified body’s identification number
Ongoing: Surveillance
The notified body conducts periodic surveillance to verify continued compliance:
- Annual surveillance audits of the QMS
- Review of significant changes to the AI system
- Verification that corrective actions from previous audits remain effective
- The certificate may be suspended or withdrawn for persistent non-compliance
Quality Management System Requirements (Article 17)
Article 17 requires providers of high-risk AI systems to establish, implement, document, and maintain a QMS. The QMS must cover all eleven elements specified in the Article.
For organisations with existing QMS certifications, the most efficient approach is to extend the existing system rather than create a parallel one:
ISO 9001 certified organisations: The quality management principles and process approach transfer directly. AI-specific elements to add: risk management per Article 9, data governance per Article 10, human oversight design per Article 14, and post-market monitoring per Article 72.
ISO 13485 certified organisations (medical devices): Strong alignment with the EU AI Act’s approach. The design control, risk management, and post-market surveillance elements map closely. Key extensions: AI-specific testing (robustness, bias), training data governance, and GPAI-specific requirements where applicable.
ISO 42001 certified organisations (AI management systems): The strongest alignment. ISO 42001 was developed with awareness of the EU AI Act and covers most QMS requirements. Verify coverage of all eleven Article 17 elements and supplement where needed.
Technical Documentation Requirements (Annex IV)
Annex IV specifies the minimum content of technical documentation. The following table maps each element to the practical documentation artefact and the typical source within a development organisation:
| Annex IV Element | Documentation Artefact | Typical Source |
|---|---|---|
| General description | System specification document | Product management / systems engineering |
| Intended purpose | Purpose statement and use case descriptions | Product management |
| Interaction with other systems | Integration architecture documentation | Systems engineering |
| Versions of relevant software | Software bill of materials, dependency list | Engineering |
| Hardware requirements | Deployment specification | DevOps / infrastructure |
| Design specifications | Architecture decision records, design documents | Engineering |
| System architecture | Architecture diagrams (component, deployment, data flow) | Engineering |
| Algorithm description | Model card, algorithm documentation | ML engineering / research |
| Data requirements | Data requirements specification | Data engineering |
| Training methodology | Training pipeline documentation, experiment logs | ML engineering |
| Evaluation techniques | Evaluation protocol, test plan | QA / ML engineering |
| Validation and testing results | Test reports, benchmark results | QA |
| Monitoring and functioning | Monitoring system documentation, logging specification | DevOps / ML engineering |
| Performance metrics | Metric definitions, validation results | ML engineering / QA |
| Risk management system | Risk management plan, risk register, FMEA | Risk management |
| Lifecycle changes | Change log, release notes | Engineering |
| Harmonised standards applied | Standards compliance matrix | Quality / compliance |
The EU Declaration of Conformity (Annex V)
The EU declaration of conformity is the formal statement by the provider that the AI system meets all applicable requirements. It must contain:
- AI system name and any additional unambiguous reference enabling identification
- Provider name and address (and authorised representative, if applicable)
- Statement that the declaration is issued under the sole responsibility of the provider
- Statement that the AI system is in conformity with Regulation (EU) 2024/1689 and, where applicable, other relevant Union legislation
- References to harmonised standards or common specifications used
- Where applicable, the name and identification number of the notified body, and reference to the certificate issued
- Place and date of issue
- Name and function of the person signing
- Signature
The declaration must be kept for 10 years after the system is placed on the market. It must be provided to competent authorities upon request.
CE Marking (Article 48)
The CE marking indicates that the AI system conforms to the requirements of the EU AI Act. It must be:
- Affixed visibly, legibly, and indelibly to the AI system
- If that is not possible, affixed to the packaging or accompanying documentation
- Affixed before the system is placed on the market
- Followed by the notified body’s identification number (if a notified body was involved)
For software-based AI systems (which have no physical product to mark), the CE marking is typically included in the software documentation, user interface, or terms of service.
Conformity Assessment for Substantial Modifications
Article 43(4) addresses what happens when a high-risk AI system undergoes a substantial modification. A new conformity assessment is required if:
- The modification significantly affects the system’s compliance with the requirements
- The modification was not pre-determined by the provider in the initial technical documentation
What constitutes a “substantial modification”?
The regulation does not provide a precise threshold. Governance professionals should assess modifications against these criteria:
- Does the modification change the system’s intended purpose or use case?
- Does the modification affect the system’s accuracy, robustness, or cybersecurity?
- Does the modification change the data used for training, validation, or testing?
- Does the modification alter the human oversight mechanisms?
- Does the modification affect the risk assessment?
If the answer to any of these questions is “yes,” a new conformity assessment (or re-assessment of affected elements) is likely required.
Timeline Planning
Governance professionals should plan conformity assessment timelines carefully:
| Activity | Internal Assessment | Notified Body Assessment |
|---|---|---|
| Pre-assessment preparation | 4-8 weeks | 4-8 weeks |
| QMS verification/audit | 1-2 weeks | 4-8 weeks (Stage 1 + Stage 2) |
| Technical documentation review | 1-2 weeks | 4-8 weeks |
| Corrective actions | 2-4 weeks (if needed) | 4-8 weeks (if needed) |
| Certification/declaration | 1 week | 2-4 weeks |
| Total | 8-17 weeks | 14-36 weeks |
These timelines assume that technical documentation and QMS documentation are substantially complete before assessment begins. If significant documentation gaps exist, add the documentation production time.
For organisations with multiple high-risk systems, batch processing can improve efficiency — conducting the QMS assessment once and applying it across multiple systems, while conducting individual technical documentation assessments per system.
Integration with the COMPEL Framework
Conformity assessment sits at the intersection of the Produce and Evaluate stages:
- Produce: The documentation and controls required for conformity assessment are produced during this stage
- Evaluate: The assessment itself — internal or by notified body — validates that the produced outputs meet regulatory requirements
- Learn: Assessment findings feed back into improvement of documentation, processes, and controls
The COMPEL cycle ensures that conformity assessment is not a one-time exercise but part of a continuous governance loop. Post-market monitoring (Article 72) feeds operational data back into risk management, which may trigger documentation updates, which may require re-assessment of affected elements.
This continuous loop is what the EU AI Act requires and what the COMPEL framework delivers: governance as an ongoing operational capability, not a project with a completion date.