COMPEL Glossary / ai-system-impact-assessment
AI System Impact Assessment
An AI System Impact Assessment is a structured, documented evaluation of how a proposed or existing AI system affects individuals, groups, organizations, and society across dimensions including fundamental rights, safety, privacy, fairness, environmental impact, and labor market effects.
What this means in practice
It is often required by regulation before deploying high-risk AI systems, particularly under the EU AI Act and similar frameworks emerging in other jurisdictions. The assessment forces organizations to think beyond technical performance to consider real-world consequences and to design mitigations before harm occurs. In COMPEL, impact assessments are a governance deliverable produced during the Model stage and reviewed during Evaluate. Module 3.4, Article 4 covers advanced ethics architecture, including how impact assessments connect to the broader algorithmic accountability framework.
Why it matters
Deploying AI without assessing its real-world impact on individuals, communities, and society is both ethically irresponsible and increasingly illegal under frameworks like the EU AI Act. Impact assessments force organizations to think beyond technical performance to consider consequences across fundamental rights, safety, privacy, fairness, and environmental dimensions. Proactive assessment is far less costly than reactive remediation after harm has occurred.
How COMPEL uses it
Impact assessments are governance deliverables produced during the Model stage within the Governance pillar and reviewed during Evaluate. They connect to the broader algorithmic accountability framework and are required before deploying high-risk AI systems. The Calibrate stage identifies which existing and planned systems require impact assessment based on risk classification, and the Learn stage captures lessons from assessment findings to improve future assessment practices.
Related Terms
Other glossary terms mentioned in this entry's definition and context.