Skip to main content

COMPEL Glossary / algorithmic-impact-assessment

Algorithmic Impact Assessment

An Algorithmic Impact Assessment (AIA) is a formal, structured evaluation conducted before deploying an AI system to identify and quantify potential negative impacts on individuals and communities, particularly regarding fairness, privacy, civil rights, employment, and access to services.

What this means in practice

It goes beyond technical testing to consider social, economic, and political context, examining who benefits from the system, who might be harmed, and what mitigations are available. For organizations, AIAs provide a documented, defensible basis for deployment decisions and demonstrate proactive risk management to regulators and the public. In COMPEL, AIAs are a governance deliverable within Module 3.4, Article 4, where they are positioned as Component Three of the Advanced Ethics Architecture framework.

Why it matters

Deploying AI without systematically assessing its impact on individuals and communities is increasingly illegal under regulations like the EU AI Act and ethically indefensible for high-stakes applications. Impact assessments force organizations to consider who benefits, who might be harmed, and what mitigations are available before damage occurs. They provide a documented, defensible basis for deployment decisions that demonstrate proactive risk management to regulators and the public.

How COMPEL uses it

Algorithmic Impact Assessments are governance deliverables within the Model stage, positioned as a core component of the advanced ethics architecture framework in the Governance pillar. The Calibrate stage identifies which systems require impact assessment based on risk classification. During Produce, assessments are conducted before deployment, and the Evaluate stage reviews assessment accuracy against actual outcomes. The Learn stage refines assessment methodology based on real-world experience.

Related Terms

Other glossary terms mentioned in this entry's definition and context.