Skip to main content

COMPEL Glossary / ai-reference-architecture

AI reference architecture

A canonical layered model — client, orchestration, model, knowledge, observability planes — that every AI system maps onto.

What this means in practice

The reference architecture gives a vendor-neutral vocabulary for evaluating, comparing, and governing AI implementations across the enterprise portfolio.

Synonyms

AI five-plane model , enterprise AI reference architecture

See also

  • Orchestration plane — The layer of an AI system that coordinates prompts, retrievals, tool calls, safety filters, and routing between the user and one or more models.
  • Knowledge plane — The layer of an AI system that stores and serves non-parametric knowledge to the model — through retrieval over vector stores, traditional indexes, and tool-based data access.
  • Serving pattern — The architectural shape of the inference path — managed API, cloud-platform hosted, self-hosted online, self-hosted batch, or edge.
  • Architecture runway — The reusable platform components — inference infrastructure, retrieval stack, observability, policy engine, evaluation harness — that future AI use cases inherit rather than re-build.

Related articles in the Body of Knowledge