COMPEL Glossary / etl-elt-pipeline
ETL/ELT Pipeline
An ETL (Extract-Transform-Load) or ELT (Extract-Load-Transform) pipeline is a data processing workflow that moves data from source systems into target repositories where it can be used for AI training and operations.
What this means in practice
ETL extracts data, transforms it into the required format, then loads it into the target; ELT loads raw data first, then transforms it in place. These pipelines are the plumbing that delivers data to AI models. Pipeline reliability directly affects AI system availability -- if a data pipeline breaks, models may receive stale or incomplete data, degrading predictions without any change to the model itself. In the COMPEL Operational Readiness assessment, data pipeline maturity is one of ten dimensions evaluated, with minimum thresholds required before AI initiatives can pass through the Produce stage gate.
Why it matters
ETL/ELT pipelines are the plumbing that delivers data to AI models. Pipeline reliability directly affects AI system availability: if a data pipeline breaks, models may receive stale or incomplete data, degrading predictions without any change to the model itself. Organizations that neglect pipeline engineering create invisible fragility in their AI systems, where a single upstream failure cascades into unreliable business decisions.
How COMPEL uses it
In the COMPEL Operational Readiness assessment, data pipeline maturity is one of ten dimensions evaluated, with minimum thresholds required before AI initiatives can pass through the Produce stage gate. During Calibrate, pipeline reliability is assessed under the Technology pillar. The Model stage designs pipeline architecture as part of the AI platform, and the Evaluate stage tracks pipeline uptime and data freshness as operational health indicators.
Related Terms
Other glossary terms mentioned in this entry's definition and context.