COMPEL Glossary / weight-decay
Weight Decay
Weight decay is a regularization technique used during AI model training that adds a penalty term proportional to the magnitude of model weights, discouraging the model from relying too heavily on any single feature and promoting simpler, more generalizable models.
What this means in practice
By preventing individual weights from growing excessively large, weight decay helps the model avoid overfitting to training data and perform better on new, unseen data. For non-technical governance professionals, the key insight is that weight decay is one of many hyperparameters that affect model behavior and should be documented as part of model governance. In COMPEL, weight decay and other training hyperparameters are part of the model documentation and governance controls within the Technology pillar.
Why it matters
Weight decay is one of many training hyperparameters that affect model behavior and should be documented as part of model governance. While technical in nature, its impact on model generalization means it directly influences production performance. For governance professionals, understanding that training choices like weight decay materially affect outcomes reinforces the importance of documenting and governing the full model development process, not just final outputs.
How COMPEL uses it
Weight decay and other training hyperparameters are part of model documentation and governance controls within the Technology pillar. During the Produce stage, hyperparameter choices are recorded in model cards and training logs. The Evaluate stage assesses whether documented hyperparameters are consistently applied across retraining cycles, and the Process pillar's MLOps maturity assessment includes hyperparameter governance as a maturity indicator.
Related Terms
Other glossary terms mentioned in this entry's definition and context.