Skip to main content

COMPEL Glossary / hyperparameter-search

Hyperparameter search

Structured or randomized exploration over model-configuration space — learning rate, depth, regularization strength, and similar — to find a configuration meeting a target criterion.

What this means in practice

Random search is preferred over grid search for higher-dimensional spaces; Bayesian and population-based methods are further refinements.

Synonyms

hyperparameter optimization , HPO , hyperparameter tuning

See also

  • Offline evaluation — Assessment of an AI system against static datasets — training hold-out, validation set, benchmark corpus — without exposure to live user traffic.
  • Experiment tracking — The infrastructure and practice of recording artifacts, metrics, parameters, environment, and lineage for every experiment run — enabling later reproduction, comparison across runs, and audit.
  • Reproducibility — The property that re-running an experiment with the same code, data, and configuration produces the same results within declared tolerance.

Related articles in the Body of Knowledge