Published April 4, 2024 | Version 0.7.0
Software Open

deephyper/deephyper: Changelog - DeepHyper 0.7.0

  • 1. Argonne National Laboratory
  • 2. Northwestern University
  • 3. Pennsylvania State University
  • 4. Argonne National Lab
  • 5. University of Notre Dame
  • 6. HKUST (Guangzhou)
  • 7. @argonne-lcf
  • 8. Lawrence Berkeley National Laboratory; Northwestern University
  • 9. Telecom ParisTech

Description

deephyper.search

  • If a results.csv file already exists in the log_dir folder, it is renamed instead of overwritten.
  • Parallel and asynchronous standard experimental design can now be used through DeepHyper to perform Random, Grid, or Quasi-Monte-Carlo evaluations: Example on Standard Experimental Design (Grid Search).

Bayesian optimization (CBO and MPIDistributedBO)

  • New optimizers of the acquisition function: the acquisition function for non-derivable surrogate models (e.g., "RF", "ET") can now be optimized with acq_optimizer="ga" or acq_optimizer="mixedga". This makes BO iterations more efficient but implies an additional overhead (negligible if the evaluated function is slow). The acq_optimizer_freq=2 parameter can be used to amortize this overhead.
  • New exploration/exploitation scheduler: the periodic exponential decay scheduler can now be specified with its initial and final values CBO(..., kappa=10, scheduler={"type": "periodic-exp-decay", "period": 25, "kappa_final": 1.96}). This mechanism allows to escape local optimum.
  • New family of acquisition functions for Random-Forests surrogate models. acq_func="UCBd" or "EId" or "PId". The "d" postfix stands for "deterministic" because in this case, the acquisition function will only use epistemic uncertainty from the black-box function to evaluate the acquisition function and it will ignore aleatoric uncertainty (i.e., noise estimation).
  • The default surrogate model was renamed "ET" which stands for "Extremely Randomized Trees" to better match the machine learning literature. This surrogate model provides better epistemic uncertainty estimates than the standard "RF" which stands for "Random Forest". It is also a type of randomized ensemble of trees but it uses a randomized split decision rule instead of an optimized split decision rule.
  • HpProblem based on ConfigSpace objects using constraints now uses the lower bound of each hyperparameter as a slack value.

deephyper.stopper

  • The stopper based on learning curve extrapolation has an improved fit and speed.

Files

deephyper/deephyper-0.7.0.zip

Files (558.9 kB)

Name Size Download all
md5:3554abfa47e9c2f910ee5b3b1de2be1d
558.9 kB Preview Download

Additional details

Related works