Skip to content

Releases: hyperactive-project/Hyperactive

v4.0.0

01 Dec 14:54

Choose a tag to compare

v4.0.0

v3.2.4

07 Jul 12:40

Choose a tag to compare

Changes from v3.0.0 -> v3.2.4:

  • Decouple number of runs from active processes (Thanks to PartiallyTyped). This reduces memory load if number of jobs is huge
  • New feature: The progress board enables the user to monitor the optimization progress during the run.
    • Display trend of best score
    • Plot parameters and score in parallel coordinates
    • Generate filter file to define an upper and/or lower bound for all parameters and the score in the parallel coordinate plot
    • List parameters of 5 best scores
  • add Python 3.8 to tests
  • add warnings of search space values does not contain lists
  • improve stability of result-methods
  • add tests for hyperactive-memory + search spaces

v2.3.0

16 Jul 10:23

Choose a tag to compare

  • add Tree-structured optimization algorithm (idea from Hyperopt)
  • add Decision-tree optimization algorithm (idea from sklearn)
  • enable new optimization parameters for bayes-opt:
    • max_sample_size: maximum number of samples for the gaussian-process-reg to train on. Sampling done by random choice.
    • skip_retrain: skips the retraining of the gaussian-process-reg sometimes during the optimization run. Basically returns multiple predictions for next output (which should be apart from another)

v2.1.0

16 Jul 10:15

Choose a tag to compare

  • first stable implementation of "long-term-memory" to save/load search positions/parameter and results.
  • enable warm start of sequence based optimizers (bayesian opt, ...) with results from "long-term-memory"
  • enable the usage of other gaussian-process-regressors than from sklearn. GPR-class (from gpy, GPflow, ...) can be passed to "optimizer"-kwarg

v2.0.0

16 Jul 10:09

Choose a tag to compare

API-change to improve usage. Class accepts training data. "search"-method accepts search_config and other optimization-run specific arguments like n_iter, n_jobs, optimizer.

v1.1.1

08 Oct 17:31

Choose a tag to compare

  • small api-change
  • extend progress bar information
  • re-enable multiprocessing for new api

v1.0.0

25 Sep 07:11

Choose a tag to compare

  • new API that creates model by function and search space by dict
  • enables more flexible usage (e.g. free use of framework, ensembles, nn-structure)
  • 100% test coverage

v0.4.2

09 Sep 13:24

Choose a tag to compare

  • performance fixes for bayesian optimization and parallel tempering
  • better default parameter for most optimizers
  • better implementation for metrics
  • add support for catboost
  • integration of meta-learn code into hyperactive
  • cleanup to avoid similar code

v0.4.1.2

31 Jul 06:41

Choose a tag to compare

  • k-fold-cross validation works with keras models
  • a cv of < 1 trains the model on a fraction of the training data and tests on the rest
  • better testing and code-coverage
  • fix of score and predict method

v0.4.0

23 Jul 17:12

Choose a tag to compare

  • improvement of optimizer class structure
  • lower memory usage
  • add testing of optimization process
  • a lot of clean up und several bug fixes (mostly parallel tempering)