TunerAsync using Asynchronous Decentralized Bayesian Optimization
Source:R/TunerADBO.R
mlr_tuners_adbo.Rd
TunerADBO
class that implements Asynchronous Decentralized Bayesian Optimization (ADBO).
ADBO is a variant of Asynchronous Model Based Optimization (AMBO) that uses AcqFunctionStochasticCB with exponential lambda decay.
This is a minimal interface internally passing on to OptimizerAsyncMbo.
For additional information and documentation see OptimizerAsyncMbo.
Currently, only single-objective optimization is supported and TunerADBO
is considered an experimental feature and API might be subject to changes.
Parameters
initial_design
data.table::data.table()
Initial design of the optimization. IfNULL
, a design of sizedesign_size
is generated with the specifieddesign_function
. Default isNULL
.design_size
integer(1)
Size of the initial design if it is to be generated. Default is100
.design_function
character(1)
Sampling function to generate the initial design. Can berandom
paradox::generate_design_random,lhs
paradox::generate_design_lhs, orsobol
paradox::generate_design_sobol. Default issobol
.n_workers
integer(1)
Number of parallel workers. IfNULL
, all rush workers specified viarush::rush_plan()
are used. Default isNULL
.
References
Egelé, Romain, Guyon, Isabelle, Vishwanath, Venkatram, Balaprakash, Prasanna (2023). “Asynchronous Decentralized Bayesian Optimization for Large Scale Hyperparameter Optimization.” In 2023 IEEE 19th International Conference on e-Science (e-Science), 1–10.
Super classes
mlr3tuning::Tuner
-> mlr3tuning::TunerAsync
-> mlr3tuning::TunerAsyncFromOptimizerAsync
-> TunerADBO
Active bindings
surrogate
(Surrogate |
NULL
)
The surrogate.acq_function
(AcqFunction |
NULL
)
The acquisition function.acq_optimizer
(AcqOptimizer |
NULL
)
The acquisition function optimizer.result_assigner
(ResultAssigner |
NULL
)
The result assigner.param_classes
(
character()
)
Supported parameter classes that the optimizer can optimize. Determined based on thesurrogate
and theacq_optimizer
. This corresponds to the values given by a paradox::ParamSet's$class
field.properties
(
character()
)
Set of properties of the optimizer. Must be a subset ofbbotk_reflections$optimizer_properties
. MBO in principle is very flexible and by default we assume that the optimizer has all properties. When fully initialized, properties are determined based on the loop, e.g., theloop_function
, andsurrogate
.packages
(
character()
)
Set of required packages. A warning is signaled prior to optimization if at least one of the packages is not installed, but loaded (not attached) later on-demand viarequireNamespace()
. Required packages are determined based on theacq_function
,surrogate
and theacq_optimizer
.
Methods
Method reset()
Reset the tuner.
Sets the following fields to NULL
:
surrogate
, acq_function
, acq_optimizer
, result_assigner
Resets parameter values design_size
and design_function
to their defaults.
Examples
# \donttest{
if (requireNamespace("rush") &
requireNamespace("mlr3learners") &
requireNamespace("DiceKriging") &
requireNamespace("rgenoud")) {
if (redis_available()) {
library(mlr3)
library(mlr3tuning)
# single-objective
task = tsk("wine")
learner = lrn("classif.rpart", cp = to_tune(lower = 1e-4, upper = 1, logscale = TRUE))
resampling = rsmp("cv", folds = 3)
measure = msr("classif.acc")
instance = TuningInstanceAsyncSingleCrit$new(
task = task,
learner = learner,
resampling = resampling,
measure = measure,
terminator = trm("evals", n_evals = 10))
rush::rush_plan(n_workers=2)
tnr("adbo", design_size = 4, n_workers = 2)$optimize(instance)
} else {
message("Redis server is not available.\nPlease set up Redis prior to running the example.")
}
}
#> Redis server is not available.
#> Please set up Redis prior to running the example.
# }