TunerMbo
class that implements Model Based Optimization (MBO).
This is a minimal interface internally passing on to OptimizerMbo.
For additional information and documentation see OptimizerMbo.
Super classes
mlr3tuning::Tuner
-> mlr3tuning::TunerBatch
-> mlr3tuning::TunerBatchFromOptimizerBatch
-> TunerMbo
Active bindings
loop_function
(loop_function |
NULL
)
Loop function determining the MBO flavor.surrogate
(Surrogate |
NULL
)
The surrogate.acq_function
(AcqFunction |
NULL
)
The acquisition function.acq_optimizer
(AcqOptimizer |
NULL
)
The acquisition function optimizer.args
(named
list()
)
Further arguments passed to theloop_function
. For example,random_interleave_iter
.result_assigner
(ResultAssigner |
NULL
)
The result assigner.param_classes
(
character()
)
Supported parameter classes that the optimizer can optimize. Determined based on thesurrogate
and theacq_optimizer
. This corresponds to the values given by a paradox::ParamSet's$class
field.properties
(
character()
)
Set of properties of the optimizer. Must be a subset ofbbotk_reflections$optimizer_properties
. MBO in principle is very flexible and by default we assume that the optimizer has all properties. When fully initialized, properties are determined based on theloop_function
andsurrogate
.packages
(
character()
)
Set of required packages. A warning is signaled prior to optimization if at least one of the packages is not installed, but loaded (not attached) later on-demand viarequireNamespace()
. Required packages are determined based on theacq_function
,surrogate
and theacq_optimizer
.
Methods
Method new()
Creates a new instance of this R6 class.
For more information on default values for loop_function
, surrogate
, acq_function
and acq_optimizer
, see ?mbo_defaults
.
Note that all the parameters below are simply passed to the OptimizerMbo and the respective fields are simply (settable) active bindings to the fields of the OptimizerMbo.
Usage
TunerMbo$new(
loop_function = NULL,
surrogate = NULL,
acq_function = NULL,
acq_optimizer = NULL,
args = NULL,
result_assigner = NULL
)
Arguments
loop_function
(loop_function |
NULL
)
Loop function determining the MBO flavor.surrogate
(Surrogate |
NULL
)
The surrogate.acq_function
(AcqFunction |
NULL
)
The acquisition function.acq_optimizer
(AcqOptimizer |
NULL
)
The acquisition function optimizer.args
(named
list()
)
Further arguments passed to theloop_function
. For example,random_interleave_iter
.result_assigner
(ResultAssigner |
NULL
)
The result assigner.
Method reset()
Reset the tuner.
Sets the following fields to NULL
:
loop_function
, surrogate
, acq_function
, acq_optimizer
, args
, result_assigner
Examples
# \donttest{
if (requireNamespace("mlr3learners") &
requireNamespace("DiceKriging") &
requireNamespace("rgenoud")) {
library(mlr3)
library(mlr3tuning)
# single-objective
task = tsk("wine")
learner = lrn("classif.rpart", cp = to_tune(lower = 1e-4, upper = 1, logscale = TRUE))
resampling = rsmp("cv", folds = 3)
measure = msr("classif.acc")
instance = TuningInstanceBatchSingleCrit$new(
task = task,
learner = learner,
resampling = resampling,
measure = measure,
terminator = trm("evals", n_evals = 5))
tnr("mbo")$optimize(instance)
# multi-objective
task = tsk("wine")
learner = lrn("classif.rpart", cp = to_tune(lower = 1e-4, upper = 1, logscale = TRUE))
resampling = rsmp("cv", folds = 3)
measures = msrs(c("classif.acc", "selected_features"))
instance = TuningInstanceBatchMultiCrit$new(
task = task,
learner = learner,
resampling = resampling,
measures = measures,
terminator = trm("evals", n_evals = 5),
store_models = TRUE) # required due to selected features
tnr("mbo")$optimize(instance)
}
#> WARN [16:46:06.066] [mlr3] train: BFGS hit on best individual produced Out of Boundary individual.
#> WARN [16:46:06.067] [mlr3] train: BFGS hit on best individual produced Out of Boundary individual.
#> WARN [16:46:06.067] [mlr3] train: BFGS hit on best individual produced Out of Boundary individual.
#> cp learner_param_vals x_domain classif.acc selected_features
#> <num> <list> <list> <num> <num>
#> 1: -3.599781453 <list[2]> <list[1]> 0.8309793 2.333333
#> 2: -8.204951502 <list[2]> <list[1]> 0.8648776 2.666667
#> 3: -1.297196360 <list[2]> <list[1]> 0.7860640 2.000000
#> 4: -5.902366271 <list[2]> <list[1]> 0.8648776 2.666667
#> 5: -0.000381697 <list[2]> <list[1]> 0.3989642 0.000000
# }