Sequential Single-Objective Bayesian Optimization
Source:R/bayesopt_ego.R
mlr_loop_functions_ego.Rd
Loop function for sequential single-objective Bayesian Optimization. Normally used inside an OptimizerMbo.
In each iteration after the initial design, the surrogate and acquisition function are updated and the next candidate is chosen based on optimizing the acquisition function.
Usage
bayesopt_ego(
instance,
surrogate,
acq_function,
acq_optimizer,
init_design_size = NULL,
random_interleave_iter = 0L
)
Arguments
- instance
(bbotk::OptimInstanceBatchSingleCrit)
The bbotk::OptimInstanceBatchSingleCrit to be optimized.- surrogate
(Surrogate)
Surrogate to be used as a surrogate. Typically a SurrogateLearner.- acq_function
(AcqFunction)
AcqFunction to be used as acquisition function.- acq_optimizer
(AcqOptimizer)
AcqOptimizer to be used as acquisition function optimizer.- init_design_size
(
NULL
|integer(1)
)
Size of the initial design. IfNULL
and the bbotk::ArchiveBatch contains no evaluations,4 * d
is used withd
being the dimensionality of the search space. Points are generated via a Sobol sequence.- random_interleave_iter
(
integer(1)
)
Everyrandom_interleave_iter
iteration (starting after the initial design), a point is sampled uniformly at random and evaluated (instead of a model based proposal). For example, ifrandom_interleave_iter = 2
, random interleaving is performed in the second, fourth, sixth, ... iteration. Default is0
, i.e., no random interleaving is performed at all.
Note
The
acq_function$surrogate
, even if already populated, will always be overwritten by thesurrogate
.The
acq_optimizer$acq_function
, even if already populated, will always be overwritten byacq_function
.The
surrogate$archive
, even if already populated, will always be overwritten by the bbotk::ArchiveBatch of the bbotk::OptimInstanceBatchSingleCrit.
References
Jones, R. D, Schonlau, Matthias, Welch, J. W (1998). “Efficient Global Optimization of Expensive Black-Box Functions.” Journal of Global optimization, 13(4), 455–492.
Snoek, Jasper, Larochelle, Hugo, Adams, P R (2012). “Practical Bayesian Optimization of Machine Learning Algorithms.” In Pereira F, Burges CJC, Bottou L, Weinberger KQ (eds.), Advances in Neural Information Processing Systems, volume 25, 2951–2959.
Examples
# \donttest{
if (requireNamespace("mlr3learners") &
requireNamespace("DiceKriging") &
requireNamespace("rgenoud")) {
library(bbotk)
library(paradox)
library(mlr3learners)
fun = function(xs) {
list(y = xs$x ^ 2)
}
domain = ps(x = p_dbl(lower = -10, upper = 10))
codomain = ps(y = p_dbl(tags = "minimize"))
objective = ObjectiveRFun$new(fun = fun, domain = domain, codomain = codomain)
instance = OptimInstanceBatchSingleCrit$new(
objective = objective,
terminator = trm("evals", n_evals = 5))
surrogate = default_surrogate(instance)
acq_function = acqf("ei")
acq_optimizer = acqo(
optimizer = opt("random_search", batch_size = 100),
terminator = trm("evals", n_evals = 100))
optimizer = opt("mbo",
loop_function = bayesopt_ego,
surrogate = surrogate,
acq_function = acq_function,
acq_optimizer = acq_optimizer)
optimizer$optimize(instance)
# expected improvement per second example
fun = function(xs) {
list(y = xs$x ^ 2, time = abs(xs$x))
}
domain = ps(x = p_dbl(lower = -10, upper = 10))
codomain = ps(y = p_dbl(tags = "minimize"), time = p_dbl(tags = "time"))
objective = ObjectiveRFun$new(fun = fun, domain = domain, codomain = codomain)
instance = OptimInstanceBatchSingleCrit$new(
objective = objective,
terminator = trm("evals", n_evals = 5))
surrogate = default_surrogate(instance, n_learner = 2)
surrogate$cols_y = c("y", "time")
optimizer = opt("mbo",
loop_function = bayesopt_ego,
surrogate = surrogate,
acq_function = acqf("eips"),
acq_optimizer = acq_optimizer)
optimizer$optimize(instance)
}
#> x x_domain y
#> <num> <list> <num>
#> 1: 1.55992 <list[1]> 2.433351
# }