Skip to contents

Lower / Upper Confidence Bound with lambda sampling and decay. The initial \(\lambda\) is drawn from an uniform distribution between min_lambda and max_lambda or from an exponential distribution with rate 1 / lambda. \(\lambda\) is updated after each update by the formula lambda * exp(-rate * (t %% period)), where t is the number of times the acquisition function has been updated.

While this acquisition function usually would be used within an asynchronous optimizer, e.g., OptimizerAsyncMbo, it can in principle also be used in synchronous optimizers, e.g., OptimizerMbo.

Dictionary

This AcqFunction can be instantiated via the dictionary mlr_acqfunctions or with the associated sugar function acqf():

mlr_acqfunctions$get("stochastic_cb")
acqf("stochastic_cb")

Parameters

  • "lambda" (numeric(1))
    \(\lambda\) value for sampling from the exponential distribution. Defaults to 1.96.

  • "min_lambda" (numeric(1))
    Minimum value of \(\lambda\)for sampling from the uniform distribution. Defaults to 0.01.

  • "max_lambda" (numeric(1))
    Maximum value of \(\lambda\) for sampling from the uniform distribution. Defaults to 10.

  • "distribution" (character(1))
    Distribution to sample \(\lambda\) from. One of c("uniform", "exponential"). Defaults to uniform.

  • "rate" (numeric(1))
    Rate of the exponential decay. Defaults to 0 i.e. no decay.

  • "period" (integer(1))
    Period of the exponential decay. Defaults to NULL, i.e., the decay has no period.

Note

References

  • Snoek, Jasper, Larochelle, Hugo, Adams, P R (2012). “Practical Bayesian Optimization of Machine Learning Algorithms.” In Pereira F, Burges CJC, Bottou L, Weinberger KQ (eds.), Advances in Neural Information Processing Systems, volume 25, 2951–2959.

  • Egelé, Romain, Guyon, Isabelle, Vishwanath, Venkatram, Balaprakash, Prasanna (2023). “Asynchronous Decentralized Bayesian Optimization for Large Scale Hyperparameter Optimization.” In 2023 IEEE 19th International Conference on e-Science (e-Science), 1–10.

Super classes

bbotk::Objective -> mlr3mbo::AcqFunction -> AcqFunctionStochasticCB

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage

AcqFunctionStochasticCB$new(
  surrogate = NULL,
  lambda = 1.96,
  min_lambda = 0.01,
  max_lambda = 10,
  distribution = "uniform",
  rate = 0,
  period = NULL
)

Arguments

surrogate

(NULL | SurrogateLearner).

lambda

(numeric(1)).

min_lambda

(numeric(1)).

max_lambda

(numeric(1)).

distribution

(character(1)).

rate

(numeric(1)).

period

(NULL | integer(1)).


Method update()

Update the acquisition function. Samples and decays lambda.

Usage

AcqFunctionStochasticCB$update()


Method reset()

Reset the acquisition function. Resets the private update counter .t used within the epsilon decay.

Usage

AcqFunctionStochasticCB$reset()


Method clone()

The objects of this class are cloneable with this method.

Usage

AcqFunctionStochasticCB$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

if (requireNamespace("mlr3learners") &
    requireNamespace("DiceKriging") &
    requireNamespace("rgenoud")) {
  library(bbotk)
  library(paradox)
  library(mlr3learners)
  library(data.table)

  fun = function(xs) {
    list(y = xs$x ^ 2)
  }
  domain = ps(x = p_dbl(lower = -10, upper = 10))
  codomain = ps(y = p_dbl(tags = "minimize"))
  objective = ObjectiveRFun$new(fun = fun, domain = domain, codomain = codomain)

  instance = OptimInstanceBatchSingleCrit$new(
    objective = objective,
    terminator = trm("evals", n_evals = 5))

  instance$eval_batch(data.table(x = c(-6, -5, 3, 9)))

  learner = default_gp()

  surrogate = srlrn(learner, archive = instance$archive)

  acq_function = acqf("stochastic_cb", surrogate = surrogate, lambda = 3)

  acq_function$surrogate$update()
  acq_function$update()
  acq_function$eval_dt(data.table(x = c(-1, 0, 1)))
}
#>       acq_cb acq_lambda acq_lambda_0
#>        <num>      <num>        <num>
#> 1: -165.5554   7.067187     7.067187
#> 2: -162.4109   7.067187     7.067187
#> 3: -140.7614   7.067187     7.067187