Skip to contents

Subclass for generalized simulated annealing tuning calling GenSA::GenSA() from package GenSA.

Source

Tsallis C, Stariolo DA (1996). “Generalized simulated annealing.” Physica A: Statistical Mechanics and its Applications, 233(1-2), 395--406. doi: 10.1016/s0378-4371(96)00271-3 .

Xiang Y, Gubian S, Suomela B, Hoeng J (2013). “Generalized Simulated Annealing for Global Optimization: The GenSA Package.” The R Journal, 5(1), 13. doi: 10.32614/rj-2013-002 .

Dictionary

This Tuner can be instantiated via the dictionary mlr_tuners or with the associated sugar function tnr():

TunerGenSA$new()
mlr_tuners$get("gensa")
tnr("gensa")

Parallelization

In order to support general termination criteria and parallelization, we evaluate points in a batch-fashion of size batch_size. Larger batches mean we can parallelize more, smaller batches imply a more fine-grained checking of termination criteria. A batch contains of batch_size times resampling$iters jobs. E.g., if you set a batch size of 10 points and do a 5-fold cross validation, you can utilize up to 50 cores.

Parallelization is supported via package future (see mlr3::benchmark()'s section on parallelization for more details).

Logging

All Tuners use a logger (as implemented in lgr) from package bbotk. Use lgr::get_logger("bbotk") to access and control the logger.

Parameters

smooth

logical(1)

temperature

numeric(1)

acceptance.param

numeric(1)

verbose

logical(1)

trace.mat

logical(1)

For the meaning of the control parameters, see GenSA::GenSA(). Note that we have removed all control parameters which refer to the termination of the algorithm and where our terminators allow to obtain the same behavior.

Progress Bars

$optimize() supports progress bars via the package progressr combined with a Terminator. Simply wrap the function in progressr::with_progress() to enable them. We recommend to use package progress as backend; enable with progressr::handlers("progress").

Super classes

mlr3tuning::Tuner -> mlr3tuning::TunerFromOptimizer -> TunerGenSA

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

TunerGenSA$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# retrieve task
task = tsk("pima")

# load learner and set search space
learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE))

# hyperparameter tuning on the pima indians diabetes data set
instance = tune(
  method = "gensa",
  task = task,
  learner = learner,
  resampling = rsmp("holdout"),
  measure = msr("classif.ce"),
  term_evals = 10
)

# best performing hyperparameter configuration
instance$result
#>           cp learner_param_vals  x_domain classif.ce
#> 1: -3.715492          <list[2]> <list[1]>  0.2421875

# all evaluated hyperparameter configuration
as.data.table(instance$archive)
#>            cp classif.ce  x_domain_cp runtime_learners           timestamp
#>  1: -3.004370  0.2773438 0.0495699974            0.013 2022-01-23 04:26:38
#>  2: -7.041797  0.2617188 0.0008745538            0.015 2022-01-23 04:26:38
#>  3: -8.499903  0.2617188 0.0002034881            0.014 2022-01-23 04:26:38
#>  4: -7.041797  0.2617188 0.0008745538            0.015 2022-01-23 04:26:38
#>  5: -7.041796  0.2617188 0.0008745546            0.015 2022-01-23 04:26:38
#>  6: -7.041798  0.2617188 0.0008745529            0.014 2022-01-23 04:26:38
#>  7: -6.963782  0.2617188 0.0009455138            0.013 2022-01-23 04:26:38
#>  8: -6.123928  0.2617188 0.0021898370            0.014 2022-01-23 04:26:38
#>  9: -3.715492  0.2421875 0.0243434635            0.013 2022-01-23 04:26:38
#> 10: -3.314132  0.2773438 0.0363656027            0.013 2022-01-23 04:26:38
#>     batch_nr      resample_result
#>  1:        1 <ResampleResult[22]>
#>  2:        2 <ResampleResult[22]>
#>  3:        3 <ResampleResult[22]>
#>  4:        4 <ResampleResult[22]>
#>  5:        5 <ResampleResult[22]>
#>  6:        6 <ResampleResult[22]>
#>  7:        7 <ResampleResult[22]>
#>  8:        8 <ResampleResult[22]>
#>  9:        9 <ResampleResult[22]>
#> 10:       10 <ResampleResult[22]>

# fit final model on complete data set
learner$param_set$values = instance$result_learner_param_vals
learner$train(task)