Subclass for generalized simulated annealing tuning calling GenSA::GenSA() from package GenSA.

Source

Tsallis C, Stariolo DA (1996). “Generalized simulated annealing.” Physica A: Statistical Mechanics and its Applications, 233(1-2), 395--406. doi: 10.1016/s0378-4371(96)00271-3 .

Xiang Y, Gubian S, Suomela B, Hoeng J (2013). “Generalized Simulated Annealing for Global Optimization: The GenSA Package.” The R Journal, 5(1), 13. doi: 10.32614/rj-2013-002 .

Dictionary

This Tuner can be instantiated via the dictionary mlr_tuners or with the associated sugar function tnr():

TunerGenSA$new()
mlr_tuners$get("gensa")
tnr("gensa")

Parallelization

In order to support general termination criteria and parallelization, we evaluate points in a batch-fashion of size batch_size. Larger batches mean we can parallelize more, smaller batches imply a more fine-grained checking of termination criteria. A batch contains of batch_size times resampling$iters jobs. E.g., if you set a batch size of 10 points and do a 5-fold cross validation, you can utilize up to 50 cores.

Parallelization is supported via package future (see mlr3::benchmark()'s section on parallelization for more details).

Logging

All Tuners use a logger (as implemented in lgr) from package bbotk. Use lgr::get_logger("bbotk") to access and control the logger.

Parameters

smooth

logical(1)

temperature

numeric(1)

acceptance.param

numeric(1)

verbose

logical(1)

trace.mat

logical(1)

For the meaning of the control parameters, see GenSA::GenSA(). Note that we have removed all control parameters which refer to the termination of the algorithm and where our terminators allow to obtain the same behavior.

Progress Bars

$optimize() supports progress bars via the package progressr combined with a Terminator. Simply wrap the function in progressr::with_progress() to enable them. We recommend to use package progress as backend; enable with progressr::handlers("progress").

See also

Super classes

mlr3tuning::Tuner -> mlr3tuning::TunerFromOptimizer -> TunerGenSA

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage

TunerGenSA$new()


Method clone()

The objects of this class are cloneable with this method.

Usage

TunerGenSA$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# retrieve task task = tsk("pima") # load learner and set search space learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE)) # hyperparameter tuning on the pima indians diabetes data set instance = tune( method = "gensa", task = task, learner = learner, resampling = rsmp("holdout"), measure = msr("classif.ce"), term_evals = 10 ) # best performing hyperparameter configuration instance$result
#> cp learner_param_vals x_domain classif.ce #> 1: -4.930703 <list[2]> <list[1]> 0.2851562
# all evaluated hyperparameter configuration as.data.table(instance$archive)
#> cp classif.ce x_domain_cp runtime_learners timestamp #> 1: -5.550318 0.3046875 0.0038862207 0.012 2021-09-16 04:23:17 #> 2: -2.679990 0.3164062 0.0685638304 0.010 2021-09-16 04:23:17 #> 3: -7.398271 0.3242188 0.0006123106 0.012 2021-09-16 04:23:17 #> 4: -5.550318 0.3046875 0.0038862207 0.012 2021-09-16 04:23:17 #> 5: -5.550317 0.3046875 0.0038862246 0.012 2021-09-16 04:23:17 #> 6: -5.550319 0.3046875 0.0038862168 0.013 2021-09-16 04:23:18 #> 7: -4.930703 0.2851562 0.0072214248 0.012 2021-09-16 04:23:18 #> 8: -4.090849 0.2890625 0.0167250258 0.015 2021-09-16 04:23:18 #> 9: -4.930703 0.2851562 0.0072214248 0.012 2021-09-16 04:23:18 #> 10: -4.930702 0.2851562 0.0072214320 0.015 2021-09-16 04:23:18 #> batch_nr resample_result #> 1: 1 <ResampleResult[20]> #> 2: 2 <ResampleResult[20]> #> 3: 3 <ResampleResult[20]> #> 4: 4 <ResampleResult[20]> #> 5: 5 <ResampleResult[20]> #> 6: 6 <ResampleResult[20]> #> 7: 7 <ResampleResult[20]> #> 8: 8 <ResampleResult[20]> #> 9: 9 <ResampleResult[20]> #> 10: 10 <ResampleResult[20]>
# fit final model on complete data set learner$param_set$values = instance$result_learner_param_vals learner$train(task)