Skip to contents

Subclass for random search tuning.

The random points are sampled by paradox::generate_design_random().

Source

Bergstra J, Bengio Y (2012). “Random Search for Hyper-Parameter Optimization.” Journal of Machine Learning Research, 13(10), 281--305. https://jmlr.csail.mit.edu/papers/v13/bergstra12a.html.

Dictionary

This Tuner can be instantiated via the dictionary mlr_tuners or with the associated sugar function tnr():

TunerRandomSearch$new()
mlr_tuners$get("random_search")
tnr("random_search")

Parallelization

In order to support general termination criteria and parallelization, we evaluate points in a batch-fashion of size batch_size. Larger batches mean we can parallelize more, smaller batches imply a more fine-grained checking of termination criteria. A batch contains of batch_size times resampling$iters jobs. E.g., if you set a batch size of 10 points and do a 5-fold cross validation, you can utilize up to 50 cores.

Parallelization is supported via package future (see mlr3::benchmark()'s section on parallelization for more details).

Logging

All Tuners use a logger (as implemented in lgr) from package bbotk. Use lgr::get_logger("bbotk") to access and control the logger.

Optimizer

This Tuner is based on bbotk::OptimizerRandomSearch which can be applied on any black box optimization problem. See also the documentation of bbotk.

Parameters

batch_size

integer(1)
Maximum number of points to try in a batch.

Progress Bars

$optimize() supports progress bars via the package progressr combined with a Terminator. Simply wrap the function in progressr::with_progress() to enable them. We recommend to use package progress as backend; enable with progressr::handlers("progress").

Super classes

mlr3tuning::Tuner -> mlr3tuning::TunerFromOptimizer -> TunerRandomSearch

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

TunerRandomSearch$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# retrieve task
task = tsk("pima")

# load learner and set search space
learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE))

# hyperparameter tuning on the pima indians diabetes data set
instance = tune(
  method = "random_search",
  task = task,
  learner = learner,
  resampling = rsmp("holdout"),
  measure = msr("classif.ce"),
  term_evals = 10
)

# best performing hyperparameter configuration
instance$result
#>           cp learner_param_vals  x_domain classif.ce
#> 1: -4.433908          <list[2]> <list[1]>  0.2460938

# all evaluated hyperparameter configuration
as.data.table(instance$archive)
#>            cp classif.ce  x_domain_cp runtime_learners           timestamp
#>  1: -4.433908  0.2460938 0.0118680165            0.016 2022-08-13 04:25:26
#>  2: -8.714308  0.2656250 0.0001642193            0.016 2022-08-13 04:25:26
#>  3: -5.506266  0.2617188 0.0040612444            0.016 2022-08-13 04:25:26
#>  4: -2.642002  0.2773438 0.0712185711            0.020 2022-08-13 04:25:26
#>  5: -4.700957  0.2500000 0.0090865755            0.015 2022-08-13 04:25:26
#>  6: -6.457298  0.2656250 0.0015690289            0.017 2022-08-13 04:25:26
#>  7: -8.884045  0.2656250 0.0001385825            0.017 2022-08-13 04:25:26
#>  8: -7.126708  0.2656250 0.0008033594            0.017 2022-08-13 04:25:26
#>  9: -3.485727  0.2500000 0.0306314693            0.021 2022-08-13 04:25:26
#> 10: -7.211394  0.2656250 0.0007381277            0.015 2022-08-13 04:25:27
#>     batch_nr warnings errors      resample_result
#>  1:        1        0      0 <ResampleResult[21]>
#>  2:        2        0      0 <ResampleResult[21]>
#>  3:        3        0      0 <ResampleResult[21]>
#>  4:        4        0      0 <ResampleResult[21]>
#>  5:        5        0      0 <ResampleResult[21]>
#>  6:        6        0      0 <ResampleResult[21]>
#>  7:        7        0      0 <ResampleResult[21]>
#>  8:        8        0      0 <ResampleResult[21]>
#>  9:        9        0      0 <ResampleResult[21]>
#> 10:       10        0      0 <ResampleResult[21]>

# fit final model on complete data set
learner$param_set$values = instance$result_learner_param_vals
learner$train(task)