Skip to contents

Subclass for tuning w.r.t. fixed design points.

We simply search over a set of points fully specified by the user. The points in the design are evaluated in order as given.

Dictionary

This Tuner can be instantiated via the dictionary mlr_tuners or with the associated sugar function tnr():

TunerDesignPoints$new()
mlr_tuners$get("design_points")
tnr("design_points")

Parallelization

In order to support general termination criteria and parallelization, we evaluate points in a batch-fashion of size batch_size. Larger batches mean we can parallelize more, smaller batches imply a more fine-grained checking of termination criteria. A batch contains of batch_size times resampling$iters jobs. E.g., if you set a batch size of 10 points and do a 5-fold cross validation, you can utilize up to 50 cores.

Parallelization is supported via package future (see mlr3::benchmark()'s section on parallelization for more details).

Logging

All Tuners use a logger (as implemented in lgr) from package bbotk. Use lgr::get_logger("bbotk") to access and control the logger.

Parameters

batch_size

integer(1)
Maximum number of configurations to try in a batch.

design

data.table::data.table
Design points to try in search, one per row.

Progress Bars

$optimize() supports progress bars via the package progressr combined with a Terminator. Simply wrap the function in progressr::with_progress() to enable them. We recommend to use package progress as backend; enable with progressr::handlers("progress").

Super classes

mlr3tuning::Tuner -> mlr3tuning::TunerFromOptimizer -> TunerDesignPoints

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

TunerDesignPoints$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

library(data.table)

# retrieve task
task = tsk("pima")

# load learner and set search space
learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE))

# hyperparameter tuning on the pima indians diabetes data set
instance = tune(
  method = "design_points",
  task = task,
  learner = learner,
  resampling = rsmp("holdout"),
  measure = msr("classif.ce"),
  design = data.table(cp = c(log(1e-1), log(1e-2)))
)

# best performing hyperparameter configuration
instance$result
#>          cp learner_param_vals  x_domain classif.ce
#> 1: -4.60517          <list[2]> <list[1]>  0.2734375

# all evaluated hyperparameter configuration
as.data.table(instance$archive)
#>           cp classif.ce x_domain_cp runtime_learners           timestamp
#> 1: -2.302585  0.2773438        0.10            0.014 2022-01-23 04:26:37
#> 2: -4.605170  0.2734375        0.01            0.013 2022-01-23 04:26:37
#>    batch_nr      resample_result
#> 1:        1 <ResampleResult[22]>
#> 2:        2 <ResampleResult[22]>

# fit final model on complete data set
learner$param_set$values = instance$result_learner_param_vals
learner$train(task)