Skip to contents

Subclass for tuning w.r.t. fixed design points.

We simply search over a set of points fully specified by the user. The points in the design are evaluated in order as given.

Dictionary

This Tuner can be instantiated with the associated sugar function tnr():

tnr("design_points")

Parallelization

In order to support general termination criteria and parallelization, we evaluate points in a batch-fashion of size batch_size. Larger batches mean we can parallelize more, smaller batches imply a more fine-grained checking of termination criteria. A batch contains of batch_size times resampling$iters jobs. E.g., if you set a batch size of 10 points and do a 5-fold cross validation, you can utilize up to 50 cores.

Parallelization is supported via package future (see mlr3::benchmark()'s section on parallelization for more details).

Logging

All Tuners use a logger (as implemented in lgr) from package bbotk. Use lgr::get_logger("bbotk") to access and control the logger.

Optimizer

This Tuner is based on bbotk::OptimizerBatchDesignPoints which can be applied on any black box optimization problem. See also the documentation of bbotk.

Parameters

batch_size

integer(1)
Maximum number of configurations to try in a batch.

design

data.table::data.table
Design points to try in search, one per row.

Resources

There are several sections about hyperparameter optimization in the mlr3book.

  • An overview of all tuners can be found on our website.

  • Learn more about tuners.

The gallery features a collection of case studies and demos about optimization.

  • Use the Hyperband optimizer with different budget parameters.

Progress Bars

$optimize() supports progress bars via the package progressr combined with a Terminator. Simply wrap the function in progressr::with_progress() to enable them. We recommend to use package progress as backend; enable with progressr::handlers("progress").

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method clone()

The objects of this class are cloneable with this method.

Usage

TunerBatchDesignPoints$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Hyperparameter Optimization

# load learner and set search space
learner = lrn("classif.rpart",
  cp = to_tune(1e-04, 1e-1),
  minsplit = to_tune(2, 128),
  minbucket = to_tune(1, 64)
)

# create design
design = mlr3misc::rowwise_table(
  ~cp,   ~minsplit,  ~minbucket,
  0.1,   2,          64,
  0.01,  64,         32,
  0.001, 128,        1
)

# run hyperparameter tuning on the Palmer Penguins data set
instance = tune(
  tuner = tnr("design_points", design = design),
  task = tsk("penguins"),
  learner = learner,
  resampling = rsmp("holdout"),
  measure = msr("classif.ce")
)

# best performing hyperparameter configuration
instance$result
#>       cp minbucket minsplit learner_param_vals  x_domain classif.ce
#>    <num>     <num>    <num>             <list>    <list>      <num>
#> 1:  0.01        32       64          <list[4]> <list[3]> 0.06956522

# all evaluated hyperparameter configuration
as.data.table(instance$archive)
#>       cp minbucket minsplit classif.ce x_domain_cp x_domain_minbucket
#>    <num>     <num>    <num>      <num>       <num>              <num>
#> 1: 0.100        64        2 0.15652174       0.100                 64
#> 2: 0.010        32       64 0.06956522       0.010                 32
#> 3: 0.001         1      128 0.06956522       0.001                  1
#>    x_domain_minsplit runtime_learners           timestamp batch_nr warnings
#>                <num>            <num>              <POSc>    <int>    <int>
#> 1:                 2            0.011 2024-06-30 09:41:23        1        0
#> 2:                64            0.009 2024-06-30 09:41:23        2        0
#> 3:               128            0.013 2024-06-30 09:41:23        3        0
#>    errors  resample_result
#>     <int>           <list>
#> 1:      0 <ResampleResult>
#> 2:      0 <ResampleResult>
#> 3:      0 <ResampleResult>

# fit final model on complete data set
learner$param_set$values = instance$result_learner_param_vals
learner$train(tsk("penguins"))