Class for Single Criterion TuningSource:
The instance contains an ObjectiveTuning object that encodes the black box objective function a Tuner has to optimize.
The instance allows the basic operations of querying the objective at design points (
This operation is usually done by the Tuner.
Evaluations of hyperparameter configurations are performed in batches by calling
The evaluated hyperparameter configurations are stored in the Archive (
Before a batch is evaluated, the bbotk::Terminator is queried for the remaining budget.
If the available budget is exhausted, an exception is raised, and no further evaluations can be performed from this point on.
The tuner is also supposed to store its final result, consisting of a selected hyperparameter configuration and associated estimated performance values, by calling the method
If no measure is passed, the default measure is used. The default measure depends on the task type.
There are several sections about hyperparameter optimization in the mlr3book.
Getting started with hyperparameter optimization.
Tune a simple classification tree on the Sonar data set.
Learn about tuning spaces.
The gallery features a collection of case studies and demos about optimization.
mlr3tuning is extended by the following packages.
For analyzing the tuning results, it is recommended to pass the ArchiveTuning to
The returned data table is joined with the benchmark result which adds the mlr3::ResampleResult for each hyperparameter evaluation.
The archive provides various getters (e.g.
$learners()) to ease the access.
All getters extract by position (
i) or unique hash (
For a complete list of all getters see the methods section.
The benchmark result (
$benchmark_result) allows to score the hyperparameter configurations again on a different measure.
Alternatively, measures can be supplied to
The mlr3viz package provides visualizations for tuning results.
Param values for the optimal learner call.
Creates a new instance of this R6 class.
Task to operate on.
Learner to tune.
Resampling that is used to evaluate the performance of the hyperparameter configurations. Uninstantiated resamplings are instantiated during construction so that all configurations are evaluated on the same data splits. Already instantiated resamplings are kept unchanged. Specialized Tuner change the resampling e.g. to evaluate a hyperparameter configuration on different data splits. This field, however, always returns the resampling passed in construction.
Measure to optimize. If
NULL, default measure is used.
Stop criterion of the tuning process.
Hyperparameter search space. If
NULL(default), the search space is constructed from the TuneToken of the learner's parameter set (learner$param_set).
TRUE(default), store resample result of evaluated hyperparameter configurations in archive as mlr3::BenchmarkResult.
TRUE, fitted models are stored in the benchmark result (
store_benchmark_result = FALSE, models are only stored temporarily and not accessible after the tuning. This combination is needed for measures that require a model.
TRUE, hyperparameter values are checked before evaluation and performance scores after. If
FALSE(default), values are unchecked but computational overhead is reduced.
Allow to hotstart learners with previously fitted models. See also mlr3::HotstartStack. The learner must support hotstarting. Sets
store_models = TRUE.
TRUE, mlr3::HotstartStack is kept in
TRUE, learner is evaluated with hyperparameters set to their default values at the start of the optimization.
(list of CallbackTuning)
List of callbacks.
The Tuner object writes the best found point and estimated performance value here. For internal use.
(List of named
Fixed parameter values of the learner that are neither part of the
# Hyperparameter optimization on the Palmer Penguins data set task = tsk("penguins") # Load learner and set search space learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE) ) # Construct tuning instance instance = ti( task = task, learner = learner, resampling = rsmp("cv", folds = 3), measures = msr("classif.ce"), terminator = trm("evals", n_evals = 4) ) # Choose optimization algorithm tuner = tnr("random_search", batch_size = 2) # Run tuning tuner$optimize(instance) #> cp learner_param_vals x_domain classif.ce #> 1: -8.429848 <list> <list> 0.05232647 # Set optimal hyperparameter configuration to learner learner$param_set$values = instance$result_learner_param_vals # Train the learner on the full data set learner$train(task) # Inspect all evaluated configurations as.data.table(instance$archive) #> cp classif.ce x_domain_cp runtime_learners timestamp #> 1: -8.429848 0.05232647 0.0002182546 0.023 2023-06-27 06:09:19 #> 2: -6.796334 0.05232647 0.0011178653 0.022 2023-06-27 06:09:19 #> 3: -8.511941 0.05232647 0.0002010532 0.039 2023-06-27 06:09:19 #> 4: -6.303767 0.05232647 0.0018294009 0.022 2023-06-27 06:09:19 #> batch_nr warnings errors resample_result #> 1: 1 0 0 <ResampleResult> #> 2: 1 0 0 <ResampleResult> #> 3: 2 0 0 <ResampleResult> #> 4: 2 0 0 <ResampleResult>