Function to construct a TuningInstanceBatchSingleCrit or TuningInstanceBatchMultiCrit.
Usage
ti(
task,
learner,
resampling,
measures = NULL,
terminator,
search_space = NULL,
store_benchmark_result = TRUE,
store_models = FALSE,
check_values = FALSE,
callbacks = NULL
)
Arguments
- task
(mlr3::Task)
Task to operate on.- learner
(mlr3::Learner)
Learner to tune.- resampling
(mlr3::Resampling)
Resampling that is used to evaluate the performance of the hyperparameter configurations. Uninstantiated resamplings are instantiated during construction so that all configurations are evaluated on the same data splits. Already instantiated resamplings are kept unchanged. Specialized Tuner change the resampling e.g. to evaluate a hyperparameter configuration on different data splits. This field, however, always returns the resampling passed in construction.- measures
(mlr3::Measure or list of mlr3::Measure)
A single measure creates a TuningInstanceBatchSingleCrit and multiple measures a TuningInstanceBatchMultiCrit. IfNULL
, default measure is used.- terminator
(bbotk::Terminator)
Stop criterion of the tuning process.- search_space
(paradox::ParamSet)
Hyperparameter search space. IfNULL
(default), the search space is constructed from the paradox::TuneToken of the learner's parameter set (learner$param_set).- store_benchmark_result
(
logical(1)
)
IfTRUE
(default), store resample result of evaluated hyperparameter configurations in archive as mlr3::BenchmarkResult.- store_models
(
logical(1)
)
IfTRUE
, fitted models are stored in the benchmark result (archive$benchmark_result
). Ifstore_benchmark_result = FALSE
, models are only stored temporarily and not accessible after the tuning. This combination is needed for measures that require a model.- check_values
(
logical(1)
)
IfTRUE
, hyperparameter values are checked before evaluation and performance scores after. IfFALSE
(default), values are unchecked but computational overhead is reduced.- callbacks
(list of mlr3misc::Callback)
List of callbacks.
Resources
There are several sections about hyperparameter optimization in the mlr3book.
Getting started with hyperparameter optimization.
Tune a simple classification tree on the Sonar data set.
Learn about tuning spaces.
The gallery features a collection of case studies and demos about optimization.
Learn more advanced methods with the practical tuning series.
Simultaneously optimize hyperparameters and use early stopping with XGBoost.
Make us of proven search space.
Learn about hotstarting models.
Run the default hyperparameter configuration of learners as a baseline.
Default Measures
If no measure is passed, the default measure is used. The default measure depends on the task type.
Task | Default Measure | Package |
"classif" | "classif.ce" | mlr3 |
"regr" | "regr.mse" | mlr3 |
"surv" | "surv.cindex" | mlr3proba |
"dens" | "dens.logloss" | mlr3proba |
"classif_st" | "classif.ce" | mlr3spatial |
"regr_st" | "regr.mse" | mlr3spatial |
"clust" | "clust.dunn" | mlr3cluster |
Examples
# Hyperparameter optimization on the Palmer Penguins data set
task = tsk("penguins")
# Load learner and set search space
learner = lrn("classif.rpart",
cp = to_tune(1e-04, 1e-1, logscale = TRUE)
)
# Construct tuning instance
instance = ti(
task = task,
learner = learner,
resampling = rsmp("cv", folds = 3),
measures = msr("classif.ce"),
terminator = trm("evals", n_evals = 4)
)
# Choose optimization algorithm
tuner = tnr("random_search", batch_size = 2)
# Run tuning
tuner$optimize(instance)
#> cp learner_param_vals x_domain classif.ce
#> <num> <list> <list> <num>
#> 1: -4.104187 <list[2]> <list[1]> 0.0640478
# Set optimal hyperparameter configuration to learner
learner$param_set$values = instance$result_learner_param_vals
# Train the learner on the full data set
learner$train(task)
# Inspect all evaluated configurations
as.data.table(instance$archive)
#> cp classif.ce x_domain_cp runtime_learners timestamp
#> <num> <num> <num> <num> <POSc>
#> 1: -4.104187 0.0640478 0.0165034335 0.018 2024-09-11 07:59:47
#> 2: -8.934757 0.0640478 0.0001317298 0.017 2024-09-11 07:59:47
#> 3: -7.573822 0.0640478 0.0005137254 0.016 2024-09-11 07:59:47
#> 4: -7.761800 0.0640478 0.0004256896 0.015 2024-09-11 07:59:47
#> warnings errors batch_nr resample_result
#> <int> <int> <int> <list>
#> 1: 0 0 1 <ResampleResult>
#> 2: 0 0 1 <ResampleResult>
#> 3: 0 0 2 <ResampleResult>
#> 4: 0 0 2 <ResampleResult>