Subclass to conduct only internal hyperparameter tuning for a mlr3::Learner.
Note
The selected mlr3::Measure does not influence the tuning result. To change the loss-function for the internal tuning, consult the hyperparameter documentation of the tuned mlr3::Learner.
Progress Bars
$optimize()
supports progress bars via the package progressr
combined with a bbotk::Terminator. Simply wrap the function in
progressr::with_progress()
to enable them. We recommend to use package
progress as backend; enable with progressr::handlers("progress")
.
Logging
All Tuners use a logger (as implemented in lgr) from package
bbotk.
Use lgr::get_logger("bbotk")
to access and control the logger.
Resources
There are several sections about hyperparameter optimization in the mlr3book.
The gallery features a collection of case studies and demos about optimization.
Use the Hyperband optimizer with different budget parameters.
Super classes
mlr3tuning::Tuner
-> mlr3tuning::TunerBatch
-> TunerBatchInternal
Examples
library(mlr3learners)
# Retrieve task
task = tsk("pima")
# Load learner and set search space
learner = lrn("classif.xgboost",
nrounds = to_tune(upper = 1000, internal = TRUE),
early_stopping_rounds = 10,
validate = "test",
eval_metric = "merror"
)
# Internal hyperparameter tuning on the pima indians diabetes data set
instance = tune(
tnr("internal"),
tsk("iris"),
learner,
rsmp("cv", folds = 3),
msr("internal_valid_score", minimize = TRUE, select = "merror")
)
# best performing hyperparameter configuration
instance$result_learner_param_vals
#> $eval_metric
#> [1] "merror"
#>
#> $nrounds
#> [1] 1
#>
#> $nthread
#> [1] 1
#>
#> $verbose
#> [1] 0
#>
instance$result_learner_param_vals$internal_tuned_values
#> NULL