Function to tune a mlr3::Learner.

tune(
method,
learner,
resampling,
measures,
term_evals = NULL,
term_time = NULL,
search_space = NULL,
store_models = FALSE,
allow_hotstart = FALSE,
...
)

Arguments

method (character(1)) Key to retrieve tuner from mlr_tuners dictionary. (mlr3::Task) Task to operate on. (mlr3::Learner) Learner to tune. (mlr3::Resampling) Resampling that is used to evaluated the performance of the hyperparameter configurations. Uninstantiated resamplings are instantiated during construction so that all configurations are evaluated on the same data splits. Already instantiated resamplings are kept unchanged. Specialized Tuner change the resampling e.g. to evaluate a hyperparameter configuration on different data splits. This field, however, always returns the resampling passed in construction. (list of mlr3::Measure) Measures to optimize. (integer(1)) Number of allowed evaluations. (integer(1)) Maximum allowed time in seconds. (paradox::ParamSet) Hyperparameter search space. If NULL (default), the search space is constructed from the TuneToken of the learner's parameter set (learner$param_set). (logical(1)) If TRUE, fitted models are stored in the benchmark result (archive$benchmark_result). If store_benchmark_result = FALSE, models are only stored temporarily and not accessible after the tuning. This combination is needed for measures that require a model. (logical(1)) Allow to hotstart learners with previously fitted models. See also mlr3::HotstartStack. The learner must support hotstarting. Sets store_models = TRUE. (named list()) Named arguments to be set as parameters of the tuner.

Value

TuningInstanceSingleCrit | TuningInstanceMultiCrit

Examples

learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE))

instance = tune(
method = "random_search",
learner$param_set$values = instance\$result_learner_param_vals