This package provides hyperparameter tuning for mlr3. It offers various tuning methods e.g. grid search, random search and generalized simulated annealing and different termination criteria can be set and combined.
AutoTuner provides a convenient way to perform nested resampling in combination with
mlr3. The package is build on bbotk which provides a common framework for optimization.
Install the last release from CRAN:
Install the development version from GitHub:
library("paradox") task = tsk("pima") learner = lrn("classif.rpart") resampling = rsmp("holdout") measure = msr("classif.ce") # Create the search space with lower and upper bounds learner$param_set$values$cp = to_tune(0.001, 0.1) learner$param_set$values$minsplit = to_tune(1, 10) # Define termination criterion terminator = trm("evals", n_evals = 20) # Create tuning instance instance = TuningInstanceSingleCrit$new( task = task, learner = learner, resampling = resampling, measure = measure, terminator = terminator) # Load tuner tuner = tnr("grid_search", resolution = 5) # Trigger optimization tuner$optimize(instance)