The Tuner implements the optimization algorithm.
Details
Tuner is a abstract base class that implements the base functionality each tuner must provide. A subclass is implemented in the following way:
Inherit from Tuner.
Specify the private abstract method
$.optimize()
and use it to call into your optimizer.You need to call
instance$eval_batch()
to evaluate design points.The batch evaluation is requested at the TuningInstanceSingleCrit/TuningInstanceMultiCrit object
instance
, so each batch is possibly executed in parallel viamlr3::benchmark()
, and all evaluations are stored inside ofinstance$archive
.Before the batch evaluation, the bbotk::Terminator is checked, and if it is positive, an exception of class
"terminated_error"
is generated. In the later case the current batch of evaluations is still stored ininstance
, but the numeric scores are not sent back to the handling optimizer as it has lost execution control.After such an exception was caught we select the best configuration from
instance$archive
and return it.Note that therefore more points than specified by the bbotk::Terminator may be evaluated, as the Terminator is only checked before a batch evaluation, and not in-between evaluation in a batch. How many more depends on the setting of the batch size.
Overwrite the private super-method
.assign_result()
if you want to decide yourself how to estimate the final configuration in the instance and its estimated performance. The default behavior is: We pick the best resample-experiment, regarding the given measure, then assign its configuration and aggregated performance to the instance.
Private Methods
.optimize(instance)
->NULL
Abstract base method. Implement to specify tuning of your subclass. See details sections..assign_result(instance)
->NULL
Abstract base method. Implement to specify how the final configuration is selected. See details sections.
Resources
There are several sections about hyperparameter optimization in the mlr3book.
Learn more about tuners.
The gallery features a collection of case studies and demos about optimization.
Use the Hyperband optimizer with different budget parameters.
Extension Packages
Additional tuners are provided by the following packages.
mlr3hyperband adds the Hyperband and Successive Halving algorithm.
mlr3mbo adds Bayesian optimization methods.
Active bindings
param_set
(paradox::ParamSet)
Set of control parameters.param_classes
(
character()
)
Supported parameter classes for learner hyperparameters that the tuner can optimize. Subclasses of paradox::Param.properties
(
character()
)
Set of properties of the tuner. Must be a subset ofmlr_reflections$tuner_properties
.packages
(
character()
)
Set of required packages. Note that these packages will be loaded viarequireNamespace()
, and are not attached.label
(
character(1)
)
Label for this object. Can be used in tables, plot and text output instead of the ID.man
(
character(1)
)
String in the format[pkg]::[topic]
pointing to a manual page for this object. The referenced help package can be opened via method$help()
.
Methods
Method new()
Creates a new instance of this R6 class.
Arguments
id
(
character(1)
)
Identifier for the new instance.param_set
(paradox::ParamSet)
Set of control parameters.param_classes
(
character()
)
Supported parameter classes for learner hyperparameters that the tuner can optimize. Subclasses of paradox::Param.properties
(
character()
)
Set of properties of the tuner. Must be a subset ofmlr_reflections$tuner_properties
.packages
(
character()
)
Set of required packages. Note that these packages will be loaded viarequireNamespace()
, and are not attached.label
(
character(1)
)
Label for this object. Can be used in tables, plot and text output instead of the ID.man
(
character(1)
)
String in the format[pkg]::[topic]
pointing to a manual page for this object. The referenced help package can be opened via method$help()
.
Method optimize()
Performs the tuning on a TuningInstanceSingleCrit or TuningInstanceMultiCrit until termination. The single evaluations will be written into the ArchiveTuning that resides in the TuningInstanceSingleCrit/TuningInstanceMultiCrit. The result will be written into the instance object.