R/tuneParamsMultiCrit.R
tuneParamsMultiCrit.Rd
Optimizes the hyperparameters of a learner in a multi-criteria fashion. Allows for different optimization methods, such as grid search, evolutionary strategies, etc. You can select such an algorithm (and its settings) by passing a corresponding control object. For a complete list of implemented algorithms look at TuneMultiCritControl.
tuneParamsMultiCrit( learner, task, resampling, measures, par.set, control, show.info = getMlrOption("show.info"), resample.fun = resample )
learner | (Learner | |
---|---|
task | (Task) |
resampling | (ResampleInstance | ResampleDesc) |
measures | [list of Measure) |
par.set | (ParamHelpers::ParamSet) |
control | (TuneMultiCritControl) |
show.info | ( |
resample.fun | (closure) |
Other tune_multicrit:
TuneMultiCritControl
,
plotTuneMultiCritResult()
# \donttest{ # multi-criteria optimization of (tpr, fpr) with NGSA-II lrn = makeLearner("classif.ksvm") rdesc = makeResampleDesc("Holdout") ps = makeParamSet( makeNumericParam("C", lower = -12, upper = 12, trafo = function(x) 2^x), makeNumericParam("sigma", lower = -12, upper = 12, trafo = function(x) 2^x) ) ctrl = makeTuneMultiCritControlNSGA2(popsize = 4L, generations = 1L) res = tuneParamsMultiCrit(lrn, sonar.task, rdesc, par.set = ps, measures = list(tpr, fpr), control = ctrl)#>#>#> #>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#># }