Optimizes the hyperparameters of a learner. Allows for different optimization methods, such as grid search, evolutionary strategies, iterated F-race, etc. You can select such an algorithm (and its settings) by passing a corresponding control object. For a complete list of implemented algorithms look at TuneControl.
Multi-criteria tuning can be done with tuneParamsMultiCrit.
tuneParams( learner, task, resampling, measures, par.set, control, show.info = getMlrOption("show.info"), resample.fun = resample )
learner | (Learner | |
---|---|
task | (Task) |
resampling | (ResampleInstance | ResampleDesc) |
measures | (list of Measure | Measure) |
par.set | (ParamHelpers::ParamSet) |
control | (TuneControl) |
show.info | ( |
resample.fun | (closure) |
(TuneResult).
If you would like to include results from the training data set, make sure to appropriately adjust the resampling strategy and the aggregation for the measure. See example code below.
Other tune:
TuneControl
,
getNestedTuneResultsOptPathDf()
,
getNestedTuneResultsX()
,
getResamplingIndices()
,
getTuneResult()
,
makeModelMultiplexerParamSet()
,
makeModelMultiplexer()
,
makeTuneControlCMAES()
,
makeTuneControlDesign()
,
makeTuneControlGenSA()
,
makeTuneControlGrid()
,
makeTuneControlIrace()
,
makeTuneControlMBO()
,
makeTuneControlRandom()
,
makeTuneWrapper()
,
tuneThreshold()
set.seed(123) # a grid search for an SVM (with a tiny number of points...) # note how easily we can optimize on a log-scale ps = makeParamSet( makeNumericParam("C", lower = -12, upper = 12, trafo = function(x) 2^x), makeNumericParam("sigma", lower = -12, upper = 12, trafo = function(x) 2^x) ) ctrl = makeTuneControlGrid(resolution = 2L) rdesc = makeResampleDesc("CV", iters = 2L) res = tuneParams("classif.ksvm", iris.task, rdesc, par.set = ps, control = ctrl)#>#>#> #>#>#>#>#>#>#>#>#>#>#>#>#> Tune result: #> Op. pars: C=4.1e+03; sigma=0.000244 #> mmce.test.mean=0.0533333# access data for all evaluated points df = as.data.frame(res$opt.path) df1 = as.data.frame(res$opt.path, trafo = TRUE) print(head(df[, -ncol(df)]))#> C sigma mmce.test.mean dob eol error.message #> 1 -12 -12 0.73333333 1 NA <NA> #> 2 12 -12 0.05333333 2 NA <NA> #> 3 -12 12 0.73333333 3 NA <NA> #> 4 12 12 0.73333333 4 NA <NA>#> C sigma mmce.test.mean dob eol error.message #> 1 -12 -12 0.73333333 1 NA <NA> #> 2 12 -12 0.05333333 2 NA <NA> #> 3 -12 12 0.73333333 3 NA <NA> #> 4 12 12 0.73333333 4 NA <NA># access data for all evaluated points - alternative df2 = generateHyperParsEffectData(res) df3 = generateHyperParsEffectData(res, trafo = TRUE) print(head(df2$data[, -ncol(df2$data)]))#> C sigma mmce.test.mean iteration #> 1 -12 -12 0.73333333 1 #> 2 12 -12 0.05333333 2 #> 3 -12 12 0.73333333 3 #> 4 12 12 0.73333333 4#> C sigma mmce.test.mean iteration #> 1 2.441406e-04 2.441406e-04 0.73333333 1 #> 2 4.096000e+03 2.441406e-04 0.05333333 2 #> 3 2.441406e-04 4.096000e+03 0.73333333 3 #> 4 4.096000e+03 4.096000e+03 0.73333333 4if (FALSE) { # we optimize the SVM over 3 kernels simultanously # note how we use dependent params (requires = ...) and iterated F-racing here ps = makeParamSet( makeNumericParam("C", lower = -12, upper = 12, trafo = function(x) 2^x), makeDiscreteParam("kernel", values = c("vanilladot", "polydot", "rbfdot")), makeNumericParam("sigma", lower = -12, upper = 12, trafo = function(x) 2^x, requires = quote(kernel == "rbfdot")), makeIntegerParam("degree", lower = 2L, upper = 5L, requires = quote(kernel == "polydot")) ) print(ps) ctrl = makeTuneControlIrace(maxExperiments = 5, nbIterations = 1, minNbSurvival = 1) rdesc = makeResampleDesc("Holdout") res = tuneParams("classif.ksvm", iris.task, rdesc, par.set = ps, control = ctrl) print(res) df = as.data.frame(res$opt.path) print(head(df[, -ncol(df)])) # include the training set performance as well rdesc = makeResampleDesc("Holdout", predict = "both") res = tuneParams("classif.ksvm", iris.task, rdesc, par.set = ps, control = ctrl, measures = list(mmce, setAggregation(mmce, train.mean))) print(res) df2 = as.data.frame(res$opt.path) print(head(df2[, -ncol(df2)])) }