Fuses a base learner with a search strategy to select its hyperparameters. Creates a learner object, which can be used like any other learner object, but which internally uses tuneParams. If the train function is called on it, the search strategy and resampling are invoked to select an optimal set of hyperparameter values. Finally, a model is fitted on the complete training data with these optimal hyperparameters and returned. See tuneParams for more details.
After training, the optimal hyperparameters (and other related information) can be retrieved with getTuneResult.
makeTuneWrapper( learner, resampling, measures, par.set, control, show.info = getMlrOption("show.info") )
learner | (Learner | |
---|---|
resampling | (ResampleInstance | ResampleDesc) |
measures | (list of Measure | Measure) |
par.set | (ParamHelpers::ParamSet) |
control | (TuneControl) |
show.info | ( |
Other tune:
TuneControl
,
getNestedTuneResultsOptPathDf()
,
getNestedTuneResultsX()
,
getResamplingIndices()
,
getTuneResult()
,
makeModelMultiplexerParamSet()
,
makeModelMultiplexer()
,
makeTuneControlCMAES()
,
makeTuneControlDesign()
,
makeTuneControlGenSA()
,
makeTuneControlGrid()
,
makeTuneControlIrace()
,
makeTuneControlMBO()
,
makeTuneControlRandom()
,
tuneParams()
,
tuneThreshold()
Other wrapper:
makeBaggingWrapper()
,
makeClassificationViaRegressionWrapper()
,
makeConstantClassWrapper()
,
makeCostSensClassifWrapper()
,
makeCostSensRegrWrapper()
,
makeDownsampleWrapper()
,
makeDummyFeaturesWrapper()
,
makeExtractFDAFeatsWrapper()
,
makeFeatSelWrapper()
,
makeFilterWrapper()
,
makeImputeWrapper()
,
makeMulticlassWrapper()
,
makeMultilabelBinaryRelevanceWrapper()
,
makeMultilabelClassifierChainsWrapper()
,
makeMultilabelDBRWrapper()
,
makeMultilabelNestedStackingWrapper()
,
makeMultilabelStackingWrapper()
,
makeOverBaggingWrapper()
,
makePreprocWrapperCaret()
,
makePreprocWrapper()
,
makeRemoveConstantFeaturesWrapper()
,
makeSMOTEWrapper()
,
makeUndersampleWrapper()
,
makeWeightedClassesWrapper()
# \donttest{ task = makeClassifTask(data = iris, target = "Species") lrn = makeLearner("classif.rpart") # stupid mini grid ps = makeParamSet( makeDiscreteParam("cp", values = c(0.05, 0.1)), makeDiscreteParam("minsplit", values = c(10, 20)) ) ctrl = makeTuneControlGrid() inner = makeResampleDesc("Holdout") outer = makeResampleDesc("CV", iters = 2) lrn = makeTuneWrapper(lrn, resampling = inner, par.set = ps, control = ctrl) mod = train(lrn, task)#> Error: Please use column names for `x`#> Error in checkClass(x, classes, ordered, null.ok): object 'mod' not found# nested resampling for evaluation # we also extract tuned hyper pars in each iteration r = resample(lrn, task, outer, extract = getTuneResult)#>#>#>#>#> #>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#> #>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#> [[1]] #> Tune result: #> Op. pars: cp=0.1; minsplit=20 #> mmce.test.mean=0.0800000 #> #> [[2]] #> Tune result: #> Op. pars: cp=0.1; minsplit=10 #> mmce.test.mean=0.0800000 #>#> cp minsplit mmce.test.mean dob eol error.message exec.time iter #> 1 0.05 10 0.08 1 NA <NA> 0.012 1 #> 2 0.1 10 0.08 2 NA <NA> 0.011 1 #> 3 0.05 20 0.08 3 NA <NA> 0.011 1 #> 4 0.1 20 0.08 4 NA <NA> 0.011 1 #> 5 0.05 10 0.08 1 NA <NA> 0.011 2 #> 6 0.1 10 0.08 2 NA <NA> 0.012 2 #> 7 0.05 20 0.08 3 NA <NA> 0.011 2 #> 8 0.1 20 0.08 4 NA <NA> 0.012 2#> cp minsplit #> 1 0.1 20 #> 2 0.1 10# }