Fuses a base learner with a search strategy to select variables. Creates a learner object, which can be used like any other learner object, but which internally uses [selectFeatures]. If the train function is called on it, the search strategy and resampling are invoked to select an optimal set of variables. Finally, a model is fitted on the complete training data with these variables and returned. See [selectFeatures] for more details.

After training, the optimal features (and other related information) can be retrieved with [getFeatSelResult].

makeFeatSelWrapper(learner, resampling, measures, bit.names,
bits.to.features, control, show.info = getMlrOption("show.info"))

Arguments

learner (Learner | character(1)) The learner. If you pass a string the learner will be created via makeLearner. ([ResampleInstance] | [ResampleDesc]) Resampling strategy for feature selection. If you pass a description, it is instantiated once at the beginning by default, so all points are evaluated on the same training/test sets. If you want to change that behaviour, look at [FeatSelControl]. (list of Measure | Measure) Performance measures to evaluate. The first measure, aggregated by the first aggregation function is optimized, others are simply evaluated. Default is the default measure for the task, see here getDefaultMeasure. [character] Names of bits encoding the solutions. Also defines the total number of bits in the encoding. Per default these are the feature names of the task. Has to be used together with bits.to.features. [function(x, task)] Function which transforms an integer-0-1 vector into a character vector of selected features. Per default a value of 1 in the ith bit selects the ith feature to be in the candidate solution. The vector x will correspond to the bit.names and has to be of the same length. [see [FeatSelControl]) Control object for search method. Also selects the optimization algorithm for feature selection. (logical(1)) Print verbose output on console? Default is set via configureMlr.

Value

Other featsel: FeatSelControl, analyzeFeatSelResult, getFeatSelResult, selectFeatures

Other wrapper: makeBaggingWrapper, makeClassificationViaRegressionWrapper, makeConstantClassWrapper, makeCostSensClassifWrapper, makeCostSensRegrWrapper, makeDownsampleWrapper, makeDummyFeaturesWrapper, makeExtractFDAFeatsWrapper, makeFilterWrapper, makeImputeWrapper, makeMulticlassWrapper, makeMultilabelBinaryRelevanceWrapper, makeMultilabelClassifierChainsWrapper, makeMultilabelDBRWrapper, makeMultilabelNestedStackingWrapper, makeMultilabelStackingWrapper, makeOverBaggingWrapper, makePreprocWrapperCaret, makePreprocWrapper, makeRemoveConstantFeaturesWrapper, makeSMOTEWrapper, makeTuneWrapper, makeUndersampleWrapper, makeWeightedClassesWrapper

Examples

# nested resampling with feature selection (with a pretty stupid algorithm for selection)
outer = makeResampleDesc("CV", iters = 2L)
inner = makeResampleDesc("Holdout")
ctrl = makeFeatSelControlRandom(maxit = 1)
lrn = makeFeatSelWrapper("classif.ksvm", resampling = inner, control = ctrl)
# we also extract the selected features for all iteration here
r = resample(lrn, iris.task, outer, extract = getFeatSelResult)#> Resampling: cross-validation#> Measures:             mmce      #> [FeatSel] Started selecting features for learner 'classif.ksvm'#> With control class: FeatSelControlRandom#> Imputation value: 1#> [FeatSel-x] 1: 0111 (3 bits)#> [FeatSel-y] 1: mmce.test.mean=0.0400000; time: 0.0 min#> [FeatSel] Result: Sepal.Width,Petal.Length,Pe... (3 bits)#> [Resample] iter 1:    0.0533333 #> [FeatSel] Started selecting features for learner 'classif.ksvm'#> With control class: FeatSelControlRandom#> Imputation value: 1#> [FeatSel-x] 1: 0100 (1 bits)#> [FeatSel-y] 1: mmce.test.mean=0.5600000; time: 0.0 min#> [FeatSel] Result: Sepal.Width (1 bits)#> [Resample] iter 2:    0.5733333 #> #> Aggregated Result: mmce.test.mean=0.3133333#>