Optimizes the features for a classification or regression problem by choosing a variable selection wrapper approach. Allows for different optimization methods, such as forward search or a genetic algorithm. You can select such an algorithm (and its settings) by passing a corresponding control object. For a complete list of implemented algorithms look at the subclasses of (FeatSelControl).
All algorithms operate on a 0-1-bit encoding of candidate solutions. Per
default a single bit corresponds to a single feature, but you are able to
change this by using the arguments bit.names and bits.to.features. Thus
allowing you to switch on whole groups of features with a single bit.
selectFeatures( learner, task, resampling, measures, bit.names, bits.to.features, control, show.info = getMlrOption("show.info") )
| learner | (Learner | |
|---|---|
| task | (Task) |
| resampling | (ResampleInstance | ResampleDesc) |
| measures | (list of Measure | Measure) |
| bit.names | character |
| bits.to.features | ( |
| control | [see FeatSelControl) Control object for search method. Also selects the optimization algorithm for feature selection. |
| show.info | ( |
Other featsel:
FeatSelControl,
analyzeFeatSelResult(),
getFeatSelResult(),
makeFeatSelWrapper()
# \donttest{ rdesc = makeResampleDesc("Holdout") ctrl = makeFeatSelControlSequential(method = "sfs", maxit = NA) res = selectFeatures("classif.rpart", iris.task, rdesc, control = ctrl)#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#>#> Features : 1 #> Performance : mmce.test.mean=0.0400000 #> Petal.Length #> #> Path to optimum: #> - Features: 0 Init : Perf = 0.64 Diff: NA * #> - Features: 1 Add : Petal.Length Perf = 0.04 Diff: 0.6 * #> #> Stopped, because no improving feature was found.# }