Fuses a base learner with a filter method. Creates a learner object, which can be used like any other learner object. Internally uses filterFeatures before every model fit.
Usage
makeFilterWrapper(
learner,
fw.method = "FSelectorRcpp_information.gain",
fw.base.methods = NULL,
fw.perc = NULL,
fw.abs = NULL,
fw.threshold = NULL,
fw.fun = NULL,
fw.fun.args = NULL,
fw.mandatory.feat = NULL,
cache = FALSE,
...
)
Arguments
- learner
(Learner |
character(1)
)
The learner. If you pass a string the learner will be created via makeLearner.- fw.method
(
character(1)
)
Filter method. See listFilterMethods. Default is “FSelectorRcpp_information.gain”.- fw.base.methods
(
character(1)
)
Simple Filter methods for ensemble filters. See listFilterMethods. Can only be used in combination with ensemble filters. See listFilterEnsembleMethods.- fw.perc
(
numeric(1)
)
If set, selectfw.perc
*100 top scoring features. Mutually exclusive with argumentsfw.abs
,fw.threshold
and `fw.fun.- fw.abs
(
numeric(1)
)
If set, selectfw.abs
top scoring features. Mutually exclusive with argumentsfw.perc
,fw.threshold
andfw.fun
.- fw.threshold
(
numeric(1)
)
If set, select features whose score exceedsfw.threshold
. Mutually exclusive with argumentsfw.perc
,fw.abs
andfw.fun
.- fw.fun
(
function)
)
If set, select features via a custom thresholding function, which must return the number of top scoring features to select. Mutually exclusive with argumentsfw.perc
,fw.abs
andfw.threshold
.- fw.fun.args
(any)
Arguments passed to the custom thresholding function- fw.mandatory.feat
(character)
Mandatory features which are always included regardless of their scores- cache
(
character(1)
| logical)
Whether to use caching during filter value creation. See details.- ...
(any)
Additional parameters passed down to the filter. If you are using more than one filter method, you need to pass the arguments in a named list viamore.args
. For examplemore.args = list("FSelectorRcpp_information.gain" = list(equal = TRUE))
.
Details
If ensemble = TRUE
, ensemble feature selection using all methods specified
in fw.method
is performed. At least two methods need to be selected.
After training, the selected features can be retrieved with getFilteredFeatures.
Note that observation weights do not influence the filtering and are simply passed down to the next learner.
Caching
If cache = TRUE
, the default mlr cache directory is used to cache filter
values. The directory is operating system dependent and can be checked with
getCacheDir()
. Alternatively a custom directory can be passed to store
the cache. The cache can be cleared with deleteCacheDir()
. Caching is
disabled by default. Care should be taken when operating on large clusters
due to possible write conflicts to disk if multiple workers try to write
the same cache at the same time.
See also
Other filter:
filterFeatures()
,
generateFilterValuesData()
,
getFilteredFeatures()
,
listFilterEnsembleMethods()
,
listFilterMethods()
,
makeFilterEnsemble()
,
makeFilter()
,
plotFilterValues()
Other wrapper:
makeBaggingWrapper()
,
makeClassificationViaRegressionWrapper()
,
makeConstantClassWrapper()
,
makeCostSensClassifWrapper()
,
makeCostSensRegrWrapper()
,
makeDownsampleWrapper()
,
makeDummyFeaturesWrapper()
,
makeExtractFDAFeatsWrapper()
,
makeFeatSelWrapper()
,
makeImputeWrapper()
,
makeMulticlassWrapper()
,
makeMultilabelBinaryRelevanceWrapper()
,
makeMultilabelClassifierChainsWrapper()
,
makeMultilabelDBRWrapper()
,
makeMultilabelNestedStackingWrapper()
,
makeMultilabelStackingWrapper()
,
makeOverBaggingWrapper()
,
makePreprocWrapperCaret()
,
makePreprocWrapper()
,
makeRemoveConstantFeaturesWrapper()
,
makeSMOTEWrapper()
,
makeTuneWrapper()
,
makeUndersampleWrapper()
,
makeWeightedClassesWrapper()
Examples
# \donttest{
task = makeClassifTask(data = iris, target = "Species")
lrn = makeLearner("classif.lda")
inner = makeResampleDesc("Holdout")
outer = makeResampleDesc("CV", iters = 2)
lrn = makeFilterWrapper(lrn, fw.perc = 0.5)
mod = train(lrn, task)
#> Error in x[0, , drop = FALSE]: incorrect number of dimensions
print(getFilteredFeatures(mod))
#> Error in getFilteredFeatures(mod): object 'mod' not found
# now nested resampling, where we extract the features that the filter method selected
r = resample(lrn, task, outer, extract = function(model) {
getFilteredFeatures(model)
})
#> Resampling: cross-validation
#> Measures: mmce
#> [Resample] iter 1: 0.0266667
#> [Resample] iter 2: 0.0666667
#>
#> Aggregated Result: mmce.test.mean=0.0466667
#>
print(r$extract)
#> [[1]]
#> [1] "Petal.Length" "Petal.Width"
#>
#> [[2]]
#> [1] "Petal.Length" "Petal.Width"
#>
# usage of an ensemble filter
lrn = makeLearner("classif.lda")
lrn = makeFilterWrapper(lrn, fw.method = "E-Borda",
fw.base.methods = c("FSelectorRcpp_gain.ratio", "FSelectorRcpp_information.gain"),
fw.perc = 0.5)
r = resample(lrn, task, outer, extract = function(model) {
getFilteredFeatures(model)
})
#> Resampling: cross-validation
#> Measures: mmce
#> [Resample] iter 1: 0.0400000
#> [Resample] iter 2: 0.0533333
#>
#> Aggregated Result: mmce.test.mean=0.0466667
#>
print(r$extract)
#> [[1]]
#> [1] "Petal.Length" "Petal.Width"
#>
#> [[2]]
#> [1] "Petal.Length" "Petal.Width"
#>
# usage of a custom thresholding function
biggest_gap = function(values, diff) {
gap_size = 0
gap_location = 0
for (i in (diff + 1):length(values)) {
gap = values[[i - diff]] - values[[i]]
if (gap > gap_size) {
gap_size = gap
gap_location = i - 1
}
}
return(gap_location)
}
lrn = makeLearner("classif.lda")
lrn = makeFilterWrapper(lrn, fw.method = "FSelectorRcpp_information.gain",
fw.fun = biggest_gap, fw.fun.args = list("diff" = 1))
r = resample(lrn, task, outer, extract = function(model) {
getFilteredFeatures(model)
})
#> Resampling: cross-validation
#> Measures: mmce
#> [Resample] iter 1: 0.0533333
#> [Resample] iter 2: 0.0266667
#>
#> Aggregated Result: mmce.test.mean=0.0400000
#>
print(r$extract)
#> [[1]]
#> [1] "Petal.Length" "Petal.Width"
#>
#> [[2]]
#> [1] "Petal.Length" "Petal.Width"
#>
# }