Use stacking method (stacked generalization) to create a multilabel learner.
Source:R/MultilabelStackingWrapper.R
makeMultilabelStackingWrapper.Rd
Every learner which is implemented in mlr and which supports binary classification can be converted to a wrapped stacking multilabel learner. Stacking trains a binary classifier for each label using predicted label information of all labels (including the target label) as additional features (by cross validation). During prediction these labels need are obtained by the binary relevance method using the same binary learner.
Models can easily be accessed via getLearnerModel.
Arguments
- learner
(Learner |
character(1)
)
The learner. If you pass a string the learner will be created via makeLearner.- cv.folds
(
integer(1)
)
The number of folds for the inner cross validation method to predict labels for the augmented feature space. Default is2
.
References
Montanes, E. et al. (2013) Dependent binary relevance models for multi-label classification Artificial Intelligence Center, University of Oviedo at Gijon, Spain.
See also
Other wrapper:
makeBaggingWrapper()
,
makeClassificationViaRegressionWrapper()
,
makeConstantClassWrapper()
,
makeCostSensClassifWrapper()
,
makeCostSensRegrWrapper()
,
makeDownsampleWrapper()
,
makeDummyFeaturesWrapper()
,
makeExtractFDAFeatsWrapper()
,
makeFeatSelWrapper()
,
makeFilterWrapper()
,
makeImputeWrapper()
,
makeMulticlassWrapper()
,
makeMultilabelBinaryRelevanceWrapper()
,
makeMultilabelClassifierChainsWrapper()
,
makeMultilabelDBRWrapper()
,
makeMultilabelNestedStackingWrapper()
,
makeOverBaggingWrapper()
,
makePreprocWrapperCaret()
,
makePreprocWrapper()
,
makeRemoveConstantFeaturesWrapper()
,
makeSMOTEWrapper()
,
makeTuneWrapper()
,
makeUndersampleWrapper()
,
makeWeightedClassesWrapper()
Other multilabel:
getMultilabelBinaryPerformances()
,
makeMultilabelBinaryRelevanceWrapper()
,
makeMultilabelClassifierChainsWrapper()
,
makeMultilabelDBRWrapper()
,
makeMultilabelNestedStackingWrapper()
Examples
d = getTaskData(yeast.task)
# drop some labels so example runs faster
d = d[seq(1, nrow(d), by = 20), c(1:2, 15:17)]
task = makeMultilabelTask(data = d, target = c("label1", "label2"))
lrn = makeLearner("classif.rpart")
lrn = makeMultilabelBinaryRelevanceWrapper(lrn)
lrn = setPredictType(lrn, "prob")
# train, predict and evaluate
mod = train(lrn, task)
#> Error in x[0, , drop = FALSE]: incorrect number of dimensions
pred = predict(mod, task)
#> Error in predict(mod, task): object 'mod' not found
performance(pred, measure = list(multilabel.hamloss, multilabel.subset01, multilabel.f1))
#> Error in performance(pred, measure = list(multilabel.hamloss, multilabel.subset01, multilabel.f1)): object 'pred' not found
# the next call basically has the same structure for any multilabel meta wrapper
getMultilabelBinaryPerformances(pred, measures = list(mmce, auc))
#> Error in checkClass(x, classes, ordered, null.ok): object 'pred' not found
# above works also with predictions from resample!