Use nested stacking method to create a multilabel learner.
Source:R/MultilabelNestedStackingWrapper.R
makeMultilabelNestedStackingWrapper.Rd
Every learner which is implemented in mlr and which supports binary classification can be converted to a wrapped nested stacking multilabel learner. Nested stacking trains a binary classifier for each label following a given order. In training phase, the feature space of each classifier is extended with predicted label information (by cross validation) of all previous labels in the chain. During the prediction phase, predicted labels are obtained by the classifiers, which have been learned on all training data.
Models can easily be accessed via getLearnerModel.
Arguments
- learner
(Learner |
character(1)
)
The learner. If you pass a string the learner will be created via makeLearner.- order
(character)
Specifies the chain order using the names of the target labels. E.g. form
target labels, this must be a character vector of lengthm
that contains a permutation of the target label names. Default isNULL
which uses a random ordering of the target label names.- cv.folds
(
integer(1)
)
The number of folds for the inner cross validation method to predict labels for the augmented feature space. Default is2
.
References
Montanes, E. et al. (2013), Dependent binary relevance models for multi-label classification Artificial Intelligence Center, University of Oviedo at Gijon, Spain.
See also
Other wrapper:
makeBaggingWrapper()
,
makeClassificationViaRegressionWrapper()
,
makeConstantClassWrapper()
,
makeCostSensClassifWrapper()
,
makeCostSensRegrWrapper()
,
makeDownsampleWrapper()
,
makeDummyFeaturesWrapper()
,
makeExtractFDAFeatsWrapper()
,
makeFeatSelWrapper()
,
makeFilterWrapper()
,
makeImputeWrapper()
,
makeMulticlassWrapper()
,
makeMultilabelBinaryRelevanceWrapper()
,
makeMultilabelClassifierChainsWrapper()
,
makeMultilabelDBRWrapper()
,
makeMultilabelStackingWrapper()
,
makeOverBaggingWrapper()
,
makePreprocWrapperCaret()
,
makePreprocWrapper()
,
makeRemoveConstantFeaturesWrapper()
,
makeSMOTEWrapper()
,
makeTuneWrapper()
,
makeUndersampleWrapper()
,
makeWeightedClassesWrapper()
Other multilabel:
getMultilabelBinaryPerformances()
,
makeMultilabelBinaryRelevanceWrapper()
,
makeMultilabelClassifierChainsWrapper()
,
makeMultilabelDBRWrapper()
,
makeMultilabelStackingWrapper()
Examples
d = getTaskData(yeast.task)
# drop some labels so example runs faster
d = d[seq(1, nrow(d), by = 20), c(1:2, 15:17)]
task = makeMultilabelTask(data = d, target = c("label1", "label2"))
lrn = makeLearner("classif.rpart")
lrn = makeMultilabelBinaryRelevanceWrapper(lrn)
lrn = setPredictType(lrn, "prob")
# train, predict and evaluate
mod = train(lrn, task)
#> Error in x[0, , drop = FALSE]: incorrect number of dimensions
pred = predict(mod, task)
#> Error in predict(mod, task): object 'mod' not found
performance(pred, measure = list(multilabel.hamloss, multilabel.subset01, multilabel.f1))
#> Error in performance(pred, measure = list(multilabel.hamloss, multilabel.subset01, multilabel.f1)): object 'pred' not found
# the next call basically has the same structure for any multilabel meta wrapper
getMultilabelBinaryPerformances(pred, measures = list(mmce, auc))
#> Error in checkClass(x, classes, ordered, null.ok): object 'pred' not found
# above works also with predictions from resample!