R/WeightedClassesWrapper.R
makeWeightedClassesWrapper.Rd
Creates a wrapper, which can be used like any other learner object.
Fitting is performed in a weighted fashion where each observation receives a weight,
depending on the class it belongs to, see wcw.weight
.
This might help to mitigate problems caused by imbalanced class distributions.
This weighted fitting can be achieved in two ways:
a) The learner already has a parameter for class weighting, so one weight can directly be defined
per class. Example: “classif.ksvm” and parameter class.weights
.
In this case we don't really do anything fancy. We convert wcw.weight
a bit,
but basically simply bind its value to the class weighting param.
The wrapper in this case simply offers a convenient, consistent fashion for class weighting -
and tuning! See example below.
b) The learner does not have a direct parameter to support class weighting, but
supports observation weights, so hasLearnerProperties(learner, 'weights')
is TRUE
.
This means that an individual, arbitrary weight can be set per observation during training.
We set this weight depending on the class internally in the wrapper. Basically we introduce
something like a new “class.weights” parameter for the learner via observation weights.
makeWeightedClassesWrapper(learner, wcw.param = NULL, wcw.weight = 1)
learner | (Learner | |
---|---|
wcw.param | ( |
wcw.weight | (numeric) |
Other wrapper:
makeBaggingWrapper()
,
makeClassificationViaRegressionWrapper()
,
makeConstantClassWrapper()
,
makeCostSensClassifWrapper()
,
makeCostSensRegrWrapper()
,
makeDownsampleWrapper()
,
makeDummyFeaturesWrapper()
,
makeExtractFDAFeatsWrapper()
,
makeFeatSelWrapper()
,
makeFilterWrapper()
,
makeImputeWrapper()
,
makeMulticlassWrapper()
,
makeMultilabelBinaryRelevanceWrapper()
,
makeMultilabelClassifierChainsWrapper()
,
makeMultilabelDBRWrapper()
,
makeMultilabelNestedStackingWrapper()
,
makeMultilabelStackingWrapper()
,
makeOverBaggingWrapper()
,
makePreprocWrapperCaret()
,
makePreprocWrapper()
,
makeRemoveConstantFeaturesWrapper()
,
makeSMOTEWrapper()
,
makeTuneWrapper()
,
makeUndersampleWrapper()
# \donttest{ set.seed(123) # using the direct parameter of the SVM (which is already defined in the learner) lrn = makeWeightedClassesWrapper("classif.ksvm", wcw.weight = 0.01) res = holdout(lrn, sonar.task)#>#>#>#>#>#>#> predicted #> true M R -err.- #> M 0 38 38 #> R 0 32 0 #> -err.- 0 38 38# using the observation weights of logreg lrn = makeWeightedClassesWrapper("classif.logreg", wcw.weight = 0.01) res = holdout(lrn, sonar.task)#>#>#> Warning: glm.fit: algorithm did not converge#> Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred#>#>#>#>#> predicted #> true M R -err.- #> M 28 7 7 #> R 16 19 16 #> -err.- 16 7 23# tuning the imbalancy param and the SVM param in one go lrn = makeWeightedClassesWrapper("classif.ksvm", wcw.param = "class.weights") ps = makeParamSet( makeNumericParam("wcw.weight", lower = 1, upper = 10), makeNumericParam("C", lower = -12, upper = 12, trafo = function(x) 2^x), makeNumericParam("sigma", lower = -12, upper = 12, trafo = function(x) 2^x) ) ctrl = makeTuneControlRandom(maxit = 3L) rdesc = makeResampleDesc("CV", iters = 2L, stratify = TRUE) res = tuneParams(lrn, sonar.task, rdesc, par.set = ps, control = ctrl)#>#>#> #> #>#>#>#>#>#>#>#>#>#>#> Tune result: #> Op. pars: wcw.weight=1.11; C=441; sigma=0.0013 #> mmce.test.mean=0.2644231# print(res$opt.path) # }