Fuses a learner with the bagging method (i.e., similar to what a randomForest does). Creates a learner object, which can be used like any other learner object. Models can easily be accessed via getLearnerModel.

Bagging is implemented as follows: For each iteration a random data subset is sampled (with or without replacement) and potentially the number of features is also restricted to a random subset. Note that this is usually handled in a slightly different way in the random forest where features are sampled at each tree split).

Prediction works as follows: For classification we do majority voting to create a discrete label and probabilities are predicted by considering the proportions of all predicted labels. For regression the mean value and the standard deviations across predictions is computed.

Note that the passed base learner must always have predict.type = 'response', while the BaggingWrapper can estimate probabilities and standard errors, so it can be set, e.g., to predict.type = 'prob'. For this reason, when you call setPredictType, the type is only set for the BaggingWrapper, not passed down to the inner learner.

makeBaggingWrapper(
  learner,
  bw.iters = 10L,
  bw.replace = TRUE,
  bw.size,
  bw.feats = 1
)

Arguments

learner

(Learner | character(1))
The learner. If you pass a string the learner will be created via makeLearner.

bw.iters

(integer(1))
Iterations = number of fitted models in bagging. Default is 10.

bw.replace

(logical(1))
Sample bags with replacement (bootstrapping)? Default is TRUE.

bw.size

(numeric(1))
Percentage size of sampled bags. Default is 1 for bootstrapping and 0.632 for subsampling.

bw.feats

(numeric(1))
Percentage size of randomly selected features in bags. Default is 1. At least one feature will always be selected.

Value

Learner.

See also