Combines multiple base learners by dispatching on the hyperparameter “selected.learner” to a specific model class. This allows to tune not only the model class (SVM, random forest, etc) but also their hyperparameters in one go. Combine this with [tuneParams] and [makeTuneControlIrace] for a very powerful approach, see example below.

The parameter set is the union of all (unique) base learners. In order to avoid name clashes all parameter names are prefixed with the base learner id, i.e. “[learner.id].[parameter.name]”.

The predict.type of the Multiplexer is inherited from the predict.type of the base learners.

The getter [getLearnerProperties] returns the properties of the selected base learner.

makeModelMultiplexer(base.learners)

Arguments

base.learners

([list` of [Learner])
List of Learners with unique IDs.

Value

([ModelMultiplexer]). A [Learner] specialized as `ModelMultiplexer`.

Note

Note that logging output during tuning is somewhat shortened to make it more readable. I.e., the artificial prefix before parameter names is suppressed.

See also

Examples

library(BBmisc)
#> #> Attaching package: ‘BBmisc’
#> The following object is masked from ‘package:base’: #> #> isFALSE
bls = list( makeLearner("classif.ksvm"), makeLearner("classif.randomForest") ) lrn = makeModelMultiplexer(bls) # simple way to contruct param set for tuning # parameter names are prefixed automatically and the 'requires' # element is set, too, to make all paramaters subordinate to 'selected.learner' ps = makeModelMultiplexerParamSet(lrn, makeNumericParam("sigma", lower = -10, upper = 10, trafo = function(x) 2^x), makeIntegerParam("ntree", lower = 1L, upper = 500L) ) print(ps)
#> Type len Def Constr #> selected.learner discrete - - classif.ksvm,classif.randomForest #> classif.ksvm.sigma numeric - - -10 to 10 #> classif.randomForest.ntree integer - - 1 to 500 #> Req Tunable Trafo #> selected.learner - TRUE - #> classif.ksvm.sigma Y TRUE Y #> classif.randomForest.ntree Y TRUE -
rdesc = makeResampleDesc("CV", iters = 2L) # to save some time we use random search. but you probably want something like this: # ctrl = makeTuneControlIrace(maxExperiments = 500L) ctrl = makeTuneControlRandom(maxit = 10L) res = tuneParams(lrn, iris.task, rdesc, par.set = ps, control = ctrl)
#> [Tune] Started tuning learner ModelMultiplexer for parameter set:
#> Type len Def Constr #> selected.learner discrete - - classif.ksvm,classif.randomForest #> classif.ksvm.sigma numeric - - -10 to 10 #> classif.randomForest.ntree integer - - 1 to 500 #> Req Tunable Trafo #> selected.learner - TRUE - #> classif.ksvm.sigma Y TRUE Y #> classif.randomForest.ntree Y TRUE -
#> With control class: TuneControlRandom
#> Imputation value: 1
#> [Tune-x] 1: selected.learner=classif.rand...; ntree=390
#> [Tune-y] 1: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 2: selected.learner=classif.rand...; ntree=241
#> [Tune-y] 2: mmce.test.mean=0.0466667; time: 0.0 min
#> [Tune-x] 3: selected.learner=classif.ksvm; sigma=0.00109
#> [Tune-y] 3: mmce.test.mean=0.7200000; time: 0.0 min
#> [Tune-x] 4: selected.learner=classif.ksvm; sigma=0.216
#> [Tune-y] 4: mmce.test.mean=0.0533333; time: 0.0 min
#> [Tune-x] 5: selected.learner=classif.rand...; ntree=178
#> [Tune-y] 5: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 6: selected.learner=classif.rand...; ntree=119
#> [Tune-y] 6: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 7: selected.learner=classif.ksvm; sigma=141
#> [Tune-y] 7: mmce.test.mean=0.6933333; time: 0.0 min
#> [Tune-x] 8: selected.learner=classif.ksvm; sigma=0.0075
#> [Tune-y] 8: mmce.test.mean=0.1200000; time: 0.0 min
#> [Tune-x] 9: selected.learner=classif.ksvm; sigma=0.00156
#> [Tune-y] 9: mmce.test.mean=0.6066667; time: 0.0 min
#> [Tune-x] 10: selected.learner=classif.ksvm; sigma=0.106
#> [Tune-y] 10: mmce.test.mean=0.0533333; time: 0.0 min
#> [Tune] Result: selected.learner=classif.rand...; classif.randomForest.ntree=390 : mmce.test.mean=0.0400000
print(res)
#> Tune result: #> Op. pars: selected.learner=classif.rand...; classif.randomForest.ntree=390 #> mmce.test.mean=0.0400000
df = as.data.frame(res$opt.path) print(head(df[, -ncol(df)]))
#> selected.learner classif.ksvm.sigma classif.randomForest.ntree #> 1 classif.randomForest NA 390 #> 2 classif.randomForest NA 241 #> 3 classif.ksvm -9.835690 NA #> 4 classif.ksvm -2.208258 NA #> 5 classif.randomForest NA 178 #> 6 classif.randomForest NA 119 #> mmce.test.mean dob eol error.message #> 1 0.04000000 1 NA <NA> #> 2 0.04666667 2 NA <NA> #> 3 0.72000000 3 NA <NA> #> 4 0.05333333 4 NA <NA> #> 5 0.04000000 5 NA <NA> #> 6 0.04000000 6 NA <NA>
# more unique and reliable way to construct the param set ps = makeModelMultiplexerParamSet(lrn, classif.ksvm = makeParamSet( makeNumericParam("sigma", lower = -10, upper = 10, trafo = function(x) 2^x) ), classif.randomForest = makeParamSet( makeIntegerParam("ntree", lower = 1L, upper = 500L) ) ) # this is how you would construct the param set manually, works too ps = makeParamSet( makeDiscreteParam("selected.learner", values = extractSubList(bls, "id")), makeNumericParam("classif.ksvm.sigma", lower = -10, upper = 10, trafo = function(x) 2^x, requires = quote(selected.learner == "classif.ksvm")), makeIntegerParam("classif.randomForest.ntree", lower = 1L, upper = 500L, requires = quote(selected.learner == "classif.randomForst")) ) # all three ps-objects are exactly the same internally.