ReampleResult now stores the runtime it took to resample in a slot
getTaskFormula / getTaskFormulaAsString have new argument ‘explicit.features’
getTaskData now has recodeY = “drop.levels” which drops empty factor levels
option fix.factors in makeLearner was renamed to fix.factors.prediction for clarity
showHyperPars was removed. getParamSet does exactly the same thing
‘resample’ and ‘benchmark’ got the argument keep.pred, setting it to FALSE allows to discard the prediction objects to save memory
we had to slightly change how the mem usage is reported in tuning and feature selection See TuneControl and FeatSelControl where it is documented what is done now.
tuneIrace: allows to set the precision / digits within irace (using the argument ‘digits’ in makeTuneControlIrace); default is maximum precision
for plotting in general we try to introduce a “data layer”, so the data can be generated independently of the plotting first, into well-defined objects; these can then be plotted with mlr or custom code; the naming scheme is always generateData and plot
getFilterValues is deprecated in favor of generateFilterValuesData
plotFilterValues can now plot multiple filter methods using facetting
plotROCRCurves has been rewritten to use ggplot2
classif.ada: added “loss” hyperpar
add missings properties to all ctree and cforest methods: regr/classif for ctree, regr/classif/surv for cforest, and regr/classif for blackboost
learner xgboost was removed, because the package is not on CRAN anymore, unfortunately
reg.km: added param ‘iso’
classif.mda: added param ‘start.method’ and changed its default to ‘lvq’, added params ‘sub.df’, ‘tot.df’ and ‘criterion’
classif.randomForest: ‘sampsize’ can now be an int vector (instead of a scalar)
plotThreshVsPerf and plotLearningCurve now have param ‘facet’