Get the resampling indices from a tuning or feature selection wrapper..
Source:R/getResamplingIndices.R
getResamplingIndices.Rd
After you resampled a tuning or feature selection wrapper (see makeTuneWrapper)
with resample(..., extract = getTuneResult)
or resample(..., extract = getFeatSelResult)
this helper returns a list
with
the resampling indices used for the respective method.
Arguments
- object
(ResampleResult)
The result of resampling of a tuning or feature selection wrapper.- inner
(logical)
IfTRUE
, returns the inner indices of a nested resampling setting.
Value
(list). One list for each outer resampling fold.
See also
Other tune:
TuneControl
,
getNestedTuneResultsOptPathDf()
,
getNestedTuneResultsX()
,
getTuneResult()
,
makeModelMultiplexerParamSet()
,
makeModelMultiplexer()
,
makeTuneControlCMAES()
,
makeTuneControlDesign()
,
makeTuneControlGenSA()
,
makeTuneControlGrid()
,
makeTuneControlIrace()
,
makeTuneControlMBO()
,
makeTuneControlRandom()
,
makeTuneWrapper()
,
tuneParams()
,
tuneThreshold()
Examples
task = makeClassifTask(data = iris, target = "Species")
lrn = makeLearner("classif.rpart")
# stupid mini grid
ps = makeParamSet(
makeDiscreteParam("cp", values = c(0.05, 0.1)),
makeDiscreteParam("minsplit", values = c(10, 20))
)
ctrl = makeTuneControlGrid()
inner = makeResampleDesc("Holdout")
outer = makeResampleDesc("CV", iters = 2)
lrn = makeTuneWrapper(lrn, resampling = inner, par.set = ps, control = ctrl)
# nested resampling for evaluation
# we also extract tuned hyper pars in each iteration and by that the resampling indices
r = resample(lrn, task, outer, extract = getTuneResult)
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune] Started tuning learner classif.rpart for parameter set:
#> Type len Def Constr Req Tunable Trafo
#> cp discrete - - 0.05,0.1 - TRUE -
#> minsplit discrete - - 10,20 - TRUE -
#> With control class: TuneControlGrid
#> Imputation value: 1
#> [Tune-x] 1: cp=0.05; minsplit=10
#> [Tune-y] 1: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 2: cp=0.1; minsplit=10
#> [Tune-y] 2: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 3: cp=0.05; minsplit=20
#> [Tune-y] 3: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 4: cp=0.1; minsplit=20
#> [Tune-y] 4: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune] Result: cp=0.1; minsplit=20 : mmce.test.mean=0.0400000
#> [Resample] iter 1: 0.0666667
#> [Tune] Started tuning learner classif.rpart for parameter set:
#> Type len Def Constr Req Tunable Trafo
#> cp discrete - - 0.05,0.1 - TRUE -
#> minsplit discrete - - 10,20 - TRUE -
#> With control class: TuneControlGrid
#> Imputation value: 1
#> [Tune-x] 1: cp=0.05; minsplit=10
#> [Tune-y] 1: mmce.test.mean=0.1600000; time: 0.0 min
#> [Tune-x] 2: cp=0.1; minsplit=10
#> [Tune-y] 2: mmce.test.mean=0.1600000; time: 0.0 min
#> [Tune-x] 3: cp=0.05; minsplit=20
#> [Tune-y] 3: mmce.test.mean=0.1600000; time: 0.0 min
#> [Tune-x] 4: cp=0.1; minsplit=20
#> [Tune-y] 4: mmce.test.mean=0.1600000; time: 0.0 min
#> [Tune] Result: cp=0.05; minsplit=20 : mmce.test.mean=0.1600000
#> [Resample] iter 2: 0.0266667
#>
#> Aggregated Result: mmce.test.mean=0.0466667
#>
# get tuning indices
getResamplingIndices(r, inner = TRUE)
#> [[1]]
#> [[1]]$train.inds
#> [[1]]$train.inds[[1]]
#> [1] 97 39 149 119 91 72 88 84 85 13 79 138 36 106 89 22 108 100 67
#> [20] 129 42 92 58 69 77 17 26 19 46 121 147 117 50 7 140 120 21 1
#> [39] 49 82 105 5 12 71 45 30 57 52 118 126
#>
#>
#> [[1]]$test.inds
#> [[1]]$test.inds[[1]]
#> [1] 125 6 75 11 38 51 96 133 23 25 8 139 114 47 34 123 37 20 32
#> [20] 48 144 4 109 59 44
#>
#>
#>
#> [[2]]
#> [[2]]$train.inds
#> [[2]]$train.inds[[1]]
#> [1] 35 70 102 141 61 66 150 98 146 104 80 111 41 110 18 81 65 16 14
#> [20] 55 60 28 24 74 62 87 148 10 73 29 76 135 145 43 134 3 93 101
#> [39] 131 83 15 9 63 68 107 127 56 115 64 116
#>
#>
#> [[2]]$test.inds
#> [[2]]$test.inds[[1]]
#> [1] 40 128 95 122 112 124 132 136 2 54 90 33 94 142 31 53 103 130 86
#> [20] 78 137 27 113 143 99
#>
#>
#>