After you resampled a tuning or feature selection wrapper (see makeTuneWrapper) with resample(..., extract = getTuneResult) or resample(..., extract = getFeatSelResult) this helper returns a list with the resampling indices used for the respective method.

getResamplingIndices(object, inner = FALSE)

Arguments

object

(ResampleResult)
The result of resampling of a tuning or feature selection wrapper.

inner

(logical)
If TRUE, returns the inner indices of a nested resampling setting.

Value

(list). One list for each outer resampling fold.

See also

Examples

task = makeClassifTask(data = iris, target = "Species") lrn = makeLearner("classif.rpart") # stupid mini grid ps = makeParamSet( makeDiscreteParam("cp", values = c(0.05, 0.1)), makeDiscreteParam("minsplit", values = c(10, 20)) ) ctrl = makeTuneControlGrid() inner = makeResampleDesc("Holdout") outer = makeResampleDesc("CV", iters = 2) lrn = makeTuneWrapper(lrn, resampling = inner, par.set = ps, control = ctrl) # nested resampling for evaluation # we also extract tuned hyper pars in each iteration and by that the resampling indices r = resample(lrn, task, outer, extract = getTuneResult)
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune] Started tuning learner classif.rpart for parameter set:
#> Type len Def Constr Req Tunable Trafo #> cp discrete - - 0.05,0.1 - TRUE - #> minsplit discrete - - 10,20 - TRUE -
#> With control class: TuneControlGrid
#> Imputation value: 1
#> [Tune-x] 1: cp=0.05; minsplit=10
#> [Tune-y] 1: mmce.test.mean=0.0000000; time: 0.0 min
#> [Tune-x] 2: cp=0.1; minsplit=10
#> [Tune-y] 2: mmce.test.mean=0.0000000; time: 0.0 min
#> [Tune-x] 3: cp=0.05; minsplit=20
#> [Tune-y] 3: mmce.test.mean=0.0000000; time: 0.0 min
#> [Tune-x] 4: cp=0.1; minsplit=20
#> [Tune-y] 4: mmce.test.mean=0.0000000; time: 0.0 min
#> [Tune] Result: cp=0.05; minsplit=20 : mmce.test.mean=0.0000000
#> [Resample] iter 1: 0.0400000
#> [Tune] Started tuning learner classif.rpart for parameter set:
#> Type len Def Constr Req Tunable Trafo #> cp discrete - - 0.05,0.1 - TRUE - #> minsplit discrete - - 10,20 - TRUE -
#> With control class: TuneControlGrid
#> Imputation value: 1
#> [Tune-x] 1: cp=0.05; minsplit=10
#> [Tune-y] 1: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 2: cp=0.1; minsplit=10
#> [Tune-y] 2: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 3: cp=0.05; minsplit=20
#> [Tune-y] 3: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 4: cp=0.1; minsplit=20
#> [Tune-y] 4: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune] Result: cp=0.05; minsplit=10 : mmce.test.mean=0.0400000
#> [Resample] iter 2: 0.0800000
#>
#> Aggregated Result: mmce.test.mean=0.0600000
#>
# get tuning indices getResamplingIndices(r, inner = TRUE)
#> [[1]] #> [[1]]$train.inds #> [[1]]$train.inds[[1]] #> [1] 113 116 56 118 66 84 38 50 141 30 12 135 15 99 127 119 2 19 125 #> [20] 149 69 130 90 23 132 68 31 139 25 9 98 100 147 81 107 144 148 128 #> [39] 16 1 27 78 21 36 123 47 6 42 103 133 #> #> #> [[1]]$test.inds #> [[1]]$test.inds[[1]] #> [1] 109 79 73 121 105 22 137 40 102 49 48 45 65 35 91 83 58 129 13 #> [20] 51 39 86 140 75 106 #> #> #> #> [[2]] #> [[2]]$train.inds #> [[2]]$train.inds[[1]] #> [1] 54 85 93 3 17 28 95 32 37 138 80 71 143 89 70 131 44 76 52 #> [20] 145 29 82 146 115 124 122 111 126 97 24 112 74 61 60 104 33 57 59 #> [39] 10 150 94 108 96 142 62 8 117 64 4 87 #> #> #> [[2]]$test.inds #> [[2]]$test.inds[[1]] #> [1] 72 34 11 136 67 77 14 88 114 20 63 53 18 92 110 55 26 7 101 #> [20] 120 41 43 5 46 134 #> #> #>