After you resampled a tuning or feature selection wrapper (see makeTuneWrapper) with resample(..., extract = getTuneResult) or resample(..., extract = getFeatSelResult) this helper returns a list with the resampling indices used for the respective method.

getResamplingIndices(object, inner = FALSE)

Arguments

object

(ResampleResult)
The result of resampling of a tuning or feature selection wrapper.

inner

(logical)
If TRUE, returns the inner indices of a nested resampling setting.

Value

(list). One list for each outer resampling fold.

See also

Examples

task = makeClassifTask(data = iris, target = "Species") lrn = makeLearner("classif.rpart") # stupid mini grid ps = makeParamSet( makeDiscreteParam("cp", values = c(0.05, 0.1)), makeDiscreteParam("minsplit", values = c(10, 20)) ) ctrl = makeTuneControlGrid() inner = makeResampleDesc("Holdout") outer = makeResampleDesc("CV", iters = 2) lrn = makeTuneWrapper(lrn, resampling = inner, par.set = ps, control = ctrl) # nested resampling for evaluation # we also extract tuned hyper pars in each iteration and by that the resampling indices r = resample(lrn, task, outer, extract = getTuneResult)
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune] Started tuning learner classif.rpart for parameter set:
#> Type len Def Constr Req Tunable Trafo #> cp discrete - - 0.05,0.1 - TRUE - #> minsplit discrete - - 10,20 - TRUE -
#> With control class: TuneControlGrid
#> Imputation value: 1
#> [Tune-x] 1: cp=0.05; minsplit=10
#> [Tune-y] 1: mmce.test.mean=0.0800000; time: 0.0 min
#> [Tune-x] 2: cp=0.1; minsplit=10
#> [Tune-y] 2: mmce.test.mean=0.0800000; time: 0.0 min
#> [Tune-x] 3: cp=0.05; minsplit=20
#> [Tune-y] 3: mmce.test.mean=0.0800000; time: 0.0 min
#> [Tune-x] 4: cp=0.1; minsplit=20
#> [Tune-y] 4: mmce.test.mean=0.0800000; time: 0.0 min
#> [Tune] Result: cp=0.05; minsplit=20 : mmce.test.mean=0.0800000
#> [Resample] iter 1: 0.0533333
#> [Tune] Started tuning learner classif.rpart for parameter set:
#> Type len Def Constr Req Tunable Trafo #> cp discrete - - 0.05,0.1 - TRUE - #> minsplit discrete - - 10,20 - TRUE -
#> With control class: TuneControlGrid
#> Imputation value: 1
#> [Tune-x] 1: cp=0.05; minsplit=10
#> [Tune-y] 1: mmce.test.mean=0.0000000; time: 0.0 min
#> [Tune-x] 2: cp=0.1; minsplit=10
#> [Tune-y] 2: mmce.test.mean=0.0000000; time: 0.0 min
#> [Tune-x] 3: cp=0.05; minsplit=20
#> [Tune-y] 3: mmce.test.mean=0.0000000; time: 0.0 min
#> [Tune-x] 4: cp=0.1; minsplit=20
#> [Tune-y] 4: mmce.test.mean=0.0000000; time: 0.0 min
#> [Tune] Result: cp=0.05; minsplit=10 : mmce.test.mean=0.0000000
#> [Resample] iter 2: 0.0666667
#>
#> Aggregated Result: mmce.test.mean=0.0600000
#>
# get tuning indices getResamplingIndices(r, inner = TRUE)
#> [[1]] #> [[1]]$train.inds #> [[1]]$train.inds[[1]] #> [1] 32 24 129 141 29 4 133 119 134 16 62 31 10 125 50 87 135 37 75 #> [20] 107 147 115 100 124 5 54 111 143 36 120 67 112 69 65 6 52 148 13 #> [39] 103 70 146 117 27 46 59 99 121 48 58 102 #> #> #> [[1]]$test.inds #> [[1]]$test.inds[[1]] #> [1] 101 26 78 34 95 90 11 53 140 2 39 77 71 126 108 88 144 47 41 #> [20] 122 25 40 93 42 3 #> #> #> #> [[2]] #> [[2]]$train.inds #> [[2]]$train.inds[[1]] #> [1] 60 63 109 85 49 131 17 73 35 15 1 33 61 136 145 149 44 113 21 #> [20] 80 98 30 57 128 14 79 137 116 150 74 84 91 105 130 118 76 9 56 #> [39] 86 82 97 142 7 138 89 51 22 19 114 68 #> #> #> [[2]]$test.inds #> [[2]]$test.inds[[1]] #> [1] 81 92 12 20 38 96 64 139 127 106 72 28 94 45 104 83 55 132 123 #> [20] 18 66 8 43 110 23 #> #> #>