Resume training learners on cross-validation folds
ens1 = resume(ens,nlearn)
ens1 = resume(ens,nlearn,Name,Value)
ens1 = resume(
ens in every fold for
nlearn more cycles.
resume uses the same training options
fitcensemble used to create
ens, except for parallel
training options. If you want to resume training in parallel, pass the
A cross-validated classification ensemble.
A positive integer, the number of cycles for additional training of
Specify optional pairs of arguments as
the argument name and
Value is the corresponding value.
Name-value arguments must appear after other arguments, but the order of the
pairs does not matter.
Before R2021a, use commas to separate each name and value, and enclose
Name in quotes.
Printout frequency, a positive integer scalar or
For fastest training of some boosted decision trees, set
Options for computing in parallel and setting random numbers,
specified as a structure. Create the
You need Parallel Computing Toolbox™ to compute in parallel.
You can use the same parallel options for
For dual-core systems and above,
The cross-validated classification ensemble
Train Partitioned Classification Ensemble for More Cycles
Train a partitioned classification ensemble for 10 cycles, and compare the classification loss obtained after training the ensemble for more cycles.
ionosphere data set.
Train a partitioned classification ensemble for 10 cycles and examine the error.
t = templateTree('MaxNumSplits',1); % Weak learner template tree object cvens = fitcensemble(X,Y,'Method','GentleBoost','NumLearningCycles',10,'Learners',t,'crossval','on'); rng(10,'twister') % For reproducibility L = kfoldLoss(cvens)
L = 0.0940
Train for 10 more cycles and examine the new error.
cvens = resume(cvens,10); L = kfoldLoss(cvens)
L = 0.0712
The cross-validation error is lower in the ensemble after training for 10 more cycles.
Automatic Parallel Support
Accelerate code by automatically running computation in parallel using Parallel Computing Toolbox™.
resume supports parallel training
'Options' name-value argument. Create options using
statset, such as
options = statset('UseParallel',true).
Parallel ensemble training requires you to set the
'Bag'. Parallel training is available only for tree learners, the
default type for
Accelerate code by running on a graphics processing unit (GPU) using Parallel Computing Toolbox™.
This function fully supports GPU arrays. For more information, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox).