This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

Classification Ensembles

Boosting, random forest, bagging, random subspace, and ECOC ensembles for multiclass learning

A classification ensemble is a predictive model composed of a weighted combination of multiple classification models. In general, combining multiple classification models increases predictive performance.

To explore classification ensembles interactively, use the Classification Learner app. For greater flexibility, use fitcensemble in the command-line interface to boost or bag classification trees, or to grow a random forest [11]. For details on all supported ensembles, see Ensemble Algorithms. To reduce a multiclass problem into an ensemble of binary classification problems, train an error-correcting output codes (ECOC) model. For details, see fitcecoc.

To boost regression trees using LSBoost, or to grow a random forest of regression trees[11], see Regression Ensembles.


Classification LearnerTrain models to classify data using supervised machine learning


templateDiscriminantDiscriminant analysis classifier template
templateECOCError-correcting output codes learner template
templateEnsembleEnsemble learning template
templateKNNk-nearest neighbor classifier template
templateLinearLinear classification learner template
templateNaiveBayesNaive Bayes classifier template
templateSVMSupport vector machine template
templateTreeCreate decision tree template
fitcensembleFit ensemble of learners for classification
predictPredict labels using ensemble of classification models
oobPredictPredict out-of-bag response of ensemble
TreeBaggerCreate bag of decision trees
fitcensembleFit ensemble of learners for classification
predictPredict responses using ensemble of bagged decision trees
oobPredictEnsemble predictions for out-of-bag observations
fitcecocFit multiclass models for support vector machines or other classifiers
templateSVMSupport vector machine template
predictPredict labels using multiclass, error-correcting output codes model


ClassificationEnsembleEnsemble classifier
CompactClassificationEnsembleCompact classification ensemble class
ClassificationPartitionedEnsembleCross-validated classification ensemble
TreeBaggerBag of decision trees
CompactTreeBaggerCompact ensemble of decision trees grown by bootstrap aggregation
ClassificationBaggedEnsembleClassification ensemble grown by resampling
ClassificationECOCMulticlass model for support vector machines or other classifiers
CompactClassificationECOCCompact multiclass model for support vector machines or other classifiers
ClassificationPartitionedECOCCross-validated multiclass model for support vector machines or other classifiers


Train Ensemble Classifiers Using Classification Learner App

Create and compare ensemble classifiers, and export trained models to make predictions for new data.

Framework for Ensemble Learning

Obtain highly accurate predictions by using many weak learners.

Ensemble Algorithms

Learn about different algorithms for ensemble learning.

Train Classification Ensemble

Train a simple classification ensemble.

Test Ensemble Quality

Learn methods to evaluate the predictive quality of an ensemble.

Handle Imbalanced Data or Unequal Misclassification Costs in Classification Ensembles

Learn how to set prior class probabilities and misclassification costs.

Classification with Imbalanced Data

Use the RUSBoost algorithm for classification when one or more classes are over-represented in your data.

Classification with Many Categorical Levels

Train an ensemble of classification trees using data containing predictors with many categorical levels.

LPBoost and TotalBoost for Small Ensembles

Create small ensembles by using the LPBoost and TotalBoost algorithms. (LPBoost and TotalBoost require Optimization Toolbox™.)

Tune RobustBoost

Tune RobustBoost parameters for better predictive accuracy. (RobustBoost requires Optimization Toolbox.)

Surrogate Splits

Gain better predictions when you have missing data by using surrogate splits.

Bootstrap Aggregation (Bagging) of Classification Trees

Create a TreeBagger ensemble for classification.

Random Subspace Classification

Increase the accuracy of classification by using a random subspace ensemble.

Was this topic helpful?