To train a k-nearest neighbor model, use the Classification
Learner app. For greater flexibility, train a k-nearest neighbor
fitcknn in the command-line interface. After training, predict
labels or estimate posterior probabilities by passing the model and predictor data to
|Classification Learner||Train models to classify data using supervised machine learning|
Create Nearest Neighbor Model
Create Nearest Neighbor Searcher
Interpret Nearest Neighbor Model
|Cross-validate machine learning model|
|Classification edge for cross-validated classification model|
|Classification loss for cross-validated classification model|
|Cross-validate function for classification|
|Classification margins for cross-validated classification model|
|Classify observations in cross-validated classification model|
|Loss of k-nearest neighbor classifier|
|Resubstitution classification loss|
|Compare accuracies of two classification models using new data|
|Edge of k-nearest neighbor classifier|
|Margin of k-nearest neighbor classifier|
|Resubstitution classification edge|
|Resubstitution classification margin|
|Compare accuracies of two classification models by repeated cross-validation|
Gather Properties of Nearest Neighbor Model
Create and compare nearest neighbor classifiers, and export trained models to make predictions for new data.
This example shows how to visualize the decision surface for different classification algorithms.
Understand the steps for supervised learning and the characteristics of nonparametric classification and regression functions.
Categorize data points based on their distance to points in a training data set, using a variety of distance metrics.
Speaker Identification Using Pitch and MFCC (Audio Toolbox)