k-nearest neighbor classification
ClassificationKNN
is a nearest-neighbor classification model
in which you can alter both the distance metric and the number of nearest neighbors.
Because a ClassificationKNN
classifier stores training data, you can
use the model to compute resubstitution predictions. Alternatively, use the model to
classify new observations using the predict
method.
Create a ClassificationKNN
model using fitcknn
.
compareHoldout | Compare accuracies of two classification models using new data |
crossval | Cross-validate machine learning model |
edge | Edge of k-nearest neighbor classifier |
gather | Gather properties of machine learning model from GPU |
lime | Local interpretable model-agnostic explanations (LIME) |
loss | Loss of k-nearest neighbor classifier |
margin | Margin of k-nearest neighbor classifier |
partialDependence | Compute partial dependence |
plotPartialDependence | Create partial dependence plot (PDP) and individual conditional expectation (ICE) plots |
predict | Predict labels using k-nearest neighbor classification model |
resubEdge | Resubstitution classification edge |
resubLoss | Resubstitution classification loss |
resubMargin | Resubstitution classification margin |
resubPredict | Classify training data using trained classifier |
shapley | Shapley values |
testckfold | Compare accuracies of two classification models by repeated cross-validation |
The compact
function reduces the size of most
classification models by removing the training data properties and any other
properties that are not required to predict the labels of new observations.
Because k-nearest neighbor classification models require all
of the training data to predict labels, you cannot reduce the size of a
ClassificationKNN
model.
knnsearch
finds the
k-nearest neighbors of points. rangesearch
finds all the points within a fixed distance. You can use
these functions for classification, as shown in Classify Query Data. If you want to perform
classification, then using ClassificationKNN
models can be more
convenient because you can train a classifier in one step (using fitcknn
) and classify in other steps (using predict
). Alternatively, you can train a k-nearest
neighbor classification model using one of the cross-validation options in the call to
fitcknn
. In this case, fitcknn
returns a
ClassificationPartitionedModel
cross-validated model object.