# objectDetectionMetrics

## Description

An `objectDetectionMetrics`

object stores object detection quality
metrics, such as the confusion matrix and average precision, for a set of
images.

## Creation

Create an `objectDetectionMetrics`

object by using the `evaluateObjectDetection`

function.

## Properties

`ConfusionMatrix`

— Confusion matrix

numeric matrix | numeric array

This property is read-only.

Confusion matrix, returned as a numeric matrix or numeric array.

When

`OverlapThreshold`

is a scalar,`ConfusionMatrix`

is a square matrix of size*C*-by-*C*, where*C*is the number of classes. Each element (*i*,*j*) is the count of objects known to belong to class*i*but predicted to belong to class*j*.When

`OverlapThreshold`

is a vector,`ConfusionMatrix`

is an array of size*C*-by-*C*-by-*numThresh*. There is one confusion matrix for each of the*numThresh*overlap thresholds.

`NormalizedConfusionMatrix`

— Normalized confusion matrix

numeric matrix | numeric array

This property is read-only.

Normalized confusion matrix, returned as a numeric matrix or numeric array with
elements in the range `[0, 1]`

. This property contains a confusion
matrix normalized by the number of objects known to belong to each class. For each
overlap threshold, each element (*i*, *j*) in the
normalized confusion matrix is the count of objects known to belong to class
*i* but predicted to belong to class *j*, divided by
the total number of objects predicted in class *j*.

`DatasetMetrics`

— Metrics aggregated over the data set

table

This property is read-only.

Metrics aggregated over the data set, returned as a table with one row. If
additional metrics are not specified through the `AdditionalMetrics`

argument of the `evaluateObjectDetection`

function, `DatasetMetrics`

has
three columns corresponding to these object detection metrics. These metrics are not in
output table order.

`NumObjects`

— Number of objects in the ground truth data.`AP`

— Average precision across all classes at each specified overlap threshold in`OverlapThreshold`

, returned as a*numThresh*-by-1 vector, where*numThresh*is the number of overlap thresholds.`mAP`

— Mean average precision, calculated by averaging the corresponding`AP`

in the same table across all overlap thresholds. Specify the overlap thresholds for the data set using the`threshold`

argument.

For information on optional additional metrics for this table, see the
`AdditionalMetrics`

argument of the `evaluateObjectDetection`

function.

`ClassMetrics`

— Metrics for each class

table

This property is read-only.

Metrics for each class, specified as a table with *C* rows, where
*C* is the number of classes in the object detection. If additional
metrics are not specified through the `AdditionalMetrics`

argument of the `evaluateObjectDetection`

function, `ClassMetrics`

has
five columns, corresponding to these object detection metrics. These metrics are not in
output table order.

`NumObjects`

— Number of objects in the ground truth data for a class.`AP`

— Average precision calculated for a class at each overlap threshold in`OverlapThreshold`

, returned as a*numThresh*-by-1 array, where*numThresh*is the number of overlap thresholds.`mAP`

— Mean average precision, calculated by averaging the corresponding`AP`

in the same table across all overlap thresholds. Specify the overlap thresholds for a class using the`threshold`

argument.`Precision`

— Precision values, returned as a*numThresh*-by-(*numPredictions*+1) matrix, where*numPredictions*is the number of predicted boxes. Precision is the ratio of the number of true positives (*TP*) and the total number of predicted positives.Precision =

*TP*/ (*TP*+*FP*)*FP*is the number of false positives. Larger precision scores imply that most detected objects match ground truth objects.`Recall`

— Recall values, returned as a*numThresh*-by-(*numPredictions*+1) matrix, where*numPredictions*is the number of predicted boxes. Recall is the ratio of the number of true positives (*TP*) and the total number of ground truth positives.Recall =

*TP*/ (*TP*+*FN*)*FN*is the number of false negatives. Larger recall scores indicate that more of ground truth objects are detected.

For information on optional additional metrics for this table, see the
`AdditionalMetrics`

argument of the `evaluateObjectDetection`

function.

`ImageMetrics`

— Metrics for each image

table

This property is read-only.

Metrics for each image in the data set, specified as a table with
*numImages* rows, where *numImages* is the number of
images in the data set. If additional metrics are not specified through the `AdditionalMetrics`

argument of the `evaluateObjectDetection`

function, `ImageMetrics`

has
three columns, corresponding to these object detection metrics. These metrics are not in
output table order.

`NumObjects`

— Number of objects in the ground truth data in each image.`AP`

— Average precision computed at all the overlap thresholds specified by`OverlapThreshold`

, returned as a*numThresh*-by-1 vector, where*numThresh*is the number of overlap thresholds.`mAP`

— Mean average precision, calculated by averaging the corresponding`AP`

in the same table across all overlap thresholds. Specify the overlap thresholds for an image using the`threshold`

argument.

For information on optional additional metrics for this table, see the
`AdditionalMetrics`

argument of the `evaluateObjectDetection`

function.

`ClassNames`

— Class names

cell array of character vectors

Class names of detected objects, returned as a cell array of character vectors.

**Example: **```
{"sky"} {"grass"} {"building"}
{"sidewalk"}
```

`OverlapThreshold`

— Overlap threshold

numeric scalar | numeric vector

Overlap threshold, specified as a numeric scalar or numeric vector of box overlap threshold values over which the mean average precision is computed. When the intersection over union (IoU) of the pixels in the ground truth bounding box and the predicted bounding box is equal to or greater than the overlap threshold, the detection is considered a match to the ground truth. The IoU is the number of pixels in the intersection of the bounding boxes divided by the number of pixels in the union of the bounding boxes.

## Object Functions

`metricsByArea` | Evaluate detection performance across object size ranges |

## Version History

**Introduced in R2023b**

## MATLAB-Befehl

Sie haben auf einen Link geklickt, der diesem MATLAB-Befehl entspricht:

Führen Sie den Befehl durch Eingabe in das MATLAB-Befehlsfenster aus. Webbrowser unterstützen keine MATLAB-Befehle.

Select a Web Site

Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .

You can also select a web site from the following list:

## How to Get Best Site Performance

Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.

### Americas

- América Latina (Español)
- Canada (English)
- United States (English)

### Europe

- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)

- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)