File Exchange

image thumbnail

Cohen's Kappa

version (27.3 KB) by Giuseppe Cardillo
Compute the Cohen's kappa


Updated 15 Mar 2018

GitHub view license on GitHub

This function computes the Cohen's kappa coefficient
Cohen's kappa coefficient is a statistical measure of inter-rater reliability. It is generally thought to be a more robust measure than simple percent agreement calculation since k takes into account the agreement occurring by chance. Kappa provides a measure of the degree to which two judges, A and B, concur in their respective sortings of N items into k mutually exclusive categories. A 'judge' in this context can be an individual human being, a set of individuals who sort the N items collectively, or some non-human agency, such as a computer program or diagnostic test, that performs a sorting on the basis of specified criteria. The original and simplest version of kappa is the unweighted kappa coefficient introduced by J. Cohen in 1960. When the categories are merely nominal, Cohen's simple unweighted coefficient is the only form of kappa that can meaningfully be used. If the categories are ordinal and if it is the case that category 2 represents more of something than category 1, that category 3 represents more of that same something than category 2, and so on, then it is potentially meaningful to take this into account, weighting each cell of the matrix in accordance with how near it is to the cell in that row that includes the absolutely concordant items. This function can compute a linear weights or a quadratic weights.
Syntax: kappa(X,W,ALPHA)

X - square data matrix
W - Weight (0 = unweighted; 1 = linear weighted; 2 = quadratic
weighted; -1 = display all. Default=0)
ALPHA - default=0.05.

- Observed agreement percentage
- Random agreement percentage
- Agreement percentage due to true concordance
- Residual not random agreement percentage
- Cohen's kappa
- kappa error
- kappa confidence interval
- Maximum possible kappa
- k observed as proportion of maximum possible
- k benchmarks by Landis and Koch
- z test results


x=[88 14 18; 10 40 10; 2 6 12];

Calling on Matlab the function: kappa(x)

Answer is:


Observed agreement (po) = 0.7000
Random agreement (pe) = 0.4100
Agreement due to true concordance (po-pe) = 0.2900
Residual not random agreement (1-pe) = 0.5900
Cohen's kappa = 0.4915
kappa error = 0.0549
kappa C.I. (alpha = 0.0500) = 0.3839 0.5992
Maximum possible kappa, given the observed marginal frequencies = 0.8305
k observed as proportion of maximum possible = 0.5918
Moderate agreement
Variance = 0.0031 z (k/sqrt(var)) = 8.8347 p = 0.0000
Reject null hypotesis: observed agreement is not accidental

Created by Giuseppe Cardillo

To cite this file, this would be an appropriate format: Cardillo G. (2007) Cohen's kappa: compute the Cohen's kappa ratio on a 2x2 matrix.

Cite As

Giuseppe Cardillo (2020). Cohen's Kappa (, GitHub. Retrieved .

Comments and Ratings (17)

Dear Gordana Panic, if you see the code before asking... you will find the same cut-offs

I see this was submitted a while ago, so I hope the board is still active. The output also provides a categorical evaluation of the kappa statistic such as "fair" or "moderate". Can you please provide the cut-offs that you used for these evaluations? To my knowledge, it should be something like:
Value of K Strength of agreement
< 0.20 Poor
0.21 - 0.40 Fair
0.41 - 0.60 Moderate
0.61 - 0.80 Good
0.81 - 1.00 Very good

Kind regards.


You re right, double checked. Thanks for your help.


Thank you for your answer. I do get an output when I feed your code with the 3d matrix, though. Does the k correspond to the agreement in the first 2d matrix in the 3d variable or does it re-shape it?

Nothing, because the script uses bidimensional matrix


Hi, what happens when 'X' is a 3 dimension variable (each third dimension being a square matrix)? Are the outputs the average for all square matrices? Thank you.

Amazing job!


Thank you very much! Your comment is very useful to me!

I have read the help, but wanted to be 100% sure about these questions.

1 & 2) The confusion matrix is a square matrix so the function will compute the Kappa. The Cohen's kappa is used to test the agreement between judges. If they can classify "objects" into 16 categories you will have a 16x16 square matrix: on the main diagonal you will have "objects" that both judges will classify in the same category.
3) No and read the help section


I have a confusion matrix (dimension 16x16) resulted from a classification in 16 classes.

I use >> kappa(cf_mat);

1) If i give this matrix to your function will calculate kappa coefficient for this classification? You only specify X as square data matrix, not as a confusion matrix.

2) Your function works also on multi-class?

3) Do i need to provide weights if the classes are not balanced?

Thank you!


I have a confusion matrix (dimension 16x16) resulted from a classification in 16 classes.

1) If i give this function to your function will calculate kappa coefficient for this classification? You only specify X as square data matrix, not as a confusion matrix.

2) Your function works also on multi-class?

Thank you!

And how could help you? The error message is clear

Thanks Giuseppe Cardillo
I am getting error
kappa at 98
Warning: all X values must be numeric and finite
pls help

because you saved the file into a directory that is unreachable by matlab

this function is giving error:
Undefined function or method 'kappa' for input arguments of type 'double'

I think there is an error in the quadratic loop. small caps j needs to be J.

Stefano Cavazza


inputparser and github link

Changes in description

correction after tzur Karelitz observation

Changes in help section

Improvement in input error handling

NORMINV was replaced by ERFCINV so Statistics Toolbox is no more needed

Introduction of asyntotic calculation of variance for large population. Some comment added.

Weighted k calculation added

MATLAB Release Compatibility
Created with R2014b
Compatible with any release
Platform Compatibility
Windows macOS Linux