estimates Cohen's kappa coefficient
[...] = nan_kappa(d1,d2); NaN's are handled as missing values and are ignored [...] = nan_kappa(d1,d2,'notIgnoreNAN'); NaN's are handled as just another Label. [kap,sd,H,z,ACC,sACC,MI] = kappa(...); X = nan_kappa(...);
data of scorer 1
data of scorer 2
Cohen's kappa coefficient point
standard error of the kappa estimate
Confusion matrix
z-score
overall agreement (accuracy)
specific accuracy
Mutual information or transfer information (in [bits])
is a struct containing all the fields above
For two classes, a number of additional summary statistics including TPR, FPR, FDR, PPV, NPF, F1, dprime, Matthews Correlation coefficient (MCC), Specificity and Sensitivity are provided. Note, the positive category must the larger label (in d and c), otherwise the confusion matrix becomes transposed and the summary statistics are messed up.
ratings1=[0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0] ratings2=[0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0] X = nan_kappa(ratings1,ratings2)
[1] Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20, 37-46.
[2] J Bortz, GA Lienert (1998) Kurzgefasste Statistik f|r die klassische Forschung, Springer Berlin - Heidelberg.
Kapitel 6: Uebereinstimmungsmasze fuer subjektive Merkmalsurteile. p. 265-270.
[3] http://www.cmis.csiro.au/Fiona.Evans/personal/msc/html/chapter3.html
[4] Kraemer, H. C. (1982). Kappa coefficient. In S. Kotz and N. L. Johnson (Eds.),
Encyclopedia of Statistical Sciences. New York: John Wiley & Sons.
[5] http://ourworld.compuserve.com/homepages/jsuebersax/kappa.htm
[6] http://en.wikipedia.org/wiki/Receiver_operating_characteristic