Interrater reliability measures the agreement between two or more raters.
Topics
- Cohen’s Kappa
- Weighted Cohen’s Kappa
- Fleiss’ Kappa
- Krippendorff’s Alpha
- Gwet’s AC2
- Intraclass Correlation
- Kendall’s Coefficient of Concordance (W)
- Kendall’s Coefficient of Agreement u for Paired Comparisons
- Kendall’s Coefficient of Agreement u for Paired Rankings
- TC Correlation between several Judges and a Criterion
- Bland-Altman Analysis
- Lin’s Concordance Correlation Coefficient (CCC)
Reference
Siegel, S., Castellan, N. J. (1988) Nonparametric statistics for the behavioral sciences, 2nd ed.
https://psycnet.apa.org/record/1988-97307-000