Cohen's Kappa Calculator
Measure agreement between two raters (Inter-rater reliability).
Enter Confusion Matrix (Counts)
| Rater 2: Yes | Rater 2: No | |
| Rater 1: Yes | ||
| Rater 1: No |
Interpretation
- ≤ 0: No agreement
- 0.01 - 0.20: None to slight
- 0.21 - 0.40: Fair
- 0.41 - 0.60: Moderate
- 0.61 - 0.80: Substantial
- 0.81 - 1.00: Almost perfect agreement