'

Cohen's kappa coefficient

Description

Cohen’s kappa coefficient is a statistical measure of inter-rater agreement or inter-annotator agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation since κ takes into account the agreement occurring by chance. Cohen’s kappa measures the agreement between two raters who each classify N items into C mutually exclusive categories.

Related formulas

Variables

κCohen's kappa coefficient (dimensionless)
PraThe relative observed agreement among raters (dimensionless)
PreThe hypothetical probability of chance agreement (dimensionless)