inst/help/raterAgreement.md

Rater Agreement

Cohen's and Fleiss' kappa and Krippendorff's alpha are coefficients that measure the agreement between raters on a nominal or ordinal scale (Cohen, 1960; Cohen, 1968; Krippendorff, 1970; Fleiss, 1971). Cohen's kappa is limited to measure the agreement between two raters. Fleiss' kappa and Krippendorff's alpha measure the agreement between two or more raters.

Rater can refer to different judges, tests or other forms of rating here and in the section below.

Input

Output

Cohen's kappa (table)

This table shows Cohen's kappa for all possible pairs of raters and the average kappa. If selected, a confidence interval for the kappa estimate will be reported.

Landis and Koch (1977) suggest the following guideline for the interpretation of Cohen's kappa: - Less than 0: poor agreement - Between 0.01 and 0.20: slight agreement - Between 0.21 and 0.40: fair agreement - Between 0.41 and 0.60: moderate agreement - Between 0.61 and 0.80: substanital agreement - Between 0.81 and 1: Almost perfect agreement

Fleiss' kappa (table)

This table shows Fleiss' kappa for the overall agreement and per rating category. If selected, a confidence interval for the kappa estimate will be reported.

Landis and Koch (1977) suggest the following guideline for the interpretation of Fleiss' kappa: - Less than 0: poor agreement - Between 0.01 and 0.20: slight agreement - Between 0.21 and 0.40: fair agreement - Between 0.41 and 0.60: moderate agreement - Between 0.61 and 0.80: substanital agreement - Between 0.81 and 1: Almost perfect agreement

Krippendorff's alpha (table)

This table shows Krippendorff's alpha for the overall agreement. If selected, a confidence interval for the alpha estimate will be reported.

Krippendorff (2004) suggests the following guideline for the interpretation of Krippendorff's alpha: - Less than 0.66: unacceptable agreement - Between 0.66 and 0.80: tentatively acceptable agreement - Between 0.81 and .99: acceptable agreement - 1: perfect agreement

References

R Packages



jasp-stats/jaspReliability documentation built on May 5, 2024, 10:57 p.m.