kappa2: Cohen's Kappa and weighted Kappa for two raters

Description Usage Arguments Details Value Author(s) References See Also Examples

Description

Calculates Cohen's Kappa and weighted Kappa as an index of interrater agreement between 2 raters on categorical (or ordinal) data. Own weights for the various degrees of disagreement could be specified.

Usage

1
kappa2(ratings, weight = c("unweighted", "equal", "squared"), sort.levels = FALSE)

Arguments

ratings

n*2 matrix or dataframe, n subjects 2 raters.

weight

either a character string specifying one predifined set of weights or a numeric vector with own weights (see details).

sort.levels

boolean value describing whether factor levels should be (re-)sorted during the calculation.

Details

Missing data are omitted in a listwise way.
During computation, ratings are converted to factors. Therefore, the categories are ordered accordingly. When ratings are numeric, a sorting of factor levels occurs automatically. Otherwise, levels are sorted when the function is called with sort.levels=TRUE.
kappa2 allows for calculating weighted Kappa coefficients. Beneath '"unweighted"' (default), predifined sets of weights are '"equal"' (all levels disagreement between raters are weighted equally) and '"squared"' (disagreements are weighted according to their squared distance from perfect agreement). The weighted Kappa coefficient with '"squared"' weights equals the product moment correlation under certain conditions. Own weights could be specified by supplying the function with a numeric vector of weights, starting from perfect agreement to worst disagreement. The length of this vector must equal the number of rating categories.

Value

A list with class '"irrlist"' containing the following components:

$method

a character string describing the method and the weights applied for the computation of weighted Kappa.

$subjects

the number of subjects examined.

$raters

the number of raters (=2).

$irr.name

a character string specifying the name of the coefficient.

$value

value of Kappa.

$stat.name

a character string specifying the name of the corresponding test statistic.

$statistic

the value of the test statistic.

$p.value

the p-value for the test.

Author(s)

Matthias Gamer

References

Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20, 37-46.

Cohen, J. (1968). Weighted kappa: Nominal scale agreement with provision for scaled disagreement or partial credit. Psychological Bulletin, 70, 213-220.

Fleiss, J.L., Cohen, J., & Everitt, B.S. (1969). Large sample standard errors of kappa and weighted kappa. Psychological Bulletin, 72, 323-327.

See Also

cor, kappa2, kappam.light

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
data(anxiety)
kappa2(anxiety[,1:2], "squared") # predefined set of squared weights
kappa2(anxiety[,1:2], (0:5)^2)   # same result with own set of squared weights

# own weights increasing gradually with larger distance from perfect agreement
kappa2(anxiety[,1:2], c(0,1,2,4,7,11))

data(diagnoses)
# Unweighted Kappa for categorical data without a logical order
kappa2(diagnoses[,2:3])

Example output

Loading required package: lpSolve
 Cohen's Kappa for 2 Raters (Weights: squared)

 Subjects = 20 
   Raters = 2 
    Kappa = 0.297 

        z = 1.34 
  p-value = 0.18 
 Cohen's Kappa for 2 Raters (Weights: 0,1,4,9,16,25)

 Subjects = 20 
   Raters = 2 
    Kappa = 0.297 

        z = 1.34 
  p-value = 0.18 
 Cohen's Kappa for 2 Raters (Weights: 0,1,2,4,7,11)

 Subjects = 20 
   Raters = 2 
    Kappa = 0.263 

        z = 1.46 
  p-value = 0.144 
 Cohen's Kappa for 2 Raters (Weights: unweighted)

 Subjects = 30 
   Raters = 2 
    Kappa = 0.631 

        z = 7.56 
  p-value = 4.04e-14 

irr documentation built on May 2, 2019, 8:50 a.m.

Related to kappa2 in irr...