ckappa: Cohen's Kappa for 2 raters

View source: R/ckappa.R

ckappaR Documentation

Cohen's Kappa for 2 raters

Description

Computes Cohen's Kappa for agreement in the case of 2 raters. The diagnosis (the object of the rating) may have k possible values.

Usage

ckappa(r)

Arguments

r

n*2 matrix or dataframe, n subjects and 2 raters

Details

The function deals with the case where the two raters have not exactly the same scope of rating (some software associate an error with this situation). Missing value are omitted.

Value

A list with :

$table

the 2*k table of raw data (first rater in rows, second rater in columns)

$kappa

Cohen's Kappa

Author(s)

Bruno Falissard

References

Cohen, J. (1960), A coefficient of agreement for nominal scales, Educational and Psychological measurements, 20, 37-46.

Examples

data(expsy)
## Cohen's kappa for binary diagnosis
ckappa(expsy[,c(12,14)])

##to obtain a 95%confidence interval:
#library(boot)
#ckappa.boot <- function(data,x) {ckappa(data[x,])[[2]]}
#res <- boot(expsy[,c(12,14)],ckappa.boot,1000)
## two-sided bootstrapped confidence interval of kappa
#quantile(res$t,c(0.025,0.975))
## adjusted bootstrap percentile (BCa) confidence interval (better)
#boot.ci(res,type="bca")
##Cohen's kappa for non binary diagnosis
#ckappa(expsy[,c(11,13)])

psy documentation built on April 22, 2022, 1:08 a.m.