| immer_agree2 | R Documentation | 
Some agreement statistics for two raters, including raw agreement, Scott's Pi, Cohen's Kappa, Gwets AC1 and Aickens Alpha (see Gwet, 2010).
immer_agree2(y, w=rep(1, nrow(y)), symmetrize=FALSE, tol=c(0, 1))
## S3 method for class 'immer_agree2'
summary(object, digits=3,...)
| y | Data frame with responses for two raters | 
| w | Optional vector of frequency weights | 
| symmetrize | Logical indicating whether contingency table should be symmetrized | 
| tol | Vector of integers indicating tolerance for raw agreement | 
| object | Object of class  | 
| digits | Number of digits after decimal for rounding | 
| ... | Further arguments to be passed | 
List with entries
| agree_raw | Raw agreement | 
| agree_stats | Agreement statistics | 
| agree_table | Contingency table | 
| marg | Marginal frequencies | 
| Pe | Expected chance agreement probabilities | 
| PH | Probabilities for hard-to-classify subjects according to Aicken | 
| nobs | Number of observations | 
Gwet, K. L. (2010). Handbook of inter-rater reliability. Gaithersburg: Advanced Analytics.
For more inter-rater agreement statistics see the R packages agRee, Agreement, agrmt, irr, obs.agree, rel.
#############################################################################
# EXAMPLE 1: Dataset in Schuster & Smith (2006)
#############################################################################
data(data.immer08)
dat <- data.immer08
y <- dat[,1:2]
w <- dat[,3]
# agreement statistics
res <- immer::immer_agree2( y=y, w=w )
summary(res)
# extract some output values
res$agree_stats
res$agree_raw
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.