kappam_vanbelle | R Documentation |
This function expands upon Cohen's and Fleiss' Kappa as measures for interrater agreement while taking into account the heterogeneity within each group.
kappam_vanbelle(
ratings,
refIdx,
ratingScale = NULL,
weights = c("unweighted", "linear", "quadratic"),
conf.level = 0.95
)
ratings |
matrix of subjects x raters for both groups of raters |
refIdx |
numeric. indices of raters that constitute the reference group. Can also be all negative to define rater group by exclusion. |
ratingScale |
character vector of the levels for the rating. Or |
weights |
optional weighting schemes: |
conf.level |
confidence level for interval estimation |
Data need to be stored with raters in columns.
list. kappa agreement between two groups of raters
Vanbelle, S., Albert, A. Agreement between Two Independent Groups of Raters. Psychometrika 74, 477–491 (2009). \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1007/s11336-009-9116-1")}
# compare student ratings with ratings of 11 experts
kappam_vanbelle(SC_test, refIdx = 40:50)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.