kappam_vanbelle: Agreement between two groups of raters

View source: R/vanbelle.R

kappam_vanbelleR Documentation

Agreement between two groups of raters

Description

This function expands upon Cohen's and Fleiss' Kappa as measures for interrater agreement while taking into account the heterogeneity within each group.

Usage

kappam_vanbelle(
  ratings,
  refIdx,
  ratingScale = NULL,
  weights = c("unweighted", "linear", "quadratic"),
  conf.level = 0.95
)

Arguments

ratings

matrix of subjects x raters for both groups of raters

refIdx

numeric. indices of raters that constitute the reference group. Can also be all negative to define rater group by exclusion.

ratingScale

character vector of the levels for the rating. Or NULL.

weights

optional weighting schemes: "unweighted", "linear","quadratic"

conf.level

confidence level for interval estimation

Details

Data need to be stored with raters in columns.

Value

list. kappa agreement between two groups of raters

References

Vanbelle, S., Albert, A. Agreement between Two Independent Groups of Raters. Psychometrika 74, 477–491 (2009). \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1007/s11336-009-9116-1")}

Examples

# compare student ratings with ratings of 11 experts
kappam_vanbelle(SC_test, refIdx = 40:50)


kappaGold documentation built on April 4, 2025, 1:02 a.m.