# CohenK: Cohen's Kappa In BavoDC/AGREL: Package for calculating agreement/reliability indices

## Description

Slightly adjusted function from `kappa2` to calculate Cohen's Kappa (Cohen, 1960).

## Usage

 ```1 2``` ```CohenK(ratings, weight = c("unweighted", "equal", "squared"), sort.levels = FALSE) ```

## Arguments

 `ratings` A dataframe or matrix of N x 2 with N the number of observations. The columns contain the ratings of the 2 raters. `weight` The weight matrix that has to be used to calculate the reliability. Default is `unweighted`, `'squared'` can be used to calculate Cohen's Weighted Kappa (Cohen, 1968). `sort.levels` Sort if the levels are numeric.

## Value

 `method` Method that was used to calculate reliability and weights used `subjects` Number of subjects in the dataframe `nraters` Number of raters `irr.name` Type of reliability measure `value` Value for Cohen's Kappa `StdErr` The standard error of the estimated Kappa value `stat.name` The corresponding test statistic `statistic` The value of the test statistic `p.value` The p-value for the test `Po` Overall proportion of agreement `Pe` Proportion of agreement expected by chance `ratings` The original dataframe with the ratings

## References

Cohen, J. (1960). A Coefficient of Agreement for Nominal Scales. Educational and Psychological Measurement, Vol.20(1), pp.37-46

Cohen, J. (1968).Weighted kappa: Nominal scale agreement provision for scaled disagreement or partial credit. Psychological Bulletin, Vol.70(4), pp.213-220

## Examples

 ``` 1 2 3 4 5 6 7 8 9 10``` ```# Load data data(PsychMorbid) Df = PsychMorbid[,1:2] # Unweighted kappa CohenK(Df) # Weighted kappa data(Agreement_deVet) CohenK(Agreement_deVet[,2:3], weight = "squared") ```

BavoDC/AGREL documentation built on May 6, 2019, 7:22 a.m.