ac_scale | Interpretation Scale for Agreement Coefficients |
ac_weights | Weight-generating functions |
bp_3_raw | Brennan-Prediger (BP) agreement coefficient among multiple... |
calc_ac | Calculate Agreement Coefficients |
calc_ac_dist | Calculate Agreement Coefficients (AC) from a distribution |
calc_ac_raw | Calculate Agreement Coefficients (AC) from raw data set |
calc_ac_table | Calculate Agreement Coefficients (AC) from a contengency... |
calc_agree_mat | Calculate an agreement matrix |
calc_classif_mat | Calculate an rater-category matrix |
calc_p_val | Calculate P-value from T-statistic |
calc_pw_kappa | Calculate pairwise agreement statistics for multiple raters |
calc_test_txt | Generate Hypothesis Test Text |
calc_t_stat | Calculate T-statistic |
conger_3_raw | Conger's generalized kappa coefficient among multiple raters... |
diagnosis | Fleiss (1971): Diagnoses on 30 subjects by 6 raters per... |
finn_1970 | Finn (1970): 5 raters classified 4 subjects into 3 categories |
fleiss_3_raw | Fleiss' generalized kappa coefficient among multiple raters... |
get_ac_weights | Get Agreement Coefficient Weights |
gwet_3_raw | Gwet's AC1/AC2 agreement coefficient among multiple raters... |
interpret_kappa | Interpretation of kappa statistic |
k_alpha | Calculate Fleiss' kappa and Krippendorff's alpha |
krippen_2_raw | Krippendorff's Alpha |
krippen_3_raw | Krippendorff's alpha coefficient among multiple raters (2, 3,... |
lagree-package | lagree:: Calculate various interrater agreement coefficients |
neurologists | Landis and Koch (1977): Diagnostic Classification of Multiple... |
pa_3_raw | Percent agreement coefficient among multiple raters (2, 3, +) |
pipe | Pipe operator |
radiologist | Classification by two radiologists of 85 xeromammograms as... |
rvary2 | rvary2 from Stata - simple tables for categorival variables |
two_raters | Generic ratings by two raters |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.