kappam_vanbelle {kappaGold} | R Documentation |
Agreement between two groups of raters
Description
This function expands upon Cohen's and Fleiss' Kappa as measures for interrater agreement while taking into account the heterogeneity within each group.
Usage
kappam_vanbelle(
ratingsGr1,
ratingsGr2,
ratingScale = NULL,
weights = c("unweighted", "linear", "quadratic"),
conf.level = 0.95
)
Arguments
ratingsGr1 |
matrix of subjects x raters for 1st group of raters |
ratingsGr2 |
matrix of subjects x raters for 2nd group of raters |
ratingScale |
character vector of the levels for the rating. Or |
weights |
optional weighting schemes: |
conf.level |
confidence level for interval estimation |
Details
Data need to be stored with raters in columns.
Value
list. kappa agreement between two groups of raters
References
Vanbelle, S., Albert, A. Agreement between Two Independent Groups of Raters. Psychometrika 74, 477–491 (2009). doi:10.1007/s11336-009-9116-1
Examples
# compare rater1-rater2 vs rater3-rater6 from diagnoses-data
# (there is no systematic difference between both groups
#+as the raters are randomly selected per subject)
kappam_vanbelle(diagnoses[,1:2], diagnoses[,3:6])
[Package kappaGold version 0.3.2 Index]