ckappa {psy} | R Documentation |
Computes Cohen's Kappa for agreement in the case of 2 raters. The diagnosis (the object of the rating) may have k possible values.
ckappa(r)
r |
n*2 matrix or dataframe, n subjects and 2 raters |
The function deals with the case where the two raters have not exactly the same scope of rating (some software associate an error with this situation). Missing value are omitted.
A list with :
$table |
the 2*k table of raw data (first rater in rows, second rater in columns) |
$kappa |
Cohen's Kappa |
Bruno Falissard
Cohen, J. (1960), A coefficient of agreement for nominal scales, Educational and Psychological measurements, 20, 37-46.
data(expsy)
## Cohen's kappa for binary diagnosis
ckappa(expsy[,c(12,14)])
##to obtain a 95%confidence interval:
#library(boot)
#ckappa.boot <- function(data,x) {ckappa(data[x,])[[2]]}
#res <- boot(expsy[,c(12,14)],ckappa.boot,1000)
## two-sided bootstrapped confidence interval of kappa
#quantile(res$t,c(0.025,0.975))
## adjusted bootstrap percentile (BCa) confidence interval (better)
#boot.ci(res,type="bca")
##Cohen's kappa for non binary diagnosis
#ckappa(expsy[,c(11,13)])