...many thanks for all the answers and clarity!!!
regards, christian
Hi,
Chance is not .5 in your data, it's a function of the expected values
for presence and absence:
(((7792*10855)/11974) + ((4182*1119)/11974))/11974
[1] 0.6225686
(.6862368-.6225686)/(1-.6225686)
See also section 6.4 of
http://www.psych.upenn.edu/~baron/rpsych/rpsych.html
which also points to a few packages that have kappa code in them.
--
Jonathan Baron, Professor of Psychology, University of Pennsylvania
Home page: http://www.sas.upenn.edu/~baron
Hi,
im little bit confused about Cohen's Kappa and i should be look into the
Kappa function code. Is the easy formula really wrong?
kappa=agreement-chance/(1-chance)
many thanks
christian
###
true-negativ:7445
Shouldn't that be
kappa = rater1 - rater2 / (1-chance)
?
Andy
--
Andrew J Perrin - andrew_perrin (at) unc.edu - http://perrin.socsci.unc.edu
Assistant Professor of Sociology; Book Review Editor, _Social Forces_
University of
Hi,
Chance is not .5 in your data, it's a function of the expected values
for presence and absence:
(((7792*10855)/11974) + ((4182*1119)/11974))/11974
[1] 0.6225686
(.6862368-.6225686)/(1-.6225686)
[1] 0.1686881
Scot
On Thu, 22 Mar 2007, Christian Schulz wrote:
Hi,
im little bit
Cohen, J. (1960). A coefficient of agreement for nominal scales.
Educational and Psychological Measurement, 20, 37-46.
Cohen, J. (1968). Weighted kappa: Nominal scale agreement with provision
for scaled disagreement or partial credit. Psychological Bulletin, 70,
213-220.
Francisco
Dr.