Тёмный
No video :(

Calculating and Interpreting Cohen's Kappa in Excel 

Dr. Todd Grande
Подписаться 1,5 млн
Просмотров 94 тыс.
50% 1

Опубликовано:

 

20 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 42   
@DavidKirschnerPhD
@DavidKirschnerPhD 3 года назад
Super useful, helped me calculate interrater reliability for program assessment of student literature reviews. Thanks!
@medicine6932
@medicine6932 7 лет назад
THANK YOU SO MUCH!!!! On a time crunch and SPSS seems like it'd take too much time to even learn to use @_@. This video really helped.
@DrGrande
@DrGrande 7 лет назад
You're welcome - thanks for watching.
@sofiaquijada7398
@sofiaquijada7398 2 года назад
Thank you so much, this is exactly what I've been looking for.
@Muuip
@Muuip 5 лет назад
Great concise presentation, very useful!Much appreciated!
@ruudparklimy
@ruudparklimy 8 лет назад
what if I have more than 3 values? e.g. not just 0 and 1, but 2, 3, 4 or even more?
@johnmarkgutierrez1599
@johnmarkgutierrez1599 5 лет назад
Lim Yufan just code it as 0,1, and 2. Hope this helps
@LukyDi
@LukyDi 8 лет назад
Thank you so much for the well explained video , It really helped me very much . You are an excellent teacher.
@emilyhughes1315
@emilyhughes1315 4 года назад
Super helpful and clear thank you!
@SPORTSCIENCEps
@SPORTSCIENCEps 3 года назад
Thank you for uploading it!
@zacrogers3975
@zacrogers3975 8 лет назад
Thanks Todd! This is great. @Ben van Buren - Cohen's Kappa is used in many academic articles but it did not originate there. It's actually from the Cohen & Cohen book from 1960. I'm using a more recent version, the citation is: Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2013). Applied multiple regression/correlation analysis for the behavioral sciences. Routledge.
@ringwormts115
@ringwormts115 8 лет назад
Thanks for this very good video. The excel functions make my life so much easier :-)
@greggelliott4570
@greggelliott4570 9 лет назад
Probably need a little more discussion of sensitivity and specificity, although I expect it's also addressed in some other videos and in the book.
@charlesdrehmer87
@charlesdrehmer87 3 года назад
Thank you for your video. Could you explain how to handle ratings that are missing, where one rater recorded a score and the other did not?
@ceccapaglia
@ceccapaglia 4 года назад
Great! Really useful! Thank you
@alejandrabeghelli38
@alejandrabeghelli38 8 лет назад
Thanks! You saved me a lot of time.
@derejebirhanu7098
@derejebirhanu7098 5 лет назад
the very important person Thank you!!
@is4220
@is4220 8 лет назад
such a wonderful and helpful video! Thanks a lot!
@DrGrande
@DrGrande 7 лет назад
I'm glad you found the video useful. Thanks for watching.
@houchj0372
@houchj0372 2 года назад
very clear, thank you
@MarkVanderley
@MarkVanderley 9 лет назад
I imagine this would be helpful in research pertaining to rating the acquisition of counseling skills in student counselors
@franciscocallebernal
@franciscocallebernal 2 года назад
hi! how do you calculate confidence invervals and standard error for the kappa values using excel? Thank you for your very helpful video
@GamingBoxChannel
@GamingBoxChannel 7 лет назад
wish my lecturer could explain like you
@mrd8300
@mrd8300 2 года назад
excellent
@soleilaugust
@soleilaugust 8 лет назад
thank you, very helpful!
@sparkly5031
@sparkly5031 7 лет назад
Very informative. However, what do you do when: a) the Pe is 1 (and then the equation is divisible by 1)? Assume the Kappa is 1? b) have a very low Kappa when the raters agree on all but one of the ratings? Surely it should be higher? I have 2 raters,, 20 subjects. If they agree on 19, and differ on 1, the Kappa is nearly 0.
@ProfGarcia
@ProfGarcia 2 года назад
I have a very strange Kappa result: I have checked for a certain behavior in footage of animals, which I have assessed twice. For 28 animals, I have agreed 27 times that the behavior is present and have disagreed only once (the behavior was present in the first assessment, but not in the second). My data is organized as the following matrix: 0 1 0 27 And that gives me a Kappa value of zero which I find very strange because in only 1 of 28 assessments I disagree. How come it is considered these results as pure chance?
@ashoklodhi1910
@ashoklodhi1910 9 лет назад
It's really helpful...
@antonioflores6148
@antonioflores6148 3 года назад
Thank you!!!
@tyrk2926
@tyrk2926 4 года назад
Many Thankssss
@Lyn-eg3id
@Lyn-eg3id 7 лет назад
Suppose you have five categories from low to high. Since it is not dichotomous as it is here, do you still use the same approach?
@ruzelasesoria5891
@ruzelasesoria5891 6 лет назад
excellent videio!!! thanks....!
@DrGrande
@DrGrande 6 лет назад
You're welcome!
@jorgemmmmteixeira
@jorgemmmmteixeira 3 года назад
Hi. What about calculating sample size for Kappa? Do you think it is problematic to set the null hypothesis at K=0.0? I believe this would be the same at what others call to set K1=0.0, when many state the K1 should be the minimum K expected. Thanks
@is4220
@is4220 8 лет назад
Dear Dr. Grande, i have maybe a simpe question. But Reseacher and RA are people, which give their responses to the survey f.a.,right?! So, this number can be very high then. And I´ve got 5 criteria like satisfactory etc . But I think i´ve understood how to do this. I should probably split people , giving responses into groups, in order to come up with the coefficient.
@MrJsanabria
@MrJsanabria 3 года назад
Dr. Grande: what would you do if the Kappa agreement turns to be too low? Should both coders recode the material in order to match and increase the value? Or what do you suggest? Thanks in advance.
@RozgarOmar
@RozgarOmar 2 года назад
It depends on the matter. In my field it is necessary, when the mean value turn out too low, a discussion is held to talk about troubleshooting. And then re-code the material.
@westbourne94
@westbourne94 7 лет назад
Can this test be used to measure reliability of categorical data?
@yourspecial-child3541
@yourspecial-child3541 7 лет назад
Hi Todd. Can you do Fleiss' Kappa in Excel as well?
@alisalm5022
@alisalm5022 8 лет назад
Do the resercher and the resercher assestance should have the same experince?or no
@zenmedia3782
@zenmedia3782 4 года назад
then you take Kendall out for a spin
@Brolnox
@Brolnox 7 лет назад
A bit slow-paced but otherwise an excellent video, thanks.
Далее
Cohen's Kappa (Inter-Rater-Reliability)
11:05
Просмотров 51 тыс.
Identifying and Highlighting Outliers in Excel
8:43
Просмотров 86 тыс.
Kappa Coefficient
4:29
Просмотров 165 тыс.
Reliability 4: Cohen's Kappa and inter-rater agreement
18:25
Excel Histogram with Normal Distribution Curve
7:16
Просмотров 191 тыс.
Excel Solver & Goal Seek Tutorial
23:34
Просмотров 160 тыс.
t-test in Microsoft Excel
5:19
Просмотров 2,2 млн
Spearman's Rank Correlation Coefficient in Excel
8:12
Calculating Inter Rater Reliability/Agreement in Excel
3:58