What if I have four levels? Thank you. 1 = definitely exclude 2 = not sure, but tendency to exclude 3 = not sure, but tendency to include 4 = definitely include
Very informative video, however, I have a question. When manually calculated Kappa was found to be 0.44 while using DATAtab it was 0.23, why this difference?
Thank you so much for your effort. I have 2 questions: Is 0.081 a good Kappa value? Can I run Cronbach's alpha test to assess the agreement between two raters?
0.081 is very bad. There are debates about what is considered good and it also depends on the context of the research. In general, I'd say in most cases everything above 0.5 is an area, in which it gets interesting.
What if I have four levels? Thank you.
1 = definitely exclude
2 = not sure, but tendency to exclude
3 = not sure, but tendency to include
4 = definitely include
Hi, then you can also use Cohen's Kappa, maybe you can also use the weighted Cohen's Kappa.
Thank you for such an excellent yet simple explanation 👍
Thank you for great explanations. Found you searching for Fleiss Kappa and now keep watching other videos for the sheer pleasure.
Hello. Can you do an example where you use Cohen's Kappa but there are 3 raters? Thank you.
Hi, then you use Fleiss Kappa, here is our video: ua-cam.com/video/ga-bamq7Qcs/v-deo.html Regards, Hannah
Can you use cohen kappa when two diffrent instruments are used measuring same thing rather than for individuals
thank you~!! so easy and good explanation!!
Glad it was helpful!
Can this test be used for more than2 categorical variables by two raters eg depressed, not depressed, unknown
Thank for the great video. Could you perhaps do a video where you use articles to calculate the Kappa.
Great suggestion! I will but it on my to do list!
Is Cohen's Kappa can be used for a 5 raters?
hi excelent lesson. I would like to know how to proceed when one of the raters's response is only yes or only no for all questions
Excelente muchas Gracias
Glad it was helpful and many thanks for your feedback! Regards Hannah
Such a perfect explanation! Thank you ❤
Glad it was helpful!
Thank you so muchhhhhh!!!!! A great explanation!
Many thanks!
No one can explain obvious like you, perfect !
Many many thanks for your nice Feedback! Regards Hannah
Very informative video, however, I have a question. When manually calculated Kappa was found to be 0.44 while using DATAtab it was 0.23, why this difference?
Many thanks for the clear explanation!
Thanks for your video. It really helpful
Glad it was helpful! Regards Hannah
Very good video.
Thank you!
Great explanation of the kappa, thank you very much!!!!!! :)
Glad you liked it!
Excellent, thank you very much
Glad it was helpful!
How is Cohen's Kappa different from the Cronbach's Alpha?
Beautifully explained!
Glad it was helpful!
2:15 inter-rater reliability
5:45 calculating
Randolph’s K = 0.84 what does it mean
well explained
Many thanks : )
Great video! Thanks.
Glad it was helpful! Thanks for your nice feedback! Regards Hannah
Wonderful Explanation!!! Thanks
Excellent explanation . Tks a lot
Glad you liked it!
@@datatab you are my life-saver for my statistic subject. Tks so much from the bottom of my heart.
Superb mam
👍
Thank you!
You're welcome! : ) Regards Hannah
very informative
how did u got 0.72?
Perfect
Thanks!
Why did you make an example where the sum of different variables is identical, two times 25 and I have a headache.
Oh sorry!
Thank you so much for your effort. I have 2 questions:
Is 0.081 a good Kappa value?
Can I run Cronbach's alpha test to assess the agreement between two raters?
0.081 is very bad. There are debates about what is considered good and it also depends on the context of the research. In general, I'd say in most cases everything above 0.5 is an area, in which it gets interesting.
The video actually talks about good and bad values at 9:25 onwards