Cohen's Kappa (Inter-Rater-Reliability)

Поділитися
Вставка
  • Опубліковано 21 січ 2025

КОМЕНТАРІ • 60

  • @Natasha-vz8dd
    @Natasha-vz8dd 3 місяці тому +1

    Thank you for great explanations. Found you searching for Fleiss Kappa and now keep watching other videos for the sheer pleasure.

  • @amalanassar
    @amalanassar 2 роки тому +3

    No one can explain obvious like you, perfect !

    • @datatab
      @datatab  2 роки тому

      Many many thanks for your nice Feedback! Regards Hannah

  • @pghjmuhammadfirdauspghjrus8870

    Thank you for such an excellent yet simple explanation 👍

  • @lulisboa81
    @lulisboa81 2 роки тому +2

    Beautifully explained!

    • @datatab
      @datatab  Рік тому

      Glad it was helpful!

  • @SoniaBiatrizCarlosNhancale
    @SoniaBiatrizCarlosNhancale 4 місяці тому

    hi excelent lesson. I would like to know how to proceed when one of the raters's response is only yes or only no for all questions

  • @mukasamohammed716
    @mukasamohammed716 Рік тому +1

    Can you use cohen kappa when two diffrent instruments are used measuring same thing rather than for individuals

  • @sanikagodbole8874
    @sanikagodbole8874 Рік тому +3

    Such a perfect explanation! Thank you ❤

    • @datatab
      @datatab  Рік тому

      Glad it was helpful!

  • @user-yw1ul4xq1g
    @user-yw1ul4xq1g Рік тому +1

    Many thanks for the clear explanation!

  • @themightyenglish5310
    @themightyenglish5310 9 місяців тому +1

    What if I have four levels? Thank you.
    1 = definitely exclude
    2 = not sure, but tendency to exclude
    3 = not sure, but tendency to include
    4 = definitely include

    • @datatab
      @datatab  9 місяців тому

      Hi, then you can also use Cohen's Kappa, maybe you can also use the weighted Cohen's Kappa.

  • @jhulialeigho.climaco9292
    @jhulialeigho.climaco9292 2 роки тому +3

    Hello. Can you do an example where you use Cohen's Kappa but there are 3 raters? Thank you.

    • @datatab
      @datatab  2 роки тому

      Hi, then you use Fleiss Kappa, here is our video: ua-cam.com/video/ga-bamq7Qcs/v-deo.html Regards, Hannah

    • @shishi-g6l
      @shishi-g6l Місяць тому

      ​@@datatab can you do an example where u usw cohen's kappa for 6 raters?

  • @bongekamkhize3244
    @bongekamkhize3244 2 роки тому +3

    Thank for the great video. Could you perhaps do a video where you use articles to calculate the Kappa.

    • @datatab
      @datatab  2 роки тому

      Great suggestion! I will but it on my to do list!

  • @qqqq-mh5if
    @qqqq-mh5if 5 місяців тому

    Can this test be used for more than2 categorical variables by two raters eg depressed, not depressed, unknown

  • @justice3530
    @justice3530 2 роки тому +4

    thank you~!! so easy and good explanation!!

    • @datatab
      @datatab  2 роки тому

      Glad it was helpful!

  • @ericdizon3320
    @ericdizon3320 2 роки тому +1

    Is Cohen's Kappa can be used for a 5 raters?

  • @alonsamuel7106
    @alonsamuel7106 2 роки тому +1

    Great explanation of the kappa, thank you very much!!!!!! :)

    • @datatab
      @datatab  2 роки тому

      Glad you liked it!

  • @jorgeulloagonzalez550
    @jorgeulloagonzalez550 10 місяців тому +2

    Excelente muchas Gracias

    • @datatab
      @datatab  10 місяців тому

      Glad it was helpful and many thanks for your feedback! Regards Hannah

  • @dinukabimsarabodaragama716
    @dinukabimsarabodaragama716 2 роки тому +2

    Thank you so muchhhhhh!!!!! A great explanation!

    • @datatab
      @datatab  2 роки тому +1

      Many thanks!

  • @archanakamble5212
    @archanakamble5212 Рік тому

    Very informative video, however, I have a question. When manually calculated Kappa was found to be 0.44 while using DATAtab it was 0.23, why this difference?

  • @janitosalipdan6002
    @janitosalipdan6002 2 роки тому

    How is Cohen's Kappa different from the Cronbach's Alpha?

  • @kebenny
    @kebenny 8 місяців тому +1

    Thanks for your video. It really helpful

    • @datatab
      @datatab  8 місяців тому +1

      Glad it was helpful! Regards Hannah

  • @dr.jeevamathan718
    @dr.jeevamathan718 2 роки тому

    Randolph’s K = 0.84 what does it mean

  • @chrischan-e2s
    @chrischan-e2s Рік тому

    how did u got 0.72?

  • @musasilahl463
    @musasilahl463 3 місяці тому

    great explanation

  • @rohanirohan9756
    @rohanirohan9756 2 роки тому

    Wonderful Explanation!!! Thanks

  • @irispham8807
    @irispham8807 2 роки тому +1

    Excellent explanation . Tks a lot

    • @datatab
      @datatab  2 роки тому +1

      Glad you liked it!

    • @irispham8807
      @irispham8807 2 роки тому

      @@datatab you are my life-saver for my statistic subject. Tks so much from the bottom of my heart.

  • @surayuthpintawong8332
    @surayuthpintawong8332 2 роки тому +1

    Great video! Thanks.

    • @datatab
      @datatab  2 роки тому +1

      Glad it was helpful! Thanks for your nice feedback! Regards Hannah

  • @dentistryhm1804
    @dentistryhm1804 Рік тому +1

    Excellent, thank you very much

    • @datatab
      @datatab  Рік тому

      Glad it was helpful!

  • @SupeRafa500
    @SupeRafa500 2 роки тому +2

    Very good video.

  • @arberg200x
    @arberg200x 10 місяців тому +1

    Why did you make an example where the sum of different variables is identical, two times 25 and I have a headache.

    • @datatab
      @datatab  10 місяців тому

      Oh sorry!

  • @originalvonster
    @originalvonster Рік тому +1

    2:15 inter-rater reliability
    5:45 calculating

  • @nitinraturi2289
    @nitinraturi2289 Рік тому +1

    well explained

  • @prathyu2559
    @prathyu2559 2 роки тому +1

    Superb mam

  • @larissacury7714
    @larissacury7714 2 роки тому +1

    Thank you!

    • @datatab
      @datatab  2 роки тому

      You're welcome! : ) Regards Hannah

  • @KOTESWARARAOMAKKENAPHD
    @KOTESWARARAOMAKKENAPHD Рік тому

    very informative

  • @nourchenekharrat3860
    @nourchenekharrat3860 2 роки тому

    Thank you so much for your effort. I have 2 questions:
    Is 0.081 a good Kappa value?
    Can I run Cronbach's alpha test to assess the agreement between two raters?

    • @Cell1808
      @Cell1808 2 роки тому +1

      0.081 is very bad. There are debates about what is considered good and it also depends on the context of the research. In general, I'd say in most cases everything above 0.5 is an area, in which it gets interesting.

    • @Cell1808
      @Cell1808 2 роки тому +2

      The video actually talks about good and bad values at 9:25 onwards

  • @javedkalva659
    @javedkalva659 2 роки тому +1

    Perfect