Kappa Coefficient

Поділитися
Вставка
  • Опубліковано 4 кві 2015
  • Cohen's kappa coefficient is a statistical measure of inter-rater agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation since κ takes into account the agreement occurring by chance
  • Наука та технологія

КОМЕНТАРІ • 61

  • @Teksuyi
    @Teksuyi 8 років тому +4

    no he entendido un carajo de lo que haz dicho (no entiendo el inglés) pero estos 4 minutos han sido mejores que la hora de mi profesor. Muchas gracias.

  • @chenshilongsun1581
    @chenshilongsun1581 3 роки тому +3

    So helpful, watching a 4.5 min video sure beats a 50 minute lecture

  • @fernandoduartemolina
    @fernandoduartemolina 9 років тому +5

    Simple, very well explained, nicely presented, clear voice. Excellent, thank you so much, this video is very useful.

  • @ezzrabella6624
    @ezzrabella6624 8 років тому +14

    this was VERY helpful and simplified the concept.. thank you. please do more videos !

  • @heikochujikyo
    @heikochujikyo Рік тому

    This is pretty quick and effective it seems.
    Understanding the formula and how it works in depth surely takes more than 5 minutes, but it sure saves some work lmao
    Thank you for this

  • @66ehssan
    @66ehssan 2 роки тому

    What I though its not possible to understand, needed only a great 4 minute video to understand. Thanks a lot!

  • @gokhancam1754
    @gokhancam1754 4 роки тому +1

    accurate, sharp and on the point. thank you sir! :)

  • @rafa_leo_siempre
    @rafa_leo_siempre 3 роки тому +1

    Great explanation (with nice sketches as a bonus)- thank you!

  • @lakshmikrishakanumuru9043
    @lakshmikrishakanumuru9043 5 років тому +2

    This was made so clear thank you!

  • @nhaoyjj
    @nhaoyjj 3 роки тому

    I like this video so much, you explained it very clearly. Thank you

  • @bhushankamble4087
    @bhushankamble4087 8 років тому +1

    just awesome !!! THANKS . Plz make more such video regarding biostatistics.

  • @TheMohsennabil
    @TheMohsennabil 8 років тому +5

    Thank you . You make it easy

  • @riridefrog
    @riridefrog 2 роки тому

    Thanks so much, VERY helpful and simplified the concept

  • @jaminv4907
    @jaminv4907 Рік тому

    great concise explanation thank you. I will be passing this on

  • @handle0617
    @handle0617 4 роки тому

    A very well explained topic

  • @LastsailorEgy
    @LastsailorEgy 2 роки тому

    very good simple clear video

  • @byron9570
    @byron9570 8 років тому +1

    Great explanation!!!!

  • @drsantoo
    @drsantoo 4 роки тому

    Superb explanation. Thanks sir.

  • @MinhNguyen-kv8el
    @MinhNguyen-kv8el 4 роки тому

    thank you for your clear explanation.

  • @UsedHeartuser
    @UsedHeartuser 9 років тому +4

    Danke, hat mir geholfen! :)

  • @vikeshnallamilli
    @vikeshnallamilli 2 роки тому

    Thank you for this video!

  • @samisami25
    @samisami25 8 років тому +1

    Thank you. More videos please :-)

  • @Adrimartja
    @Adrimartja 7 років тому

    thank you, this is really helping.

  • @salvares8323
    @salvares8323 9 років тому +1

    Awesome. can you make more of such video. the are so simple & nice.
    thanks

  • @Lector_1979
    @Lector_1979 3 роки тому

    Great explication. Thanks
    a lot.

  • @nomantech8813
    @nomantech8813 3 роки тому

    Well explained. Thank you sir

  • @arnoudvanrooij
    @arnoudvanrooij 4 роки тому

    The explanation is quite clear, the numbers can be a bit optimized. Agreement: 63.1578947368421%,
    Cohen’s k: 0.10738255033557026. Thanks for the video!

  • @daliael-rouby2411
    @daliael-rouby2411 3 роки тому

    Thank you. If I have data with high agreement between both observers, should i choose the results of any one of the raters or should i use the mean of rating of both?

  • @galk32
    @galk32 5 років тому

    great explanation

  • @louiskapp
    @louiskapp 2 роки тому

    This is phenomenal

  • @mayralizcano8892
    @mayralizcano8892 2 роки тому +1

    thank you, you help me so much

  • @isa..333
    @isa..333 Рік тому

    this video is so good

  • @anasanchez2935
    @anasanchez2935 3 роки тому

    Gracias teacher lo he entendido :)

  • @danjosh20
    @danjosh20 8 років тому +2

    Question please:
    We are supposed to do kappa scoring for dentistry but we have 5 graders. How do we do such thing?

  • @zicodgra2684
    @zicodgra2684 7 років тому +3

    what is the range of kappa values that indicate good agreement and low agreement?

    • @zicodgra2684
      @zicodgra2684 7 років тому +15

      i did my own research and figured id post it here in case anyone ever has the same question. taken from the source, article name "interrater reliability: the kappa statistic"
      it reads...
      Similar to correlation coefficients, it can range from −1 to +1, where 0 represents the amount of agreement that can be expected from random chance, and 1 represents perfect agreement between the raters. While kappa values below 0 are possible, Cohen notes they are unlikely in practice (8). As with all correlation statistics, the kappa is a standardized value and thus is interpreted the same across multiple studies.
      Cohen suggested the Kappa result be interpreted as follows: values ≤ 0 as indicating no agreement and 0.01-0.20 as none to slight, 0.21-0.40 as fair, 0.41- 0.60 as moderate, 0.61-0.80 as substantial, and 0.81-1.00 as almost perfect agreement.

  • @MrThesyeight
    @MrThesyeight 6 років тому

    How to calculate the agreement between "strongly disagree,disagree,agree,strongly disagree" , what is the formula only to calculate 'observed agreement'

  • @KnightMD
    @KnightMD 3 роки тому

    Thank you so much! Problem is, I don't have a "YES" or "NO" answer from each rater. I have a grade of 1-5 given by each rater. Can I still calculate Kappa?

  • @genwei007
    @genwei007 Рік тому

    Still not clear how come to the final Kappa equation? Why (OA-AC)? Why divided by (1-AC)? The rationale is obscure to me.

  • @rekr6381
    @rekr6381 2 роки тому

    Thank you!

  • @EvaSlash
    @EvaSlash 7 років тому +3

    The only thing I do not understand is the "Chance Agreements", the AC calculation of .58. I understand where the numbers come from, but I do not understand the theory behind why the arithmetic works to give us this concept of "chance" agreement. All of the numbers in the table are what was observed to have happened...how can we just take some of the values in the table and call it "chance" agreement? Where is the actual proof they agreed by chance in .58 of the cases?

    • @farihinufiya
      @farihinufiya 6 років тому +7

      for the "chance" of agreement, we are essentially multiplying the probability of rater 1 saying yes and the probability of rater 2 saying yes and doing the same for the no(s). The same way you would calculate the "chances" of getting both heads on two coins, we would multiply the probability of obtaining heads in coin 1 (0.5) and the probability of obtaining heads in coin 2 (0.5). The chance of us obtaining heads by mere luck for both is hence 0.25, the same way the chance of the two raters agreeing by chance is 0.58

  • @simonchan2394
    @simonchan2394 7 років тому +3

    can you please eloborate on the meaning of a high or low Kappa value? I can now calculate kappa, but what does it mean?

    • @jordi2808
      @jordi2808 3 роки тому +1

      A bit late. But in school we learned the following.

  • @zubirbinshazli9441
    @zubirbinshazli9441 7 років тому +1

    how about weighted kappa?

  • @alejandroarvizu3099
    @alejandroarvizu3099 6 років тому

    It a value clock-agreement chart.

  • @atefehzeinoddini9925
    @atefehzeinoddini9925 3 роки тому

    great..thank you

  • @anukulburanapratheprat7483
    @anukulburanapratheprat7483 2 роки тому

    Thank you

  • @o1971
    @o1971 5 років тому +1

    Great video. Could you also explain if 0.12 is significant or not?

    • @robinredhu1995
      @robinredhu1995 4 роки тому

      Cohen suggested the Kappa result be interpreted as follows: values ≤ 0 as indicating no agreement and 0.01-0.20 as none to slight, 0.21-0.40 as fair, 0.41- 0.60 as moderate, 0.61-0.80 as substantial, and 0.81-1.00 as almost perfect agreement.

  • @llxua7487
    @llxua7487 2 роки тому

    thank you for yourvideo

  • @michaellika6567
    @michaellika6567 Рік тому

    THANK U!!!

  • @ProfGarcia
    @ProfGarcia 2 роки тому

    I have a very strange Kappa result: I have checked for a certain behavior in fotage of animals, which I have assessed twice. For 28 animals, I have agreed 27 times that the behavior is present and have disagreed only once (the behavior was present in the first assessment, but not in the second). My data is organized as the following matrix:
    0 1
    0 27
    And that gives me a Kappa value of zero which I find very strange because in only 1 of 28 assessments I disagree. How come it is considered these results as pure chance?

    • @krautbonbon
      @krautbonbon Рік тому

      i am wondering the same thing

    • @krautbonbon
      @krautbonbon Рік тому

      I think that's the answer: pubmed.ncbi.nlm.nih.gov/2348207/

  • @lakesidemission7172
    @lakesidemission7172 3 роки тому

    👍♥️♥️🐾