Calculating Inter Rater Reliability/Agreement in Excel

Поділитися
Вставка
  • Опубліковано 21 лют 2015
  • A brief description on how to calculate inter-rater reliability or agreement in Excel.

КОМЕНТАРІ • 25

  • @Simmy56
    @Simmy56 8 років тому +36

    Technically this should not be called inter-rater reliability as you are presenting only inter-rater agreement (absolute consensus) as opposed to inter-rater consistency (moving in the same direction or maintenance of rank-order across judges). Further, if doing % agreement it would be beneficial to correct for chance using Cohen's kappa or in the case where your data is continuous (as it appears to be here) then it might also be simpler to just calculate a correlation between the two raters.

    • @drkayotu
      @drkayotu  7 років тому +2

      Yes - both good suggestions. I suppose I have simply followed the model I was taught and read most in education. But these are good options. Thank you.

  • @DvdBdjzDvl
    @DvdBdjzDvl 4 роки тому +4

    What would you recommend for when you have 10 coders and each variable has 4-7 values (nominal)?

  • @SPORTSCIENCEps
    @SPORTSCIENCEps 3 роки тому

    Thank you for uploading it!

  • @FooodConfusion
    @FooodConfusion 3 роки тому +2

    Thanks for making it so easy, but can you explain rwg(j) or rwg that is done for team level variables..

  • @donnaderose4959
    @donnaderose4959 4 роки тому +1

    thank you.
    Could you share a video on how to calculate inter rater reliability by using SPSS application?

  • @sanjnathakur6489
    @sanjnathakur6489 3 роки тому

    Sir please tell for likert scale interrater is used or not?
    And for checklist test retest can be used?
    Or its vice versa?
    Please its urgent

  • @Machina3123
    @Machina3123 3 роки тому +4

    Excuse me sir, I would like ask about how high should the result be to be considered reliable? Is it above 80%?

    • @drkayotu
      @drkayotu  3 роки тому

      It depends, and I do not believe there are set guidelines, but I like to get 80-95%. Remember, you can always go back and discuss with your rating partner.

  • @Ammar__Ninja7973
    @Ammar__Ninja7973 Рік тому +1

    thank youuu

  • @madelineespunkt6240
    @madelineespunkt6240 8 років тому +2

    What if I have more than 2 raters? should I give a 1, if everyone rate the question with the same number? Is this the same procedure for a nominal scale? what is that (pearson, spearman correlation or none of those??)

    • @drkayotu
      @drkayotu  8 років тому +2

      +Madeline Espunkt Good question. There are different ways. You could require all three to agree to get a 1 but that seems a bit strict. You could calculate and present IRR for each pair. Same procedure for any scale.

    • @madelineespunkt6240
      @madelineespunkt6240 8 років тому

      +Robin Kay thanks for you quick reply :)! I´ve plan to run a test manual with my fellow students, so I have about 20 Raters for one object. There are 30 points to reach and they see a video of the test subject and they´ve gonna rate it.. but I´m not sure how to calculate it.. :/

  • @lindsyrichardson2708
    @lindsyrichardson2708 8 років тому

    Hi Robin. Great video! Is this Cohen's Kappa unweighted? Lindsy

    • @drkayotu
      @drkayotu  8 років тому

      +Lindsy Richardson No - This is a simple percent formula. No fancy name. Basically - number of agrees over the total number of answers

  • @soukainaaziz1623
    @soukainaaziz1623 6 місяців тому

    Thank you. What is the used formula please?

  • @shurooqa.a.3245
    @shurooqa.a.3245 8 років тому

    Plz I need the reference that explains this procedure.
    Thank you

    • @drkayotu
      @drkayotu  7 років тому

      Well there are a number of references - here is one quick one: www.statisticshowto.com/inter-rater-reliability/

  • @subhayumitra5417
    @subhayumitra5417 4 роки тому +1

    Thank you 😊

    • @drkayotu
      @drkayotu  4 роки тому +1

      You are very welcome

  • @dr.yousufmoosa2561
    @dr.yousufmoosa2561 8 років тому

    please show us how to calculate intra examiner reliability

    • @drkayotu
      @drkayotu  8 років тому

      +Dr. Yousuf Moosa I think the same procedure would apply - Rater 2 would simply be the scores for the same person on a second occasion. The caluclations would be the same.

    • @dr.yousufmoosa2561
      @dr.yousufmoosa2561 8 років тому

      Thank you

  • @jegathiswarymou3117
    @jegathiswarymou3117 2 роки тому

    how if all the scores is not match?

  • @SufraDIYCrafts1
    @SufraDIYCrafts1 3 роки тому

    This is not how we calculate Inter Rater Reliability on a scale of 10. There are much higher chances of not agreement on scale of 10, there must be some normalization