BLEU Score Explained

Поділитися
Вставка
  • Опубліковано 3 січ 2025

КОМЕНТАРІ • 7

  • @datamlistic
    @datamlistic  8 місяців тому +1

    A really popular model used for machine translation is BART. Take a look at this video to see how it works: ua-cam.com/video/jTvPJD81m8E/v-deo.html

  • @Accorinrin
    @Accorinrin 26 днів тому +1

    thanks a lot! please keep making more videos. they are very useful and informative. videos like yours make learning so much fun and engaging :D

  • @stanleykurniawan3053
    @stanleykurniawan3053 8 місяців тому +1

    I just love you so much! I could really see your effort to make things intuitive. You are so underrated and I say this from the bottom of my heart.

    • @datamlistic
      @datamlistic  8 місяців тому +1

      Thank you so much for your kind words!

  • @lamrin9178
    @lamrin9178 7 місяців тому

    recall doesn't worry about FP. why is it 2/5? If we have all ground truths in a given list of predicted positives, isn't recall 1?

    • @datamlistic
      @datamlistic  7 місяців тому

      Recall is interested in finding how many ngrams from the reference text are found in the predicted text. Thus, it is defined as the number of overlapping ngrams divided by the number of reference ngrams. There are 2 overlapping unigrams and 5 unigrams in the reference, which gives us a recall of 2/5. Hope it make sense now! :)