Recall is interested in finding how many ngrams from the reference text are found in the predicted text. Thus, it is defined as the number of overlapping ngrams divided by the number of reference ngrams. There are 2 overlapping unigrams and 5 unigrams in the reference, which gives us a recall of 2/5. Hope it make sense now! :)
A really popular model used for machine translation is BART. Take a look at this video to see how it works: ua-cam.com/video/jTvPJD81m8E/v-deo.html
thanks a lot! please keep making more videos. they are very useful and informative. videos like yours make learning so much fun and engaging :D
Thanks! More to come! :)
I just love you so much! I could really see your effort to make things intuitive. You are so underrated and I say this from the bottom of my heart.
Thank you so much for your kind words!
recall doesn't worry about FP. why is it 2/5? If we have all ground truths in a given list of predicted positives, isn't recall 1?
Recall is interested in finding how many ngrams from the reference text are found in the predicted text. Thus, it is defined as the number of overlapping ngrams divided by the number of reference ngrams. There are 2 overlapping unigrams and 5 unigrams in the reference, which gives us a recall of 2/5. Hope it make sense now! :)