Thank you Nikhil for your kind words. Despite my UA-cam Channel being underrated, the comments on my recent video have made me proud of my teaching abilities. Your motivation and encouragement have inspired me to create more videos. Let's keep learning together!!
Thank you a lot sir, you failed my college professors in terms of teaching and knowledge of this subject. Thank you from the heart again and wish you success
Hi, pls add more videos on the continuation of CKY, chart parsing, PCFG, dependency parsing, transition parsing. Thanks for your lectures but need more videos.
Apply probabilistic CKY Parsing of PCFG to the sentence “John eats pie with ice-cream” and update the table. Production rules Probabilities S → NP VP 1.0 NP → N 0.4 NP → NP PP 0.6 VP → V NP 0.7 VP → VP PP 0.3 PP → P NP 1.0 N → "John" 0.2 N → "pie" 0.3 N → "ice-cream" 0.5 V → "eats" 1.0 P → "with" 1.0 can you please explain for this?
This man earned a lot of respect literally 💌
sir ur way of explanation is simply superb
Thank you so much sir. I’m trying to learn this concept from different sources but yours was clear.
Most underrated UA-cam channel it seems!!
Thank you Nikhil for your kind words. Despite my UA-cam Channel being underrated, the comments on my recent video have made me proud of my teaching abilities. Your motivation and encouragement have inspired me to create more videos. Let's keep learning together!!
Thanks Binod, its really simple to understand. When i go thru your video - you make concept easy to understand & quickly grasp it. Thanks once again
I really wanna thank you Mr. Binod. This was made so easy to understand to your genius.
Thanks a lot sir, for explaining it clearly you really helped me understand this concept better than any other lecturers present
bhai aap devta ho devta, doing all this advance stuff that is being used in NLP today and that also for free ❤❤
tmrw is my exam. watching this video boosted my confidence. easy to understand
sir brilliant explaination thanks for teaching respect your efforts
Very useful video! Clear explanation on the concept and easily understandable numerical.
Thank you Binod for your videos. it helped for exams!
Glad to know this NLP Natural Language Processing tutorial video helped you. Thanks for your nice words !!
Fantastic explanation. Thank you Binod!
Thank you sir you cleared me this topic
this is a super helpful explanation. Thanks for making this video..
Thank you ! This was an amazing explanation
You're very welcome! Goos to know, this CKY video helped you.
thankyou so much sir 🙏
Thanks for your video, very helpful and understandable
Glad this NLP video was helpful for you! Keep Learning !!
Thank you for explaining this clearly
Happy to hear Nihal, that this Amazon NLP Natural Language Processing CKY videos Tutorial series helped you. Keep Learning!! @binodsumanacademy
Wonderful explanation ❤sir.Thank you
explained so well thanks so much
Thank you Soooooo Much it was really helpful
Good to know this NLP CKY Algorithm UA-cam Video helped you. Thank you for nice words. Keep Learning !!
Thank you a lot sir, you failed my college professors in terms of teaching and knowledge of this subject. Thank you from the heart again and wish you success
Thank you for all nice words. It keeps motivating me for creating awesome contents and share through UA-cam. Need you all support. Keep learning !!
What a lovely teacher, keep it up 🙃🌹
Thank you Sir ! You make subject interesting.
Glad to hear Nidhi that this Natural Language Processing NLP Tutorial series helped you. Keep Learning and thank you for your nice words !!
I could be wrong but I calculated a different final probability. In chart[0,5] I get 2.304x10^-8
Thanks for the explanation though! Very much enjoyed
thank you sir, great explanation
This is really helpful, thank you ! :))
Glad this Natural Language Processing tutorial was useful for you, Keep Learning !!
clear and concise!
Thanks Soo much👌
Hello, so if there is more than one parse given by the probabilistic CKY you choose the more probable?
Thank You, It is a nice video.
Big thanks 🙏.
Sir ! Please explain Chu- Liu-Edmonds algorithm
Thanks Sir, this has been very helpful, i really appreciate
good explaination, it's also help to implement through programs
Thank You, Sir.
Thank you sir
will the set of rules be given in the question paper or do we have to do that on our own
awesome
Hi, pls add more videos on the continuation of CKY, chart parsing, PCFG, dependency parsing, transition parsing. Thanks for your lectures but need more videos.
❤
Please complete the playlist sir
Apply probabilistic CKY Parsing of PCFG to the sentence “John eats pie with ice-cream” and update the table.
Production rules Probabilities
S → NP VP 1.0
NP → N 0.4
NP → NP PP 0.6
VP → V NP 0.7
VP → VP PP 0.3
PP → P NP 1.0
N → "John" 0.2
N → "pie" 0.3
N → "ice-cream" 0.5
V → "eats" 1.0
P → "with" 1.0
can you please explain for this?
Plz make video how you calculate so fast
Nice sir
The IIT professor failed to teach the CYK algorithm as effectively as you did.
💯
how precision can be achive with probabilistic cky algorithm?
sir, which book you are referring to?
Why did NP and v didn't connect in 0,3 box ??
2 Non terminals will add to each other not 1 terminal and 1 non terminal
sir aapne isme error kiya hai 0.3*0.4+0.02 =??
by any chance are you from Assam ?
Next video when?
parse tree ??
What is probability is not given
Binod Suman I think calculation at the end is wrong
Binod
🫡
Binod
Binod
ha ha Binod