I couldn't understand the topic despite watching many videos on UA-cam. However, after watching your video, all my doubts were cleared.I hope you can understand how amazing teacher you are!
@@jamirraja1397 i think he needs to divide by p(orange) and p(banana) and p(others) and multiply p(fruits ) in all but only p(fruits) will be same in all case so it can be neglected
Whenever i went for any cs related course in UA-cam I found many problems regarding understanding the topic..But when i started following you,its going to be change day by day
You are going to / already are a great teacher. Aapka teaching style is just fantastic. Watching your video once is just enough to grasp the concept. You have a long long way to go! please post many many videos and keep up the good job! (Y)
Ary bhai kya karke maanoge, Itna to college me 4 saal me ni aaya samj jitna aap 10 min me bata dete ho🔥 Big fan of your way of teaching❤ Precise, to the point, and Easiest on YT❤
Nice Explanation. One correction, I think: in calculating the final probabilities, it should be P(Orange/Fruit) because we are calculating the probability of the item being orange given it has the characteristics of the mentioned fruit.
Sir thank you so much for all kind of help you did this long. I would say I've passed Engineering just on your lectures. the spirit you show in your lectures motivates us a lot.
Thank You So Much Sir For Simplest Explanation and Excellent ways of Explanation. I Wrote DAA & CD after Seeing and Learning from You lecture videos. ❤❤❤😮😮😮😮#Thanks5MinEngg
@@anishjain8096 bhai konse year me h... Mostly CS k 3rd year m hota ML... Mere college me bhi 3rd yr m h... Aur elective hota subject... Hum choice mili thi data mining aur ml me se ek m
i think sir you should calculate P(banana/fruit) instead of P(fruit/banana) and same for oranges and others because calculating P(banana/fruit) provides the probability of the fruit being a banana given the observed features, which is the desired outcome in classification tasks, whereas calculating P(fruit/banana) gives the probability of observing the features given that the fruit is a banana but does not directly inform us about the likelihood of the fruit being a banana.
IMP 🛑But if we have Fruit = { Yellow, Sweet, Sweet, Sweet, Sweet, Long} then by conventional logic seeing Sweet so many times and having sweet as a higher probability for orange we might conclude that it is orange, but the single term Long will make it 0, as compared to other fruits. To counter this we add a small number say 1 to all the zeros in the dataset so that we do not get incorrect output because of zero being present.
Sir plz Data analytics ke remaining videos dalao naa sir...plz....C4.5, CART, evaluating decision tree, Smoothing, Diagnostics, classification of diagnostic, sir plz hosake to last 3 unit mein ke syllabus topic explain kro naa sir bahoot hard language hai sir plz
Sir u style of explanation is awesome.. however do u have any videos regarding putting these algorithms in coding .. pls do guide me it will be highly appreciated. Great respect from Karachi Pakistan
Bayes' Theorem: A fundamental theorem in probability theory that describes how to update the probability of a hypothesis based on new evidence. Naive Bayes: A classification algorithm based on Bayes' Theorem, with the assumption of feature independence.
sir sari examples classification ki ...nd its tooo gud..bt can u gve examples of same classifier based on continous data as label..or regression based predictive modelling
Just an assumption because although total of yellow orange and sweet orange is greater than 650 ( The point is there can be some yellow orange which can be sweet as well)🙏
Sir how we got total of orange, banana and others as 650, 400 and 150. I was calculating manually in every row and the result turns out to be different. I might be wrong please explain.
Sir, thank you for giving such easy explanation but please upload the videos on Id3 algorithms, c4.5 algorithm ,cart algorithm , smoothing and diagnostic.please sir kal exam hai.
I couldn't understand the topic despite watching many videos on UA-cam. However, after watching your video, all my doubts were cleared.I hope you can understand how amazing teacher you are!
its 6 years for this video but still its so relevent ,for knowledge and end-sem paper ,thanks a lot 👍👍
I hope you get a lot more views on this channel man, great explanation. You'll make a great professor.
Thanks a lot for your valuable response.
@@5MinutesEngineering 84k subscribers all over the globe and still counting
@@jamirraja1397 i think he needs to divide by p(orange) and p(banana) and p(others) and multiply p(fruits ) in all but only p(fruits) will be same in all case so it can be neglected
Whenever i went for any cs related course in UA-cam I found many problems regarding understanding the topic..But when i started following you,its going to be change day by day
I always thinking Machine learning is very hard, but when i saw this video then machine learning is very easy.
Thanks Brother
bro it is hard
bro you made it easy because you learn it . if you have time build your algoritm
then you will know its essence
You are going to / already are a great teacher. Aapka teaching style is just fantastic. Watching your video once is just enough to grasp the concept. You have a long long way to go! please post many many videos and keep up the good job! (Y)
I was hopeless...but now I can see a ray of hope😇
👏👏👏u deserve thousands of views..best explanation skills..
Ary bhai kya karke maanoge, Itna to college me 4 saal me ni aaya samj jitna aap 10 min me bata dete ho🔥 Big fan of your way of teaching❤ Precise, to the point, and Easiest on YT❤
bhai tu kya karwaake manwana chahta batade
Nice Explanation. One correction, I think: in calculating the final probabilities, it should be P(Orange/Fruit) because we are calculating the probability of the item being orange given it has the characteristics of the mentioned fruit.
and also multiply by P(Orange)
Thank you so much, i have my semester final exam tomorrow on this. Great help indeed, you make it look so easy.
Sir thank you so much for all kind of help you did this long. I would say I've passed Engineering just on your lectures. the spirit you show in your lectures motivates us a lot.
That was clear and crisp 🔥
just love u sir....i was zero in datamining before i found this channel
The best channel to get the concepts cleared :)
Very Nice Explanation,Sir. Needs More This Type Of Tutorial From You.
I don't need decode or easy solutions... 5 minutes engineering is enough for BE exams! ❤️
Did you cleared your be
Thanks for an amazing explanation.
Can't hope of getting a better explanation than this.
Thanks a lot!
Great sir... This video is very helpful and in simple language rather than other videos..
superb methodology to convey the knowledge .. I saw DR.VIVAK BINDRA in your expressions.
Thank You So Much Sir For Simplest Explanation and Excellent ways of Explanation. I Wrote DAA & CD after Seeing and Learning from You lecture videos. ❤❤❤😮😮😮😮#Thanks5MinEngg
0:27 is perfect screenshot
Your way of expressing the content is too👍 good sir.....every point gets clear....
Sir You are ray of hopes...God Bless you...Thanks so much..
You are a saver brother..
thanks a lot for your all videos.
Really well explained.
Exactly what you need a night before the exam.
Good work.
Bhai yeh kis machine learning kis course me ha me to cs me hoo wha to sirf programming he c cpp or java me bhai yeh subject kha he
@@anishjain8096 bhai konse year me h... Mostly CS k 3rd year m hota ML... Mere college me bhi 3rd yr m h... Aur elective hota subject... Hum choice mili thi data mining aur ml me se ek m
Sir ....aap nahi hote toh data mining to kabhi pass nahi hota...thank u for the wonderful videos...
What a great course sir, very easy explanation. U r awesome sir. 👍👍
Jabardasth explanation bhaiya..very helpful
great exactly, 1 hour left for my exam lol. good.
I really love the way you explained sir ... god bless you (From Nepal)
Better than any college professor.
Thanks 🙏 so much Sir!
Just saw this before exam & my doubts are somewhat cleared. 😌
Great explanation.Thank you soooooooooo Much महोदय 🙏😊
Concise and up to the point....great job
plz... make a video on ID3 algo,smoothing,Diagnostic...
very nicely explained please keep posting such videos. it helps a lot.
Very nice video. Also, please explain the case of zero probability in Naive Bayes and respective laplace smoothing technique.
Thanks man... It helps me!
Thanks sir, secured 15% with ur help😋
Thankyou Sir♥
Stay blessed sir bhot zbrdst explanation thi thank you so much
Thank you very much. The explanation simple and elegant.
Superb bro great way of explaining....
i think sir you should calculate P(banana/fruit) instead of P(fruit/banana) and same for oranges and others because calculating P(banana/fruit) provides the probability of the fruit being a banana given the observed features, which is the desired outcome in classification tasks, whereas calculating P(fruit/banana) gives the probability of observing the features given that the fruit is a banana but does not directly inform us about the likelihood of the fruit being a banana.
exactly
very nice and simple and to the point Example
My college professor used your examples to explain.
IMP 🛑But if we have Fruit = { Yellow, Sweet, Sweet, Sweet, Sweet, Long} then by conventional logic seeing Sweet so many times and having sweet as a higher probability for orange we might conclude that it is orange, but the single term Long will make it 0, as compared to other fruits. To counter this we add a small number say 1 to all the zeros in the dataset so that we do not get incorrect output because of zero being present.
thanks sir😊😊😊
Thanks from Kerala 😀😀
KTU Data mining alle? 😂
@@sneh9817 haha yaa
Sir plz Data analytics ke remaining videos dalao naa sir...plz....C4.5, CART, evaluating decision tree, Smoothing, Diagnostics, classification of diagnostic, sir plz hosake to last 3 unit mein ke syllabus topic explain kro naa sir bahoot hard language hai sir plz
outstanding explanation of every thing
Thank you so much sir... really very helpful❤❤❤❤
love from lucknow
great sir u r a great teacher 👍
Very good explanation,
Thank you sir
Awesome explanation 👌👌👌👌🙏
or we can use P(A/B) =P(A intersection B) / P(B) works in every case bayes theorom is derived from this property
Nice video ❤
This is awesome explanation.. great job...
Lots of respect to you sir
Your explanation is good
thank you soo much sir,1m thanks to you sir!!!
Great video sir!!
Thanks sir bohot sahi 👍🏻❤
Very nice explanation bro 👌
who need worry about machine learning algorithm, when you have sir
You are great bro . Keep it up
ek number sie great
You’re amazing !!! 😊😊
great explaination sir
love from pakistan
Well explained 👍👍🙏
Nice and easy explanation
Great video and thanks sir!!!
Awesome brother .. thanks
Best teacher
good explained 👌👌👌👍
Great one. Thanks a lot.
Sir u style of explanation is awesome.. however do u have any videos regarding putting these algorithms in coding .. pls do guide me it will be highly appreciated. Great respect from Karachi Pakistan
Thank you for the explaination.
Bayes' Theorem:
A fundamental theorem in probability theory that describes how to update the probability of a hypothesis based on new evidence.
Naive Bayes:
A classification algorithm based on Bayes' Theorem, with the assumption of feature independence.
great sir thank u for your video
Excellent 👍😌
Wonderful tutorial
Will you not apply laplace soothing when frequency of P(L|O) is missing.
sir sari examples classification ki ...nd its tooo gud..bt can u gve examples of same classifier based on continous data as label..or regression based predictive modelling
great
Sir please upload videos on ...
An analytics project - communicating, operationalizing, creating final deliverables.
Amazing explanation
mango bhi ho sakta hai na XD.
thanks for explaining it so easily tho!!
Thanks a lot broo🔥❤️❤️
simple and easy explanation! Thanks my online Teacher!
Good hu gyaa ustad ji
You're Doing great job . Do you think row totals in the table are incorrect
Simply brilliant! So glad I found your channel!
thanks sir this video is helpfull
Dear Sir,
can you please explain how the row totals 650, 400 & 150 are arrived at.
Just an assumption because although total of yellow orange and sweet orange is greater than 650 ( The point is there can be some yellow orange which can be sweet as well)🙏
bhai very nice and to the point like maths. If you have some video of joint probability & conditional Probability
then plz share the link
Sir how we got total of orange, banana and others as 650, 400 and 150. I was calculating manually in every row and the result turns out to be different. I might be wrong please explain.
I'm confused about same
Sir, thank you for giving such easy explanation but please upload the videos on Id3 algorithms, c4.5 algorithm ,cart algorithm , smoothing and diagnostic.please sir kal exam hai.