No concept is too difficult to understand if its explained in the way that it can be comprehended. Great job Luis! I keep coming back to your videos whenever I am stuck. You style of explanation with examples is amazing.
you are the coach who could teach students two weeks before the exams and get all students to get distinctions. I only wish if you took up Master classes.. Guys would be pouring to take up your master classes
This is the best explanation of Naive Bayes I have seen or read. Better than Bishop. It starts with a crystal clear, intuitive example and concludes with a thorough explanation of how to apply Bayes' theorem. Thank you so much for this.
Gemini: This video is about Naive Bayes classifier, a spam detector which is based on Bayes theorem. The video uses an example of building a spam detector to illustrate the concept. The idea is that we can classify an email as spam or not spam based on the presence of certain words in the email. For instance, emails containing the word "buy" are more likely to be spam than those which do not contain "buy". Bayes theorem allows us to calculate the probability of an event (e.g. an email being spam) given another event (e.g. the email containing the word "buy"). The video uses a simple example with two properties (presence of "buy" and presence of cheap") to illustrate this concept. However, the challenge arises when we want to consider more than two properties at the same time. Ideally, we would like to calculate the probability of an email being spam given the presence of all the properties we are considering (e.g. "buy", "cheap", "work"). But calculating the probability of all these properties appearing together becomes cumbersome as the number of properties increases. This is where the Naive Bayes assumption comes in. Naive Bayes assumes that all these properties are independent of each other. This assumption although not always true, simplifies the calculation significantly. The video concludes by explaining how the Naive Bayes classifier works with this assumption and shows how to calculate the probability of an email being spam given multiple properties.
Thanks Luis. This was a lot easier to follow than most of my profs to be honest. The fact that you explained first and then put it in equation terms now helpsme remember the equation and understand it better. Many many thanks ! and God Bless
Hi Luis, it would be great to see you publish videos more frequently and yes, could you please tell how any of us could reach out to you be it email or personal message
Thank you! Definitely, the easiest is to add me on linkedin: linkedin.com/in/luisgserrano/ or you can also see my data in the "about" page in the channel.
"So if you like formulas..." OMG! Thank you so much, Dr. Serrano. You helped my brain find the missing piece in my puzzle. The whole explanation was so clear but the formula helped me transition from Bayes to Naive Bayes. I was looking for the missing piece in youtube and somehow landed on your video. I actually came here after attending your AIND class.
It's been around 8 months, I'm moving towards ML and your guidance, teaching strategy are playing major role in it. I can't simply say thank you. Stay blessed.
Maybe I missed this part in the video, but Naive Bayes assumes only conditional independence. For example, this training set suggests that the words "Buy" and "Cheap" are far from being (unconditionally) independent. Namely, P("Buy")=P("Cheap")=25/100=1/4. So, if the two words were independent, we would expect P("Buy" and "Cheap")=1/16=6.25%. However, there are 12 emails containing both words out of 100, which is 12%.
Thank you very much for this video. I've spent days trying to work out intuition on how to apply the Naive Bayes for spam detection, but all other videos just repeat the Bayes probability formula and show you the answer. Formulas give you 0 understanding unless you figure out the logic behind the approach, and only then they become useful.
Just add the numerator with 'alpha' and denominator with 'k'*'alpha' for each class probabilities, where k is no of classes (here binary, so 2) and typically alpha is a hyper parameter (varies between 10^-3 to 1).
First I saw the video from 3b1b then from statquest. Both of them are great videos. But I was not able to find a connection between them. Your video helped me to connect all the dots
I love these easy explanations though it's that easy I cant connect them to the formulas and stuff I read before. That would have been great if you've done that too.
Muchísimas gracias, profesor Serrano! =) I've seen many explanations in UA-cam regarding naive bayes, most of them from channels that I really appreciate, but your explanation is the best one by far. Thank you so much for making the link between the logic and the bayes formula!
Amazing explaination so far .. I am watching this in morning and you literally made my morning ... I have one question (after that I would understand it perfectly).. After training your Naive Bayes model, the output of the whole model is in terms of probability, ie, P(S/(buy and cheap)) and P(not S / (buy and cheap)).. right ? if yes, what happens in testing phase of the naive bayes model, I mean how this works during testing phase.. eagerly waiting for your reply sir :)
Thank you very very much, I would probably liked your video twice if it was possible. It's so clear and plain that after a while, I again came back to it for reviewing naive bayes.
No concept is too difficult to understand if its explained in the way that it can be comprehended. Great job Luis! I keep coming back to your videos whenever I am stuck. You style of explanation with examples is amazing.
you are the coach who could teach students two weeks before the exams and get all students to get distinctions. I only wish if you took up Master classes.. Guys would be pouring to take up your master classes
Such a brilliant explanation. Thank you Luis. Kindly add more lectures on traditional ML related topics.
This is the best explanation of Naive Baye’s & Baye’s theorem ... you rocked it ... Thanks for this
Great Video!
This is the best explanation of Naive Bayes I have seen or read. Better than Bishop. It starts with a crystal clear, intuitive example and concludes with a thorough explanation of how to apply Bayes' theorem. Thank you so much for this.
Appreciate the way (visualization) you explained a more complicated concept.
First time ever I understood this Naive Bayes. Thank you so much
one of the best videos on Naive Bayes
Fantastic explanation! Such amazing teaching skills! I wish every teacher was like you! Great work and thank you!
Gemini: This video is about Naive Bayes classifier, a spam detector which is based on Bayes theorem.
The video uses an example of building a spam detector to illustrate the concept. The idea is that we can classify an email as spam or not spam based on the presence of certain words in the email. For instance, emails containing the word "buy" are more likely to be spam than those which do not contain "buy".
Bayes theorem allows us to calculate the probability of an event (e.g. an email being spam) given another event (e.g. the email containing the word "buy"). The video uses a simple example with two properties (presence of "buy" and presence of cheap") to illustrate this concept.
However, the challenge arises when we want to consider more than two properties at the same time. Ideally, we would like to calculate the probability of an email being spam given the presence of all the properties we are considering (e.g. "buy", "cheap", "work").
But calculating the probability of all these properties appearing together becomes cumbersome as the number of properties increases. This is where the Naive Bayes assumption comes in. Naive Bayes assumes that all these properties are independent of each other. This assumption although not always true, simplifies the calculation significantly.
The video concludes by explaining how the Naive Bayes classifier works with this assumption and shows how to calculate the probability of an email being spam given multiple properties.
Thanks Luis. This was a lot easier to follow than most of my profs to be honest. The fact that you explained first and then put it in equation terms now helpsme remember the equation and understand it better. Many many thanks ! and God Bless
so much more clearer than my professor explaining it for 80 minutes
My first encounter with your teaching style was in the Pytorch Udacity Challenge , I loved it ,and following you since then.
Amazing, been researching this for a while and the way you break this down really gets through
Beautifully and clearly explained....Thank You sir.
Explained intuitively! Thanks! :)
Best explanation. Thank you.
Thank you so much for the crisp and clear explanation !!
Thanks for your detailed and friendly explanation. It really helps me a lot :)
I think the probabilities you picked might be a bit confusing, didactically: P(S|B) happens to be equal to P(B|S) in the Bayes formula at 17min.
Thanks, it's the best explanation!
Awesome explanation...👍
Amazing explanation, thank you for making my life easier.
Thank you as always. Top tier stuff.
Great explanation
Kindly Make a video on Expectation Maximization
Excellent explanation..
Beautiful and Effective.
Thank you!
Thanks for the episode
thank you! its really well explained and good animation. please do a video on Generalized linear model.
Thanks A LOT! great video!
dude.. you are amazing
At 6:55 how did you conclude 0.5% ? Please make it clear
Thanks
excellent! best video out there :) many, many thanks
Thx a lot
the best of best
Perfect
1:31 I counted 80 non spam mails, you should have chopped off one column there!
How 10% of 5% is equal to 0.5?
Probability that he is the god =100%
Wow!!!!! Great
Hi Luis, it would be great to see you publish videos more frequently and yes, could you please tell how any of us could reach out to you be it email or personal message
Thank you! Definitely, the easiest is to add me on linkedin: linkedin.com/in/luisgserrano/ or you can also see my data in the "about" page in the channel.
@@SerranoAcademy thank you, adding you on LinkedIn
Why is it assumed that 0.5% of the words contain buy and cheap
Could you explain how you calculated "0.5% 'buy' and 'cheap" (at 6:52)?
Abhimanyu tiwari I guess 5% of 10%
(5/100)*10 in percentage units
which is 0.5%. Good luck!
You sound like Heath Ledger alias Joker :)
You dont get it if u dont know how to explain it ... great video
The two people who disliked were looking for baes, but got Bayes.
haha! Imagine how they would have felt by end of the video!
they were naive.
😂
This is explained so well, this video is so beautiful that I want to cry
same here
I am crying as I type this line...... *snif* .....so good !
"So if you like formulas..." OMG! Thank you so much, Dr. Serrano. You helped my brain find the missing piece in my puzzle. The whole explanation was so clear but the formula helped me transition from Bayes to Naive Bayes. I was looking for the missing piece in youtube and somehow landed on your video. I actually came here after attending your AIND class.
It's been around 8 months, I'm moving towards ML and your guidance, teaching strategy are playing major role in it.
I can't simply say thank you.
Stay blessed.
Thank you, that's really nice to hear! Keep up the good work in ML!
Maybe I missed this part in the video, but Naive Bayes assumes only conditional independence. For example, this training set suggests that the words "Buy" and "Cheap" are far from being (unconditionally) independent. Namely, P("Buy")=P("Cheap")=25/100=1/4. So, if the two words were independent, we would expect P("Buy" and "Cheap")=1/16=6.25%. However, there are 12 emails containing both words out of 100, which is 12%.
I have exactly the same remark
Beautiful Luis. You clearly draw the distinction between an educator and an instructor.
Great Explanation for Bayes Theorem, I have never understood naive Bayes so well....Thanks for this Luis
Awesome session.. Thanks a million!! God Bless..
Thank you very much for this video. I've spent days trying to work out intuition on how to apply the Naive Bayes for spam detection, but all other videos just repeat the Bayes probability formula and show you the answer. Formulas give you 0 understanding unless you figure out the logic behind the approach, and only then they become useful.
This is the really the best explanation of naive bayes...that beats even Andrew Ng's and many other's...
Hey Luis, can you explain bayes theorem with laplace corrections applied to it when, cond prob is zero
Just add the numerator with 'alpha' and denominator with 'k'*'alpha' for each class probabilities, where k is no of classes (here binary, so 2) and typically alpha is a hyper parameter (varies between 10^-3 to 1).
Thank you, Luis. Your classes are amazing, keep the good work.
Best regards from Brazil.
Great explanation...It was very easy to understand Naive Bayes ...Thank you very much for this video...!!
Came here from "codebasics" youtube channel.
pretty amazingly explained by you man.. Thanks a lot..
Great Explanation! Can someone please explain how we get for P(B/S) = 20/25? @ 17:45
Really amazing sir❤ love from India, watching you videos on 3G internet 😅
First I saw the video from 3b1b then from statquest. Both of them are great videos. But I was not able to find a connection between them. Your video helped me to connect all the dots
I love these easy explanations though it's that easy I cant connect them to the formulas and stuff I read before. That would have been great if you've done that too.
Thank You for Such Clear and Well structured Explanation!.
Wow ... I love the way you present this topic. Thank you very much.
Absolutely fantastic explaination. Thank you so much for this.
Explained with great simplicity, thanks for this!
thank you for excellent explanation !
This video is SOLID!!! Thank you so much for this and please keep making more videos! You made this concept so digestible it is not even funny.
indeed! best explanation of Naive Baye’s & Baye’s theorem
Finally, I noticed that there is a difference between Bayes theorem and Naive Bayes.
Really great 🙏 my understanding is much better now 😊
awesome explanation. thank you very much :)
Great video dude, helped a lot. Thank U.
Thank You for this video. You are an inspiration ❤️
wow, that was wonderful explanation. thatnks!
This is really good! Thank you so much for your time and effort to make this topics accessible to the masses 🙂
Best Naive Bayes explanation ever! thank U Luis
Would love to see you explaining Kalmanfilter
PLEASE COULD YOU SEND LINK TO DOWNLOAD THIS PRESENTATION?
This is really FRIENDLY. Thank you!
your video should be the first result of "naive bayes"
Muchísimas gracias, profesor Serrano! =)
I've seen many explanations in UA-cam regarding naive bayes, most of them from channels that I really appreciate, but your explanation is the best one by far. Thank you so much for making the link between the logic and the bayes formula!
Amazing breakdown! I liked how you made it visually first and gradually turned it into the formula. That really made it click in my head!
Holy, what a good explanation of the concept, damn dude. Thanks alot!
Excellent explanation. Thank you 🙏
Very well done thank you
very clear explanation, thank you!
The people who unliked this are the spams :P
Great explanation , perfectly paced.
no thank you Luis you are an excellent educator
Amazing explaination so far .. I am watching this in morning and you literally made my morning ... I have one question (after that I would understand it perfectly).. After training your Naive Bayes model, the output of the whole model is in terms of probability, ie, P(S/(buy and cheap)) and P(not S / (buy and cheap)).. right ? if yes, what happens in testing phase of the naive bayes model, I mean how this works during testing phase.. eagerly waiting for your reply sir :)
Thank you very very much, I would probably liked your video twice if it was possible. It's so clear and plain that after a while, I again came back to it for reviewing naive bayes.
Thomas Bayes wouldn’t have explained it better !! Thank you for this explanation 👏🏼👏🏼👏🏼
GREAT explanation. But it didn't click for me until the 2nd half. Stick with it folks....and thank you, Professor Serrano!
Thank you so much for this video, Luis! Respect!
This is a beautiful explanation!
Excellent video. So well explained!