NOTE: Conditional probabilities can be written two ways: a "long" version and a "short" version. In this video I use the long version (because it's easier for me to read)... P(A and B | B) = P(A and B)/P(B) ...however, the short version is commonly used... P(A | B) = P(A and B)/P(B) Both mean the same thing, in other words: P(A and B | B) = P(A | B). The short version implies the "and B" part. Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
I always thought I was incapable of learning math, but it had always intrigued me. I just discovered what was missing; a good teacher. Thank you very much.
In India due to competition for engineering entrance exam teacher majorly focus on question types rather than concept in depth. Im glad there is a place where people learn for fun not just for competition, also appreiciate teachers like you who can turn subjects into magic. Keep up the good work! From now on I'm permanent on this channel. You are the best!
I "stumbled" upon the contingency table method many years ago when trying teach myself the basics of probability. This approach is really the only approach for developing a solid understanding of conditional probability, as well as joint and marginal probability. I've never seen anyone present it this way and I'm so glad to see it here! Just more evidence that Josh Starmer is an absolute master instructor.
Logical explanation, precision and genuineness - your way of teaching exemplifies these essential qualities. Thanks so much Josh for these great lessons. Invaluable effort!
Just about started crying when the cartoon-y jingle and graphics started and felt a weight lift off my shoulders when you introduced the contigency table. I have 0 background in prob and stats, always hated the subject, and was terrified it would be impossible for me to learn. None of the symbols made sense to me. The concepts gave me a headache. And I wouldn't even know where to begin asking questions considering no matter how much I seemed to "understand" what my professor was saying in class, the moment I saw a homework question, complete blank. I thought I was doomed to just memorizing formulas that made no sense to me, which isn't a great thing to say as an engineering student. Thank you. I've been saved by your channel.
I FUCKING HATE PROBABILITIES SO MUCH I ACTUALYL GAVE UP 3 TIMES LEARNING THE COURSE (THE COURSE IS HIGH QUALITY IM JUST VERY STUPID) IM SO SO SO GLAD IM STARTING TO GET IT FROM YOUR VIDEOS JOSH. YOU'RE DEFINITELY THE #1 GUY ON UA-cam.
I am forever in my uni lecturer's debt for being too lazy to explain and instead give us the link to your index!! I need to learn stats as part of my Data Science Programming course, and you have been VERY helpful
You are literally saving me from my statistics exam! Thank you so much, your explanations are brilliant and you are really making me focus with your simple and funny ways. Thank you again!
Hi Josh, I am a long time fan of your channel and all of the shameless self-promotion has guilted me into buying one of your study guides (LDA). They seem great, a nice supplement to your videos. Though I think you would benefit greatly from stripping down the ordering process, I mean for online purchases almost noone asks for complete addresses, phone numbers and names anymore. I fought very hard with myself to finish it and parting with that information, and I succeeded simply because of being so grateful for the videos you provide. You'd probably sell a lot more of these if you just ask for an email. Like most. Cheers
Thank you SO MUCH for clarifying that AB = A and B, when A and B are events. I get confused because XY is literally multiplication when X and Y are random variables. It can be doubly confusing because both events and random variables (even though they are different types) can be passed into P(), the probability function. Function overloading is a thing in computer programming, but I’m shocked to see it in math. Also, the implied A and B in P(A | B) is a lifesaver, many times over. BTW, understanding the tables in this video is pivotal in grasping marginal distributions, Baye’s Theorem, and joint probability distributions - it feels like I’m getting the other three concepts for free!
Today I discovered your channel and it is truly a gem. As a EEE student who tries to make sense of subject of probability and neural network stuff, I believe I will benefit a lot from your work and be thankful for rest of my life. Especially statistics is challenging as things get messy a lot and scary/unintuitive equations with confusing notations start to appear. I did not look at all of your videos yet, but I hope that you did statistics less scary for those like me. My best regards
First, I'll admit I had "Do you love orange soda?" from Keenan and Kel in the back of my mind. That said, this video has been immensely helpful, thank you!
Very well explained as usual!! BTW, do you have any plans for making videos on stochastic calculus topics like Brownian motion, Ito integral, martingale measures etc. If so, I am looking forward to them.
It's interesting notation that you keep the superfluous condition on the left side of the pipe. I.e. using p(c=0 and s=1 | s=1) instead of p(c=0 | s=1). Do you find students understand it better this way?
Conditional probability : Decoded. BAM !!! Tbh Some of your videos ( Very few actually. For example p-value 😁) were not very clear to me ( Well that's just me ! ) in the first instance but I digested through them somehow. But with little study & focus or just give a break for a while, the 2nd or 3rd look at your video makes the concept super clear to me and I don't need to look elsewhere. Thanks for making such videos and making statistics/probability easier.
Good video that cleared up a lot of my questions. Only thing I'm confused now tho is 9:20 let's say we are given all the probabilities. How do I know if a probability is p(not love c and loves s | loves s) or p(loves s | not love c and loves s)? Confused because we know both the numerator and denominator. Thanks
Well, the second one, p(loves s | not love c and loves s), really doesn't make sense to calculate at all. This is because the conditional part, "not love c and loves s" is only a subset of "loves s", and thus, the conditional part does not give us any useful information about the larger "loves s" population.
Your channel is amazing, I wish I had discovered it earlier! Would you ever consider making the slides available for sale? I would absolutely buy them.
although that i calculate conditional probability during college, i cannot get the concept of it leaving a gap in my stat knowledge... i stumbled in your video and i have a basic understanding about it. im just need a more examples regarding conditional probability to fully master this concept... Thanks Josh for your videos and hoping for more topics about stats and data analysis.
This might sound like a silly question but, shouldn’t one expect that P(A | ~A) = 0, and vice-versa? However this conditional notation doesn’t encompass that
Actually, it does. What you are asking is "The probability that someone loves candy AND does not love candy given that we know that they love candy". This results in the equation p(loves candy & does not love candy) / p(loves candy) = 0 / (6/14) = 0.
Hi Josh! I've been working with DEseq2 a lot. Is dispersion shrinkage on your TODO list for statquest? I realize it's a little more of a niche topic for bioinformatics than broadly applicable tho
If we take Event A:Not loves candy Event B:Loves soda Then general formula is P(A∩B|B)=P(A∩B)/P(B) Is the above formula same as P(A|B)=P(A∩B)/P(B) (This is the formula I was taught)
I'm gonna go out on a limb and say that 8:14, the formula on the bottom is the long form of the notation you used. And he's showing in this frame how the results are the same.
Josh , Please start a series for mathematics concepts for machine learning. Such as probability, linear algebra, statistics, calculus etc.. It'll great help for us.
At 5:08, is it legitimate to ask "What is the probability that someone loves candy given that we know they love soda"? Without having the "and soda" condition. Or is the fact that it's "given/knowing they love soda" implies that it has to be "loves candy and soda"?
Conditional probabilities can be written two ways: a "long" version and a "short" version. In this video I use the long version (because it's easier for me to read)... P(A and B | B) = P(A and B)/P(B) ...however, the short version is commonly used... P(A | B) = P(A and B)/P(B) Both mean the same thing, in other words: P(A and B | B) = P(A | B). The short version implies the "and B" part.
It really depends on what type of thing "y" is. Is "y" continuous? Consider linear regression: ua-cam.com/play/PLblh5JKOoLUIzaEkCLIUxQFjPIlapw8nU.html If "y" is "true/false", then consider use logistic regression: ua-cam.com/play/PLblh5JKOoLUKxzEP5HA2d-Li7IJkHfXSe.html
@@statquest Thanks. My confusion was after the linear regression. How to do some analysis on the X -> Y, if the Y can be predicted by X or there is internal randomness in the variables?
NOTE: Conditional probabilities can be written two ways: a "long" version and a "short" version. In this video I use the long version (because it's easier for me to read)...
P(A and B | B) = P(A and B)/P(B)
...however, the short version is commonly used...
P(A | B) = P(A and B)/P(B)
Both mean the same thing, in other words: P(A and B | B) = P(A | B). The short version implies the "and B" part.
Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
Your version is much easier for non-statisticians like me :)
@@exploringdiversity572 bam!
omg this cleared up so much confusion i had from reading my text book where they were using the shorthand
I love how I need a video that seems designed for 7 year olds to understand my uni course.
Great vid thanks
Glad it was helpful!
I like the style. It's fun and nostalgic
this is secondary school level maths
bam!
I always thought I was incapable of learning math, but it had always intrigued me. I just discovered what was missing; a good teacher. Thank you very much.
bam!
I'm in the same boat as you man, love this channel
In India due to competition for engineering entrance exam teacher majorly focus on question types rather than concept in depth. Im glad there is a place where people learn for fun not just for competition, also appreiciate teachers like you who can turn subjects into magic. Keep up the good work! From now on I'm permanent on this channel. You are the best!
Thank you very much!
I "stumbled" upon the contingency table method many years ago when trying teach myself the basics of probability. This approach is really the only approach for developing a solid understanding of conditional probability, as well as joint and marginal probability. I've never seen anyone present it this way and I'm so glad to see it here! Just more evidence that Josh Starmer is an absolute master instructor.
BAM! Thank you very much! :)
you know, me too.
Hey this table reminds me of confusion matrix from a ML classification problem
Very nice video for sure. Conditional prob is one of the topics students feel the most difficulty when they start studying for stat! Great job!
Thank you! :)
this is how i wish people taught stuff, i really hope these ways of learning statistics get the recognition they deserve! please keep it up !!!!
Thank you!
I should of paid you instead of my useless lecturer at university! This is so much more useful thank you!
Glad I could help!
Logical explanation, precision and genuineness - your way of teaching exemplifies these essential qualities. Thanks so much Josh for these great lessons. Invaluable effort!
Wow, thank you!
You are a GOD of simple explanations
Thanks!
Soda and Candy > Mechanical engineering degree. Thank you it was very helpful!
BAM! :)
Great video!
I never saw the "long version" to write conditional probabilities before but it's better to understand. Very intuitive!
Yes, I like it more, too.
Just about started crying when the cartoon-y jingle and graphics started and felt a weight lift off my shoulders when you introduced the contigency table. I have 0 background in prob and stats, always hated the subject, and was terrified it would be impossible for me to learn. None of the symbols made sense to me. The concepts gave me a headache. And I wouldn't even know where to begin asking questions considering no matter how much I seemed to "understand" what my professor was saying in class, the moment I saw a homework question, complete blank. I thought I was doomed to just memorizing formulas that made no sense to me, which isn't a great thing to say as an engineering student.
Thank you. I've been saved by your channel.
Happy to help! :)
erm what the sigma
Thank you Josh & the team. I'm so grateful I've found your channel. You make me truly love statistics!
Glad you enjoy it!
This is one of the best explanations of absolutely anything I've ever seen and heard. Well done. Subscribed.
Awesome, thank you!
I love these classes. They are so easy to understand. Thks you for sharing with us!
Thank you very much! :)
Struggled with this topic literally for years! You solved my problem within just 10 min.! 🎉
Bam! :)
Exceptional explanation... I'm Completely in love with your teaching style
Thank you!
I FUCKING HATE PROBABILITIES SO MUCH I ACTUALYL GAVE UP 3 TIMES LEARNING THE COURSE (THE COURSE IS HIGH QUALITY IM JUST VERY STUPID) IM SO SO SO GLAD IM STARTING TO GET IT FROM YOUR VIDEOS JOSH. YOU'RE DEFINITELY THE #1 GUY ON UA-cam.
thanks! :)
Thank you so much! This is the best video on conditional probability on internet.
Thank you!
THANK YOU! I've really been struggling to get an intuitive understanding of conditional probability and this has helped so much!
Glad it helped!
Never thought that to understand conditional probabilities is this easy … thanks prof 🎯
bam! :)
I get a dopamine rush every time this man says bam
bam!
I am forever in my uni lecturer's debt for being too lazy to explain and instead give us the link to your index!! I need to learn stats as part of my Data Science Programming course, and you have been VERY helpful
BAM! Good luck with the course!
This is the best video explaining conditional probability. Hands down.
Thanks!
Great job Josh!!!!! This is such a clear and precisely laid out video perfect for my students.
bam!
Awesome. I always come here to get my doubts clear. Love the way you teach. Lots of BAM!!!
Thanks!
A 6 years old can understand this. That's genius
Thanks!
cant express how grateful i am to stumble across your channel!!!! I thought I was dumb but not anymore haha
bam! :)
Incredibly well-explained. Thank you!
Thank you! :)
😭😭😭😭thanks a lot😭😭😭god bless you, u are doing a noble job, may you get all the riches, love and happiness in the world
Thank you very much! :)
You are literally saving me from my statistics exam! Thank you so much, your explanations are brilliant and you are really making me focus with your simple and funny ways. Thank you again!
Good luck! :)
You're the statistics buddha! The analogies and the calming voice!
Thanks!
Omgg I finally understood this
Bam!!
bam! :)
Hi Josh, I am a long time fan of your channel and all of the shameless self-promotion has guilted me into buying one of your study guides (LDA). They seem great, a nice supplement to your videos. Though I think you would benefit greatly from stripping down the ordering process, I mean for online purchases almost noone asks for complete addresses, phone numbers and names anymore. I fought very hard with myself to finish it and parting with that information, and I succeeded simply because of being so grateful for the videos you provide.
You'd probably sell a lot more of these if you just ask for an email. Like most. Cheers
Thanks for the advice! I'll see what I can do to simplify the ordering process.
Finally, I understand this. BAM!!!!
Bam!
Thank you SO MUCH for clarifying that AB = A and B, when A and B are events. I get confused because XY is literally multiplication when X and Y are random variables. It can be doubly confusing because both events and random variables (even though they are different types) can be passed into P(), the probability function. Function overloading is a thing in computer programming, but I’m shocked to see it in math. Also, the implied A and B in P(A | B) is a lifesaver, many times over. BTW, understanding the tables in this video is pivotal in grasping marginal distributions, Baye’s Theorem, and joint probability distributions - it feels like I’m getting the other three concepts for free!
Triple bam!!! :)
Bam!! , what a bam person to explain everything about statistics so bamly!!
Thanks!
Bamks!
This channel is the bible of " Making 'Understanding Statistics' Easy !"
bam! :)
BAM! You just got a new subscriber 😂
Thank you!
Today I discovered your channel and it is truly a gem. As a EEE student who tries to make sense of subject of probability and neural network stuff, I believe I will benefit a lot from your work and be thankful for rest of my life. Especially statistics is challenging as things get messy a lot and scary/unintuitive equations with confusing notations start to appear. I did not look at all of your videos yet, but I hope that you did statistics less scary for those like me. My best regards
Awesome! Here's a page that lists all of my videos (and playlists etc): statquest.org/video-index/
This is how every tutorial should be!
Thanks!
You're incredible. You make everything seem so easy
Wow, thank you!
Bam! I like the way you teach.....keep up!....Bam Bam!😅
Thank you very much! BAM! :)
Happy Guru Purnima Sir, So blessed that this channel exists. ❤️🙏
Thank you! And thank you for supporting StatQuest! BAM! :)
❤️❤️
The people demand a Montecarlo Simulations StatQuest!!!
That would be nice.
I'll keep that in mind. We may get to something like it when I cover bayesian statistics.
That could be nice
First, I'll admit I had "Do you love orange soda?" from Keenan and Kel in the back of my mind. That said, this video has been immensely helpful, thank you!
:)
Awesome explanation. Thanks to you and Triple Bam members
Glad you liked it! BAM! :)
This helped understand c prob. I am taking a mandatory into to stat class with no math experience and passion. Thanks
Good luck!
"I hate using abbreviations"
And I love you for that
bam!
"statsquatch" I love it 😂
Also I guess I never put together that you could use the probabilities to calculate conditional probabilies, super helpful
BAM! :)
This is amazing way of explanation!
Thanks!
Your way of teaching ! Bam!
Thanks!
Very well explained as usual!! BTW, do you have any plans for making videos on stochastic calculus topics like Brownian motion, Ito integral, martingale measures etc. If so, I am looking forward to them.
I'll keep those topics in mind.
👍👍👍👍
amazing googleplex BAM! Man you did it again. BRAVO
Wow, thanks!
And BAM! I subscribed….such a clear explanation
Awesome, thank you!
It's interesting notation that you keep the superfluous condition on the left side of the pipe. I.e. using p(c=0 and s=1 | s=1) instead of p(c=0 | s=1). Do you find students understand it better this way?
Yes. To me it is much clearer.
After 4:40, I cried "that's enough" because it was relentless!!
bam
Excellent video..again ! So good to be hit by the triple bam !!! 😁
Hooray! :)
Thank you! Great content always🤩
My pleasure!
Well explained. Bit long but makes the concept clear as possible. A big THANK YOU Josh.
Thanks!
Conditional probability : Decoded. BAM !!! Tbh Some of your videos ( Very few actually. For example p-value 😁) were not very clear to me ( Well that's just me ! ) in the first instance but I digested through them somehow. But with little study & focus or just give a break for a while, the 2nd or 3rd look at your video makes the concept super clear to me and I don't need to look elsewhere. Thanks for making such videos and making statistics/probability easier.
I'm glad you are able to understand the concepts.
Good video that cleared up a lot of my questions. Only thing I'm confused now tho is 9:20 let's say we are given all the probabilities. How do I know if a probability is p(not love c and loves s | loves s) or p(loves s | not love c and loves s)? Confused because we know both the numerator and denominator. Thanks
Well, the second one, p(loves s | not love c and loves s), really doesn't make sense to calculate at all. This is because the conditional part, "not love c and loves s" is only a subset of "loves s", and thus, the conditional part does not give us any useful information about the larger "loves s" population.
@@statquest Thank you!
Thanks for this explanation.....It really help me to grasp this topic....
Bam!
What is the probability of a video being great GIVEN that we know it is made by Josh? Pretty close to 100%, I'd say!
BAM! :)
Incredible lecture!! Bamm😀
Thank you!
Your channel is amazing, I wish I had discovered it earlier! Would you ever consider making the slides available for sale? I would absolutely buy them.
I have some of them for sale here: statquest.org/statquest-store/
This was amazing and fun. Thank you!
Glad you enjoyed it!
SUPERB explanation... BAAAM !!!!!
Thanks!
Super easy ti understand! Thanks alot
Glad it helped!
Thanks a lot. Clearly explained as you stated.
Thank you!
i love you this actually helped so much
Thanks!
BAMMM!!!!
Great work
Can you make video about bayesian classifier because it's depending on conditional probability
I've already done that. Here's the link: ua-cam.com/video/O2L2Uv9pdDA/v-deo.html
Wow. Great explanation with nice song ❤️❤️
Thanks a lot 😊!
@@statquest welcome sir 🙏🏻
I was looking for sth like this recently, thanks for somehow read my mind XD (I guess some kinda probability?? BAM!
BAM! :)
although that i calculate conditional probability during college, i cannot get the concept of it leaving a gap in my stat knowledge...
i stumbled in your video and i have a basic understanding about it. im just need a more examples regarding conditional probability to fully master this concept...
Thanks Josh for your videos and hoping for more topics about stats and data analysis.
I'm glad my video was helpful! :)
Best explanation ever. Thank you! ❤️
Thank you! :)
This might sound like a silly question but, shouldn’t one expect that P(A | ~A) = 0, and vice-versa? However this conditional notation doesn’t encompass that
Actually, it does. What you are asking is "The probability that someone loves candy AND does not love candy given that we know that they love candy". This results in the equation p(loves candy & does not love candy) / p(loves candy) = 0 / (6/14) = 0.
@@statquest thanks a lot prof for your clarifications!
Because of you, I knew this Concept "BAM" 😹😸
bam!
Hi Josh! I've been working with DEseq2 a lot. Is dispersion shrinkage on your TODO list for statquest? I realize it's a little more of a niche topic for bioinformatics than broadly applicable tho
I'll keep it in mind.
Great video, perfect explanation, marvelous song, as always :D
I would love to see a video about Generalized Linear Models( other than logistic).
Thanks and I'll keep that topic in mind.
👍👍👍
Thank you so much...Now I cleared. 😁🥰🥰
bam!
That's a great one.thank you😊
Glad you liked it!
thank you, uncle bam!
My pleasure!
Josh is the best
Thanks!
Yeh!
Good one Josh. Any plans of making videos on "Time series Models" ?
One day! :)
What should I choose between B.tech Cse with specialization in ai/ml engineering and B.tech Cse with specialization in data science?
The answer to that question really depends on your interests.
@@statquest But what has more scope or the scope is not going to end soon
It totally depend on your interest man
Quadra BAMMMM ! ITS now clearly cleared in my mind
It's not clear? What time point, minutes and seconds, was confusing?
@@statquest sorry sir , I wish to write now but due to my mistake it become not .
Great!
Which software have you used to make this video(especially animation)?
I give away all of my secrets in this video: ua-cam.com/video/crLXJG-EAhk/v-deo.html
If we take
Event A:Not loves candy
Event B:Loves soda
Then general formula is
P(A∩B|B)=P(A∩B)/P(B)
Is the above formula same as
P(A|B)=P(A∩B)/P(B)
(This is the formula I was taught)
Yes, they are the same. The formula that you were taught is harder to read than the one I use, so I don't use it.
@@statquest Ok thanks a lot :)
Yeah. What does upside-down-U mean?
I'm gonna go out on a limb and say that 8:14, the formula on the bottom is the long form of the notation you used. And he's showing in this frame how the results are the same.
Yes
Hi ! I really like yours graphics and pictures ! What tool do you use ?
Keynote.
@@statquest thank you :)
Josh , Please start a series for mathematics concepts for machine learning. Such as probability, linear algebra, statistics, calculus etc.. It'll great help for us.
I have a bunch of statistics videos here: statquest.org/video-index/
So bloody good!
Thanks!
Hundred Bam!! Great explanation as always
Could you put the playlist link in the description, it will help us a lot, thanks.
Which playlist are you asking about?
Great job josh you are my favorite teacher !! , can you think about making a video about Multicollinearity? bam
I'll keep that in mind.
Any chance you could post a video on when to run regular linear regression vs. time series ARIMA?
I'll keep that in mind.
Thank you so much for this video! It saves my life hahaha
Thanks!
At 5:08, is it legitimate to ask "What is the probability that someone loves candy given that we know they love soda"? Without having the "and soda" condition. Or is the fact that it's "given/knowing they love soda" implies that it has to be "loves candy and soda"?
Conditional probabilities can be written two ways: a "long" version and a "short" version. In this video I use the long version (because it's easier for me to read)...
P(A and B | B) = P(A and B)/P(B)
...however, the short version is commonly used...
P(A | B) = P(A and B)/P(B)
Both mean the same thing, in other words: P(A and B | B) = P(A | B). The short version implies the "and B" part.
best video ever !
:)
hmmm, if I have x to predict y, how can I separate the internal y variations and randomness and how the x predict the y?
It really depends on what type of thing "y" is. Is "y" continuous? Consider linear regression: ua-cam.com/play/PLblh5JKOoLUIzaEkCLIUxQFjPIlapw8nU.html If "y" is "true/false", then consider use logistic regression: ua-cam.com/play/PLblh5JKOoLUKxzEP5HA2d-Li7IJkHfXSe.html
@@statquest Thanks. My confusion was after the linear regression. How to do some analysis on the X -> Y, if the Y can be predicted by X or there is internal randomness in the variables?
@@portiseremacunix You can calculate R^2 between the two variables. For details, see: ua-cam.com/video/2AQKmw14mHM/v-deo.html