@ 4:05, it should be probability of A event as 2/5 (bcoz you have defined A to be the event where you get black marble) @ 6:15, in the formula in the denominator it should be P(A) Also one small thing which should be mentioned is that in Bayes theorem the events must be mutually exhaustive (that is the whole sample space is partitioned into events)
Seen lot of lectures of your sir. But writing it first time, sorry took time for commenting, but thought guys like you must be appreciated for your efforts and helping data scient enthusiast without any cost. You make things very simple to understand and to the point. Appreciate sir. Really...thank you
Bro , literally i m searching this kind of comment that only i m confusing in this or any other guy 😂. Explanation is good but I m fighting with my mind to understanding the difference between P(Balck) or P(Event).
How did you know that I was searching for this? I was just searching for this topic on youtube, and at the same time, youtube notified me that you have uploaded a video on it.
Very informative . I am very thankful to you .. You are source of inspiration for students and working professionals . I have been following your channel for quite long time. Pls make video on maths intuition on gradient boost and Xgboost . Your ML playlist has video on ADaaboost but not on former two . Thanks again for your selfless efforts 🙏🏻🙏🏻
everything got messedup because of the naming convention, point - A, B are events, try with different names , so it won't get mix with P(B) which is probability of getting a black ball. ps - rest everything is very smooth, thankyou so much sir for the video xD.
In 6:46 you took p(b/a) =p(anb)/p(b). , But later at 7:30 while deriving, your taking p(a/b) =p(anb)/p(b). That's a mistake. You actually changing the formula.your wrong there.
Hi Krish , It is good explanaton. But i think as you showed one example of conditional probablility, Same way one example could be added on Bayes' theorem (Which also called reverse probablility)
Hi Krishna, I have been following you since last year. Your videos are very informative, concise and helpful. My comment is not related to this particular video but in overall. I do have a request to you for a video answering the following question: how the cost function for logistic regression differs from the cost function used for typical linear regression. In both cases how does the calculation of parameter values depend upon finding the minimization of the cost function?
Oh man, you are really awesome. I just came across to your videos and found it very easy to learn. I like that you have created short videos for each and every topic which is way easy to learn. I really appreciate it Sir. amazing work. Subscribed and liked it... will continue to do so and I hope you will be teaching us.. Thanks you Sir Naik.
Krish, Thanks for sharing. Can I some information about how should I choose an best algorithms before the start of machine learning. What sort of things I have to evaluate before working with an algorithm
there is no such thing as dependent events merely because something depends upon other unless it has some meaning . Even to be called for independent there should be something to be depend on. i mean you can not put the demarcation between dependent and independent. even dependent events use the independent theory (multiplication theory). for example, a man of eighty has less chance of surviving than a healthy young man. probability is dependent on his age. but the probability of his infatuation of woman is likely to be same even though chance of infatuation depends on his age(in this case). while computing this probability (infatuation) you use the independent formula to the dependent one. for the sake of linguistic purpose, yes you can. people have misunderstanding about it. independent events is mathematical but dependent is just a word in English. i have never seen any book quoting about the dependent event.
In fact, be very careful with the explanation made, is a good one but, Krish, unfortunately, used the same laters in "even B" and probability of taking a black p(B), their fore in his explanations sometimes he uses p(a) = probability of occurring the event A, (min 6: 25), the p(B)= 2/5 is for him the probability of occurring the first event, taking a black (B) = event A.
Whatever already happened will come in the denominator. P(B|A) = P(AnB)/P(A). Here P(B|A) means (probability of B given that A already happened) ua-cam.com/video/v938yj5r3pA/v-deo.html
Sir please help me in understanding that ... P(B) event that is 1/4 when there is 1 black marble out of 4. And P(B|A) is also the same scenario when A event has occurred and we r having 1 black marble out of 4 marbles. Then how we will differentiate between the two. As we have done P(A)*P(B)=2/5*1/4=1/10
This is confusing. He says that P(B|A) is 1/4. But for him event A is taking a black, and B is taking another black. Shouldn't it be P(A|A) = 0.25. Very confusing.
I have a confusion in while solving an example since we were using bag of words we get some probabilities for each word in that stuff Sir, please clear me that with an example how it works plz
Hey Krish I have a problem I am using a large dataset and it takes hours on my laptop When I use Google colab it keeps on disconnecting after something Any suggestions
It's always good to learn something new. However, when will I really use statistics? Simple themes of math such as addition and subtraction multiplication division etc. I use almost everyday.
it will probably never be used in daily lives, but this particular video was in series of Machine learning video series, where you had to work with large data.
Guys just a small change int he formula p(b|a)= p(anb)/p(a) .please consider this change
Yes
Nyc video. ...
yeah.. Saw that in the video... was about to tell... kudos!
Well explained....👍
Sir are u from south side?
Sir does bayesian theorem uses marginal probability??
@ 4:05, it should be probability of A event as 2/5 (bcoz you have defined A to be the event where you get black marble)
@ 6:15, in the formula in the denominator it should be P(A)
Also one small thing which should be mentioned is that in Bayes theorem the events must be mutually exhaustive (that is the whole sample space is partitioned into events)
Yes u r right
@@krishnaik06 👍, thank you sir, you are really doing a tremendous work.
P(B|A) = P(A INT B) / P(A)
@@krishnaik06 please flash the text in the video. mentioning it as a correction. Tq.
Please explain what mutually exhaustive means? The explanations on the internet are very confusing.
Seen lot of lectures of your sir. But writing it first time, sorry took time for commenting, but thought guys like you must be appreciated for your efforts and helping data scient enthusiast without any cost. You make things very simple to understand and to the point. Appreciate sir. Really...thank you
Your are the man, you such a gem in youtube thanks brother for the video
INTRO was dope bro....👍👍☢️
U really deserve to be good teacher
Thanks, It's a good explanation; but I think the reference of P(Event A) and P(Black) is slightly confusing.
Bro , literally i m searching this kind of comment that only i m confusing in this or any other guy 😂. Explanation is good but I m fighting with my mind to understanding the difference between P(Balck) or P(Event).
Thank you. I understood this today only
I'm new on seeing this video
I like the way of u r teaching sir
I love u seriously..u r the best
thnak u so much sir ur teaching is so clean i am so satisfied watching this
Incredible teaching
....
How did you know that I was searching for this? I was just searching for this topic on youtube, and at the same time, youtube notified me that you have uploaded a video on it.
Glitch in the matrix, I guess?😂
The same thing has happened with me also, a lot of times.
Good Video. Bayes theorem simplified
Very well explained. Thank you.
Very informative . I am very thankful to you .. You are source of inspiration for students and working professionals . I have been following your channel for quite long time. Pls make video on maths intuition on gradient boost and Xgboost . Your ML playlist has video on ADaaboost but not on former two . Thanks again for your selfless efforts 🙏🏻🙏🏻
Preeti Bhatt plz share yours email id
Thank you sir!
It's clearly visible that you are really teaching by heart!
What is that supposed to mean?
excellent explanation.
Excellent way of teaching Sir
everything got messedup because of the naming convention, point - A, B are events, try with different names , so it won't get mix with P(B) which is probability of getting a black ball.
ps - rest everything is very smooth, thankyou so much sir for the video xD.
Great explanation Krish.Thanks
In 6:46 you took p(b/a) =p(anb)/p(b). , But later at 7:30 while deriving, your taking p(a/b) =p(anb)/p(b). That's a mistake. You actually changing the formula.your wrong there.
This video is amazing, incredibly helpful.!!!!!!!!!!!thank youuuuuuu
Hello sir. Your all lectures are very helpful and understanding. Thank you for making such tutorials.
excellent explanation
Great explain sir. It is very helpful to me.
Well explanation bro.
Sir, please make a video on Gradient Checking and Adam optimizer in the Deep learning playlist. Me and most of my friends are waiting for that.
Adam Optimizer! I agree
Much informative❣️...will recommend your videos to our students also.
Excellent...
Really helpful! Thank you❤
Thanks a lot dear sir love u
Very thank you sir 🙏🏼
Hi Krish ,
It is good explanaton.
But i think as you showed one example of conditional probablility, Same way one example could be added on Bayes' theorem (Which also called reverse probablility)
Hi Krishna, I have been following you since last year. Your videos are very informative, concise and helpful. My comment is not related to this particular video but in overall. I do have a request to you for a video answering the following question: how the cost function for logistic regression differs from the cost function used for typical linear regression. In both cases how does the calculation of parameter values depend upon finding the minimization of the cost function?
Awesome sir
this guys intro is FIREEEEEE
Thank you so much
Super sirrr😎
Thanks for this video krish.... Can u please make one video explaining terms like Maximum likelihood estimation, Log of odds, logit function...plzzz
Bro can you make the playlist for all tutorials and for mathematics for ML/DS
Thank you sir
Great video, Krish! Explained much better than by my well paid lecturers :)
Lovely explanation bro ! Thank you !
Oh man, you are really awesome. I just came across to your videos and found it very easy to learn. I like that you have created short videos for each and every topic which is way easy to learn. I really appreciate it Sir. amazing work. Subscribed and liked it... will continue to do so and I hope you will be teaching us.. Thanks you Sir Naik.
Eagerly waiting for next one..
Good explanation though there are some mistakes. Like the formula you give 1st and telling intersection as 'and' and later as 'or'
Hey please upload videos on SVM & Hinge loss.
New intro 👍
Coolest Intro I have seen in a while!👍🏻🙏🏻
It ripped my ears
Sir i am also waiting for SVM video please upload it
Probability of Black P(B) is getting confused with Probability of event B. P(B)
ua-cam.com/video/v938yj5r3pA/v-deo.html
but both events are dependent why did you perform p(A intersect B ) ??? ... event A and B multiply if both are independent ??
Tqq ❤️ a lot ❤️
Love the new intro! Did you make it yourself?
Big fan
It's 1/5
Krish, Thanks for sharing. Can I some information about how should I choose an best algorithms before the start of machine learning. What sort of things I have to evaluate before working with an algorithm
Can you also please explain Bayesian regression models?
Sir conditional expectation also please
Awesome job Krish.... Nice to see your videos... Great Work...
Remember me from PUC , Philos?? :D
:) how have u been
@@krishnaik06 great bro... very nice to see your videos dude.. nice work 👏
Thanks @faaran
there is no such thing as dependent events merely because something depends upon other unless it has some meaning . Even to be called for independent there should be something to be depend on. i mean you can not put the demarcation between dependent and independent. even dependent events use the independent theory (multiplication theory). for example, a man of eighty has less chance of surviving than a healthy young man. probability is dependent on his age. but the probability of his infatuation of woman is likely to be same even though chance of infatuation depends on his age(in this case). while computing this probability (infatuation) you use the independent formula to the dependent one. for the sake of linguistic purpose, yes you can. people have misunderstanding about it. independent events is mathematical but dependent is just a word in English. i have never seen any book quoting about the dependent event.
the problem i find in this lecture is that , the Black B and Event B make us confuse
Watch my video on Baye's theorem to apply Baye's theorem without formulae
When can you teach on Bayesian Neural Network please?
really goooodddddddddddd
In fact, be very careful with the explanation made, is a good one but, Krish, unfortunately, used the same laters in "even B" and probability of taking a black p(B), their fore in his explanations sometimes he uses p(a) = probability of occurring the event A, (min 6: 25), the p(B)= 2/5 is for him the probability of occurring the first event, taking a black (B) = event A.
nice
Intro 🔥 🔥
What is difference between conditional probability and bayes theorem.
it tricky as to how you define p(A|B).....we know P(B|A)...but we do not know what P(A|B) means with respect to picking black marble.
Whatever already happened will come in the denominator. P(B|A) = P(AnB)/P(A). Here P(B|A) means (probability of B given that A already happened) ua-cam.com/video/v938yj5r3pA/v-deo.html
I am waiting for SVM for classification and regression...
Sir mechanical can do data science
What is the best data science course or certificate out right now? Nice video btw
Sir please help me in understanding that ... P(B) event that is 1/4 when there is 1 black marble out of 4. And P(B|A) is also the same scenario when A event has occurred and we r having 1 black marble out of 4 marbles. Then how we will differentiate between the two. As we have done P(A)*P(B)=2/5*1/4=1/10
P(A intersect B) = P(A).P(B) only when events are independent.
Here events are dependent.
@@asn9329 but here P(A n B) = 2/5*1/4 =1/10 is done.
Hi, Krish. Are you removed the NLP playlist ?
It is getting revamped it will be upload in a week
Plz upload support vector machine
In 5:00 min, did you mean the b in p(b|a) as event or probability of picking up a black ball?
Sir, SVM Kernel Intution video is not available on your youtube channel.
taking a and b as events and variables both is TIGHT
6:26 is wrong. It should be p(B|A) = p(B n A)/p(A) not p(B|A) = p (A nB)/P(B)
This is confusing. He says that P(B|A) is 1/4. But for him event A is taking a black, and B is taking another black. Shouldn't it be P(A|A) = 0.25. Very confusing.
I have a confusion in while solving an example since we were using bag of words we get some probabilities for each word in that stuff Sir, please clear me that with an example how it works plz
Watch my video on Baye's theorem to apply Baye's theorem without formulae
Hey Krish I have a problem
I am using a large dataset and it takes hours on my laptop
When I use Google colab it keeps on disconnecting after something
Any suggestions
Use R
Can i say that, always - conditional probability is the probability of dependent events ?
Yes
ua-cam.com/video/v938yj5r3pA/v-deo.html
You did nothing but confused the theory
It's always good to learn something new. However, when will I really use statistics? Simple themes of math such as addition and subtraction multiplication division etc. I use almost everyday.
it will probably never be used in daily lives, but this particular video was in series of Machine learning video series, where you had to work with large data.
You should edit the video where it is acclidly wrong.it is. Confusing.
dont get nervous baby
Pls reduce the volume of intro bgm
at 7:10 you are saying P(B/A) = P(A intersection B)/P(B) but at 7:25 you have written just opposite, which one is correct ?
P(B|A) = P(A INT B) / P(A)
picking two times black, first without condition (P(A)), second times with condition (P(B|A))
Engineering?????
intro music name ?
Your math is WRONG!
wrong padhra lge ho
Bekaar
Thank you sir