Tutorial 47- Bayes' Theorem| Conditional Probability- Machine Learning

Поділитися
Вставка
  • Опубліковано 22 сер 2024
  • In probability theory and statistics, Bayes' theorem describes the probability of an event, based on prior knowledge of conditions that might be related to the event
    Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
    / @krishnaik06
    Please do subscribe my other channel too
    / @krishnaikhindi
    If you want to Give donation to support my channel, below is the Gpay id
    GPay: krishnaik06@okicici
    Connect with me here:
    Twitter: / krishnaik06
    Facebook: / krishnaik06
    instagram: / krishnaik06

КОМЕНТАРІ • 131

  • @krishnaik06
    @krishnaik06  4 роки тому +131

    Guys just a small change int he formula p(b|a)= p(anb)/p(a) .please consider this change

  • @rajhiranandani1007
    @rajhiranandani1007 4 роки тому +84

    @ 4:05, it should be probability of A event as 2/5 (bcoz you have defined A to be the event where you get black marble)
    @ 6:15, in the formula in the denominator it should be P(A)
    Also one small thing which should be mentioned is that in Bayes theorem the events must be mutually exhaustive (that is the whole sample space is partitioned into events)

    • @krishnaik06
      @krishnaik06  4 роки тому +15

      Yes u r right

    • @rajhiranandani1007
      @rajhiranandani1007 4 роки тому

      @@krishnaik06 👍, thank you sir, you are really doing a tremendous work.

    • @arunkumaracharya9641
      @arunkumaracharya9641 4 роки тому

      P(B|A) = P(A INT B) / P(A)

    • @ankbala
      @ankbala 3 роки тому +1

      @@krishnaik06 please flash the text in the video. mentioning it as a correction. Tq.

    • @Vinay1272
      @Vinay1272 Рік тому

      Please explain what mutually exhaustive means? The explanations on the internet are very confusing.

  • @belimmohsin
    @belimmohsin 3 роки тому +7

    Seen lot of lectures of your sir. But writing it first time, sorry took time for commenting, but thought guys like you must be appreciated for your efforts and helping data scient enthusiast without any cost. You make things very simple to understand and to the point. Appreciate sir. Really...thank you

  • @ebewhitecaspian3402
    @ebewhitecaspian3402 3 роки тому +7

    In 6:46 you took p(b/a) =p(anb)/p(b). , But later at 7:30 while deriving, your taking p(a/b) =p(anb)/p(b). That's a mistake. You actually changing the formula.your wrong there.

  • @amardeepsingh9001
    @amardeepsingh9001 3 роки тому +13

    Thanks, It's a good explanation; but I think the reference of P(Event A) and P(Black) is slightly confusing.

    • @zufrankhan7793
      @zufrankhan7793 Рік тому

      Bro , literally i m searching this kind of comment that only i m confusing in this or any other guy 😂. Explanation is good but I m fighting with my mind to understanding the difference between P(Balck) or P(Event).

  • @jaheerkalanthar816
    @jaheerkalanthar816 2 роки тому +2

    Your are the man, you such a gem in youtube thanks brother for the video

  • @abhishek-shrm
    @abhishek-shrm 4 роки тому +2

    How did you know that I was searching for this? I was just searching for this topic on youtube, and at the same time, youtube notified me that you have uploaded a video on it.

    • @parthjain7822
      @parthjain7822 3 роки тому

      Glitch in the matrix, I guess?😂

    • @parthjain7822
      @parthjain7822 3 роки тому +1

      The same thing has happened with me also, a lot of times.

  • @vishwajithbarad
    @vishwajithbarad 4 роки тому +2

    INTRO was dope bro....👍👍☢️

  • @preetibhatt5085
    @preetibhatt5085 4 роки тому +3

    Very informative . I am very thankful to you .. You are source of inspiration for students and working professionals . I have been following your channel for quite long time. Pls make video on maths intuition on gradient boost and Xgboost . Your ML playlist has video on ADaaboost but not on former two . Thanks again for your selfless efforts 🙏🏻🙏🏻

  • @nadirbelkebir7219
    @nadirbelkebir7219 Рік тому

    this guys intro is FIREEEEEE

  • @YashrajNigam
    @YashrajNigam 4 роки тому +5

    Sir, please make a video on Gradient Checking and Adam optimizer in the Deep learning playlist. Me and most of my friends are waiting for that.

  • @shantakumariymr8601
    @shantakumariymr8601 3 роки тому

    U really deserve to be good teacher

  • @KenJee_ds
    @KenJee_ds 4 роки тому +4

    Love the new intro! Did you make it yourself?

  • @devanshmesson2777
    @devanshmesson2777 3 роки тому

    Thank you sir!
    It's clearly visible that you are really teaching by heart!

    • @nad1ax2
      @nad1ax2 3 роки тому

      What is that supposed to mean?

  • @mandarvengurlekar4118
    @mandarvengurlekar4118 4 роки тому +1

    Good Video. Bayes theorem simplified

  • @nocode659
    @nocode659 4 роки тому +1

    I love u seriously..u r the best

  • @chiragcontractor3141
    @chiragcontractor3141 3 роки тому +2

    but both events are dependent why did you perform p(A intersect B ) ??? ... event A and B multiply if both are independent ??

  • @shashpeiris6008
    @shashpeiris6008 2 роки тому

    Thank you. I understood this today only

  • @rajanadhikari3741
    @rajanadhikari3741 4 роки тому +1

    Hi Krishna, I have been following you since last year. Your videos are very informative, concise and helpful. My comment is not related to this particular video but in overall. I do have a request to you for a video answering the following question: how the cost function for logistic regression differs from the cost function used for typical linear regression. In both cases how does the calculation of parameter values depend upon finding the minimization of the cost function?

  • @MrHawrociik
    @MrHawrociik 3 роки тому +1

    Great video, Krish! Explained much better than by my well paid lecturers :)

  • @youngers1498
    @youngers1498 2 роки тому

    I'm new on seeing this video
    I like the way of u r teaching sir

  • @dhanyashreen9311
    @dhanyashreen9311 4 роки тому

    thnak u so much sir ur teaching is so clean i am so satisfied watching this

  • @Anjali-wz7yt
    @Anjali-wz7yt Рік тому

    Incredible teaching
    ....

  • @michellechen3054
    @michellechen3054 3 роки тому +1

    This video is amazing, incredibly helpful.!!!!!!!!!!!thank youuuuuuu

  • @Neuraldata
    @Neuraldata 4 роки тому +1

    Much informative❣️...will recommend your videos to our students also.

  • @NewITWorld
    @NewITWorld 3 роки тому

    Excellent way of teaching Sir

  • @aratitodkar2055
    @aratitodkar2055 3 роки тому

    Hello sir. Your all lectures are very helpful and understanding. Thank you for making such tutorials.

  • @mejavedjaved
    @mejavedjaved 3 роки тому

    Oh man, you are really awesome. I just came across to your videos and found it very easy to learn. I like that you have created short videos for each and every topic which is way easy to learn. I really appreciate it Sir. amazing work. Subscribed and liked it... will continue to do so and I hope you will be teaching us.. Thanks you Sir Naik.

  • @RajKumar-mv6om
    @RajKumar-mv6om 3 роки тому +2

    Bro can you make the playlist for all tutorials and for mathematics for ML/DS

  • @pradnyavk9673
    @pradnyavk9673 Рік тому

    Very well explained. Thank you.

  • @niranjansaha5135
    @niranjansaha5135 Рік тому

    everything got messedup because of the naming convention, point - A, B are events, try with different names , so it won't get mix with P(B) which is probability of getting a black ball.
    ps - rest everything is very smooth, thankyou so much sir for the video xD.

  • @sachink9102
    @sachink9102 2 роки тому

    Hi Krish ,
    It is good explanaton.
    But i think as you showed one example of conditional probablility, Same way one example could be added on Bayes' theorem (Which also called reverse probablility)

  • @alphaadil4028
    @alphaadil4028 4 роки тому +2

    Thanks for this video krish.... Can u please make one video explaining terms like Maximum likelihood estimation, Log of odds, logit function...plzzz

  • @sandipansarkar9211
    @sandipansarkar9211 3 роки тому

    Great explanation Krish.Thanks

  • @Jawad_Ali876
    @Jawad_Ali876 2 роки тому

    Thanks a lot dear sir love u

  • @abinashkumarsinha8958
    @abinashkumarsinha8958 2 роки тому

    excellent explanation.

  • @kumarssss5
    @kumarssss5 4 роки тому +1

    Probability of Black P(B) is getting confused with Probability of event B. P(B)

  • @amitdebnath2207
    @amitdebnath2207 5 місяців тому

    Thank you so much

  • @sultanismail4970
    @sultanismail4970 3 роки тому

    Great explain sir. It is very helpful to me.

  • @amitkhandelwal8030
    @amitkhandelwal8030 4 роки тому +2

    Sir i am also waiting for SVM video please upload it

  • @techinicalocean8733
    @techinicalocean8733 Місяць тому

    Thank you sir

  • @anandhuded3647
    @anandhuded3647 4 роки тому +1

    Eagerly waiting for next one..

  • @kallamindrasenareddy8710
    @kallamindrasenareddy8710 4 роки тому +1

    Krish, Thanks for sharing. Can I some information about how should I choose an best algorithms before the start of machine learning. What sort of things I have to evaluate before working with an algorithm

  • @Vinay1272
    @Vinay1272 Рік тому

    Really helpful! Thank you❤

  • @vamshikrishna3376
    @vamshikrishna3376 3 роки тому

    excellent explanation

  • @SayedAhmed-ic2hm
    @SayedAhmed-ic2hm 4 роки тому +1

    Sir mechanical can do data science

  • @manishkatiyar2403
    @manishkatiyar2403 4 роки тому +2

    New intro 👍

  • @chintukhan7873
    @chintukhan7873 2 роки тому

    Well explanation bro.

  • @gulsanbor
    @gulsanbor 4 роки тому +1

    Hey please upload videos on SVM & Hinge loss.

  • @faaranmohd
    @faaranmohd 4 роки тому

    Awesome job Krish.... Nice to see your videos... Great Work...
    Remember me from PUC , Philos?? :D

    • @krishnaik06
      @krishnaik06  4 роки тому

      :) how have u been

    • @faaranmohd
      @faaranmohd 4 роки тому

      @@krishnaik06 great bro... very nice to see your videos dude.. nice work 👏

    • @krishnaik06
      @krishnaik06  4 роки тому

      Thanks @faaran

  • @noname-gh8pf
    @noname-gh8pf 3 роки тому

    there is no such thing as dependent events merely because something depends upon other unless it has some meaning . Even to be called for independent there should be something to be depend on. i mean you can not put the demarcation between dependent and independent. even dependent events use the independent theory (multiplication theory). for example, a man of eighty has less chance of surviving than a healthy young man. probability is dependent on his age. but the probability of his infatuation of woman is likely to be same even though chance of infatuation depends on his age(in this case). while computing this probability (infatuation) you use the independent formula to the dependent one. for the sake of linguistic purpose, yes you can. people have misunderstanding about it. independent events is mathematical but dependent is just a word in English. i have never seen any book quoting about the dependent event.

  • @datascience1019
    @datascience1019 4 роки тому +1

    Coolest Intro I have seen in a while!👍🏻🙏🏻

  • @amanchaudhary8817
    @amanchaudhary8817 2 роки тому

    Very thank you sir 🙏🏼

  • @anthonyraj100
    @anthonyraj100 4 роки тому

    Lovely explanation bro ! Thank you !

  • @mukundkomati
    @mukundkomati 4 роки тому +1

    Can you also please explain Bayesian regression models?

  • @revantreddy4532
    @revantreddy4532 4 роки тому +1

    What is the best data science course or certificate out right now? Nice video btw

  • @chil_vibezz8727
    @chil_vibezz8727 2 роки тому

    When can you teach on Bayesian Neural Network please?

  • @teetanrobotics5363
    @teetanrobotics5363 4 роки тому +1

    Sir conditional expectation also please

  • @ajithkumar0219
    @ajithkumar0219 4 роки тому +2

    Hi, Krish. Are you removed the NLP playlist ?

    • @krishnaik06
      @krishnaik06  4 роки тому +2

      It is getting revamped it will be upload in a week

  • @shivamkala4105
    @shivamkala4105 4 роки тому +2

    Hey Krish I have a problem
    I am using a large dataset and it takes hours on my laptop
    When I use Google colab it keeps on disconnecting after something
    Any suggestions

  • @arjunregmi3486
    @arjunregmi3486 Рік тому

    What is difference between conditional probability and bayes theorem.

  • @lazytocook
    @lazytocook Рік тому

    6:26 is wrong. It should be p(B|A) = p(B n A)/p(A) not p(B|A) = p (A nB)/P(B)

  • @anandhuded3647
    @anandhuded3647 4 роки тому +1

    I have a confusion in while solving an example since we were using bag of words we get some probabilities for each word in that stuff Sir, please clear me that with an example how it works plz

    • @sarth2668
      @sarth2668 3 роки тому

      Watch my video on Baye's theorem to apply Baye's theorem without formulae

  • @JoseLuisNaranjoVillotaEc
    @JoseLuisNaranjoVillotaEc 2 роки тому

    In fact, be very careful with the explanation made, is a good one but, Krish, unfortunately, used the same laters in "even B" and probability of taking a black p(B), their fore in his explanations sometimes he uses p(a) = probability of occurring the event A, (min 6: 25), the p(B)= 2/5 is for him the probability of occurring the first event, taking a black (B) = event A.

  • @ujjwalchettri1018
    @ujjwalchettri1018 2 роки тому

    Good explanation though there are some mistakes. Like the formula you give 1st and telling intersection as 'and' and later as 'or'

  • @shilpask7990
    @shilpask7990 2 роки тому

    Super sirrr😎

  • @MrPrincemohanty
    @MrPrincemohanty 3 роки тому

    Excellent...

  • @mathematics_infinity_pi2779
    @mathematics_infinity_pi2779 4 роки тому

    Awesome sir

  • @dude5697
    @dude5697 3 роки тому

    In 5:00 min, did you mean the b in p(b|a) as event or probability of picking up a black ball?

  • @Nidhsgoyal
    @Nidhsgoyal 3 роки тому

    Sir please help me in understanding that ... P(B) event that is 1/4 when there is 1 black marble out of 4. And P(B|A) is also the same scenario when A event has occurred and we r having 1 black marble out of 4 marbles. Then how we will differentiate between the two. As we have done P(A)*P(B)=2/5*1/4=1/10

    • @asn9329
      @asn9329 3 роки тому

      P(A intersect B) = P(A).P(B) only when events are independent.
      Here events are dependent.

    • @Nidhsgoyal
      @Nidhsgoyal 3 роки тому

      @@asn9329 but here P(A n B) = 2/5*1/4 =1/10 is done.

  • @shelsiadaniel32
    @shelsiadaniel32 2 роки тому

    really goooodddddddddddd

  • @goutamsamal1720
    @goutamsamal1720 2 роки тому

    You should edit the video where it is acclidly wrong.it is. Confusing.

  • @mks7846
    @mks7846 4 роки тому +1

    Intro 🔥 🔥

  • @SayedAhmed-ic2hm
    @SayedAhmed-ic2hm 4 роки тому +1

    Engineering?????

  • @Syllerud
    @Syllerud Рік тому

    This is confusing. He says that P(B|A) is 1/4. But for him event A is taking a black, and B is taking another black. Shouldn't it be P(A|A) = 0.25. Very confusing.

  • @ExplooreWithManoj
    @ExplooreWithManoj 4 роки тому

    Sir, SVM Kernel Intution video is not available on your youtube channel.

  • @MrPrincemohanty
    @MrPrincemohanty 3 роки тому

    I am waiting for SVM for classification and regression...

  • @arunkumaracharya9641
    @arunkumaracharya9641 4 роки тому

    it tricky as to how you define p(A|B).....we know P(B|A)...but we do not know what P(A|B) means with respect to picking black marble.

    • @BehindTheLogics
      @BehindTheLogics 3 роки тому +1

      Whatever already happened will come in the denominator. P(B|A) = P(AnB)/P(A). Here P(B|A) means (probability of B given that A already happened) ua-cam.com/video/v938yj5r3pA/v-deo.html

  • @tejavathnagaraju3403
    @tejavathnagaraju3403 3 роки тому

    Tqq ❤️ a lot ❤️

  • @Miles2Achieve
    @Miles2Achieve 4 роки тому

    at 7:10 you are saying P(B/A) = P(A intersection B)/P(B) but at 7:25 you have written just opposite, which one is correct ?

  • @kishankunwar7409
    @kishankunwar7409 4 роки тому

    the problem i find in this lecture is that , the Black B and Event B make us confuse

    • @sarth2668
      @sarth2668 3 роки тому

      Watch my video on Baye's theorem to apply Baye's theorem without formulae

  • @theshishir24
    @theshishir24 4 роки тому

    Can i say that, always - conditional probability is the probability of dependent events ?

  • @shawnfrost1220
    @shawnfrost1220 4 роки тому

    Plz upload support vector machine

  • @tusharjeena7081
    @tusharjeena7081 Рік тому

    It's 1/5

  • @ManuPresannakumar
    @ManuPresannakumar 3 роки тому

    Pls reduce the volume of intro bgm

  • @dheerajsharma5492
    @dheerajsharma5492 3 роки тому

    nice

  • @sathyavel8046
    @sathyavel8046 4 роки тому

    dont get nervous baby

  • @human-011
    @human-011 4 роки тому

    intro music name ?

  • @KendraGraceT
    @KendraGraceT 3 роки тому

    It's always good to learn something new. However, when will I really use statistics? Simple themes of math such as addition and subtraction multiplication division etc. I use almost everyday.

    • @sensei249
      @sensei249 3 роки тому +2

      it will probably never be used in daily lives, but this particular video was in series of Machine learning video series, where you had to work with large data.

  • @chandreyeemukherjee4500
    @chandreyeemukherjee4500 3 роки тому

    You did nothing but confused the theory

  • @nackyding
    @nackyding 2 роки тому

    Your math is WRONG!

  • @adivenkimarya
    @adivenkimarya 2 роки тому

    wrong padhra lge ho

  • @diochen9199
    @diochen9199 3 роки тому

    Thank you sir