The Math Behind Bayesian Classifiers Clearly Explained!

Поділитися
Вставка
  • Опубліковано 22 лис 2024

КОМЕНТАРІ • 92

  • @hayleyH997
    @hayleyH997 6 місяців тому +7

    How he manage to explain something that a 1-hr lecture couldn't! Thanks mate

  • @pradyumnabada5118
    @pradyumnabada5118 Рік тому +11

    Dude.. I lost count of the videos I watched to understand this but lastly, after seeing your video the struggle ended. Thank you so much!

  • @BrianAmedee
    @BrianAmedee 4 роки тому +56

    'Clearly Explained' - and it actually was. Thanks man

  • @bluestar2253
    @bluestar2253 3 роки тому +20

    One of the best explanations I've ever seen!

  • @jaster_mereel7657
    @jaster_mereel7657 3 роки тому +29

    This was a very clear explanation indeed. Thank you!

  • @uncaged3076
    @uncaged3076 Місяць тому

    Been struggling to grasp this topic but i finally hit that Eureka moment with this video,.Thank you so much

  • @sye9522
    @sye9522 7 місяців тому

    HUGE thanks for perfectly delivering the whole concept in one video bro!!

  • @noname-anonymous-v7c
    @noname-anonymous-v7c 9 місяців тому +1

    9:37 you made conclusion based on P(X=[0,2] | Y), I think the correct way is to calculate P(Y|X=[0,2]). In case P(Y=1) is very small, the answer can be Y=0.

  • @radoyapanic998
    @radoyapanic998 2 роки тому +1

    In the last part of the video you said we can fit a known distribution to a continuous set of data. However, you continued to then write that the probabilities can be calculated by taking the product of the pdf evaluated at different values of the feature and label. The pdf does not provide probabilities however, as it needs to be integrated to inform one of the probabilities of an event. This part of the video seems imprecise.
    However, the video in general was great. Thanks.

  • @hussamcheema
    @hussamcheema Місяць тому

    one of the best explanation of this topic. Thanks man

  • @miusukamadoto6805
    @miusukamadoto6805 2 роки тому

    Thank you very much for the video. Clearly explained indeed, the only part I couldn't get completely was the discretization.

  • @high_fly_bird
    @high_fly_bird Рік тому

    The explanation is so cool! But it would be even cooler if you added some examples with continious features and fitting a distribution, this part wasn't so clear...

  • @aakashjuseja
    @aakashjuseja 2 місяці тому

    I just love this explanation

  • @lakshuperiakaruppan6777
    @lakshuperiakaruppan6777 3 місяці тому

    Good work with the visuals!!

  • @guangruli4486
    @guangruli4486 3 роки тому +4

    Very clearly explained, thank you!

  • @jefersondavidgalloaristiza3410

    Very nice explanation and perfect illustrations!!

  • @RayRay-yt5pe
    @RayRay-yt5pe 3 місяці тому

    You did good my friend. I'm glad I came across this video

  • @lucasqwert1
    @lucasqwert1 Рік тому

    in the last part at minute 11: What is the function f to fit a known distribution? Thank you for answering!

  • @PritishMishra
    @PritishMishra 3 роки тому +1

    If I search for any ML Algorithm I just first check your channel If you have created the video on the same... You are my first preference for ML/DL Algo Explanation. Just a request please make a video on Deep Learning Algorithm too like CNN, RNN & LSTM "from scratch". It will really help people who want to become practitioners in AI like me.

    • @NormalizedNerd
      @NormalizedNerd  3 роки тому +1

      Thank you so much ❤
      Writing CNNs and RNNs from scratch are pretty hectic...maybe some day I'll try.

    • @PritishMishra
      @PritishMishra 3 роки тому

      @@NormalizedNerd Waiting... you are our only hope who can teach us Mathematics of ML with cool animation, That's why requested you! Thanks.

  • @Nazmul-4u
    @Nazmul-4u 2 роки тому +2

    LOVED IT!!!
    Awesome Explanation! Can't thank you enough...

  • @sobana653
    @sobana653 Рік тому

    Nicely explained!

  • @sayonsom
    @sayonsom Рік тому +1

    Great explanation :)

  • @daniilsukhovv
    @daniilsukhovv 3 роки тому

    bro, best explanation I could find

  • @sopegue
    @sopegue 2 роки тому

    It was clearly explained as mentionned in the title. Thanks a bunch !!!

  • @SihatAfnan-y6o
    @SihatAfnan-y6o 26 днів тому

    Best Explanation

  • @leolei9352
    @leolei9352 2 роки тому

    Very clear explanation!

  • @swethanandyala
    @swethanandyala 3 роки тому

    very nice explanation thank you so much

  • @vojinivkovic9533
    @vojinivkovic9533 2 роки тому

    great explanation

  • @arielalvarez88
    @arielalvarez88 4 роки тому +1

    Really good work, congrats

  • @abduljeleelajibona2401
    @abduljeleelajibona2401 Місяць тому

    Nice video! Thank you.

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 роки тому

    This is really well explained.

  • @dpaul3447
    @dpaul3447 Рік тому +1

    Thank you so much man!!

  • @muhammadzubairbaloch3224
    @muhammadzubairbaloch3224 4 роки тому +3

    sir please more lectures.
    I am seeing after too days later your lectures
    made some advance NLP and CV lectures or AI lectures thanks

    • @NormalizedNerd
      @NormalizedNerd  4 роки тому +1

      I will try my best to upload more frequently.

  • @parisaghanad8042
    @parisaghanad8042 2 роки тому

    That was great! I'm really glad that I found your channel. Thanks a lot 👍👍

  • @sayantansadhu6380
    @sayantansadhu6380 4 роки тому +2

    It was like a revision for class 12 probability 😁😁

  • @imadeit6587
    @imadeit6587 3 роки тому +1

    I am appreciate your work

  • @dannysammy8972
    @dannysammy8972 2 роки тому +2

    Yes, this was actually well explained. Thank you :)

  • @aurorasart9458
    @aurorasart9458 3 роки тому +1

    Thank you very much for your work! Nice explanation!

  • @aymericalixe1310
    @aymericalixe1310 4 роки тому +4

    Maybe i'm wrong but I think the hypothesis is not that X1 and X2 are independant but that X1 and X2 are conditionnaly independant. It was very clear otherwise thank you !

    • @NormalizedNerd
      @NormalizedNerd  4 роки тому

      In naive Bayes every feature is treated as an independent feature that's why it's called naive.

    • @chitranghosal879
      @chitranghosal879 Рік тому +1

      I think the hypothesis is that you assume each feature to be (w.r.t other features)
      1) globally independent (in the global sample space)
      2)conditionally independent w.r.t the occurence of each class label (under the subset sample space where the particular class event has occured)
      If these assumptions are not met, then it does not seem possible to build the mathematics, because as far as I see,
      if events A and B are independent, that does not naturally imply conditional independence between events (A|C) and (B|C)

  • @joaomatheusnascimentogonca7633
    @joaomatheusnascimentogonca7633 4 місяці тому

    10:51 How does this work? wouldn't the probability that Xi = xi be zero, given we're using a continuous distribution? Because of the "=" sign

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 роки тому

    Great explanation.

  • @nikolai228
    @nikolai228 8 місяців тому

    Amazing video. thanks.

  • @adityaprasad3356
    @adityaprasad3356 2 роки тому +1

    very helpful🥺🥺

  • @fmt2586
    @fmt2586 2 роки тому

    hey, thanks man, very clear explanation.😀😀

  • @mehditavakoli2492
    @mehditavakoli2492 Рік тому

    Thank you!

  • @DANstudiosable
    @DANstudiosable 4 роки тому +1

    Well explained, a quick revision for Naive bayes. I forgot why it was called Naive until i watched this video 😂😂

  • @signature445
    @signature445 3 роки тому +1

    Sir is this like
    Bayesian classifier deals with conditional probability ?
    Naïve bays classifier deals with joint probability ?
    Thanks in advance.....

  • @dzmitryk9658
    @dzmitryk9658 3 роки тому

    Awesome! Thank you.

  • @hasben0
    @hasben0 Рік тому

    Well done👊👊

  • @AnasHawasli
    @AnasHawasli 9 місяців тому

    Great video man great
    herre is a sub

  • @prar_shah
    @prar_shah 3 місяці тому

    Love this

  • @telusukondifirstuu9221
    @telusukondifirstuu9221 3 роки тому

    I love this Exolaination 😍🥰😘
    Thanks a lot ❤

  • @sumedha1051
    @sumedha1051 2 роки тому

    love this!

  • @Fuktron13
    @Fuktron13 3 роки тому

    I wish you were my professor

  • @nickgannon7466
    @nickgannon7466 2 роки тому

    well done

  • @SarahGhiyasi
    @SarahGhiyasi Рік тому

    Thank u it was great.

  • @MrDaniel560
    @MrDaniel560 Рік тому

    HELPFUL!!!!

  • @quanghuynh1570
    @quanghuynh1570 Рік тому

    you saved me

  • @aditya.singh9
    @aditya.singh9 3 роки тому

    truly amazing

  • @mahirjain8898
    @mahirjain8898 Рік тому +1

    so goood

  • @zouhir2010
    @zouhir2010 3 роки тому

    thumbs up
    thanks

  • @Ilham-lj3me
    @Ilham-lj3me Рік тому

    and how aboit gaussian NB?

  • @plumSlayer
    @plumSlayer Рік тому

    You areee Amazing. I love your Indian Bengali accent ( just a guess hehe make me a voice analyzer if i am right XD
    )

  • @10xGarden
    @10xGarden 4 роки тому +1

    3b1b's bro is here

  • @kunalsoni7681
    @kunalsoni7681 Рік тому

    Nice ⭐⭐⭐⭐⭐

  • @harshitdtu7479
    @harshitdtu7479 7 місяців тому

    10:37

  • @pushandeb187
    @pushandeb187 Рік тому

    liked that

  • @abdulkarim.jamal.kanaan
    @abdulkarim.jamal.kanaan 3 роки тому

    Hello people from the future! :D

  • @anon_148
    @anon_148 3 роки тому

    independant moment

  • @mahedihassanrafin7493
    @mahedihassanrafin7493 Рік тому

    just quit confusing people

  • @davidmurphy563
    @davidmurphy563 Рік тому

    Ok, I've given up on the video after 45 secs. You said "stated clearly", if you hadn't I'd have kept watching.
    You point to an array of features called X. What are they? Are they features of the array itself (its size / rank / dimension?), are they features of the thing the array of describing (measurements in a house?), or a list of possible attributes (the ingredients on a pizza?) Then you introduce a label. So what, is this like a python dictionary?
    Plus, I've no idea what sort of issue we're supposed to be tackling? Is it probability? Is it rationality with limited knowledge? I only guess that because I've heard of Bayes before.
    Instead you launch into calculations when I have not the first idea what you're calculating. Why would I listen to that?
    Tell you what, I'll give it another 30 secs. If there's no illustrative example / clear explanation of what the hell you're covering I'm gone.

    • @davidmurphy563
      @davidmurphy563 Рік тому

      Nope, 30 secs later and it's absolute horseshit.

  • @atulyadav9712
    @atulyadav9712 2 роки тому

    Great explanation