Bayesian Network - Exact Inference Example (With Numbers, FULL Walk-Through)

Поділитися
Вставка
  • Опубліковано 24 лип 2024
  • Timestamps
    Relevant Equations - 0:12
    Brief Aside - 1:52
    Example Problem - 2:35
    Solution - 3:41

КОМЕНТАРІ • 25

  • @Fakevomet
    @Fakevomet 2 роки тому +4

    This video is saving my life right now

  • @chiaral8516
    @chiaral8516 3 роки тому +11

    Thank you! This is very helpful having all the exact calculations and also going over all the relevant formulas up front is super. The calculations of the probabilities across the probability tables went a bit fast for me but I'll break it down and study it tomorrow. This is the only full example I found after scouring the internet, looking in tesctbooks, and reviewing the lecture videos for tge course Im in.

  • @mhrumi8994
    @mhrumi8994 5 місяців тому

    Thanks a lot, I was looking for this all over the Internet for weeks. ❤

  • @harshsason3725
    @harshsason3725 8 місяців тому

    you are a life savor. Thanks

  • @mattanova
    @mattanova 2 роки тому

    FINALLY! Thank you

  • @mohamedalaqel4570
    @mohamedalaqel4570 3 роки тому +1

    Thanks! this is very useful

  • @CowboyRocksteady
    @CowboyRocksteady Рік тому

    excellent video on bayes nets. thank you

  • @prachimehta9839
    @prachimehta9839 2 роки тому

    Amazing Video !

  • @CamelEnjoyer
    @CamelEnjoyer 4 роки тому

    Wow co video John, very cool!

  • @janenie51
    @janenie51 Рік тому

    Thank you so much🎉🎉🎉

  • @OpeLeke
    @OpeLeke 2 роки тому

    This video is loaded. do you have another one where you go into details?

  • @tianlongwang7238
    @tianlongwang7238 Рік тому

    Hi there, I like your matrix solution so much that make the computation so easy, I also calculate the same question in Naive bayesian Classifier, thich give me an astonishingly close result, which is 0.28461. I want to investigate more on the reasons that easier architecture can achieve a similar accuracy as a more sophisticated architecture, do you or anyone who saw this comments have any ideas?

  • @reallyidrathernot.134
    @reallyidrathernot.134 2 роки тому

    mates! What's the capital letters instead of lowercase letters mean? eg: P(a) turning into P(A)?

    • @dfreeze371
      @dfreeze371 Рік тому

      For anyone still confused by this: P(A) is the total probability distribution of A, which can then turn into two distinct outcomes P(a) and P( ¬ a)

  • @sakibahmed4317
    @sakibahmed4317 Рік тому

    why don't you have other videos

  • @bmyalbara2
    @bmyalbara2 3 роки тому +2

    But how to define the values of probabilities used in this example or in any other example?

    • @froylanm.wbariomartinez470
      @froylanm.wbariomartinez470 3 роки тому +1

      Excuse me, do you know how to define it now ? thanks

    • @bmyalbara2
      @bmyalbara2 3 роки тому +2

      @@froylanm.wbariomartinez470 It is either you have historical data that you can extract the probabilities from by dividing the number of times a specific event happens over the total number of events, or, in case you do not have data, you can ask experts in a specific field to provide their assessment of the probabilities. Read about (Expert Judgement).

    • @trouvaille5965
      @trouvaille5965 3 роки тому

      @@bmyalbara2 Thank you so much

  • @niharamzan382
    @niharamzan382 2 роки тому

    why have you ignored alpha in your computation?

    • @adityachawla7523
      @adityachawla7523 2 роки тому +1

      you can just normalize in the end to account for alpha

  • @OLink9
    @OLink9 Рік тому +1

    woah is this brad pitt?

  • @battlemode
    @battlemode 2 роки тому +1

    Really helpful, thank you