Maximum A Posteriori Estimate (MAP) for Bernoulli | Derivation & TensorFlow Probability

Поділитися
Вставка
  • Опубліковано 6 жов 2024

КОМЕНТАРІ • 10

  • @Mrt00mm
    @Mrt00mm 2 роки тому +6

    This was perfect explanation that I needed. Thankyou infinitely for the derivation :D

  • @edizferitkula9920
    @edizferitkula9920 2 роки тому +1

    Awsome video, thank you very much. I have been watching videos about this topic and this one was the best explanation by far.

  • @josht7238
    @josht7238 Рік тому +1

    Thankyou so much great explanation!!

  • @vipingautam9501
    @vipingautam9501 2 роки тому +1

    hi what does theta raise to power "i" implies here...would be helpful if you could explain sorry for this silly question.

    • @MachineLearningSimulation
      @MachineLearningSimulation  2 роки тому +3

      Hey, thanks for the question :)
      Do you have a particular point in the video?
      If you are referring to "raising a variable to [i]", then this means, in the nomenclature of the video, using the i-th sample. It's rather an indexing than taking it to the power.
      Imagine a dataset with 100 observations of the weather than w^{[i]} refers to how the weather was at day number i. For example, w^{[13]} = good
      Hope that helped :)
      Let me know if it is still unclear.

    • @vipingautam9501
      @vipingautam9501 2 роки тому +2

      @@MachineLearningSimulation Thank you so much for such a prompt response. I had the same doubt which you explained now. All good now thanks once again.
      Please keep uploading such videos, you've done amazing work. Very few videos are out there which gives such a beautiful insight.

    • @MachineLearningSimulation
      @MachineLearningSimulation  2 роки тому

      @@vipingautam9501 Thanks so much :)