Monte Carlo Methods - VISUALLY EXPLAINED!

Поділитися
Вставка
  • Опубліковано 23 лип 2024
  • In this tutorial, I provide all the necessary background on how to use sampling methods to estimate the distributions and compute expected values.
    I provide an overview of 3 sampling methods as well -
    a) Inverse CDF Transform
    b) Rejection Sampling
    c) Importance Sampling
    As such I have dedicated tutorials on these methods so if you need more in-depth explanations & various proofs then watch the stand-alone dedicated tutorials on these methods.
  • Наука та технологія

КОМЕНТАРІ • 20

  • @mathewjones8891
    @mathewjones8891 4 місяці тому +2

    Thank you, again. I use Markov chains and Monte Carlo methods to model ion channel function and other things. I was foolish enough to believe that I had independently figured out what you describe as the Inverse CDF Transform. It is actually a relief to learn that there are already established principled theorems about this. So, maybe some of my publications were not barking up the wrong tree after all :-).

  • @Matteo-uq7gc
    @Matteo-uq7gc 3 роки тому +2

    Great Video! I subscribed please keep them coming !!!

  • @ssshukla26
    @ssshukla26 3 роки тому +2

    Another great video. 👍

  • @bikinibottom2100
    @bikinibottom2100 Рік тому +1

    This Bayesian regression serie is incredible, the way it starts with a polynomial regression and then builds up to more general probabilistic modelling is brilliant and reminds me of Rasmussen's book on Gaussian processes. Are you going to make a video about kernels? That would nicely generalize the cosine example of the part 1

  • @bikinibottom2100
    @bikinibottom2100 Рік тому +1

    10:50 you wouldn't be wasting our time, this video about generating random numbers seems very interesting!

  • @bharadwajreddy7840
    @bharadwajreddy7840 3 роки тому +1

    waiting for kernels and Gaussian processes. All of your videos are amazing:)

    • @KapilSachdeva
      @KapilSachdeva  3 роки тому +1

      Either you have hacked my account or can read my mind :) .... am planning to do those at some point but not sure if will do it as part of this series.
      Jokes aside, thanks for your comment and if possible share these with people who may be interested in this subject.

    • @bharadwajreddy7840
      @bharadwajreddy7840 3 роки тому

      @@KapilSachdeva I'm currently working with gaussian processes so, kinda of anticipating that video for a while :) and waiting to see your take on them, just being a fan-boi for this series/all-your-videos in general.
      I do share your videos, it's kind of sad that gold is being ignored by youtube's algorithm :(

    • @KapilSachdeva
      @KapilSachdeva  3 роки тому +1

      You will be happy to know that I was using GP yesterday for my own project and hence the comment of you hacking my computer :) .... I think GP deserve its own mini series so will do them at one point.

    • @bharadwajreddy7840
      @bharadwajreddy7840 3 роки тому

      @@KapilSachdeva yes, Non-parametrics are amazing :p

  • @dhimanbhowmick9558
    @dhimanbhowmick9558 Місяць тому

    ❤❤🙏🏽🙏🏽🙏🏽 thanks and subscribed

  • @rileskebaili6145
    @rileskebaili6145 Рік тому

    For when a video about the theoretical aspect of score based diffusion models ?

  • @paedrufernando2351
    @paedrufernando2351 2 роки тому

    I think importance sampling in Reinformcaent Learning..and this large time number of sampling represents the large number of trails we train the Actors and Critics for. Is that so or I am mistakendly confusing topics

    • @KapilSachdeva
      @KapilSachdeva  2 роки тому +1

      Do not know much about RL so can not really give any feedback on your statement.
      Think of where ever you need to compute expected values you may end up using importance sampling.
      See this article (have given it a cursory look only)
      jonathan-hui.medium.com/rl-importance-sampling-ebfb28b4a8c6