great stuff , the notation helped so much vs numerous books / materials that do not even bother to mention that x_j should be sampled from the density f(x) and then plugged into h(x). 5 starts for that !
Hi, Thank you very much for the video. I have not understood why the estimate is so far from the truth in the Normal example (min 7:40). Can anyone give me a hint? Thank you!
"The estimate being so far from the truth" is a matter of perspective. From that plot, I could also say the estimate is close to the truth. If you continue on in the video, there is a discussion of determining the Monte Carlo error which quantifies "how far" you are. But since this is a Monte Carlo approach, there will always be some probability that you end up "far" from the truth, but that probability decreases with sample size.
Jonnemanne by dividing by j-1 you get an unbiased estimator but dividing by j yields the MLE estimator of the variance. as stated in the video its just one of the many possible variance estimators you can use.
Well, this isn't really based on a book. But I do use Robert & Casella's Monte Carlo Statistical Methods which is pretty technical. www.springer.com/gp/book/9780387212395 (see Section 3.2)
Over 9 years ago, this video still helping. Thank you so much Jarad!
great stuff , the notation helped so much vs numerous books / materials that do not even bother to mention that x_j should be sampled from the density f(x) and then plugged into h(x). 5 starts for that !
Thanks a lot, that was on point as it was needed to end my Variational Autoencoder derivation
Excellent, excellent video.
Thanks! Very instructive video!!
This was very helpful!
This helped a lot!
Thank you Dr. J
excellent!
thanks Mr. Niemi, great content ! does anybody know how to plot that in Python? much appriciated
thanks for the great video!
The video was very useful...but how did u calculate the true value??
You can use complex analysis (residue theorem).
Very helpful video
I wish there's a Coursera course or something about this.
lost me at "Hi"
very good video.
Hi,
Thank you very much for the video.
I have not understood why the estimate is so far from the truth in the Normal example (min 7:40). Can anyone give me a hint?
Thank you!
"The estimate being so far from the truth" is a matter of perspective. From that plot, I could also say the estimate is close to the truth. If you continue on in the video, there is a discussion of determining the Monte Carlo error which quantifies "how far" you are. But since this is a Monte Carlo approach, there will always be some probability that you end up "far" from the truth, but that probability decreases with sample size.
You save my life! hahah
Thank you so much
why not divide by j-1 in the variance estimation at 3:05?
Jonnemanne by dividing by j-1 you get an unbiased estimator but dividing by j yields the MLE estimator of the variance. as stated in the video its just one of the many possible variance estimators you can use.
Thank uuuuuuuuuuuuuuuuuuuuu
What is the book?
Well, this isn't really based on a book. But I do use Robert & Casella's Monte Carlo Statistical Methods which is pretty technical. www.springer.com/gp/book/9780387212395 (see Section 3.2)
While the video was ok to understand the formula, there was no intuition/explanation of why MC integration even works. :/
ua-cam.com/video/8276ZswRw7M/v-deo.html
IT'S...
Monte Carlo's integration method