Introducing Bayes factors and marginal likelihoods
Вставка
- Опубліковано 16 тра 2018
- Provides an introduction to Bayes factors which are often used to do model comparison. In using Bayes factors, it is necessary to calculate the marginal likelihood - another term for the denominator of Bayes rule. This video explains that marginal likelihoods are notoriously difficult to calculate and are sensitive to the choice of priors; even when changes to priors do not affect the posterior distribution.
This video is part of a lecture course which closely follows the material covered in the book, "A Student's Guide to Bayesian Statistics", published by Sage, which is available to order on Amazon here: www.amazon.co.uk/Students-Gui...
For more information on all things Bayesian, have a look at: ben-lambert.com/bayesian/. The playlist for the lecture course is here: • A Student's Guide to B...
I think I watched this video more than 10 times. I get back to it every time that I got lost in formulas and explanations in other sources. Thanks Ben for such amazing and nice tutorials.
I can not believe you are back. I have been watching your videos for 4 years, you are incredible. Thank your for all the staff you make , it always was really helpful.
Thanks Dr. Lambert for all your hard work and videos on Bayesian analysis. I'm currently taking a Bayesian statistics course this semester at UCF in Orlando, Fl. I wish my professor would teach this course the way you do. I use your videos to study and fill in knowledge gaps. Your videos have been very helpful. I plan on buying your book over the summer and reading it cover to cover.
One of The best veideos I have watched!! Your videos are helping me get through my grad courses (Statistical ML) !! Thank you
I am taking a statistics course but your video is much better than my university's. It really works
Thank you so much for making these videos. They help a lot and you answered questions I didn't know I had :)
Great introduction to the bayes factor
Thank you very much for your help!
Btw, I didn't mean to seem a troll in last comment. I should have prefaced with the fact that I think you video is excellent. But I am interested in seeing/hearing your further discussion of WAIC, and LOO-CV. Thanks for the video. Very helpful.
which tool did you use to do the black boarding? it is really awesome
What happens when M1 and M2 are not mutually disjoint? Or if they're not exhaustive of the data? Wondering if there is an analogue for a residue term in P(data). Thanks!
Please let us know the playlist and the sequence number in which this video would/should appear. Thanks
Hi, thanks for your message. The playlist can be found here: m.ua-cam.com/play/PLwJRxp3blEvZ8AKMXOy0fc0cqT61GsKCG.html Best, Ben
Thanks Ben
what is the difference between theta and a model? Aren't they supposed to be the same?
What is the distinction between m1, m2 and theta?
what is theta here? Is it a vector of the parameters in model M1 or M2?
I'm answering myself here lol. It is.
Why would you ascribe a lower prior to more complex models?
Because of principles like Occams, which try to favor simpler models, for a variety of reasons, such as utility (imagine a selection model for employees. Utility goes down as model complexity and cost goes up.) the prior is a potential way of constraining/penalizing complexity
You are assuming, by means of p(M1)=1-p(M2), that these two models are the only ones possible. Perhaps an oversimplification for didactic purposes?
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben