I think I watched this video more than 10 times. I get back to it every time that I got lost in formulas and explanations in other sources. Thanks Ben for such amazing and nice tutorials.
I can not believe you are back. I have been watching your videos for 4 years, you are incredible. Thank your for all the staff you make , it always was really helpful.
Thanks Dr. Lambert for all your hard work and videos on Bayesian analysis. I'm currently taking a Bayesian statistics course this semester at UCF in Orlando, Fl. I wish my professor would teach this course the way you do. I use your videos to study and fill in knowledge gaps. Your videos have been very helpful. I plan on buying your book over the summer and reading it cover to cover.
Btw, I didn't mean to seem a troll in last comment. I should have prefaced with the fact that I think you video is excellent. But I am interested in seeing/hearing your further discussion of WAIC, and LOO-CV. Thanks for the video. Very helpful.
What happens when M1 and M2 are not mutually disjoint? Or if they're not exhaustive of the data? Wondering if there is an analogue for a residue term in P(data). Thanks!
Because of principles like Occams, which try to favor simpler models, for a variety of reasons, such as utility (imagine a selection model for employees. Utility goes down as model complexity and cost goes up.) the prior is a potential way of constraining/penalizing complexity
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben
I think I watched this video more than 10 times. I get back to it every time that I got lost in formulas and explanations in other sources. Thanks Ben for such amazing and nice tutorials.
I can not believe you are back. I have been watching your videos for 4 years, you are incredible. Thank your for all the staff you make , it always was really helpful.
Thanks Dr. Lambert for all your hard work and videos on Bayesian analysis. I'm currently taking a Bayesian statistics course this semester at UCF in Orlando, Fl. I wish my professor would teach this course the way you do. I use your videos to study and fill in knowledge gaps. Your videos have been very helpful. I plan on buying your book over the summer and reading it cover to cover.
I am taking a statistics course but your video is much better than my university's. It really works
One of The best veideos I have watched!! Your videos are helping me get through my grad courses (Statistical ML) !! Thank you
Thank you so much for making these videos. They help a lot and you answered questions I didn't know I had :)
Btw, I didn't mean to seem a troll in last comment. I should have prefaced with the fact that I think you video is excellent. But I am interested in seeing/hearing your further discussion of WAIC, and LOO-CV. Thanks for the video. Very helpful.
which tool did you use to do the black boarding? it is really awesome
what is the difference between theta and a model? Aren't they supposed to be the same?
what is theta here? Is it a vector of the parameters in model M1 or M2?
I'm answering myself here lol. It is.
Thank you very much for your help!
What is the distinction between m1, m2 and theta?
Great introduction to the bayes factor
What happens when M1 and M2 are not mutually disjoint? Or if they're not exhaustive of the data? Wondering if there is an analogue for a residue term in P(data). Thanks!
Please let us know the playlist and the sequence number in which this video would/should appear. Thanks
Hi, thanks for your message. The playlist can be found here: m.ua-cam.com/play/PLwJRxp3blEvZ8AKMXOy0fc0cqT61GsKCG.html Best, Ben
Thanks Ben
Why would you ascribe a lower prior to more complex models?
Because of principles like Occams, which try to favor simpler models, for a variety of reasons, such as utility (imagine a selection model for employees. Utility goes down as model complexity and cost goes up.) the prior is a potential way of constraining/penalizing complexity
You are assuming, by means of p(M1)=1-p(M2), that these two models are the only ones possible. Perhaps an oversimplification for didactic purposes?
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben
Hi, thanks for your comment and, no worries, most trolls don’t comment on the intricacies of Bayesian inference! Yes, I am considering two models here, sorry if that wasn’t clear. Best, Ben