Introduction to Bayesian statistics, part 1: The basic concepts
Вставка
- Опубліковано 29 вер 2024
- An introduction to the concepts of Bayesian analysis using Stata 14. We use a coin toss experiment to demonstrate the idea of prior probability, likelihood functions, posterior probabilities, posterior means and probabilities and credible intervals.
www.stata.com
Copyright 2011-2019 StataCorp LLC. All rights reserved.
It was the most comprehensive video with the amazing explanations about prior, likelihood, and posterior. Thank you so much for this wonderful video.
Hello how are you?
That was excellent explanation of the interaction between the parameters, thank a lot for putting the time and effort to do the animations
Hello how are you?
Amazing! Thank you so so much! :)
Awesome, thank you! Animations are really helpful.
great vid! so informative
I shouldn't be saying that loud but dunno about you, I find this prior distribution & Ledoit-Wolf shrinkage method for accrued efficiency very difficult to picture and don't get me started on these affecting eigenvalues instead of eigenvectors... it's a mess in my head right now... I really need to pull myself together
Ok so how has the Bayesian model been tested and demonstrated superior to other statistical methods. I'm always skeptical without hard evidence.
Hi,
On what depends the type of likelihood distribution?
Thanks,
There's no information about what the Y in the graph is/refers to. This is unacceptable
what tHE BLEEP did he just say?
This is a very bad introduction. You jumped from the absolute basics to straight up prior and posterior.
I'm really tired of these videos that area dvanced videos as "beginner videos" in disguise. They really spam all of UA-cam but don't provide any value.
Please explain it more simply next time and please elaborate what each concept means that you introduce within a few seconds. Sorry for being this critical but I'm not here to learn and not to waste my time.
Wow, my understanding acquired from this video is more than from dozen of hours on classes.
Same
i understand nothing
.75x speed
2x
0.25x
Thanks
He was going too fast
he might sound like a regular human at .825 speed
This is awesome. So intuitive and interesting. Why did we ever use null hypothesis testing? With the computational power we have now, this should be the norm.
Maybe the video creator intended to explain Bayesian statistics, but did not.
The concepts start to be explained, then there is a stepwise jump into mentioning prior and posterior probability, with the introduction of on screen equations but no further explanations - it's like it was read out of a technical manual that only 'insiders' know about. This then quickly turns into how to use the software/which buttons to press, which seems applicable to those who already know about Bayes and want to use the software - and not for those who want an introduction.
So I'm sorry to say this video was not useful to introduce Bayesian statistics and I would recommend giving it a miss.
It was a really bad video if you’re actually trying to understand bayesian statistics
Posterior is proportional to the MLE x prior , not equal =
that's true in a certain sense that he should have written proportional instead of equal by avoiding the use of marginal distribution indication for scaling
Haha, “I’m going give a relatively non-technical explanation…” then proceeds to speak entirely in words that have definitions specific to statistics. Most people who remember the definitions of all the words used probably also remember what Bayesian is. People who don’t remember or never did know the vocabulary used have no hope of learning here what Bayesian is.
This is the best introduction to this that I've found online! Thanks!
Really bad video for a newbie trying to learn Bayesian statistics
excellent explanation. I had been surfing internet, for clarity
What i dont understand is how is multiplying liklihood and prior distribution going to give us what we call the posterior distribution. If anything the product just seems like a random function
I have the same version of Stata as yours. However, my Bayesmh window doesn't have the "univariate distribution" option. What could be the reason? Can you give me a hint?
Thank you Sir, the best explanation I found on youtube..
How can we specify belief power of prior? In this example alfa, beta=30. And we can assign 250 for both. There is no boundary for us to prevent assigning 250 instead of 30. In a real life data, if you assign powerful prior, this means you have a bias and you may have implemented pressure to information coming from data; otherwise you have come close to non-prior case.
@4:30 what's the difference between credible interval and confidence interval? After reading about it made me even more confused...
How is it that you are able to neglect the probability of y for the posterior distribution function, which is normally on the denominator?
Chuck the new stata 17.1 has different command structure. Can you please redo the video for version 17.1.
the coin could land on its edge, neither heads or tails. Forgot about that potential event didn't you.
If the coin is held with heads facing up, what is the likelihood it will yield heads when it is tossed?
If the con is held with heads facing up, what is the likelihood it will yield tails when it is tossed?
If the coin is held with tails facing up, what is the likelihood it will yield tails when it is tossed?
If the coin is held with tails facing up, what is the likelihood it will yield heads when it is tossed?
One question I would have on this, is how can you be sure you are not biasing your result using these informative priors? I believe the most conservative approach is indeed the uniform (equivalent to I don't know anything so everything is equally possible for me), but when I start getting "clever", choosing appropriate priors, I can't make a real hypothesis test with that because I already tell the coin to be 50:50 (while someone could have potentially given me a magic coin of 10:90).
I believe the point of the prior is to introduce bias responsibly. That is, they should probably only be used if the prior was decided on from previous experience and expertise, and creating a posterior distribution could be helpful in cases that you believe will generate similar results from previous experiments but only have a limited sample size.
Hello how are you? I need some help
"Non technical"
3:07
Right.
Thank you for making this video. I took statistics class before, but my knowledge is limited. Please add descriptive details so I can understand your video.
Your animation were based on binomial likelihood and in Stata you choose Bernoulli likelihood
are they the same if we remove the binome factor (choose (N,X)
No they are not the same, but a single stochastic variable with a binomial distribution can be described by several stochastic variables with Bernoulli distributions.
At 1:40, shouldn't the area under the graph be equal to 1? What does the y-axis represent?
Hi, thanks for the video. What I wonder is, what are " default priors" when it comes to bayesian inference? As I understand, the priors are specific to each hypothesis or data, so how come some packages include these defaults? What do these priors entail?
Vgl-AAa
@@jamiilax4163 ??
Hi can someone explain why this form of probability is important ?
DURING HIGHSCHOOL DAYS, MY CLOSEST FRIENDS ARE THE NICE ONES.
Proving the non-existence of God was harder than I thought.
Finally I understand this thing. Thank you.
Isn't there an error at 5:18
Shouldn't the beta distribution's a and b be 86 and 84 NOT 106 and 114 ???? as the mean of 86 and 84 gives the mean on the screen (0.506) ......
Whereas the mean of the beta(106,114) is 0.481
Your teaching style is very effective. Explanation and pacing is very good and your voice maintains attention very well. Thank you for making this video, it was quite informative.
how to calculate odd ratio in bayesian ordered logistic plz tell me
Thank you for this video its clear to me
Thank you very much for the explanations of non-informative prior and informative prior. Very helpful for my research.
would someone please tell me what is he saying at 0:28 ? thank you
I think he says: "Many of us were trained using a frequentist approach to statistics..."
1:25 Why does this mean? Prior = Beta (1.0, 1.0)
The Beta function evaluated in (1.0, 1.0) is the Uniform distribution. He says that he will asume not having any information about the probability of getting heads or tails. And for that he will use a prior with an uniform distribution: Beta (1.0, 1.0) = Uniform; so the probability of getting heads or tails has a uniform probability from 0 to 1.
@@Magnuomoliticus but how do you know how to accurately increase the parameters of the prior distribution ? The only thing I don't understand here is how he decided that beta(30,30) was a more accurate depiction of what he knows about the coin. why 30? And thanks for your previous answer.
@@12rooted Well that's a great question that I don't know the answer of. My first guess is that it's arbitrary which distribution you use. But let's wait if someone else can clarify that!'
this is Advance basic concept.
Thanks . I love statistic.
great explanation
Brilliant video thank you a lot
that was so so helpful. thank you.
excellent explanation sir.....
so fucking fast..
Thank you for your kind help.
why is the posterior narrower at 5:15?
amazing! thanks!
excelent video
Please, could you send us the video transcript?
Woo
Thank you. That was very clear and helpful.
What to say, an excellent explanation of Bayesian updating, long life to Stata and its People!
Thanks. Perhaps you do another video to call it part 0 as the building blocks for this part 1. Introduction that is :)
excellent sir
Thank you. The first video that makes me understand this reasoning in one go.
Please could you indicate some friendly material about bayesian inference?
It doesn’t exist. This stuff is taught horrendously everywhere
@@bigfishartwire4696 100% Agree
BAYESIAN STATISTICS IS AN EXTENSION OF THE CLASSICAL APPROACH. VARIOUS DECISION RULES ARE ESTABLISHED. THEY ALSO USE SAMPLING DATA. I LEARNED ABOUT THIS WHEN I WAS STILL IN HIGHSCHOOL IN ATENEO DE ZAMBOANGA UNIVERSITY, MY GRADES IN ALGEBRA ARE HIGH.
too many basic errors: "distribution closer to .5" such a claim is not even formally defined
"With a mean closer to 0.5".