What is meant by entropy in statistics?
Вставка
- Опубліковано 14 тра 2018
- Describes how entropy - in statistics - is a measure of information content as well as uncertainty, and uses an example to illustrate its use.
This video is part of a lecture course which closely follows the material covered in the book, "A Student's Guide to Bayesian Statistics", published by Sage, which is available to order on Amazon here: www.amazon.co.uk/Students-Gui...
For more information on all things Bayesian, have a look at: ben-lambert.com/bayesian/. The playlist for the lecture course is here: • A Student's Guide to B...
you must be knowing this surely, but still let me reiterate once , you are a Genius at explaining things.
Excellent explanations as always.
Excellent Video. Using log based 2, the entropy for theta = 0.75 is 0.81 not 0.56 as described in the video. You get 0.56 if you use the natural log.
thanks
Omg! Thank you for this.
I like this explanation, well done. I'm surprised people don't adopt an intuition of Entropy based on the degree of "even mixture" or "even population or occupation" of the possible states of the system after the process is repeated (ie across multiple instances).
Thank you!
SUperbbb Explanation Thank you
Sir, very nice explanation. A small doubt, I have two series of data points X and Y. X is independent of Y but Y depends on X. To see the dependency of Y on X, I have calculated Pearson's correlation coefficient. But, one of my friend suggested me to calculate entropy for better correlation between distribution. Can you guide me how to do that.
Good explanation thanks.
Brilliant!
Thank you for providing your explanation! But, some subheaders and not cramming it all onto one slide would make it so much more digestible
I agree 100%. It gets confusing
Most of your videos are very good but I have watched this one five times and still have little idea what the value of entropy is. I can do the math but I am confused as to the purpose. What does the recipient know before hand? How does telling someone the outcome of a coin toss "reduce uncertainty"? What exactly is it they were uncertain about? Any insights would be appreciated. Thank you.
On my sixth watching of this I now think I have it. Please verify if correct. The information recipient knows the distributional form and the value of theta ex ante. A coin is then flipped but the outcome hidden. How much uncertainty is reduced about the outcome of the flip by showing the coin to the information recipient.
I have the same doubt. If someone can verify, it'd be greatly appreciated.
Thank you
thank you😊
Cool !!
When in doubt, taking a derivative and setting it equal to 0 is your friend
so true lol
it should the expected value of Negative log p(x)
Pretty unintuitive. There are much better intuitive interpretation with the depth of binary tree.