What is meant by entropy in statistics?

Поділитися
Вставка
  • Опубліковано 14 тра 2018
  • Describes how entropy - in statistics - is a measure of information content as well as uncertainty, and uses an example to illustrate its use.
    This video is part of a lecture course which closely follows the material covered in the book, "A Student's Guide to Bayesian Statistics", published by Sage, which is available to order on Amazon here: www.amazon.co.uk/Students-Gui...
    For more information on all things Bayesian, have a look at: ben-lambert.com/bayesian/. The playlist for the lecture course is here: • A Student's Guide to B...

КОМЕНТАРІ • 23

  • @sarrae100
    @sarrae100 5 років тому +6

    you must be knowing this surely, but still let me reiterate once , you are a Genius at explaining things.

  • @joed3325
    @joed3325 6 років тому +1

    Excellent explanations as always.

  • @ernsthenle8431
    @ernsthenle8431 3 роки тому +3

    Excellent Video. Using log based 2, the entropy for theta = 0.75 is 0.81 not 0.56 as described in the video. You get 0.56 if you use the natural log.

  • @CGKittenz
    @CGKittenz 3 роки тому +1

    Omg! Thank you for this.

  • @cerioscha
    @cerioscha Рік тому

    I like this explanation, well done. I'm surprised people don't adopt an intuition of Entropy based on the degree of "even mixture" or "even population or occupation" of the possible states of the system after the process is repeated (ie across multiple instances).

  • @awr359
    @awr359 5 років тому +1

    Thank you!

  • @jayalekshmis4977
    @jayalekshmis4977 5 років тому

    SUperbbb Explanation Thank you

  • @charankumark8457
    @charankumark8457 3 роки тому +1

    Sir, very nice explanation. A small doubt, I have two series of data points X and Y. X is independent of Y but Y depends on X. To see the dependency of Y on X, I have calculated Pearson's correlation coefficient. But, one of my friend suggested me to calculate entropy for better correlation between distribution. Can you guide me how to do that.

  • @rembautimes8808
    @rembautimes8808 3 роки тому

    Good explanation thanks.

  • @gunjansethi2896
    @gunjansethi2896 3 роки тому

    Brilliant!

  • @chalize1
    @chalize1 4 роки тому +1

    Thank you for providing your explanation! But, some subheaders and not cramming it all onto one slide would make it so much more digestible

  • @AK-fj4tb
    @AK-fj4tb 5 років тому +1

    Most of your videos are very good but I have watched this one five times and still have little idea what the value of entropy is. I can do the math but I am confused as to the purpose. What does the recipient know before hand? How does telling someone the outcome of a coin toss "reduce uncertainty"? What exactly is it they were uncertain about? Any insights would be appreciated. Thank you.

    • @AK-fj4tb
      @AK-fj4tb 5 років тому +1

      On my sixth watching of this I now think I have it. Please verify if correct. The information recipient knows the distributional form and the value of theta ex ante. A coin is then flipped but the outcome hidden. How much uncertainty is reduced about the outcome of the flip by showing the coin to the information recipient.

    • @ziliestarrive
      @ziliestarrive 3 роки тому

      I have the same doubt. If someone can verify, it'd be greatly appreciated.

  • @indurthi
    @indurthi 4 роки тому

    Thank you

  • @happygrace8065
    @happygrace8065 8 місяців тому

    thank you😊

  • @zoozolplexOne
    @zoozolplexOne 2 роки тому

    Cool !!

  • @justinw8370
    @justinw8370 2 роки тому +1

    When in doubt, taking a derivative and setting it equal to 0 is your friend

  • @shijingsi8288
    @shijingsi8288 4 роки тому

    it should the expected value of Negative log p(x)

  • @vlaaady
    @vlaaady 3 роки тому

    Pretty unintuitive. There are much better intuitive interpretation with the depth of binary tree.