(IC08) Fisher Information and Efficiency

Поділитися
Вставка
  • Опубліковано 21 жов 2024

КОМЕНТАРІ • 4

  • @RAyLV17
    @RAyLV17 3 місяці тому

    With the amount of effort you put, I really wish your channel gets more popular!

    • @LetsLearnNemo
      @LetsLearnNemo  3 місяці тому +1

      Thank you! Maybe one day =) For now, I just enjoy making content even just for one person to help at a time

  • @chenyuzhu2792
    @chenyuzhu2792 3 місяці тому

    Same as usual, amazing video quality. I just want to add on (if I am wrong correct me), the fisher information can also be defined as the expectation of the square of the first derivative of the log-likelihood function, as I found it more intuitive to understand it that way. The first derivative accounts for the change in probability, summing up and averaging them essentially tells u how drastic there is a change in probability through the change in the random variable given a set parameter (which is something I found relatable to Shannon). The power term prevents offsetting from negative changes.

    • @LetsLearnNemo
      @LetsLearnNemo  3 місяці тому

      Thank you four kind words =) and yes, some will define fisher in terms of what you mention, which becomes intuitive once one develops a geometric sense of what we are doing. Shannon entropy (and other measures like KL divergence) also closely relates to these mechanisms also, but hope to investigate these at a later time when I get the chance. Hope you are well!