Fisher's Information: Examples

Поділитися
Вставка
  • Опубліковано 22 вер 2024

КОМЕНТАРІ • 21

  • @syz911
    @syz911 Рік тому

    Your videos are so precise and accurate. What I appreciate is you spent so much of time writing all these without compromising accuracy. Many people over-simplify or skip important assumptions when presenting definitions or theorems. I have a request: Could you please discuss about the Fisher Information and C-R bound for multi-parameters? Thanks!

    • @statisticsmatt
      @statisticsmatt  Рік тому

      Thanks for your kind words. I'll add these topics to my to do list. But being honest, I don't know when I'll be able to get to it. Will eventually. Be very patients (many thanks). Don't forget to subscribe and let others know about this channel.

  • @nickmillican22
    @nickmillican22 3 роки тому +3

    Love your videos, Matt!
    Question, do you have any videos (or is there a brief explanation) explaining why the 'variance of the partial derivative of the log likelihood' AND the 'negative expected value of the second derivative of the log likelihood' are equivalent (and thus both define the information matrix)? I can understand the intuition behind the second-derivative definition, but I don't understand the variance definition.
    Keep up the great work!

    • @statisticsmatt
      @statisticsmatt  3 роки тому +1

      Many thanks for saying that you love these videos. Much appreciated. First note this formula without proof. Var(x) = E(x^2) - [E(x)]^2. Let x="partial derivative of the log likelihood". From this video, ua-cam.com/video/xkMstee5gQ0/v-deo.html, about 3 minutes into it, it shows that E(x)=E(partial derivative of the log likelihood)=0. Thus, Var(x) = E(x^2). This shows that the first and third equations are equal. To show that the second equals the first and third, I'm going to recommend watching at least the first 7 minutes of the above mentioned video.

  • @erguancho6186
    @erguancho6186 3 роки тому +3

    Very useful! Thanks :)

  • @bhawikajain4022
    @bhawikajain4022 4 місяці тому

    Hi matt, can you explain why you have witten (n*lambda) to the power {sum xi} in ex1 around 1:31?
    Thank you for such helpful videos!!

    • @statisticsmatt
      @statisticsmatt  4 місяці тому

      Many thanks for watching! You have found an error in the video, which I highlighted in the description for the video.

  • @csaa7659
    @csaa7659 3 роки тому

    Why would the term n*lambda use instead of lambda in the likelihood of the Poisson distribution in example 1?

    • @statisticsmatt
      @statisticsmatt  3 роки тому

      You have pointed out an error. Many many thanks. It should be just lambda. I'm going to put this error in the description and give you credit. Also, don't forget to subscribe to the channel.

  • @xiaowei8546
    @xiaowei8546 2 роки тому

    for the Poisson example, for two samples, say 10,10,10,10,10 and 0, 5,10,15, 20, is the fisher information always the same for different lambdas?

    • @statisticsmatt
      @statisticsmatt  2 роки тому

      Fisher's Information for a Poisson distribution is (n/lambda). If you know that the two samples came from the same distribution, then Fisher's Information would be the same. However, if you use the sample data to estimate lambda, and then plug into the formula (n/lambda), different values would be achieved. Many thanks for watching. Don't forget to subscribe and let others know about this channel.

    • @xiaowei8546
      @xiaowei8546 2 роки тому

      @@statisticsmatt many thanks

  • @minerva646
    @minerva646 3 роки тому

    Hi, is it possible that you share your notes?

    • @statisticsmatt
      @statisticsmatt  3 роки тому

      In the past, I've posted my notes on Gumroad. What videos do you want notes from?

    • @minerva646
      @minerva646 3 роки тому

      @@statisticsmatt Thanks, this one please "Fisher's Information: Examples"

    • @statisticsmatt
      @statisticsmatt  3 роки тому

      go to this site for a copy of the video notes
      gumroad.com/statisticsmatt
      use "Fisher's Information" to search for the notes.

  • @nathanemil10
    @nathanemil10 3 роки тому

    Hey Mr StatsiticsMatt, i have a course test on MLE coming soon. think you can help me out during it?

    • @statisticsmatt
      @statisticsmatt  3 роки тому

      You may always ask questions on these videos. Not sure how much I'd be able to help. Please don't forget to subscribe.

  • @lalapanda4216
    @lalapanda4216 Рік тому

    wtf

    • @statisticsmatt
      @statisticsmatt  Рік тому

      I'm not sure if this is a positive or negative comment. Many thanks for watching. Don't forget to subscribe and let others know about this channel.