What is Fisher Information?

Поділитися
Вставка
  • Опубліковано 12 чер 2022
  • Explains the concept of Fisher Information in relation to statistical estimation of parameters based on random measurements. Gives an example of parameter estimation in Gaussian noise, and shows the component functions to help with providing intuition.
    Related videos: (see iaincollings.com)
    • What is Least Squares Estimation? • What is Least Squares ...
    • What is a Random Variable? • What is a Random Varia...
    • What is a Probability Density Function (pdf)? • What is a Probability ...
    • What is a Multivariate Probability Density Function (PDF)? • What is a Multivariate...
    • What is the Kalman Filter? • What is the Kalman Fil...
    • What is a Cumulative Distribution Function (CDF) of a Random Variable? • What is a Cumulative D...
    • What is a Moment Generating Function (MGF)? • What is a Moment Gener...
    • What is a Random Process? • What is a Random Process?
    • Expectation of a Random Variable Equation Explained • Expectation of a Rando...
    • What is a Gaussian Distribution? • What is a Gaussian Dis...
    • How are Matched Filter (MF), Zero Forcing (ZF), and MMSE Related? • How are Matched Filter...
    For a full list of Videos and Summary Sheets, goto: iaincollings.com

КОМЕНТАРІ • 81

  • @akaakaakaak5779
    @akaakaakaak5779 Рік тому +11

    Love the format of the video. Emphasising the intuition just makes everything else clearer

  • @NONAME_G_R_I_D_
    @NONAME_G_R_I_D_ Рік тому +10

    Great video! I have been following your content for quite a while now and you really always try to give the intuition behind the each process. This really helps to understand the material :) Much appreciated!!!

    • @iain_explains
      @iain_explains  Рік тому +1

      I'm so glad to hear that you like the videos, and find the intuition helpful.

  • @sdsa007
    @sdsa007 11 місяців тому +1

    Im a visual learner the graphs really helped! but I almost gave up half way through the video but I'm glad I hung on!

    • @iain_explains
      @iain_explains  11 місяців тому +1

      I'm glad you found the graphs helpful.

  • @ImranMoezKhan
    @ImranMoezKhan Рік тому +2

    What a wonderful coincidence! Here I am deriving a CRB for a noise variance model I'm researching, and running MLE simulations to verify it, and your video with this great explanation of FI comes up :-). I've read that the multivariate FIM can be considered a metric in the parameter space, and with your explanation of how the derivative takes into account the variation of the PDF wrt to the parameter, I can almost visualize it for an intuitive understanding - fascinating concepts. Thanks Iain!

    • @iain_explains
      @iain_explains  Рік тому +1

      I'm so glad it was helpful. The concept is not particularly intuitive, especially when considering the multivariate case with the FIM. Perhaps the FIM can be the topic of a future video.

  • @sharp8710
    @sharp8710 Місяць тому

    Thank you for the video. Love the way you used simple examples to explain the theory intuitively and decomposed the expression explaining the meaning of each part!

  • @user-cl7vh1tz3t
    @user-cl7vh1tz3t 6 місяців тому +1

    This is really a great explanation. You made a difficult concept (at least for me) very easy to understand. I’ve been watching other videos with animations and all, but I only understood this well after watching your explanation. Thank you very much.

    • @iain_explains
      @iain_explains  6 місяців тому

      That's so great to hear. I'm glad it was helpful!

  • @sintumavuya7495
    @sintumavuya7495 7 місяців тому

    Thank you for explaining the logic behind that formula. Knowing the why helps me remember easily and just makes it all make sense.

    • @iain_explains
      @iain_explains  7 місяців тому +1

      That's great to hear. I'm glad you found the video helpful.

  • @chengshen7833
    @chengshen7833 Рік тому

    Thanks a lot. This really provides a excellent complementary explanation to S. Kay's book. In the book Fisher Information was interpreted as the 'curvature of the log-likelihood function', where the expectation of squared 1st derivative can be converted to negative of expectation of 2nd derivative, and f is viewed as a function of theta with Y fixed. The meaning of natural log will become more subtle when it comes to the derivation of CRLB.

    • @iain_explains
      @iain_explains  Рік тому

      Glad it was helpful! It's a concept that took me a long time to get intuition on, when I first learned about it.

  • @SavasErdim-ly8xo
    @SavasErdim-ly8xo Рік тому

    Great video to understand the Fisher Information intuitively. Thank you Prof. Iain.

  • @menglu5776
    @menglu5776 8 місяців тому

    Thank you so much, I was literature research for some novel Cramer rao lower bound application. You video helped me a lot!

  • @mikewang4626
    @mikewang4626 Рік тому

    Thanks a lot for your intuitive explanation with diagrams. The explanation about why Fisher Information looks like that is quite useful to understand the definition!

    • @iain_explains
      @iain_explains  Рік тому

      That's great to hear. I'm glad you liked the video.

  • @oO_toOomy_Oo
    @oO_toOomy_Oo 5 місяців тому

    I appreciate your work Mr. lain, it is very helpful, it gives great sense of the signals and systems.

  • @aliosmancetin9542
    @aliosmancetin9542 Рік тому

    Awesome video! You concentrate on giving the intuiton, thanks!

  • @marirsg
    @marirsg Рік тому

    Beautifully explained ! Thank you

  • @SignalProcessingWithPaul
    @SignalProcessingWithPaul 9 місяців тому

    Hey Ian, great content. Have you considered doing a video on the Cramer-Rao bound (and how it relates to Fisher information)? I was thinking of doing some statistical signal processing videos on my channel but you've covered so much already haha.

    • @iain_explains
      @iain_explains  9 місяців тому

      Thanks for the suggestion. Yes, I've thought about it. It's the inverse of the Fisher information, but there's really not much intuition as to why this is the case - except that the maths turns out that way.

  • @mustaphasadok3172
    @mustaphasadok3172 2 роки тому

    Amazing... Thank you professor. In the litterature there is rare clear book on the subject beside Pr Steve Mc Kay collection.

  • @rohansinghthelord
    @rohansinghthelord 3 місяці тому

    I'm a little confused as to why we take the log. Specifically, wouldn't we want the part of the function that changes the most to have more weight in the expectation? Aren't small changes not that notable in comparison?

  • @stillwalking78
    @stillwalking78 7 місяців тому

    The most informative video on Fisher information I have seen, pun intended! 😄

  • @qudratullahazimy4037
    @qudratullahazimy4037 Рік тому

    Absolutely great explanation! made my my life easy.

  • @ZardoshtHodaie
    @ZardoshtHodaie Рік тому

    The beauty of math becomes evident when a good teacher teaches it :) ... thank you! thank you!

  • @niveditashrivastava8374
    @niveditashrivastava8374 Рік тому

    Very informative video. The normal distribution is plotted for a particular value of the mean. How can we perform differentiation wrt to the mean? Am I missing something here.

    • @iain_explains
      @iain_explains  Рік тому

      In the definition of Fisher Information, there is a log function. This cancels out the exponential function in the function f(y;theta).

  • @user-fk2pc9zp3t
    @user-fk2pc9zp3t Рік тому

    讲的真好,谢谢了

  • @bobcoolmen1571
    @bobcoolmen1571 Місяць тому

    Excellent video thank you sir.

  • @pitmaler4439
    @pitmaler4439 2 роки тому

    Thanks. Is the FI only useful when you compare a situation with the identical PDF?
    There don't seem to be an unit for the FI, or can you compare parameter with different PDFs?

    • @iain_explains
      @iain_explains  2 роки тому +2

      Just to clarify, the FI is not "comparing situations". It is a measure of the information in a random variable drawn from a (single) PDF. Of course the FI measure for different random variables (PDFs) can be compared.

    • @pitmaler4439
      @pitmaler4439 2 роки тому +1

      @@iain_explains Yes, I just thought, you get a number for the FI. But for what purpose? Without the unit you cannot put the number in a relation. Now I read that they use that value for the Cramer-Rao bound.

    • @khalifi2100
      @khalifi2100 2 роки тому +2

      ​ Example: like the uniform distribution, the FI is zero, and this is why they call the Uniform PDF: "uninformative PDF". So FI is a helpful measure even outside the use of Cramer-Rao bound calculation.

  • @cerioscha
    @cerioscha Рік тому

    Great video thanks !

  • @nzambabignoumba445
    @nzambabignoumba445 6 місяців тому

    Wonderful!!!!

  • @natsuaya2859
    @natsuaya2859 Рік тому

    great video thank you! what about fisher information and CRLB?

    • @iain_explains
      @iain_explains  Рік тому

      Yes, thanks for the suggestion. It's on my "to do" list.

  • @lostmylaundrylist9997
    @lostmylaundrylist9997 Рік тому +1

    Excellent!

  • @vedantnipane2268
    @vedantnipane2268 6 місяців тому

    🎯 Key Takeaways for quick navigation:
    00:01 🌡️ *Fisher Information measures information content in measurements about a parameter. An example is given with various methods to measure stomach temperature.*
    03:23 📊 *The Fisher Information formula is explained, involving the expected value of the squared derivative of the log of the probability density function (pdf) of a random variable.*
    07:22 📉 *Fischer Information is inversely proportional to the variance of noise in measurements. Small noise leads to high information, while large noise results in low information.*
    14:22 🔄 *The log function in the Fisher Information formula enhances small values in the pdf, ensuring contributions from all parts of the distribution, giving a comprehensive measure.*
    16:50 📈 *Fischer Information decreases as noise (standard deviation) increases, illustrated with visualizations of pdf changes and their impact on the information measure.*
    Made with HARPA AI

  • @ujjwaltyagi3030
    @ujjwaltyagi3030 Рік тому

    Also in the video we never talked about which function gives most info about theta? is it pills , coffee or arm?

  • @govindbadgoti8678
    @govindbadgoti8678 Рік тому

    i am from india ........ your video is so informative

  • @arjunsnair4986
    @arjunsnair4986 2 роки тому

    Thank you sir

  • @a.nelprober4971
    @a.nelprober4971 Рік тому

    Am I a dummy? For the fisher info of theta I have computed theta/(pop variance). (I have theta instead of 1)

  • @xinpeiwu6086
    @xinpeiwu6086 Рік тому

    absolutely made everything understandable, better than my college professor😁

  • @gamingandmusic9217
    @gamingandmusic9217 Рік тому

    sir, can you please tell the difference between
    1.Maximum likelihood (ML)
    2.Maximum Aposteriori (MAP)
    3.Least squares (LS)
    4.Minimum mean square error (MMSE)
    5. zero forcing (ZF).
    moreover, are the equalizer and receiver the same?
    if possible, please post a video on this topic sir. Thank you so much for inspiring us sir.

    • @iain_explains
      @iain_explains  Рік тому +2

      Have you checked out my webpage? iaincollings.com I've already got videos on all the topics you ask about. There are lots of them, but the three most relevant would be: "What are Maximum Likelihood (ML) and Maximum a posteriori (MAP)?" ua-cam.com/video/9Ahdh_8xAEI/v-deo.html and "How are Matched Filter (MF), Zero Forcing (ZF), and MMSE Related?" ua-cam.com/video/U3qjVgX2poM/v-deo.html and "What is Least Squares Estimation?" ua-cam.com/video/BZ9VlmmuotM/v-deo.html

    • @gamingandmusic9217
      @gamingandmusic9217 Рік тому +1

      @@iain_explains Thank you so much sir.you have taken time to give me reply and all the links. Thank you so much agin sir.

  • @tuongnguyen9391
    @tuongnguyen9391 Рік тому

    Could you kindly explain "what is polarization " of polar code :) ?

    • @iain_explains
      @iain_explains  Рік тому +1

      I'll have to give that one some more thought. I don't really have a good intuitive explanation right now.

    • @tuongnguyen9391
      @tuongnguyen9391 Рік тому +1

      @@iain_explains Thank you professor

  • @musaadalruwaili5772
    @musaadalruwaili5772 2 роки тому

    Hi, I really enjoy your videos, and I have learned a lot. I tried to find your email to contact you, and I only find your University's email. I am a Ph.D. student, and I am working on a D2D base on the NOMA system. So, could you please explain the D2D system and how it works? Thank you

    • @iain_explains
      @iain_explains  Рік тому

      Thanks for the suggestion. I'll put it on my "to do" list.

  • @InquilineKea
    @InquilineKea Місяць тому

    12:00

  • @thomascook7948
    @thomascook7948 7 місяців тому

    i wish you were my professor

    • @iain_explains
      @iain_explains  7 місяців тому

      That's nice of you to say. I'm glad you like the videos.

  • @ujjwaltyagi3030
    @ujjwaltyagi3030 Рік тому

    mistake at 15:07, fisher information would be 16 as 1/(0.25)^2=16

    • @iain_explains
      @iain_explains  Рік тому

      No, you're mistaken. I said the variance is 0.25. The symbol for variance is sigma^2.

    • @ujjwaltyagi3030
      @ujjwaltyagi3030 Рік тому +1

      @@iain_explains ok thanks my bad