SURE estimator derivation - part 1

Поділитися
Вставка
  • Опубліковано 15 жов 2024

КОМЕНТАРІ • 6

  • @NnRNoAh
    @NnRNoAh 3 роки тому

    I am using NLSUR for my research, and it is sometimes difficult for me to explain the benefits of using NLSUR for a simultaneous system of equations over an IV approach in which I recover a valid IV from another equation. My explanation is around the need to estimate multiple parameters from multiple equations at the same time as my structural model suggests. This explanation, however, is incomplete. Not only my simultaneous system of equations needs to be estimated with NLSUR to guarantee theoretically consistent parameters (thus unbiased), but also I need to deal with the heteroskedasticity and autocorrelation of my system (thus consistency). I am not an econometrician, so I am happy to say that your video has been really helpful to complement my answer.

  • @c0bhc666
    @c0bhc666 7 років тому

    Your videos are fantastic, thank you.

  • @SC-hq9mk
    @SC-hq9mk 6 років тому +1

    While finding the expression for transforming the matrix, why did you equate p.sigma.p'=I and not sigma^2.I according to the assumption of no heteroscedasticity.

    • @嘻嘻大好人
      @嘻嘻大好人 5 років тому +2

      The purpose here is to eliminate the heteroscedasticity and serial correlation. Setting p.sigma.p'=I is enough to guarantee this because sigma^2 is a constant. Hope this explanation can help you a bit.

    • @mdenayethossain6194
      @mdenayethossain6194 4 роки тому

      @@嘻嘻大好人 So, do you mean that to take into account serial correlation too, sigma^2 has not been included here?

    • @kottelkannim4919
      @kottelkannim4919 3 роки тому

      ​@@mdenayethossain6194 In short, No.
      1. To begin with, there are N different sigma-s, one for each participant/ individual. One wouldn't know what sigma to pick.
      2. to be able to use OLS algorithm, one aims at getting variance-covariance matrix which is diagonal* and has identical element on its main diagonal**.
      * The diagonal restriction arises from Gauss-Markof's no-serial-correlation constraint.
      ** the identical elements on the diagonal restriction arises from another Gauss-Markof's constraint of homoscedasticity (same sigma/variance for every data point).
      Now, you have a symmetric positive semi-definite variance-covariance SIGMA matrix.
      SIGMA= var(EPSILON)
      The problem is it is NOT diagonal and one CANNOT use OLS algorithm when one has a NON diagonal matrix, such as SIGMA matrix.
      Since one wishes to use OLS algorithm, one wishes one had some kind of variable transformation, a P-matrix, of the ESPILON vector such that the transformed EPSILON vector, namely P*EPSILON , would have a DIAGONAL variance-covariance matrix.
      The variance matrix of any transformed_EPSILON= P*EPSILON can be represented by the following matrix factorization:
      var(transformed_EPSILON) = P*SIGMA*P'.
      The fact that there exists such matrix factorization of var(transformend_EPSILON) is a virtue of this matrix' symmetry (for example, look for LDLT or Cholesky decompositions).
      This brings me back to your original question. if one uses
      P*SIGMA*P'=constant*I
      instead of
      P*SIGMA*P'=I
      P and P' may be easily adapted such that the multiplying constant of I is 'absorbed' by the P and P', thus preserving the diagonality (and identical elements on the diagoal) of var(transformed_EPSILON).