Gauss-Markov assumptions part 2

Поділитися
Вставка
  • Опубліковано 16 жов 2024
  • This video details the second half of the Gauss-Markov assumptions, which are necessary for OLS estimators to be BLUE.
    Hi thanks for joining me. Today we are going to be talking about the second half of the Gauss-Markov assumptions. If you missed the first half you may want to have a look at the previous video which looks through assumptions one to three. So just to reiterate, the Gauss-Markov assumptions are the set of conditions which if they are upheld then that means that least-square estimators are BLUE. So, that means that they are the best, linear, unbiased estimators which are possible. So the fourth Gauss-Markov assumption is something which we refer to as no perfect collinearity. And this is referring to our particular sample, but by deduction it also refers to the population. So, what does it actually mean? Well no perfect collinearity - in regressors I should say - that means that if i have some sort of model that y equals alpha plus 'beta one' times 'x one' plus 'beta two' times 'x two', plus some sort of an error. That there cannot be an exact relationship between 'x one' and 'x two', so I cannot write down in an equation that 'x one' is equal to 'delta nought', plus 'delta one' times 'x two'. That means that if I know 'x two', I exactly know 'x one'. In a sense 'x one' and 'x two' are exactly the same event. So, an example of this might be, if I was trying to determine which factors affect the house price of a given house from its attributes, then if I was to include a regression which included the square meterage of that house, and also the square footage. Well, obviously if I know square meterage, I actually know square footage - they are both essentially the same thing. Square footage is essentially equal to nine, times the square meterage of the house. So, obviously within a regression, I am going to have a hard time unpicking square footage from square meterage, because they're exactly the same thing. And, the assumption of no perfect collinearity among regressions means that I cannot include both of these things in my regression. Assumption five is called 'homoskedastic errors'. So, homoskedastic errors means that if I was to draw a process - so let's say that I have the number of years of education and the wage rate, and this again is referring to population rather than to the sample. If I have errors which, looks something like - when I draw the population line - like that whereby the distribution of errors away from the line remain relatively constant, that are lying between the error lines which I draw here. There's no increasing or decreasing of errors along the education variable, then that means that errors are homoskedastic. So, mathematically that just means that I can write the variance of our error in the population process, is equal to some constant, 'sigma squared', or writing it a little bit more completely. The variance of 'u i' given 'x i' is equal to 'sigma squared'. In other words the variance - how far the points are away from the line - does not vary systematically with x. The last Gauss-Markov assumption is called 'no serial correlation'. What this means is mathematically that the covariance between a given error 'u i' and another error 'u j', must be equal to zero, unless i equals j. In which case we are considering the covariance of the error with itself, in which case we have variance, which is to do with assumption five. So, this last assumption of 'no serial correlation' means that the errors essentially have to be independent of one another. So, knowing one of the errors, doesn't help me predict another error. So in other words if I know this error here in my diagram this doesn't help me predict the error here for a higher level of education. This concludes my video summarising the Gauss-Markov assumptions. I'm going to go and examine each of these assumptions in detail in the next few videos. I'll see you then. Check out ben-lambert.co... for course materials, and information regarding updates on each of the courses. Quite excitingly (for me at least), I am about to publish a whole series of new videos on Bayesian statistics on youtube. See here for information: ben-lambert.co... Accompanying this series, there will be a book: www.amazon.co....

КОМЕНТАРІ • 28

  • @christianilkjr6771
    @christianilkjr6771 8 місяців тому

    Thank you so much for these brilliant videos. I am now taking econometrics 2(panel data) but I didn't have the proper foundation from the first course since I was a bit too lazy at the time. Going through econometrics 1 and 2 at the same time is a bit of a mouthful but these videos are a great help. Concise and the many short videos somehow seem less daunting than sitting through a 2 hour lecture.

  • @danielatatarlieva1854
    @danielatatarlieva1854 11 років тому +4

    I've been preparing for an exam this whole week, and watching your videos is really helpful! Especially to understand why something is done this or that way which is actually the most important question! So thanks :)
    I wish my professor was like that...

  • @zumasuma5489
    @zumasuma5489 6 років тому +10

    Fantastic job! Thank you. Greetings from Germany

  • @sandeepmandrawadkar9133
    @sandeepmandrawadkar9133 3 роки тому +1

    Thanks for sharing highly simplified yet very much informative videos 👌👌👌
    Your efforts to share knowledge is highly appreciated. Thank you again.

  • @spart230
    @spart230 10 років тому +8

    Oh God, thank you so much! I have had so much trouble figuring out what "constant variance of error" actually mean - your explanation was CLEAR and i mean CLEAR - this is after reading several econometrics textbooks!!!

  • @sangheehwang9679
    @sangheehwang9679 6 років тому +1

    Thank you so much! I was looking for the G-M assumptions so far and finally got it! really appreciate your video.

  • @hansmoleman777
    @hansmoleman777 11 років тому +1

    Your videos are very helpful to figure out econometrics. Thanks for your great work!

  • @ryandeng2954
    @ryandeng2954 4 роки тому +1

    Thank you, this explanation of homoskedasticity was really helpful!

  • @zuzuzutiga
    @zuzuzutiga 3 роки тому +1

    Hello Porfessor, I have a question: Cov(ui,uj)=0 doesn't mean that ui and uj are independent. At the end of the video, why did you say that they are essentially independent?

    • @kottelkannim4919
      @kottelkannim4919 3 роки тому

      Hear, hear.
      He may have had in mind another assumption, used later on for hypothesis testing, which requires {ui} to be jointly normally distributed.

  • @sweatergod4755
    @sweatergod4755 4 роки тому

    Only video that made sense about this. I appreciate this alot

  • @mengwang6936
    @mengwang6936 5 років тому +2

    im here in 2019 to thank you for making this video

  • @jaimereyes9808
    @jaimereyes9808 6 років тому

    Your videos are great! Very clear and sound. Thank´s a lot!

  • @thomasmattelaer200
    @thomasmattelaer200 5 років тому

    Thanks a lot Ben, we are going to succeed our exam !

  • @joaoluistbarroso6917
    @joaoluistbarroso6917 5 років тому

    Fantastic!!!
    Ben Lambert for the Education Nobel Award!

  • @khaledmustafa7341
    @khaledmustafa7341 4 роки тому

    Thank you very much, dear teacher
    I have a question
    If we have more than two independent variables in this case, how can I find the significance test for the estimated parameters?

  • @equalitydisabilityandhealt4680
    @equalitydisabilityandhealt4680 4 роки тому

    helping me through my exams! thanks

  • @krabbypatty6896
    @krabbypatty6896 3 роки тому

    Superb. Thank you so much! :D

  • @hounamao7140
    @hounamao7140 7 років тому +1

    I don't get how we can make the homoskedastic assumption in practice, how could the variance of error be always the same..

    • @ruchiyadav3633
      @ruchiyadav3633 7 років тому +1

      it is not necessary
      but as we know the regression line goes through the best possible lace minimising the disturbance terms which means homoscedasity can occur

  • @shirubana
    @shirubana 8 років тому

    Great, great expalnation. thanks!

  • @duanjiaxin1240
    @duanjiaxin1240 6 років тому

    cheers! ur video really solve my problems

  • @uhdang
    @uhdang 3 роки тому

    Thank you !!

  • @tomarkhelpalma138
    @tomarkhelpalma138 5 років тому

    Thank you!!!