Simple Linear Regression MLE are the same as LSE
Вставка
- Опубліковано 8 лип 2024
- In this video I show that under the normality assumption for the model error, Simple Linear Regression Maximum Likelihood Estimators are the same as Least Squared Estimators
Very very helpful.... Thanx.... I was just unable to catch up with the pace of professor in the linear models... Your playlist really helped me a lot ... Thanx again ❤️
So glad to hear that you found this playlist to be helpful!! :-)
thanks a lot , life saver before the stats exam
Thank you ma very much for this video. You made my day.
Thank you so much mam......
Thank you very much for this video. However ,I think the Yi's are independent but not identically distributed since they have different expected values. Despite viewing the Xi's as constant, we change these values from one y value to another. What do you think about this?
You are correct. The Error term, epsilon is identically distributed. The y's depend on the subscript i. For each i, y will have a different expected mean.
Can you please explain why y will have normal distribution?
Thank you very much for this video. I just had one doubt. Don't we take ln after L in MLE? Or is it not required here?
no need in this case for the linear model of y.
Hey, isn't it wrong to assume that x_i are non-random? What I understand is that they ARE random and we can only say that y_i is unconditionally normal if x_i and y_i are jointly normally distributed
Can you explain why did you say 'these are residuals' at 7:50 , when they are epsilon/error?
In your this video: ua-cam.com/video/RZVXf50-HWE/v-deo.html , the residual is being minimised, ie (Yi - Yi_hat)^2
where Yi = beta_0 + beta_1 * Xi + Error
and Y_hat = beta0_hat + beta1_hat * Xi
Just asking. Bit confused.
Why are there ads every 10 seconds
I'm sorry you had that experience! I do not usually have "Mid roll" adds on my videos. I mistakenly had them on this video; I went ahead and corrected this, so you should not have any "mid roll" adds for this video now.
@@Stats4Everyone Thank you, that is much appreciated