This video is really great. I have seen a lot of youtube videos about linear regression using matrices, but none of them was that clear and pedagogic. Thank you.
The best I have seen on this teaching for now. Thank you so much. How wish you can illustrate on another video 'how do we believe E(ei) = 0 '? Assumptions, with example like did here. Best,
Very nice video, thanks. Just one question, so if beta_hat = [ b m ] with one explanatory variables, does it become beta_hat = [ b m1 m2 m3 ] with three explanatory variables?
Solving the system X^T X A = X^T Y for A by other methods (gaussian elimination, matrix decomposition, conjugate gradient method etcetera), is faster and more numerically stable than computing the inverse of X^T X, especially for large matrices.
Finally - someone who can "teach" the concepts vs. just reading slides and equations - You Rule!
This video is really great. I have seen a lot of youtube videos about linear regression using matrices, but none of them was that clear and pedagogic. Thank you.
Great explanation. As a tutor at PC< I had a student with this kind of problem and didnt know how to help him. Now I do.
One of the best explanation of linear regression in matrix form
I looked at this in several books, but this helped the most!
most helpful video i've seen in a while! very in-depth calculation-wise and was good review for my exam!
A wonderful overview. Very clear explanation and great pacing. Thank you.
Oh, there is a little mistake. In the slide of 5:27 you corrected the value "25367". I'm sorry for the inconvenience. Thank you.
went through tons of material including course era and udacity but ur tutorial is the best!!
Thank you! This is was SO informative. Best content on this subject I've seen so far!
you explained this so much more clearly omg
I wish I could give this video 1000 thumbs
Thank you!
I agree!
Same
Man I was struggling until you came along. Thank you so much!
Thank you very much. A video on the derivation of OLS using matrix notation would be very much appreciated!
Thanks a lot! This deserves my immediate subscription! Hope you make a playlist that contains all kind of estimator methods!
This has helped me a lot.Hope this comes in my end semester exam.
6:24 Looking for confirmation; should the formula for e_i have y_i instead of y_1?
thank you, this has been of great help in my assignment
You actually rescued me...Many thanks... Love the quote too
This was really good. I was surprised - Please post more
The best I have seen on this teaching for now. Thank you so much. How wish you can illustrate on another video 'how do we believe E(ei) = 0 '? Assumptions, with example like did here.
Best,
Thanks you so much for all your videos
you really help us with all your jobs, thanks again
This is so helpful to understand regression analysis. Thank you so much.
New subscriber! I really enjoyed this video! thanks for this great explanation!!!!
Highly efficient and effective. Thank you!
Thanks.This is the best explanation of linear algebra i have ever seem. Do you have a statisitcs or data science playlist?
Wow, thank you for such a complimentary comment! I appreciate it.
Watching on 2018. Very helpful sir👍🏻👍🏻
At time t=2:28 where did the expression of A come in?
what a great video !
but plz suggest from where I can have the understanding of the eq at 2:20
thanks in advance
So great ! i need more example for ridge regression ! thank you
Thanks man you saved my day!
What if I want to solve with a constraint, like b = 0? (i.e., my regression has to pass through the origin)
thank you, you help me for my exam
Thanks so much! Explained very well and helpful!
Nice thank you. I have an off-the-track question - Can you please let me know how you have recorded this video? What tool did you use?
Thanks for the use cases
Very nice video, thanks. Just one question, so if beta_hat = [ b m ] with one explanatory variables, does it become beta_hat = [ b m1 m2 m3 ] with three explanatory variables?
Great explanation, I wish you could share the slides
Thanks! This rescued me
Wow, this is awesome man, thanks
Really well explained, thank you :)
how did you get 25 365?
where I can find the playlist
Wow, super useful, thanks a million!
great video, thank you!
you're a hero
Thank you so very much for the video.
Solving the system X^T X A = X^T Y for A by other methods (gaussian elimination, matrix decomposition, conjugate gradient method etcetera), is faster and more numerically stable than computing the inverse of X^T X, especially for large matrices.
Awesome video!
Well explained every point. Thanks for uploading this video :-)
Could you please upload for Logistic regression too. Please :-)
Hello Sir, watching in 2018. First, thank you for the explanation, and second: what software are you using in this video? Thank you & have a nice day
Thanks. Awesome video.
very helpful thank you very much
Great video!
Thanks a million 👍
211 how is found?
Helpful.Thanks
thx G u a real one.
This is great! Some presenters have a bland droning voice. Not this guy!
Thank you so much!
cheers mate big help
super helpful
thank you bro really good :)
really good :)
Thank you for the comment.
THANK U. THANK U. THANK U.
Who can tell me how to get SSE=E'E in matrices?
tnks very much
is so much intersting
I was interested in the explanation of the formulas. The video was worthless for me.
dank u senpai
fantastic. if u were here i would hug and kiss you. . thanks so much dude!
I found 25.367 and not 25.365 =(