Thank you for this comment. Yes, this is a good idea. I will make a video on this either later today, or tomorrow. I will respond with a link as soon as it is ready.
In almost all videos on yt the explanation for the derivative of b‘x‘xb is „ah just think as if that is b^2” BUT that’s actually wrong, throughout your solution is right.
Good question - to respond to this a second derivative will result into a [X^tX] which is always positive as long as X is non zero real number hence b is local min
This was a life saver…..finally someone explained the entire derivation steps! Was stuck at the 3rd last step for so long
Video MARAVILHOSO. Salvou minha vida e explodiu meu cérebro!!!
Great video, thanks 👍🏻.
Very good explanation! Thank you!!!!
This is so helpful, thank you so much
Thank you
All I did after watching is just "hit the subscribe button"!
excellent, thank you! :)
Glad it helped!
thank you
Perpect explanation...can you explain also the likelihood method estimation for logistic regression coefficient
Thank you for this comment. Yes, this is a good idea. I will make a video on this either later today, or tomorrow. I will respond with a link as soon as it is ready.
New Video on MLE for Logisitic Regression: ua-cam.com/video/1YYi3pVbhwE/v-deo.html
Like it, thanks!
Thanks for the feedback! I'm happy to hear that you found this video to be helpful!
In almost all videos on yt the explanation for the derivative of b‘x‘xb is „ah just think as if that is b^2” BUT that’s actually wrong, throughout your solution is right.
how does taking the derivative in respect to b ensure the min of the residuals squared is given and not the max?
Good question - to respond to this a second derivative will result into a [X^tX] which is always positive as long as X is non zero real number hence b is local min