Great VIdeo!!! This makes so much sense now. I was having trouble in my ML course. You just blew my mind. Math can be so interesting and the whole journey can be so exciting if we could try to understand like this !!!!! Thank you so much !!! Such a Great teacher Khan
At 7:00, how he decides that the surface is going to be in 3D ? and how you know which surface it is going to be? like how you predicted that the surface is going to be a 3D parabola? why not any other surface say why not sphere?
3d surface is there because square error is dependent on the regression line we choose. The regression line is function of two variables (m,b). So square error is dependent variable which depends on two independent variables (m,b). In such cases we obtain a 3d surface much like z=f(x,y). Mostly that surface will look like parabola so that we have optimum point where the value of the square error is minimum. Hope this helps!!
hi all , I am very new to statistics. Why did we consider m and b as an axis while m is the slope and b is the intercept ? please explain. Also, why is our goal to minimize the surface of the cup like thing? (at 8:40 )
3d surface is there because square error is dependent on the regression line we choose. The regression line is function of two variables (m,b). So square error is dependent variable which depends on two independent variables (m,b). In such cases we obtain a 3d surface much like z=f(x,y). We minimise the square error because we want best approximation for regression line. Hope this helps!!
amazing. didn know that we will end up involving 3d equations from trying to minimize the least square error of a 2d plot. i wonder what will happen if we want a optimise a 3d plot then
3d surface is there because square error is dependent on the regression line we choose. The regression line is function of two variables (m,b). So square error is dependent variable which depends on two independent variables (m,b). In such cases we obtain a 3d surface much like z=f(x,y). Hope this helps!!
Great video series. However I don't quite agree that y1^2 + y2^2 + ...yn^2 = n(y_bar)^2, shouldn't it be (y1 + y2 + ...yn)^2 = (n*y_bar)^2? It doesn't matter for the end result because this term is a constant. But still : D
+Praveen Kambhampati I suppose you are talking about the picture at 5:30. Those "mean x^2" and "mean x" at the bottom of your fractions are not actually parts of any fractions in the video (and, therefore, can't be canceled out). Those are just captions of new, shorter versions for long sums in brackets.
Great VIdeo!!!
This makes so much sense now. I was having trouble in my ML course. You just blew my mind. Math can be so interesting and the whole journey can be so exciting if we could try to understand like this !!!!!
Thank you so much !!!
Such a Great teacher Khan
Thank you so much! These proof videos are making my life easier.
good video!! cannot believe how can professors not explain the whole picture and fundamentals ..
this is great Sal, thanks for your work!
Can anyone recommend a UA-cam video or other source that helps to better visualize and gain intuition for this 3D concept of optimizing for m & b?
Excellent and simple explanation! Now only i understand
This has been veeerrrryyyy much helpful😍
Thank you so much.😌😌😌
This video is misplaced in the playlist. Should be video no 53 actually, pls correct it if possible.
At 7:00, how he decides that the surface is going to be in 3D ? and how you know which surface it is going to be? like how you predicted that the surface is going to be a 3D parabola? why not any other surface say why not sphere?
3d surface is there because square error is dependent on the regression line we choose. The regression line is function of two variables (m,b). So square error is dependent variable which depends on two independent variables (m,b). In such cases we obtain a 3d surface much like z=f(x,y).
Mostly that surface will look like parabola so that we have optimum point where the value of the square error is minimum.
Hope this helps!!
hi all , I am very new to statistics.
Why did we consider m and b as an axis while m is the slope and b is the intercept ? please explain.
Also, why is our goal to minimize the surface of the cup like thing? (at 8:40 )
3d surface is there because square error is dependent on the regression line we choose. The regression line is function of two variables (m,b). So square error is dependent variable which depends on two independent variables (m,b). In such cases we obtain a 3d surface much like z=f(x,y).
We minimise the square error because we want best approximation for regression line.
Hope this helps!!
amazing. didn know that we will end up involving 3d equations from trying to minimize the least square error of a 2d plot. i wonder what will happen if we want a optimise a 3d plot then
do you ever stop working????? LOL. And how do you know so much about everything??????
I understood everything until that 3D figure popped up.
3d surface is there because square error is dependent on the regression line we choose. The regression line is function of two variables (m,b). So square error is dependent variable which depends on two independent variables (m,b). In such cases we obtain a 3d surface much like z=f(x,y). Hope this helps!!
we can visualize functions with two independent variables using 3D plots or contour plots
Great video series. However I don't quite agree that y1^2 + y2^2 + ...yn^2 = n(y_bar)^2, shouldn't it be (y1 + y2 + ...yn)^2 = (n*y_bar)^2? It doesn't matter for the end result because this term is a constant. But still : D
m^2 * (mean x^2 / mean x^2) should cancel out leaving m^2. similarly the next term in the equation should be 2mb *( mean x / mean x) leaving 2mb !!
+Praveen Kambhampati I suppose you are talking about the picture at 5:30. Those "mean x^2" and "mean x" at the bottom of your fractions are not actually parts of any fractions in the video (and, therefore, can't be canceled out). Those are just captions of new, shorter versions for long sums in brackets.
WOW..
THanx bro
8:44 bootie