Proof (part 2) minimizing squared error to regression line | Khan Academy

Поділитися
Вставка
  • Опубліковано 21 гру 2024

КОМЕНТАРІ • 24

  • @meme12389
    @meme12389 3 місяці тому +1

    Great VIdeo!!!
    This makes so much sense now. I was having trouble in my ML course. You just blew my mind. Math can be so interesting and the whole journey can be so exciting if we could try to understand like this !!!!!
    Thank you so much !!!
    Such a Great teacher Khan

  • @madinabonuturdalieva8299
    @madinabonuturdalieva8299 Рік тому +1

    Thank you so much! These proof videos are making my life easier.

  • @smrekojow
    @smrekojow 9 років тому +6

    good video!! cannot believe how can professors not explain the whole picture and fundamentals ..

  • @mialmastaposeia
    @mialmastaposeia 2 роки тому

    this is great Sal, thanks for your work!

  • @SportsManVegetal
    @SportsManVegetal 10 місяців тому

    Can anyone recommend a UA-cam video or other source that helps to better visualize and gain intuition for this 3D concept of optimizing for m & b?

  • @maheswarisrinivasan8391
    @maheswarisrinivasan8391 4 роки тому

    Excellent and simple explanation! Now only i understand

  • @anweshadutta8782
    @anweshadutta8782 5 років тому

    This has been veeerrrryyyy much helpful😍

  • @anweshadutta8782
    @anweshadutta8782 5 років тому

    Thank you so much.😌😌😌

  • @ishangarg2227
    @ishangarg2227 2 роки тому +1

    This video is misplaced in the playlist. Should be video no 53 actually, pls correct it if possible.

  • @niteshmethani9884
    @niteshmethani9884 7 років тому +1

    At 7:00, how he decides that the surface is going to be in 3D ? and how you know which surface it is going to be? like how you predicted that the surface is going to be a 3D parabola? why not any other surface say why not sphere?

    • @amitjoshi-eb5vh
      @amitjoshi-eb5vh 3 роки тому +1

      3d surface is there because square error is dependent on the regression line we choose. The regression line is function of two variables (m,b). So square error is dependent variable which depends on two independent variables (m,b). In such cases we obtain a 3d surface much like z=f(x,y).
      Mostly that surface will look like parabola so that we have optimum point where the value of the square error is minimum.
      Hope this helps!!

  • @91vasanth
    @91vasanth 6 років тому

    hi all , I am very new to statistics.
    Why did we consider m and b as an axis while m is the slope and b is the intercept ? please explain.
    Also, why is our goal to minimize the surface of the cup like thing? (at 8:40 )

    • @amitjoshi-eb5vh
      @amitjoshi-eb5vh 3 роки тому

      3d surface is there because square error is dependent on the regression line we choose. The regression line is function of two variables (m,b). So square error is dependent variable which depends on two independent variables (m,b). In such cases we obtain a 3d surface much like z=f(x,y).
      We minimise the square error because we want best approximation for regression line.
      Hope this helps!!

  • @gnzeu4tpns91
    @gnzeu4tpns91 6 років тому

    amazing. didn know that we will end up involving 3d equations from trying to minimize the least square error of a 2d plot. i wonder what will happen if we want a optimise a 3d plot then

  • @wookiemaster73
    @wookiemaster73 14 років тому +15

    do you ever stop working????? LOL. And how do you know so much about everything??????

  • @smaranrai6689
    @smaranrai6689 4 роки тому +2

    I understood everything until that 3D figure popped up.

    • @amitjoshi-eb5vh
      @amitjoshi-eb5vh 3 роки тому +1

      3d surface is there because square error is dependent on the regression line we choose. The regression line is function of two variables (m,b). So square error is dependent variable which depends on two independent variables (m,b). In such cases we obtain a 3d surface much like z=f(x,y). Hope this helps!!

    • @Ivan_Penkov
      @Ivan_Penkov Рік тому

      we can visualize functions with two independent variables using 3D plots or contour plots

  • @xiemins
    @xiemins 5 років тому +1

    Great video series. However I don't quite agree that y1^2 + y2^2 + ...yn^2 = n(y_bar)^2, shouldn't it be (y1 + y2 + ...yn)^2 = (n*y_bar)^2? It doesn't matter for the end result because this term is a constant. But still : D

  • @PraveenKambhampatiMyVideos
    @PraveenKambhampatiMyVideos 8 років тому

    m^2 * (mean x^2 / mean x^2) should cancel out leaving m^2. similarly the next term in the equation should be 2mb *( mean x / mean x) leaving 2mb !!

    • @gyermolenko
      @gyermolenko 8 років тому

      +Praveen Kambhampati I suppose you are talking about the picture at 5:30. Those "mean x^2" and "mean x" at the bottom of your fractions are not actually parts of any fractions in the video (and, therefore, can't be canceled out). Those are just captions of new, shorter versions for long sums in brackets.

  • @andregomes2629
    @andregomes2629 5 років тому

    WOW..

  • @dollyfacegirl
    @dollyfacegirl 13 років тому

    THanx bro

  • @powerToYourself36
    @powerToYourself36 6 років тому +2

    8:44 bootie