Finding Polynomials of Best Fit Using Least Squares

Поділитися
Вставка
  • Опубліковано 7 сер 2020
  • This video explains how to find a polynomial of at most degree d which is as close to going through a set of points as possible.
    This video is part of a series of video lectures for the linear algebra class I am teaching.

КОМЕНТАРІ • 5

  • @egonwombat8234
    @egonwombat8234 18 днів тому

    Amazingly insightful - you finally opened my eyes! Thanks!

  • @ironhammersgaming1391
    @ironhammersgaming1391 2 роки тому +4

    Great explanation!

  • @naturelover4148
    @naturelover4148 9 місяців тому +1

    Thanks a lot but could you please explain why you multiplied with transpose matrix on both sides and how it leads to the best fiiting least square polynomial...

  • @omkarparishwad5410
    @omkarparishwad5410 10 місяців тому

    Loved your session! Could you please explain why do you say in the end that this equation from the used method- has the lowest sum of squares? What if a higher polynomial or some other ML method defines a better curve fit?

    • @drjakepotter
      @drjakepotter  10 місяців тому

      The problem statement itself limits the scope of answers we want to consider. There is a cubic polynomial that exactly fits the 4 points, which would mean that the sum of the squares would be 0. If you allow any polynomial, you can fit any (finite) number of points (as long as the x-values are distinct). The least squares method allows us to do the "best" we can with a limit on the degree of the desired polynomial. In this case, the problem statement limits us to polynomials of degree at most 2, so higher degree polynomials are not considered.
      The answer we get yields the lowest possible sum of squares because the result of the least-squares process is a projection. We are projecting the original point in \R^4 (thought of as the vector of 4 y-values) onto the 3-dimensional subspace of \R^4 which contains only those vectors which correspond to polynomials (with degree at most 2). Whenever we project a point onto a subspace, the result is the point in the subspace which is closest to the original point. Closeness (distance) here means the square root of the sum of the squares of the z_i, but this is minimized if and only if the sum of the squares (without the square root) is minimized. For why this minimization happens with projections, check out Khan Academy's video here: ua-cam.com/video/b269qpILOpk/v-deo.htmlsi=SnjOnOt7dy_KCJ78.