What is Least Squares?

Поділитися
Вставка
  • Опубліковано 1 лип 2024
  • A quick introduction to Least Squares, a method for fitting a model, curve, or function to a set of data.
    TRANSCRIPT
    Hello, and welcome to Introduction to Optimization. This video provides a basic answer to the question, what is Least Squares?
    Least squares is a technique for fitting an equation, line, curve, function, or model to a set of data. This simple technique has applications in many fields, from medicine to finance to chemistry to astronomy. Least squares helps us represent the real world using mathematical models.
    Let’s take a closer look at how this works. Imagine that you have a set of data points in x and y, and you want to find the line that best fits the data. This is also called regression. For a given x, you have the y value from your data and the y value predicted by the line. The difference between these values is called the error, or the residual.
    This residual value is calculated between each of the data points and the line. To give only positive error values, the error is usually squared. Next the individual residuals are summed to give the total error between the data and the line, the sum of squared errors.
    One way to think of least squares is as an optimization problem. The sum of squared errors is the objective function, and the optimizer is trying to find the slope and intercept of the line that best minimizes the error.
    Here we’re trying to fit a line, which makes this a linear least-squares problem. Linear least squares has a closed form solution, and can be solved by solving a system of linear equations. It’s worth noting that other equations, such as parabolas and polynomials can also be fit using linear least squares, as long as the variables being optimized are linear.
    Least squares can also be used for nonlinear curves and functions to fit more complex data. In this case the problem can be solved using a dedicated nonlinear least squares solver, or a general purpose optimization solver.
    To summarize, least squares is a method often used for fitting a model to data. Residuals express the error between the current model fit and the data. The objective of least squares is to minimize the sum of the squared error across all the data points to find the best fit for a given model.
  • Наука та технологія

КОМЕНТАРІ • 29

  • @randolfshemhusain2298
    @randolfshemhusain2298 4 місяці тому +7

    Visual learning makes things so much better.

  • @mymanager3662
    @mymanager3662 2 роки тому +14

    excellent animation and explanation, simple to follow and understand!

  • @SwanPrncss
    @SwanPrncss Рік тому +5

    Omg, your explanation is better than other youtube videos and my teacher because I'm a visual learner.

  • @MultiJx2
    @MultiJx2 Рік тому +2

    compact and thorough at the same time. thanks !

  • @mhd112211
    @mhd112211 4 місяці тому

    Thanks a lot, I have the curse of being a visual learner and this was amazing.

  • @plep-m555ww
    @plep-m555ww 7 місяців тому +1

    Great video and helpful channel! Khan academy and the organic chemistry guy are getting old and less helpful as school curriculums develop. Super grateful for these simple, direct explanations

  • @raminbohlouli1969
    @raminbohlouli1969 10 місяців тому +3

    Simple yet extremely informative👍

  • @mesfindelelegn1165
    @mesfindelelegn1165 Місяць тому

    clear and brief idea

  • @tonmoysharma5758
    @tonmoysharma5758 Рік тому +2

    Excellent video and also quite easy to understand

  • @anshisingh1915
    @anshisingh1915 Рік тому

    lovely brooo, such good animation, now i have the concept in my head.

  • @funfair-bs7wf
    @funfair-bs7wf Рік тому

    This is a great little video !

  • @VictoriaOtunsha
    @VictoriaOtunsha Рік тому

    Thank you for simplifying this

  • @ernstuzhansky
    @ernstuzhansky Рік тому

    Excellent! Thank you.

  • @rohiniharidas
    @rohiniharidas 2 роки тому

    All videos are excellent

  • @tomerweinbach4059
    @tomerweinbach4059 Рік тому

    great explanation!

  • @fabianb.7429
    @fabianb.7429 Рік тому

    Just perfect. Thanks

  • @yoshitha12
    @yoshitha12 Рік тому

    Thank you... ❤

  • @ahmedshalaby9343
    @ahmedshalaby9343 8 місяців тому

    in 2 mins just you explained everything

  • @Keyakina
    @Keyakina Рік тому

    But residual != error?

  • @Jkauppa
    @Jkauppa 2 роки тому

    fit a function f(x) to data with normal noise, f(x) can be a line, or a polynomial, etc, includes outlier handling, least squares is very sus

    • @Jkauppa
      @Jkauppa 2 роки тому

      line with normal noise is a better answer than just a line

    • @Jkauppa
      @Jkauppa 2 роки тому

      constant std additive normal noise assumed

    • @Jkauppa
      @Jkauppa Рік тому

      think like you are removing a base line function from the data points, either linear (like pca, f(x)=kx+c) or polynome (nonlinear pca, f(x)=...+ax^2+bx+c), then checking if the noise is from a normal distribution, ie, trying to make the noise after removing the base line as normal as possible, if you do linear, the noise might not be normal, so you get only a partial pca component fit, kinda

  • @alice20001
    @alice20001 Рік тому +1

    Why use the squares instead of the absolute values?

    • @laraelnourr
      @laraelnourr Рік тому +1

      because they are easier to compute and deal with mathematically. But we can use absolute values too!

    • @faheemrasheed9967
      @faheemrasheed9967 Рік тому

      because it gives more clear picture if we have error of ,1 and if we square it it will give 0,01 which is kind of scaled.

    • @AchiragChiragg
      @AchiragChiragg 6 місяців тому

      ​@@faheemrasheed9967actually it's the other way around. It's better to use absolute value instead of squares as it can amplify the outliers and influence the final fit.

  • @rudypieplenbosch6752
    @rudypieplenbosch6752 Рік тому

    no example, pretty useless