Linear regression

Поділитися
Вставка
  • Опубліковано 6 січ 2025

КОМЕНТАРІ • 54

  • @drpeyam
    @drpeyam  5 років тому +8

    Note: There's a typo at the end of the video: You also have to premultiply b by your matrix with 1/2

  • @ffggddss
    @ffggddss 5 років тому +1

    That isn't the way I've always done it, but I like this treatment.
    And thanks for treating the weighted-data case; that's too often omitted.
    This is a topic that goes a lot further than the fitting of a straight line to coordinate pairs - you can fit all kinds of (families of) functions to a set of points in n dimensions. The end goal is to choose a particular function from the given family, that "best fits" the data.
    The process is simplest when the chosen family is linear in the parameters (note that it can be totally nonlinear in the data variables, without disturbing that simplicity).
    When it isn't, you can sometimes linearize it with a transformation, which usually requires applying weights to the points, just because of the "distortion" imposed by that transformation - that is, the transformation will generally favor some points and disfavor others, which can be compensated for with weights.
    The stated goal of minimizing the sum of squared deviations, can be shown equivalent to your matrix method; I think that would make a nice followup to this.
    As I conceive of that, it's a matter of writing down the sum of squared deviations, then setting the partial derivative of that sum, wrt each fitting parameter (here those were a and b), to 0, and solving for a and b.
    Of course, things get really interesting (read, "hairy!") when the fit family is nonlinear in the parameters.
    A question I sometimes get when explaining the method of least-squares fitting is,
    "Why choose the squares of the deviation to minimize? Why not the absolute value or some other function of the deviations?"
    So that's another thing it might be helpful to go into.
    Fred

  • @frozenmoon998
    @frozenmoon998 5 років тому +4

    I love how the lights randomly turned off and you kept on explaining what was on the board. For a moment I thought that my screen went full black mode, as you were so consistent, even with the lights off. I guess you don't need to see what you wanna say - you gotta visualize it.

  • @alexulloa174
    @alexulloa174 5 років тому +2

    I did this in AP stats earlier this year but not deriving it by Linear algebra. This is amazing! Thanks for the awesome video!

  • @federicopagano6590
    @federicopagano6590 5 років тому +1

    To verify the line is correct the mean value of x must spit out the mean value of y 👉 f (3,6)=2,2✔✔ excellent as always

  • @firstave33
    @firstave33 5 років тому +2

    Literally just covered this in my linear algebra class! Great video thank you for your content

  • @aBetterHumanBeing
    @aBetterHumanBeing 5 років тому

    I really like this approach through linear algebra, I'd like to see more about linear regression and statistics in general with this approach. I'd also like to know exactly why it works this way

  • @jorgeeduardopereztasso6134
    @jorgeeduardopereztasso6134 5 років тому +1

    I dreamed so long ago for this video!!! Thank you so much Dr. Peyamtastic!!

  • @beatoriche7301
    @beatoriche7301 5 років тому +2

    I’ve only ever calculated regression polynomials using calculus and partial derivatives - this is certainly an interesting way to do it; thanks for sharing it! I think I’ll stick with the calculus way because I find it more elegant, though.

    • @emperorpingusmathchannel5365
      @emperorpingusmathchannel5365 5 років тому

      Can you explain your method?

    • @beatoriche7301
      @beatoriche7301 5 років тому

      @@emperorpingusmathchannel5365
      It’s pretty straightforward: Set up an explicit formula for the sum of the square areas, take the partial derivatives with respect to the coefficients of your regression polynomial, and solve the resulting linear system of equations.

    • @emperorpingusmathchannel5365
      @emperorpingusmathchannel5365 5 років тому

      @@beatoriche7301 sum of square areas of what?

    • @beatoriche7301
      @beatoriche7301 5 років тому

      @@emperorpingusmathchannel5365 The method generally used for regression is called the least-square method, and the goal is to minimize the total sum of the areas of the squares with side length equal to the difference between the y coordinate of the point and the value of the polynomial at the same input.

  • @MrCigarro50
    @MrCigarro50 5 років тому +1

    Gracias, en nombre de los Estadísticos. Es una presentación un tanto distinta a lo que hay en los libros de Estadística, pero es muy interesante. Gracias otra vez.

  • @rubenmendoza8829
    @rubenmendoza8829 5 років тому +2

    Que genialidad. Por trucos como este, es que tengo activada la campanita

  • @RalphDratman
    @RalphDratman 5 років тому +2

    Why does multiplying both sides by A-transpose give the least-square fit? I'm not asking for a proof, just a suggestion of why that might be.

    • @drpeyam
      @drpeyam  5 років тому +2

      I think I explain that in a video called Least Squares

  • @historybuff0393
    @historybuff0393 5 років тому +7

    Lights out seems to be a common feature of your videos now...

  • @elgazeta
    @elgazeta 3 роки тому

    Sublime la primera vez, y sublime hoy q lo volví a ver

  • @FunctionalIntegral
    @FunctionalIntegral 5 років тому +1

    You reminded me of my lab reports in experimental physics many years ago, where I frequently did this linear regression :D.

    • @orenrosenman5242
      @orenrosenman5242 5 років тому

      I'm doing this course right now!

    • @FunctionalIntegral
      @FunctionalIntegral 5 років тому

      lol have fun. I really hated it at that time as a theoretical physicist. :D

    • @orenrosenman5242
      @orenrosenman5242 5 років тому

      @@FunctionalIntegral I get you, but I personally really enjoy it, mostly because it gives me an opportunity to brush up on my explanatory physics skills

  • @emperorpingusmathchannel5365
    @emperorpingusmathchannel5365 5 років тому +2

    Given that line, find the correlation coefficient of the data by hand.

  • @shreeganesh441
    @shreeganesh441 5 років тому

    I would love to see topics reflecting mathematical beauty from you. (Not saying you should stop what you are doing now)

  • @qingyangzhang887
    @qingyangzhang887 5 років тому +2

    What is the motivation behind multiplying by Atranspose?

    • @drpeyam
      @drpeyam  5 років тому +2

      Because Q^T Q = I (not sure if I mentioned that, but if I didn’t, check out my least-squares video)

  • @MegaTRIANGULUM
    @MegaTRIANGULUM 5 років тому +1

    loved it!

  • @SuperMtheory
    @SuperMtheory 5 років тому

    Great video. Thanks.

  • @volcanic3104
    @volcanic3104 5 років тому

    Hey Dr. Peyam, why do we find the line that minimizes the vertical distance from the points to the line? Isn't it better to find a line that minimizes the actual distance to the line (the normal line from the regression to the point)?

    • @drpeyam
      @drpeyam  5 років тому

      It’s equivalent; if one is minimized, then so is the other, and vice-versa

  • @scottalder2374
    @scottalder2374 5 років тому

    Dr Peyam do you consider yourself to be a frequentist or a Bayesian?

    • @drpeyam
      @drpeyam  5 років тому +2

      I’m a Peyamian

  • @raulyazbeck7425
    @raulyazbeck7425 5 років тому +4

    Hi, I really love your videos but why don’t you move the camera and place it right in front of the board? It’s way better for the eye, and really more enjoyable. I really hope you take this count. Thank you

    • @sugarfrosted2005
      @sugarfrosted2005 5 років тому

      That means that you are blocked by him as he writes, which is a really big issue for a leftie.

    • @drpeyam
      @drpeyam  5 років тому

      Exactly what sugarfrosted said!

    • @raulyazbeck7425
      @raulyazbeck7425 5 років тому +1

      Dr Peyam I mean, no? Blackpenredpen does it really well, papa flammy does it really well... you made recently a video on y=x volume calculating and it was great!....

  • @GreenMeansGOF
    @GreenMeansGOF 5 років тому

    2:22 That’s weird. My phone locked but I can still hear him.

  • @__donez__
    @__donez__ 5 років тому

    Merci!

  • @mohammedal-haddad2652
    @mohammedal-haddad2652 5 років тому

    But how do we know ATA has an inverse? Thanks for the nice example.

    • @chongli297
      @chongli297 5 років тому +1

      A^T A does not necessarily have an inverse. If A is skew-symmetric (A^T = -A), then the determinant of A is 0, so the determinant of A^T A is also 0, thus A^T A is not invertible.

    • @mohammedal-haddad2652
      @mohammedal-haddad2652 5 років тому

      @@chongli297 Is this the only situation or it is just a counter example?

    • @chongli297
      @chongli297 5 років тому

      @@mohammedal-haddad2652 It's just a counter-example. The determinant is defined for square matrices and A could be non-square. At any rate, the fact that A^T A is a symmetric matrix does not preclude it having a determinant of 0, as evidenced by the counter-example.

    • @mohammedal-haddad2652
      @mohammedal-haddad2652 5 років тому

      @@chongli297 Thanks, that was helpful.

    • @drpeyam
      @drpeyam  5 років тому +2

      It has an inverse iff the columns of A are linearly independent, which is usually the case for those kinds of problems

  • @kaIawin
    @kaIawin 5 років тому

    a + bx is what we use in our stats class what is the difference!

    • @EnterSkitarii
      @EnterSkitarii 5 років тому

      There's no difference, a and b are arbitrary and since addition is commutative (the order the expression comes in doesn't matter) you can stick your x to either of them..in linear algebra, I was taught with 'y = mx + c'

  • @sugarfrosted2005
    @sugarfrosted2005 5 років тому

    I somehow forgot about being able to weight points.

  • @newtonnewtonnewton1587
    @newtonnewtonnewton1587 5 років тому

    Thanks a lot D peyam السلام عليكم

    • @newtonnewtonnewton1587
      @newtonnewtonnewton1587 5 років тому

      @Yo Ming it means peace for u in arabic language

    • @newtonnewtonnewton1587
      @newtonnewtonnewton1587 5 років тому

      @@OtiumAbscondita yes there is an arabic language in google

    • @newtonnewtonnewton1587
      @newtonnewtonnewton1587 5 років тому +1

      @Yo Ming حياك الله انا احب الجزائر والجزائريين انا جمال استاذ رياضيات فلسطيني مقيم بالاردن تحياتي لك اخ يوسف