Linear Regression - Least Squares Criterion Part 1

Поділитися
Вставка
  • Опубліковано 5 вер 2024
  • Thanks to all of you who support me on Patreon. You da real mvps! $1 per month helps!! :) / patrickjmt !! Linear Regression - Least Squares Criterion. In this video I just give a quick overview of linear regression and what the 'least square criterion' actually means. In the second video, I will actually use my data points to find the linear regression / model.

КОМЕНТАРІ • 158

  • @zhd911
    @zhd911 10 років тому +35

    I used to watch your videos when i was doing my math courses (in engineering) but its been a while since i watched your videos since i got done with all my math courses two semesters ago. However when i UA-cam'd "Linear Regression" for a statistic class. I was so happy to see your name on the top of the list. You were the BEST UA-cam tutor i had and i definitely missed watching your videos.

    • @asaryland9424
      @asaryland9424 3 роки тому

      You all probably dont give a damn but does someone know of a way to log back into an Instagram account..?
      I stupidly forgot my account password. I appreciate any help you can give me.

  • @meedan1
    @meedan1 11 років тому +4

    Dear Patrick! I've been using your videos for about 3 years, last year of IB and upto this point (2nd year elctrical engineering). I would just like to thank you tons for all these helpful videos you're sharing. You have a very good pedagogical approach. The best i've ever seen :))) Thankyou!!!

  • @mugume
    @mugume 4 роки тому +2

    Finally someone that speaks in the language I understand. Thank you for not taking anything for granted. You are an amazing teacher! Kudos bro!

  • @031219400
    @031219400 7 років тому +1

    Great video Patrick. I'm a CPA, and venturing into big data, machine learning (high level knowledge) and this really helped. Its a challenge explaining this stuff to execs, and this is great

    • @patrickjmt
      @patrickjmt  7 років тому

      +Blair Elliott good luck! Happy to come in as a consultant :)

  • @Musicllya
    @Musicllya 8 років тому +82

    This helped me so much, because my teacher honestly taught me nothing.

  • @johnnywayne3443
    @johnnywayne3443 10 років тому +3

    The text book makes this 20 times more complicated than it actually is. This is fantastic, thank you.

  • @2018paulrobbinx
    @2018paulrobbinx 4 роки тому

    I sat through a 1.5 hr lecture, didn't understand a thing. Watched this and now I understand at least the concept. Thank you!

  • @gn341ram8
    @gn341ram8 9 років тому

    I think the problem with the other stuff I was reading was they were trying to pretend that the concept was much more complicated than it really was. Like, they were trying to fit too much stuff in too early. This was a perfect, casual explanation that gave me a good idea of what you were talking about before moving on. Thank you!

  • @kaylacumming1609
    @kaylacumming1609 5 років тому +2

    I really like how you described Least Squares and Linear Regression in a long way because I was able to get all of the notes down from this video :) Thanks @patrickJMT

  • @kazaakas
    @kazaakas 11 років тому +2

    You got it right already.
    Squaring makes small numbers smaller and large numbers larger. A large error is much worse than a small error, so by squaring all errors, the larger errors are indeed overemphasised

  • @21CenturyBreakdownX
    @21CenturyBreakdownX 4 роки тому

    You helped me through college and now you're helping me at work! Much love!

  • @hiendelong1831
    @hiendelong1831 8 років тому +2

    Thank you Patrick for making this so easy to understand. You're a good teacher.

    • @padmabagri7010
      @padmabagri7010 8 років тому

      hii mam can u please explain the red line concept in patrick sir video part 1

  • @jenzo42
    @jenzo42 11 років тому +4

    Dammit I just had my exam with this in it today...!
    Good luck to all the future generations, when PatrickJMT has uploaded explanations to all the maths that ever was :P

  • @CNsongs
    @CNsongs 7 років тому

    Thank you so much. I literally learn pretty much everything we do in our numerical method class from your videos.

  • @mahasish
    @mahasish 10 років тому +4

    Nice job!!! Made me understand in simple language. Thank a lot

  • @kimberleytaylor8139
    @kimberleytaylor8139 9 років тому

    Thank-you Patrick for explaining it in such an easy to follow format.

  • @chutsu_io
    @chutsu_io 11 років тому

    Excellent explanation, was looking at Wikipedia and didn't quite get it as fast as watching your graph demonstration. Good job!

  • @mch006
    @mch006 10 років тому

    Would have never thought I would have needed to know this again after college. 10yrs later it is making my work life miserable. Hahahahahaha. Thanks for the info buddy! Cheers

  • @Highlander0689
    @Highlander0689 11 років тому

    It is because the error ('D' here) will be zero if you don't square them. All the points under the line are negative are and the ones above the line are positive. A basic assumption of OLS is that these will sum to zero always. That is the definition of the best fit line. Therefore in order to get the error, you have to square them.

  • @mildaonadoronenkovaite2149
    @mildaonadoronenkovaite2149 3 роки тому

    omg my whole lecture makes so much more sense now. thank you for this

  • @nyashanyakuchena786
    @nyashanyakuchena786 5 років тому +1

    It was a great lecture. Cleared a bit. Thank you !!

  • @brendathompson3758
    @brendathompson3758 4 роки тому

    great video, you are very clear and hit the nail on the head, in terms of communicating conceptual information. thanks

  • @amritajayakrishnan2308
    @amritajayakrishnan2308 7 років тому

    Principle of least squares - well explained!

  • @smfry010
    @smfry010 9 років тому +1

    Very clear and concise, thank you for the tutorial.

  • @sanjithramanmohan8971
    @sanjithramanmohan8971 4 роки тому

    Woahh mann it was superbly explained by youu! Great work !

  • @Dataisthetruth
    @Dataisthetruth 9 років тому +4

    I remember taking regression in college a few semesters back. It was challenging, but I definitely learned a lot.

  • @Ahlawy507
    @Ahlawy507 9 років тому +31

    thank you so much >>> big kiss

  • @febinjose21
    @febinjose21 5 років тому

    Very Good Explanation .Thanks

  • @trinity8675309
    @trinity8675309 7 років тому

    This was outstanding! Very helpful and informative. Great work!

  • @brannonperez9278
    @brannonperez9278 8 років тому +7

    being a lefty myself, I think using a permanent marker is genius.

  • @IdelleIn
    @IdelleIn 7 років тому

    you are an angel,Patrick

  • @samahwanas4165
    @samahwanas4165 3 роки тому

    So simple !
    Thank you

  • @amruthavenkata3586
    @amruthavenkata3586 5 років тому

    Thank you very much. The video helped me understand the concept better and also solved my doubts. :)

  • @sknganga3633
    @sknganga3633 3 роки тому

    this guy saved my ass in calculus!

  • @Terry2020
    @Terry2020 4 роки тому

    thanks so much! Excellent explanation.

  • @jstephen1
    @jstephen1 8 років тому

    Patrick, you are the man. Thank you.

  • @abd-elrahmanmohamed9839
    @abd-elrahmanmohamed9839 6 років тому

    Thanks Patrick . Nice explanation!

  • @nickseifert4164
    @nickseifert4164 9 років тому

    Thank you so much doing this stuff in 7th grade is really hard at times

  • @accessoriesgirl
    @accessoriesgirl 11 років тому

    Wow! I subscribe to your channel, and linear regression is what we're covering now.

  • @sdlfjlsdkf
    @sdlfjlsdkf 11 років тому

    you can. Least squared method is just one of many methods. However OLS is mainly introduced because it's the most widely used & the easiest.

  • @bluevalley82
    @bluevalley82 3 роки тому

    So clear! Thank you!

  • @somsabayTO
    @somsabayTO 10 років тому

    Hi Patrick, This is great. I'm taking Econometrics as a course for my Masters. Never studied Economics, Statistics or maths n my life. Your video is very intuitive, the way I want to grasp things visually. Subscribed to your channel. But couldn't find the rest of the Econometrics videos. can you send me the links please. Thanks a bunch!

  • @ntdenniswong
    @ntdenniswong 11 років тому

    Very clear explanation! Thanks Patrick!!!

  • @pratikshitole391
    @pratikshitole391 7 років тому

    Thanks its very clear & easy to get.

  • @avinashimmaneni
    @avinashimmaneni 6 років тому

    Great explanation patrick! Cheers

  • @ucheogbede
    @ucheogbede 6 років тому

    What a fantastic explanation! Thank you...

  • @ludwigmarcello8915
    @ludwigmarcello8915 10 років тому

    So well explained! thank you! Excellent video!

  • @joyofliving5352
    @joyofliving5352 2 роки тому

    Hi Patrick, great, could you please also do a non-linear least square video?

  • @dawnkosoris572
    @dawnkosoris572 10 років тому +1

    Thanks - this was very helpful

  • @allyhuang6103
    @allyhuang6103 5 років тому

    your videos are super helpful but i can't help but crack up whenever you say LEEEEENGTH

  • @xunit94
    @xunit94 9 років тому

    Patrick to the rescue again!

  • @AmiyaSarkar
    @AmiyaSarkar 4 роки тому

    very well explained

  • @Lasheen419
    @Lasheen419 4 роки тому

    Great explanation

  • @alvinng4028
    @alvinng4028 10 років тому

    Good job explaining it simple terms without all the greek symbols!

  • @Zeppelinpuppy
    @Zeppelinpuppy 9 років тому

    Patrick is the man! Thanks mate!

  • @tavisking3326
    @tavisking3326 5 років тому

    Mixing centimetres with pounds? Metric and Imperial?!? Science Licence revoked! ;-) Great video.

  • @erjonaselimaj5767
    @erjonaselimaj5767 9 років тому

    When we use the OLS method, why do we take in consideration the vertical deviation of a point from the line of regression instead of the horizontal deviation ??

  • @williamaiweilun
    @williamaiweilun 10 років тому

    Man, I like your videos. Simple and easy to understand. To explains things not in a fashion that people can understand is a form of cowardliness, and I am really glad you are absolutely not!

  • @jkid1134
    @jkid1134 11 років тому

    If this is what I think it is, I'm EXCITED

  • @johndevittire
    @johndevittire 10 років тому +1

    Why look at the vertical distance, does it not make more sense to look at perpendicular distance?

    • @mightyworker
      @mightyworker 10 років тому

      this seems to also be done according to my current lecturer, but he only just said "hey it's called Orthogonal Distance Regression" and then kept going with this method.

    • @przs33
      @przs33 10 років тому +1

      No, because it would correspond to another X-value, you use this method to predict an Y(explained) value based on every X(explicative) value that you have.

  • @tonycatman
    @tonycatman 11 років тому

    I think that the reason we use squares is that we can then take the square root. We always end up with a positive number which we want to minimize.

  • @Cyphlix
    @Cyphlix 10 років тому

    Whats the point of squaring here? Surely you could take abs() instead and find some minimum function, or are we differentiating later for minimum..

  • @jamesburke9865
    @jamesburke9865 8 років тому +2

    Could we also take the sum of the absolute values of the distances, instead of squares?

    • @stubill9298
      @stubill9298 7 років тому

      Yes absolutely, you'd get the Least Absolute Deviations estimator rather than least squares. For many reasons though, LS is much more common than LAD

  • @tarunsankhla2328
    @tarunsankhla2328 4 роки тому

    i had a question what is the need to square the distance if we take the absolute distance and add we can still get the minimum value for D???

  • @wiscatbijles
    @wiscatbijles 9 років тому

    Is there also a method where you don't take the squares, but take the sum of the absolute differences (and what is this called)? Because, I guess least squares would seem sensitive to outliers. Because (outliers - model) squared would contribute greatly to the sum of squares (D) I believe.

  • @hazemsaeed6372
    @hazemsaeed6372 4 роки тому

    that's a great explanation and it helped me a lot.. and i tried to search for what is kernel least squares but i didn't find anything that explain it, all i found is kernel regularized least squares... care to explain it for me?

  • @CommanderHulio
    @CommanderHulio 9 років тому +1

    You mentioned taking the squared value so you wouldn't have to deal with positives and negatives in your least squares equation. Could you simply use the absolute value of a number instead of squaring it?

  • @antoineleblanc2509
    @antoineleblanc2509 6 років тому

    why would you take the vertical distance for your calculations? wouldn't the shortest distance between the line and the point make more sense?

    • @patrickjmt
      @patrickjmt  6 років тому

      why would that make more sense?

  • @PC-pe3tg
    @PC-pe3tg 5 років тому

    Finally got it.. thank u so much 😘

  • @chutsu_io
    @chutsu_io 11 років тому

    Additionally could it not also be used to emphasize those distances that are relatively larger then the other smaller distances? Just a guess

  • @blakehelms3566
    @blakehelms3566 7 років тому

    If I had to interpret least squares in the context of a problem (like a free response question), how would I do that

  • @efflorithe
    @efflorithe 11 років тому

    I know this is from months back, but just in case...
    I am assuming we use squares because it'll give us values that are easier to work with, rather than using larger, even-num. powers which will make us work with extremely large values.

  • @sarahhope8516
    @sarahhope8516 6 років тому

    what if y is a function of a lot of variables say for example 4 or 5 and each time we combine values of these variables we get a new value of y......in this case how can i proceed to get an approximate function of y?? THANK YOU SO MUCH :-)

  • @barnoegamberdiyeva8733
    @barnoegamberdiyeva8733 2 роки тому

    Dear Patrick d1 distance should be vertical (perpendicular) to the linear model, shouldn't be?

  • @antonnyagolov4332
    @antonnyagolov4332 6 років тому

    Hey guys.
    If we have an exercise or let's say an exam and the question is asking you to "Plot the data points and graph the regression line" how do we estimate the line? Like where to draw the line or it is just a random line that we need to draw? Or we first find the mean of X and Y and then draw the line through the mean of those two points?
    #

    • @MZBENNE
      @MZBENNE 6 років тому

      Draw the line randomly on the graph, where the data is but not necessarily passing through the data points.

  • @arpitagarwal016
    @arpitagarwal016 7 років тому

    patrickJMT So, how the algorithm draws the first line? does it chooses the slope and intercept value at random?

  • @rupali4197
    @rupali4197 4 роки тому

    Omg !! It made me to think ... i'm sooooo late here. Some comments are 7 year ago , some are 6 or 5 or 3...video was uploaded on 16,jan 2013 .

  • @italoarrue5229
    @italoarrue5229 7 років тому

    It can't be clearer. Thank you!

  • @muthonimutonga7673
    @muthonimutonga7673 7 років тому

    Great explanation! Asante!

  • @080rohitdatta2
    @080rohitdatta2 5 років тому

    Thanks man

  • @stevewaltz8076
    @stevewaltz8076 8 років тому

    You sound like David Cross a little lo, thanks for the explanation

  • @samueloluwapelumi1077
    @samueloluwapelumi1077 6 років тому

    Hi @patrickJMT, I just realized that you are left-handed too. Yaay!

  • @IMadeOfClay
    @IMadeOfClay 11 років тому

    Ok thanks. But why not raise them to the fourth power for example? That would achieve the same result i.e. the sum wouldn't be zero. What's so special about squaring them?
    Thanks in advance.

  • @state439
    @state439 7 років тому +1

    Sounds exactly like the guy at the end of pulp fiction

  • @padmabagri7010
    @padmabagri7010 8 років тому

    hii patrick sir.. how you make the red line on what basis that i am unable to understand.

  • @KASANITEJ
    @KASANITEJ 5 років тому

    what is the point of squaring the distance? why can't it be the sum of distances?

  • @ozgeatalanloomis1717
    @ozgeatalanloomis1717 6 років тому

    I enjoyed your explanation but doesn't your points have to be perpendicular to the line in order to measure the distance of your data points? You aren't really measuring the same distance with your d1 and so on.

  • @NguyenHoa-er1ff
    @NguyenHoa-er1ff 6 років тому

    just love your voice so much :D

  • @rohith1510
    @rohith1510 9 років тому

    Thank you!

  • @netmadefamous
    @netmadefamous 10 років тому +4

    Why Square? to not have any negative value that's it ...
    so the model works more accurate ..

    • @peymanmohsenikiasari8564
      @peymanmohsenikiasari8564 5 років тому +1

      this is my question too. why not minimizing the d1 + d2 + d3 + ...?

    • @itnhalam
      @itnhalam 5 років тому

      @@peymanmohsenikiasari8564 the smallest is a positive number, so we just square it!

  • @pvtests8248
    @pvtests8248 4 роки тому +1

    shukriya :)

  • @d-shiri
    @d-shiri 6 років тому

    thank YOU

  • @unev
    @unev 7 років тому +7

    Why do we prefer squared values over absolute values?

    • @justinnunez7318
      @justinnunez7318 7 років тому +4

      squared turn negatives into positive values and its easier to differentiate. Remember absolute value, although is continuous, it is not differentiable at 0.

    • @zaidthunder1
      @zaidthunder1 6 років тому

      From my understanding, we square so that deviations that are outliers are emphasized more than the values which are deviating very less.

    • @dammekes6406
      @dammekes6406 6 років тому +1

      Indeed. Squaring gives more weight to more widely scattered residuals. I think that's a little mistake in this video.

    • @dammekes6406
      @dammekes6406 6 років тому

      Shortly, it makes the regression BLUE.

  • @TheAk74us
    @TheAk74us 5 років тому

    starts at 2:44

  • @ahmedelamineboukemoune3
    @ahmedelamineboukemoune3 7 років тому

    THANK YOY SO MUCH

  • @morgyn8241
    @morgyn8241 6 років тому

    thanks

  • @saad7417
    @saad7417 5 років тому

    Cant we have used absolute value instead of squaring the values?

    • @johnb1391
      @johnb1391 5 років тому +1

      You should always assume your data is subject to error. Taking the absolute value will make a negative number positive but it won't help showcase the margin of error easily. Squaring a value will make a small number smaller and a big number bigger so squaring will put much more weight on erroneous measures.

  • @scottrobinsonmusic
    @scottrobinsonmusic 11 років тому

    Why can't you just take the absolute value rather than squaring? Surely squaring will over-emphasise outliers? I don't mean this is wrong, it's just something I never understood about least squares.

  • @TheZakev
    @TheZakev 8 років тому +1

    Can somebody tell me the equasion for the regresion line, please?

    • @armaankapila7788
      @armaankapila7788 8 років тому +1

      +Chavdar Zakev Y (with the little 'hat' triangle on top) = bx + c

  • @jessymarie211
    @jessymarie211 9 років тому

    Legend, Thanks :D