Linear Regression - Least Squares Criterion Part 1
Вставка
- Опубліковано 5 вер 2024
- Thanks to all of you who support me on Patreon. You da real mvps! $1 per month helps!! :) / patrickjmt !! Linear Regression - Least Squares Criterion. In this video I just give a quick overview of linear regression and what the 'least square criterion' actually means. In the second video, I will actually use my data points to find the linear regression / model.
I used to watch your videos when i was doing my math courses (in engineering) but its been a while since i watched your videos since i got done with all my math courses two semesters ago. However when i UA-cam'd "Linear Regression" for a statistic class. I was so happy to see your name on the top of the list. You were the BEST UA-cam tutor i had and i definitely missed watching your videos.
You all probably dont give a damn but does someone know of a way to log back into an Instagram account..?
I stupidly forgot my account password. I appreciate any help you can give me.
Dear Patrick! I've been using your videos for about 3 years, last year of IB and upto this point (2nd year elctrical engineering). I would just like to thank you tons for all these helpful videos you're sharing. You have a very good pedagogical approach. The best i've ever seen :))) Thankyou!!!
Finally someone that speaks in the language I understand. Thank you for not taking anything for granted. You are an amazing teacher! Kudos bro!
Great video Patrick. I'm a CPA, and venturing into big data, machine learning (high level knowledge) and this really helped. Its a challenge explaining this stuff to execs, and this is great
+Blair Elliott good luck! Happy to come in as a consultant :)
This helped me so much, because my teacher honestly taught me nothing.
+Trumpdog16 come back any time :)
Musicllya aha same!!!
The text book makes this 20 times more complicated than it actually is. This is fantastic, thank you.
Indeed 😀
I sat through a 1.5 hr lecture, didn't understand a thing. Watched this and now I understand at least the concept. Thank you!
I think the problem with the other stuff I was reading was they were trying to pretend that the concept was much more complicated than it really was. Like, they were trying to fit too much stuff in too early. This was a perfect, casual explanation that gave me a good idea of what you were talking about before moving on. Thank you!
I really like how you described Least Squares and Linear Regression in a long way because I was able to get all of the notes down from this video :) Thanks @patrickJMT
You got it right already.
Squaring makes small numbers smaller and large numbers larger. A large error is much worse than a small error, so by squaring all errors, the larger errors are indeed overemphasised
You helped me through college and now you're helping me at work! Much love!
Thank you Patrick for making this so easy to understand. You're a good teacher.
hii mam can u please explain the red line concept in patrick sir video part 1
Dammit I just had my exam with this in it today...!
Good luck to all the future generations, when PatrickJMT has uploaded explanations to all the maths that ever was :P
Thank you so much. I literally learn pretty much everything we do in our numerical method class from your videos.
Nice job!!! Made me understand in simple language. Thank a lot
Thank-you Patrick for explaining it in such an easy to follow format.
Excellent explanation, was looking at Wikipedia and didn't quite get it as fast as watching your graph demonstration. Good job!
Would have never thought I would have needed to know this again after college. 10yrs later it is making my work life miserable. Hahahahahaha. Thanks for the info buddy! Cheers
It is because the error ('D' here) will be zero if you don't square them. All the points under the line are negative are and the ones above the line are positive. A basic assumption of OLS is that these will sum to zero always. That is the definition of the best fit line. Therefore in order to get the error, you have to square them.
omg my whole lecture makes so much more sense now. thank you for this
It was a great lecture. Cleared a bit. Thank you !!
great video, you are very clear and hit the nail on the head, in terms of communicating conceptual information. thanks
Principle of least squares - well explained!
Very clear and concise, thank you for the tutorial.
Woahh mann it was superbly explained by youu! Great work !
I remember taking regression in college a few semesters back. It was challenging, but I definitely learned a lot.
thank you so much >>> big kiss
Saeed Alahlawy word
+BBBDubb
No he didnt ༼͡◕ ͜ ʖ ͡◕༽
Very Good Explanation .Thanks
This was outstanding! Very helpful and informative. Great work!
being a lefty myself, I think using a permanent marker is genius.
you are an angel,Patrick
So simple !
Thank you
Thank you very much. The video helped me understand the concept better and also solved my doubts. :)
this guy saved my ass in calculus!
thanks so much! Excellent explanation.
Patrick, you are the man. Thank you.
Thanks Patrick . Nice explanation!
Thank you so much doing this stuff in 7th grade is really hard at times
Wow! I subscribe to your channel, and linear regression is what we're covering now.
you can. Least squared method is just one of many methods. However OLS is mainly introduced because it's the most widely used & the easiest.
So clear! Thank you!
Hi Patrick, This is great. I'm taking Econometrics as a course for my Masters. Never studied Economics, Statistics or maths n my life. Your video is very intuitive, the way I want to grasp things visually. Subscribed to your channel. But couldn't find the rest of the Econometrics videos. can you send me the links please. Thanks a bunch!
Very clear explanation! Thanks Patrick!!!
Thanks its very clear & easy to get.
Great explanation patrick! Cheers
What a fantastic explanation! Thank you...
So well explained! thank you! Excellent video!
Hi Patrick, great, could you please also do a non-linear least square video?
Thanks - this was very helpful
your videos are super helpful but i can't help but crack up whenever you say LEEEEENGTH
LEEEEEEEENGTH!
Patrick to the rescue again!
very well explained
Great explanation
Good job explaining it simple terms without all the greek symbols!
Patrick is the man! Thanks mate!
Mixing centimetres with pounds? Metric and Imperial?!? Science Licence revoked! ;-) Great video.
When we use the OLS method, why do we take in consideration the vertical deviation of a point from the line of regression instead of the horizontal deviation ??
Man, I like your videos. Simple and easy to understand. To explains things not in a fashion that people can understand is a form of cowardliness, and I am really glad you are absolutely not!
If this is what I think it is, I'm EXCITED
Why look at the vertical distance, does it not make more sense to look at perpendicular distance?
this seems to also be done according to my current lecturer, but he only just said "hey it's called Orthogonal Distance Regression" and then kept going with this method.
No, because it would correspond to another X-value, you use this method to predict an Y(explained) value based on every X(explicative) value that you have.
I think that the reason we use squares is that we can then take the square root. We always end up with a positive number which we want to minimize.
Whats the point of squaring here? Surely you could take abs() instead and find some minimum function, or are we differentiating later for minimum..
Could we also take the sum of the absolute values of the distances, instead of squares?
Yes absolutely, you'd get the Least Absolute Deviations estimator rather than least squares. For many reasons though, LS is much more common than LAD
i had a question what is the need to square the distance if we take the absolute distance and add we can still get the minimum value for D???
Is there also a method where you don't take the squares, but take the sum of the absolute differences (and what is this called)? Because, I guess least squares would seem sensitive to outliers. Because (outliers - model) squared would contribute greatly to the sum of squares (D) I believe.
that's a great explanation and it helped me a lot.. and i tried to search for what is kernel least squares but i didn't find anything that explain it, all i found is kernel regularized least squares... care to explain it for me?
You mentioned taking the squared value so you wouldn't have to deal with positives and negatives in your least squares equation. Could you simply use the absolute value of a number instead of squaring it?
No, because it is not differentiable.
why would you take the vertical distance for your calculations? wouldn't the shortest distance between the line and the point make more sense?
why would that make more sense?
Finally got it.. thank u so much 😘
Additionally could it not also be used to emphasize those distances that are relatively larger then the other smaller distances? Just a guess
If I had to interpret least squares in the context of a problem (like a free response question), how would I do that
I know this is from months back, but just in case...
I am assuming we use squares because it'll give us values that are easier to work with, rather than using larger, even-num. powers which will make us work with extremely large values.
what if y is a function of a lot of variables say for example 4 or 5 and each time we combine values of these variables we get a new value of y......in this case how can i proceed to get an approximate function of y?? THANK YOU SO MUCH :-)
Dear Patrick d1 distance should be vertical (perpendicular) to the linear model, shouldn't be?
Hey guys.
If we have an exercise or let's say an exam and the question is asking you to "Plot the data points and graph the regression line" how do we estimate the line? Like where to draw the line or it is just a random line that we need to draw? Or we first find the mean of X and Y and then draw the line through the mean of those two points?
#
Draw the line randomly on the graph, where the data is but not necessarily passing through the data points.
patrickJMT So, how the algorithm draws the first line? does it chooses the slope and intercept value at random?
Omg !! It made me to think ... i'm sooooo late here. Some comments are 7 year ago , some are 6 or 5 or 3...video was uploaded on 16,jan 2013 .
It can't be clearer. Thank you!
Great explanation! Asante!
Thanks man
You sound like David Cross a little lo, thanks for the explanation
Hi @patrickJMT, I just realized that you are left-handed too. Yaay!
Ok thanks. But why not raise them to the fourth power for example? That would achieve the same result i.e. the sum wouldn't be zero. What's so special about squaring them?
Thanks in advance.
Sounds exactly like the guy at the end of pulp fiction
hii patrick sir.. how you make the red line on what basis that i am unable to understand.
what is the point of squaring the distance? why can't it be the sum of distances?
I enjoyed your explanation but doesn't your points have to be perpendicular to the line in order to measure the distance of your data points? You aren't really measuring the same distance with your d1 and so on.
just love your voice so much :D
;)
Thank you!
Why Square? to not have any negative value that's it ...
so the model works more accurate ..
this is my question too. why not minimizing the d1 + d2 + d3 + ...?
@@peymanmohsenikiasari8564 the smallest is a positive number, so we just square it!
shukriya :)
thank YOU
Why do we prefer squared values over absolute values?
squared turn negatives into positive values and its easier to differentiate. Remember absolute value, although is continuous, it is not differentiable at 0.
From my understanding, we square so that deviations that are outliers are emphasized more than the values which are deviating very less.
Indeed. Squaring gives more weight to more widely scattered residuals. I think that's a little mistake in this video.
Shortly, it makes the regression BLUE.
starts at 2:44
THANK YOY SO MUCH
thanks
Cant we have used absolute value instead of squaring the values?
You should always assume your data is subject to error. Taking the absolute value will make a negative number positive but it won't help showcase the margin of error easily. Squaring a value will make a small number smaller and a big number bigger so squaring will put much more weight on erroneous measures.
Why can't you just take the absolute value rather than squaring? Surely squaring will over-emphasise outliers? I don't mean this is wrong, it's just something I never understood about least squares.
Can somebody tell me the equasion for the regresion line, please?
+Chavdar Zakev Y (with the little 'hat' triangle on top) = bx + c
Legend, Thanks :D