Support StatQuest by buying my books The StatQuest Illustrated Guide to Machine Learning, The StatQuest Illustrated Guide to Neural Networks and AI, or a Study Guide or Merch!!! statquest.org/statquest-store/
I have been enrolled in a graduate machine learning course for about a month now and you have just demystified so many details around Linear Regression. Please do more ML videos! They are so clear and helpful. If you can, please do one on Regularization and Decision Forests.
You deserve more subscribers, the quality in your videos is so intuitive that even a high school student understands. Please do upload videos, keep going. I really think ur gonna get more subscribers in future.
I actually took machine learning as the elective subject for my final year of engineering and I pretty much guess that this channel's going to teach me every thing!
After weeks of research and frustration, I have finally understood the concept of least squares so well! You explained the concepts so simply and logically!! Thank you so much for this amazing video. Much appreciated.
Thank you!! After somehow passing 2 PhD quantitative methods modules and not really understanding why we did any of it, your channel has finally cleared a lot of stuff up!
There are so many videos on this subject where they say what and how to use least squares method of find the line of best fit but you are the only one who explained the concept behind this method. Thank you.
I keep coming back for more! THANK YOU SO MUCH!!! This is so clearly explained than any other tutorial/video/in-class session I've ever listened to! You are the best!
I always wanted to know why the formula squares the distance and then get the root instead of using the absolut value. You're the first one to explain this. Thank you!
Your presentation style is way ahead of anything else on this platform. You don't have the inefficient habit of deriving everything from first principles but allow a holistic intuition to develop. Absolute magic!
Josh your songs and teaching are excellent, you are doing something no one else has done in my life: inspiring me to become a Data scientist as well as a composer
Done thanks We take the square to make all the errors positive (we want to find the total error of the points from the line) We want to find the optimal values for a and b in the equation of a line that minimize the sum of errors squared. We can express the sum of squares as a function of a and b and take the derivative to optimize it 5:45 We find the slope that minimizes the error by finding the minima of the multi variable function (variables are a,b) 7:30
I've read a few undergrad texts and none of them actually explain the origins of the idea behind least squares. At least not in a simplified visual form, you're usually just explained the problem and slapped with the simple linear model and then the generalized version....This gives some insight into the motivation behind this technique. Thank you for donating you're time to such a altruistic cause. You a real one!
@@statquest I'm serious! If you ever need anything done on Python for content - I would be more than happy to write it out as clearly and as elegantly as I can so you can use it for content.
This is absolutely fantastic! I am so glad that I found this channel on the UA-cam while I am doing the data science self study. I am now understand the concept of OLS which stress me out in a week time before I found this video. Big thanks!
I just wanted to say that you were born to teach. The book? Are you kidding me? Perfection. I would advise to give the first chapter as free content so everybody can have a taste of your abilities. Favorite quote so far: "The Binomial Distribution makes me want to run away and hide. :) "
@@franciscoicarocs Haha! I'm glad you're enjoying it. I just started (in earnest) to write my next book on neural networks (from simple to start of the art)
This is a nice video, but by looking at more recent videos and these old ones, is very clear how much you have improved what was already awesome, congrats
StatQuest Team - Thank you so much for all your efforts. For the last few months, I felt like that mouse stuck in a wheel going round and round with concepts as I got deeper into ML. A definite recommendation to everyone and anyone irrespective of their ML proficiency.
What a legend you are! No words to express my gratitude. You are a blessing to everyone wanting to learn these concepts! Wish you good health and loads of happiness. :)
I'm new to data science, you just nailed it ....amazing explanation...after so many videos...finally understood what the heck Linear regression is ...Thank you so much...
Oh my God this is just the best video I've ever seen about Linear Regression! Thank you very much! I subscribed just after the video, please do not stop!
Thanks very much for this. I watched it last year when I was looking to change careers. Re watching now that I'm enrolled in some real training. And Wow!
@@statquest hey man, I'd like to request you to kindly make a video on how to become an ML engineer from scratch! I am a self taught aspirant. please make the roadmap for people like us
I've seen many of your videos, they are amazing good stuff! I just wanted to point out something in this one: you calculate "b" for the first horizontal line, and then you start rotating it to find the best slope. But you never explain WHERE IS THE ROTATION POINT! This is crucial!
The slope explanation gives a good intuitive sense of how to find the best-fit slope and y-intercept for least squares, especially if you have a background in calculus. In contrast, the linear algebra solution of OLS is just that much more shocking/amazing; that the same result can be calculated algebraically for n-space >without< any geometric intuition, without any search in the solution space for slope and y-intercept. Your visual explanation is more intuitive and memorable. The linear algebra approach feels more magical, as I find it harder to remember the derivation.
I just "discovered" by myself that lines generated by linear regression always passes (M(X), M(Y)) point (which, when thinked about it is quite intuitive) and thus, we can add constriction, that b=M(Y)-a*(M(X), which allows to solve the equation for just single varible (slope), instead of two (slope and intercept). Thats for sure basic, but im neverthless proud of myself :D Great channel BTW.
Thank you for the vid. It was so easy to understand the concept of Least Square using visualization. I will use this as my reference for my demo teaching in stat. Hoping fore more stat videos and data analysis trick & tips...
BAM 😅😅😅🎉 I reached out to the bottom of the _STACK_ Quest (finally) 🎉❤ Wow 😮 I am on this quest to find out where this will stop since I just learned that I must watch the video *Fitting a Line to Data* also know as *Linear Regression* before I can watch the *Gradient Descent Step-by-Step!!!* so that I can watch the video related to *Neural Networks part 2* that I must watch before I can watch the *The StatQuest Introduction To PyTorch...* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand) I am genuinely so happy to learn about that stuff with you Josh❤ I will go watch the other videos first and then I will back propagate to that first video... And now I feel like it took me 6 years to find out this wonderful video I can't wait to see it😊
Josh. Ive seen this explanation for near 40 years, always re-watching for someone to explain the approach as well as the instructor who introduced it to me in 1980... your is the best so far. But, you got away w some simplification by starting w a 0- slope line, and the calculatiin of the line's value was sort of 'lost' when you lept to non-zero slope lines... Jussayin. Cheers DocV
This is an amazing video on the intuition behind Fitting lines to data. Loved this video, it gave me a recap on some of the concepts I've learnt years ago and have forgotten.
hello, you have done the wonderfull explanation. I loved it. and I am requesting you to do a video on assumptions made in linear regression. this will help us a lot.
Hey josh, I absolutely love your explanations! It's given me a completely different perspective on how i think about machine learning and made the topics intuitive. I wonder if you can compile some notes for all of the content that can be reviewed in under a day and a mind-map that can be used to put all the pieces together. that would be really AWESOME!
Great video! It is admirable the effort you put on teaching... Just a suggestion: It would be great if you could put the link of the complementary videos you describe in each video. That way, it is easier to keep on track.
Thanks for the tip. I generally try to do that (add the links in the description), but sometimes I forget. If you have time, it would be great if you could post which videos need links to complementary videos in a comment and then I'll take care of the rest.
I think another reason to take the square error is that it creates an actual geometric square of size of the error (area of the square, A= L*W). Add them all up together to get a 2D representation of the error. Where adding all the absolute error lines is only 1D. Sometimes shapes are easier to visualize. Amazing videos and pedagogy style. Props.
Sorry for commenting several times throughout the series, but I would like to point out you should probably move this video + "Linear Regression, Clearly Explained!!!" before ROC/AUC in your ML playlist since you suggest understanding these basics beforehand.
This is a brilliant video. You're a few minutes from explaining how to derive the LR formula. I seem to recall from my high school days - that you work out two partial derivatives (dSum/da - treating 'b' as a constant and then dSum/db treating 'a' as a constant) and equate these to 0 (to find minimum). You should be left with a couple of equations for intercept and slope.
That's exactly right (and I show how at least part of this works in my video on the Chain Rule ua-cam.com/video/wl1myxrtQHQ/v-deo.html ). However, what I don't like about the LR formula is that it only works in this specific situation. In contrast, Gradient Descent gives us pretty good parameter estimates and works in a million other situations. For more details on Gradient Descent, see: ua-cam.com/video/sDv4f4s2SB8/v-deo.html
Bro you are genius . it just sinks in the mind . .they way you are explaining , Kindly guide me with some youtube channel which explains other concepts related to math's like calculus etc. .in similar way . .or I request you to create them as well from prospective of DS
Your method of explanation is great. Please keep uploading tutorials. I would like to see tutorials about Deep Learning and Boosting (XGBoost, Catboost, etc.) algorithms which are popular lately. Thanks.
I already have videos on deep learning here: ua-cam.com/video/CqOfi41LfDw/v-deo.html and XGBoost here: ua-cam.com/video/OtD8wVaFm6E/v-deo.html All of my videos are organized here: app.learney.me/maps/StatQuest
Hi, Josh. You are amazing person. Your videos are very helpful to me. Your talant of explaining complicated things simply is magnificent! I hope you will go on and help a lot of people like me. But I am a bit confused in some moments, I hope you can help me through this. It's about the gragh where we plot sum of squared residuals and different rotation. If the derivative=0 it means that the our function is horizontal, isn't it? In my head we just have the horizontal line as optimal line but it cannot be so. Please, clear it up. Thank you very much!
The point of rotating the line and showing different sums of the squared residuals was simply to help people understand the concept of the goal of finding the optimal line. However, in practice, we just take the derivative of the function and set it to 0 (just like you said). BAM! :)
@@statquest thank you very much! So quick response! I am a bit of the middle between “simple explanation” and “complex explanation” so it confused me a bit). You are a great human being! Good luck)
@@АлексейШаков-ь4и Thank you! Now, even though we can solve for optimal line by setting the derivative = 0 and solving for the slope and intercept, a more general solution, that works in a lot of different situations, is called Gradient Descent. Gradient Descent is the backbone of Machine Learning and is used in this situation, as well as for Deep Learning and all that fancy stuff. For details on how Gradient Descent works, see: ua-cam.com/video/sDv4f4s2SB8/v-deo.html
Can never find a clearer explanation than yours video!! I have a question hope this is not stupid. In 8:44, the derivative of the function is the derivative (a,b) or derivative a + derivative b? I don’t understand how to functionize them. Thank you so much for your work!
We take the derivatives with respect to each variable and set them to zero. However, a more flexible method is Gradient Descent, which can be used in a lot more situations. I show how to do it here: ua-cam.com/video/sDv4f4s2SB8/v-deo.html
Hi Josh. Love your videos!! They give the best intuitive explanation. Can't thank you enough :). Please make a video on curve fitting for linear equations using normal equations vs using gradient descent. Thank You!
I have calculated taking small values of diference from the line like 1,2,3,4(4 was supposed to be negative) Using modulus answer comes 10 ...Now when i do it square answer comes 30 so how is square better than modulus?where is the complexity. since modulus is doing its job to make the number positive.
The square is used because it has a derivative defined for all points. This makes the math relatively to find the best parameter values that minimize the residuals.
Pretty spot on. Happy I found this channel. I have a question. The distance between the line and a point, should'nt it be the perpendicular distance between the point and the line? I believe that is even more of an accurate model.
For regression we use the vertical distance between the data and the line in order to preserve the relationship that the variable on the x-axis is supposed to be predicting the variable on the y-axis. By measuring the vertical distance (instead of the perpendicular distance), we can measure how good or bad that prediction is.
Really useful video, thank you! Quick question: when you mention estimating the best line of fit, can you assume the trough of the y=a*x+b vs sum of squared residuals plot to be the best line of fit, or is it essential to calculate the derivative for that function in order to find the best fit?
I'm not sure I understand your question, however, in practice, we solve for the derivatives and set them equal to 0 and solve for the optimal values. That said, we can also use gradient descent ua-cam.com/video/sDv4f4s2SB8/v-deo.html to solve this, and I like that method better because it's a general solution that works in a lot of situations where analytical methods fail.
Hi! First of all, congratulation for this awesome youtube channel. I was wondering if it would it be possible to re-upload this video in a higher quality? If I am not wrong, the max quality is 480p, which is not the best at these days. Thanks! Miguel
Support StatQuest by buying my books The StatQuest Illustrated Guide to Machine Learning, The StatQuest Illustrated Guide to Neural Networks and AI, or a Study Guide or Merch!!! statquest.org/statquest-store/
Everyone should buy this book if you want to learn machine learning. It is the greatest 20 bucks that I have ever spent in my entire life.
@@weixiangzhao561 BAM! Thank you very much! :)
This video isn't available in Nepal. Can you please make it available sir? I really love your content. 🙏
@@dipenpandit684 I'm working on it. It's some strange thing with youtube and I've contacted them.
I love this book! I have taken Stats so many times but I still learned so much. Thank you for writing this book!
I have been enrolled in a graduate machine learning course for about a month now and you have just demystified so many details around Linear Regression. Please do more ML videos! They are so clear and helpful. If you can, please do one on Regularization and Decision Forests.
You have no idea how much I appreciate the clarity and simplicity of this explanation, you deserve a medal
Thank you!
You deserve more subscribers, the quality in your videos is so intuitive that even a high school student understands. Please do upload videos, keep going. I really think ur gonna get more subscribers in future.
You were right
Can’t agree more
Definitely! I'm glad that as of March 2024, he's got over one million subs (1.13 to be exact)!
I actually took machine learning as the elective subject for my final year of engineering and I pretty much guess that this channel's going to teach me every thing!
Hooray!!! :)
@@statquest your vids are so good I watch your videos for entertainment lol
How'd the machine learning class turn out?
After weeks of research and frustration, I have finally understood the concept of least squares so well! You explained the concepts so simply and logically!! Thank you so much for this amazing video. Much appreciated.
Thank you very much!!! :)
I must admit sir, you are one of the best teachers I've ever had. Thank you for being so awesome!
Wow, thank you!
Thank you!! After somehow passing 2 PhD quantitative methods modules and not really understanding why we did any of it, your channel has finally cleared a lot of stuff up!
Great to hear!
How the hell did you pass
@@frankchen4229 do what you're told and don't ask why
@@maggiechen1141 huh. Maybe PhD isn't so bad.
@@maggiechen1141 best advice so far
There are so many videos on this subject where they say what and how to use least squares method of find the line of best fit but you are the only one who explained the concept behind this method. Thank you.
Thanks!
I keep coming back for more! THANK YOU SO MUCH!!! This is so clearly explained than any other tutorial/video/in-class session I've ever listened to! You are the best!
Wow! Thank you very much!
the quality of these videos seem to have improved greatly over the years, but the simplicity was always there. Amazing!
Some of the early videos are still the best.
english is not my first language, but i can clearly understand your explanation. Thank you sir!
Thank you!
I always wanted to know why the formula squares the distance and then get the root instead of using the absolut value. You're the first one to explain this. Thank you!
Thanks!
So can you explain in your own words in short then? Thanks
This is the best UA-cam channel ever, thank you Josh for all your work you doing awesome!!
Wow, thanks!
I don't think I have ever clicked on the subscribe button that fast. Absolutely amazing
Awesome! :)
Me too
Your presentation style is way ahead of anything else on this platform. You don't have the inefficient habit of deriving everything from first principles but allow a holistic intuition to develop. Absolute magic!
Thank you! :)
Josh your songs and teaching are excellent, you are doing something no one else has done in my life: inspiring me to become a Data scientist as well as a composer
Wow! That is awesome! BAM! :)
Done thanks
We take the square to make all the errors positive (we want to find the total error of the points from the line)
We want to find the optimal values for a and b in the equation of a line that minimize the sum of errors squared. We can express the sum of squares as a function of a and b and take the derivative to optimize it
5:45
We find the slope that minimizes the error by finding the minima of the multi variable function (variables are a,b)
7:30
I've read a few undergrad texts and none of them actually explain the origins of the idea behind least squares. At least not in a simplified visual form, you're usually just explained the problem and slapped with the simple linear model and then the generalized version....This gives some insight into the motivation behind this technique. Thank you for donating you're time to such a altruistic cause. You a real one!
Thank you! :)
This video made me stop crying from stress of barely understanding anything in my class. Thank you
Glad it helped!
Right?! This channel is so underrated, we have to change that!
@@bigvinweasel1050 Wow! Thank you!
@@statquest I'm serious! If you ever need anything done on Python for content - I would be more than happy to write it out as clearly and as elegantly as I can so you can use it for content.
This is absolutely fantastic! I am so glad that I found this channel on the UA-cam while I am doing the data science self study. I am now understand the concept of OLS which stress me out in a week time before I found this video. Big thanks!
I'm glad it was helpful! :)
bro im appreciate every single videos that you've made, i just want to say thanks a ton with not skip ads in your videos, love from indonesian
I appreciate that!
It literally only took me 15 seconds to subscribe because of his unique 15 second intro
That's awesome! :)
@@statquest i think you should add fitting a curve to data in this playlist
I have watched and read so many articles but this video explains the use of sum of squared errors and why its important. Thank you!!!
Hooray! :)
Never excited more than after watching this video. Truly intuitive and amazing intro to linear regression.
Thank you! :)
Your way to explain these abstract concepts is simply AMAZING!!!! Thank you so much for these incredible videos!
Thank you very much! :)
I just wanted to say that you were born to teach.
The book? Are you kidding me? Perfection.
I would advise to give the first chapter as free content so everybody can have a taste of your abilities.
Favorite quote so far: "The Binomial Distribution makes me want to run away and hide. :) "
Thank you very much! That's a good idea. I wonder how I can do that.
Favorite new quote just dropped: "the Normal distribution is awesome, and, to be honest, it sort of looks like you..." lol
@@franciscoicarocs Haha! I'm glad you're enjoying it. I just started (in earnest) to write my next book on neural networks (from simple to start of the art)
Josh, seriously, thanks for all these videos. My ML journey is smoother thanks to them.
Glad you like them!
I never understood why we needed to plot a line. Now I do. It is amazing
OMG !! You really have the best in explaining this concept which most books just want to show how good their English are.
Thanks! :)
This is a nice video, but by looking at more recent videos and these old ones, is very clear how much you have improved what was already awesome, congrats
Thanks!
StatQuest Team - Thank you so much for all your efforts. For the last few months, I felt like that mouse stuck in a wheel going round and round with concepts as I got deeper into ML. A definite recommendation to everyone and anyone irrespective of their ML proficiency.
Awesome! Good luck with your ML studies! :)
Your videos are helping me write my thesis because I don't have a stats background and went into a science heavy masters. this is just the best!!!!
Good luck! :)
Hello Josh, Thank you for sharing. Every time I sign in, I learn something new from you. Fantastic presentation . Stay blessed.
Thank you! You too!
What a legend you are! No words to express my gratitude. You are a blessing to everyone wanting to learn these concepts! Wish you good health and loads of happiness. :)
Thank you!
I'm new to data science, you just nailed it ....amazing explanation...after so many videos...finally understood what the heck Linear regression is ...Thank you so much...
Hooray! :)
Oh my God this is just the best video I've ever seen about Linear Regression! Thank you very much! I subscribed just after the video, please do not stop!
Awesome! Thank you!
This is the real teaching. Respect
Awesome. I am screaming with happiness. Thanks Statquest. Intuition you conveyed to us priceless.
Thanks!
Thanks very much for this. I watched it last year when I was looking to change careers. Re watching now that I'm enrolled in some real training. And Wow!
BAM! Good luck with your course.
What a teaching style... 200% which I was searching about...
Exellent work
Thank you very much! :)
I think that the Blue Lake in New Zealand has some competition for ~most clarity~. This channel is amazing!!
BAM! :)
taken Andrew's course and bought books, but that intro and vibe is the rela into to ML--I learned that the hard way. Thanks for your songs.
:)
@@statquest hey man, I'd like to request you to kindly make a video on how to become an ML engineer from scratch! I am a self taught aspirant. please make the roadmap for people like us
@@supriyamanna715 To be honest, you could just start at the top of this and work your way down, through the webinars: statquest.org/video-index/
The best teaching video I have ever seen. What a great work!
Thank you! :)
AMAZING VIDEO. MAKES STATS AND MATHS MORE LOGICAL AND REAL
Thanks!
I've seen many of your videos, they are amazing good stuff!
I just wanted to point out something in this one: you calculate "b" for the first horizontal line, and then you start rotating it to find the best slope.
But you never explain WHERE IS THE ROTATION POINT! This is crucial!
I mean, to give the intuition that you can assume any intercept and the rotate using it as the rotation point, and it won't change the result.
Noted.
Wowie kabawie wowie ZOWIE! Im still in highschool! And I understand most fundementals in the video!😊
double bam!
You have the best teaching skill in the universe.
Wow! Thanks!
The slope explanation gives a good intuitive sense of how to find the best-fit slope and y-intercept for least squares, especially if you have a background in calculus. In contrast, the linear algebra solution of OLS is just that much more shocking/amazing; that the same result can be calculated algebraically for n-space >without< any geometric intuition, without any search in the solution space for slope and y-intercept. Your visual explanation is more intuitive and memorable. The linear algebra approach feels more magical, as I find it harder to remember the derivation.
Noted
I just "discovered" by myself that lines generated by linear regression always passes (M(X), M(Y)) point (which, when thinked about it is quite intuitive) and thus, we can add constriction, that
b=M(Y)-a*(M(X), which allows to solve the equation for just single varible (slope), instead of two (slope and intercept). Thats for sure basic, but im neverthless proud of myself :D Great channel BTW.
bam!
Thank you for the vid. It was so easy to understand the concept of Least Square using visualization. I will use this as my reference for my demo teaching in stat. Hoping fore more stat videos and data analysis trick & tips...
Glad it was helpful!
superb explanation, crystal clear with graphics is good way to comprehend. Thank you for this.
Hooray!
BAM 😅😅😅🎉 I reached out to the bottom of the _STACK_ Quest (finally) 🎉❤ Wow 😮 I am on this quest to find out where this will stop since I just learned that I must watch the video *Fitting a Line to Data* also know as *Linear Regression* before I can watch the *Gradient Descent Step-by-Step!!!* so that I can watch the video related to *Neural Networks part 2* that I must watch before I can watch the *The StatQuest Introduction To PyTorch...* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand)
I am genuinely so happy to learn about that stuff with you Josh❤ I will go watch the other videos first and then I will back propagate to that first video... And now I feel like it took me 6 years to find out this wonderful video I can't wait to see it😊
You finally made it!
Josh. Ive seen this explanation for near 40 years, always re-watching for someone to explain the approach as well as the instructor who introduced it to me in 1980... your is the best so far.
But, you got away w some simplification by starting w a 0- slope line, and the calculatiin of the line's value was sort of 'lost' when you lept to non-zero slope lines...
Jussayin.
Cheers
DocV
Glad it was helpful!
Yes this is exactly what I've been looking for. Great video. It has made my life a lot easier.
Thanks!
I actually LOVE your video style!
Thank you! :)
This is an amazing video on the intuition behind Fitting lines to data. Loved this video, it gave me a recap on some of the concepts I've learnt years ago and have forgotten.
Hooray! :)
hello, you have done the wonderfull explanation. I loved it.
and I am requesting you to do a video on assumptions made in linear regression.
this will help us a lot.
your method of teaching is awesome, thank you!
Thank you! :)
maths is so crazy, the explanation was amazing - the people who figure this stuff out are geniuses
bam! :)
Hey josh, I absolutely love your explanations! It's given me a completely different perspective on how i think about machine learning and made the topics intuitive. I wonder if you can compile some notes for all of the content that can be reviewed in under a day and a mind-map that can be used to put all the pieces together. that would be really AWESOME!
Something like this? app.learney.me/maps/StatQuest
Sir, your explanations are crystal clear. Thank you
Thank you! :)
Please do more of these, I think I will be able to pass my econometrics test thanks to you.
Good luck!
This is the best tutorial channel ever!!
Thank you very much! :)
Great video! It is admirable the effort you put on teaching...
Just a suggestion: It would be great if you could put the link of the complementary videos you describe in each video. That way, it is easier to keep on track.
Thanks for the tip. I generally try to do that (add the links in the description), but sometimes I forget. If you have time, it would be great if you could post which videos need links to complementary videos in a comment and then I'll take care of the rest.
I think another reason to take the square error is that it creates an actual geometric square of size of the error (area of the square, A= L*W). Add them all up together to get a 2D representation of the error. Where adding all the absolute error lines is only 1D. Sometimes shapes are easier to visualize. Amazing videos and pedagogy style. Props.
Noted.
Sorry for commenting several times throughout the series, but I would like to point out you should probably move this video + "Linear Regression, Clearly Explained!!!" before ROC/AUC in your ML playlist since you suggest understanding these basics beforehand.
Ok thanks!
Thanks for this content!!! I am very happy to understand these concepts watching this awesome explanation!
Glad it was helpful!
Simply explained...just makes it beautiful to watch! Thanks!! :-)
Thank you! :)
This is a brilliant video. You're a few minutes from explaining how to derive the LR formula. I seem to recall from my high school days - that you work out two partial derivatives (dSum/da - treating 'b' as a constant and then dSum/db treating 'a' as a constant) and equate these to 0 (to find minimum). You should be left with a couple of equations for intercept and slope.
That's exactly right (and I show how at least part of this works in my video on the Chain Rule ua-cam.com/video/wl1myxrtQHQ/v-deo.html ). However, what I don't like about the LR formula is that it only works in this specific situation. In contrast, Gradient Descent gives us pretty good parameter estimates and works in a million other situations. For more details on Gradient Descent, see: ua-cam.com/video/sDv4f4s2SB8/v-deo.html
Bro you are genius . it just sinks in the mind . .they way you are explaining , Kindly guide me with some youtube channel which explains other concepts related to math's like calculus etc. .in similar way . .or I request you to create them as well from prospective of DS
Thanks! Lots of other people like 3Blue1Brown for math.
Best and simplest explanation ever !!
Glad you liked it!
Concise and precise, well done Sir.
Thank you! :)
You are an incredible teacher.
Thank you! :)
Thank you, this is great. Easy to understand
Thank you! :)
Just the right amount of theory and math. You should consider teaching stat for students in health and biological studies.
Your method of explanation is great. Please keep uploading tutorials. I would like to see tutorials about Deep Learning and Boosting (XGBoost, Catboost, etc.) algorithms which are popular lately. Thanks.
I already have videos on deep learning here: ua-cam.com/video/CqOfi41LfDw/v-deo.html and XGBoost here: ua-cam.com/video/OtD8wVaFm6E/v-deo.html All of my videos are organized here: app.learney.me/maps/StatQuest
Great video.I love the way how intuitive your videos are! 👍🏻
Thank you! :)
awesome explanation, best ever explanation, made it look so easy.....
Thanks!
Thanks for the video. It was a great refresher.
Thanks!
Hi, Josh. You are amazing person. Your videos are very helpful to me. Your talant of explaining complicated things simply is magnificent! I hope you will go on and help a lot of people like me.
But I am a bit confused in some moments, I hope you can help me through this.
It's about the gragh where we plot sum of squared residuals and different rotation.
If the derivative=0 it means that the our function is horizontal, isn't it? In my head we just have the horizontal line as optimal line but it cannot be so. Please, clear it up.
Thank you very much!
The point of rotating the line and showing different sums of the squared residuals was simply to help people understand the concept of the goal of finding the optimal line. However, in practice, we just take the derivative of the function and set it to 0 (just like you said). BAM! :)
@@statquest thank you very much! So quick response!
I am a bit of the middle between “simple explanation” and “complex explanation” so it confused me a bit). You are a great human being! Good luck)
@@АлексейШаков-ь4и Thank you! Now, even though we can solve for optimal line by setting the derivative = 0 and solving for the slope and intercept, a more general solution, that works in a lot of different situations, is called Gradient Descent. Gradient Descent is the backbone of Machine Learning and is used in this situation, as well as for Deep Learning and all that fancy stuff. For details on how Gradient Descent works, see: ua-cam.com/video/sDv4f4s2SB8/v-deo.html
Great video on least square method.
Thanks!
Can never find a clearer explanation than yours video!! I have a question hope this is not stupid. In 8:44, the derivative of the function is the derivative (a,b) or derivative a + derivative b? I don’t understand how to functionize them. Thank you so much for your work!
We take the derivatives with respect to each variable and set them to zero. However, a more flexible method is Gradient Descent, which can be used in a lot more situations. I show how to do it here: ua-cam.com/video/sDv4f4s2SB8/v-deo.html
@@statquest omg thank you!!!!!
Hi Josh. Love your videos!! They give the best intuitive explanation. Can't thank you enough :).
Please make a video on curve fitting for linear equations using normal equations vs using gradient descent. Thank You!
My video on Gradient Descent compares that method to the analytical solution: ua-cam.com/video/sDv4f4s2SB8/v-deo.html
Great Video! I believe correction needed at 8:09 , it should be "Taking the derivatives of SSR with respect to both slope and intercept ... "
Sure, I should have been a little more careful with my words there.
I have calculated taking small values of diference from the line like 1,2,3,4(4 was supposed to be negative) Using modulus answer comes 10 ...Now when i do it square answer comes 30 so how is square better than modulus?where is the complexity. since modulus is doing its job to make the number positive.
The square is used because it has a derivative defined for all points. This makes the math relatively to find the best parameter values that minimize the residuals.
Really good explanation, thank you !
Thanks!
When a clip about stats starts of with singing that makes you laugh before you start it's a very good thing.
bam!
Great explanation. Thank you. God bless you!
Thank you!
Pretty spot on. Happy I found this channel.
I have a question. The distance between the line and a point, should'nt it be the perpendicular distance between the point and the line? I believe that is even more of an accurate model.
For regression we use the vertical distance between the data and the line in order to preserve the relationship that the variable on the x-axis is supposed to be predicting the variable on the y-axis. By measuring the vertical distance (instead of the perpendicular distance), we can measure how good or bad that prediction is.
Clearly explained indeed. Thank you.
Glad it was helpful!
clearly explained! lots of thanks
Glad it was helpful!
thanks a lot for your useful content , ❤ from Iran
Hello Iran!!! Thank you! :)
God bless you and these videos. These are so helpful
Thanks!
This is so awesome. I finally get it! Have you written a book?
Yes, I have written a book and it should come out this spring. Subscribe to stay in the loop! :)
Thank you so much for this incredible video!
Glad you enjoyed it!
The best explanation to date!
Thanks!
Really useful video, thank you! Quick question: when you mention estimating the best line of fit, can you assume the trough of the y=a*x+b vs sum of squared residuals plot to be the best line of fit, or is it essential to calculate the derivative for that function in order to find the best fit?
I'm not sure I understand your question, however, in practice, we solve for the derivatives and set them equal to 0 and solve for the optimal values. That said, we can also use gradient descent ua-cam.com/video/sDv4f4s2SB8/v-deo.html to solve this, and I like that method better because it's a general solution that works in a lot of situations where analytical methods fail.
Hi! First of all, congratulation for this awesome youtube channel.
I was wondering if it would it be possible to re-upload this video in a higher quality? If I am not wrong, the max quality is 480p, which is not the best at these days.
Thanks!
Miguel
I'm glad you like the video! That's a good idea to upload a higher quality version.
Now that's a good explanation.
Thanks!