The Main Ideas of Fitting a Line to Data (The Main Ideas of Least Squares and Linear Regression.)

Поділитися
Вставка
  • Опубліковано 3 лют 2025

КОМЕНТАРІ • 696

  • @statquest
    @statquest  2 роки тому +21

    Support StatQuest by buying my books The StatQuest Illustrated Guide to Machine Learning, The StatQuest Illustrated Guide to Neural Networks and AI, or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @weixiangzhao561
      @weixiangzhao561 2 роки тому +3

      Everyone should buy this book if you want to learn machine learning. It is the greatest 20 bucks that I have ever spent in my entire life.

    • @statquest
      @statquest  2 роки тому +1

      @@weixiangzhao561 BAM! Thank you very much! :)

    • @dipenpandit684
      @dipenpandit684 2 роки тому

      This video isn't available in Nepal. Can you please make it available sir? I really love your content. 🙏

    • @statquest
      @statquest  2 роки тому +1

      @@dipenpandit684 I'm working on it. It's some strange thing with youtube and I've contacted them.

    • @alicefriedman7607
      @alicefriedman7607 Рік тому +1

      I love this book! I have taken Stats so many times but I still learned so much. Thank you for writing this book!

  • @rmenchoachupicachu
    @rmenchoachupicachu 7 років тому +115

    I have been enrolled in a graduate machine learning course for about a month now and you have just demystified so many details around Linear Regression. Please do more ML videos! They are so clear and helpful. If you can, please do one on Regularization and Decision Forests.

  • @toandef3109
    @toandef3109 Рік тому +7

    You have no idea how much I appreciate the clarity and simplicity of this explanation, you deserve a medal

  • @majeedhussain3276
    @majeedhussain3276 6 років тому +198

    You deserve more subscribers, the quality in your videos is so intuitive that even a high school student understands. Please do upload videos, keep going. I really think ur gonna get more subscribers in future.

    • @adambis337
      @adambis337 Рік тому +3

      You were right

    • @wirotep.1210
      @wirotep.1210 11 місяців тому +1

      Can’t agree more

    • @PunmasterSTP
      @PunmasterSTP 10 місяців тому

      Definitely! I'm glad that as of March 2024, he's got over one million subs (1.13 to be exact)!

  • @kenway346
    @kenway346 5 років тому +55

    I actually took machine learning as the elective subject for my final year of engineering and I pretty much guess that this channel's going to teach me every thing!

    • @statquest
      @statquest  5 років тому +9

      Hooray!!! :)

    • @User-l3u6m
      @User-l3u6m 2 роки тому +5

      @@statquest your vids are so good I watch your videos for entertainment lol

    • @PunmasterSTP
      @PunmasterSTP 10 місяців тому

      How'd the machine learning class turn out?

  • @rishabh2892
    @rishabh2892 5 років тому +53

    After weeks of research and frustration, I have finally understood the concept of least squares so well! You explained the concepts so simply and logically!! Thank you so much for this amazing video. Much appreciated.

    • @statquest
      @statquest  5 років тому +4

      Thank you very much!!! :)

  • @homeboy6668
    @homeboy6668 2 роки тому +25

    I must admit sir, you are one of the best teachers I've ever had. Thank you for being so awesome!

  • @maggiechen1141
    @maggiechen1141 3 роки тому +46

    Thank you!! After somehow passing 2 PhD quantitative methods modules and not really understanding why we did any of it, your channel has finally cleared a lot of stuff up!

    • @statquest
      @statquest  3 роки тому +4

      Great to hear!

    • @frankchen4229
      @frankchen4229 2 роки тому +4

      How the hell did you pass

    • @maggiechen1141
      @maggiechen1141 2 роки тому +11

      @@frankchen4229 do what you're told and don't ask why

    • @frankchen4229
      @frankchen4229 2 роки тому +2

      @@maggiechen1141 huh. Maybe PhD isn't so bad.

    • @kapricun
      @kapricun 2 роки тому

      @@maggiechen1141 best advice so far

  • @benparasa
    @benparasa Рік тому +1

    There are so many videos on this subject where they say what and how to use least squares method of find the line of best fit but you are the only one who explained the concept behind this method. Thank you.

  • @blessinglotus
    @blessinglotus 4 роки тому +9

    I keep coming back for more! THANK YOU SO MUCH!!! This is so clearly explained than any other tutorial/video/in-class session I've ever listened to! You are the best!

    • @statquest
      @statquest  4 роки тому

      Wow! Thank you very much!

  • @Jef-ur7zv
    @Jef-ur7zv 2 роки тому +3

    the quality of these videos seem to have improved greatly over the years, but the simplicity was always there. Amazing!

    • @statquest
      @statquest  2 роки тому

      Some of the early videos are still the best.

  • @doge6154
    @doge6154 Рік тому +6

    english is not my first language, but i can clearly understand your explanation. Thank you sir!

  • @MrCentrax
    @MrCentrax 2 роки тому +5

    I always wanted to know why the formula squares the distance and then get the root instead of using the absolut value. You're the first one to explain this. Thank you!

    • @statquest
      @statquest  2 роки тому +1

      Thanks!

    • @frt_x
      @frt_x 9 місяців тому

      So can you explain in your own words in short then? Thanks

  • @guillermoalvarezbacame5909
    @guillermoalvarezbacame5909 8 місяців тому +2

    This is the best UA-cam channel ever, thank you Josh for all your work you doing awesome!!

  • @Neoclassicalmaese
    @Neoclassicalmaese 5 років тому +38

    I don't think I have ever clicked on the subscribe button that fast. Absolutely amazing

  • @odins_claw
    @odins_claw 3 роки тому +7

    Your presentation style is way ahead of anything else on this platform. You don't have the inefficient habit of deriving everything from first principles but allow a holistic intuition to develop. Absolute magic!

  • @datasciencewithjazz5854
    @datasciencewithjazz5854 3 роки тому +7

    Josh your songs and teaching are excellent, you are doing something no one else has done in my life: inspiring me to become a Data scientist as well as a composer

    • @statquest
      @statquest  3 роки тому +3

      Wow! That is awesome! BAM! :)

  • @mostinho7
    @mostinho7 4 роки тому +5

    Done thanks
    We take the square to make all the errors positive (we want to find the total error of the points from the line)
    We want to find the optimal values for a and b in the equation of a line that minimize the sum of errors squared. We can express the sum of squares as a function of a and b and take the derivative to optimize it
    5:45
    We find the slope that minimizes the error by finding the minima of the multi variable function (variables are a,b)
    7:30

  • @moneyman2200
    @moneyman2200 5 років тому +1

    I've read a few undergrad texts and none of them actually explain the origins of the idea behind least squares. At least not in a simplified visual form, you're usually just explained the problem and slapped with the simple linear model and then the generalized version....This gives some insight into the motivation behind this technique. Thank you for donating you're time to such a altruistic cause. You a real one!

  • @reginango9009
    @reginango9009 4 роки тому +6

    This video made me stop crying from stress of barely understanding anything in my class. Thank you

    • @statquest
      @statquest  4 роки тому

      Glad it helped!

    • @bigvinweasel1050
      @bigvinweasel1050 3 роки тому +1

      Right?! This channel is so underrated, we have to change that!

    • @statquest
      @statquest  3 роки тому

      @@bigvinweasel1050 Wow! Thank you!

    • @bigvinweasel1050
      @bigvinweasel1050 3 роки тому

      @@statquest I'm serious! If you ever need anything done on Python for content - I would be more than happy to write it out as clearly and as elegantly as I can so you can use it for content.

  • @munny0607
    @munny0607 3 роки тому +4

    This is absolutely fantastic! I am so glad that I found this channel on the UA-cam while I am doing the data science self study. I am now understand the concept of OLS which stress me out in a week time before I found this video. Big thanks!

    • @statquest
      @statquest  3 роки тому +1

      I'm glad it was helpful! :)

  • @vincent3542
    @vincent3542 3 роки тому +2

    bro im appreciate every single videos that you've made, i just want to say thanks a ton with not skip ads in your videos, love from indonesian

  • @_aft
    @_aft 5 років тому +57

    It literally only took me 15 seconds to subscribe because of his unique 15 second intro

  • @uchuhikari9292
    @uchuhikari9292 4 роки тому +1

    I have watched and read so many articles but this video explains the use of sum of squared errors and why its important. Thank you!!!

  • @ramkotha4726
    @ramkotha4726 4 роки тому +4

    Never excited more than after watching this video. Truly intuitive and amazing intro to linear regression.

  • @daiannehofig3311
    @daiannehofig3311 5 років тому +6

    Your way to explain these abstract concepts is simply AMAZING!!!! Thank you so much for these incredible videos!

    • @statquest
      @statquest  5 років тому

      Thank you very much! :)

  • @franciscoicarocs
    @franciscoicarocs 10 місяців тому +1

    I just wanted to say that you were born to teach.
    The book? Are you kidding me? Perfection.
    I would advise to give the first chapter as free content so everybody can have a taste of your abilities.
    Favorite quote so far: "The Binomial Distribution makes me want to run away and hide. :) "

    • @statquest
      @statquest  10 місяців тому

      Thank you very much! That's a good idea. I wonder how I can do that.

    • @franciscoicarocs
      @franciscoicarocs 10 місяців тому +1

      Favorite new quote just dropped: "the Normal distribution is awesome, and, to be honest, it sort of looks like you..." lol

    • @statquest
      @statquest  10 місяців тому

      @@franciscoicarocs Haha! I'm glad you're enjoying it. I just started (in earnest) to write my next book on neural networks (from simple to start of the art)

  • @ludelaire
    @ludelaire Рік тому +1

    Josh, seriously, thanks for all these videos. My ML journey is smoother thanks to them.

  • @avishekdey1943
    @avishekdey1943 4 роки тому +13

    I never understood why we needed to plot a line. Now I do. It is amazing

  • @akeithkira
    @akeithkira 4 роки тому +2

    OMG !! You really have the best in explaining this concept which most books just want to show how good their English are.

  • @StackhouseBK
    @StackhouseBK Місяць тому +1

    This is a nice video, but by looking at more recent videos and these old ones, is very clear how much you have improved what was already awesome, congrats

  • @pravinsaraswatula5263
    @pravinsaraswatula5263 5 років тому +7

    StatQuest Team - Thank you so much for all your efforts. For the last few months, I felt like that mouse stuck in a wheel going round and round with concepts as I got deeper into ML. A definite recommendation to everyone and anyone irrespective of their ML proficiency.

    • @statquest
      @statquest  5 років тому

      Awesome! Good luck with your ML studies! :)

  • @kawaiihellen2285
    @kawaiihellen2285 2 роки тому +1

    Your videos are helping me write my thesis because I don't have a stats background and went into a science heavy masters. this is just the best!!!!

  • @asghaznavi
    @asghaznavi 2 роки тому +1

    Hello Josh, Thank you for sharing. Every time I sign in, I learn something new from you. Fantastic presentation . Stay blessed.

  • @avadhsavsani1148
    @avadhsavsani1148 3 роки тому +2

    What a legend you are! No words to express my gratitude. You are a blessing to everyone wanting to learn these concepts! Wish you good health and loads of happiness. :)

  • @gothams1195
    @gothams1195 3 роки тому +1

    I'm new to data science, you just nailed it ....amazing explanation...after so many videos...finally understood what the heck Linear regression is ...Thank you so much...

  • @danielkhromov9724
    @danielkhromov9724 4 роки тому +2

    Oh my God this is just the best video I've ever seen about Linear Regression! Thank you very much! I subscribed just after the video, please do not stop!

  • @divinity1170
    @divinity1170 5 років тому +11

    This is the real teaching. Respect

  • @halilibrahimcetin9448
    @halilibrahimcetin9448 2 роки тому +1

    Awesome. I am screaming with happiness. Thanks Statquest. Intuition you conveyed to us priceless.

  • @zukofire6424
    @zukofire6424 3 роки тому +1

    Thanks very much for this. I watched it last year when I was looking to change careers. Re watching now that I'm enrolled in some real training. And Wow!

    • @statquest
      @statquest  3 роки тому +1

      BAM! Good luck with your course.

  • @imranullah7355
    @imranullah7355 3 роки тому +1

    What a teaching style... 200% which I was searching about...
    Exellent work

    • @statquest
      @statquest  3 роки тому

      Thank you very much! :)

  • @gingerl6081
    @gingerl6081 4 роки тому +2

    I think that the Blue Lake in New Zealand has some competition for ~most clarity~. This channel is amazing!!

  • @supriyamanna715
    @supriyamanna715 3 роки тому +1

    taken Andrew's course and bought books, but that intro and vibe is the rela into to ML--I learned that the hard way. Thanks for your songs.

    • @statquest
      @statquest  3 роки тому

      :)

    • @supriyamanna715
      @supriyamanna715 3 роки тому +1

      @@statquest hey man, I'd like to request you to kindly make a video on how to become an ML engineer from scratch! I am a self taught aspirant. please make the roadmap for people like us

    • @statquest
      @statquest  3 роки тому

      @@supriyamanna715 To be honest, you could just start at the top of this and work your way down, through the webinars: statquest.org/video-index/

  • @aldikroos6290
    @aldikroos6290 5 років тому +2

    The best teaching video I have ever seen. What a great work!

  • @kushbhomawat8722
    @kushbhomawat8722 2 роки тому +1

    AMAZING VIDEO. MAKES STATS AND MATHS MORE LOGICAL AND REAL

  • @MrFredazo
    @MrFredazo 4 роки тому +2

    I've seen many of your videos, they are amazing good stuff!
    I just wanted to point out something in this one: you calculate "b" for the first horizontal line, and then you start rotating it to find the best slope.
    But you never explain WHERE IS THE ROTATION POINT! This is crucial!

    • @MrFredazo
      @MrFredazo 4 роки тому

      I mean, to give the intuition that you can assume any intercept and the rotate using it as the rotation point, and it won't change the result.

    • @statquest
      @statquest  4 роки тому

      Noted.

  • @alhello_game_of_everything
    @alhello_game_of_everything 10 місяців тому +2

    Wowie kabawie wowie ZOWIE! Im still in highschool! And I understand most fundementals in the video!😊

  • @ian.ambrose
    @ian.ambrose 3 роки тому +1

    You have the best teaching skill in the universe.

  • @Nakameguro97
    @Nakameguro97 3 роки тому

    The slope explanation gives a good intuitive sense of how to find the best-fit slope and y-intercept for least squares, especially if you have a background in calculus. In contrast, the linear algebra solution of OLS is just that much more shocking/amazing; that the same result can be calculated algebraically for n-space >without< any geometric intuition, without any search in the solution space for slope and y-intercept. Your visual explanation is more intuitive and memorable. The linear algebra approach feels more magical, as I find it harder to remember the derivation.

  • @lampkanocna6
    @lampkanocna6 2 роки тому

    I just "discovered" by myself that lines generated by linear regression always passes (M(X), M(Y)) point (which, when thinked about it is quite intuitive) and thus, we can add constriction, that
    b=M(Y)-a*(M(X), which allows to solve the equation for just single varible (slope), instead of two (slope and intercept). Thats for sure basic, but im neverthless proud of myself :D Great channel BTW.

  • @kensarmoreto1863
    @kensarmoreto1863 3 роки тому +1

    Thank you for the vid. It was so easy to understand the concept of Least Square using visualization. I will use this as my reference for my demo teaching in stat. Hoping fore more stat videos and data analysis trick & tips...

  • @abhi9029
    @abhi9029 3 роки тому +1

    superb explanation, crystal clear with graphics is good way to comprehend. Thank you for this.

  • @Luxcium
    @Luxcium Рік тому

    BAM 😅😅😅🎉 I reached out to the bottom of the _STACK_ Quest (finally) 🎉❤ Wow 😮 I am on this quest to find out where this will stop since I just learned that I must watch the video *Fitting a Line to Data* also know as *Linear Regression* before I can watch the *Gradient Descent Step-by-Step!!!* so that I can watch the video related to *Neural Networks part 2* that I must watch before I can watch the *The StatQuest Introduction To PyTorch...* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand)
    I am genuinely so happy to learn about that stuff with you Josh❤ I will go watch the other videos first and then I will back propagate to that first video... And now I feel like it took me 6 years to find out this wonderful video I can't wait to see it😊

  • @michaelvogt7787
    @michaelvogt7787 7 місяців тому

    Josh. Ive seen this explanation for near 40 years, always re-watching for someone to explain the approach as well as the instructor who introduced it to me in 1980... your is the best so far.
    But, you got away w some simplification by starting w a 0- slope line, and the calculatiin of the line's value was sort of 'lost' when you lept to non-zero slope lines...
    Jussayin.
    Cheers
    DocV

    • @statquest
      @statquest  7 місяців тому

      Glad it was helpful!

  • @ian_owuor
    @ian_owuor 2 роки тому +1

    Yes this is exactly what I've been looking for. Great video. It has made my life a lot easier.

  • @TheSheekeyScienceShow
    @TheSheekeyScienceShow 3 роки тому +2

    I actually LOVE your video style!

  • @nayaksrigovind
    @nayaksrigovind 4 роки тому +4

    This is an amazing video on the intuition behind Fitting lines to data. Loved this video, it gave me a recap on some of the concepts I've learnt years ago and have forgotten.

  • @aakshay8440
    @aakshay8440 5 років тому +3

    hello, you have done the wonderfull explanation. I loved it.
    and I am requesting you to do a video on assumptions made in linear regression.
    this will help us a lot.

  • @iktiar
    @iktiar 4 роки тому +3

    your method of teaching is awesome, thank you!

  • @TheEternalDao
    @TheEternalDao 2 роки тому +1

    maths is so crazy, the explanation was amazing - the people who figure this stuff out are geniuses

  • @MONAD_0
    @MONAD_0 3 роки тому +1

    Hey josh, I absolutely love your explanations! It's given me a completely different perspective on how i think about machine learning and made the topics intuitive. I wonder if you can compile some notes for all of the content that can be reviewed in under a day and a mind-map that can be used to put all the pieces together. that would be really AWESOME!

    • @statquest
      @statquest  3 роки тому

      Something like this? app.learney.me/maps/StatQuest

  • @maheshgopalakrishnan5072
    @maheshgopalakrishnan5072 4 роки тому +1

    Sir, your explanations are crystal clear. Thank you

  • @dariakrupnova6245
    @dariakrupnova6245 3 роки тому +2

    Please do more of these, I think I will be able to pass my econometrics test thanks to you.

  • @aakashgarg66
    @aakashgarg66 3 роки тому +1

    This is the best tutorial channel ever!!

    • @statquest
      @statquest  3 роки тому

      Thank you very much! :)

  • @lucasbatista8169
    @lucasbatista8169 5 років тому +1

    Great video! It is admirable the effort you put on teaching...
    Just a suggestion: It would be great if you could put the link of the complementary videos you describe in each video. That way, it is easier to keep on track.

    • @statquest
      @statquest  5 років тому +2

      Thanks for the tip. I generally try to do that (add the links in the description), but sometimes I forget. If you have time, it would be great if you could post which videos need links to complementary videos in a comment and then I'll take care of the rest.

  • @jasafraga
    @jasafraga 4 роки тому

    I think another reason to take the square error is that it creates an actual geometric square of size of the error (area of the square, A= L*W). Add them all up together to get a 2D representation of the error. Where adding all the absolute error lines is only 1D. Sometimes shapes are easier to visualize. Amazing videos and pedagogy style. Props.

  • @Hevletica
    @Hevletica 2 роки тому +2

    Sorry for commenting several times throughout the series, but I would like to point out you should probably move this video + "Linear Regression, Clearly Explained!!!" before ROC/AUC in your ML playlist since you suggest understanding these basics beforehand.

  • @angelmotta
    @angelmotta 2 роки тому +1

    Thanks for this content!!! I am very happy to understand these concepts watching this awesome explanation!

  • @may4081
    @may4081 4 роки тому +2

    Simply explained...just makes it beautiful to watch! Thanks!! :-)

  • @johnwilson3918
    @johnwilson3918 4 роки тому

    This is a brilliant video. You're a few minutes from explaining how to derive the LR formula. I seem to recall from my high school days - that you work out two partial derivatives (dSum/da - treating 'b' as a constant and then dSum/db treating 'a' as a constant) and equate these to 0 (to find minimum). You should be left with a couple of equations for intercept and slope.

    • @statquest
      @statquest  4 роки тому

      That's exactly right (and I show how at least part of this works in my video on the Chain Rule ua-cam.com/video/wl1myxrtQHQ/v-deo.html ). However, what I don't like about the LR formula is that it only works in this specific situation. In contrast, Gradient Descent gives us pretty good parameter estimates and works in a million other situations. For more details on Gradient Descent, see: ua-cam.com/video/sDv4f4s2SB8/v-deo.html

  • @techproductowner
    @techproductowner 4 роки тому +1

    Bro you are genius . it just sinks in the mind . .they way you are explaining , Kindly guide me with some youtube channel which explains other concepts related to math's like calculus etc. .in similar way . .or I request you to create them as well from prospective of DS

    • @statquest
      @statquest  4 роки тому

      Thanks! Lots of other people like 3Blue1Brown for math.

  • @user-cb7it9cv9l
    @user-cb7it9cv9l 3 роки тому +1

    Best and simplest explanation ever !!

  • @woodynani4394
    @woodynani4394 4 роки тому +1

    Concise and precise, well done Sir.

  • @amribrahim7850
    @amribrahim7850 3 роки тому +1

    You are an incredible teacher.

  • @mukossa4650
    @mukossa4650 5 років тому +4

    Thank you, this is great. Easy to understand

  • @RoRight
    @RoRight 7 років тому

    Just the right amount of theory and math. You should consider teaching stat for students in health and biological studies.

  • @mustafacakir__
    @mustafacakir__ 3 роки тому +1

    Your method of explanation is great. Please keep uploading tutorials. I would like to see tutorials about Deep Learning and Boosting (XGBoost, Catboost, etc.) algorithms which are popular lately. Thanks.

    • @statquest
      @statquest  3 роки тому +1

      I already have videos on deep learning here: ua-cam.com/video/CqOfi41LfDw/v-deo.html and XGBoost here: ua-cam.com/video/OtD8wVaFm6E/v-deo.html All of my videos are organized here: app.learney.me/maps/StatQuest

  • @nikhilsetty3451
    @nikhilsetty3451 6 років тому +1

    Great video.I love the way how intuitive your videos are! 👍🏻

  • @muksmart1
    @muksmart1 4 роки тому +1

    awesome explanation, best ever explanation, made it look so easy.....

  • @tseckwr3783
    @tseckwr3783 Рік тому +1

    Thanks for the video. It was a great refresher.

  • @АлексейШаков-ь4и
    @АлексейШаков-ь4и 3 роки тому +1

    Hi, Josh. You are amazing person. Your videos are very helpful to me. Your talant of explaining complicated things simply is magnificent! I hope you will go on and help a lot of people like me.
    But I am a bit confused in some moments, I hope you can help me through this.
    It's about the gragh where we plot sum of squared residuals and different rotation.
    If the derivative=0 it means that the our function is horizontal, isn't it? In my head we just have the horizontal line as optimal line but it cannot be so. Please, clear it up.
    Thank you very much!

    • @statquest
      @statquest  3 роки тому

      The point of rotating the line and showing different sums of the squared residuals was simply to help people understand the concept of the goal of finding the optimal line. However, in practice, we just take the derivative of the function and set it to 0 (just like you said). BAM! :)

    • @АлексейШаков-ь4и
      @АлексейШаков-ь4и 3 роки тому +1

      @@statquest thank you very much! So quick response!
      I am a bit of the middle between “simple explanation” and “complex explanation” so it confused me a bit). You are a great human being! Good luck)

    • @statquest
      @statquest  2 роки тому +1

      @@АлексейШаков-ь4и Thank you! Now, even though we can solve for optimal line by setting the derivative = 0 and solving for the slope and intercept, a more general solution, that works in a lot of different situations, is called Gradient Descent. Gradient Descent is the backbone of Machine Learning and is used in this situation, as well as for Deep Learning and all that fancy stuff. For details on how Gradient Descent works, see: ua-cam.com/video/sDv4f4s2SB8/v-deo.html

  • @ankuragarwala7558
    @ankuragarwala7558 3 роки тому +1

    Great video on least square method.

  • @dearyawen
    @dearyawen 3 роки тому

    Can never find a clearer explanation than yours video!! I have a question hope this is not stupid. In 8:44, the derivative of the function is the derivative (a,b) or derivative a + derivative b? I don’t understand how to functionize them. Thank you so much for your work!

    • @statquest
      @statquest  3 роки тому

      We take the derivatives with respect to each variable and set them to zero. However, a more flexible method is Gradient Descent, which can be used in a lot more situations. I show how to do it here: ua-cam.com/video/sDv4f4s2SB8/v-deo.html

    • @dearyawen
      @dearyawen 3 роки тому

      @@statquest omg thank you!!!!!

  • @abhinav9561
    @abhinav9561 4 роки тому

    Hi Josh. Love your videos!! They give the best intuitive explanation. Can't thank you enough :).
    Please make a video on curve fitting for linear equations using normal equations vs using gradient descent. Thank You!

    • @statquest
      @statquest  4 роки тому

      My video on Gradient Descent compares that method to the analytical solution: ua-cam.com/video/sDv4f4s2SB8/v-deo.html

  • @golamchowdhury4165
    @golamchowdhury4165 2 роки тому

    Great Video! I believe correction needed at 8:09 , it should be "Taking the derivatives of SSR with respect to both slope and intercept ... "

    • @statquest
      @statquest  2 роки тому

      Sure, I should have been a little more careful with my words there.

  • @Vrocks_
    @Vrocks_ 3 роки тому

    I have calculated taking small values of diference from the line like 1,2,3,4(4 was supposed to be negative) Using modulus answer comes 10 ...Now when i do it square answer comes 30 so how is square better than modulus?where is the complexity. since modulus is doing its job to make the number positive.

    • @statquest
      @statquest  3 роки тому

      The square is used because it has a derivative defined for all points. This makes the math relatively to find the best parameter values that minimize the residuals.

  • @MichaelKlimenko
    @MichaelKlimenko 10 місяців тому +1

    Really good explanation, thank you !

  • @christinagiannaros9817
    @christinagiannaros9817 2 роки тому +1

    When a clip about stats starts of with singing that makes you laugh before you start it's a very good thing.

  • @jeanlanz2344
    @jeanlanz2344 Рік тому +1

    Great explanation. Thank you. God bless you!

  • @ajayravishankar7797
    @ajayravishankar7797 4 роки тому

    Pretty spot on. Happy I found this channel.
    I have a question. The distance between the line and a point, should'nt it be the perpendicular distance between the point and the line? I believe that is even more of an accurate model.

    • @statquest
      @statquest  4 роки тому +1

      For regression we use the vertical distance between the data and the line in order to preserve the relationship that the variable on the x-axis is supposed to be predicting the variable on the y-axis. By measuring the vertical distance (instead of the perpendicular distance), we can measure how good or bad that prediction is.

  • @ling6701
    @ling6701 2 роки тому +1

    Clearly explained indeed. Thank you.

  • @ayselceferzade8587
    @ayselceferzade8587 2 роки тому +1

    clearly explained! lots of thanks

    • @statquest
      @statquest  2 роки тому +1

      Glad it was helpful!

  • @Mohammadreza-Dolati
    @Mohammadreza-Dolati Рік тому +1

    thanks a lot for your useful content , ❤ from Iran

    • @statquest
      @statquest  Рік тому

      Hello Iran!!! Thank you! :)

  • @hardouthereforaph.d.2525
    @hardouthereforaph.d.2525 3 роки тому +1

    God bless you and these videos. These are so helpful

  • @sweetmaths4213
    @sweetmaths4213 3 роки тому +1

    This is so awesome. I finally get it! Have you written a book?

    • @statquest
      @statquest  3 роки тому

      Yes, I have written a book and it should come out this spring. Subscribe to stay in the loop! :)

  • @parisaayazi8886
    @parisaayazi8886 3 роки тому +1

    Thank you so much for this incredible video!

  • @palakagrawal8856
    @palakagrawal8856 3 роки тому +1

    The best explanation to date!

  • @asharasif3849
    @asharasif3849 11 місяців тому +1

    Really useful video, thank you! Quick question: when you mention estimating the best line of fit, can you assume the trough of the y=a*x+b vs sum of squared residuals plot to be the best line of fit, or is it essential to calculate the derivative for that function in order to find the best fit?

    • @statquest
      @statquest  11 місяців тому

      I'm not sure I understand your question, however, in practice, we solve for the derivatives and set them equal to 0 and solve for the optimal values. That said, we can also use gradient descent ua-cam.com/video/sDv4f4s2SB8/v-deo.html to solve this, and I like that method better because it's a general solution that works in a lot of situations where analytical methods fail.

  • @MiguelAngelMartinezFernandez
    @MiguelAngelMartinezFernandez 5 років тому +3

    Hi! First of all, congratulation for this awesome youtube channel.
    I was wondering if it would it be possible to re-upload this video in a higher quality? If I am not wrong, the max quality is 480p, which is not the best at these days.
    Thanks!
    Miguel

    • @statquest
      @statquest  5 років тому +2

      I'm glad you like the video! That's a good idea to upload a higher quality version.

  • @yourfriend988
    @yourfriend988 4 роки тому +2

    Now that's a good explanation.