Regularization Part 1: Ridge (L2) Regression

Поділитися
Вставка
  • Опубліковано 22 лис 2024

КОМЕНТАРІ • 1,5 тис.

  • @statquest
    @statquest  5 років тому +147

    Correction:
    13:39 I meant to put "Negative Log-Likelihood" instead of "Likelihood".
    A lot of people ask about 15:34 and how we are supposed to do Cross Validation with only one data point. At this point I was just trying to keep the example simple and if, in practice, you don't have enough data for cross validation then you can't fit a line with ridge regression. However, much more common is that you might have 500 variables but only 400 observations - in this case you have enough data for cross validation and can fit a line with Ridge Regression, but since there are more variables than observations, you can't do ordinary least squares.
    ALSO, a lot of people ask why can't lambda by negative. Remember, the goal of lambda is not to give us the optimal fit, but to prevent overfitting. If a positive value for lambda does not improve the situation, then the optimal value for lambda (discovered via cross validation) will be 0, and the line will fit no worse than the Ordinary Least Squares Line.
    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @statquest
      @statquest  4 роки тому +2

      @VINAY MALLU To repeat what I wrote in the comment you replied to: A lot of people ask why can't lambda by negative. Remember, the goal of lambda is not to give us the optimal fit, but to prevent overfitting. If a positive value for lambda does not improve the situation, then the optimal value for lambda (discovered via cross validation) will be 0, and the line will fit no worse than the Ordinary Least Squares Line.

    • @statquest
      @statquest  4 роки тому +5

      @VINAY MALLU The larger the dataset, the less likely you are to overfit the data. So in some sense, regularization becomes less important. However, Lasso (L1) regularization is still helpful for removing extra variables regardless of the size of the dataset. And even with very large datasets, ML algorithms that depend on weak learners benefit from regularization.

    • @sfz82
      @sfz82 3 роки тому

      @@statquest Coming back to Vinay's question: In the counterexample he gives a negative lambda would not achieve a better fit to the training data, but prevent overfitting (in that case overfitting to a more shallow slope). I really liked the video and found most of it very intuitive, but the fact that ridge regression favours a more shallow slope is not. With a large set of predictors, it's easy to see that enforcing sparsity may provide better out-of-sample predictions in practice. But with a single predictor the prior assumption of 'the obsvered data tend to overestimate the influence of the predictor' seems no more justified than its opposite would be. In other words: under OLS assumptions the distribution of OLS fitted slopes will be symmetrically centered on the 'true' slope. But the example was really helpful to understand that ridge regression doesn't work that way and instead biases the fit towards the intercept.

    • @cosworthpower5147
      @cosworthpower5147 3 роки тому +1

      Is there an intuitive explanation, why the intercept beta 0 is not included in the regularization process?

    • @statquest
      @statquest  3 роки тому +1

      @@cosworthpower5147 The goal is to reduce sensitivity to the parameters. The y-axis intercept does not depend on any of the parameters, so there's no reason to shrink it. Instead, as the other parameters go to 0, the intercept goes to the mean y-axis value.

  • @scubashar
    @scubashar 3 роки тому +525

    I am a machine learning engineer at a large, global tech company with a decade of experience in industry and a computer science graduate student. Your channel has helped me immensely in learning new concepts for work and job interviews, and your videos are so enjoyable to watch. They make learning feel effortless! Thank you so much!!

    • @statquest
      @statquest  3 роки тому +21

      Wow! Thank you very much! :)

    • @VainCape
      @VainCape 3 роки тому +12

      can you give me a job plz?

    • @LucasPossatti
      @LucasPossatti 2 роки тому +11

      @Son Of Rabat , some people (like me) might have skipped the "simple stuff" to jump right into the complex stuff because it gives better results. For example, I was introduced to ML by working with image classification and object detection right away, where deep learning is king. I studied backpropagation, gradient descent, etc, but never heard of Ridge Regression, for example, until recently. Now I'm trying to collect the pieces I left behind.
      (I also always sucked with the theoretical parts. As long as the evaluation metrics were good, it was fine... And it kind of worked for me, for some time. I'm now trying to change that, and deepen my theoretical knowledge.)

    • @LucasPossatti
      @LucasPossatti 2 роки тому +5

      Today, I also work for a global tech company (as a Data Scientist). Not for a decade though. 😅

    • @joshsherfey
      @joshsherfey 2 роки тому +3

      @@LucasPossatti same for me. I work as DS at large tech company, but still learn a lot from SQ

  • @PolitePolice563
    @PolitePolice563 9 місяців тому +21

    This channel is by far the best at explaining mathematical concepts related to machine learning. I'm in a machine learning class at my university and go to every class lecture. I leave not having understood an hour and fifteen minutes of lecture. I immediately pull up this channel and watch a video on the same concept and "BAM". It makes sense.

  • @ryzary
    @ryzary 4 роки тому +240

    After watching dozens of StatQuest videos, I finally know when to say 'BAM!'

  • @ardakosar3826
    @ardakosar3826 3 роки тому +91

    Explaining things at this complexity at this level of simplicity is a real skill! Awesome channel!

  • @Gebev
    @Gebev 11 місяців тому +10

    I have no words to express how good this lecture is.

  • @lucaspenna6009
    @lucaspenna6009 4 роки тому +66

    Professors in general teach Ridge Regression with many complicated equations and notations. You made this topic very clear and easy to understand. Thank u very much again.

  • @pritamck13
    @pritamck13 6 місяців тому +3

    Only Statquest can make someone emotional while learning statistics. The ease with which the concepts are flowing flawlessly into my brain makesme teary. Thank you so much 🥺❣

  • @elliotyip9844
    @elliotyip9844 Рік тому +2

    The way you go through the logic step by step makes you a good teacher. In many of my research occasions they just say "adjust your alpha higher or lower until you don't overfit / underfit" but I don't even know what am I looking at. Bless you.

  • @Nicole-se7zj
    @Nicole-se7zj 2 роки тому +10

    I've spent so much time trying to read and understand what EXACTLY is ridge regression. This video made it much easier to understand. Thank you so much for simplifying this complex concept!

  • @NaggieNag
    @NaggieNag 4 роки тому +7

    I don't know how my stat teacher can make something this easy to understand that complicated. Everytime I can't understand what he's talking about in the class I know that I have to turn to StatQuest. Thank you for what you're doing.

  • @SpL-mu5zu
    @SpL-mu5zu 5 років тому +19

    YOU ARE THOUSANDS OF TIMES BETTER THAN MY PROF...CLEAR & SIMPLE. THANKSSSSS

  • @the40yearpuzzle
    @the40yearpuzzle Рік тому +5

    I am brand-new to statistics, and I'm in school to be a data scientist. so many times, I lose the plot watching lectures from my professors who have the Curse of Knowledge. I end up spending hours watching your videos and they help so much, I just don't even have words! I've recommended your channel to all my classmates--and I mentioned it so much, my professor is considering adding your channel to recommended materials for next semester! you are a shining light of joy in a jargon-filled sea of confusion.

    • @statquest
      @statquest  Рік тому +1

      Thank you so much and good luck with your coursework! :)

    • @Dreadheadezz
      @Dreadheadezz Рік тому +1

      I study data science too at a uni and his videos are helping me stay afloat in my statistical learning course. Not all heroes were capes and he's truly one of them!

    • @shivanit148
      @shivanit148 Рік тому

      @Linda Wallberg @Josh Sherfey @Lucas Possatti I don't see why we even use lambda, it doesn't seem to change anything 🤔, i'd understand if it were a value between 0-1 but not any>=0. Can someone please explain? Multiplying lamba (scalar) to slope² should only scale it in parallel direction right? We basically just take any smaller arbitrary slope (introduce bias) and that's all.

    • @statquest
      @statquest  Рік тому

      @@shivanit148 No, we don't take an arbitrary smaller slope. We find the one slope that minimizes the SSR + penalty

  • @TheGoldenFluzzleBuff
    @TheGoldenFluzzleBuff 5 років тому +32

    I have a big data economics exam tomorrow and you literally just saved my life. I don't always understand what my professor is trying to explain, but you did it super clearly. Actual life saver

  • @malini76
    @malini76 2 роки тому +2

    Whenever I feel some concept in ML, DS is not easily understood, I come to this channel because you explain it in a simple way with good examples.

  • @SomeOfOthers
    @SomeOfOthers 5 років тому +2

    I've taken 4 machine learning courses and always wondered what ridge regression was, because I've heard it several times, but I was never taught it. I never realized it was just adding the regularization parameter! Awesome! Thank you so much.

    • @statquest
      @statquest  5 років тому +1

      Hooray! I'm glad the video helped clear up a long standing mystery. As you've noticed, a lot of machine learning is about giving old things new names - which makes it a lot easier to understand than we might think at first.

  • @anamfatima5489
    @anamfatima5489 4 роки тому +17

    I came to know about this channel 2 hours ago. Simple and Outstanding explanation. My aim is to watch each and every video.
    Loving your style of teaching.
    From India.

    • @statquest
      @statquest  4 роки тому

      Thank you very much! :)

  • @OttoFazzl
    @OttoFazzl 5 років тому +1

    I came here to learn about ridge regression only to realize it's L2 regularization. Aside from this, StatQuest is simply amazing. I use it to brush up on theory before interviews.

    • @statquest
      @statquest  5 років тому

      It's true - I'm not sure why we call it Ridge Regression and not L2. Or the other way around. And, on top of that, why not pick a name that is easy to remember, like "Squared Regularization".

  • @DragomirJtac
    @DragomirJtac 6 років тому +4

    Incredibly clear explanation. I'm using your Machine Learning videos to study for my midterm for sure. It's so nice to know that these concepts aren't above my head after all.

    • @statquest
      @statquest  6 років тому +2

      Nice!! Good luck on your mid terms!

  • @RaviYadav-nj8zh
    @RaviYadav-nj8zh 4 роки тому +2

    Level of simplicity on this channel is just BAM!!!

  • @monazaizan947
    @monazaizan947 Рік тому +3

    You made learning this complicated topic (for me) a lot more fun than from reading from a textbook or from my own lecturer. Very entertaining too... Well done!

  • @daliakamal5621
    @daliakamal5621 3 роки тому +2

    Amazing video, I have read many articles and watched many videos to understand the idea behind Ridge & Lasso Regression and finally you explained in the most simplest way, many thanks for your effort.

  • @republic2033
    @republic2033 4 роки тому +6

    You have that ability to explain difficult topics in a very simple way, this is amazing! Thank you so much

  • @yassersayed6109
    @yassersayed6109 6 років тому +1

    People like you, makes world a better place … thanks for being you ...

  • @charissapoh1159
    @charissapoh1159 3 роки тому +7

    your explanations are insane... they're so easy to understand and literally capture the essence of the topic without being overly complicated! i've bingewatched so many of your videos ever since chancing upon your channel last night - i specially love the little jingles you add in at the start of your videos, they really add such a fun and personal touch~ thank you so so soo much, your channel has really helped me immensely!!!

  • @Tapsthequant
    @Tapsthequant 9 місяців тому +1

    Clearly explained is an understatement, it is the saturated BAM!!!

    • @statquest
      @statquest  9 місяців тому

      Thank you very much! :)

  • @andersonarroyo7238
    @andersonarroyo7238 4 роки тому +9

    This is my first video and I am so impressed by how you explain things!!! It is like my buddy from college will explain it to me in plain words. You rock StatQuest, I am a follower from now on!! Thank you

  • @lampuiking4140
    @lampuiking4140 2 роки тому +1

    I don’t like to make comment often, but dude. What a waste of talent of you with this level of gifted talent on statistics. You should have been making million of dollars if u work in ibank or whatever. Thank you very much for your video. For a guy like me just want to enter data science field, u help us to achieve more than what u expect.

    • @statquest
      @statquest  2 роки тому

      Thank you so much 😀!

  • @petax004
    @petax004 5 років тому +27

    You just spoon feed my brain with your clear explanation, thanks man!

  • @youknowwhatlol6628
    @youknowwhatlol6628 10 місяців тому +1

    Greetings from Ukriane, Josh! I'd like to say thanks to you for even though we are in a difficult situation here, but your videos on machine learning techniques always help me comprehend topics of this field....i am grateful to you! Thank you so much!!!

    • @statquest
      @statquest  10 місяців тому +1

      Wow! I can't imagine trying to learn ML in your situation, but I'm happy that I can help in some way.

  • @JT2751257
    @JT2751257 4 роки тому +21

    Josh, I have been practicing data science since last 4 years and have used Ridge regression as well. But now I am feeling embarrassed after watching this explanation because before the video I only had half baked knowledge. You deserve a lot of accolades my friend :)

    • @statquest
      @statquest  4 роки тому +1

      Awesome! I'm glad the videos are helpful. :)

  • @snehabag4820
    @snehabag4820 Рік тому +1

    I looked out for 3-4 videos before this. But this one was the best in term of explanation and very easily understood. Thanks!

  • @meichendong3434
    @meichendong3434 5 років тому +3

    I love your videos. They are so easy to follow and understand complicated concepts and procedures! Thanks for sharing all of the brilliant ideas!

    • @statquest
      @statquest  5 років тому

      Awesome! Thank you! :)

  • @mohamad5005
    @mohamad5005 Рік тому +1

    You are a real man, when you said it is clearly explained, it is clearly explained.
    Mohamed from Syria

  • @kaimueric9390
    @kaimueric9390 4 роки тому +3

    BAM! The concepts are presents in the clearest way ever.

  • @winghho9
    @winghho9 6 років тому +2

    Didn't even realized this StatQuest video is super long until you mentioned it, truly enjoy your way to explain, thanks))))))))

    • @statquest
      @statquest  6 років тому

      Hooray! I'm glad you liked it. :)

  • @hrushikeshkulkarni7353
    @hrushikeshkulkarni7353 Рік тому +3

    The lecture was at a whole different level.....thank you for such amazing content dear Josh

  • @skylarj720
    @skylarj720 2 роки тому +2

    Thank you, Josh, you made the ML and stat easy and enjoyable. Hands down better than most stat prof.

    • @statquest
      @statquest  2 роки тому

      Thank you very much! :)

  • @jobandeepsingh1929
    @jobandeepsingh1929 5 років тому +4

    your channel deserves more recognition, Keep up the good work

  • @shanmukhasaratponugupati6308
    @shanmukhasaratponugupati6308 3 роки тому +1

    If there's a noble prize for good stats teacher on yt...give this guy one...

  • @tommcnally3231
    @tommcnally3231 4 роки тому +5

    My lecturer explained this by just putting the equation in front of us on the slides. The maths is easy but I didn't understand the point or intuition behind behind adding a penalty. Now I do. Thank you.

    • @statquest
      @statquest  4 роки тому

      I'm glad the video was helpful. :)

  • @tejbirsinghbhatia3090
    @tejbirsinghbhatia3090 4 роки тому +2

    Man, love the sarcasm in your voice and the concise / crisp explanation of your concepts! DOUBLE BAMMMM!

  • @tymothylim6550
    @tymothylim6550 3 роки тому +6

    Thank you, Josh, for another fun StatQuest! I really enjoyed learning the use and benefits of Ridge Regression!

  • @runxingjiao1979
    @runxingjiao1979 2 роки тому +1

    Can't say how much I love you!! God please make sure this channel is always here❤

  • @tusharpatil96
    @tusharpatil96 4 роки тому +4

    Probably the most sensible explanation available on youtube..and yes...BAM!! ;)

  • @JustTal631
    @JustTal631 4 роки тому +2

    I love you Stat quest. Your videos are better than any other stats resource I have come across, and I am actually understanding things now, which will help me do my job better. Please never stop making these excellent videos...

    • @statquest
      @statquest  4 роки тому +1

      Thank you so much! And thank you for your support! I hope to make videos for the rest of my days (which I hope are many!). :)

  • @juhipathak8433
    @juhipathak8433 6 років тому +150

    Your channel is a god send!

  • @ArinzeDavid
    @ArinzeDavid 2 роки тому +2

    I study financial Technology at Imperial College Business School; I must say your content made the "Big Data in Finance" module damn easier to understand

    • @statquest
      @statquest  2 роки тому

      Hooray! I'm glad my videos are helpful! :)

  • @aliciachen9750
    @aliciachen9750 5 років тому +4

    wow. seriously better explained than lectures from my professor in the data science department

  • @MrFalingdown
    @MrFalingdown 2 роки тому +1

    Wow, you are my personal Lifesaver. Didnt understand the concepts of Ridge Regression in any other source

  • @macilguiddir3680
    @macilguiddir3680 6 років тому +9

    Josh, even though I have just started Machine Learning and Data Science in my French Engineering "Grande Ecole", watching your videos just replaced most of the teachers I had met in my life. Great BAM my friend and thank you, just keep it up! You got a rare gift

    • @statquest
      @statquest  6 років тому +1

      Thank you so much! I'm so happy to hear that my videos are helpful! :)

    • @macilguiddir3680
      @macilguiddir3680 6 років тому +1

      StatQuest with Josh Starmer Even French people rely on you and are looking forward to studying your next videos ;)

    • @statquest
      @statquest  6 років тому

      Hooray!

    • @luisakrawczyk8319
      @luisakrawczyk8319 5 років тому

      lol tu dois avoir des très mauvais profs du coup, c'est quelle école?

  • @yousufali_28
    @yousufali_28 6 років тому +2

    Thanks for this awesome explanation. This is the first time I really understood how ridge regression works.

  • @seetarajpara7626
    @seetarajpara7626 3 роки тому +4

    This is incredibly helpful!! I will be watching many of your videos to supplement my stats/data science studies :) Thank you!

  • @Anthestudios
    @Anthestudios 2 роки тому +1

    I just keep coming back to you Josh! Thanks for your clear explanation.

  • @spencerprice1676
    @spencerprice1676 6 років тому +4

    Thank you so much. You made this so much easier to understand than my professor. Really appreciate it

    • @statquest
      @statquest  6 років тому

      You're welcome! I'm glad to hear that the video was helpful. :)

  • @abinsharaf8305
    @abinsharaf8305 3 роки тому +1

    Dear josh, when i get a job, ill buy an entire album, thanks for all these videos, they are super helpful for me to understand. I was not able to understand the purpose of regularization until i watched this video, i was always confused why are we adding penalty to error. got a load off my mind, again thanks a lot !

  • @sam271183
    @sam271183 5 років тому +4

    Just Brilliant!! Josh Starmer - You are a genius!

  • @dafni5674
    @dafni5674 4 роки тому +1

    I am an aspiring to be data scientist.. Right now I feel lost with all the math, stats, machine learning and programming... I have been watching a lot of UA-cam videos and I came across your channel! I simply love it! I plan to watch all the videos. And let me just say I love the jokes and the silly songs

    • @statquest
      @statquest  4 роки тому +1

      Thank you so much and good luck on your journey to learning Data Science! BAM! :)

    • @dafni5674
      @dafni5674 4 роки тому +1

      @@statquest Thank you! :D

  • @PedroRibeiro-zs5go
    @PedroRibeiro-zs5go 4 роки тому +3

    Thanks Josh! You’re absolutely the best 💪🏻

    • @statquest
      @statquest  4 роки тому

      Thank you very much! :)

  • @leanneZzz08
    @leanneZzz08 4 роки тому +1

    Really appreciate your videos. They are valuable for beginners. Easy to understand and easy to learn. Thanks for your good work. Greeting from a new PhD student.

    • @statquest
      @statquest  4 роки тому

      Thanks and good luck with your PhD! :)

  • @monicakulkarni3319
    @monicakulkarni3319 5 років тому +7

    I really appreciate your videos! Keep up the good work.

  • @dennismikolaj2541
    @dennismikolaj2541 3 роки тому +1

    your tutorial worth much more than my university ML course which is 5000 dollars one semester. must donate, keep going.

  • @shashankupadhyay821
    @shashankupadhyay821 4 роки тому +3

    This is so cool, it's almost like magic.

  • @viniths7683
    @viniths7683 5 років тому +1

    So far the best Video i ever saw for regression ... thanks Josh !!

  • @longkhuong8382
    @longkhuong8382 6 років тому +3

    Mega BAM!!!! Thank you
    I can't wait to learn the next lesson

    • @statquest
      @statquest  6 років тому +1

      Hooray!!!! :) The next one, on Lasso Regression, should come out in the next week or so.

    • @longkhuong8382
      @longkhuong8382 6 років тому +1

      Yeah!, It's great. Thank you

  • @scottzeta3067
    @scottzeta3067 2 роки тому +1

    4:58 "I usually try to avoid using Greek characters as much as possible" You are too kind and it is very true, lots of students start shaking once they saw Greek letter in an equation!🥶

  • @kslm2687
    @kslm2687 6 років тому +8

    Thank you for this video, it's so helpful! I can't believe, it's only 500 views. Please consider patreon account that people could thank for your work!

    • @statquest
      @statquest  6 років тому +4

      Thank you! I'll look into the patreon account. In the mean time you can support my channel through my bandcamp site - even if you don't like the songs, you can buy an album and that will support me. joshuastarmer.bandcamp.com/

  • @balajiadithya1292
    @balajiadithya1292 2 роки тому +1

    Wow! Such a simple yet detailed exposition!

  • @aliozcankures7864
    @aliozcankures7864 2 роки тому +3

    absolutely amazing, thank you sir!

  • @trmohr
    @trmohr 5 років тому +2

    StatQuest - you are awesome! You’re my go-to source to learn stats when my textbooks fail me.

  • @kadhirn4792
    @kadhirn4792 4 роки тому +15

    Love from India. Wish me good luck interview in less than days.

    • @statquest
      @statquest  4 роки тому +5

      Thank you and good luck with your interviews. Let me know how they go. :)

    • @Whoasked777
      @Whoasked777 3 роки тому

      @@statquest narrator: they never did let StatQuest know...

    • @statquest
      @statquest  3 роки тому

      @@Whoasked777 Totally! I hope they went well.

  • @thecontroller6786
    @thecontroller6786 4 роки тому +2

    You know what? Your video is so.... PERFECT.

  • @vspecky6681
    @vspecky6681 4 роки тому +32

    I was listening with extreme focus and you suddenly threw "Airspeed of Swallow" at me. I died XDDDDDDDDDDDD

    • @statquest
      @statquest  4 роки тому

      Awesome! :)

    • @oldcowbb
      @oldcowbb 3 роки тому +1

      what do you mean, African or European Swallow

  • @nibinkarayi
    @nibinkarayi 4 роки тому +1

    Josh,you are the best,and you know this by now.Please help us with the video on why ridge regression works for datasets with lots of parameters and less data points

    • @statquest
      @statquest  4 роки тому +1

      I'll keep it in mind.

  • @programminginterviewprep1808
    @programminginterviewprep1808 5 років тому +21

    These videos are awesome!
    Somehow, listening to the video, I feel it comes from/for someone with a background in stats, than a typical computer science machine learning video.

    • @statquest
      @statquest  5 років тому +8

      Interesting. My background is both computer science and statistics - but I did biostatistics for years before I did machine learning, so that might explain it.

  • @adisetiawan-du7fl
    @adisetiawan-du7fl 29 днів тому +1

    now i'm fluent to use bam even the triple bam word, thankyou legend !

  • @akashdesarda5787
    @akashdesarda5787 6 років тому +6

    Quadruple bam!!!! For your explanation

    • @statquest
      @statquest  6 років тому

      Hooray! I'm glad you like it! :)

  • @arpitmishra8439
    @arpitmishra8439 5 років тому +1

    Never stop teaching sir... U r the best

  • @iefe65
    @iefe65 5 років тому +6

    Small question: Does ridge regression only decrease sensitiveness ? What if instead of this example, our test set was above the red line ? Normally we'll need to increase sensitiveness ?

    • @vishaltyagi2983
      @vishaltyagi2983 10 місяців тому

      This will be taken care of... if you are taking a random sample ... don't worry

    • @Niglnws
      @Niglnws 3 місяці тому

      Did you understand why?

    • @Niglnws
      @Niglnws 3 місяці тому

      @@vishaltyagi2983 can you explain more? i am trying for an hour to proof it myself and reached that the random sample has less variance but that doesnt matter, because it doesnt differ. Then i found your reply.

  • @habeshadigitalnomad137
    @habeshadigitalnomad137 8 місяців тому +1

    its insane i keep coming back to this channel to brush up on material. I am finally graduating this summer but i know for sure i will coming back here just here "small Bam!" and "Bamm" lol

    • @statquest
      @statquest  8 місяців тому

      Congratulations! BAM! :)

  • @fmetaller
    @fmetaller 6 років тому +3

    Great explanation as always. There is something it's not convincing me about this type of regression. The ridge regression assumes that the training data are always overestimating the slope. Isn't possible that the training data are underestimating the slope instead?

    • @statquest
      @statquest  6 років тому +2

      If the training data underestimate the slope, then shrinking it will not improve the fit during cross validation. In this case the best value for lambda will be zero. So ridge regression can’t make things worse. Does this make sense?

    • @fmetaller
      @fmetaller 6 років тому

      @@statquest yes it's clear. Thank you for your explanation.

    • @akhilmahajan1417
      @akhilmahajan1417 5 років тому +3

      I also had same question. Thankfully, I found your comment!

  • @Shubhamkumar-ng1pm
    @Shubhamkumar-ng1pm 5 років тому +2

    this is the best content i have ever seen on machine learning triple baam.

    • @statquest
      @statquest  5 років тому

      Thank you very much! :)

  • @1pompeya170
    @1pompeya170 4 роки тому +6

    you are my sunshine,my only sunshine , you make me happy when f**king math puzzled me!

  • @senzhuang9408
    @senzhuang9408 6 років тому +1

    You are absolutely amazing and the videos are so insanely useful! If these videos were available 5 years ago, I would have skipped all my stat classes! : )

    • @statquest
      @statquest  6 років тому

      Thank you so much! :)

  • @lazypunk794
    @lazypunk794 5 років тому +3

    So from what I understand, ridge regression controls the slope from getting big right? This affects bias but reduces variance a lot so overall its better.
    But what if my true model has a slope that is actually bigger(steeper) than what I got using my training data? In that case wouldn't you be making the model worse by using regularization?
    In other words, why are we "desensitizing" when we don't know what the underlying model is? What if sensitivity in actual model is higher?

    • @sidsr
      @sidsr 5 років тому

      I have this exact same doubt! I guess we use trial and error and see whether the model improves, if it doesn't the only way to either use a more complex function or get more training data.

    • @lazypunk794
      @lazypunk794 5 років тому

      @@sidsr oh okay.. but still regularization works pretty much everytime right

    • @meinizizheng9867
      @meinizizheng9867 5 років тому

      I think once you test all possible value of lambda, the one gives you the smallest test error will be the best one. So if true model is steeper (and assume test error gave you an approximation to true error) the lambda will reduce to zero.

    • @-long-
      @-long- 5 років тому

      by trial and error, your model will get the best performance when lambda=0, which means "no regularizer used".

  • @ameliaschricker2527
    @ameliaschricker2527 2 роки тому +1

    You are literally a LIFE SAVER!! Thank you sosososo much

  • @nathanx.675
    @nathanx.675 4 роки тому +9

    Who's watching this the day before their machine learning finals?

  • @codewithsid2063
    @codewithsid2063 6 років тому +2

    Your videos are so underrated. Please have a patreon account so that community can help you bring these high quality videos.

    • @statquest
      @statquest  6 років тому +2

      Thank you! I'll look into the patreon account. In the mean time you can support my channel through my bandcamp site - even if you don't like the songs, you can buy an album and that will support me. joshuastarmer.bandcamp.com/

  • @zeerot
    @zeerot 6 років тому +6

    Josh, you're a true hero with your explanations. Thanks a bunch!
    I have one question though. In the video (in the graph at 19:20 for example) you show that a ridge regression would fit real world data better, as it shrinks the beta (the graph shows that in the real world this beta is also smaller, due to most green points (=real world data) being positioned below the red line (=training data)).
    However, would ridge regression still be better if for example most of the green dots would be above the red line? Because with ridge regression we would shrink the beta, while the real world beta in reality has even a higher slope than the slope of the red line (thus in this case ridge would lead to increase in both variance and bias for real world data?)

    • @statquest
      @statquest  6 років тому +3

      This is a great question - the key is that when lambda = 0, then you get the exact same result as least squares - so Ridge Regression can not do worse than Least Squares, it can on only do better. In the case you mention, sure, if all of the green dots are above the red dots, neither Least Squares or Ridge Regression will do well - but Ridge Regression will do no worse than Least Squares.

    • @CyberSinke
      @CyberSinke 3 роки тому +1

      Thank you for posting this question. One thousand comments on this video, all well deserved praise as this video and the whole channel are awesome. Yet only you asked this obvious question. Makes me wonder how many people actually bothered to understand the whole point of Ridge Regression.

    • @Niglnws
      @Niglnws 3 місяці тому

      @@CyberSinke Exactly what shocked me too, i am trying for one hour to understand it by assuming sample variance underestimation of population but it doesnt matter, it is just the sample which picked randomly.

    • @Niglnws
      @Niglnws 3 місяці тому

      @@statquest why not it will not do worse? it will make the slope flatter which is away from the real relation which is more vertical or steeper.

    • @statquest
      @statquest  3 місяці тому

      @@Niglnws It will do no worse because we will compare it to the simple least squares fit. If it performs worse, we won't use it.

  • @luizelias2560
    @luizelias2560 4 роки тому +1

    best video about Ridge ever !!!!! very clear and precise!

  • @Tntpker
    @Tntpker 6 років тому +4

    How would you do cross validation for the example @ 10:16 to determine lambda? For example, would you then take 10 random samples of 2 (out of 8) data points and try different lambda's (for example lambda 1-20) for each _individual_ sample? And then determine which value of lambda in all those 10 samples gives the lowest variance?

    • @statquest
      @statquest  6 років тому +1

      That's the idea. In practice, there are usually many more samples, so you're not just picking 2 samples at a time, but that's the idea.

    • @Tntpker
      @Tntpker 6 років тому

      @@statquest Thanks!

    • @dadipsaus332
      @dadipsaus332 5 років тому

      How to calculate that variance then?

  • @omprakash007
    @omprakash007 5 років тому +1

    Firstly i like to thank you for explaining these concepts in such a crystal clear manner , this is one of the best video i ever witnessed. second, i request you to please make some video on backpropagation and some tedious concepts of M.L.
    once again thank you.

  • @hzyTMU
    @hzyTMU 5 років тому +4

    How to prove "the slop close to 0 when lambda increasing in the 9:42"?

    • @badoiuecristian
      @badoiuecristian 4 роки тому

      I have the same question

    • @chandankumar-jo7rf
      @chandankumar-jo7rf 4 роки тому +1

      when lambda tend to infinity, SSE will be negligible compared to lambda * slope^2, hence slope has to go to 0

  • @vinodr9655
    @vinodr9655 4 роки тому +2

    hello sir i just wanted to tell you that you are the teacher ! thank you for your diamond cut clarification

    • @statquest
      @statquest  4 роки тому +1

      Thank you very much! :)

  • @herp_derpingson
    @herp_derpingson 6 років тому +3

    This reminds me of L2 regularization of weights in neural networks.

    • @statquest
      @statquest  6 років тому

      Yes! This is the exact same thing, only applied to Regression. I think it appeared first in the regression context, but I'm not sure.

  • @VVV-wx3ui
    @VVV-wx3ui 4 роки тому +2

    this is one super explanation of the Regularization concept of Ridge Regression. Great work.

  • @usamanavid2044
    @usamanavid2044 4 роки тому +3

    Love from 🇵🇰 Pakistan.

  • @matgg8207
    @matgg8207 5 років тому +1

    you are the GOLD to the DS

  • @MrChryssy1
    @MrChryssy1 5 років тому +11

    How do we get the new line in 3:40 ? We calculated 1.69 and 0.74, what did we do with it to get the new line?

    • @statquest
      @statquest  5 років тому +17

      In practice, ridge regression starts with the least squares estimates for the slope and intercept. Then it changes the slope a little bit to see if the sum of the squared residuals plus lambda times the squared slope gets smaller. If so, keep the new value. Then make the slope a little smaller and see if the sum of squared residuals plus lambda times the squared slope gets smaller. If so, keep the new value. Repeat those steps over and over again until you the sum of the squared residuals plus lambda times the squared slope no longer gets smaller. Does that make sense?

    • @utkarshkulshrestha2026
      @utkarshkulshrestha2026 5 років тому +3

      @@statquest Hi Josh, the slope that you are referring to is just one of our parameters that we want to minimize right? For a higher order fitting, can it be any other parameter apart from slope?

    • @statquest
      @statquest  5 років тому +5

      @@utkarshkulshrestha2026 Least Squares will work to minimize the sum of the squared residuals using all of the parameters and the ridge regression will be applied to all parameters except for the intercept. Thus, for all parameters other than the intercept, we try to minimize the sum of the squared residuals plus the ridge regression penalty. Usually reducing the parameter values will increase the sum of the squared residuals a little bit and decrease the ridge regression penalty a lot. Does that make sense?

    • @utkarshkulshrestha2026
      @utkarshkulshrestha2026 5 років тому

      @@statquest Yes, this was pretty very much clear. Thank you..!!

    • @MrChryssy1
      @MrChryssy1 5 років тому +1

      @@statquest I mean the calculation^^That is what I am not quite sure about

  • @999Stergios
    @999Stergios 5 років тому +1

    This is not StatQuest.. this is Machine learning slayer! Damn! Another awesome video. Bravo bravo!

    • @statquest
      @statquest  5 років тому

      Thank you so much! I really appreciate it. :)