Ridge Regression

Поділитися
Вставка
  • Опубліковано 19 сер 2015
  • My Patreon : www.patreon.com/user?u=49277905
  • Розваги

КОМЕНТАРІ • 177

  • @xavierfournat8264
    @xavierfournat8264 3 роки тому +51

    This is showing that the quality and value of a video is not depending on how fancy the animations are, but how expert and pedagogue the speaker is. Really brilliant! I assume you spent a lot of time designing that course, so thank you for this!

    • @ritvikmath
      @ritvikmath  3 роки тому +1

      Wow, thanks!

    • @backstroke0810
      @backstroke0810 2 роки тому

      Totally agree. I learn a lot from his short videos. Precise, concise, enough math, enough ludic examples. True professor mind.

  • @tzu-chunchen5139
    @tzu-chunchen5139 7 місяців тому +4

    This is the best explanation of Ridge regression that I have ever heard! Fantastic! Hats off!

  • @rez_daddy
    @rez_daddy 4 роки тому +45

    "Now that we understand the REASON we're doing this, let's get into the math."
    The world would be a better place if more abstract math concepts were approached this way, thank you.

  • @GreenEyesVids
    @GreenEyesVids 3 роки тому +3

    Watched these 5 years ago to understand the concept and I passed an exam. Coming back to it now to refresh my memory, still very well explained!

  • @siddharthkshirsagar2545
    @siddharthkshirsagar2545 4 роки тому +4

    I was searching for ridge regression on the whole internet and stumbled upon this is a video which is by far the best explanation you can find anywhere thanks.

  • @nadekang8198
    @nadekang8198 5 років тому +2

    This is awesome! Lots of machine learning books or online courses don't bother explaining the reason behind Ridge regression, you helped me a lot by pulling out the algebraic and linear algebra proofs to show the reason WHY IT IS THIS! Thanks!

  • @bettychiu7375
    @bettychiu7375 4 роки тому

    This really helps me! Definitely the best ridge and lasso regression explanation videos on UA-cam. Thanks for sharing! :D

  • @zgbjnnw9306
    @zgbjnnw9306 2 роки тому +2

    It's so inspiring to see how you get rid of the c^2! I learned Ridge but didn't know why! Thank you for making this video!

  • @abhichels1
    @abhichels1 7 років тому +12

    This is gold. Thank you so much!

  • @taareshtaneja7523
    @taareshtaneja7523 5 років тому +8

    This is, by far, the best explanation of Ridge Regression that I could find on UA-cam. Thanks a lot!

  • @yxs8495
    @yxs8495 7 років тому +38

    This really is gold, amazing!

  • @RobertWF42
    @RobertWF42 5 місяців тому

    Excellent video! One more thing to add - if you're primarily interested in causal inference, like estimating the effect of daily exercise on blood pressure while controlling for other variables, then you want an unbiased estimate of the exercise coefficient and standard OLS is appropriate. If you're more interested in minimizing error on blood pressure predictions and aren't concerned with coefficients, then ridge regression is better.
    Also left out is how we choose the optimal value of lambda by using cross-validation on a selection of lambda values (don't think there's a closed form expression for solving for lambda, correct me if I'm wrong).

  • @murraystaff568
    @murraystaff568 8 років тому +2

    Brilliant! Just found your channel and can't wait to watch them all!!!

  • @theoharischaritidis4173
    @theoharischaritidis4173 6 років тому

    This really helped a lot. A big thanks to you Ritvik!

  • @akino.3192
    @akino.3192 6 років тому

    You, Ritvik, are simply amazing. Thank you!

  • @Lisa-bp3ec
    @Lisa-bp3ec 7 років тому

    Thank you soooo much!!! You explain everything so clear!! and there is no way I couldn't understand!

  • @alecvan7143
    @alecvan7143 4 роки тому

    Amazing video, you really explained why we do things which is what really helps me!

  • @TahaMVP
    @TahaMVP 6 років тому

    best explanation of any topic i've ever watched , respect to you sir

  • @SarahPourmolamohammadi
    @SarahPourmolamohammadi Рік тому

    You are the best of all.... you explained all the things,,, so nobody is gonna have problems understanding them.

  • @mortezaabdipour5584
    @mortezaabdipour5584 5 років тому

    It's just awesome. Thanks for this amazing explanation. Settled in mind forever.

  • @nickb6811
    @nickb6811 7 років тому +1

    So so so very helpful! Thanks so much for this genuinely insightful explanation.

  • @q0x
    @q0x 8 років тому +14

    I think its explained very fast, but still very clear, for my level of understanding its just perfect !

  • @cu7695
    @cu7695 6 років тому +2

    I subscribed just after watching this. Great foundation for ML basics

  • @nikunjgattani999
    @nikunjgattani999 2 роки тому

    Thanks a lot.. I watched many videos and read blogs before this but none of them clarified at this depth

  • @surajshivakumar5124
    @surajshivakumar5124 2 роки тому

    This is literally the best video on ridge regression

  • @yanlinwang5703
    @yanlinwang5703 2 роки тому

    The explanation is so clear!! Thank you so much!!

  • @BhuvaneshSrivastava
    @BhuvaneshSrivastava 4 роки тому +3

    Your data science videos are the best I have seen on UA-cam till now. :)
    Waiting to see more

  • @jhhh0619
    @jhhh0619 8 років тому +1

    Your explanation is extremely good!

  • @aDifferentHandle
    @aDifferentHandle 6 років тому

    The best ridge regression lecture ever.

  • @soudipsanyal
    @soudipsanyal 6 років тому

    Superb. Thanks for such a concise video. It saved a lot of time for me. Also, subject was discussed in a fluent manner and it was clearly understandable.

  • @Krishna-me8ly
    @Krishna-me8ly 8 років тому

    Very good explanation in an easy way!

  • @sasanosia6558
    @sasanosia6558 5 років тому

    Amazingly helpful. Thank you.

  • @teegnas
    @teegnas 4 роки тому +1

    These explanations are by far the best ones I have seen so far on youtube ... would really love to watch more videos on the intuitions behind more complicated regression models

  • @youyangcao3837
    @youyangcao3837 7 років тому +1

    great video, the explanation is really clear!

  • @shiva6016
    @shiva6016 6 років тому

    simple and effective video, thank you!

  • @JC-dl1qr
    @JC-dl1qr 7 років тому

    great video, brief and clear.

  • @kamesh7818
    @kamesh7818 6 років тому

    Excellent explanation, thanks!

  • @Thaifunn1
    @Thaifunn1 8 років тому +1

    excellent video! Keep up the great work!

  • @sanketchavan8
    @sanketchavan8 6 років тому

    best explanation on ridge reg. so far

  • @charlesity
    @charlesity 4 роки тому +2

    Stunning! Absolute gold!

  • @babakparvizi2425
    @babakparvizi2425 6 років тому

    Fantastic! It's like getting the Cliff's Notes for Machine Learning. These videos are a great supplement/refresher for concepts I need to knock the rust off of. I think he takes about 4 shots of espresso before each recording though :)

  • @Viewfrommassada
    @Viewfrommassada 4 роки тому +1

    I'm impressed by your explanation. Great job

  • @Hazit90
    @Hazit90 7 років тому +2

    excellent video, thanks.

  • @xwcao1991
    @xwcao1991 3 роки тому +1

    Thank you. I make the comment because I know I will never need to watch it again! Clearly explained..

  • @myazdani2997
    @myazdani2997 7 років тому

    I love this video, really informative! Thanks a lot

  • @TURBOKNUL666
    @TURBOKNUL666 7 років тому +1

    great video! thank you very much.

  • @abhijeetsingh5049
    @abhijeetsingh5049 8 років тому +1

    Stunning!! Need more access to your coursework

  • @adityakothari193
    @adityakothari193 7 років тому

    Excellent explanation .

  • @vishnu2avv
    @vishnu2avv 6 років тому

    Awesome, Thanks a Million for great video! Searching you have done video on LASSO regression :-)

  • @zhilingpan2486
    @zhilingpan2486 6 років тому

    Very clear. Thank you!

  • @mohamedgaal5340
    @mohamedgaal5340 Рік тому

    I was looking for the math behind the algorithm. Thank you for explaining it.

  • @andremajchrzak8680
    @andremajchrzak8680 6 років тому

    you are the man, keep doing what you're doing

  • @tamoghnamaitra9901
    @tamoghnamaitra9901 6 років тому

    Beautiful explanation

  • @jamiewilliams9271
    @jamiewilliams9271 6 років тому

    Thank you so much!!!!

  • @aarshsachdeva5785
    @aarshsachdeva5785 7 років тому

    You should add in that all the variables (dependent and independent) need to be normalized prior to doing a ridge regression. This is because betas can vary in regular OLS depending on the scale of the predictors and a ridge regression would penalize those predictors that must take on a large beta due to the scale of the predictor itself. Once you normalize the variables, your A^t*A matrix being a correlation matrix of the predictors. The regression is called "ridge" regression because you add (lambda*I + A^t*A ) which is adding the lambda value to the diagonal of the correlation matrix, which is like a ridge. Great video overall though to start understanding this regression.

  • @ethanxia1288
    @ethanxia1288 8 років тому +6

    Excellent explanation! Could you please do a similar video for Elastic-net?

  • @hunarahmad
    @hunarahmad 7 років тому

    thanks for the nice explanation

  • @sendydowneyjr
    @sendydowneyjr 7 років тому

    This is great, thank you!

  • @abeaumont10
    @abeaumont10 5 років тому

    Great videos thanks for making it

  • @wi8shad0w
    @wi8shad0w 3 роки тому

    THIS IS ONE HELL OF A VIDEO !!!!

  • @samie3000
    @samie3000 7 років тому

    Thank you!

  • @garbour456
    @garbour456 2 роки тому

    great video - thanks

  • @adrianfischbach9496
    @adrianfischbach9496 Рік тому

    Huge thanks!

  • @mikeperez4222
    @mikeperez4222 3 роки тому

    Anyone else get anxiety when he wrote with the marker?? Just me?
    Felt like he was going to run out of space 😂
    Thank you so much thoo, very helpful :)

  • @lucyli8770
    @lucyli8770 6 років тому

    very helpful, thanks

  • @intom1639
    @intom1639 6 років тому

    Brilliant! Could you make more videos about Cross validation, RIC, BIC, and model selection.

  • @meysamsojoudi3947
    @meysamsojoudi3947 3 роки тому

    It is a brilliant video. Great

  • @janaosea6020
    @janaosea6020 4 роки тому

    bless this is amazing

  • @zw7453
    @zw7453 2 роки тому

    best explanation ever!

  • @e555t66
    @e555t66 Рік тому

    I don't have money to pay him so leaving a comment instead for the algo. He is the best.

  • @yassersaeid3424
    @yassersaeid3424 7 років тому +1

    a big thanks

  • @happy_labs
    @happy_labs 7 років тому

    Thanks for this one!

  • @HeduAI
    @HeduAI 7 років тому

    I would trade diamonds for this explanation (well, allegorically! :) ) Thank you!!

  • @faeritaaf
    @faeritaaf 7 років тому

    Thank you! Your explaining is really good, Sir. Do you have time to make a video explaining the adaptive lasso too?

  • @qiulanable
    @qiulanable 6 років тому

    excellent video!!!!

  • @canernm
    @canernm 3 роки тому +2

    Hi and thanks fr the video. Can you explain briefly why when the m_i and t_i variables are highly correlated , then the estimators β0 and β1 are going to have very big variance? Thanks a lot in advance!

    • @lanag873
      @lanag873 2 роки тому

      Hi same question here😶‍🌫

  • @brendachirata2283
    @brendachirata2283 5 років тому

    hey, great video and excellent job

  • @kxdy8yg8
    @kxdy8yg8 6 років тому

    This is gold indeed!

  • @kartikkamboj295
    @kartikkamboj295 4 роки тому

    Dude ! Hats off 🙏🏻

  • @justinm1307
    @justinm1307 6 років тому +2

    this is great stuff

  • @nicolasmanelli7393
    @nicolasmanelli7393 Рік тому

    I think it's the best video ever made

  • @divyarthprakash1541
    @divyarthprakash1541 6 років тому

    Very well explained :)

  • @jakobforslin6301
    @jakobforslin6301 Рік тому

    You are awesome!

  • @zhongshanhu7376
    @zhongshanhu7376 8 років тому +1

    very good explanation in an easy way!!

  • @Sytch
    @Sytch 6 років тому

    Finally, someone who talks quickly.

  • @lauraarmbrust1639
    @lauraarmbrust1639 5 років тому

    Thanks for this really helpful video!
    Could you explain why the independent variables in A should be standardized for Ridge and Lasso Regression?

  • @yingbinjiang133
    @yingbinjiang133 7 років тому +1

    a very nice video

  • @sergioperezmelo3090
    @sergioperezmelo3090 5 років тому

    Super clear

  • @ilkerkarapanca
    @ilkerkarapanca 6 років тому

    Awesome, thanks man

  • @mnwepple
    @mnwepple 8 років тому

    Awesome video! Very intuitive and easy to understand. Are you going to make a video using the probit link?

  • @RAJIBLOCHANDAS
    @RAJIBLOCHANDAS 2 роки тому

    Excellent approach to discuss Lasso and Ridge regression. It could have been better if you have discussed how Lasso yields sparse solutions! Anyway, nice discussion.

  • @SUBHRASANKHADEY
    @SUBHRASANKHADEY 5 років тому +2

    Shouldn't the radius of the Circle be c instead of c^2 (at time around 7:00)?

  • @shashankparameswaran2336
    @shashankparameswaran2336 2 роки тому

    Amazing!!!

  • @Tyokok
    @Tyokok 5 років тому

    Thanks for the video! silly question: where is your L2 norm video, can you provide a link? (subscribed)

  • @OmerBoehm
    @OmerBoehm 2 роки тому

    Brilliant simplification of this topic. No need for fancy presentation to explain the essence of an idea!!

  • @prabhuthomas8770
    @prabhuthomas8770 5 років тому

    SUPER !!! You have to become a professor and replace all those other ones !!

  • @tsrevo1
    @tsrevo1 6 років тому

    Sir, a question about 4:54: I understand that in tax/income example the VARIANCE of the beta0-beta1's is high, since there's an additional beta2 effecting things. However, the MEAN in the population should be the same, even with high variance, isn't it so? Thanks in advance!

  • @sagarsitap3540
    @sagarsitap3540 4 роки тому

    Thanks! why lamba cannot be negative? What if to improve variance it is need to increase the slope and not decrease?

  • @sachinrathi7814
    @sachinrathi7814 3 роки тому

    Can anyone explain the statement "The efficient property of any estimator says that the estimator is the minimum variance unbiased estimator", so what is minimum variance denotes here.

  • @adarshnamdev5834
    @adarshnamdev5834 3 роки тому

    @ritvik when you said that the estimated coefficients has small variance does that implies the tendency of obtaining different estimate values of those coefficients ? I tend to confuse this term 'variance ' with the statistic Variance (spread of the data!).

    • @benxneo
      @benxneo 3 роки тому

      Variance is the change in prediction accuracy of ML model between training data and test data.
      Simply what it means is that if a ML model is predicting with an accuracy of "x" on training data and its prediction accuracy on test data is "y"
      Variance = x - y
      A smaller variance would thus mean the model is fitting less noise on the training data, reducing overfitting.
      this definition was taken from: datascience.stackexchange.com/questions/37345/what-is-the-meaning-of-term-variance-in-machine-learning-model
      Hope this helps.

    • @adarshnamdev5834
      @adarshnamdev5834 3 роки тому

      @@benxneo thanks mate!