Lasso Regression

Поділитися
Вставка
  • Опубліковано 5 жов 2024
  • My Patreon : www.patreon.co...

КОМЕНТАРІ • 93

  • @rishabhsharma329
    @rishabhsharma329 2 роки тому +14

    This question was asked in my interview. 7 mins of this video changed my life, 5 years ago! Thank you

  • @laIlI129
    @laIlI129 8 років тому +2

    First time, I am exploring the meaning of LASSO Regression and I have no confusion after watching this video. Very helpful. Thanks Ritvik Kharkar.

  • @kristjan2838
    @kristjan2838 7 років тому +13

    Took a convex optimization course last year. you explained clearly in 3 videos, what took days of digging previously. Papa Bless

  • @junjiema4613
    @junjiema4613 8 років тому +30

    very helpful! Like the speed you speak

    • @michaelfresco2769
      @michaelfresco2769 6 років тому

      also exceptionally clear

    • @glaswasser
      @glaswasser 4 роки тому

      i first thought I had youtube still on 1.5 speed haha

  • @luqiyao5896
    @luqiyao5896 8 років тому +1

    The best introduction of LASSO, very easy to understand! Thanks!

  • @qwqsimonade3580
    @qwqsimonade3580 3 роки тому +1

    thanks for the video so much. I've been confused by dumb prof for almost one year, and until I understand ridge and l1l2 penalty just by a video. thank you

  • @perrysellers9198
    @perrysellers9198 6 років тому

    Excellent job explaining Ridge and Lasso. Your equations/functions AND visuals close the loop nicely!

  • @ariani86
    @ariani86 6 років тому +1

    your videos are extremely helpful and easily to understand the math behind ML! Thanks a ton!

  • @glaswasser
    @glaswasser 4 роки тому +2

    finally! Stumbled upon that figure in the ISLR book but did not understand what was going on, you made it clear to me now, thanks!

  • @Dhruvbala
    @Dhruvbala 2 роки тому +1

    solid video. saved my interest in the subject, so thank you very much!

  • @meichendong3434
    @meichendong3434 5 років тому

    Very clear explanation of the contour!

  • @ahmetcihan8025
    @ahmetcihan8025 3 роки тому

    This insane Man. Thank you so much.

  • @krishnapathi572
    @krishnapathi572 6 років тому

    Amazing clarity of idea...and perfect speed for the explanations :D

  • @hcgaron
    @hcgaron 6 років тому

    You are awesome. Thank you for your passion to teach this topic!

  • @Michael-uc3fx
    @Michael-uc3fx 8 років тому

    Awesome introduction, thanks! Keep posting videos!

  • @secondtonone2284
    @secondtonone2284 2 роки тому

    Thank you so much for sharing this brilliant video! If you can afford it, I hope you cover the unique feature of adaptive lasso (oracle properties) too.

  • @SaintRudi85
    @SaintRudi85 6 років тому

    Excellent video. Very clear. Thank you.

  • @robertc2121
    @robertc2121 6 років тому

    Brilliantly explained - Brandon Foltz love you video series!!!!

  • @iVergilchiou
    @iVergilchiou 7 років тому

    It's so clearly and very helpful !! Thank you so much !

  • @deepakravishankar169
    @deepakravishankar169 6 років тому

    really succinct and to the point. good explanation

  • @spencerlee5500
    @spencerlee5500 4 роки тому

    So great! Thank YOU!

  • @aliteshnizi672
    @aliteshnizi672 6 років тому

    Falling in love with your videos

  • @mikx55
    @mikx55 7 років тому

    Super precise and incredibly helpful!!

  • @GotUpLateWithMoon
    @GotUpLateWithMoon 7 років тому

    very helpful, thanks very much Ritvik!

  • @faithkalos7745
    @faithkalos7745 6 років тому

    Very good explanation, thanks a lot !

  • @akhileshpandey8457
    @akhileshpandey8457 7 років тому

    Most perfect Video on this stuff.. Even the pace was something I could keep with :)

  • @morumotto
    @morumotto 4 роки тому

    Thank you!!!

  • @ronithsinha5702
    @ronithsinha5702 6 років тому +2

    Can you please explain again why exactly the co-efficients of the B vector hit the edges of the pyramid in case of Lasso Regression, but they do not hit the circumference in case of Ridge Regression. This is the only concept I am not being able to grasp that how does Lasso lead to elimination of co-efficients, but Ridge only causes shrinkage of co-efficients and not entire deletion.

  • @jianishen5656
    @jianishen5656 7 років тому

    Thank you so much !! the explanation is so good !

  • @aliteshnizi672
    @aliteshnizi672 6 років тому

    Incredibly good.

  • @BhuvaneshSrivastava
    @BhuvaneshSrivastava 4 роки тому +1

    Great videos as expected 😊..
    Also please find time to make videos on:
    - A/B testing
    - Survival Modelling
    - Type of errors
    - GBM

    • @ritvikmath
      @ritvikmath  4 роки тому +1

      Thanks! And I will look into those suggestions

  • @urmumsfrend
    @urmumsfrend 6 місяців тому

    thank you!

  • @chillwinternight
    @chillwinternight 6 років тому

    Thank you! A very helpful video. Please consider making a video on coordinate descent. :)

  • @miliyajindal
    @miliyajindal 6 років тому

    I have been learning about data science from the last 6 months but there is no article or no videos that are better than users.

  • @rasikai521
    @rasikai521 6 років тому

    perfectly expplained. Thank you so much

  • @shakedg2956
    @shakedg2956 6 років тому

    really good explanation!

  • @preeyank5
    @preeyank5 5 років тому

    thanks a lot...God Bless!!

  • @alimuqaibel7619
    @alimuqaibel7619 6 років тому

    Thanks, very informative

  • @vishnu2avv
    @vishnu2avv 7 років тому

    Awesome Video. Thanks a million for upload :-)

  • @abomad2011
    @abomad2011 7 років тому

    good explanation

  • @lluisgasso
    @lluisgasso 6 років тому

    Awesome Job!

  • @antonisstellas741
    @antonisstellas741 2 роки тому

    very nice video!

  • @robertjonka1238
    @robertjonka1238 7 років тому +2

    outstanding

  • @skan121
    @skan121 8 років тому

    Brilliant!!

  • @jackjiang7617
    @jackjiang7617 6 років тому

    great explanation!

  • @mech_builder7998
    @mech_builder7998 3 роки тому

    this intuitive explanation made lasso regression "click" by me, so a big thanks! Were you inspired / did you get the ideas / diagrams from a book or did you come up with them yourself?

  • @tomaspablofermandois4690
    @tomaspablofermandois4690 6 років тому

    Thanks for the Video!

  • @sikun7894
    @sikun7894 6 років тому

    Thanks!

  • @pranukvs
    @pranukvs 7 років тому

    great stuff man, you should put up a course on udacity or something !!

  • @Chris-is9fm
    @Chris-is9fm 7 років тому

    Thanks, cheers!

  • @randomforrest9251
    @randomforrest9251 4 роки тому +1

    great explanation, but it's a little bit misleading, since we are not regul. our beta0, but beta1..m.

    • @ritvikmath
      @ritvikmath  4 роки тому +1

      You have a good point, thank you!

  • @batosato
    @batosato 5 років тому

    Hey there,
    Thanks for all the explanation. Could you make a video on Non-Linear Least Square (NLS) estimator and how is it different from OLS?
    Thanks

  • @ravivijayk1840
    @ravivijayk1840 8 років тому

    thx for doing this video, intuitively helpful!
    couple of questions, 1) In lasso, are resultant coefficients be always positive or zero?
    2) do we still interpret coefficients after they get penalized by whatever lamda value we pass?

  • @yxs8495
    @yxs8495 7 років тому

    excellent

  • @marcofumagalli8147
    @marcofumagalli8147 7 років тому

    Good job !

  • @qiulanable
    @qiulanable 7 років тому

    awesome video !!!

  • @tonix1993
    @tonix1993 7 років тому

    beast lecturer !

  • @jererox
    @jererox 7 років тому

    Thanks it really helped.

  • @sidk5919
    @sidk5919 8 років тому

    Awesome!

  • @kautukkaushik7587
    @kautukkaushik7587 7 років тому

    Thanks for the video. Explaination is really great. But I have a question, what if the curve passes through line between (c,0) and (0,c) and also between (c,0) and (0,-c) , then which point would be better?

    • @harminderpuri1243
      @harminderpuri1243 6 років тому

      that curve would not be the smallest curve .. there will be curves with lower (y - Bs)^2 .. plot it and visualize

  • @bitadet3935
    @bitadet3935 6 років тому

    GREAT VIDEO! :D

  • @AhmedAbdelrahmanAtbara
    @AhmedAbdelrahmanAtbara 6 років тому

    I just don't agree with you in the feature selection argument, if beat comes with many zeros that doesn't mean the model is conducting any feature selection process there, it will automatically ignore the zeros. Probably feature selection is something different.

  • @nicholasdi1529
    @nicholasdi1529 2 роки тому

    Hello! So will C be 25 in the case? (around 5 mins)

  • @manikandantv3015
    @manikandantv3015 7 років тому

    could you please explain how some of the coefficients are becoming ZERO in LASSO? I would like to know the internals.

  • @Jack20032008
    @Jack20032008 6 років тому

    thanks. it's helpful

  • @puifais
    @puifais 7 років тому

    This is a great video. I suggest you do NOT touch or move the piece of paper this much. It'll be less distracting and help the audience look at equations and compare the information.

  • @nikhilnambiar7160
    @nikhilnambiar7160 5 років тому

    So finding new beta is done by taking derivative of lasso formula with respect to beta? And subtracting it from old beta?

  • @victorcrspo
    @victorcrspo 6 років тому

    Hello! I have a question related with this video and with the Ridge Regression video. Why shoud not I use these methods if I have one variable ( Y = betha_0 + betha_1*X) ? What would happend if I used one of these methods in that situation? Thank you!

  • @dodg3r123
    @dodg3r123 8 років тому

    Thank you so much! So how do you come up with a suitable value for c?

  • @thelastcipher9135
    @thelastcipher9135 2 роки тому

    How do you pick the constraint c?

  • @joshespinoza8645
    @joshespinoza8645 9 років тому

    Awesome, so what exactly are the "betas"?

    • @thestyxx
      @thestyxx 8 років тому

      +Josh Espinoza The regression coefficients, in other words, the estimated effects of your parameters.

  • @manikandantv3015
    @manikandantv3015 7 років тому

    if LASSO is for feature selection how it's different from PCA? Pls clarify

  • @tableauvizwithvineet148
    @tableauvizwithvineet148 6 років тому

    What is the meaning of green level curves, why are they used ?

  • @edlarmore5958
    @edlarmore5958 5 років тому

    Great explanations. Just wish you would talk a tad bit slower.

  • @Theateist
    @Theateist 6 років тому

    Why do the corners get hit a lot more than other points?

  • @phuccoiinkorea3341
    @phuccoiinkorea3341 6 років тому

    what is its optimization formular?

  • @zhenqiangsu8231
    @zhenqiangsu8231 7 років тому

    赞!

  • @fredrious
    @fredrious 8 років тому

    it's very good, but too fast!!!

  • @jianfengxu7889
    @jianfengxu7889 7 років тому

    beta_0 should not be in the regularization term.

    • @akhileshpandey8457
      @akhileshpandey8457 7 років тому

      he is just using it as an example.. he explained that in Ridge video

  • @youyangcao3837
    @youyangcao3837 8 років тому

    Thank you!!!

  • @alvaroneuenfeldtjunior9157
    @alvaroneuenfeldtjunior9157 7 років тому

    outstanding