Regularization in machine learning | L1 and L2 Regularization | Lasso and Ridge Regression
Вставка
- Опубліковано 24 чер 2021
- Regularization in machine learning | L1 and L2 Regularization | Lasso and Ridge Regression
Hello ,
My name is Aman and I am a Data Scientist.
About this video:
In this video, I explain about Regularization in machine learning. I explain why Regularization is needed in machine learning and what are different ways to Regularize models in machine learning. I also explain about lasso and Ridge regression and explain the mathematical intuition behind it.
Below topics are discussed in this video.
1. What is Regularization in machine learning
2. Bias Variance trade off
3. What is L1 and L2 Regularization
4. What is Lasso and Ridge Regression
5. What is use of model regularization
About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.
If you need Data Science training from scratch . Please fill this form (Please Note: Training is chargeable)
docs.google.com/forms/d/1Acua...
Book recommendation for Data Science:
Category 1 - Must Read For Every Data Scientist:
The Elements of Statistical Learning by Trevor Hastie - amzn.to/37wMo9H
Python Data Science Handbook - amzn.to/31UCScm
Business Statistics By Ken Black - amzn.to/2LObAA5
Hands-On Machine Learning with Scikit Learn, Keras, and TensorFlow by Aurelien Geron - amzn.to/3gV8sO9
Ctaegory 2 - Overall Data Science:
The Art of Data Science By Roger D. Peng - amzn.to/2KD75aD
Predictive Analytics By By Eric Siegel - amzn.to/3nsQftV
Data Science for Business By Foster Provost - amzn.to/3ajN8QZ
Category 3 - Statistics and Mathematics:
Naked Statistics By Charles Wheelan - amzn.to/3gXLdmp
Practical Statistics for Data Scientist By Peter Bruce - amzn.to/37wL9Y5
Category 4 - Machine Learning:
Introduction to machine learning by Andreas C Muller - amzn.to/3oZ3X7T
The Hundred Page Machine Learning Book by Andriy Burkov - amzn.to/3pdqCxJ
Category 5 - Programming:
The Pragmatic Programmer by David Thomas - amzn.to/2WqWXVj
Clean Code by Robert C. Martin - amzn.to/3oYOdlt
My Studio Setup:
My Camera : amzn.to/3mwXI9I
My Mic : amzn.to/34phfD0
My Tripod : amzn.to/3r4HeJA
My Ring Light : amzn.to/3gZz00F
Join Facebook group :
groups/41022...
Follow on medium : / amanrai77
Follow on quora: www.quora.com/profile/Aman-Ku...
Follow on twitter : @unfoldds
Get connected on LinkedIn : / aman-kumar-b4881440
Follow on Instagram : unfolddatascience
Watch Introduction to Data Science full playlist here : • Data Science In 15 Min...
Watch python for data science playlist here:
• Python Basics For Data...
Watch statistics and mathematics playlist here :
• Measures of Central Te...
Watch End to End Implementation of a simple machine learning model in Python here:
• How Does Machine Learn...
Learn Ensemble Model, Bagging and Boosting here:
• Introduction to Ensemb...
Build Career in Data Science Playlist:
• Channel updates - Unfo...
Artificial Neural Network and Deep Learning Playlist:
• Intuition behind neura...
Natural langugae Processing playlist:
• Natural Language Proce...
Understanding and building recommendation system:
• Recommendation System ...
Access all my codes here:
drive.google.com/drive/folder...
Have a different question for me? Ask me here : docs.google.com/forms/d/1ccgl...
My Music: www.bensound.com/royalty-free...
After my data science classes I used to watch the concepts through your videos and it helped me a lot in understanding... 😃😃
Thanks Mrutyunjaya.
Beautiful explanation sir, I regret of not watching this video before my interview but anyhow I am glad I got to know it now.
Thanks a lot Shanmukh 😊
loved the way u teach and your voice is amazaing. I wish for the growth of this channel
Thanks ! all doubts cleared ..!
The word sweet spot can actually impress the interviewer I guess :)
Every one can understand ur explanation.....neat and clear 👍
Thanks a lot Brahmadanna.
good explanation with keeping the audience understanding in me
Amazing, your teaching skills are really awesome sir! Thanks for this great work
Welcome Sudhanshu.
Amazing explaination Sir !!!
THANK YOU MR. AMAN SIR
Awsome description
Very Well explained.
Excellent teacher. Thank you sir for such a wonderful explanation. :)
Welcome :)
Hi aman can I request you to make a video on what's the best approach of dealing with complex data in real world . as we know in real time the data is very unstructured and most of the time data doesn't exist in CSV form. But unfortunately many of the learning available on UA-cam is in perspective of analyzing data which is in CSV form . Can you please enlighten these points in your upcoming videos including the best and practical approach . For example how to work with JSON data in data science project ,, how to work with XML files etc?
Regards
Sanyam
Awesome sir
one of the good explanations i have seen for this topic, good work
Thanks Gaurav.
Thank you sir .
You just made tuff topics so easy.🙏
Thanks Inderjeet.
Really Learned a lot Sir..your teaching skills are amazing..Super..
Thanks a lot Kirandeep.
Great...
fnished watching
Very nicely explained 💯💯
Please explain the maths behind feature selection using lasso and not ridge.
Yes Navodit, next video me wahi aaega.
Thanks vaiya😊
Hi sir, thanks a lot for such valuable videos and crisp information.
Can you please tell me why exactly a high coefficient value is a problem in regression models? Also is very low coefficient values also a problem?
Thanks in advance.
Thank you for the explanation. Would have been useful to see how this would work in practice using an example in Excel using a small dataset or in Stata.
Glad it was helpful!
feels like getting a lecture from one of my friends at last night before the exam
Which is the best/better regularization technique , and which is used for variable selection
Thank you for informative video, how is accuracy less is in overfitting scenario?
Sweet Spot ❌ Technical word - Balanced Fit
Sir since we already have learning rate to arrive at the optimum coefficients then why do we need to use Regularization ? Aren't both of them serving the same purpose?
U told that l1 and l2 is only available for regression. But I have seen them for feature selection for textual dataset(although in textual data features are transformed into vector form and have numerical values) . So pls clarify the things that whether they used for feature selection also?
finished watching
Excellent explanation. Subscribed!
Thanks Keemster
i wish this channel reached 100K very soon
Thank you so much. If you guys keep liking and sharing, anything is possible. Your feedback is highly appreciated!
Very clear explanation 👍👍👍
Thanks Christy.
Amazing Teaching Sir.. Thank You....
Welcome Adarsh
Excellent explanation 👍🏻
Thanks Sourav.
Great... Helped a lot..
Thanks Shubhajit.
Wouldn't cubing the slope(instead of squaring) in the ridge regression penalty decrease the loss function even more? If yes, why don't we do that?
Squaring a function makes it differentiable. Hence.
1:09 that laugh 🤣🤣
I understand the struggle.
bro how do you find the equation (bo+b1) after find the fist cost function is high
nice explanation
Thanks and welcome
what do you mean by L1 and L2 regularization works only with Linear regression, decision tree based algo does have other way of regularization? You mean to say L1 and L2 are not in tree based algo? L1 and L2 are also used in decision tree based algo for example catboost regression has L2 (l2_leaf_reg) regularization technique
Sir, can you please make computer vision and CNN videos?
sir can we use Lasso and ridge for feature selection in multi-class classification? say for IRIS data? or it is only for binary problem? please reply
Mulyi class also you can
sir, can you make more videos on deep learningg
I am not sure but we use l2 in neural networks i saw Andrew Ng lecture.
Bro please include a exercise that uses all these.
Thanks Aaron for watching. Will do.
Is it possible to use ridge regression to impute univariate time series? Thanks
we can do.
Linear Regression => (XTX)-1XTY
Ridge Regression: (XTX+PI)-1XTY where P is penalty, I is Identity Matrix
Lasso Regression=>please mention
Elastic Net =>Please Mention
If my slope is coming very less but I want my model slope to be more what's thought in that
Slope will come based on data, why u want to change it?
Hi Aman, BIG FAN OF YOUR WORK!! I noticed you give DS training and filled the google form right away! Sadly didn't receive any email. Can you help me with my issue? Should i receive an email? I'm super interested.
Thank you!
Hi ,not getting enough bandwidth for training now, however I am working on a course, will share the update soon, many thanks for watching videos and staying connecetd.
@@UnfoldDataScience Thanks for the quick response! Already turned on the notification bell!
My lasso regression is getting wrong results. It is giving all coefficients as zero except the constant and R2 score as --0.001825328970232576. Someone please help.
Exact what i want AMAN
Thanks Vishal.
Why L1 regularization creates sparsity ???
Hi Mukesh, does it?
@@UnfoldDataScience yes Aman it creates sparsity
samjh nhi aaya bhai.where to use l1 and where to use l2? try explaining in dnn model
not understood
Feedback taken, thanks 🙂
Bhai Hindustan me rhete ho toh hindi me bhi samjho na
It's not about staying in India or anywhere else.
Unfortunately we need to speak in English in office and interviews and this channel is completely in English so that everyone can understand.