Polynomial Regression | Machine Learning
Вставка
- Опубліковано 22 лип 2024
- We simplify the concept, making it easy to understand how polynomial terms can enhance your regression models. Learn to apply Polynomial Regression in Python and elevate your skills in capturing complex relationships in your data.
Code used: github.com/campusx-official/1...
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in/s/store
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
⌚Time Stamps⌚
00:00 - Intro
00:10 - What is Polynomial Regression
07:25 - Code Demo
great video bro , excellent , perfect , no words to express my gratitude. U covered all the doubts I had w.r.t to polynomial regression
If anyone wants to learn ML then it is the best channel.. explanation is amazing for all the topics.
Perfect explanation as always.
Best of the best!
Awesome Explanation.
Best, simple & easy explanation
Your explanation is wonderful. I request you to kindly prepare a video to explain the code.
Excellent. Very well explained. You should use real world data instead of random numerical values
Thanks for the Video Sirjiiii
You way of teaching is very very interesting.
6:03, PolynomialFeatures class for x1 and x2 and for degree 3, also adds columns (x1)(x2), (x1^2)(x2), (x1)(x2^2). Generally, for PolynomialFeatures set to degree = d, and we originally had n features, we will get a total of (n+d)! / d! n! features. Just thought this'd be useful to know.
thanks! I had the same doubt because at 5:52 sir didn't add the additional term x1*x2, so I got confused. Please correct me if I'm wrong.
@@ruchiagrawal6432 it gets features’ high-order and interaction terms. suppose (x1,x1) becomes (1,x1,x2,x1^2,x1x2,x2^2) and so on
from where/how the term (X1)(X2) coming from ? Kindly elaborate
@@subhajitdey4483 formula bro (a+b)^n where n is the degree of polynomial.
Can we apply polynomial regression on dataframe (let's say df) first then after we split x into train_x or test_x. If yes then why everyone doing split first then transform x_train then x_test.
Thank u nitish 🎉
Infinit time thankyou ir
Thank you sir
Sir you generated data in form X and y, what about, if we have real life dataset, then how to plot data distribution?
1:35-->7:25 What is polynomial Linear Regression?
finished watching
nice
I understood the explanation well.. thanks... but Bhai one doubt in real data set by looking at the data without plotting graph, will I be able to tell we need to use poly reg.. Also if we add features like X^2,X^3, will it not end up with multicollinearity as dependency b/w input features exists now??
Yes it adds multi-collinearity in your model by introducing x^2, x^3 features into your model. To overcome that, you can use orthogonal polynomial regression which introduces polynomials that are orthogonal to each other.
Wow
❤️
Why in polynomial regression with degree 2 is not able to capture the bowl in the training dataset? I think that curve shown was for a higher polynomial.
Yes even I was confused because the poly features should have fit better for degree 2
One qurstion....
10:37 why we are not using fit_transform() for X_test_trans as X_train_trans ??
fit() is used to calculate mean and variance of the data, while transform() is use to transform (scale the data according to our need) the data. When we first use fit_transform() which already calculated mean and variance from 80% of data, which is more reliable than the 20% of data i.e. X_test_trans also by doing so we keep the mean and variance of whole dataset same that's why we only use transform for X_train_trans.
Hi Nitish bro,this audio is not clear ..please do new video if possible on same topic..also please cover fearure selection xgboost
you can use earphones, it will be understandable.
Why do we create X_new and y_new, while we have X_test_trans and y_pred
Same doubt
Watch this video with the subtitles provided by UA-cam. You will find another story in the subtitles 😅
we might have to use "print("Input", poly.n_features_in_)" instead of "print("Input", poly.n_input_features_)"
Avaj ni saaf
❤