When you check the coefficient of your OLS you will find that it has given higher values to both of the independent variable which have high correlation and given very low coefficient to other independent variable which contains relevant information but have slight low correlation compare to the multico-linear variable. Multicollinearity is a problem because it undermines the statistical significance of an independent variable. hope this has helped you out if not please let me know I will try to explain you in some other way... Thanks for giving so much attention to the video...
Why don't you make videos on ML algorithms like 1.logistic regression 2.Naive Bayes 3.Decision Trees 4.Random Forest 5.Boosting 6.PCA 7.k.means 8.KNN 9.Time series And topics like regularization,activation functions
auto-correlation revels if the independent variable have some hidden pattern for even more detailed mathematical explanation please review below link www.homepages.ucl.ac.uk/~uctpsc0/Teaching/GR03/Heter&Autocorr.pdf
I should say this is best video on UA-cam on asumptions.
thank you sir
So far best video for Linear Regression Assumption
Watched so many videos on Assumptions. But after watching this video I understand better.
This is such a great video
Amazing Explanation
This is what I was looking for. Please suggest more videos on the combo of data visualization and statistics. Please suggest. Thank you so much.
best video on assumption of linear regression, thanks...
thanks bro you make this topic more understandable please make more videos like this,excellent job
Thank you, I will
Awesome explanation bro
Glad you liked it
thank you very much you teaching way really fanstastic pls make a video gradient discent algorithm simple way. again thanks
sure
Please do a video on testing of hypothesis
Shall be thankful..
Very well explained
Thanks
Bro please make your next video on testing of hypothesis
Sure Its a great suggestion I will work on it...
excellent !!
@16.11 Why we dont want duplicate information ???what happens if we do???
kindly reply
When you check the coefficient of your OLS you will find that it has given higher values to both of the independent variable which have high correlation and given very low coefficient to other independent variable which contains relevant information but have slight low correlation compare to the multico-linear variable. Multicollinearity is a problem because it undermines the statistical significance of an independent variable.
hope this has helped you out if not please let me know I will try to explain you in some other way...
Thanks for giving so much attention to the video...
what is the effect of heteroskedasticity on regression ?
Why don't you make videos on ML algorithms like
1.logistic regression
2.Naive Bayes
3.Decision Trees
4.Random Forest
5.Boosting
6.PCA
7.k.means
8.KNN
9.Time series
And topics like regularization,activation functions
All these videos are on their way.. Thanks for support..
@@whenmathsmeetcoding1836 all the best bro
i didn't understood that why we have to calculate the auto correlation
auto-correlation revels if the independent variable have some hidden pattern
for even more detailed mathematical explanation please review below link
www.homepages.ucl.ac.uk/~uctpsc0/Teaching/GR03/Heter&Autocorr.pdf
please provide the python codes for all these
here:
m = 0.1
c = 0.1