This is exactly what I was looking for. Thank you very much! Btw I liked how you accepted that you did not know why the coefficients weren't the same when putting lambda = 0 and genuinely tried to figure it out.
Great content! I was looking for ridge regression explanations for my master thesis as I have highly multicollinearity. Your lecture on ridge regression as well as this practical application really helped me to understand the advantages. Now I have only to figure out how to implement ridge regression in python. Many Thanks and regards from Germany.
Hi Dr. Mack, I really enjoy your videos. Thank you. I have two continuous variables: rcs(Age, 5) and rcs(GRE_score, 6) that I relaxed the cubic splines on and now I a getting huge VIF values for each of those variables. Does VIF work with variables that have relaxed cubic splines please? Thank you for your important work.
For a ridge regression, we introduce bias into the parameter values to reduce the collinearity (and thus the standard errors). Often the bias becomes larger than the resulting standard errors, so that calculating a standard error (for example, using bootstrapping) is not very useful.
See Slide 8 of Lecture 50. Some people use the ratio of max and min eigen values, and some people use the square root of this ratio. You need to check which definition someone is using when they tell you what the condition number is.
Also at 14:00 I think you don't get the same numbers because of rounding error when R runs ridge again. If you re run the entire code you'll keep ending up within .1 of correct coefs
This is exactly what I was looking for. Thank you very much!
Btw I liked how you accepted that you did not know why the coefficients weren't the same when putting lambda = 0 and genuinely tried to figure it out.
Great content! I was looking for ridge regression explanations for my master thesis as I have highly multicollinearity. Your lecture on ridge regression as well as this practical application really helped me to understand the advantages. Now I have only to figure out how to implement ridge regression in python. Many Thanks and regards from Germany.
Thank you for sharing this video. Is there any video or link to discuss using Ridge Regression for panel data sets in Stata software?
Is this applicable for Linear mixed model. E.g. the predictor variables for LMM are all fixed effects?
its very helpful ,thank you so much Mr.Chris
Hi Chris, your videos are amazing. Thanks!
Hi Dr. Mack, I really enjoy your videos. Thank you. I have two continuous variables: rcs(Age, 5) and rcs(GRE_score, 6) that I relaxed the cubic splines on and now I a getting huge VIF values for each of those variables. Does VIF work with variables that have relaxed cubic splines please? Thank you for your important work.
I've never worked with relaxed cubic splines, so I don't know for sure.
Error: unexpected symbol in "model = lm(concrete compressive" sir i am facing this problem how can i move forward plz help me out
This is great! But I have a question. How does one obtain the standard errors of the parameter estimates of ridge regression?
For a ridge regression, we introduce bias into the parameter values to reduce the collinearity (and thus the standard errors). Often the bias becomes larger than the resulting standard errors, so that calculating a standard error (for example, using bootstrapping) is not very useful.
what is my eigen value is less than hundred, in my case it is 26.468 what does that mean?
It means you probably don't have a multicollinearity problem. See lecture 50.
Isn't the condition number the square root of the maximum and minimum eigen value?
See Slide 8 of Lecture 50. Some people use the ratio of max and min eigen values, and some people use the square root of this ratio. You need to check which definition someone is using when they tell you what the condition number is.
What if I want to add weights?
Yes, it is possible to perform a weighted ridge regression.
This was awesome, thank you
Also at 14:00 I think you don't get the same numbers because of rounding error when R runs ridge again. If you re run the entire code you'll keep ending up within .1 of correct coefs
I love it...
13:40 typical R
thank you.