Lecture52 (Data2Decision) Detecting Multicollinearity in R

Поділитися
Вставка
  • Опубліковано 9 лип 2024
  • Using R to detect mutlicollinearity (eigenvalues, variance inflation factors), and using ridge regression to deal with multicollinearity.
    Course Website: www.lithoguru.com/scientist/st...

КОМЕНТАРІ • 22

  • @rons4345
    @rons4345 3 роки тому +2

    This is exactly what I was looking for. Thank you very much!
    Btw I liked how you accepted that you did not know why the coefficients weren't the same when putting lambda = 0 and genuinely tried to figure it out.

  • @wieland369
    @wieland369 Рік тому

    Great content! I was looking for ridge regression explanations for my master thesis as I have highly multicollinearity. Your lecture on ridge regression as well as this practical application really helped me to understand the advantages. Now I have only to figure out how to implement ridge regression in python. Many Thanks and regards from Germany.

  • @bevansmith3210
    @bevansmith3210 5 років тому +1

    Hi Chris, your videos are amazing. Thanks!

  • @abdullahmohammed8521
    @abdullahmohammed8521 6 років тому +1

    its very helpful ,thank you so much Mr.Chris

  • @KnorpelDelux
    @KnorpelDelux 6 років тому

    I love it...

  • @mohammadkhayatsarkar9402
    @mohammadkhayatsarkar9402 Рік тому

    Thank you for sharing this video. Is there any video or link to discuss using Ridge Regression for panel data sets in Stata software?

  • @yankindeez
    @yankindeez 7 років тому +1

    This was awesome, thank you

    • @yankindeez
      @yankindeez 7 років тому +2

      Also at 14:00 I think you don't get the same numbers because of rounding error when R runs ridge again. If you re run the entire code you'll keep ending up within .1 of correct coefs

  • @tmuffly1
    @tmuffly1 5 років тому

    Hi Dr. Mack, I really enjoy your videos. Thank you. I have two continuous variables: rcs(Age, 5) and rcs(GRE_score, 6) that I relaxed the cubic splines on and now I a getting huge VIF values for each of those variables. Does VIF work with variables that have relaxed cubic splines please? Thank you for your important work.

    • @ChrisMack
      @ChrisMack  5 років тому

      I've never worked with relaxed cubic splines, so I don't know for sure.

  • @abhukk
    @abhukk 6 років тому

    thank you.

  • @mufutaubello3883
    @mufutaubello3883 7 років тому

    This is great! But I have a question. How does one obtain the standard errors of the parameter estimates of ridge regression?

    • @chrismack783
      @chrismack783 7 років тому

      For a ridge regression, we introduce bias into the parameter values to reduce the collinearity (and thus the standard errors). Often the bias becomes larger than the resulting standard errors, so that calculating a standard error (for example, using bootstrapping) is not very useful.

  • @avnistar2703
    @avnistar2703 2 роки тому

    Is this applicable for Linear mixed model. E.g. the predictor variables for LMM are all fixed effects?

  • @myknowledgeyourwisdom-bydi5649
    @myknowledgeyourwisdom-bydi5649 4 роки тому

    Error: unexpected symbol in "model = lm(concrete compressive" sir i am facing this problem how can i move forward plz help me out

  • @Hongnguyen-zv5sj
    @Hongnguyen-zv5sj 6 років тому

    Isn't the condition number the square root of the maximum and minimum eigen value?

    • @chrismack783
      @chrismack783 6 років тому

      See Slide 8 of Lecture 50. Some people use the ratio of max and min eigen values, and some people use the square root of this ratio. You need to check which definition someone is using when they tell you what the condition number is.

  • @EranM
    @EranM 7 років тому

    What if I want to add weights?

    • @chrismack783
      @chrismack783 7 років тому

      Yes, it is possible to perform a weighted ridge regression.

  • @radhikamalhotra5650
    @radhikamalhotra5650 4 роки тому

    what is my eigen value is less than hundred, in my case it is 26.468 what does that mean?

    • @ChrisMack
      @ChrisMack  4 роки тому +1

      It means you probably don't have a multicollinearity problem. See lecture 50.

  • @Ingannoakagurzone
    @Ingannoakagurzone 5 років тому

    13:40 typical R