Multiple Regression Backward Elimination

Поділитися
Вставка
  • Опубліковано 15 вер 2024

КОМЕНТАРІ • 13

  • @fengli9945
    @fengli9945 11 місяців тому

    Very clearly explained! Thank you.

  • @Parna_pushpshala
    @Parna_pushpshala 2 роки тому

    Thank you so much for uploading this video... its the most useful video i came across on multiple regression... 💯💯

  • @chaitaliacharya7790
    @chaitaliacharya7790 5 років тому +1

    Very Simple yet an effective way of learning backward elimination

  • @ravish6280
    @ravish6280 5 років тому +8

    the click sounds are SOOOO annoying... i have to quit this video.

  • @afkghost9758
    @afkghost9758 5 років тому +8

    That is the most annoying mouse click I have ever heard.

  • @babanadopu
    @babanadopu 6 років тому

    This is really helpful for my Machine Learning study. Thank you!

  • @omololagbade-oladiran4819
    @omololagbade-oladiran4819 Рік тому

    Thank you

  • @gokuafrica
    @gokuafrica 6 років тому

    can someone explain to me how p value is calculated in this context? I would appreciate the formula

  • @plualumanevko2194
    @plualumanevko2194 7 років тому

    At 10mins 25 seconds why are you highlighting a p value of 2.9 and stating it is significant? - I am new to this, but I thought p values were only significant at 0.05?

  • @rameshthamizhselvan2458
    @rameshthamizhselvan2458 5 років тому

    Still I can't understand why she is eliminating the columns ....

    • @afkghost9758
      @afkghost9758 5 років тому

      High pvalues support the null (business as usual) hypothesis. Low pvalues support the impact of variables on dep variable. So she's pulling low (not significant) variables out in an attempt to explain what is driving the R squared (what is causing that 84% explanation).

  • @BlackOps-Ent
    @BlackOps-Ent 3 місяці тому

    Hi! Great presentation. Please lose the mouse-click sound. IMHO, it is more distracting than helpful.
    $0.02