Assumptions of linear Regression | explained in simplest way

Поділитися
Вставка
  • Опубліковано 10 січ 2025

КОМЕНТАРІ • 29

  • @goldenstatmodels8106
    @goldenstatmodels8106 5 років тому +6

    I should say this is best video on UA-cam on asumptions.
    thank you sir

  • @foundev
    @foundev 3 роки тому

    So far best video for Linear Regression Assumption

  • @reethiashok9961
    @reethiashok9961 3 роки тому

    Watched so many videos on Assumptions. But after watching this video I understand better.

  • @rewindpraveen
    @rewindpraveen 10 місяців тому

    This is such a great video

  • @ranjanpal7217
    @ranjanpal7217 2 роки тому

    Amazing Explanation

  • @prajjwaljaiswal3419
    @prajjwaljaiswal3419 2 роки тому

    This is what I was looking for. Please suggest more videos on the combo of data visualization and statistics. Please suggest. Thank you so much.

  • @vijaypalmanit
    @vijaypalmanit 3 роки тому

    best video on assumption of linear regression, thanks...

  • @summitjoshi7901
    @summitjoshi7901 3 роки тому +1

    thanks bro you make this topic more understandable please make more videos like this,excellent job

  • @pradeepaparameshwaran7445
    @pradeepaparameshwaran7445 4 роки тому +1

    Awesome explanation bro

  • @arvindkumar-ug1zf
    @arvindkumar-ug1zf 5 років тому +1

    thank you very much you teaching way really fanstastic pls make a video gradient discent algorithm simple way. again thanks

  • @ravikiran1284
    @ravikiran1284 5 років тому +1

    Please do a video on testing of hypothesis
    Shall be thankful..

  • @ravikiran1284
    @ravikiran1284 5 років тому +1

    Very well explained

  • @ravikiran1284
    @ravikiran1284 5 років тому +2

    Bro please make your next video on testing of hypothesis

  • @utsavaggarwal_ds
    @utsavaggarwal_ds 3 роки тому

    excellent !!

  • @ganeshkharad
    @ganeshkharad 4 роки тому +1

    @16.11 Why we dont want duplicate information ???what happens if we do???
    kindly reply

    • @whenmathsmeetcoding1836
      @whenmathsmeetcoding1836  4 роки тому

      When you check the coefficient of your OLS you will find that it has given higher values to both of the independent variable which have high correlation and given very low coefficient to other independent variable which contains relevant information but have slight low correlation compare to the multico-linear variable. Multicollinearity is a problem because it undermines the statistical significance of an independent variable.
      hope this has helped you out if not please let me know I will try to explain you in some other way...
      Thanks for giving so much attention to the video...

  • @zulfiquarshaikh5921
    @zulfiquarshaikh5921 3 роки тому

    what is the effect of heteroskedasticity on regression ?

  • @ravikiran1284
    @ravikiran1284 5 років тому +2

    Why don't you make videos on ML algorithms like
    1.logistic regression
    2.Naive Bayes
    3.Decision Trees
    4.Random Forest
    5.Boosting
    6.PCA
    7.k.means
    8.KNN
    9.Time series
    And topics like regularization,activation functions

  • @chiragagrawal7104
    @chiragagrawal7104 4 роки тому +1

    i didn't understood that why we have to calculate the auto correlation

    • @whenmathsmeetcoding1836
      @whenmathsmeetcoding1836  4 роки тому +1

      auto-correlation revels if the independent variable have some hidden pattern
      for even more detailed mathematical explanation please review below link
      www.homepages.ucl.ac.uk/~uctpsc0/Teaching/GR03/Heter&Autocorr.pdf

  • @dev5289
    @dev5289 3 роки тому

    please provide the python codes for all these

  • @prajjwaljaiswal3419
    @prajjwaljaiswal3419 2 роки тому

    here:
    m = 0.1
    c = 0.1