What are the main Assumptions of Linear Regression? | Top 5 Assumptions of Linear Regression

Поділитися
Вставка
  • Опубліковано 22 лип 2024
  • In this video, we will discuss the assumptions of linear regression in detail. We will first discuss all the assumptions in theory, and then write python code to check it. We'll explore the key assumptions that underlie Linear Regression.
    🧑‍💻Code - github.com/campusx-official/l...
    ============================
    Do you want to learn from me?
    Check my affordable mentorship program at : learnwith.campusx.in
    ============================
    📱 Grow with us:
    CampusX' LinkedIn: / campusx-official
    CampusX on Instagram for daily tips: / campusx.official
    My LinkedIn: / nitish-singh-03412789
    Discord: / discord
    👍If you find this video helpful, consider giving it a thumbs up and subscribing for more educational videos on data science!
    💭Share your thoughts, experiences, or questions in the comments below. I love hearing from you!
    ✨ Hashtags✨
    #LinearRegression #StatisticsExplained #DataScience101
    ⌚Time Stamps⌚
    00:00 - Intro
    00:32 - Main Assumptions of Linear Regression
    01:57 - Linear Relationship
    04:25 - Multicollinearity
    09:56 - Normal Residual
    13:14 - Homoscedasticity
    15:36 - No Autocorrelation of Error
    17:25 - Outro

КОМЕНТАРІ • 97

  • @msgupta07
    @msgupta07 Рік тому +15

    Timeline of
    Assumption of linear regression
    0) 00:51 Introduction
    1) 01:58 Linear relationship (between all the independent and dependent features)
    2) 04:25 No multicollinearity (between independent features)
    3) 11:37 Normality of residuals (Distribution of residuals should be normal)
    4) 13:15 Homoscadascity (residuals and predicted values should not have any pattern )
    5) 15:32 No autocorrelation of residuals

  • @tanjulgohar5
    @tanjulgohar5 2 роки тому +6

    Sir aapse accha koi nhi samjha sakta ❤️

  • @laxminarayangaidhane7116
    @laxminarayangaidhane7116 2 роки тому +18

    Sir if possible make video on AUC ROC curve...and thank you for making this video

  • @kunikakhobragade6953
    @kunikakhobragade6953 2 роки тому +2

    Sir aapne bohot hi badhiya padhaya ....after watching a lot of videos but not founded a video like this ....now i understood the concept only by you ...ty

  • @rajshekharrakshit9058
    @rajshekharrakshit9058 2 роки тому +1

    This is what content is. I hope you will give deep understanding on other topics too

  • @prateeksrivas89
    @prateeksrivas89 8 місяців тому

    Very Helpful Video for people who grasp a concept from fundamentals. Very intuitive with practical implementation.

  • @ishuraj7407
    @ishuraj7407 Рік тому

    The only video on YT that explains assumptions of alogorithm. Thank you so much sir this video was a great help. Sir, can you please make videos like this for other alogorithms also.

  • @biswasshubendu4
    @biswasshubendu4 2 роки тому

    ON THE POINT!!!!! VERY IMPORTANT INFORMATION REGARDING INTERVIEWS

  • @anujsinghkushwah2712
    @anujsinghkushwah2712 Рік тому

    thanku bhai
    the most "to the point" and easiest explained video in youtube.

  • @jashneetkaur3176
    @jashneetkaur3176 Рік тому

    Yours videos are very well explained ,Thank you soo much Sir for giving the knowledge. You are the best teacher ever

  • @shreepalpawar9437
    @shreepalpawar9437 2 роки тому +1

    Sir,ur teaching skills is very awesome ❤️It was much helpful for me Thanku 💐🎉

  • @nidhisingh9303
    @nidhisingh9303 3 місяці тому

    One of the best and quick video I have seen

  • @deepakalur5603
    @deepakalur5603 Місяць тому

    This Question has been asked in Turing data scientiest interview Sir, thank you so much.

  • @vikaskadam9842
    @vikaskadam9842 2 роки тому +1

    Great explanation sir,simple illustrated by example

  • @sachin2725
    @sachin2725 Рік тому

    Dude, you are really a genius......excellent explaination

  • @arfapathan1832
    @arfapathan1832 10 місяців тому

    you simplified the concept..Thank youuuuu

  • @sourabhagarwal4852
    @sourabhagarwal4852 2 роки тому

    Good Video on Assumptions of Linear Regression🙂

  • @lothalopolis
    @lothalopolis Рік тому +3

    1:44 During train_test_split, the rows of the data are randomly ordered (unless you set a parameter not to reorder, which is not set here). Because of this, the residuals at 16:08 will always show no auto-correlation even if it was, as the order is jumbled up.

  • @unitedpakistan8516
    @unitedpakistan8516 3 місяці тому

    Thank You Sir, the way you make us understand is really great... Love From Pakistan 💖

  • @abhaykumaramanofficial
    @abhaykumaramanofficial 2 роки тому

    Thanks you for simple and great explanation

  • @nilkantgudpale1959
    @nilkantgudpale1959 5 місяців тому

    thank you clearly explained the concepts

  • @NishaSharma-se1js
    @NishaSharma-se1js 10 місяців тому +1

    Thank you for this wonderful information🎉

  • @shalinigoud802
    @shalinigoud802 Рік тому

    Thanx for clear explanation it was quite informative

  • @ParthivShah
    @ParthivShah 3 місяці тому +1

    Amazing Knowledge Sir.

  • @ge5850
    @ge5850 Рік тому

    Its very very good lecture sir thanks a lot

  • @619vijay
    @619vijay 2 місяці тому

    Thank you. Very helpful

  • @rachitsingh4913
    @rachitsingh4913 2 роки тому +4

    Hello Sir, As always this video is also amazing no doubt about that. But in this video you only explained How to check the assumptions. But what if the assumption is not hold than how to tackle them ?? Like what are the processes in order to convert the data so that it holds all assumptions. please make video on that and explain that.. Thankyou Sir

  • @learnwithajmal8829
    @learnwithajmal8829 3 місяці тому

    sir videos is very good , sir we need a videos for that case if assumption not satisfy how we can use remedy of these assumption in python

  • @datamatrix20
    @datamatrix20 2 роки тому +5

    Please make video on how to overcome each assumption if it is invalid

  • @shivarajnavalba5042
    @shivarajnavalba5042 Рік тому

    great explanation... thank you! 😇

  • @srkandekar
    @srkandekar Рік тому

    Thankyou Sir.
    I will reference this content.

  • @monalishasahu1276
    @monalishasahu1276 Рік тому

    Very well sir, thank you so much 😊

  • @balrajprajesh6473
    @balrajprajesh6473 2 роки тому

    Best teacher ever!

  • @sumjakar
    @sumjakar 2 роки тому

    Nice video sir
    Thank You So Much

  • @ParthivShah
    @ParthivShah 3 місяці тому +1

    Thank You Sir.

  • @divyanshusharma4576
    @divyanshusharma4576 5 місяців тому

    Hi Nitish there is one more assumption for this our response variable should be normally distributed please explain thats why we use GLM

  • @namanmodi7536
    @namanmodi7536 2 роки тому

    deep learning video sir!

  • @Keep_Laughfing
    @Keep_Laughfing 2 роки тому +1

    Same Question asked me.
    Can you explain all assumption of all algorithms??
    Plz sir that will be very helpful for us. 🙏🙏🙏🙏

  • @yagnikposhiya7019
    @yagnikposhiya7019 2 роки тому

    Great Explanation.. But can you please make a video on detail explanation of autocorrelation and homoscedasticity??.. Thank You

  • @diwakargupta0
    @diwakargupta0 Рік тому

    Sir in Autocorrelation of Residuals if we sort the data then it will also follow some pattern. This plot depends on the order of input and we can pass input in any order.
    btw great video. Thanks

  • @shahilgourisaria2336
    @shahilgourisaria2336 Рік тому

    Very nice explanation sir

  • @harshithasuri663
    @harshithasuri663 Рік тому

    Conceptually what does auto correlation of residuals represent? You explained nicely why there should not be a correlation b/w Independent variables. But I didn't understand significance of no-auto correlation assumption for residuals

  • @ashvinibhuskade6250
    @ashvinibhuskade6250 2 роки тому

    Nice video sir.., the no autocorrealtion assumption is only for linear regression or it is applicable for other algorithms

  • @DataScienceWithAkesh
    @DataScienceWithAkesh 6 місяців тому

    Sir i am facing a bimodal residual issue or problem dont know what to say. Even my teacher dont helped me in that. Can you give some points or anything

  • @pankajbhatt8315
    @pankajbhatt8315 2 роки тому

    Nice explanation

  • @xploramit
    @xploramit 8 місяців тому +1

    Hi sir,
    The 1st assumption of linear regression is that the equation should be linear in parameters and there is no restriction on how x and y are related. But u showed in ur video that if there is non linear relationship between x and y then the equation doesn't holds the assumption which I think is not right.

  • @jyotsanagour850
    @jyotsanagour850 Рік тому

    Well Explained

  • @because2022
    @because2022 5 місяців тому

    Very nice explanation❤

  • @anirbansen9285
    @anirbansen9285 Рік тому

    Excellent content Sir but I have a doubt. Residual should be bell-shaped then how it's not holding any auto-relation correlation?

  • @nikhilbansal855
    @nikhilbansal855 8 місяців тому

    In both assumptions 3 (normal residual) and 5 (autocorrelation) , we are plotting residuals, How come assumption 3 says it is normally distributed but 5 says there is not relation?

  • @ShubhamVerma-wf3vc
    @ShubhamVerma-wf3vc 2 роки тому

    Thanks jitu bhaiya.

  • @reshubathla8138
    @reshubathla8138 Рік тому

    Very nice ...

  • @shadiyapp5552
    @shadiyapp5552 Рік тому

    Thank you sir ♥️

  • @ashutoshthokare2127
    @ashutoshthokare2127 5 місяців тому

    Thank u sir

  • @pratikghodke7983
    @pratikghodke7983 2 роки тому

    good one sir

  • @nikhilgupta4859
    @nikhilgupta4859 3 місяці тому

    Sir apne btaya how to check linearity, but ye nahi btaya agar non linear h to krna kya h

  • @ishandandekar1808
    @ishandandekar1808 2 роки тому

    Sir please keep making Deep learning videos for the 100 days ml playlist

  • @arpanpal9860
    @arpanpal9860 9 місяців тому

    Thank you sir❤❤

  • @debatradas1597
    @debatradas1597 8 місяців тому

    thank you so much sir

  • @sumansamantaray4886
    @sumansamantaray4886 2 роки тому +1

    Sir, aapne bataya tha January me NLP ka playlist khatam hoga ! Abhi June khatam hone wala hai 😞

  • @navtojsingh
    @navtojsingh 4 місяці тому

    bravo!

  • @gauravsharma-sd2mg
    @gauravsharma-sd2mg 2 роки тому

    Awesome 👏

  • @viral_video_ayana
    @viral_video_ayana 2 роки тому

    Thankyou sir 🙇

  • @Compact18
    @Compact18 Рік тому

    What if these assumptions get violated ?

  • @debjitsarkar2651
    @debjitsarkar2651 2 роки тому

    SIR how can I join your online 6 month Ml &AI COURSE?PLEASE REPLY SIR.Thank you🙏🏻

  • @rafibasha4145
    @rafibasha4145 2 роки тому +1

    Thanks bro

  • @pranjalmeshram3961
    @pranjalmeshram3961 9 місяців тому

    Isn't the assumption about linearity in the sense that it should be linear in parameters, not variable? That is to say the assumption is fine with non linearity in X variables until coefficient (ß) of X is linear.

  • @thethreemusketeers4500
    @thethreemusketeers4500 2 роки тому

    sir plz Deep Learning and NLP ki playlist complete kr do.

  • @rahulaher3874
    @rahulaher3874 Рік тому

    thank you sir , apne iski githhub link di hoti to time save hota hamhara...

  • @Vipulghadi
    @Vipulghadi 2 роки тому +1

    thanks sir

  • @shrinathjagtap6703
    @shrinathjagtap6703 Рік тому

    Make video on
    What to do if these assumptions get violated

  • @iftikhar58
    @iftikhar58 Рік тому

    Thank Men

  • @krishnabhadke6161
    @krishnabhadke6161 2 місяці тому

    NIcely Explained

  • @vijaylaxmilendale3399
    @vijaylaxmilendale3399 Рік тому

    1. Linear relationship between input and output
    2. No multi collinearity
    3. Normality of residual
    4. Homoscedasticity
    5. No auto correlation in residual

  • @SaranRavali
    @SaranRavali 2 місяці тому

    It would have been a good video , if the reasons behind these assumptions is well explained. the reasons behind the Normal Residual, Homoscedasticity, No Autocorrelation of Error are not explained. how does these assumptions impact the model is not explained. Thanks for explaining the meanings of these errors with examples.

  • @ajaychinni3148
    @ajaychinni3148 6 місяців тому +1

    The only missing thing was the "why" Why do we need these assumptions of linear regression. You only explained Multicollinearity would have been perfect if explained for all.

  • @harshmankodiya9397
    @harshmankodiya9397 2 роки тому

    hello there.
    As u said these are the assumptions in LR and a candidate who is not aware of these is judged. But the thing is, from where can one read about such concepts?. Can you please suggest some books with solid ml fundamentals as there is a lot of ambiguity about concepts in ML books and not every book talks in depth about these algorithms.

    • @campusx-official
      @campusx-official  2 роки тому

      www.amazon.in/Elements-Statistical-Learning-Prediction-Statistics-ebook/dp/B00475AS2E

  • @vivekpawar1854
    @vivekpawar1854 2 роки тому

    Sir how to handle multicollanirity??? should we drop one column???

  • @jams6279
    @jams6279 Рік тому

    🎉🎉🎉

  • @DarkLord79799
    @DarkLord79799 Рік тому

    nice

  • @rajatchauhan4410
    @rajatchauhan4410 23 дні тому

    but why these assumptions??

  • @ParasProgramming123
    @ParasProgramming123 2 роки тому

    Can you make tutorial on deep learning

    • @campusx-official
      @campusx-official  2 роки тому

      100 Days of Deep Learning: ua-cam.com/play/PLKnIA16_RmvYuZauWaPlRTC54KxSNLtNn.html

    • @ParasProgramming123
      @ParasProgramming123 2 роки тому

      @@campusx-official thank you sir.
      Sir is macbook air m1 good for such work such as ml and dl I am watching your 100 days of machine learning and reached day 3 because it been only 3 days of me starting this new journey

  • @tanmaythaker2905
    @tanmaythaker2905 Рік тому

    LR
    No multicollinearity
    Normality of residuals
    Error should have constant variance
    No auto correlation of errors

  • @vishnupsharma50
    @vishnupsharma50 Місяць тому

    You should actually rectify yourself. The linear assumption is never about straight line... it is linear in estimation parameters. y= k x^2 is also linear regression. Please correct.

  • @ninderjoshi7384
    @ninderjoshi7384 2 роки тому

    I would suggest you to explain why linear regression assumes, normal residuals, homoscdacitiy and no correlation between/w residuals/independent variables.
    That will help make your channel different from others because it will help your audience understand the concept better.
    The things you have explained, anyone can explain it, but only a handful number of people explains "why"

    • @arun5351
      @arun5351 2 роки тому

      You can explain the reasoning behind it. Others can also chip in. Nitish can correct our understanding if there are any gaps.

  • @Dyslexic_Neuron
    @Dyslexic_Neuron Рік тому

    Wasted 19 minutes . You should explain the reason for having these assumptions

  • @pratiknaikwade95
    @pratiknaikwade95 11 місяців тому

    agar result 1 se 5 ke bitch mai aarahe ho to "multicolinearity hai ya nahi"???????🤨🤨🤨🤨🤨🤨🤨🤨🤨🤨🤨🤨🤨🤨🤨🤨🤨🙄🙄🙄🙄🙄🙄🙄🙄🙄

  • @ritujawale10
    @ritujawale10 2 роки тому

    Thnks you sir... 👍

  • @teenagepanda8972
    @teenagepanda8972 2 роки тому

    Thank you sir

  • @partharora6023
    @partharora6023 2 роки тому

    amazing sir