Linear Regression, Clearly Explained!!!

Поділитися
Вставка
  • Опубліковано 25 лис 2024

КОМЕНТАРІ • 1,6 тис.

  • @statquest
    @statquest  2 роки тому +43

    Correction:
    25:39 I should have (Pfit - Pmean) instead of the other way around.
    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

  • @udaychopra4802
    @udaychopra4802 5 років тому +952

    I have seen almost all the playlists.
    BAM!
    I have told my many friends about this channel.
    Double BAM!!
    I come back again and again to revise my concepts.
    Triple BAMM!!!
    There will be generations who are gonna come back and learn here. More power to you. (so many BAMS)

    • @udaychopra4802
      @udaychopra4802 5 років тому +54

      we talk in terms of BAMs now. little BAM.

    • @statquest
      @statquest  5 років тому +82

      This is awesome! So many BAMS!!! :) Even a little BAM. :)

    • @ah2522
      @ah2522 4 роки тому +11

      @@statquest hooray!

    • @statquest
      @statquest  4 роки тому +22

      @@ah2522 Yes! Hooray!!! :)

    • @vaibhavdani4470
      @vaibhavdani4470 4 роки тому +10

      @@statquest BAM...Hooray are on my tip of tongue now days !!

  • @ycao6
    @ycao6 4 роки тому +663

    Every time I saw your video, I just feel you’re probably the only person that don’t want to use math to confuse people or destroy people’s confidence. I appreciate that.

    • @statquest
      @statquest  4 роки тому +26

      Thanks! :)

    • @Glock_50
      @Glock_50 2 роки тому +3

      You are awsome explaining!

    • @armangilang9654
      @armangilang9654 2 роки тому +6

      ya, you broken the legs of math mistery

    • @amanjang7531
      @amanjang7531 2 роки тому +7

      30 years old and for the first time i am not afraid of mathematics a big part thanks to finding this channel in Graduate school.

    • @tinacole1450
      @tinacole1450 Рік тому

      ha ha.. great comment. Josh does a great job at making complex stuff seem okay to understand.

  • @AnuragTK
    @AnuragTK 4 роки тому +725

    God bless the friendly folks in the genetics department at the University of North Carolina at Chapel Hill

  • @lazarus8011
    @lazarus8011 3 роки тому +80

    The clarity of your speech with no unessesary details in your visuals makes learning these methods so much easier thankyou.

    • @statquest
      @statquest  3 роки тому +4

      Glad it was helpful!

  • @tonywang7933
    @tonywang7933 Рік тому +73

    Thank you, throughout years of undergrad courses, I have never seen an instructor is willing to take the time to explain all these basic yet extremely important concepts.

  • @zulucharlie5244
    @zulucharlie5244 Рік тому +81

    I've published high-impact scientific papers in several different life science fields, and this is my go-to source to refresh my memory for all basic statistics knowledge. Most universities should abandon all attempts at teaching statistics to students in lectures, vector them to this channel, and then provide one-on-one tutoring to help students get across the line and master the material. Seriously great stuff here; you have made the world a better place - thank you.

    • @statquest
      @statquest  Рік тому +9

      Wow! Thank you very much! :)

  • @iarasouza8029
    @iarasouza8029 4 роки тому +94

    This is by far the best resource for someone who does not have any background in math (or even for someone who does) to learn about key concepts of statistics. Thank you!

    • @statquest
      @statquest  4 роки тому +2

      Wow, thanks!

    • @confidenceinterval4849
      @confidenceinterval4849 2 роки тому

      You can learn more about linear regression in simple words here ua-cam.com/video/QQ9MAnX963M/v-deo.html

  • @amitlavon1647
    @amitlavon1647 3 роки тому +4

    I am a 33 year old computer science PhD student and I still need statistics explained to me as if I'm 6. Thank you for that.

  • @bu8291
    @bu8291 7 років тому +182

    More people should discover your channel, you make it very easy to understand statistics :)

  • @rohitamalnerkar2152
    @rohitamalnerkar2152 5 років тому +21

    Mr. Josh
    Me and my friend circle thoroughly watch your all videos related to data science and we all think you are doing an amazing job. The amount of efforts you put make our ideas more and more clearer. I can't thank you enough for your generous work. Keep it up. Love from India.

    • @statquest
      @statquest  5 років тому +2

      Thank you very much!!! I really appreciate your feedback.

  • @statquest
    @statquest  5 років тому +120

    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @ramnewton
      @ramnewton 4 роки тому

      Had the same doubt; this clears things up, thanks! Maybe consider pinning this so it shows up on top always?

    • @sureshkm
      @sureshkm 4 роки тому

      @StatQuest I think, we need the same correction in the study guide 'statquest_linear_regression_study_guide' Page 5 of 6 as well.

    • @ssriv47
      @ssriv47 3 роки тому

      statquest website is not working. It is showing database error. Please look into it.

    • @statquest
      @statquest  3 роки тому +1

      @@ssriv47 Thanks! The site is back up.

    • @jbtechcon7434
      @jbtechcon7434 3 роки тому +2

      Is that a typo at 19:25? You said the numerators are the same, but they're not. Are you missing parentheses?

  • @monicakulkarni3319
    @monicakulkarni3319 4 роки тому +95

    You are statistically one of my favourite UA-cam channels!

    • @statquest
      @statquest  4 роки тому +5

      I love it! Thank you. :)

    • @felixwhise4165
      @felixwhise4165 4 роки тому

      Which are the others? :)

    • @TheEverydayAnalyst
      @TheEverydayAnalyst 4 роки тому +2

      @@felixwhise4165 For me Brandon Foltz is another awesome channel!

  • @kakusniper
    @kakusniper 7 років тому +17

    I never understood statistics in my life till now, you make it look so easy. Thanks again.

  • @ericstefko4852
    @ericstefko4852 9 місяців тому +4

    Without a doubt this is the best explanation of Linear Regression to be found anywhere.

    • @statquest
      @statquest  9 місяців тому

      Thank you very much!

  • @ali57555
    @ali57555 2 роки тому +7

    I really like how you explain things in the most simple way. I thought I knew what Adj R^2 is, but watching your video made me really understand it. Thank you a lot. Please don't stop making our world simpler to understand.

    • @statquest
      @statquest  2 роки тому

      Thank you very much! :)

  • @prayaglehana7187
    @prayaglehana7187 6 років тому +59

    I have gone through all of his videos.. and believe me he is the one that will make your concepts clear !

    • @statquest
      @statquest  6 років тому +5

      Thanks! :)

    • @malamals
      @malamals 5 років тому +1

      couldn't have agreed more on this.

  • @dominicj7977
    @dominicj7977 6 років тому +10

    This is the most intuitive explanation I have seen of these terms. Although I have known these since a couple of years, I have never seen or read a similar kind of explanation anywhere else.

  • @ssriv47
    @ssriv47 3 роки тому +1

    I was in search for few days how to learn stats for ML.
    BAM! I stumbled upon this channel and everything is so bright and clear like a pleasant sunny day.
    It seems we have to learn how to pronounce BAM correctly and everything else is transferred to our mind between a BAM and double BAM.
    God bless Josh Starmer !

  • @swathisukumar2219
    @swathisukumar2219 4 роки тому +22

    Hi Josh, your videos are so explicit that learning statistics seem less challenging. I can't express my gratitude for you in mere words, nonetheless "Thank you so much". The concepts are so clearly explained. You are such an amazing teacher and with your generosity and passion, you are helping so many of us out here across the world. May God bless you always. More power and love to you from India.

  • @Privacy-LOST
    @Privacy-LOST 5 років тому +3

    Mind Blown at 7:37. The most obvious things are sometimes the most striking.
    I have been hammered these concepts at college with barely any context and now it all makes sense thanks to you sir

    • @statquest
      @statquest  5 років тому +3

      Hooray!! Yes, that's the big moment for me, too. I realized that R^2 made much more sense if I could "see it". Once I made that realization, I knew I had to make this video. I'm glad you appreciate it too!

  • @silentsuicide4544
    @silentsuicide4544 2 роки тому +3

    I'm learning concepts from machine learning and beside courses and books I'm trying to come back to this channel on daily basis since i find the content very straight forward. For example, i find statistics and probability the hardest to learn and i really struggle when it comes down to feature selection as i don't get for example all those statistical tests to determine the significance, as i cannot understand why for example we use f-distribution to determine r^2 p-value, and here we go, plain and simple, it took few minutes and i get the concept whereas learning from Google can be sometimes really challenging. Thanks again!

    • @statquest
      @statquest  2 роки тому +1

      Glad the video was helpful! :)

  • @italiandarthvader
    @italiandarthvader 2 місяці тому +1

    Thanks to this video I can say I built my first (working) linear regression model FROM SCRATCH. What was most amazing was that I got the approximate answer to the one in the book I'm working through. I'm really ecstatic.
    I know it might sound childish to build something this simple but I found mathematics to be fun and enjoyable during the process. You really have a talent for explaining, thank you.

  • @rezat.ashtiani1338
    @rezat.ashtiani1338 2 роки тому +6

    The most clear and simple explanation of multiple linear regression I’ve ever seen, and I’ve seen MANY! Thank you for this video!

  • @AI_ML_DL_LLM
    @AI_ML_DL_LLM 3 роки тому +3

    Even with a PhD in engineering from a top 100 university in the world with lots of publications, this is the first time when F test synched with me, Thank you Josh

  • @AB-iq9tl
    @AB-iq9tl 2 роки тому +3

    your channel is THE BEST in explaining core concepts of stats and ML. I would have been lost without it. Your way of explanation is just amazing..its basically what i like..short, crisp, clear and precise. So many people are indebted to you for making these videos. Please keep up the good work and i cant THANK YOU enough for your hard work.

    • @statquest
      @statquest  2 роки тому

      Thank you very much! :)

  • @ikechang1967
    @ikechang1967 3 роки тому +2

    You folks are tearing down the barriers that have prevented me from understanding statistics for so many years. Thank you and thank you again!!

  • @navjotsingh2251
    @navjotsingh2251 3 роки тому +3

    Honestly, you are saving people all around the world! Currently doing a data analysis course in the UK and these videos are helping me understand statistics better!!!

    • @statquest
      @statquest  3 роки тому +1

      BAM! And good luck with your course.

  • @PunmasterSTP
    @PunmasterSTP 8 місяців тому +1

    I've used regression lines relatively frequently while tutoring students, but it's always good to review the logic and concepts. Thanks for another awesome vid!

    • @statquest
      @statquest  8 місяців тому +1

      Glad it was helpful!

  • @sudhakarmadakaful
    @sudhakarmadakaful 4 роки тому +19

    Linear Regression is well-explained in this video. Love the humor in between too!!!!

  • @nguyentuongvy2110
    @nguyentuongvy2110 2 роки тому +1

    This is the first time I have watched a intelligible video about R. Thanks really much for your sharing. Love you and wish you healthy.

  • @vijaypawar5003
    @vijaypawar5003 4 роки тому +3

    Your videos are so simple and easy to grasp with all those conceptual explanations and graphics you use. They make learning fun. Thanks to UA-cam for getting this to us and a bigger thanks to you for sharing it with the world.

  • @ZowieFaerieYeah
    @ZowieFaerieYeah 3 роки тому +2

    These videos are the best! Thank you soooooo much! Big smiles from a recent psych graduate revising stats concepts for a job interview.

    • @statquest
      @statquest  3 роки тому +1

      Good luck with the interview! BAM! :)

  • @pietronickl8779
    @pietronickl8779 3 роки тому +12

    better than several stats courses I took 🙃 thanks a lot for your work! really appreciate the focus on getting an intuition for the formulae. Too many courses just state them without communicating that..

    • @statquest
      @statquest  3 роки тому +2

      Thank you very much! :)

    • @confidenceinterval4849
      @confidenceinterval4849 2 роки тому

      You can learn more about intuition of linear regression here ua-cam.com/video/QQ9MAnX963M/v-deo.html

  • @ramnewton
    @ramnewton 4 роки тому +1

    Using size and transparency of ink to represent the 3rd dimension is a brilliant idea. I have struggled with visualizing 3D graphs, this is by far one of the best ways I've seen. Kudos!

  • @derrickc.8486
    @derrickc.8486 Рік тому +3

    Wow, I took Statistics I and II at the doctoral level and struggled through Stats I, but II was a bit easier. I wish my Statistics professor had made it as interesting, meaningful, and easy to learn and understand as you did in this video. I have a much better understanding of what linear regression is about and I've a renewed interest in learning more about the hows and whys of statistics so I can apply it in my profession. Thank you!

    • @statquest
      @statquest  Рік тому

      Hooray! I'm glad my video was helpful! :)

  • @alecvan7143
    @alecvan7143 5 років тому +3

    Incredibly clear, watching your videos not only do I learn, but I clear up things I hadn't very well understood in the past!

    • @statquest
      @statquest  5 років тому +1

      You're totally binging! And that's awesome. TRIPLE BAM!!! I'm so glad that you like my videos and they are helping you understand statistics etc.

    • @alecvan7143
      @alecvan7143 5 років тому +1

      @@statquest Definitely! I'm even watching them now to study for my upcoming actuarial exam! :D

    • @statquest
      @statquest  5 років тому +1

      @@alecvan7143 Good luck and let me know how the exam goes.

    • @alecvan7143
      @alecvan7143 5 років тому +1

      @@statquest Will do, thank you :)

    • @alecvan7143
      @alecvan7143 4 роки тому +1

      @@statquest Exam went quite well thanks to your videos ;).
      Cheers!

  • @sharvarisosale2780
    @sharvarisosale2780 3 роки тому +7

    I love this channel and your explanations are amazing! It's so lucid and I actually feel like making an effort into understanding things. Your channel is one of the main reasons I have a job now. I've been recommending your channel and courses to every person I know!
    Triple BAM! 🌟

    • @statquest
      @statquest  3 роки тому +1

      Awesome and thank you!

  • @MrMarias1234
    @MrMarias1234 Рік тому +1

    Thanks! It is the most precise explanation I've ever seen. I don't know why professors love to make this topic so complicated and full of fancy notation.

  • @wz5445
    @wz5445 4 роки тому +6

    Thank you so much ;-; this is so helpful!!! Our school literally threw us course notes without much of an explanation, and this video is so useful!!! Triple kudos!!!

  • @monicasethuraman5772
    @monicasethuraman5772 Рік тому +1

    I like the way to review the previous concept , before heading to the other concept. It is really helpful. Ur voice goes very well , increasing an interest to listen and learn more. Thank you so much for sharing your knowledge😇😇

  • @yondchang
    @yondchang 6 років тому +7

    Great video, just note Rsqured can actually be negative, it happens when you fit the data worse than a horizontal line.

  • @cuberootme
    @cuberootme Рік тому +2

    You are the best instructor. I will watch all your videos. It really helps me in ML. Hope I will find a good job!

  • @ayush612
    @ayush612 6 років тому +3

    Thank you so much for the videos you upload... I am actually wasting my time trying to describe in words how helpful and intuitive your videos are in understanding pillar concepts of statistics... Thank you so much!

    • @statquest
      @statquest  6 років тому +1

      It's really nice to hear how much you like the videos! I'm glad I can help. :)

    • @ayush612
      @ayush612 6 років тому +1

      Sorry josh, can't stop myself from appreciating you.. I guess I qualify to be called a fan of your channel and your voice.. Will definitely buy your tracks from my first income ! 😎

    • @statquest
      @statquest  6 років тому

      Thanks!!! :)

  • @Relative0
    @Relative0 6 років тому +2

    As the variation of the sum of the squares around the line taking in to account the mouse weight is less than the variation of the sum of the squares around the line average mouse size alone: "some of the variation in mouse size is explained by taking weight in to account" DOUBLE BAM! Thanks a lot Josh, you really help make sense of this stuff.

  • @anshulnarela9888
    @anshulnarela9888 Рік тому +4

    More power to you Josh!! Nobody eplains these basic concepts like you do. Thanks a lot!

  • @arishali9248
    @arishali9248 4 роки тому +2

    This is amazing !!! Thank you !! I have been using linear regression for years and this is the first time I understood R squared so clearly! BAM !

  • @matthewdong9368
    @matthewdong9368 5 років тому +4

    This is literally the BEST materials on stats fundamentals I could ever find, thank you so so much, Josh!!! I wish I watched your vids way earlier!

    • @statquest
      @statquest  5 років тому +1

      Hooray!!!! Thank you so much! I'm so glad to hear that you like my videos. :)

  • @autogenes
    @autogenes 3 роки тому +2

    I love teachers who can induce interest in their subjects

  • @rutu1011
    @rutu1011 2 роки тому +1

    All these exams I tok and tortured myself..where were you? This is the first and only channel that makes stat look interesting!!!

  • @hazel6330
    @hazel6330 3 роки тому +3

    This is the best explanation that I have ever heard. Truly appreciate your efforts .. true genius👍👍👍

  •  6 років тому +2

    Your videos are absolutely fantastic. You present the information and terminology in a consistent and easy to follow way, and you never make any big leaps between steps (which is unfortunately too common with math teachers who just assume intuition which may not be immediately obvious to everyone).

    • @statquest
      @statquest  6 років тому +1

      Thanks so much! Yes, I try to keep the leaps as small as possible and I try to make sure I can "draw" the math. My rule is this - if I can draw a picture of what an equation is doing, than I understand it.

    •  6 років тому +1

      Great way to teach, humans are very visual creatures... I am wondering if there should be a bracket around ss(mean)-ss(fit) in the formula for F @ 19:15 ? Also looking forward to your video of degrees of freedom!

    • @statquest
      @statquest  6 років тому

      Yep, a bracket would make that clearer. You divide [ss(mean) - ss(fit)] by the degrees of freedom. Thanks for pointing that out! :)

  • @aarondorcas
    @aarondorcas 5 років тому +4

    You make statistics that easy to understand, thanks for your extensive work!

  • @malamals
    @malamals 5 років тому +1

    Your videos are super easy to have an intuition, about what we are studying without having the intuition. i'm truly great-full for that. Please make videos on Neural network concepts. A much needed in the largely developing technological world.

    • @statquest
      @statquest  5 років тому +1

      My plan is to make a neural network video later in the fall.

  • @smdsouza8
    @smdsouza8 5 років тому +6

    I love your videos. It’s a true talent to be able to explain these concepts so clearly

    • @statquest
      @statquest  5 років тому

      Thank you so much! :)

  • @false_binary
    @false_binary Рік тому +1

    Holy crud StatQuest w/Josh Starmer, this vid just catapulted me into my first grad school course I started this pm! OLS is primary focus and this vid was the perfect primer for our assigned readings ahead. TY!!! 🤯

  • @shekharrudrabhatla8594
    @shekharrudrabhatla8594 6 років тому +14

    Awesome.. never looked at R-squared this way. Thank you :)

    • @statquest
      @statquest  6 років тому

      Hooray!! I'm glad you like the video :)

  • @elalaelasariuinjakarta7548
    @elalaelasariuinjakarta7548 Рік тому +1

    My background is in public health and I just started to learn PCR and Sequencing. I watch your video step-by-step.

  • @jamesshin4901
    @jamesshin4901 4 роки тому +3

    Love your presentation!! What a great job you have been doing to educate people and lessen the confusion!! Many thanks!!!

  • @adancastro2220
    @adancastro2220 3 роки тому +1

    I found that this Video explains about 60% of my current understanding of linear regression... Thank you friendly people ;)

  • @merryjoy48
    @merryjoy48 5 років тому +3

    Waiting eagerly for the video on degrees of freedom a topic that has eluded me for a very long time. Can't wait for the BAMS on that one.

  • @aureliendaviet5290
    @aureliendaviet5290 Рік тому +2

    Normally I don't write comments but I really wanted to thank you because your videos are amazing ! You do an tremendous job at explaining things that should be explained by professors (who are unfortunately sometimes bad at explaining it in a simple way). Thank you again (BAM !)

  • @imad_uddin
    @imad_uddin 3 роки тому +3

    What an amazing explanation of r squared. Cant appreciate enough! Subscribed

  • @tinacole1450
    @tinacole1450 Рік тому +1

    Your videos are helping to explain what my instructor assumes we know. Thanks!

  • @Privacy-LOST
    @Privacy-LOST 5 років тому +5

    3:00 I've been doing this for 10+ years without having ever seen it that way. Brilliant.

  • @sweetmaths4213
    @sweetmaths4213 2 роки тому +1

    You're the first person I have found who makes stats make sense. Ty ❤

  • @sdfsdf5313
    @sdfsdf5313 4 роки тому +17

    Really thankful for all the mesmerizing videos. Throughly enjoyed the content (Purchased few study guides. Quite helpful) . I am still struggling to get a good video on linear regression assumptions with code ( all 10 assumptions) with full explanation on how to detect when the assumptions are violated and what to do next. Just a suggestion :)

    • @statquest
      @statquest  4 роки тому +3

      It's a good suggestion. I will put that on the to-do list. And thank you very much for supporting StatQuest!!! BAM!

  • @xondiego
    @xondiego 3 роки тому +2

    Geez, such a beautiful explanation. God bless you. Normally when you're doing the computation, all the teachers I had never explained with the ssmean minus the ssfit over ssmean, but that made heaps of sense why... Lovely.

  • @ЕвгенияКазакова-ъ3д

    Oh, I wish I have more hours in a day to watch all of your videos 🤤
    I'm crying how good they are! 🤩🤩

    • @statquest
      @statquest  3 роки тому +1

      Thank you so much 😀

  • @esperanzazagal7241
    @esperanzazagal7241 4 роки тому +1

    I just want to say 14:50 is a brilliant example of what matters to least squares :)
    thanks!

    • @statquest
      @statquest  4 роки тому

      Thank you very much! :)

  • @michaeld6854
    @michaeld6854 4 роки тому +4

    Explanation so simple, and more importantly fun - thanks!

  • @Brandon-oc8lr
    @Brandon-oc8lr 5 років тому +2

    Probably one of the most informative videos on these concepts I've ever seen. Thank you!

    • @statquest
      @statquest  5 років тому

      Hooray!! I'm so glad you like my video. :)

  • @meenasirishaallamsetty713
    @meenasirishaallamsetty713 5 років тому +4

    OMG!!! you are so helpful, before I saw your videos, all these topics didn't make sense. On by the way please bring the BAM back!!! kind of liked it

    • @statquest
      @statquest  5 років тому +1

      Thank you very much! The BAM is back!!!! :)

  • @jamiyana4969
    @jamiyana4969 Рік тому +1

    Literally the best explaination you will ever find on UA-cam, Thanks a lot!

  • @joseangelmartinez308
    @joseangelmartinez308 3 роки тому +4

    How can someone explain so wel?? HUGE BAM!

  • @edwardgrigoryan3982
    @edwardgrigoryan3982 2 роки тому +1

    This video is lim as BAM! approaches infinity. I found this to be an outstanding lecture that covers a huge cross section of important topics.

  • @kesavkumar7710
    @kesavkumar7710 4 роки тому +5

    I never missed intro song of the video.This is real BAM..!! The concepts anyway going to be clear.

  • @somethingnotspecified
    @somethingnotspecified Рік тому +1

    Don't know how I wrote a statistics test that had a question about this without really understanding the intuition behind the whole thing. Still, I need some extra work on the F-measure to get it 100 % right. Hats off to you as always, Josh! Amazingly well done! :)

  • @salnaud8898
    @salnaud8898 5 років тому +3

    Thank you very much. Good on ya mate.
    Finally I got the answers of my questions that I was looking for a long time.
    Cheers,

  • @shubham2488
    @shubham2488 2 місяці тому

    Interesting!! At 7:39 I pondered over two questions based on the two graphs:
    1. In first graph, ignore the x-axis and imagine a scenario where only mouse sizes are observed.
    Q1. The only question we can ask here is: Can we predict the next mouse size?
    Answer: Very intuitive. Take the average (here denoted as the average). Now we may be wrong by some amount and can quantify it using the variance.
    2. In second graph, now consider the case when you have some additional readings in your data which are weights. Followings can be asked:
    Q1. What is the next prediction about reading of weight and size?
    Answer: Again simple, just take average on both the axes. The variance and covariances can tell some performance guarantee. But this question is pathological.
    Q2. Given weight, can we predict the size or vice-versa? Why do we have the fitting curve doing best in some metric and lying roughly in the middle of the points?
    Answer: Theoretically speaking, with the random variable kind of treatment, one can show that the answer lies in the expected value of conditional distribution of Y given X whenever squared loss is considered.
    Typically in practice, we take a Gaussian assumption on Y given X justified by linear model: y= f(x) + Gaussian noise. And the idea of best f is the one which maximumize the likelihood of y given x which is ultimately minimizing squared loss. And thus, because of Gaussian assumption, we have the fitting curve in the roughly in the middle as the errors can be both positive and negative.
    Further, the hope is conditional expectation would simply be a straight line that is a linear combination of x with certain coefficients which is indeed true for jointly Gaussian rvs (X,Y).

  • @Dynamite_mohit
    @Dynamite_mohit 4 роки тому +4

    This is my 7+ time on this video,
    I just love to revise. :)
    In case anyone wondering why we are using mean here,
    For a single variable the best fit line is Mean
    Example : You tell me what's the average height of your class?
    You will calculate mean right?
    That's why mean

  • @jingxinzhao878
    @jingxinzhao878 3 роки тому +1

    This is brilliant! I finally figure out how to explain linear regression to that XXX client without any math term!

  • @seauxmean4074
    @seauxmean4074 5 років тому +3

    the opening theme is awesome

  • @nachiketpensalwar
    @nachiketpensalwar 10 місяців тому +1

    a lot better than other videos... easy to understand concepts this way. thank you for free content.

    • @statquest
      @statquest  10 місяців тому

      Glad it was helpful!

  • @TTortrix
    @TTortrix 4 роки тому +6

    I wish i would have found this when I was in my statistics class. Then I would have been like "Bam!", I understand P values, R^2, and SS.

  • @ankitjha3480
    @ankitjha3480 6 років тому

    I search for your video if i stuck in any statistics concepts . I was searching for AIC & BIC videos as model validation techniques .I wish if you can make more & more videos & cover maximum concepts. Thank you so much for your excellent videos .Making life easy for many .

  • @jackignatev
    @jackignatev 5 років тому +3

    Infinity BAM!
    Your videos are awesome!

    • @statquest
      @statquest  5 років тому

      Thank you very much! :)

  • @yrrep27
    @yrrep27 2 роки тому +2

    "This is the frowny face of bad times" got a hearty laugh out of me.

  • @BrianRisk
    @BrianRisk 6 років тому +6

    “Womp womp. Here’s the frowny face of sad times.” CRACKED ME UP 😂

    • @statquest
      @statquest  6 років тому +4

      Here's the happy face! :)

  • @anan7b66
    @anan7b66 7 років тому

    Thank you so much for dissecting the terminology along with highlighting equations as you explain. Great way to understand and build the right intuition. The confusing usage of terminology and equations are the hardest and time consuming part that delays the learning process. This is so far the right mix of theory and equations I have come across. Cheers.

  • @nathanx.675
    @nathanx.675 4 роки тому +3

    Who’s watching this two days before their midterm?
    Baaaaaaam!

    • @statquest
      @statquest  4 роки тому

      Yes! It's that time of year. Good luck!!!! :)

  • @marwolaeth111
    @marwolaeth111 3 роки тому +1

    My goodness, StatQuest is exactly the approach I've been seeking for my statistical knowledge reinforcement! Not as shallow as DataCamp or other R courses (‘The math behind this is beyond the scope of this course’). Not as crazy as math lectures. I am a humanitarian. And Salman Khan… well, he is great but way too thorough for a busy man to follow along. I'll resort to him for linear algebra and calculus. Thanks, Josh! StatQuest is marvellous.

  • @xzhou6686
    @xzhou6686 6 років тому +7

    Hey Josh, thanks for the great videos! One quick question. At 19:13 where the equation for F turns up, should the numerator be (SS(mean)-SS(fit))/(pfit-pmean) or SS(mean)-SS(fit)/(pfit-pmean)? wondered if there is a pair of bracket missing there.

    • @statquest
      @statquest  6 років тому +5

      You are correct, a pair of brackets would have made this clearer! Sorry for the confusion. It should be: [SS(mean)-SS(fit)] / (pfit-pmean)

  • @elatior
    @elatior 6 років тому

    I wish I could have seen your video when I was in university. You have made it so clear and easy to understand. Thanks for making these videos!

  • @joerich10
    @joerich10 6 років тому +3

    Another fantastic vid. Just to clarify, the p value of the F statistic is to be interpreted as the probability that the R2 was obtained by random chance alone. If it is

  • @saicharangarrepalli9590
    @saicharangarrepalli9590 4 роки тому +2

    That was a really cool intro. That got me hooked. Then the quality of the lecture was top notch.

  • @statsmama
    @statsmama 4 роки тому +3

    When will the video on degrees of freedom be coming out? :D :D :D

    • @statquest
      @statquest  4 роки тому +1

      Good question. One day! :)

  • @oriol-borismonjofarre6114
    @oriol-borismonjofarre6114 2 роки тому +1

    I Love you man! I'm in my current position thanks to all your classes! you made my life waaaay better than it was! Thanks thanks thanks

  • @pi2019
    @pi2019 5 років тому +4

    Hi I have 2 questions:
    1) When we are creating the histogram with different F values, where are these random datasets coming from? are they related to our original dataset in someway? I mean are comparing the same features (eg predicting mouse height from mouse weight?)
    2)I didn't get the part about p-value being defined as the number of more extreme values divided by all values. From your lecture on p-values explained- a p-value was defined was the sum of probability of current event + anything with an equal probability + rarer events.

    • @statquest
      @statquest  5 років тому +1

      1) Those random datasets are related to the original dataset in that they have the same number of points and the same range along the y-axis and the same domain along the x-axis. Other than that, they are just random values. The idea is to compare what you observed to noise. If the noise generated a lot of F-values that were larger than the F-value we got from the observed data, then we wouldn't think that the observed data is all that special (and thus, the observed data would have a large p-value).
      2) That's a little bit of a typo - I mean to say, "the number of randomly generated F-values equal to or more extreme divided by the all of the values". The "division" is what turns the values into a probability. Does that make sense?

    • @pi2019
      @pi2019 5 років тому +2

      @@statquest Thank you! can you please confirm whether my thought process is correct or not for the 2 points above:
      1) So these random datasets have the same range and domain but can be unrelated to the our dataset in terms of the list of features. We use these datasets to create our histogram (and then the curve) to see how does our observation fair. If there are a lot of f-values greater than ours in the curve, then our observation is not good?
      2) For the p-value we take the probability of distribution which is equal, less or rarer than our the point on the distribution, this point is determined by our calculated f-value?

    • @statquest
      @statquest  5 років тому +1

      @@pi2019 You are 100% correct for point #1. For point #2 you are 99% correct. The only thing is that the F-distribution isn't symmetric (like the normal distribution) and F-values can only be positive. Thus, we don't have to look at values "less than" the observed F-value when calculating the p-value. Does that make sense?
      One thing I would like to make sure is clear is that the randomization in the video is only intended to illustrate how the p-value is calculated. In practice, we just refer to the F-distribution and don't do the randomization.

    • @pi2019
      @pi2019 5 років тому +1

      @@statquest Thank you it makes perfect sense!!

  • @naomyduarteg
    @naomyduarteg Рік тому +1

    You are the best! And you certainly got even better through the years (I just watched the "R^2 explained" video from 8 years ago).