Deriving the least squares estimators of the slope and intercept (simple linear regression)

Поділитися
Вставка
  • Опубліковано 21 бер 2019
  • I derive the least squares estimators of the slope and intercept in simple linear regression (Using summation notation, and no matrices.) I assume that the viewer has already been introduced to the linear regression model, but I do provide a brief review in the first few minutes. I assume that you have a basic knowledge of differential calculus, including the power rule and the chain rule.
    If you are already familiar with the problem, and you are just looking for help with the mathematics of the derivation, the derivation starts at 3:26.
    At the end of the video, I illustrate that sum(X_i-X bar)(Y_i - Y bar) = sum X_i(Y_i - Y bar) =sum Y_i(X_i - X bar) , and that sum(X_i-X bar)^2 = sum X_i(X_i - X bar).
    There are, of course, a number of ways of expressing the formula for the slope estimator, and I make no attempt to list them all in this video.

КОМЕНТАРІ • 195

  • @AnDr3s0
    @AnDr3s0 4 роки тому +150

    Finally, someone who made it simple to understand! Thank you!

    • @victoriabrimm5014
      @victoriabrimm5014 3 роки тому +3

      Right! i went through like a million videos trying to understand this one segment and this was the first to do it.

    • @agustinlawtaro
      @agustinlawtaro 3 роки тому

      True.

    • @cmacompilation4649
      @cmacompilation4649 5 місяців тому

      please, how is it possible to consider Beta variable (when taking derivatives) and then consider Beta constant (to take it out of the sum) ???

    • @cleisonarmandomanriqueagui9176
      @cleisonarmandomanriqueagui9176 26 днів тому

      Best video . i was looking for something like this

  • @anuraghavsai5575
    @anuraghavsai5575 3 роки тому +41

    This is an underrated video.

  • @sajibmannan
    @sajibmannan 3 роки тому +36

    My god, you explained this so easily. It took me hours trying to understand this before watching this video but still couldn’t understand it properly. After watching this video, it's crystal-clear now. ❤️

    • @DHDH_DH
      @DHDH_DH 4 місяці тому

      me, too. I spent a whole morning figuring this. He is a savior

  • @amberxv4777
    @amberxv4777 2 роки тому +6

    I don't usually comment on teaching videos. But this really deserves thanks for how clearly and simply you explained everything. The lecture I had at the university left much to be desired

  • @aaskyboi
    @aaskyboi 4 роки тому +17

    I can't thank you enough for this brilliant explanation!

  • @augustinejunior3361
    @augustinejunior3361 2 роки тому +2

    I never thought that I could understand simple linear regression using this approach. Thank you

  • @JFstriderXgaming
    @JFstriderXgaming 3 роки тому +4

    My Physical Chemistry teacher spent ~1.5 hrs showing this derivation and I got completely lost. Watching your video, it's so clear now. Thank you for your phenomenal work.

  • @alimortadahoumani1134
    @alimortadahoumani1134 7 місяців тому +1

    unbelievably perfect video, one of the best videos I have watched in the statistics field, so rare to find high-quality in this field idk why

  • @monojitchatterjee3185
    @monojitchatterjee3185 4 роки тому +11

    Absolutely beautiful derivation!
    Crystal clear!
    Thanks very much.

  • @user-jk2tp6gs2g
    @user-jk2tp6gs2g 2 роки тому +5

    I have gone through tons of materials on this topic and they either skip the derivation process or go direct into some esoteric matrix arithmetics. This video explains everything I need to know. Thanks.

  • @nak6608
    @nak6608 2 роки тому +4

    phenomenal video. Thank you for taking the time to explain each step of the derivations such as the sum rule for derivation. Thank you for helping me learn.

  • @Murraythis
    @Murraythis 2 роки тому

    Amazing video! Slight bumps where my own knowledge was patchy but you provided enough steps for me to work those gaps out.

  • @pkeric2626
    @pkeric2626 4 роки тому +3

    Thanks so much, this was so easy to follow and comprehend!

  • @process6996
    @process6996 5 років тому +6

    Glad you're back!

    • @jbstatistics
      @jbstatistics  5 років тому +3

      Thanks! Glad to be back! Just recording and editing as I type this!

  • @yixuanliu8368
    @yixuanliu8368 Рік тому

    you have no idea how you saved my life, I was struggling so hard to find out why xi(xi-xbar)=(xi-xbar)^2 and etc. you are the first one I found explained that.

  • @ghunter958
    @ghunter958 4 роки тому +2

    Really, really good explanation!! Thank you!!

  • @danverzhao9912
    @danverzhao9912 2 роки тому +6

    Thank you so much! This explanation is literally perfect, helped me so much!

    • @jbstatistics
      @jbstatistics  2 роки тому +1

      Thanks for the kind words! I'm glad to be of help!

  • @TheProblembaer2
    @TheProblembaer2 Рік тому

    This is FANTASTIC. THANK YOU!

  • @muhammaddzakyrafliansyah3587

    You made it simpler than my lecturer do. Thank you!

  • @martijnbos9873
    @martijnbos9873 4 роки тому

    Absolutely brilliant video!!! Thanks so much

  • @daniyal98
    @daniyal98 4 роки тому +1

    one video on youtube that actually explains something properly

  • @TheMatthyssen
    @TheMatthyssen 7 місяців тому

    thank you for actually explaining it, most of videos are just like "hi, if you want to solve this, plug in this awesome formula and thats it, thank you for watching :)"

  • @valeriereid2337
    @valeriereid2337 Рік тому +1

    The best part of this video is finally figuring out where that "n" came from in the equation for beta-naught-hat. Thank you so very much for making this available.

  • @jingyiwang5113
    @jingyiwang5113 6 місяців тому

    Thank you so much for such a clear explanation! It helps me a lot in preparing for my upcoming final exam.

  • @tastypie2276
    @tastypie2276 3 роки тому

    Thank you very much! Very clear and interesting explanation!

  • @asad_rez
    @asad_rez 2 роки тому

    Thank you for explaining in such details ❤️

  • @raitup00
    @raitup00 4 роки тому

    Great explanation! Step by step...

  • @satyarath7723
    @satyarath7723 2 роки тому

    Thanks a lot sir I really got this what I need indeed. 🙏🙏🙏🙏🙏🙏🙏🙏There is no words for appreciation of your efforts

  • @aron4317
    @aron4317 2 місяці тому

    Beautiful video, good explanation

  • @eltuneyvazzade8845
    @eltuneyvazzade8845 Рік тому

    Very helpful video to understand. Many thanks!

  • @jackhasfun4752
    @jackhasfun4752 7 місяців тому

    thank you so much, this video has cleared all my confusions cuz the book im reading just says 'by doing some simple calculus'

  • @Zydres_Impaler
    @Zydres_Impaler Рік тому

    thanks a lot for simplifying the derivation

  • @ericricky808
    @ericricky808 3 роки тому

    This is talent. Thank you so much 😊

  • @AJ-et3vf
    @AJ-et3vf 2 роки тому

    Awesome video sir! Thank you!

  • @mangaart3366
    @mangaart3366 3 роки тому

    Exactly what I was looking for. Thank you so much!

    • @jbstatistics
      @jbstatistics  3 роки тому +1

      You are very welcome!

    • @herodmoonga4799
      @herodmoonga4799 2 роки тому

      Greatful, you are wonderful Sir,you have made me understand economics

  • @arjoy4942
    @arjoy4942 5 років тому

    Great sir, very helpful!

  • @DHDH_DH
    @DHDH_DH 4 місяці тому

    You are awesome! I am not a native speaker and still struggling with the master program courses in the US, but your instruction is so helpful. I appreciate your great help

    • @jbstatistics
      @jbstatistics  4 місяці тому

      Thanks! I'm happy to be of help!

  • @preethidonthu2433
    @preethidonthu2433 Рік тому

    Thank u soooo much! For explaining this. You made my day

  • @wuttinanlerkmangkorn7009
    @wuttinanlerkmangkorn7009 2 роки тому

    Best explanation, thank you so much

  • @soryegetun529
    @soryegetun529 2 роки тому +1

    finally, I've understood this bloody thing. Thank u sooooo much m8.

  • @loden5677
    @loden5677 7 місяців тому

    This was really helpful thanks!

  • @louism.4980
    @louism.4980 5 місяців тому +1

    This is incredible, thank you so much! :)

  • @CaptainCalculus
    @CaptainCalculus 3 роки тому

    Excellent video!

  • @frequencyspectra1
    @frequencyspectra1 Рік тому

    Excellent video

  • @renata8938
    @renata8938 3 роки тому

    Thank you. This is very clear

  • @fatemehmaroufkhani7104
    @fatemehmaroufkhani7104 2 роки тому

    Thank you very much. This video helped me a lot.

  • @joansome951
    @joansome951 9 місяців тому

    Great job! Thank you sir!

  • @omarzodiac
    @omarzodiac 5 років тому

    Yes! New stuff 👍🏼👍🏼

  • @mpandesyambayi5898
    @mpandesyambayi5898 2 роки тому

    Greatly explained

  • @olympusexothermic1752
    @olympusexothermic1752 Рік тому

    Thank you so much am really enjoying and understanding what your teaching

  • @tezike
    @tezike 3 роки тому +7

    holy hell I wish you were my econometrics professor. mine is useless

  • @tnorton314
    @tnorton314 3 роки тому

    Thanks, so helpful!

  • @mohe4ever514
    @mohe4ever514 2 роки тому

    Very well explained

  • @sajibmannan
    @sajibmannan 3 роки тому +2

    Question: 6:24 Why and how beta zero hat is multiplied with n? Does n mean sample size? What's the reasoning behind n adjoining with beta zero hat?

  • @SiddharthPrabhu1983
    @SiddharthPrabhu1983 5 років тому

    Thanks Dr. Balka! Is it computationally expensive to estimate the parameters in this manner for models with many independent variables or very large datasets? Is that the reason why iterative methods such as gradient descent are sometimes employed instead?

    • @Tusharchitrakar
      @Tusharchitrakar 5 місяців тому

      The matrix inversion operation in ols is computationally expensive, hence numerical methods like gradient descent are useful.

  • @promasterliuss
    @promasterliuss Рік тому

    good explanation!

  • @girlstrends8204
    @girlstrends8204 5 місяців тому

    you make it sooo easy

  • @poojachhachhiya1574
    @poojachhachhiya1574 3 роки тому

    Wow. This is great.thanku so much.

  • @umedina98
    @umedina98 2 роки тому

    Really nice derivation!

  • @deborahsakazhila4068
    @deborahsakazhila4068 4 роки тому +8

    Hi, do you have a video on deriving coefficients in multiple regression?

    • @mattstats399
      @mattstats399 11 місяців тому

      That is a fun derivation using linear algebra and calculus. First step is the same here which is taking the first derivative and setting it equal to zero. The book "The Elements of Statistical Learning" has a good proof. I'd say one needs a calc 1 and linear algebra background first though.

  • @natachajouonang2710
    @natachajouonang2710 2 роки тому +1

    Amazing and super helpful video! Extremely simple and easy to follow! But please, quick question: Why did you switch the Xi and Xbar at 7:51? This drastically changes the ending solution.

    • @malolanbalaji98
      @malolanbalaji98 2 роки тому

      When he removes the inner paranthesis, the term Xi becomes negative and Xbar becomes positive. So when you multiply it by (-ve)Beta, the signage of both terms reverses

  • @stretch8390
    @stretch8390 Рік тому

    Easiest subscribe of my life.

  • @sasirekhamsvl9504
    @sasirekhamsvl9504 3 роки тому

    Great video Brother

  • @MudahnyaFizik
    @MudahnyaFizik 3 роки тому

    Thanks so much!

  • @user-fn6lz8pp6j
    @user-fn6lz8pp6j 4 роки тому

    THANKS GOD FINALLY SOMEONE TRIED TO DERIVE THE FORMULA,
    INSANE THAT NEARLY ALL OTHER RESOURCES OMIT THIS SHIT

  • @fisher4651
    @fisher4651 2 роки тому

    LEGEND, HAVE TO SAY YOU ARE BETTER THAN A PROFFESOR

  • @robin-bird
    @robin-bird 2 роки тому

    definitely the best video out there on this topic
    makes me wonder why its not the top recommendation / search result
    maybe because of the title

    • @jbstatistics
      @jbstatistics  2 роки тому

      Thanks! How about "Finding the formulas for the slope and intercept the EASY WAY! (When I got to step 8, my jaw DROPPED!)" :)

    • @robin-bird
      @robin-bird 2 роки тому

      @@jbstatistics no idea, I'm not good at making up titles. Maybe something like this?
      Simple Linear Regression | Derivation

    • @jbstatistics
      @jbstatistics  2 роки тому

      @@robin-bird I was just joking, but thanks for the input :)

    • @robin-bird
      @robin-bird 2 роки тому

      @@jbstatistics I was wondering - thanks for clearing that up ^^

  • @AnuarPhysics
    @AnuarPhysics 2 роки тому

    Nice trick! Adding an intelligent zero huh?
    Thanks for this video!

  • @mautasimsiddiqui8618
    @mautasimsiddiqui8618 Рік тому

    Thank you so much.

  • @LucyMburu605
    @LucyMburu605 7 днів тому

    Thanks alot it really helped

  • @ethanchung1541
    @ethanchung1541 3 роки тому

    u are a life saver

  • @zumichetiapator
    @zumichetiapator 26 днів тому

    finally got my doubt resolved.😊

  • @jesabumaras4758
    @jesabumaras4758 2 роки тому

    Thank you very to clear explanation ❤

  • @dalkeiththomas9352
    @dalkeiththomas9352 2 роки тому

    You sir are AMazing

  • @anonymousAI-pr2wq
    @anonymousAI-pr2wq 2 роки тому

    Thanks for the video. Just wondering why x and y can be considered constant when differentiate against B0 or B1? Is it because of partial differentiation or X and Y are known numbers?

    • @yaweli2968
      @yaweli2968 Рік тому

      I think you are arguing why Bo hat and B1 hat should be considered constants for the sample.They are clearly not going to change for that sample.

  • @atomu9663
    @atomu9663 Рік тому

    Good video Thanks!

  • @virgilhawkins4575
    @virgilhawkins4575 4 роки тому

    Awesome!

  • @hellothere9298
    @hellothere9298 5 місяців тому +1

    Thank you very much sir !

  • @ezozakamolova885
    @ezozakamolova885 7 місяців тому

    Thank you so much

  • @cmacompilation4649
    @cmacompilation4649 5 місяців тому

    please, how is it possible to consider Beta variable (when taking derivatives) and then consider Beta constant (to take it out of the sum) ???

  • @Sulegibongtoo176
    @Sulegibongtoo176 2 роки тому

    Wow many university lecturers can’t explain it this well!

  • @menghongpor2667
    @menghongpor2667 4 роки тому +2

    At 10:52 timeline, how can we switch the role of X sub i and Y sub i? Could you help explain how this happens?

    • @harveywilliams7013
      @harveywilliams7013 3 роки тому +1

      In the first step, we choose to expand (Xi - Xbar) but we could have chosen to expand (Yi - Ybar) and it would follow a similar route.

  • @egnos
    @egnos 4 роки тому +3

    2:24, where did you discuss why it makes sense to minimize the sum of squared residuals ?

    • @aakarshan01
      @aakarshan01 3 роки тому +2

      makes it more sensitive to bigger errors. And it's differentiable at all points. In the Mod function , it is not differentiable at the point it pivots up

    • @SuperYtc1
      @SuperYtc1 3 роки тому

      @@aakarshan01 but why not to power 1.5? why not to power 4? why is it exactly power 2?

    • @aakarshan01
      @aakarshan01 3 роки тому +1

      @@SuperYtc1 you can.but there is no need to. The differentiability is achieved in square. Why calculate a bigger number that could lead to problems since power 4 of a decimal number of more likely to break the minimum number limit of a float than a square. But in theory, you can

  • @singsongsou1865
    @singsongsou1865 3 роки тому +1

    Thanks a lot!!!

  • @anangelsdiaries
    @anangelsdiaries Місяць тому

    The result represent the minimas since the original function that we were minimizing is convex and open upwards, so the only way for a critical value to exist is for it to be a minimum.

  • @Harshavardhan-no7ri
    @Harshavardhan-no7ri 2 роки тому +1

    did we consider beta not hat and beta hat as variables for partial derivation in this problem usually they are constant in straight line right ? why did we take them as variables , if any one knows the answer plse do reply me

  • @Harchit23
    @Harchit23 3 роки тому

    this is great

  • @katerinagk2681
    @katerinagk2681 Рік тому

    THANK YOU

  • @badrftouh9322
    @badrftouh9322 Рік тому

    Nailed it

  • @reinier5355
    @reinier5355 3 роки тому

    great video, my summary just gave the formula with the text: 'just remember this' hate that

  • @hawasadiki5292
    @hawasadiki5292 2 місяці тому

    Thank you

  • @DHDH_DH
    @DHDH_DH 4 місяці тому

    at 10:43, can you please tell me why we can easily swap the roles of x and y? Is it based on any properties or formulas?

    • @jbstatistics
      @jbstatistics  4 місяці тому

      The initial term is sum (X_i - X bar)(Y_i - Y bar). While in the video I split up the (Y_i - Y bar) term, leaving (X_i - X bar) intact, I could have just as easily split up the (X_i - X bar) term instead, and using the same steps as I did in the video, end up with sum (Y_i - Y bar)X_i.

  • @ciasuryani5422
    @ciasuryani5422 4 роки тому

    Thanks so much

  • @IS-xm8bc
    @IS-xm8bc 3 роки тому

    Why do we take sum of squared residuals and not only residuals and do their partial derivative wrt alpha and beta

  • @mobileentertainment212
    @mobileentertainment212 Рік тому

    In which video does he discuss why the we use squared residuals?

  • @IvanJacobsvelocity
    @IvanJacobsvelocity 2 роки тому

    thank you 100^100 times

  • @kaanaltug455
    @kaanaltug455 4 роки тому +3

    Wait, at 6:45, how do you divide the summations by n and get (y) itself? y-sub-i isn't a constant, so how does the division even work?

    • @kaanaltug455
      @kaanaltug455 4 роки тому +3

      OOHHH NOOOO, ITS THE MEAN. NOW I GOT IT. JUST GONNA LEAVE THIS HERE JUST TO SHOW HOW STUPID I CAN BE

    • @noopyx3414
      @noopyx3414 4 роки тому

      I'm new to this formula and the big data field, what mathematical knowledge should I learn prior to watch this video? Thank you

    • @kaanaltug455
      @kaanaltug455 4 роки тому +1

      @@noopyx3414 Oh man, you're lucky. I just logged in to UA-cam. Prior to this formula, I'd really suggest you check out Brandon Fultz's Statistics 101: Linear Regression series. There he explains what this formula and other stuff regarding the topic, are all about.

    • @noopyx3414
      @noopyx3414 4 роки тому

      @@kaanaltug455 Thank you very much!

  • @MahnoorNaveed-
    @MahnoorNaveed- 2 роки тому

    thank u so much.

  • @1UniverseGames
    @1UniverseGames 3 роки тому

    How can we find the intercept and slope value of B0 and B1

  • @andreawhitson7451
    @andreawhitson7451 9 місяців тому

    He says that he describes why we square it elsewhere. Does anyone know which video that is?