The Chain Rule

Поділитися
Вставка
  • Опубліковано 26 гру 2024

КОМЕНТАРІ • 504

  • @statquest
    @statquest  2 роки тому +24

    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
    Corrections:
    13:05 When the residual is negative, the pink circle should be on the left side of the y-axis. And when the residual is positive, the pink circle should be on the right side.

    • @adolfocarrillo248
      @adolfocarrillo248 2 роки тому +1

      This is an amazing explanation!!! Thanks

    • @statquest
      @statquest  2 роки тому

      @@adolfocarrillo248 Thank you very much! :)

    • @anushreesaran
      @anushreesaran 2 роки тому +2

      Got my copy of The StatQuest Illustrated Guide to Machine Learning today! Quadruple BAM!!!!

    • @statquest
      @statquest  2 роки тому +1

      @@anushreesaran Hooray! Thank you very much! :)

    • @tremaineification
      @tremaineification Рік тому

      @@statquestwhat do mean by the last term not containing the intercept?

  • @moetasimrady8876
    @moetasimrady8876 Рік тому +76

    I have started my machine learning journey a month ago and I stumbled onto a myriad of resources that explain linear models using the RSS function but no one, and I mean no one, managed to explain it with as much clarity and elegance as you have in just under 20 minutes. You sir are a boon to the world.

  • @diyanair158
    @diyanair158 2 роки тому +21

    Did I just UNDERSTAND the CHAIN RULE ? SURREAL, thank you!

  • @revolution77N
    @revolution77N 4 роки тому +206

    Man you are amazing. You should get a Nobel prize!

  • @pperez1224
    @pperez1224 3 роки тому +122

    Amazing pedagogy. Slow pace , short setences , visuals consistent with the talk. great job ;-) Thanks

  • @Ruostesieni
    @Ruostesieni 2 роки тому +11

    As someone who is doing medical research and needs to learn little-by-little about statistics, neural networks and machine learning as my project goes on, your channel is a literal life-saver! It has been so hard to try to keep my M.D. stuff together with my PhD research all the while learning statistics, programming and neural network structures and machine learning. Trying to arrange courses from my uni to fit in with all the other stuff is simply impossible, so I've been left to my own devices and find a way to gain knowledge about said subjects and your channel has done just that.
    Your teaching is great and down-to-earth enough to be easily grasped, but you also delve deep into the subject after the initial baby steps, so the person watching isn't just left with "nice to know"-infobits. Love it! Keep up the great work!

  • @amanrastogi603
    @amanrastogi603 9 місяців тому +3

    I am Biostatistician, proclaiming that you are really a good teacher.

    • @statquest
      @statquest  9 місяців тому

      Thank you very much!

  • @ayushbatra2471
    @ayushbatra2471 Рік тому +17

    Over the past three years, I have been studying neural networks and delving into the world of coding. However, despite my best efforts, I struggled to grasp the true essence of this complex subject. That is until I stumbled upon your enlightening video.
    I cannot emphasize enough how much your video has helped me. It has shed light on the intricate aspects of neural networks, allowing me to comprehend the subject matter with greater clarity and depth. The way you presented the material was truly remarkable, and it made a profound impact on my understanding.
    What astounds me even more is that you provide such valuable content for free. It is a testament to your passion for educating and empowering individuals like myself. Your dedication to spreading knowledge and fostering learning is truly commendable.
    Thanks to your channel, I have been able to unlock the true essence of mathematics and its relationship with neural networks. The confidence and clarity I now have in this subject are invaluable to my personal and professional growth.
    Your video has been a game-changer for me, and I am grateful beyond words. Please continue your fantastic work and know that your efforts are deeply appreciated.

    • @statquest
      @statquest  Рік тому +7

      Thank you very much! BAM! :)

  • @dc_amp8843
    @dc_amp8843 Рік тому +6

    The way you link equations to visuals and show how everything is working along with the math at the SAME time. Beautiful, elegant, easy to follow.

  • @jbboyne
    @jbboyne 2 роки тому +4

    Your videos are fantastic, even without the sound effects... but the sound effects really bring them over the top.

    • @statquest
      @statquest  2 роки тому

      Thank you! And thank yo so much for supporting StatQuest!!! BAM! :)

  • @TheGreatFilterPodcast
    @TheGreatFilterPodcast 3 роки тому +35

    BY FAR the best explanation of the chain rule I have ever seen (and trust me - I've seen A LOT)
    You, sir, just earned yourself yet another well-deserved subscriber.
    F'n brilliant!!!

    • @statquest
      @statquest  3 роки тому +1

      Thank you very much!!! BAM! :)

  • @RahulVerma-Jordan
    @RahulVerma-Jordan 7 місяців тому +3

    If I watched your videos during my college, my career trajectory would be totally different. BIG BAM!!!!

  • @vnaveenkumar982
    @vnaveenkumar982 3 роки тому +3

    Take my words Josh you are the best teacher on the internet who teaches Statistics........ and the chain rule made me crazy.......... by your explanation.

  • @markbordelon1601
    @markbordelon1601 7 місяців тому +1

    We could have had a "dreaded terminology alert" : "decomposition of functions". But even without it: this was a perfect explanation of the chain rule , with great practical examples. Bravo, Josh!

  • @user-ul2mw6fu2e
    @user-ul2mw6fu2e 2 роки тому +4

    Best chain rule explanation i have ever seen.

  • @putririzqiyah6294
    @putririzqiyah6294 3 роки тому +8

    this channel was suggested by my professor, and i always watch the videos while doing a machine learning tasks. Big appreciate to you :D

  • @Alchemist10241
    @Alchemist10241 3 роки тому +19

    Awesome!! None of my math teachers in high school or collage never explained to me WHY chain rule works this way. but you explained it with a very simple example. I'm certain that from now on I'll never forget the chain rule formula. Thanks a million. 👌✔

  • @ivanferreira5042
    @ivanferreira5042 4 роки тому +69

    Nobody:
    The demon in my room at 3am: 7:56

  • @mr.shroom4280
    @mr.shroom4280 Рік тому +3

    Bro your the only tutorial that actually helped me grasp this concept, thank you so much.

    • @statquest
      @statquest  Рік тому +1

      Glad it helped!

    • @mr.shroom4280
      @mr.shroom4280 Рік тому

      ​@@statquestI know this isn't related to this video, i just want you to help me because you replied to this comment.
      With gradeint descent, how am i supposed to get the derivative for each weight and bias in a loss function dynamically? because surely for networks with more than 100 neurons there would be a way, i know there is i just don't know.
      When i am calculating the derivative for one varaible in the loss function, to optimize it, i get some overly complicated function, but i see some papers on it and it isn't complicated.

    • @statquest
      @statquest  Рік тому

      @@mr.shroom4280 See: ua-cam.com/video/IN2XmBhILt4/v-deo.html ua-cam.com/video/iyn2zdALii8/v-deo.html and ua-cam.com/video/GKZoOHXGcLo/v-deo.html

    • @mr.shroom4280
      @mr.shroom4280 Рік тому +1

      @@statquest thankyou so much, i watched those but i totally forgot about the chain rule lol

  • @varunparuchuri9544
    @varunparuchuri9544 3 роки тому +8

    dear @stat quest you must have come from heaven to save students from suffering's
    just unbeliable explanation

  • @rigobertomartell5029
    @rigobertomartell5029 2 роки тому +6

    Josh you are a master in teaching, you make difficult topics so easy to understand which is really amazing. My mother language is not English but you explain so well and clear that I can understand everything. Congratulations Sir, please keep doing this job.

    • @statquest
      @statquest  2 роки тому

      Thank you very much! :)

  • @aydnndurmaz
    @aydnndurmaz 2 дні тому +1

    Explaining u-substitution besides chain rule is brilliant

  • @juanp.lievanok.3737
    @juanp.lievanok.3737 Місяць тому +1

    You are a genius at this I can't believe I hadn't heard of this channel before.

  • @RealSlimShady7
    @RealSlimShady7 4 роки тому +25

    Guess I will not be afraid of the ***THE CHAAAAAINNNN RULE***
    Thank you, Josh! Always Waiting for your videos!

  • @Maskedlapis64
    @Maskedlapis64 5 місяців тому +1

    I’ve watched videos like this for work, yours is the best, I fully grasp what a derivative is!

    • @statquest
      @statquest  5 місяців тому

      Glad you liked it!

  • @alexg7082
    @alexg7082 3 місяці тому +1

    As always, clear and in simple language. Thank you !

    • @statquest
      @statquest  3 місяці тому

      Glad it was helpful!

  • @prydt
    @prydt Рік тому +1

    These seriously are some of my favorite videos on youtube!

  • @louco2
    @louco2 Рік тому +1

    This is probably the best video about on the internet!! Thank you so much for taking the time to do it!!

  • @meow-mi333
    @meow-mi333 10 місяців тому +1

    This dude explains things clearly. Huge thanks!

  • @dhakalsandeep3452
    @dhakalsandeep3452 Рік тому +3

    One of the best video i have ever watched. Thank yoy guys for providing such a wonderful content for free.

  • @nick_g
    @nick_g 2 роки тому +1

    I love StatQuest! I got my SQ mug in the morning and just got the Illustrated Guide to Machine Learning. Super excited to start! Thank you for all the great content!

    • @statquest
      @statquest  2 роки тому

      That is awesome! TRIPLE BAM!!!! :)

  • @behrampatel3563
    @behrampatel3563 8 місяців тому +1

    This one outdoes all the best videos on the topic .

  • @amerjabar7825
    @amerjabar7825 2 роки тому +1

    The best video in the internet about the Chain Rule!

  • @darshuetube
    @darshuetube 2 роки тому +3

    you have great videos that help explain a lot of concepts very clearly, step by step. You have help a lot of students for sure.

    • @statquest
      @statquest  2 роки тому

      Thank you very much! :)

  • @ShermanSitter
    @ShermanSitter 4 роки тому +1

    I would insert a BAM at 5:25. :) ...also, I realized the thing I like about your videos is you explain things, not only in a clear way, but in a different way. It adds to the depth of our understanding. Thank you!

    • @statquest
      @statquest  4 роки тому

      That is definitely a BAM moment! And thank you. One of my goals is to always explain things in a different way, so I'm glad you noticed! :)

  • @suneel8480
    @suneel8480 4 роки тому +4

    You had made my machine learning path easy!

  • @edphi
    @edphi 3 роки тому +1

    Genius serious sincere
    I’m a mathematician and am convinced you are a born sage

  • @harisjoseph117
    @harisjoseph117 3 роки тому +3

    Dear Josh Starmer, Thank you so much. May God bless with you more knowledge so that you can energize learners like me. ❤. Thank you again.

  • @joeyshias
    @joeyshias Рік тому +1

    i'm so moved to finally understand this, thank you!

  • @salah6160
    @salah6160 Рік тому +1

    Teaching is an art. thank you StatQuest

  • @Vinyl-vv3pz
    @Vinyl-vv3pz Рік тому

    Best reference for learning statistics. Btw, would just like to point out that in 6:16, there appears to be a minor mistake. Actually for every 1 unit increase in Weight, there is a 2 unit increase in Shoe Size, because the equation would be Size = (1/2)*Weight, or 2*Size = 1*Weight

    • @statquest
      @statquest  Рік тому

      This video is actually correct. For every one unit increase in weight, there is only a 1/2 unit increase in Shoe Size. What your equation shows is that for every unit increase in Size, there is a 2 unit increase Weight. That's not the same thing as "for every unit increase in Weight, there is a 2 unit increase in Size".

    • @Vinyl-vv3pz
      @Vinyl-vv3pz Рік тому +1

      @@statquest I calculated through the equation, and you are correct. Thanks for the verification!

  • @anashaat95
    @anashaat95 2 роки тому +1

    Very clear explanation. I saw different people explaining this topic but you are the best.
    Thank you so much.

  • @taiman9423
    @taiman9423 3 роки тому +2

    Top notch visualization.

  • @jhfoleiss
    @jhfoleiss 4 роки тому +7

    Awesome Explanation Mr. Starmer! I wish your videos existed back when I was taking Calculus in the university!!! ( which was a long time ago =) )

  • @dkutagulla
    @dkutagulla Рік тому +1

    Simply the best explanation of chain rule!
    Now I understand CR better to teach my kid when she needs it...
    Thank you!!!
    Do you publish a book on calculus I would love to buy it!

    • @statquest
      @statquest  Рік тому

      Thanks! I don't have a book on calculus, but I have on on machine learning: statquest.org/statquest-store/

  • @gabrielcournelle3055
    @gabrielcournelle3055 4 роки тому +17

    Now I can't read "the chain rule" without hearing your voice !

  • @chelsie292
    @chelsie292 4 роки тому +3

    this is epic, simple, and applicable chain rule in real life too - we need more videos like this damn

  • @Vanadium404
    @Vanadium404 Рік тому +1

    Such a beautiful intuition that weight height then height shoe size example was just commendable

  • @georgetzimas6882
    @georgetzimas6882 3 роки тому +3

    13:15 Is the residual(squared) graph mirrored? Since residual=(observed - predicted), wouldn't that mean that when on the original graph the intercept is zero, the residual would be positive(2-1=1), so the position on the residual(squared) graph should be on the positive x-axis(x=1), as opposed to the negative side on the video, and vice versa?

    • @statquest
      @statquest  3 роки тому +3

      Yes! You are correct. Oops!

  • @tagoreji2143
    @tagoreji2143 2 роки тому +1

    Thank you Sir for the amazing Tutorial.

  • @sheilawang3847
    @sheilawang3847 4 роки тому +1

    such a clean and simple explanation! can't wait for more Math and Statistic videos. You are the awesomeness in UA-cam!

  • @irischin6165
    @irischin6165 2 роки тому +1

    I graduated with stats degrees from college 10+ years ago and never touched it. Now I feel I re-learned everything overnight!!!!!

  • @RumayzaNorova
    @RumayzaNorova 3 місяці тому +1

    After this awesome statquest, I will hear 'The Chain Rule' with the echo playing in my head

  • @MrXiiaoSky
    @MrXiiaoSky 3 роки тому +1

    thanks for clearing up the confusions i had with chain rule!

  • @syedmustahsan4888
    @syedmustahsan4888 5 місяців тому +2

    Another concept well explained ❤

    • @statquest
      @statquest  5 місяців тому +1

      Thanks a lot 😊!

  • @robelbelay4065
    @robelbelay4065 4 роки тому +1

    An epically clear explanation. Thank you so much!

  • @janscheuring2642
    @janscheuring2642 Рік тому +1

    Hi, I think I found a mistake. (?) The pink ball in the graph from 13:08 should be on the other side of the Y axis. It doesn't change the educational value of the whole video but it caught my eye.

  • @39_ganesh_ghodke98
    @39_ganesh_ghodke98 3 місяці тому +1

    Your are an amazing teacher !

  • @aswink112
    @aswink112 3 роки тому +1

    Great teaching Josh Starmer!

  • @saifqawasmeh9664
    @saifqawasmeh9664 Рік тому +2

    Reading abour Loss in Neural Network and optimization from 20+ sources and could not understand it until watching this video. Big BAM!

  • @Infinitesap
    @Infinitesap 2 роки тому +4

    I think you must be an alien! This is the best, most simplistic and complete explanation I have seen -ever. Fantastic job you did ❤️ thanks

  • @amirhossientakeh5540
    @amirhossientakeh5540 2 роки тому +1

    you deserve Nobel prize Nobel man

  • @tanubist7721
    @tanubist7721 4 роки тому +1

    This be the first time I am laughing learning stats🤣 Thanks alot!

  • @jialushen6248
    @jialushen6248 4 роки тому +1

    Oh boy that's a teaser for neural net. Been looking forward to this!!

    • @statquest
      @statquest  4 роки тому +1

      YES!!! This is the first video in my series on Neural Nets!!!!!!! The next one should be out soon (hopefully late July, but I always run behind so maybe early August).

  • @KUMAWANI
    @KUMAWANI 3 роки тому +1

    I would like to thank you from bottom of my heart for such wonderful videos.
    Such difficult topic made simple, you are awesome man , keep rocking!!!!

    • @KUMAWANI
      @KUMAWANI 3 роки тому +1

      And Triple BAM!!!!

    • @statquest
      @statquest  3 роки тому

      Thank you very much! :)

  • @lightxx06
    @lightxx06 2 роки тому +1

    BAM! best explanation so far

  • @pooravkadiyan
    @pooravkadiyan 2 роки тому +1

    Your explanation is awesome. Make more videos.

  • @Dy-yg9wq
    @Dy-yg9wq 3 роки тому +2

    Please add more adds so we can watch them and actually give back to you

    • @statquest
      @statquest  3 роки тому +1

      Ha! I wish I could remove all the ads. But even then, UA-cam will add them.

  • @alinadi9427
    @alinadi9427 9 місяців тому +1

    your videos are fantastic

    • @statquest
      @statquest  9 місяців тому

      Glad you like them!

  • @lin1450
    @lin1450 3 роки тому +1

    Thank you so much for your videos! I got a StatQuest Shirt for my Birthday... hurray! :)

  • @syedmustahsan4888
    @syedmustahsan4888 5 місяців тому +1

    Thanks a lot Sir Josh. Jzakallah. 😊Emotional

    • @statquest
      @statquest  5 місяців тому +1

      Thank you very much! :)

  • @Metryk
    @Metryk Рік тому +1

    13:27 When the residual is negative, the pink circle is shown to be on the right side of the y-Axis, but shouldn't it be on the left side?
    Aside from that, great content! Cheers from Germany

    • @statquest
      @statquest  Рік тому +1

      Yep. Thanks for catching that! I've added a correction to the pinned comment.

  • @vaishnavi4354
    @vaishnavi4354 4 роки тому +1

    Awesome Statquest...
    Initially played Song and concept too!!😎😎😎

  • @krishj8011
    @krishj8011 5 місяців тому +1

    Awesome Tutorial...

  • @hassanalimohammadi4553
    @hassanalimohammadi4553 2 роки тому +1

    hanks for all your amazing videos. I'm still learning from you :)

  • @studgaming6160
    @studgaming6160 Рік тому +1

    Thanks for informative video.

  • @luis96xd
    @luis96xd 2 роки тому +1

    Amazing video! Back to basics 😄👍

  • @elielberra2867
    @elielberra2867 2 роки тому +1

    Amazing video thanks!

  • @mattaaron79
    @mattaaron79 2 роки тому +1

    I'm getting strong MST3K and Star Control II vibes from this guy and that's pretty cool

  • @alfcnz
    @alfcnz 4 роки тому

    6:52 that's not an exponential line (2^x), it's just a parabola (x^2). Anyhow, you're awesome! BAM! Just subscribed!

    • @statquest
      @statquest  4 роки тому +1

      Thanks for catching that. :)

  • @jonathangallant-mills6434
    @jonathangallant-mills6434 2 роки тому +1

    Hey, can someone help me understand why at 14:55 we Observed and Weight to 0 because they do not contain the intercept? I thought I understood until this point. Now I'm a bit confused and discouraged! Thank you!

    • @statquest
      @statquest  2 роки тому +2

      When we change the value for the intercept, the Observed values do not change (because they are what we observed, they don't every change). Since there is 0 change in the observed values when we change the intercept, the derivative of the observed values with respect to the intercept is 0.

    • @jonathangallant-mills6434
      @jonathangallant-mills6434 2 роки тому +1

      @@statquest Thank you!!!😄

  • @rosmontis06
    @rosmontis06 4 місяці тому

    I have a couple questions...
    At 6:54, what's the time^2 + 1/2 formula supposed to be representing? 🤔 and is that 1/2 supposed to be the intercept? why do we plug it in, is that just a set formula you've gotta learn?

    • @statquest
      @statquest  4 місяці тому

      The formula is for the curve that fits our data. What you do is you get some data and then fit a line (or curve in this case) to it - so the line, and the equation for it, depend on the data.

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 4 роки тому +1

    A video also on probability chain rule would be awesome

  • @dhruvsharma7992
    @dhruvsharma7992 2 роки тому

    This video is not just the explanation of "The Chain Rule" instead explained the intuition behind the various loss functions.

    • @statquest
      @statquest  2 роки тому +1

      That's right. The chain rule is used a lot in machine learning so I tried to explain it from that perspective.

    • @dhruvsharma7992
      @dhruvsharma7992 2 роки тому +1

      @@statquest Thanks for all of the videos, they all really help alot

    • @statquest
      @statquest  2 роки тому

      @@dhruvsharma7992 Thanks!

  • @wrjazziel
    @wrjazziel 4 роки тому +2

    LMAO, the song at the beginning xD, just for that I'm giving it a like.

  • @andersk
    @andersk 3 роки тому

    Is 13:06 a slight error? The residual-intercept graph shows the point in the negative part of the residual's axis (negative y), yet the residual-sq-residual graph shows the point on the positive side of the residual's axis on that graph (positive x)

    • @statquest
      @statquest  3 роки тому +1

      You are correct! The x-axis on the Residual vs Residual^2 graph is backwards.

    • @andersk
      @andersk 3 роки тому +1

      @@statquest thanks for clarifying - and amazing video again, looking forward to your illustrated guide!

  • @KayYesYouTuber
    @KayYesYouTuber 2 роки тому +1

    Simply beautiful. you are the best.

  • @joeyshias
    @joeyshias Рік тому +1

    謝謝!

    • @statquest
      @statquest  Рік тому

      Hooray!!! Thank you so much for supporting StatQuest!!! BAM! :)

  • @siddean5048
    @siddean5048 2 роки тому +1

    Beautiful Just Beautiful

  • @Klyaa
    @Klyaa 4 місяці тому

    At 6:54 you said that you fit an exponential line to the graph and got hunger = time^2 + 1/2. I have a few questions about that.
    1. I've never heard the phrase 'exponential line' before. Do you just mean an exponential 'line' of best fit?
    2. You said that the equation is exponential, but that looks quadratic to me. Am I missing something?
    I really like the way you explained this. Once you think about problems in the 'real world' like this it really starts to make sense how changing one function affects and changes the other and then why you need the chain rule to find the rate of change.

    • @statquest
      @statquest  4 місяці тому

      1. I just mean that we fit a curve defined by the function hunger = time^2 + 1/2
      2. I should have said quadratic instead of exponential. I apologize for any confusion that this may have caused.

    • @Klyaa
      @Klyaa 4 місяці тому +1

      @@statquest Thanks for replying so quickly on an older video like this! I'm making some math videos of my own right now and I can't believe how easy it is to misspeak or write something wrong. You've done an amazing job with all your videos. This is the only video I've found that attempts to explain the chain rule in an intuitive way without using the limit definition.

  • @motherisape
    @motherisape 2 роки тому +1

    Awesomeness = like statquest squared 😆 🤣

  • @nishantgoel3065
    @nishantgoel3065 3 роки тому +1

    awsome work man!!!! you have created the best content...... I wish that you should be teaching us at our college🥺

    • @statquest
      @statquest  3 роки тому +1

      Thank you so much 😀

  • @bozok1903
    @bozok1903 Рік тому +1

    Bam! You are awesome. Thanks a lot.

  • @aleksandrpakhomov4154
    @aleksandrpakhomov4154 Рік тому +1

    Спасибо, вы молодец!

  • @valor36az
    @valor36az 4 роки тому +1

    Awesome. You made my day!

  • @warrenb7450
    @warrenb7450 4 роки тому +3

    The best Chain Role tutorial! Do you have any for Relu? Thank you!!

  • @igorg4129
    @igorg4129 Рік тому +1

    In the 1st example both initial relationships(hight to weight and shoe size to hight) are given as linear. Thist the deriviative multiplication gives me not only the deriviative but the slope of themodel predicting the shoe size by weight(final model)
    What I am missing are 2 things:
    1) but in some non linear final model what is the use of
    knowing the slope equation? It is not a model equation, so can not be used for predictions... what am I missing?
    2) another thing confuses me is that here ,at least in ghe shoesize example you use the chainrool to get the final model. But further in backpropagation in each iteration the use of it is different itis kinda to predict weights using them get the "final model"
    Could you please formulate this difference better then I am trying to?
    Thank you so much.

    • @statquest
      @statquest  Рік тому

      I'm not sure I understand your questions. The idea is that we want to establish a relationship among variables - and how much one changes when we change another. This works for linear and non-linear equations. It also sounds like you are interested in how derivatives are used for backpropagation. For details, see: ua-cam.com/video/sDv4f4s2SB8/v-deo.html and ua-cam.com/video/IN2XmBhILt4/v-deo.html

  • @honghur8977
    @honghur8977 3 роки тому

    13:39 how come the slope of at a point on squared residual curve can be written in terms of derivate of squared residual with respect to intercept and not derivate of squared residual with respect to residual? Why do we set the derivate of squared residual with respect to intercept to zero when the slope of the squared residual curve should be written in terms with respect to residual? Shouldn’t we set the latter to zero and solve for intercept? Is it because residual is a function of intercept itself?

    • @statquest
      @statquest  3 роки тому

      I'm not sure I understand your questions because they all seem to be answered immediately following that time point in the video. The goal is to find the optimal intercept for the graph of "weight vs height". So we use the chain rule to tell us the derivative of the residual^2 with respect to the intercept. This derivative has two parts, the residual^2 with respect to the residual and the residual with respect to the intercept.

  • @supernenechi
    @supernenechi Рік тому

    Despite how good you are at explaining, I'm still having a hard time with it all. My confidence isn't exactly helped by the fact that all the other people in the comments seem to somehow be doing PhDs and stuff, but okay...
    How can I try to understand it even better?

    • @statquest
      @statquest  Рік тому

      Can you tell me what time point, minutes and seconds, you first got confused?

  • @faezeabdolinejad731
    @faezeabdolinejad731 3 роки тому +1

    Thank you ❤❤❤❤