Tutorial 6-Chain Rule of Differentiation with BackPropagation

Поділитися
Вставка
  • Опубліковано 3 гру 2024

КОМЕНТАРІ • 247

  • @debtanudatta6398
    @debtanudatta6398 3 роки тому +200

    Hello Sir, I think there is mistake in this video for backpropagation. Basically to find out (del L)/(del (w11^2)), we don't need the PLUS part. Since here O22 doesn't depend on w11^2. Please look into that. The PLUS part will be needed while calculating (del L)/(del (w11^1)), there O21 & O22 both depend on O11 and O11 depends on w11^1.

    • @alinawaz8147
      @alinawaz8147 2 роки тому +2

      Yes brother there is mistake what is said is correct

    • @prakharagrawal4011
      @prakharagrawal4011 2 роки тому +3

      Yes, This is correct. Thank you for pointing this out.

    • @aaryankangte6734
      @aaryankangte6734 2 роки тому +2

      true that

    • @vegeta171
      @vegeta171 2 роки тому +1

      You are correct concerning that, but I think he wanted to take derivative w.r.t O11 since it is present in both nodes of f21 and f22, so if we replace w11^2 in the equation by O11 the equation would be correct

    • @byiringirooscar321
      @byiringirooscar321 2 роки тому +1

      it took me time to understand it but now I got the point thanks man but I can assure you that @krish naik is the first professor I have

  • @ksoftqatutorials9251
    @ksoftqatutorials9251 5 років тому +6

    I don't want to calulate Loss function to your videos and no need to propagate the video back and forward i.e you explained in such a easiest way I have ever seen in others. Keep doing more and looking forward to learn more from you. Thanks a ton.

  • @tarun4705
    @tarun4705 Рік тому +3

    This is the most clear mathematical explanation I have ever seen till now.

    • @moksh5743
      @moksh5743 Рік тому

      ua-cam.com/video/Ixl3nykKG9M/v-deo.html

  • @OMPRAKASH-uz8jw
    @OMPRAKASH-uz8jw Рік тому +2

    you are no one but the perfect teacher,keep on adding playlist

  • @AmitYadav-ig8yt
    @AmitYadav-ig8yt 5 років тому +32

    It has been years since I had solved any mathematics question paper or looked at mathematics book. But the way you explained was damn good than Ph.D. holder professors at the University. I did not feel my away from mathematics at all. LoL- I do not understand my professors but understand you perfectly

  • @RomeshBorawake
    @RomeshBorawake 3 роки тому +20

    Thank you for the perfect DL Playlist to learn, wanted to highlight a change to make it 100% useful (Already at 99.99%),
    13:04 - For Every Epoch, the Loss Decreases adjusting according to the Global Minima.

    • @vishnukce
      @vishnukce Рік тому

      But for negative slopes loss has to increase know to reach global maxima

    • @being_aadarsh
      @being_aadarsh 2 місяці тому +1

      @@vishnukce For negative slopes weights need to be increased instead of a loss

  • @VVV-wx3ui
    @VVV-wx3ui 5 років тому

    This is simply yet Superbly explained. When I learnt earlier, it stopped at Back Propagation. Now, learnt what is in Backpropagation that makes the Weights updation in an appropriate way, i.e., Chain rule. Thanks much for giving clarity that is easy to understand. Superb.

  • @aj_actuarial_ca
    @aj_actuarial_ca Рік тому +1

    Your videos are really helping me to learn Machine learning as an actuarial student who is from a pure commerce/ finance background

  • @rajeeevranjan6991
    @rajeeevranjan6991 5 років тому +6

    simply one word "Great"

  • @namyashah3173
    @namyashah3173 4 місяці тому

    No one has ever explained like you did.hatts off!!

  • @ganeshvhatkar9040
    @ganeshvhatkar9040 9 місяців тому +1

    one of the best videos, I have seen in my life!!

  • @manateluguabbaiinuk-mahanu761
    @manateluguabbaiinuk-mahanu761 2 роки тому +2

    Deep Learning Playlist concepts are very clear and anyone can understand easily. Really have to appreciate your efforts 👏🙏

  • @abhishek-shrm
    @abhishek-shrm 4 роки тому +1

    This video explained everything I needed to know about backpropagation. Great video sir.

  • @mranaljadhav8259
    @mranaljadhav8259 4 роки тому +1

    Well Explained sir ! Before starting the deep learning, I have decided to start the learning from your videos. You explain in very simple way ...Anyone can understand from your video. Keep it up Sir :)

  • @VIKASPATEL-of2sy
    @VIKASPATEL-of2sy 5 років тому +36

    i guess differentiation done at 11:26 is bit wrong, r u sure about? i mean why do we have to addan extra term of delta loss by delta w12

    • @debasispatra8368
      @debasispatra8368 4 роки тому +10

      yes correct. It seems a mistake. addition part will come when we will calculate derivative of w11 for layer 1, not for derivative of w11 for layer 2.

    • @RajatSharma-ct6ie
      @RajatSharma-ct6ie 4 роки тому +1

      Yes you are correct !!

    • @bhavyaparikh6933
      @bhavyaparikh6933 4 роки тому +2

      @@debasispatra8368 but why we dont have to add for layer 2 and add to layer 1

    • @mranaljadhav8259
      @mranaljadhav8259 4 роки тому

      @@bhavyaparikh6933 same question here....if you got it, can you explain.. I have just started deep learning.

    • @nikitlune9526
      @nikitlune9526 4 роки тому

      @@debasispatra8368 Hi, can you just tell how initially weights are assign and how many hidden layers and no. of neurons on each layer should be there?

  • @chartinger
    @chartinger 5 років тому +2

    OP... Nice Teaching... Why don't we get teachers like u in every institute and college??

  • @TheMainClip-t1h
    @TheMainClip-t1h 3 роки тому

    You have saved my life, i owe you everything

  • @shaan2522
    @shaan2522 3 місяці тому

    great explanation of the chain rule in backpropagation.. all my doubts are cleared!!
    thankss

  • @shrutiiyer68
    @shrutiiyer68 3 роки тому +1

    Thank you so much for all your efforts to give such an easy explanation🙏

  • @nishitnishikant8548
    @nishitnishikant8548 3 роки тому +45

    Of the two connections from f11 to the second hidden layer, w11^2 is affecting only f21 and not f22(as it affected by w21^2). So, dL/dw11^2 will only have one term instead of two.
    Anyone, pls correct me if i am wrong.

    • @sahilvohra8892
      @sahilvohra8892 3 роки тому +3

      I agree. i dont know why others didn't realized this same mistake!!!

    • @mustaphaelammari1128
      @mustaphaelammari1128 3 роки тому +3

      i agree, i was looking for someone has the same remark :)

    • @ismailhossain5114
      @ismailhossain5114 3 роки тому +3

      That's the point I am actually looking

    • @saqueebabdullah9142
      @saqueebabdullah9142 3 роки тому +4

      Exactly, cause if I solve the derivative of two terms it results d/dw11^2 *L = d/dw11^2 *L + d/dw12^2 *L , which is wrong

    • @RUBAYATKHAN89
      @RUBAYATKHAN89 3 роки тому +3

      Absolutely.

  • @someshanand1799
    @someshanand1799 4 роки тому +1

    great video especially you are giving the concept behind it, love it.. thank you for sharing with us.

  • @deepaktiwari9854
    @deepaktiwari9854 3 роки тому +12

    Nice informative video. It helped me in understanding the concept. But i think at end there is a mistake. You should not add the other path to calculate the derivative for W11^2. Addition should be done if we are calculating the derivative for O11.
    w11^2(new) = (dl/dO31 * dO31/dO21 * dO21/dW11^2)

    • @grownupgaming
      @grownupgaming 3 роки тому

      Yes deepak, I noticed the same thing. There's a mistake around 12:21. no addition is needed.

    • @anupampurkait6066
      @anupampurkait6066 3 роки тому

      yes deepak you are correct. I also think the same.

    • @albertmichaelofficial8144
      @albertmichaelofficial8144 Рік тому

      Is that because we are calculating based on o3 and 03 depends on both output from second layer

  • @varunsharma1331
    @varunsharma1331 Рік тому

    Great explanation. I was looking for this clarity since long...

  • @adityashewale7983
    @adityashewale7983 Рік тому

    hats off to you sir,Your explanation is top level, THnak you so much for guiding us...

  • @manikosuru5712
    @manikosuru5712 5 років тому +1

    Amazing Videos...Only one word to say "Fan"

  • @aditideepak8033
    @aditideepak8033 4 роки тому +1

    You have explained it very well. Thanks a lot!

  • @MrityunjayD
    @MrityunjayD 4 роки тому

    Really appreciable the way you taught Chain rule...awesome..

  • @punyanaik52
    @punyanaik52 5 років тому +15

    Bro, there is a correction needed in this video... watch out for last 3 mins and correct the mistake. Thanks for your efforts

  • @hashimhafeez21
    @hashimhafeez21 3 роки тому

    first time i undestand very well by your explanation.

  • @gunjanagrawal8626
    @gunjanagrawal8626 2 роки тому +1

    Could you please recheck the video at around 11:00, W11 weight updation should be independent of W12.

  • @saritagautam9328
    @saritagautam9328 4 роки тому

    This is really cool. First time samjh aaya. Hats off Man.

  • @ruchikalalit1304
    @ruchikalalit1304 5 років тому +8

    @ 10:28 - 11:22 krish do we need both the paths to get added . since w11 suffix 2 is not affected by lower path ie w12 suffix 2? please tell

    • @amit_sinha
      @amit_sinha 5 років тому +2

      The second part of the summation should not come in the picture as it will come only when we will be calculating (dL/dw12) with suffix as 2.

    • @SiMsIMs-1
      @SiMsIMs-1 4 роки тому

      @@amit_sinha i think that is correct.

    • @niteshhebbare3339
      @niteshhebbare3339 4 роки тому

      @@amit_sinha
      Yes I have the same doubt!

    • @vishaldas6346
      @vishaldas6346 4 роки тому +1

      Not required, its not correct as w11^2 is not affected by lower weights. The 1st part is correct and summation is required , when we are thinking about w11^1.

    • @grownupgaming
      @grownupgaming 3 роки тому

      @@vishaldas6346 Yes!

  • @mohammedsaif3922
    @mohammedsaif3922 4 роки тому

    Krish your awesome finally I understood the chain rule from you thanks Krish again

  • @uddalakmitra1084
    @uddalakmitra1084 2 роки тому

    Excellent presentation Krish Sir .. You are great

  • @kamranshabbir2734
    @kamranshabbir2734 5 років тому +14

    the last partial derivative of Loss we have calculated w.r.t. (w11^2) is that correct how we have shown there that it is dependent upon two paths one w11^2 and other w12^2 ......... Please make it clear i am confused about it ??????

    • @wakeupps
      @wakeupps 5 років тому +13

      I think this is wrong! Maybe he wanted to discuss about the w11^1? However, a forth term should be add in the sum. Idk

    • @imranuddin5526
      @imranuddin5526 5 років тому +1

      @@wakeupps yes, i think he got confused and it was w11^1

    • @Ip_man22
      @Ip_man22 4 роки тому +4

      assume he is explaining about W11^1 and youll understand everything. From the diagram itself, you can see the connections and can clearly imagine which weights are dependent on each other .
      Hope this helps

    • @akrsrivastava
      @akrsrivastava 4 роки тому +4

      Yes, he should not have added the second term in the summation.

    • @gouravdidwania1070
      @gouravdidwania1070 3 роки тому

      @@akrsrivastava Correct no second term needed for W11^2

  • @sekharpink
    @sekharpink 5 років тому +2

    Very very good explanation..very much understandable. Can I know how many days ur planning to complete this entire playlist?

  • @arpitdas2530
    @arpitdas2530 4 роки тому +2

    Your teaching is great sir. But can we get some video also about how we will apply these practically in python?

  • @SiMsIMs-1
    @SiMsIMs-1 4 роки тому +3

    Awesome Mate. however, I think you got carried away for the second part to be added. read the comments below and correct, please. W12 may not need to be added. But it all makes sense. A very good explanation.

  • @aminzaiwardak6750
    @aminzaiwardak6750 5 років тому +1

    thank you sir, you explain very good keep it up.

  • @dipankarrahuldey6249
    @dipankarrahuldey6249 4 роки тому +4

    I think this part dL/dw11^2 should be (dL/dO31 *dO31/O21 *dO21/dO11^2). If we are taking derivative of dL w.r.t w11^2 then,w12^2 doesn't come into play. So,in that case, dL/dO12^2= (dL/dO31 *dO31/O22 *dO22/dw12^2)

    • @raj4624
      @raj4624 3 роки тому

      agree...dw11^2 should be (dL/dO31 *dO31/O21 *dO21/dO11^2). not extra afte addition

  • @manjunath.c2944
    @manjunath.c2944 5 років тому +1

    clearly understood very much appreciated for your effort :)

  • @channel8048
    @channel8048 Рік тому

    Thank you so much for this! You are a good teacher

  • @devgak7367
    @devgak7367 4 роки тому

    Just awsome explanation of gradient descent.

  • @skviknesh
    @skviknesh 3 роки тому +1

    Thanks ! That was really awesome.

  • @grownupgaming
    @grownupgaming 3 роки тому

    Isnt the dL/dw2-11 independent of dL/dw2-12? At 12:21 why is dL/dw2-11 those two terms added up? dL/dw2-11 is the first line of additions, and dL/dw2-12 is the second line of additions.

  • @maheshvardhan1851
    @maheshvardhan1851 5 років тому +2

    great effort...

  • @siddharthdedhia11
    @siddharthdedhia11 4 роки тому

    Skip to 3:50 If you've watched the previous videos

  • @hope2251
    @hope2251 3 роки тому +2

    10:30 i dont think w112 is effecting o22, so the plus oart should not come

  • @camilogonzalezcabrales2227
    @camilogonzalezcabrales2227 4 роки тому +2

    Excellent video, I'm new in the field, could someone explain me how the O's are obtained. Are that O's the result of each neuron computation? are the O's numbers equations?

  • @amitjajoo9510
    @amitjajoo9510 4 роки тому +1

    Best video on back proportional on internet

  • @meanuj1
    @meanuj1 5 років тому +1

    Nice and requested to please add some videos on optimizer...

  • @tanvirantu6623
    @tanvirantu6623 4 роки тому

    love you sir, love ur effort. love from Bangladesh.

  • @good114
    @good114 2 роки тому +1

    Thank you Sir 🙏🙏🙏🙏♥️☺️♥️

  • @sundara2557
    @sundara2557 4 роки тому

    I am going through tour videos. You are Rocking Bro.

  • @viveksm863
    @viveksm863 3 роки тому +1

    Im able to understand the concepts you are explaining, but I dont know that from where do we get values for weights in forward propgation.Could you brief about that once if possible.

  • @ga43ga54
    @ga43ga54 5 років тому +2

    Can you please do a Live Q&A session !? Great video... Thank you

    • @krishnaik06
      @krishnaik06  5 років тому +3

      Let me upload some more videos, then I will do a Live Q&A session.

  • @vishalshukla2happy
    @vishalshukla2happy 5 років тому +1

    Great way to explain man.... keep on going

  • @ZaChaudhry
    @ZaChaudhry Рік тому

    ❤. God bless you, Sir.

  • @hokapokas
    @hokapokas 5 років тому +1

    Loved it man... Great effort in explaining the maths behind it and chain rule. Pls make a video on its implementation soon. as usual great work.. Looking forward for the videos. Cheers

    • @shivamjalotra7919
      @shivamjalotra7919 5 років тому +1

      Hello Sunny, I myself have stitched an absolutely brilliant repository explaining all the implementation details behind an ANN. See this: github.com/jalotra/Neural_Network_From_Scratch

    • @kshitijzutshi
      @kshitijzutshi 3 роки тому

      @@shivamjalotra7919 Great effort. Starred it. ⭐👍🏼

    • @shivamjalotra7919
      @shivamjalotra7919 3 роки тому +1

      @@kshitijzutshi try to implement it yourself from scratch. See george hotz twitch stream for this.

    • @kshitijzutshi
      @kshitijzutshi 3 роки тому

      @@shivamjalotra7919 Any recommendation for understanding image segmentation problem using CNN? resources?

  • @dnakhawa
    @dnakhawa 4 роки тому

    You are too Good Krish , nice Data science content

  • @sekharpink
    @sekharpink 5 років тому +1

    Hi Krish,
    Please upload videos on regular basis. I'm eagerly waiting for your videos.
    Thanks in Advance

    • @krishnaik06
      @krishnaik06  5 років тому +2

      Uploaded please check the tutorial 7

    • @sekharpink
      @sekharpink 5 років тому

      @@krishnaik06 thank you..please keep posting more videos..I'm really waiting to watch your videos..really liked your way of explanation

  • @aswinthviswakumar64
    @aswinthviswakumar64 3 роки тому

    Great Video and a Great initiative sir
    from 12:07 if we use same method to calculate dL/dW12^2 it will be the same as dL/dW11^2.
    is this the correct way or am I getting it wrong
    thank you!

  • @utkarshdadhich771
    @utkarshdadhich771 3 роки тому

    @krish naik Correction at 13:05.. I guess Loss should be dcecreasing not increasing with to every epoch.

  • @chandanbp
    @chandanbp 4 роки тому

    Great stuff for free. Kudos to you and your channel

  • @yedukondaluannangi7351
    @yedukondaluannangi7351 4 роки тому

    Thanks a lot for the videos it helped me a lot

  • @ThachDo
    @ThachDo 5 років тому +1

    10:44 you are pointing to w1_11, but why the formula on board is the derivative w.r.t w2_11?

    • @winviki123
      @winviki123 5 років тому

      That's correct.
      Even I was wondering the same

  • @pranjalgupta9427
    @pranjalgupta9427 3 роки тому +1

    Nice 👍👏🥰

  • @mdmuqtadirfuad
    @mdmuqtadirfuad 9 місяців тому

    I can't understand( 11:09) dL/dw^2_11= 1st term + 2nd term... We are updating w11. But how w12 make impact (2nd term)?

  • @sandeepganage9717
    @sandeepganage9717 5 років тому

    Brilliant explanation!

  • @aravindvarma5679
    @aravindvarma5679 4 роки тому

    Thanks Krish...

  • @cynthiamoricordova5099
    @cynthiamoricordova5099 3 роки тому

    Thank you so much for all your videos. I have a question respect of the value to assign to bias. This value is a random value? I will appreciate your answer.

  • @sivaveeramallu3645
    @sivaveeramallu3645 4 роки тому

    excellent Krish

  • @mikelrecacoechea8730
    @mikelrecacoechea8730 3 роки тому

    Hey Krish, god explanation
    I think there is one correction. In the end, you explained for w11^2, what I feel is, it is for w11^1.

  • @omkarpatil2854
    @omkarpatil2854 5 років тому +3

    thank you for great explanation,
    i have a question, with this formula which generates for ( diff(L) / diff (W11)) is completely same for ( diff(L) / diff (W12))
    i am i right? does both value gets same difference in weights while back propagation ( though W old value will be different

    • @SunnyKumar-tj2cy
      @SunnyKumar-tj2cy 5 років тому

      Same question.
      What I think, as we are finding out the new weights, the W11 and W12 for HL2, both should be different and should not be added, or I am missing something.

    • @abhinaspadhi8351
      @abhinaspadhi8351 5 років тому

      @@SunnyKumar-tj2cy Yeah, Both should not be added as they are diff...

    • @spurthygopal1239
      @spurthygopal1239 5 років тому

      Yes i have same question too!

    • @varunmanjunath6204
      @varunmanjunath6204 3 роки тому

      @@abhinaspadhi8351 its wrong

  • @kasimidrisi7602
    @kasimidrisi7602 4 роки тому +1

    His sir i think there is something wrong wrong because the w11 to the suffix 2 is not impacted with the w12 to the suffix 2..! But this playlist is really helpfull to me thankyou sir...:)

  • @rede_neural
    @rede_neural 8 місяців тому

    11:17 are you sure we have to sum them? It doesn't seems like the the two sides are equal when we "cancel" the chain

  • @saygnileri1571
    @saygnileri1571 3 роки тому

    Nice one thnks a lot!

  • @pratikgudsurkar8892
    @pratikgudsurkar8892 4 роки тому +2

    We are solving supervised learning problem that's why we have loss as actual-predicted , what in case of unsupervised where we don't have y actual how the loss is calculated and how the updation happen

    • @benvelloor
      @benvelloor 4 роки тому

      I don't think there will be back propogation in unsupervised learning!

  • @kavinvignesh2832
    @kavinvignesh2832 4 місяці тому +1

    for the dL/w11^3 it should be dL/w11^3 = (dL/dO31 * dO31/dO31(before activation) * dO31(before activation)/dW11^3) right?

  • @sandipansarkar9211
    @sandipansarkar9211 4 роки тому

    yeah I did understand chain rule but being a fresher please provide some easy to study articles on chain rule so that i can increase my understanding before proceeding further.

  • @pranjalbahore6983
    @pranjalbahore6983 3 роки тому

    so insightful @krish

  • @bibhutiswain175
    @bibhutiswain175 5 років тому

    Really helpful for me.

  • @tintintintin576
    @tintintintin576 4 роки тому

    so helpful video :)
    thanks

  • @shindepratibha31
    @shindepratibha31 4 роки тому

    Hey Krish, your way of explanation is good.
    I think there is one correction. In the end, you explained for w11^2, what I feel is, it is for w11^1. It would be really helpful if you correct it because many are getting confused with it.

    • @aneeshkalita7452
      @aneeshkalita7452 2 роки тому

      I think the same.. But great method of teaching.. there is no doubting that

  • @saitejakandra5640
    @saitejakandra5640 5 років тому +3

    Pls upload ROC auc related concepts

  • @shashireddy7371
    @shashireddy7371 5 років тому

    Well explained video

  • @rajshekharrakshit9058
    @rajshekharrakshit9058 4 роки тому +1

    sir i think one thing you are doing is worng.
    as w^(3)11 impacts O(31) , here is one activation part.
    so the dL/dw^(3)11 = dL/dO(31) . d0(31)/df1 . df1/dw^(3)11
    I might be wrong, can you please clear my query ?

  • @vishaljhaveri6176
    @vishaljhaveri6176 3 роки тому

    Thank you sir.

  • @vishalgupta3175
    @vishalgupta3175 4 роки тому

    Hi sir, Sorry to say you that which degree you have completed,you are awesome!

  • @quranicscience9631
    @quranicscience9631 5 років тому

    very good content

  • @mrunalwaghmare
    @mrunalwaghmare 4 місяці тому

    3:49 if you have watched previous vid before this he is just revising

  • @tabilyst
    @tabilyst 4 роки тому

    Hi Krish, can you pls let me know, if we are calculating the derivative of W2 11 weight then why we are adding derivative of W2 12 weight in that. ? pls clear

  • @enquiryadmin8326
    @enquiryadmin8326 5 років тому

    in the back propagation, calculation of gradients using the chain rule for the w11^1, i think we need to consider 6 paths. please kindly clarify.

  • @bsivarahulreddy
    @bsivarahulreddy 3 роки тому

    Sir, O31 is also impacted by weight W11(3) ryt? why we are not taking that derivative in chain rule?

  • @sapito169
    @sapito169 2 роки тому

    finally i understand it

  • @jerryys
    @jerryys 3 роки тому +1

    Great job! Does the last derivative need the second part? I do not get it.

    • @kartikesood8242
      @kartikesood8242 3 роки тому

      d(O22) will also be differentiated but with respect to w11, thus it will come out to be zero. Hence take it or not, result will be the same

  • @parthicle
    @parthicle Рік тому

    thank you ser

  • @waynewu7763
    @waynewu7763 5 місяців тому

    how do you take the derivative of d(O31)/dO21? what kind of equations are those?

  • @chaitanyakumarsomagani592
    @chaitanyakumarsomagani592 4 роки тому

    krish sir, is it w12^2 is depends on w11^2 then only we can do differentiation. w12^2 is going one way and w11^2 is going another way.