Optimization in Deep Learning | All Major Optimizers Explained in Detail

Поділитися
Вставка
  • Опубліковано 25 сер 2024

КОМЕНТАРІ • 27

  • @user-lj3bo8it4r
    @user-lj3bo8it4r 11 місяців тому +4

    Dude, i for the longest time felt like my understanding of moving averages and RmsProp was missing something and i found it in this video. You have no idea how grateful i am to your channel. Thank you, teachers tend to jump over important concepts without explaining them 🎉

    • @CodingLane
      @CodingLane  11 місяців тому

      Hehe… thank you so much for expressing this 😇

  • @azharhussian4326
    @azharhussian4326 2 роки тому +3

    Explained it best. After many years finally got it

  • @daniapy
    @daniapy 2 роки тому +3

    a life saver! thank you soo much for sharing this with us

  • @igorg4129
    @igorg4129 11 місяців тому +1

    Your explanations are as always clear and very useful. One of the best on UA-cam, and again I say it as a teacher myself (not in the AI field).
    Yet, an issue is being miss-explained in literally all available explanations on UA-cam.
    For some reason, you are also among them.
    The issue is that you forget to mention that the loss surface is unique and different for EVERY observation and might potentially have minimums in different places for different observations. This is extremely important to understand especially in the context of stochastic gradient descent

    • @CodingLane
      @CodingLane  11 місяців тому

      Hi, thanks for the suggestion. Yes, its true

    • @igorg4129
      @igorg4129 11 місяців тому

      @@CodingLane Hope I not insulted you, If i did, my appologies. Just trying to sullply you constructive feedvacks
      As being said I think you are one of the best on you yube

  • @2daymatters
    @2daymatters 2 роки тому +1

    You explained it very smooth and clear, thank you!

    • @CodingLane
      @CodingLane  2 роки тому +1

      Thank you! Happy to help!

  • @ekleanthony7735
    @ekleanthony7735 Рік тому +1

    This is the best explanation so far. Thanks for the great work

    • @CodingLane
      @CodingLane  Рік тому

      Thank you so much! Glad it helped! 🙂

  • @stylish37
    @stylish37 7 місяців тому +1

    Best explanation out there. Thanks a lot!

    • @CodingLane
      @CodingLane  6 місяців тому

      Thanks a lot for this!

  • @areegfahad5968
    @areegfahad5968 Рік тому

    Thank you very much for the awesome explanation.

  • @DelightDomain_DB
    @DelightDomain_DB 2 роки тому +1

    You made it super easy. Thanks for sharing

  • @shubhamsinghal2756
    @shubhamsinghal2756 Рік тому +1

    A small doubt, in the RMSprop as W and B both are affected by the SEWMA then the B direction is also affected by db, thus the steps in the B direction will also get checked, so won't it too decrease the learning rate and defy our whole purpose?

  • @rohithdasari4888
    @rohithdasari4888 2 місяці тому

    i still dont understand one thing what does B mean here is it the direction or bias

  • @antonykahuro8349
    @antonykahuro8349 2 роки тому +1

    Very nice explanation..Keep up the good work

    • @CodingLane
      @CodingLane  2 роки тому

      Thank you so much. I appreciate it!

  • @nileshsahu6786
    @nileshsahu6786 2 роки тому +1

    Nice Explanation, Keep it up.

  • @pavankongara792
    @pavankongara792 5 місяців тому

    Amazing!

  • @vijayakumarr.k.2469
    @vijayakumarr.k.2469 Рік тому

    thank you

  • @mugomuiruri2313
    @mugomuiruri2313 8 місяців тому

    good