Gradient Descent Explained

Поділитися
Вставка
  • Опубліковано 21 гру 2024

КОМЕНТАРІ • 32

  • @Msyoutube38
    @Msyoutube38 2 роки тому +11

    Very nice explanation of the concept, brief and understandable. Awesome!

  • @vt1454
    @vt1454 2 роки тому +12

    As always, great video from IBM

  • @handsanitizer2457
    @handsanitizer2457 Рік тому +1

    Wow best explanation ever 👏

  • @Akanniafelumo
    @Akanniafelumo 2 місяці тому

    The best explanation I have had ever, in fact till now

  • @krishnakeshav23
    @krishnakeshav23 Рік тому +3

    Good explanation. It is somewhat also important to note that curve should be differentiable.

  • @davidrempel433
    @davidrempel433 Рік тому +28

    The most confusing part of this video is how he managed to write everything backwards on the glass so flawlessly

    • @sanataeeb969
      @sanataeeb969 Рік тому +7

      can't they write on their normal side then flip the video?

    • @sirpsychosexy
      @sirpsychosexy Рік тому +2

      @@sanataeeb969 no that would be way too easy

    • @waliyudin86
      @waliyudin86 Рік тому +6

      Bro just focus on the gradient descent topic

    • @P4INKiller
      @P4INKiller Рік тому +1

      @@sanataeeb969Oh shit, you're clever.

    • @smritibasnet9782
      @smritibasnet9782 4 місяці тому

      Nope he isnt writing backward..you can observe he seems to be using left hand to write ,but in actual right hand was being used

  • @krissatish87
    @krissatish87 9 місяців тому

    The best video i could find. Thank you.

  • @cyrcesarkore
    @cyrcesarkore 2 місяці тому

    Very simple and clear explanation. Thank you!

  • @57-tycm-ii-karanshardul28
    @57-tycm-ii-karanshardul28 10 днів тому

    Thankyou sir.

  • @Adnanuni
    @Adnanuni 2 місяці тому

    Thank you for such an amazing explaination Martin. Thanks a lot team IBM

  • @hugaexpl0it
    @hugaexpl0it Рік тому +1

    Very good explanation of high-level concept on GD.

  • @Shrimant-ub4ul
    @Shrimant-ub4ul 6 місяців тому

    Thank You Martin , really helpful for my uni exam

  • @sotirismoschos775
    @sotirismoschos775 Рік тому +4

    didn't know Steve Kerr works at IBM

  • @harshsonar9346
    @harshsonar9346 Рік тому +1

    Im always confused by these screens or boards, whatever.
    Like how do you write on them? Do you have to write backwards or do you write normally and it kinda mirrors it?

  • @FaberLSH
    @FaberLSH 5 місяців тому

    Thank you so much!

  • @s.m.rakibhasan5525
    @s.m.rakibhasan5525 Рік тому

    great lecture

  • @_alekss
    @_alekss 2 роки тому +2

    Nice I learned more from this 7 min video than 1 hour long boring lecture

  • @SAZlearn_AI
    @SAZlearn_AI 2 місяці тому

    Let me clarify the concept of learning rate and step size in gradient descent:
    Learning rate:
    The learning rate is a hyperparameter that we set before starting the optimization process. It's a fixed value that determines how large our steps will be in general.
    Step size:
    The actual size of each step is determined by both the learning rate and the gradient at that point. Specifically:
    step_size = learning_rate * magnitude_of_gradient
    So:
    The learning rate itself is not the size of the steps from point to point.
    The learning rate is a constant that helps determine how big those steps will be.
    The actual size of each step can vary, even with a constant learning rate, because it also depends on the gradient at each point.
    To visualize this:
    In steep areas of the loss function (large gradient), the steps will be larger.
    In flatter areas (small gradient), the steps will be smaller.
    The learning rate acts as a general "scaling factor" for all these steps.

  • @velo1337
    @velo1337 2 роки тому

    ibm: "how to make a neural network for the stock market?"

  • @Justme-dk7vm
    @Justme-dk7vm 8 місяців тому +1

    ANY CHANCE TO GIVE 1000 LIKES???😩

  • @John-wx3zn
    @John-wx3zn 8 місяців тому +2

    Your neural network is wrong.

    • @slimeminem7402
      @slimeminem7402 3 місяці тому

      Yeah the neurons are not fully connected 1:43

  • @Rajivrocks-Ltd.
    @Rajivrocks-Ltd. Рік тому

    I was expecting a mathematical explanation :(

  • @abdulhamidabdullahimagama9334
    @abdulhamidabdullahimagama9334 2 роки тому

    I couldn't visualise, I saw nothing on the screen...

    • @yt-sh
      @yt-sh 2 роки тому

      can see it

  • @Theodorus5
    @Theodorus5 4 місяці тому

    Too many words