07 - Classification, an energy perspective - PyTorch 5-step training code

Поділитися
Вставка
  • Опубліковано 17 гру 2024
  • Course website: bit.ly/DLFL22-...
    Playlist: bit.ly/DLFL22-Y...
    Speaker: Alfredo Canziani

КОМЕНТАРІ • 6

  • @Francis-gg4rn
    @Francis-gg4rn Рік тому

    Thank you for making this!

    • @alfcnz
      @alfcnz  Рік тому

      Anytime 😇😇😇

  • @kasparhidayat9293
    @kasparhidayat9293 Рік тому

    Hi Alfredo - thanks for your videos. Just as a note on the step to zero gradients, I felt a simpler way to think about it is that pytorch is storing the results of past computations and these past results need to be cleared prior to future training batches. Found it confusing when you argue that zeroing + L.backward() are conceptually linked when I don't think they are.

    • @alfcnz
      @alfcnz  Рік тому +1

      There’s a reason why these previous gradients are stored. I have a entire section about it. To perform backpropagation in PyTorch one needs to execute two commands: zeroing + backward. Backward alone does two things: it computes and accumulate the gradient. So, if it is preceded by zeroing the previous grads, then it just computes the new grads. That’s why I’m insisting that ‘zeroing + backward’ accounts for a single statement, i.e. backpropagation.

  • @НиколайНовичков-е1э

    Hello, Alfredo. Thank you for the video! It's nice to spend a Saturday morning watching a lecture. one question: Will your book be available for public sale?

    • @alfcnz
      @alfcnz  Рік тому +2

      For sale in print and for free in digital version.