Backpropagation In Depth

Поділитися
Вставка
  • Опубліковано 1 жов 2024

КОМЕНТАРІ • 4

  • @vikasparuchuri
    @vikasparuchuri Рік тому +1

    You can see the code and written explanations for this video here - github.com/VikParuchuri/zero_to_gpt/blob/master/explanations/comp_graph.ipynb . And the full course is here - github.com/VikParuchuri/zero_to_gpt .

  • @anfedoro
    @anfedoro Рік тому

    I actually found Andrej Karpathy 2 hours video for his micrograd library (part of NN: Zero to Hero course) and this is absolutly perfect explanations for once who wich to understand how NN behaves and specifically how back propagation works and all gradients are calculated on that backmove through the calculation graph. With all respect to Vik, I considered Andrej's explanation a bit more clear for understading for newbies also with bunch of general Python coding technics I was not familar with.

    • @Dataquestio
      @Dataquestio  Рік тому +2

      Of course, Andrej Karpathy makes excellent tutorials, and I'm glad you found it useful.
      I'll think about how I can improve the backpropagation explanations in this video (maybe I'll make a second video, since Karpathy focuses more on each element of the tensors, whereas I focus more on the operations). I made this video since the knowledge in it is useful later in the zero to gpt series.
      Thanks for the pointer.

  • @hussainsalih3520
    @hussainsalih3520 9 місяців тому

    Keep doing :)