Backpropagation in Neural Network with an Example By hand - TensorFlow Tutorial

Поділитися
Вставка
  • Опубліковано 23 лис 2022
  • Forward pass and Backpropagation in Neural Network with an Example By hand - TensorFlow Tutorial
    In this Video, we cover a step by step process using an example in order to calculate Forward and Backward Propagation by hand (I mean we calculate the derivates and all the stuff in order to update weights and biases in Feed Forward Pass and also Backward Propagation in Neural Network). You need to understand this topic in order to be prepared for TensorFlow coding (or other stuff like Keras and PyTorch) in Python Programming Language.
    ==================================
    Deep Learning with TensorFlow and Keras Playlist
    ==================================
    • TensorFlow 2 Beginner ...
    #deep_learning #tensorflow #keras #neuralnetworks

КОМЕНТАРІ • 20

  • @torgath5088
    @torgath5088 6 місяців тому +3

    You are one of very few who actually shows the bias update, most of videos is purely about weight update.

    • @Koolac
      @Koolac  6 місяців тому +1

      So happy to hear that. Glad you liked the video.

  • @Ziv-rv9xb
    @Ziv-rv9xb 22 дні тому +1

    The only explanation I understood, Thanks

    • @Koolac
      @Koolac  16 днів тому +1

      Glad to hear that.
      Many thanks for your feedback.

  • @tejasvinnarayan2887
    @tejasvinnarayan2887 Рік тому +3

    Thank you Koolac so much for a such a clear and simple explanation of a complex problem!

    • @Koolac
      @Koolac  Рік тому

      Happy to hear that. Thank you so much for your comment and support.

  • @ArturFejklowicz
    @ArturFejklowicz 2 місяці тому +3

    Thank you, after hours of searching this is the first video that explains properly the back propagation

    • @Koolac
      @Koolac  15 днів тому

      You're welcome.
      Glad you liked the video.

    • @mohammadosman4102
      @mohammadosman4102 13 годин тому

      Think about it this way, bias is also a weight just add an additional neuron with value always equal 1
      viola! bias is just a weight

  • @ivanmounde5046
    @ivanmounde5046 Рік тому +3

    there is a small issue with the input of neuron 2. it has taken on w3 instead of w2. otherwise everything is in order.

  • @emreyaln7780
    @emreyaln7780 4 місяці тому

    the best explanation i have ever seen so far, thx

  • @stephanedibo8167
    @stephanedibo8167 8 місяців тому

    This course is a true blessing

  • @khameelmustapha
    @khameelmustapha 8 місяців тому

    This such a brilliant explanation.

  • @guillermosainzzarate5110
    @guillermosainzzarate5110 4 місяці тому

    Thank you!!

  • @manthiwjs
    @manthiwjs Рік тому +1

    well explained. Thank you.:)

    • @Koolac
      @Koolac  Рік тому

      You're welcome. It's nice of you. Many thanks for your feedback and support.

  • @madankhatri7727
    @madankhatri7727 4 місяці тому

    nice explanation

  • @evanhadi6395
    @evanhadi6395 3 місяці тому

    isn't the true value supposed to be 3 and not 2 ? are maybe I'm wrong ?

  • @tejasvinnarayan2887
    @tejasvinnarayan2887 Рік тому +1

    How to include activation functions on backward propogation?

    • @Koolac
      @Koolac  Рік тому

      I've done so in the video as well. I even calculated ReLU as an example.