Backpropagation in Neural Network with an Example By hand - TensorFlow Tutorial
Вставка
- Опубліковано 23 лис 2022
- Forward pass and Backpropagation in Neural Network with an Example By hand - TensorFlow Tutorial
In this Video, we cover a step by step process using an example in order to calculate Forward and Backward Propagation by hand (I mean we calculate the derivates and all the stuff in order to update weights and biases in Feed Forward Pass and also Backward Propagation in Neural Network). You need to understand this topic in order to be prepared for TensorFlow coding (or other stuff like Keras and PyTorch) in Python Programming Language.
==================================
Deep Learning with TensorFlow and Keras Playlist
==================================
• TensorFlow 2 Beginner ...
#deep_learning #tensorflow #keras #neuralnetworks
You are one of very few who actually shows the bias update, most of videos is purely about weight update.
So happy to hear that. Glad you liked the video.
The only explanation I understood, Thanks
Glad to hear that.
Many thanks for your feedback.
Thank you Koolac so much for a such a clear and simple explanation of a complex problem!
Happy to hear that. Thank you so much for your comment and support.
Thank you, after hours of searching this is the first video that explains properly the back propagation
You're welcome.
Glad you liked the video.
Think about it this way, bias is also a weight just add an additional neuron with value always equal 1
viola! bias is just a weight
there is a small issue with the input of neuron 2. it has taken on w3 instead of w2. otherwise everything is in order.
the best explanation i have ever seen so far, thx
This course is a true blessing
This such a brilliant explanation.
Thank you!!
well explained. Thank you.:)
You're welcome. It's nice of you. Many thanks for your feedback and support.
nice explanation
isn't the true value supposed to be 3 and not 2 ? are maybe I'm wrong ?
How to include activation functions on backward propogation?
I've done so in the video as well. I even calculated ReLU as an example.