i build all sorts of models for a living. i didnt get lost in the math and unless you are optimizing your network you can squint at it. whats required is to learn how backprop works and how weights and biases are updated and how activation effects the model's learning. the gradient descent is another thing you can squint at. "Seeing" it as a 3d terrain is very simple and powerful. your videos were my education 3b1b! it got me incredibly far.
Building up from a foundation of simple linear and logistic regression with gradient descent, and explaining how introducing the one nonlinearity in ReLU allows piecewise linear approximation of any function by a NN layer, is exactly what Andrew Ng’s excellent Coursera course does.
i build all sorts of models for a living. i didnt get lost in the math and unless you are optimizing your network you can squint at it. whats required is to learn how backprop works and how weights and biases are updated and how activation effects the model's learning. the gradient descent is another thing you can squint at. "Seeing" it as a 3d terrain is very simple and powerful.
your videos were my education 3b1b! it got me incredibly far.
Grant Sanderson is going to have such a sick legacy.
I bet! he's the man who motivated me to get into ML :)
I'm truly glad to live at the same time as grant sanderson
how cool, now I know. beautiful man. thanks for all the years on your channel learning shit that a high dropout never learned until I needed it.
Building up from a foundation of simple linear and logistic regression with gradient descent, and explaining how introducing the one nonlinearity in ReLU allows piecewise linear approximation of any function by a NN layer, is exactly what Andrew Ng’s excellent Coursera course does.
i hope your work benefit to all kind of human especially a poor one like me... yes your video already helped me so much. Thank you.
This is pure gold thank you for this !
Love this podcast! ❤
Good podcast!
İs it real 😂❤