Recurrent Neural Networks - EXPLAINED!
Вставка
- Опубліковано 23 лис 2018
- Understand exactly how RNNs work on the inside and why they are so versatile (NLP applications, Time Series Analysis, etc). We are also going to construct a recurrent neural net from scratch using only numpy!
Follow me on M E D I U M: towardsdatascience.com/likeli...
Code: github.com/dennybritz/rnn-tut...
(Give this dude a * for his hard work!)
Patreon: / codeemporium
Quora: www.quora.com/profile/Ajay-Ha...
REFERENCES
[1] Slides from the Deep Learning book for RNNs: www.deeplearningbook.org/slid...
[2] Andrej Karpathy’s Blog + Code (You can probably understand more from this now!): karpathy.github.io/2015/05/21/...
[3] The Deep learning Book on Sequence Modeling: www.deeplearningbook.org/cont...
[4] Colah’s blog on LSTMs: colah.github.io/posts/2015-08-...
[5] Wildml for Coding RNNs: www.wildml.com/2015/10/recurre...
This is by far the best explanation on RNN. There are lots of efforts being put to make it. Thank you for this content. 🙏
Thank you for recognizing my talent and glad this helps :)
I love that you use the images from Ian’s book. I’m currently reading that one, and this keeps some nice consistency between the book and your videos
Subscribed, this is quality content!
Excellent tutorial, my friend! Subscribed, please keep it going!
Amen
Why its so similar to my knowledge of state-space system in control theory. Is there any chance that they are related?
HELP !!!!! In RNN we have only 3 unique weight parameters, so during back prop. their will be only 3 parameters to update then why are RNN goes till the 1st input & creates long term dependencies thereby creates vanishing gradient problem ????
Hello, you've said at 6:36 that all h's are actually the same. Are these really the same? What I knew is only weights (V, W and U) are kept constant while unfolding, not the states themselves.
unless what the original author was trying to say is the function of h(t) =tanh(a(t)), the value of h(t) will be different at each stage
One of the most underrated channels on UA-cam!
Hi man, I love your work!. It would be great if you do a tutorial about Image Captioning
I was thinking the same. Probably a future video. And thanks for watching!
PLEASE! Please make a little image with what varialbe represents what in you mathmatical stuff. I am a beginner and have a little experience but it would be extremly helpful, if you do this!
Great Video
It is confusing, if you use a not for the activation but for the weighted sum... correct me if I am wrong.
Puppy in a cup - classic. Great video
Thank youu
Omg! How do you know exactly what videos I want in the moment!??? Thanks!
I can read minds online. It's a thing you develop over time.
Theoretically .very good. However , understanding code going difficult. Please make videos like hands-on session. So we could follow better.
Great video! One small correction: it's pronounced "Eeee-pock" not "eh pick". Epic is describing something awesome; epoch is one forward & backward pass of all training examples.
Thanks for the complements! About the pronunciation of epoch, I've looked at a number of sources. If you check out Merriam Webster's pronunciation, it can be either "ee-pock" or "eh pick" depending on whether you use American English or British English. I have a bad habit of combining both ;)
@@CodeEmporium Yeah I've read that as well. The "eh pick" has greater semantic ambiguity (~ higher Shannon entropy) since there are multiple meanings (more possible states), whereas "ee-pock" is much less ambiguous, and therefore it makes more sense to go with that one, if the goal is to be as clear as possible when communicating. But minor detail; really, I love your videos. Have learned so much from you! Keep up the great work
@@RedShipsofSpainAgain lol dude! Geek alert!!! peace out :) Great video indeed!
@@aakashagarwal2602 lol yup. Once you learn about Shannon entropy, you see its applications EVERYWHERE! it really is a cool theory
Halfway through the video, do you want to marry me because I'd say yes any day? Beautifully made and I'm not sure if I have more questions or less than before but I can confidently say that you've helped me a ton.