To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning To learn more about Grid: www.grid.ai/ Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@@statquest please include CNN, RNN and LSTM for multivariate time series as input and continuous variable as out put - problems, these are very much useful in climate change studies.
@statquest I have a question about the vanishing gradient problem. I understand that input1 from the first timestep is having less and less impact on the output the more steps we take but isn’t the gradient also relying on the new inputs of every timestep? I don’t understand why the gradient is vanishing if the new inputs aren’t that heavily discounted as the timesteps that are older. I imagined it’s more like the old inputs are less impactful and the network is more focused on the newer inputs but can still train normally. Is there something I’m missing?
The only place on the internet where you can actually grasp a complex topic before diving deeper into the topic. I am so grateful people like you exist. Thank you!
I'm in a deep learning class right now and the amount of straight math that my teacher throws at me is overwhelming. Videos like yours are incredible and I'm so thankful for the help and the color coding and the fun that makes it worth watching! It is super helpful as I'm studying for my midterm and just want to get a more definite grasp of what all this math actually means without reading someone's Mathematics PhD dissertation or something
Everytime I watch on of your lessons, I become sooo happy, because you make all the subjects easy to be understood in magical way. Thank you for your effort
One of the most underrated channels. Never once have I had trouble understanding the intuition of whatever you explain. I'd donate money to you if I weren't a broke college student.
This is the best video I came across that explained RNN in details without leaving a single thing If my Teachers were to explained like this I will have understood better Thanks a lot
To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning To learn more about Grid: www.grid.ai/ Support StatQuest by buying The StatQuest Illustrated Guide to Machine Learning!!! PDF - statquest.gumroad.com/l/wvtmc Paperback - www.amazon.com/dp/B09ZCKR4H6 Kindle eBook - www.amazon.com/dp/B09ZG79HXC English This video has been dubbed using an artificial voice via aloud.area120.google.com to increase accessibility. You can change the audio track language in the Settings menu. Spanish Este video ha sido doblado al español con voz artificial con aloud.area120.google.com para aumentar la accesibilidad. Puede cambiar el idioma de la pista de audio en el menú Configuración. Portuguese Este vídeo foi dublado para o português usando uma voz artificial via aloud.area120.google.com para melhorar sua acessibilidade. Você pode alterar o idioma do áudio no menu Configurações.
Never quite understood RNNs until I watched this video, thank you! A hand-calculated example of a one-to-one RNN is extremely hard to find online, so this was perfect. The only one out there, I believe.
First time commenting on youtube video. You are a living legend. It has been two days I am trying to understand RNNs and came across your video and Baam. RNNs concept got cleared.
This explanation covers some very important points that were missed in several other lectures I've watched on this subject. Thank you for clearing things up for me.
Hopefully, in the long term, what you learn from these videos will make it easier to understand other sources. At least, that's starting to happen for me. It's still hard, though.
I'm just in love with your content. I've watched your neural network series and it was just so easy to understand. You really deserve more subs and views Josh!
@@statquest when it comes to application of RNN the LSTM is sometimes a must to have:) that’s why it would be great to have clearly explanation of LSTM. But these are little things. In any case, thank you very much for the valuable knowledge that we can get here.
StatQuest’s stylized scalar-based mumerical examples are amazing even for learning beyond the introductory level. To get the full vectorized-matrix version of the algorithms, I just mentally swap x, w, b etc with the corresponding vectors and matrices, then it’s golden!
12:22 is probably the cutest bam I've heard Also thank you for your videos! They have definitely been helping me get through my Bioinformatics grad course. You are AWESOME
literally before I see your video, I made a like, that much I trust your information and knowledge , thank you for your time and effort to explain this to us
@@statquest How could I not have seen this?! By different series, I mean it would be great if you could create more videos covering each of these topics in more detail. But of course, you've already done so far, and I'm so grateful to you for sharing your knowledge in such a good way.
Those tones won won bam double bam kaboom n d fun way of learning, opens up mind for grasping things real quick as well as we can think freely wdout bcming nervous. U lord🙌
To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning
To learn more about Grid: www.grid.ai/
Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
whether the book include LSTM and RNN for bi variate input time series data as well?
@@rathnakumarv3956 Nope, just basic neural networks. My next book on deep learning will have more stuff about fancy neural networks.
@@statquest
please include CNN, RNN and LSTM for multivariate time series as input and continuous variable as out put - problems, these are very much useful in climate change studies.
@Tech What time point, minutes and seconds, is confusing? And have you watched the entire series on Neural Networks before watching this one?
@statquest I have a question about the vanishing gradient problem. I understand that input1 from the first timestep is having less and less impact on the output the more steps we take but isn’t the gradient also relying on the new inputs of every timestep? I don’t understand why the gradient is vanishing if the new inputs aren’t that heavily discounted as the timesteps that are older. I imagined it’s more like the old inputs are less impactful and the network is more focused on the newer inputs but can still train normally. Is there something I’m missing?
The only place on the internet where you can actually grasp a complex topic before diving deeper into the topic. I am so grateful people like you exist. Thank you!
Thanks!
@@statquest np
I'm in a deep learning class right now and the amount of straight math that my teacher throws at me is overwhelming. Videos like yours are incredible and I'm so thankful for the help and the color coding and the fun that makes it worth watching! It is super helpful as I'm studying for my midterm and just want to get a more definite grasp of what all this math actually means without reading someone's Mathematics PhD dissertation or something
Good luck on your midterm! :)
Everytime I watch on of your lessons, I become sooo happy, because you make all the subjects easy to be understood in magical way. Thank you for your effort
Wow, thank you!
One of the most underrated channels. Never once have I had trouble understanding the intuition of whatever you explain. I'd donate money to you if I weren't a broke college student.
Thanks!
OMG, Finally I understand Vanishing Exploding Gradient, Thank you StatQuest!
HOORAY!!! Thanks for supporting StatQuest!!! TRIPLE BAM! :)
This is the best video I came across that explained RNN in details without leaving a single thing
If my Teachers were to explained like this I will have understood better
Thanks a lot
Glad you liked it!
Thanks to your series of videos on neural networks, I was able to pass the entrance exam for PhD program at St. Petersburg State University.
Congratulations!!! TRIPLE BAM!!! :)
To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning
To learn more about Grid: www.grid.ai/
Support StatQuest by buying The StatQuest Illustrated Guide to Machine Learning!!!
PDF - statquest.gumroad.com/l/wvtmc
Paperback - www.amazon.com/dp/B09ZCKR4H6
Kindle eBook - www.amazon.com/dp/B09ZG79HXC
English
This video has been dubbed using an artificial voice via aloud.area120.google.com to increase accessibility. You can change the audio track language in the Settings menu.
Spanish
Este video ha sido doblado al español con voz artificial con aloud.area120.google.com para aumentar la accesibilidad. Puede cambiar el idioma de la pista de audio en el menú Configuración.
Portuguese
Este vídeo foi dublado para o português usando uma voz artificial via aloud.area120.google.com para melhorar sua acessibilidade. Você pode alterar o idioma do áudio no menu Configurações.
Josh!!!! I love u!!! I can't wait to learn about the Transformers!! thank you very much for your content
Thank you!
Josh teaching about transformers would be a blessing
@@capyk5455 I'm looking forward to it!
Transformers out yet or some ETA to expect?
@@shaiguitar LSTMs comes out in the next week or so. Then I'll start working on transformers.
You can't understand how good this is. I've spent all of yesterday trying to understand these concepts but I couldn't grasp them. THANK YOU!!!
Glad it helped!
This is the CLEAREST explanation of RNNs.
Thank you!
Never quite understood RNNs until I watched this video, thank you! A hand-calculated example of a one-to-one RNN is extremely hard to find online, so this was perfect. The only one out there, I believe.
Thanks!
wow, just wow! 2 days of headache solved by a 17 min video! thank you for existing.😊
bam!
Josh, I found your channel yesterday and have been binge watching. Incredible work in democratizing knowledge. Thankful for your work.
Thank you!
democratizing knowledge -- exactly !!!
With this level of simplicity in teaching, even a high schooler could grasp these concepts, probably quicker than me!
Scared of the future now....
:)
You're gonna carry me through my neural networks class, what a godsend
You can do it!
I was looking for a small thing in RNN, but your way of explanation forced me to keep watching the entire video! and I subscribed to your channel!!
Hooray! Thank you very much! :)
First time commenting on youtube video. You are a living legend. It has been two days I am trying to understand RNNs and came across your video and Baam. RNNs concept got cleared.
Glad I could help!
Our lecturer at the uni recommended us this video. I am amazed how simply it is put. Great job! Both funny and informative ❤
Thank you!
Thank you for making these videos! They are very helpful.
TRIPLE BAM!!! Thank you so much for supporting StatQuest!!! :)
This explanation covers some very important points that were missed in several other lectures I've watched on this subject. Thank you for clearing things up for me.
For example, the note at 10:31
Thank you! Yes, that little detail is important, but often glossed over.
i come to listen to " peep poop poop"
bam! :)
Wom wom ! 😂😅
This is not fair, I literally am addicted to your style of teaching and find it quite hard to learn from other sources now.
Hopefully, in the long term, what you learn from these videos will make it easier to understand other sources. At least, that's starting to happen for me. It's still hard, though.
honestly your channel is one of if not the best channel on all youtube, thank you so much for this!
Wow, thank you!
Josh, you are the person who make ML theory so understandable!
Thank you!
Your channel should be mandatory for all universities teaching AI 💖
Maybe one day!
It will be for sure 😊
I'm just in love with your content. I've watched your neural network series and it was just so easy to understand. You really deserve more subs and views Josh!
Thank you very much! :)
Thank you so much 😭
People like you are the real mvp of humanity !
Thanks!
Really looking forward to your LSTM video.. You are a very good teacher !!
Thank you!
@@statquest when will you make the next vid ? i have exam in two weeks and i need your LSTM video
@@WonPeace94 :)
Im a simple man, I see statquest, I click like. Can't wait for the videos on transformers.
bam! :)
Amazing video as always, professor! I cant wait for the video on LSTM
You and me both!
@@statquest when it comes to application of RNN the LSTM is sometimes a must to have:) that’s why it would be great to have clearly explanation of LSTM. But these are little things. In any case, thank you very much for the valuable knowledge that we can get here.
@@usamsersultanov689 I hope to have a video on LSTMs soon.
@@statquest One on Transformers and their variations would be even greater :D
@@james199879 That's the master plan, but we'll get there one step at a time.
OH MY GOD THIS IS EXACTLY WHAT I WAS LOOKING FOR THANK YOU SO MUCH
bam! :)
StatQuest’s stylized scalar-based mumerical examples are amazing even for learning beyond the introductory level. To get the full vectorized-matrix version of the algorithms, I just mentally swap x, w, b etc with the corresponding vectors and matrices, then it’s golden!
Bam! :)
I like the way how clearly and easily you explain concepts. Thank you very much!
You're very welcome!
you definitely are the best teacher for machine learning and deep learning
Thank you!
i literally was having a menatal breakdown coz i was unable to understand things. your video helped me a lot and also brought a smile on my face :))
Glad I could help!
Dude that DOUBLE BAMM and TRIPLE BAMMM kills me. Actually fun way to get info. Also greate video very easy to understand
Thank you! :)
Very high level explanation. Waiting for the next video on "Long-short Term Networks". Thank you so much.
Thanks! :)
True Hero. I have an exam on 29th about rnns, lstms and transformers.
Good luck! :)
12:22 is probably the cutest bam I've heard
Also thank you for your videos! They have definitely been helping me get through my Bioinformatics grad course. You are AWESOME
Thank you so much and good luck with your course! :)
Hey Josh...the way you made this so easy is mind blowing, I love you man, keep being awesome 😊
Thank you so much 😀!
Excelente proyecto! no pense que con dibujos fuera tan entretenido e informativo. Definitivamente un muy buen video para comenzar!
Muchas gracias! :)
They way you explained RNNs made me so excited for LSTMs. Can‘t wait to see it!
bam!
Oh my gosh that was an amazing explanation. I'm quite literally flabbergasted. Thanks, mate!!
Glad you liked it!
Hi Josh, You are the best. Nobody has explained exploding gradient like you have, Thank you
Wow, thanks!
This was the best explanation I've heard for RNNs!
Thank you! :)
Oh man! This has been super tough for me to wrap my head around. I knew this was going to be a great weekend! Thank you for the drop! :D
BAM! :)
literally before I see your video, I made a like, that much I trust your information and knowledge , thank you for your time and effort to explain this to us
Thanks!
Great video, you must become the President of the ClearlyExplainedLand
:)
Clearly explained the difference between RNN and normal network, gradient vanishing/exploding! Looking forward to the LSTM and Transformer videos!!!
Thanks!
Love the little embarrassing singings during the videos. Subscribed. Great videos!
Thanks!
You graced upon us as a stats saviour :))) Send love from Australia
Awesome! Thank you!
That intro was sick. I smashed like button immediatelly :D
bam! :)
why are you master of everything???? I have been watching your video for two years through out my university course
Ha! Thank you! :)
Oh man i literally watch his videos like a web series its very fun and very easy to understand thank you very much sir !!!!😭😭
Thanks!
Man, this is awesome. I wasn't understanding anything about RNNs in my course but thanks to this video is all clear now.
Thank you Josh Stamer :D
BAM! :)
Thank you
DOUBLE BAM!!! Wow! I really appreciate the support!
you are vry very very very very very brilliant teacher ! you are my low variance and low bias position.
bam! :)
Best explanation I've seen from RNNs. Thanks.
Thank you!
Thank you brother, this was really intuitive and easy to understand
Thanks!
This dude is funny and a great teacher. Triple BAM fosure
Thanks!
Anxious for the LSTM, Transformers and Attention StatQuests!
It's already available to early access patreon supporters and will be available to everyone else soon.
Hello Guys let's make this man happy always as he did for us!!!!!!!!!!!!! Nothing to say just thanks a lot.
BAM! Thanks!
I saw a light turning on my head! great video
bam! :)
Amazing explaination with simple, easy-absorb, attractive method and but still pursue clear concept. 🙂 Kabaam... nice job
Thank you! :)
KA-BAAAM! Thank you for all these amazing videos. I wish you had different series about CNNs and RNNs separately.
If you want to learn about CNNs, see: ua-cam.com/video/HGwBXDKFk9I/v-deo.html
@@statquest How could I not have seen this?! By different series, I mean it would be great if you could create more videos covering each of these topics in more detail. But of course, you've already done so far, and I'm so grateful to you for sharing your knowledge in such a good way.
Your professorial ability is only exceeded by your singing!!
Thanks!
@@statquest When are you going to release your, surely to be a best-selling, book on singing lessons?
@@exxzxxe Just as soon as I win Eurovision.
U have an amazing way of explaining with adlibs loved it and thank you so much as I was not able to understand at all but now it is very clear
Glad I could help!
Those tones won won bam double bam kaboom n d fun way of learning, opens up mind for grasping things real quick as well as we can think freely wdout bcming nervous. U lord🙌
Thanks!
Thanks!
Hooray!!! Thank you so much for supporting StatQuest! TRIPLE BAM! :)
Any darn fool can make something complex; it takes a genius to make something simple (" Albert Einstein"), and you made it very very simple. Thanks!
Thank you very much! :)
PENTA BAM!!! The best pre-course !
Thanks!
Great video!! I can't wait for LSTM and transformer videos!
Coming soon!
Thank you for sharing 🙂 super excited for the transformers statsquest!
Thanks!
Thanks a lot Josh. Every concept explained by you is a BAM!!!!!!!!!!
Glad you think so!
Amazing, This is one best and coolest learning tutorial i have watched ever, great work Josh, keep it up. Thanks
Thanks, will do!
Been waiting and waiting, the waiting is BAM!!!
:)
Thank YOUUU Clearly explained !! I have been struggling with it !
BAM! :)
Beautiful and succinct explanations!! So glad I found your channel....lots of love
Thanks!
Clear as day!!! Hooray!!!! Thank you Josh
Thank you!
Thank you for this amazing explanation! Waiting for the video on LSTM! :)
Coming soon!
谢谢!
Thank you so much for supporting StatQuest! TRIPLE BAM! :)
Thanks
TRIPLE BAM!!! Thank you very much for supporting StatQuest!!! :)
Very hyped for the video transformers ! Keep up the good work, it's amazing how good it is!
Thanks!
Such a great explanation!! Will be watching as many of your other videos as I can while we wait for the next one :))
Thank you! 😃
Josh, you are amazing! Thank God you exist
Thank you!
Gracias !! Estuvo excelente ✨✨✨✨ Bendiciones
Muchas gracias!
Hey, hopefully this will safe my Deep Learning exam. And... love the sound effects.
Best of luck!
I wish you were my math teacher! The whole class would have sang like you while calculating🤣
That would be awesome!
You are insane Man, very clear and understandable explaination!! Thanks a ton 🎉
Happy to help!
... you sir are a timeless legend!
Thanks!
Thank you so much. You make great videos... Just great teaching. Thanks alot.
Glad you like them!
u are king my friend
perfect explanation with simple example
Thanks!
All love. Looking forward to LSTM video.
Thanks! The LSTM video should be out on Monday.
OMG I love you for your teaching style
Thank you! 😃
El video estuvo muy bien explicado, lo entendí facilmente. Gracias
Muchas gracias! :)
Omg, that intro jingle is gold!
bam!
Great video & explanation 👏🌟. Thank you so much.
Glad it was helpful!
Really liked the video. Quite creative and straight to the point!
Thanks!