Batch Normalization (“batch norm”) explained
Вставка
- Опубліковано 4 чер 2024
- Let's discuss batch normalization, otherwise known as batch norm, and show how it applies to training artificial neural networks. We also briefly review general normalization and standardization techniques, and we then see how to implement batch norm in code with Keras.
🕒🦎 VIDEO SECTIONS 🦎🕒
00:00 Welcome to DEEPLIZARD - Go to deeplizard.com for learning resources
00:30 Help deeplizard add video timestamps - See example in the description
07:02 Collective Intelligence and the DEEPLIZARD HIVEMIND
💥🦎 DEEPLIZARD COMMUNITY RESOURCES 🦎💥
👋 Hey, we're Chris and Mandy, the creators of deeplizard!
👉 Check out the website for more learning material:
🔗 deeplizard.com
💻 ENROLL TO GET DOWNLOAD ACCESS TO CODE FILES
🔗 deeplizard.com/resources
🧠 Support collective intelligence, join the deeplizard hivemind:
🔗 deeplizard.com/hivemind
🧠 Use code DEEPLIZARD at checkout to receive 15% off your first Neurohacker order
👉 Use your receipt from Neurohacker to get a discount on deeplizard courses
🔗 neurohacker.com/shop?rfsn=648...
👀 CHECK OUT OUR VLOG:
🔗 / deeplizardvlog
❤️🦎 Special thanks to the following polymaths of the deeplizard hivemind:
Tammy
Mano Prime
Ling Li
🚀 Boost collective intelligence by sharing this video on social media!
👀 Follow deeplizard:
Our vlog: / deeplizardvlog
Facebook: / deeplizard
Instagram: / deeplizard
Twitter: / deeplizard
Patreon: / deeplizard
UA-cam: / deeplizard
🎓 Deep Learning with deeplizard:
Deep Learning Dictionary - deeplizard.com/course/ddcpailzrd
Deep Learning Fundamentals - deeplizard.com/course/dlcpailzrd
Learn TensorFlow - deeplizard.com/course/tfcpailzrd
Learn PyTorch - deeplizard.com/course/ptcpailzrd
Natural Language Processing - deeplizard.com/course/txtcpai...
Reinforcement Learning - deeplizard.com/course/rlcpailzrd
Generative Adversarial Networks - deeplizard.com/course/gacpailzrd
🎓 Other Courses:
DL Fundamentals Classic - deeplizard.com/learn/video/gZ...
Deep Learning Deployment - deeplizard.com/learn/video/SI...
Data Science - deeplizard.com/learn/video/d1...
Trading - deeplizard.com/learn/video/Zp...
🛒 Check out products deeplizard recommends on Amazon:
🔗 amazon.com/shop/deeplizard
🎵 deeplizard uses music by Kevin MacLeod
🔗 / @incompetech_kmac
❤️ Please use the knowledge gained from deeplizard content for good, not evil.
Machine Learning / Deep Learning Tutorials for Programmers playlist: ua-cam.com/play/PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU.html
Keras Machine Learning / Deep Learning Tutorial playlist: ua-cam.com/play/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL.html
I already asked it on another video, but just to cover the most area as possible
Could I possibly normalize the weights to have mean 0 and variance 1 on weights initialization?
I am in debt to you for teaching me so much in one day. I would have kissed your hand in gratitude if you were in front of me. NN are such a convoluted mess but you have made things easier.
Can we make a game where ai have their own life and we live as their family and social system with our friends
I'm deeply impressed by the quality of your videos. Allow me to say that these, by far, are the most helpful video tutorials on Neural Networks. I seriously appreciate the time you spend researching such information and then putting it in such a concise pleasant way, that's also easy to comprehend. Trust me without you, I wouldn't have been able to understand what changes these parameters make in the network. That's why, thank you very very much for both the time and the effort you put into this! And please, please, keep making more tutorials.
Also, I'd like to remark that the topics of these videos are so sequential, so if you're following the playlist from the very beginning you'd absolutely be able to make sense of everything noted in the videos, regardless what your prior knowledge of Neural Networks is. Besides, the Keras playlist is also complementary and adds up a lot to the learning experience.
All in all, this is - in one word - "professional work".
Wow kareem, thank you so much for leaving such a thoughtful comment! I'm very happy to hear the value you're getting from this series, and we're really glad to have you here!
i don't allow you to say..!!
It was the purpose of this *deep learning* videos : to be *deeply* impressed by the *learning* you get
God Bless you, my dear Teacher. I saw in every lesson that you put the whole ocean in a small jar. This is the unique qualities and very few teachers have such good quality.
Thank you, Hafiz!
Several days that I read several articles to understand what really does Batch Norm, and I found your video. Perferctly explained, thanks a lot !
One of the few youtube series I have completed in my life. Instead of beating around the bushes, you kept it to the point with the tons of info just in few minutes. Hope to see more such series.
THANK YOU SO MUCH FOR THIS AMAZING PLAYLIST! One of the best channels for learning deep learning. Absolutely loved your content. It was explained in the easiest possible way and awesome graphical illustrations. You really worked hard on the editing! Thanks again!
Literally watched all 38 videos in one go. Thank you so much!
Finally completed the series of deep learning, Thank You for such amazing videos and blogs for giving free on UA-cam. It's great quality!!!
Thanks, I'm writing my thesis thanks to your explanations!
Completed the whole playlist. Now I am confident about the basics of neural networks. Thanks a lot for the great series!!
I love your tutorial. The illustration is just so concise and easy to understand. Thank you for all your effort in making these videos!
Top-notch, I finished it all, kudos to the Deeplizard team, love you all, love you Mandy, your sweet voice keep up us.
This is the best intro to deep learning i have seen anywhere be it a textbook or video lecture series. You have definitely put in serious efforts and thought to break down this dense topic into bite-size tutorials packed with logical chain of thought which is easy to follow through. Thanks a lot :)
finding this channel has been a great help for my studies!
worth watching all the videos because of the content delivery and quality. big thumbs up for the entire team
These tutorial videos are one of the best ones I could find. The explanations are extremely lucid and so easy to understand. I really hope you expand your pool of videos to include other topics such as RNNs. You could also dedicate some videos to hyper-plane classifiers, SVMs, RL, even some optimization methods. All in all the set of videos is just amazing!
Thanks for all of your hard work in putting this series together. I just finished this last video & I can say that with your help I am much further ahead in understanding deep learning. God bless!
I am really fascinated by your hard work that bring such quality to your videos ! I would be really happy if you could make as much more stuff as possible. Channels like yours only keep up the spirits of students like us really high! Just one word to sum it up....... OUTSTANDING !!
Great video! But from my understanding, only g and b are trainable. In 4:23, it is mentioned that the mean and std are parameters as well ("these four parameters ... are all trainable")
Thanks Fernando, you’re right! The blog for this video has the correction :)
deeplizard.com/learn/video/dXB-KQYkzNU
came looking for this comment! thanks for stopping me losing my mind trying to reconcile this explanation to the paper
Wow, thanks for putting this up. You deserve every like and every subscribe. Great job.
This one of the most comprehensive videos I ever watch.....
really thank you and I am looking forward for advanced concepts
Great playlist!! I went through the entire Deep Learning playlist, and have to say it's probably one of the best at explaining deep learning in a simplistic way. Thanks for sharing your knowledge!! 👍
Clearly explained, good animation, covered most areas. Thanks
Thank you very much for this whole series! It was really enjoyable to watch and I learnt a lot!
Thanks for the amazing series! I really enjoyed your videos! Keep up the good work! Hope to see more complex networks made simple by you!
just woaaa ..! Please keep making these videos, it's by far the best explanation I got here
Very Excellent, I hope you continue this series. Your explanation is so clear.
Thank you Deeplizard!.
The playlist of Machine Learning & Deep Learning Fundamentals made me understanding the concepts of ML super easily.
Thank you so much :D
Thanks for the amazing explanation!! By far the best tutorial video I've seen!
Wow, this is awesome. Kudos to you! Perfect explanation. Was trying to understand batchnorm from some websites and articles, this was much better than any of them. Thanks!
The video I was finding like a beggar over the internet to help me understand the step 2 and 3 of batch norm. Here it was finally! Thank you so much for doing great work. I really really appreciate. So simple calm and informative explanation to very important topic.
Ohh bhai khajaana 💰💰💰mil gaya💰💰💰💰💰💰💰💰💰
I have successfully binged (across 2 weeks) this playlist and found them really helpful! Thank you for all you do and keep up the good work. Hope to watch more vids getting added here or elsewhere on the channel. Lots of love:)
Thank you, and great work! Check out the homepage of deeplizard.com to see all other DL courses and the order in which to take them after this one!
Thank you so much for your explanations!. I'm writing my phd thesis and your tutorial helped me a lot :)
Wonderful work. Thank you for setting up all this content.
This is a gem! Thank you very much!!!
I think every machine learning specialist even specialized one will find in you course something new for himself.:) Great course, Thanks a lot!
Hurray, Completed the series (The only series on UA-cam which I have seen from the first video to last without skipping a second). Amazing job Deep Lizard Team. Highly Appreciated!
Now I am going to see the Keras Playlist and den I will see the Pytorch series and den Reinforcement learning
Congratulations! 🎉 Keep up the great work as you progress to the next courses!
Nice tutorial, clear, professional voice and animations !
Looking forward more deep learning videos :)
(I'm aware of your Keras tutorial series and I'm going to watch it right now !)
Thank you, Jonathan! I'm glad you're liking the videos so far!
Simple and lucid explanation. loved it. Thanks
great video. precise and concise. Thanks!
gentle and to the point. Thank you.
The online tutorial is very useful and helps me understand in detail batch normalization concept, which has confused me for a long time. Thanks very much for your sharing.
You are welcome!
Great content. Like many others have said, one of the best series on ML out there.
just like all other comments: i have just finished your video series and I am impressed by the quality of explanation. Many videos go into tiny details way to fast, before making sure that everyone at least understands the terms. Kudos! I hope you make many more.
Thank you Robin! Much more content available on deeplizard.com :)
nice short video and great way of explaining!
I will follow this channel and watch more videos!
Keep up the great work
As always, very well done and clear, thank you!!
These videos are SO helpful, thank you
Cleared the concept. Thnx
the best tutorial that I've ever seen.thanks
thankyou for amazing explanation
Amazing explanation!
I found, pure gold ... ! Great video ! I understood perfectly !
Beautiful !! super clear !
Excellent series!
Ohhh what a wonderful narrative. I really like the way you explained it. Thank you and I’ve just Subscribed to your channel👍🏻
great series
amazing teaching skills you have got madam
thank you
Thank you so much for your great work ❤
Loved your video. I am going to complete this series. Can you include RNNs, LSTMs and GRUs, and also complete the video series? I am looking forward to this as I start and complete this series.
Wow. Such a nice explanation. Thank you!
Amazing and concise video, thank you!
Very nice tutorial, thank you
i completed thes series of this videos, can't wait to watch more on your playlist!
Awesome job! See all of our deep learning content on deeplizard.com :)
Very well explained!
this was an amazing explanation. thank you.
Thanks, Nika!
AMAZING SERIES
Wonderful explanation
Thank You Very Very Much. I'm posting this comment in 2020. and under the house quarantine. I needed to know about deep learning to my internship. And thanks to this playlist, now I have good knowledge about fundamental theories of neural networks.
Wonderful!
Great quality content, subscribed ️🔥
Very well explained
Just WoW! Amazing content. Please make series on Explainig research papers
That was very helpful, thanks
0:10 intro
0:30 normalize and standardize
1:25 why normalize
3:05 problem of large weights, and batch normalization
5:46 Keras
Thank you for your contribution of the timestamps for several videos! Will review soon for publishing :)
These are really good set of videos for neural network. I really liked it a lot and enjoyed watching it. Great work. But there is just one thing which I would like to suggest, you guys have explained Back propagation really well, better than most that I have seen, but it would be really helpful in understanding back propagation better if you could add a small numerical problem for back propagation calculation.
I spotted a slight issue in the article for this video.
At the end of the article, it says "I’ll see ya in the next one!", with a link to the Zero Padding article, but by that point that article has already been covered.
I really enjoy your courses so far, by the way. I've stopped and started a few times with studying ML in the past, but this has been a pleasure to go through.
Fixed, thanks Chris! :D
I've rearranged the course order since the initial posting of these videos/blogs, so I removed the hyperlink.
This video was amazing
Thank you so much mandy... i have gone through all the videos... 😍😍😍 .
Thanks for the video. So do we have to normalize data before putting to the model or batch normalization does it itself in the model?
Just wanted to say kudos and thanks so much for your awesome series :D I have learned so much! Now I'm off to your Keras w/TF series :)
Great job getting through this course!
@@deeplizard Thanks! moving to your Deep Learning and Keras series next :)
The best explonation I ever watch
@deeplizard could you please explain how does "g" and "b" gets updated during backpropogation in "(z*g)+b". Is the derivative taken or is there any other method.
Brilliant !!
Thanks. How exactly is the mean and std for a specific neuron in the dense layer calculated? Is it just like adding up all values in a specific batch and then divide by the batch size. Each time a new batch gets fed in then this repeats? Thanks
Good stuff, thank you
Thanks a lot.
Hey I have a question! It is sometimes preferred to have a batchnorm layer after a convolutional layer and after the activation layer. Does anyone know why?
Batch norm according the paper is actually applied before the activation function, not after. For this reason, they even recommend dropping the bias parameter of the layer itself because batch norm comes with a learnable bias term. The output of batch norm then goes to the activation function.
very helpful tut
awesome...I am going to watch the playlist....
i started to fall in love with the voice
@deeplizard please do a series on transfer learning, or more in-depth teaching on NLP/CV :)
Do I need to add the batch normalization after each hidden layers or use it once just before the output layer?
Very good Explanation..watched this whole playlist.Thanks for making understanding DL so easy and fun.Moreover your funny stuff made me laugh.
Great video, very clear and understandable. However, I want to point out some mistakes. In the batch norm, only b and g are trainable; not the m and s. Moreover, batch norm is applied after fully connected/convolutional layers but before activation functions. Therefore, it doesn't normalize the output from activation function; it normalizes the input to the activation function.
In the example you mentioned, about miles driven in 5 years. Why did you mention that the data isn't necessarily on the same scale? I didn't get that. Can you elaborate? 1:48
so helpful!
thank you really you are the best teacher in the world. I appreciate your efforts
Happy to hear the value you're getting from the content, qusay!
@@deeplizard I am so happy for your reply to my comment ^_^
I have a question on the slide around 4:00. Why do we need multiple and some parameter value after normalizing the value? The step will transform the value range. In term of the original papers, they say identify transform. In fact I wonder why we use 'identiy transform' which essentially makes no chnage to the input.
do i need to normalize my input data 'manually' or it is ok to instead use BatchNormalization layer as the very first layer in my model? something tells me the data should be first normalized as a whole, whereas the layer would normalize per-batch and therefore each batch would be normed differently.
I really enjoyed learning with your videos. Can you please create videos on RNN.!!
beautiful vid