Batch Normalization (“batch norm”) explained

Поділитися
Вставка
  • Опубліковано 4 чер 2024
  • Let's discuss batch normalization, otherwise known as batch norm, and show how it applies to training artificial neural networks. We also briefly review general normalization and standardization techniques, and we then see how to implement batch norm in code with Keras.
    🕒🦎 VIDEO SECTIONS 🦎🕒
    00:00 Welcome to DEEPLIZARD - Go to deeplizard.com for learning resources
    00:30 Help deeplizard add video timestamps - See example in the description
    07:02 Collective Intelligence and the DEEPLIZARD HIVEMIND
    💥🦎 DEEPLIZARD COMMUNITY RESOURCES 🦎💥
    👋 Hey, we're Chris and Mandy, the creators of deeplizard!
    👉 Check out the website for more learning material:
    🔗 deeplizard.com
    💻 ENROLL TO GET DOWNLOAD ACCESS TO CODE FILES
    🔗 deeplizard.com/resources
    🧠 Support collective intelligence, join the deeplizard hivemind:
    🔗 deeplizard.com/hivemind
    🧠 Use code DEEPLIZARD at checkout to receive 15% off your first Neurohacker order
    👉 Use your receipt from Neurohacker to get a discount on deeplizard courses
    🔗 neurohacker.com/shop?rfsn=648...
    👀 CHECK OUT OUR VLOG:
    🔗 / deeplizardvlog
    ❤️🦎 Special thanks to the following polymaths of the deeplizard hivemind:
    Tammy
    Mano Prime
    Ling Li
    🚀 Boost collective intelligence by sharing this video on social media!
    👀 Follow deeplizard:
    Our vlog: / deeplizardvlog
    Facebook: / deeplizard
    Instagram: / deeplizard
    Twitter: / deeplizard
    Patreon: / deeplizard
    UA-cam: / deeplizard
    🎓 Deep Learning with deeplizard:
    Deep Learning Dictionary - deeplizard.com/course/ddcpailzrd
    Deep Learning Fundamentals - deeplizard.com/course/dlcpailzrd
    Learn TensorFlow - deeplizard.com/course/tfcpailzrd
    Learn PyTorch - deeplizard.com/course/ptcpailzrd
    Natural Language Processing - deeplizard.com/course/txtcpai...
    Reinforcement Learning - deeplizard.com/course/rlcpailzrd
    Generative Adversarial Networks - deeplizard.com/course/gacpailzrd
    🎓 Other Courses:
    DL Fundamentals Classic - deeplizard.com/learn/video/gZ...
    Deep Learning Deployment - deeplizard.com/learn/video/SI...
    Data Science - deeplizard.com/learn/video/d1...
    Trading - deeplizard.com/learn/video/Zp...
    🛒 Check out products deeplizard recommends on Amazon:
    🔗 amazon.com/shop/deeplizard
    🎵 deeplizard uses music by Kevin MacLeod
    🔗 / @incompetech_kmac
    ❤️ Please use the knowledge gained from deeplizard content for good, not evil.

КОМЕНТАРІ • 261

  • @deeplizard
    @deeplizard  6 років тому +18

    Machine Learning / Deep Learning Tutorials for Programmers playlist: ua-cam.com/play/PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU.html
    Keras Machine Learning / Deep Learning Tutorial playlist: ua-cam.com/play/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL.html

    • @rey1242
      @rey1242 5 років тому

      I already asked it on another video, but just to cover the most area as possible
      Could I possibly normalize the weights to have mean 0 and variance 1 on weights initialization?

    • @coolbits2235
      @coolbits2235 5 років тому +4

      I am in debt to you for teaching me so much in one day. I would have kissed your hand in gratitude if you were in front of me. NN are such a convoluted mess but you have made things easier.

    • @Itsme-wt2gu
      @Itsme-wt2gu Рік тому

      Can we make a game where ai have their own life and we live as their family and social system with our friends

  • @kareemjeiroudi1964
    @kareemjeiroudi1964 5 років тому +94

    I'm deeply impressed by the quality of your videos. Allow me to say that these, by far, are the most helpful video tutorials on Neural Networks. I seriously appreciate the time you spend researching such information and then putting it in such a concise pleasant way, that's also easy to comprehend. Trust me without you, I wouldn't have been able to understand what changes these parameters make in the network. That's why, thank you very very much for both the time and the effort you put into this! And please, please, keep making more tutorials.
    Also, I'd like to remark that the topics of these videos are so sequential, so if you're following the playlist from the very beginning you'd absolutely be able to make sense of everything noted in the videos, regardless what your prior knowledge of Neural Networks is. Besides, the Keras playlist is also complementary and adds up a lot to the learning experience.
    All in all, this is - in one word - "professional work".

    • @deeplizard
      @deeplizard  5 років тому +16

      Wow kareem, thank you so much for leaving such a thoughtful comment! I'm very happy to hear the value you're getting from this series, and we're really glad to have you here!

    • @vdev6797
      @vdev6797 3 роки тому

      i don't allow you to say..!!

    • @WahranRai
      @WahranRai 2 роки тому +1

      It was the purpose of this *deep learning* videos : to be *deeply* impressed by the *learning* you get

  • @dr.hafizurrahman9374
    @dr.hafizurrahman9374 5 років тому +24

    God Bless you, my dear Teacher. I saw in every lesson that you put the whole ocean in a small jar. This is the unique qualities and very few teachers have such good quality.

  • @PatriceFERLET
    @PatriceFERLET 4 роки тому +23

    Several days that I read several articles to understand what really does Batch Norm, and I found your video. Perferctly explained, thanks a lot !

  • @tamoorkhan3262
    @tamoorkhan3262 3 роки тому +5

    One of the few youtube series I have completed in my life. Instead of beating around the bushes, you kept it to the point with the tons of info just in few minutes. Hope to see more such series.

  • @nasiksami2351
    @nasiksami2351 3 роки тому +4

    THANK YOU SO MUCH FOR THIS AMAZING PLAYLIST! One of the best channels for learning deep learning. Absolutely loved your content. It was explained in the easiest possible way and awesome graphical illustrations. You really worked hard on the editing! Thanks again!

  • @woudjee2
    @woudjee2 Рік тому +1

    Literally watched all 38 videos in one go. Thank you so much!

  • @deepcodes
    @deepcodes 4 роки тому +3

    Finally completed the series of deep learning, Thank You for such amazing videos and blogs for giving free on UA-cam. It's great quality!!!

  • @linknero1
    @linknero1 4 роки тому +9

    Thanks, I'm writing my thesis thanks to your explanations!

  • @aashwinsharma1859
    @aashwinsharma1859 3 роки тому +1

    Completed the whole playlist. Now I am confident about the basics of neural networks. Thanks a lot for the great series!!

  • @lingjiefeng3196
    @lingjiefeng3196 5 років тому +2

    I love your tutorial. The illustration is just so concise and easy to understand. Thank you for all your effort in making these videos!

  • @tanfortyfive
    @tanfortyfive 3 роки тому +10

    Top-notch, I finished it all, kudos to the Deeplizard team, love you all, love you Mandy, your sweet voice keep up us.

  • @smartguy3043
    @smartguy3043 3 роки тому +2

    This is the best intro to deep learning i have seen anywhere be it a textbook or video lecture series. You have definitely put in serious efforts and thought to break down this dense topic into bite-size tutorials packed with logical chain of thought which is easy to follow through. Thanks a lot :)

  • @karelhof319
    @karelhof319 5 років тому +3

    finding this channel has been a great help for my studies!

  • @vikasshetty6725
    @vikasshetty6725 4 роки тому +1

    worth watching all the videos because of the content delivery and quality. big thumbs up for the entire team

  • @aniketbhatia1163
    @aniketbhatia1163 5 років тому +4

    These tutorial videos are one of the best ones I could find. The explanations are extremely lucid and so easy to understand. I really hope you expand your pool of videos to include other topics such as RNNs. You could also dedicate some videos to hyper-plane classifiers, SVMs, RL, even some optimization methods. All in all the set of videos is just amazing!

  • @rob21dog
    @rob21dog 4 роки тому +1

    Thanks for all of your hard work in putting this series together. I just finished this last video & I can say that with your help I am much further ahead in understanding deep learning. God bless!

  • @parthbhardwaj2262
    @parthbhardwaj2262 3 роки тому +2

    I am really fascinated by your hard work that bring such quality to your videos ! I would be really happy if you could make as much more stuff as possible. Channels like yours only keep up the spirits of students like us really high! Just one word to sum it up....... OUTSTANDING !!

  • @FernandoWittmann
    @FernandoWittmann 4 роки тому +26

    Great video! But from my understanding, only g and b are trainable. In 4:23, it is mentioned that the mean and std are parameters as well ("these four parameters ... are all trainable")

    • @deeplizard
      @deeplizard  4 роки тому +4

      Thanks Fernando, you’re right! The blog for this video has the correction :)
      deeplizard.com/learn/video/dXB-KQYkzNU

    • @davidireland724
      @davidireland724 4 роки тому +5

      came looking for this comment! thanks for stopping me losing my mind trying to reconcile this explanation to the paper

  • @pallavbakshi612
    @pallavbakshi612 5 років тому +3

    Wow, thanks for putting this up. You deserve every like and every subscribe. Great job.

  • @abdelrahmansalem6233
    @abdelrahmansalem6233 Рік тому +1

    This one of the most comprehensive videos I ever watch.....
    really thank you and I am looking forward for advanced concepts

  • @JoeSmith-kn5wo
    @JoeSmith-kn5wo 11 місяців тому +1

    Great playlist!! I went through the entire Deep Learning playlist, and have to say it's probably one of the best at explaining deep learning in a simplistic way. Thanks for sharing your knowledge!! 👍

  • @gaurav_gandhi
    @gaurav_gandhi 5 років тому +1

    Clearly explained, good animation, covered most areas. Thanks

  • @tymothylim6550
    @tymothylim6550 3 роки тому +1

    Thank you very much for this whole series! It was really enjoyable to watch and I learnt a lot!

  • @jerseymec
    @jerseymec 4 роки тому +1

    Thanks for the amazing series! I really enjoyed your videos! Keep up the good work! Hope to see more complex networks made simple by you!

  • @simonbernard4216
    @simonbernard4216 5 років тому +1

    just woaaa ..! Please keep making these videos, it's by far the best explanation I got here

  • @hamzawi2752
    @hamzawi2752 4 роки тому +1

    Very Excellent, I hope you continue this series. Your explanation is so clear.

  • @stwk8
    @stwk8 2 роки тому +1

    Thank you Deeplizard!.
    The playlist of Machine Learning & Deep Learning Fundamentals made me understanding the concepts of ML super easily.
    Thank you so much :D

  • @julianarotsen6521
    @julianarotsen6521 4 роки тому +1

    Thanks for the amazing explanation!! By far the best tutorial video I've seen!

  • @silentai3826
    @silentai3826 3 роки тому +1

    Wow, this is awesome. Kudos to you! Perfect explanation. Was trying to understand batchnorm from some websites and articles, this was much better than any of them. Thanks!

  • @pranaysingh3950
    @pranaysingh3950 2 роки тому +2

    The video I was finding like a beggar over the internet to help me understand the step 2 and 3 of batch norm. Here it was finally! Thank you so much for doing great work. I really really appreciate. So simple calm and informative explanation to very important topic.

    • @shraddhadevi8964
      @shraddhadevi8964 2 роки тому

      Ohh bhai khajaana 💰💰💰mil gaya💰💰💰💰💰💰💰💰💰

  • @aravindvenkateswaran5294
    @aravindvenkateswaran5294 2 роки тому

    I have successfully binged (across 2 weeks) this playlist and found them really helpful! Thank you for all you do and keep up the good work. Hope to watch more vids getting added here or elsewhere on the channel. Lots of love:)

    • @deeplizard
      @deeplizard  2 роки тому +1

      Thank you, and great work! Check out the homepage of deeplizard.com to see all other DL courses and the order in which to take them after this one!

  • @rowidaalharbi6861
    @rowidaalharbi6861 2 роки тому +1

    Thank you so much for your explanations!. I'm writing my phd thesis and your tutorial helped me a lot :)

  • @diogo9610
    @diogo9610 4 роки тому

    Wonderful work. Thank you for setting up all this content.

  • @nikhillahoti7628
    @nikhillahoti7628 5 років тому +1

    This is a gem! Thank you very much!!!

  • @al-farabinagashbayev5403
    @al-farabinagashbayev5403 4 роки тому +1

    I think every machine learning specialist even specialized one will find in you course something new for himself.:) Great course, Thanks a lot!

  • @PritishMishra
    @PritishMishra 3 роки тому

    Hurray, Completed the series (The only series on UA-cam which I have seen from the first video to last without skipping a second). Amazing job Deep Lizard Team. Highly Appreciated!
    Now I am going to see the Keras Playlist and den I will see the Pytorch series and den Reinforcement learning

    • @deeplizard
      @deeplizard  3 роки тому +1

      Congratulations! 🎉 Keep up the great work as you progress to the next courses!

  • @jonathanmeyer4842
    @jonathanmeyer4842 6 років тому +13

    Nice tutorial, clear, professional voice and animations !
    Looking forward more deep learning videos :)
    (I'm aware of your Keras tutorial series and I'm going to watch it right now !)

    • @deeplizard
      @deeplizard  6 років тому +2

      Thank you, Jonathan! I'm glad you're liking the videos so far!

  • @adwaitnaik4003
    @adwaitnaik4003 3 роки тому +1

    Simple and lucid explanation. loved it. Thanks

  • @Yadunandankini
    @Yadunandankini 5 років тому +3

    great video. precise and concise. Thanks!

  • @fanusgebrehiwet6286
    @fanusgebrehiwet6286 4 роки тому

    gentle and to the point. Thank you.

  • @yelchinyang148
    @yelchinyang148 5 років тому +2

    The online tutorial is very useful and helps me understand in detail batch normalization concept, which has confused me for a long time. Thanks very much for your sharing.

  • @ejkitchen
    @ejkitchen 3 роки тому +1

    Great content. Like many others have said, one of the best series on ML out there.

  • @robinkerstens516
    @robinkerstens516 3 роки тому +2

    just like all other comments: i have just finished your video series and I am impressed by the quality of explanation. Many videos go into tiny details way to fast, before making sure that everyone at least understands the terms. Kudos! I hope you make many more.

    • @deeplizard
      @deeplizard  3 роки тому +1

      Thank you Robin! Much more content available on deeplizard.com :)

  • @khalilturki8187
    @khalilturki8187 2 роки тому +1

    nice short video and great way of explaining!
    I will follow this channel and watch more videos!
    Keep up the great work

  • @betterbrained
    @betterbrained 2 роки тому

    As always, very well done and clear, thank you!!

  • @rogeriogouvea7278
    @rogeriogouvea7278 Рік тому

    These videos are SO helpful, thank you

  • @ranitbarman6471
    @ranitbarman6471 Рік тому +1

    Cleared the concept. Thnx

  • @farzadimanpour2751
    @farzadimanpour2751 3 роки тому +1

    the best tutorial that I've ever seen.thanks

  • @akhtarzaman7864
    @akhtarzaman7864 5 років тому +1

    thankyou for amazing explanation

  • @adityagupta-gf7pl
    @adityagupta-gf7pl 5 років тому +1

    Amazing explanation!

  • @smithflores6968
    @smithflores6968 2 роки тому +1

    I found, pure gold ... ! Great video ! I understood perfectly !

  • @baqirhusain5652
    @baqirhusain5652 2 роки тому +1

    Beautiful !! super clear !

  • @GS-kj5pc
    @GS-kj5pc Рік тому

    Excellent series!

  • @robertc6343
    @robertc6343 2 роки тому +1

    Ohhh what a wonderful narrative. I really like the way you explained it. Thank you and I’ve just Subscribed to your channel👍🏻

  • @kavithavinoth4557
    @kavithavinoth4557 4 роки тому +1

    great series
    amazing teaching skills you have got madam
    thank you

  • @amirraad4437
    @amirraad4437 Рік тому +1

    Thank you so much for your great work ❤

  • @rupjitchakraborty8012
    @rupjitchakraborty8012 3 роки тому +3

    Loved your video. I am going to complete this series. Can you include RNNs, LSTMs and GRUs, and also complete the video series? I am looking forward to this as I start and complete this series.

  • @tss109
    @tss109 2 роки тому +1

    Wow. Such a nice explanation. Thank you!

  • @UtaShirokage
    @UtaShirokage 4 роки тому

    Amazing and concise video, thank you!

  • @mkulkhanna
    @mkulkhanna 5 років тому +1

    Very nice tutorial, thank you

  • @sciences_rainbow6292
    @sciences_rainbow6292 3 роки тому +1

    i completed thes series of this videos, can't wait to watch more on your playlist!

    • @deeplizard
      @deeplizard  3 роки тому

      Awesome job! See all of our deep learning content on deeplizard.com :)

  • @punitdesai4779
    @punitdesai4779 3 роки тому +1

    Very well explained!

  • @FuryOnStage
    @FuryOnStage 6 років тому +1

    this was an amazing explanation. thank you.

  • @ericdu6576
    @ericdu6576 11 місяців тому +1

    AMAZING SERIES

  • @thepresistence5935
    @thepresistence5935 2 роки тому +1

    Wonderful explanation

  • @pamodadilranga
    @pamodadilranga 3 роки тому

    Thank You Very Very Much. I'm posting this comment in 2020. and under the house quarantine. I needed to know about deep learning to my internship. And thanks to this playlist, now I have good knowledge about fundamental theories of neural networks.

  • @ahmadnurokhim4168
    @ahmadnurokhim4168 Рік тому +1

    Great quality content, subscribed ️‍🔥

  • @yashgupta417
    @yashgupta417 4 роки тому +1

    Very well explained

  • @shaelanderchauhan1963
    @shaelanderchauhan1963 2 роки тому +1

    Just WoW! Amazing content. Please make series on Explainig research papers

  • @alphadiallo9324
    @alphadiallo9324 2 роки тому +1

    That was very helpful, thanks

  • @CosmiaNebula
    @CosmiaNebula 3 роки тому +2

    0:10 intro
    0:30 normalize and standardize
    1:25 why normalize
    3:05 problem of large weights, and batch normalization
    5:46 Keras

    • @deeplizard
      @deeplizard  3 роки тому

      Thank you for your contribution of the timestamps for several videos! Will review soon for publishing :)

  • @prasaddalvi3017
    @prasaddalvi3017 4 роки тому

    These are really good set of videos for neural network. I really liked it a lot and enjoyed watching it. Great work. But there is just one thing which I would like to suggest, you guys have explained Back propagation really well, better than most that I have seen, but it would be really helpful in understanding back propagation better if you could add a small numerical problem for back propagation calculation.

  • @fritz-c
    @fritz-c 4 роки тому +1

    I spotted a slight issue in the article for this video.
    At the end of the article, it says "I’ll see ya in the next one!", with a link to the Zero Padding article, but by that point that article has already been covered.
    I really enjoy your courses so far, by the way. I've stopped and started a few times with studying ML in the past, but this has been a pleasure to go through.

    • @deeplizard
      @deeplizard  4 роки тому

      Fixed, thanks Chris! :D
      I've rearranged the course order since the initial posting of these videos/blogs, so I removed the hyperlink.

  • @RandomShowerThoughts
    @RandomShowerThoughts 5 років тому +1

    This video was amazing

  • @senduranravikumar3554
    @senduranravikumar3554 3 роки тому +1

    Thank you so much mandy... i have gone through all the videos... 😍😍😍 .

  • @yuriihalychanskyi8764
    @yuriihalychanskyi8764 4 роки тому +1

    Thanks for the video. So do we have to normalize data before putting to the model or batch normalization does it itself in the model?

  • @richarda1630
    @richarda1630 3 роки тому +1

    Just wanted to say kudos and thanks so much for your awesome series :D I have learned so much! Now I'm off to your Keras w/TF series :)

    • @deeplizard
      @deeplizard  3 роки тому

      Great job getting through this course!

    • @richarda1630
      @richarda1630 3 роки тому

      @@deeplizard Thanks! moving to your Deep Learning and Keras series next :)

  • @HasanKarakus
    @HasanKarakus 9 місяців тому +1

    The best explonation I ever watch

  • @abhishekp4818
    @abhishekp4818 4 роки тому

    @deeplizard could you please explain how does "g" and "b" gets updated during backpropogation in "(z*g)+b". Is the derivative taken or is there any other method.

  • @mhdalkadri9228
    @mhdalkadri9228 6 років тому +1

    Brilliant !!

  • @DanielWeikert
    @DanielWeikert 4 роки тому

    Thanks. How exactly is the mean and std for a specific neuron in the dense layer calculated? Is it just like adding up all values in a specific batch and then divide by the batch size. Each time a new batch gets fed in then this repeats? Thanks

  • @user-qt3jo9tw6m
    @user-qt3jo9tw6m 5 років тому +1

    Good stuff, thank you

  • @arohawrami8132
    @arohawrami8132 5 місяців тому +1

    Thanks a lot.

  • @lucaslucassino
    @lucaslucassino 3 роки тому +1

    Hey I have a question! It is sometimes preferred to have a batchnorm layer after a convolutional layer and after the activation layer. Does anyone know why?

  • @OKJazzBro
    @OKJazzBro Рік тому

    Batch norm according the paper is actually applied before the activation function, not after. For this reason, they even recommend dropping the bias parameter of the layer itself because batch norm comes with a learnable bias term. The output of batch norm then goes to the activation function.

  • @rajuthapa9005
    @rajuthapa9005 6 років тому +1

    very helpful tut

  • @pranavdhage691
    @pranavdhage691 4 роки тому

    awesome...I am going to watch the playlist....

  • @rapunziao2929
    @rapunziao2929 5 років тому

    i started to fall in love with the voice

  • @bl7395
    @bl7395 3 роки тому

    @deeplizard please do a series on transfer learning, or more in-depth teaching on NLP/CV :)

  • @mdyounusahamed6668
    @mdyounusahamed6668 Рік тому

    Do I need to add the batch normalization after each hidden layers or use it once just before the output layer?

  • @mukulverma8404
    @mukulverma8404 4 роки тому +1

    Very good Explanation..watched this whole playlist.Thanks for making understanding DL so easy and fun.Moreover your funny stuff made me laugh.

  • @orcuncetintas2258
    @orcuncetintas2258 4 роки тому +1

    Great video, very clear and understandable. However, I want to point out some mistakes. In the batch norm, only b and g are trainable; not the m and s. Moreover, batch norm is applied after fully connected/convolutional layers but before activation functions. Therefore, it doesn't normalize the output from activation function; it normalizes the input to the activation function.

  • @kartikpodugu
    @kartikpodugu Рік тому

    In the example you mentioned, about miles driven in 5 years. Why did you mention that the data isn't necessarily on the same scale? I didn't get that. Can you elaborate? 1:48

  • @roxanamoghbel9147
    @roxanamoghbel9147 3 роки тому +1

    so helpful!

  • @qusayhamad7243
    @qusayhamad7243 3 роки тому +1

    thank you really you are the best teacher in the world. I appreciate your efforts

    • @deeplizard
      @deeplizard  3 роки тому +1

      Happy to hear the value you're getting from the content, qusay!

    • @qusayhamad7243
      @qusayhamad7243 3 роки тому

      @@deeplizard I am so happy for your reply to my comment ^_^

  • @heejuneAhn
    @heejuneAhn 5 років тому

    I have a question on the slide around 4:00. Why do we need multiple and some parameter value after normalizing the value? The step will transform the value range. In term of the original papers, they say identify transform. In fact I wonder why we use 'identiy transform' which essentially makes no chnage to the input.

  • @YumanoidPontifex
    @YumanoidPontifex 3 роки тому

    do i need to normalize my input data 'manually' or it is ok to instead use BatchNormalization layer as the very first layer in my model? something tells me the data should be first normalized as a whole, whereas the layer would normalize per-batch and therefore each batch would be normed differently.

  • @travel_with_rahullanannd
    @travel_with_rahullanannd 4 роки тому +1

    I really enjoyed learning with your videos. Can you please create videos on RNN.!!

  • @blackraider777
    @blackraider777 5 років тому +1

    beautiful vid