That's a masterpiece, not only have I learned how in detail convolutional neural networks work, but also I've learned how I should explain hard subjects to others. Thank you.
I tend to get intimidated by videos longer than an hour, but I'm so incredibly glad I watched this one! Super clear explanation, I feel like I actually understand what happens now. No one else has been able to explain it so clearly. :) Thank you!!
@@BrandonRohrer a little suggestion it would have been a lot better if it was a playlist consisting of 10 mins videos each, it would really be helpful for someone with low attention span like me
Although it's 5 years ago, this is the simplest and the AWESOMEST video in youtube for someone getting started with Computer Vision. This lecture, along with 3-Blue 1-Brown neural network playlist, and you are good to explore Thank you!!
One of the best videos I’ve ever seen on the topic: super clear explanation + truly in depth, all without being boring. The only thing I didn’t understand is how to determine the values in the matrices for the convolution.
Thank you Daniele! The short answer: they start random and get adjusted during training by backpropagation ( e2eml.school/backpropagation ) The long answer: A two-course sequence walks through how to implement this in Python for 1-D ( e2eml.school/321 ) and 2-D ( e2eml.school/322 ) convolutional neural networks.
Thanks so much for spending time preparing this videos. Watching is 1h, preparing for this video is probably * by 100 :) 3:20 - Filtering 8:10 - Pooling 10:30 - Normalisation (ReLu) 12:16 deep stacking 13:11 fully connected layers 17:00 receptive fields 18:00 create a neuron, create weight and squash the results (sigmoid function). 26:50 optimisation
I can't stress enough how great your videos and explanations are. I get overwhelmed by lots of text and missing visual examples, so it's great I found your videos. Watched 2 already and will definitely watch the rest too!
Don't let the duration of this video intimidate you from enjoying this masterpiece of a presentation, just press play and begin, you'll freaking love every second of it. Thank you so much for sharing this and so much other information for free!
I had a diffi ultrasound time understanding the convolution layer, this course is the best among all courses I saw on UA-cam, keep the good work, you saved me , I was struggling understanding and now I'm completely clear. Thanks alot
Brandon, you explain the most difficult concepts in simple understandable language. Nice visualizations create a mind map which we cannot forget. Thank you for all your efforts on these videos!
This is by far the best explanation in convolution neural networks, gets into theory and details of things. The presentation of everything is superb. I now know precisely what CNN are exactly all about. I would never spend a full hour watching an explanation on youtube unless it is a full course. This explanation hour long of CNN is well worth it. Thanks.
Wow, i this tutorial is packed with information. I had to rewind a 100times to grasp the art about weights & errors, nobody ever explains this part for mere mortals like myself.
Marvelous explanation, made simple and concise, yet not oversimplified to a level that would render it pointless. I could not have imagined a better way to bring the loose pieces in my head together. Thanks a lot for this.
This is the first video of yours I have watched. It was so good that I subscribed to your channel. BYW, your voice is a lot like Brian Greene. This is good because it is a good lecture and documentary voice.
Thanks! If you want to go one level deeper, I recommend walking through e2eml.school/321 and e2eml.school/322 . They walk through the Python implementation and give a deeper understanding of how and why.
Sir, just finished watching and you explained this very well especially the second half with gradient descent and backpropagation. Thank you so much, have liked and Subscribed!
In this simplified example yes, but just a heads up that in practice it's often done just like the other layers - summing up all the inputs and passing them through an activation function, such as the logistic function.
Thanks! Here's a bit more on convolution that might help clarify: ua-cam.com/video/B-M5q51U8SM/v-deo.html And if you want to go really deep , there are courses here: end-to-end-machine-learning.teachable.com/p/321-convolutional-neural-networks and here: end-to-end-machine-learning.teachable.com/p/322-convolutional-neural-networks-in-two-dimensions/
Thank you for this amazing video! It definitely helped clear a lot of stuff about CNNs for me. On a very random note, you have a great voice! I feel like you'd make an awesome audiobook narrator!
I am not able to understand how will gradient be calculated for the convolutions? Like how will each of the convolutions filter parameters update mathematically. Can someone please explain
HI Sana. Your confusion is justifiable - we didn't talk about that here. If you'd like to dig down to the next level, you can find the answer to this in End to End Machine Learning Course 321 end-to-end-machine-learning.teachable.com/p/321-convolutional-neural-networks
Nice tutorial. Do you use any specific plaftorm such as Keras or Pytorch. I've seen some tutorials and examples using a convolutional layer like this Conv2D(filters=32) Which is supposed to tell Keras to use 32 convolutional filters. But it doesn't specifies what filters to use, it seems to be something automatic. How does Keras compute that 32 filters? What filters is it really using? (I know horizontal, vertical, vertical, cross, sobel...)
Thanks! If you'd like to take this to the next level, here's a course on CNNs. It's not PyTorch or Keras, but it walks you through how to implement a layer full of kernels.
It would have been nice to see the in-depth breakdown of convolution layers instead of regular neural network starting at 15:00. Does pieces of the image take the place of the pixels?
54:58 Does anyone know where I can find a detailed breakdown of how backpropagation works for non-fully-connected layers (convolutional, ReLU, Pooling)? Brandon's excellent breakdown of how it works for fully connected layers is what ultimately made classical neural networks click for me, and I would love to see a similar break down for the parts exclusive to CNNs.
You can go through optional excercises in Andrew Ng's Convolutional Neural Network Course to understand the math behind. In general you don't have to do calculations of the backprop of CNNs, it's pretty complex for hard-coding, and modern frameworks like pytorch/tf do it automatically.
@@iegormykhailov8934 I see I will check him out. And I understand it is already supported by modern frameworks, but I am of the mindset that if I can't do it myself then I don't fully understand it. In fact, I am writing a NN library in Go, which has very limited ML support currently, so these tools aren't available to me anyway.
That's a masterpiece, not only have I learned how in detail convolutional neural networks work, but also I've learned how I should explain hard subjects to others. Thank you.
This is by far the best video I've seen on CNN. Thanks a lot!
You're an amazing teacher. Just the right speed. The right structure. Well done.
For those who come from the shorter video by Brandon, the new stuff starts at 15:13.
I tend to get intimidated by videos longer than an hour, but I'm so incredibly glad I watched this one! Super clear explanation, I feel like I actually understand what happens now. No one else has been able to explain it so clearly. :) Thank you!!
That's so good to hear. I'm really happy that it clicked.
@@BrandonRohrer a little suggestion it would have been a lot better if it was a playlist consisting of 10 mins videos each, it would really be helpful for someone with low attention span like me
@@opto3539 Thanks Opto, I like this suggestion. I tried this on some later content and I like the result.
Although it's 5 years ago, this is the simplest and the AWESOMEST video in youtube for someone getting started with Computer Vision.
This lecture, along with 3-Blue 1-Brown neural network playlist, and you are good to explore
Thank you!!
That is a huge compliment. Thanks!
One of the best videos I’ve ever seen on the topic: super clear explanation + truly in depth, all without being boring. The only thing I didn’t understand is how to determine the values in the matrices for the convolution.
Thank you Daniele!
The short answer: they start random and get adjusted during training by backpropagation ( e2eml.school/backpropagation )
The long answer: A two-course sequence walks through how to implement this in Python for 1-D ( e2eml.school/321 ) and 2-D ( e2eml.school/322 ) convolutional neural networks.
Thanks so much for spending time preparing this videos.
Watching is 1h, preparing for this video is probably * by 100 :)
3:20 - Filtering
8:10 - Pooling
10:30 - Normalisation (ReLu)
12:16 deep stacking
13:11 fully connected layers
17:00 receptive fields
18:00 create a neuron, create weight and squash the results (sigmoid function).
26:50 optimisation
and then he died
There is lecturer that knows about what he's teaching the students. Well explained thank you.
I appreciate that.
I can't stress enough how great your videos and explanations are. I get overwhelmed by lots of text and missing visual examples, so it's great I found your videos. Watched 2 already and will definitely watch the rest too!
Don't let the duration of this video intimidate you from enjoying this masterpiece of a presentation, just press play and begin, you'll freaking love every second of it.
Thank you so much for sharing this and so much other information for free!
I had a diffi ultrasound time understanding the convolution layer, this course is the best among all courses I saw on UA-cam, keep the good work, you saved me , I was struggling understanding and now I'm completely clear. Thanks alot
I am a visual learner with no background of computer science and this video is a gem! Thank you very much. Subscribed:)
Thank you! I'm pleased to hear it.
This is the BEST video explanation EVER! Animation, simplicity, voice, oh god, you deserve an award in the machine learning world!
Thanks :) Made my day
insanely good explanation, never seen anything like this. thanks a lot
Probably, one of the best intuitive explainers of why we like to use gradient descent in neural networks, which I ever seen.
Brandon, you explain the most difficult concepts in simple understandable language. Nice visualizations create a mind map which we cannot forget. Thank you for all your efforts on these videos!
Thank you so much Ishwar
This is by far the best explanation in convolution neural networks, gets into theory and details of things. The presentation of everything is superb. I now know precisely what CNN are exactly all about. I would never spend a full hour watching an explanation on youtube unless it is a full course. This explanation hour long of CNN is well worth it. Thanks.
Great presentation, Brandon. I prefer your simple graphics and pace over the highly distracting, animated videos from other educators.
Thanks! I appreciate that
this is the only explanation in youtube and the internet, that has finally helped to quench my thirst of understanding CNN!
Thank you!
Wow, i this tutorial is packed with information. I had to rewind a 100times to grasp the art about weights & errors, nobody ever explains this part for mere mortals like myself.
Thanks! I'm really happy to hear it.
Your explanation is amazing, from your video i can understand neural network. Thanks
I'm so glad to finally find the videos about NN explained by somebody whose English I can understand.
A one hour well spent.,,in my Life...
You are simply the best at explaining this complex topic. Thank you.
Thanks!
This is the best video I have ever watched about machine learning. You have more than just a talent.
First time replying to any tutorial in 7 years, You really know how to make others understand, Would love o work with you if I get a chance ever.
Detailed and concise at the same time. Perfect video.
Marvelous explanation, made simple and concise, yet not oversimplified to a level that would render it pointless. I could not have imagined a better way to bring the loose pieces in my head together. Thanks a lot for this.
I'm only halfway through but really, you're amazing at teaching and explaining concepts. Thank you
one of the best videos about this topic I have ever watched. It is 1 in a thousand! Thank you for sharing it
Wow thanks!
It was nice of you to simplify the understanding as most UA-cam video's just put neural networks in an entertaining way with a vague explanation.
Just 10 mins into the video, I got a clear overall picture of CNN that I have searched for weeks. Thanks Brandon.
Wow! All your perfect presentations combined in a better presentation! I'm bookmarking this one and also sharing it with my colleagues.
A remarkably intuitive video for beginners. Thank you
thank you so much Mr.Brandon Rohrer sir for your good teaching on convolutional neural networks.
I can't imagine how hard it was to make this cool video! Many thanks to the author!
I'm glad you enjoyed it!
Explanation is on point!!!
damn, the most unexpected comment I have ever seen
Patec??
Very intuitive way of explaining Convolution Neural Networks. Great job!
This really make me understand CNN more intuitively, lucky to meet with your vedio😄
Very clearly spoken and illustrated. It's great to have well articulate and easy to follow tutorials like this.
This is the first video of yours I have watched. It was so good that I subscribed to your channel.
BYW, your voice is a lot like Brian Greene. This is good because it is a good lecture and documentary voice.
Thanks thomas, those are huge compliments. I'm really happy it was helpful.
thank you so much for your explanation. really helps me to understand what CNN is about
WoW! This is by far the best tutorial out there for CNNs! Thank you...
THANK YOU THANK YOU THANK YOU. Finally I understood what Convolutional NN is. Great vid bro.
I'm so happy to hear it!
Jesus this was a fantastic tutorial I imagine you spent many months working on!
I'm going to create a new account just to give this man two thumbs up. This lecture is soooo good.
Awesome! I just didn't expect you to actually talk about backpropagation and linear layers but I'm not complaining.
This is the video I needed the most. Thank you
Super Sir. Finally I got what I have expected.
Great teachings !1h of Brandon = 15h of Stanford lectures....
Thank you so much SarahK
@@BrandonRohrer I would thank you much more for your efforts, you examples makes the subject so much easier digestible !
What a great tutorial. Easily the best on CNN.
Thank you so much! I didn't have to pause once to understand anything. You explained it so perfectly.
Very good tutorial. Learn so many things
Best explanation of how Neural Networks work I have watched so far! Well explained and really intuitive
Really good explanations. Just the right level of detail for my understanding. Thanks.
Brandon beside knowledge also has nice narrative ability, for me definitely best 1 hour of time spent...
As usual, it's an extent video to build intuition about what CNNs are doing under the hood. Thank you, Brandon. More power to you..
Underrated video! views should be at least E6.
Thanks Lee :)
Best explanation of backpropagation I've seen fr. Thank you SO much!
Thank you Berhane! I appreciate it.
amazing explanation with great examples
This is what a tutorial video should be!
That was AWESOME. The minor issue was, there was no pointer. ( We could skip the issue with the great explanation)
Many thanks :) And I agree. After this video I changed my workflow so that I could record a pointer too.
quite good explanation Brandon !
Now i feel like sending CV to Tesla
Super! Crisp clear explanation with breaking down complex concepts into easily understandable steps.
Amazing dear...help alot to understand the foundation....👌
I'm happy to hear it
I dont usually comment on youtube videos. All i can say is that you Sir!!!
An incredible Video !! thanks brandon for such a good explanation to understand CNN. please don´t stop making more material . Greetings from Germany
Thank you Victor! I appreciate it.
Perfect !!!
Such a great video .
Thanks a lot Brandon
Thanks Omkar!
i loved your detailed explanation of the steps, but can you please make another video to explain the REASON for each of the steps in detail?
Thanks! If you want to go one level deeper, I recommend walking through e2eml.school/321 and e2eml.school/322 . They walk through the Python implementation and give a deeper understanding of how and why.
Fantastic video. The conclusion really summed up everything nicely.
22:15 is wrong, the bottom of the 4 outputs is 'upside down'.. great video, though
Great work. Thank you so much. This has been the most useful video i have seen in NN!
I'm so happy to hear it.
Sir, just finished watching and you explained this very well especially the second half with gradient descent and backpropagation. Thank you so much, have liked and Subscribed!
Man, thank you so much!
This is incredible work!
Beautifully explained Brandon and so clear - thank you !
In 2020, the sigmoid function is almost always replaced by the ReLU function for the activation of neurons.
OMG, you are an amazing teacher. Thank you a million times
Thank you tran!
Thanks a lot !! You are one of the best teachers ever!!
@brandonrohrer sir in 14:27, are we evaluating the final confidence scores by taking the average of either x or o scores?
In this simplified example yes, but just a heads up that in practice it's often done just like the other layers - summing up all the inputs and passing them through an activation function, such as the logistic function.
Wow !!!! Great tutorial, my knowledge expanded 10 fold
Master piece!
One question: Is convolution the same or a kind of filtering?
Thanks! Here's a bit more on convolution that might help clarify: ua-cam.com/video/B-M5q51U8SM/v-deo.html
And if you want to go really deep , there are courses here: end-to-end-machine-learning.teachable.com/p/321-convolutional-neural-networks
and here: end-to-end-machine-learning.teachable.com/p/322-convolutional-neural-networks-in-two-dimensions/
Wow, the explanation is easy to be understand. Thanks for your work. it helps me a lot
The interaction with the audience feels so personal.
I've learnt so much from these videos thanks a lot!!
Thank you for this amazing video! It definitely helped clear a lot of stuff about CNNs for me. On a very random note, you have a great voice! I feel like you'd make an awesome audiobook narrator!
Aw thanks! That's a really nice ting to say.
Magnificently explained sir, well done.
I am not able to understand how will gradient be calculated for the convolutions? Like how will each of the convolutions filter parameters update mathematically. Can someone please explain
HI Sana. Your confusion is justifiable - we didn't talk about that here. If you'd like to dig down to the next level, you can find the answer to this in End to End Machine Learning Course 321
end-to-end-machine-learning.teachable.com/p/321-convolutional-neural-networks
Nice tutorial.
Do you use any specific plaftorm such as Keras or Pytorch.
I've seen some tutorials and examples using a convolutional layer like this
Conv2D(filters=32)
Which is supposed to tell Keras to use 32 convolutional filters.
But it doesn't specifies what filters to use, it seems to be something automatic.
How does Keras compute that 32 filters? What filters is it really using? (I know horizontal, vertical, vertical, cross, sobel...)
Thanks! If you'd like to take this to the next level, here's a course on CNNs. It's not PyTorch or Keras, but it walks you through how to implement a layer full of kernels.
The best explanation I ever heard !!!!
Wow! Very well done :) Perfect pace, content, and explanations.
Great teacher! Big thank for your sharing to every body!
Thank you Sir for this crystal clear explanation
Thank you, it gave great clarity.
Thank you. I was in need for such a video. Well done.
It would have been nice to see the in-depth breakdown of convolution layers instead of regular neural network starting at 15:00. Does pieces of the image take the place of the pixels?
54:58
Does anyone know where I can find a detailed breakdown of how backpropagation works for non-fully-connected layers (convolutional, ReLU, Pooling)? Brandon's excellent breakdown of how it works for fully connected layers is what ultimately made classical neural networks click for me, and I would love to see a similar break down for the parts exclusive to CNNs.
You can go through optional excercises in Andrew Ng's Convolutional Neural Network Course to understand the math behind. In general you don't have to do calculations of the backprop of CNNs, it's pretty complex for hard-coding, and modern frameworks like pytorch/tf do it automatically.
@@iegormykhailov8934 I see I will check him out. And I understand it is already supported by modern frameworks, but I am of the mindset that if I can't do it myself then I don't fully understand it. In fact, I am writing a NN library in Go, which has very limited ML support currently, so these tools aren't available to me anyway.
This is great! Thank you Brandon.
You are very welcome Julie!
This is such a clear explanation, thank you!!
You are a great teacher.