Convolutional Neural Network from Scratch | Mathematics & Python Code
Вставка
- Опубліковано 19 чер 2024
- In this video we'll create a Convolutional Neural Network (or CNN), from scratch in Python. We'll go fully through the mathematics of that layer and then implement it. We'll also implement the Reshape Layer, the Binary Cross Entropy Loss, and the Sigmoid Activation. Finally, we'll use all these objects to make a neural network capable of classifying hand written digits from the MNIST dataset.
😺 GitHub: github.com/TheIndependentCode...
🐦 Twitter: / omar_aflak
Chapters:
00:00 Intro
00:33 Video Content
01:26 Convolution & Correlation
03:24 Valid Correlation
03:43 Full Correlation
04:35 Convolutional Layer - Forward
13:04 Convolutional Layer - Backward Overview
13:53 Convolutional Layer - Backward Kernel
18:14 Convolutional Layer - Backward Bias
20:06 Convolutional Layer - Backward Input
27:27 Reshape Layer
27:54 Binary Cross Entropy Loss
29:50 Sigmoid Activation
30:37 MNIST
====
Corrections:
23:45 The sum should go from 1 to d
====
Animation framework from @3Blue1Brown: github.com/3b1b/manim
This is one of the best explanation of CNN on the internet for me, and that 3b1b video format is cherry on the cake. Please keep on making these videos.
Yeah , for sure !
Honestly, this is barely an explanation. He just showed you the steps to achieve CNN from scratch. He did not explain why we did some of the stuff we did, like the cross-correlation and stuff.
@@mysticlunala8020that's what other videos out there already do. The focus of this video was how to actually put all the concepts into code concisely and intuitively.
Cannot believe that tutorials like this exist. Thank you so much. I have been looking for a tutorial for a long time and I finally found it. This is definitely one of the best tutorials out there!
This is for real one of the best videos related to any type of NN I've ever seen. Most videos just scratch the surface of how these NNs work, but you went deeper and in an understandable way. Congratulations and keep the good work!
Thank you so much! I've been searching for this kind of explanation of CNN, especially the backprop process. I'll for sure cite this video in my thesis. Thank you!
I love your teaching! This is perfect for me and exactly what I have been looking for. Thank you for your contribution. These videos are gold!
After going through many blogs, this helped me just fully understand these networks. Such a great teacher you are!!!
Thank you!! I'm making a machine learning library from scratch for fun and I've been confused with some details that thanks to you now I understand. It's my favorite explanation of CNNs on youtube
This is the best and calm explanation in NNs that I have ever seen on Internet! Amazing work, definitely sharing it to my colleagues
please please please keep making more videos. This is so insightful and relaxing to watch.
the "from scratch" series you made is pure gold!!
Please produce more high quality videos like this. Your 30-minute video explains CNN better than my 1-semester AI class in college
This is too much underrated.
Keep doing the good work. I really appreciate your contribution.
The best explanation of CNN I have ever seen. Thank you!
I am really so fortunate to have found this amazing video. I will really mention reference of this video at multiple places. Thanks for the hardwork :)
I am making a CNN from scratch and I was a little bit stuck on how to find the gradients of convolutional layers but that little digression about how the equation of a convolutional layer is really just a more general version of the equation of the dense layer output really made it clear for me! This video is gold
I really love this explanation of CNNs. It almost makes them look easy
You are a gifted teacher bro. I can't believe you've only got 50k views. But then again with how esoteric the content you're teaching is, it's impressive that your videos are so popular! Thank you
Man, I hope you channel become very huge. Thanks, this is the one of the best videos on youtube and not just about this topic, it is in general one of the best video in youtube
True that. After reading so many blogs on Medium, none could solve all my doubts. You did it. Kudos to you.
I know it’s a bit late, but I thought I should mention how well this video is paced and structured. The listing and crossing out of what topics are to be covered makes the video very clear, concise and easy to follow.
Thank you :)
Finally understood backprop of conv. Thank you for the great video!
You are one of the best had ever explained this topic. Keep up your easy and succinct style. thumbs up
Came across this while trying to code Resnet in pure CUDA
The best explanation on the topic!
Great Thanks!!
I'm an undergrad student studying CS at Georgia Tech. This video explained the backprop in CNN's better than my professors. A true gem.
this is definitely one of the better videos on the topic, surprised it doesn't have more views (:
This video is amazing, It really helped me understand the math behind CNNs - thank you!
GREATEST LECTURE EVER ON CORE DEEP LEARNING.... THANK YOU MATE
One of the best tutorial I have gone through. Thank you so much.
I dont know if its the music but this video is incredibly calming, thanks for this!
It's really the best explanation I have ever seen about convolution neural network
Thank you pro.
Thank you so much for those tutorials. They are really clear and well explained. Pls do that in every domain you know, even if its plumbery
I've been reading about CNN and image recognition for a while to make my own one for my project idea, but never thought or found something that brought me light into how to implement a CNN, because I want to do it from scratch, with the maths and all staff.
You have thought me a lot on 33min of video, now I know how I can make my own CNN, and also that I need to go over derivatives UwU
Thanks a lot!!!
Thank you for the kind message, I'm really glad if it helped :)
Another piece of art! Thank you
Thank you for your time and effort, this is the best so far for me
Great thanks!This is the clearest video about CNN's on the whole internet!😀
Your lessons are works of art!!!
Finally, a tutorial where I got to know how the 3-channel RGB is being mapped mathematically into features. It is surprising to watch so many tutorials and none mentioned that for every channel there is a corresponding kernel and the summation of the convolutions was used to get the result for the next step. They all show h*w*3 and then a single 3*3 kernel. Example this video: C4W1L08 Simple Convolutional Network Example
Thank you for explaining very clearly!
The best-ever tutorial. thank you.
Man , I am mind blown by your content . Such a great video 👌
It is really help me to understand the whole concept of Convolutional Network. Especially the backpropagation. Please make some video on RNN, LSTN. Thank you.
Thanks for making this. You're really good at doing what you do.
One of the best explanations for CNN on internet! Your channel would be Big in future.
Thanks a lot for this video. Couldn't be more grateful!
This is amazing man. Very informative!
Perfect Perfect i like this channel. Bravo, i found what i was looking for. Really thank you Sir.💚
Thank you so much of those two excellent videos !!!
Your "from scratch" videos are great! I was able to convert them into c++/cuda neural net classes and they work better than my old code. Thank you!
Also, is there any way you can do one for an "Unconvolutional" layer? I would love to mess around with different types of autoencoders for images. :)
Deconvolution is just a convolution, you just pick your convolved image, and use a bigger filter and add some padding.
thank you very much for this masterclass
best video on Convolutional layer. Good job!
Great sir !! i find about backward kernel so long time thank you
Great video - first I saw that really shows how to thing about implementation
This in a very good tutorial to learn about CNN. Thank you so much.
3b1b video format & amazing calming voice
OMG, you are a treasure
awsome, thank you for the tutorial it really help me out to understand about cnn with easily
This is amazing work, thank you so much :)
Thanks a lot!
very high quality video and amazing explanation!
Exceptional explanation, thank you for sharing this.
You could implement the hypermatrix operation you talked about in the beginning of the video in order to simplify the forward and backward functions of the Convolutional layer, removing all the loops
Incredible video!
BEST video! Thanks a ton!
really found this video very interesting and informative . I really appreciate it a lot. Thanks!
Best ever video on CNN, hats off!
wonderful, fantastic, thanks a million
Just amazing work ❤
Excelent Video!!! Thanks for inspiration.
Great video! helped me a lot!!!
Great explanations! For completeness, it would have been nice to include an implementation of pooling layers, but those are quite easy in comparison to convolutional layers. And they might destroy too much information in the small 28x28 images from MNIST.
Best Explanation... lv ur teaching style and animations
Numpy needs to add this operation and give it a name for real 8:46
Edit: Ammmazing video btw!!
woow amazing explanation..thank you!
excellent video on CNN ever thanks buddy. Do more videos like this!!
interesting and well organized demo!!!!
heavy logical equations like poetry, the only channel I activate the bell :D thanks for all.
Great explanation! it has really helped my wary mind but I want to point out that it is a bit tricky what depth signifies in the forward propagation explanation. Depth is used to represent the number of input matrices and the number of filter kernels, so it would be nice to present a clear distinction. Thanks for the video
Thanks, This vidio was perfect also had some beauty of mathematics
LOVE IT!!!
Your videos are amazing 🙃❤
Wow your content is super awesome,
it would be super cool if you would also code RNN, LSTM, GRU and all that.. 🙂
Your animation reminded me of 3Blue1Brown videos. Awesome stuffs! :)
Thanks for sharing!
It's because I'm using his library :)
github.com/3b1b/manim
I have started my AI journey a month back and I have lots of confusion as how these CNN are getting parameters and how is it passing through layers and why reshaping and many more queries. I give full star to clear all the doubts on this video. This is saviour for me in my AI journey.
Really a great video ❤️❤️ I also liked the neural network video very much bcz it cleared all my doubts and I was really amazed to see how we can easily implement a network by building each layer in the internet there are many implementations but this one is the best and easiest one and this CNN is the most amazing video bcz I haven't found any article where CNN is explained in such depth it was a great video🔥🔥🔥animations are also really helpful specially its 3b1b style is one thing i liked very much. I have a request that you should also make a video on RNN bcz after watching this video I think there are many deep understanding in RNN which I don:t know and also the implementation will be really helpful.
I had so many aha moments here! this is awesome
Excellent!
Perfect Perfect i like this channel. Bravo, i found what i was looking for. Really thank you Sir
Thank you for this! great explaination.
I will request you to do "Attention" next if possible.
I feel like I don't deserve to get such content for free.. Amazing job!
Hey, thanks for your knowledge, please share more❤
This is extremely in-depth and just what I needed... All the best for future videos..
Great Video!
Wonderful explaination love from India🇮🇳
Thank you so much
Simply beautiful videos, days of struggling for understanding backprop has just ended. Would you be willing to make a video on transformers?
It was a great video and the best organised code. Can you also make a video of pooling in extension of this which makes it a complete one
Fatnastic brother!
I really apprciate what you are doing
thanks🎉🎉
Best video of its kind. Please do RNN's!
Everything well explained, thank you. I think all the code still can be simplified and made faster by using numpy ndarray functionalties. Instead of using all data sets in each epoch just use batches of data so that you can train all set without needing using only two class of data. Using numpy ndarray functions will remove almost all loops and hence your code will be faster.
Your just hiding the loops in the function call
@@polyfoxgames9006 No.Not really. Most of numpy functions are optimized C functions.
@@polyfoxgames9006 Vectorized Python code can be as fast as C or Julia.
@@michaelpieters1844 no way haha. Pytorch is fast because it is cuda, being called by python
please keep posting videos, thanks
Great video