The Most Important Algorithm in Machine Learning
Вставка
- Опубліковано 1 тра 2024
- Shortform link:
shortform.com/artem
In this video we will talk about backpropagation - an algorithm powering the entire field of machine learning and try to derive it from first principles.
OUTLINE:
00:00 Introduction
01:28 Historical background
02:50 Curve Fitting problem
06:26 Random vs guided adjustments
09:43 Derivatives
14:34 Gradient Descent
16:23 Higher dimensions
21:36 Chain Rule Intuition
27:01 Computational Graph and Autodiff
36:24 Summary
38:16 Shortform
39:20 Outro
USEFUL RESOURCES:
Andrej Karpathy's playlist: • Neural Networks: Zero ...
Jürgen Schmidhuber's blog on the history of backprop:
people.idsia.ch/~juergen/who-...
CREDITS:
Icons by www.freepik.com/
Join Shortform for awesome book guides and get 5 days of unlimited access! shortform.com/artem
Can you talk about liquid neural networks? I’m interested to know if that’s a revolutionary work that deserves more recognition and following.
arxiv.org/pdf/2006.04439.pdf
Back prop is a hard, heavy thing to explain, and this video does it extremely well. I mean, that section 'Computational Graph and Autodiff' might be the best explanation of that subject on the internet. I'm very impressed - well done!
You two are the best channels I have found in the SoME episodes. It's great to see this interaction between you guys.
Love your videos
If there is no mention of sine waves in neural networks then it won't be total.
Funnily enough, the calculus portion of the video is probably one of the best explained I've seen
Why would that be 'funnily enough'? What a diss lmao.
@@George70220 I don't think CuriousLad meant it as a diss, it's just that when Artem made the video, he explained the Calculus section as a background information. The partial derivates and gradient descent wasn't the main topic of the vid, yet you could show this to Calculus I student and they would be thanking him for the explanation, even if they have not interest in learning back propagation! That's why funnily enough, while the intro Calc topics wasn't the main part of the video, that portion would be very helpful to anyone starting out int Calc!
I dont agree for example the act of minimizing loss function and gradient descend were not properly linked there were just two pieces of information unprocessed dumped in series
"Wait, It's all derivatives?"
"Always has been"
Great work pal. Provides excellent clarity.
Looking forward to the second part.
It makes sense that you would cover both computational neuroscience AND machine learning since they both play a significant role in AI research. The sort of content you're making is definitely 3Blue1Brown level. Keep up the good work!
It’s probably the best explanation of backward propagation. Hats off to your hard work and saving this so valuable content.
By far the best ML explanation I have seen on internet.
I just have to say this goes way beyond the quality of the many chainrule videos I've seen so far. Good job man, you've got some impressive skills to keep me watching a math video and take notes past my usual bedtime
you take notes?
This is the best ML explanation I have seen on YT
This just might be the most underrated video on Back Propagation that I've ever seen! I hope more people come across this
The visuals on this video is from another planet . So Good !!!!!!!!
31 years now, had like 13 years of math in school and another 5 years at university, first time i really understood how derivatives work, bcs visualisation instead of "you calculate it this way and derive it that way, now memorize"
Damn, I was wondering where you've been since over half a year, whilst I was stuck in backpropagation😂 and here you came back like a true mind reader. Glad to see you back❤
He was calculating your backward step so you can make your next forward step (sorry, couldnt resist) XD
this's by far the most clearer explaination and simplification of backpropagation i have watched
There could not have been a better explanation. Hats off to you
This is one of, if not the, best videos I’ve seen that throughly explains back propagation. It will definitely help me to be able to better explain the algorithm to others, so thank you for creating it.
So clear and concise! Thank you for creating this.
Most Comprehensive Explanation EVER
my opinion : better than
3b 1b, No offence to 3b 1b Hes great at it and one of the pioneers who did these kind kf visual explanations.
But i like your explanation as it is slow paced & comprehensive
This has to be the best explanation of the chain rule ever! Thanks
This is incredibly well done and helped me visualize derivatives comprehensively. Thank you.
This is up there with 3Blue1Brown for mathematical explanation, animation quality and overall elegance. Well done.
This is a visual masterpiece! Well done!
Much of this was a review for me as I took the time to go through all this last year. I did an implementation of the MNIST handwritten number neural network and had to learn all the calculus covered here to work out the backpropagation math. You really do have to dig in to it to get a good handle on it but it's fun stuff.
He is back! Greetings from Brazil, we've all been waiting for this release!
It's very very nice to see that are you updating.
This is the best ever explanation I have seen. Thanks for taking the time and doing something extraordinary.
I have been doing ML research for a few years now but somehow I was drawn to this video. I am glad to say that it did not disappoint! You have done an amazing job, putting things in perspective and showing respect to calculus where it is due. We forget how a simple derivatives powers all of ML. Thank you for reminding that!
This is the best youtube channel in my feed, and I have many.
criminally underrated
Always impressive! Looking forward to the second one.
Excellent visualization! Keep posting like this! 😃😃
Ya ví el video completo como 5 veces en estas semanas, este tema me fascina
Animation is great, but more and more people are doing it now. What make this special is the story, the complexity build-up is perfect and efficient. One needs a deep understanding of the subject and strong teaching skills to produce this.
Beat graphical experience with a clear information, Really enjoyed throughout the video !!!
Excellent video, thank you. I'm already looking forward to the synaptic plasticity video!
Wow, hats off to you! Can't even imagine how long it takes to make something like this
thank you so much! The most clear explanation of the topic i've seen so far, amazing job! I wish i had this kind of videos during school education.
Thank you for illustration!
Wow wow wow wow! From what I gather here, the key is in understandng ML predictions is that we are looking to fit the function f(x) = b + k1x + k2x^2 + k3^x^3 + k4x^4 + k5x^5. The machine just turns the dial until it finds the best fit using function such as mae or mse. So this is why ML needs so much GPU power then! I'm mind blown, in case you didn't notice the wows earlier. :) Thank you so much for this.
Well, kind of. In ML in general we are not fitting that exact function. We can fit any function and those functions in real deep learning models are very complex.
all these basic concepts such as derivatives, least square method, I'm learning it in my college. watching these kind of machine learning videos has made me understand the practical applications of these theoretical concepts a bit better now 😌
i just made that in python for a simple quadratic equation.....THANK YOU !!!! i just learned python and machine learning !!!!!!!!!!
Using desired y=0 i could also find one solution of the equation... wow i love this so much!!
The only different i did was to make x the weight and not the coeficients which i wanted them to be fixed inputs
What you helped me realise is that any system that can put in a computational graph like that 30:04 ...it can be embeded backpropagation regardles
THANK YOU im out of words
Also when the next loss is bigger or equal than the preview loss after one iteration... i divided the learning rate by a factor of 2 or 10 for more accuracy and if the next loss was smaller than the preview one i multiple the learning rate by a factor of 1.1 to 1.5 to speed up the proccess...thus having results in hundreds or even thousands less generations/iterations and less time consuming!!!!!
I can use this for optimizing my desired outputs in any system !!! JUST WOW!!
Great video! Very elegant explanation of back propagation, and I’m super excited to see the different mechanics of biological neural networks! Keep up the good work.
I think I just found my favourite channel of all times.
I've been on YT since 2011 and never had a crush for a YT channel before today é.è
This is insane. I loved the video, keep it up!
A million dollar explanation. Thank you @Artem
You are the best source of understanding computation that is biological and organic (all ml stuff), thank you.
Thank you for this excellent explanations !
Artem back with another masterclass!
Very insightful video. Can't wait to see the second part. I would really love to see a video from you on spiking neural networks too!
Make more videos like this. I learned so much. Thank you for making this great videos.
Absolutely brilliant
Some people just want to see the world learning. Great Video Artem!
Outstanding explanation. Thanks
Excellent explanation - I already understood this conceptually but this video gives a very good intuition for the repeated chain rule application
Man this is such a great channel.
I loved this content. You rock it! Congratulations! ❤
I cannot imagine just how much effort and work this took to make.
Wonderful video, many thanks!
Amazing explanation!
This is just superb, thank you Artem! Timing couldn't be any better as the gradient descent algorithm was mentioned in Grahaene's "How We Learn" which I'm currently reading.
Fantastic explanation and animations!
I enjoy watching your videos, thank you .
omg, what an explanation. You legend, more power to you !!!
This was amazing and mind blowing 🤩
amazing video!!!!
I am recently doing AI by Hand and was stuck on the back-propagation concept.
It really help deepen my understanding of neural networks and back-propagation.
Really nice work! Congrats.
Amazing, enjoying very much!
great explanation!
this is the only thing I never understood, I hope to finally understan it. I's weird how this video gets recommended just as I wanted to google about backpropagation
Phenomenal video
Excellent explanation
This video explains the mathematical base of neural networks in a way I understood it the frist time enough to be able to explain it to somebody else. Thank You for that. I can't even imagine how much work you put into the animations. A master piece!
Glad to see ML related video from you ! As you have neuroscience background I would love to see some video that compare the current state of the art architecture work in ML with some of the inner working of the brain. For exemple if there are any structure in the brain with some ressemblance with GPT/transformers architecture, even thought the brain is light-years away I think that could be interesting :)
Top notch visuals man
Thanks Artem
Yo, I'm hyped for the next video
Aha! I get it now. Impressive effort to explain, thanks
Thank you sir.
I think this video alone made all my Calculus I and II classes make sense now
That is a very good explanation
Another gem of a video, well done Artem!! This channel deserves 1M+ subscribers, there's nothing else like it on UA-cam.
This is beautiful!
I need the next video yesterday please!
Wow ! This is masterpiece
The legend is back!
Mindblowing. Just the video I was looking for. TBH, initially, I was a bit put off by your English as I am not a mothertongue myself. However, your knowledge, competence, hard work and research behind this video got me hooked. Liked and subscribed. And I will be watching this video many times.Well done!
Good Work, Congrats
Most important people can know and teach at the same time
Amazing video ❤
Das is very cool man! Thanks. :)
I have to subscribe to this great teacher.
Как всегда великолепно!
When the stuff on UA-cam is better, more intuitive and better explained than in school 🤣
Wow. Wow. Wow. Thank you so much. This is instrumental for my study. Makes AI math a lot more approachable.
I swear I commented yesterday that I I really hope to see another one of your videos
The back propagation topic definitely is important. Nevertheless, it’s a neural network’s implementation of a known from cybernetics feedback. A neural network’s simulation of source data, based of interpolation of the source data, with following extrapolation, is an equilibrium of a neural network. Such interpolation widely described in mathematics, for example, in method Monte Carlo.
Thank you for considering this topic.
Great job, as always! I'm glad you don't forget about this channel and about us, your fans ^_^
Amazing !!
Excellent presentation. You made it let from basic calculus, machine learning is just one simple step. What would be interesting is - what are the theoretical underpinnings of this method? When do we say learning is successful? What is the computational complexity of neural networks?
subscribed! I just loved it
Nice colors in the equations ❤