Beautiful application of semi-abstract math to a whole class of problems. I've never seen such an elegant presentation of gradient descent - usually it gets lost in the clutter of "multivariable calculus". The whole section on the Frechet derivative was also excellent. Great long-form style that's getting harder to find on YT these days.
Give this man a nobel. Half of the master's students in AI struggle in understanding when seeing too many indices. It's better to demonstrate indices right on the code rather than in notations
This makes so much more sense. For ex. For x^2, derivative is 2x and even if u understand the limit definition, why it is 2x doesnt make intuitive sense but now, it is essentially that the approximation to nearby point on the curve x^2 is based on thr straight line 2x.
Hey, nice to meet you! I just found your channel and subscribed, love what you're doing! I like how clear and detailed your explanations are as well as the depth of knowledge you have surrounding the topic! Since I run a tech education channel as well, I love to see fellow Content Creators sharing, educating, and inspiring a large global audience. I wish you the best of luck on your UA-cam Journey, can't wait to see you succeed! Your content really stands out and you've put so much thought into your videos! Cheers, happy holidays, and keep up the great work ;)
@@charles_irl Yeah, I totally understand - it is a grind whenever you first start a new channel. But keep going, your content is great and I love your teaching style! I think that more people should see these videos too, they are incredibly high-quality and refined! Keep going :)
In this video, I focus on just derivatives, because they come up more often in ML. In the view of this serirs, ML is programming by optimization, and derivative-based methods for optimization (gradient descent, adaptive gradient methods, second-order methods) predominate. We'll touch briefly on integrals in the last video, on Probability, but in my opinion they're more trouble than they're worth (both trouble and worth have large magnitude!) unless you're specializing in a subdivision of ML that uses them.
@@charles_irl so you recommend to focus on Derivatives more than Integral for ML ? Can you recommend any book or post where I can study the video material in detail in simple words ? Thank You for your knowledge 👍🙏
hey, @charles_irl it would be great if you make a course like this but only for getting programmers ready for mathematics. call that Into Mathematics for Programmers????
Beautiful application of semi-abstract math to a whole class of problems. I've never seen such an elegant presentation of gradient descent - usually it gets lost in the clutter of "multivariable calculus". The whole section on the Frechet derivative was also excellent. Great long-form style that's getting harder to find on YT these days.
Give this man a nobel. Half of the master's students in AI struggle in understanding when seeing too many indices. It's better to demonstrate indices right on the code rather than in notations
This makes so much more sense. For ex. For x^2, derivative is 2x and even if u understand the limit definition, why it is 2x doesnt make intuitive sense but now, it is essentially that the approximation to nearby point on the curve x^2 is based on thr straight line 2x.
Thank you for the explanation. The graph where you show how to approximate scalar changes with calculus and little-o really helps.
I've been waiting for this material since I met you Charles!!! So excited.
Hope I lived up to expectations 😉
Loved it dude. From a fellow researcher in ML and Econ.
Thanks!
Excellent, this is what I wanted for a long time.
Just found your channel and thank you so much for contents, they are super helpful!
I'm going to repeat myself, this is extremely cool!
Thanks Charles
Hey, nice to meet you! I just found your channel and subscribed, love what you're doing!
I like how clear and detailed your explanations are as well as the depth of knowledge you have surrounding the topic! Since I run a tech education channel as well, I love to see fellow Content Creators sharing, educating, and inspiring a large global audience. I wish you the best of luck on your UA-cam Journey, can't wait to see you succeed! Your content really stands out and you've put so much thought into your videos!
Cheers, happy holidays, and keep up the great work ;)
Thanks! I'm sure you know the struggle of posting videos and getting minimal feedback, so your message means a lot!
@@charles_irl Yeah, I totally understand - it is a grind whenever you first start a new channel. But keep going, your content is great and I love your teaching style! I think that more people should see these videos too, they are incredibly high-quality and refined! Keep going :)
How do you use calculus for ML? I mean are we talking derivatives or the entire calculus including integrals etc
In this video, I focus on just derivatives, because they come up more often in ML. In the view of this serirs, ML is programming by optimization, and derivative-based methods for optimization (gradient descent, adaptive gradient methods, second-order methods) predominate.
We'll touch briefly on integrals in the last video, on Probability, but in my opinion they're more trouble than they're worth (both trouble and worth have large magnitude!) unless you're specializing in a subdivision of ML that uses them.
@@charles_irl so you recommend to focus on Derivatives more than Integral for ML ?
Can you recommend any book or post where I can study the video material in detail in simple words ?
Thank You for your knowledge 👍🙏
What the actual f, this derivative definition needs to be as standard as the limit ones.
hey, @charles_irl it would be great if you make a course like this but only for getting programmers ready for mathematics.
call that Into Mathematics for Programmers????