The Most Important Algorithm in Machine Learning

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 518

  • @ArtemKirsanov
    @ArtemKirsanov  7 місяців тому +36

    Join Shortform for awesome book guides and get 5 days of unlimited access! shortform.com/artem

    • @TNTsundar
      @TNTsundar 7 місяців тому

      Can you talk about liquid neural networks? I’m interested to know if that’s a revolutionary work that deserves more recognition and following.
      arxiv.org/pdf/2006.04439.pdf

    • @webgpu
      @webgpu 2 місяці тому +2

      Artem, i want to thank you, not only for publishing an excellent material (from the 100's of DL/ML videos i've saved, yours is top 5 - really), as well as having a great intonation, that helps A LOT in capturing the attention in this day and age of constant distractions. THANK YOU 🙌

  • @Mutual_Information
    @Mutual_Information 7 місяців тому +538

    Back prop is a hard, heavy thing to explain, and this video does it extremely well. I mean, that section 'Computational Graph and Autodiff' might be the best explanation of that subject on the internet. I'm very impressed - well done!

    • @33gbm
      @33gbm 7 місяців тому +4

      You two are the best channels I have found in the SoME episodes. It's great to see this interaction between you guys.

    • @dprophecyguy
      @dprophecyguy 7 місяців тому +3

      Love your videos

    • @michaelcharlesthearchangel
      @michaelcharlesthearchangel 7 місяців тому +2

      If there is no mention of sine waves in neural networks then it won't be total.

    • @ExtantFrodo2
      @ExtantFrodo2 5 місяців тому +1

      Where is that section 'Computational Graph and Autodiff' ?

    • @entivreality
      @entivreality 4 місяці тому +2

      Yeah really helped me get the significance of autodiff

  • @CuriousLad
    @CuriousLad 7 місяців тому +292

    Funnily enough, the calculus portion of the video is probably one of the best explained I've seen

    • @George70220
      @George70220 7 місяців тому +3

      Why would that be 'funnily enough'? What a diss lmao.

    • @balu6923
      @balu6923 7 місяців тому +30

      @@George70220 I don't think CuriousLad meant it as a diss, it's just that when Artem made the video, he explained the Calculus section as a background information. The partial derivates and gradient descent wasn't the main topic of the vid, yet you could show this to Calculus I student and they would be thanking him for the explanation, even if they have not interest in learning back propagation! That's why funnily enough, while the intro Calc topics wasn't the main part of the video, that portion would be very helpful to anyone starting out int Calc!

    • @veritas7010
      @veritas7010 7 місяців тому +2

      I dont agree for example the act of minimizing loss function and gradient descend were not properly linked there were just two pieces of information unprocessed dumped in series

    • @tonic4120
      @tonic4120 4 місяці тому +1

      I found it unnecessary. Anyone who clicks on a video about back propagation likely already knows calculus, and if they don’t, that short primer is not going to be enough foundation for the rest of the content.

    • @MphoMotlokwemampuru
      @MphoMotlokwemampuru 3 місяці тому +1

      Nasdaq please buy toggle 0:25

  • @undertheshadow
    @undertheshadow 7 місяців тому +178

    "Wait, It's all derivatives?"
    "Always has been"
    Great work pal. Provides excellent clarity.
    Looking forward to the second part.

    • @rad6626
      @rad6626 6 місяців тому +5

      😂 Turns out back propagation isn’t just magic

  • @aminebouramada
    @aminebouramada 7 місяців тому +12

    this's by far the most clearer explaination and simplification of backpropagation i have watched

  • @shikhargairola5815
    @shikhargairola5815 7 місяців тому +27

    It’s probably the best explanation of backward propagation. Hats off to your hard work and saving this so valuable content.

  • @vastabyss6496
    @vastabyss6496 7 місяців тому +107

    It makes sense that you would cover both computational neuroscience AND machine learning since they both play a significant role in AI research. The sort of content you're making is definitely 3Blue1Brown level. Keep up the good work!

    • @nickwissler6811
      @nickwissler6811 5 місяців тому +8

      He also managed to squeeze an entire calc 1 course into this single video. It's amazing

  • @priteshtadvi4946
    @priteshtadvi4946 5 місяців тому +37

    I knew that calculus is important for machine learning but never knew that 12th grade derivatives are that much important.
    When you said about chain rule, that bring me back to my school days , I never thought that derivatives, integration and probabilities will be used this way in future.
    Well explained video.
    Thanks for sharing this knowledge and conveying process much simply.

  • @matheusmendonca1332
    @matheusmendonca1332 7 місяців тому +23

    By far the best ML explanation I have seen on internet.

  • @keithwallace5277
    @keithwallace5277 6 місяців тому +41

    This has to be one of the greatest explanation of the inner working of learning in ML, I love it!

  • @ReighKnight
    @ReighKnight 7 місяців тому +13

    The visuals on this video is from another planet . So Good !!!!!!!!

  • @black_crest
    @black_crest 7 місяців тому +5

    This just might be the most underrated video on Back Propagation that I've ever seen! I hope more people come across this

  • @maheshwaransivagnanam6452
    @maheshwaransivagnanam6452 5 місяців тому +13

    I've been trying to get into ML for quite a while now. This is by far the best explanation of gradient descent and back propagation hands down!!!
    Amazing work!!!

  • @pavanmamidi7705
    @pavanmamidi7705 16 днів тому

    Understanding something complex requires high intelligence. Explaining it simply requires even higher intelligence. You are one of the best teachers that I have encountered in my life! I'm grateful!

  • @aabiddd
    @aabiddd 7 місяців тому +4

    all these basic concepts such as derivatives, least square method, I'm learning it in my college. watching these kind of machine learning videos has made me understand the practical applications of these theoretical concepts a bit better now 😌

  • @bungerwow7963
    @bungerwow7963 6 місяців тому +3

    I've seen probably 20 videos on this and your explanation of the derivatives for someone not in calculus was really helpful. thanks.

  • @Alwaysiamcaesar
    @Alwaysiamcaesar 6 місяців тому +3

    I actually pictured this all in my head successfully where I thought I had everything in a canonical deep neural network figured out the other day. It’s one thing to hold it, it’s another to do the detailed, gritty work of explaining it in video format. Very well done.

  • @Anonymous-fr2op
    @Anonymous-fr2op 7 місяців тому +40

    Damn, I was wondering where you've been since over half a year, whilst I was stuck in backpropagation😂 and here you came back like a true mind reader. Glad to see you back❤

    • @highchiller
      @highchiller 7 місяців тому +3

      He was calculating your backward step so you can make your next forward step (sorry, couldnt resist) XD

    • @David123456789345
      @David123456789345 2 місяці тому

      @@highchillerhe just gave you the right explanation gradient so that you can optimize your learning loss function 😂

  • @gianlucanordio7200
    @gianlucanordio7200 7 місяців тому +8

    I just have to say this goes way beyond the quality of the many chainrule videos I've seen so far. Good job man, you've got some impressive skills to keep me watching a math video and take notes past my usual bedtime

    • @marc_frank
      @marc_frank 6 місяців тому

      you take notes?

    • @kevinscales
      @kevinscales 4 місяці тому +2

      @@marc_frank It's generally a good idea if you are trying to learn. Don't be passive if you want it to stick.

    • @ZavierBanerjea
      @ZavierBanerjea 3 місяці тому +1

      Taking notes, making sketches of the ideas, doing the math are excellent learning techniques. Old timers like me always do that 👍

  • @wrtcookiedelta7560
    @wrtcookiedelta7560 Місяць тому

    I should be watching gameplays but here I am procrastinating. Jokes aside, im 13 minutes into the video and astonished by your crystal clear explanations and the quality of your material, this is gold.

  • @Redant1Redant
    @Redant1Redant 5 місяців тому +5

    That was an outstanding explanation. Your ability to explain higher mathematical concepts in such simple terms is really an amazing service to the rest of us who wanna understand these subjects but don’t have a mathematics degree. Thank you.

  • @techshivanik
    @techshivanik Місяць тому

    This is the best resource I have found to learn backpropagation. The visualization of each concept made this very clear. I can't even imagine the amount of effort you have put into this video.

  • @K9Megahertz
    @K9Megahertz 7 місяців тому +9

    This is a visual masterpiece! Well done!
    Much of this was a review for me as I took the time to go through all this last year. I did an implementation of the MNIST handwritten number neural network and had to learn all the calculus covered here to work out the backpropagation math. You really do have to dig in to it to get a good handle on it but it's fun stuff.

  • @pradhumnkanase8381
    @pradhumnkanase8381 7 місяців тому +5

    There could not have been a better explanation. Hats off to you

  • @ChaseGartner
    @ChaseGartner 4 місяці тому +1

    Absolutely one of the best videos explaining data points and regression formulas I have ever seen. Amazing work

  • @HimanshuPakhale-n3i
    @HimanshuPakhale-n3i Місяць тому +1

    in simple words, backpropagation is method of finding gradient descent. to minimising the losses their is need to find the correct values of each function and it is get by chain rule which is based on darivatives and calculus. it's direction in backward direction hence it is known as backward propagation.
    still so much confusion in mind regarding this process.
    video is very useful and editing is extraordinary.

  • @kltr007
    @kltr007 7 місяців тому +7

    This video explains the mathematical base of neural networks in a way I understood it the frist time enough to be able to explain it to somebody else. Thank You for that. I can't even imagine how much work you put into the animations. A master piece!

  • @sparkle2575
    @sparkle2575 3 місяці тому +3

    Excellent explanation!! You have done a selfless service to humanity.

  • @naveen_malla
    @naveen_malla 6 місяців тому +3

    Dude, this is the most beautiful ML video i've ever seen. Highly informative yes, but also beautifully made. Thank you for your work.

  • @ajay0909
    @ajay0909 Місяць тому

    This video would have saved me so many days that I have spent on researching backpropagation 2 years ago

  • @MaitreJedi19
    @MaitreJedi19 7 місяців тому +2

    Animation is great, but more and more people are doing it now. What make this special is the story, the complexity build-up is perfect and efficient. One needs a deep understanding of the subject and strong teaching skills to produce this.

  • @HeatherRoberson-vx5eh
    @HeatherRoberson-vx5eh 5 місяців тому

    As a student in this business, who has passed through a bunch of professors, I can say with confidence! With this trader, you will both learn and earn and, importantly, receive advice. Everything is competent and clear, without a bunch of any unnecessary movements! Keep up the good work!🤣

  • @ahmeterdonmez9195
    @ahmeterdonmez9195 2 місяці тому +1

    The best and most understandable explanation I have ever seen. You explained the essential basis of Artificial Neural Networks so beautifully. I really congratulate you

  • @ram-my6fl
    @ram-my6fl 7 місяців тому +4

    Most Comprehensive Explanation EVER
    my opinion : better than
    3b 1b, No offence to 3b 1b Hes great at it and one of the pioneers who did these kind kf visual explanations.
    But i like your explanation as it is slow paced & comprehensive

    • @domorobotics6172
      @domorobotics6172 5 місяців тому

      Yeah 3b1b definitely deserves respect from me, but I think he will to recognize this video is very carefully done.
      I like that these people just care about the truth and the perfection, and even with a little bit of envy, care about the best product being done.

  • @hasanrants
    @hasanrants Місяць тому

    I just watched this video after completing my first lecture of Deep Learning on Backpropagation and Gradient Descent.
    thanks man! appreciated. Really solid content.

  • @ZavierBanerjea
    @ZavierBanerjea 3 місяці тому

    Guru of Fundamentals. I can't resist subscribing to your channel and watch all of your videos. The way you explained Chain Rule : the logic behind it is awesome. I am trying to visualize the Quotient Rule of Derivatives in your way. A good Teacher always makes you THINK 🙏

  • @Martin-cz4zy
    @Martin-cz4zy Місяць тому

    Protect this guy at all cost please

  • @cachegrk
    @cachegrk 7 місяців тому +2

    This is the best ever explanation I have seen. Thanks for taking the time and doing something extraordinary.

  • @ForTheOmnissiah
    @ForTheOmnissiah 2 місяці тому

    I wish the Chain Rule was explained in this manner when I was in university. I understood how to do it on paper just fine, but this explanation makes the reasoning behind it make complete sense.

  • @MrRhainer
    @MrRhainer 5 місяців тому +3

    The best explanation about Deep Learning. Grateful.

  • @The-Martian73
    @The-Martian73 6 місяців тому +1

    If you couldn’t understand this explanation, visualization, clearness … there’s nothing else can work with you I swear

  • @asdasd-yr7wi
    @asdasd-yr7wi 7 місяців тому +55

    31 years now, had like 13 years of math in school and another 5 years at university, first time i really understood how derivatives work, bcs visualisation instead of "you calculate it this way and derive it that way, now memorize"

    • @ArnaudMEURET
      @ArnaudMEURET 5 місяців тому +3

      May I ask which university you went to?

    • @WsciekleMleko
      @WsciekleMleko 4 місяці тому +3

      @@ArnaudMEURET Sorry, but I don't believe anybody who have no idea what a tangent line of a function in point x looks like, and what it means (despite milion of excersises teaching you it's meaning), and dozens of graphs in literally every workbook, could actually go through 5 years of math related subject. This guy is straight up lying or trolling.

    • @abyssmage6979
      @abyssmage6979 3 місяці тому +1

      ​​@@WsciekleMleko exactly. People shit on school because "muh education system bad" and forget about all of the interesting stuff they actually teach.
      No, you're not heroes trying to fight against the big bad. You're just lazy and want an excuse to keep being one.

    • @Dr_Larken
      @Dr_Larken День тому

      Um, how long did it take you to graduate grade school? 13 years of math! Even if you did start learning maths in PK or K to 12, it’s basic mathematics until about grade 4 when you’re starting to build off the foundation of that such as algebra, geometry etc
      In other words, the maths ain’t mathing!

  • @BijouBakson
    @BijouBakson 7 місяців тому +4

    Wow wow wow wow! From what I gather here, the key is in understandng ML predictions is that we are looking to fit the function f(x) = b + k1x + k2x^2 + k3^x^3 + k4x^4 + k5x^5. The machine just turns the dial until it finds the best fit using function such as mae or mse. So this is why ML needs so much GPU power then! I'm mind blown, in case you didn't notice the wows earlier. :) Thank you so much for this.

    • @szef_fabryki_azbestu
      @szef_fabryki_azbestu 6 місяців тому

      Well, kind of. In ML in general we are not fitting that exact function. We can fit any function and those functions in real deep learning models are very complex.

    • @BijouBakson
      @BijouBakson 6 місяців тому

      @@szef_fabryki_azbestu The function above is the only function that is gradually adjusted by the stochastic gradient descent(SGD). Watch the course again. The weights and bias that the SGD is attempting to determine are those of the above function. They are used to make predictions in Deep Learning. You're confusing concepts here. Think again please.

    • @szef_fabryki_azbestu
      @szef_fabryki_azbestu 6 місяців тому

      @@BijouBakson Unfortunately you are confusing concepts. That's not how this works. Sure in that particular example we are optimizing to get parameters for that particular function. But that just a simple example.

    • @BijouBakson
      @BijouBakson 6 місяців тому

      @@szef_fabryki_azbestu It seems like we are stuck in a back and forth turns, accusing each other of confusion. Maybe you are right! I mean, you could be Einstein for all I know. So... Please help then, in summary, what do they refer to when they say weights and bias? How do you understand it?

    • @szef_fabryki_azbestu
      @szef_fabryki_azbestu 6 місяців тому

      ​@@BijouBakson Weights and biases in NNs are parameters a and b for linear functions:
      y = a*x + b
      That's for 1 neuron. For one layer of neurons we can write it as a matrix multiplication and vector addition:
      Y = A*X + B
      On top of those functions we usually apply some non-linearities like ReLU, tanh, sigmoid and so on. In classical multilayer feedforward networks we stack those layers on top of each other e.g. f(g(h(x))). Example fully connected network with 3 layers and tanh as activation function can be written as:
      Y = A_3*(tanh(A_2*(tanh(A_1*X + B_1)) + B_2)) + B_3. Here we have weights in matrices A_1, A_2, A_3 and biases in vectors B_1, B_2, B_3. So no, in general case when we train NN we do not fit f(x) = b + k1x + k2x^2 + k3^x^3 + k4x^4 + k5x^5 function. Of course you can say that you just meant polynomial approiximation of functions (Taylor expansion) but you explicitly mentioned only that particular function f(x) that is polynomial of order 5 and what's more neural networks are universal functions approximators so they can approximate any function to any degree of accuracy but only if the network's activation function is not a polynomial.

  • @yordanyordanov6719
    @yordanyordanov6719 4 місяці тому +1

    Very good video, very well explained. But there is one problem you didn't mention. When training very deep neural networks and using a sigmoid or tanh funktion as the activation funktion, Backpropagation loses it's "powers". The learning prozess becomes extremely slow and results are suboptimal. One of many solutions to this is a ReLU or a ELU funktion in the hidden layers instead of Sigmoid or Tanh. And also how we initialize our weights at the beginning. For example the He-Initialiazation...

  • @winterknight1159
    @winterknight1159 7 місяців тому

    I have been doing ML research for a few years now but somehow I was drawn to this video. I am glad to say that it did not disappoint! You have done an amazing job, putting things in perspective and showing respect to calculus where it is due. We forget how a simple derivatives powers all of ML. Thank you for reminding that!

    • @ArtemKirsanov
      @ArtemKirsanov  6 місяців тому

      Thank you! That’s really nice to hear!

  • @kakandeemmanuel7410
    @kakandeemmanuel7410 5 місяців тому

    I cannot tell how much excited this video has got me once I realized I am understanding every single step effortlessly.😂😂😂
    Thanks so much for the explanation.
    God bless you!🙏🙏🙏🙏🙏🙏

  • @pankajgoikar4158
    @pankajgoikar4158 3 місяці тому

    Where you were so far.... I've been trying to understand this concept from past 2 years, and now it's cleared after watching this video. Honestly, my maths was not up to the mark. After seeing your video so many important concepts are cleared.
    Don't have enough words to thank you.... God bless you.
    I'll share your videos with my friends. Please keep it up.....🙏🙏🙏🙏🙏🙏🙏🙏🙏

  • @TruthOfZ0
    @TruthOfZ0 7 місяців тому +3

    i just made that in python for a simple quadratic equation.....THANK YOU !!!! i just learned python and machine learning !!!!!!!!!!
    Using desired y=0 i could also find one solution of the equation... wow i love this so much!!
    The only different i did was to make x the weight and not the coeficients which i wanted them to be fixed inputs
    What you helped me realise is that any system that can put in a computational graph like that 30:04 ...it can be embeded backpropagation regardles
    THANK YOU im out of words
    Also when the next loss is bigger or equal than the preview loss after one iteration... i divided the learning rate by a factor of 2 or 10 for more accuracy and if the next loss was smaller than the preview one i multiple the learning rate by a factor of 1.1 to 1.5 to speed up the proccess...thus having results in hundreds or even thousands less generations/iterations and less time consuming!!!!!
    I can use this for optimizing my desired outputs in any system !!! JUST WOW!!

  • @slopesmonte
    @slopesmonte 27 днів тому

    Finally a solid explantation of backpropagation. Thank you!!

  • @omarbadr9469
    @omarbadr9469 4 місяці тому

    Man, you really nailed it, especially the Computational Graph and Autodiff part. I heard so many times about them in lectures on Stanford and others. However, this was impressive.

  • @dwinsemius
    @dwinsemius 5 місяців тому +2

    In traditional statistics (which preceded machine learning by many decades) the "loss function" was called the "deviance" or "the variance"

  • @Vinsce_lives
    @Vinsce_lives 7 місяців тому +2

    This is incredibly well done and helped me visualize derivatives comprehensively. Thank you.

  • @halilzabun
    @halilzabun 5 місяців тому

    One of the best visual explanations of the backpropagation algorithm I've seen! The animations are really good.

    • @javastream5015
      @javastream5015 5 місяців тому

      Sure that it was the back propagation algorithm?

  • @RoodJood
    @RoodJood Місяць тому +1

    simply the best presentation on the subject

  • @Master_of_Chess_Shorts
    @Master_of_Chess_Shorts 7 місяців тому +1

    This has to be the best explanation of the chain rule ever! Thanks

  • @simaitools
    @simaitools 7 місяців тому +1

    Watching this video was like a breath of fresh air after some heavy math calculations! The visual explanations really helped solidify my understanding of backpropagation. I appreciate how clear and easy to follow the graphs were. Keep up the fantastic work! Can't wait for more graphic doses like this.

  • @krishnagupta31
    @krishnagupta31 4 місяці тому

    this is the most intuitive video I have ever come across. Amazing work!!!!!

  • @tonsetz
    @tonsetz 7 місяців тому +1

    He is back! Greetings from Brazil, we've all been waiting for this release!

  • @chilledpepsi
    @chilledpepsi 6 місяців тому +1

    Hands down the best explanation there is to backprop

  • @moralboundaries1
    @moralboundaries1 7 місяців тому +6

    So clear and concise! Thank you for creating this.

  • @balajimarisetti4245
    @balajimarisetti4245 Місяць тому

    Excellent explanation of back-propagation, the building block of machine learning. Thanks a lot.

  • @isaac10231
    @isaac10231 7 місяців тому

    I cannot imagine just how much effort and work this took to make.

  • @XxIgnirFirexX
    @XxIgnirFirexX 7 місяців тому

    I think I just found my favourite channel of all times.
    I've been on YT since 2011 and never had a crush for a YT channel before today é.è

  • @Binue-n5g
    @Binue-n5g 6 місяців тому +1

    The world needs more of you bro

  • @brahmatejachilumula2668
    @brahmatejachilumula2668 7 місяців тому +1

    Beat graphical experience with a clear information, Really enjoyed throughout the video !!!

  • @eurob12
    @eurob12 5 місяців тому

    Very well explained how backpropagation and how the loss function helps in determining the optimal minimum by using calculus, great detail which helps newbies like me understand this complex topic much better.

  • @nayanahgowda3219
    @nayanahgowda3219 5 місяців тому

    Hands down the best explanation I have seen so far! So clear and easy to understand!!

  • @OmarElghamry1
    @OmarElghamry1 2 місяці тому

    So much effort, in this video, the quality of the content at the same level of 3B1B, keep it going man.

  • @4th_wall511
    @4th_wall511 5 місяців тому

    bro im 2 minutes in and your graphics are insanely good I can already tell this is going to be a treat. Holy smokes man I'm having a graphicgasm

  • @HozanKano
    @HozanKano 4 місяці тому

    The best explanation of machime learning i have ever seen on you tube ,amazing work .thank you👍

  • @arjunrao9978
    @arjunrao9978 4 місяці тому

    This video has an amazing and easy-to-understand explanation of the basics of Calculus. Many Thanks to the Creator 🙏🏼

  • @eradubbo
    @eradubbo 2 місяці тому

    Best description on the topic on the internet!

  • @andrewshort6440
    @andrewshort6440 5 місяців тому

    Magnificent work, from the beautiful, creative, elegant design, to the mastery in teaching. Thank you!

  • @ElSenorEls
    @ElSenorEls 3 місяці тому

    This is the best video about this topic. Learned a lot of things. Took me 2 or more hours but I understand it now. Thank you!

  • @ndungikyalo
    @ndungikyalo 16 днів тому

    Amazing how you can explain it so well, so simply. You have a subscriber !

  • @danielgsfb
    @danielgsfb 6 місяців тому +2

    What an amazing video. I hope one day they come up with some world prize for 'free education heroes'. 173k views for a video like this is simply disgusting. This guy deserves maybe 2 billion views. God damn it, that makes me mad.

    • @noth606
      @noth606 4 місяці тому

      Ehm, ya do realize this flies over the head of most people, you'll have to stack thousands up to find one person who is interested and can understand this properly. It is also not really needed for a plumber or a bakery cashier to understand ML improvement/approach velocity which is what I'd call this in a sense. Or, a visual way to pick a good method for it.

  • @shizzm1990
    @shizzm1990 7 місяців тому +1

    Some people just want to see the world learning. Great Video Artem!

  • @MultiMojo
    @MultiMojo 7 місяців тому +4

    Another gem of a video, well done Artem!! This channel deserves 1M+ subscribers, there's nothing else like it on UA-cam.

  • @Shadow-g8r6s
    @Shadow-g8r6s 2 місяці тому +2

    Subscribed after just watching five mins.. 😊

  • @AlexKelleyD
    @AlexKelleyD 7 місяців тому +9

    This is one of, if not the, best videos I’ve seen that throughly explains back propagation. It will definitely help me to be able to better explain the algorithm to others, so thank you for creating it.

  • @Bhuvan_D
    @Bhuvan_D 3 місяці тому

    That was fire bro! Gonna have to rewatch to understand the back step, but a lot clearer than most videos

  • @haritadepalli959
    @haritadepalli959 7 місяців тому

    Excellent presentation. You made it let from basic calculus, machine learning is just one simple step. What would be interesting is - what are the theoretical underpinnings of this method? When do we say learning is successful? What is the computational complexity of neural networks?

  • @developersteve1658
    @developersteve1658 3 місяці тому

    27:27. It clicked here.
    Seriously amazing video. Honestly, all your videos are.
    Thank you so much.

  • @DudeWhoSaysDeez
    @DudeWhoSaysDeez 7 місяців тому +4

    I'd love to see more videos relating to any relationships between artificial neural networks and biological neural networks

    • @egor.okhterov
      @egor.okhterov 7 місяців тому

      The only relationship is that ANNs store something and brain NN also store something. That's it. The analogy ends here. Everything else is completely different =)

  • @Тима-щ2ю
    @Тима-щ2ю 6 місяців тому

    WOW!!! The amount of animation you have made is just incredible. I would really like to not know about backprop again in order to fully appreciate this video!

  • @AkshayKumar-sg8qm
    @AkshayKumar-sg8qm 3 місяці тому

    That's the most amazing way of explaining such hard things to understand

  • @fosowl
    @fosowl 7 місяців тому +2

    Glad to see ML related video from you ! As you have neuroscience background I would love to see some video that compare the current state of the art architecture work in ML with some of the inner working of the brain. For exemple if there are any structure in the brain with some ressemblance with GPT/transformers architecture, even thought the brain is light-years away I think that could be interesting :)

  • @WayneLopez-w9d
    @WayneLopez-w9d 4 місяці тому

    Your approach to trading is truly impressive. Thank you for teaching me so much!

  • @1ProsperousPlanet
    @1ProsperousPlanet 12 днів тому

    Wow amazing thank you. Ive read and watched many videos on this topic and this is the one where I finally "got it"

  • @delete7316
    @delete7316 7 місяців тому +1

    As soon as I saw this video, I knew it was going to be the best of this kind on the Internet. And it was. Fantastic video!

  • @mou8842
    @mou8842 7 місяців тому

    I think this video alone made all my Calculus I and II classes make sense now

  • @philipm3173
    @philipm3173 7 місяців тому

    This is just superb, thank you Artem! Timing couldn't be any better as the gradient descent algorithm was mentioned in Grahaene's "How We Learn" which I'm currently reading.

  • @enricoginelli3405
    @enricoginelli3405 2 місяці тому

    Some parts were really hard, can't deny that. Thank you for your work, it is amazing. How are you able to be so confident with these concept while being still at PhD level?

  • @stewie__69
    @stewie__69 7 місяців тому +1

    I'm curious: Are your video editing skills superior, or do your tech skills take the lead? Your expertise is remarkable! I'd love to see a video on Transformer from you.

  • @teamredstudio7012
    @teamredstudio7012 7 місяців тому

    this is the only thing I never understood, I hope to finally understan it. I's weird how this video gets recommended just as I wanted to google about backpropagation

  • @benmuller6103
    @benmuller6103 7 місяців тому

    Excellent explanation - I already understood this conceptually but this video gives a very good intuition for the repeated chain rule application

  • @obsidianSt6761
    @obsidianSt6761 3 місяці тому

    this video has amazing animations. You/your team clearly have a very high attention to details

  • @ks0ni
    @ks0ni 7 місяців тому

    Wow, hats off to you! Can't even imagine how long it takes to make something like this

  • @TurinBeats
    @TurinBeats 5 місяців тому

    Waiting patiently for the second video 🫰♥️. Much love from Kenya, thank you for making me understand back propagation. Started watching your channel because of Obsidian, stayed for the AI lessons 🫰.

  • @kalyanasundaramsubramanian2775
    @kalyanasundaramsubramanian2775 Місяць тому

    Lucid explanation...I am yet to get my head fully around all of it but if I review this a couple of more time I am sure I will...thanks for this it has rekindled some interest in basic math...
    I was reminded of 3Blue1Brown channel when I was watching this...