Backpropagation calculus | DL4

Поділитися
Вставка
  • Опубліковано 18 лис 2024

КОМЕНТАРІ • 2,1 тис.

  • @3blue1brown
    @3blue1brown  7 років тому +1920

    Two things worth adding here:
    1) In other resources and in implementations, you'd typically see these formulas in some more compact vectorized form, which carries with it the extra mental burden to parse the Hadamard product and to think through why the transpose of the weight matrix is used, but the underlying substance is all the same.
    2) Backpropagation is really one instance of a more general technique called "reverse mode differentiation" to compute derivatives of functions represented in some kind of directed graph form.

    • @iamunknownperiod3355
      @iamunknownperiod3355 7 років тому +18

      You should probably change the thumbnail. The snapshot of the variables with indices (which I didn't know were indices at the time) and subscripts almost deterred me from watching this although it really wasn't that complicated.

    • @TalathRhunen
      @TalathRhunen 7 років тому +26

      I will probably be a TA for a lecture course on Deep Neural Networks again next semester and I will recommend this series to the students (we did it in a very math-heavy way this year and it was a bit too much for some of them, even though its a lecture for master students)

    • @sebaitor
      @sebaitor 7 років тому +8

      I was hoping you'd explain in either of these 2 vids on backprop why the hadamard product and transposing are used, what a waste :(

    • @polychats5990
      @polychats5990 7 років тому +5

      Amazing video, I think you did a really good job of making it as easy to understand as possible while also not simplifying things too much.

    • @bgoggin88
      @bgoggin88 7 років тому +2

      Sirius Black what you could do is download the r package "deepnet" and poke around at the source code. Its written in base r so you can follow it around. This is how I learned, and IMHO the best way to learn.

  • @thomasclark8922
    @thomasclark8922 Рік тому +2228

    This series was my first introduction to Machine Learning 3 years ago. I now work full-time as an AIML Scientist, my life is forever changed. Thank you.

    • @envadeh
      @envadeh Рік тому +38

      how hard was it? I am tryna code my own neural network from scratch, there's so little resources for that it seems. and how do I even make myself unique?

    • @thomasclark8922
      @thomasclark8922 Рік тому +396

      ​@@envadeh Learn everything, use the feynman technique; if you can't explain how a machine learns to someone who knows nothing about it, keep filling in the gaps. Formal education is great, but honestly more of a waste of time than not. Teach yourself, learn how to learn, and then keep learning.
      I audited Andrew Ng's Deep Learning Specialization from Coursera, had some formal education, and self taught myself everything I could get my hands on, from theory to application, the underlying math to the practical programming. Understand the importance of data, DATA IS KING. Watch Interviews with industry leaders, understand the big turning points and continued development within the last two decades of AIML (you'll figure out what they are with time).
      It takes 10,000 hours to become an expert, I'm about 4,500 in, but all it took was a little bit of work every single day. Make learning a habit. Trust yourself, believe in your ability to become who you want to be.
      "It doesn't matter if your read two research papers in a week, what matters is if you read two research papers a week for a year, now you've read 100 papers" - Andrew Ng
      (Don't 'read' research papers, watch synopsis! Smarter not harder! There's so much free information, you could probably use a GPT model to teach you what you don't know!)
      Goodluck, and I believe in you! :)

    • @nczioox1116
      @nczioox1116 Рік тому +7

      Did you need a CS or math degree to get into the field?

    • @thomasclark8922
      @thomasclark8922 Рік тому +72

      @@nczioox1116 "Need" is a strong word, it just depends on what kind of work you want to do/who your employer is; people used to go to college because that was the only place you could learn these difficult subjects, but now it's just an archaic way of putting you in debt since you can learn these things online for free UNLESS you want to work for an employer where you need the degree to be recognized.
      If you are self-motivated and can teach yourself these subjects, seriously consider your options before assuming that spending 4 years of your life and 100k+ is necessary.
      I have an Electrical Engineering degree, but out of the 40+ classes I had to take for it, only 2 had any sort of impact on my daily job now. It all depends on the context.
      Goodluck, and I believe in you! :)

    • @nczioox1116
      @nczioox1116 Рік тому +14

      @@thomasclark8922 Thank you! I have a mechanical engineering degree. I'm in the process of self teaching myself machine learning concepts and doing some projects. Lots of job postings I've seen in the field seem to require a bachelors or masters in CS, math, or neuroscience. Of course these seem to be for larger companies so maybe smaller companies might take a more holistic approach

  • @cineblazer
    @cineblazer 3 роки тому +1540

    Dear Grant,
    A year ago, I decided I wanted to learn Machine Learning and how to use it to make cool stuff. I was struggling with some of the concepts, so I went to UA-cam and re-discovered this series on your channel.
    Out of all the courses I've tried and all the hours of other content I've sat through, your videos stand out like a ray of sunshine. I just got my first full-time job as a Machine Learning Engineer, and I can confidently say it would never have happened without this series.
    Your channel may have affected the course of my life more than almost any other. Thanks for all your hard work!

    • @maruferieldelcarmen9573
      @maruferieldelcarmen9573 2 роки тому +306

      You could say that this channel had the largest nudge to your activation value

    • @cineblazer
      @cineblazer 2 роки тому +117

      @@maruferieldelcarmen9573 The partial derivative of Grant's videos with respect to my career is off the charts!

    • @souls.7033
      @souls.7033 2 роки тому +12

      @@maruferieldelcarmen9573 get out 😂

    • @souls.7033
      @souls.7033 2 роки тому +6

      @@cineblazer also i just saw your comment 11months ago, it's amazing to see your development! keep it up!!!

    • @khai7151
      @khai7151 2 роки тому +6

      Congrats on your job. I was wondering, when you finished Andrew Ng’s ML course, what additional steps and how long did you have to take to become a full fledge ML engineer?
      Thanks in advance

  • @kslm2687
    @kslm2687 6 років тому +3067

    “The definition of genius is taking the complex and making it simple.”
    - Albert Einstein
    You are genius.

    • @jean-francoiskener6036
      @jean-francoiskener6036 5 років тому +48

      I thought he said "You don't understand something well until you can explain it in a simple way"

    • @fractal5764
      @fractal5764 4 роки тому +10

      That's not the definition of genius

    • @ericayllon7497
      @ericayllon7497 4 роки тому +3

      @@jean-francoiskener6036 yes, it is a quote that appeared in this youtube channel

    • @Djorgal
      @Djorgal 4 роки тому +103

      "More quotes are attributed to me than I could possibly have said during my entire life." - Albert Einstein

    • @shawnjames3242
      @shawnjames3242 4 роки тому +3

      @@Djorgal Did he actually say that?
      \

  • @noahkupinsky1418
    @noahkupinsky1418 5 років тому +1064

    Hey for all of you getting discouraged because you don’t understand this - that was me last year. I went and taught myself derivatives and came back to try again and suddenly I understand everything. It’s such an amazing feeling to see that kind of work pay off. Don’t give up kiddos

    • @kg3217
      @kg3217 3 роки тому +6

      Thanks for the nice words 🙂

    • @angelbythewings
      @angelbythewings 3 роки тому +15

      studied this 3 years ago in college and it all makes sense to me now

    • @xbutterguy4x
      @xbutterguy4x 3 роки тому +9

      Yup. I tried to watch this series a year ago and make my own neural network which turned out to be disappointing. A semester into college and some passion for calculus is all it took for me to mostly understand this series!

    • @sukhresswarun
      @sukhresswarun 2 роки тому +2

      Same here man
      I seen this video a year ago
      But now only i understand fully
      Keep commenting

    • @oskarwallberg4566
      @oskarwallberg4566 2 роки тому +7

      I would say it’s recommended to have read calculus 2 (for partial derivatives and the Jacobian) and linear algebra (for matrix and vector multiplication). Otherwise, just looking up mentioned things is also fine. But it might take time to build up intuition for the math.

  • @hutc22222222
    @hutc22222222 Рік тому +222

    Your work of making high levels of math accessible to anyone wishing to learn a variety of new topics is not obvious to me. You succeed to explain everything so clearly, making me want to start learning maths again, reminding me of and introducing me to beautiful aspects of math, and you deserve more than a 'thank you' :)

  • @hiqwertyhi
    @hiqwertyhi 7 років тому +946

    It's not that no-one else makes top-notch math/cs videos, it's that this guy makes it CLICK.

    • @ravenn2631
      @ravenn2631 5 років тому +15

      hiqwertyhi It rivals even the website BetterExplained. People like this teach me how to teach.

    • @vgdevi5167
      @vgdevi5167 Рік тому

      Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?

  • @Mrrajender2801
    @Mrrajender2801 5 років тому +144

    Many guys claim to know. Some guys actually know. But only one guy actually knows and can explain to his grandma as well with very beautiful animations. You are that ONE !!!

    • @Seff2
      @Seff2 3 роки тому +4

      I think my grandma would understand ;D Maybe on a very very high abstract level.

    • @puppergump4117
      @puppergump4117 2 роки тому +4

      @@Seff2 Python grandma

  • @suharsh96
    @suharsh96 7 років тому +414

    This is the longest 10 minute video I have ever watched. Literally took me half an hour, but the feeling of the idea behind this completely settling in , makes it totally worth it!

    • @son_go_ma
      @son_go_ma 4 роки тому +19

      I think it's easier if you take the time to re-write all of it yourself, on a scratchpad, work through writing the formulas etc. Then you can play with these objects in your mind more fluently. It takes a longer time initially but I feel you get more out of it. And that's a good way to begin "translating" into vector algebra by taking "simple" papers on DL (haven't go there myself yet).

    • @anjelpatel36
      @anjelpatel36 4 роки тому +4

      @@son_go_ma This, is so important. The math really doesnt click unless you write down where each derivative comes from. And the fact that you need more than one partial derivatives for each layer.

    • @danielcampelo2
      @danielcampelo2 4 роки тому +4

      @@son_go_ma Exactly what i had to do. Was getting the concepts, but only really understood once i started taking notes, and perform the calculations by myself (took even more time as had to re-learn derivatives...). The most interesting is that, now that i understood it, i'm even more appreciative the way it's explained in the video.

    • @arnavrawat9864
      @arnavrawat9864 4 роки тому +3

      Some memorisation is required.
      The way to understanding is easily recalling the different pieces and how they fit together.

    • @kevinsommerfield6341
      @kevinsommerfield6341 4 роки тому +1

      I hear you. I rewatched videos 1 and 2 in this series earlier, and will be rewatching videos 3 and 4 later.

  • @yashjindal9822
    @yashjindal9822 Рік тому +94

    I just started out with my ML career. This entire series made me feel as if I knew it all along. Thank you Grant
    I will return to this comment to share my professional progress😊

    • @Mayank-lf2ym
      @Mayank-lf2ym 10 місяців тому +13

      Now it's time to return to tell your progress

    • @azibekk
      @azibekk 7 місяців тому

      Any update?

    • @rmmaccount
      @rmmaccount 6 місяців тому

      we are waiting

  • @SaifUlIslam-db1nu
    @SaifUlIslam-db1nu 5 років тому +340

    It has taken me about 3-4 days worth time to understand all of these 4 lectures, lectures which are in total, no longer than 1 hour and 30 minutes.
    And I feel proud.

    • @debajyotimajumder2656
      @debajyotimajumder2656 4 роки тому +12

      you should get the t-shirt-merch from 3b1b's description site, the shirt says "pause and ponder"

    • @danielcampelo2
      @danielcampelo2 4 роки тому +12

      Same here. Took my time to hear all explanations. This last video is by far more complex than the previous ones, yet still very well explained.

    • @retrocodequest
      @retrocodequest 4 роки тому +18

      I'm attempting to implement it from scratch in C# with no matrix math library or anything so I can get a feel for the nuts and bolts. This is the boss level!

    • @retrocodequest
      @retrocodequest 4 роки тому +1

      @@berkebayraktar3556 Yeah, I'd love to once I can get it to train properly! So finicky.

    • @vibaj16
      @vibaj16 4 роки тому

      Daniel McKinnon me too! I’m working on the back propagation, this math is hard

  • @PhilippeCarphin
    @PhilippeCarphin 7 років тому +417

    This series is insanely good. As a teacher, I feel like Salieri watching Mozart play and being like "It's so beautiful, how is he so good!"

    • @stanislawgalas
      @stanislawgalas 6 років тому +11

      As a former mathematician I feel the same way :).

    • @hansdieter9911
      @hansdieter9911 4 роки тому +4

      I like this analogy.

    • @sashwatdas5482
      @sashwatdas5482 4 роки тому +1

      This is the first time I have ever liked a comment bc I could not agree more.

    • @seppl5372
      @seppl5372 4 роки тому

      @Stanisław Galas I don't get why we want the devirative of c in respect to w^L. can you explain pls? It isn't a division right?

    • @alonsorobots
      @alonsorobots 4 роки тому

      Too many notes, I do not understand!!

  • @shofada
    @shofada 6 років тому +324

    This is how 21st teaching should look like. It feels like your work should be made a "human right". Thank you.

    • @fakecubed
      @fakecubed 7 місяців тому +1

      No human has the right to another human's labor. That's called slavery.

  • @SaintKhaled
    @SaintKhaled Рік тому +97

    The quality of this education is top-tier. I absolutely am speechless that you make it freely accessible. Thank you so much!

  • @alexdebate7081
    @alexdebate7081 Рік тому +20

    Grant, I've come back to this series many times over the last five years. Every time I do, I pick up more and more pieces of the puzzle. I think I've finally got it now, but we shall see! Thank you!

  • @snf303
    @snf303 2 роки тому +60

    At time when I just finished my university - I could not imagine that at one chilly Sunday evening, in almost 15 years after the graduation, I will sit with a bottle of beer, watch math videos, and have so much fun! Thank you!

  • @Ensorcle
    @Ensorcle 7 років тому +72

    I cannot tell you how much I appreciate these videos. I don't have a strong math background (english undergrad) but I'm teaching myself data science. It REALLY helps to have the equations explained rather than just presented and to tie the components of the equation back to intuitions. Thank you thank you thank you.

  • @bradleydennis210
    @bradleydennis210 4 роки тому +45

    I just finished up calc iii this semester and I have never felt happier with myself for being able to apply my new knowledge than this episode. I also don't think I have ever been more excited to hear calc iii topics being brought up in a field I am trying to teach myself currently. Thank you for making such a simple to understand series!

  • @vedant7090
    @vedant7090 3 роки тому +29

    Man u deserve a Nobel Prize for teaching Machine Learning with this simplicity.

  • @cineblazer
    @cineblazer 3 роки тому +174

    I'm taking Machine Learning by Andrew Ng on Coursera right now, and just got stuck on backpropagation. Thank you thank you thank you thank you Grant, you have no idea how incredibly helpful your videos are and how much your channel has inspired me through the years.

    • @rembautimes8808
      @rembautimes8808 3 роки тому +8

      I was in the same position 2 years back . This video does clarify the topic - tremendously

    • @mrflyswat
      @mrflyswat 2 роки тому +2

      Here I am. Same situation. Andrew Ng course and backpropagation is rough. This video in particular really helped to clear things up. Breaking it down to a single neuron is enormously helpful.

    • @barditheweird
      @barditheweird 2 роки тому +5

      Same here!) I was somewhat disappointed when Andrew Ng course just through the formulas at me, so I tried to derive backpropagation myself and got stuck in all the little details. Thankfully, 3b1b rode in like a knight in shining armor and now I am really damn happy))))

    • @mo_l9993
      @mo_l9993 2 роки тому +3

      I think the wheel gets repeated with every new comer !

    • @vgdevi5167
      @vgdevi5167 Рік тому +1

      Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?

  • @borg286
    @borg286 7 років тому +56

    The point where you addressed the concern that the example you were using was too simple, having only 1 edge, was spot on as you were leading me down this merry garden path. I appreciate how much you watch your own videos and predict where the watcher would mentally say, "but what about..."

  • @antoniobernardo9884
    @antoniobernardo9884 7 років тому +537

    this is easily the best channel in youtube today! once I get a job i will more than glad to support you!

    • @utsavprabhakar5072
      @utsavprabhakar5072 6 років тому +22

      Exactly what i was thinking!

    • @chinmayrath8494
      @chinmayrath8494 5 років тому +23

      It has been two years. Have you supported yet??

    • @pranaysingh3702
      @pranaysingh3702 5 років тому +21

      Did you get a job ?

    • @fitokay
      @fitokay 5 років тому +7

      Two years ago, could you get AI job?

    • @hozelda
      @hozelda 4 роки тому +42

      @@fitokay I think AI won and got his job.

  • @vladimirfokow6420
    @vladimirfokow6420 2 роки тому +173

    Thank you a lot for this series! It has really helped me get into this topic, and changed my life. Your intuitions have been immensely helpful in my efforts to understand backpropagation. I just can't overestimate, how great your channel is!

    • @vgdevi5167
      @vgdevi5167 Рік тому +2

      Hello, I'm greatly impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?

  • @thiyagutenysen8058
    @thiyagutenysen8058 4 роки тому +295

    I came here after Andrew Ng's week 5 in coursera and you blew my mind

  • @mooglefan
    @mooglefan 4 роки тому +14

    I've worked with AI for 2 years now. I have never seen anyone explain this as succinctly and aptly as you have. This video is legitimate gold. Going to show this to anyone who needs an explanation in future!

  • @vectozavr
    @vectozavr 6 років тому +329

    That is the reason for learning the math! To understand such a beautiful things! That is awesome! Thank's a lot!!!

    • @anthead7405
      @anthead7405 4 роки тому +12

      Math on his own is also the reason for learning math.

    • @vvii3250
      @vvii3250 3 роки тому

      Интересно повстречать тебя тут. :)

    • @owaisfarooqui6485
      @owaisfarooqui6485 3 роки тому +1

      and when I asked my math teacher, that person told me you need this to pass the test. that didn't make a lot of sense back then

    • @krenciak
      @krenciak 3 роки тому +3

      @@vvii3250 Ага, чувствую себя как в своеобразном мини-клубе, где собралась небольшая компашка и тусуется))

    • @bocik2854
      @bocik2854 3 роки тому

      @@krenciak Ыыыыыы

  • @jaysoaring6318
    @jaysoaring6318 7 років тому +9

    If there is an award for educational video series on advanced scientific matters. Please give this award to 3b1b. Love it!

  • @sainandandesetti3268
    @sainandandesetti3268 4 роки тому +6

    Stunningly beautiful...
    The best part of the series (for me, obviously) is that the beauty of this series does NOT make it very easy to understand.
    No. Each video may need multiple views. But these videos are so beautifully made that you'd want to watch them again and again, not with the frustration of getting your head over a concept but with the thrill of unravelling a mystery...
    So for creating such excitement in me, thank you.

  • @vil9386
    @vil9386 2 роки тому +1

    How easy it is to understand this through your lectures in just 10minutes. THANK YOU.

  • @rohitdatla724
    @rohitdatla724 4 роки тому +2

    u r not just teaching NN concept but how to think, break down and understand any complex problem and digest, U R AWESOME!!!!!

  • @elizfikret7489
    @elizfikret7489 2 роки тому +17

    Thank you so much! I have understood more math from this channel than from all teachers I have had in high school or university in total.

    • @vgdevi5167
      @vgdevi5167 Рік тому +1

      Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?

  • @LimitedWard
    @LimitedWard 4 роки тому +22

    Absolutely brilliant explanation! I took a course on deep learning in college, but ended up auditing it in the end because I couldn't grasp the concepts well enough to pass the tests. You just took the entire first unit of the course, which took several weeks, and condensed it into 4 easily digestible videos that anyone can understand!

  • @Kevin-cy2dr
    @Kevin-cy2dr 4 роки тому +4

    Honestly this channel doesn't deserve a dislike button. It took me days to figure out one video(at the beginning),but the concepts remain still in my head. This channel taught us that maths is not just changing numbers, but its conceptual and intuitive just like science. Grant if you are ever read this, please know that you are one of the very few people that change the world. I just dont have words for you man, great job is an understatement for you. I promise once i earn enough i will contribute to your channel

  • @nomasan
    @nomasan 4 місяці тому +1

    Rewatching this over and over and over again... that really does help with understanding it.
    It builds connections in the brain

  • @meeradad
    @meeradad Рік тому +2

    These videos are the best ways to make a high schooler fall in love with calculus instead of hating it or fearing it. And open his/her mind to the joy of creativity rooted in mathematical insights.

  • @AnshulKanakia
    @AnshulKanakia 6 років тому +4

    I can't tell you how long I've been trying to visualize all this in my head to get a solid mental picture of backpropagation... Well, I guess I can - it was the duration of a flight from Frankfurt to Seattle (about 9 hours) and it involved one terribly lit backside of an airplane menu and a shitty pencil. I am so grateful for the work you put into animating this algorithm. It has literally brought tears to my eyes and a smile on my face. Thank you.

  • @thfreakinacage
    @thfreakinacage 7 років тому +46

    My god! A basic machine learning video series that actually makes sense to completely beginners!
    Subscribed, and waiting in great anticipation for the next one! :D

  • @Redrumy0
    @Redrumy0 6 років тому +21

    Literally the only youtube channel, that makes studying 2 hours of math, go by in a blink of an eye

  • @ashkankiafard4493
    @ashkankiafard4493 2 роки тому

    The fact that I can understand what you're talking about shows that your teaching is flawless!

  • @ElNachoMacho
    @ElNachoMacho Місяць тому

    The visualization of the chain rule is pure gold. Thank you for putting together such informative videos. You have a gift in making complex concepts easier to digest through rich yet clear visualizations.

  • @samuelreed5481
    @samuelreed5481 6 років тому +20

    These videos are unbelievably well produced. Thank you so much for your effort. You've made this topic incredibly clear and I cannot understate how much I appreciate the amount of effort you put into these. You have incredible talent as a teacher.

  • @kangChihLun
    @kangChihLun 7 років тому +298

    This is the best and clearest explanation in all BP course I could find ! 沒有之一!

    • @songlinyang8614
      @songlinyang8614 6 років тому +1

      没有之一,haha

    • @OGLOCK69
      @OGLOCK69 6 років тому +15

      Are you going to use this knowledge to help develop the social credit system?

    • @dysonsun3306
      @dysonsun3306 6 років тому

      kang Chih Lun 哈哈哈哈哈

    • @shengchuangfeng227
      @shengchuangfeng227 5 років тому

      Good! 沒有之一!

    • @皖辉徐-r2j
      @皖辉徐-r2j 5 років тому

      you are right! 没有之一,哈哈哈

  • @sergiokorochinsky49
    @sergiokorochinsky49 7 років тому +273

    I just unsubscribed to this channel, so I can have the pleasure of subscribing again.

    • @pranavnigam11
      @pranavnigam11 4 роки тому +26

      I just unliked your comment...
      So I can have the pleasure of liking it again

    • @ifusubtomepewdiepiewillgiv1569
      @ifusubtomepewdiepiewillgiv1569 4 роки тому +7

      i just liked and then unliked ur comment bc i realized it was at 69

    • @etsagcoenagpur1200
      @etsagcoenagpur1200 4 роки тому +1

      @@ifusubtomepewdiepiewillgiv1569 nice

    • @neillunavat
      @neillunavat 4 роки тому +5

      "Not gonna lie, you got me in the first half"

    • @neillunavat
      @neillunavat 4 роки тому +1

      @@ifusubtomepewdiepiewillgiv1569 .

  • @DavidUgarteChacon
    @DavidUgarteChacon 5 місяців тому +1

    It is just absolutely crazy how well this guy teaches

  • @zilongzhao3274
    @zilongzhao3274 3 роки тому +3

    your video should be shown in every university's lesson, the animation makes the calculation just so easy to understand.

  • @Erioch
    @Erioch 7 років тому +4

    Honestly, this is one of the best (If not the best) channel on Mathematics/Science education I have seen. Intuitive but not oversimplified. Thank you so much that for offering your spectacular work and you help so many people understand these concepts.

  • @samarthsingla1082
    @samarthsingla1082 5 років тому +8

    The amount of help you are providing is nothing short of amazing.

  • @micahsheller101
    @micahsheller101 7 років тому +4

    Beautiful work! Reminds me of my late father who was a math professor: he had the same gentle, happy style, and believed heartily in making math a safe place for everyone to learn and have fun. Gonna make me tear up :)

  • @santhoshnamballa
    @santhoshnamballa 3 місяці тому

    I am just blown away by the clarity he has over the concepts and the ability to put it out just as clearly.

  • @grigorioschatziandreou2558
    @grigorioschatziandreou2558 11 місяців тому +1

    THANK YOU - MSc student here. Taken a module on machine learning. My university is world class and so the module was very well taught, so I had already good knowledge of neural networks. But now, I am doing research and need to dive deep into them and realised how much I lack deep understanding. This has helped A LOT.

  • @Jabrils
    @Jabrils 7 років тому +748

    youre a deity Grant

    • @Jabrils
      @Jabrils 7 років тому +11

      Haha, why hello Bill. Nice to find you on 3B1B's channel :D

    • @bevel1702
      @bevel1702 6 років тому +10

      wut

    • @anjelpatel36
      @anjelpatel36 4 роки тому +1

      wut

    • @gumbo64
      @gumbo64 4 роки тому +1

      wut

    • @idr7789
      @idr7789 4 роки тому +1

      wut

  • @giron716
    @giron716 7 років тому +6

    I seriously have a hard time explaining how much I appreciate this video. I am far and away a symbolic thinker, as opposed to a geometric one, and while I love all of your videos and how intuitive you make the concepts, it's sometimes hard for me to think about the geometry. I am much more comfortable working with symbols and that's why I treasure videos like this. Thank you :)

  • @matthewhaythornthwaite754
    @matthewhaythornthwaite754 4 роки тому +32

    If anyone is interested, I worked through the chain rule for the differential of the cost function w.r.t the weight in the second layer down. Two additional terms are added to make everything cancel as they should. It shows how as you progress down the layers, more partial differentials are added to the equation from all the variables above, making it more unstable and hence more susceptible to the exploding or vanishing gradient problem.
    dC/dw(L-1) = dz(L-1)/dw(L-1) * da(L-1)/dz(L-1) * dz(L)/da(L-1) * da(L)/dz(L) * dC/da(L)

    • @galileofrank5779
      @galileofrank5779 4 роки тому

      what's the vanishing gradient problem?

    • @MrJmayz
      @MrJmayz 3 роки тому +1

      i'm looking for this in the video. appreciate if you could share your work

  • @n9537
    @n9537 4 роки тому

    This 10 min video is pure gold. Lays down the math in an easy to understand, intuitive manner.

  • @jamesjin1668
    @jamesjin1668 4 роки тому +1

    All of the animation you used are so simple yet at the same time so illuminating. I bet people would appreciate your videos even more when at the end of the video you zoom out to show all of your visual aid in a nicely summarized flow chart / spatial diagram.

  • @saptarshimitra1267
    @saptarshimitra1267 7 років тому +623

    Amazing man..... I say 3gold1platinum

    • @General12th
      @General12th 7 років тому +59

      More like 3plutonium1antimatter

    • @TwentySeventhLetter
      @TwentySeventhLetter 7 років тому +37

      Yeah, cause he nukes the brains of laymen like myself.

    • @General12th
      @General12th 6 років тому +1

      @@ishwarkarthik327 It's not.

  • @sjgmc
    @sjgmc 7 років тому +6

    As an hobbyist programmer, i can't thank you enough! Once i finish my studies i will donate to you. :)

  • @nairanvac79
    @nairanvac79 5 років тому +38

    Thank you for starting your indices at 0.

  • @michaelarmstrong2251
    @michaelarmstrong2251 Рік тому

    Just watch this series of videos. I'm a mechanical engineer with no prior experience of machine learning - now I feel like I understand quite a few concepts that were hard to wrap my head around when learning from other sources. Absolutely awesome videos - well done!

  • @afsdsasasf
    @afsdsasasf Рік тому

    love ur teaching style. teaching us like we r neural networks. like showing us step by steps to absorb instead of showing us everything at one goal like what presenters usually do

  • @13thxenos
    @13thxenos 7 років тому +13

    Nicely done video.
    I knew I learned backpropagation before, but it was hard, and I didn't use it manually ( I used frameworks like TensorFlow which uses computational graphs and backpropagate automatically) so I've forgotten how it actually worked.
    But this video is a great resource for newcomers to ANNs and people like me that have forgotten the theory behind it all. Thank you.

  • @MeriaDuck
    @MeriaDuck 7 років тому +4

    After seeing a few pieces of books, descriptions on the internet about back propagation, with this video I finally reached some kind of enlightenment (especially at about 4:50 into this video). Thank you so much for that!
    Just as a hobby, I was trying to implement a neural network from scratch in java: plain objects for neurons, connections and layers. I wanted to really visualize how a neural network WORKS. (Visualize either as computer code, but maybe I even want to create some visual for it...) This will certainly help me on my way!

  • @ehsanmon
    @ehsanmon 7 років тому +6

    Thank you so much, Grant. I finally learned back prop, and I have become a patron. I wish I could do more.

  • @bigbluetunafish4997
    @bigbluetunafish4997 Рік тому +2

    Finally I finished these 4 chapters of neural networks, and some of your linear algebra and calculus stuff. I feel much better that now I have deeper understanding of how neural network works and have built up that base for further exploration of machine learning. Thanks very much for your effort creating all these great videos together.

  • @dteja92
    @dteja92 5 років тому +1

    God Bless You man! I have tried watching many videos about backpropagation but this series made my conceptual understanding and intuition super clear. Thanks a lot. You have no idea how happy I am right now to have understood the concept of backpropagation.

  • @prashamsht
    @prashamsht 5 років тому +6

    One of the best lectures I have ever heard. Great explanation of NN, cost functions, activation functions etc. Now I understand NN far far better...(P.S. I saw previous videos Part 1, 2,3 as well)

  • @GaborGyebnar
    @GaborGyebnar 7 років тому +80

    Awesome material. :) Please make a video about convolutional neural networks, too.

  • @Abstruct
    @Abstruct 7 років тому +50

    This stuff is an amazing supplication to Andrew Ng's courses, it gives a lot more intuition and visual understandings of the formulas.

    • @claytonharting9899
      @claytonharting9899 6 років тому +1

      It certainly is a huge help for backprop. Just the tree visual is a huge help. Btw, what do you think of 3b1b’s use of a bias value vs Ng’s use of a weighted bias node? I think 3b1b’s may be more clear, but the node version is more computationally efficient.

    • @Viplexify
      @Viplexify 6 років тому +13

      ... in which Ng mentioned that he still doesn't fully understands backprop. I wondered if it was true or just a consolation for beginners.

    • @ab452
      @ab452 6 років тому

      Consolation, it is just to sooth your frustration. But he can also be referring that you can understand how to compute it for a simple case ,but it in a large instance you simple lose track of it. Without a computer is would be a hopeless task.

    • @tanmaybhayani
      @tanmaybhayani 5 років тому +1

      andrew should link this series in his course, cos this is just beautiful!

    • @hayden.A0
      @hayden.A0 4 роки тому

      I'm actually here in between Andrew Ng's course on machine learning. there were a few concepts I didn't completely understand but they are quite clear now.

  • @catchingphotons
    @catchingphotons 3 роки тому +1

    Unarguably one of the best "tutorial" videos of all times! The carefully taken logical steps of understanding, the animations, the visualizations, the tempo, the examples... boggles my mind! This is a masterpiece!
    Greetings
    -Chris

  • @JinruWu
    @JinruWu Рік тому

    This is back propagation so brilliantly explained, I can now feel how efficient this way of calculation is and how it enabled computation not in the past possible. Teaching a Deep Learning course next week and have been scratching my head understanding this concept - now I'm confident I will be able to explain to my students with clarity. Thank you so much! You have touched my life as well as theirs :)

  • @notbobbobby
    @notbobbobby 7 років тому +7

    Right now, I am so thankful for having taken vector calculus and several numerical approximation courses. This was an AWESOME video to watch. Thanks! :)

  • @lagduck2209
    @lagduck2209 7 років тому +194

    *looking at thumbnail
    oh sh~, Im never going to understand that complex stuff. probably should watch it anyway
    *ten minutes later
    whoa! that's actually quite clear now!

    • @kairostimeYT
      @kairostimeYT 7 років тому +2

      same for me.

    • @lagduck2209
      @lagduck2209 7 років тому +6

      просто люблю математику и познавательные видео на эту тему) особенно нейросети

    • @sage5296
      @sage5296 7 років тому +5

      I think that’s kinda the point of much of this channel lol. Same experience though! It’s amazing how a UA-cam video can do so much teaching in like 10 minutes

    • @tommysawyer9680
      @tommysawyer9680 6 років тому +4

      I had the same reaction. Beautifully broken down into calculus us mere mortals can understand!

    • @jessedampare1379
      @jessedampare1379 6 років тому

      Facts!

  • @aravindkannan9490
    @aravindkannan9490 7 років тому +26

    This is by far the best video I have ever seen in Neural Networks. Thanks for this! :)

    • @tisajokt7676
      @tisajokt7676 7 років тому

      I also suggest the video series on neural networks by Welch Labs, or if you've seen it already, I'd be interested to hear your comparison between it and 3Blue1Brown's series.

    • @aravindkannan9490
      @aravindkannan9490 7 років тому +3

      Just completed their playlist! equally good :) I like the application-oriented explaination
      However, I would still recommend 3B1B for an absolute beginner because of the in-depth explanation and for the help in visualizing the math behind it

  • @bean217
    @bean217 3 роки тому +2

    I am currently going through Michael Nielson's "Neural Networks and Deep Learning" book. This video helps to clear up and visualize the chapter on back propagation a lot. Thank you for making this video series.

  • @thomasschwarz1973
    @thomasschwarz1973 Рік тому +5

    This is truly awesome, as pedegogy and as math and as programming and as machine learning. Thank you! ...one comment about layers, key in your presentation is the one-neuron-per-layer, four layers. And key in the whole idea of the greater description of the ratio of cost-to-weight/cost-to-bias analysis, is your L notation (layer) and L - 1 notation. Problem, your right most neuron is output layer or "y" in your notation. So one clean up in the desction is to make some decisions: the right most layer is Y the output (no L value), because C[0]/A[L] equals 2(A[L] - y). So the right most three neurons, from right to left, should be Y (output), then L then L minus one, then all the math works. Yes?

  • @cowcannon8883
    @cowcannon8883 6 років тому +664

    Neural networks have layers, ogres have layers
    Shrek is an AI confirmed

    • @icegod4849
      @icegod4849 5 років тому +4

      Goddamn nice reference would like it a thousand times over if I could

    • @minecraftermad
      @minecraftermad 5 років тому +4

      Shrek is our AI overlord

    • @Ammothief41
      @Ammothief41 5 років тому +6

      My AI overlord has decided they're both onions.

    • @inlandish
      @inlandish 5 років тому +2

      ~onions have layers too~

    • @shiveshramlall2809
      @shiveshramlall2809 5 років тому +1

      Big. Brain.

  • @patelnirmal4726
    @patelnirmal4726 7 років тому +288

    Awesome channel

  • @drstrangeluv1680
    @drstrangeluv1680 8 місяців тому

    Grant I was able to go through my entire PhD at University of Maryland by watching and learning Math from your videos. Now I am trying to get a full time position as an ML engineer and I come back to your channel again for understanding.
    I hope you know you are changing lives- you certainly changed mine. Thank you!

  • @homieboi5352
    @homieboi5352 6 місяців тому

    I’m gonna need to rewatch this a few times to grasp it all, but wow, what a thorough explanation of back propagation! I adore how you referenced the entire equation earlier in the series and it made no sense, but now you’ve broken it down entirely. Phenomenal work!

  • @kirilllosik7054
    @kirilllosik7054 Рік тому +7

    Thanks a lot for creating such a fantastic content! Anticipating to see more videos about AI, ML, Deep Learning!

  • @securemax
    @securemax Рік тому +3

    For everyone wanting to implement backprop from scratch: don't use dC/da = 2*(a-y). Instead use dC/da = a-y. This is because the cost function would actually be defined with a factor 1/2 in front which is missing here. Hence, the derivative changes. All other derivatives are good :)

    • @carloscortes2391
      @carloscortes2391 10 місяців тому

      Why would there be a 1/2 factor?, to average the square error since there are 2 outputs?

  • @sacation6057
    @sacation6057 5 років тому +4

    Awesome series! Even though i already had quite a intuitive feeling about the concepts of Deep learning, your videos just always make complex subjects click in my mind, it sort of forms the right connections between the neurons in my mind i suppose so ;)
    Even without any advanced math knowledge i was able to follow your math, so thanks for choosing to keep your examples as simple as possible!
    I'm gonna make my own network from scratch in code some time, to see if i truly understand it throughly.

  • @ganjarulez009
    @ganjarulez009 3 роки тому +2

    I don't understand why the books cant explain this stuff this easy yet precise, great job!

    • @andrewvalenzuela1790
      @andrewvalenzuela1790 Рік тому

      I despised math books. It always felt as if the examples were not "showing their work"

  • @ssachdev1
    @ssachdev1 2 роки тому

    "So pat yourself on the back! If all of this makes sense, you have now looked deep into the heart of backpropagation, the work horse behind how neural networks learn." felt soooo goooooood

  • @AbhishekKumar-bo1yi
    @AbhishekKumar-bo1yi 7 років тому +8

    I always feel, if u have a mentor who can break complex things into simple stuff so beautifully, even a dumb guy can grasp the concept. Keep doing the good stuff. Admirer++

  • @bilalsedef9545
    @bilalsedef9545 2 роки тому +3

    This is a great and very educational video. But I think it needs one more part to show how the weights are updated.

  • @NitinNataraj-gf3vx
    @NitinNataraj-gf3vx 7 років тому +251

    Netflix can show these rather than some other questionable material.

    • @4.0.4
      @4.0.4 7 років тому +26

      Nitin they would probably make it about the gender/race of different numbers, or draw some number in a way that fans of that number don't like.

    • @NitinNataraj-gf3vx
      @NitinNataraj-gf3vx 7 років тому +31

      A new phrase would emerge, "Backprop and chill"!

    • @sohailape
      @sohailape 6 років тому +5

      netflix show what people want . Don't blame them , blame people around you,your friends , family and yOU .

    • @vineetk1998
      @vineetk1998 5 років тому +1

      then it won't be free(unaccessible to who can't afford) and netflix doesn't care and get their hands on whatever gets them money

    • @vineetk1998
      @vineetk1998 5 років тому +2

      bdw it would be great if they could make education addictive. lol that would be really great, Imagine haha

  • @youngsoochoy5592
    @youngsoochoy5592 4 роки тому

    This is the best mathematical explanation about the backpropagation of neural network. I've watched other coursera courses twice, but nothing can be compared to this well-visualized and easy to understand explanation.

  • @amulya1284
    @amulya1284 Рік тому

    never in my life have I come across a calculus video explained so beautifully! in awe with this intuition

  • @4AneR
    @4AneR 7 років тому +82

    What an art of math, Jesus Christ

  • @蘇志雄-h6n
    @蘇志雄-h6n 6 років тому +5

    真是大師級的作品,解釋得非常清楚,太神奇了! Awesome!!

  • @qwert-cj4ld
    @qwert-cj4ld 5 років тому +13

    9:28 sums up the whole thing

  • @atiehhisham
    @atiehhisham 5 місяців тому

    This series explained all of this to me 100x better than the courses I paid for. You are a genius, keep up the great work !

  • @zhexiangxd
    @zhexiangxd 4 роки тому

    Sir, i cant thank you enough of how simply and clearly you explained this. Makes university professors look bad tbh. Thank you so much!

  • @sharkk2979
    @sharkk2979 4 роки тому +3

    Thanks lord for the info
    अद्वितीय!!!!

  • @TheZenytram
    @TheZenytram 7 років тому +4

    WOW i thought that the math behind it would be waaayy more complicated. i know that this video has a lot information to digest but it's not complicated.

  • @youngnam1175
    @youngnam1175 7 років тому +6

    Thanks 3B1B. I'm understanding machine learning mush better, and following your video while note taking was the easiest method for learning.
    I'm a little confused about what the change in C_0 with respect to change in a_(L-1, k) for the k-th activation in L-1 layer (I just changed to this notation because I feel more comfortable writing like this in one line text). That's 8:40 part of the video I guess. It doesn't make intuitive sense for me as to why you need the summation of impact of a_(L-1, k) on a_(L, 0~n), say without any multiplier or something.
    Trying to understand the meaning of `dC_0/da_(L-1, k)` I thought of a Neural Network where there are only two layers, input and output layer, and input layer having 1 neuron and output layer having 2 neurons.
    Does it ever make sense for a_(L-1, k) to be an activation (or neuron?) in an input layer? If so, I think it makes to add the 'impact' all up especially when the weights are all same 'direction' or sign because if so summing them all up would result in greater number, and this would mean changing the input has the biggest impact in this scenario.
    If not, I'm still confused what `dC_0/da_(L-1, k)` is and why it has the summation.

    • @danielniels22
      @danielniels22 3 роки тому

      hello, how are you? I know it's been 3 years since you made your comment. But for me, it's my first few weeks started to learn Deep Learning, and now trying to build Neural Network from scratch, before I try to learn frameworks like Tensorflow, Keras, etc.
      Do you happen to know now, why do we sum the activation layers? Before watch this video, i thought of square root each of its squared value, like we determine the scalar value of a vector. And i turned out I was wrong after watched this video. I really looking forward for an explanation or any source of information from you, about why do we sum the change of dC/da(L-1).
      Thank you 😊🙏

  • @omerfarukozturk9720
    @omerfarukozturk9720 2 роки тому

    literally thank you. I learned the information that I could not learn at school for 5 weeks in a 10-minute video. The animations of the video are absolutely magnificent. Thank you thank you thank you

  • @mukundholo6019
    @mukundholo6019 5 років тому +2

    this is math at the best and art too at the highest. the grace of the animation, the subtle music, the perfectly paced narration and the wonderful colour scheme! math and art or let's say math is art!

  • @alpinstef3566
    @alpinstef3566 3 роки тому +4

    Great presentation! Very helpful. Although, could you clarify if the indices at 9:30 are consistent? if you sum over j in the 'next' layer, you would have k in the current layer. Instead of w_jk^(l+1), would it be w_ij^(l+1) and sum over i for this next layer? Thanks!

    • @marlanivanovich1828
      @marlanivanovich1828 2 роки тому

      It seems that the second formula (at 9:30) is inconsistent with formulas from Andrew Ng ML course and as of me looks very confusing because of inaccurate use of indexes. Let's simplify the vectorized formulas from the above course and keep using the derivative of the activation function: sigma'(z(L)) in delta(L) as suggested in the video. With index notation we will get for d_C / d_a_j(l): sum_over(q=0 -> n(l+1)-1) [ w_qj(l+1) * sigma'(z_q(l+1)) * d_C / d_a_q(l+1) ]