Understanding Backpropagation In Neural Networks with Basic Calculus

Поділитися
Вставка
  • Опубліковано 6 січ 2025

КОМЕНТАРІ • 34

  • @Slim
    @Slim 3 місяці тому +5

    Spent the past weeks trying to grasp backpropagation mathematically. Spent countless days on videos and textbooks. This is by far the most intuitive explanation. Just subbed. Thank you.

    • @DrDataScience
      @DrDataScience  3 місяці тому

      I am glad this was helpful and thank you for subscribing.

  • @dxlorean2938
    @dxlorean2938 2 місяці тому +2

    hands down best video about backpropagation out there. subbed

  • @hieuluc8888
    @hieuluc8888 9 місяців тому +3

    I couldn't really understand what a computational graph was until I watched this video. Thank you very much.

  • @cgqqqq
    @cgqqqq 11 днів тому +1

    the EXACTLY video im looking for and 2:58 is the GEM, everybody is trying to explain backprop of each single weight of each layer w11, w12, w13, then you will drown, image there are100 layers, how to calculate w11, w12, ... w100, even a super computer will blow up. You really need to look at all weights in a single layer as a MATRIX, this is the magic and HACK of linear algebra, only by doing this, no matter how deep the NN is and how many nodes there is in a layer, you will not be afraid of it

  • @leonmark2416
    @leonmark2416 Рік тому +5

    the best session tha i can recommend to anyone. I studied computer science and i am not good at math and this was really a saver to coninue on.

  • @asfandiyar5829
    @asfandiyar5829 Рік тому +5

    Thank you again! This was incredibly simple to understand and exactly what I was looking for.

  • @omnnnooy3267
    @omnnnooy3267 2 роки тому +2

    I cannot express how thankful I'm !

  • @parisahajibabaee2893
    @parisahajibabaee2893 3 роки тому +2

    You can present complex information in simplified manner, many thanks!

  • @esper5429
    @esper5429 11 місяців тому +2

    The only video that explained me everything

  • @patrikszepesi2903
    @patrikszepesi2903 Рік тому +2

    at 12:18 at the bottom right corner i think the matrix with the xs should be a 1 x 3 matrix and not a 3x1 matrix, otherwise the matrix multiplication wont work

    • @DrDataScience
      @DrDataScience  Рік тому +2

      Both vectors x and w are 3x1. Note that we are using the transpose of W, which means that the result would be (1 x 3) x (3 x1), which is scalar.

  • @user-db2zx3ox7u
    @user-db2zx3ox7u 10 місяців тому +2

    very good thanks

  • @Fantastics_Beats
    @Fantastics_Beats 2 роки тому +2

    I love you brother your teaching

  • @sparklexscrewyachivements813
    @sparklexscrewyachivements813 Рік тому +2

    thanks for meaningful describe

  • @williammartin4416
    @williammartin4416 10 місяців тому +2

    Very helpful

  • @sudheeshe1384
    @sudheeshe1384 3 роки тому +3

    Can you explain about optimizer state

  • @polycarpnalela2297
    @polycarpnalela2297 2 роки тому +2

    Thanks!! It was very useful

  • @karthikb.s.k.4486
    @karthikb.s.k.4486 3 роки тому +2

    Nice session

  • @karthikb.s.k.4486
    @karthikb.s.k.4486 3 роки тому +4

    Can you please do a video on different cross validation techniques in machine learning

  • @kuldeepkaur-do2ex
    @kuldeepkaur-do2ex 9 місяців тому

    What is phi with a1? Is it activation function.?

  • @shahulrahman2516
    @shahulrahman2516 6 місяців тому +1

    Great video

  • @llothar68
    @llothar68 9 місяців тому

    Do i need to understand all this to use neural networks? I'm scared shitless but my curiculuum says i have to learn it

  • @ChristopherKennedy-fm2lo
    @ChristopherKennedy-fm2lo 8 місяців тому +1

    You're the goat

  • @feramuzalacal8998
    @feramuzalacal8998 5 місяців тому

    nice video, birader aksanınızdan anladım