04.2 - Recurrent neural networks, vanilla and gated (LSTM)

Поділитися
Вставка
  • Опубліковано 4 січ 2025

КОМЕНТАРІ • 86

  • @ginodelferraro4240
    @ginodelferraro4240 3 роки тому +27

    Nice lecture and impressive animations, mate. Your dedication to teaching is remarkable. Well done!

    • @alfcnz
      @alfcnz  3 роки тому +3

      Yay! 🥳🥳🥳

  • @hamedgholami261
    @hamedgholami261 2 роки тому

    I just wanted to thank you because I really struggled before to learn these concepts but when you say it, it seems easy although it isn't. thanks

    • @alfcnz
      @alfcnz  2 роки тому

      🙂🙂🙂

  • @DeepFindr
    @DeepFindr 3 роки тому +3

    Wow! High quality content and super entertaining to watch!

    • @alfcnz
      @alfcnz  3 роки тому

      I'm glad you found it enjoyable. 😊

  • @Aswin255
    @Aswin255 2 роки тому

    Love the animations at 7:20 professor! Thank you for the materials.

    • @alfcnz
      @alfcnz  2 роки тому

      🤓🤓🤓

  • @st0ox
    @st0ox 3 роки тому +1

    The production quality is amazing

    • @alfcnz
      @alfcnz  3 роки тому +1

      I've put some hours of work towards a few new skills 😇😇😇

  • @НиколайНовичков-е1э

    Very impressive animations, Alfredo! I think that this is the best explanation of the ANN which i have seen. Thank you for your work

    • @alfcnz
      @alfcnz  3 роки тому

      ❤️❤️❤️

  • @hueynguyen
    @hueynguyen 3 роки тому

    best lecture with visualization ive found on youtube. Thanks a lot!

    • @alfcnz
      @alfcnz  3 роки тому

      You're very welcome! 😊

  • @jonathansum9084
    @jonathansum9084 3 роки тому +1

    Thank you for your hard work in editing. recording, and the effort for making a new effect🎇✨.

    • @alfcnz
      @alfcnz  3 роки тому

      Hehe, I'm having quite some fun! Also, working on brand new lessons as well.

  • @MaraLearns
    @MaraLearns 3 роки тому +1

    these animations in the video are just something amazing for understanding what's going on
    P.S. I've made 10000 woooows during watching

    • @alfcnz
      @alfcnz  3 роки тому

      Yay! 🥳🥳🥳

  • @НиколайНовичков-е1э

    Alfredo made an absolutely excellent explanation. There is also the paper "A Gentle Tutorial of Recurrent Neural Network with Error
    Backpropagation. Gang Chen" maybe it will be useful to someone for a better understanding of the BPTT.

    • @alfcnz
      @alfcnz  3 роки тому

      I'll look it up, thanks!

  • @SaonCrispimVieira
    @SaonCrispimVieira 2 роки тому

    Grazie Mille Prof Canziani! I think that when one loads zero to the abstract concept state vector, its not only loading an arbitrary value to initialize an allocated memory, but defining zero as the absense of information, so the distance from the origin somehow represent the level of information you have. Just thinking a little about this class part! Thank you!

    • @alfcnz
      @alfcnz  2 роки тому +1

      You're welcome. You need to add a timestamp if you refer to any part of the video.

    • @SaonCrispimVieira
      @SaonCrispimVieira 2 роки тому

      @@alfcnz 50:20

    • @alfcnz
      @alfcnz  2 роки тому +1

      Not necessarily. There's a bias there as well, which can counteract any kind of initial value you pick (as long as it's constant).

  • @gonzalomanuelbeade3651
    @gonzalomanuelbeade3651 2 роки тому

    Amazing content.

    • @alfcnz
      @alfcnz  2 роки тому

      🥳🥳🥳

  • @kieranmcauliffe4821
    @kieranmcauliffe4821 2 роки тому

    Are the answers to the homeworks available anywhere? I'm having significant trouble getting my alphabetic signal echo RNN to perform well without significant overfitting

    • @alfcnz
      @alfcnz  2 роки тому

      No answers available (we may have similar exercises during future semesters). Feel free to ask in the Discord channel.

    • @kieranmcauliffe4821
      @kieranmcauliffe4821 2 роки тому

      @@alfcnz oh I forgot there was a discord channel! That's a great idea, thanks! And thanks for uploading the course on YT!

  • @xXxBladeStormxXx
    @xXxBladeStormxXx 3 роки тому

    In your animation (it's awesome btw!) starting at 6:35 how do you get a smooth continuous morphing of the space? Since you have 5 hidden layers shouldn't the animation just be 5 discrete images of the space as it appears after that layer?
    Edit: wait I think you literally answer that right after the animation... asked too soon

    • @alfcnz
      @alfcnz  3 роки тому

      Yup. Also, in the next lesson I explain that further.

  • @pavlosn
    @pavlosn 3 роки тому

    amazing class!

    • @alfcnz
      @alfcnz  3 роки тому +1

      I'm glad you're enjoying it!

  • @Anujkumar-my1wi
    @Anujkumar-my1wi 3 роки тому

    Why deep neural nets require less number of neurons than shallower ones to approximate a function in terms of function combination and composition?

    • @alfcnz
      @alfcnz  3 роки тому

      You can consider a 1,000 neurons single layer to be roughly equivalent to 3 10 neurons layers stacked on top of each other.

    • @Anujkumar-my1wi
      @Anujkumar-my1wi 3 роки тому

      @@alfcnz But why?and also can we write a compostion of functions as some combination of functions

    • @alfcnz
      @alfcnz  3 роки тому

      @@Anujkumar-my1wi what does combination of function mean?

    • @Anujkumar-my1wi
      @Anujkumar-my1wi 3 роки тому

      @@alfcnz I meant like when we use a 1 hidden layer neural net the output neuron is just taking the nonlinear function learned by neurons of 1st hidden layer and do the weighted sum of those functions to approximate a nonlinear function ,that's what i meant by combination of function

    • @alfcnz
      @alfcnz  3 роки тому

      That's called composition.
      So, your question asks «how to write a composition of functions as composition of functions?» which does not make sense.

  • @chsafouane
    @chsafouane 3 роки тому

    You're a legend!

  • @mahdiamrollahi8456
    @mahdiamrollahi8456 3 роки тому

    It is so worthy to watch 05:05 one thousand times... so how did you make it, so interesting... ( I know that you used a mouse🐭😉)

  • @Adityakumar-wc3ec
    @Adityakumar-wc3ec 3 роки тому

    Hi Alfredo,
    Your videos are really wonderful and I want to thank you for uploading such quality material. However, I am unable to understand the data flow in RNN networks. For instance, I have a sentence which is 4 words long. If I am using a Fully Connected Network of 12 neurons, each of my 4 words will be fed into the 12 neurons. I want to understand what will be the data flow if I use a LSTM with 12 hidden states.
    Thanks.

    • @alfcnz
      @alfcnz  3 роки тому

      Perhaps you may want to check once again the «Training example» section, at 47:52.

  • @roughr4044
    @roughr4044 3 роки тому

    Amazing... Thank you 😊

    • @alfcnz
      @alfcnz  3 роки тому

      You're most welcome ☺️

  • @dr.mikeybee
    @dr.mikeybee 3 роки тому

    Alf, Thank you for making all these interesting lectures and materials available. I'm just loving this. BTW, you keep mentioning Yann's lectures. Can I view those too? Were they recorded?

    • @alfcnz
      @alfcnz  3 роки тому

      Just follow the playlist. Every lecture from Yann has one of mine. The link is in the video description.
      Let me know if you cannot find it.

    • @dr.mikeybee
      @dr.mikeybee 3 роки тому

      @@alfcnz Thank you, Alf. I don't know why I didn't see all the other lectures. I'm glad I asked. Even though I'm getting the material from your lectures, I'm a firm believer in reinforcing work through repetition. Cheers!

  • @xXxBladeStormxXx
    @xXxBladeStormxXx 3 роки тому

    These videos keep getting better! Excellent explanations and animations! Btw, are you familiar with the book 'Mathematics for Machine Learning' by Deisenroth et. al.? I'm really enjoying reading it. Might be a good additional resource for your students.

    • @alfcnz
      @alfcnz  3 роки тому

      Thank you 😌
      And yes, we've been recommending Marc's book to our students already (especially the chapter on gradients).

  • @noble-sword
    @noble-sword 3 роки тому

    Hi brother. Could you please make a video on Text to video using Gans. Specifically for videos which have annotations

    • @alfcnz
      @alfcnz  3 роки тому

      Interesting. What paper are you talking about?

    • @noble-sword
      @noble-sword 3 роки тому

      @@alfcnz Zero shot anticipation for instructional activities. But I want to create a GAN which will take in a recipe text and create a video for it. Using Tasty dataset

  • @hamedgholami261
    @hamedgholami261 2 роки тому

    hey Alf, how is it going? I have a question from this week's homework that I worked super hard to answer (I spent about a month trying to answer the question) but couldn't find the answer, so I said maybe I can ask you. however, I hate wasting someone important's time, like your time so if there's any place that I can ask these questions so that I don't take your valuable time, please guide me.
    that being said, let me state my question. In the theory part of HW, Question 1.1, part C, subpart (ii), you asked about the derivative of F with respect to weight matrix W. first of all I worked out the dimension of the F function to be 1 * 1 * 2. There my problem begins, I know how to differentiate a vector with respect to a matrix but those two extra dimensions (first and second dimensions that are both 1) are giving me a hard time in figuring out the dimension of the derivative. I know that the dimension of the derivative should be the number of elements in the output, times the weight matrix dimensions transposed, and I worked it out to be 1 * 1 * 2 * 1 * 3 * 2, but it really seems wrong to me. am I missing something?

    • @alfcnz
      @alfcnz  2 роки тому

      We have a Discord channel for homework discussions. 😀

    • @hamedgholami261
      @hamedgholami261 2 роки тому

      @@alfcnz oh thank you, that's such a relief!

  • @username42
    @username42 3 роки тому +1

    do we get certificate after watching those videos :D

    • @alfcnz
      @alfcnz  3 роки тому +1

      Haha, only if you can write the notebook from scratch 😉

  • @mostechroom9780
    @mostechroom9780 3 роки тому

    What do you use for your animations

    • @alfcnz
      @alfcnz  3 роки тому

      Matplotlib, PowerPoint, Adobe AfterEffects. Depends what animations you're talking about.

    • @mostechroom9780
      @mostechroom9780 3 роки тому

      @@alfcnz Particularly ones that are similar to 3Blue1Brown, and ones where you can animate what happens with data structures. (isn't matplotlib used for python).

    • @alfcnz
      @alfcnz  3 роки тому

      Can you add minutes:seconds so I understand what you're talking about?

    • @mostechroom9780
      @mostechroom9780 3 роки тому

      @@alfcnz @6:48, the animation of the plane ?

    • @alfcnz
      @alfcnz  3 роки тому

      Yeah, matplotlib.

  • @teegnas
    @teegnas 3 роки тому

    Nicely explained ... thanks a lot for uploading these to UA-cam. Quick questions though ... any idea when practica #1 will be available and Lecture #4 is missing from the 2020 playlist (ua-cam.com/play/PLLHTzKZzVU9eaEyErdV26ikyolxOsz6mq.html)?

    • @alfcnz
      @alfcnz  3 роки тому

      There was no lecture 4 in 2020 (it was President Day). Practica 1 is basically me going over the syllabus and Yann giving an intro on the history. Not sure if it's worth adding it.

  • @EdeYOlorDSZs
    @EdeYOlorDSZs 2 роки тому

    Great lecture, loved the examples. You can be even better if you work on your articulation and accent!

    • @alfcnz
      @alfcnz  2 роки тому +1

      Hahaha 😸😸😸 My Italian is perfect! 😛😛😛

  • @paolomontesel
    @paolomontesel 3 роки тому

    Gg e grazie per l'effort. Maybe show an anonymized chat in future streams? Not necessary ofc.

    • @alfcnz
      @alfcnz  3 роки тому

      The students' chat, you mean?

    • @paolomontesel
      @paolomontesel 3 роки тому

      @@alfcnz Yep. You sometime reply to the chat and it breaks the flow a bit since we don't see it. That being said, again, it's nothing major. This is amazing content and it's free so I'm not complaining (: Just wanted to suggest a possible improvement.

    • @alfcnz
      @alfcnz  3 роки тому

      @@paolomontesel these are my live lessons, where my NYU students ask anything during class. I read their questions out loud if they are relevant. So, I'm not 100% sure I understand why I should have the text on screen. Can you expand a little? I'm really interested in understanding your opinion.

  • @XX-vu5jo
    @XX-vu5jo 3 роки тому

    NYU SHOULD GIVE YOU A FUCKING RAISE OR PROMOTION FOR THIS!!!

    • @alfcnz
      @alfcnz  3 роки тому +1

      Hahaha 😅 I agree 😅 I don't have a stable job nor a comfortable salary 🤣 but at least I do what I love!

    • @XX-vu5jo
      @XX-vu5jo 3 роки тому

      @@alfcnz we need you here!!!

  • @pasquale7226
    @pasquale7226 3 роки тому

    that sounded like the intro of a scifi erotic novel lol

    • @alfcnz
      @alfcnz  3 роки тому

      😮😮😮

  • @timarbatis640
    @timarbatis640 3 роки тому

    Hes got 99 likes but a dislike aint one.

    • @alfcnz
      @alfcnz  3 роки тому +2

      Haters haven't come over just yet 🤣