Lets Unfold RNN| Recurrent neural network explained | Recurrent neural network complete tutorial

Поділитися
Вставка
  • Опубліковано 5 вер 2024

КОМЕНТАРІ • 32

  • @suganyaramu4929
    @suganyaramu4929 4 дні тому

    I refered so many videos regarding RNN. But only urs is clear in depth . True mentor. I salute you sir

  • @piusranjan
    @piusranjan Місяць тому

    Amazing explanation . I am really surprised why so less likes here !!! Please keep it up .

  • @yosupa
    @yosupa 6 місяців тому +2

    Amazing explanation boss. The way you have pilled the concept. After listing to 5 videos from experts, I could finally understand the concept.

  • @jsridhar72
    @jsridhar72 8 місяців тому

    Excellenly presented. Even Statquest failed to teach bettre. Kudos!!!

  • @user-tc4yb3sf4w
    @user-tc4yb3sf4w 18 днів тому

    Sir, good morning, its an excellent contribution to all categories of people related to this field. Its wonderful, thanks a lot. Just a small doubt sir, in this video you mentioned to add the bias term(at 17.25 min) in the formula. My doubt is, In the network , where the bias is mentioned or added, whether at the end, or at the O2 level. pl clear my doubt. thank you once again.

  • @user-uz5dt7ch7n
    @user-uz5dt7ch7n 11 місяців тому

    You're so good Aman and talks basics. Thank you very much for sharing these videos.

  • @charlesfon7398
    @charlesfon7398 2 місяці тому

    Amazing explanation. Thanks, sir

  • @swathiangamuthu2226
    @swathiangamuthu2226 9 місяців тому

    Very nice Aman sir... Thank you for your help...

  • @deepakdodeja4663
    @deepakdodeja4663 8 місяців тому

    Wonderful way of initiating the video.

  • @RinkiSingh-ph6oo
    @RinkiSingh-ph6oo Рік тому

    Very very informative session

  • @geekyprogrammer4831
    @geekyprogrammer4831 Рік тому +1

    Fantastic job Aman. Please create video on LSTM also.

  • @user-jt4vn2dn8g
    @user-jt4vn2dn8g 8 місяців тому

    Best Explanation

  • @SelfBuiltWealth
    @SelfBuiltWealth День тому

    Sir please help: the part at 14:45 where the output of the recurrent node is passed to next time step of the same node BUT ALSO passed to the other node in the hidden layer, i didnt understand that part pls explain it to me intuitevly/mathematically how it works❤

  • @keshav6930
    @keshav6930 5 місяців тому

    nice explaination , i have one question
    1 ) will the data/output from neurons of the same layer will be passed to another neuron in the same layer ?

  • @salomishiny6997
    @salomishiny6997 11 місяців тому

    great session

  • @himayaperera4758
    @himayaperera4758 3 місяці тому

    Thank You Sir🙏

  • @KrishnaSatishReddy
    @KrishnaSatishReddy 10 місяців тому

    Good info

  • @mridulgupta4536
    @mridulgupta4536 Рік тому

    Excellent Teaching Sir!!

  • @user-vr8xq7lw7v
    @user-vr8xq7lw7v Рік тому

    Superb

  • @veenajain
    @veenajain 8 місяців тому

    Awesome videos Aman :)

  • @meysamjavadzadeh
    @meysamjavadzadeh Рік тому

    nice👌👌👌

  • @abusufiyanmansuri5675
    @abusufiyanmansuri5675 Рік тому

    Damn! You looking sharp.

  • @karanmehta3675
    @karanmehta3675 Рік тому

    Please do upload the previous video of 39 minute long

  • @prabhakergautam9204
    @prabhakergautam9204 5 місяців тому

    nice video

  • @tejkiran1836
    @tejkiran1836 Рік тому +1

    Hi aman.. Thanks for the video
    There will be three weights, in common notation
    waa for the previous word
    wax for the input word
    wya for the output...
    Correct me if I am wrong
    Thanks.. Can we expect the derivations also in the next video? 😊

  • @gopinathsrimatthirumala3092

    Hi Aman, I need one suggestion. I need to convert xaml files to atmx files. Is it possible ?. How to develop model ? which model i need to use and how to build dataset ? kindly guide me on this.

  • @ClipsforQalb
    @ClipsforQalb Місяць тому

    Here, You didn't explain How Each node of the Hidden Layer Process(We know) -> Passes(other Hidden nodes of the same layer & other layers) -> How it stores the output hidden state of each node,
    How it process with Next timestamp
    and finally previous Dense give the multiple HiddenStates,
    How it using that Hidden States & finally give the ouput
    and also RNN has Multiple type of Architecture (For Many to one) When the output layer works
    please explain these doubts with the logic, sample code (Ex) along with Sample calculation (We want process only that's enough not exact nums)
    Even it takes a longtime in a vedio, please upload as single vedio
    Every sources of internet gives the outline process of RNN not depth level
    can you please?