Sir, good morning, its an excellent contribution to all categories of people related to this field. Its wonderful, thanks a lot. Just a small doubt sir, in this video you mentioned to add the bias term(at 17.25 min) in the formula. My doubt is, In the network , where the bias is mentioned or added, whether at the end, or at the O2 level. pl clear my doubt. thank you once again.
Sir please help: the part at 14:45 where the output of the recurrent node is passed to next time step of the same node BUT ALSO passed to the other node in the hidden layer, i didnt understand that part pls explain it to me intuitevly/mathematically how it works❤
Hi aman.. Thanks for the video There will be three weights, in common notation waa for the previous word wax for the input word wya for the output... Correct me if I am wrong Thanks.. Can we expect the derivations also in the next video? 😊
Hi Aman, I need one suggestion. I need to convert xaml files to atmx files. Is it possible ?. How to develop model ? which model i need to use and how to build dataset ? kindly guide me on this.
Here, You didn't explain How Each node of the Hidden Layer Process(We know) -> Passes(other Hidden nodes of the same layer & other layers) -> How it stores the output hidden state of each node, How it process with Next timestamp and finally previous Dense give the multiple HiddenStates, How it using that Hidden States & finally give the ouput and also RNN has Multiple type of Architecture (For Many to one) When the output layer works please explain these doubts with the logic, sample code (Ex) along with Sample calculation (We want process only that's enough not exact nums) Even it takes a longtime in a vedio, please upload as single vedio Every sources of internet gives the outline process of RNN not depth level can you please?
I refered so many videos regarding RNN. But only urs is clear in depth . True mentor. I salute you sir
All the best
Amazing explanation . I am really surprised why so less likes here !!! Please keep it up .
Amazing explanation boss. The way you have pilled the concept. After listing to 5 videos from experts, I could finally understand the concept.
Excellenly presented. Even Statquest failed to teach bettre. Kudos!!!
Sir, good morning, its an excellent contribution to all categories of people related to this field. Its wonderful, thanks a lot. Just a small doubt sir, in this video you mentioned to add the bias term(at 17.25 min) in the formula. My doubt is, In the network , where the bias is mentioned or added, whether at the end, or at the O2 level. pl clear my doubt. thank you once again.
You're so good Aman and talks basics. Thank you very much for sharing these videos.
Welcome
Amazing explanation. Thanks, sir
Very nice Aman sir... Thank you for your help...
Wonderful way of initiating the video.
Very very informative session
Fantastic job Aman. Please create video on LSTM also.
Sure
Best Explanation
Sir please help: the part at 14:45 where the output of the recurrent node is passed to next time step of the same node BUT ALSO passed to the other node in the hidden layer, i didnt understand that part pls explain it to me intuitevly/mathematically how it works❤
nice explaination , i have one question
1 ) will the data/output from neurons of the same layer will be passed to another neuron in the same layer ?
great session
Thank You Sir🙏
Good info
Excellent Teaching Sir!!
Keep watching
Superb
Awesome videos Aman :)
nice👌👌👌
Damn! You looking sharp.
Thanks a lot.
Please do upload the previous video of 39 minute long
nice video
Hi aman.. Thanks for the video
There will be three weights, in common notation
waa for the previous word
wax for the input word
wya for the output...
Correct me if I am wrong
Thanks.. Can we expect the derivations also in the next video? 😊
Hi Aman, I need one suggestion. I need to convert xaml files to atmx files. Is it possible ?. How to develop model ? which model i need to use and how to build dataset ? kindly guide me on this.
Here, You didn't explain How Each node of the Hidden Layer Process(We know) -> Passes(other Hidden nodes of the same layer & other layers) -> How it stores the output hidden state of each node,
How it process with Next timestamp
and finally previous Dense give the multiple HiddenStates,
How it using that Hidden States & finally give the ouput
and also RNN has Multiple type of Architecture (For Many to one) When the output layer works
please explain these doubts with the logic, sample code (Ex) along with Sample calculation (We want process only that's enough not exact nums)
Even it takes a longtime in a vedio, please upload as single vedio
Every sources of internet gives the outline process of RNN not depth level
can you please?