What is LSTM (Long Short Term Memory)?
Вставка
- Опубліковано 14 чер 2024
- Learn about watsonx → ibm.biz/BdvxRB
Long Short Term Memory, also known as LSTMs, are a special kind of Recurrent Neural Network, or RNN, architecture capable of learning long-term dependencies as well as a solution to the vanishing gradient problem that can occur when training traditional RNNs.
In this lightboard video, Martin Keen with IBM, breaks down why we need LSTMs to address the problem of long-term dependencies, how the cell state and its various gates help transfer relative information in a sequence chain, and a few key LSTM use cases.
#LSTM #RNN #AI - Наука та технологія
After watching Martin Keen's explanations, you don't need another short explanation
Martin you are a wonderful teacher! Thank you very much for the explanations.
Very useful and helpul. This is a hard topic to understand as readily way, but you can did it in just 8 minutes. Thanks for that Mr. Martin and company. Greetings from Mexico!
Great advancement in time. Glad to have a better understanding. Thank you folks
So are going to ignore the fact he wrote everything backwards on a clear glass wall?
That's likely the #1 comment! The presenter writes on the glass and we flip the image in post-production. Martin is actually right-handed. Some married presenters will switch their wedding ring to the other hand or not wear it so as not to confuse their spouse.
@@IBMTechnology Smart ! I was wondering how you did this.
@@IBMTechnology I don't understand how you guys made it look like he's writing backwards on a glass, could you explain it to me.
@@abhishekrao6198 The presenter uses right hand to write. The camera is in front of him so that the original video should have all sentences reversed. Then flip the video left to right, you can get the video you watch now
@@IBMTechnology 😂😂…the most obvious observation. People who didn’t even figure that out 🤦♂️should not be watching LSTM/RNN or other such videos 🕺
I'm in love with his way of teaching!
Very helpful lecture. Keep up the good work!
Fantastic explanation. Please keep making more.
thank you martin and team. great work.
Great video and nice visual effects!
Good lecture ! Thank you very much for the explanations.
Very useful, plain, and concise 😀
Clear and concise explanation.
👍
Thanks for the clear explanation.
Very good teaching, thank you!
such an informative lecture, thank you so much
Really good video. Thank you!
Too many thanks, your lecture is very helpful, could you please explain all gates from LSTM (Forget Gate, Learn Gate, Remember Gate & Usegate(OutPut))?
Useful video. Thanks a lot
really good video, thanks you!
Thank you for the lecture
Super good lecture
Good lecture, Please make video on "transformer based models" too,
It will be very helpful
Thanks for the suggestion, Akash! We'll see what we can do! 🙂
Great lecture, thank you.
You're welcome and thanks for watching, Shilpa!
Sir can you do a video of Rnn example by giving numerical values
Is there a way to use clustering and similarity measures to load relevant context? Say we have a Text corpus, cluster it and the lstm checks what the current topic probably belongs too and loads information about this topic into the State or as an extra input?
Thank you !
That was great!
Hi Martin as you given an example of martin and jenifer, When questions asked about jenifer, martin is no more reelevant so iit'll be forgotten right, if anytime in the future question asked about martin does that relevant to the LSTM? I mean will it be able to remind about martin even after forgetting it?
Tell me please, how is it regulated what we need to remember in sequence? How is it determined? That's how the model will determine what makes sense for the investigation?
thank you
please teach us more content like this..
Great video - do you write on glass and then transpose/flip the video?
See ibm.biz/write-backwards
Wao, Very helpful
Thanks.
Cam we use rnn for ctr prediction
are there any new algorithms more powerful and more efficiency work than traditional neural network ?
Wait the Homebrew Challenge dude does Deep Learning too?!
LOL yes, Homebrew Challenge dude has a day job :)
How does this trick work, you record normally and then mirror the video?
Anyone else distracted by the fact that he's writing backwards? Great vids, keep it up
See ibm.biz/write-backwards
tanks how you can apply lstm with times series?
awesome
how can you write backwards?
magnificent
The video is mirrored image of the actual video. The writing appears straight to the audience.
Who can please tell me what screen he wrote on, it is so cool!🤩
See ibm.biz/write-backwards
How are you writing ?
what if Jennifer is they?
why echo though?
saved my exam
After first 10 seconds of this video
me: Woah now I know the origins of knives out movie😂
❤❤
5:00 5:42
Imagine if the model takes the context "My name " and predicts "J" as the next letter 🤣
who would have guess that some algorithm can beat current state of art memory (eg me )
The voice is too low
Yes needs to work on audio 🔉
Subtitle is a solution
Use earphones. I did and had no problem!
It was the butler.
I see the LSTM = low volume. This is the 3rd guide that is almost muted sound.
Sound is just fine mate
"always butler" is high bias :D
mirror writing🙃🙃🙃🙃🙃
Are u using mirror or can you actually write backwards sir 🤐🤐
anyone here to know the murderer, its the butler🤣
I am cooked
martin are you married or nah
VOICE too low!