What is LSTM (Long Short Term Memory)?

Поділитися
Вставка
  • Опубліковано 14 чер 2024
  • Learn about watsonx → ibm.biz/BdvxRB
    Long Short Term Memory, also known as LSTMs, are a special kind of Recurrent Neural Network, or RNN, architecture capable of learning long-term dependencies as well as a solution to the vanishing gradient problem that can occur when training traditional RNNs.
    In this lightboard video, Martin Keen with IBM, breaks down why we need LSTMs to address the problem of long-term dependencies, how the cell state and its various gates help transfer relative information in a sequence chain, and a few key LSTM use cases.
    #LSTM #RNN #AI
  • Наука та технологія

КОМЕНТАРІ • 85

  • @IsxaaqAcademy
    @IsxaaqAcademy Місяць тому +2

    After watching Martin Keen's explanations, you don't need another short explanation

  • @simonbax2002
    @simonbax2002 Рік тому +29

    Martin you are a wonderful teacher! Thank you very much for the explanations.

  • @engr.inigoe.silvagalvan1161
    @engr.inigoe.silvagalvan1161 Рік тому +8

    Very useful and helpul. This is a hard topic to understand as readily way, but you can did it in just 8 minutes. Thanks for that Mr. Martin and company. Greetings from Mexico!

  • @toenytv7946
    @toenytv7946 2 роки тому +6

    Great advancement in time. Glad to have a better understanding. Thank you folks

  • @channel-xj7rp
    @channel-xj7rp 2 роки тому +85

    So are going to ignore the fact he wrote everything backwards on a clear glass wall?

    • @IBMTechnology
      @IBMTechnology  2 роки тому +114

      That's likely the #1 comment! The presenter writes on the glass and we flip the image in post-production. Martin is actually right-handed. Some married presenters will switch their wedding ring to the other hand or not wear it so as not to confuse their spouse.

    • @gehallak660
      @gehallak660 2 роки тому +6

      @@IBMTechnology Smart ! I was wondering how you did this.

    • @abhishekrao6198
      @abhishekrao6198 Рік тому +5

      @@IBMTechnology I don't understand how you guys made it look like he's writing backwards on a glass, could you explain it to me.

    • @chaoma8228
      @chaoma8228 Рік тому +5

      @@abhishekrao6198 The presenter uses right hand to write. The camera is in front of him so that the original video should have all sentences reversed. Then flip the video left to right, you can get the video you watch now

    • @nitinkapoor4752
      @nitinkapoor4752 Рік тому +7

      @@IBMTechnology 😂😂…the most obvious observation. People who didn’t even figure that out 🤦‍♂️should not be watching LSTM/RNN or other such videos 🕺

  • @amalkumar256
    @amalkumar256 6 місяців тому +4

    I'm in love with his way of teaching!

  • @DieLazergurken
    @DieLazergurken 2 роки тому +36

    Very helpful lecture. Keep up the good work!

  • @vivekpujaravp
    @vivekpujaravp Рік тому

    Fantastic explanation. Please keep making more.

  • @RishabKapadia
    @RishabKapadia Рік тому

    thank you martin and team. great work.

  • @theneumann7
    @theneumann7 Рік тому

    Great video and nice visual effects!

  • @WangY-ip3sb
    @WangY-ip3sb 11 місяців тому

    Good lecture ! Thank you very much for the explanations.

  • @athmaneghidouche7746
    @athmaneghidouche7746 Рік тому

    Very useful, plain, and concise 😀

  • @fouziafathima6460
    @fouziafathima6460 10 місяців тому +1

    Clear and concise explanation.
    👍

  • @phonethiriyadana
    @phonethiriyadana 2 роки тому

    Thanks for the clear explanation.

  • @Gabi_09
    @Gabi_09 Рік тому

    Very good teaching, thank you!

  • @waleedt_trz
    @waleedt_trz 2 роки тому +6

    such an informative lecture, thank you so much

  • @tammofrancksen5186
    @tammofrancksen5186 2 роки тому

    Really good video. Thank you!

  • @balenkamal182
    @balenkamal182 Рік тому +1

    Too many thanks, your lecture is very helpful, could you please explain all gates from LSTM (Forget Gate, Learn Gate, Remember Gate & Usegate(OutPut))?

  • @kikichung2955
    @kikichung2955 Рік тому

    Useful video. Thanks a lot

  • @siddhesh119369
    @siddhesh119369 Рік тому

    really good video, thanks you!

  • @mastajigga13
    @mastajigga13 2 роки тому

    Thank you for the lecture

  • @thomasplum9868
    @thomasplum9868 2 роки тому +1

    Super good lecture

  • @akashthoriya
    @akashthoriya 2 роки тому +10

    Good lecture, Please make video on "transformer based models" too,
    It will be very helpful

    • @IBMTechnology
      @IBMTechnology  2 роки тому +2

      Thanks for the suggestion, Akash! We'll see what we can do! 🙂

  • @shilpadas1311
    @shilpadas1311 2 роки тому

    Great lecture, thank you.

    • @IBMTechnology
      @IBMTechnology  2 роки тому

      You're welcome and thanks for watching, Shilpa!

  • @jayasreechaganti9382
    @jayasreechaganti9382 Рік тому +1

    Sir can you do a video of Rnn example by giving numerical values

  • @massimothormann272
    @massimothormann272 Рік тому

    Is there a way to use clustering and similarity measures to load relevant context? Say we have a Text corpus, cluster it and the lstm checks what the current topic probably belongs too and loads information about this topic into the State or as an extra input?

  • @kedarnandiwdekar20
    @kedarnandiwdekar20 Рік тому

    Thank you !

  • @mohammadrezarazavian9305
    @mohammadrezarazavian9305 Рік тому

    That was great!

  • @user-wq2sn4zb1d
    @user-wq2sn4zb1d 7 місяців тому

    Hi Martin as you given an example of martin and jenifer, When questions asked about jenifer, martin is no more reelevant so iit'll be forgotten right, if anytime in the future question asked about martin does that relevant to the LSTM? I mean will it be able to remind about martin even after forgetting it?

  • @bysedova
    @bysedova Рік тому

    Tell me please, how is it regulated what we need to remember in sequence? How is it determined? That's how the model will determine what makes sense for the investigation?

  • @aimatters5600
    @aimatters5600 Рік тому

    thank you

  • @user-nz5tk9wm7f
    @user-nz5tk9wm7f 3 місяці тому

    please teach us more content like this..

  • @rollopost
    @rollopost Рік тому

    Great video - do you write on glass and then transpose/flip the video?

  • @colabwork1910
    @colabwork1910 Рік тому

    Wao, Very helpful

  • @codeschool3964
    @codeschool3964 6 місяців тому

    Thanks.

  • @LakshmiDevi_jul31
    @LakshmiDevi_jul31 11 місяців тому

    Cam we use rnn for ctr prediction

  • @sathiraful
    @sathiraful 24 дні тому

    are there any new algorithms more powerful and more efficiency work than traditional neural network ?

  • @ashleygillman3104
    @ashleygillman3104 2 роки тому +6

    Wait the Homebrew Challenge dude does Deep Learning too?!

  • @Grzeroli1
    @Grzeroli1 2 місяці тому

    How does this trick work, you record normally and then mirror the video?

  • @TaylorDeiaco
    @TaylorDeiaco 7 місяців тому

    Anyone else distracted by the fact that he's writing backwards? Great vids, keep it up

  • @maloukemallouke9735
    @maloukemallouke9735 7 місяців тому

    tanks how you can apply lstm with times series?

  • @matthewg7702
    @matthewg7702 7 місяців тому

    awesome

  • @gupsekobas2209
    @gupsekobas2209 4 місяці тому +1

    how can you write backwards?

  • @fatmamahmoud9148
    @fatmamahmoud9148 Рік тому

    magnificent

  • @trinity98767
    @trinity98767 3 місяці тому

    The video is mirrored image of the actual video. The writing appears straight to the audience.

  • @wmxoae1237
    @wmxoae1237 Рік тому

    Who can please tell me what screen he wrote on, it is so cool!🤩

  • @elevated_existence
    @elevated_existence Рік тому

    How are you writing ?

  • @ivant_true
    @ivant_true 10 місяців тому

    what if Jennifer is they?

  • @anirvinkandarpa5544
    @anirvinkandarpa5544 Місяць тому

    why echo though?

  • @IconOfSeas
    @IconOfSeas 11 місяців тому

    saved my exam

  • @abdullahshaikh7409
    @abdullahshaikh7409 Рік тому

    After first 10 seconds of this video
    me: Woah now I know the origins of knives out movie😂

  • @BinodLaKshitha
    @BinodLaKshitha 9 місяців тому

    ❤❤

  • @willw4096
    @willw4096 9 місяців тому

    5:00 5:42

  • @willdrunkenstein5367
    @willdrunkenstein5367 2 місяці тому

    Imagine if the model takes the context "My name " and predicts "J" as the next letter 🤣

  • @amirarshiamirzaei710
    @amirarshiamirzaei710 2 місяці тому

    who would have guess that some algorithm can beat current state of art memory (eg me )

  • @petchpaitoon
    @petchpaitoon 2 роки тому +6

    The voice is too low

  • @NewNerdInTown
    @NewNerdInTown 10 днів тому

    It was the butler.

  • @cybrhckr
    @cybrhckr 2 роки тому +3

    I see the LSTM = low volume. This is the 3rd guide that is almost muted sound.

    • @jeverydk
      @jeverydk 2 роки тому +1

      Sound is just fine mate

  • @eyupozturk8586
    @eyupozturk8586 Рік тому

    "always butler" is high bias :D

  • @simplexination9837
    @simplexination9837 Рік тому

    mirror writing🙃🙃🙃🙃🙃

  • @domnic7431
    @domnic7431 Рік тому

    Are u using mirror or can you actually write backwards sir 🤐🤐

  • @user-fp8jx7gr7v
    @user-fp8jx7gr7v 6 місяців тому +1

    anyone here to know the murderer, its the butler🤣

  • @blastinnn
    @blastinnn 3 дні тому

    I am cooked

  • @HITNUT
    @HITNUT Рік тому

    martin are you married or nah

  • @manmohanmahapatra6040
    @manmohanmahapatra6040 Місяць тому

    VOICE too low!