165 - An introduction to RNN and LSTM

Поділитися
Вставка
  • Опубліковано 12 гру 2024

КОМЕНТАРІ • 88

  • @elisavasta2684
    @elisavasta2684 2 роки тому +3

    One of the best explanation ever on LSTM! Greetings from Politecnico di Milano!

  • @AshishBamania95
    @AshishBamania95 2 роки тому

    Can't believe that this is free. Thanks a lot. You are building a community of future researchers and innovators here!

  • @sonhdang
    @sonhdang 3 роки тому +17

    I've watched dozen of videos on LSTM and this is the best one so far. Thank you so much sir. Greetings from UCLA!

  • @Balakrish-cl9kq
    @Balakrish-cl9kq 3 роки тому

    I feel very gifted that I got the suggestion from UA-cam, the right video....

  • @christophbrand9015
    @christophbrand9015 3 роки тому +2

    The first youtube tutorial I saw which explains a LSTM in detail, e.g. why a Sigmoid or why a tanh is used within the cell. Great!

  • @pfever
    @pfever 3 роки тому +14

    Best LSTM explanation I have watched! All your videos are superb! I want to watch them all from beginning to end! Thank you for such detailed and intuitive explanations! :D

  • @edwardbowora
    @edwardbowora 2 роки тому +4

    Best teacher ever.

  • @mehdisdikiene8752
    @mehdisdikiene8752 3 роки тому

    I've watched many videos and read a lot about LSTM but this is the first time i really understand how LSTM works. Thumbs up thank you!

  • @dizhang947
    @dizhang947 Рік тому +1

    amazing work, thank you so much!

  • @MercyBedada
    @MercyBedada 8 місяців тому

    I get valuable Understanding. I realy appriciate the way of your explanation.

  • @learn2know79
    @learn2know79 3 роки тому +1

    I was struggling to understand the basic concept of LSTM and watched dozen of videos and finally found the best one so far. Thank you so much for letting us understand. Greetings from GIST!

  • @dantec.dagandanan3732
    @dantec.dagandanan3732 2 роки тому

    Thanks!

    • @dantec.dagandanan3732
      @dantec.dagandanan3732 2 роки тому

      I know this little amount of money is not enough to say thank you. Keep the good works ser, 🥰

    • @DigitalSreeni
      @DigitalSreeni  2 роки тому +1

      Thank you very much. No amount of money is little. Every penny counts :)
      Bulk of the money goes to charities that help with cancer research and eye surgeries for poor people. So the society benefits from any amount that is contributed. Thanks again.

  • @RAHUDAS
    @RAHUDAS 2 роки тому +1

    At 19:31, he mentioned how many units of LSTM , the units parameters is not for how many units of LSTM in any layer, it is for hidden state dimension.
    And for how many LSTM depends on input shape[0].

    • @Droiduxx
      @Droiduxx Рік тому

      So if I understand well, if we consider the input to be a sequence of x elements, each "LSTM" unit contains x states, and returns a list of x vectors passed to the LSTM units of the next hidden layer. Am I right ?

    • @RAHUDAS
      @RAHUDAS Рік тому

      @@Droiduxx yes, but consider return_sequence, and return_stae arguments also, their default values false , to see the full picture, kindly turn on return sequence.
      Example -
      x = tf.range(60)
      x = tf.reshape(x,(5,3,2))
      # shape - ( batch, time, num-features)
      lstm = tf.Keras.Layes.LSTM( 7, return_sequence= True)
      Output = lstm(x)
      Print(Output.shape)
      # answer (5,3,7)

  • @yangfarhana3660
    @yangfarhana3660 3 роки тому

    I've viewed several vids on LSTM but this breakdown is the best!!

  • @gadisaadamuofficial2946
    @gadisaadamuofficial2946 Рік тому

    really, thank you for your more clarification!

  • @lh2738
    @lh2738 2 роки тому +1

    Nice video, so well explained and not too long, along with a full tutorial. Probably one of the best ones about LSTM. Thanks and please keep up the good work! Greetings from France!

  • @claybowlproductions
    @claybowlproductions Рік тому

    Sir you are a gem!

  • @aristideirakoze8098
    @aristideirakoze8098 2 роки тому

    We are infinitely grateful

  • @sivadasanet7966
    @sivadasanet7966 Місяць тому

    Wonderful explanation on LSTM

  • @jahanzaibasgher1275
    @jahanzaibasgher1275 2 роки тому

    Thank you so much :)
    Subscribed after watching your first video.

  • @davidomarparedesparedes8718
    @davidomarparedesparedes8718 7 місяців тому

    Great explanation! Thank you so much!! : )

  • @omniscienceisdead8837
    @omniscienceisdead8837 2 роки тому

    Best explanation out there, i understood, what is happening both conceptually and mathematically

  • @zhiyili6707
    @zhiyili6707 2 роки тому

    Thank you very much! It is well explained!

  • @alteshaus3149
    @alteshaus3149 3 роки тому +1

    Thank you very much for this video sir!

  • @saaddahmani1870
    @saaddahmani1870 2 роки тому +1

    Good, thanks a lot.

  • @kukuhiksanmusyahada7615
    @kukuhiksanmusyahada7615 2 роки тому

    Great presentation sir! thank you so much!

  • @karamjeetsinghmakkar3323
    @karamjeetsinghmakkar3323 Рік тому

    Dear Dr. S. Sreeni,
    Thanku for your informational videos regarding cnn.
    Kindly make LSTM for image classification tasks.
    Thanku.

    • @DigitalSreeni
      @DigitalSreeni  Рік тому

      LSTM is primarily used for processing sequential data. While it is possible to use LSTM for image classification tasks, it is generally not the best choice as it is designed to model sequential dependencies in data, whereas images are inherently spatial and do not have an obvious sequential structure. Images are typically processed using CNNs, which are specifically designed to handle spatial data and can effectively extract features from images using convolutions.

  • @-mle566
    @-mle566 3 роки тому

    thank you, nice video for LSTM new learners :)

  • @indranisen5877
    @indranisen5877 2 роки тому

    Thank you Sir, Nice explanations.

  • @AhmedFazals
    @AhmedFazals 8 місяців тому

    Awesome! Thanks sir.

  • @rolandosantos7755
    @rolandosantos7755 3 роки тому

    i love your video...i am just starting to learn machine learning and its very useful'

  • @jolittevillaruz5234
    @jolittevillaruz5234 3 роки тому

    Very intuitive video!

  • @nisa_ssa
    @nisa_ssa 3 роки тому

    Thank you so much for this video...

  • @gakhappy
    @gakhappy 3 роки тому

    Great work sir. keep on doing great job

  • @alex-beamslightchanal8743
    @alex-beamslightchanal8743 2 роки тому

    Nice tutorial! Thank you!

  • @Toss3geek
    @Toss3geek Рік тому

    谢谢老师

  • @cryptodude5359
    @cryptodude5359 2 роки тому +1

    Amazing tutorial! I got a question:
    At 14:59 you explain the forget gate.
    In the lower-left corner, the cell gets ht-1 (last timestep) as input. Is it possible to have a sequence of past days as input?
    For example ht-1 & ht-2 & ht-3 ... etc. to spot potential trends in the data. Maybe with multiple variables. Giving every single timestep an additional weight.

  • @ajithkannan522
    @ajithkannan522 2 місяці тому

    Very well explained. Pls ask questions to engqge the aydience and give anwers with explanation

  • @s.e.7268
    @s.e.7268 3 роки тому +1

    I am so happy to discover this channel! :)

  • @aminasgharisooreh9243
    @aminasgharisooreh9243 4 роки тому +1

    Thank you, it is really helpful

  • @VCodes
    @VCodes 3 роки тому +1

    great. thx a lot

  • @awesome-ai1714
    @awesome-ai1714 Рік тому

    11:40 What is going on with the arrows? Signal from previous cell merges with current Xt, but there is no operator. Signal from left and signal from bottom Xt. And they both go to 3 gates?
    Edit: ok I see, its explained later

  • @aomo5293
    @aomo5293 Рік тому

    Thank you, honestly it s very clear.
    Please I am looking for a tutorial on image classification but using local images dataset.
    Have y made a one before.
    Thank you again

  • @rahuliron1635
    @rahuliron1635 3 роки тому

    awesome explanation thank you very much

  • @sherrlynrasdas8387
    @sherrlynrasdas8387 2 роки тому

    Can you teach us how to use LSTM and ARIMA in ensemble learning in forecasting time series data?

  • @sadafmehdi2991
    @sadafmehdi2991 3 роки тому

    Nice Explanation Sir!

  • @nicolamenga8943
    @nicolamenga8943 2 роки тому

    Thank you for the video.
    I have a question.
    The number of units (50) is the number of the so called "hidden units", also known as "hidden size"?

  • @manideepgupta2433
    @manideepgupta2433 3 роки тому

    Amazing Sir.

  • @vzinko
    @vzinko Рік тому

    Why is there a dropout after the final LSTM layer?

  • @bobaktadjalli
    @bobaktadjalli Рік тому

    Hi, well explained! Could I have your slides?

  • @AveRegina_
    @AveRegina_ 2 роки тому

    I'm using RNN for my PG thesis work. I've a query. Do we have to run stationarity test for our time series data before feeding it in the neural network model... or this step is only required in traditional time series models like ARIMA?

    • @DigitalSreeni
      @DigitalSreeni  2 роки тому

      RNNs are capable of learning nonlinearities (compared to ARIMA) and therefore should be able to learn from the input data without doing any stationarity pre-processing. This is especially true if you use LSTMs. Also, please note that you need lot more training data for RNNs compared to ARIMA. You may find this blog useful to understand the effectiveness of RNNs: karpathy.github.io/2015/05/21/rnn-effectiveness/

  • @JJGhostHunters
    @JJGhostHunters Рік тому

    Hi DigitalSreeni...I am a PhD candidate investigating applications of MLPs, CNNs and LSTMs. I see that you have amazing graphics for these model types in your videos.
    Would you be willing to share these graphics for the model architectures with me so that I may use them in my dissertation and defense presentation? I certainly would give you credit for them.
    Thank you for your time!

  • @ziyuelu1734
    @ziyuelu1734 2 роки тому

    thanks!

  • @kanui3618
    @kanui3618 4 роки тому

    nice explanation!

  • @stevenzhou7358
    @stevenzhou7358 3 роки тому

    Thanks for your videos! It's really helpful. I have a small question. Could you explain a little more about the meaning of units? Is it mean the number of hidden layers or the number of neurons in a layer?

    • @DigitalSreeni
      @DigitalSreeni  3 роки тому

      May be this helps... stats.stackexchange.com/questions/241985/understanding-lstm-units-vs-cells

    • @stevenzhou7358
      @stevenzhou7358 3 роки тому +1

      @@DigitalSreeni Thanks a lot! It's very helpful.

  • @hudaankara5616
    @hudaankara5616 4 роки тому +1

    Hi sir. thank you for much for all your videos. Could you provide us with tutorial to implement LSTM & RNN with Python Please?

    • @DigitalSreeni
      @DigitalSreeni  4 роки тому +1

      Yes... they should be out this week.

  • @aminasgharisooreh9243
    @aminasgharisooreh9243 4 роки тому

    please make a video about attention in images

  • @tchintchie
    @tchintchie 4 роки тому +2

    I can´t help but find this channel incredibly undersubscribed!!!

    • @DigitalSreeni
      @DigitalSreeni  4 роки тому

      I’m glad you like the content. I rely on you guys to spread the word :)

  • @ramchandracheke
    @ramchandracheke 4 роки тому

    First like a video then watch it !

    • @DigitalSreeni
      @DigitalSreeni  4 роки тому

      Thanks for your blind confidence in the video, I hope your opinion doesn’t change after watching the video :)

  • @XX-vu5jo
    @XX-vu5jo 4 роки тому

    Lol ever heard of transformers???

    • @DigitalSreeni
      @DigitalSreeni  4 роки тому

      Now sure what your meant by your comment, was that a question?

  • @pattiknuth4822
    @pattiknuth4822 3 роки тому

    His continuing use of "ok?" "ok?" "ok?" "ok?" is incredibly annoying.

    • @adhoc3018
      @adhoc3018 3 роки тому +2

      And you are not annoying at all.

    • @DigitalSreeni
      @DigitalSreeni  3 роки тому +6

      Poor choice to comment on personal trait rather than content of the tutorial, ok?