LSTM explained simply | LSTM explained | LSTM explained with example.

Поділитися
Вставка
  • Опубліковано 9 лют 2025
  • LSTM explained simply | LSTM explained | LSTM explained with an example
    #lstm #machinelearning #deeplearning #ai
    Hello,
    My name is Aman and I am a Data Scientist.
    All amazing data science courses at the most affordable price here: www.unfolddata...
    Book one on one session here(Note - These supports are chargable): docs.google.co...
    Follow on Instagram: unfold_data_science
    Topics for this video:
    LSTM explained simply,
    LSTM explained,
    LSTM explanation,
    LSTM explained with example,
    LSTM explained medium,
    lstm for stock prediction,
    lstm time series prediction,
    lstm pytorch,
    lstm tensorflow,
    lstm vs transformer,
    lstm neural network tensorflow,
    lstm model for time series prediction,
    long short term memory,
    lstm for anomaly detection,
    lstm for chatbot,
    unfold data science deep learning,
    unfold data science neural network,
    About Unfold Data science: This channel is to help people understand the basics of data science through simple examples in an easy way. Anybody without prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at a high level through this channel. The videos uploaded will not be very technical in nature and hence can be easily grasped by viewers from different backgrounds as well.
    Book recommendation for Data Science:
    Category 1 - Must Read For Every Data Scientist:
    The Elements of Statistical Learning by Trevor Hastie - amzn.to/37wMo9H
    Python Data Science Handbook - amzn.to/31UCScm
    Business Statistics By Ken Black - amzn.to/2LObAA5
    Hands-On Machine Learning with Scikit Learn, Keras, and TensorFlow by Aurelien Geron - amzn.to/3gV8sO9
    Category 2 - Overall Data Science:
    The Art of Data Science By Roger D. Peng - amzn.to/2KD75aD
    Predictive Analytics By By Eric Siegel - amzn.to/3nsQftV
    Data Science for Business By Foster Provost - amzn.to/3ajN8QZ
    Category 3 - Statistics and Mathematics:
    Naked Statistics By Charles Wheelan - amzn.to/3gXLdmp
    Practical Statistics for Data Scientist By Peter Bruce - amzn.to/37wL9Y5
    Category 4 - Machine Learning:
    Introduction to machine learning by Andreas C Muller - amzn.to/3oZ3X7T
    The Hundred Page Machine Learning Book by Andriy Burkov - amzn.to/3pdqCxJ
    Category 5 - Programming:
    The Pragmatic Programmer by David Thomas - amzn.to/2WqWXVj
    Clean Code by Robert C. Martin - amzn.to/3oYOdlt
    My Studio Setup:
    My Camera: amzn.to/3mwXI9I
    My Mic: amzn.to/34phfD0
    My Tripod: amzn.to/3r4HeJA
    My Ring Light: amzn.to/3gZz00F
    Join the Facebook group :
    www.facebook.c...
    Follow on medium: / amanrai77
    Follow on quora: www.quora.com/...
    Follow on Twitter: @unfoldds
    Watch the Introduction to Data Science full playlist here: • Data Science In 15 Min...
    Watch python for data science playlist here:
    • Python Basics For Data...
    Watch the statistics and mathematics playlist here :
    • Measures of Central Te...
    Watch End to End Implementation of a simple machine-learning model in Python here:
    • How Does Machine Learn...
    Learn Ensemble Model, Bagging, and Boosting here:
    • Introduction to Ensemb...
    Build Career in Data Science Playlist:
    • Channel updates - Unfo...
    Artificial Neural Network and Deep Learning Playlist:
    • Intuition behind neura...
    Natural language Processing playlist:
    • Natural Language Proce...
    Understanding and building a recommendation system:
    • Recommendation System ...
    Access all my codes here:
    drive.google.c...
    Have a different question for me? Ask me here : docs.google.co...
    My Music: www.bensound.c...

КОМЕНТАРІ • 124

  • @rajendrabhise4427
    @rajendrabhise4427 9 місяців тому +4

    This is the first video I encountered on LSTM subject and I dont think, I need to watch other videos to understand LSTM any more. What a clear and straight to the point lecture.

  • @deepakdodeja4663
    @deepakdodeja4663 Рік тому +6

    This is a well-researched, well-designed, well-explained video I would say. Thanks a lot for efficiently explaining LSTM in just 30 minutes.

  • @msbrdmr
    @msbrdmr Рік тому +3

    This was the most understanding tutorial I've ever seen about LSTM. Keep going. You are awesome.

  • @shanjidabduhalim6146
    @shanjidabduhalim6146 Рік тому +3

    What a clear and straight to the point lecture. Thanks Prof!

  • @vetrivelmurugan4937
    @vetrivelmurugan4937 2 місяці тому +1

    Super, Excellent,I got Very clear idea about LSTM🎉😊

  • @Mimimoon2024
    @Mimimoon2024 Рік тому +1

    thanku prof, now am confidante enough about what happening in LSTM cells .its well clear even the math behind it very clear and well explained . thanku prof for this course.

  • @dheerajp3024
    @dheerajp3024 5 місяців тому +1

    For the question asked at 18:38,
    The range of Sigmoid function is [0,1] and Tanh function is [-1,1].
    During Backpropagation, the partial derivative of Sigmoid function is much closer to zero when compared to the partial derivative of Tanh function. In longer-range networks the partial derivatives of Sigmoid function decreases to zero and can cause Vanishing gradients problem. But the partial derivative of Tanh is closer to one, hence the advantage and this solves the problem of Vanishing gradients. But we need to keep in mind that LSTM can still suffer from the problem of Exploding gradients, hence we use techniques like Gradient Clipping and Batch and Layer Normalization.
    I hope it answers the question.

  • @nagarajtrivedi610
    @nagarajtrivedi610 4 місяці тому

    Very well explained Aman. All these days I was not clear about how it retains information for short duration and long duration. I was also questioning myself how lstm predicts new words in a sequence. Today it has become clear to me. Thank you again.

  • @youssefbechara8809
    @youssefbechara8809 5 місяців тому

    I have to say mr.aman you've became my favourite deep learning teacher, your ability to teach HARD and complex maths in simple real-life examples is amazing! Not to mention the other youtubers teach us like robots they keep saying hard words, however you really make us feel like its a conversation!! It's really apparent how much professional and passionate you are about your work. Thank you so much❤!

  • @SelfBuiltWealth
    @SelfBuiltWealth 5 місяців тому +1

    No bullshit this is the best explaination on youtube ❤❤pls keep helping us

  • @barwalgayatri4655
    @barwalgayatri4655 Місяць тому

    Too Good Amamn very very easy to undersatnd

  • @samadhanpawar6554
    @samadhanpawar6554 Рік тому +4

    Amazing explanation very clear to understand keep up the good work
    Make a video on GRU , Transformers, attention mechanism,encoder and decoder also😊

  • @omarabubakarosman2791
    @omarabubakarosman2791 Рік тому +1

    very explicit presentation. We are very grateful for breaking down this complex concept of deep learning.

  • @Okive-green
    @Okive-green Рік тому +1

    thank you sir for this brief detailed video. it really helps me to get the idea about lstm

  • @renus4898
    @renus4898 7 місяців тому

    Hai Aman, No words to say... Simply superb! Excellent topic selection, explanation, and presentation. Please continue your journey; it is so helpful.

  • @nimeshraijada5844
    @nimeshraijada5844 Рік тому +2

    You have excellent teaching skills 👍

  • @ShivaniChauhan-g8t
    @ShivaniChauhan-g8t 4 місяці тому

    thank you sir for this explanation of LSTM, made easy and understandable in few mins.

  • @vimu-frm-slm
    @vimu-frm-slm 7 місяців тому +1

    Hi Aman, Thanks for your video..I understand the Vanishing Gradient problem, where small gradients are back propagated to update the weights. If the gradient is small, the update to the weights will be even smaller, which will reduce the learning rate of the model and lead to poor performance.
    I also understand the LSTM model, which has long-term memory, short-term memory, a forget gate, an input gate, and an output gate. The thing I don’t understand is how the LSTM fixes the vanishing gradient problem during back propagation. The gradient can still be small even when using an LSTM, and when it is back propagated, the weight updates and learning rate will still be impacted.
    I understand how LSTM is used & helps in forward propagation.. How it helps in back propagation? Please make a video explaining that.. Your help is much appreciated.. Thanks again

  • @TheMato1112
    @TheMato1112 Рік тому

    Thank you very much. Finally, I understand. Well Done

  • @vimalashekar-c8c
    @vimalashekar-c8c 4 місяці тому

    I like your style of teaching. Great job!

  • @swapnil2881
    @swapnil2881 Рік тому

    Excellent I watch so many video but not clear concept but today video very helpful

  • @Tatanajafi
    @Tatanajafi 10 місяців тому

    Best description. Thank you.

  • @slickjunt
    @slickjunt 5 місяців тому

    Keep up the great work !

  • @samiashah7914
    @samiashah7914 Рік тому

    Zabardast, very well explained. ❤ Thank you Sir

  • @houssam0017
    @houssam0017 2 місяці тому

    Thanks alot. Well explained

  • @sunilkumarsam150
    @sunilkumarsam150 Рік тому +2

    Best explanation ever!👍

  • @zenithmacwan
    @zenithmacwan Рік тому +1

    Excellent explanination! Thank you sir!

  • @rosie-5h
    @rosie-5h Рік тому

    wow such a hard topic still felt really simple thank you sir for explaining it very nicely

  • @agbershimaemmanuel-ci6mz
    @agbershimaemmanuel-ci6mz 2 місяці тому

    Thank you Amar for this interesting lecture

  • @prateekbhadauria7004
    @prateekbhadauria7004 Рік тому

    Nice and brief supereb explanation, thank you for spreading your knowledge.

  • @revanthisbnimmagadda
    @revanthisbnimmagadda 5 місяців тому

    Well explained Aman , like the video on LSTM

  • @shine_through_darkness
    @shine_through_darkness 9 місяців тому

    Thank you brother , your channel is really good

  • @adityarajiv6346
    @adityarajiv6346 10 місяців тому

    thanks it was really helpful!.

  • @vincentdey4313
    @vincentdey4313 8 місяців тому

    Well Explained . Great Job

  • @Gezahegnt2000
    @Gezahegnt2000 8 місяців тому +1

    Nice Tutorial thanks

  • @NileshPatil-pl2fj
    @NileshPatil-pl2fj 6 місяців тому

    I liked the video...very nice explanation

  • @danielfiadjoe9312
    @danielfiadjoe9312 Рік тому

    Thank you. This is very clear.

  • @divyaharshad9985
    @divyaharshad9985 11 місяців тому

    v good content! explained lucidly

  • @raheemasghar2383
    @raheemasghar2383 Рік тому

    Great effort Aman

  • @kavyanagesh8304
    @kavyanagesh8304 Рік тому

    You're THE BEST! Thank you.

  • @veenajain
    @veenajain Рік тому

    Awesome content Aman

  • @princekhunt1
    @princekhunt1 7 місяців тому

    Nice explanation 👍

  • @TheSerbes
    @TheSerbes 4 місяці тому

    I want to make a parameter selection in lstm. I will remove unnecessary parameters. Do you have a video on how I can do this?

  • @prateekkumar2740
    @prateekkumar2740 Рік тому +2

    Great content Aman. Could you please consider time series problem solving using LSTM. Thanks

  • @sunilkumarsam150
    @sunilkumarsam150 Рік тому +3

    sir, can you make one video of the implementation and research paper writing effectively in the field of NLP, ML, and DL.

  • @AIandSunil
    @AIandSunil Рік тому

    Excellent explanation sir❤ thank you

  • @tehreemqasim2204
    @tehreemqasim2204 9 місяців тому

    excellent tutorial thank you

  • @PriyaBeram13
    @PriyaBeram13 7 місяців тому

    Great Explanation Thank you sir! Could you tell me use of tanh in input gate?

  • @spoc.mnmjecspringboardmnmjec
    @spoc.mnmjecspringboardmnmjec 3 місяці тому

    Good sir

  • @aiddee-p2n
    @aiddee-p2n Рік тому

    Simply amazing

  • @casuresh7081
    @casuresh7081 Рік тому

    Good Explanation. Thanks

  • @selvimurali-o2s
    @selvimurali-o2s Рік тому

    Nice explanation.thank u

  • @RafaelRivetti
    @RafaelRivetti 7 місяців тому

    In the MLP network, data from independent variables from date t are used to predict a future value t+n. In the LSTM network, instead of using only data from time t of the independent variables, it uses data from time t, t-1, t-2, ..., t-n as desired by the programmer, and after that, generates the prediction for a future time t+n? Is this reasoning correct? Thank you very much!

  • @milindankur
    @milindankur Рік тому

    Great explanation! Thank you!

  • @red_righthand-o4x
    @red_righthand-o4x 9 місяців тому

    great brother

  • @aliyuabubakarmusa946
    @aliyuabubakarmusa946 8 місяців тому

    The video is very interesting

  • @IT-pz8yz
    @IT-pz8yz Рік тому

    Great explanation ,thank you

  • @MayankArya-jp7dq
    @MayankArya-jp7dq 7 місяців тому

    Thanks Sir :)

  • @KhairulMia-tr2jv
    @KhairulMia-tr2jv 5 місяців тому

    good tutorial on LSTM which game me good idea.

  • @akshitacharak4385
    @akshitacharak4385 Рік тому

    Thnkyou so much👍🏼

  • @preethibaligar6766
    @preethibaligar6766 7 місяців тому

    And just like that magic of understanding happened!

  • @parsasakhi1505
    @parsasakhi1505 Рік тому

    awesome! Thank you

  • @malaykhare1006
    @malaykhare1006 4 місяці тому

    amazing

  • @Amin-ue2dz
    @Amin-ue2dz 8 місяців тому

    you are amazing.

  • @muhammedthayyib9202
    @muhammedthayyib9202 Рік тому

    GREAT work

  • @Hunger_Minds
    @Hunger_Minds 11 місяців тому

    Aman Sir having a query regarding about LSTM architecture.
    In how many iteration will model to understand which one word is important & which one is not?..

  • @afn8370
    @afn8370 Рік тому

    thankyou man

  • @BhanusriAndhavarapu-xc1xb
    @BhanusriAndhavarapu-xc1xb 9 місяців тому

    could you please explain timeseries data

  • @ashwinijalla3504
    @ashwinijalla3504 Рік тому

    Please do series on complete learning of Generative AI concepts

  • @shubhrashrivastava9510
    @shubhrashrivastava9510 10 місяців тому

    What is the difference between the final output ot and ht??? Explain in detail plz...thanks in advance

  • @jawaherabdulwahabfadhil2329

    Thank u sir

  • @Hiisbnxkskkwl
    @Hiisbnxkskkwl Місяць тому +1

    Sir badiya ekdum chutkiyo mae master Kara Diya 😂

  • @spoc.mnmjecspringboardmnmjec

    Nice sir

  • @tishachhabra3369
    @tishachhabra3369 Рік тому

    Best👍

  • @chirumadderla8129
    @chirumadderla8129 Рік тому

    Super stuff

  • @lathusree
    @lathusree Рік тому

    Hi Aman can you help explain bus time arrival prediction using LSTM.

  • @chirumadderla8129
    @chirumadderla8129 Рік тому

    can you please consider the eusecase of weather forecasting

  • @sg28011
    @sg28011 11 місяців тому

    How will forget gate will know a particular word is irrelevant or of less importance?

    • @sg28011
      @sg28011 11 місяців тому

      Got the answer as the video progressed

  • @bakyt_yrysov
    @bakyt_yrysov 4 місяці тому

    🔥🔥🔥

  • @rahulraizada7001
    @rahulraizada7001 Рік тому +1

    sigmoid lies between 0 and 1. whereas tanh lies between 1 and -1.

  • @dhirajpatil6776
    @dhirajpatil6776 9 місяців тому

    Sir Please Can you provide written notes of this video means what you have explain in this video thats all I want written in words. Thank you

  • @rahulraizada7001
    @rahulraizada7001 Рік тому

    What is different between tanh and relu?

  • @mohitchouksey6707
    @mohitchouksey6707 Рік тому

    Sir how many people got job from ur course?

  • @hassanarshad2687
    @hassanarshad2687 Рік тому

    🤯🤯🤯🤯🤯🤯

  • @KumR
    @KumR Рік тому

    why sigma and why tanh?

  • @Mimimoon2024
    @Mimimoon2024 Рік тому

    u know prof i bought a bunch of courses related to this , believe me they were not clear as much as your course.

  • @siddheshmhatre2811
    @siddheshmhatre2811 Рік тому

    Finally god found

  • @allinteli
    @allinteli 10 місяців тому

    sir please hindi mein video banaoo

  • @_Channel_X
    @_Channel_X 11 місяців тому

    This is known as "simple explanation" in real sense