Ali Ghodsi, Deep Learning, Attention mechanism, self-attention, S2S, Fall 2023, Lecture 9
Вставка
- Опубліковано 27 вер 2024
- Attention mechanism and self-attention,
Sequence-to-sequence models
This video provides an in-depth exploration of Attention Mechanism and Self-Attention, crucial concepts that have revolutionized the field of Natural Language Processing (NLP). Transformers, the game-changers in NLP, rely heavily on self-attention. Join us as we unravel the fundamentals of attention and self-attention in the context of NLP, and gain a brief insight into their application in image processing.
Thank you
Detail mathematical formula explanation start @47:00
Prof. You used to be very verbose and invasive on the board. Why this hybrid mode of ppt and some board? Love from Pakistan!
Thanks for the class sir..