Ali Ghodsi, Deep Learning, Attention mechanism, self-attention, S2S, Fall 2023, Lecture 9

Поділитися
Вставка
  • Опубліковано 27 вер 2024
  • Attention mechanism and self-attention,
    Sequence-to-sequence models
    This video provides an in-depth exploration of Attention Mechanism and Self-Attention, crucial concepts that have revolutionized the field of Natural Language Processing (NLP). Transformers, the game-changers in NLP, rely heavily on self-attention. Join us as we unravel the fundamentals of attention and self-attention in the context of NLP, and gain a brief insight into their application in image processing.

КОМЕНТАРІ • 4

  • @nguyenple
    @nguyenple 11 місяців тому +1

    Thank you

  • @huwenhan
    @huwenhan 9 місяців тому

    Detail mathematical formula explanation start @47:00

  • @bsementmath6750
    @bsementmath6750 8 місяців тому

    Prof. You used to be very verbose and invasive on the board. Why this hybrid mode of ppt and some board? Love from Pakistan!

  • @olabintanibraheem8111
    @olabintanibraheem8111 11 місяців тому

    Thanks for the class sir..