How did the Attention Mechanism start an AI frenzy? | LM3

Поділитися
Вставка
  • Опубліковано 20 січ 2025

КОМЕНТАРІ • 29

  • @vcubingx
    @vcubingx  9 місяців тому +10

    With that, these are the three videos I had planned out. Do check out the previous ones if you missed them!
    What kind of videos would you guys like to see next?

    • @VisibilityO2
      @VisibilityO2 9 місяців тому

      Hey , I consider vcubingx should explain the sparse attention it make the models handle large inputs more efficiently by only attending to a subset of elements . In large sequences it helps in a advantage of calculation (as it requires less calculation than softmax).
      I will recommend you to read this 'research.google/blog/rethinking-attention-with-performers/?m=1'

  • @scottmcevoy9252
    @scottmcevoy9252 9 місяців тому +8

    This is one of the best explanations of attention I have seen so far. Understanding the bottleneck motivation really makes this clear right around 3:15.

  • @blackveganarchist
    @blackveganarchist 9 місяців тому +4

    you’re doing god’s work brother, thank you for the series

  • @j.domenig418
    @j.domenig418 9 місяців тому +2

    Thanks!

  • @nikkatalnikov
    @nikkatalnikov 9 місяців тому +1

    great explanation

  • @antoineberkani9747
    @antoineberkani9747 9 місяців тому +2

    I really like how easy you make it to understand the why of things. I think you've accomplished your goal of making it seem like I could come up with this!
    Please cover multi headed self attention next! :)
    I am worried that this simple approach skips important pieces of the puzzle though. Transformers do have a lot of moving parts it seems. But it seems like you're only getting started!

  • @lolatomroflsinnlos
    @lolatomroflsinnlos 9 місяців тому +1

    Thanks for this series :)

  • @kevindave277
    @kevindave277 8 місяців тому

    Thank you, Vivek. Absolutely love your content. Please also keep adding Math content, though. Maybe create a playlist about different functions, limits etc? Whatever suits you.

  • @YeabsiraTesfaye-ry6yc
    @YeabsiraTesfaye-ry6yc 9 днів тому

    Just wow! Subscribed.

  • @FlyingHenroxx
    @FlyingHenroxx 5 місяців тому

    Thank you for your work! Your videos were very helpful for understanding the evolution of transformers 👍

  • @shukurullomeliboyev2004
    @shukurullomeliboyev2004 8 місяців тому

    Best explanation when i have found so far, thank you

  • @calix-tang
    @calix-tang 9 місяців тому +5

    What a great video mfv I paid attention the whole time

  • @TheRoganExperienceJoe
    @TheRoganExperienceJoe 4 місяці тому +1

    Nice, time to boost this video in the algorithm by typing out a comment

  • @FabioDBB
    @FabioDBB 7 місяців тому

    Truly amazing explanation, thx!

  • @artmiss-x8o
    @artmiss-x8o 4 місяці тому

    it was really good. thank you

  • @hafizulislam364
    @hafizulislam364 2 місяці тому

    What's the name of the piano tune that appears at the beginning of the video?

  • @Fussfackel
    @Fussfackel 9 місяців тому

    Great material and presentation, thanks a lot for your work! I'd like to see some deep dive into how embeddings work, as we can get embeddings from decoder-only models like GPTs, Llamas, etc. and they use some form of embeddings for their internal representations, right? But there are also encoder-only models like BERT and others (OpenAIs text-embedding models) which are actually used instead. What is their difference and why does one work better than the other? Is it just because of computer differences or are there some inherent differences?

  • @balasubramaniana9541
    @balasubramaniana9541 9 місяців тому +1

    awesome

  • @maurogdilalla
    @maurogdilalla 9 місяців тому

    What about Q, K a d V matrixes meaning?

  • @posttoska
    @posttoska 9 місяців тому

    nice vid

  • @Rouxles
    @Rouxles 9 місяців тому

    tyfs

  • @lacmacmclean
    @lacmacmclean Місяць тому

    epic

  • @varunmohanraj5031
    @varunmohanraj5031 9 місяців тому

    ❤❤❤

  • @aidanthompson5053
    @aidanthompson5053 9 місяців тому

    If you hate others, your really just hating yourself, because we are all one with god source

  • @OBGynKenobi
    @OBGynKenobi 9 місяців тому +8

    Weird 3b1b has the same series going on now.

    • @korigamik
      @korigamik 9 місяців тому +3

      He works for 3b1b

  • @Rami_Zaki-k2b
    @Rami_Zaki-k2b 21 день тому

    Every video on planet earth explains attention with "translation", when every individual on planet earth uses ChatGPT "NOT IN TRANSLATION". We use it to CHAT ... Why use translation to explain ? It is so wired ....

  • @rosschristopherross
    @rosschristopherross 9 місяців тому

    Thanks!