How did the Attention Mechanism start an AI frenzy? | LM3

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ • 27

  • @vcubingx
    @vcubingx  7 місяців тому +10

    With that, these are the three videos I had planned out. Do check out the previous ones if you missed them!
    What kind of videos would you guys like to see next?

    • @VisibilityO2
      @VisibilityO2 7 місяців тому

      Hey , I consider vcubingx should explain the sparse attention it make the models handle large inputs more efficiently by only attending to a subset of elements . In large sequences it helps in a advantage of calculation (as it requires less calculation than softmax).
      I will recommend you to read this 'research.google/blog/rethinking-attention-with-performers/?m=1'

  • @scottmcevoy9252
    @scottmcevoy9252 7 місяців тому +8

    This is one of the best explanations of attention I have seen so far. Understanding the bottleneck motivation really makes this clear right around 3:15.

  • @blackveganarchist
    @blackveganarchist 7 місяців тому +4

    you’re doing god’s work brother, thank you for the series

  • @TheRoganExperienceJoe
    @TheRoganExperienceJoe 2 місяці тому +1

    Nice, time to boost this video in the algorithm by typing out a comment

  • @antoineberkani9747
    @antoineberkani9747 7 місяців тому +2

    I really like how easy you make it to understand the why of things. I think you've accomplished your goal of making it seem like I could come up with this!
    Please cover multi headed self attention next! :)
    I am worried that this simple approach skips important pieces of the puzzle though. Transformers do have a lot of moving parts it seems. But it seems like you're only getting started!

  • @calix-tang
    @calix-tang 7 місяців тому +5

    What a great video mfv I paid attention the whole time

  • @shukurullomeliboyev2004
    @shukurullomeliboyev2004 6 місяців тому

    Best explanation when i have found so far, thank you

  • @lolatomroflsinnlos
    @lolatomroflsinnlos 7 місяців тому +1

    Thanks for this series :)

  • @kevindave277
    @kevindave277 6 місяців тому

    Thank you, Vivek. Absolutely love your content. Please also keep adding Math content, though. Maybe create a playlist about different functions, limits etc? Whatever suits you.

  • @FlyingHenroxx
    @FlyingHenroxx 3 місяці тому

    Thank you for your work! Your videos were very helpful for understanding the evolution of transformers 👍

  • @FabioDBB
    @FabioDBB 5 місяців тому

    Truly amazing explanation, thx!

  • @nikkatalnikov
    @nikkatalnikov 7 місяців тому +1

    great explanation

  • @artmiss-x8o
    @artmiss-x8o 3 місяці тому

    it was really good. thank you

  • @j.domenig418
    @j.domenig418 7 місяців тому +2

    Thanks!

  • @Fussfackel
    @Fussfackel 7 місяців тому

    Great material and presentation, thanks a lot for your work! I'd like to see some deep dive into how embeddings work, as we can get embeddings from decoder-only models like GPTs, Llamas, etc. and they use some form of embeddings for their internal representations, right? But there are also encoder-only models like BERT and others (OpenAIs text-embedding models) which are actually used instead. What is their difference and why does one work better than the other? Is it just because of computer differences or are there some inherent differences?

  • @balasubramaniana9541
    @balasubramaniana9541 7 місяців тому +1

    awesome

  • @hafizulislam364
    @hafizulislam364 15 днів тому

    What's the name of the piano tune that appears at the beginning of the video?

  • @lacmacmclean
    @lacmacmclean 6 днів тому

    epic

  • @maurogdilalla
    @maurogdilalla 7 місяців тому

    What about Q, K a d V matrixes meaning?

  • @posttoska
    @posttoska 7 місяців тому

    nice vid

  • @Rouxles
    @Rouxles 7 місяців тому

    tyfs

  • @aidanthompson5053
    @aidanthompson5053 7 місяців тому

    If you hate others, your really just hating yourself, because we are all one with god source

  • @varunmohanraj5031
    @varunmohanraj5031 7 місяців тому

    ❤❤❤

  • @OBGynKenobi
    @OBGynKenobi 7 місяців тому +8

    Weird 3b1b has the same series going on now.

    • @korigamik
      @korigamik 7 місяців тому +3

      He works for 3b1b

  • @rosschristopherross
    @rosschristopherross 7 місяців тому

    Thanks!