Word2Vec, GloVe, FastText- EXPLAINED!

Поділитися
Вставка
  • Опубліковано 31 тра 2024
  • Let's talk about word2vec architectures (CBOW, Skip-gram, GloVe, FastText)
    SPONSOR
    Get 20% off and be apart of a Premium Software Engineering Community for career advice and guidance: www.jointaro.com/r/ajayh486/
    ABOUT ME
    ⭕ Subscribe: ua-cam.com/users/CodeEmporiu...
    📚 Medium Blog: / dataemporium
    💻 Github: github.com/ajhalthor
    👔 LinkedIn: / ajay-halthor-477974bb
    RESOURCES
    [1 🔎] A Neural Probabilistic Language Model (Bengio et al., 2003): www.jmlr.org/papers/volume3/b...
    [2 🔎] Word2Vec (original paper): arxiv.org/pdf/1301.3781.pdf
    [3 🔎] GloVe paper: nlp.stanford.edu/pubs/glove.pdf
    [4 🔎] FastText: arxiv.org/pdf/1607.04606.pdf
    [5 🔎] LSTM Video: • LSTM Networks - EXPLAI...
    [5 🔎] Transformer Video: • Transformer Neural Net...
    [6 🔎] BERT video: • BERT Neural Network - ...
    [8 🔎] ChatGPT: openai.com/blog/chatgpt
    PLAYLISTS FROM MY CHANNEL
    ⭕ Transformers from scratch playlist: • Self Attention in Tran...
    ⭕ ChatGPT Playlist of all other videos: • ChatGPT
    ⭕ Transformer Neural Networks: • Natural Language Proce...
    ⭕ Convolutional Neural Networks: • Convolution Neural Net...
    ⭕ The Math You Should Know : • The Math You Should Know
    ⭕ Probability Theory for Machine Learning: • Probability Theory for...
    ⭕ Coding Machine Learning: • Code Machine Learning
    MATH COURSES (7 day free trial)
    📕 Mathematics for Machine Learning: imp.i384100.net/MathML
    📕 Calculus: imp.i384100.net/Calculus
    📕 Statistics for Data Science: imp.i384100.net/AdvancedStati...
    📕 Bayesian Statistics: imp.i384100.net/BayesianStati...
    📕 Linear Algebra: imp.i384100.net/LinearAlgebra
    📕 Probability: imp.i384100.net/Probability
    OTHER RELATED COURSES (7 day free trial)
    📕 ⭐ Deep Learning Specialization: imp.i384100.net/Deep-Learning
    📕 Python for Everybody: imp.i384100.net/python
    📕 MLOps Course: imp.i384100.net/MLOps
    📕 Natural Language Processing (NLP): imp.i384100.net/NLP
    📕 Machine Learning in Production: imp.i384100.net/MLProduction
    📕 Data Science Specialization: imp.i384100.net/DataScience
    📕 Tensorflow: imp.i384100.net/Tensorflow

КОМЕНТАРІ • 22

  • @uk_with_jatin3512
    @uk_with_jatin3512 11 місяців тому +2

    Just Love you videos man! Thanks for helping! I was scared of learning attention mechanisms at first but now everything is clear. It just helps a lot while working on projects when you know what is going on behind the scenes.

  • @RivenOmg
    @RivenOmg 11 місяців тому +1

    That’s such a great explanation, thank you!

  • @rohitchamp
    @rohitchamp 10 місяців тому +1

    Your explanation is super brother one of the best

  • @joudjoud1947
    @joudjoud1947 11 місяців тому

    Thank you for all the effort you put to make these subjects easy and accessible . As a newbie, could you please tell me where do I have to start ?

  • @Patapom3
    @Patapom3 11 місяців тому

    Great video!

  • @samson6707
    @samson6707 Місяць тому

    quality video. thanks 👍

  • @phlip00
    @phlip00 10 місяців тому

    Danke Mann!

  • @krzysztofjarek6476
    @krzysztofjarek6476 11 місяців тому

    Thank you for this video :)

    • @CodeEmporium
      @CodeEmporium  11 місяців тому

      My pleasure :)

    • @krzysztofjarek6476
      @krzysztofjarek6476 11 місяців тому

      @@CodeEmporium Your new, more historical series is a great contribution to yt :D

  • @BromaniJones
    @BromaniJones 8 місяців тому

    When you say “n-gram vector” do you mean “bag of words vector”. I always thought it was the latter and haven’t heard the former

  • @kinanradaideh5479
    @kinanradaideh5479 11 місяців тому

    Didn't this guy have a discord, does anyone know if its still up or could anyone send it. id love to be a part of this wonderful youtubers community

  • @AnakarParida
    @AnakarParida 9 місяців тому

    I am a newbie to ML so forgive me if I am wrong. I was going through your video and then something didn't make sense to me. At time 2:09 you said it is 100 x 1 vector but later you mentioned at 2:38 that its 1 x 100 vector. I think the latter is correct, right @CodeEmporium

  • @s8x.
    @s8x. 27 днів тому

    what makes these different than tokenizers

  • @ghostrider9084
    @ghostrider9084 Місяць тому

    sir neevu kannadigara wow?? do u work in usa sir ?
    or pursuing any degree?

  • @XpLoeRe
    @XpLoeRe 11 місяців тому

    my saviour.

  • @roastmaker1233
    @roastmaker1233 9 місяців тому +1

    ❤ from ಬೆಂಗಳೂರು

    • @CodeEmporium
      @CodeEmporium  9 місяців тому

      Hearts from me too! Thanks for commenting fellow Kannadiga :)

  • @prithvi1138
    @prithvi1138 6 місяців тому

    Sir r u from Karnataka