The Illustrated Word2vec - A Gentle Intro to Word Embeddings in Machine Learning

Поділитися
Вставка
  • Опубліковано 31 тра 2024
  • The concept of word embeddings is a central one in language processing (NLP). It's a method of representing words as numerically -- as lists of numbers that capture their meaning. Word2vec is an algorithm (a couple of algorithms, actually) of creating word vectors which helped popularize this concept. In this video, Jay take you in a guided tour of The Illustrated Word2Vec, an article explaining the method and how it came to be developed.
    The article: jalammar.github.io/illustrate...
    The talk: • Intuition & Use-Cases ...
    Word2vec paper: proceedings.neurips.cc/paper/...
    By Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean
    ---
    Twitter: / jayalammar
    Blog: jalammar.github.io/
    Mailing List: jayalammar.substack.com/
    ---
    More videos by Jay:
    Language Processing with BERT: The 3 Minute Intro (Deep learning for NLP)
    • Language Processing wi...
    Explainable AI Cheat Sheet - Five Key Categories
    • Explainable AI Cheat S...
    The Narrated Transformer Language Model
    • The Narrated Transform...

КОМЕНТАРІ • 18

  • @debbs_io
    @debbs_io Місяць тому

    I’ve watched a lot of videos on UA-cam. So many with animations etc. I nearly lost hope thinking I would never be able to grasp this concept. This is the only one that truly explains what the word embedding is and how it’s being derived in just a simple manner. Thank you so much

  • @andrestellez84
    @andrestellez84 Рік тому +1

    Thanks for these videos and your blog, I've learned so much from you. I always read your blog entries before dive in the original paper.

  • @kaninlunaire3106
    @kaninlunaire3106 Рік тому +2

    Thank you for making this, Sir. It's very helpful!

  • @MannyBernabe
    @MannyBernabe 3 місяці тому

    Personality scores is a great example!

  • @nelsonpullella9977
    @nelsonpullella9977 Рік тому

    Great job! I enjoy very much your channel and blog! THK!

  • @bagamanocnon
    @bagamanocnon 10 місяців тому

    jay, how does training LLMs differ from training text embedding models? or is an embedding model a byproduct of training an LLM? Like in transformers where text are converted to embeddings first before being fed to to the transformer blocks. Thanks!

  • @lemoniall6553
    @lemoniall6553 Рік тому

    Very good explaination, one more thing, is word2vec using dimensional reduction too?, we can choose 50,100,200 dimensions? but how it works? Thanks

  • @priyam66
    @priyam66 Рік тому +7

    One unsolicited piece of advice. You got a profound knowledge of AI. You should share this knowledge by making more videos on several AI topics. I hope every AI aspirant gets a chance to watch your videos.
    Keep it up..:)

  • @sakaar-lok9109
    @sakaar-lok9109 Рік тому

    You are great, please never stop

  • @abdikadermohamed5288
    @abdikadermohamed5288 Рік тому

    Thank u so much its great Explanation clear understand

  • @RoccoSwat
    @RoccoSwat Рік тому

    This guy is the best. He is a good guy.

  • @Shubham-su7sm
    @Shubham-su7sm Рік тому +1

    Yoo Flying Beast!!

  • @user-uf6ym4qx4c
    @user-uf6ym4qx4c 5 місяців тому

    why are the person turning big and turning small all the time through the video?

  • @sershsershsersh
    @sershsershsersh 7 місяців тому +1

    3:32 "...Jay is 38 on the 0 to 100 scale... so -.4 on the -1 to 1 scale...": How is that? I get -.24. If it's -.4 on the -1 to 1 scale, that's 30 on the 0 to 100 scale. Please fix my math.

  • @Udayanverma
    @Udayanverma 7 місяців тому +1

    instead of explaining you went scrolling pages'. it was better if you have just kept it short and may be make other vid for subsequent sections.