Word Embeddings: Word2Vec

Поділитися
Вставка
  • Опубліковано 9 лют 2025
  • Word2Vec is a groundbreaking technique that transforms words into numerical vectors, capturing semantic relationships in language.
    This video explores:
    How Word2Vec works to create meaningful word representations
    Practical applications in NLP and machine learning
    Some limitations of Word2Vec
    Hex is a collaborative workspace for exploratory analytics and data science. With Hex, teams can quickly reach insights in AI-powered notebooks using SQL, Python, & no-code, and instantly share their work with anyone.
    Links
    ------------------
    Visit our website: hex.tech/
    Stay connected on twitter - / _hex_tech
    Stay connected on LinkedIn - / mycompany

КОМЕНТАРІ • 21

  • @belovedsoul99
    @belovedsoul99 6 днів тому

    One of the best explanations of word2vec!

  • @nayanradadiya3288
    @nayanradadiya3288 4 місяці тому +8

    Top notch explanation with amazing animations!!

    • @_hex_tech
      @_hex_tech  4 місяці тому

      Appreciate it 🙏🏾

  • @amirmasoud_iravani
    @amirmasoud_iravani 11 днів тому

    it was brillinat! I enjoyed and deeply understood the word2vec concept from your content. plz keep up the brilliant work 🥰🥰🥰

  • @infomaax_yt
    @infomaax_yt 4 місяці тому +4

    New Achievement Unlocked: Found another awesome channel to subscribe and watch grow 🌟🌟

  • @geforex28
    @geforex28 4 місяці тому +1

    This really was a high quality video thank you

  • @tmjthabethe
    @tmjthabethe 4 місяці тому +1

    Loving the motion graphics!

  • @harrydawitch
    @harrydawitch 4 місяці тому +1

    Keep going what are you doing my friend. i'll always be here supporting you.

  • @sankhuz
    @sankhuz 4 місяці тому

    What a great video, Loved it ❤

  • @IbrahimSowunmi
    @IbrahimSowunmi 4 місяці тому

    Fantastic breakdown

  • @gorangagrawal
    @gorangagrawal 4 місяці тому +1

    Too the point and Simple. Thanks a lot.
    Do you mind sharing tools used to make this beautiful piece of art? Looking to learn making videos and share with students.

    • @_hex_tech
      @_hex_tech  4 місяці тому

      🙏🏾. My tools are just adobe premiere, Hex, and notion

  • @billbond2682
    @billbond2682 4 місяці тому

    what the fug , did this awesome video just popped up in my algorithm ?

  • @crystalmuaz
    @crystalmuaz 4 місяці тому

    Subbing and commenting and liking to boost algorithm

  • @pablosanzo5584
    @pablosanzo5584 2 місяці тому

    Cool vid! What is that tool to do the word analogies and visualizations?

    • @_hex_tech
      @_hex_tech  2 місяці тому

      It's all done in Hex hex.tech/

  • @MatheoDampfer-nl3no
    @MatheoDampfer-nl3no 3 місяці тому

    But how does the loss function work if the model doesn t know what is correct. And we humans could not judge the loss factually

    • @MatheoDampfer-nl3no
      @MatheoDampfer-nl3no 3 місяці тому

      Think I understood: the model compares the probability of these worlds showing up together in other texts. Am I right? Thanks for this great video

    • @_hex_tech
      @_hex_tech  3 місяці тому +1

      The loss function learns from how words naturally appear together in text. It doesn't need an absolute "correct" answer - instead, it measures how well the model predicts actual word co-occurrences in the training data. If words like "cat" and "drinks" frequently appear near each other, the model learns to expect this pattern, and gets penalized when it predicts unrelated words.