A Complete Overview of Word Embeddings

Поділитися
Вставка
  • Опубліковано 27 тра 2024
  • NLP has seen some big leaps over the last couple of years thanks to word embeddings but what are they? How are they made and how can you use them too?
    Let's answer those questions in this video!
    Get your Free Token for AssemblyAI Speech-To-Text API 👇www.assemblyai.com/?...
    ▬▬▬▬▬▬▬▬▬▬▬▬ CONNECT ▬▬▬▬▬▬▬▬▬▬▬▬
    🖥️ Website: www.assemblyai.com
    🐦 Twitter: / assemblyai
    🦾 Discord: / discord
    ▶️ Subscribe: ua-cam.com/users/AssemblyAI?...
    🔥 We're hiring! Check our open roles: www.assemblyai.com/careers
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
    #MachineLearning #DeepLearning

КОМЕНТАРІ • 162

  • @ozgurak1840
    @ozgurak1840 Рік тому +73

    Thank you. It is very clear and informative, though i really think you (AssemblyAI) should lose the music on the background; it is distracting and it gives the whole thing an infomercial feeling.

    • @nirash8018
      @nirash8018 11 місяців тому +8

      Somehow the music had a motivational influence for me. I caught myself vibing to it a few times

  • @impracticaldev
    @impracticaldev Рік тому +10

    Would love a video on ELMo further. Thanks for all this!

  • @originalmianos
    @originalmianos Рік тому +3

    There are maybe 30 videos on this topic and this is the only one that does not suddenly make a massive jump across whole concepts that the presenter knows but the watcher does not.

  • @marten9334
    @marten9334 4 місяці тому +4

    amazing video. Perfectly clear speech, good explanations, logical visualisations and the background music makes it a lot easier to focus. Thank you!!

  • @TuhinBhattacharya
    @TuhinBhattacharya 2 роки тому +5

    Awesome overview.. Loved it.. Waiting for videos explaining GloVe and Elmo..

    • @AssemblyAI
      @AssemblyAI  2 роки тому

      Great to hear you liked it!

  • @hadiloghman1572
    @hadiloghman1572 Рік тому +8

    great explanation. Please explain ELMO and GloVe. it was really great

    • @AssemblyAI
      @AssemblyAI  Рік тому +3

      Thank you for the suggestions!

    • @cimmik
      @cimmik 4 місяці тому

      ​@@AssemblyAII'd love to see those videos too

  • @jeremymarkson1423
    @jeremymarkson1423 Рік тому +2

    Would be great to see a video on Elmo!

    • @AssemblyAI
      @AssemblyAI  Рік тому

      Thank you for the suggestion, noted!

  • @KidistAmde
    @KidistAmde 5 місяців тому +1

    Excellent ! Thank you so much for making an absolutly clear explanation.

  • @sajjaddehghani8735
    @sajjaddehghani8735 2 роки тому +10

    great explanation. please explain elmo and other approaches. also please make a video about efficient ways of clustering the embeddings👍

    • @AssemblyAI
      @AssemblyAI  2 роки тому

      Thank you Sajjad for the suggestion!

  • @lahiru954
    @lahiru954 Рік тому

    Great explanation! I went through the topics hours of hours. But this channel saved my time. And on target.

  • @manojjoshi1102
    @manojjoshi1102 Рік тому +16

    Excellent explanation. I did some study on this topic before coming here and the reason was because so many terms and concepts were quite overwhelming. I generally understood those but still missed the fine tuned clarity. After watching this video, most of what I read before started making a lot of sense. I highly recommend this video. Thank you so much.

    • @AssemblyAI
      @AssemblyAI  Рік тому +2

      This is great to hear! You are very welcome!

  • @idrissnguepi7842
    @idrissnguepi7842 Рік тому +2

    Very nice explanation of embedding concept, Would love to see pre-trained word embeddings for sentiment analysis.

  • @augurelite
    @augurelite Рік тому +1

    Wow such a good presenter. I really like the examples super clear. This stuff is amazing

  • @tommyhuffman7499
    @tommyhuffman7499 4 місяці тому

    The absolute best video I've seen on this topic!!

  • @deepaksingh9318
    @deepaksingh9318 3 місяці тому

    Amazing Content.. Exactly what a learner wants .. to Have all the concepts in a single Video with easy to understand way in minimum time..

  • @Kmmc2011
    @Kmmc2011 Рік тому +2

    Thanks for taking the time to break this down and share!

    • @AssemblyAI
      @AssemblyAI  Рік тому

      You are very welcome! - Mısra

  • @whifflingtove
    @whifflingtove Рік тому +2

    Very interested in an in depth explanation of ElMo

  • @lavanyaseetharaman
    @lavanyaseetharaman Рік тому

    simple and clear explanation. please explain Elmo, thanks

  • @kfirgollan
    @kfirgollan Рік тому

    Great explanation! Thanks for sharing

  • @UkiDLucas
    @UkiDLucas 11 місяців тому

    Very good explanation, thank you!

  • @estelitaribeiro4196
    @estelitaribeiro4196 Місяць тому

    Thanks! Great information in a very objective way!

  • @smudgepost
    @smudgepost Рік тому +1

    Yes - to all videos you suggest making! Great guide thank you.. was struggling to see value in lemmatization and concerned a bout a loss of coherence. Seeing several worked examples are great. Interested how the final results were all different but all had similarly high percentage match. How do you tackle this?

  • @shubhamdas5192
    @shubhamdas5192 Рік тому

    Great explanation in less amount of time. Really liked the video.

  • @berkk1993
    @berkk1993 Рік тому

    çok teşekkürler, bu kadar iyi anlatan başka video yok

  • @HashimWarren
    @HashimWarren Місяць тому

    Very clear, thank you

  • @bibhutibaibhavbora8770
    @bibhutibaibhavbora8770 10 місяців тому

    Great and very illustrative video

  • @yusufkemaldemir9393
    @yusufkemaldemir9393 Рік тому

    Interested in “Creating your own embedding before doing binary or multi label classification prediction”! Thanks for the clarity.

  • @nikitamalviya692
    @nikitamalviya692 Рік тому

    Very well explained!! Thank you so much

  • @vigneshpadmanabhan
    @vigneshpadmanabhan Рік тому

    Well explained ! Thanks a lot

  • @draziraphale
    @draziraphale Рік тому

    Excellent presentation. I will be teaching this topic to students shortly and will recommend this material.

  • @user-lq7rh4it7c
    @user-lq7rh4it7c Рік тому +4

    Brilliant video, as always, thanks so much. Would love to see your suggested follow on using pre-trained word embeddings for sentiment analysis if you ever have time 🙂

  • @michaelng3126
    @michaelng3126 5 місяців тому

    This was awesome. Would love to see Elmo video and sentiment analysis video you mentioned possibly making!

  • @yuanjunren5220
    @yuanjunren5220 3 місяці тому

    amazing video!!!❣❣❣ Thanks for sharing

  • @ehichamu
    @ehichamu Рік тому +1

    Very Good video. I second the other comments. PLEASE drop the music completely. It would increase the quality of the experience by at least 70%. I had hard time finishing the video because of the music

  • @glowwell4292
    @glowwell4292 11 місяців тому

    Thanks dear. Nicely paced intro. Good for recap.

  • @diegovnoble
    @diegovnoble 2 місяці тому

    Thanks for the video! I've enjoyed watching and liked the format and pace. I'd add the retrowave background to my playlist if I knew the name. I guess that people would note it less if the volume was lower.

  • @automatster
    @automatster Рік тому

    Great video. Thanks for sharing it. It would be great if you do a task like train sentiment analysis model with word embedding and share with us.

  • @Harduex
    @Harduex 4 місяці тому

    Great videos there, thank you for your content and keep up the good work!

  • @investime247
    @investime247 Рік тому

    Thank u very clear. Need to know how to use word embedding for text classification

  • @AhmedKhaliet
    @AhmedKhaliet Рік тому

    Thank youuuu it's my first video but I guess I should make your video my periorties I'm NLP thanks alot❤

  • @nogur9
    @nogur9 11 місяців тому

    It's a really good explanation, thank you very much :)

  • @hileamlakyitayew9450
    @hileamlakyitayew9450 Рік тому

    Awesome video!!

  • @captainmustard1
    @captainmustard1 10 місяців тому

    top video for embedding introduction

  • @toshyamg
    @toshyamg Рік тому

    Great job 👍

  • @emandiab9524
    @emandiab9524 11 місяців тому

    Thanks that helped a lot.

  • @hamitguner
    @hamitguner 4 місяці тому

    Thank you

  • @JayTheMachine
    @JayTheMachine 8 місяців тому

    thnak you soo much, amazing explaination and you beautiful

  • @dalehu5606
    @dalehu5606 Рік тому

    Clear explanation! 👍

  • @shubham-pp4cw
    @shubham-pp4cw 2 роки тому +1

    nice video on word embedding keep it upp.............

  • @mariussame9357
    @mariussame9357 Рік тому

    Thanks for the video I do have a question when you said that for instance in the CBOW there is only one layer it means that the ouput of this layer should be a vector of size dimension of the embedding but in order to train the model we need to compaire this output with the word in the midlle which is actually a one hot encoded vector of size dimension of the vocabulary so it migth have another layer and a softmax.

  • @danielcanedo5240
    @danielcanedo5240 Рік тому

    I'm your fan already, please make an ELMo video....!!!

  • @r.walid2323
    @r.walid2323 Рік тому

    Great explanation

  • @lbognini
    @lbognini Рік тому

    From the embeddings of your name, I removed those of "work", added "great" and "relationship" and I came up with the embeddings of my own name? How come? Mere coincidence? 🤔🤔
    Great video, btw!

  • @dessiabdelkerym5612
    @dessiabdelkerym5612 9 місяців тому

    Thanks for the explanation please try to make a video about how ELMOS works

  • @davidheilbron
    @davidheilbron Рік тому

    Great! Thanks

  • @RiccardoCarlessoGoogle
    @RiccardoCarlessoGoogle 5 місяців тому

    This is amazing. Can you share the python notebook you show at 12m33s?

  • @ShaikRaasikha21
    @ShaikRaasikha21 4 місяці тому

    Video on Training a sentiment analysis model please

  • @AliZaki1401
    @AliZaki1401 11 місяців тому

    Great work !!
    Can you make a video on Elmo and Transformer-based word embeddings ???

  • @MohamedElGhazi-ek6vp
    @MohamedElGhazi-ek6vp 5 місяців тому

    Excellent Explanation. I have one question please how could I fit my model with this embedding vectors cause for Example in one of my projects for extracting informations from fils. instead of using texts for training my models I thinked of using embedding but I don't know the best way to represent them to my model . I hope u understand my question and thank you.

  • @MartinJohannesNilsen
    @MartinJohannesNilsen Рік тому

    Great video! As for your analogy, I would guess that changing cocktail to bar would indeed give you cocktail. The analogy of having dinner at a restaurant, is not matching to having bar at cocktail.

  • @jtauber
    @jtauber Рік тому +1

    I love that all your examples are Lord of the Rings quotes because I run the Digital Tolkien Project which applies computational text analysis techniques to the works of Tolkien :-)

    • @AssemblyAI
      @AssemblyAI  Рік тому +2

      That's amazing! Nice to meet you! Huge Tolkien fan here. :)

    • @jtauber
      @jtauber Рік тому

      @@AssemblyAI you should join the Digital Tolkien Project!

    • @sidindian1982
      @sidindian1982 6 місяців тому +1

      @@AssemblyAI Pls provide the notebook code ..
      thnx

  • @altantoksoz5999
    @altantoksoz5999 11 місяців тому

    Great tutorial. She speaks like a native speaker. She looks like a Turkish girl, beautiful one :)

  • @cimmik
    @cimmik Рік тому

    Would it be possible to use word embedding to ask if a text is about a certain topic (or rather to what degree a text is about a topic)?

  • @praveenbehara
    @praveenbehara Рік тому +1

    Hi.. thank you for the video.. great introduction and also a practical example.. One request is to drop or reduce the intensity of the music. It was distracting.

    • @AssemblyAI
      @AssemblyAI  Рік тому

      Noted! Thank you for the feedback Praveen

    • @javidjamae
      @javidjamae 8 місяців тому

      Yes, great video but music is definitely too loud and distracting! It's really hard to concentrate on what you're saying.

  • @josephvanname3377
    @josephvanname3377 Рік тому

    I have just created my own word embedding algorithm (no neural networks). I am now training it. Let's see what kind of gibberish sentence it produces after I use it to produce new sentences (I will try to produce sentences without neural networks).

    • @josephvanname3377
      @josephvanname3377 Рік тому

      Here are some of the sentences produced by my word embedding (and just a word embedding without much on top of the word embedding).
      "How nevertheless she had credited yourself."
      "I released the chimneys were committed."-Well, this is two sentences "I released the chimneys" and "The chimneys were committed." The word embedding is not full NLP so this sort of word embedding cannot remember "released" when we get to "were".
      "Luke fills the intruder boxed the interval."-Same issue here.
      "I shall be linked unarmed."
      "He was afraid I know."

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Рік тому

    be great to see a video on Elmo.

  • @jenot7164
    @jenot7164 Рік тому

    How large should data for a custom embedding be and is it possible to utilize a GPU for the creation of a word embedding vector space?

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Рік тому

    Can the embeddings from Transformer be used elsewhere, like with Word2Vec?

  • @peymanhashemi3827
    @peymanhashemi3827 Рік тому

    Great!

  • @j0nrages851
    @j0nrages851 Рік тому

    Is there a sentiment training model video that builds from this? Trying to build a recommendation system based on candidate sentences and a job description

    • @AssemblyAI
      @AssemblyAI  Рік тому

      We don't have that video yet but thank you for the suggestion!

  • @ali75988
    @ali75988 6 місяців тому

    8:15 i am having problem with the sentence "no of neurons in hidden layer = size of embedding".
    i am confused what is size of embedding?

  • @soheiltehrani3792
    @soheiltehrani3792 Рік тому +1

    Great visual, Great Voice , Good pace of presentation . Everything is awesome in this video.
    thanks for sharing :D

    • @AssemblyAI
      @AssemblyAI  Рік тому

      Thank you for the nice words Soheil! Glad it was helpful!

  • @abdelazizkhalid4231
    @abdelazizkhalid4231 Рік тому

    is there a video about sentiment analysis yet?

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Рік тому

    Be interested in seeing a python example of Word2Vec.

  • @yigalirani308
    @yigalirani308 Рік тому

    super helpful, but is there a version of this without the music?

    • @AssemblyAI
      @AssemblyAI  Рік тому +2

      Sorry about that! We got a lot of feedback in this. Let me see if we can upload without the music. :D

  • @lemoniall6553
    @lemoniall6553 Рік тому

    if we have a sentence "vishy eat bread". then we vectorize the word "eaat"(misspelled word), why does fasttext see that the word "eaat" is more similar to the word "eat"?. How is the architecture?, is it possible for fasttext without using skipgram to be able to classify words?. Thanks

  • @tamoghnamaitra9901
    @tamoghnamaitra9901 8 місяців тому

    New crush added to life

  • @joergbieri9701
    @joergbieri9701 9 місяців тому

    Great content thanks. Due to a hearing problem I would appreciate it, if you could remove the backround music. Ok? Thanks

  • @TuhinBhattacharya
    @TuhinBhattacharya 2 роки тому

    How do I know which embedding will be best choice for a specific use case? How do I know which distance measure will be best?

    • @pathikghugare
      @pathikghugare 2 роки тому

      Depends on your use case, cuz lets say if your use case contains more in general words like tea, king, actor, etc. then you may try different embeddings and see for yourself which ones are working well for particular examples from your use case
      OR
      If your use case is quite specific, something like say representing skills as a vector then you may need to train your own word2vec model on your data since pretrained embeddings may not cover what you need

  • @__________________________6910
    @__________________________6910 2 роки тому +2

    Noice !

  • @HappyDataScience
    @HappyDataScience 10 місяців тому

    i would surely like to learn elmo guessing that chatgpt used the same correct me if i'm wrong 🙇🏻‍♂️

  • @zaratushtra21
    @zaratushtra21 Рік тому

    You should also add the name of the speak to videos. She says I in the video and we even do not know who is she :)

  • @brunam7908
    @brunam7908 10 місяців тому

    Waiting for the ELMo video.

  • @lexflow2319
    @lexflow2319 Рік тому +1

    Do transformers from scratch. I heard they can be written in 50 lines. I would like to understand how bert encodes words

  • @davidswearingen7571
    @davidswearingen7571 11 місяців тому

    Another good video marred by the inclusion of unnecessarily loud music.

  • @iravkr
    @iravkr Рік тому

    Your pretty face holds my concentration, and thus I understand anything taught by you, especially transformer, more than any other youtube video..Thank you so much for such videos...indebted!

  • @mimori.com_
    @mimori.com_ Рік тому

    Thank you. Easy to understand. But I don't need the music at all. I fight myself listening to the music than your talk.

  • @flaashmindstudio1468
    @flaashmindstudio1468 Рік тому

    Can you make a video about ELmo?

  • @EkShunya
    @EkShunya Рік тому +1

    nice and crisp, just one suggestion "please remove background music", It is reductive to the viewers experience :)

  • @moeal5110
    @moeal5110 8 місяців тому

    Awesome content but these background music are slightly distracting specially when you play video on 1.5 speed

  • @Nice-po4xg
    @Nice-po4xg 11 місяців тому

    Cosine Similarity, not equal distance bro it just tells direction of that word

  • @YuraZavadenko
    @YuraZavadenko 9 місяців тому

    what about BOW?

  • @princegoyal1843
    @princegoyal1843 4 місяці тому

    Hi, Can you please tell you name. Going forward to learn more from you.

  • @josephvanname3377
    @josephvanname3377 Рік тому

    So when are we going to construct word embeddings from good old fashioned pictographs?

  • @HikmetYolcusu
    @HikmetYolcusu 6 місяців тому

    Why is there a background soundtrack during the lecture? Does it help with learning or focus? I find it kinda distracting and feel rushed.

  • @uvurgun
    @uvurgun Рік тому

    Thanks for sharing! It would have been great to remove the background music.

  • @DrLouMusic
    @DrLouMusic Рік тому

    Stoooooo[pppppppppp the awwwwwfffffuuuuulllll music!!!!! It’s beyond disracting😊

  • @sdsunjay
    @sdsunjay 10 місяців тому

    I was hoping this video would cover BERT as it can be used to generate embeddings. Bidirectional Encoder Representations from Transformers (BERT) is a family of language models introduced in 2018 by researchers at Google. However I do see there is another video about BERT: ua-cam.com/video/6ahxPTLZxU8/v-deo.html

  • @mehmetaliozer2403
    @mehmetaliozer2403 Рік тому

    excellent tutor but music distracted me so much 😄

    • @AssemblyAI
      @AssemblyAI  Рік тому +2

      Sorry about that! Wish we could take it back 🤷‍♀️

    • @DanielTorres-gd2uf
      @DanielTorres-gd2uf Рік тому

      @@AssemblyAI I personally like both

  • @caiyu538
    @caiyu538 11 місяців тому

    From this lecture, ua-cam.com/video/OATCgQtNX2o/v-deo.html at 0:23, my understanding embedding is produced from the transform encoder. But from your lecture, the embedding is before transformer at 6:45. I also notice that in your lecture, the embedding is for a single word, but in this HF lecture, it seems for a whole sentence. I am still confused. Sorry that my question may be naïve.