Word2Vec, GloVe, FastText- EXPLAINED!
Вставка
- Опубліковано 31 тра 2024
- Let's talk about word2vec architectures (CBOW, Skip-gram, GloVe, FastText)
SPONSOR
Get 20% off and be apart of a Premium Software Engineering Community for career advice and guidance: www.jointaro.com/r/ajayh486/
ABOUT ME
⭕ Subscribe: ua-cam.com/users/CodeEmporiu...
📚 Medium Blog: / dataemporium
💻 Github: github.com/ajhalthor
👔 LinkedIn: / ajay-halthor-477974bb
RESOURCES
[1 🔎] A Neural Probabilistic Language Model (Bengio et al., 2003): www.jmlr.org/papers/volume3/b...
[2 🔎] Word2Vec (original paper): arxiv.org/pdf/1301.3781.pdf
[3 🔎] GloVe paper: nlp.stanford.edu/pubs/glove.pdf
[4 🔎] FastText: arxiv.org/pdf/1607.04606.pdf
[5 🔎] LSTM Video: • LSTM Networks - EXPLAI...
[5 🔎] Transformer Video: • Transformer Neural Net...
[6 🔎] BERT video: • BERT Neural Network - ...
[8 🔎] ChatGPT: openai.com/blog/chatgpt
PLAYLISTS FROM MY CHANNEL
⭕ Transformers from scratch playlist: • Self Attention in Tran...
⭕ ChatGPT Playlist of all other videos: • ChatGPT
⭕ Transformer Neural Networks: • Natural Language Proce...
⭕ Convolutional Neural Networks: • Convolution Neural Net...
⭕ The Math You Should Know : • The Math You Should Know
⭕ Probability Theory for Machine Learning: • Probability Theory for...
⭕ Coding Machine Learning: • Code Machine Learning
MATH COURSES (7 day free trial)
📕 Mathematics for Machine Learning: imp.i384100.net/MathML
📕 Calculus: imp.i384100.net/Calculus
📕 Statistics for Data Science: imp.i384100.net/AdvancedStati...
📕 Bayesian Statistics: imp.i384100.net/BayesianStati...
📕 Linear Algebra: imp.i384100.net/LinearAlgebra
📕 Probability: imp.i384100.net/Probability
OTHER RELATED COURSES (7 day free trial)
📕 ⭐ Deep Learning Specialization: imp.i384100.net/Deep-Learning
📕 Python for Everybody: imp.i384100.net/python
📕 MLOps Course: imp.i384100.net/MLOps
📕 Natural Language Processing (NLP): imp.i384100.net/NLP
📕 Machine Learning in Production: imp.i384100.net/MLProduction
📕 Data Science Specialization: imp.i384100.net/DataScience
📕 Tensorflow: imp.i384100.net/Tensorflow
Just Love you videos man! Thanks for helping! I was scared of learning attention mechanisms at first but now everything is clear. It just helps a lot while working on projects when you know what is going on behind the scenes.
That’s such a great explanation, thank you!
Your explanation is super brother one of the best
Thank you for all the effort you put to make these subjects easy and accessible . As a newbie, could you please tell me where do I have to start ?
Great video!
quality video. thanks 👍
Danke Mann!
Thank you for this video :)
My pleasure :)
@@CodeEmporium Your new, more historical series is a great contribution to yt :D
When you say “n-gram vector” do you mean “bag of words vector”. I always thought it was the latter and haven’t heard the former
Didn't this guy have a discord, does anyone know if its still up or could anyone send it. id love to be a part of this wonderful youtubers community
I am a newbie to ML so forgive me if I am wrong. I was going through your video and then something didn't make sense to me. At time 2:09 you said it is 100 x 1 vector but later you mentioned at 2:38 that its 1 x 100 vector. I think the latter is correct, right @CodeEmporium
what makes these different than tokenizers
sir neevu kannadigara wow?? do u work in usa sir ?
or pursuing any degree?
my saviour.
Anytime haha
❤ from ಬೆಂಗಳೂರು
Hearts from me too! Thanks for commenting fellow Kannadiga :)
Sir r u from Karnataka
Yep :)