20 papers to master Language modeling?
Вставка
- Опубліковано 11 чер 2024
- Here are 20 research papers you should read to master Language modeling.
ABOUT ME
⭕ Subscribe: ua-cam.com/users/CodeEmporiu...
📚 Medium Blog: / dataemporium
💻 Github: github.com/ajhalthor
👔 LinkedIn: / ajay-halthor-477974bb
PAPERS
[1 📚] A Mathematical Theory of Communication: people.math.harvard.edu/~ctm/...
[2 📚] A Neural Probabilistic Language Model: www.jmlr.org/papers/volume3/b...
[3 📚] NLP (Almost) from scratch: www.jmlr.org/papers/volume12/...
[4 📚] Phoneme Recognition using Time Delay Neural Networks: www.cs.toronto.edu/~fritz/abs...
[5 📚] Efficient Estimation of words in vector space: arxiv.org/pdf/1301.3781.pdf
[6 📚] GloVe: nlp.stanford.edu/pubs/glove.pdf
[7 📚] Enriching word vectors with subword information: arxiv.org/pdf/1607.04606.pdf
[8 📚] A Convolution Neural Network for modeling sentences: arxiv.org/pdf/1404.2188.pdf
[9 📚] Learning Internal Representations by error propagation: apps.dtic.mil/dtic/tr/fulltex...
[10 📚] Sequence Modeling (from the deep learning book): www.deeplearningbook.org/cont...
[11 📚] Long Short Term Memory: www.bioinf.jku.at/publications...
[12 📚] Colah's blog to understanding LSTM: colah.github.io/posts/2015-08...
[13 📚] Training Recurrent Neural Networks (PhD Thesis): www.cs.utoronto.ca/~ilya/pubs...
[14 📚] Deep contextualized word representations: arxiv.org/pdf/1802.05365.pdf
[15 📚] Attention is all you need: arxiv.org/pdf/1706.03762.pdf
[16📚] BERT: arxiv.org/pdf/1810.04805.pdf
[17 📚] Improving language understanding by generative pretraining: s3-us-west-2.amazonaws.com/op...
[18 📚] Language Models are multi task learners:d4mucfpksywv.cloudfront.net/b...
[19 📚] Language models are few shot learners: arxiv.org/pdf/2005.14165.pdf
[20 📚] Sentence BERT: arxiv.org/pdf/1908.10084.pdf
[21 📚] ChatGPT blog: openai.com/blog/chatgpt
[22 📚] Llama-2: scontent-lax3-1.xx.fbcdn.net/...
MY VIDEOS ON THESE TOPICS
💻 Convolution in NLP: • Convolution in NLP
💻 LSTM: • LSTM Networks - EXPLAI...
💻 Transformers: • Transformer Neural Net...
💻 Coding Transformers (playlist): • Transformers from scratch
💻 BERT: • BERT Neural Network - ...
💻 GPT: • GPT - Explained!
💻 ChatGPT: • ChatGPT - Explained!
💻 Llama: • Llama - EXPLAINED!
MATH COURSES (7 day free trial)
📕 Mathematics for Machine Learning: imp.i384100.net/MathML
📕 Calculus: imp.i384100.net/Calculus
📕 Statistics for Data Science: imp.i384100.net/AdvancedStati...
📕 Bayesian Statistics: imp.i384100.net/BayesianStati...
📕 Linear Algebra: imp.i384100.net/LinearAlgebra
📕 Probability: imp.i384100.net/Probability
OTHER RELATED COURSES (7 day free trial)
📕 ⭐ Deep Learning Specialization: imp.i384100.net/Deep-Learning
📕 Python for Everybody: imp.i384100.net/python
📕 MLOps Course: imp.i384100.net/MLOps
📕 Natural Language Processing (NLP): imp.i384100.net/NLP
📕 Machine Learning in Production: imp.i384100.net/MLProduction
📕 Data Science Specialization: imp.i384100.net/DataScience
📕 Tensorflow: imp.i384100.net/Tensorflow
#chatgpt #deeplearning #machinelearning #bert #gpt
This is exactly what I want in a video.
A concise list of the timeline of the thing in question.
This is the best way to learn most significant knowledge realms.
Everyone out there needs to make one of these for their own discipline.
Thanks for the kind words. And yes I agree :)
@@CodeEmporium Then how do we get all the other channels to sum up their own topics this way, so I can be a more effective autodidact?
Amazing work!
Awesome content mate!!!!
Thank you for providing these papers
Amazing that AFAICS nobody has yet done this. Understanding the historical development is a really good way to get into the headspace of the SotA. +1
Thanks so much! And yes I completely agree. :)
Nice sharing👍🏻
My pleasure! Thanks for watching and commenting
Thanks a lot, This is exactly I was looking for.
Super glad you found what you need! Thanks so much for watching!
I really like this kind of content. Waiting for 20 papers for other ML topic from you
Definitely! Thank you so much for commenting and watching !
Amazing content!
Thanks so much for commenting
Thanks this was helpful. However if you would have mentioned the research year, it could make it even more helpful.
Could you please make the same for computer vision.
you have great videon on transofrmers form scratch make video LM from scratch
Thank you so much for watching that “Transformers from scratch “ playlist!
If Amazon will deliver an LLM Assembly Kit, what will I receive and how much will I pay?
wtf dude
Hùng Akikiki