20 papers to master Language modeling?

Поділитися
Вставка
  • Опубліковано 11 чер 2024
  • Here are 20 research papers you should read to master Language modeling.
    ABOUT ME
    ⭕ Subscribe: ua-cam.com/users/CodeEmporiu...
    📚 Medium Blog: / dataemporium
    💻 Github: github.com/ajhalthor
    👔 LinkedIn: / ajay-halthor-477974bb
    PAPERS
    [1 📚] A Mathematical Theory of Communication: people.math.harvard.edu/~ctm/...
    [2 📚] A Neural Probabilistic Language Model: www.jmlr.org/papers/volume3/b...
    [3 📚] NLP (Almost) from scratch: www.jmlr.org/papers/volume12/...
    [4 📚] Phoneme Recognition using Time Delay Neural Networks: www.cs.toronto.edu/~fritz/abs...
    [5 📚] Efficient Estimation of words in vector space: arxiv.org/pdf/1301.3781.pdf
    [6 📚] GloVe: nlp.stanford.edu/pubs/glove.pdf
    [7 📚] Enriching word vectors with subword information: arxiv.org/pdf/1607.04606.pdf
    [8 📚] A Convolution Neural Network for modeling sentences: arxiv.org/pdf/1404.2188.pdf
    [9 📚] Learning Internal Representations by error propagation: apps.dtic.mil/dtic/tr/fulltex...
    [10 📚] Sequence Modeling (from the deep learning book): www.deeplearningbook.org/cont...
    [11 📚] Long Short Term Memory: www.bioinf.jku.at/publications...
    [12 📚] Colah's blog to understanding LSTM: colah.github.io/posts/2015-08...
    [13 📚] Training Recurrent Neural Networks (PhD Thesis): www.cs.utoronto.ca/~ilya/pubs...
    [14 📚] Deep contextualized word representations: arxiv.org/pdf/1802.05365.pdf
    [15 📚] Attention is all you need: arxiv.org/pdf/1706.03762.pdf
    [16📚] BERT: arxiv.org/pdf/1810.04805.pdf
    [17 📚] Improving language understanding by generative pretraining: s3-us-west-2.amazonaws.com/op...
    [18 📚] Language Models are multi task learners:d4mucfpksywv.cloudfront.net/b...
    [19 📚] Language models are few shot learners: arxiv.org/pdf/2005.14165.pdf
    [20 📚] Sentence BERT: arxiv.org/pdf/1908.10084.pdf
    [21 📚] ChatGPT blog: openai.com/blog/chatgpt
    [22 📚] Llama-2: scontent-lax3-1.xx.fbcdn.net/...
    MY VIDEOS ON THESE TOPICS
    💻 Convolution in NLP: • Convolution in NLP
    💻 LSTM: • LSTM Networks - EXPLAI...
    💻 Transformers: • Transformer Neural Net...
    💻 Coding Transformers (playlist): • Transformers from scratch
    💻 BERT: • BERT Neural Network - ...
    💻 GPT: • GPT - Explained!
    💻 ChatGPT: • ChatGPT - Explained!
    💻 Llama: • Llama - EXPLAINED!
    MATH COURSES (7 day free trial)
    📕 Mathematics for Machine Learning: imp.i384100.net/MathML
    📕 Calculus: imp.i384100.net/Calculus
    📕 Statistics for Data Science: imp.i384100.net/AdvancedStati...
    📕 Bayesian Statistics: imp.i384100.net/BayesianStati...
    📕 Linear Algebra: imp.i384100.net/LinearAlgebra
    📕 Probability: imp.i384100.net/Probability
    OTHER RELATED COURSES (7 day free trial)
    📕 ⭐ Deep Learning Specialization: imp.i384100.net/Deep-Learning
    📕 Python for Everybody: imp.i384100.net/python
    📕 MLOps Course: imp.i384100.net/MLOps
    📕 Natural Language Processing (NLP): imp.i384100.net/NLP
    📕 Machine Learning in Production: imp.i384100.net/MLProduction
    📕 Data Science Specialization: imp.i384100.net/DataScience
    📕 Tensorflow: imp.i384100.net/Tensorflow
    #chatgpt #deeplearning #machinelearning #bert #gpt

КОМЕНТАРІ • 23

  • @KAZVorpal
    @KAZVorpal 9 місяців тому +9

    This is exactly what I want in a video.
    A concise list of the timeline of the thing in question.
    This is the best way to learn most significant knowledge realms.
    Everyone out there needs to make one of these for their own discipline.

    • @CodeEmporium
      @CodeEmporium  9 місяців тому +1

      Thanks for the kind words. And yes I agree :)

    • @KAZVorpal
      @KAZVorpal 9 місяців тому +1

      @@CodeEmporium Then how do we get all the other channels to sum up their own topics this way, so I can be a more effective autodidact?

  • @debjyotimukherjee8275
    @debjyotimukherjee8275 9 місяців тому +2

    Amazing work!

  • @deiro04
    @deiro04 9 місяців тому

    Awesome content mate!!!!

  • @mansoorsoomro8585
    @mansoorsoomro8585 Місяць тому

    Thank you for providing these papers

  • @pi5549
    @pi5549 9 місяців тому +2

    Amazing that AFAICS nobody has yet done this. Understanding the historical development is a really good way to get into the headspace of the SotA. +1

    • @CodeEmporium
      @CodeEmporium  9 місяців тому

      Thanks so much! And yes I completely agree. :)

  • @umisyifaschoolvlog
    @umisyifaschoolvlog 9 місяців тому +2

    Nice sharing👍🏻

    • @CodeEmporium
      @CodeEmporium  9 місяців тому

      My pleasure! Thanks for watching and commenting

  • @Joy_jester
    @Joy_jester 9 місяців тому +1

    Thanks a lot, This is exactly I was looking for.

    • @CodeEmporium
      @CodeEmporium  9 місяців тому

      Super glad you found what you need! Thanks so much for watching!

  • @UNTITLED-ex1wd
    @UNTITLED-ex1wd 9 місяців тому +1

    I really like this kind of content. Waiting for 20 papers for other ML topic from you

    • @CodeEmporium
      @CodeEmporium  9 місяців тому

      Definitely! Thank you so much for commenting and watching !

  • @SW-nx4jz
    @SW-nx4jz 9 місяців тому

    Amazing content!

    • @CodeEmporium
      @CodeEmporium  9 місяців тому

      Thanks so much for commenting

  • @devasp009
    @devasp009 5 місяців тому

    Thanks this was helpful. However if you would have mentioned the research year, it could make it even more helpful.

  • @user-ig8di6qc3p
    @user-ig8di6qc3p 9 місяців тому

    Could you please make the same for computer vision.

  • @aregpetrosyan465
    @aregpetrosyan465 9 місяців тому +2

    you have great videon on transofrmers form scratch make video LM from scratch

    • @CodeEmporium
      @CodeEmporium  9 місяців тому

      Thank you so much for watching that “Transformers from scratch “ playlist!

  • @amparoconsuelo9451
    @amparoconsuelo9451 9 місяців тому

    If Amazon will deliver an LLM Assembly Kit, what will I receive and how much will I pay?

  • @nguyenchitoanddww
    @nguyenchitoanddww 9 місяців тому

    Hùng Akikiki