How AI (like ChatGPT) understands word sequences.

Поділитися
Вставка
  • Опубліковано 5 чер 2024
  • Medium Article for this video: / ais-sentence-embedding...
    SPONSOR
    Get 20% off and be apart of a Premium Software Engineering Community for career advice and guidance: www.jointaro.com/r/ajayh486/
    ABOUT ME
    ⭕ Subscribe: ua-cam.com/users/CodeEmporiu...
    📚 Medium Blog: / dataemporium
    💻 Github: github.com/ajhalthor
    👔 LinkedIn: / ajay-halthor-477974bb
    DETAILED VIDEOS ON TOPICS DISCUSSED
    [1 🔴] Curse of Dimensionality: • Curse of Dimensionalit...
    [2 🔴] Time Delay Neural Networks +: Dynamic Convolution Neural Networks: • Convolution in NLP
    [3 🔴] Word2Vec video: • Word2Vec, GloVe, FastT...
    [4 🔴] LSTM Video: • LSTM Networks - EXPLAI...
    [5 🔴] Transformer Video: • Transformer Neural Net...
    [6 🔴] BERT video: • BERT Neural Network - ...
    [7 🔴] Sentence Transformers video: • Sentence Transformers ...
    [8 🔴] ChatGPT video: • ChatGPT - Explained!
    PLAYLISTS FROM MY CHANNEL
    ⭕ Transformers from scratch playlist: • Self Attention in Tran...
    ⭕ ChatGPT Playlist of all other videos: • ChatGPT
    ⭕ Transformer Neural Networks: • Natural Language Proce...
    ⭕ Convolutional Neural Networks: • Convolution Neural Net...
    ⭕ The Math You Should Know : • The Math You Should Know
    ⭕ Probability Theory for Machine Learning: • Probability Theory for...
    ⭕ Coding Machine Learning: • Code Machine Learning
    MATH COURSES (7 day free trial)
    📕 Mathematics for Machine Learning: imp.i384100.net/MathML
    📕 Calculus: imp.i384100.net/Calculus
    📕 Statistics for Data Science: imp.i384100.net/AdvancedStati...
    📕 Bayesian Statistics: imp.i384100.net/BayesianStati...
    📕 Linear Algebra: imp.i384100.net/LinearAlgebra
    📕 Probability: imp.i384100.net/Probability
    OTHER RELATED COURSES (7 day free trial)
    📕 ⭐ Deep Learning Specialization: imp.i384100.net/Deep-Learning
    📕 Python for Everybody: imp.i384100.net/python
    📕 MLOps Course: imp.i384100.net/MLOps
    📕 Natural Language Processing (NLP): imp.i384100.net/NLP
    📕 Machine Learning in Production: imp.i384100.net/MLProduction
    📕 Data Science Specialization: imp.i384100.net/DataScience
    📕 Tensorflow: imp.i384100.net/Tensorflow

КОМЕНТАРІ • 9

  • @pw7225
    @pw7225 10 місяців тому +8

    I teach AI. And you do a great job at explaining these concepts well. Kudos.

    • @CodeEmporium
      @CodeEmporium  10 місяців тому

      Thanks a ton for the compliments

  • @NeoShameMan
    @NeoShameMan 10 місяців тому +1

    IMHO the next step WILL be to detach the "database" function of the FeedForward layer of GPT architecture, but that mean being able to retrieve and distinguish syntactic vs ontological aspects of data. Syntactic knowledge is what give the reasoning power and ontology is the facts. Given that the power of the system is to "tag" words with hidden classes (or type), the difficulty is that both syntax and ontology are expressed the same way, with "bag of words", noun -> class/type of word that refer to objects (syntactic), capital -> class/type of place with a specific meaning (ontology).

  • @softwine91
    @softwine91 9 місяців тому +1

    Amazing content as always. You have mentioned that Bert provides Word embeddings. However, it also provides Sentence embeddings : CLS token, and it has been proved that it works very well on text classification tasks.

  • @paimeg
    @paimeg 9 місяців тому +1

    Great stuff as usual. Add that to the history section of your upcoming "Survival manual from our AI overlord" compendium.

    • @CodeEmporium
      @CodeEmporium  9 місяців тому

      Hahaha! Yesss! Thanks so much for commenting :)

  • @MrMehrd
    @MrMehrd 10 місяців тому +2

    Your make this concept s easy to understand.

    • @CodeEmporium
      @CodeEmporium  10 місяців тому +1

      Thank you for the complements!