Sequencing - Turning sentences into data (NLP Zero to Hero - Part 2)

Поділитися
Вставка
  • Опубліковано 24 лют 2020
  • Welcome to Zero to Hero for Natural Language Processing using TensorFlow! If you’re not an expert on AI or ML, don’t worry -- we’re taking the concepts of NLP and teaching them from first principles with our host Laurence Moroney (@lmoroney).
    In the last video you learned about how to tokenize words using TensorFlow’s tools. In this video you’ll take that to the next step -- creating sequences of numbers from your sentences, and using tools to process them to make them ready for teaching neural networks.
    Codelab → goo.gle/tfw-nlp2
    NLP Zero to Hero playlist → goo.gle/nlp-z2h
    Subscribe to the TensorFlow channel → goo.gle/TensorFlow
  • Наука та технологія

КОМЕНТАРІ • 55

  • @SudeepDasguptaiginition
    @SudeepDasguptaiginition 4 роки тому +16

    best TensorFlow basic tutorials ever.

  • @Cdaprod
    @Cdaprod 9 місяців тому +11

    As a self taught {whatever I am} that believes in open source I appreciate the videos more than you can imagine. Life changing knowledge right here folks.

    • @TensorFlow
      @TensorFlow  9 місяців тому +1

      Glad you enjoyed the video! Here is the playlist to catch up on the whole series → goo.gle/nlp-z2h

  • @streetjesus6846
    @streetjesus6846 2 роки тому +5

    I love the way he teaches this, so intriguing

  • @dc3itlearn471
    @dc3itlearn471 Рік тому +2

    I loved the edx course you taught and hope you will teach another one on NLP and tensorflow js. Thx!

  • @krishnachauhan2822
    @krishnachauhan2822 4 роки тому +3

    You are amazing sir, absolutely amazing

  • @Continentalky
    @Continentalky 2 роки тому +3

    super helpful series of videos. I have read so many articles but this has been the clearest and easiest to understand.

  • @rajansaharaju1427
    @rajansaharaju1427 3 роки тому

    Loved the trick! Awesome!

  • @oktoniuschian9831
    @oktoniuschian9831 3 роки тому +2

    very clear, awesome sir

  • @Tracks777
    @Tracks777 4 роки тому +8

    lovely stuff

  • @narendrapratapsinghparmar91
    @narendrapratapsinghparmar91 6 місяців тому

    Thanks for your efforts.

  • @vishalsaha2341
    @vishalsaha2341 2 роки тому

    Wow great explanation !

  • @prabhatupadhyay7526
    @prabhatupadhyay7526 3 роки тому

    Thnk for teching it help me a lot

  • @mithunchandrasaha403
    @mithunchandrasaha403 3 роки тому

    Nice Explanation.Sir

  • @emrahmete6507
    @emrahmete6507 4 роки тому +5

    What a simple explanation. Looking forward to next video and really want to see word2vec and BERT as simple as possible.

  • @codderrrr606
    @codderrrr606 Рік тому +2

    will those who don't have any knowledge of deep learning and ML will also be able to understand this playlist to great extent.

  • @rasoulnorouzi3657
    @rasoulnorouzi3657 4 роки тому +9

    waiting and excited for seq2seq models ❤

  • @neginhadisadegh6232
    @neginhadisadegh6232 Рік тому

    The best thing ever🤩🤩

  • @rishabhanand4270
    @rishabhanand4270 4 роки тому

    dang, can't wait

  • @Tracks777
    @Tracks777 4 роки тому +11

    amazing stuff

  • @sabarieswaran7403
    @sabarieswaran7403 11 місяців тому +1

    nice job

  • @ashokbiswas6390
    @ashokbiswas6390 8 днів тому

    great, I wish I had seen this earlier

  • @sprucesunday4536
    @sprucesunday4536 6 місяців тому

    I love this course but I will need an exercise on NLP part of speech tagging

  • @bryancc2012
    @bryancc2012 4 роки тому

    which book to buy for the most update information for this part ? NLP with TF 2, thanks

  • @nishalk781
    @nishalk781 4 роки тому

    Thanks for the video

  • @ege5785
    @ege5785 2 місяці тому

    I wish one day I could have a chance to work together Mr.Moroney

  • @VibhootiKishor
    @VibhootiKishor 4 роки тому +3

    Great!!! Google is Playing vital role to develop the entire Globe 👍

  • @Ricocase
    @Ricocase 3 роки тому

    Can I simply import a dictionary api of some kind that identifies all English words beforehand?

  • @aravindravindranatha4260
    @aravindravindranatha4260 3 роки тому

    Can we find the text similarity using this

  • @muhammadzubair2109
    @muhammadzubair2109 4 роки тому

    ❤️

  • @balakrishnakumar1588
    @balakrishnakumar1588 4 роки тому +21

    Waiting for word embedding

  • @yashasvibhatt1951
    @yashasvibhatt1951 3 роки тому +2

    If this is how google teaches things then I am ready to pay them the money I might using to buy ps5. Simply awesome

  • @sheikzaidh9321
    @sheikzaidh9321 4 роки тому

    What word2vec is better then tokenizer which is good

  • @aravindravindranatha4260
    @aravindravindranatha4260 4 роки тому

    Any one can help me i need to know how to find the text similarity

  • @deepakdakhore
    @deepakdakhore 4 роки тому

    Nice explanation

  • @neginfazlialishah5771
    @neginfazlialishah5771 3 роки тому

    I can't understand why sequence is needed !? Can't we train the model just by tokenizing the words and don't use the sequenceing code ?

  • @ayushigupta5420
    @ayushigupta5420 4 роки тому

    waiting for the next video in the series.

  • @dalehu5606
    @dalehu5606 3 роки тому +2

    like me doing my hearing test

  • @kelvinsmith4894
    @kelvinsmith4894 4 роки тому +1

    lol Why does this look like a series of linear equations in a matrix where the missing variables are represented by 0s

  • @sanjaykrish8719
    @sanjaykrish8719 4 роки тому +2

    import tensorflow as tf

  • @nambeoriviu
    @nambeoriviu 4 роки тому +1

    Really?

  • @amarjeetkushwaha4258
    @amarjeetkushwaha4258 4 роки тому

    121 view