TensorFlow Tutorial 11 - Text Classification - NLP Tutorial
Вставка
- Опубліковано 11 жов 2024
- Implement a Sentiment Classification algorithm in TensorFlow and analyze Twitter data! Learn how to use NLP (Natural Language Processing) techniques like a Tokenizer and Word Embeddings to preprocess text data, and then create a RNN model with keras to classify the tweets.
~~~~~~~~~~~~~~ GREAT PLUGINS FOR YOUR CODE EDITOR ~~~~~~~~~~~~~~
✅ Write cleaner code with Sourcery: sourcery.ai/?u... *
Get my Free NumPy Handbook:
www.python-eng...
⭐ Join Our Discord : / discord
📓 ML Notebooks available on Patreon:
/ patrickloeber
If you enjoyed this video, please subscribe to the channel:
▶️ : / @patloeber
Course material is available on GitHub:
github.com/pat...
RNN in TensorFlow:
• TensorFlow Tutorial 10...
RNN in Depth:
• PyTorch RNN Tutorial -...
Data:
www.kaggle.com...
Regex Tutorial:
• Regular Expressions in...
Links:
www.tensorflow...
www.tensorflow...
~~~~~~~~~~~~~~~ CONNECT ~~~~~~~~~~~~~~~
🖥️ Website: www.python-eng...
🐦 Twitter - / patloeber
✉️ Newsletter - www.python-eng...
📸 Instagram - / patloeber
🦾 Discord: / discord
▶️ Subscribe: / @patloeber
~~~~~~~~~~~~~~ SUPPORT ME ~~~~~~~~~~~~~~
🅿 Patreon - / patrickloeber
Music: www.bensound.com/
Photo by Sara Kurfeß on Unsplash: unsplash.com
Python
Course Parts:
01 TensorFlow Installation
02 TensorFlow Tensor Basics
03 TensorFlow Neural Net
04 TensorFlow Linear Regression
05 TensorFlow CNN (Convolutional Neural Nets)
06 TensorFlow Save & Load Models
07 TensorFlow Functional API
08 TensorFlow Multi-output Project
09 TensorFlow Transfer Learning
10 TensorFlow RNN / LSTM / GRU
11 TensorFlow NLP
TensorFlow 2, Keras, Deep Learning, TensorFlow Course, TensorFlow Beginner Course, TensorFlow Tutorial
----------------------------------------------------------------------------------------------------------
This is a sponsored or an affiliate link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏
I feel much more confident going into the TF cert exam after finishing your playlist. Danke Patrick!
omg,your tensorflow series is very good for bigginers to understand how to begin train their models,i hope you can make some develop tutorials.Thank you so much
Thanks for your awesome videos, some GAN's videos would be helpful.
Will try to do this in the future
Many thanks! very clear explanation i like it
Thanks 🙏🏻
thanks for this video, that i can learn NLP and english
Thanks for your valuable content.Kindly do some nlp tasks like NER, BERT implementation that will be highly useful.
Yes very interesting topics
Hii thanks for the video. I just have one questions. What is your recommendation to fix the overfitting in the model?
just amazing
:)
How can you get the prediction and validation to those numbers? what is the formula to get those numbers?
I recognize different lengths in train_sentences and train_sequences (at 12:xx). The length of sentence 3 and sentence 5 do not match with their sequence length. Can you please explain this?
Another great video. Just a question. In the real world, when processing natural language, is that always converting training words into numbers first before applying to model? Like in this example, you convert "flood bago myanmar arrived bago" into [99, 3742, 612, 1451, 3742]. Basically, we can't use real words in the model?
No, you always somehow have to map the words to numbers so that the model can understand it. There are different ways of doing this...
Please can you do a video on tweet sentiment analysis to determine suicidal classification using NLP
I'll add it to my list :)
Notification Gang 🔥🔥🔥
Yeah
Hey man, I am getting this error (NotImplementedError: Cannot convert a symbolic Tensor (lstm_11/strided_slice:0) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported). Can anyone help me out or do you mind sharing which versions of tensor and numpy you used while coding this exercise?
how can I export your model to use in another application?
question : why did we use Padding to fix the sequence length ? LSTM/RNNs can deal with variable sequence lengths .. am I missing something ?
aka : unless the reason is in the embeddings layer input expecting a matrix with batch size and max input length ?!! ====>>> model.add(layers.Embedding(num_unique_words, 32, input_length=max_length))
# The layer will take as input an integer matrix of size (batch, input_length),
# and the largest integer (i.e. word index) in the input should be no larger than num_words (vocabulary size).
We should use masking or padding for RNN. In this case I used padding explicitely. And yes if input_length is used then it must be of same size
When you say helper functions, next time do explain it also how it works please!!
how about cactegories on a document or tittle of a paragraph
what method we use
edit:
what i saw its only 2 categories this whole time how about 3 or more categories
Text classification using tensorflow
ua-cam.com/play/PL-N0_7SF7nTqOQdTzLRIRvyGJW-msR3Q4.html&feature=shared
Hi thank you for your nice work, can I ask for the code?
Thank you I had found it on the link to github
yep almost all the code to my videos is on github
at 7:09 this would no longer work, one of the functions maybe
from collections import Counter
def counter_word(text_col):
count = Counter()
df['text'].str.lower().str.split().apply(count.update)
return count
counter = counter_word(df.text)
Why didn't we use test sentences in the tutorial to check the prediction?
my mistake. I should have used the test data in the end...
@@patloeber that's okay. Just wanted to check if my understanding was correct. And thanks for your videos. They are amazing brother