Thank you for this video, I came from Udemy... I would like to know how to implement RNNs and LSTM on multilabel models. I'm working with NLP, I saw the multilabel with CNN in images but I would like to see one explained for text data before jumping to advanced videos.
Sir can you please help me with ( How to add multiple LSTM layers for a classification task )...Just give me a code snippet and it would be very helpful Thank you
Thank you for this video, I came from Udemy... I would like to know how to implement RNNs and LSTM on multilabel models. I'm working with NLP, I saw the multilabel with CNN in images but I would like to see one explained for text data before jumping to advanced videos.
In your understanding what does input_shape = (X_train.shape[1],) do ? Does it set up X_train.shape[1] neurons in the 1st input layer ?
If I already vectorized the text data into a 300 dim vector (Word2Vec Google 300), do I still need the embedding layer?
No. Then you don't need it.
Thank you ... Can you do a document classification(long text) video using BERT .
Thanks for watching. I would try for sure.
what if I have 3 classes: positive, negative, and neutral?
How to modify for multi class
Bro please make a full video from preprocessing data to hypertuning our rnn-lstm model
Bro, do videos on churn prediction ml
Sir can you please help me with ( How to add multiple LSTM layers for a classification task )...Just give me a code snippet and it would be very helpful Thank you
As a beginner can you guide me where can i start so i can understand the teaching more accurately?
Please start from lesson one.