This is amazing! You literally made the topic interesting, I never got bored throughout the entire video! I'll definitely stick to your channel in preparing for my thesis!
This is the first subscription I have made in my you tube...that's only because of your teaching skills ....u make any topic very interesting and provide excellent information on research and it's techniques... Please upload videos on autoencoders .
Really a great explaination. I just played only one video but after that I've watched 4 more Countinous videos which helped me lot to prepare for my interview I was going have tomorrow. Share your divine knowledge for us sir!
I thought I will not learn Deep Learning because it is too much complicated, you have just explained in an effective way, within a week, LSTM skill will be updated in my resume!
Most hot topic of the NLP I learnt it after giving too much time I am sharing what I learnt For embedding we use One hot encoding technigue Word2vec technique Embedding layer of keras technique after pad sequencing You can use it for recurrent neural network
Thank you for sharing this wealth of knowledge with us. TBH I’ve been finding issues in grasping how word embedding works. It’s so clear that they are not as straight forward as basic vectorization approach like BoW and TF-IDF. Nevertheless, I gained something from this video
thank you Sir. please make a video explaining how a sentence is beeing translated with the neural system. but with an example. thank you again you are amazing.
Thanks for the tutorial on word embedding models. I wonder how are features selected in these models? I think in some particular cases having control on customizing these feature might enhance the chance of getting more similar words than just using the pretarined ones.
Sir Kindly guide - How can I use Pre Trained word embedding models for local languages (or languages written in Roman format) that are not available/trained in the pretrained model. Do I have to use an embedding layer(not pre trained) for creating embedding matrices for any local language? How can I get benefit from pretrained models for local language?
It was really helpful. Can u make videos on Grammer Correction using Rule based methord, Language Models & classifiers. its really hard to understand it otherwise
@Krish Naik at 7:08 i would like to know if you can share some material for the technique that we use to relate features such as gender to the words like boy and girl not to the apple and mangos but I have doubts what technique we use to make possible a machine learn the relation between features and words. Thanks
Good day, may I ask how to define specific dimensions of features (for example, I want to extract linguistic features such as part of speech tagging, word density, and word frequency) that is going to be vectorized?
Hi Krish, First of all thanks for making this content. I have some doubts: 1. In predicting analogy,(King->queen), if we take up some other feature instead of gender, then result may not be necessarily true(or it selects the best feature to give results)
I would like to know how can be obtained that coefficients in vertical columns automatically and what scientific assumption and premisses should be used for doing it. As I see now, it can be done manually by logical consideration. Thanks!
Hey, In word embedding how features are defined? Is it extract from document itself or pre define features set available for related domain or whole language?
Sir When u said that we related the gender to boy , then this is done by machine , or this is predefined , i mean to say that on what bases this all comes to an action that gender is related to boy and girl , and royal is related to queen and king ??? thnku sir
Can you please make a video on 'Hybrid Deep Learning Model for Sentiment Classification' that is implementation of CNN and LSTM together for sentiment classification?
Hello, Suppose we need to add more features in our X which are not text..i.e suppose we get a sparse matrix after count vectorizer and now we have one more feature length and we want both features.How to combine both?
I don’t really get word embedding. Does it work outside the mainstream English language? For example medical language is different. If I am studying about medical literature, a lot of my main vocabularies are medical words. What is your opinion on this?
I am facing an issue, trying to install nltk and spacy, but it's asking to downgrade tensorflow from version 2 to version 1.X. What can be done to install it without downgrading the TF.
This is amazing! You literally made the topic interesting, I never got bored throughout the entire video!
I'll definitely stick to your channel in preparing for my thesis!
I'm literally binge watching your videos like Netflix!
This is the first subscription I have made in my you tube...that's only because of your teaching skills ....u make any topic very interesting and provide excellent information on research and it's techniques... Please upload videos on autoencoders .
Same here! 😊😊
Thanks!
you are 200% better than my professor in explaining Word Embedding
Really a great explaination. I just played only one video but after that I've watched 4 more Countinous videos which helped me lot to prepare for my interview I was going have tomorrow. Share your divine knowledge for us sir!
I thought I will not learn Deep Learning because it is too much complicated, you have just explained in an effective way, within a week, LSTM skill will be updated in my resume!
I think I found a gem! Thank you Krish! You really make it so easy to understand.
I don't usually subscribe to youtube channels ... but this first video I watched from you got me.
His channel and Statquest are 2 of the best resources for ML and Data Science on UA-cam.
Word Embedding is such a masterpiece!
Keep posting great content. It's worth sharing your content with everyone! Thank you!!!
Awsome video!! You are an excellent teacher wish I had a teacher like u at my Master program right now!
Sir your explanations are fantastic
Amazing way of teaching sir !! Great work. Thanks alot
Can't wait for the next video.....
Thanks A Lot .....
Super great! Thanks very much. Krish. It is very good learning video, instructor Krish is great, passionate, exciting. The lesson is very interesting.
Nice ,Clean discription. Well Done ,Krish !!!
Krish, you save my life every time
These videos are adding up my knowledge. Thank you, @krish naik sir!
Most hot topic of the NLP I learnt it after giving too much time
I am sharing what I learnt
For embedding we use
One hot encoding technigue
Word2vec technique
Embedding layer of keras technique after pad sequencing
You can use it for recurrent neural network
Thanks krish
Excellent! Love from USA
you are a gem man , love your style
Your speciality is teaching from scratch sir
Great explaination ❤
@Krish- How you got this features(like Gender,Royal,Age etc).is this features mention any where in word embedding
You are simply amazing sir....hats off to you💯💯
Subscribed !
Thanks a lot. Hopefully your videos will be helpful my thesis project !
Thank you for sharing this wealth of knowledge with us. TBH I’ve been finding issues in grasping how word embedding works. It’s so clear that they are not as straight forward as basic vectorization approach like BoW and TF-IDF. Nevertheless, I gained something from this video
Sir that's amazing 😍😊
What an awesome explanation. Thank you @Krish
Super sir I enjoyed before the exam
applauding for you !! thank you again!
thank you Sir. please make a video explaining how a sentence is beeing translated with the neural system. but with an example. thank you again you are amazing.
Perfect explanation... thanks a lot
You have done a great job, There are students like me, who really need such explannation for getting a rough image...atleast thanks a lot
rough image...means, I got an idea and now i can study more on it and try to clear the image, created by your wonderful explannation
Good explanation krish sir
Your explanations are great, thank you
You are amazing❤️
Every day im watching atleast 5-6 videos these days
amazing brother thank you
good explanation sir
Good content, keep doing videos like this!
Nicely explain Thanks for making things clear
Very well explained
Try showing embedding projector. It’s an interesting way to visualise embedding and sparks interest.
Excellent tutorials .
Great explanation
Thanks for the tutorial on word embedding models. I wonder how are features selected in these models? I think in some particular cases having control on customizing these feature might enhance the chance of getting more similar words than just using the pretarined ones.
Wonderful Video Krish
Sir Kindly guide - How can I use Pre Trained word embedding models for local languages (or languages written in Roman format) that are not available/trained in the pretrained model. Do I have to use an embedding layer(not pre trained) for creating embedding matrices for any local language? How can I get benefit from pretrained models for local language?
thanks this is awesome
excellent
no words for you sir.. 👌👌
would you please make a video on gradient boosting and Xgboost ML algorithm with all maths stuffs....
Will upload soon
Thanks krish for the video.
thank you so much sir
Amazing explanation thank you!
please make a video on dimensional reduction technique in order to reduce into 2 dimension from more dimension (coding)
Nice explanation sir ,always waiting for new videos. we are looking for your book publication.
Thank you sir
very good, thank you sir !
sir can you please tell me how to choose number of embedding dimensions?
For eg., vocab_size = 10000,max_length = 120, and embedding_dim = ??
Many thanks
superb ... Is the word embedding fixed or generated for every dataset given for training ?
How do we decide features (gender, age...) in this technique
Simply Awsome.
LSTM Preactical session also required Sir
Sir please make a video on sentiment analysis using VADER.
It was really helpful. Can u make videos on Grammer Correction using Rule based methord, Language Models & classifiers.
its really hard to understand it otherwise
@Krish Naik at 7:08 i would like to know if you can share some material for the technique that we use to relate features such as gender to the words like boy and girl not to the apple and mangos but I have doubts what technique we use to make possible a machine learn the relation between features and words. Thanks
Good day, may I ask how to define specific dimensions of features (for example, I want to extract linguistic features such as part of speech tagging, word density, and word frequency) that is going to be vectorized?
Sir kindly make video on how we can embed source code into vector and use it for Training DL model
sir how will be these parameter will be decided as u did . like gender, royal, age , food etc
sir please make video on Data scientist carrier degree is required or certification is enough for job
Hi Krish, First of all thanks for making this content.
I have some doubts:
1. In predicting analogy,(King->queen), if we take up some other feature instead of gender, then result may not be necessarily true(or it selects the best feature to give results)
I would like to know how can be obtained that coefficients in vertical columns automatically and what scientific assumption and premisses should be used for doing it. As I see now, it can be done manually by logical consideration. Thanks!
Are you able to use nlp to build a siri type app in your own foreign language? Thanks for tutorial.
is word embedding similar to pandas pivot table expect that we provide features here?
Hey, In word embedding how features are defined? Is it extract from document itself or pre define features set available for related domain or whole language?
Waiting for your facenet embedding... and clustering process
You haven't made any video on gradient boosting yet...Actually.Boosting series is incomplete.
Please make video on GB technique
what kind of mathematics and statistics required for data science carrier? sir please make video on this topic
Basic level
what is the difference between word embedding layer and WordtoVec class
Can you make a video on balancing the imbalance text data set?
Sir When u said that we related the gender to boy , then this is done by machine , or this is predefined , i mean to say that on what bases this all comes to an action that gender is related to boy and girl , and royal is related to queen and king ??? thnku sir
sir, u left most important part w2v plzzzz do cover it
and do cover also maximum 300 dimension funda its really difficult for me to get
Can you explain the last video in this "Introduction to Word Embeddings" series? "Embeddings matrix"
Hi krish,
What is the parameter update equation in SVM and logistic regression?
This is really really amazing. Thank you for your efforts. Can you make a video on sentence embedding as well.
There is no such thing as sentence embedding as per my knowledge
Can you please make a video on 'Hybrid Deep Learning Model for
Sentiment Classification' that is implementation of CNN and LSTM together for sentiment classification?
Pytorch or Tensorflow 2.0 which one is better for beginners?
Keras
Hello, Suppose we need to add more features in our X which are not text..i.e suppose we get a sparse matrix after count vectorizer and now we have one more feature length and we want both features.How to combine both?
👏👏
I don’t really get word embedding. Does it work outside the mainstream English language? For example medical language is different. If I am studying about medical literature, a lot of my main vocabularies are medical words. What is your opinion on this?
Hey can you please explain on which bases you are deciding these vector value like 0.01, 0.03 etc.
Refer word2vec video of ritvikmath UA-cam channel. Thank me later😇😆
Sir,kindly make some basic videos on Pandas.
Check my complete ML playlist
I am facing an issue, trying to install nltk and spacy, but it's asking to downgrade tensorflow from version 2 to version 1.X. What can be done to install it without downgrading the TF.
How do you get feature for different domain
I have one doubt can you make it clear, my doubt is how it is assigning the values for opposite genders, like for Boy -1 and for girl 1