Word Embedding - Natural Language Processing| Deep Learning

Поділитися
Вставка
  • Опубліковано 3 лют 2025

КОМЕНТАРІ • 134

  • @zahraaberjawi2541
    @zahraaberjawi2541 4 роки тому +19

    This is amazing! You literally made the topic interesting, I never got bored throughout the entire video!
    I'll definitely stick to your channel in preparing for my thesis!

  • @ruchitagarde4642
    @ruchitagarde4642 4 роки тому +42

    I'm literally binge watching your videos like Netflix!

  • @bharathiramkumar166
    @bharathiramkumar166 4 роки тому +33

    This is the first subscription I have made in my you tube...that's only because of your teaching skills ....u make any topic very interesting and provide excellent information on research and it's techniques... Please upload videos on autoencoders .

  • @deepaktanna
    @deepaktanna 3 роки тому

    Thanks!

  • @1217Yangli
    @1217Yangli 3 роки тому +1

    you are 200% better than my professor in explaining Word Embedding

  • @dhanushsabbisetty9379
    @dhanushsabbisetty9379 4 місяці тому

    Really a great explaination. I just played only one video but after that I've watched 4 more Countinous videos which helped me lot to prepare for my interview I was going have tomorrow. Share your divine knowledge for us sir!

  • @rajathslr
    @rajathslr 3 роки тому

    I thought I will not learn Deep Learning because it is too much complicated, you have just explained in an effective way, within a week, LSTM skill will be updated in my resume!

  • @josephselwan1652
    @josephselwan1652 3 роки тому +1

    I think I found a gem! Thank you Krish! You really make it so easy to understand.

  • @patrickadjei9676
    @patrickadjei9676 4 роки тому

    I don't usually subscribe to youtube channels ... but this first video I watched from you got me.

  • @Utkarshkharb
    @Utkarshkharb 3 роки тому

    His channel and Statquest are 2 of the best resources for ML and Data Science on UA-cam.

  • @premranjan4440
    @premranjan4440 3 роки тому

    Word Embedding is such a masterpiece!

  • @anjalia8792
    @anjalia8792 3 роки тому +1

    Keep posting great content. It's worth sharing your content with everyone! Thank you!!!

  • @cesaranzola4828
    @cesaranzola4828 2 роки тому +1

    Awsome video!! You are an excellent teacher wish I had a teacher like u at my Master program right now!

  • @usashichatterjee6489
    @usashichatterjee6489 3 роки тому +1

    Sir your explanations are fantastic

  • @puneetbansal8567
    @puneetbansal8567 2 роки тому +1

    Amazing way of teaching sir !! Great work. Thanks alot

  • @informetica7394
    @informetica7394 4 роки тому +2

    Can't wait for the next video.....
    Thanks A Lot .....

  • @WoodyDataAI
    @WoodyDataAI 2 роки тому

    Super great! Thanks very much. Krish. It is very good learning video, instructor Krish is great, passionate, exciting. The lesson is very interesting.

  • @DS_AIML
    @DS_AIML 4 роки тому +1

    Nice ,Clean discription. Well Done ,Krish !!!

  • @GhizlaneBOUSKRI
    @GhizlaneBOUSKRI 3 роки тому

    Krish, you save my life every time

  • @PriyaM-og6ji
    @PriyaM-og6ji 3 роки тому

    These videos are adding up my knowledge. Thank you, @krish naik sir!

  • @shivamkumar-qp1jm
    @shivamkumar-qp1jm 4 роки тому

    Most hot topic of the NLP I learnt it after giving too much time
    I am sharing what I learnt
    For embedding we use
    One hot encoding technigue
    Word2vec technique
    Embedding layer of keras technique after pad sequencing
    You can use it for recurrent neural network

  • @Trouble.drouble
    @Trouble.drouble 4 роки тому +3

    Thanks krish

  • @kumarpurantripathy1145
    @kumarpurantripathy1145 2 роки тому

    Excellent! Love from USA

  • @gautammalik1911
    @gautammalik1911 2 роки тому

    you are a gem man , love your style

  • @Trouble.drouble
    @Trouble.drouble 4 роки тому

    Your speciality is teaching from scratch sir

  • @attitudekillersaqib
    @attitudekillersaqib Місяць тому

    Great explaination ❤

  • @DS_AIML
    @DS_AIML 4 роки тому +10

    @Krish- How you got this features(like Gender,Royal,Age etc).is this features mention any where in word embedding

  • @prakharsingh9279
    @prakharsingh9279 4 роки тому +8

    You are simply amazing sir....hats off to you💯💯

  • @abidtaqi3842
    @abidtaqi3842 4 роки тому +1

    Subscribed !
    Thanks a lot. Hopefully your videos will be helpful my thesis project !

  • @Ogunbiyi_Ibrahim
    @Ogunbiyi_Ibrahim 8 місяців тому

    Thank you for sharing this wealth of knowledge with us. TBH I’ve been finding issues in grasping how word embedding works. It’s so clear that they are not as straight forward as basic vectorization approach like BoW and TF-IDF. Nevertheless, I gained something from this video

  • @FunFusionTrajectory
    @FunFusionTrajectory Рік тому

    Sir that's amazing 😍😊

  • @nagesh866
    @nagesh866 4 роки тому

    What an awesome explanation. Thank you @Krish

  • @machaindrakumar3464
    @machaindrakumar3464 8 місяців тому

    Super sir I enjoyed before the exam

  • @xiaoweidu4667
    @xiaoweidu4667 4 роки тому

    applauding for you !! thank you again!

  • @christinavalavani5242
    @christinavalavani5242 2 роки тому

    thank you Sir. please make a video explaining how a sentence is beeing translated with the neural system. but with an example. thank you again you are amazing.

  • @mahikdm
    @mahikdm Рік тому

    Perfect explanation... thanks a lot

  • @dharmendrathakur1487
    @dharmendrathakur1487 4 роки тому

    You have done a great job, There are students like me, who really need such explannation for getting a rough image...atleast thanks a lot

    • @dharmendrathakur1487
      @dharmendrathakur1487 4 роки тому

      rough image...means, I got an idea and now i can study more on it and try to clear the image, created by your wonderful explannation

  • @ltrahul1016
    @ltrahul1016 2 роки тому

    Good explanation krish sir

  • @Fatima-kj9ws
    @Fatima-kj9ws 3 роки тому

    Your explanations are great, thank you

  • @shjsjssjksksa6452
    @shjsjssjksksa6452 3 роки тому

    You are amazing❤️

  • @Rajesh-xk5kv
    @Rajesh-xk5kv 3 роки тому

    Every day im watching atleast 5-6 videos these days

  • @kin_1997
    @kin_1997 2 роки тому

    amazing brother thank you

  • @rohit_mondal__
    @rohit_mondal__ 3 роки тому

    good explanation sir

  • @giopez
    @giopez 3 роки тому

    Good content, keep doing videos like this!

  • @thesistersrock2538
    @thesistersrock2538 4 роки тому

    Nicely explain Thanks for making things clear

  • @vipulritwik
    @vipulritwik 4 роки тому

    Very well explained

  • @vakarsh
    @vakarsh 4 роки тому +8

    Try showing embedding projector. It’s an interesting way to visualise embedding and sparks interest.

  • @muhammadnomankhanassistant3793
    @muhammadnomankhanassistant3793 3 роки тому

    Excellent tutorials .

  • @BuddhiChathurangaTheAmigo
    @BuddhiChathurangaTheAmigo 4 роки тому

    Great explanation

  • @mohammadkazemibeydokhti5768
    @mohammadkazemibeydokhti5768 3 роки тому +3

    Thanks for the tutorial on word embedding models. I wonder how are features selected in these models? I think in some particular cases having control on customizing these feature might enhance the chance of getting more similar words than just using the pretarined ones.

  • @ravindarmadishetty736
    @ravindarmadishetty736 4 роки тому

    Wonderful Video Krish

  • @Lotof_Mazey
    @Lotof_Mazey 2 роки тому +2

    Sir Kindly guide - How can I use Pre Trained word embedding models for local languages (or languages written in Roman format) that are not available/trained in the pretrained model. Do I have to use an embedding layer(not pre trained) for creating embedding matrices for any local language? How can I get benefit from pretrained models for local language?

  • @dominikdominik5767
    @dominikdominik5767 3 роки тому

    thanks this is awesome

  • @ibrahimahmethan586
    @ibrahimahmethan586 2 роки тому

    excellent

  • @yukeshnepal4885
    @yukeshnepal4885 4 роки тому +2

    no words for you sir.. 👌👌
    would you please make a video on gradient boosting and Xgboost ML algorithm with all maths stuffs....

  • @sandipansarkar9211
    @sandipansarkar9211 4 роки тому

    Thanks krish for the video.

  • @tagoreji2143
    @tagoreji2143 2 роки тому

    thank you so much sir

  • @mavaamusicmachine2241
    @mavaamusicmachine2241 3 роки тому

    Amazing explanation thank you!

  • @eswarank3127
    @eswarank3127 4 роки тому +1

    please make a video on dimensional reduction technique in order to reduce into 2 dimension from more dimension (coding)

  • @ukc2704
    @ukc2704 4 роки тому

    Nice explanation sir ,always waiting for new videos. we are looking for your book publication.

  • @suvarnadeore8810
    @suvarnadeore8810 3 роки тому

    Thank you sir

  • @xiaoweidu4667
    @xiaoweidu4667 4 роки тому

    very good, thank you sir !

  • @SameerSk
    @SameerSk 4 роки тому +4

    sir can you please tell me how to choose number of embedding dimensions?
    For eg., vocab_size = 10000,max_length = 120, and embedding_dim = ??

  • @sandejai
    @sandejai 4 роки тому

    Many thanks

  • @parukadli
    @parukadli 2 роки тому

    superb ... Is the word embedding fixed or generated for every dataset given for training ?

  • @ArshdeepSingh..
    @ArshdeepSingh.. 2 роки тому +1

    How do we decide features (gender, age...) in this technique

  • @amitmodi7882
    @amitmodi7882 4 роки тому

    Simply Awsome.

  • @professorbhumika
    @professorbhumika 4 роки тому +6

    LSTM Preactical session also required Sir

  • @shaktisharma5529
    @shaktisharma5529 4 роки тому +1

    Sir please make a video on sentiment analysis using VADER.

  • @harikrishnanm5109
    @harikrishnanm5109 4 роки тому

    It was really helpful. Can u make videos on Grammer Correction using Rule based methord, Language Models & classifiers.
    its really hard to understand it otherwise

  • @sandeepkumawat4982
    @sandeepkumawat4982 4 роки тому

    @Krish Naik at 7:08 i would like to know if you can share some material for the technique that we use to relate features such as gender to the words like boy and girl not to the apple and mangos but I have doubts what technique we use to make possible a machine learn the relation between features and words. Thanks

  • @arkimanago5284
    @arkimanago5284 3 роки тому +2

    Good day, may I ask how to define specific dimensions of features (for example, I want to extract linguistic features such as part of speech tagging, word density, and word frequency) that is going to be vectorized?

  • @santoshsaklani5019
    @santoshsaklani5019 2 роки тому

    Sir kindly make video on how we can embed source code into vector and use it for Training DL model

  • @DhananjaySharma-p1y
    @DhananjaySharma-p1y Рік тому

    sir how will be these parameter will be decided as u did . like gender, royal, age , food etc

  • @mashakpatel4962
    @mashakpatel4962 4 роки тому +1

    sir please make video on Data scientist carrier degree is required or certification is enough for job

  • @karamvirsingh5670
    @karamvirsingh5670 3 роки тому +1

    Hi Krish, First of all thanks for making this content.
    I have some doubts:
    1. In predicting analogy,(King->queen), if we take up some other feature instead of gender, then result may not be necessarily true(or it selects the best feature to give results)

  • @borysprydalnyy8106
    @borysprydalnyy8106 3 роки тому

    I would like to know how can be obtained that coefficients in vertical columns automatically and what scientific assumption and premisses should be used for doing it. As I see now, it can be done manually by logical consideration. Thanks!

  • @danielmafileo4078
    @danielmafileo4078 4 роки тому

    Are you able to use nlp to build a siri type app in your own foreign language? Thanks for tutorial.

  • @shreyaskulkarni5823
    @shreyaskulkarni5823 3 роки тому

    is word embedding similar to pandas pivot table expect that we provide features here?

  • @akilabandara137
    @akilabandara137 2 роки тому

    Hey, In word embedding how features are defined? Is it extract from document itself or pre define features set available for related domain or whole language?

  • @DeepakKumar-uz4xy
    @DeepakKumar-uz4xy 4 роки тому +1

    Waiting for your facenet embedding... and clustering process

  • @priyabratamohanty3472
    @priyabratamohanty3472 4 роки тому +3

    You haven't made any video on gradient boosting yet...Actually.Boosting series is incomplete.
    Please make video on GB technique

  • @mashakpatel4962
    @mashakpatel4962 4 роки тому +1

    what kind of mathematics and statistics required for data science carrier? sir please make video on this topic

  • @fitnflex_Anuja
    @fitnflex_Anuja 2 роки тому

    what is the difference between word embedding layer and WordtoVec class

  • @naivedhshah2980
    @naivedhshah2980 4 роки тому

    Can you make a video on balancing the imbalance text data set?

  • @gurdeepsinghbhatia2875
    @gurdeepsinghbhatia2875 4 роки тому

    Sir When u said that we related the gender to boy , then this is done by machine , or this is predefined , i mean to say that on what bases this all comes to an action that gender is related to boy and girl , and royal is related to queen and king ??? thnku sir

  • @chetantanwar8561
    @chetantanwar8561 4 роки тому +3

    sir, u left most important part w2v plzzzz do cover it

    • @chetantanwar8561
      @chetantanwar8561 4 роки тому +3

      and do cover also maximum 300 dimension funda its really difficult for me to get

  • @Kmysiak1
    @Kmysiak1 4 роки тому

    Can you explain the last video in this "Introduction to Word Embeddings" series? "Embeddings matrix"

  • @masterstats8064
    @masterstats8064 4 роки тому +1

    Hi krish,
    What is the parameter update equation in SVM and logistic regression?

  • @muhammadroman2334
    @muhammadroman2334 4 роки тому

    This is really really amazing. Thank you for your efforts. Can you make a video on sentence embedding as well.

    • @pratyushharsh7186
      @pratyushharsh7186 3 роки тому

      There is no such thing as sentence embedding as per my knowledge

  • @ShirishSonvane
    @ShirishSonvane 4 роки тому

    Can you please make a video on 'Hybrid Deep Learning Model for
    Sentiment Classification' that is implementation of CNN and LSTM together for sentiment classification?

  • @torque2123
    @torque2123 4 роки тому +1

    Pytorch or Tensorflow 2.0 which one is better for beginners?

  • @tanishbothra5044
    @tanishbothra5044 4 роки тому

    Hello, Suppose we need to add more features in our X which are not text..i.e suppose we get a sparse matrix after count vectorizer and now we have one more feature length and we want both features.How to combine both?

  • @naderbouchnag3
    @naderbouchnag3 2 роки тому

    👏👏

  • @r_pydatascience
    @r_pydatascience 3 роки тому

    I don’t really get word embedding. Does it work outside the mainstream English language? For example medical language is different. If I am studying about medical literature, a lot of my main vocabularies are medical words. What is your opinion on this?

  • @gunjansingh1971
    @gunjansingh1971 4 роки тому

    Hey can you please explain on which bases you are deciding these vector value like 0.01, 0.03 etc.

    • @shilashm5691
      @shilashm5691 2 роки тому

      Refer word2vec video of ritvikmath UA-cam channel. Thank me later😇😆

  • @subhamacharya7084
    @subhamacharya7084 4 роки тому +1

    Sir,kindly make some basic videos on Pandas.

    • @krishnaik06
      @krishnaik06  4 роки тому +1

      Check my complete ML playlist

  • @Arjun147gtk
    @Arjun147gtk 4 роки тому

    I am facing an issue, trying to install nltk and spacy, but it's asking to downgrade tensorflow from version 2 to version 1.X. What can be done to install it without downgrading the TF.

  • @umesh1032
    @umesh1032 3 роки тому

    How do you get feature for different domain

  • @dharmendrathakur1487
    @dharmendrathakur1487 4 роки тому

    I have one doubt can you make it clear, my doubt is how it is assigning the values for opposite genders, like for Boy -1 and for girl 1