Conditional Random Fields : Data Science Concepts

Поділитися
Вставка
  • Опубліковано 6 вер 2024

КОМЕНТАРІ • 42

  • @spiderkent
    @spiderkent Рік тому +7

    I like your narrative description of the topic. It is good that you have written everything down beforehand so that you can refer to any part of the formulas to emphasize their relationship. Thank you for the effort, job well done!

  • @jordanblatter1595
    @jordanblatter1595 2 роки тому +3

    I have an assignment on segmenting chinese words with crfs due tonight. Perfect timing!

  • @karunesharora3302
    @karunesharora3302 2 роки тому +7

    It is a wonderful explanation for HMM and CRF. It would be great if you could post a separate video dedicated to generative vs discriminative models, as this becomes basis for various NLP models.

  • @huyhoannguyen9913
    @huyhoannguyen9913 2 роки тому +1

    Your explain is much easier to understand than the course that I attended. Keep doing the great job RitVikMath

  • @johnathancorgan3994
    @johnathancorgan3994 2 роки тому +2

    I like the whiteboard presentation style, and your audio was fine.

  • @erickleuro6159
    @erickleuro6159 2 роки тому +4

    Thank you, great video! I used your other Time-Series video series (not pun intended) to help me with my final project, and they were super helpful!

  • @prodbyryshy
    @prodbyryshy 6 місяців тому

    this is the best video ive seen on this topic (for beginners) so far

  • @uansholanbayev5670
    @uansholanbayev5670 4 місяці тому

    thanks man, finally got it clear

  • @dragolov
    @dragolov 9 днів тому

    Bravo, Master!

  • @user-lx1fc6ii9o
    @user-lx1fc6ii9o 2 роки тому

    Thank you for the great explanations! I have watched several videos in different languages trying to get an intuitive idea of CRF, but unfortunately they all focused on symbolic maths. I do understand the maths, but I just couldn't reach an intuitive understanding from the maths. The comparison with HMM you make helped me a lot, and I have a much clearer picture of what CRF is doing after watching this video. Thanks a lot!

  • @blairt8101
    @blairt8101 2 місяці тому

    saved my life again!

  • @allendark2982
    @allendark2982 Рік тому

    The best video about crf ever!

  • @bilalbayrakdar7100
    @bilalbayrakdar7100 2 роки тому

    you are true pioneer of data science, you make everything understandable . keep it up

  • @muhammadal-qurishi7110
    @muhammadal-qurishi7110 2 роки тому

    Thank you for this video. I have to add something here and correct me if I am wrong: HMM is a general form of Naive Bayes whereas CRF is a general form of Logistic Regression.

  • @nikhildharap4514
    @nikhildharap4514 2 роки тому

    Superb! Just can't thank you enough for these videos. You make the concepts so easy to understand.

  • @user-zn3on9hn6n
    @user-zn3on9hn6n 9 місяців тому

    Thanks Ritvik! The video is so clear and i've learned a lot!

  • @BiggestITDisasters-br4jy
    @BiggestITDisasters-br4jy 4 місяці тому

    Thanks for this video!

  • @anoop8753
    @anoop8753 2 роки тому

    Brilliantly explained

  • @nisharathod2945
    @nisharathod2945 2 роки тому

    You make it sound so easy! Thanks dude

  • @zhenwang5872
    @zhenwang5872 Рік тому

    Really good work! I found it inspiring to look at.

  • @lexisense
    @lexisense Рік тому

    Awesome. It is a bit too technical for a linguist. Could you make it more easy please by adding some examples from the English corpus. Thank you in advance

  • @CarlosSoto-rn7jc
    @CarlosSoto-rn7jc Рік тому

    truly amazing explanation! thanks!

  • @n1984ster
    @n1984ster 2 роки тому

    This video talks a lot about feature functions in CRF but HMM video doesn't elaborate on the feature functions concept as related to HMM. Like what feature function could be used in HMM. The HMM video talks about probabilities, but I couldn't find any mention of feature functions. @ritvik

  • @sergioserino1823
    @sergioserino1823 Рік тому

    At last, I get it! Thank you!

  • @DPCoder
    @DPCoder 8 місяців тому

    that was awesome explaination. Thanks alot.

  • @namratanath7564
    @namratanath7564 Рік тому

    Why would you want to use crfs instead of lstm s?

  • @mikewood8175
    @mikewood8175 2 роки тому

    Hey, such a great person you are at explaining.
    I just want you to make video on why LSTM backprop solves vanishing gradient intuition and also backprop of CNN model! I really have hard time understanding gradient flow of both these models. Just the intuition will work too.

  • @cherryfan9987
    @cherryfan9987 3 місяці тому

    thanks

  • @Giovanni-em7ny
    @Giovanni-em7ny 2 роки тому

    You are truly amazing!

  • @zhiyili6707
    @zhiyili6707 2 роки тому

    Thank you for the video. It is really helpful.

  • @kunalnarang1912
    @kunalnarang1912 2 роки тому +1

    Hey Ritvik, great stuff! I have a question: How exactly does one define a different feature function for each timestamp in the sequence. Let's say that the X, Yi-1 and Yi are the same, but the only difference is i. Will that mean we have to define a different feature function every time we see that combination in the sequence. Is there an easier way to do this? Is that something we have to define before training the CRF?

  • @SEOTADEO
    @SEOTADEO 2 роки тому

    Thanks a lot! Helps so much.

  • @j.b.7237
    @j.b.7237 2 роки тому

    Hi Riktiv, what a great video. In my opinion the best understandable video on youtube. I still have a question, are the observed states X_i the respective segmented elements of our data (e.g. words or chars for textual data) or are these already the feautures? I found in the paper "An Introduction to Conditional Random Fields" by McCallum (the inventor of CRFs) a graph example of a CRF, where each Y_i had three connections to observations, but the observation states had only the connection to Y_i in each timestep.

  • @raghavamorusupalli7557
    @raghavamorusupalli7557 8 місяців тому

    Hi Rishi, What about Z?

  • @Hermioneswand1
    @Hermioneswand1 2 роки тому

    Thank you for this video, really helped me out!!
    The audio could be a little louder though

  • @n1984ster
    @n1984ster 2 роки тому

    The drawback of HMM having static transmission and emission probabilities I couldn't understand very well. Please if someone could elaborate a bit more.

  • @ChocolateMilkCultLeader
    @ChocolateMilkCultLeader 2 роки тому +1

    When you talk about generative vs discriminative model, please make sure to include a section talking about how these models can be combined. Their being exclusive is a huge misunderstanding in Machine Learning and something I've covered in my videos and articles. Hope you can cover that idea too

  • @lipe5331
    @lipe5331 Рік тому

    I love you

  • @antrasen77
    @antrasen77 2 місяці тому

    why this accent though?😑