I really love that you talked about the big picture, what the variables mean, and how they relate to each other. The last 2 minutes of the video really explains everything to me. My professor in school spends most of the time showing us the technicalities in the math part, like how to take the derivatives, which seems to totally miss the important points but is just a big showoff.
WOW you are honestly an incredible teacher. My normal experience learning some ~"deep" concept like this is comprehending 30% of the material and needing to search at length to find answers to the questions I have about the remaining 70%. Watching this video, despite the fact that I had almost no background, you paused to answer 95% of the questions I would've otherwise had (e.g. "what is this big pi notation?"). I wish I had teachers like you in college - you truly are making me believe in the value of a competent professor over merely reading textbooks (which is the only way I got any learning done in college).
Man I SWEAR, you are a legend, I watched videos for faculty staff from very famous universities, they didn't explain it half clear and simple as you did, THANK YOU, THANK YOU, THANK YOU :-)
couldnt understand GMM in my expensive six month long semester classes but understood it now in 7.5 mins by watching this free video in 2x! Thanks Ritwik! Weird world we live in lol
I came here looking for clarity on an adjacent topic that is quite different from what you were teaching, but I have a good idea of how my own troublesome matter is supposed to work. Thanks for this awesome explanation!
I came from the Deep Learning book (Goodfellow, Bengio et al.), and I must say you have done a phenomenal job explaining GMM as compared to that literature.
The best resource for understanding this topic. I've been searching for two days and this is the most useful explanation I've encountered so far. Thank you.
Keep it going! Your hand-written Notes with that color scheme could be even more impactful if you used Onenote. Get a writing tab for small money it really is a game-changer. Try it if you fancy! Anyhow, pls keep doing these videos.
Hello, Would it be possible for you to make videos on belief propogation algorithms? Your videos are of great quality, and considering there are not really good quality explanations, it would be really helpful
Great video, outstanding job. Got me 99% of the way to understanding the concept. I have 1 question: During Expectation Maximization, you are recomputing Mu of k, Sigma of k, Pi of k. For Sigma of k, what does the variable T denote? Transpose?
Ritvik, thanks for explaining this. using Mclust package in R, what are the different parameters EEV etc.? what is BIC in the context of MM. This is awesome explanation. Using MM in mclust to classify small group of clusters that are distinct from population that carry significant meaning. Could you please introduce Dirichlet concept also as an extension to this class. Thank you.
Please provide R code for modelling the dependence between trial and suceess ( herons case by J Zhu) which include em algorithm of beta binomial poisson mixture model. Please help me
When you describe the value of P(x) as a "probability that we see x" did you mean "likelihood" instead of "probability"? Because the result of pdf function doesn't give the the probability but rather the likelihood that point x belongs to the distribution. Thanks for the video though. Good stuff!
Hi Ritvik, Great video but why gamma has a summation from 1 to k. I think it should be without summation as you said probability of being in class k given an observation. Please correct me if understood that correctly
The best explanation of GMMs on UA-cam
agreed
I really love that you talked about the big picture, what the variables mean, and how they relate to each other. The last 2 minutes of the video really explains everything to me. My professor in school spends most of the time showing us the technicalities in the math part, like how to take the derivatives, which seems to totally miss the important points but is just a big showoff.
WOW you are honestly an incredible teacher. My normal experience learning some ~"deep" concept like this is comprehending 30% of the material and needing to search at length to find answers to the questions I have about the remaining 70%. Watching this video, despite the fact that I had almost no background, you paused to answer 95% of the questions I would've otherwise had (e.g. "what is this big pi notation?"). I wish I had teachers like you in college - you truly are making me believe in the value of a competent professor over merely reading textbooks (which is the only way I got any learning done in college).
Whenever I need to understand a new concept, I search your channel. Thank God, you have this!
The best way to explain a concept is by keeping it simple. Hands down the best explanation ever !
Way better than my professor who couldn't explain it in one semester. You are a great teacher. thanks very much.
Happy to help!
I love that you talk about its assumptions, the reason for using it vs K means, etc. All the important questions answered.
Thanks!
bro i just put on notifications on this channel, you are blessing sent from the heavens
Wow, thanks!
the way explained it is so practical as well as theoretical, on top of that easy to understand. Respect to you sir. Thanks a lot
You are most welcome
Man I SWEAR, you are a legend, I watched videos for faculty staff from very famous universities, they didn't explain it half clear and simple as you did,
THANK YOU, THANK YOU, THANK YOU :-)
*Probably the best explanation out there. Taught better than my professor.*
You are the most underrated machine learning conceptual lecturer I can find on youtube...
couldnt understand GMM in my expensive six month long semester classes but understood it now in 7.5 mins by watching this free video in 2x! Thanks Ritwik! Weird world we live in lol
I really really appreciate the effort in making this video. Truly helped in understanding the GM model
thank you for the kind words :)
I came here looking for clarity on an adjacent topic that is quite different from what you were teaching, but I have a good idea of how my own troublesome matter is supposed to work. Thanks for this awesome explanation!
Really gives a simple but clear explanation of the model! Better than the book's explanation!
I came from the Deep Learning book (Goodfellow, Bengio et al.), and I must say you have done a phenomenal job explaining GMM as compared to that literature.
The best resource for understanding this topic. I've been searching for two days and this is the most useful explanation I've encountered so far. Thank you.
By far the best explanation I found!
Thanks!!!!!!
Dude, you broke it down so clearly even I could understand it! Well done!
Keep it going! Your hand-written Notes with that color scheme could be even more impactful if you used Onenote. Get a writing tab for small money it really is a game-changer. Try it if you fancy! Anyhow, pls keep doing these videos.
Thank you so much, watching your Em model video before watching this one helped me to understand things even better.
Glad it helped!
Wow! God bless you for explaining it so beautifully!
Purely logical and very smooth explanation!
Wow you help me in my journey on understanding spectral mixture kernel which requires the knowledge in mixture of Gaussian. Thank you
No problem !
The most easy-to-understand video about GMM on UA-cam
Thanks a lot for explainig this concept this much easier. Hats off to you..
You are most welcome
one of the best explanation on this topic
very clear introduction ,I have understood the Gaussian mixed models.
Thanks a ton ! I'm obsessed with this channel. Would love to watch you explaining - 'Deep Learning' topics.
You should be our professor at our university.
My new favorite stats channel! Can you make a video on multivariate normal distribution? You referenced it after all 😀
I wish I had you as one of my prof, who could make the concept easy and teach, rather than complicating it with all maths
Thanks a lot, I have my exam tomorrow, and I am saved.
Incredible explanation, thank you.
You're very welcome!
Thank you for another clear explanation
This was a fantastic explanation!
Best simple explanation👏
I wish I found your channel earlier. Great content !
Hello,
Would it be possible for you to make videos on belief propogation algorithms?
Your videos are of great quality, and considering there are not really good quality explanations, it would be really helpful
Thanks sir, outstanding explanation!
Great video! Can you please consider making one on Gaussian Processes?
Thank you very much! That's was a very clear explanation!
The punch line is, "That's Gaussian Mixture Model in a Nutshell".
Anyway.. great video.. thank you...
Very well done ! Thanks for your explanation !
Great video, outstanding job. Got me 99% of the way to understanding the concept. I have 1 question:
During Expectation Maximization, you are recomputing Mu of k, Sigma of k, Pi of k.
For Sigma of k, what does the variable T denote? Transpose?
amazing one to learn about GMM. do you have any video related to the use of GMM in sensor fusion?
best explanation ever
Brilliant explanation!
Does capital sigma have to be a 2x2 matrix? Width, length and angle can also be represented by 3 values.
Incredible !! Thanks gigantically
Great explanation thanks
Ritvik, thanks for explaining this. using Mclust package in R, what are the different parameters EEV etc.? what is BIC in the context of MM. This is awesome explanation. Using MM in mclust to classify small group of clusters that are distinct from population that carry significant meaning. Could you please introduce Dirichlet concept also as an extension to this class. Thank you.
In the multivariate gaussian mixture, how can insert the proportion in R?
Great Explanation
Clearly explained, ever.
thanks a lot i really understand it!!!
Please provide R code for modelling the dependence between trial and suceess ( herons case by J Zhu) which include em algorithm of beta binomial poisson mixture model. Please help me
Does this work for other distributions, like t-distros etc?
What a best way to use those crayolas
Could you please introduce the source for the math, especially for the derivatives part
Wow .... keep it up
When you describe the value of P(x) as a "probability that we see x" did you mean "likelihood" instead of "probability"? Because the result of pdf function doesn't give the the probability but rather the likelihood that point x belongs to the distribution.
Thanks for the video though. Good stuff!
Whether dos GMM with EM means, that at last we classify the point by the highest value from P(xi)?
wow thank you so much sir
the fucking 🐐
Good Explanation. But would have liked you would have went over maths bit more.
This was amazing
Hi Ritvik, Great video but why gamma has a summation from 1 to k. I think it should be without summation as you said probability of being in class k given an observation. Please correct me if understood that correctly
It's just a hyperparameter just like in case of K means
I came here after breaking my brain reading books for 2 days... seems like I only needed 15 minutes.
So is EM like maximum likelihood?
Is salmon lengthy than tuna?
Thank you
Welcome!
So helpful!
Thanks!
terrific
ur awesome!
You rock!
Simplified explanation.
Thank you!!!!!!!!!!!!
What is capital N sub k?
pls look at 10:16
Thank you!
Please make a post prune video. Thanks!
brilliant
What a boss
Woow
nice :))