I really love that you talked about the big picture, what the variables mean, and how they relate to each other. The last 2 minutes of the video really explains everything to me. My professor in school spends most of the time showing us the technicalities in the math part, like how to take the derivatives, which seems to totally miss the important points but is just a big showoff.
WOW you are honestly an incredible teacher. My normal experience learning some ~"deep" concept like this is comprehending 30% of the material and needing to search at length to find answers to the questions I have about the remaining 70%. Watching this video, despite the fact that I had almost no background, you paused to answer 95% of the questions I would've otherwise had (e.g. "what is this big pi notation?"). I wish I had teachers like you in college - you truly are making me believe in the value of a competent professor over merely reading textbooks (which is the only way I got any learning done in college).
I came here looking for clarity on an adjacent topic that is quite different from what you were teaching, but I have a good idea of how my own troublesome matter is supposed to work. Thanks for this awesome explanation!
Man I SWEAR, you are a legend, I watched videos for faculty staff from very famous universities, they didn't explain it half clear and simple as you did, THANK YOU, THANK YOU, THANK YOU :-)
couldnt understand GMM in my expensive six month long semester classes but understood it now in 7.5 mins by watching this free video in 2x! Thanks Ritwik! Weird world we live in lol
I came from the Deep Learning book (Goodfellow, Bengio et al.), and I must say you have done a phenomenal job explaining GMM as compared to that literature.
The best resource for understanding this topic. I've been searching for two days and this is the most useful explanation I've encountered so far. Thank you.
Keep it going! Your hand-written Notes with that color scheme could be even more impactful if you used Onenote. Get a writing tab for small money it really is a game-changer. Try it if you fancy! Anyhow, pls keep doing these videos.
Great video, outstanding job. Got me 99% of the way to understanding the concept. I have 1 question: During Expectation Maximization, you are recomputing Mu of k, Sigma of k, Pi of k. For Sigma of k, what does the variable T denote? Transpose?
Ritvik, thanks for explaining this. using Mclust package in R, what are the different parameters EEV etc.? what is BIC in the context of MM. This is awesome explanation. Using MM in mclust to classify small group of clusters that are distinct from population that carry significant meaning. Could you please introduce Dirichlet concept also as an extension to this class. Thank you.
When you describe the value of P(x) as a "probability that we see x" did you mean "likelihood" instead of "probability"? Because the result of pdf function doesn't give the the probability but rather the likelihood that point x belongs to the distribution. Thanks for the video though. Good stuff!
Please provide R code for modelling the dependence between trial and suceess ( herons case by J Zhu) which include em algorithm of beta binomial poisson mixture model. Please help me
Hi Ritvik, Great video but why gamma has a summation from 1 to k. I think it should be without summation as you said probability of being in class k given an observation. Please correct me if understood that correctly
The best explanation of GMMs on UA-cam
agreed
I really love that you talked about the big picture, what the variables mean, and how they relate to each other. The last 2 minutes of the video really explains everything to me. My professor in school spends most of the time showing us the technicalities in the math part, like how to take the derivatives, which seems to totally miss the important points but is just a big showoff.
WOW you are honestly an incredible teacher. My normal experience learning some ~"deep" concept like this is comprehending 30% of the material and needing to search at length to find answers to the questions I have about the remaining 70%. Watching this video, despite the fact that I had almost no background, you paused to answer 95% of the questions I would've otherwise had (e.g. "what is this big pi notation?"). I wish I had teachers like you in college - you truly are making me believe in the value of a competent professor over merely reading textbooks (which is the only way I got any learning done in college).
The best way to explain a concept is by keeping it simple. Hands down the best explanation ever !
I love that you talk about its assumptions, the reason for using it vs K means, etc. All the important questions answered.
Thanks!
Whenever I need to understand a new concept, I search your channel. Thank God, you have this!
Way better than my professor who couldn't explain it in one semester. You are a great teacher. thanks very much.
Happy to help!
the way explained it is so practical as well as theoretical, on top of that easy to understand. Respect to you sir. Thanks a lot
You are most welcome
I came here looking for clarity on an adjacent topic that is quite different from what you were teaching, but I have a good idea of how my own troublesome matter is supposed to work. Thanks for this awesome explanation!
Really gives a simple but clear explanation of the model! Better than the book's explanation!
bro i just put on notifications on this channel, you are blessing sent from the heavens
Wow, thanks!
You are the most underrated machine learning conceptual lecturer I can find on youtube...
Man I SWEAR, you are a legend, I watched videos for faculty staff from very famous universities, they didn't explain it half clear and simple as you did,
THANK YOU, THANK YOU, THANK YOU :-)
couldnt understand GMM in my expensive six month long semester classes but understood it now in 7.5 mins by watching this free video in 2x! Thanks Ritwik! Weird world we live in lol
*Probably the best explanation out there. Taught better than my professor.*
I came from the Deep Learning book (Goodfellow, Bengio et al.), and I must say you have done a phenomenal job explaining GMM as compared to that literature.
I really really appreciate the effort in making this video. Truly helped in understanding the GM model
thank you for the kind words :)
Thank you so much, watching your Em model video before watching this one helped me to understand things even better.
Glad it helped!
Wow you help me in my journey on understanding spectral mixture kernel which requires the knowledge in mixture of Gaussian. Thank you
No problem !
By far the best explanation I found!
Thanks!!!!!!
one of the best explanation on this topic
The best resource for understanding this topic. I've been searching for two days and this is the most useful explanation I've encountered so far. Thank you.
The most easy-to-understand video about GMM on UA-cam
Dude, you broke it down so clearly even I could understand it! Well done!
Keep it going! Your hand-written Notes with that color scheme could be even more impactful if you used Onenote. Get a writing tab for small money it really is a game-changer. Try it if you fancy! Anyhow, pls keep doing these videos.
Purely logical and very smooth explanation!
Wow! God bless you for explaining it so beautifully!
very clear introduction ,I have understood the Gaussian mixed models.
Thanks a lot for explainig this concept this much easier. Hats off to you..
You are most welcome
I wish I had you as one of my prof, who could make the concept easy and teach, rather than complicating it with all maths
Thanks a ton ! I'm obsessed with this channel. Would love to watch you explaining - 'Deep Learning' topics.
My new favorite stats channel! Can you make a video on multivariate normal distribution? You referenced it after all 😀
You should be our professor at our university.
Thanks a lot, I have my exam tomorrow, and I am saved.
Incredible explanation, thank you.
You're very welcome!
Thank you for another clear explanation
I wish I found your channel earlier. Great content !
Thanks sir, outstanding explanation!
The punch line is, "That's Gaussian Mixture Model in a Nutshell".
Anyway.. great video.. thank you...
best explanation ever
This was a fantastic explanation!
Great video! Can you please consider making one on Gaussian Processes?
Best simple explanation👏
Brilliant explanation!
Very well done ! Thanks for your explanation !
Great explanation thanks
Clearly explained, ever.
amazing one to learn about GMM. do you have any video related to the use of GMM in sensor fusion?
Great Explanation
What a best way to use those crayolas
Great video, outstanding job. Got me 99% of the way to understanding the concept. I have 1 question:
During Expectation Maximization, you are recomputing Mu of k, Sigma of k, Pi of k.
For Sigma of k, what does the variable T denote? Transpose?
Thank you very much! That's was a very clear explanation!
thanks a lot i really understand it!!!
Incredible !! Thanks gigantically
wow thank you so much sir
Wow .... keep it up
Ritvik, thanks for explaining this. using Mclust package in R, what are the different parameters EEV etc.? what is BIC in the context of MM. This is awesome explanation. Using MM in mclust to classify small group of clusters that are distinct from population that carry significant meaning. Could you please introduce Dirichlet concept also as an extension to this class. Thank you.
This was amazing
Thank you!!!!!!!!!!!!
the fucking 🐐
Does capital sigma have to be a 2x2 matrix? Width, length and angle can also be represented by 3 values.
So helpful!
Thanks!
terrific
In the multivariate gaussian mixture, how can insert the proportion in R?
Thank you
Welcome!
Whether dos GMM with EM means, that at last we classify the point by the highest value from P(xi)?
Does this work for other distributions, like t-distros etc?
ur awesome!
You rock!
brilliant
Could you please introduce the source for the math, especially for the derivatives part
What a boss
Thank you!
When you describe the value of P(x) as a "probability that we see x" did you mean "likelihood" instead of "probability"? Because the result of pdf function doesn't give the the probability but rather the likelihood that point x belongs to the distribution.
Thanks for the video though. Good stuff!
Is salmon lengthy than tuna?
Please provide R code for modelling the dependence between trial and suceess ( herons case by J Zhu) which include em algorithm of beta binomial poisson mixture model. Please help me
Simplified explanation.
So is EM like maximum likelihood?
Hi Ritvik, Great video but why gamma has a summation from 1 to k. I think it should be without summation as you said probability of being in class k given an observation. Please correct me if understood that correctly
It's just a hyperparameter just like in case of K means
Woow
nice :))
I came here after breaking my brain reading books for 2 days... seems like I only needed 15 minutes.
Please make a post prune video. Thanks!
What is capital N sub k?
pls look at 10:16