EM algorithm: how it works

Поділитися
Вставка
  • Опубліковано 27 сер 2024
  • Full lecture: bit.ly/EM-alg
    Mixture models are a probabilistically-sound way to do soft clustering. We assume our data is sampled from K different sources (probability distributions). The expectation maximisation (EM) algorithm allows us to discover the parameters of these distributions, and figure out which point comes from each source at the same time.

КОМЕНТАРІ • 215

  • @biancaluedeker
    @biancaluedeker 3 роки тому +35

    I am a PhD student. I have seen the material in three classes and had no idea what was going on. You made it crystal clear in 8 minutes.

    • @snackmaster35
      @snackmaster35 Рік тому

      You must of had comp stat with Tony too lol

    • @Beowulf245
      @Beowulf245 Рік тому

      I don't mean to be rude but what area of research do you specialize in? Because that's some pretty basic math here that they teach in your freshman or sophomore year in college.

    • @TorSTARAGARIO
      @TorSTARAGARIO 10 місяців тому +10

      @@Beowulf245 Very intentionally rude. I've had the same issue, and without solid intuition or clear explanations like this video, the math looks like a confusing mess that would go right over a lot of undergraduates' heads.

    • @ChadieRahimian
      @ChadieRahimian 7 місяців тому

      @@Beowulf245 I have two masters degrees in Astrophysics and Computer Science and currently a PhD student in computer science and I needed a refresher on EM and was failing to find anything useful online and couple of book chapters I went through were not enough for me to grasp the concept on a practical level of implementation for my problem right now and this video helped. So…

    • @fbng
      @fbng 3 місяці тому

      im doing this shit in my Bachelors and it's kicking my ass. i guess we all united in bad teaching of this topic regardless of our level

  • @omidmo7554
    @omidmo7554 8 років тому +262

    I was struggling with EM for a long time. You explained it very simple. I believe this is a kind of art if some one explains sth hard in a easy way. This makes different. I appreciate your help.

    • @joshwolff4592
      @joshwolff4592 3 роки тому +21

      It's also an art when someone explains something quite easy in a very hard way. I know a few Picassos in this genre

    • @michaelle1229
      @michaelle1229 2 роки тому

      This helped me a lot during my exam for Modern Applied Statistics! I got a bit lost computing the derivations and their parameters with the Gaussian.

  • @aprilsun2572
    @aprilsun2572 4 роки тому +15

    Omg. U don’t know how long I have struggled with this algo. Such a nice explanation!!!

  • @leeris19
    @leeris19 7 днів тому

    Now this is how you teach. Teaching should be an adventure filled with teeny tiny realizations to produce an outcome.

  • @impzhu3088
    @impzhu3088 3 роки тому +3

    Omg, now I know the intuition behind the EM algo. This is art and you are fantastic! They say if you can’t explain some concepts in an easy way, you don’t know it well enough. I guess that’s why you can explain this so clearly and why many teachers can’t. Thank you so much!

  • @andersonbessa9044
    @andersonbessa9044 5 років тому +5

    It's great when someone really knows what is explaining. You made it look simple. Thanks!

  • @jeremyborg1365
    @jeremyborg1365 7 років тому +2

    The best introductory video to EM by far. thanks

  • @mr6462
    @mr6462 4 роки тому +1

    I like how you explained causation in parameter -> which group or which group -> parameter but we have none of them. It is truly a beauty to recognize this nuance.

  • @xiayisun8570
    @xiayisun8570 6 років тому +2

    Sir you are my hero! You always set up things so intuitively.

  • @daattali
    @daattali 8 років тому +7

    What a great and clear explanation! I watched a few videos on EM/GMM and didn't quite get it as well as I do now. Your explanation of chicken and egg problem and the intro before that really makes it so much more intuitive. Thanks!

  • @adawang9147
    @adawang9147 7 років тому +1

    I'll have to say, I looove your lectures. I watched your Decision Tree lecture 2 days ago, today I was looking for EM lecture, and you explain them all through. Thank you so much for sharing

  • @Vivekagrawal5800
    @Vivekagrawal5800 2 роки тому

    this is the best video on GMM EMM. I tried tens of other videos, bt couldnt understand anything. Now I am crystal clear.

  • @KoLMiW
    @KoLMiW 3 роки тому +1

    In our lectures they straight jumped into 2D examples and it was very hard to comprehend the formulas, it helped me a lot that you explained it with a 1-d example. Thank you very much!

  • @comatosetorpor3602
    @comatosetorpor3602 3 роки тому

    youtube has opened doors for so many people who would not have otherwise gotten such a good lectures.

  • @pereeia9048
    @pereeia9048 Рік тому +1

    Amazing video, perfectly explained the concepts without getting bogged down in the math/technical details.

  • @ks34199
    @ks34199 7 років тому +3

    Hello Prof,
    The breakdown of complex algorithms in simple steps is excellent.

  • @sandlinjames
    @sandlinjames Рік тому

    Wonderful explanation. I've been watching my class presentations over and over with no result. Now I get it. :)

  • @gogopie64
    @gogopie64 9 років тому

    This explains mixture modeling so much better than wikipedia. The wikipedia article required such a large math vocabulary to read that i had no idea what it was talking about.

  • @maxweera7897
    @maxweera7897 2 роки тому

    Best lecture on EM on youtube. Well done!

  • @xentox5016
    @xentox5016 2 роки тому

    My prof was talking about this for 30min and i didnt got it at all. This video goes 8 mins and its very clear, thanks!

  • @nvsabhishek7356
    @nvsabhishek7356 2 роки тому

    Beautiful. My teacher just showed a bunch of equations. I was lost. Great example on how good teaching goes a long way.

  • @emmettmcdow9916
    @emmettmcdow9916 4 роки тому

    Rarely comment, but this was a fantastic video. Very few youtube videos on a subject as small as EM are this informative.

  • @roman5932
    @roman5932 2 роки тому

    I'm a russian student, 2nd year. I've seen 3 lectures in russian language, but only now I get this algorithm. This is professionalism I sure
    Thank you sir

  • @shm2157
    @shm2157 7 років тому

    Superb and simplified way to explain the essence of GMM and EM..!

  • @bionh
    @bionh 10 років тому +3

    Thanks Victor, you really have a great educational style. Please keep making videos explaining important topics in data science in a clear way--I will keep watching them :)

    • @vlavrenko
      @vlavrenko  9 років тому

      Bion Howard Thank you for the kind words. Really happy you find these videos helpful.

  • @1982Dibya
    @1982Dibya 8 років тому

    Awesome video...Now finally my concept of EM is clear...Nobody can make me understand the way you did..So many thanks.....Waiting for more interesting videos on machine learning...U r simply awesome..hats off

  • @ape1eat
    @ape1eat 10 років тому

    I love this kind of explanation. Simple practical example that anyone can understand without tons of abstract mathematical expression. It helped me to understand the concept and then it's much easier to go deeper. Wish more teachers teach this way.

  • @sarthak8786
    @sarthak8786 6 років тому

    this is so far the best explanation on this topic

  • @TheLyue
    @TheLyue 5 років тому +2

    best explanation I've ever seen

  • @ishitaraj7723
    @ishitaraj7723 Місяць тому

    Very smooth explanation. Loved it!

  • @TankNSSpank
    @TankNSSpank 9 років тому +5

    We need more of these sir.

    • @vlavrenko
      @vlavrenko  9 років тому +2

      Thanks, I'm working on getting more uploaded.

  • @annlee8239
    @annlee8239 3 роки тому

    impressive! you made it sound so clear, and appreciate how you compared it with k means

  • @nkapila6
    @nkapila6 5 місяців тому

    Thanks for this. Your video helped bring clarity to the problem statement.

  • @lijun2031
    @lijun2031 8 років тому

    This is my first time to leave a comment!!! you are awesome!! I come from NYU. I am struggling my final project. It uses EM. You really save my time!!! This is video is wonderful!!!!

  • @mahdishafiei7230
    @mahdishafiei7230 8 років тому +1

    very very eloquent.
    Thanks for your time which you spend to teach

  • @karinalatsko4908
    @karinalatsko4908 2 роки тому

    Great explanation! So clear even when showing the more "difficult" math pieces.

  • @jameshighland6769
    @jameshighland6769 9 років тому

    Very clear and simple explanation. Fantastic. Thanks for this. Please upload more videos like these.

    • @vlavrenko
      @vlavrenko  9 років тому

      Thank you for the kind words!

  • @IvolineNgong
    @IvolineNgong 5 років тому

    What an awesome explanation of EM algorithm. You've made so so easy for me now. Thank you Sir

  • @Kruuppe
    @Kruuppe 5 років тому

    Really really good explanation. Easy to understand and you even referenced K means clustering algorithm!

  • @hakimazman488
    @hakimazman488 3 роки тому

    The comparison with K-means made it clicked for me. Thank you!!.

  • @supacopper4790
    @supacopper4790 7 років тому

    Very helpful guide! I spent some time to read the paper and tutorial online and having trouble in understand the logic, and u just lead me to the point of understanding this logic in jst 8 minutes!

  • @ruoyuguo9134
    @ruoyuguo9134 3 роки тому

    Great EM algorithm explanation!

  • @beatlekim
    @beatlekim 3 роки тому

    BEST VIDEO ON EM ALGORITHM

  • @ct528
    @ct528 3 роки тому

    Seriously. I have a very rough time understanding my professor over zoom and his lecturing style doesn't help. This was easier to understand. Thank you.

  • @nv3796
    @nv3796 2 роки тому

    In the last minute of this video; I understood it! THANKS

  • @danielalfonsetti6602
    @danielalfonsetti6602 3 роки тому

    This is by far the best and clearest explanation of the intuition of EM that I've heard. This is amazing. Thank you so much.

  • @Kruuppe
    @Kruuppe 5 років тому +1

    My god this was an amazing video. I think I've already commented on this before but watching it again was really helpful! Cheers

  • @SuperJJAlexander
    @SuperJJAlexander 5 років тому

    I surely can pass my Machine Learning exam thanks to this video. Thanks!!!

  • @vikalpmehta6019
    @vikalpmehta6019 4 роки тому

    I was struggling to get a big picture of EM algorithm. What a great explanation you provided. Thank you. 😁

  • @44r0n-9
    @44r0n-9 2 роки тому

    absolutely brilliantly explained.

  • @cycman98
    @cycman98 2 роки тому

    Comparison with kmeans opened my eyes. Thank you

  • @steveshank9674
    @steveshank9674 9 років тому +2

    Subbed and will be going through your entire collection of videos. This video helped me take a step forward with my research. Thank you for your efforts!

  • @lordnicholasbuzanthefearle2155
    @lordnicholasbuzanthefearle2155 9 років тому +22

    You just earned my sub

  • @lucianotarsia9985
    @lucianotarsia9985 4 роки тому

    Great video Victor. Very simple to understand. Thank you for the help

  • @abhishekagnihotri9233
    @abhishekagnihotri9233 5 років тому

    The way of explanation is very good.

  • @bigears8296
    @bigears8296 5 років тому

    Crazy dude u explain this 1000 times better than the professor

  • @beng.7708
    @beng.7708 8 років тому

    Good Lessen which tells the essence of GMM and why we need latent Z variable.

  • @smawtan
    @smawtan 9 років тому +1

    Victor, your videos are just fantastic for someone like myself who is new to the area of informatics and machine learning. I have a background in physics but my current PhD work is in medical image analysis; I'm excited to take my research down a more computer science-ey route and your videos provide brilliant overviews of key areas.
    Thanks a lot! :)

  • @desitravellers2023
    @desitravellers2023 5 років тому

    Clear, concise and insightful. Thank you.

  • @sreejadevisetti
    @sreejadevisetti Рік тому +1

    Awesome explanation !!

  • @MuhammadGhufran5
    @MuhammadGhufran5 5 років тому

    Thanks very much for the lucid explanation Prof !

  • @xianda9648
    @xianda9648 3 роки тому +1

    Hello, How to update the existing GMM when new data come?

  • @ericarnaud5062
    @ericarnaud5062 10 років тому

    Victor you are so Amazing....
    I would like to be your student, your lectures are simple and we learn more than we expect.
    Thank you so much

    • @vlavrenko
      @vlavrenko  10 років тому +1

      Eric Arnaud Thank you for the kind words. Very ha[[y to know you find these lectures useful.

  • @musicspinner
    @musicspinner 9 років тому

    Great lecturer. Thanks for posting these videos.

  • @jessicas2978
    @jessicas2978 8 років тому

    Great video! Very clearly explained and easy to understand.

  • @siyuanxiang1636
    @siyuanxiang1636 2 роки тому

    Thanks for this amazing video! It clarifies EM and it's really helpful! Thanks for making it!

  • @raoufkeskes7965
    @raoufkeskes7965 6 місяців тому

    at 3:08 the variance estimator shoud be divided by (nb-1) as corrected estimation and not nb .. that's what we call Bessel's correction

  • @200415670
    @200415670 5 років тому

    This is crazy. You explain it so clearly!! Thanks alot!!

  • @dmitryzabavin319
    @dmitryzabavin319 5 років тому +3

    Thank you so much. This is the most clear explanation of EM I found. But the only question remained is: how to calculate P(a) and P(b) in P(b | xi) formula?

  • @cageybee777
    @cageybee777 3 роки тому +1

    At 5:37, how do we know P(b) and P(a)?

  • @Jessica-dt9by
    @Jessica-dt9by 8 років тому

    Thank you so much! This explains EM basic idea so clearly!

  • @martijnhuijnen1
    @martijnhuijnen1 9 місяців тому

    Thanks so much! I will refer my students to your webpage!

  • @samuelcoromandel7392
    @samuelcoromandel7392 7 років тому +1

    How do you calculate priors?

  • @nahidakhter8646
    @nahidakhter8646 3 роки тому

    Well explained and illustrated, thanks!

  • @randalllionelkharkrang4047
    @randalllionelkharkrang4047 2 роки тому

    im just confused when are estimating the probability i.e p(b|x_i). There is P(b) term on the right because of bayes rule. How would we know that probability i.e P(b) in this equation, so b is also distributed by some distribution and we dont know its pararameter.

  • @ericrudolph6615
    @ericrudolph6615 4 роки тому

    What is the difference between soft clustering and fuzzy clustering?

  • @phuongdinh5836
    @phuongdinh5836 7 років тому

    Beautifully explained.

  • @adityapandey5264
    @adityapandey5264 5 років тому

    This is so easy to understand. Thank you.

  • @havayastik
    @havayastik 9 років тому

    thanks man, this video saved me so much time to understand it

    • @vlavrenko
      @vlavrenko  9 років тому +2

      Thanks, glad this was helpful.

  • @dilettachiaro5322
    @dilettachiaro5322 4 роки тому

    so grateful to you man for this explanation!

  • @mavaamusicmachine2241
    @mavaamusicmachine2241 2 роки тому

    thank you for this lecture, extremely well explained

  • @erictao8396
    @erictao8396 21 день тому

    Great explanation, thanks!

  • @yooneylee6694
    @yooneylee6694 4 роки тому

    At EM, do we know how many sources? Or we have to guess that as well?

  • @krystaljinluma
    @krystaljinluma 4 роки тому

    How do you calculate the new parameters in EM? In k-means you would compute the means for each feature, but I'm confused about how you do that in EM?

    • @krystaljinluma
      @krystaljinluma 4 роки тому

      Nvm I found your next video explaining exactly what I was asking. Great videos

  • @EliBiomedEng
    @EliBiomedEng 3 роки тому

    AWESOME EXPLANATION, THANK YOU

  • @ensargunesdogdu
    @ensargunesdogdu 7 років тому

    The Best way to explain, thank you so much. Liked and subscribed

  • @AdrienneTranR
    @AdrienneTranR 8 років тому

    Excellent explanation.

  • @CT99999
    @CT99999 Рік тому

    Great explanation!

  • @micahmayanja7439
    @micahmayanja7439 4 роки тому

    brilliantly done.

  • @stephenyau3146
    @stephenyau3146 8 років тому

    Thank you so much!! You have explained it much better than my professor haha!

  • @Code-09
    @Code-09 5 років тому

    Best explanation. Thank you very much.

  • @kevinsong1056
    @kevinsong1056 8 років тому

    Very very good explanation! Thank you!

  • @billbentley3
    @billbentley3 4 роки тому

    Is it possible to use the EM technique with categorical variables and if so, how does the technique change to do that?

  • @cp3shadow
    @cp3shadow 4 роки тому

    At 5:25, P(x_i|b) equals a PDF as stipulated by your equation. However, since the random variables are continuous, shouldn't the likelihood equal a density function f, NOT a probability? You'd need to integrate over some interval in order to claim the Gaussian PDF equals a probability

  • @rlobo2535
    @rlobo2535 5 років тому +22

    Your voice is like "Gale Boetticher" the lab guy from Breaking Bad, good video though

  • @jimmorrisshen
    @jimmorrisshen 5 років тому

    A super nice explanation.

  • @camilledingam8210
    @camilledingam8210 3 роки тому

    Hello, please can you show me the reference where i can find the membership cluster expression of expectation maximization? I will be very grateful for you help, thank you

  • @vaibhav81
    @vaibhav81 6 років тому

    Super explanation. Loved it.

  • @iamconnected5991
    @iamconnected5991 3 роки тому

    Thank you so much for this great video!

  • @sabbirneplumpstein334
    @sabbirneplumpstein334 Місяць тому +1

    You´re amazing