How I Understand Flow Matching

Поділитися
Вставка
  • Опубліковано 7 лют 2025
  • Flow matching is a new generative modeling method that combines the advantages of Continuous Normalising Flows (CNFs) and Diffusion Models (DMs).
    In this tutorial, I share my understanding of the basics of flow matching and provide an overview of how these ideas evolve over time.
    Check out the resources below to learn more about this topic.
    ===== Slides =====
    Introduction: www.dropbox.co...
    Normalizing Flows:www.dropbox.co...
    Continuous Normalizing Flows: www.dropbox.co...
    Flow Matching: www.dropbox.co...
    ===== Paper/blog survey =====
    [Papamakarios et al. 2021] Normalizing flows for probabilistic modeling and inference arxiv.org/abs/...
    [Kobyzev et al. 2020] Normalizing Flows: An Introduction and Review of Current Methods arxiv.org/abs/...
    [Tor Fjelde et al. 2024] An Introduction to Flow Matching
    mlg.eng.cam.ac...
    [Jakub Tomczak] Flow Matching: Matching flows instead of scores
    Blog: jmtomczak.gith...
    Code example: github.com/jmt...
    ===== Research talks =====
    [Yaron Lipman] Flow Matching: Simplifying and Generalizing Diffusion Models
    • Flow Matching: Simplif...
    [Michael S Albergo] Building Normalizing Flows with Stochastic Interpolants
    • Building Normalizing F...
    [Alex Tong] Conditional Flow Matching
    • TransferLab Seminar: C...
    Thumbnail background image credit: unsplash.com/p...

КОМЕНТАРІ • 82

  • @plcrodrigues
    @plcrodrigues 7 місяців тому +13

    This is gold. Congrats for doing such a fun, pedagogical, and informative on a topic that can often be quite dry in the literature.

    • @jbhuang0604
      @jbhuang0604  7 місяців тому

      Thanks for your kind words!

  • @xuanluo5807
    @xuanluo5807 Місяць тому +1

    Really well-made video! Love how you put all these concepts in the same framework and explain all the math intuitively!

    • @jbhuang0604
      @jbhuang0604  Місяць тому

      Thanks for the kind words, Xuan!

  • @GapLoser42
    @GapLoser42 2 місяці тому +2

    Thank you for your excellent video! This is the clearest video I have ever watched explaining Flow Matching in such an interesting way!

    • @jbhuang0604
      @jbhuang0604  2 місяці тому

      Thank you so much for your kind words!

  • @manuellecha
    @manuellecha 8 місяців тому +4

    Thank you very much! You did a really nice job! The video is clear, visual, and informative. It is consistent with the timeline and evolution of the field, and it effectively conveys the information along with the motivation for the development of these models.

    • @jbhuang0604
      @jbhuang0604  8 місяців тому

      Glad you liked it! It’s a lot of fun making this video!

  • @chakery3
    @chakery3 Місяць тому +1

    This video explains the maths of Flow Matching very well!! Esp. you mentioned that Flow Matching is a generalised version of diffusion model, it suddenly makes all sense. Looking forward to your next video!!

  • @AnujZore-pe9mg
    @AnujZore-pe9mg 8 місяців тому +5

    Thanks once again for easy-to-understand explanation! Gonna miss CSMC 733 lectures :(

  • @因幡の黒うさぎ-i1p
    @因幡の黒うさぎ-i1p 4 місяці тому

    Your lecture was truly inspiring and made complex concepts so easy to understand. Thank you for your incredible clarity and passion - I’m deeply grateful!

    • @jbhuang0604
      @jbhuang0604  4 місяці тому

      You're very welcome! Glad that you like it!

  • @catherineyang5199
    @catherineyang5199 8 місяців тому +1

    Thank you for the video! This is the most clear explanation of flow matching on the internet ❤

    • @jbhuang0604
      @jbhuang0604  8 місяців тому

      Thank you so much for your kind words!

  • @ChenLiu-nc5tg
    @ChenLiu-nc5tg Місяць тому +1

    I really, really love this video! Thanks for making those complex papers interesting and understandable for me. Now, I have the courage and enthusiasm to read the original papers. Haha!

    • @jbhuang0604
      @jbhuang0604  Місяць тому +1

      That's great to hear! Glad you found it helpful.

  • @r00t257
    @r00t257 8 місяців тому +1

    Legend comeback 🙇! Your educational video is worth more than gold.💓🙏

    • @jbhuang0604
      @jbhuang0604  8 місяців тому

      Thanks a lot! Glad you like it!

  • @nathan_ca
    @nathan_ca 8 місяців тому +1

    Thanks! This is amazing video to get students, like me, to re-engage with these topics that i haven't had a chance to explore more ❤

    • @jbhuang0604
      @jbhuang0604  8 місяців тому

      Thanks! Glad that this is helpful.

  • @DimitrivonRutte
    @DimitrivonRutte 8 місяців тому +3

    Awesome to see easy-to-understand explanations of current research topics, keep up the great work!

  • @JQXU-z3s
    @JQXU-z3s 6 місяців тому +1

    Thank you for your excellent work! Absolutely clear and informative.

  • @fuhodev9548
    @fuhodev9548 Місяць тому +1

    Idont know if there's anybody like me, the video is easy to understand but I need to watch it more, for now I've watched it for 10 times but still does not fully get the formulas. Thank you so much!

    • @jbhuang0604
      @jbhuang0604  Місяць тому

      Yup, it’s not easy to understand these math equations. But I hope the video provides some intuition on why and how it works.

  • @kimchi_taco
    @kimchi_taco 3 місяці тому +1

    This is gold mine! Thank you!

  • @amirhosseinraffiee8270
    @amirhosseinraffiee8270 8 місяців тому +1

    Thanks for the video. Great way to explain a complex concept

    • @jbhuang0604
      @jbhuang0604  8 місяців тому +1

      Appreciate your comment! Thanks for watching the video. Hope you enjoyed it.

  • @pravin1390
    @pravin1390 5 місяців тому +3

    Thanks for the explanation. I feel it makes sense to use the notation p(x_t, t). Then, it is clear to prove ∂/∂t p(x_t, t) = -div(p(x_t, t) u(x_t, t)) is equivalent to
    d/dt p(x_t, t) = -p(x_t, t) div(u(x_t,t)), which is indeed required here. Please let me know if my comments make sense. The first equation is partial derivative and second equation is total derivative is what I sense.

    • @NirGoren-k2k
      @NirGoren-k2k 3 місяці тому

      Thank you, was struggling to make sense of this part.

  • @yeon6761
    @yeon6761 7 місяців тому +1

    perfect video! Thank you for your works.

  • @ahsentahir4473
    @ahsentahir4473 3 місяці тому +1

    That was so good man!

  • @yoshiyuki1732ify
    @yoshiyuki1732ify Місяць тому +2

    Great content, just a small tip, avoid extraneous load, some sound seems unnecessary.

  • @julienblanchon6082
    @julienblanchon6082 8 місяців тому +1

    This is brilliant !

    • @jbhuang0604
      @jbhuang0604  8 місяців тому +1

      Glad that you enjoyed the video!

  • @adrienforbu5165
    @adrienforbu5165 8 місяців тому +1

    nice visuals, good job

  • @jackshi7613
    @jackshi7613 8 місяців тому +2

    excellent video!

  • @ruoshiliu6024
    @ruoshiliu6024 8 місяців тому +1

    Amazing work Jia-Bin!!
    P.S. you should create a bibtex for this video so it can be cited in literature :P

    • @jbhuang0604
      @jbhuang0604  8 місяців тому +1

      Haha! Thanks! Too bad google scholar don’t include views of UA-cam videos.

  • @sayakpaul3152
    @sayakpaul3152 6 місяців тому +1

    Do you have the next video already? It's so good!

    • @jbhuang0604
      @jbhuang0604  6 місяців тому

      Thanks! I am working on it. :-)

  • @nghiapham1632
    @nghiapham1632 9 днів тому +1

    Thank you so much

  • @dreadfulbodyguard7288
    @dreadfulbodyguard7288 4 місяці тому +4

    Why is everyone saying this is so simple? I can't understand anything after 2:30

    • @jbhuang0604
      @jbhuang0604  4 місяці тому

      Sorry about that! Probably should cover a bit more probability basics in the video as well.

    • @dreadfulbodyguard7288
      @dreadfulbodyguard7288 4 місяці тому

      @@jbhuang0604 thanks for response. The problem is that I don't have maths background and your ideas are appearing very abstract. So, if you can add some python code to explain various mathematical expressions you are using, I think that would be very helpful for people like me.
      I am using chatgpt to do that currently.

  • @ruilongxing2619
    @ruilongxing2619 День тому +1

    Thank you for the great video! Your explanations are always super helpful. 🙌 I just noticed a small typo in the equation: I believe it should be τ⁻¹(x_i, c_i(z_1:i-1)) at 6:25 instead. Let me know if I misunderstood. Keep up the amazing work!

    • @jbhuang0604
      @jbhuang0604  3 години тому

      Thanks so much for your comment! And yes, I should correct that! Thanks!

  • @silver_argenta
    @silver_argenta 8 місяців тому +2

    @10:55 with that local outgoingness on the left, why there is one additional term p_t(x_t) inside the d/dx bracket? This term seems to disappear in @11:18. Thanks.

    • @karnikram
      @karnikram 3 місяці тому

      The p_t(x_t) term on the right means that the temporal change in p_t(x_t) is also proportional to the current likelihood value and not just the vector field. However it gets cancelled out when you take the log-likelihood of p_t(x_t)

    • @NirGoren-k2k
      @NirGoren-k2k 3 місяці тому

      @@karnikram Can you explain how exactly does it get canceled out?

    • @karnikram
      @karnikram 3 місяці тому

      @@NirGoren-k2k d/dt log(p_t(x_t)) = 1/p_t(x_t) * d/dt p_t(x_t). We know the total derivative d/dt p_t(x_t) = -p_t(x_t) * div (u_t(x_t)) => d/dt log(p_t(x_t)) = -div (u_t(x_t))

  • @kvu207
    @kvu207 8 місяців тому +2

    beautiful! May I ask how you made the animations for this video?

    • @jbhuang0604
      @jbhuang0604  8 місяців тому +1

      Most of the animations are from the “morph transition” in PowerPoint slides. The rest are from Adobe premier pro.

  • @xulin730
    @xulin730 4 місяці тому

    Nice Job

  • @tauhidkhan453
    @tauhidkhan453 3 місяці тому

    Question:
    How CFM is different from Rectified Flows?

  • @emilbogomolov5709
    @emilbogomolov5709 3 місяці тому

    How did nabla become divergence? at 10:59

  • @rexton123
    @rexton123 6 місяців тому

    Amazing

  • @morrisfan2004
    @morrisfan2004 5 місяців тому

    I wondering if there are any urgent, potential applications of flow matching in industry?

    • @jbhuang0604
      @jbhuang0604  4 місяці тому

      I think many of the recent text-to-image generation models are now trained with flow matching. There are also many other applications beyond image generations.

  • @rexton123
    @rexton123 6 місяців тому

    What are the advantage of flow matching compared to diffusion models?

    • @jbhuang0604
      @jbhuang0604  4 місяці тому

      You can view it as a generalization of diffusion models. The training can converge faster and you won't have the difficulty where you cannot reach pure Gaussian distributions using finite diffusion steps.

  • @shuchangzhou
    @shuchangzhou 4 місяці тому

    At 7:39, by "constrative map", do you really mean "Contraction mapping"?

    • @jbhuang0604
      @jbhuang0604  4 місяці тому

      Yes, contraction mapping. en.m.wikipedia.org/wiki/Contraction_mapping

  • @tian_chen4816
    @tian_chen4816 Місяць тому +1

    这么新的技术能找到质量这么高的视频,赛博活佛拯救无知本科生啊😭

  • @JingHe-q6p
    @JingHe-q6p 3 місяці тому +1

    I can't understant why z_* = u(z_* ), z_{k+1} = x-u(z_k), and x_{k+1} = x_k+\delta u(x_k) after 7:38

    • @mateuszwyszynski4331
      @mateuszwyszynski4331 2 місяці тому

      I'm not sure whether you're asking about the Banach fixed-point theorem or the later explanation. The best intuition I know of behind the theorem is the following. Imagine that you're holding a map of the town you are currently in. Now throw it on the ground. There will be exactly one point which falls "on itself". The city in this case is your domain (i.e. the z's) and the points on the map are the image (i.e. u(z)).
      Concerning the next argument, if u(z) is a contraction then so is h(z) = x - u(z). It's rather simple to check that |h(z1) - h(z2)| < |z1 - z2| for any z1 and z2. Since h is a contraction, there exists a unique fixed point s.t. h(z*) = z* which is the same as x - u(z*) = z*
      Finally, concerning the intuition behind the iterative algorithm, the definition of contraction u is that you have |u(y) - u(x)| < |y - x| for all x and y. Now you can see that | u(y) - x* | = | u(y) - u(x*) | < | y - x* |. So every time you apply u to some point y different the fixed point you get closer to the fixed point. Hence you can do it over and over again and in the limit you should arrive at the fixed point.

  • @thienthuoan1081
    @thienthuoan1081 4 місяці тому

    感觉有点像李宏毅老师的风格哈哈哈