Thank you very much! You did a really nice job! The video is clear, visual, and informative. It is consistent with the timeline and evolution of the field, and it effectively conveys the information along with the motivation for the development of these models.
Thanks for the explanation. I feel it makes sense to use the notation p(x_t, t). Then, it is clear to prove ∂/∂t p(x_t, t) = -div(p(x_t, t) u(x_t, t)) is equivalent to d/dt p(x_t, t) = -p(x_t, t) div(u(x_t,t)), which is indeed required here. Please let me know if my comments make sense. The first equation is partial derivative and second equation is total derivative is what I sense.
Your lecture was truly inspiring and made complex concepts so easy to understand. Thank you for your incredible clarity and passion - I’m deeply grateful!
@10:55 with that local outgoingness on the left, why there is one additional term p_t(x_t) inside the d/dx bracket? This term seems to disappear in @11:18. Thanks.
The p_t(x_t) term on the right means that the temporal change in p_t(x_t) is also proportional to the current likelihood value and not just the vector field. However it gets cancelled out when you take the log-likelihood of p_t(x_t)
I think many of the recent text-to-image generation models are now trained with flow matching. There are also many other applications beyond image generations.
You can view it as a generalization of diffusion models. The training can converge faster and you won't have the difficulty where you cannot reach pure Gaussian distributions using finite diffusion steps.
@@jbhuang0604 thanks for response. The problem is that I don't have maths background and your ideas are appearing very abstract. So, if you can add some python code to explain various mathematical expressions you are using, I think that would be very helpful for people like me. I am using chatgpt to do that currently.
Thank you for your excellent video! This is the clearest video I have ever watched explaining Flow Matching in such an interesting way!
Thank you so much for your kind words!
This is gold. Congrats for doing such a fun, pedagogical, and informative on a topic that can often be quite dry in the literature.
Thanks for your kind words!
Thank you very much! You did a really nice job! The video is clear, visual, and informative. It is consistent with the timeline and evolution of the field, and it effectively conveys the information along with the motivation for the development of these models.
Glad you liked it! It’s a lot of fun making this video!
Thanks for the explanation. I feel it makes sense to use the notation p(x_t, t). Then, it is clear to prove ∂/∂t p(x_t, t) = -div(p(x_t, t) u(x_t, t)) is equivalent to
d/dt p(x_t, t) = -p(x_t, t) div(u(x_t,t)), which is indeed required here. Please let me know if my comments make sense. The first equation is partial derivative and second equation is total derivative is what I sense.
Thank you, was struggling to make sense of this part.
Thanks once again for easy-to-understand explanation! Gonna miss CSMC 733 lectures :(
Glad you like them!
Thank you for the video! This is the most clear explanation of flow matching on the internet ❤
Thank you so much for your kind words!
Your lecture was truly inspiring and made complex concepts so easy to understand. Thank you for your incredible clarity and passion - I’m deeply grateful!
You're very welcome! Glad that you like it!
Thanks! This is amazing video to get students, like me, to re-engage with these topics that i haven't had a chance to explore more ❤
Thanks! Glad that this is helpful.
Awesome to see easy-to-understand explanations of current research topics, keep up the great work!
Glad you liked it!
Legend comeback 🙇! Your educational video is worth more than gold.💓🙏
Thanks a lot! Glad you like it!
This is gold mine! Thank you!
Glad to hear it!
Thank you for your excellent work! Absolutely clear and informative.
Glad it was helpful!
Thanks for the video. Great way to explain a complex concept
Appreciate your comment! Thanks for watching the video. Hope you enjoyed it.
Do you have the next video already? It's so good!
Thanks! I am working on it. :-)
beautiful! May I ask how you made the animations for this video?
Most of the animations are from the “morph transition” in PowerPoint slides. The rest are from Adobe premier pro.
perfect video! Thank you for your works.
You're very welcome!
excellent video!
Thanks for watching!
@10:55 with that local outgoingness on the left, why there is one additional term p_t(x_t) inside the d/dx bracket? This term seems to disappear in @11:18. Thanks.
The p_t(x_t) term on the right means that the temporal change in p_t(x_t) is also proportional to the current likelihood value and not just the vector field. However it gets cancelled out when you take the log-likelihood of p_t(x_t)
@@karnikram Can you explain how exactly does it get canceled out?
@@NirGoren-k2k d/dt log(p_t(x_t)) = 1/p_t(x_t) * d/dt p_t(x_t). We know the total derivative d/dt p_t(x_t) = -p_t(x_t) * div (u_t(x_t)) => d/dt log(p_t(x_t)) = -div (u_t(x_t))
That was so good man!
Thank you for the comment!
Thank you!
This is brilliant !
Glad that you enjoyed the video!
nice visuals, good job
Thanks a lot!
Question:
How CFM is different from Rectified Flows?
Amazing work Jia-Bin!!
P.S. you should create a bibtex for this video so it can be cited in literature :P
Haha! Thanks! Too bad google scholar don’t include views of UA-cam videos.
I wondering if there are any urgent, potential applications of flow matching in industry?
I think many of the recent text-to-image generation models are now trained with flow matching. There are also many other applications beyond image generations.
How did nabla become divergence? at 10:59
Nice Job
Thanks!
I can't understant why z_* = u(z_* ), z_{k+1} = x-u(z_k), and x_{k+1} = x_k+\delta u(x_k) after 7:38
What are the advantage of flow matching compared to diffusion models?
You can view it as a generalization of diffusion models. The training can converge faster and you won't have the difficulty where you cannot reach pure Gaussian distributions using finite diffusion steps.
Amazing
Thanks!
At 7:39, by "constrative map", do you really mean "Contraction mapping"?
Yes, contraction mapping. en.m.wikipedia.org/wiki/Contraction_mapping
Why is everyone saying this is so simple? I can't understand anything after 2:30
Sorry about that! Probably should cover a bit more probability basics in the video as well.
@@jbhuang0604 thanks for response. The problem is that I don't have maths background and your ideas are appearing very abstract. So, if you can add some python code to explain various mathematical expressions you are using, I think that would be very helpful for people like me.
I am using chatgpt to do that currently.
感觉有点像李宏毅老师的风格哈哈哈
李宏毅 is THE BEST!