One of the most amazing lectures. Ive never seen a lecture on generative models that is so connected like these, from simple autoregressive, to latent models, to gans, to energy-based, langevin dynamics, and finally to diffusion models, all are connected! The connectedness and story telling are so amazing! thank you Prof!
Presenting the unified view of Energy, Score and Diffusion models is invaluable. My coursework didnt cover generative methods beyond VAE and GANs but this presentation has been a great surrogate. Thanks! A small correction to the proof on slide 14. I think there's possibly a rogue squared s_theta(x) in the third term.
why is the score function graph at around 21:00 postive after 5ish? As soon as you past the point corresponding to the mode at around 5 shouldn't it point left (be negative)? Same question for the other mode around -4
One of the most amazing lectures. Ive never seen a lecture on generative models that is so connected like these, from simple autoregressive, to latent models, to gans, to energy-based, langevin dynamics, and finally to diffusion models, all are connected! The connectedness and story telling are so amazing! thank you Prof!
Amazing Lectures! Can't wait to watch the next diffusion lecture! Awesome!
Presenting the unified view of Energy, Score and Diffusion models is invaluable. My coursework didnt cover generative methods beyond VAE and GANs but this presentation has been a great surrogate. Thanks!
A small correction to the proof on slide 14. I think there's possibly a rogue squared s_theta(x) in the third term.
why is the score function graph at around 21:00 postive after 5ish? As soon as you past the point corresponding to the mode at around 5 shouldn't it point left (be negative)? Same question for the other mode around -4
even i have the same doubt
@34:34 - The last integral on the RHS, why is there an S_theta(x) "square" term ? .. shudn't it just be s_theta(x) ??
Yeah, the squares should not be there on s_theta(x)