Cornell CS 6785: Deep Generative Models. Lecture 12: Score-Based Generative Models

Поділитися
Вставка
  • Опубліковано 28 лис 2024

КОМЕНТАРІ • 7

  • @Geraltofrivia12gdhdbruwj
    @Geraltofrivia12gdhdbruwj 8 місяців тому +4

    One of the most amazing lectures. Ive never seen a lecture on generative models that is so connected like these, from simple autoregressive, to latent models, to gans, to energy-based, langevin dynamics, and finally to diffusion models, all are connected! The connectedness and story telling are so amazing! thank you Prof!

  • @chenweilong2505
    @chenweilong2505 Рік тому +5

    Amazing Lectures! Can't wait to watch the next diffusion lecture! Awesome!

  • @DrumsBah
    @DrumsBah 11 місяців тому +2

    Presenting the unified view of Energy, Score and Diffusion models is invaluable. My coursework didnt cover generative methods beyond VAE and GANs but this presentation has been a great surrogate. Thanks!
    A small correction to the proof on slide 14. I think there's possibly a rogue squared s_theta(x) in the third term.

  • @micahdelaurentis6551
    @micahdelaurentis6551 9 місяців тому +1

    why is the score function graph at around 21:00 postive after 5ish? As soon as you past the point corresponding to the mode at around 5 shouldn't it point left (be negative)? Same question for the other mode around -4

    • @nitind9786
      @nitind9786 2 місяці тому

      even i have the same doubt

  • @nitind9786
    @nitind9786 2 місяці тому

    @34:34 - The last integral on the RHS, why is there an S_theta(x) "square" term ? .. shudn't it just be s_theta(x) ??

    • @haihaibaba
      @haihaibaba 2 місяці тому

      Yeah, the squares should not be there on s_theta(x)