Stanford CS236: Deep Generative Models I 2023 I Lecture 9 - Normalizing Flows

Поділитися
Вставка
  • Опубліковано 5 тра 2024
  • For more information about Stanford's Artificial Intelligence programs visit: stanford.io/ai
    To follow along with the course, visit the course website:
    deepgenerativemodels.github.io/
    Stefano Ermon
    Associate Professor of Computer Science, Stanford University
    cs.stanford.edu/~ermon/
    Learn more about the online course and how to enroll: online.stanford.edu/courses/c...
    To view all online courses and programs offered by Stanford, visit: online.stanford.edu/

КОМЕНТАРІ • 2

  • @user-zr4ns3hu6y
    @user-zr4ns3hu6y 20 днів тому +5

    I think the titles of lecture 8 and lecture 9 have been switched.

  • @CPTSMONSTER
    @CPTSMONSTER 18 днів тому

    15:15 High likelihood and bad samples, garbage component is a constant in log-likelihood
    40:00? Expectation on p data and p theta, how was this chosen
    46:35 Note optimization of phi (discriminator) and theta (generator of fake samples)
    50:45 Likelihood model in discriminator, but GANs can avoid likelihoods
    1:00:15? Expectation on p data and p theta, added?
    1:06:50 Minimax training objective
    1:15:00 GANs no longer state of the art, very hard to train, mode collapse, no clean loss function to evaluate