Vector Quantized VAEs

Поділитися
Вставка
  • Опубліковано 28 тра 2024
  • Vector Quantized VAEs are the first variational auto-encoders to be competitive with GANs in the quality of the generated images.

КОМЕНТАРІ • 9

  • @K3pukk4
    @K3pukk4 7 місяців тому

    what a legend!

  • @jonathanyang2359
    @jonathanyang2359 3 роки тому +1

    Thanks! I don't attend this institution, but this was an extremely clear lecture :)

  • @LyoshaZebra
    @LyoshaZebra 3 роки тому

    Thanks for explaining that! Great job. Subscribed!

  • @sdfrtyhfds
    @sdfrtyhfds 3 роки тому +1

    do you train the pixel cnn on the same data and just not update the Vae weights while training?

    • @davidmcallester4973
      @davidmcallester4973 3 роки тому +3

      yes, the vector quantization is held constant as the pixel CNN is trained.

  • @sdfrtyhfds
    @sdfrtyhfds 3 роки тому

    also, what if you skip the quantization during inference? would you still get images that make sense?

    • @davidmcallester4973
      @davidmcallester4973 3 роки тому +3

      Do you mean "during generation"? During generation you can't skip the quantization because the pixel-CNN is defined to generate the quantized vectors (the symbols).

    • @sdfrtyhfds
      @sdfrtyhfds 3 роки тому +1

      @@davidmcallester4973 I guess that during generation it wouldn't make much sense, i was thinking more in the direction of interpolating smoothly between two different symbols.

  • @bernhardbermeitinger8617
    @bernhardbermeitinger8617 3 роки тому +2

    Thank you for this video, however, please don't call your variable ŝ 😆 (or at least don't say it out loud)