Ian Goodfellow: Generative Adversarial Networks (NIPS 2016 tutorial)

Поділитися
Вставка
  • Опубліковано 5 січ 2025

КОМЕНТАРІ • 116

  • @RATMCU
    @RATMCU 4 роки тому +270

    even the lecture on GAN had an Adversary

  • @Prithviization
    @Prithviization 6 років тому +335

    Schmidhuber vs Goodfellow @ 1:03:00

  • @christophschmidl
    @christophschmidl 6 років тому +69

    Jürgen Schmidhuber starts talking at 1:02:59.

  • @TheAntrikooos
    @TheAntrikooos 5 років тому +96

    An adversarial interaction about adversarial methods.

  • @dasayan05
    @dasayan05 4 роки тому +67

    Schmidhuber: Can I ask a question ?
    GoodFellow: Oh, You_Again ?

  • @mranonymous8815
    @mranonymous8815 5 років тому +69

    Watch from 1:03:00 to 1:05:04 in double speed. The art of schmidhubering vs the art of patience while being schmidhubered.

    • @RATMCU
      @RATMCU 4 роки тому +5

      He took time to answer the question to a bigger audience than Schmidhuber, by directing them to the places where it matters, also saving time of his presentation.

  • @TapabrataGhosh
    @TapabrataGhosh 5 років тому +170

    1:03:00 is what you're all here for

    • @eugenedsky3264
      @eugenedsky3264 5 років тому

      Learning Finite Automaton Simplifier: drive _ google _ com/file/d/1GSv89tiQmPDcnFEu4n4CqfaJcUJxVmL5KrSCJ047g4o/edit

    • @y.o.2478
      @y.o.2478 3 роки тому +1

      So you come to a video about an amazing technology just for petty drama?

    • @olarenee208
      @olarenee208 2 роки тому +7

      @@y.o.2478 yes

    • @adamtran5747
      @adamtran5747 2 роки тому +2

      @@y.o.2478 1000%

    • @carlossegura403
      @carlossegura403 2 роки тому +1

      @@y.o.2478 Indeed.

  • @ZulfiqarZaidi-m4e
    @ZulfiqarZaidi-m4e 4 місяці тому +4

    Schmidhuber putting the ADVERSERIAL in GANs @ 1:03:00

  • @ruchirjain1163
    @ruchirjain1163 3 роки тому +26

    i came here just to understand GANs a bit better, didn't realize i would strike gold at 1:03:00

    • @kanishktantia7899
      @kanishktantia7899 2 роки тому +1

      Can you give your study plan as in how do you get it?

    • @ruchirjain1163
      @ruchirjain1163 2 роки тому +3

      @@kanishktantia7899 just watch a couple of videos on this, derive the min-max loss atleast once on paper. See a basic implementation of GAN from scratch. I was doing it just for making a ppt on a research paper (StyleGAN2), so this was more than enough for me. Same goes for any other topic in DL. I was just dipping my feet in the deep sea of DL, i ain't touching it again.

    • @kanishktantia7899
      @kanishktantia7899 2 роки тому

      @@ruchirjain1163 Sure , that might help. Im looking for higher study opportunities from abroad in this space only. Can you help me in any capacity?

    • @ruchirjain1163
      @ruchirjain1163 2 роки тому

      @@kanishktantia7899 it depends mate, i would say you should try going for thesis if u have that in ur university

    • @kanishktantia7899
      @kanishktantia7899 2 роки тому +1

      @@ruchirjain1163 I graduated last year, working these days.. not enrolled currently at any university..I'm looking for that only. Any lab or any university in US or maybe somewhere else.

  • @svenjaaunes2507
    @svenjaaunes2507 5 років тому +88

    So many people in the wild came up with this idea of training two networks against each other. Some lacked the deeper knowledge to continue, some tackled specific problems instead of generalizing. Ian Goodfellow is just another person who came up with this exact idea (in fact while drunk or something but this may be wrong but in a casual context for sure). Schmidthuber's paper is also based on this exact idea. Goodfellow must acknowledge the overwhelming overlap, let alone similarity, but he doesn't, because if he did, that would take all attention away from him since the earlier work is the same. He probably wasn't even aware of Schmidthuber's paper because that paper dates back >20 years earlier. This doesn't justify not giving proper credit though. Goodfellow simply popularized GANs, certainly not invented. In fact, I bet there were even others before Schmidthuber who came up with this core idea but just didn't keep up.

    • @miraculixxs
      @miraculixxs 21 день тому

      Agree. Goodfellow could and should have just cited Schmidhuber and explain the similarities and differencies. Schmidhuber wants recognition, not fame. Goodfellow wanted fame and got it.

  • @cueva_mc
    @cueva_mc 4 роки тому +6

    Can someone explain in English the reason of the confrontation?

    • @Karl_Squell
      @Karl_Squell 4 роки тому +2

      stats.stackexchange.com/questions/251460/were-generative-adversarial-networks-introduced-by-j%c3%bcrgen-schmidhuber/301280#301280

  • @firojpaudel
    @firojpaudel 15 днів тому

    Thanks Schmidhuber I had the same questions while reading the paper.....

    • @firojpaudel
      @firojpaudel 15 днів тому

      then I read the final copy 😂

  • @architjain6749
    @architjain6749 5 років тому +52

    I dont know if its just me, but, I enjoyed and understood Sir Jürgen Schmidhuber more than Sir Ian Goodfellow. Will definitely go check his contributions.

    • @kanishktantia7899
      @kanishktantia7899 2 роки тому +1

      How do you study can you share your plan with me?

  • @neuron8186
    @neuron8186 3 роки тому +10

    Really good fellow

  • @raxirex8646
    @raxirex8646 22 дні тому

    Has anyone kept track of the plagiarism Jürgen mentioned and if they are actually based on facts? Because he does make good points. We should not allow accidental/plagiarism and corrections have to be made.

  • @yoloswaggins2161
    @yoloswaggins2161 5 років тому +62

    If only Schmidhuber had more friends in the field he wouldn't be outmaneuvered in this manner.

    • @hummingbird7579
      @hummingbird7579 2 роки тому +5

      It should not matter if someone has friends or not within the field!!!

    • @yoloswaggins2161
      @yoloswaggins2161 2 роки тому +7

      @@hummingbird7579 It shouldn't but it does!

    • @hummingbird7579
      @hummingbird7579 2 роки тому +1

      @@yoloswaggins2161 History has shown time and time again... you are right.
      It's really a shame.

  • @MunkyChunk
    @MunkyChunk 2 роки тому

    Very well organised lecture, thank you Ian!

  • @backnforth8401
    @backnforth8401 4 роки тому +16

    1:03:00 what a way of getting called out

  • @OttoFazzl
    @OttoFazzl 5 років тому +8

    I think he misspoke at 31:39 - he said that we want to make sure that x has a higher dimension than z. Instead, z should have a higher dimension than x, to provide full support to space of x and avoiding learning lower-dimensional manifold.

    • @busTedOaS
      @busTedOaS 4 роки тому +2

      It's almost never the case in practice where z has about 100 entries and x is 500x500x3 or something similar. If it was the other way around, the noise input would have superfluous entries, which is what he means by "learning lower-dimensional manifolds", I believe. However, in order to perfectly reconstruct the training distribution, I agree that it makes sense to have it exactly that way, and I'm confused by the way he worded that whole part.

    • @akshayshrivastava97
      @akshayshrivastava97 4 роки тому

      yeah, that confused me too. Good to see someone agrees that it should be the other way round.

  • @sairocks128
    @sairocks128 6 років тому +1

    Thank you very much for uploading.

  • @masisgroupmarinesoftintell3299
    @masisgroupmarinesoftintell3299 3 роки тому

    Thank you for uploading the wonderful video! The explanations were really clear.

  • @AvielLivay
    @AvielLivay 2 роки тому

    13:24 when searching for the best theta, Ian is summing over the log of the probabilities rather than over the probabilities, why?

    • @piclkesthedrummer6439
      @piclkesthedrummer6439 Рік тому +1

      If it's still relevant, I'll try to help. This is called maximum likelyhood estimation, which is the product of the estimated probabilities on the training dataset. As derivative of product is not easy to work with, we take a log of this, so this yields the sum of logs. As log is a monotonous function it doesn't change the local minimum and is easier to take the derivative of. Hope it helped

  • @svenjaaunes2507
    @svenjaaunes2507 5 років тому +38

    predictability minimization is almost literally the same thing as GANs

    • @busTedOaS
      @busTedOaS 4 роки тому +14

      almost literally kind of exactly vaguely the same thing.

  • @harshinisewani5095
    @harshinisewani5095 4 роки тому

    Can someone give more insights to exercises?

  • @g.l.5072
    @g.l.5072 2 роки тому +1

    I dont think people realize this is one of the most important lectures in the past 100 years... GAN... it will be everywhere soon.

  • @Peace_in_you
    @Peace_in_you 5 років тому +1

    I have a question:is this kind of network just good for the same data as training data?

    • @bitbyte8177
      @bitbyte8177 5 років тому

      No

    • @busTedOaS
      @busTedOaS 4 роки тому +8

      The generator, if successful, will recover the training distribution, and nothing else, if that answers the question.

  • @wenboma4398
    @wenboma4398 6 років тому

    Good speech, thx

  • @dibyaranjanmishra4272
    @dibyaranjanmishra4272 5 років тому +5

    THERE IS NO BETTER LECTURE ON INTRODUCTION TO GAN. PERIOD.

  • @Marcos10PT
    @Marcos10PT 4 роки тому +32

    Schmidhuber has a point

    • @hummingbird7579
      @hummingbird7579 2 роки тому +4

      I feel bad for him. He deserves more recognition.

  • @nikhilmuthukrishnan7222
    @nikhilmuthukrishnan7222 6 років тому +1

    Pure Genius, available in R?

  • @miraculixxs
    @miraculixxs 21 день тому

    That's just sampling with a new name. The innovation is in naming, not method.

  • @user-pz7sl4qq9v
    @user-pz7sl4qq9v 2 роки тому

    Well well well another prodigy from stanford that has made a.i more complicated and sophisticated for the better good of humanity 😔... Isnt GAN whats CONDUCTING the war now lol 😂. I remember movie WARGAMES

  • @akshayshrivastava97
    @akshayshrivastava97 4 роки тому +7

    Ok, people like Dr. Schmidhuber need to understand that a tutorial is not a place for these kind of discussions. This could have been easily taken offline. Also, once the presenter expresses they have no desire to discuss it at that moment, back off. It's too conceited and self-important to think your argument with the presenter is more important than everyone else - who paid for NIPS and had been looking forward to this tutorial - getting their time and money's worth.
    That being said, not downplaying what Dr. Schmidhuber was trying to point out, simply that it could've been discussed differently and elsewhere.

    • @MrMSS22
      @MrMSS22 3 роки тому +13

      If Schmidhuber had only raised his concern offline, it would not have gotten in the focus of academic publicity in the way it did. It's reasonable to assume that the latter was his intention, therefore it didn't matter whether Goodfellows presentation was a tutorial.

  • @youtubeadventurer1881
    @youtubeadventurer1881 5 років тому +40

    Why is he obfuscating everything with needless mathematical jargon that most ML researchers won't understand? This stuff actually isn't so complicated that you need a degree in mathematics to understand it. You can understand it on a deep level with only high school mathematics if it is explained properly.

    • @teckyify
      @teckyify 5 років тому +48

      Oh sorry, what would you like to talk instead about? Visual Code themes or the latest JavaScript framework? Any ML course in the university is heavy math Einstein.

    • @OttoFazzl
      @OttoFazzl 5 років тому +17

      Engineering details are straightforward, however, this is a NIPS lecture, they have to have theoretical justification about how it works. If you want to engineer a working system you don't need all this, I agree with that.

    • @robbiedozier2840
      @robbiedozier2840 4 роки тому +12

      Man, it’s almost like Computer Science is a subset of Mathematics...

    • @robbiedozier2840
      @robbiedozier2840 3 роки тому

      @@sZlMu2vrIDQZBNke8ENmEKvzoZ lmao

    • @judedavis92
      @judedavis92 3 роки тому +1

      GO write your HTML code, kid.

  • @pranav7471
    @pranav7471 5 років тому +34

    Such a fake Schmidhuber is the true creator of Gans, and Goodfellow had the nerve to shut him up

    • @busTedOaS
      @busTedOaS 4 роки тому +7

      Two people inventing more or less the same thing independent of each other has happened many times in history, for example Newton's method or Darwin's theory of natural selection. Calling either of them fake is rather presumptuous.

    • @pranav7471
      @pranav7471 4 роки тому +12

      @@busTedOaS bro Goodfellow came decades after this guy that's not called simultaneous inventions

    • @busTedOaS
      @busTedOaS 4 роки тому +4

      @@pranav7471 That's why I said independent, not simultaneous. Bro.

    • @pranav7471
      @pranav7471 4 роки тому +11

      ​@@busTedOaS I completely understand that, but u need to credit the first creator too, that's the only problem I have. As far I know Goodfellow was the first one to make Adverserial Networks work, Schmidhuber worked with 1000x worser hardware and wasn't able to get any real results so it was just a cool idea with no solid results backing it up. Thus Goodfellow deserves credit but not completely. Even after all these arguments, Goodfellow refused to acknowledge the clear similarity between the work and cite it, this is blatantly unethical from an academic standpoint.

    • @busTedOaS
      @busTedOaS 4 роки тому +2

      ​@@pranav7471Schmidhuber had modern hardware in 2014. Plus years of experience with adversarial models ahead of Goodfellow, presumably. I don't see any disadvantage for Schmidhuber there.
      I agree that one should cite related work and Goodfellow does this consistently - in fact that's what he did right before the confrontation. Why would he specifically ignore Schmidhuber's work while citing many other works with even stronger similarities? The reasonable explanations I can come up with are 1) personal spite or 2) he honestly thinks the techniques are sufficiently different.

  • @EB3103
    @EB3103 3 роки тому +11

    This ian kid is rude and not so goodfellow. Schmidhuber politely just asked a question and got attacked

    • @EB3103
      @EB3103 3 роки тому +1

      And also schmidhuber can be his father, he should show a little more respect

    • @NavinF
      @NavinF 3 роки тому +3

      @@EB3103 Ok boomer

    • @Nickyreaper2008
      @Nickyreaper2008 3 роки тому +4

      We're talking about a guy that says he invented "generative adversarial networks", when the paper clearly mentions 7 other people, university staff and more working on the project. Of course he's gonna talk like that.

    • @floydamide
      @floydamide 8 місяців тому

      @@Nickyreaper2008 He also likes to call himself "The industry lead"

  • @TobiSemester
    @TobiSemester 7 місяців тому

    My role model we are science 🔭🧪