Continual Learning and Catastrophic Forgetting

Поділитися
Вставка
  • Опубліковано 22 гру 2024

КОМЕНТАРІ • 18

  • @weinansun9321
    @weinansun9321 4 роки тому +10

    this is a gem on the internet...more people should know about these videos!

  • @ahmedbahaaeldin750
    @ahmedbahaaeldin750 4 роки тому +6

    this video is genius why isn't it famous ???

    • @lorenzoleongutierrez7927
      @lorenzoleongutierrez7927 5 місяців тому

      Man , here in 2024 and I just discovered this concepts an vid . Amazing !

  • @prof.laurenzwiskott
    @prof.laurenzwiskott 4 роки тому +8

    Very nice lecture giving a good overview. Thanks a lot.
    One remark: There are actually regions in the brain that continuously generate new neurons, in particular the detate gyrus in the hippocampus, and we have actually built a model for avoiding catastrophic interference on that, see Wiskott, L., Rasch, M. & Kempermann, G. A functional hypothesis for adult hippocampal neurogenesis: avoidance of catastrophic interference in the dentate gyrus, Hippocampus, 2006, 16, 329-343.

  • @3koozy
    @3koozy 3 роки тому +1

    Thank you Paul Very Much for this brilliant summary of the "Continual Learning" topic , you saved my day!

  • @juandiego2045
    @juandiego2045 3 роки тому +1

    Great aproach to the problem, best explanation i´ve found.

  • @niraj5582
    @niraj5582 3 роки тому +1

    Excellent content to get a quick overview.

  • @lukepeterson868
    @lukepeterson868 4 роки тому +4

    Paul, Do you have a patreon? Your videos are awesome.

  • @piupunia6373
    @piupunia6373 4 роки тому +2

    this video was really informative and well described.

  • @boburniyozov62
    @boburniyozov62 6 місяців тому

    indeed very good video. well and easy explained.

  • @silviasanmartindeporres7033
    @silviasanmartindeporres7033 2 роки тому

    Brilliant lecture!

  • @abbaskhaliil4717
    @abbaskhaliil4717 4 роки тому +2

    Hi dear, nice explanation.. do you have python code for this task?

  • @reihanehmirjalili7467
    @reihanehmirjalili7467 8 місяців тому

    Amazing video!

  • @PuMpKinSpIcE666
    @PuMpKinSpIcE666 4 роки тому +2

    god-tier video

  • @prasenjitgiri919
    @prasenjitgiri919 3 роки тому

    Paul, although a very good explanation but dude c'mon why so low volume!

  • @yeyerrd
    @yeyerrd 2 роки тому

    Great talk! Thank you for the video.
    Just two comments regarding typing on ua-cam.com/video/vjaq03IYgSk/v-deo.html:
    1. During initialization, wouldn't Y_o be Y^hat_o? Because that is the output of the network
    2. In the argmin formula isn't Y_o the same as Y_n?

  • @kil98q
    @kil98q 4 місяці тому

    Dont know much about this subject. What if you would not need old training data but generate it with the current neural network and the knowledge which objects or things where related. You could dream up the old skill and include the new situation. Nvm.. this has been covered in the video..