Very nice lecture giving a good overview. Thanks a lot. One remark: There are actually regions in the brain that continuously generate new neurons, in particular the detate gyrus in the hippocampus, and we have actually built a model for avoiding catastrophic interference on that, see Wiskott, L., Rasch, M. & Kempermann, G. A functional hypothesis for adult hippocampal neurogenesis: avoidance of catastrophic interference in the dentate gyrus, Hippocampus, 2006, 16, 329-343.
Great talk! Thank you for the video. Just two comments regarding typing on ua-cam.com/video/vjaq03IYgSk/v-deo.html: 1. During initialization, wouldn't Y_o be Y^hat_o? Because that is the output of the network 2. In the argmin formula isn't Y_o the same as Y_n?
Dont know much about this subject. What if you would not need old training data but generate it with the current neural network and the knowledge which objects or things where related. You could dream up the old skill and include the new situation. Nvm.. this has been covered in the video..
this is a gem on the internet...more people should know about these videos!
Agree !
this video is genius why isn't it famous ???
Man , here in 2024 and I just discovered this concepts an vid . Amazing !
Very nice lecture giving a good overview. Thanks a lot.
One remark: There are actually regions in the brain that continuously generate new neurons, in particular the detate gyrus in the hippocampus, and we have actually built a model for avoiding catastrophic interference on that, see Wiskott, L., Rasch, M. & Kempermann, G. A functional hypothesis for adult hippocampal neurogenesis: avoidance of catastrophic interference in the dentate gyrus, Hippocampus, 2006, 16, 329-343.
Thank you Paul Very Much for this brilliant summary of the "Continual Learning" topic , you saved my day!
Great aproach to the problem, best explanation i´ve found.
Excellent content to get a quick overview.
Paul, Do you have a patreon? Your videos are awesome.
this video was really informative and well described.
indeed very good video. well and easy explained.
Brilliant lecture!
Hi dear, nice explanation.. do you have python code for this task?
Amazing video!
god-tier video
Paul, although a very good explanation but dude c'mon why so low volume!
Great talk! Thank you for the video.
Just two comments regarding typing on ua-cam.com/video/vjaq03IYgSk/v-deo.html:
1. During initialization, wouldn't Y_o be Y^hat_o? Because that is the output of the network
2. In the argmin formula isn't Y_o the same as Y_n?
Dont know much about this subject. What if you would not need old training data but generate it with the current neural network and the knowledge which objects or things where related. You could dream up the old skill and include the new situation. Nvm.. this has been covered in the video..