Hi! Could you please elaborate on task transfer? You gave an example of classification of dogs and cats as task 1 and task 2 of horses and elephants. How does knowledge transfer work here?
Thanks for your comments. To elaborate on that, appearance wise all are 4 legged creatures. So the knowledge that classes horses and elephants are similar to classes dogs and cats should be transferred from task 1 to task 2. I spoke more from the perspective of meta learning where we train in episodes and each episode is a task. Hope it make sense now. Or perhaps the example wasn't the best.
Thanks for your question. It is the same as how you do with any neural network or CNN architecture. Instead of training your output with a softmax cross-entropy you can train with a L1 or L2 loss.
Once again the better way of illustrating the recent knowledge. Thanks a lot.
Nice explanation
Thank you Vikash! 😊
Excellent explanation. i love it.
Hi! Could you please elaborate on task transfer? You gave an example of classification of dogs and cats as task 1 and task 2 of horses and elephants. How does knowledge transfer work here?
Thanks for your comments. To elaborate on that, appearance wise all are 4 legged creatures. So the knowledge that classes horses and elephants are similar to classes dogs and cats should be transferred from task 1 to task 2. I spoke more from the perspective of meta learning where we train in episodes and each episode is a task. Hope it make sense now. Or perhaps the example wasn't the best.
this is a very good presentation
Thank you very much!
Thanks for the great video! Do you mean BERT instead of BIRT when you mention the class token?
Yes, thats a good spot Lisa. I meant BERT! :)
Great video! Would be nice if you could also post one about DETR and deformable DETR:)
Thanks Godwin.. I have a video on DETR. Will do them at some point :)
How to carry regression with Vision Transformer?
Thanks for your question. It is the same as how you do with any neural network or CNN architecture. Instead of training your output with a softmax cross-entropy you can train with a L1 or L2 loss.
Very cool
Thank you very much!
great video, can. you making coding video
ya sure, in the future videos! ☺