Honestly this is one of the best introduction to the topic. A lot of the lectures directly dive / start with group theory and people without relevent background immediately loose interest seeing the mathematical concepts and axioms. This channel deserves more subs and views ❤❤
You have a good ability to explain difficult subjects easily. There are tutorial videos for Geometric Deep Learning from Bronstein(who organized GDL tutorial school with Cohen, Burna etc) which includes this concept, but their tutorial needs more math background. Excellent.
I love your content, and often come back to recall important concepts. Thank you very much, and I hope that soon I will be able to afford to buy you a coffee.
hi, where the lower complexity of the network is stated as an advantage.. complex in what sense? my definition of complex network means there are more filters/neurons/weights to learn.. so how would equivariance reduce connections? those are fixed by our hand-crafted layer topology, no?
Honestly this is one of the best introduction to the topic. A lot of the lectures directly dive / start with group theory and people without relevent background immediately loose interest seeing the mathematical concepts and axioms. This channel deserves more subs and views ❤❤
You have a good ability to explain difficult subjects easily. There are tutorial videos for Geometric Deep Learning from Bronstein(who organized GDL tutorial school with Cohen, Burna etc) which includes this concept, but their tutorial needs more math background. Excellent.
Thank you!!
I love your content, and often come back to recall important concepts. Thank you very much, and I hope that soon I will be able to afford to buy you a coffee.
Thank you, happy that it is useful :)
i've been looking at this topic these days, super helpful to wrap my head around better!
Awesome :)
hi, where the lower complexity of the network is stated as an advantage.. complex in what sense? my definition of complex network means there are more filters/neurons/weights to learn.. so how would equivariance reduce connections? those are fixed by our hand-crafted layer topology, no?
Wonderful video, the examples are incredibly intuitive.
Is it possible to share the slides as well?
Sure! Please send me a mail to deepfindr@gmail.com and I will attach them :)
@@DeepFindr Done. Hope to hear back from you soon :)
Really interesting content.
Awesome as always
Fantastic video thank you so much
Clean as fuck. Thank you very much. Nice visuals nice explanations. Easy to follow
Awesome video!!
Fantastic video, thx
I love this video! thank you!
Nice video, could you please also create video of capsule networks, through caps-net you can also achieve equivariance in images.
Yes I'll also include them :)
Thx