Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains (10min talk)
Вставка
- Опубліковано 5 лип 2024
- NeurIPS 2020 Spotlight. This is the 10 minute talk video accompanying the paper at the virtual NeurIPS conference.
Project Page: bmild.github.io/fourfeat
Paper: arxiv.org/abs/2006.10739
Code: github.com/tancik/fourier-fea...
Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains
Matthew Tancik*, Pratul P. Srinivasan*, Ben Mildenhall*, Sara Fridovich-Keil, Nithin Raghavan, Utkarsh Singhal, Ravi Ramamoorthi, Jonathan T. Barron, Ren Ng
*denotes equal contribution - Наука та технологія
Great video, especially the part with the scale is well explained
Thank you so much for this wonderful video!
A+ in my book.
Hi, thank you for the great work. I just wonder what software you used to make this video that could vividly show the iterations, the Fourier features and its Std, frequencies, and reconstruction.
Would it be feasible to somehow incorporate the fourier features in the activation functions? So that the entire model can be made high frequency sensitive instead of just the input
You can just use Discrete Cosine Transform to do it. It's much simpler. No need to use Fourier transform to make it complex. We have a paper: www.cse.scu.edu/~yliu1/papers/ISCAS2020Yifei.pdf You can write 2D-DCT into 1 dimensional representation for activation function. See our another paper: www.cse.scu.edu/~yliu1/papers/ISCAS2021YifeiPei.pdf
However, these transforms can only work on fully-connected neural networks. It gives bad results on CNN.
😮
You can just use Discrete Cosine Transform to do it. We have a paper: www.cse.scu.edu/~yliu1/papers/ISCAS2020Yifei.pdf