Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains
Вставка
- Опубліковано 20 жов 2020
- NeurIPS 2020 Spotlight. This is the 3 minute talk video accompanying the paper at the virtual Neurips conference.
Project Page: bmild.github.io/fourfeat
Paper: arxiv.org/abs/2006.10739
Code: github.com/tancik/fourier-fea...
Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains
Matthew Tancik*, Pratul P. Srinivasan*, Ben Mildenhall*, Sara Fridovich-Keil, Nithin Raghavan, Utkarsh Singhal, Ravi Ramamoorthi, Jonathan T. Barron, Ren Ng
*denotes equal contribution - Фільми й анімація
Thanks a lot! I needed to learn this for a job interview ❤
This is extremely impressive
Hi, thank you for the great work. I just wonder what software you used to make this video that could vividly show the iterations, the Fourier features and its Std, frequencies, and reconstruction.
Great work. Could you please share and teach how you created such a nice presentation?
I find a new idea based on this paper for anomaly detection.
why would the datapoints you consider anomaly actually anomalies? what if the trained network happens to not be good enough to discover a subspace that makes those datapoints normal?
@@huytruonguic You can try to design a conditional signal into the mlp, maybe regularize it with variational or sparse coding manner. You'll see something amazing is about to happen. I build this idea on medical anomaly detection and it works well. There are many properties of Fourier features not discovered in this paper.
Pluggin for SketchUp?
This looks amazing, why is it not being used in GANs?
could you explain the reason if you have gained insights by any chance?