Ref-NeRF: Structured View-Dependent Appearance for Neural Radiance Fields
Вставка
- Опубліковано 1 бер 2022
- We introduce Ref-NeRF, which fixes NeRF's failure to produce realistic specular highlights, and enables scene editing.
Project page: dorverbin.github.io/refnerf - Наука та технологія
"Hold onto your papers"
"What a time to be alive!"
This looks very impressive. I installed the NVIDIA NERF system a few weeks ago and found that very interesting too. I hope you release some code in the future as this looks very promising for us 3d artists especially if you can export geometry + diffuse/color maps similar in quality to photogrammetry.
Beautiful work!
OMG Brilliant work, well done!
This is incredible! keep it going!
Whoah, it's so pretty!
Very cool! 🔥
good work
neat !
Hi, this is great. I would like to ask a dumb question, what is the relationship between this and plenoxels, also known as nerf without neural network. Thanks.
Very nice work! However, one flaw I see in your work is that you model the reflection direction distribution as a circularly symmetric Gaussian distribution, which happens to be the Phong shading model, which is not how reflection directions are distributed in reality. Just look at reflections of streetlights in the street on a rainy day; they almost look like vertical lines, and not like discs.
A shading model that much better captures the real shape of reflections is the Blinn-Phong shading model, which instead approximately corresponds to modelling the surface normal as a circularly symmetric Gaussian distribution, and it does indeed generate reflections of street lights that almost look vertical. I have derived a reflection model that looks very similar to the Blinn-Phong shading model in Appendix B of my master thesis work "Wave Model and Watercraft Model for Simulation of Sea State."
Is there more detailed video about the paper?
How does the car from 6:47 look like from the bottom? I can't sleep because of it -,-
how does this technique compare on an actually foggy scene?