Your excitement and enthusiasm are palpable, so much so that to fully bask in the glory, and to allow my brain to try and comprehend how this will affect my world, I have had to set my playback speed to 0.75 🙂
So go slower I hear! Glad you find the content inspiring. That was my goal! I’m not the MOST creative person. But if I show this to everyone, people will make interesting things once you learn how to use it.
Glad to hear! I promise you that the more of these projects you dive into, the easier it gets. You will find that there is a ton of overlap with 3D Gaussian Splatting, Nerfstudio, Instant NeRFs, and other 3D reconstruction from imagery techniques.
@@thenerfguru i saw you using vr in a nerf? is there a vr viewer? or possibly a way to export nerfs into a mesh? i heard it asked a few times here but didn't see a reply
Thanks for your videos, Jonathan! I made sure to give you a shout-out in my most recent video talking about Gaussian Splatting. I used old 2D and 360 video/photos to see what was possible and had some interesting results! I'm fascinated by how this kind of tech will reshape the entertainment capture industry. 🤗
Hello there, I recently followed your instructions, but I encountered an issue with Nerfstudio. It requires a YML file using the "--load-config" option, but as you know, the 3DGS process doesn't produce a YML file. As a result, it's difficult to follow along with your video now. Is there any way to open the 3DGS PLY file using Nerfstudio? I'm quite new to this field, and thank you for your great video regardless!
awesome video! Thank you so much, I was wondering if it's possible to use the instant -ngp viewer with splats? I love the ability to control the crop box with extents and the gumball.
Technically, not out of the box. I believe Infinite Realities built their own in house viewer based on Instant NGP. We’ll have to wait to see if something comes lot publicly.
to be clear, using Gaussian, we can not export meshes/pointclouds in NerfStudio, we would have to feed it nerfs initially? I see a comment below that there could be an export using Splats, can you go over that?
Gaussian Splats is not really a technology that was focused on creating accurate geometry. If that’s what you’re after, I would suggest looking just to something like Neuralangelo. I plan to cover that technology in the coming week or so.
NerfStudio install like everyone is saying would be a great video. Can I suggest do the video on a brand new install of Win 10 and then installing all the dependencies such VS2019(or VS2022).
This tech seems like it would revolutionize computer graphics. It seems to cut out the need to render polygons and all the intensive rendering pipelines traditionally required to reproduce 3d environments. You wouldn't even need a graphics rendering engine so to speak, or am I way off base?
4:19 Just in case someone is having issues in the same point there is a video cut, when installing the submodules: being in the correct folder didn't fixed the issue for me. I had to install PyTorch, and then it worked.
Does this technology allow one to create virtual environments in 3d software. I work with matchmove and the point clouds got me super interested. Usually matchmove departments get lidar scans to work with . I wonder if this tech could work or help in some way with VFX workflows for set extensions etc
Thank you for your videos I really aprecciate. I have seen that from the Gaussian Splatter viewer you still cannot export a mesh or a point cloud but from the Nerfstudio viewer, can it be done?
@@aestendrela Literally I skipped the portion on how to install Nerfstudio dependencies. It's 2 command prompts. The trick is matching the correct prompt with your version of CUDA toolkit.
Both are used to generate NeRFs. Instant NGP is NVIDIA's project to produce NeRFs fast! It has a Windows binary and you need to license it to use it commercially. Nerfstudio is an open-source collaborative project where you can train NeRFs using an assortment of methods. It is a little bit more code-heavy to run, but not by much.
I never going to be able to watch my results? :D :D The precompiled viewer wants me to buy a new gpu If I want to compile it Cmake won't see cuda no matter what I do The browser viewer almost works And this one says: "fatal: Remote branch gaussian-splatting not found in upstream origin" but there is in the same folder as you showed... (I had no issues to run nerf studio or instant-ngp before)
What if we shoot the surrounding first and then record a video of a character doing some action. Is there a way to generate the output where we can see the surrounding and the moving object in it
I’d appreciate a tutorial on nerf studio itself as well. Essentially a basic beginner’s guide would be awesome. 🙏😊
I’ll make it so.
@thenerfguru that would be amazing
Your excitement and enthusiasm are palpable, so much so that to fully bask in the glory, and to allow my brain to try and comprehend how this will affect my world, I have had to set my playback speed to 0.75 🙂
So go slower I hear! Glad you find the content inspiring. That was my goal! I’m not the MOST creative person. But if I show this to everyone, people will make interesting things once you learn how to use it.
Thank you for those videos and how to publish them so fast. Hopefully we will see 360 photos soon😊
Probably the next one.
subbed. nerf is really intimidating to approach from just some text files and a few energy drinks. this video series is gold!
Glad to hear! I promise you that the more of these projects you dive into, the easier it gets. You will find that there is a ton of overlap with 3D Gaussian Splatting, Nerfstudio, Instant NeRFs, and other 3D reconstruction from imagery techniques.
@@thenerfguru i saw you using vr in a nerf? is there a vr viewer? or possibly a way to export nerfs into a mesh? i heard it asked a few times here but didn't see a reply
Thanks for your videos, Jonathan! I made sure to give you a shout-out in my most recent video talking about Gaussian Splatting. I used old 2D and 360 video/photos to see what was possible and had some interesting results! I'm fascinated by how this kind of tech will reshape the entertainment capture industry. 🤗
Keep going, great content on your channel. A Nerfstudio windows Install tutorial would be fantastic.
I’ll get it done this week. It will be install and how to train your first scene with the nerfacto method.
Awesome, thanks Jonathan!
You’re welcome!
Hello there, I recently followed your instructions, but I encountered an issue with Nerfstudio. It requires a YML file using the "--load-config" option, but as you know, the 3DGS process doesn't produce a YML file. As a result, it's difficult to follow along with your video now. Is there any way to open the 3DGS PLY file using Nerfstudio? I'm quite new to this field, and thank you for your great video regardless!
Great educational format diving right in.
Yaaaaas!
awesome video! Thank you so much,
I was wondering if it's possible to use the instant -ngp viewer with splats? I love the ability to control the crop box with extents and the gumball.
Technically, not out of the box. I believe Infinite Realities built their own in house viewer based on Instant NGP. We’ll have to wait to see if something comes lot publicly.
Nice tutorial, thank you. I guess NerfStudio will also allow saving mesh as .OBJ format, right?
Those tools are for the original needstudio implementation. They do not work for this project.
to be clear, using Gaussian, we can not export meshes/pointclouds in NerfStudio, we would have to feed it nerfs initially? I see a comment below that there could be an export using Splats, can you go over that?
Gaussian Splats is not really a technology that was focused on creating accurate geometry. If that’s what you’re after, I would suggest looking just to something like Neuralangelo. I plan to cover that technology in the coming week or so.
NerfStudio install like everyone is saying would be a great video. Can I suggest do the video on a brand new install of Win 10 and then installing all the dependencies such VS2019(or VS2022).
That's exactly how I would do it!
Great job mate! Thanks
Thanks!
I really like your video!
This tech seems like it would revolutionize computer graphics. It seems to cut out the need to render polygons and all the intensive rendering pipelines traditionally required to reproduce 3d environments. You wouldn't even need a graphics rendering engine so to speak, or am I way off base?
4:19 Just in case someone is having issues in the same point there is a video cut, when installing the submodules: being in the correct folder didn't fixed the issue for me. I had to install PyTorch, and then it worked.
Ah, I had Pytorch installed in the Conda environment. That's a dependency. Thanks for pointing that out.
Does this technology allow one to create virtual environments in 3d software. I work with matchmove and the point clouds got me super interested.
Usually matchmove departments get lidar scans to work with . I wonder if this tech could work or help in some way with VFX workflows for set extensions etc
Not sure. I am not an expert in this exact realm of technology. My hunch is no for now.
Can we export 3d mesh for example obj... from 3D Gaussian Splatting?
Technically possible, but not with the code I am using. Also, the output with be vertex colored, not nicely phototextured.
you are the best!
Is it possible to take real dimensions using gaussian splatting ? For example dimensions room ect?
I wouldn’t measure on a Gaussian Splat scene. It’s based on a point cloud, so you could sample measurements on that. Still, I wouldn’t trust accuracy.
Thank you for your videos I really aprecciate. I have seen that from the Gaussian Splatter viewer you still cannot export a mesh or a point cloud but from the Nerfstudio viewer, can it be done?
I don't believe so.
Thanks for reply! We'll have to wait, then@@thenerfguru
By the way, could you make a video on how to install Neuralangelo? It looks very interesting 😉@@thenerfguru
You are the best! 😊
It would be awesome if you made a tutorial to install NERFSTUDIO + GAUSSIAN.
Basically it was this video plus a couple commands. Haha
@@thenerfguru hahhahaha
@@aestendrela Literally I skipped the portion on how to install Nerfstudio dependencies. It's 2 command prompts. The trick is matching the correct prompt with your version of CUDA toolkit.
hey, whets the difference between nerf studio and instant ngp?
Both are used to generate NeRFs. Instant NGP is NVIDIA's project to produce NeRFs fast! It has a Windows binary and you need to license it to use it commercially. Nerfstudio is an open-source collaborative project where you can train NeRFs using an assortment of methods. It is a little bit more code-heavy to run, but not by much.
This is possible export en obj or fbx for use in Unreal Engine 5?
No, but there is a plugin now on the marketplace to view the data natively in UE5.
I followed your tutorials but somehow my generated clouds are full of artifacts.
Nice.
Thanks!
Would be pretty awesome if we could export these into blender an view them in VR.
People are working on that. How about Unity in VR? That may be a video I make soon.
no one has made a video on how to splat a ply file ahhhh can you make one pleaseeee
How to splat a ply file? like a random one?
is there any possibility to grab the model and put it in Unity or unreal engine?
I saw a proof of concept in UE5 today. x.com/kenjiasaba/status/1698508499691467256?s=46&t=LkxosscvTGfFgbDlP6AL7Q
does it support dynamic scenes ?
I never going to be able to watch my results? :D :D
The precompiled viewer wants me to buy a new gpu
If I want to compile it Cmake won't see cuda no matter what I do
The browser viewer almost works
And this one says:
"fatal: Remote branch gaussian-splatting not found in upstream origin"
but there is in the same folder as you showed...
(I had no issues to run nerf studio or instant-ngp before)
Hmm, I suggest asking questions on the original GitHub Projects discussion page.
@@thenerfguru Sorry, thank you :)
What if we shoot the surrounding first and then record a video of a character doing some action. Is there a way to generate the output where we can see the surrounding and the moving object in it
Like an output which can showcase the change of state of an object in the content. (Adding timeline)
You could get this all done somehow using a Unity, UE5, or Blender integration. Those are all coming.
@@thenerfguru that's good to hear. Looking forward to watch your videos on these...
Hi Is it available in colab with same way?
thanks for sticking with Unity
My pleasure! I’ll be curious how they respond to the creator backlash.
How to Use The Nerfstudio Viewer With 3D Gaussian Splatting in google colab