Love that as I'm watching your other g-splat videos, a new one emerges! Incredibly helpful as always, saving us a lot of time and headaches. You are indispensable to this ecosystem, so keep it up and know that we're grateful! Also, props to the Polycam team for shipping this so quickly.
Thank you! Since I am making these on my spare time, I can't quite put the effort or production value in it like I would like to. Keep blazing the trail on engaging content around this!
You're being very charitable. The difference in results are night and day! Great to see a free easy version but the results aren't equal at all. Looking forward to watching your how to github gaussian splat video.
Good points on comparing pros and cons of both of these GS creating methods. Have you already checked how you can open a point cloud created in Polycam in the SIBR viewer? It is bit of a hack where you need that small cfg_args file in the opening directory. But it works. The same trick can also be done for those LumaAI's new Gaussian point clouds. they can be openned in SIBR viewer.
I hadn’t actually tried using the cfg_args. I noticed either Polycam or Luma AI include them. Does any cfg_args file work? Or does it have to be one that come from the original project training?
It is file that you can copy from original project. Some of those output folders where you have GS models which you have trained with Inria. Then the PLY file must be in point_cloud folder and ply must be renamed as point_cloud.ply. After that SIBR viewer is able to open the file.
@@OlliHuttunen78 I figured. So the downside is if you don't run the original project you don't have the config file. That's a bummer for all of those people who don't have the correct hardware...for example Mac users.
@@thenerfguru Hi Jonathan! I just released a tutorial video about this how you can open Splat PLY files that you can export from Luma AI or Polycam and open them in SIBR viewer. Check it out: ua-cam.com/video/xxQr61ifoqM/v-deo.htmlsi=bGCVI9VJnAG8W3LT
Your tutorials have been super helpful. Do you have any plans to put out a video on webhosting your own splats or editing them? I know I'm struggling to figure out how to re-orient the ground plane of a splat so that when I'm ready to host, it's not initially rotated off at strange kiltered angles.
How do these compare to splatfacto-big? The gaussian-splatting-Windows repo (which I assume you're using?) mentions needing 24GB of VRAM and I only have 12GB so I'm curious to know if it's worth renting a GPU for this.
Great comparison! I'm looking forward several evident features in GS pipeline: ability to mark up "meaningful", "skybox" and "vacuum" voulmes in the scene. Given that, I'm sure that it would be possible to hide mid-air blobs and skybox splats.
I saw that 4d Gaussian splats are now out, could you do a video on this. It’s the video version of G splats and you don’t need a camera done to use it!
Yes! Maybe that should a video… however, performance can be slower on smartphones and lower end PCs. I believe Polycam has tuned their output to be fast on browser, not necessarily going for top quality.
Hi Jonathan, my name is Alex and I am currently working on a thesis that has to do with NeRF, 3D content generation and video generation. I was wondering if there is a way to get in contact with you as I would love to ask you a few questions!
Been thinking that splats could be translated to a vertex and be used with AI to generate moving meshes. Just a thought. No static mesh but mesh vertex based on splat.
I believe someone is working on a Blender plugin. Also, the original is better, but I think that’s partly because they are balancing quality with performance in browser.
Does anyone know what the best settings for camera like A6400 are? I tried to do find info but nothing found so far. Thank You for your tutotrials I made my first test with gaussian splatting Thanks for sharing knowledge.
Probably, but its not going to be good, topology will be horrible and there is no way it could be used in anything professional like film/tv cgi, everything would have to be remodeled/retopologized so in the end doesnt matter. depends where you need it for
Love that as I'm watching your other g-splat videos, a new one emerges! Incredibly helpful as always, saving us a lot of time and headaches. You are indispensable to this ecosystem, so keep it up and know that we're grateful! Also, props to the Polycam team for shipping this so quickly.
Thank you! Polycam did a great job! This is a no brainer especially if your in the Polycam ecosystem.
You should do a Luma AI Gaussian Splat vs Polycam Gaussian Splat comparison too!
THIS
and kiri engine !
8:45 actually looks really artistic, like stokes of brush ! Interesting result
Great comparison dude as per usual!! Thanks for this beautiful video
Thank you! Since I am making these on my spare time, I can't quite put the effort or production value in it like I would like to. Keep blazing the trail on engaging content around this!
Nah dude they are great. People would not be so informed on this topic if it was not for you!@@thenerfguru
Luma also added GS, you should test and compare that one too.
Probably my next video! I recorded this before the Luma AI announcement.
That's amazing splat on the cat!! and the drone shot!
You're being very charitable. The difference in results are night and day! Great to see a free easy version but the results aren't equal at all. Looking forward to watching your how to github gaussian splat video.
Good points on comparing pros and cons of both of these GS creating methods. Have you already checked how you can open a point cloud created in Polycam in the SIBR viewer? It is bit of a hack where you need that small cfg_args file in the opening directory. But it works. The same trick can also be done for those LumaAI's new Gaussian point clouds. they can be openned in SIBR viewer.
I hadn’t actually tried using the cfg_args. I noticed either Polycam or Luma AI include them. Does any cfg_args file work? Or does it have to be one that come from the original project training?
It is file that you can copy from original project. Some of those output folders where you have GS models which you have trained with Inria. Then the PLY file must be in point_cloud folder and ply must be renamed as point_cloud.ply. After that SIBR viewer is able to open the file.
@@OlliHuttunen78 I figured. So the downside is if you don't run the original project you don't have the config file. That's a bummer for all of those people who don't have the correct hardware...for example Mac users.
@@thenerfguru Hi Jonathan! I just released a tutorial video about this how you can open Splat PLY files that you can export from Luma AI or Polycam and open them in SIBR viewer. Check it out: ua-cam.com/video/xxQr61ifoqM/v-deo.htmlsi=bGCVI9VJnAG8W3LT
Thank you for the good information. I'm studying!
Let me know if you have any questions!
@@thenerfguru How to edit a point cloud?
Great comparison !! It is clear to decide to opt for one or the other.
Would love to see a night scene with lights!! I wonder how it would turn around....
Your tutorials have been super helpful. Do you have any plans to put out a video on webhosting your own splats or editing them? I know I'm struggling to figure out how to re-orient the ground plane of a splat so that when I'm ready to host, it's not initially rotated off at strange kiltered angles.
How do these compare to splatfacto-big?
The gaussian-splatting-Windows repo (which I assume you're using?) mentions needing 24GB of VRAM and I only have 12GB so I'm curious to know if it's worth renting a GPU for this.
In Luma AI you can export now in Gaussian splatts
Great comparison! I'm looking forward several evident features in GS pipeline: ability to mark up "meaningful", "skybox" and "vacuum" voulmes in the scene. Given that, I'm sure that it would be possible to hide mid-air blobs and skybox splats.
You could get the splats into Unity or Unreal Engine and edit the splats that are floating mid-scene.
Hi. What softwarre were using to compare to Polycam? Sorry If I missed this in the intro.
It's the original GitHub project: repo-sam.inria.fr/fungraph/3d-gaussian-splatting/
What are the indie game developers waiting for to dive into this even it means single-scene games? It's really hard to comprehend why not.
No shadows or real-time lighting yet, surface is not perfect for interaction.
I saw that 4d Gaussian splats are now out, could you do a video on this. It’s the video version of G splats and you don’t need a camera done to use it!
Hey, quick question. Could you share your settings for the training. When i do with i dont get that detail. Even when i do 30k steps
To be honest, it probably has to do more with the input imagery than the settings I chose.
Can I upload the desktop one to a server and view it on WebGL?
Great video as always
Yes! Maybe that should a video… however, performance can be slower on smartphones and lower end PCs. I believe Polycam has tuned their output to be fast on browser, not necessarily going for top quality.
@@thenerfguru Thank you for taking time to respond!
I'll be eagerly waiting for your video covering this
@@caedicoes Great!
Hi Jonathan, my name is Alex and I am currently working on a thesis that has to do with NeRF, 3D content generation and video generation. I was wondering if there is a way to get in contact with you as I would love to ask you a few questions!
Hi, Guru. did you have your splats files on some place? i want to try it with the @antimatter15 viewer.
I don’t have these scenes publicly hosted.
Been thinking that splats could be translated to a vertex and be used with AI to generate moving meshes. Just a thought. No static mesh but mesh vertex based on splat.
would be curious to see a comparison with LUMA AI !
Great comparation! thks, the Original is far better! will be great we can open and use it in Blender
yea, it can be cloud of planes, with material what react to angle and position of camera
I believe someone is working on a Blender plugin. Also, the original is better, but I think that’s partly because they are balancing quality with performance in browser.
The 'washed out' look might be due to Polycam using a light backdrop color instead of black. (t=671)
Good point!
Does anyone know what the best settings for camera like A6400 are? I tried to do find info but nothing found so far.
Thank You for your tutotrials I made my first test with gaussian splatting Thanks for sharing knowledge.
Those floaters might be contributed to the image via small lens flares.
Does it work faster or is it more detailed with a monochrome input or raw data?
Can I measure the relative width in the result of Gaussian? Thank you!
Hey Johnathon where is the best place to reach out with business enquires? I'm building a team of 3d experts for a startup in LA.
Can it run on 8gb VRAM like 3070ti? Or 11gb VRAM 2080Ti
would you teach us a way to generate a mesh out of my gaussian splats using nerfstudio, for example? very educational videos!
Probably, but its not going to be good, topology will be horrible and there is no way it could be used in anything professional like film/tv cgi, everything would have to be remodeled/retopologized so in the end doesnt matter. depends where you need it for
Yea. Not going to be better than if you use actual photogrammetry
Plz test it in colab too
When is the rebranding to gsplat guru lol /jk
Ha! Just you wait, I have non splat and NeRF content coming too. I do need to rebrand to something more 3D reconstruction specific.
The original requires a high end nvidia, no amd
I have an expensive GPU but its not nvidia
In your case, Polycam or Luma AI are your best choice!