glad to see 3DGS could be integrated in Resonite and i was pretty sure you're working on it because it's so good for VR and web . i made some videos about gaussian splatting on my YT channel with links in description if you want . and a podcast in english about that tech and future
8:48 wait so each splat has an array of information, based on the angle the camera is viewing it from? So could you essentially make visual interactions per eye? Creating depth per angle? But then the resolution of angle points would have to increase the closer you get and decrease as you get further away. So the limiting factor is what shape and what angle and what colour they are per angle. So could you have a bunch of "preset particles" to say paint with? and create a group of splats that react specific ways to light and the camera? Essentially making shader based volumes?
1:49 - Yes, objects looks great, but it is not interactable at all. No physics, no lighting, no dynamics and animation. The only thing that can be the same by description above - walls, but there is no splatting needed for it. Any other things on map like props or foliage should be "alive" in vr (You touching leaves on plant and it bends for example) and it is not fit for gaussian splatting
I just curious. Can we deform the location of points in space using something like lattice or a classic armature rig? If this were implemented, the static problem could be solved, because the collider can be made from a low-poly silhouette of an object. The complexity remains in the shade of points
These are the exact same things I thought about when watching the video as well. I don't see the reason to why we wouldn't be able to deform the spats with different techniques, (I could be wrong since I don't know how they are stored and used behind the scenes,) but I think another issue would be that, while it's amazing how gaussian splats preserve the environmental lighting and reflections, it would probably also be the downside to making it look good when animated. I imagine it would en up being like you put a typical polygon character in a specific scene, maybe it's lit with all different kinds of colored light if it's on a disco floor or something. Then imagine baking those lights onto the character and it's textures. If you animate the character, if it's now in a different scene, or even if it's within the same scene the lighting will no longer make any sense. So now you'd need some sort of way to remove most if not all ambient lighting from the model or splats, and still have it look good.
So mesh based photogrammetry tries to recreate the _shape_ of a scene, but Gaussian Splatting tries to recreate the _experience of observing_ it from all angles? So you don't get an actual mesh, but a volume you can look through from different perspectives and see the object as interpolated from the different camera perspectives? Sounds useful (they sure look gorgeous when they work) but restrictive, in that everything's "baked in" and you can't really easily make them respond to outside influences (lighting, etc.)
She splat on my Gaussian till I render
I hate how that works, now take my like and get out /pos
Oh look it's me >///
Its also me
I've been using that scan to show people the power of Gaussian Splatting. It came out so good! IT'S SO FLUFFY!
Ouppy!
the floof :0
This thing is super dang interesting!! Imma look into it more, thank you for talking about this subject!
saw gaussian splats and nerfs in some recent corridor videos and wanted to mess with them in resonite, they would be super neat
Fakt miluju jak je jednoduchý nahrát prvky přímo do hry a hned je vidět a mít možnost s nima interagovat, asi čas se zase do Resonitu podívat xD
Its neat what you can do in Resonite. Just spawing a video window or objects. Wet dreams for VRChat. But they are getting there slowly.
glad to see 3DGS could be integrated in Resonite and i was pretty sure you're working on it because it's so good for VR and web . i made some videos about gaussian splatting on my YT channel with links in description if you want . and a podcast in english about that tech and future
FYI the metaverse platform framevr implemented GS this week
8:48 wait so each splat has an array of information, based on the angle the camera is viewing it from? So could you essentially make visual interactions per eye? Creating depth per angle? But then the resolution of angle points would have to increase the closer you get and decrease as you get further away. So the limiting factor is what shape and what angle and what colour they are per angle.
So could you have a bunch of "preset particles" to say paint with? and create a group of splats that react specific ways to light and the camera? Essentially making shader based volumes?
So whats the resolution of the angle reference for the splats? Can you incrase them? and Decrease them with LOD reference?
@@Jack_Wolfe from what i have seen game Dreams on PS4 does something similar.
EXCITING! It'll be a first for a VR world to incorporate Gaussian Splatting
I could listen to you talk tech all day
Gracia AI has a gaussian splatting viewer working in SteamVR. It is technically possible.
Oh yeah, there's no reason it wouldn't be possible, it's just matter of putting time into it. There are existing solutions to render these in VR.
16:26 so is there a way to combine both methods? taking the mesh data from the first one and guiding the gaussian splats to the right depth?
1:49 - Yes, objects looks great, but it is not interactable at all. No physics, no lighting, no dynamics and animation. The only thing that can be the same by description above - walls, but there is no splatting needed for it. Any other things on map like props or foliage should be "alive" in vr (You touching leaves on plant and it bends for example) and it is not fit for gaussian splatting
Actually i hate static worlds in vrchat. There are a lot of props and NONE of them can't be picked up!
I just curious. Can we deform the location of points in space using something like lattice or a classic armature rig? If this were implemented, the static problem could be solved, because the collider can be made from a low-poly silhouette of an object. The complexity remains in the shade of points
These are the exact same things I thought about when watching the video as well. I don't see the reason to why we wouldn't be able to deform the spats with different techniques, (I could be wrong since I don't know how they are stored and used behind the scenes,) but I think another issue would be that, while it's amazing how gaussian splats preserve the environmental lighting and reflections, it would probably also be the downside to making it look good when animated.
I imagine it would en up being like you put a typical polygon character in a specific scene, maybe it's lit with all different kinds of colored light if it's on a disco floor or something. Then imagine baking those lights onto the character and it's textures. If you animate the character, if it's now in a different scene, or even if it's within the same scene the lighting will no longer make any sense.
So now you'd need some sort of way to remove most if not all ambient lighting from the model or splats, and still have it look good.
So mesh based photogrammetry tries to recreate the _shape_ of a scene, but Gaussian Splatting tries to recreate the _experience of observing_ it from all angles? So you don't get an actual mesh, but a volume you can look through from different perspectives and see the object as interpolated from the different camera perspectives?
Sounds useful (they sure look gorgeous when they work) but restrictive, in that everything's "baked in" and you can't really easily make them respond to outside influences (lighting, etc.)
are u from czech republic?
Yes
😎👍