@@RubenFro waw, that is encouraging, thank you for tip, I do happen to have a nikon, I have been researching a bit the topic meanwhile, it's been a few years I use Touch Designer, but I think it's time to jump on unity wagon, I have been mesmerized by that Aphex twin clip T69 Collapse for a while. You made best clip on all you tube, and I have seen a lot of them, congrats
You say it is a point cloud shader... does this mean you’re *not* using VFX graph for this? Is it custom code in a compute shader, then? Great work. I think the sound design of this series is fantastic.
Thanks! Yes, I meant it's rendered with a custom regular shader for the built-in pipeline, no VFX graph. I'm calculating lights and all the point effects directly from the shader. Just recently I ported it to URP so I'll move everything to the new pipelines next.
RubenFro Great. What is your motivation for using URP? To be able to render on mobile devices? Or just a way to “start from scratch” with all the rendering and build your own implementations... EDIT: sorry i assumed you meant from HDRP to URP... but now i read you said you were using the old standard pipeline. Cheers
@@matt42hughes Yeah it was mainly to understand how to write custom shaders on the new pipelines, but also, there's waaay too much stuff in an HDRP shader that I don't need. I'm also running tons of points (usually 20-30 million, sometimes up to 90 million for offline rendering) so I prefer to keep it light.
This is one of the best demoscene I have seen
teach us how to do that, transcendental work, really a gem!
Damn. This is so cool, great work :)
The Legend!
I'd really love for a tutorial, how do you go about creating something like this?
Gotta love when they just like your comment and don't reply lol
me too
Lidar scanning and Keijiro takahashi's particle effects plugin for unity.
@@georgepanicker61916 not fm points plugin?
Excelente trabajo. Potente. Inspirador. Gracias.
Thanks to you, very glad you enjoyed it!
This is stunning! Great work!
Thank you!
Did you have the volumetric capture itself running in real time?
Yes, I'm using Unity for all my works. I usually post some behind the scenes on Twitter
Nice job
Hi - I would really love you to show me how you did this?
wow , crazy good
appreciate it!
Please tell us a bit more, is that using a kinect and a laptop? or very expensive scanner
Neither! It’s photogrammetry using 360 cameras
@@RubenFro waw, that is encouraging, thank you for tip, I do happen to have a nikon, I have been researching a bit the topic meanwhile, it's been a few years I use Touch Designer, but I think it's time to jump on unity wagon, I have been mesmerized by that Aphex twin clip T69 Collapse for a while. You made best clip on all you tube, and I have seen a lot of them, congrats
@@RubenFro What software do you use to create the point clouds?
@@HamsterRocketeer Software itself is metashape
I need to take point cloud data from a Kinect and use it in VRChat
look up Phace in VRC or visit the MCSA!
Wow! Have a tutorial?
how did you do that ? can you reffer something please ? any tutorial ?
You say it is a point cloud shader... does this mean you’re *not* using VFX graph for this? Is it custom code in a compute shader, then? Great work. I think the sound design of this series is fantastic.
Thanks! Yes, I meant it's rendered with a custom regular shader for the built-in pipeline, no VFX graph. I'm calculating lights and all the point effects directly from the shader. Just recently I ported it to URP so I'll move everything to the new pipelines next.
RubenFro Great. What is your motivation for using URP? To be able to render on mobile devices? Or just a way to “start from scratch” with all the rendering and build your own implementations... EDIT: sorry i assumed you meant from HDRP to URP... but now i read you said you were using the old standard pipeline. Cheers
@@matt42hughes Yeah it was mainly to understand how to write custom shaders on the new pipelines, but also, there's waaay too much stuff in an HDRP shader that I don't need. I'm also running tons of points (usually 20-30 million, sometimes up to 90 million for offline rendering) so I prefer to keep it light.