How can I export a Mist pass in a 3D model form (OBJ) rather than a Open EXR (which doesn't give me the option to move it around in 3D space inside Element 3D) ....
A mist pass is by design a 2D file, as it's a projection of depth from the camera at an exact angle, so it can not be translated into a 3D space (because 3D means the camera will change position)
do all professionals use a depth passes instead of rendering with DoF right from Blender for example? I keep hearing about separate passes etc, but struggling to get my head around why they're so powerful or why they're necessary!
I think it's because you have so much control over it and can change it without doing another render. Also i just tried to use the one in blender for one of my project, i need to have such a strong DoF that even with denoising and a lot of sample i have some artefact in my render that won't go away. I'm gonna try to use this technique to get around it.
The main reason I'm looking into it is to A reduce Render time and B dramatically reduce noise in the image along with denoising artifacts caused by the denoiser. Dof and Volumes really screw with the denoiser but by doing those in post I can get nice clean images in the end that end up take 1/3 the time to render for better quality.
@@dukevera4216 Firstly it's not the exact same thing it's actually rather different but good results in post are possible. Secondly it renders faster because there's dramatically less noise in the image which means instead of having to render 4-8k samples to get a clean image (and probably way more than that for animations with heavy DOF) I can get the same thing with a fraction of the sample count and in animation it will be WAY more temporally stable. Basically if I try to denoise a 256 sample render with regular DOF/FOG the results will be an unusable mess even if I denoise the passes separately because the noise in the render also obfuscates details in the individual passes (A noisy splotch in a render is a noisy splotch even if you separate the render passes), alternatively if I render without all that defocus from the DOF and just noise from the fog I can hand the denoiser crisp clean images that it'll do a great job on, then I can add all those things back in post, albeit with some quality tradeoffs that we mitigate for with more work in post.
This was helpful and worked, but coming from other 3d software this process of making a zdepth pass is very convoluted. There has to be another way! For instance in 3Ds Max I can make a Zdepth pass (or any other pass you can imagine) in just a couple of clicks all from a single menu. No nodes, no multiple locations to set an output, no hidden boxes to check, no separate work spaces to set up.
HI! Thanks so much for the video, it was really really helpful! Question, if I'm rendering at 16 bpc should i put the gamma at half of what you put it in (like a 5 maybe), or is 10 still the value i should input?
You can just render out your beauty pass as a regular .png file, and your z-depth file as a separate .exr file, then there's no need for any of that color matching stuff.
FINALLY. A well-explained, No-Nonsense explanation!
finally a good explanation of the blender compositor settings
Exactly what I was looking for! What's the View Transform in Blender exr files? AGX, filmic or sRGB?
Best video I have found on the subject! Thank you!
what's deth pass for then?
How can I export a Mist pass in a 3D model form (OBJ) rather than a Open EXR (which doesn't give me the option to move it around in 3D space inside Element 3D) ....
A mist pass is by design a 2D file, as it's a projection of depth from the camera at an exact angle, so it can not be translated into a 3D space (because 3D means the camera will change position)
do all professionals use a depth passes instead of rendering with DoF right from Blender for example? I keep hearing about separate passes etc, but struggling to get my head around why they're so powerful or why they're necessary!
I think it's because you have so much control over it and can change it without doing another render. Also i just tried to use the one in blender for one of my project, i need to have such a strong DoF that even with denoising and a lot of sample i have some artefact in my render that won't go away. I'm gonna try to use this technique to get around it.
ua-cam.com/video/9p_iwjU5_Bg/v-deo.html&ab_channel=DylanNeill
The main reason I'm looking into it is to A reduce Render time and B dramatically reduce noise in the image along with denoising artifacts caused by the denoiser.
Dof and Volumes really screw with the denoiser but by doing those in post I can get nice clean images in the end that end up take 1/3 the time to render for better quality.
@@OGPatriot03 I've always wondered about the reduced render time, why would it be quicker to render the egsact same thing but separately?
@@dukevera4216 Firstly it's not the exact same thing it's actually rather different but good results in post are possible.
Secondly it renders faster because there's dramatically less noise in the image which means instead of having to render 4-8k samples to get a clean image (and probably way more than that for animations with heavy DOF) I can get the same thing with a fraction of the sample count and in animation it will be WAY more temporally stable.
Basically if I try to denoise a 256 sample render with regular DOF/FOG the results will be an unusable mess even if I denoise the passes separately because the noise in the render also obfuscates details in the individual passes (A noisy splotch in a render is a noisy splotch even if you separate the render passes), alternatively if I render without all that defocus from the DOF and just noise from the fog I can hand the denoiser crisp clean images that it'll do a great job on, then I can add all those things back in post, albeit with some quality tradeoffs that we mitigate for with more work in post.
This was helpful and worked, but coming from other 3d software this process of making a zdepth pass is very convoluted. There has to be another way! For instance in 3Ds Max I can make a Zdepth pass (or any other pass you can imagine) in just a couple of clicks all from a single menu. No nodes, no multiple locations to set an output, no hidden boxes to check, no separate work spaces to set up.
he made this tutorial more complicated. To render a mist pass is literally 2 clicks, plus it's no need of an exr file
this was great thanks
WHY...I use the gamma to 10.000.
its still gray....
HI! Thanks so much for the video, it was really really helpful! Question, if I'm rendering at 16 bpc should i put the gamma at half of what you put it in (like a 5 maybe), or is 10 still the value i should input?
Thank You!
beast! ~
Why don't do everything in blender? Ae i kind of mainstream nowadays
great!!!
Finding the config file is every time a pain in the ass...... especially for unorganized people like me XD
You can just render out your beauty pass as a regular .png file, and your z-depth file as a separate .exr file, then there's no need for any of that color matching stuff.
This is overcomplicated, u dont need any scripts
i think bokeh is better
Very helpful, thank you.