Image-based Lighting (IBL) of PBR Materials [Shaders Monthly #11]

Поділитися
Вставка
  • Опубліковано 1 сер 2024
  • In Episode #11 of Shaders Monthly, we talk about image-based lighting of the Cook-Torrance microfacet BRDF. This will allow us to implement image-based lighting for basic PBR materials in GLSL.
    For real-time performance, we follow closely the Siggraph 2013 tutorial "Real Shading in Unreal Engine 4" by Brian Karis:
    cdn2.unrealengine.com/Resourc...
    In the practical part, we implement three shaders in GLSL:
    1) PrefilterSpecular
    GSN Composer: www.gsn-lib.org/index.html#pr...
    C++: www.mathematik.uni-marburg.de...
    Java: www.mathematik.uni-marburg.de...
    2) BRDF Integration Map
    GSN Composer: www.gsn-lib.org/index.html#pr...
    C++: www.mathematik.uni-marburg.de...
    Java: www.mathematik.uni-marburg.de...
    3) IBLSpecular
    GSN Composer: www.gsn-lib.org/index.html#pr...
    C++: www.mathematik.uni-marburg.de...
    Java: www.mathematik.uni-marburg.de...
    Documentation for the shader plugin node of the GSN Composer:
    gsn-lib.org/docs/nodes/Shader...
    Additional lecture slides:
    www.uni-marburg.de/en/fb12/re...
    00:00 Introduction and Refreshers
    04:16 Image-based Lighting
    07:37 Importance Sampling of the GGX Normal Distribution Function
    15:01 Image-based Lighting (Ground Truth Solution)
    22:00 Split Sum Approximation
    27:17 PrefilterSpecular (Shader 1)
    34:44 BRDF Integration Map (Shader 2)
    39:33 IBLSpecular (Shader 3)
    Source for all HDR environment maps that are used in this video:
    polyhaven.com/hdris

КОМЕНТАРІ • 18

  • @thebirdhasbeencharged
    @thebirdhasbeencharged Рік тому +7

    This is so good. Whenever I come across content like this I make sure to grab a copy just in case these amazing resources vanish.

  • @roozbubu
    @roozbubu 7 місяців тому +2

    This series is a truly phenomenal resource. Thank you!

  • @oSteMo
    @oSteMo 4 місяці тому +1

    Excellent explanation, thanks for all the effort you put in those videos :)

  • @doriancorr
    @doriancorr 7 місяців тому +1

    A truly great work, thank you for sharing your work with us. I have learned so much from your series.

  • @GameEngineSeries
    @GameEngineSeries Рік тому +4

    Such an excellent video, as always! Going to recommend these to my viewers as soon as I get to PBR and IBL! Thanks for making the videos and Keep up the great work!

    • @gsn-composer
      @gsn-composer  Рік тому +1

      Thank you for the encouraging comment. I noticed that you are currently implementing directional lights in your engine, so PBR materials and IBL are probably coming soon!

  • @unveil7762
    @unveil7762 9 місяців тому +2

    I was thinking to precompiute the brdf integration map as an attribute. This can save maybe gpu resource since than the texture is an attribute at i guess point level! 🎉 thanks for all those lessons. Anytime i have time i come here to become a better artist! Little pearls each time. Thanks!!

    • @gsn-composer
      @gsn-composer  9 місяців тому

      Thank you very much. The result of performing shading (or parts of shading) in the vertex shader depends on the number of vertices in your mesh. Of course, if you have control over the tessellation of your input mesh, you can use it for low-frequency shading components, but I cannot recommend it as a general solution, especially in combination with high-frequency (detailed) textures. The texture coordinates are interpolated by the rasterizer when they are passed from the vertex to the fragment shader. This means that the texture coordinates and material properties read from textures, such as roughness (see the last example at 39:33), may change per pixel, not per vertex.

  • @jjy7384
    @jjy7384 Рік тому +1

    life saver

  • @HaonanHe-ep5vs
    @HaonanHe-ep5vs 4 місяці тому +1

    Excellent video! But how to render shadow when using image-based lighting in physically based rendering?

    • @gsn-composer
      @gsn-composer  4 місяці тому

      When we do rasterization and perform LOCAL shading, shadows are always difficult because they are a GLOBAL effect that we cannot decide locally. We need to know if the path to the light source is occluded or not.
      This is why ray tracing approaches are so popular, because they can solve this problem by shooting a ray towards the light source (or environment map) and can check if the light is occluded by some other object.
      Here is an image-based lighting example for ray tracing. It has shadows and other "global" effects:
      gsn-lib.org/apps/raytracing/index.php?name=example_transmission
      I have some slides about raytracing here:
      www.uni-marburg.de/en/fb12/research-groups/grafikmultimedia/lectures/graphics2
      In a rasterization approach, we need to cheat somehow. For example, we could look for the brightest location in the environment map and assume that this direction is the only one that matters for shadows (a very rough approximation, of course). Then we could add a cascaded shadow map for that direction:
      developer.download.nvidia.com/SDK/10.5/opengl/src/cascaded_shadow_maps/doc/cascaded_shadow_maps.pdf

  • @emperorpalpatine6080
    @emperorpalpatine6080 Рік тому +1

    Hello !
    I have a question , how do you deal with aliasing in envmaps that have very bright and small lights , like the sun ?
    Using split sum approximation , I need extremely high number of samples to only minimally enhance the pre filtered cubemap , and , when using shaders , you quickly reach the point where the GPU crashes and the driver resets .
    in this situation , I'm about to bake the environment map totally offline CPU side, any advice ?

    • @gsn-composer
      @gsn-composer  Рік тому

      In this case, you can perform importance sampling based on the envmap (incoming radiance), not based on the material. I am currently working on the next episode, which will cover exactly this topic. In the meantime, here are some slides: www.mathematik.uni-marburg.de/~thormae/lectures/graphics2/graphics_3_4_eng_web.html
      The other options is to compute the prefiltered envmap incrementally, doing a few hundred samples per render pass.
      You can just compute an incremental average: www.mathematik.uni-marburg.de/~thormae/lectures/graphics2/graphics_2_1_eng_web.html#42
      This way the "watch dog timer" is not triggered and the GPU driver does not crash/reset.

    • @emperorpalpatine6080
      @emperorpalpatine6080 Рік тому

      @@gsn-composer Thank you so much!

  • @maxlykos940
    @maxlykos940 Рік тому

    What's the reason behind chosing that PDF of GGX normal distribution function? Where do cosθh and sinθh come from?

    • @maxlykos940
      @maxlykos940 Рік тому

      9:27

    • @gsn-composer
      @gsn-composer  Рік тому +1

      Theoretically, you can choose it freely. The only constraint is that the integral of the PDF for the entire domain must be one. At 10:03 min:sec in the video, we check that the normalization to one is fulfilled for our particular choice. This is not a coincidence. Walter et al. constructed the GGX normal distribution function in this way, as described in their paper "Microfacet Models for Refraction through Rough Surfaces", section "3.1. Microfacet distribution function"; Equation 4:
      www.cs.cornell.edu/~srm/publications/EGSR07-btdf.pdf