Radiance Caching for Real-Time Global Illumination

Поділитися
Вставка
  • Опубліковано 29 чер 2024
  • This talk will present an efficient and high-quality Final Gather for fully dynamic Global Illumination with ray tracing, targeted at next generation consoles and shipping in Unreal Engine 5. Part of the SIGGRAPH 2021 Advances in Real-Time Rendering in Games course ( advances.realtimerendering.com/).
    Hardware Ray Tracing provides a new and powerful tool for real-time graphics, but current hardware can barely afford 1 ray per pixel for diffuse indirect, while Global Illumination needs hundreds of effective samples for high quality indoor lighting. Existing approaches that rely on Irradiance Fields cannot scale up in quality, while approaches relying on a Screen Space denoiser have exorbitant costs at high resolutions. This talk will present practical applications of Radiance Caching along with effective techniques to reduce noise and leaking.
    Bio:
    Daniel Wright is an Engineering Fellow in graphics at Epic Games, and Technical Director of the 'Lumen' dynamic Global Illumination and Reflections system in Unreal Engine 5. Prior to that, he developed lighting and shadowing techniques for Unreal Engine 3 and 4 which shipped in Gears of War, Fortnite and a multitude of games licensing Unreal Engine technology. Daniel's main passion is real-time Global Illumination.

КОМЕНТАРІ • 45

  • @rallokkcaz
    @rallokkcaz Рік тому +16

    As a computer scientist, this stuff is almost pure magic. Congratulation to the team at Unreal for designing/implementing these algorithms and tools! This is amazing work.

  • @10minuteartist87
    @10minuteartist87 2 роки тому +23

    remembering the old days of "Mental Ray" when i saw GI first time and Said wowww.. ☺️

  • @Alexander_Sannikov
    @Alexander_Sannikov 2 роки тому +29

    I see much more people are interested in Nanite than in Lumen (judging by the number of views). But 90% of nanite ideas have already been implemented in the 2008 paper that it's based on. Radiance caching in screen space, however, looks novel and even though the idea is somewhat obvious once you name it, but I have actually never seen it implemented anywhere.
    This is in my opinion the core idea of the entire approach and can be used for a much wider set of GI algorithms based on different acceleration structures.

    • @djayjp
      @djayjp 2 роки тому +4

      Check out the video by EA about their Surfel GI technique used in Frostbite. I think it's using the same method fundamentally? Idk

    • @635574
      @635574 2 роки тому +3

      @@djayjp it might achieve similar results, but we have no direct behind the scenes comparison for realtime motion through scene and for bidirectional dynamic object GI in Lumen. There isnt as much temporal accumulation in lumen by what we have seen.

    • @wsqdawsdawdwad
      @wsqdawsdawdwad 2 роки тому +3

      "But 90% of lumen ideas have already been implemented in the 2008 paper that it's based on", is this a typo? From the context I think you mean Nanite.

    • @Alexander_Sannikov
      @Alexander_Sannikov 2 роки тому +2

      @@wsqdawsdawdwad correct, thanks for that. I edited the post.

    • @Revoker1221
      @Revoker1221 2 роки тому

      @@Alexander_Sannikov Would you happen to have a link or name of the 2008 paper on hand? I'd love to take a deeper dive.

  • @prithvib8662
    @prithvib8662 2 роки тому +13

    Really well explained, good stuff!

  • @DominikMorse
    @DominikMorse 2 роки тому +7

    This is just pure genius.

  • @arifahmed6610
    @arifahmed6610 2 роки тому +1

    Beautiful video 😍

  • @raanoctive6092
    @raanoctive6092 2 роки тому +2

    Excellent job👏👍

  • @jimbanana3071
    @jimbanana3071 11 місяців тому

    Awsome Job !!

  • @caelanread3199
    @caelanread3199 2 роки тому +16

    It looks as though by the end, real life graphics achieved, with 2 rays Per pixel. Am excited for the future,

    • @GeneralKenobi69420
      @GeneralKenobi69420 2 роки тому +7

      People have been saying that for the last 20 years lol. It does look good but trust me there's still a lot that can be improved before it can be actually indistinguishable from reality. Just compare with any Disney or Pixar movie made in the last 3 years. Real time graphics are still many years away from that, and even then movies will probably get even better by then

    • @TehIdiotOne
      @TehIdiotOne 2 роки тому +6

      @@GeneralKenobi69420 I mean, modern graphics aren't even remotely comparable to graphics of 20 years ago dude...

    • @GeneralKenobi69420
      @GeneralKenobi69420 2 роки тому +7

      @@TehIdiotOne i know, and yet you should have seen the reactions when Sony revealed that PS2 tech demo. "Omg so lifelike! It looks just like a movie!!"

    • @mnomadvfx
      @mnomadvfx 2 роки тому +5

      @@GeneralKenobi69420 20 years ago real time gfx were not even close to this level of advancement.
      As for feature CGI animation, at this point there is little to make them LOOK better.
      The real advancements for offline rendering will come from:
      1) faster rendering of what they can already do, thereby making 4K VFX targets for all films regardless of shot quantity a serious possibility.
      2) Further improvements to FX simulations (water, fire, smoke) and tissue/muscle simulations for creatures and digital humans.
      For me what is currently lacking is a believable volume conserving model for facial animation that shows everything down to even small scale ticks and other unconscious movement that is constantly happening on the average human face.
      For example, it took the VFX guys a year to create the Blade Runner 2049 scene with faux Rachel.
      Despite extensive reference video from the original film and skull scans from Sean Young to guide them it still looks fundamentally off when her face moves - like parts of her face are simply asleep or affected with botox (IMHO they also way overdid it with the subsurface scattering effect too, her facial skin looked less like a fleshy volume over a skull than a light diffusing film covering another translucent volume.

    • @delphicdescant
      @delphicdescant 2 роки тому +4

      @@GeneralKenobi69420 People have been saying it for 20 years, and every time it's been true. That's a great thing imo.
      When someone looked at Mario 64 when it came out and said "wow that looks like real life," they weren't wrong.
      You can either look at the progression of graphics technology and think "we have never, and never will, achieve 100% photorealism, because there will always be something better 10 years later on," or you can recognize that human perception is a fuzzy and a relative thing and say "we have achieved photorealism many times, and every time it stays new and exciting."

  • @diligencehumility6971
    @diligencehumility6971 2 роки тому +9

    So what am I supposed to do with my 10 years of Unity experience now?

  • @eclairesrhapsodos5496
    @eclairesrhapsodos5496 2 роки тому +11

    I wonder if its possible to have semi-dynamic GI? - like lightmaps with data what allows of interaction/blend with realtime GI, so only changes / characters be having realtime GI.

    • @doltBmB
      @doltBmB Рік тому

      would basically be a modern version of quake's lighting system so yeah should be possible.

    • @clonkex
      @clonkex Рік тому

      Totally. I'd say they just didn't do that because it would be a pretty big chunk more work and their target audience (console game devs) would prefer to have fully dynamic lighting with no baking time.

    • @theneonbop
      @theneonbop 28 днів тому

      How would you determine which parts of the lightmap are valid?

  • @robbie_
    @robbie_ 2 роки тому +2

    I didn't understand a single word of this. Am still working on my Lambert Shading algorithm.

    • @UserX03
      @UserX03 2 роки тому +7

      That’s alright bud everyone starts at knowing nothing about a topic

  • @635574
    @635574 2 роки тому +1

    But does it even work on dynamic objects or just how the map interacts with them? Surfels from EA can handle rigged meshes but so far only one sample per bone.

  • @djayjp
    @djayjp 2 роки тому +5

    Anyone know how this differs from Frostbite's Surfels GI technique, in a nutshell?

    • @sporefergieboy10
      @sporefergieboy10 2 роки тому +14

      One uses wizard magic the other one uses alien technology

    • @Sergeeeek
      @Sergeeeek 2 роки тому +10

      They are similar, but Lumen uses grids for screen space cache, while GIBS uses surfels (surface elemetns) which lie directly on the geometry. Also I don't think GIBS has static world space probes like Lumen, they just put surfels far away and make them bigger, in a way they always use world space probes, but place and size them dynamically.
      They do use a grid to accelerate searching for surfels, and the grid changes based on distance from the camera. Close up it uses a uniformly sized grid, but far away it's a view aligned frustum thing, where each grid cell is stretched to align with the view. From the camera perspective it looks the same, but from world space it stretches grid cells a lot, which makes it easier to cover a huge open world area.
      Overall they seem to achieve a very similar result, but I personally think surfels are neater.

    • @djayjp
      @djayjp 2 роки тому +1

      @@Sergeeeek Ah I had watched part of their respective presentations but couldn't quite determine how they might differ. Your explanation helps a lot! Thanks! I have to say that, simply based on the renders shown in each of their SIGGRAPH videos, Lumen appears to give more realistic results, closer to unbiased PT.

    • @Sergeeeek
      @Sergeeeek 2 роки тому +3

      @@djayjp watch the gibs presentation fully too, I really liked the demo where a skinned character had a fully emissive material and was illuminating the scene while animating through it. Very impressive imo

    • @djayjp
      @djayjp 2 роки тому

      @@Sergeeeek Yeah that's a good point. I didn't see anything quite like that with Lumen. Also I haven't seen a scene with many dynamic lights in Lumen just yet.

  • @yizhang7027
    @yizhang7027 Рік тому +1

    It'd be much clearer if you could state the problem prior to an explanation.

  • @thecrypt6482
    @thecrypt6482 2 роки тому +2

    This is creep that both epic games and electronic arts new gi systems are based on 20 years old method by Mental Ray wow...

  • @LotmineRu
    @LotmineRu Рік тому

    unusable

  • @prltqdf9
    @prltqdf9 2 роки тому +4

    This presenter's narration suffers from slurred speech. It's often times hard to understand what he is talking about.

    • @djayjp
      @djayjp 2 роки тому +2

      Turn on the auto captions.