Global Illumination Based on Surfels

Поділитися
Вставка
  • Опубліковано 23 січ 2025

КОМЕНТАРІ • 82

  • @cube2fox
    @cube2fox Місяць тому +2

    They released their first game with system recently: EA Sports College Football 25. It has some impressive lighting. The second game will be "skate.", the upcoming successor to Skate 3.

  • @nowherebrain
    @nowherebrain 3 роки тому +21

    including skinned meshes, this is impressive. Can't you also save surfels by ignoring geometry with direct lighting...that is..not applying surfels to directly lit surfaces.???

    • @NullPointer
      @NullPointer 3 роки тому +5

      I thought the same, but then the surfaces close to those areas won't receive that bounce

    • @nowherebrain
      @nowherebrain 3 роки тому +2

      @@NullPointer I get that, I'm not clever enough to have a creative solution for that..besides..it's kind of arrogant of me to have thought that during the development(ongoing) that you hadn't thought of this.. I love this btw, good work.

  • @davinsaputraartandgamedev9453
    @davinsaputraartandgamedev9453 3 роки тому +12

    I'm curious on how this compares to lumen. Anyone willing to share their thought on comparing the 2?

    • @Xodroc
      @Xodroc 3 роки тому

      If it supports VR, it beats Lumen.. Otherwise, it's a nice alternative.

    • @eclairesrhapsodos5496
      @eclairesrhapsodos5496 3 роки тому +7

      Irradiance cashe is better, not sure about that one, but Lumen do reflections too and its not ray tracing. My opinion VXGI / Lightmaps / SVOGI / Brute Force RTGI / Poton Mapping diffuse GI is best for now. PS: soon for Unreal Engine be added realtime caustics via Photon Mapping on GPU with extreme good denoise / approximation. I really excited about that method of Irradiance cashe - should be medium premium lol (ballance of quality/speed/production time).

    • @brainz80
      @brainz80 3 роки тому

      I had the exact same thought

    • @edinev5766
      @edinev5766 3 роки тому +4

      In my testing, and since it's linked to my job it has been extensive - Lumen is slow for big exteriors. Unusable for most professional applications.
      This doesn't seem to be. But no way to know unless it becomes available for the general public.

    • @halthewise3995
      @halthewise3995 3 роки тому +9

      I'm not an expert, but you're right to point out that Lumen and this are trying to solve roughly the same problem, and the high-level approach is somewhat similar as well. Both combine local probe points stuck to the surface of objects with a global grid of sample points, and both are using roughly similar approaches for ray steering.
      The biggest difference in approach that I see is that Lumen's "local" sampling points are re-created from scratch each frame because they are strictly placed on a screen-space grid, while surfels stay alive as long as the camera hasn't moved too dramatically. That means Lumen needs to do temporal smoothing in screen space at the end of the pipeline, while surfels can do it earlier (and a little bit more flexibly). In theory, that means the best-case performance of surfels when the scene _isn't_ changing and the camera's not moving is significantly better, especially for high-resolution rendering. On the other hand, when the camera is moving, surfels needs to do a lot more bookkeeping to move and update the surfels, so it seems likely more expensive in that case.
      In practice, the big difference is that Lumen is much farther in development, and actually exists today, including lots of work hammering out edge cases and adding all the little tweaks required to get good real-world performance. Surfel-based GI is clearly earlier stage right now, so it's hard to say how good it will be when it's "done".

  • @Great.Milenko
    @Great.Milenko 3 роки тому +9

    sooo, am i right in thinking its kinda like nvidias hardware raytracing based global illumination but instead of single pixel samples with an AI noise filter, its a softer blobbier sample radius with far better performance?

    • @clonkex
      @clonkex Рік тому +5

      RTX simply provides hardware acceleration of tracing the rays. That is, it makes it really fast to say "fire some rays from this point in this direction and tell me what they hit, how they bounce and what colour information they gather along the way". That's literally all it does. It's up to you to decide how to use that information and incorporate it into your GI shading.
      This is basically another version of "fire as many rays as we can afford and accumulate the results over time until it looks realistic". Hardware raytracing could totally be used in this algorithm to make it "look good faster" by firing a lot more rays. The trick with this sort of solution (well, one of many, many tricks) is that you don't want to waste any work you've already done, but you also have limited memory.
      I also don't think there's any AI noise filtering going on here. It's just regular procedural denoising unless I missed something.

  • @erikm9768
    @erikm9768 3 роки тому +8

    Isnt this just photon mapping essentially? is there a difference, except that using surfels with depth functions instead of spheres? Photon mapping traces back several decades

    • @clonkex
      @clonkex Рік тому +1

      In realtime though?

  • @cube2fox
    @cube2fox Рік тому

    So did they end up using this approach as a default for GI? Or do they use something else for new EA/Frostbite games?

  • @sampruden6684
    @sampruden6684 3 роки тому +25

    There're some cool ideas here, but after watching this just once I'm not seeing an obvious advantage vs DDGI. This has very slow convergence times, and even the converged renders sometimes look a little blotchy in the demo. There's a lot of complexity that goes into handling skinned meshes etc (and that doesn't handle procedural geometry) that DDGI avoids by storing all of the information in the probe volume.
    At the start they mention that they think it's better to calculate the GI on the surface, because that's where it's needed. That sounds sensible in theory, but I wouldn't say that anything here stood out as being visually better than DDGI in practice.
    Is there something in the "pro" column that I've missed? I guess it doesn't suffer from DDGI's corner case when all eight surrounding probes are unreachable.

    • @williamxie5901
      @williamxie5901 2 роки тому +10

      It’s good for large open world games. For ddgi, far objects will fallback to low res probe grid due to its clip map structure, whereas GIBS spawn the surfers from screen space which is almost constant

  • @ch3dsmaxuser
    @ch3dsmaxuser Рік тому +1

    That is awesome!

  • @635574
    @635574 3 роки тому +1

    I can thank Coretex for telling me about surfels

  • @cptairwolf
    @cptairwolf Рік тому

    Interesting solution but I'll take path tracing with radiance caching over this anyway.

  • @Zi7ar21
    @Zi7ar21 3 роки тому +45

    Neat! This was made by EA though and so we have to troll them with jokes about how they are gonna start charging $0.01 per surfel

  • @lohphat
    @lohphat 3 роки тому +4

    OH!
    I thought you said "squirrels". Worst clickbait EVAR!
    (I still enjoyed the video.)

  • @ThePrimeTech
    @ThePrimeTech 3 роки тому

    Wow

  • @charoleawood
    @charoleawood 2 роки тому +2

    I think that "surface circle" is a better description of what these are versus "surface element"

    • @nielsbishere
      @nielsbishere 2 роки тому +2

      Surficle

    • @inxiveneoy
      @inxiveneoy Рік тому +1

      @@nielsbishere Surcle

    • @nielsbishere
      @nielsbishere Рік тому

      @@inxiveneoy sule

    • @endavidg
      @endavidg 11 місяців тому +1

      Since it’s also something that has to do with sinuses, “Snot on a wall”.

  • @dragonslayerornstein387
    @dragonslayerornstein387 3 роки тому +1

    Oh god this is so jank. But it works!

  • @Jkauppa
    @Jkauppa 3 роки тому +4

    you could be shooting rays from all light sources, bounce them around, then keep the average in check, then you get automatic global illumination, just keep track of the real-time light maps, as if its accumulated real ray tracing, as in real-time light baking

    • @Jkauppa
      @Jkauppa 3 роки тому

      paint the textures with light

    • @Jkauppa
      @Jkauppa 3 роки тому

      you only need to fill the image pixels and no more

    • @Jkauppa
      @Jkauppa 3 роки тому

      hows the light outside screen space

    • @Jkauppa
      @Jkauppa 3 роки тому

      importance sample all objects

    • @Jkauppa
      @Jkauppa 3 роки тому

      send more rays

  • @TheIqCook
    @TheIqCook 3 роки тому +2

    pixar introduced this kind of rendering techniques 15 years ago for offline rendering,

    • @clonkex
      @clonkex Рік тому +2

      Did they? Wasn't it just regular offline raytracing?

    • @art-creator
      @art-creator Рік тому +1

      @@clonkex no. It was pointcloud/brickmap-based, with harmonic filtration etc.

  • @diligencehumility6971
    @diligencehumility6971 3 роки тому +3

    Quite beautiful results...
    But, FrostBite is a EA engine.... and EA is not a nice company, at all. Pay-to-win and microtransactions, surprise mechanics, taking advantages of kids, etc... So not really interesting

    • @miurasrpnt_v2
      @miurasrpnt_v2 3 роки тому +6

      Company and Engine have to be separated imo.

    • @clonkex
      @clonkex Рік тому +5

      Who cares? They're also spending some of that money on advancing GI technology. We can benefit greatly from their research and still never touch Frostbite.