I gave the talk, if you have any questions please just ask them. Or feel free to hit me with your thoughts. I read every post, even though I probably shouldn't. =)
@@oosmosmoo I think he is the source, most coders/engineers are introverts, working with computers make them less sociable but they're otherwise good people.
we will wait quite a while yet, he said its 50% dlss upscaling, so his "60" fps is actually like 15fps, and im betting its running on a 4090 or something
@@hulejul9748 this is an just an option for developers same with ray/path tracing, and nvidia software and research is nearly flawless every year, unreal engine is not their problem
@@googleslocik when the result is indistinguishable from native res, whats the difference? image reconstruction isnt going anywhere, and it will only improve across all platforms, so waiting for this to be viable without it doesnt make sense
Not just games but virtual production. Having path tracing for independent filmmakers. A 2-minute scene that would take 3 days to render on a reasonably powerful machine, (say 4070 super with an i71400) could render in minutes. The time saved that would then go into enhancing the creative process and allow more time for just... thinking and working as a human.. not getting demotivated etc, that time is precious.
Yeah this is a whole subject I didn't get to touch on, and deserves a separate talk by itself. There are cases where you can get very close to the offline render in terms of quality, 98%+ similar. And maybe you use this technology in a psuedo-realtime fashion to reduce render times. So instead of 5 minutes per frame, maybe its 1-3 seconds per frame. Obviously a huge time savings. Just important to note the things that aren't pathtraced yet, but even then this could still be useful for rapid prototyping.
Always exciting to see better light-tracing at lower cost. Light is THE factor that determines the realism of a game more than anything. And it helps artists to properly capture a mood in the scene.
Lightning an entire scene only with emissive materials is mindblowing. It changes almost everything in workflow. If it can work with animated materials...
@@MorimeaWell like he said this part isn't quite ready. The goal is 1 day per pixel but 2 is probably as low as it'll work for now. This is still improving tech.
To demonstrate how things have moved on....in 1992 during my CGI MA it would take c.15mins to render one 320x256px image on an Apollo workstation. And they were considered cutting edge. No raytracing, no GI. Reflections and shadows were faked. I'm now using Unreal and every day, it blows me away. I can't wait for the 5090 cards mixed with this tech...
I used to work in Imagine on the Amiga back in the late 80s and early 90s. What we did was amazing given the technology at the time. Used everything from Amiga 500 to 4000 with 68060 processors. It was a world of hurt compared to Blender on a couple of 4090s today.
In 92 at Autodesk we were happy to get our hands on Weitek coprocessors for 3D Studio. They helped and were better than xxx87s, but still glacial. Great memories of those early days.
Yeah got my start at Microprose So i had Access to O2 SGI hardware & software.... This is MAGIC compared to those $20K "state of the art" dedicated graphics powerhouses of past years.
@@Oakbeast yeah, I remember SGI. We has access to a couple of Octane IIs and Softimage that were used in a production studio. I fondly remember sitting there in the weekends, learning and working with Softimage in the hope of making it as a 3D modeler and render artist. In retrospect it was a better career option to become a programmer, but it would have been so nice to work with PIXAR in those days.
Performance wise, which of these 5 run better? Real-time Pathtracing, offline Pathtracing, software Lumen, hardware Lumen, or just dynamic lighting only?
@alyasVictorio run better is not meant for "it's better". To me it's better to have Path Tracing near 60fps and real graphic instead liw quality with bad illumination at very high FPS
Am glad that we are in the final stretch of light simulation development for games, cause we really need to move on to the thing that most kill the inmersion: interaction between characters and with the world. That is way more important and has been waiting for decades to be nailed
That is amazing! 60FPS is awesome target at moment, but even if I get 10FPS, its game changer for me for my product visualization animations, Cheers to amazing team
34:48 Thank you, nvidia, for the path-tracing. And thank you, Epic, for the textures failing to load on time. Soon this bug will celebrate its 20th birthday.
Neither of these are inherit UE problems, shader compilation stuttering happens in any game that doesn't pre-cache shaders, traversal stutter happens in any game that handles its level streaming poorly. Both are developer issues, not the engine's fault that the devs are incompetent.
@@LewdSCP1471A While stutter can indeed happen in other engines, it has been near universal in Unreal Engine games since UE3, it's definitely inherent to the engine.
@@CrypticWritings89a shader compilation pass can be done as a loading screen, where every mesh, texture, and animation is compiled while a different screen is shown. Level streaming can be done various ways, and distance vs screen space culling can be used to make sure things are loaded and unloaded gradually rather than all at once. All the tools are there, but too many developers release games before spending the time to optimize their games. While the engine could be designed to try and make the best choice on its own, that would require an ungodly amount of development, as the engine would have to optimize on the fly, given a seemingly endless number of possible scenarios for all levels and their overlaps. Meanwhile, a team could move from outside of a building to inside, and change the streaming and culling methods used, and try again to see which method provides the best performance. This is a developer issue, not an engine issue.
@@nemureru_tanuki don't forget shows like the Mandalorian that use "the volume" for actual production. That already runs real-time on UE. Those shows will directly benefit from this technology. So I see this as completely viable for full feature movies. Will probably be a while before the technology is adopted but still.
It looks pretty good but I'm really just not a fan of screenspace temporal solutions for anything because you will invariably get all kinds of temporal artifacting - the light shifting as the camera moves, light/dark trails as surfaces are unoccluded, etc...
Yeah just because its realtime doesn’t mean its accurate. There is no way to make millions of calculations per pixel in 0 seconds with our current technology. Thats what quantum computers are for
All of which are fully on display here too. I appreciate the work on this tech, but with the smearing and disocclusion I feel it actually looks worse in scenes with high motion. Too many comments talking about how beautiful the flashy demo is and too few talking about the disocclusion artifacts on the wires on the swing ride and the ghosting on the string lights.
I may be in the minority here but i feel the trade-off for path traced direct and indirect is very worth it. Is "temporal" simply means looking at previous frames to predict future frames. It's the most logical way of achieving such technologies and doesnt have to necessarily mean artifacts either just because it is temporal. These artifacts may very well just be coming from the denoiser thats being used and not necessarily from ReSTIR
Since it's not path tracing everything, nvidia should call this hybrid path tracing. It is defaulting to ray trace and raster in quite a lot of cases. In general, when someone mentions path tracing, we expect a modern offline path tracer that can handle refraction, translucency, high quality sss, layered materials, hair, vdb, and everything else at multiple bounces.
The point is that MOST things are path tracing- direct and indirect lighting which makes up most of use cases. Saying it is defaulting to ray tracing in a lot of cases is such a disservice to the 80% of the scene that is being path traced. The main things that dont get path traced is low roughness, transclucency, fog and dof. Which in generally only make up 30% of scene. At the end of the day, almost every scene you have is going to look SIGNIFICANTLY better with this technology. Furthermore, i really dont think anyone should expect to click on this video expecting everything to be path traced and for this to be fully featured coming from nowhere. The fact that the made this compatible with unsupported use cases is a big deal on it's own. Again please lets not do a disservice to such an incredible leap in graphical fidelity
The fact they have a way to make path tracing, an inherently more technical process than ray tracing, have better performance is insane. This could really push games to insane new levels. Now hopefully UE5 devs can find a nice balance as a lot of games are very GPU demanding and dont have the best performance.
Very excited to see this shown off even if it's still not technically done or fully production ready! I imagine in another 3-4 years how much more polished this tech will finally be for production with the next-gen consoles releasing!! Very exciting times!!! :DDD
I might be the only one but I can't say I'm impressed. Especially on Meerkat demo. If we compare to the original here : ua-cam.com/video/SB4nnhJv3IU/v-deo.html (compare 34:55 here with 0:08 on the link) there is far far less GI on rocks in general making the render more video game like and unpleasant. The other thing that I don't like is that I see a lot of problem Lumen currently have in the RT Pathtrace render. Like GI poping, Specular glitching and defocus being a little bit unstable. I am ofc very interested with the techno and hope it will get better with time but I do not feel any hype from what I see. I see no gain in being realtime Pathtrace 60fps if you have to sacrifice everything that make an image good. Honestly (working in animation right now) I see more value on fixing Lumen with MRQ & Temporal than developping a real time PathTrace.
You're not the only one who noticed the issues. I hope someone asked a question about that at the talk, because compared to the original meerkat demo the shadows in the nvidia demo are going to almost black. In all fairness, gpus are probably not fast enough to handle path tracing in real time, because here they're only path tracing certain things. They're also using a lower resolution, incredibly low samples, strong denoising etc.
Senuas Saga Hellblade to me demonstrated the best use of lighting and unreal graphics that felt the closest to realism that ive ever played. I stopped the game with photo mode so many times thinking they actually mixed in real capture just to see it was still the same scene. How I always dreamed games would be where you dont go from prerendered to in game graphics after cutscenes
Lol that forest scene was not so good I think. The lighting was always shifting, popping in, like the blend of screen-space & accumulated info, the low res path trace. It all comes together in a not stable image. It's not plug&play just yet. Very temporally unstable, smeary, and noisy. Bleh.
It's a shame that most demos are too high-contrast, which makes the final result not very appealing! An artist should supervise the videos before they go online!
It was done in an auditorium with a huge screen with lots of diffuse lighting so it’s a standard to bump the contrast for presentation as everything gets washed out anyways, they should’ve kept the original for the UA-cam presentation but I guess is that it’s always undecided wether it’ll be put on web or not
it's really amazing but to run this you need upscaling, frame generation and $2500 GPU that consumes 600+ wats of power - rasterization is king for next 10 years as games fly in native 4k - hail to the king
The fact you think UE4 invented needing to compile shaders shows how little you know, i've used UE3 game builds that have shader stutter because they dont precompile them, its a developer issue, not an engine issue.
who would have thought the day will come, this might be the reason i get a high end gpu, not ray tracing or dlss 3.5, whatever, but actual path tracing in real time, what a time to be alive, edit: now that i remember, two minute papers showcased this like a month ago or so, and it was running on a gtx 680 or something quite old, so i hope nvidia doesn't come up with the stupid idea to lock it exclusively on the 5000 series
@@computron5824 Dude, this is REAL TIME PATH TRACING ! And guys like you will still find reasons to whine that it's not as good as offline path tracing.. Like really ?
Good showcase but please also talk about the caveats. We can see how this looks in motion. Show some bad cases, show the ghosting, the smearyness. It comes off as a bit shill-y and dishonest to not mention the negatives of these techniques. It's not only a performance hit. The image stability and motion clarity takes a huge noisedive in many cases. This is like 2015 era TAA all over again 😂
The new tech is impressive and all, but can you PLEASE help devs eliminate the horrendous traversal stuttering which seems to be ubiquitous in UE5 games. Having all these new super expensive features that reduce performance even more is pretty pointless when the frametime graphs look like a mountain range. It's giving the engine a bad reputation.
I remember there was a 3dmark test displaying a rotating carousel with light sources. When it turned on 8 light sources the frame rate dropped to something like 5 fps.
I love all of this, and I love Richard's presentations (been watching them for a long while). The only barrier to entry here is needing to have the separate branch. In the past I'd downloaded the branches and compiled, but it's a bit limiting as far as other aspects of Unreal that are easily injected into the vanilla code from Epic Games. I wish Nvidia and them could work together so everything could be a plugin, similar to how they now have DLSS available as a directly-injected plugin. Obviously that would be vastly more complex than I myself could understand, but it would be an 'ideal' situation. I HORRIBLY miss all of Nvidia's GAMEWORKS library not being supported anymore (for a long, long time). :( Love the tech, Nvidia!!
32:19 My project is dependent on Ray Reconstruction. I am waiting for it to be implemented, but the DLSS plugin download website for UE5 only says "coming soon." It was mentioned as part of DLSS 3.5, but you already shipped UE5 DLSS 3.7 without it. In this video, you emphasize how important it is, but no download is still available. It is coming soon for a year now.
People on the nvidia dev forums have been asking for it for a long time, and nvidia just doesn't respond. Even though it was announced over a year ago, its most likely not ready for public release.
Silly question here but is this feature already in the released 5.5?? I can see that they listed it as a new improvement but I can`t find how to enable and test it, or is it just how the engine calculates lights now? , thaanks
@@YouSacOfWine This is real time so more than likely that limitation doesn't apply here. He emphasized that we can even mix different things, Raster and Pathtracing together, so whatever works with Lumen would naturally work with real time pathtracing too.
i cant wait till the default engine officially adopts proper emissive material lighting support as shown in this. another thing im anxious to receive in the default engine is proper world position offset support with ray traced shadows. Im tired of having black areas appear, shift, and disappear on my foliage, aswell as world position offset compatible ray traced shadows
Yes man, 👍 good job, keep going 💪, the Unreal, the Epic, the Fab is more fortune for one programmer, one artist, and one game developer who use Unreal, and, man I use Good before, stealing have some projects little to finish, I don't like the Godot 4 stealing use 3.6 and it doesn't some different. Just for the. Eagles eyes lol. No I'm fascinated by what I must work like crazy, programming to just move, on Unreal just goes like crazy and with maximum effort! 💪 Just keep going guys you are amazing.
Now... it would be interesting, what exactly is considered high-end or low-end hardware. But looks very promising! What is a bit concerning tho is the fact that companies might rely too much on things like DLSS for performance, instead of optimizing their games properly. Actually, thats already happening. On the other hand, the more streamlined and optimized the engine is out of the box and the less you can do to break performance (like with the light sources), the better.
is there a difference between "full ray tracing" and path tracing? I thought the terms were synonymous with each other but now that you're distinguishing between path traced effects and ray traced effects within path tracing, i'm starting to wonder...
The talk doesn't seem to be taking the new MegaLights feature into account, but it should mean the legacy fallback to Lumen could look even closer to the fully pathtraced version when it comes to emissives. The reduced disparity will be great for devs, but terrible for hardware companies trying to convince the average gamer of the difference 😅
I've been using RTXDI in my work for over a year, and when MegaLights was announced, I was fully convinced that it was finally RTXDI integrated directly into Unreal. Do you know what the difference is, in brief?
@СергейШавлюга-з2ч I'm not real sure the underlying implementation of MegaLights other than what Epic mentioned in their presentation a few weeks back. Although I think they did say it might require hardware RT. Not sure if the same as ReSTIR or not
It is probably what's under the hood,megalight is maybe just a fancy word for ReSTIR with an ergonomic and user-friendly packaging, I would be very surprised if it's otherwise
@@nareshkumar3526many many years. Nanite, for example, was release years ago and till today the Blender Foundation has made no mention of this tech being implemented in the new roadmap in some form into Blender despite it being arguably the biggest tech breakthrough to happen to 3D Art. Same thing can be said for Eevee next which despite being brand new, is still inferior to lumen in many ways.
I gave the talk, if you have any questions please just ask them. Or feel free to hit me with your thoughts. I read every post, even though I probably shouldn't. =)
great talk
Thanks for the presentation, very interesting stuff
Buena presentación y muchas gracias por la información!!!
Awesome presentation - thanks - possibly this will go to history for the milestone presentation for real time path tracing for masses.
@@Vemrah This is the start of making it more mainstream, everyone can expect more advancements from here.
We appreciate this presentation by this man who is not too comfortable doing it, thanks for the effort ;)
Right? I'd much rather get the info directly from the source, not some spokesperson. Respect 🔥
Este señor es un genio. Directamente al grano mostrándonos lo mejor q se viene, sin tanto curriculum.
❤ his good congrats man, just didn't have time or mood to do it perfectly.
you can barely notice the discomfort at the start
@@oosmosmoo I think he is the source, most coders/engineers are introverts, working with computers make them less sociable but they're otherwise good people.
Real time PathTracing is a dream for years now I can't wait for this to be out in unreal.
we will wait quite a while yet, he said its 50% dlss upscaling, so his "60" fps is actually like 15fps, and im betting its running on a 4090 or something
@@googleslocik yes he said its on a 4090
i think the problem unreal has with poor efficiency and stuttering should be taken into effect as well, before deciding to throw on more ray tracking
@@hulejul9748 this is an just an option for developers same with ray/path tracing, and nvidia software and research is nearly flawless every year, unreal engine is not their problem
@@googleslocik when the result is indistinguishable from native res, whats the difference? image reconstruction isnt going anywhere, and it will only improve across all platforms, so waiting for this to be viable without it doesnt make sense
Not just games but virtual production. Having path tracing for independent filmmakers. A 2-minute scene that would take 3 days to render on a reasonably powerful machine, (say 4070 super with an i71400) could render in minutes. The time saved that would then go into enhancing the creative process and allow more time for just... thinking and working as a human.. not getting demotivated etc, that time is precious.
Yeah this is a whole subject I didn't get to touch on, and deserves a separate talk by itself. There are cases where you can get very close to the offline render in terms of quality, 98%+ similar. And maybe you use this technology in a psuedo-realtime fashion to reduce render times. So instead of 5 minutes per frame, maybe its 1-3 seconds per frame. Obviously a huge time savings. Just important to note the things that aren't pathtraced yet, but even then this could still be useful for rapid prototyping.
Always exciting to see better light-tracing at lower cost. Light is THE factor that determines the realism of a game more than anything. And it helps artists to properly capture a mood in the scene.
This isn't just better light at lower cost. This is holy grail of computer graphics.
Fix the UE stutter, it should be highest priority.
Absolutely it makes me crazy
what's that? can you elaborate?
@@khurramvirani3643 It's freezing for a short time but very frequently
Realtime Stutter Elimination should be priority and I think I speak for many PC Enthusiasts.
No it call Unreal Engine 5.4
lol
Already exists. Plenty of UE5 games that don't stutter.
It's a developer issue at this point.
@@whatistruth_1 Name some please.
@@DanteBellin The one that comes directly to my mind is The Finals
Lightning an entire scene only with emissive materials is mindblowing. It changes almost everything in workflow. If it can work with animated materials...
You can already do this with Lumen afaik
@@drinkwwwaterrryeh. It it looks garbage and noisy
@@vid828 25:00 6 rays per pixel 720p resolution(upscaled) and 60fps on 4090RTX - not sure if it can be actuallly useful for this cost in performnce.
@@MorimeaWell like he said this part isn't quite ready. The goal is 1 day per pixel but 2 is probably as low as it'll work for now. This is still improving tech.
It's shows an example of an animated stained glass material in the vid.
To demonstrate how things have moved on....in 1992 during my CGI MA it would take c.15mins to render one 320x256px image on an Apollo workstation. And they were considered cutting edge. No raytracing, no GI. Reflections and shadows were faked. I'm now using Unreal and every day, it blows me away. I can't wait for the 5090 cards mixed with this tech...
I used to work in Imagine on the Amiga back in the late 80s and early 90s. What we did was amazing given the technology at the time. Used everything from Amiga 500 to 4000 with 68060 processors. It was a world of hurt compared to Blender on a couple of 4090s today.
In 92 at Autodesk we were happy to get our hands on Weitek coprocessors for 3D Studio. They helped and were better than xxx87s, but still glacial. Great memories of those early days.
Most of folks here probably have no idea how huge this is lol !
Yeah got my start at Microprose So i had Access to O2 SGI hardware & software.... This is MAGIC compared to those $20K "state of the art" dedicated graphics powerhouses of past years.
@@Oakbeast yeah, I remember SGI. We has access to a couple of Octane IIs and Softimage that were used in a production studio. I fondly remember sitting there in the weekends, learning and working with Softimage in the hope of making it as a 3D modeler and render artist.
In retrospect it was a better career option to become a programmer, but it would have been so nice to work with PIXAR in those days.
This is great. More efficient use of resources is what we like to see.
Already super impressed by the demo being presented
Devs: *This is an area where we can make improvements on*
this seems like an absolute game changer! Congrats on the DEV for this, amazing!
WHOOP WHOOP MY ZPD Cop made it in the presentation ❤
Very cool model. Would play the game it's in.
😂 i subbed to you just for that dude
My man showing the future of real-time rendering and the crowd is like "what he says?"
PathTracing _is_ a "real-life" element I cannot wait to see more in games.
I sow your comment in my Dega vous 😅 ,yep it a dream that's become true
Same for me! Love Path Tracing❤
Performance wise, which of these 5 run better? Real-time Pathtracing, offline Pathtracing, software Lumen, hardware Lumen, or just dynamic lighting only?
@alyasVictorio run better is not meant for "it's better". To me it's better to have Path Tracing near 60fps and real graphic instead liw quality with bad illumination at very high FPS
Finally the light in Unreal Engine viewport didn't delay like when the first time Lumen was introduced!
Am glad that we are in the final stretch of light simulation development for games, cause we really need to move on to the thing that most kill the inmersion: interaction between characters and with the world. That is way more important and has been waiting for decades to be nailed
That is amazing! 60FPS is awesome target at moment, but even if I get 10FPS, its game changer for me for my product visualization animations, Cheers to amazing team
Beautiful. Every month there's exciting news for 3D.
I wish there was more for stereoscopic 3D 😢 RIP nvidia 3D vision
Ikr
@@kelownatechkid it still works, I use it everyday on a 110 inch 3d projector. Just need to do a few things to make it work.
I love this, it makes the game feel so much more real, grounded and tactile, even if it's highly stylized. It's truly the future.
Great news! Love it! The presentation was awesome!
34:48 Thank you, nvidia, for the path-tracing. And thank you, Epic, for the textures failing to load on time. Soon this bug will celebrate its 20th birthday.
🤣😂🤣
Great, but how about fixing shader compilation/traversal stutter that plagues pretty much every UE5 game, especially on PC?
Neither of these are inherit UE problems, shader compilation stuttering happens in any game that doesn't pre-cache shaders, traversal stutter happens in any game that handles its level streaming poorly. Both are developer issues, not the engine's fault that the devs are incompetent.
@@LewdSCP1471A While stutter can indeed happen in other engines, it has been near universal in Unreal Engine games since UE3, it's definitely inherent to the engine.
@@CrypticWritings89a shader compilation pass can be done as a loading screen, where every mesh, texture, and animation is compiled while a different screen is shown. Level streaming can be done various ways, and distance vs screen space culling can be used to make sure things are loaded and unloaded gradually rather than all at once. All the tools are there, but too many developers release games before spending the time to optimize their games. While the engine could be designed to try and make the best choice on its own, that would require an ungodly amount of development, as the engine would have to optimize on the fly, given a seemingly endless number of possible scenarios for all levels and their overlaps. Meanwhile, a team could move from outside of a building to inside, and change the streaming and culling methods used, and try again to see which method provides the best performance. This is a developer issue, not an engine issue.
Forget this for games, i mean.
Imagine how frigging fast you can render out an animation now for a client.
Exactly, maybe not for a full-length industry-standard movie, but for shorts and tech demos.
@@nemureru_tanuki don't forget shows like the Mandalorian that use "the volume" for actual production. That already runs real-time on UE. Those shows will directly benefit from this technology. So I see this as completely viable for full feature movies. Will probably be a while before the technology is adopted but still.
can be posible realtime, using ndisplay, specially for virtual production
Does Megalights use the same approach?
I would love to see the comparison of this technology with Megalights.
Thanks for your work!
No, megalights is nothing compared to this.
@@gn2727whats the difference between?
@@gn2727 in a good way or not
It looks pretty good but I'm really just not a fan of screenspace temporal solutions for anything because you will invariably get all kinds of temporal artifacting - the light shifting as the camera moves, light/dark trails as surfaces are unoccluded, etc...
Yeah just because its realtime doesn’t mean its accurate. There is no way to make millions of calculations per pixel in 0 seconds with our current technology. Thats what quantum computers are for
All of which are fully on display here too. I appreciate the work on this tech, but with the smearing and disocclusion I feel it actually looks worse in scenes with high motion. Too many comments talking about how beautiful the flashy demo is and too few talking about the disocclusion artifacts on the wires on the swing ride and the ghosting on the string lights.
I may be in the minority here but i feel the trade-off for path traced direct and indirect is very worth it. Is "temporal" simply means looking at previous frames to predict future frames. It's the most logical way of achieving such technologies and doesnt have to necessarily mean artifacts either just because it is temporal. These artifacts may very well just be coming from the denoiser thats being used and not necessarily from ReSTIR
Since it's not path tracing everything, nvidia should call this hybrid path tracing. It is defaulting to ray trace and raster in quite a lot of cases. In general, when someone mentions path tracing, we expect a modern offline path tracer that can handle refraction, translucency, high quality sss, layered materials, hair, vdb, and everything else at multiple bounces.
The point is that MOST things are path tracing- direct and indirect lighting which makes up most of use cases. Saying it is defaulting to ray tracing in a lot of cases is such a disservice to the 80% of the scene that is being path traced. The main things that dont get path traced is low roughness, transclucency, fog and dof. Which in generally only make up 30% of scene. At the end of the day, almost every scene you have is going to look SIGNIFICANTLY better with this technology. Furthermore, i really dont think anyone should expect to click on this video expecting everything to be path traced and for this to be fully featured coming from nowhere. The fact that the made this compatible with unsupported use cases is a big deal on it's own. Again please lets not do a disservice to such an incredible leap in graphical fidelity
Who is 'we" nerd?
don't worry they can come up with new words later
these demos look absolutely incredible!
Most important one so far
Bro how are they doing this 🔥🤯🤯 ,these dudes are wizards
Very exciting and impressive! 👏👏👏
The fact they have a way to make path tracing, an inherently more technical process than ray tracing, have better performance is insane. This could really push games to insane new levels. Now hopefully UE5 devs can find a nice balance as a lot of games are very GPU demanding and dont have the best performance.
Unbelievable, can't wait to test it
At last. Can't wait to put my 4090 to scream wiith the Realtime Path Tracing. Amazing. 😍
This is UNREAL!
This is it !!!! and great presentation as well!
😍 I can't wait.
Great talk, cheers!
Super Exciting! I am so grateful :) this will help me so much
We need adaptive samplers for shadows,
Because lot of noise around moveable objects.
Very excited to see this shown off even if it's still not technically done or fully production ready! I imagine in another 3-4 years how much more polished this tech will finally be for production with the next-gen consoles releasing!! Very exciting times!!! :DDD
looks way better than lumen, no ghosting at all.
Probably has more to do with ray reconstruction than anything else.
this is awesome because Lumen can certainly struggle in some cases , and offline Path Tracing just takes too long (RTX 3060).
I might be the only one but I can't say I'm impressed. Especially on Meerkat demo. If we compare to the original here : ua-cam.com/video/SB4nnhJv3IU/v-deo.html (compare 34:55 here with 0:08 on the link) there is far far less GI on rocks in general making the render more video game like and unpleasant. The other thing that I don't like is that I see a lot of problem Lumen currently have in the RT Pathtrace render. Like GI poping, Specular glitching and defocus being a little bit unstable.
I am ofc very interested with the techno and hope it will get better with time but I do not feel any hype from what I see. I see no gain in being realtime Pathtrace 60fps if you have to sacrifice everything that make an image good.
Honestly (working in animation right now) I see more value on fixing Lumen with MRQ & Temporal than developping a real time PathTrace.
You're not the only one who noticed the issues. I hope someone asked a question about that at the talk, because compared to the original meerkat demo the shadows in the nvidia demo are going to almost black. In all fairness, gpus are probably not fast enough to handle path tracing in real time, because here they're only path tracing certain things. They're also using a lower resolution, incredibly low samples, strong denoising etc.
@computron5824 the simple fact to see LOD poing at the beginning say that nanite dont work with this yet
is the tech demo scene at the end available as a download somewhere?
9:36 "Cop lights, flashlights, spotlights
Strobe lights, street lights (All of the lights, all of the lights)" 🎵
😂 I had the same thought... UE5.5 is going extra good and want us to see everything... All of the light... Shout out to kid Cudi for that song btw
truly incredible
can't wait for more stutter
Uhmm
Senuas Saga Hellblade to me demonstrated the best use of lighting and unreal graphics that felt the closest to realism that ive ever played. I stopped the game with photo mode so many times thinking they actually mixed in real capture just to see it was still the same scene. How I always dreamed games would be where you dont go from prerendered to in game graphics after cutscenes
Wow! Amazing!
That forest scene is awesome!
Lol that forest scene was not so good I think. The lighting was always shifting, popping in, like the blend of screen-space & accumulated info, the low res path trace. It all comes together in a not stable image. It's not plug&play just yet. Very temporally unstable, smeary, and noisy. Bleh.
It's a shame that most demos are too high-contrast, which makes the final result not very appealing! An artist should supervise the videos before they go online!
It was done in an auditorium with a huge screen with lots of diffuse lighting so it’s a standard to bump the contrast for presentation as everything gets washed out anyways, they should’ve kept the original for the UA-cam presentation but I guess is that it’s always undecided wether it’ll be put on web or not
@@me-ry9ee Well, I can't even decide which pancake I should order, so...
Path Tracing is the futur, much better, precise, complet, realistic and accurate than normal RT.
Holy! Loving it.
it's really amazing but to run this you need upscaling, frame generation and $2500 GPU that consumes 600+ wats of power - rasterization is king for next 10 years as games fly in native 4k - hail to the king
Great stuff 🎉
Looks amazing I wonder if NVIDIA will ever pursue bringing GameWorks back in that branch
Cool.
But what about fixing the well documented issues with stuttering which is a thing since the UE4?
They won't get marketing buzz from that so they don't mention ;) they're also incapable of fixing it.
@@willianjohnam I'm afraid it's a lost cause at this point.
The fact you think UE4 invented needing to compile shaders shows how little you know, i've used UE3 game builds that have shader stutter because they dont precompile them, its a developer issue, not an engine issue.
who would have thought the day will come, this might be the reason i get a high end gpu, not ray tracing or dlss 3.5, whatever, but actual path tracing in real time, what a time to be alive, edit: now that i remember, two minute papers showcased this like a month ago or so, and it was running on a gtx 680 or something quite old, so i hope nvidia doesn't come up with the stupid idea to lock it exclusively on the 5000 series
Download the NVRTX branch and try it. Real time path tracing is not the same as offline path tracing.
@@computron5824 Dude, this is REAL TIME PATH TRACING ! And guys like you will still find reasons to whine that it's not as good as offline path tracing.. Like really ?
@@gn2727 Dude, calm down, it's marketing. I've been testing the NVRTX branches for years. Marketing this as path tracing right now is a stretch.
You guys are freakin' geniuses! I'm wondering, are ReSTRIR and the MegaLights feature related? Or are they different technologies?
How about some Real-Time-No-Stutter? Or maybe Real-Time-Stable-Framerate? Now that would be game changing!
unity is just a name now
Unless your hardware can't handle Unreal Engine, of course.
excellent, I am waiting for this feature.
Good showcase but please also talk about the caveats. We can see how this looks in motion. Show some bad cases, show the ghosting, the smearyness. It comes off as a bit shill-y and dishonest to not mention the negatives of these techniques. It's not only a performance hit. The image stability and motion clarity takes a huge noisedive in many cases. This is like 2015 era TAA all over again 😂
6090... xd
31 years... it only took 31 years since Doom in 1993 for the dream to be real!
The new tech is impressive and all, but can you PLEASE help devs eliminate the horrendous traversal stuttering which seems to be ubiquitous in UE5 games. Having all these new super expensive features that reduce performance even more is pretty pointless when the frametime graphs look like a mountain range. It's giving the engine a bad reputation.
GREAT!
I thought shader execution order was hardcoded into the hardware?
can't wait for real time path tracing to be at the level of current day animation
This guy is pretty casual for showing the newest gaming tech in the world that actually breaks the limits.
Amazing
Illuminating :D
I remember there was a 3dmark test displaying a rotating carousel with light sources. When it turned on 8 light sources the frame rate dropped to something like 5 fps.
Pretty cool to see RTGI running at >1 "uhps"
I love all of this, and I love Richard's presentations (been watching them for a long while). The only barrier to entry here is needing to have the separate branch. In the past I'd downloaded the branches and compiled, but it's a bit limiting as far as other aspects of Unreal that are easily injected into the vanilla code from Epic Games. I wish Nvidia and them could work together so everything could be a plugin, similar to how they now have DLSS available as a directly-injected plugin. Obviously that would be vastly more complex than I myself could understand, but it would be an 'ideal' situation. I HORRIBLY miss all of Nvidia's GAMEWORKS library not being supported anymore (for a long, long time). :( Love the tech, Nvidia!!
32:19 My project is dependent on Ray Reconstruction. I am waiting for it to be implemented, but the DLSS plugin download website for UE5 only says "coming soon." It was mentioned as part of DLSS 3.5, but you already shipped UE5 DLSS 3.7 without it. In this video, you emphasize how important it is, but no download is still available. It is coming soon for a year now.
People on the nvidia dev forums have been asking for it for a long time, and nvidia just doesn't respond. Even though it was announced over a year ago, its most likely not ready for public release.
Very exciting technology. I just rendered my 12 sec lasting scene in 18 hours with pathtracing.
omg---- I REALLY need to learn Unreal...
Really cool stuff, can't wait to try it out
Imagine what a 5090 is going to accomplish with this
Wonders
Welcome to the comment section, my friends.
Silly question here but is this feature already in the released 5.5?? I can see that they listed it as a new improvement but I can`t find how to enable and test it, or is it just how the engine calculates lights now? , thaanks
when will this be available in ue5 final releases on epic games launcher.
Cool but how does it do with nanite? Because using ray traced shadows with nanite object doesnt work
it's not ray traced so its different
Path tracing does not need nanite at all. It can handle insane amount of geometry better than nanite.
@@gn2727 but unreal can’t, insane amounts of geometry bog down more then just lighting calculations
@@that3dguy119 and yet can it work with nanite or not?
Current path tracing does not work with nanite displacement. Will this be?
it does with ray tracing nanite mode on it think, r.raytracing.nanite =1 or something like that
@@John-ee4qr yee, the cvar is r.raytracing.nanite.mode=1, but it doesn't work with displacement/tessellation specifically.
@@YouSacOfWine ok i do not use path tracing much so i was not sure
@@YouSacOfWine This is real time so more than likely that limitation doesn't apply here. He emphasized that we can even mix different things, Raster and Pathtracing together, so whatever works with Lumen would naturally work with real time pathtracing too.
@@pawprinting that’ll be interesting to test out
Otherwise performance and quality awsome❤.
daam can't wait to make some scenes with that
I used to do this (offline of course) on a commodore amiga 500, luckily the resolution was only 640x512
so impressive
How do we get Lumen ReStir? looks great
Thank you for providing the link to the branch in the video description. Oh, wait....... 😒
i cant wait till the default engine officially adopts proper emissive material lighting support as shown in this.
another thing im anxious to receive in the default engine is proper world position offset support with ray traced shadows. Im tired of having black areas appear, shift, and disappear on my foliage, aswell as world position offset compatible ray traced shadows
Thank you CDPR 🎉
Is this that thing they showed off in the Ramen shop demo and cyberpunk years ago? Is there a way to get it?
cyberpunk orion gonna be crazy!
Yes man, 👍 good job, keep going 💪, the Unreal, the Epic, the Fab is more fortune for one programmer, one artist, and one game developer who use Unreal, and, man I use Good before, stealing have some projects little to finish, I don't like the Godot 4 stealing use 3.6 and it doesn't some different. Just for the. Eagles eyes lol. No I'm fascinated by what I must work like crazy, programming to just move, on Unreal just goes like crazy and with maximum effort! 💪 Just keep going guys you are amazing.
Now... it would be interesting, what exactly is considered high-end or low-end hardware. But looks very promising!
What is a bit concerning tho is the fact that companies might rely too much on things like DLSS for performance, instead of optimizing their games properly. Actually, thats already happening. On the other hand, the more streamlined and optimized the engine is out of the box and the less you can do to break performance (like with the light sources), the better.
Were the Q&As at the end recorded?
So when will it happen or is it already and how to? 🎉
is there a difference between "full ray tracing" and path tracing? I thought the terms were synonymous with each other but now that you're distinguishing between path traced effects and ray traced effects within path tracing, i'm starting to wonder...
The talk doesn't seem to be taking the new MegaLights feature into account, but it should mean the legacy fallback to Lumen could look even closer to the fully pathtraced version when it comes to emissives.
The reduced disparity will be great for devs, but terrible for hardware companies trying to convince the average gamer of the difference 😅
I've been using RTXDI in my work for over a year, and when MegaLights was announced, I was fully convinced that it was finally RTXDI integrated directly into Unreal. Do you know what the difference is, in brief?
@СергейШавлюга-з2ч I'm not real sure the underlying implementation of MegaLights other than what Epic mentioned in their presentation a few weeks back. Although I think they did say it might require hardware RT. Not sure if the same as ReSTIR or not
@@xephyrxero Thank you
It is probably what's under the hood,megalight is maybe just a fancy word for ReSTIR with an ergonomic and user-friendly packaging, I would be very surprised if it's otherwise
This thing should come to Blender too
It will definitely happen, but will take many years.
@@nareshkumar3526many many years. Nanite, for example, was release years ago and till today the Blender Foundation has made no mention of this tech being implemented in the new roadmap in some form into Blender despite it being arguably the biggest tech breakthrough to happen to 3D Art. Same thing can be said for Eevee next which despite being brand new, is still inferior to lumen in many ways.
You can say it again!
@@mrlightwriter This thing should come to Blender too
@@GenesisSoon :D