I studied Computer Science 25 years ago and back then lightmaps were the big thing. Raytracing was just a dream. Now a game dev i feel this graphics improvement has been amazing. We've been waiting a long time for this quality to finally reach realtime.
Maybe I’m nitpicking, but I assume you mean real time ray tracing was a dream 25 years ago. Ray tracing itself definitely was a thing, it’s just that a single frame would take hours to render so forget using it in a video game. But even then, it was used in stills and movies (where time wasn’t a factor) and the results at the time were absolutely breathtaking, maybe even more so than today given other technologies couldn’t even come close.
@@jmanig76 But that is the beast to solve. The quality is not really impressive the time in which it is done is. With enough time and physical considerations you would be able to create a photorealistic image. For example comparing the first avatar with the second movie. But to do it in a feasible timeframe or even real time is much harder since you will have to approximate, especially since exponential performance increase at the same price is seemingly ending in our lifetimes.
Yup real time RT. I also studied CS just over 10 years ago and even then we were taught it was so prohibitively expensive that you wouldn't even countenance it for real time use. Nuts where we're at... but that's always the way in this field!
The fact that we're getting a Triple-A game to implement Path-tracing and be remotely playable (let alone 60 FPS+ abeit with upscaling) is just nothing short of a miracle.
Bingo. A lot of people were slagging the ever living hell out of Nvidia and Quake back then. But it's clear that Nvidia is playing the long game. We have one of the most visually complex games of all time running realtime pathtracing and it's completely playable. It's borderline surreal to me.
AAA game? We’ve must have played something totally different. Outside of *Graphics, nothing is impressive or innovate with CyberPunk. Especially the dumb founded artificial intelligence and lack of a true sandbox world/game. This is simple a tech demo for graphics, one of the worst games I’ve ever played with such an immense 100+ million dollar budget
@@tears2040 Dude can you chill the fuck out already? We are talking about rendering and graphics here. And in those terms Cyberpunk is ahead of anything we have today. Also its not sandbox game, its an RPG. Its not GTA like you expected for some reason. As an rpg its a solid game. So stick to the topic. This video was about graphics...
@@Rebelscum264 To be fair, prior to 2018 I don't remember real-time path tracing to be even on the agenda for gaming in the coming years. Maybe I was out-of-the-loop.
@@dominicdibagio7166 You aren't really that out of the loop, you're mostly correct. Before Nvidia's ray tracing acceleration I remember there only were one or two path traced engine (one of which was called Brigade 3), which were pretty obscure with only one game under development (with lots of noise), an unreleased Quake Wars demo (from Intel I think). I remember the Titan Black being able to render a scene at 640x480 30fps (I think it was CUDA accelerated, too). Almost everybody thought path tracing was a pipe dream and without Nvidia's push for dedicated RT cores it would have remained such. The boost in development has been significant and probably unpredictable with pure bruteforcing of CUDA and such.
Greatly appreciated to see videos like these! There are so many technologies at work to produce the results you're talking about. I thought I understood what Cyberpunk 2077 was doing with its overdrive mode, but I had not heard about Restir whatsoever until now. I love learning what's going on under the hood!
@@perpetualprocrastinator Though it would be awesome, I doubt Sony would allow it. That would place the visuals too far ahead of their flagship console, which couldn't come close to running it at a playable frame-rate.
DF: You're probably the ONLY channel on the Tube that does things like this. Others mention and regurgitate what's already been said or known. YOU BRING FRESH Fruits to the table and that's why I subscribe and thumb up you - EVERYTIME - when I watch your video
"You're probably the ONLY channel on the Tube that does things like this." no, plenty of other channels do game tech and graphics stuff, and also just regurgitate PR, and fanboyistic BS, because making actual content is hard and you have to do it weekly, and also most of these people wouldn't be that good at it even if they only had to do it once per month.
For someone who doesnt know much about the deeper things of how this technology works, the analysis is just as impressive as the actual showcase! Great job presenting all this, I cant imagine it's easy to condense and organize all this info
Impressive? What? A defect game and a choppy story? Oh you talking avout the light effects and the mirror effects in the water. Uhh who the hell cares these fake things? The game still a bad something ...
@@gaborszabo7765The whole video that this person is commenting on is about path tracing, a technical aspect of Cyberpunk. Nothing about this video, nor the person's comments on this video, is about the merits of Cyberpunk the game. You would know all of these if you watched the video, or even just read the title of the video. Stop being stupid. Surely you're better than this.
@@gaborszabo7765 Do you always struggle with reading, or is this a one off thing, where you see someone mention something positive about video that's related to cyberpunk and it turned your brain off?
Going from Quake 2 RTX to Cyberpunk 2077 Overdrive in only 4 short years tells me in 10 years path tracing will be close to the norm. That really is pretty fascinating knowing that milestone is achievable in 4-5 more GPU generations.
Yeah, that rate of growth is nuts. By the 60 series it will likely be fully playable on midrange graphics cards. By the time the next gen consoles arrive pathtracing will be common place and by the time we hit 2033 I'm certain rasterized lighting will be dead as a doornail. I'm calling it now: There's going to be a TON of remakes using pathtracing as their selling point. Example: Final Fantasy VII Remake: Pathtraced Edition/Illumination Edition The effect is that transformative
Here is how I see it… I believe RT can be a standard and maybe replace rasterization as the default lighting system in the PS6 generation, Path tracing tho? Could be a standard by the PS7 generation… games are usually designed with consoles as the baseline, and console’s RT is pretty weak now. PS6’s RT should be light years beyond the PS5’s!
@@RicochetForce On the other hand, prerendered videos are moving AWAY from path tracing because because it's ineficent and you can make an animation in UE without any RT today that looks unbelievable.
I'm glad DF makes high quality videos like these, talking about the many technologies that are used to make path tracing work in a triple A game like Cyberpunk 2077 without cutting much of the technical parts. I might not understand 100% of what was said, but I'm glad I know at least names and overview of techs used
Compared to the views other channels get, this channel is quite possibly very much underrated. I didn't know about and I absolutely regret it. It's the best among the lot of channels which only provide surface facts without providing a proper feel. For eg. I was actually able to visualize how reSTIR works or how increased L2 cache would affect frame latency and fps in case of ray traced games because of how random rays are traced and textures are interpreted per frame. I can't possibly subscribe you more. Would have loved to even pay to support your channel had I not been a student and had a working job.
Alex, you can turn SER and OMM on and off in Portal RTX from the developer menu. It's possible that it can be switched off in Cyberpunk via a mod. There are multiple mods that deal with ray counts as well.
@@RicochetForce I found visual bugs with anything more than 2 bounces, things started to glow after a while, some other settings might need adjustments as well. I'm playing with 2 bounces and 6 rays per pixel and the noise is significantly reduced, although I lost 40 fps on my 4090. I'm still bale to get around 80 fps though, so it's all good.
@@cpt.tombstone The eye adaptation in the game’s rendering pipeline likely needs to be adjusted to scale properly with the number of bounces. Just like how we see things look too dark with only 1 bounce with that mod.
Another important factor is how BVH travessing inherently works. It has a logarithmic time complexity, which means that if the number of triangles increases by 1000x, the BVH traversal won't take 1000x as long, it might only increase by 2-3x.
True , which is why sometimes im games like dying light 2 , using rtgi halves your framerate but adding additional effects on top like ambient occlusion and reflections may decrease performance by just 5 additional fps
@@SimonBuchanNz luckily there is an upper limit for how many triangles it's salensible to have, so at some point it'll start plateauing. E.g. triangles so small that you have tens of them in a single pixel is seriously overkill. Microtriangles is a thing and used in offline rendering, but there there is a "limit" of diminishing returns..
@@phizc I wonder. We are already seeing things like Nanite targeting a triangle per pixel, but I don't know if you can apply that to ray traversal (sounds interesting though!) Very long term, triangle per atom? 😄
thats from graphic cards perspective. But CPU will have 1000x more work to compute BVH, and BVH itself will have 1000x more data- so it will need MUCH more vram. At that size of dataset, you also start to get more cache misses, and need bigger cache. So yeah, in theory 1000x more triangles is not more work to trace, but its still a factor, that we will not see soon (and probably will not need it- looking at ninite)
Great video Alex, I did not expect you to give an introduction to restir and NRC. That was a pleasant surprise. I also did not know the game doesn't use omm. Very exciting times we're in I must say.
Just a small nitpick: at 8:29 ReSTIR stands for Reservoir (based) SpatioTemporal Importance Resampling, hence why it's "Re"servoir based "S"patio"T"emporal "I"mportance "R"esampling, i.e. ReSTIR. Maybe y'all left that out because it's a little more complicated but it makes the acronym make sense.
Most of my time playing this game is spent looking at lighting scenarios on walls. Sometimes I'll hear a radio, seek out the radio, assualt the group of people standing around the radio, turn it off and go back to looking at walls.
Wow, this is a proper research video. Thanks for highlighting these research innovations and their significance in modern games. No other channels does stuff like that.
Wow this kind of deep dive is truly awesome, I learned a lot! Here's hoping there's more titles soon that push the envelope more in path tracing so we can see more, I feel like we're on the cusp of really incredible lighting. Everything in overdrive has this really nice weight and place to them
at 5:29, SIMD is listed as "Simultaneous Instruction / Multiple Data" but that is incorrect, SIMD stands for single instruction, multiple data, as one instruction is carried out on for example 32 registers in modern GPU architectures.
I was wondering why you said RT Overdrive is "mostly path traced". I'm glad to finally know that you were referring to limitations in transparent reflections. I wonder if these will be fixed in future updates. Refractions and caustics would also be really cool. Interesting video, as always.
Yeah, I wouldn't be surprised when the final version of OD is released (as they made it clear this is a work in progress) it'll also include those elements as well.
As far as I can work out, its still primarily a raster render, ie unlit textures are pasted into the gbuffer triangle by triangle, and then lit, shadowed and reflected using rays. I think they are calling this "full raytracing" because compared to previous titles with RTX features (Shadow of the Tomb Raider, eg, only did shadows with raytracing, not lighting or reflections) it uses all 3 techniques instead of just 1 or 2. IIRC, original Cyberpunk 2077 did reflections with RT but not shadows or GI. TLDR; its not actually "full pathtracing", because real pathtracing wouldn't have any problem with the glass (other than not enough bounces causing dark spots).
@@puddingtopf Eventually all game development will move in that direction. Unfortunately that will require even entry level graphics cards having the ability to render pathtraced titles properly. Which is some ways off.
Thank you for the video, Alex! Having grown up seeing a new rendering or game design paradigm occur every year or two in gaming, I recognized Cyberpunk 2077's Overdrive mode for exactly what it was immediately. I'm happy for all the people that were too young to have experienced those leaps to experience one of their own. Pathtracing is a massive step forward, and one that was considered a fever dream for realtime rendering for years.
Large caches are useful for ray-tracing because of the huge acceleration structures (and geometry) required to be consulted during ray traversal, by the aforementioned incoherent rays, and it is expensive to wait on fetches from VRAM. While multi-core GPUs do a great job at hiding these types of latency, they have their limits. Caustic (later purchased by Imagination) had a solution for this, which could be implemented in software to improve cache efficiency: batch rays that are travelling in roughly the same direction, and compute all rays in these batches in a go. This works because there is a high likelihood of traversing the same structural nodes and intersecting with the same set of geometry, improving cache hits dramatically. Another solution is to utilize imposters, reducing the size of both structures and geometry while providing quality high quality results. I believe UE's Lumen does this (as some SDF amalgam) at least in its software mode.
This is actually a big use case for the shader execution reordering (SER) mentioned in the video. Without going into too much detail, SER lets you reorder and compact threads in a grid according to some arbitrary condition. You could, for instance, reorder threads after tracing indirect rays so that you have groups of threads all executing the same material shader. As for reordering the rays themselves, this would be under the hood, and there's no particular reason to think that Nvidia doesn't already do it. The ray tracing hardware is something of a black box, where shader threads request a ray to be traced, go to sleep, and at some point wake up again when the intersection results are ready.
I'm in the same boat. Path tracing means that photorealism is just a matter of the quality of the assets themselves, which is mainly down to skill and time. Once hardware is more performant and the sample counts increase, that'll be realtime lighting solved, just like that. Alongside virtualized geometry and more advanced simulations, graphics will be pretty much perfect
@@existentialselkath1264 it's not even that. With how quickly AI is evolving, once RT is mature and widespread enough, AIs will be able to produce high quality assets good enough for production
@@NigraXXL it'll probably play a big role eventually, but AI (more accurately, machine learning) has yet to produce any quality 3d work or temporally stable video, nevermind accurate lighting, forget about at a performance level comparable to even full path tracing
Thing is it's like all the other great leaps forward. At this exact point in time for the vast majority it is little more than a gimmick because the cost of using these features is too great, better to sacrifice them for better performance. However in a few years time that attitude will be changing, just as has happened with RT generally, as accessibility and adoption increase. It's a bit gimmicky, but marks the beginning of a new era
Stooge, it's simple. They don't understand. Laymen didn't understand the need for more hard drive space in the 90s. Laymen didn't understand the need for hardware accelerated graphics in the 90s. Laymen didn't understand the move to deferred rendering. Laymen didn't understand the move to HD and the development challenges and asset creation pipeline changes that needed to be made. Laymen didn't understand what bump mapping actually was. Laymen didn't understand the importance of PBR. Laymen didn't understand how momentous the development of dynamic lighting was. Laymen didn't understand the ramifications of the move to raytracing. And now laymen don't understand that pahtracing is the goddamn future. It's a technical milestone, a sea-change in how lighting (and development at large) will be handled going forward, etc... Consistently, in every era, they get angry, loud and belligerent. They claim it's a scam to get them to buy X product, that developers are lazy and should focus on fixing the games instead of making them pretty garbage, etc... There's nothing wrong with not understanding something, but the difference here is that they are proudly ignorant and refuse to hear otherwise. They'll spread negative word of mouth to people who are unaware of these advances. That's where it pisses me off.
I've been replaying through the game with path tracing on, and while the performance has been a bit rough in some spots even on a 4080 the way it transforms the look of the game has been amazing.
We've come so far, yet we're basically still in the infancy of real time ray tracing. That already makes me excited to one day see your "10 Years of Ray tracing" retrospect
It's crazy how many tricks they can use to make this one feature work. From architecture changes to restir to Deep learning super sampling and upscaling. Your only rendering a fraction of the final image. Its so clever. And it's a glimpse of what the next generation will be. Infinite detail with nanite and real simulated lighting with th path tracing.
The game with full RT looks amazing. I never got hooked into finishing the game during my first time playing it, but the way it looks now has me playing a new game from the start.
Cyberpunk 2077 OverDrive looks abosolutely amazing. It might not be available for every graphic card but as with everything, eventually it will get to the mainstream. And it's worth the wait. Some of the scenes just couldn't look any better.
True, I have a 3080 and a 4k Oled, obviously I will wait until I have a newer setup to play it. But it's progress and looks amazing. Wish every 4090 owner a great time with it xD
@@maxmusterman6030 Yeah... I tried it on my 3080 with the mod for a bit of extra performance. Still isn't worth using imo. Can't enjoy the visuals when it either doesn't run smooth, or you're compromising too much resolution. Just a lack of clarity on the whole. Not sure why people with cards below 3080 are even bothering lol Oh and I got a C2 as well
@@Brandywine92 I have a gtx 1650 laptop and still I care about it is because of how tech tends to cheaper out just within a few years. I am still a student, but when I get my first working job, I can't wait to play new titles on my future pc. Also the second reason being, I'm a game developer as well, recently interned in an XR company, and loved using unreal engine. Videos with as technical depth like these are worth more than gifts for graphics nerds like me xD
Huh? What's exciting about it? It's just a simple effect.. Like the glorious tress fx in Tomb Raider. Lol how many games used it? Let me help:1 Tomb Raider 😂🤣
@@gaborszabo7765 It's not a simple effect, it's a whole different way of architecting a graphics rendering pipeline. TressFX is an effect for simulating hair (from AMD I might add)
An excellent follow up video from DF for CP2077’s overdrive mode. Personally, I would like to see another near/full path-tracing title but on Unreal 5 with high quality Nanite assets for the near future. Currently, the NVidia branch of UE5 is already using RTXGI and RTXDI but has some compatibility issues with newer features such as Nanite trees, so Lumen is still more practical in most cases but has its limitations versus proper path-tracing (ie. Rely on surface cache, light leaking, noise, limited bounces etc).
I was actually wondering what UE5 can bring together with Nvidias tech. So you’re saying that Nvidia has a dedicated team to implement their tech into UE as good as possible?? That sounds really promising actually
Just got my RTX 4090 yesterday. Upgrading from 2080 TI. Of course I had to load up CP 2077 again. I am blow away by the visuals with Overdrive mode! Wow. Great video.
I got the same feeling seeing Overdrive CP2077 as I did playing Mario 64 for the first time, playing GTA 3 for the first time, playing Gears of War for the first time, playing Crysis for the first time. The only way I could describe it is that it feels like I'm seeing the future, now.
Another game I can recommend is metro exodus enhanced. It is also fully raytraced and you can run it maxed out at 4k 120fps with dlss quality on a 4090.
@@gavinderulo12 Metro Exodus Enhanced Edition is nowhere near fully raytraced. Not even close to, tons of shortcuts and fallbacks. And it's very slow to update the light. It's a good example of where we were just a couple years ago but it's not anywhere near as impressive as Cyberpunk 2077 RT Overdrive.
@@TheDravic No it's not, it's raytracing only but they need a path tracing patch for it I think, I don't know why you say it's nowhere near when it literally by being raytracing only, no raster lighting
Yes, I did read somewhere that CDPR said they will implement the OMM tech in Cyberpunk Overdrive, so when the patch releases the fps will get a lot better in vegetation areas.
It’s because before it was mostly bolted on additions in most other games. This shows what can truly be achieved when being more fully implemented. Ray tracing is absolutely the future.
Since when I was a kid, I have always had the need to know what's inside everything. This is really amazing. We have come a long way in graphic technology
There was recently a mod for CONTROL that updates the ray tracing and adds DLSS 3 support. As well as ultra wide and HDR. Would love to see a review of that. Been playing that on my 3080ti and it's fantastic so far.
@@SimonBuchanNz it really helps with the banding issues that game has. Also makes the hallways look less washed out from the reflections and distance fog.
@@alephnole7009 Oh, absolutely. Right now I advise anyone to drop the 3.1.1 DLSS file into any game they know runs with DLSS2. The performance gain and improvement to image quality is nuts.
Love raytracing. My only gripe is that in some games, especially hogwarts, things are too shiny and reflective, like they went too far and everything looks wet or clear-coated.
Man I feel so lucky to be one of the few to be able to experience fully path traced Cyberpunk 2077 in all its glory right now. The 4090 is the first GPU I've picked up since the 1080Ti at launch where I was actually so blown away at the performance leap. I am coming from a 3080Ti and I was just blown away when I picked up the 4090 when it first came out at how big of a performance leap it was.... and now getting to experience stuff like this - its just insane. Can't wait for the next few years and what it brings to gaming, especially with AI coming in hot.... I foresee a future where you can have full length dynamic conversations with NPC's in video games and have fully dynamic side quests and to a lesser degree full story missions as well... so insane.
Nice, great tech showcase! I'm running overdrive at 4k with dlss ultra performance on an rtx 3080 10gb. Locked 40fps in the benchmark. Very playable overall, but I need the expansion to pick the game up again. Already finished it 4 times (love it).
It's sad to see CDPR ditch their REDengine. All this technical knowledge will be passed to the UE5 engine where they will have limited access to modify it.
@@NigraXXL and I'm sure nvidias next move is to incorporate their Realtime path tracing tech into UE5 seeing how popular the engine is going to be this gen. Maybe they will have an option to switch out hardware lumen for pathtracing in future ue5 games on pc.
Uh, what? CDPR and Nvidia were very clear about what this technical preview was. For Nvidia it was a showcase not just to consumers, but the industry at large that realtime pathtracing is fucking here and it's 100% viable. That if it can run on a Triple A monstrosity like Cyberpunk, it can also run on much smaller, lighter indie fare. For CDPR, they flat out said it's them testing pathtracing ahead of its implementation in Witcher 4 and the next Cyberpunk game (both of which are apparently in simultaneous development). So while while their REDengine is no longer in use, their knowledge and experience with pathtracing will be that much better going into the development of those new games.
Fantastic video! It's great to learn about a lot of the work, and complexity, that goes into making a technique like path-tracing performant in realtime.
Hey Alex, great video! Just a quick note: SIMD stands for "Single Instruction/Multiple Data". This really defines the work that can be done by GPGPUs. While a GPU thrives to be inherently concurrent by desing, GPU programs are usually bound to working on independent data using the same instruction on every sample. Therefore it's a single instruction working on multiple data! Thanks for your great work - keep it up! Liebe Grüße :)
RT Overdrive and Path Tracing looks actual worth using technologies than just simple RT where you suffer substantial performance for little visual improvements.
We will definitely be able to run Cyberpunk natively at 60 FPS with overdrive mode when the RTX 6090 is released. Nvidia and CDPR have showcased an amazing demo, providing us a glimpse of what types of games we can expect in the future.
I love how we get a great video like this from DF. It's informative and professional. Only to get about 1/4th the comments be; 1. iTS SInLle iNstRucti0n, MultiPle dAta Or 2. 'nvidia shills'
The biggest bottleneck to hardware raytracing would be the very shallow acceleration structure. We only get *two* hierarchy levels in our BVH tree - and we're forced to use axis-aligned bounding boxes to boot. At least let us have a tree as shallow or deep as we want, and then let use use other parametric fast-to-intersect-rays-against shapes like spheres to bound our geometry and clusters of geometry in. The fact is that when a ray intersects the AABB of a piece of geometry, for instance, the ray then must be tested against *every* triangle in the model. This is a very very very slow way to do things. I remember when Nvidia announced the RTX 2000 series GPUs - I thought "no way! they must've come up with some genius spatial indexing structure to eliminate the need to test rays against thousands upon thousands of triangles!" Boy was I wrong. They're just brute forcing it! Imagine if rasterization had to search for the triangles that overlapped each pixel, where 99.9999% of the triangles it tests simply don't overlap the pixel at all. It's effing insane. It's downright stupid. If you have thousands of rays intersecting a model's AABB then each one is going to be testing against the same list of triangles over and over, missing almost all of them. IT'S CRAZY. If we can have deeper hierarchies then we can group together sections of a model, a few dozen triangles per bounding volume, and speed things up by orders of magnitude. This two-level hierarchy BS just ain't cutting it.
There's a Chips and cheese article that looks the approaches both AMD and Nvidia take to raytracing. Nvidia's shallow BVH tree approach allows for greater parallelism achieved by being less sensitive to memory latency. I'd suggest you take a look
I feel like you should read Ada Lovelace Architecture White Paper. Not for the BVH hierarchy itself but for the new primitive (that no game uses - yet) - Displacement Micro-Mesh or DMM. At face value it seems that Nvidia has been making strides in solving this issue but it will require a paradigm shift to implement it in actual games.
What you said is simply wrong and not the case for hardware raytracing. Each level contains a BVH down to the triangles themselves. Once a ray intersects a bounding box in the TLAS it needs to run through the tree in the BLAS and gets down to around 4-8 triangles in each node. There are limits to this two level hierarchy, but it is not what you are talking about
@@chengcao418 The tree is two-levels. I haven't found any documentation that shows otherwise. I was hoping that what you are saying was true - that you can have multiple recursions in there and the TLAS/BLAS is just an abstraction for data structures, but everything I've seen has always been that BLAS is the actual geometry instance and TLAS is the collection of those instances, and there's no BLAS with child BLAS' or the like. Yes, you *can* create thousands of BLAS for handfuls of triangles for a scene like CP2077 has but then you're just performing thousands of ray/hit tests against all of those BLASes in the scene TLAS instead, which isn't much better because now you've got the overhead of managing all of these BLAS instances from the CPU. I really was hoping that I was just reading the API documentation wrong, and that there was something in there that allowed a TLAS to have child TLAS instances, or anything to that effect, but I haven't seen it. I'd love if you could show me where it says that it's not just a 2-level tree.
IGN's video seems to imply that the look is "subjective" to each viewer's preference, where there are scenes where the older tech, Ray Tracing Ultra, can look better in certain situations. Look better than the more modern Path Tracing.
@@RicochetForce 4 is usually adequate for diffuse lighting in offline renders. Unless we are talking about extreme examples like Alex showed. I am slightly disapointed that it's only 2 by default and not 3. (Imo GI is the biggest wow factor in path tracing)
@@hastesoldat Oh, agreed. I think 4R/4B struck a nice medium between fidelity, performance and avoidance of the worst of the artifacts. From there it would be better to focus on better denoising, more efficient ray/triangle interaction calculation, extending the propagation range, working on a solution for transparent surface caustics that don't murder performance, etc...
I studied Computer Science 25 years ago and back then lightmaps were the big thing. Raytracing was just a dream. Now a game dev i feel this graphics improvement has been amazing. We've been waiting a long time for this quality to finally reach realtime.
Maybe I’m nitpicking, but I assume you mean real time ray tracing was a dream 25 years ago. Ray tracing itself definitely was a thing, it’s just that a single frame would take hours to render so forget using it in a video game.
But even then, it was used in stills and movies (where time wasn’t a factor) and the results at the time were absolutely breathtaking, maybe even more so than today given other technologies couldn’t even come close.
@@jmanig76 But that is the beast to solve. The quality is not really impressive the time in which it is done is.
With enough time and physical considerations you would be able to create a photorealistic image. For example comparing the first avatar with the second movie.
But to do it in a feasible timeframe or even real time is much harder since you will have to approximate, especially since exponential performance increase at the same price is seemingly ending in our lifetimes.
Didn't legends of Valour use Ray tracing?
@@jmanig76 yes real-time raytracing.
Yup real time RT. I also studied CS just over 10 years ago and even then we were taught it was so prohibitively expensive that you wouldn't even countenance it for real time use. Nuts where we're at... but that's always the way in this field!
The fact that we're getting a Triple-A game to implement Path-tracing and be remotely playable (let alone 60 FPS+ abeit with upscaling) is just nothing short of a miracle.
Bingo. A lot of people were slagging the ever living hell out of Nvidia and Quake back then. But it's clear that Nvidia is playing the long game. We have one of the most visually complex games of all time running realtime pathtracing and it's completely playable. It's borderline surreal to me.
AAA game? We’ve must have played something totally different. Outside of *Graphics, nothing is impressive or innovate with CyberPunk.
Especially the dumb founded artificial intelligence and lack of a true sandbox world/game.
This is simple a tech demo for graphics, one of the worst games I’ve ever played with such an immense 100+ million dollar budget
@@tears2040 Dude can you chill the fuck out already? We are talking about rendering and graphics here. And in those terms Cyberpunk is ahead of anything we have today. Also its not sandbox game, its an RPG. Its not GTA like you expected for some reason. As an rpg its a solid game. So stick to the topic. This video was about graphics...
@@tears2040 Bro, "AAA" is just about budget, nothing else. nobody asked what you thought of the game.
@@konga382 right, but have you heard that cyberpunk bad?
Thanks for covering this. Seeing this paradigm shift in real-time rendering is nothing short of amazing.
It’s almost as if it hasn’t taken 50 fucking years….
@@Rebelscum264 To be fair, prior to 2018 I don't remember real-time path tracing to be even on the agenda for gaming in the coming years. Maybe I was out-of-the-loop.
@@Rebelscum264 Bruh.
@@dominicdibagio7166 You aren't really that out of the loop, you're mostly correct. Before Nvidia's ray tracing acceleration I remember there only were one or two path traced engine (one of which was called Brigade 3), which were pretty obscure with only one game under development (with lots of noise), an unreleased Quake Wars demo (from Intel I think). I remember the Titan Black being able to render a scene at 640x480 30fps (I think it was CUDA accelerated, too). Almost everybody thought path tracing was a pipe dream and without Nvidia's push for dedicated RT cores it would have remained such. The boost in development has been significant and probably unpredictable with pure bruteforcing of CUDA and such.
Except for the overreliance on temporal accumulation - yeah, it's quite amazing.
Greatly appreciated to see videos like these! There are so many technologies at work to produce the results you're talking about. I thought I understood what Cyberpunk 2077 was doing with its overdrive mode, but I had not heard about Restir whatsoever until now. I love learning what's going on under the hood!
Maybe the next game to implement RT overdrive could be Marvel's Spider-Man: Miles Morales
@@perpetualprocrastinator Regular Spider-Man too
@@perpetualprocrastinator Though it would be awesome, I doubt Sony would allow it. That would place the visuals too far ahead of their flagship console, which couldn't come close to running it at a playable frame-rate.
I don't think ps5 can do this otherwise I would go and buy it now
DF: You're probably the ONLY channel on the Tube that does things like this. Others mention and regurgitate what's already been said or known. YOU BRING FRESH Fruits to the table and that's why I subscribe and thumb up you - EVERYTIME - when I watch your video
Damn, ride harder...
NX_Gamer also does excellent technical analyses.
They are just regurgitating Nvidia marketing material, word for word...
"You're probably the ONLY channel on the Tube that does things like this." no, plenty of other channels do game tech and graphics stuff, and also just regurgitate PR, and fanboyistic BS, because making actual content is hard and you have to do it weekly, and also most of these people wouldn't be that good at it even if they only had to do it once per month.
For someone who doesnt know much about the deeper things of how this technology works, the analysis is just as impressive as the actual showcase! Great job presenting all this, I cant imagine it's easy to condense and organize all this info
Impressive? What? A defect game and a choppy story? Oh you talking avout the light effects and the mirror effects in the water. Uhh who the hell cares these fake things? The game still a bad something ...
@@gaborszabo7765The whole video that this person is commenting on is about path tracing, a technical aspect of Cyberpunk. Nothing about this video, nor the person's comments on this video, is about the merits of Cyberpunk the game. You would know all of these if you watched the video, or even just read the title of the video.
Stop being stupid. Surely you're better than this.
@@gaborszabo7765 clearly you dont understand my comment so I'll just leave it at that
@@gaborszabo7765 Homie you gotta learn English before you try to comprehend a sentence and then make a rebuttal in that language.
@@gaborszabo7765 Do you always struggle with reading, or is this a one off thing, where you see someone mention something positive about video that's related to cyberpunk and it turned your brain off?
Going from Quake 2 RTX to Cyberpunk 2077 Overdrive in only 4 short years tells me in 10 years path tracing will be close to the norm. That really is pretty fascinating knowing that milestone is achievable in 4-5 more GPU generations.
Yeah, that rate of growth is nuts. By the 60 series it will likely be fully playable on midrange graphics cards. By the time the next gen consoles arrive pathtracing will be common place and by the time we hit 2033 I'm certain rasterized lighting will be dead as a doornail.
I'm calling it now: There's going to be a TON of remakes using pathtracing as their selling point. Example:
Final Fantasy VII Remake: Pathtraced Edition/Illumination Edition
The effect is that transformative
Here is how I see it…
I believe RT can be a standard and maybe replace rasterization as the default lighting system in the PS6 generation,
Path tracing tho? Could be a standard by the PS7 generation…
games are usually designed with consoles as the baseline, and console’s RT is pretty weak now.
PS6’s RT should be light years beyond the PS5’s!
@@RicochetForce man I wish FF7 Remake had ray-tracing, let alone path tracing!
Game would look so beautiful with it!
@@RicochetForce On the other hand, prerendered videos are moving AWAY from path tracing because because it's ineficent and you can make an animation in UE without any RT today that looks unbelievable.
This stuff is already playable on 30 series.
I'm glad DF makes high quality videos like these, talking about the many technologies that are used to make path tracing work in a triple A game like Cyberpunk 2077 without cutting much of the technical parts. I might not understand 100% of what was said, but I'm glad I know at least names and overview of techs used
Alex's tech focus videos are absolutely underrated.
But we are still waiting for the hogwarts legacy video on pc from Alex ….
they are not underrated at all, stop being a bot
@@Mrbigkamkam Doubt that's ever gonna happen 😔
@@Mrbigkamkam Nobody has thought about that game since a week after it came out tbh
No.
Compared to the views other channels get, this channel is quite possibly very much underrated. I didn't know about and I absolutely regret it. It's the best among the lot of channels which only provide surface facts without providing a proper feel. For eg. I was actually able to visualize how reSTIR works or how increased L2 cache would affect frame latency and fps in case of ray traced games because of how random rays are traced and textures are interpreted per frame. I can't possibly subscribe you more. Would have loved to even pay to support your channel had I not been a student and had a working job.
I loooove the in depth technical details in this video! Thx Alex and DF for making me and fellow nerds happy! 😊
Alex, you can turn SER and OMM on and off in Portal RTX from the developer menu. It's possible that it can be switched off in Cyberpunk via a mod. There are multiple mods that deal with ray counts as well.
Yeah, I actually want to see Alex cover the mod specifically. Seeing people crank it up to 6 Rays/6 Bounces lead to eerily realistic scenes.
@@RicochetForce I found visual bugs with anything more than 2 bounces, things started to glow after a while, some other settings might need adjustments as well. I'm playing with 2 bounces and 6 rays per pixel and the noise is significantly reduced, although I lost 40 fps on my 4090. I'm still bale to get around 80 fps though, so it's all good.
@@cpt.tombstone The eye adaptation in the game’s rendering pipeline likely needs to be adjusted to scale properly with the number of bounces. Just like how we see things look too dark with only 1 bounce with that mod.
They made that video a week ago.
@@void-2b they've made a video about reducing bounces and rays per pixel to improve performance, I'm talking about the opposite.
Another important factor is how BVH travessing inherently works. It has a logarithmic time complexity, which means that if the number of triangles increases by 1000x, the BVH traversal won't take 1000x as long, it might only increase by 2-3x.
True , which is why sometimes im games like dying light 2 , using rtgi halves your framerate but adding additional effects on top like ambient occlusion and reflections may decrease performance by just 5 additional fps
Unfortunately, we also increase triangle counts exponentially, so it evens out to linear over time 🤷
@@SimonBuchanNz luckily there is an upper limit for how many triangles it's salensible to have, so at some point it'll start plateauing. E.g. triangles so small that you have tens of them in a single pixel is seriously overkill. Microtriangles is a thing and used in offline rendering, but there there is a "limit" of diminishing returns..
@@phizc I wonder. We are already seeing things like Nanite targeting a triangle per pixel, but I don't know if you can apply that to ray traversal (sounds interesting though!)
Very long term, triangle per atom? 😄
thats from graphic cards perspective. But CPU will have 1000x more work to compute BVH, and BVH itself will have 1000x more data- so it will need MUCH more vram. At that size of dataset, you also start to get more cache misses, and need bigger cache. So yeah, in theory 1000x more triangles is not more work to trace, but its still a factor, that we will not see soon (and probably will not need it- looking at ninite)
Love these insightful dives into tech. Thank you Alex!
Great video Alex, I did not expect you to give an introduction to restir and NRC. That was a pleasant surprise. I also did not know the game doesn't use omm. Very exciting times we're in I must say.
Alex, you're the only one who pronounces the name of CD PROJEKT RED correctly! Respect and greetings from Poland!
Great educational video. Thank you.
Just a small nitpick: at 8:29 ReSTIR stands for Reservoir (based) SpatioTemporal Importance Resampling, hence why it's "Re"servoir based "S"patio"T"emporal "I"mportance "R"esampling, i.e. ReSTIR. Maybe y'all left that out because it's a little more complicated but it makes the acronym make sense.
This acronym itself gets different descriptions in different papers so... No one really cares about what it expands to anymore
Most of my time playing this game is spent looking at lighting scenarios on walls. Sometimes I'll hear a radio, seek out the radio, assualt the group of people standing around the radio, turn it off and go back to looking at walls.
God tier coverage. It's my favorite channel to see an update.
Wow, this is a proper research video. Thanks for highlighting these research innovations and their significance in modern games. No other channels does stuff like that.
Wow this kind of deep dive is truly awesome, I learned a lot! Here's hoping there's more titles soon that push the envelope more in path tracing so we can see more, I feel like we're on the cusp of really incredible lighting. Everything in overdrive has this really nice weight and place to them
This is *fascinating.* Thanks for the lesson, Alex!
I sense John’s presence heavily in these gorgeous b-roll shots throughout the video. I see you hiding behind that curtain, John!
Yeah, John is like Kubrick of DF. You can't mistake his cinematic style for anyone's else.
at 5:29, SIMD is listed as "Simultaneous Instruction / Multiple Data" but that is incorrect, SIMD stands for single instruction, multiple data, as one instruction is carried out on for example 32 registers in modern GPU architectures.
As a software engineer, thanks for correcting and clarifying this for others.
Thank you for pointing that out. If it wasn't for your post I would have made one.
Already had a hunch that there's something off, but didn't come to mind. Thanks for the correction. :)
I was wondering why you said RT Overdrive is "mostly path traced". I'm glad to finally know that you were referring to limitations in transparent reflections. I wonder if these will be fixed in future updates. Refractions and caustics would also be really cool.
Interesting video, as always.
They will try to switch forwards to path tracing but it's a really tricky challenge.
Yeah, I wouldn't be surprised when the final version of OD is released (as they made it clear this is a work in progress) it'll also include those elements as well.
As far as I can work out, its still primarily a raster render, ie unlit textures are pasted into the gbuffer triangle by triangle, and then lit, shadowed and reflected using rays. I think they are calling this "full raytracing" because compared to previous titles with RTX features (Shadow of the Tomb Raider, eg, only did shadows with raytracing, not lighting or reflections) it uses all 3 techniques instead of just 1 or 2. IIRC, original Cyberpunk 2077 did reflections with RT but not shadows or GI.
TLDR; its not actually "full pathtracing", because real pathtracing wouldn't have any problem with the glass (other than not enough bounces causing dark spots).
@@Oldman_Gamer2 Good point. Would be cool if they could completely drop rasterization for RT Overdrive at some point.
@@puddingtopf Eventually all game development will move in that direction. Unfortunately that will require even entry level graphics cards having the ability to render pathtraced titles properly. Which is some ways off.
Just want to show my appreciation for this outstanding video, thanks for explaining path tracing in such depth.
Thank you for the video, Alex!
Having grown up seeing a new rendering or game design paradigm occur every year or two in gaming, I recognized Cyberpunk 2077's Overdrive mode for exactly what it was immediately.
I'm happy for all the people that were too young to have experienced those leaps to experience one of their own. Pathtracing is a massive step forward, and one that was considered a fever dream for realtime rendering for years.
The references at the end really makes me respect the academic ethic of this channel
To come to this in such a short amount of time? This is truly incredible. Kudos to all the graphical engineers.
16:46 Respect for going for the original pronounciation of CD Projekt RED!
Hell yeah Alex! These Tech Focus videos are some of my DF favorites.
I LOVE these in-depth videos on the technology! Please do more of these!
Large caches are useful for ray-tracing because of the huge acceleration structures (and geometry) required to be consulted during ray traversal, by the aforementioned incoherent rays, and it is expensive to wait on fetches from VRAM. While multi-core GPUs do a great job at hiding these types of latency, they have their limits.
Caustic (later purchased by Imagination) had a solution for this, which could be implemented in software to improve cache efficiency: batch rays that are travelling in roughly the same direction, and compute all rays in these batches in a go. This works because there is a high likelihood of traversing the same structural nodes and intersecting with the same set of geometry, improving cache hits dramatically.
Another solution is to utilize imposters, reducing the size of both structures and geometry while providing quality high quality results. I believe UE's Lumen does this (as some SDF amalgam) at least in its software mode.
This is actually a big use case for the shader execution reordering (SER) mentioned in the video. Without going into too much detail, SER lets you reorder and compact threads in a grid according to some arbitrary condition. You could, for instance, reorder threads after tracing indirect rays so that you have groups of threads all executing the same material shader. As for reordering the rays themselves, this would be under the hood, and there's no particular reason to think that Nvidia doesn't already do it. The ray tracing hardware is something of a black box, where shader threads request a ray to be traced, go to sleep, and at some point wake up again when the intersection results are ready.
I love tech focus. Alex truly has a gift for explaining the most complex of modern rendering tech succinctly and comprehensively.
People acting like this is just a marketing gimmick kinda piss me off. As a 3D artist I understand this is a huge technical milestone.
I'm in the same boat. Path tracing means that photorealism is just a matter of the quality of the assets themselves, which is mainly down to skill and time.
Once hardware is more performant and the sample counts increase, that'll be realtime lighting solved, just like that. Alongside virtualized geometry and more advanced simulations, graphics will be pretty much perfect
@@existentialselkath1264 it's not even that. With how quickly AI is evolving, once RT is mature and widespread enough, AIs will be able to produce high quality assets good enough for production
@@NigraXXL it'll probably play a big role eventually, but AI (more accurately, machine learning) has yet to produce any quality 3d work or temporally stable video, nevermind accurate lighting, forget about at a performance level comparable to even full path tracing
Thing is it's like all the other great leaps forward. At this exact point in time for the vast majority it is little more than a gimmick because the cost of using these features is too great, better to sacrifice them for better performance.
However in a few years time that attitude will be changing, just as has happened with RT generally, as accessibility and adoption increase. It's a bit gimmicky, but marks the beginning of a new era
Stooge, it's simple.
They don't understand.
Laymen didn't understand the need for more hard drive space in the 90s.
Laymen didn't understand the need for hardware accelerated graphics in the 90s.
Laymen didn't understand the move to deferred rendering.
Laymen didn't understand the move to HD and the development challenges and asset creation pipeline changes that needed to be made.
Laymen didn't understand what bump mapping actually was.
Laymen didn't understand the importance of PBR.
Laymen didn't understand how momentous the development of dynamic lighting was.
Laymen didn't understand the ramifications of the move to raytracing.
And now laymen don't understand that pahtracing is the goddamn future. It's a technical milestone, a sea-change in how lighting (and development at large) will be handled going forward, etc...
Consistently, in every era, they get angry, loud and belligerent. They claim it's a scam to get them to buy X product, that developers are lazy and should focus on fixing the games instead of making them pretty garbage, etc...
There's nothing wrong with not understanding something, but the difference here is that they are proudly ignorant and refuse to hear otherwise. They'll spread negative word of mouth to people who are unaware of these advances. That's where it pisses me off.
I've been replaying through the game with path tracing on, and while the performance has been a bit rough in some spots even on a 4080 the way it transforms the look of the game has been amazing.
Just use frame GEN
@@FeroxX_Gosu I've been using dlss3. It helps but it's still got its issues.
@@FeroxX_Gosu frame gen can add latency if the basic fps is low
We've come so far, yet we're basically still in the infancy of real time ray tracing.
That already makes me excited to one day see your "10 Years of Ray tracing" retrospect
It's crazy how many tricks they can use to make this one feature work. From architecture changes to restir to Deep learning super sampling and upscaling.
Your only rendering a fraction of the final image. Its so clever. And it's a glimpse of what the next generation will be. Infinite detail with nanite and real simulated lighting with th path tracing.
Great vid. Highlights how quick things are advancing. People just don't realize because they only pay attention to rasterization.
Moore's Law for gaming! Great video as always!
The game with full RT looks amazing. I never got hooked into finishing the game during my first time playing it, but the way it looks now has me playing a new game from the start.
Amazing work. Love the technical review regarding RT-RT graphics. Thanks for sharing!!
Cyberpunk 2077 OverDrive looks abosolutely amazing. It might not be available for every graphic card but as with everything, eventually it will get to the mainstream. And it's worth the wait. Some of the scenes just couldn't look any better.
True, I have a 3080 and a 4k Oled, obviously I will wait until I have a newer setup to play it. But it's progress and looks amazing. Wish every 4090 owner a great time with it xD
@@maxmusterman6030 Yeah... I tried it on my 3080 with the mod for a bit of extra performance. Still isn't worth using imo. Can't enjoy the visuals when it either doesn't run smooth, or you're compromising too much resolution. Just a lack of clarity on the whole. Not sure why people with cards below 3080 are even bothering lol
Oh and I got a C2 as well
@@Brandywine92 I have a gtx 1650 laptop and still I care about it is because of how tech tends to cheaper out just within a few years. I am still a student, but when I get my first working job, I can't wait to play new titles on my future pc. Also the second reason being, I'm a game developer as well, recently interned in an XR company, and loved using unreal engine. Videos with as technical depth like these are worth more than gifts for graphics nerds like me xD
It does not! You can barely notice the difference unless you make a screenshot in a real game.
I absolutely love these Ray tracing path tracing tech-focused videos being on the Cutting Edge of Technology is so exciting
Huh? What's exciting about it? It's just a simple effect.. Like the glorious tress fx in Tomb Raider. Lol how many games used it? Let me help:1 Tomb Raider 😂🤣
@@gaborszabo7765 It's not a simple effect, it's a whole different way of architecting a graphics rendering pipeline. TressFX is an effect for simulating hair (from AMD I might add)
Great video Alex. As usual, I understood maybe 10% of it but I still find it extremely interesting!
An excellent follow up video from DF for CP2077’s overdrive mode. Personally, I would like to see another near/full path-tracing title but on Unreal 5 with high quality Nanite assets for the near future. Currently, the NVidia branch of UE5 is already using RTXGI and RTXDI but has some compatibility issues with newer features such as Nanite trees, so Lumen is still more practical in most cases but has its limitations versus proper path-tracing (ie. Rely on surface cache, light leaking, noise, limited bounces etc).
I was actually wondering what UE5 can bring together with Nvidias tech. So you’re saying that Nvidia has a dedicated team to implement their tech into UE as good as possible?? That sounds really promising actually
I love y'all for putting out videos like this. Just an absolute master class of explanation and exploration.
Appreciate your knowledge on the subject Alex.
Just got my RTX 4090 yesterday. Upgrading from 2080 TI. Of course I had to load up CP 2077 again. I am blow away by the visuals with Overdrive mode! Wow. Great video.
I got the same feeling seeing Overdrive CP2077 as I did playing Mario 64 for the first time, playing GTA 3 for the first time, playing Gears of War for the first time, playing Crysis for the first time. The only way I could describe it is that it feels like I'm seeing the future, now.
Another game I can recommend is metro exodus enhanced. It is also fully raytraced and you can run it maxed out at 4k 120fps with dlss quality on a 4090.
@@gavinderulo12 Metro Exodus Enhanced Edition is nowhere near fully raytraced. Not even close to, tons of shortcuts and fallbacks. And it's very slow to update the light.
It's a good example of where we were just a couple years ago but it's not anywhere near as impressive as Cyberpunk 2077 RT Overdrive.
@@TheDravic No it's not, it's raytracing only but they need a path tracing patch for it I think, I don't know why you say it's nowhere near when it literally by being raytracing only, no raster lighting
@@WrathInteractive There's a lot of raster stuff in ME:EE, you're incorrect.
these in depth technology analysis videos are my favorite!
Apparently cp2077 became the ultimate tech demo.
I love your videos on ray tracing, they are my favorite DF videos and why I subscribe.
Playing 2.0 with Path Tracing at over 100 fps feels Like the future. I can never go back to other games
1:06 That was Kubrick levels of editing right there. 10/10 edit
i love alex and his obsession with RT
Wow! Great job! So cool to learn about all this stuff!
Yes, I did read somewhere that CDPR said they will implement the OMM tech in Cyberpunk Overdrive, so when the patch releases the fps will get a lot better in vegetation areas.
Great! More of these videos please. Hope more games will take advantage of this.
Loved the game and cant wait to see what a sequel looks like
Amazing video!
I've NEVER cared for Raytracing...but DAMN this is wonderful
It’s because before it was mostly bolted on additions in most other games. This shows what can truly be achieved when being more fully implemented.
Ray tracing is absolutely the future.
@@TheScyy it makes Cyberpunk look like a brand new game. Is overdrive available on PS5??
@@RetroCrisis nope and never will be. PS5 is the same power as rtx 2060ti and its amd so it means it have crappy ray tracing performance + no dlss.
@@RetroCrisis unfortunately no, you basically need at the very least a 3070 on pc to even get 1080p playable.
@@ForceInEvHorizon Without ray tracing it has 2070super-2080 power. With rt, it is close to 2060super because amd lags behind nvidia in rt
What a time to be alive!
Cyberpunk 2077 is literally a modern graphic benchmark tool
Since when I was a kid, I have always had the need to know what's inside everything. This is really amazing. We have come a long way in graphic technology
The biggest difference is in character lightning and fpp cutscenes - they look so much better!
absolutely loved this 'tech focus' piece. very informative. hope to see more for different technologies and games. maybe something for nanite one day.
There was recently a mod for CONTROL that updates the ray tracing and adds DLSS 3 support. As well as ultra wide and HDR. Would love to see a review of that.
Been playing that on my 3080ti and it's fantastic so far.
Yes!
But a quick clarification:
Not DLSS frame generation. Just DLSS 2.X to 3.1.1 afaik.
@@MaxxPlay99 oh so the mod doesn't add Frame gen?
Still an improvement over the old versions at least.
HDR always seemed like the one thing missing from that game! I'll have to give it another look.
@@SimonBuchanNz it really helps with the banding issues that game has.
Also makes the hallways look less washed out from the reflections and distance fog.
@@alephnole7009 Oh, absolutely. Right now I advise anyone to drop the 3.1.1 DLSS file into any game they know runs with DLSS2. The performance gain and improvement to image quality is nuts.
What a great video and what a great guy. Truly you care about games.
Love raytracing. My only gripe is that in some games, especially hogwarts, things are too shiny and reflective, like they went too far and everything looks wet or clear-coated.
Perfect explanation and summary as Allways 🙏
Thanks a lot Alex. You are the best!
Man I feel so lucky to be one of the few to be able to experience fully path traced Cyberpunk 2077 in all its glory right now. The 4090 is the first GPU I've picked up since the 1080Ti at launch where I was actually so blown away at the performance leap. I am coming from a 3080Ti and I was just blown away when I picked up the 4090 when it first came out at how big of a performance leap it was.... and now getting to experience stuff like this - its just insane. Can't wait for the next few years and what it brings to gaming, especially with AI coming in hot.... I foresee a future where you can have full length dynamic conversations with NPC's in video games and have fully dynamic side quests and to a lesser degree full story missions as well... so insane.
Nice, great tech showcase! I'm running overdrive at 4k with dlss ultra performance on an rtx 3080 10gb. Locked 40fps in the benchmark. Very playable overall, but I need the expansion to pick the game up again. Already finished it 4 times (love it).
It's sad to see CDPR ditch their REDengine. All this technical knowledge will be passed to the UE5 engine where they will have limited access to modify it.
Devs have access to unreal's source code, so they can heavily modify it. That said, nanite and lumen are pretty good as they are
@@NigraXXL and I'm sure nvidias next move is to incorporate their Realtime path tracing tech into UE5 seeing how popular the engine is going to be this gen. Maybe they will have an option to switch out hardware lumen for pathtracing in future ue5 games on pc.
Uh, what? CDPR and Nvidia were very clear about what this technical preview was. For Nvidia it was a showcase not just to consumers, but the industry at large that realtime pathtracing is fucking here and it's 100% viable. That if it can run on a Triple A monstrosity like Cyberpunk, it can also run on much smaller, lighter indie fare. For CDPR, they flat out said it's them testing pathtracing ahead of its implementation in Witcher 4 and the next Cyberpunk game (both of which are apparently in simultaneous development).
So while while their REDengine is no longer in use, their knowledge and experience with pathtracing will be that much better going into the development of those new games.
I wish I had more time to play all of these freaking games. xD
Excellent vid as always Alex. 👏
It's been years since i learned anything from DF. This video broke that trend majestically. Awesome job.
We are only years away from fully path traced games , this is so exciting!
22 minutes of joy thank you Alex N DF team!
recently built a new pc after 2 years away from pc gaming. really enjoyed this and its made me excited to try it out. Thanks Alex
Fantastic video! It's great to learn about a lot of the work, and complexity, that goes into making a technique like path-tracing performant in realtime.
My god your videos are so complete. You guys made my day even though it's 9:27 PM here xD
Can we all take a second to appreciate the transition at 1:06?
Much approached, thank you!
My first playthrough of this will be glorious.
Hey Alex, great video!
Just a quick note: SIMD stands for "Single Instruction/Multiple Data". This really defines the work that can be done by GPGPUs. While a GPU thrives to be inherently concurrent by desing, GPU programs are usually bound to working on independent data using the same instruction on every sample. Therefore it's a single instruction working on multiple data! Thanks for your great work - keep it up! Liebe Grüße :)
I always love these tech dives Alex. Nicely done!
PS I’m also excited for what future hardware will do when faced with PT heavy titles like Cyberpunk!
Great analysis and insights! The visuals are amazing. Awesome to see CP2077 as a tech showcase.
I understand some of these words
RT Overdrive and Path Tracing looks actual worth using technologies than just simple RT where you suffer substantial performance for little visual improvements.
We will definitely be able to run Cyberpunk natively at 60 FPS with overdrive mode when the RTX 6090 is released. Nvidia and CDPR have showcased an amazing demo, providing us a glimpse of what types of games we can expect in the future.
it will only cost you $3500 to own one!
Great to see another Tech Focus!
I love how we get a great video like this from DF. It's informative and professional. Only to get about 1/4th the comments be;
1. iTS SInLle iNstRucti0n, MultiPle dAta
Or
2. 'nvidia shills'
In 2077, we'll finally be able to run Cyberpunk 2077 at 120 fps with full Path Tracing at 8K using an RTX6090Ti 128GB.
What's still curious to me, that this machine learning stuff is cheaper per frame as a natively processed image.
Tensor cores can be 10x faster than CUDA cores for the specific task. That helps offset a lot.
Phenomenal video with some excellent ELI5 technical explanations. Can't wait to see PT becoming even more efficient and widely used!
The biggest bottleneck to hardware raytracing would be the very shallow acceleration structure. We only get *two* hierarchy levels in our BVH tree - and we're forced to use axis-aligned bounding boxes to boot. At least let us have a tree as shallow or deep as we want, and then let use use other parametric fast-to-intersect-rays-against shapes like spheres to bound our geometry and clusters of geometry in. The fact is that when a ray intersects the AABB of a piece of geometry, for instance, the ray then must be tested against *every* triangle in the model. This is a very very very slow way to do things. I remember when Nvidia announced the RTX 2000 series GPUs - I thought "no way! they must've come up with some genius spatial indexing structure to eliminate the need to test rays against thousands upon thousands of triangles!" Boy was I wrong. They're just brute forcing it! Imagine if rasterization had to search for the triangles that overlapped each pixel, where 99.9999% of the triangles it tests simply don't overlap the pixel at all. It's effing insane. It's downright stupid. If you have thousands of rays intersecting a model's AABB then each one is going to be testing against the same list of triangles over and over, missing almost all of them. IT'S CRAZY. If we can have deeper hierarchies then we can group together sections of a model, a few dozen triangles per bounding volume, and speed things up by orders of magnitude. This two-level hierarchy BS just ain't cutting it.
There's a Chips and cheese article that looks the approaches both AMD and Nvidia take to raytracing. Nvidia's shallow BVH tree approach allows for greater parallelism achieved by being less sensitive to memory latency. I'd suggest you take a look
I feel like you should read Ada Lovelace Architecture White Paper. Not for the BVH hierarchy itself but for the new primitive (that no game uses - yet) - Displacement Micro-Mesh or DMM.
At face value it seems that Nvidia has been making strides in solving this issue but it will require a paradigm shift to implement it in actual games.
You're kidding! I thought they were using octrees. You mean they're using traditional BVHs (as in, one BV per model)?!
What you said is simply wrong and not the case for hardware raytracing. Each level contains a BVH down to the triangles themselves. Once a ray intersects a bounding box in the TLAS it needs to run through the tree in the BLAS and gets down to around 4-8 triangles in each node.
There are limits to this two level hierarchy, but it is not what you are talking about
@@chengcao418 The tree is two-levels. I haven't found any documentation that shows otherwise. I was hoping that what you are saying was true - that you can have multiple recursions in there and the TLAS/BLAS is just an abstraction for data structures, but everything I've seen has always been that BLAS is the actual geometry instance and TLAS is the collection of those instances, and there's no BLAS with child BLAS' or the like. Yes, you *can* create thousands of BLAS for handfuls of triangles for a scene like CP2077 has but then you're just performing thousands of ray/hit tests against all of those BLASes in the scene TLAS instead, which isn't much better because now you've got the overhead of managing all of these BLAS instances from the CPU. I really was hoping that I was just reading the API documentation wrong, and that there was something in there that allowed a TLAS to have child TLAS instances, or anything to that effect, but I haven't seen it. I'd love if you could show me where it says that it's not just a 2-level tree.
the future of realtime rendering is so bright
IGN's video seems to imply that the look is "subjective" to each viewer's preference, where there are scenes where the older tech, Ray Tracing Ultra, can look better in certain situations. Look better than the more modern Path Tracing.
Woow that looks mind blowing
You actually can increase the amount of bounces and rays with mods 😈
There's definitely a point of diminishing returns, but my goodness 6 Rays/6 Bounces looks WILD.
@@RicochetForce where can we see this?
@@RicochetForce thank you, I found the vid! :)
@@RicochetForce 4 is usually adequate for diffuse lighting in offline renders. Unless we are talking about extreme examples like Alex showed.
I am slightly disapointed that it's only 2 by default and not 3. (Imo GI is the biggest wow factor in path tracing)
@@hastesoldat Oh, agreed. I think 4R/4B struck a nice medium between fidelity, performance and avoidance of the worst of the artifacts. From there it would be better to focus on better denoising, more efficient ray/triangle interaction calculation, extending the propagation range, working on a solution for transparent surface caustics that don't murder performance, etc...
Great video and explanation! I really like these deep dives into tech.