the tek is very good but to get the most out of it you need high level programmers and very few devs want to pay for them so you get less talented people working on the game on the programing side. all of the latest probs have been down to poor use of the CPU.
@stephenmeinhold5452 the coders that work for AAA companies are extremely skilled because there are so many coders that want to make games. They are just too few of them with too little time working on massively complex games on several platforms.
I'm sure they're tired of it, but I would love to see a Cyberpunk video covering the current and "final" state of the game and what optimised settings look like today versus years ago.
we're slowly moving in the direction where tracing rays from light-sources instead of from the camera becomes more viable, because rays traced from the direction of a static light source can be cached *much* more effectively than rays coming from a moving camera, if only because it saves a lot of linear algebra, lol.
@@Megalomaniakaal Yes. But that's not a natural law. This was done as an optimization because it means you don't have to light millions of pixels that your camera can't even perceive. That also means however that your scene is very camera dependent, and if your camera is dynamic, that makes caching stuff difficult. If you trace from the light source however you'll have neatly cacheable, persistent lighting values for all static objects that are in the scene, so you only have to dynamically trace dynamic objects. This is basically how baked lighting works, but if we move far enough along it may become more economical to have a static tracer like this that caches values, to take load off the dynamic tracer. In other words: a scene lighting tracer that does multiple bounces and a subject tracer that does a single ray with no bounces, unless it intersects with a dynamic object
@@insu_na You misunderstood my point. Even if you 'trace from the light source' you start from the camera first and then further trace from the hit lights before returning results to output. You don't just trace all lights in real time to cache their results. You only do that to lights that are relevant. Now if we are talking pre-baking caches pre publishing in an offline fashion that's a different story, of course. But then we aren't really talking strictly about online/realtime rendering anymore.
@@Megalomaniakaal You're the one misunderstanding me here... You don't start from the camera when you're tracing from the light source, because there is no camera. You cast rays from the light source in all directions and then note down every intersection with the objects in the scene. That's not how it's currently done, because it's exceptionally expensive computationally, but it's also possible to cache the results of these rays regardless of where a camera is eventually placed in the scene. It wouldn't exactly be pre-baking, because pre-baking is done before you ship the game. This would be done while loading the game and then constantly while the game is active to improve the cached data. Please understand that I ***EXPLICITLY*** said this would be a different way from the current way of doing this, in the very first comment.
It wasn't too long ago that the prospect of doing ANY raytracing seemed like it was 20 years away, then the RTX 20 series dropped in 2018, and six short years later we have a handful of titles that are doing almost completely path traced graphics.
PowerVR hand UE4 raytracing demos running on mobile chips with RT cores YEARS before RTX 20 series. The only reason it wasn't released to the market is because at that time no OEM was interested in licensing it, but the tech worked.
Yeah, at first they were just trying to get it to work. Now they are coming up with all kinds of optimization techniques. Soon enough, they will find a technique that give 2-3x performance advantages like Insomniac has and eventually 10-15x. Caching and lower updates at lower resolutions are going to be an important part.
It's still a bit too flawed. Too many compromises here and there that creates distractions/artifacts. Still love it though. I'm on RT/PT wagon since day one. I think they should prioritize eliminating distractions in DLSS and RT. Because it defeats the purpose a bit in my opinion. It's like having gsync but getting flicker and overshoot from it. At that point why have gsync at all.
I love how Alex tells in detail how much needs to be faked in order to run a game at real-time when people lose their heads about fake this and fake that nowadays when they're unaware of how the sausage is made.
Oh, that's essentially real-time 3D rendering in a nutshell. "Fake" as much as possible, but at least we finally seem to be getting to that stage where the lighting can update in real-time as well, without needing to compromise on certain aspects of the lighting itself.
@@MLWJ1993 yep, and probably just a decade away from having 3-bounce fully global RT that isn't noisy as hell being the standard. I wonder what'll be the next beacon on the graphical hill after that, accurate material & physics simulation that isn't 'faking' it in the same way raster 'fakes' lighting? AI Swarms? I can't help feeling like each step closer & closer to photorealism we get, the less & less it matters for games, because there's basically no technological limitation to execute on most art styles now unless we want to use AI to interpret 3D models as particular styles of brush paintings or something.
It shouldn't have required this much explaining... Unless people thought that their monitors/TVs function as a portal into an actual real world. In that case, telling them that it's *ALL smoke and mirrors* must be devastating. So silly...🤦🏽♂️
There is a command line parameter in higher UE5 versions that enables tracing against full nanite meshes: "r.RayTracing.Nanite.Mode 1". Works for ray tracing and path tracing.
This is so cool. Path tracing for me is the biggest uplift in games now, it makes the image so much more believable. Cyberpunk for example really comes to life with PT even if the environment and characters feels like in a comic book it fools me to believe it's more realistic.
@@Fumes74 But is that really what makes games great? slightly improved lighting ? Games are meant to be played not to stand there and watch some shadows and light bouncing from surfaces. The best games out there have the worst graphics.
@@BastianRosenmüller One thing does not rule out the other. There are many games with less impressive graphics that are really good. But this tech bumps the immersive feeling a good bit for me and many others. It's the most drastic (graphical) improvement in Cyberpunk 2077 for example, its night and day. If you don't care about graphics, ofc this does nothing for you. For me the story comes first, but 2nd place is graphics. And it is not "slightly" better, it's a major difference. When it really picks up momentum and devs will use this as their primary goal it will be fantastic. And wow it must be nice to not be restrained to a small number of fixed lights in the scene, as a gfx artist i mean.
As a CG artist who primarily works in Film and TV VFX, this was an excellent technical talk giving me the distinctions between different methods of light transport being used in games. I am, like many in the VFX industry, primarily looking for speeding up my offline renders using Unreal, and with that comes compromises of course. It's very interesting to hear it from the other side with game devs wanting to have less compromise! As it goes in many things in life, the truth lays somewhere in the middle I am thinking. But make no mistake about it, if I could have realtime with full offline path tracing rendering quality, I would! Excellent talk, and it instantly earned a Subscribe from me!
The temporal lag is still spoiling my opinion of realtime Raytracing. Saw it in Cyberpunk how the shadows of footsteps slowly fading in and out of existence. Hogwarts Legacy had indoor rooms completely dark the first frame and took some time to fully saturate the scene with light. This feels way worse than traditional hacky approaches.
Have you played since the latest rtx updates? the new denoiser almost eliminates the temporal slowness of lights and shadows that move or pop in with quick lighting changes. Havent re-tested HW lately but CP on a 4070ti is darn good looking these days at 1440p with path tracing or screams at high fps with normal RT.
I have issue with CP2077 when turn on RT/PT i get a glitch like the tree become black as i approach them and it even worse with Ray Reconstruction. Are you guys get the same problem?
Cyberpunk and HW both look fantastic at max settings on my 4070ti Super. When 2.0 dropped for Cyberpunk I noticed too much ghosting, but that seems to have been minimized after a few patches.
@@MDxGano It doesn't "almost eliminate" the light ghosting. Ray reconstruction improves the light responsiveness a lot, but there's still significant light ghosting compared to rasterized lighting. That's likely something that will slowly be improved over the years.
Path tracing is the way to go, I'm not denying it. How can they make these tech cheaper and the chips to run on it is the real question. You see consoles run on a tight budget. As of now, only RTX 4070 Super and above are meant to run that on 4k with upscaling.
4k is an absolute pipedream, even for rasterized games, I don't understand how this has become an assumed standard. 4k is so many pixels, there will basically never be a reason not to use dlss to reach that resolution.
@@lgolem09l you can thank techtubers for that; it has become the metric for optimisation as well. Most people watch one video of benchmarking and come away with '4080 can't do max settings at native 4K with ray tracing, game is unoptimised trash.'
@@grantusthighs9417 Tech tubers & influencers are shyt at doing real unbiased reviews which have now leaked into everything else. Benchmarking computers used to about how test all the performance in different ways. Now it's all about which games are "the most played" that "have to be benchmarked" because the player want it. There's a reason why older reviews didn't ever do crap like that, & now it becoming a standard as turned all reviews into pure dog shyt.
Just when real-time raytracing is becoming viable the industry is starting to flirt with neural rendering that could make this decades old dream irrelevant.
The optimizations in AW2 for the BVH based on distance are reminiscent of LOD based on distance only in this case, there's a frame rate component that changes instead of file component. It's amazing how many concessions developers have to make to simulate reality which makes you wonder how good the hardware is that simulates our own world 😂😂😂
I am fed up with graphics of the modern games, it's already at decent level, and anyway it looks like a game, not like a real world. I prefer some progress in gameplay, like AI NPCs/companions, advanced physics, deeper mechanics etc.
Alex might want to read up on Reyes rendering, micro poly displacement and Catmull-Clark subdivision surfaces. The realtime graphics industry is by and large just reinventing the wheel. Which is fine. But worth noting and keeping in mind.
I think we are a few generations on hardware and getting close on engine level to real time pathtracing being a mainstream thing. The real implementation will be when the affordable cards in the 60-70 series cards can do it. But pathtracing will be the last real leap in games as we already have photorealistic textures and crazy realistic models and meshes.
I'll be excited for new technology when companies actually start to use their massive budgets to produce decent games instead of aiming for unplayable eye candy.
I think there's a big bottleneck in how fast technology develops but cannot be thoroughly learned and implemented because a lack of training and educational programs for devs that are just rushed to push the latest sponsored thing.
With all the politics, division and strife in the world, it’s always refreshing/calming, when a Digital Foundry vid is uploaded. Thank you guys for the shot of sanity.
Still want to see more games use RT Static. Having baked lighting of path tracing quality with no performance hit would be insane for VR and really any game.
Not just npc/enemy ai but things like physics and interactive animations which have actually become worse compared to earlier generations. Look at foliage and things in Ubisoft's latest Farcry's vs Farcry 2 and Ubisoft is not alone in this but one of the most egregious.
@@NatrajChaturvedi yes. Nowadays its basically ALL looks - graphics sells. Its like boobs. So we get boobs but no brains. Great. For someone who started gaming in the pre-PC era, its a big let down. I care about gameplay and still dream of a game, where the resources would be directed a deep gameplay, including AI, physics, world, sound (!) and not just GFX... when its GFX focused, its like being in a beautiful place, where you cant go anywhere and cant touch nothing. Or like living with a brain dead barbie doll, which looks fantastic.... both sucks
Wouldnt it be straightforward? To solve for shadows using explicit path tracing with not a whole lot more compute?, since you are still doing the first direct illumination step with brdf sampling + light sampling using multiple importance sampling in real time? All thats left is to have a fast algorithm to cull occluded shadows (the irony) to stop shadows from flickering? Curious tbh. I love that Alex does these deep diives into rendering from an industry use case in games. I dabbled a lot in physics based rendering for a potential phd thesis in ML but lost interest eventually but Alex makes wanna change that lol. Wish I had time to go through the full GTC and GDC videos.
ok, this is actually good. really good, big ups, yes that's the way its gonna be going now, brains over power, look at Spiderman 2's rt, reflections the reason that works is because they used alot of cheeky tricks like that (sampling, using the temporal data correctly, inheritance) this is probably the only comment on df that's slightly technical someone know what they are taking about nice change. we had the cpu, then the gpi, now we have these new units where we hadn't figured out rasts equivalence of differed rendering. ive only ever worked with rt, and not a pure path environment so i can just say yeah i thing your right.
I just want us to reach a point where we eleminate loading screens. Increase the amount of particles/effects/geometry in our games. Basically increase IMMERSION. WE NEED IMMERSION NOW. Gaming is genuinely getting boring and repetitive
Maybe, but the unfortunate reality is all gaming will still be held back by the pathetic specs of this generation of consoles. Imagine where gaming would be if consoles didn’t exist and everyone had capable gaming PCs instead…
@@mattspeer01"Capable gaming PCs" would just have had last gen components that held you back anyway. The gtx 1650 is still the second most popular gpu on steam, 1060 3rd. Remember when Alan Wake II launched and people complained their 7 year old GPU's ran terribly. You think the people buying cheap consoles would instead now buy 50 series entry-level GPUs worth as much as their consoles?
I suspect that the industry transition to ARM from x86 will be the harbinger of universal RTX which is one among a number of reasons Nvidia wanted to acquire them. Nvidia's new AI chip uses some ARM architecture and they're supposedly going to be making ARM CPUs for Microsoft products. The chiplet bid that Intel and AMD are going for is simply prolonging the inevitable. Future low end gaming rigs will have an ARM CPU and some motherboard coprocessors (like an NPU) and high end will be an ARM CPU with a discrete GPU. The bandgap breakthrough with graphene will likely be implemented and iterated on quickly, but any advantage it affords x86 can be applied to ARM with greater scalability and efficiency. My forte is analog electronics so take this with a heaping helping of salt.
The direction in which real time graphics will evolve is set... But I don't see the necessary hardware to actually run all that sufficiently well become mainstream this decade with the budget console hardware always being the common denominator.
Devs used to come up with all kinds of cool work arounds to get games running on console and none of them do that anymore I knew there is a better solution to get high quality ray tracing in a game than just pure gpu performance.
The current well done rt implementations are pretty far from just using "pure gpu performance". There's tons of research work that has been done to optimize these techniques to even work in realttime at all.
It doesn't matter. We are talking about determining which object's what point you see in a specific pixel here. We can do either rasterizing or ray tracing, and it's not where the hard work is. The hard work is to determine what color that pixel should show.
I dont think RT is cost effective now. It will be when medium range hardware can run RT and get 120fps. Until there I prefer to have 120 or 144fps than RT.
Path tracing is the first time turning on RT is actually worth it despite that the perf hit is brutal, because the ReSTIR algorithm is simply just so much more efficient than naive ray tracing.
Question here I tried to run RTX Remix games: NFS Underground, Max Payne I copied github RTX remix files and game mod files to game directory but when I run game exe it crashes and won't start I have 7800xt and 7800X3D
Are you on the latest drivers? They broke RTX Remix mods except Portal. It also needs Linux drivers to run well on AMD hardware (if you run these games a 7800XT gets like 10FPS on Windows).
Is it really hard to get path and ray tracing all in vs only getting 1? 60fps should already be the nextGEN norm but isn't implemented in a lot of cases till the following patches but I think having the full reflection ray and path tracing would give the full gaming experience whether on PC/console
Personally, I think that raytracing has taken the industry backwards. There was such a push for 4K, when 4K televisions came out. Which made graphics that much sharper and even games on the Xbox One X were pulling off 4K/60. Then, raytracing was introduced and became bigger… And graphics cards and consoles could not handle it, so resolution started slipping backwards to the point now where things are just blurry. So what’s the point of high resolution textures, if you’re just going to end up with lower resolutions to display them? It’s like two steps forward and one step back. Yet, games like Metro: Exodus are pulling off 4K/60 with raytraced bounce lighting and shadows with ease… yet, no other developer can do it? It makes no sense. I’m tired of seeing games at lower resolutions running with raytracing that seemingly nobody else can do except the developers of Metro, looking as if I am seeing them running through fogged up goggles. Simply because they want to add the newest buzz word in the industry. If these features can’t be done with modern consoles and graphics cards, then they shouldn’t be attempted. The thing is, once again, the developers of Metro proved that it can be done efficiently… Which makes me wonder what is going on with the rest of the industry? It makes me believe that developers are just being stretched too thin and developing for all of these PCs and consoles is bringing everybody down. Since the consoles are essentially PCS now… wouldn’t it make the most sense to develop for the highest end PC and then scale back for consoles? That way everybody gets the best of everything, depending on their platform? I understand that there’s some custom hardware in the consoles… But by now, developers should be able to code to the metal. it’s been 3 1/2 years and we still have not seen 50% of what these systems can do. Which makes me believe that this generation was just a wash. I understand it started under extenuating circumstances… but, it seems like many developers are just doing the bare minimum of getting these games up and running. Which is proven by almost all of them needing patches after patches after launch. But this is what happens when you cater more to your Investors more than you do the gamers playing your games. We are essentially at the spot where we were when the video game crash happened back in the 80s. Too many games, which are overpopulating the landscape and making it tough for any one developer to break through. you can’t trust what any developer says about the games, because there is so much coming out that is broken on release. So many promises that fall through. Every once in a while, we get a great game and then it is followed up by 100 that aren’t. I think developers need to start focusing on what’s most important. Gameplay, story and sharp visuals that developed us into the game. Not pushing technology that the hardware isn’t ready for. I love raytraced bounce lighting. Lighting is one of the most important aspects to make graphics look lifelike. Reflections are nice, but many times they are overdone. Shadows are just as important as they make things pop. but, it’s very obvious that only the highest end graphics starts can achieve any of this and even though can only do it at 1440p with decent resolutions. So, next Gen, maybe?
"If these features can't be done with modern graphics cards and consoles, they shouldn't be done" That's not how we got here... If people had this attitude towards 3d rasterization in the 90's and 00's, gaming would look very different today, likely not in a good way
"wouldn’t it make the most sense to develop for the highest end PC and then scale back for consoles? That way everybody gets the best of everything, depending on their platform?" It is largely a CPU bottleneck for the lower end systems and consoles due to the apparent difficulty in keeping cpu requirements in check as we see in a lot of the newer releases. You develop for the hardware most people are going to have, thus servicing the most customers you can. I'm fairly familiar with Deep Silver and their development history and experience making Metro Exodus Enhanced. They had to tone down the real time lighting but were able to do it efficiently by basically baking everything else as accurately as they could so that turning on the extra RT settings weren't so impactful to visuals and performance in order to make it work on consoles. also food for thought, current gen consoles are not running Metro exodus at 4k, full stop, they run mostly around 1800p which is about 87% scale. A 4070ti struggles to maintain a 60fps target at 4k with a 7800x3d. The game isnt as sunshine and rainbows as you think in terms or performance, they just handled their console launch very well as they always have and were able to offer a 60fps update 2 years after launch, not publishing much if anything else in the meantime. Pc port was pretty rough at launch in terms of graphics settings sticking or downright not applying as well. Nothing too gamebreaking, but hardly faultless. They also are not very diligent in updating to the newest versions of dlss, but very few developers are.
@@crestofhonor2349 I respect you for your reply. So often, people can’t agree to disagree. They just go on a rampage because people don’t think like they do. I understand it from both aspects. I’ve been playing games since the 2600... but, I think we are at a crossroads in graphics right now. I understand, pushing things forward, but it doesn’t do gamers any good if the games are released broken or with bugs, simply because developers wanted to push the latest features. I think when 4K gaming became a huge thing, which was obviously just a few years ago… that sharp and crisp quality to the visuals was amazing. Especially with the games that ran at 60 FPS on top of it. We only had a few years of that, before the new consoles and then a lot of games now are kicked back to 1440 P to include the newest graphic features. Upscaling, at least on consoles right now, just isn’t a great viable option. Because it is causing too many artifacts. One of them being blurring, the other one being ghosting on moving characters. On PC, it’s different story. Especially with RTX, because it is more advanced for upscaling. So we will see. I always keep my mind open and I really want to see things advance forward. Especially with physics. Because I think that could mean a lot to gaming if it’s used for gameplay. Essentially giving players and NPCs the ability to use their environment to their advantage and give a multitude of ways to dispatch those enemies and them, you. Not to mention how much fun it would be in multiplayer to be able to shoot the corner of the top of a building and have that rubble fall into your opponents. I’m remaining Optimistic and hoping the next generation will be an incredible leap, as Microsoft says. I want to see games out of the gate taking advantage. Something we really didn’t get this generation. Something we still really haven’t seen yet. Which is why I’m concerned. Because we are 3 1/2 years into the generation and there’s already talking two years of Microsofts releasing a new system. When the current system has barely been tapped for its power. It doesn’t seem like developers are taking advantage of the features of the hardware in order to get the best out of it. And I understand, they have a lot of systems to code and you can only do so much, without destroying your budget. But, we haven’t even seen much in the way of greatness from the first party developers. Like I said, in my original post, Metro: Exodus proved that raytracing could be done, essentially full scene with bounce lighting… At 4K/60. How they got there… Who knows? I’m sure there’s some upscaling of some sort, but you would never know… Simply because it still looks incredibly sharp and the lighting is just gorgeous. Which I think is one of the most important aspects of obtaining realistic graphics. Wish they would share their secrets with the rest of the industry. Because I think a lot of developers could use it right now. Any case, thank you for the reply. I wish more people in the world could discuss why they disagree and come to a consensus of each other opinions. Instead of simply going off the deep end and making it seem like the entire world is coming down around them because somebody doesn’t agree with them. have a great day
Unfortunatly i am not really excited about this, despite having a new pc (rtx 4070ti). There are a couple of issues that plague modern graphics and for me personally the worst is TAA. This is more then a matter of preference for me, i have Epilepsy and the motionblur that TAA introduces makes me nauseous, so i can't play a game that has forced TAA for more then 30 mins before i feel sick. Also i don't like blurry visuals in general and would prefer the tradeoff with blocky looking edges, but that is just my preference and not an accessability problem. Another thing that needs more attention is shader compilation stutters and unoptimized games on pc in general. I paid almost 2000€ for my PC and there are new games that will still stutter, regardless of the settings. I notice this mostly as micro stutters and not the classical fps drops it feels more like a lag or freeze. It feels like hardware manufacturers keep making hardware to keep up with more and more ridiculously demanding features like real time path tracing and other nonsense instead of the other way around, where developers work within the limitations of the existing hardware and come up with creative solutions. If a game can't run on my brand new high end 2000€ pc properly, then i don't care how good it looks (and i don't even think that it looks good because of TAA blurring the whole image), it has bad graphics period. Because for me "graphics" are the techincal foundation of a videogame, the end result of what is rendered for you on the screen. And if the result is something that stutters and i can't use properly then it's just bad. Developers need to focus on high performance first, take more care about people who suffer from motion blur and give us more graphics options and ways to custimize our experiences. Because this is what makes PC gaming better then console gaming: the ability to finetune your games visuals to your liking, witout being stuck with what the developer thinks looks best. Most of these new features are just marketing gags that hardware manufacturers like Nvidia want you to believe that you need. Videogame publishers gladly advertise with RTX and Pathtracing because they look good on screenshots, but they are getting played by the hardware manufacturers in my opinion. Just take a look at the best selling games of all time, the most played games on steam, and the most succesfull gaming consoles and hardware that existed. Do you know what they all have in common? Accessibility. This is the key to succes that i can't stress enough. All the best selling consoles of all time like the Gameboy, PS2, Switch, DS etc. had a low cost compared to their competition and also lower cost of operation (there is a video on YT about the original gameboy that used AA batteries and how in comparison to it's competition it only coast 17 cents compared to 2.50€ per hour to run). The hardwarepower or graphics alone never sold games or hardware! PS2 = weaker then the first Xbox but sold better Nintendo DS = weaker then the PSP but sold better Nintendo Switch = weaker then current and last gen consoles but sold better The first gameboy didn't even have a backlight screen and was in monochrome colors, at a time where the competing handheld had amazing graphics a bright screen and of course colors. But again guess what sold better? Same with software: Why are mobile games so succesfull moneyprinting machines? Well apart from microtransactions, everyone can run them on their low end devices. Why are free to play games so insenly succesfull? Because they have well optimized graphics and scalability so most potential customers can run them on their devices. Why are games that are old still so succesfull? Because apart from nostalgia, many people with lower end hardware can install them and just play them without worrying about hardware restrictions. For me graphics are really important, they are really important to draw you into a world and it's atmosphere. But graphics is so much more then just texture resolution or raytracing! Image clarity, performance, art style, accesability options are much more important then just having the most advanced technical features in your game. Games just like movies are a for of art, that have to work with limited resources (be it computational or budget limits) and i find it way more impressive how developers made game worlds come to live with clever 2D prerendered Graphics, animations, attention to detail and using every single Bit of performance available. Unfortunatly modern Devs don't seem to care about this and we are left with broken, stuttering and bugged releases that are overpriced and overbloated. A game doesn't need 1000+ developers working on it to make it good and neither does it need a budget of 200 million dollars.
TAA is here for a reason mostly because MSAA doesn't work well at all in modern games. With DLAA that could improve stuff as it often offers a sharper picture and less ghosting than TAA. But even then TAA needs to be modified more because stuff like the default TAA present in UE4 and UE5 is bad. I do agree power has never won a generation but that's also because you have to factor in cost. Expensive consoles always sell in low numbers. I completely why the PS2, Wii, Switch, DS, and GB sold so many units. These consoles are cheaper to get allowing for a wider range of people to purchase said device just as you said. People like a low cost of entry because it means more people can buy said stuff. Developers don't really need to aim for anything but stable performance. Whether they want high performance or low performance is going to change on a per game basis. Stable performance is more important than high performance. If given the choice between a stuttery 60fps and a stable 30fps I'd take the stable 30fps Devs do absolutely care. I think you're mixing it up with the publishers who knowingly ship a game unfinished. The devs just do what they can to make the game in the allotted time given
@@crestofhonor2349 SMAA does when properly done. There is overuse of TAA because it doesn't impact on performance because it's post processing anyways, it will always have problems. SMAA & MSAA are not done on the post processing.
@@kevinerbs2778 Many effects rely on the use of TAA and SMAA and MSAA would leave many things in games untouched by anti aliasing. I've seen this happen already. TAA can and does look fine with the right parameters.
@@crestofhonor2349 Go watch the video for Crossout here on youtube on TAA VS SMAA. What you said is wrong. SMAA does not miss anything on screen it superior to both since doesn't just hit edges it is however performance heavy 25-40%. Since it applies to everything on screen & does it a proper real time rendering. it literally applies to each pixel & is sampled a higher resolution than what the screen's resolution is. TAA is only being used because it's performance hit is minimal like 3-10% that's it, I literally did research for TAA & it's other problems were already know since it's a post processed. It's actually terrible for most things AA other than transparent textures.
Pc looks more accurate but for some reason ps5 quality mode looks better on the eyes. I’m specifically referring to the comparison in the video with Alan wake 2. Many of the stills look better on the console. Inexcusable for such expensive hardware
Yeah because your fanboyism is blinding you. Path tracing which is used on the pc brings an improvement to the graphics especially the reflections if you observe carefully. And well if you prefer console graphics (optimised raster settings) you can run ps5 graphic settings on the pc as well and the pc will push much higher frame rate and resolution.
@@dhaumya23gango75 that’s not fanboyism that’s the truth. I’m playing the game after work to relax. If I have to CONCENTRATE to see the difference from over 4k plus hardware then it’s a moot point. Graphics are just too close
@@bigde131 they are closed because that what DF told you. The reality is different. This site is Sony sponsored. They would compare a little 2019 APU Zen 2 500 $ box with high-end 2024 PC CPU and GPU, and set up PC antialiasing levels to "make believe" ... you even don't know if their pictures are really coming from a Playstation.. if you are smart enough, you don't trust that bs. If they were honest, they would have compared this 2019 APU with 2017 mid-end PC... PS3 is not Full HD native, PS4 and PS5 are not 4k native... the PS4 Pro is the first Full HD console. The PSS 1.0 for the PS5 Pro will bring a hybride definitions 1080/1440p automatically scalable. Note that you are stuck with an old Zen 2 (i8500+GTX1660 or RTX2050 kindalike) while AMD are going to release the Zen 5... 😉 note that DCS Gaming PC recommandations for low PC graphics 30 fps are the same for PS (GTX 1660/1070)
@@exotikification I play on a 120hz Sony Bravia A80 55 inch. The picture quality is so good literally EVERY game I play looks so beautiful it’s unreal from overwatch 2 to Stellar Blade. With balanced mode or resolution mode enabled it’s like looking at a dream. All the while splayed on my bed smoking a joint. Sorry but pc performance just doesn’t justify the price point. My best friend has a pc yet he comes over my place to game all the time 🤣. What does that tell you?
@@bigde131 ... that 2019 console Ryzen 2 graphics are the same since ONE/PS4 and it is enough and historically design to play on TV since little console system exists :) personally, i play on an Acer SpatiaLLabs (a TV is has-been in my case, i enrich your culture by the way lol) or 3DVision videoprojector, this is stellar ! better than smoking. 😂 and yes, 80 dollars a AAA game and 80 dollars a controller on a console with low upscaled graphics : playing on console is not cheap anymore.
That's great and all, but beautiful visuals hit their absolute peak in the 7th gen, with a mix of tech and artistry. This technology is impressive, but not really necessary to enjoy a game.
When will ray tracing be playable for peeps who have budget cards? I have 6700 xt and even I can turn on some of the effects. Like reflections in Control and over 60 FPS.
I think the solution to solving RT with nanite is that it will all just be replaced with some neural rendering tech which isn’t bound by the same issues as current triangle + RT methods.
Good for PC gamers with a very big budget. Maybe in 10 years for the majority of pc-gamers. Console gamers won't see this even in a possible new generation.
For us thrifty console users how about talk a comparison for last gen compatible engines like cdpr’s red vs the Elden ring engine how these beautiful engines didn’t have to left behind just for underdeveloped for never fully potential used would great for us less technically enthused for ue5 or more appreciative of matured bespoke engines for er and cdpr.would love that for me the not as advanced thrifty console gamer.
Engine in Cyberpunk 2077 was a hell to work with, coz it was built on a crunch with a dated baseline. Coz of it they were not able to implement refined multiplayer mode... Elden Ring is also on a rly old and clunky engine. ID software is going to share engine tools, which were used to make last Doom games. Also Source 2 from Valve is a few years away from being ready to share. For now, only S&box is using it.
Why don’t devs take all the cpu cores or one of the basically unused cores for ray tracing alone that should be enough power to calculate the lighting without any performance hit. Work smarter not harder is what everyone always says off load the ray tracing to the cpu and have the gpu do all other graphic processes. Or maybe dedicate 1gig of video game just for lighting effects.
1. CPU cores aren't "basically unused", they're being hammered just as much as the GPU is the entire time the game is running. Whether it be gameplay logic, NPC simulation or recording and submitting rendering commands, the CPU is doing a *lot* of work to the point where often engines are architected specifically around managing all of this work (look up things like entity component architecture, task scheduling, multithreaded rendering, etc). Putting more work onto the CPU will not help. 2. The CPU is "too far away" from the GPU in systems with discrete graphics (ie your average gaming PC) to be of any use here. There's a *lot* of latency/delay present in sending data or commands between the CPU and GPU, and said latency gets *far* worse when you try to send data from the GPU back to the CPU because the GPU is behind the CPU in 99% of cases (the ideal case for modern games is that while the CPU is busy simulating frame N, the GPU is busy rendering frame N-1). Best case scenario where the CPU is fully responsible for lighting the frame, the GPU would have to send the in-progress frame back to the CPU (latency), stop while the CPU lights the frame (latency), then wait for the CPU to finish transferring the lit frame back to the GPU (latency). Worst case scenario where the CPU is responsible only for checking if a ray is intersecting any objects and traversing the acceleration structure (this is what RT cores do on the GPU), the GPU would have to send the trace query back to the CPU (latency), stop while the CPU traces the ray (latency), wait for the CPU to finish transferring the intersection state back to the GPU (latency), then repeat the process for however many trace queries the GPU needs to offload to the CPU. Performance would be *abysmal* under this scheme, so this is a Very Bad Idea (TM). 3. "Dedicate 1gig of video game" doesn't make any sense. Data alone is meaningless, there has to be some code to dictate what to do with that data and this code is what contributes to poor performance in 99% of cases (the exception is when you put too much data into RAM/VRAM and either cause cache misses on the CPU/GPU, or cause data to spill from RAM/VRAM).
CataFeral's intense stare is because he was in the middle of a chess match against ChatGPT. He was trying his hardest to give himself a handicap so ChatGPT had a chance. In the end, he won. His record, 312-0. A living legend!
Next gen consoles will do path tracing just fine. Meanwhile you can be one of the handful of PC gamers that can experience path tracing if you have a spare $3000 available.
@@Coxy-b34 they will not please don’t try and claim a 500 dollar console will do real time pathtracing when current gen can’t do raytracing at any reasonable resolution. We are several generations away from that.
@@a1racer441 I sense your PC elitist rage but history suggests a next-gen console will have a GPU at least on par with a 4080 and thus will be capable of pathtracing.
@@crestofhonor2349 if it depends then im not interested ma dude 4070 plays alan wake 2 with 30 fps ray tracing medium !!! its a technology that are we are not ready for whan a 300$ gpu can run a ray tracing game for 60 fps then i ll say its worth it
I'm curious to see how amd and nvidia will segment their GPU's now. I'm sure they're going to rip off customers more just using this RT as an excuse to do so.
Why do you guys’, the consistent viewer, still listen to this MARKETING MAGAZINE!!!! RTX sucks performance wise and will never replace typical game development. Wasted processing with ALWAYS limited processing power. I only saw this video title cause UA-cam auto nexted this damn channel video in my feed. I DON’T WATCH THESE GUYS CAUSE THEY DON’T REPORT THINGS HONESTLY. It’s all an advertisement. Do they do new flagship phone content? Cause that’s mainstream tech influencing I’m sure they’re looking into.
They talked about this seemingly overhaul of RT overdrive yet literal every lighting bug (plus more popping up) still exists from the old system? Very odd. It really always seems that every Nvidia project is just a shell or partially completed work that piggybacks off all these marketing promises that never really come to fruition and are abandoned and waste away.
Everybody in public space puts their foot in their mouth at some time and licks their toe nails. I don't care what he said about the Eve character body as being problematic, because that was just an opinion he gave in passing. Alex is talking at depth regarding technical aspects and challenges of some amazing things in this video.
@@jcdenton41100 You are just trolling right? There's rarely topics where one can say this is the only correct answer but in this case it is. PCs are faster then the Xbox. The Xbox is about 20% faster then the PS5.
Raytracing for me is not such a big deal yet. Even now in actual demanding rt titles you need a 4090 to properly do it without upscaling and at even 1440p. Personally that makes it not worth it. When you can path trace at 1440p with a 500 bucks graphics card at 90-100 fps without upscaling then I feel like it's worth it. Eventually it will get to that. But until then.. I honestly don't care. Raster is more important. And let's be honest even visuals without rt are getting better and better too. Which is why I went for a 6950xt instead of a 4070ti. (Which was also more expensive screw the greediness of nvidia) because raster is still king.
You keep saying without upscaling, why is DLSS such a big deal for you? DLSS looks phenomenal these days, in some cases it rivals what you get in native rasterization - this seems like a “I can’t afford it so its not worth it” comment tbh
@@iamgates7679 i literally play on amd.. if you read my message. i dont even use dlss. i keep saying without upscaling because you literally want the base power of the card to be good. if i would say with upscaling then it would mean that the base perf of the card would be pretty bad if it needs upscaling to perform decently at 1440p as.. your not even playing on 1440p then. makes sense.. no? people like you make no sense to me. its almost like you want them to give you a shit product but go "but but with this ai upscaling you now get better perf!!! unless.. its not in the game you play.. then sucks for you. or a game doesnt have the upscaler you support.. sucks for you". like.. are you seriously that blind? also.. a cant afford it so its not worth it comment? really? that just sounds like pure stupidity to me. ofc i can buy a stupid 4090 doesnt mean i will buy one. no one should support such outrages price increases that companies have been doing like nvidia. and ofc path tracing is worth it in the future or are you incapable of reading. i literally said " yet."
@@Pand0rasAct0r_ literally everything about this statement tells me you have no idea wtf you are talking about. I said DLSS but , could’ve said FSR as well. Go be dumb somewhere else :)
I just swapped my RX 7800xt for an rtx 4070 super and i am so happy. My rx 7800xt can't do oath tracing even with fsr frame gen mod in 1440p. BUT the 4070super can get 70-80fps in path tracing dlss frame gen in 1440p. Its so beautiful 😍
Just dropping in before watching with a thumbs down, because of the shitty thumbnail. YOU SHOULD NOT DO YOUR CREW DIRTY WITH DOPEY THUMBNAILS CAUGHT CORPO-ALGO
All this impressive technological progress, and we still struggle to get AAA games ported to PC without traversal stutter.
That's not the only stutter we have to deal with.
the tek is very good but to get the most out of it you need high level programmers and very few devs want to pay for them so you get less talented people working on the game on the programing side. all of the latest probs have been down to poor use of the CPU.
@@stephenmeinhold5452Nothing is impossible with overtime pay
@stephenmeinhold5452 the coders that work for AAA companies are extremely skilled because there are so many coders that want to make games. They are just too few of them with too little time working on massively complex games on several platforms.
Stutters exist on consoles as well. It is bad optimization in general not just pc ports.
I'm sure they're tired of it, but I would love to see a Cyberpunk video covering the current and "final" state of the game and what optimised settings look like today versus years ago.
we're slowly moving in the direction where tracing rays from light-sources instead of from the camera becomes more viable, because rays traced from the direction of a static light source can be cached *much* more effectively than rays coming from a moving camera, if only because it saves a lot of linear algebra, lol.
I love linear algebra
All tracing starts from the camera, even in case of bi-directional path tracing.
@@Megalomaniakaal Yes. But that's not a natural law. This was done as an optimization because it means you don't have to light millions of pixels that your camera can't even perceive.
That also means however that your scene is very camera dependent, and if your camera is dynamic, that makes caching stuff difficult. If you trace from the light source however you'll have neatly cacheable, persistent lighting values for all static objects that are in the scene, so you only have to dynamically trace dynamic objects.
This is basically how baked lighting works, but if we move far enough along it may become more economical to have a static tracer like this that caches values, to take load off the dynamic tracer.
In other words: a scene lighting tracer that does multiple bounces and a subject tracer that does a single ray with no bounces, unless it intersects with a dynamic object
@@insu_na You misunderstood my point. Even if you 'trace from the light source' you start from the camera first and then further trace from the hit lights before returning results to output. You don't just trace all lights in real time to cache their results. You only do that to lights that are relevant.
Now if we are talking pre-baking caches pre publishing in an offline fashion that's a different story, of course. But then we aren't really talking strictly about online/realtime rendering anymore.
@@Megalomaniakaal You're the one misunderstanding me here...
You don't start from the camera when you're tracing from the light source, because there is no camera. You cast rays from the light source in all directions and then note down every intersection with the objects in the scene.
That's not how it's currently done, because it's exceptionally expensive computationally, but it's also possible to cache the results of these rays regardless of where a camera is eventually placed in the scene. It wouldn't exactly be pre-baking, because pre-baking is done before you ship the game. This would be done while loading the game and then constantly while the game is active to improve the cached data.
Please understand that I ***EXPLICITLY*** said this would be a different way from the current way of doing this, in the very first comment.
It wasn't too long ago that the prospect of doing ANY raytracing seemed like it was 20 years away, then the RTX 20 series dropped in 2018, and six short years later we have a handful of titles that are doing almost completely path traced graphics.
PowerVR hand UE4 raytracing demos running on mobile chips with RT cores YEARS before RTX 20 series. The only reason it wasn't released to the market is because at that time no OEM was interested in licensing it, but the tech worked.
Yeah, at first they were just trying to get it to work. Now they are coming up with all kinds of optimization techniques. Soon enough, they will find a technique that give 2-3x performance advantages like Insomniac has and eventually 10-15x. Caching and lower updates at lower resolutions are going to be an important part.
It's still a bit too flawed. Too many compromises here and there that creates distractions/artifacts. Still love it though. I'm on RT/PT wagon since day one. I think they should prioritize eliminating distractions in DLSS and RT. Because it defeats the purpose a bit in my opinion. It's like having gsync but getting flicker and overshoot from it. At that point why have gsync at all.
@@kazioo2 Crytek had ray tracing running on a Vega56 at 4K 30fps.
@kazioo2 sorry but it was vastly inferior and not ready for mass adoption.
I love how Alex tells in detail how much needs to be faked in order to run a game at real-time when people lose their heads about fake this and fake that nowadays when they're unaware of how the sausage is made.
Oh, that's essentially real-time 3D rendering in a nutshell. "Fake" as much as possible, but at least we finally seem to be getting to that stage where the lighting can update in real-time as well, without needing to compromise on certain aspects of the lighting itself.
That’s usually just AMD shills spewing their copium
Yuuuuup.
@@MLWJ1993 yep, and probably just a decade away from having 3-bounce fully global RT that isn't noisy as hell being the standard. I wonder what'll be the next beacon on the graphical hill after that, accurate material & physics simulation that isn't 'faking' it in the same way raster 'fakes' lighting? AI Swarms? I can't help feeling like each step closer & closer to photorealism we get, the less & less it matters for games, because there's basically no technological limitation to execute on most art styles now unless we want to use AI to interpret 3D models as particular styles of brush paintings or something.
It shouldn't have required this much explaining... Unless people thought that their monitors/TVs function as a portal into an actual real world.
In that case, telling them that it's *ALL smoke and mirrors* must be devastating. So silly...🤦🏽♂️
Alex really kept me listening and I didn't want it to stop
My head feels like it bounced 4 times after this lesson...
Indubitably
As Alex clearly stated in this video, this would not be an issue if you were a shark!
Technical term for 3 bounce lighting:
“Boo boo boooop” 😅👍
There is a command line parameter in higher UE5 versions that enables tracing against full nanite meshes: "r.RayTracing.Nanite.Mode 1". Works for ray tracing and path tracing.
To be able to condense all that info into a 20 min clip is extraordinary, well done Alex
I can listen to Alex talk about PC stuff all day, its so interesting!
I'm gonna pretend I understood everything this man has said
Northlite was one of the engines prepared for future.. thats remedy, that's their legacy being demoscene group.
This is so cool. Path tracing for me is the biggest uplift in games now, it makes the image so much more believable. Cyberpunk for example really comes to life with PT even if the environment and characters feels like in a comic book it fools me to believe it's more realistic.
who cares really about that ? That are just games. Not reality
@@BastianRosenmüller I care! So nice.
@@Fumes74 But is that really what makes games great? slightly improved lighting ? Games are meant to be played not to stand there and watch some shadows and light bouncing from surfaces. The best games out there have the worst graphics.
@@BastianRosenmüller One thing does not rule out the other. There are many games with less impressive graphics that are really good. But this tech bumps the immersive feeling a good bit for me and many others. It's the most drastic (graphical) improvement in Cyberpunk 2077 for example, its night and day. If you don't care about graphics, ofc this does nothing for you. For me the story comes first, but 2nd place is graphics. And it is not "slightly" better, it's a major difference. When it really picks up momentum and devs will use this as their primary goal it will be fantastic. And wow it must be nice to not be restrained to a small number of fixed lights in the scene, as a gfx artist i mean.
As a CG artist who primarily works in Film and TV VFX, this was an excellent technical talk giving me the distinctions between different methods of light transport being used in games. I am, like many in the VFX industry, primarily looking for speeding up my offline renders using Unreal, and with that comes compromises of course. It's very interesting to hear it from the other side with game devs wanting to have less compromise! As it goes in many things in life, the truth lays somewhere in the middle I am thinking. But make no mistake about it, if I could have realtime with full offline path tracing rendering quality, I would! Excellent talk, and it instantly earned a Subscribe from me!
The temporal lag is still spoiling my opinion of realtime Raytracing. Saw it in Cyberpunk how the shadows of footsteps slowly fading in and out of existence. Hogwarts Legacy had indoor rooms completely dark the first frame and took some time to fully saturate the scene with light. This feels way worse than traditional hacky approaches.
Have you played since the latest rtx updates? the new denoiser almost eliminates the temporal slowness of lights and shadows that move or pop in with quick lighting changes. Havent re-tested HW lately but CP on a 4070ti is darn good looking these days at 1440p with path tracing or screams at high fps with normal RT.
I have issue with CP2077 when turn on RT/PT i get a glitch like the tree become black as i approach them and it even worse with Ray Reconstruction. Are you guys get the same problem?
Cyberpunk and HW both look fantastic at max settings on my 4070ti Super. When 2.0 dropped for Cyberpunk I noticed too much ghosting, but that seems to have been minimized after a few patches.
@@MDxGano It doesn't "almost eliminate" the light ghosting. Ray reconstruction improves the light responsiveness a lot, but there's still significant light ghosting compared to rasterized lighting. That's likely something that will slowly be improved over the years.
@@faultier1158you mean fake lightning
Fascinating! pretty excited for Alex's UE5 video!
Path tracing is the way to go, I'm not denying it. How can they make these tech cheaper and the chips to run on it is the real question. You see consoles run on a tight budget. As of now, only RTX 4070 Super and above are meant to run that on 4k with upscaling.
4k is an absolute pipedream, even for rasterized games, I don't understand how this has become an assumed standard. 4k is so many pixels, there will basically never be a reason not to use dlss to reach that resolution.
@@lgolem09lthe OP _did_ say "with upscaling" though, so... yeah.
@@lgolem09l you can thank techtubers for that; it has become the metric for optimisation as well. Most people watch one video of benchmarking and come away with '4080 can't do max settings at native 4K with ray tracing, game is unoptimised trash.'
Only facts!
@@grantusthighs9417 Tech tubers & influencers are shyt at doing real unbiased reviews which have now leaked into everything else. Benchmarking computers used to about how test all the performance in different ways. Now it's all about which games are "the most played" that "have to be benchmarked" because the player want it. There's a reason why older reviews didn't ever do crap like that, & now it becoming a standard as turned all reviews into pure dog shyt.
*SHRC RT?*
Does the Frostbite Engine use that? Or is it a Seacret?
Well, cool video anyway. Got me hooked.
Just when real-time raytracing is becoming viable the industry is starting to flirt with neural rendering that could make this decades old dream irrelevant.
@11:17 Wow the version without RT is the correct reflection as the TV is the Samsung S95D! 😂😂😂 You can see the reflection on the glossy TV screen.
The optimizations in AW2 for the BVH based on distance are reminiscent of LOD based on distance only in this case, there's a frame rate component that changes instead of file component. It's amazing how many concessions developers have to make to simulate reality which makes you wonder how good the hardware is that simulates our own world 😂😂😂
I am fed up with graphics of the modern games, it's already at decent level, and anyway it looks like a game, not like a real world. I prefer some progress in gameplay, like AI NPCs/companions, advanced physics, deeper mechanics etc.
Alex might want to read up on Reyes rendering, micro poly displacement and Catmull-Clark subdivision surfaces.
The realtime graphics industry is by and large just reinventing the wheel. Which is fine. But worth noting and keeping in mind.
I think we are a few generations on hardware and getting close on engine level to real time pathtracing being a mainstream thing. The real implementation will be when the affordable cards in the 60-70 series cards can do it. But pathtracing will be the last real leap in games as we already have photorealistic textures and crazy realistic models and meshes.
nah it wont, the next thing will be physics
@@pliat we already have physics in games.
@@a1racer441 I’m talking about actually advanced physics, full building destruction, etc.
@@a1racer441 yeah like in starfield 🤣
Physics/ ai next
Love the show! Life long high end gamer
I'll be excited for new technology when companies actually start to use their massive budgets to produce decent games instead of aiming for unplayable eye candy.
I think there's a big bottleneck in how fast technology develops but cannot be thoroughly learned and implemented because a lack of training and educational programs for devs that are just rushed to push the latest sponsored thing.
With all the politics, division and strife in the world, it’s always refreshing/calming, when a Digital Foundry vid is uploaded. Thank you guys for the shot of sanity.
Isn't he the guy that hates pretty women and wears a dress?
No?
Still want to see more games use RT Static. Having baked lighting of path tracing quality with no performance hit would be insane for VR and really any game.
all that GFX but ingame AI is still in the pre 2k era. Gameplay for braindead zombies as well. But its RTX!'!!!!
Not just npc/enemy ai but things like physics and interactive animations which have actually become worse compared to earlier generations. Look at foliage and things in Ubisoft's latest Farcry's vs Farcry 2 and Ubisoft is not alone in this but one of the most egregious.
@@NatrajChaturvedi yes. Nowadays its basically ALL looks - graphics sells. Its like boobs. So we get boobs but no brains. Great. For someone who started gaming in the pre-PC era, its a big let down. I care about gameplay and still dream of a game, where the resources would be directed a deep gameplay, including AI, physics, world, sound (!) and not just GFX... when its GFX focused, its like being in a beautiful place, where you cant go anywhere and cant touch nothing. Or like living with a brain dead barbie doll, which looks fantastic.... both sucks
@@HybOj Stop playing AAA games and start playing indie games. A lot of games where those are the focus.
@@ArchOfficial I dont play AAA too much, I dont play much anything last year. I enjoyed only Tomb Raider remake :X
Wouldnt it be straightforward? To solve for shadows using explicit path tracing with not a whole lot more compute?, since you are still doing the first direct illumination step with brdf sampling + light sampling using multiple importance sampling in real time? All thats left is to have a fast algorithm to cull occluded shadows (the irony) to stop shadows from flickering? Curious tbh. I love that Alex does these deep diives into rendering from an industry use case in games. I dabbled a lot in physics based rendering for a potential phd thesis in ML but lost interest eventually but Alex makes wanna change that lol. Wish I had time to go through the full GTC and GDC videos.
ok, this is actually good. really good, big ups, yes that's the way its gonna be going now, brains over power, look at Spiderman 2's rt, reflections the reason that works is because they used alot of cheeky tricks like that (sampling, using the temporal data correctly, inheritance) this is probably the only comment on df that's slightly technical someone know what they are taking about nice change. we had the cpu, then the gpi, now we have these new units where we hadn't figured out rasts equivalence of differed rendering. ive only ever worked with rt, and not a pure path environment so i can just say yeah i thing your right.
Loving these videos. Keep 'em coming!
Can that GDC deep dive about Alan Wake be streamed somewhere?
Might pop up on the GDC UA-cam channel in a couple months, otherwise it's stuck in the GDC "Vault" that requires a paid subscription
I just want us to reach a point where we eleminate loading screens. Increase the amount of particles/effects/geometry in our games. Basically increase IMMERSION. WE NEED IMMERSION NOW.
Gaming is genuinely getting boring and repetitive
The RTX 50 series launch will probably get more games to start adding it.
Maybe, but the unfortunate reality is all gaming will still be held back by the pathetic specs of this generation of consoles. Imagine where gaming would be if consoles didn’t exist and everyone had capable gaming PCs instead…
Doubt it. Especially with the prices ngreedia will ask
@@mattspeer01 In this world where the majority of PC users have something similar to a PS5/SEX or even worse? Keep imagining, buddy.
@@mattspeer01"Capable gaming PCs" would just have had last gen components that held you back anyway. The gtx 1650 is still the second most popular gpu on steam, 1060 3rd. Remember when Alan Wake II launched and people complained their 7 year old GPU's ran terribly. You think the people buying cheap consoles would instead now buy 50 series entry-level GPUs worth as much as their consoles?
I suspect that the industry transition to ARM from x86 will be the harbinger of universal RTX which is one among a number of reasons Nvidia wanted to acquire them. Nvidia's new AI chip uses some ARM architecture and they're supposedly going to be making ARM CPUs for Microsoft products. The chiplet bid that Intel and AMD are going for is simply prolonging the inevitable. Future low end gaming rigs will have an ARM CPU and some motherboard coprocessors (like an NPU) and high end will be an ARM CPU with a discrete GPU.
The bandgap breakthrough with graphene will likely be implemented and iterated on quickly, but any advantage it affords x86 can be applied to ARM with greater scalability and efficiency.
My forte is analog electronics so take this with a heaping helping of salt.
The direction in which real time graphics will evolve is set... But I don't see the necessary hardware to actually run all that sufficiently well become mainstream this decade with the budget console hardware always being the common denominator.
Devs used to come up with all kinds of cool work arounds to get games running on console and none of them do that anymore I knew there is a better solution to get high quality ray tracing in a game than just pure gpu performance.
The current well done rt implementations are pretty far from just using "pure gpu performance". There's tons of research work that has been done to optimize these techniques to even work in realttime at all.
Immortals of aveum have insane stuttering
I never noticed the Bond villain scar on Alex's face.
Good thing he’s handsome. If anything it makes him look more interesting.
I wonder what happened to him though
I got a bit lost there sometimes whether at any given point "fully path traced" meant just the lighting or if it meant "replaces the rasterizer".
It doesn't matter. We are talking about determining which object's what point you see in a specific pixel here. We can do either rasterizing or ray tracing, and it's not where the hard work is. The hard work is to determine what color that pixel should show.
@@disconnection9502 HDR is a much better "wow factor" than Raytracing honestly.
That was one word before 11:56
I dont think RT is cost effective now. It will be when medium range hardware can run RT and get 120fps. Until there I prefer to have 120 or 144fps than RT.
fascinating stuff, but let's be honest, we're not going to see much fully fledged games including all this stuff until... a very very long time.
Don't wreck your voice Alex. Take a time of speaking to let it heal. A few silent episodes aren't worth risking your voice for good.
Dann erstmal nen spritziges Wasser gegönnt
Path tracing is the first time turning on RT is actually worth it despite that the perf hit is brutal, because the ReSTIR algorithm is simply just so much more efficient than naive ray tracing.
Question: Would running a RAM-disk on DDR4 improve UE5 shader cache problems? DDR4 is getting a bit cheaper so it's more feasable
it won't because it's all supposed to go to the Vram on the card.
Direct Storage
Question here I tried to run RTX Remix games: NFS Underground, Max Payne I copied github RTX remix files and game mod files to game directory but when I run game exe it crashes and won't start I have 7800xt and 7800X3D
Are you on the latest drivers? They broke RTX Remix mods except Portal. It also needs Linux drivers to run well on AMD hardware (if you run these games a 7800XT gets like 10FPS on Windows).
Don't you need RTX card for this tho? Lol
@@kajmak64bit76 only for using the kit to mod games. The mods themselves can run on any hardware albeit terribly on non-RT cards.
@@grantusthighs9417 dam
Is it really hard to get path and ray tracing all in vs only getting 1? 60fps should already be the nextGEN norm but isn't implemented in a lot of cases till the following patches but I think having the full reflection ray and path tracing would give the full gaming experience whether on PC/console
Personally, I think that raytracing has taken the industry backwards. There was such a push for 4K, when 4K televisions came out. Which made graphics that much sharper and even games on the Xbox One X were pulling off 4K/60. Then, raytracing was introduced and became bigger… And graphics cards and consoles could not handle it, so resolution started slipping backwards to the point now where things are just blurry. So what’s the point of high resolution textures, if you’re just going to end up with lower resolutions to display them?
It’s like two steps forward and one step back. Yet, games like Metro: Exodus are pulling off 4K/60 with raytraced bounce lighting and shadows with ease… yet, no other developer can do it? It makes no sense. I’m tired of seeing games at lower resolutions running with raytracing that seemingly nobody else can do except the developers of Metro, looking as if I am seeing them running through fogged up goggles. Simply because they want to add the newest buzz word in the industry.
If these features can’t be done with modern consoles and graphics cards, then they shouldn’t be attempted. The thing is, once again, the developers of Metro proved that it can be done efficiently… Which makes me wonder what is going on with the rest of the industry? It makes me believe that developers are just being stretched too thin and developing for all of these PCs and consoles is bringing everybody down. Since the consoles are essentially PCS now… wouldn’t it make the most sense to develop for the highest end PC and then scale back for consoles? That way everybody gets the best of everything, depending on their platform?
I understand that there’s some custom hardware in the consoles… But by now, developers should be able to code to the metal. it’s been 3 1/2 years and we still have not seen 50% of what these systems can do. Which makes me believe that this generation was just a wash. I understand it started under extenuating circumstances… but, it seems like many developers are just doing the bare minimum of getting these games up and running. Which is proven by almost all of them needing patches after patches after launch.
But this is what happens when you cater more to your Investors more than you do the gamers playing your games. We are essentially at the spot where we were when the video game crash happened back in the 80s. Too many games, which are overpopulating the landscape and making it tough for any one developer to break through. you can’t trust what any developer says about the games, because there is so much coming out that is broken on release. So many promises that fall through. Every once in a while, we get a great game and then it is followed up by 100 that aren’t.
I think developers need to start focusing on what’s most important. Gameplay, story and sharp visuals that developed us into the game. Not pushing technology that the hardware isn’t ready for. I love raytraced bounce lighting. Lighting is one of the most important aspects to make graphics look lifelike. Reflections are nice, but many times they are overdone. Shadows are just as important as they make things pop. but, it’s very obvious that only the highest end graphics starts can achieve any of this and even though can only do it at 1440p with decent resolutions. So, next Gen, maybe?
I understand your point but I disagree
"If these features can't be done with modern graphics cards and consoles, they shouldn't be done"
That's not how we got here...
If people had this attitude towards 3d rasterization in the 90's and 00's, gaming would look very different today, likely not in a good way
"wouldn’t it make the most sense to develop for the highest end PC and then scale back for consoles? That way everybody gets the best of everything, depending on their platform?"
It is largely a CPU bottleneck for the lower end systems and consoles due to the apparent difficulty in keeping cpu requirements in check as we see in a lot of the newer releases. You develop for the hardware most people are going to have, thus servicing the most customers you can. I'm fairly familiar with Deep Silver and their development history and experience making Metro Exodus Enhanced. They had to tone down the real time lighting but were able to do it efficiently by basically baking everything else as accurately as they could so that turning on the extra RT settings weren't so impactful to visuals and performance in order to make it work on consoles.
also food for thought, current gen consoles are not running Metro exodus at 4k, full stop, they run mostly around 1800p which is about 87% scale. A 4070ti struggles to maintain a 60fps target at 4k with a 7800x3d. The game isnt as sunshine and rainbows as you think in terms or performance, they just handled their console launch very well as they always have and were able to offer a 60fps update 2 years after launch, not publishing much if anything else in the meantime. Pc port was pretty rough at launch in terms of graphics settings sticking or downright not applying as well. Nothing too gamebreaking, but hardly faultless. They also are not very diligent in updating to the newest versions of dlss, but very few developers are.
@@crestofhonor2349 I respect you for your reply. So often, people can’t agree to disagree. They just go on a rampage because people don’t think like they do. I understand it from both aspects. I’ve been playing games since the 2600... but, I think we are at a crossroads in graphics right now. I understand, pushing things forward, but it doesn’t do gamers any good if the games are released broken or with bugs, simply because developers wanted to push the latest features. I think when 4K gaming became a huge thing, which was obviously just a few years ago… that sharp and crisp quality to the visuals was amazing. Especially with the games that ran at 60 FPS on top of it.
We only had a few years of that, before the new consoles and then a lot of games now are kicked back to 1440 P to include the newest graphic features. Upscaling, at least on consoles right now, just isn’t a great viable option. Because it is causing too many artifacts. One of them being blurring, the other one being ghosting on moving characters. On PC, it’s different story. Especially with RTX, because it is more advanced for upscaling.
So we will see. I always keep my mind open and I really want to see things advance forward. Especially with physics. Because I think that could mean a lot to gaming if it’s used for gameplay. Essentially giving players and NPCs the ability to use their environment to their advantage and give a multitude of ways to dispatch those enemies and them, you. Not to mention how much fun it would be in multiplayer to be able to shoot the corner of the top of a building and have that rubble fall into your opponents. I’m remaining Optimistic and hoping the next generation will be an incredible leap, as Microsoft says. I want to see games out of the gate taking advantage. Something we really didn’t get this generation.
Something we still really haven’t seen yet. Which is why I’m concerned. Because we are 3 1/2 years into the generation and there’s already talking two years of Microsofts releasing a new system. When the current system has barely been tapped for its power. It doesn’t seem like developers are taking advantage of the features of the hardware in order to get the best out of it. And I understand, they have a lot of systems to code and you can only do so much, without destroying your budget. But, we haven’t even seen much in the way of greatness from the first party developers. Like I said, in my original post, Metro: Exodus proved that raytracing could be done, essentially full scene with bounce lighting… At 4K/60.
How they got there… Who knows? I’m sure there’s some upscaling of some sort, but you would never know… Simply because it still looks incredibly sharp and the lighting is just gorgeous. Which I think is one of the most important aspects of obtaining realistic graphics. Wish they would share their secrets with the rest of the industry. Because I think a lot of developers could use it right now. Any case, thank you for the reply. I wish more people in the world could discuss why they disagree and come to a consensus of each other opinions. Instead of simply going off the deep end and making it seem like the entire world is coming down around them because somebody doesn’t agree with them. have a great day
👍👍👍
Unfortunatly i am not really excited about this, despite having a new pc (rtx 4070ti). There are a couple of issues that plague modern graphics and for me personally the worst is TAA. This is more then a matter of preference for me, i have Epilepsy and the motionblur that TAA introduces makes me nauseous, so i can't play a game that has forced TAA for more then 30 mins before i feel sick.
Also i don't like blurry visuals in general and would prefer the tradeoff with blocky looking edges, but that is just my preference and not an accessability problem.
Another thing that needs more attention is shader compilation stutters and unoptimized games on pc in general. I paid almost 2000€ for my PC and there are new games that will still stutter, regardless of the settings. I notice this mostly as micro stutters and not the classical fps drops it feels more like a lag or freeze.
It feels like hardware manufacturers keep making hardware to keep up with more and more ridiculously demanding features like real time path tracing and other nonsense instead of the other way around, where developers work within the limitations of the existing hardware and come up with creative solutions.
If a game can't run on my brand new high end 2000€ pc properly, then i don't care how good it looks (and i don't even think that it looks good because of TAA blurring the whole image), it has bad graphics period. Because for me "graphics" are the techincal foundation of a videogame, the end result of what is rendered for you on the screen. And if the result is something that stutters and i can't use properly then it's just bad.
Developers need to focus on high performance first, take more care about people who suffer from motion blur and give us more graphics options and ways to custimize our experiences. Because this is what makes PC gaming better then console gaming: the ability to finetune your games visuals to your liking, witout being stuck with what the developer thinks looks best.
Most of these new features are just marketing gags that hardware manufacturers like Nvidia want you to believe that you need. Videogame publishers gladly advertise with RTX and Pathtracing because they look good on screenshots, but they are getting played by the hardware manufacturers in my opinion.
Just take a look at the best selling games of all time, the most played games on steam, and the most succesfull gaming consoles and hardware that existed. Do you know what they all have in common? Accessibility. This is the key to succes that i can't stress enough.
All the best selling consoles of all time like the Gameboy, PS2, Switch, DS etc. had a low cost compared to their competition and also lower cost of operation (there is a video on YT about the original gameboy that used AA batteries and how in comparison to it's competition it only coast 17 cents compared to 2.50€ per hour to run). The hardwarepower or graphics alone never sold games or hardware!
PS2 = weaker then the first Xbox but sold better
Nintendo DS = weaker then the PSP but sold better
Nintendo Switch = weaker then current and last gen consoles but sold better
The first gameboy didn't even have a backlight screen and was in monochrome colors, at a time where the competing handheld had amazing graphics a bright screen and of course colors.
But again guess what sold better?
Same with software:
Why are mobile games so succesfull moneyprinting machines? Well apart from microtransactions, everyone can run them on their low end devices.
Why are free to play games so insenly succesfull? Because they have well optimized graphics and scalability so most potential customers can run them on their devices.
Why are games that are old still so succesfull? Because apart from nostalgia, many people with lower end hardware can install them and just play them without worrying about hardware restrictions.
For me graphics are really important, they are really important to draw you into a world and it's atmosphere. But graphics is so much more then just texture resolution or raytracing!
Image clarity, performance, art style, accesability options are much more important then just having the most advanced technical features in your game.
Games just like movies are a for of art, that have to work with limited resources (be it computational or budget limits) and i find it way more impressive how developers made game worlds come to live with clever 2D prerendered Graphics, animations, attention to detail and using every single Bit of performance available.
Unfortunatly modern Devs don't seem to care about this and we are left with broken, stuttering and bugged releases that are overpriced and overbloated.
A game doesn't need 1000+ developers working on it to make it good and neither does it need a budget of 200 million dollars.
TAA is here for a reason mostly because MSAA doesn't work well at all in modern games. With DLAA that could improve stuff as it often offers a sharper picture and less ghosting than TAA. But even then TAA needs to be modified more because stuff like the default TAA present in UE4 and UE5 is bad. I do agree power has never won a generation but that's also because you have to factor in cost. Expensive consoles always sell in low numbers. I completely why the PS2, Wii, Switch, DS, and GB sold so many units. These consoles are cheaper to get allowing for a wider range of people to purchase said device just as you said. People like a low cost of entry because it means more people can buy said stuff. Developers don't really need to aim for anything but stable performance. Whether they want high performance or low performance is going to change on a per game basis. Stable performance is more important than high performance. If given the choice between a stuttery 60fps and a stable 30fps I'd take the stable 30fps
Devs do absolutely care. I think you're mixing it up with the publishers who knowingly ship a game unfinished. The devs just do what they can to make the game in the allotted time given
👌👌👌👌👌
@@crestofhonor2349 SMAA does when properly done. There is overuse of TAA because it doesn't impact on performance because it's post processing anyways, it will always have problems. SMAA & MSAA are not done on the post processing.
@@kevinerbs2778 Many effects rely on the use of TAA and SMAA and MSAA would leave many things in games untouched by anti aliasing. I've seen this happen already. TAA can and does look fine with the right parameters.
@@crestofhonor2349 Go watch the video for Crossout here on youtube on TAA VS SMAA. What you said is wrong. SMAA does not miss anything on screen it superior to both since doesn't just hit edges it is however performance heavy 25-40%. Since it applies to everything on screen & does it a proper real time rendering. it literally applies to each pixel & is sampled a higher resolution than what the screen's resolution is. TAA is only being used because it's performance hit is minimal like 3-10% that's it, I literally did research for TAA & it's other problems were already know since it's a post processed. It's actually terrible for most things AA other than transparent textures.
Pc looks more accurate but for some reason ps5 quality mode looks better on the eyes. I’m specifically referring to the comparison in the video with Alan wake 2. Many of the stills look better on the console. Inexcusable for such expensive hardware
Yeah because your fanboyism is blinding you. Path tracing which is used on the pc brings an improvement to the graphics especially the reflections if you observe carefully. And well if you prefer console graphics (optimised raster settings) you can run ps5 graphic settings on the pc as well and the pc will push much higher frame rate and resolution.
@@dhaumya23gango75 that’s not fanboyism that’s the truth. I’m playing the game after work to relax. If I have to CONCENTRATE to see the difference from over 4k plus hardware then it’s a moot point. Graphics are just too close
@@bigde131 they are closed because that what DF told you. The reality is different. This site is Sony sponsored. They would compare a little 2019 APU Zen 2 500 $ box with high-end 2024 PC CPU and GPU, and set up PC antialiasing levels to "make believe" ... you even don't know if their pictures are really coming from a Playstation.. if you are smart enough, you don't trust that bs. If they were honest, they would have compared this 2019 APU with 2017 mid-end PC... PS3 is not Full HD native, PS4 and PS5 are not 4k native... the PS4 Pro is the first Full HD console. The PSS 1.0 for the PS5 Pro will bring a hybride definitions 1080/1440p automatically scalable. Note that you are stuck with an old Zen 2 (i8500+GTX1660 or RTX2050 kindalike) while AMD are going to release the Zen 5... 😉 note that DCS Gaming PC recommandations for low PC graphics 30 fps are the same for PS (GTX 1660/1070)
@@exotikification I play on a 120hz Sony Bravia A80 55 inch. The picture quality is so good literally EVERY game I play looks so beautiful it’s unreal from overwatch 2 to Stellar Blade. With balanced mode or resolution mode enabled it’s like looking at a dream. All the while splayed on my bed smoking a joint. Sorry but pc performance just doesn’t justify the price point. My best friend has a pc yet he comes over my place to game all the time 🤣. What does that tell you?
@@bigde131 ... that 2019 console Ryzen 2 graphics are the same since ONE/PS4 and it is enough and historically design to play on TV since little console system exists :) personally, i play on an Acer SpatiaLLabs (a TV is has-been in my case, i enrich your culture by the way lol) or 3DVision videoprojector, this is stellar ! better than smoking. 😂 and yes, 80 dollars a AAA game and 80 dollars a controller on a console with low upscaled graphics : playing on console is not cheap anymore.
That's great and all, but beautiful visuals hit their absolute peak in the 7th gen, with a mix of tech and artistry.
This technology is impressive, but not really necessary to enjoy a game.
Ooh? Yes please.
When will ray tracing be playable for peeps who have budget cards? I have 6700 xt and even I can turn on some of the effects. Like reflections in Control and over 60 FPS.
I think the solution to solving RT with nanite is that it will all just be replaced with some neural rendering tech which isn’t bound by the same issues as current triangle + RT methods.
Oh hey where in the heck is RTX Racer !!?
Exactly. It was supposed to come out in 2022, and they just never released it.
@@computron5824 i hate it when they do this
Nanite aged like milk.
Dachsjaeger's scar is fucking badass
Good for PC gamers with a very big budget. Maybe in 10 years for the majority of pc-gamers.
Console gamers won't see this even in a possible new generation.
Babe wake up its Alex time
For us thrifty console users how about talk a comparison for last gen compatible engines like cdpr’s red vs the Elden ring engine how these beautiful engines didn’t have to left behind just for underdeveloped for never fully potential used would great for us less technically enthused for ue5 or more appreciative of matured bespoke engines for er and cdpr.would love that for me the not as advanced thrifty console gamer.
Engine in Cyberpunk 2077 was a hell to work with, coz it was built on a crunch with a dated baseline. Coz of it they were not able to implement refined multiplayer mode...
Elden Ring is also on a rly old and clunky engine.
ID software is going to share engine tools, which were used to make last Doom games.
Also Source 2 from Valve is a few years away from being ready to share. For now, only S&box is using it.
Why don’t devs take all the cpu cores or one of the basically unused cores for ray tracing alone that should be enough power to calculate the lighting without any performance hit. Work smarter not harder is what everyone always says off load the ray tracing to the cpu and have the gpu do all other graphic processes. Or maybe dedicate 1gig of video game just for lighting effects.
Software ray tracing is very slow compared to doing it on the GPU. Additionally the CPU already has to deal with constructing the BVH for tracing.
1. CPU cores aren't "basically unused", they're being hammered just as much as the GPU is the entire time the game is running. Whether it be gameplay logic, NPC simulation or recording and submitting rendering commands, the CPU is doing a *lot* of work to the point where often engines are architected specifically around managing all of this work (look up things like entity component architecture, task scheduling, multithreaded rendering, etc). Putting more work onto the CPU will not help.
2. The CPU is "too far away" from the GPU in systems with discrete graphics (ie your average gaming PC) to be of any use here. There's a *lot* of latency/delay present in sending data or commands between the CPU and GPU, and said latency gets *far* worse when you try to send data from the GPU back to the CPU because the GPU is behind the CPU in 99% of cases (the ideal case for modern games is that while the CPU is busy simulating frame N, the GPU is busy rendering frame N-1). Best case scenario where the CPU is fully responsible for lighting the frame, the GPU would have to send the in-progress frame back to the CPU (latency), stop while the CPU lights the frame (latency), then wait for the CPU to finish transferring the lit frame back to the GPU (latency). Worst case scenario where the CPU is responsible only for checking if a ray is intersecting any objects and traversing the acceleration structure (this is what RT cores do on the GPU), the GPU would have to send the trace query back to the CPU (latency), stop while the CPU traces the ray (latency), wait for the CPU to finish transferring the intersection state back to the GPU (latency), then repeat the process for however many trace queries the GPU needs to offload to the CPU. Performance would be *abysmal* under this scheme, so this is a Very Bad Idea (TM).
3. "Dedicate 1gig of video game" doesn't make any sense. Data alone is meaningless, there has to be some code to dictate what to do with that data and this code is what contributes to poor performance in 99% of cases (the exception is when you put too much data into RAM/VRAM and either cause cache misses on the CPU/GPU, or cause data to spill from RAM/VRAM).
@@jcm2606 None of what you said here is true as if it were true that would mean drivers would be completely pointless.
@@kevinerbs2778 No...? Why would they be pointless?
CataFeral's intense stare is because he was in the middle of a chess match against ChatGPT. He was trying his hardest to give himself a handicap so ChatGPT had a chance. In the end, he won. His record, 312-0. A living legend!
That's Thomas Morgan to you, mortal.
Console gets left in the dust 😔
Next gen consoles will do path tracing just fine. Meanwhile you can be one of the handful of PC gamers that can experience path tracing if you have a spare $3000 available.
@@Coxy-b34 they will not please don’t try and claim a 500 dollar console will do real time pathtracing when current gen can’t do raytracing at any reasonable resolution. We are several generations away from that.
@@a1racer441 I sense your PC elitist rage but history suggests a next-gen console will have a GPU at least on par with a 4080 and thus will be capable of pathtracing.
@@Coxy-b34 Yet the current consoles can barely hold 60fps in just a few games.
@@jbscotchman Am I talking about next gen consoles or not?
What are the chances Nvidia starts offering increased memory on their new cards so that there's enough room for this caching?
what about the hardware that can run it ?
do i still need to sell a kidney to use it ? YES so its useless
It'll probably depend on the game like usual an how much geometry and how many rays are sent into the scene
@@crestofhonor2349 if it depends then im not interested
ma dude 4070 plays alan wake 2 with 30 fps ray tracing medium !!! its a technology that are we are not ready for
whan a 300$ gpu can run a ray tracing game for 60 fps then i ll say its worth it
I'm curious to see how amd and nvidia will segment their GPU's now. I'm sure they're going to rip off customers more just using this RT as an excuse to do so.
ah yeas, now at where rasti was before at doom. its gonna get good but it just aint it yet
Best result I got was turning off all ray tracing in Cyberpunk. SSR on ultra. The game looks sumptuous. Won’t go back.
So good news that alex became Nicholas Cage in Con Air.
Bath tracing getting better while video games getting worse
Why do you guys’, the consistent viewer, still listen to this MARKETING MAGAZINE!!!! RTX sucks performance wise and will never replace typical game development. Wasted processing with ALWAYS limited processing power.
I only saw this video title cause UA-cam auto nexted this damn channel video in my feed. I DON’T WATCH THESE GUYS CAUSE THEY DON’T REPORT THINGS HONESTLY. It’s all an advertisement. Do they do new flagship phone content? Cause that’s mainstream tech influencing I’m sure they’re looking into.
Would it be possible to use frame gen only on reflections, to make them run at full speed?
No
Yes
Nope as that's not how frame gen works. It can only take information from the scene not what it's being run on
I don‘t think there’s a technical reason why I couldn’t be used on a specific (part of the) scene, e.g. the reflection.
that would've been possible if we went back to multi-card setups. Tiling type rendering would actually be useful for that.
Aaahhhm 40 aaahhhmm fps aahhhhmm for aaahhhhm 400 aahhhhm dollar aahhhhhmm that's aahhhhmmm today's aaaahhhhmm gaming aaahhhhmmmm huhuhuhu 😭 aaahhhmm
Special ed
We need to go back to the old ways of gaming, when games were made with spit and gristle or spristle as I call it 😎
That isn't fixing anything except expending development time even longer
When you got 20 frames a second and you where happy that you got that.
I couldn't care less when games are releasing in such a shoddy state on PC. Ray tracing is a buzz word for sales, even if it is impressive tech.
They talked about this seemingly overhaul of RT overdrive yet literal every lighting bug (plus more popping up) still exists from the old system? Very odd. It really always seems that every Nvidia project is just a shell or partially completed work that piggybacks off all these marketing promises that never really come to fruition and are abandoned and waste away.
Scarface. ??
Regular RT is not that noticeable, but Path tracing is a difference maker!
a shill and his dreams.
Will we _FINALLY_ have full Path Tracing with the RTX 5090 @ 200 fps+!!! 4K!! Max setting!!
Very possible
Alex “ I’m scared of women breast” is at it again.
?
@@blubblurbThe commenter is having a fit because Alex commented years ago on some forum how he didn't like the design of the MC from Stellar Blade.
@@ZalvaTionZ Thanks for the clarification.
Everybody in public space puts their foot in their mouth at some time and licks their toe nails. I don't care what he said about the Eve character body as being problematic, because that was just an opinion he gave in passing. Alex is talking at depth regarding technical aspects and challenges of some amazing things in this video.
Thanks to Xbox we have stuff like path tracing and only Xbox can handle true path tracing
There isn't a single game on consoles that do path tracing. If the Xbox could do path tracing then the PS5 can do path tracing
The Xbox is only a little bit faster than the PS5.
@@blubblurb Xbox is faster than any other console or any pc. Only you console warriors think pc or ps5 is more powerful than Xbox.
@@jcdenton41100 You are just trolling right? There's rarely topics where one can say this is the only correct answer but in this case it is. PCs are faster then the Xbox. The Xbox is about 20% faster then the PS5.
@@blubblurb Consoles warriors are annoying just accept that no other console or pc can outperform Xbox instead of lying.
Could you repeat that part about raytracing?
Raytracing for me is not such a big deal yet. Even now in actual demanding rt titles you need a 4090 to properly do it without upscaling and at even 1440p. Personally that makes it not worth it.
When you can path trace at 1440p with a 500 bucks graphics card at 90-100 fps without upscaling then I feel like it's worth it. Eventually it will get to that. But until then.. I honestly don't care. Raster is more important. And let's be honest even visuals without rt are getting better and better too. Which is why I went for a 6950xt instead of a 4070ti. (Which was also more expensive screw the greediness of nvidia) because raster is still king.
You keep saying without upscaling, why is DLSS such a big deal for you? DLSS looks phenomenal these days, in some cases it rivals what you get in native rasterization - this seems like a “I can’t afford it so its not worth it” comment tbh
Raster my rash
@@iamgates7679 summed up perfectly lmao.
@@iamgates7679 i literally play on amd.. if you read my message. i dont even use dlss. i keep saying without upscaling because you literally want the base power of the card to be good. if i would say with upscaling then it would mean that the base perf of the card would be pretty bad if it needs upscaling to perform decently at 1440p as.. your not even playing on 1440p then. makes sense.. no? people like you make no sense to me. its almost like you want them to give you a shit product but go "but but with this ai upscaling you now get better perf!!! unless.. its not in the game you play.. then sucks for you. or a game doesnt have the upscaler you support.. sucks for you". like.. are you seriously that blind?
also.. a cant afford it so its not worth it comment? really? that just sounds like pure stupidity to me. ofc i can buy a stupid 4090 doesnt mean i will buy one. no one should support such outrages price increases that companies have been doing like nvidia. and ofc path tracing is worth it in the future or are you incapable of reading. i literally said " yet."
@@Pand0rasAct0r_ literally everything about this statement tells me you have no idea wtf you are talking about. I said DLSS but , could’ve said FSR as well. Go be dumb somewhere else :)
it's Alex the gay dude who fears attractive women!
As impressive as this all is, it's still just diminishing returns at this point.
As long as it's exclusive for nvidia i will never care
I just swapped my RX 7800xt for an rtx 4070 super and i am so happy. My rx 7800xt can't do oath tracing even with fsr frame gen mod in 1440p. BUT the 4070super can get 70-80fps in path tracing dlss frame gen in 1440p. Its so beautiful 😍
GayTracing and PT destoy game studios creativity and only give more mirror shit
Richard 😴
Tom 😪
Alex 🙂
Me 😴
Just dropping in before watching with a thumbs down, because of the shitty thumbnail. YOU SHOULD NOT DO YOUR CREW DIRTY WITH DOPEY THUMBNAILS CAUGHT CORPO-ALGO
Speak in english please!
Cringe
@@mikeuk666 no, u!