So, we're back in the mid to late 90's, having a discusion wether the game devs should mandate either Opengl, DirectX or 3dfx (Glide) hardware or even sotfware mode...
I think the problem is that it's been a long time since we last had one of those "your hardware isn't just too slow, it's simply obsolete" moments - anyone who started PC gaming since DX11 became the baseline has been lulled into a false sense of stability, but for those of use who have been around since CGA was the cutting edge of PC graphics hardware this is just business as usual, and I'm kind of annoyed that it's taking so long to make the transition.
A full mandatory switch will happen but hardware across all prices in the spectrum needs to be capable to support RT. If you buy an RTX XX50, it should do RT at the traditional XX50 resolution, otherwise it's going to price people out of PC gaming.
Agreed. While there have always been "prestige" PC titles to buck the trend and push new hardware, they used to be PC exclusive titles, which are seemingly rare these days. If you want to be profitable, you want the most amount of players to be able to play your game. PC exclusive games can have the attitude of "time to upgrade" but most titles are going to base specs on the consoles and scale upward. (Correctly in my opinion.) But aside from that. mandating RT is fine, if you can guarantee your frame time budget. It's great we have a couple of examples of games that did that! But, let's see a trend of that before we raise the bar. The stutter struggle is not over.
The challenge with RT is you do need geometry to trace against. Current gen has pretty high poly counts, but I think mega geometry will go hand-in-hand with RT in the future and enable more noticeable detail improvements.
@@MrJamergamer sure RT is workable without crazy geometry, but you do have some limitations and additional work for that. If mega geometry works as it has been teased so far, it should bring up quality while greatly reducing the complexity for the artists/developing studios. I think this will be a breakthrough for RT only titles (or even Path Trace only) over the next 5 years. The combination of superior graphics and easier development will soon become irresistable for studios, especially once PT-capable cards will be widely spread among players.
@@T33K3SS3LCH3N are you suggesting we straight up import our high polys into the engine and let mega geometry handle it? cause no matter how many tris they can render at runtime, a sculpted mesh of a dragon can have more surface detail than manhathan. baking detail to normal maps will always be a thing. i do however agree that more actual geometric info can benefit raytraced rendering.
I've always preferred better lighting over better geometry. In fact, low-poly games using modern rendering techniques seems to be a trend these days, and I am all for it.
@@MrJamergamer certainly not EVERYTHING, but some especially important focal elements. Ideally, that's still enough to need fewer different workaround techniques for good ray tracing.
This wouldn't mean that games would be harder to run. I think it'd lead to the opposite. If games are built from the ground up with RT in mind then it would lead to a better integration and therefore performance. Just look at Metro Exodus enhanced edition. It's RT exclusive, looks amazing and even runs on my deck with no issues. Let's start to think of the future and not get hung up in the past.
GPU manufacturers have already been improving RT performance. The mandatory RT requirements only upset gamers who dont have good enough hardware and feel it's unfair and forced. Once consoles have at least 60fps with mid-high quality RTAO, RTGI, RT shadows, and RT reflections. Those same PC gamers won't have excuses anymore, and by then, the RTX 6000 series will be here and will be close to Raster performance with RT enabled
Consoles is the platform that can mandate certain tech going forward because they kind of like the least common denominator, just like how SSD only becomes mandatory starting from PS5 gen. Many of the games still can’t run at 60fps when RT is on, so not yet
Now if they'd mandate 60fps over ray tracing we'd all be better off . The last thing wanna hear outta of next gen is that 30fps is still a thing cause developers choose to go with ridiculously unoptimized engines like a future Unreal Engine 6 or some other over demanding engines only to have these developers say uh well we'll release a remake or remaster instead of just upgrade the damn way the game runs through a patch!
@@AdamMi1 I mean technically yes you can run new games from a hard drive but the performance is abysmal and there can be glitches that happen when trying to do.
Consoles have typically targeted 30fps. I believe we could see more high-quality RT features next gen, but it would absolutely kick consoles back to 30fps, with maybe a performance RT mode that has less features or lower quality where it's boosted to 60fps, followed by ultra performance with no RT and 120fps at middle of the road quality.
AFAIK making shadows the old fashion way is time consuming and RT simplifies this aspect of game dev a bit. If it makes building a game significantly cheaper, devs will continue to mandate RT.
Honestly it would make more sense to make games playable on very low with stock integrated graphics on average computers than to keep making it harder and more expensive to actually play the game. We should be getting closer to every computer being able to play a game, not further away.
Totally. This tech race is making it more difficult for people to get into gaming. Also, developers struggle to properly optimize their games. I am way happier with how a game from 10 years ago was running than most UE5 titles today. I also think they look over-processed and outright blurry if you do not have a 1000 dollar GPU. So, yeah, more optimization instead of high-end exclusives would be nice.
But how would they sell the latest high end graphics cards? I think that is part of the whole point. We've seen how many games where AMD or Nvidia partner with the company making the game, whats stopping these companies from collaborating and releasing a game that is harder to run than it should be just to sell more computer hardware? I've been thinking it for probably 15 years now, and I really think that happens on some level today.
@@craciunator99I mean, it's not exactly secret. Almost all RT-heavy games out on the market right now have been developed in cooperation with Nvidia: Cyberpunk 2077, Alan Wake 2, Black Myth: Wukong, even Metro Exodus Enhanced Edition and Control back in the day. And somehow, the games keep getting more expensive to run over the years, despite using the same tech.
Couldn't more than agree with you on that one but unfortunately these developers along with hardware manufacturers like Nvidia don't see it that way. They just see more and more people bend the knee and buy their overly expensive cards t Do to gimmicks like frame gen and TAA. Why have a game run at much more truer ,raw fps instead just keep releasing cards using shotty a.i as a literally crutch in order to get there? We as customers demand at least for hardware to excel software and do it by not gimping it at a big cost
@ I use 4070 Ti Super and in Cyberpunk2077, RDR2, SystemShock... with the DLSS set to quality to 4K output the game looks 99% of the time on a 32'' monitor like native 4K. In some games, such as Forza Motorsport there is ghosting with DLSS, but it's not that bad that I would sacrifice sharpness or frame rate over that.
@@philipcooper8297 dlss 4 with transformer is also insane, you can get the same quality as you used to get on quality DLSS3 with dlss 4 performance with the transformer model
I think it is inevitable, but I think it's gonna take another GPU cycle maybe when nvidia launches the 6000 lineup in a couple of years which will also be around the same time we get the PS6. But I think in the years between we will see a gradual increase in games that mandate hardware RT like we are already seeing. I think one of the biggest improvement with next-gen consoles will be their ability to handle hardware path tracing, and they're going to want to really push that to help sell the system as truly "next-gen." When it becomes ubiquitous in console games, it will become ubiquitous in all games.
Needs to be mandatory for Reflections. I saw Screen Space Reflections the other day for the first time in forever and instantly disliked what I was seeing.. so damn distracting watching Reflections play peek a boo based on where you're looking.
@@smittyvanjagermanjenson182 yes reflections as well those 3 things could be a gamechanger for devs half of dev time is spent baking lighting and things to give a realistic feel when raytracing it could prevent all of this and just allow other aspects to be focused on
The future should be good playable games again with great story. Im sorry but i dont really care if i can see the skin details on the protagonist, and that shadows looked amazing, that tree moving leaves fall. Control was a beautiful game but the Story was so intriguing that i couldn't care if the graphics looked like the first tomb raider from 2013.
I think it's fine to require a hardware-RT capable GPU at this point (coming from someone that does not own one), as long as the game can run on AMD/Nvidias lowest end (gaming) GPUs that are RT capable (so RX 6xxx/RTX 2xxx cards)... If you can't get it to run on that hardware with RT, you should still implement raster options for now (in my opinion) and have optional RT... Edit: Upscaling can totally be required for older RT cards, though (but not FG unless it can already get 60FPS, which then it wouldn't be required anyway)... And it should never require extreme upscaling to get 30FPS (though maybe to get 60+)... And target res should be at least 1080p, ideally 1440p...
I don't need all that jazz. I need good games. All this tech-chasing is not for me. I played through RE5 recently and am amazed by how good it still looks. I'm happy with games looking like this for a long, long time.
Seriously, games from 10 years ago still look amazing with some slight modernization and higher resolutions. Look at some PS4 games, they look INCREDIBLE, we have AAA games today that look worse than games for ps4. Developers chase all this crap instead of making a good game using not high end graphics.
@@craciunator99 Agreed. It's all marketing and no art. Well, not all of course. There are great games being made, but the industry as a whole is really annoyingly obsessed with terms like "fidelity" and "power" and all that stuff.
@@Mrs_Puffington Yeah i think thats why indie games have exploded in recent years, there are some incredible games made by smaller companies, that frankly have put a lot of AAA games to shame.
@@Mrs_Puffington It's diversity. The vast majority of game releases out there is not targeted at top of the line hardware, but some studios have a particular focus on develivering the latest and greatest visual spectacle. I didn't care about graphics for about 10 years either, mostly playing Slay the Spire and World of Warships on a GTX 1060. Then suddeny Cyberpunk released its Overdrive path tracing mode. It floored me so much that I got myself a 4090 for it, which naturally got me to explore other high end titles since. You can enjoy gaming with or without top tier graphics. Both are fun in their own way.
@@T33K3SS3LCH3N Yes. I don't have a problem with it being an option. I myself will perhaps upgrade for the next Resident Evil ;) But honestly, I'd be fine with the next RE looking like RE2 Remake. I've got zero issues with that and I'd applaud a more scalable game engine that lets me choose what features I want to use.
I say give it another generation. Once we’ve got 2, 3, 4, 5, and 6000 series cards on the market I’d say there’s precedent for it. That being said, if Indiana Jones experiences success on PC in spite of this hardware mandate, good on them
The 2000 series is 6 years old at this point, that's basically a whole console generation right there. I think it's safe to start moving on from the 10 series
The next Doom is going to release on the same engine and mandatory RT. So the precedent will already become pretty strong this year. The best part about it is that Indy ran really well on AMD and older RTX cards as well, rather than having performance limitations that made RT only viable on stronger Nvidia cards.
@@AlrhamaYeah even plenty of laptop gamers have RTX GPUs now. It's been 3 generations of graphics cards, going on 4. It's time to move on. Developers spending less of their time on implementing baked lighting might also mean that even budget cards will be able to run games with mandatory raytracing fine.
The future should be all RT so devs don't need to waste time on mimicking anything that has to do with light physics. Also more powerful fluid/volumetric calculations for more complex "realistic" results.
With the Steam hardware survey showing that the most owned GPUs are RTX cards and such, I certainly wouldn't mind or be surprised that a good amount of games released in the future require support for it. Raytracing has been out for three generations of cards now so even budget and laptop gamers have cards that support it. That being said, developers should be sure that their games work on the most common hardware. If the game can't work on even the lowest budget RTX card, then I would not support that. Also if new games can't work on common/popular hardware, more people would be pushed into playing older titles and not even glance at any modern ones which I'm sure publishers and companies wouldn't want.
The switch 2 is going to have it. That's how RT becomes our baseline. Every console will have RT, no more excuses not to push more RT. Even if the series S and switch 2 suck at RT.
Let's wait until all GPUs from both NVIDIA and AMD released in the last 10 years are RT capable, so at least another 5 years, by that point if you don't have an RT capable PC too bad it's time for an upgrade
If it runs well yes. According to the steam hardware survey most people own rtx or amd capable video cards. You can see that Indiana Jones is an rt only game and on a 3060 at native 1080p on ultra settings you get 60-70fps which to me is crazy, assuming you are not CPU bottlenecked. Ultra settings are overkill, turn that down to low or medium and use dlss on balanced or quality to upscale to 1440p you get even better performance for pretty much the same experience, depends what trade offs you want to make. The new Intel card is 200$ and is better than a 4060 in many cases. Plenty of games will still support raster only approaches to rendering so playing on your ancient 1060 will still be viable for awhile. If you want to play the new state of the art cutting edge games you need capable hardware, 3060 is the minimum IMO.
If you’re talking about frame gen specifically, probably not. If I had to guess stutters would probably cause even more visual artifacts since the motion would be erratic, which the AI algorithm would have difficulty interpreting.
@@MerryBlind Thank you, I was wondering more about the new transformer model for dlss4 not frame gen specifically (I have a 40 series card and don't plan on upgrading for this gen). My tech knowledge is on the limited side but from how I understood it the dead space stutter came from frame time issues and I was wondering if the new method might help reduce it/bring more frame stability
Yes. Non-RT GPUs are 8+ years old at this point. Just ensure the lowest settings are running OK on the most shit tier RT-capable GPU still. ie. 2060 and lowest AMD RT stuff (probably Steam Deck...)
@@AzaiaMonota They are also a tiny fraction of the market. And 5 year old stuff is already ripe for replacement. Rest is a business decision from the game developer. Do you do two lighting routines just to support 5+ year old AMD cards exclusively, knowing they have like 10% marketshare?
@@Jarnis-v1c RT capable GPUs in general didn't even exist 7 years ago it's still too early to make it mandatory. I'd say give it 5 more years at that point every GPU from AMD and NVIDIA in the last 10 years will be RT capable, that is the point where we can say if you don't have an RT capable system that's too bad time to upgrade
AMD already has RT support (Nvidia's "RTX" is just a name for cards which support hardware RT). It's just that AMD's RT support is nowhere near Nvidia.
@ then why can’t AMD cards even launch RTX remix games like Portal RTX, Half life RTX etc? There must be something about RTX that literally cannot function on an AMD card because if you try to launch those games there will be several graphical errors and glitches. Even video games aside AMD cards cannot use any type of ray tracing acceleration for workstation rendering like using blender. They just render at standard rasterization speeds and take no advantage of RT
@@WigWoo1 Blender has supported RT acceleration on AMD for a while now. Of course games that have RT on should support all RT accelerated cards by default.
Definitely, I remember when people where opting for letting the CPU do all the work, not that Voodoo Tech, thank good Voodoo did it's job. It's time to move on, all vendors are at the point where there should be no issue moving on and focus on this and other new tech.
Why is everyone acting like this is the first time new rendering tech was added to video cards and then it was slowly made mandatory as time goes by? I have to asume that all these commenters and posters aee like 13 or 16 years old and aren't aware that this has always happened with video cards. I remember whe they first added hardware T&L; in a manner of months cards that released that same year were completely obsolete. The same thing happen with pixel shader tech. And don't get me started with the VRAM amounts doubling every couple of months and games requiring the higher memory. At least these days we get multiple years of use from our GPUs. Back then a videocard became unsupported in a years and sometimes less. PC game development and hardware advancements were very fast and fluid, so game developers implemented new tech during development.
Yes, they should. However, these gpu manufacturers aren’t making it easy with their pricey enthusiast options and their shitty “budget” options. People are keeping their 8+ year old cards for a reason.
Lmfao consoles are trash value, get a decent GPU with 12 or 16gb vram. An RTX 4070 costing less than 500$ destroys 700$ console and a B580 if you can find it for msrp matches the base PS5 easily. Not to mention all the long term benefits of PC like cheaper games and no online pay to play.
@@ninetendopesaitama2107 lmfao series x doesn't have any better performance than PS5. And gamepass is cheaper on PC lol. You console fanboys are so ignorant 🤣🤣🤣
The visual gain of ray traced lighting is not worth the large performance hit for most people. However, if AI were to improve ray tracing performance by a factor of 100 or more, I would not mind a mandate for tensor cores in the future eventually.
If developers would focus purely on RT the performance impact could be mitigated thanks to optimization. Look at the Metro Exodus Pc enhanced edition. It's got multi bounce rautraced Gi and runs like a dream, thanks to being RT exclusive.
I'm quite tired of the tech race aspect, feels like the graphics get more expansive and the gameplay stays the same, i wish stuff got more dinamic and interesting instead of graphical accuracy, there's a reason DF shows of graphics with cinematic shots, how much of it do you get to see while actually playing... I do love to see RT on games, what played the most was Minecraft pathtraced, but other games dont that dinamic and interactive enough.
I think why not? But id give a few more years. Maybe during the next cycle of consoles. Gpus aside… most people rock mid level CPUs still. My ryzen 2600bottlenecks my 4070. Most will require new builds.
Only makes sense if GPU manufacturers would design independent RT core architecture and sell dedicated hardware for it. Otherwise it would only drive the graphics card prices unnecessarily high and restrict the generational uplift in traditional raster performance.
So, when shaders on hardware T&L were introduced, did you also think that they should have sold separate cards or hardware? That would end up being more expensive and cumbersome for everyone. How would it make sense to implement graphics features in separate hardware instead of adding the tech in the video cards themselves that is meant to handle graphics?
No. Because as of 2024 the gaming industry still has no official standard of acceptable performance. Native 1080p 60fps somehow isnt baseline and the vast majority of games struggle to hit that. Great example: god of war reboot was released in 2018. The 1080ti released 2017. The vast majority of games from indie to AAA don't look half as good, yet we want RT and AI to do the optimization work for us? If you have to buy a graphics card to get RT, its not mainstream enough. When integrated graphics can do ray tracing at 1080p 60fps, then pushing RT as a baseline makes sense
Sure just make sure the game can actually run at an stable and acceptable 60fps on a reasonably priced GPU without any upscaling or fake frames gimmick. Which means we're still far from this 😂
The issue is that as seen on the hardware unboxed channel a couple months ago: too many games still look WORSE with RT on, a ton of games look very similar but with a hit to performance and a lot of games may look nicer but at a high performance hit. This is different from the late 90s where if you had a 3D acceleration card like a 3dfx voodoo it completely transformed your gaming experience. Suddenly you were running at seriously high frame rates, with transparency effects, lighting etc. it was just an upgrade in every way.
@@paulszkibut the games on HUB’s list where RT was a focus point were all on the good side. Which in my opinion is the important part. Also, why did they use the not so great RT games for their RT benchmarks i wonder? AMDunboxed at their best
@ I honestly can't take you serious with that conspiratorial implication. What are the "great" RT-games that were missing? Saying "Raytracing is often not worth it to enable" isn't some AMD fanboy hot take. I have a 3090 myself and frequently check out raytracing in games and often times, it's just a visual side-grade with a hit to performance.
@@paulszki its not conspiratorial. HUB did make a list ranging from Bad RT, to good, correct?. And HUB did use games from the bad side in their RT benchmarks. Just like when they did rasterized benchmarks, they added Cod twice. Its not up for debate. That is objectively true. Also, RT in games where it was clearly a development target is objectively better than rasterized graphics. Again, even HUB said that.
I would definitely not mandate it yet. AMD got into RT just 4 years ago, which means that there are plenty of people at that time who bought into the previous generation for a bargain price, like the 5700 XT. I happen to have that card and it performs very well even at 1440p in todays AAA games at medium-low settings, I would prefer not to have to replace it.
As soon as something better exists, mandate it. Sales will increase and prices will drop. The only reason LCD TV’s went down in price was because the Xbox 360 went to game stores on demo displays. Immediately the whole world was able to use one in real time and games needed that display hardware to play the game. TV’s were forced to price drop because of videgames.
@gorillagroddgamingoof. I mean it’s the future of games. You will always get a better product the longer you wait. Just wait 6 more years and get an 8000 series card if the 5000 series isn’t good enough. I’m going to be upgrading from a 1080ti to a 5090.
Brother, Rtx 20 series are capable of primitive ray tracing, and the games that have this requirement can run on those cards by the moment, you can't just stop technology and advancements on graphics, they are not going to wait for you sadly
The visuals do improve, immensely. Fortnite has some of the most optimized RT performance around. When I max out nanite and lumen and set everything else to High the game runs 4k 60fps and is jaw dropping. Setting it back to Raster for the first time in months made me dislike prebaked crap.. Raster lighting and fake reflections was nice 7 years ago, now it just looks soulless AF...
@charmingpeasant9834 It's not about what you care but this is how industry is moving forward. I would rather play on higher settings/FPS than lower frame rate and lower settings on native.
lmao no. there are still a lot of people playing on pre 20/6000 series cards and even those playing on low 30/40/6000/7000 series cards are not going to be able to run new games with proper ray tracing. that really limits your market when only the top maybe 1/3 of the player base can run the game. the only hope is that upscalers can get good enough to run games at 50% or even lower base resolution to get those 60 frames needed for a good experience without terrible artifacting
The majority of PC gamers have RTX branded cards. With the new DLSS I can play CP2077 maxed out with psycho RT, at 3440x1440 and get 60 Fps on a 3070, or even pathtracing and get a good bit above 30. And the image quality is great. Its time for at least some games to mandate RT.
What a question! 😂 OFC not! Not everyone can afford to update often enough to have this mandatory in 2025. Devs should keep giving options to run games without Ray Tracing. I don't know how the RTX 5060 is going to perform but I don't think it will be miles better than an RTX 4060. Ray Tracing remains an option for people with deep pockets. And by deep pockets I mean people with at least a $650 dollars graphics card for the next 2 years.
While I get the point, and agree to some extent, this causes many issues during the game’s development. Lighting is one of the hardest things to get right and if you not only have to design 2 lighting options, but the raster one is way harder to implement properly, I can clearly see why studios are moving away from it.
Render of geometry in frame, cost about few ms . ~5 ms Other 30ms for 30 fps cost shadows , GI , lights . Upscaler like AA solution. When player talk about optimization. there are not many ways to optimize something in RT even low quality, actually is too high for hardware .
The current games are just soulless products of profit. AC shadows system requirements are ridiculous while offering nothing new or groundbreaking. Art design is more important than raw graphics. ( take a look at Arkham Knight)
Simple answer . No... Developers don't optimize ... The only generation where "PC Games" was optimized was when consoles had weak single threaded performance which forced developers to optimize. Now the games are 10 times more taxing for 5% better graphics and terrible games that are good for nothing more than tinkering hours on the settings screen
@zachb1706 whenever the console generation has strong CPU's optimizations were bad. Best example.2 eras where optimizations were bad coincide with xbox360/PS3 era and PS5/Xbox series x era
@@soulsbourne the XBONE/PS4 era had loads of poorly optimised games. Off the top of my head Arkham Knight, RDR2, Watchdogs, Farcry 4, AC: Unity (Ubisoft games ran terribly on PC). Games developed for PC first run well, games that aren’t run poorly. It’s been the same forever.
@@soulsbournealso the PS2/GC/Xbox era. See how bad the PC ports of GTA games were. I would say that PS4/Xbox one era had the best optimized (relatively speaking) PC ports of console games.
Isn’t it great how the goalpost is constantly being moved for those that reject RT? First it was 60fps, then 120fps, then 4K, then 144fps, then it’s this .2ms latency is killing muh K/D ratio. Cards can easily do RT, it’s PathTracing that’s the next hurdle. “What do you mean RT has been optimized and not brute forced? FAKE we say!!!” /s
100%. Apple said NO MORE 3.5mm and lo and behold! The _entire industry_ followed suit with not so much as a small resistance. Going forward, It's RT or DIE. This is a very good thing indeed.
Sorry bud, you cant listen to this new music because you are trying to listen to it through your 3.5mm jack and we arbitrarily made bluetooth the only way you can listen
“Should games mandate hardware rendering? Questions for 1998 and beyond.”
So, we're back in the mid to late 90's, having a discusion wether the game devs should mandate either Opengl, DirectX or 3dfx (Glide) hardware or even sotfware mode...
I think the problem is that it's been a long time since we last had one of those "your hardware isn't just too slow, it's simply obsolete" moments - anyone who started PC gaming since DX11 became the baseline has been lulled into a false sense of stability, but for those of use who have been around since CGA was the cutting edge of PC graphics hardware this is just business as usual, and I'm kind of annoyed that it's taking so long to make the transition.
A full mandatory switch will happen but hardware across all prices in the spectrum needs to be capable to support RT. If you buy an RTX XX50, it should do RT at the traditional XX50 resolution, otherwise it's going to price people out of PC gaming.
Agreed. While there have always been "prestige" PC titles to buck the trend and push new hardware, they used to be PC exclusive titles, which are seemingly rare these days. If you want to be profitable, you want the most amount of players to be able to play your game. PC exclusive games can have the attitude of "time to upgrade" but most titles are going to base specs on the consoles and scale upward. (Correctly in my opinion.) But aside from that. mandating RT is fine, if you can guarantee your frame time budget. It's great we have a couple of examples of games that did that! But, let's see a trend of that before we raise the bar. The stutter struggle is not over.
The challenge with RT is you do need geometry to trace against. Current gen has pretty high poly counts, but I think mega geometry will go hand-in-hand with RT in the future and enable more noticeable detail improvements.
You dont need excessive geometry for raytracing, a normal map can work just fine for surface geo
@@MrJamergamer sure RT is workable without crazy geometry, but you do have some limitations and additional work for that. If mega geometry works as it has been teased so far, it should bring up quality while greatly reducing the complexity for the artists/developing studios.
I think this will be a breakthrough for RT only titles (or even Path Trace only) over the next 5 years. The combination of superior graphics and easier development will soon become irresistable for studios, especially once PT-capable cards will be widely spread among players.
@@T33K3SS3LCH3N are you suggesting we straight up import our high polys into the engine and let mega geometry handle it? cause no matter how many tris they can render at runtime, a sculpted mesh of a dragon can have more surface detail than manhathan. baking detail to normal maps will always be a thing. i do however agree that more actual geometric info can benefit raytraced rendering.
I've always preferred better lighting over better geometry.
In fact, low-poly games using modern rendering techniques seems to be a trend these days, and I am all for it.
@@MrJamergamer certainly not EVERYTHING, but some especially important focal elements.
Ideally, that's still enough to need fewer different workaround techniques for good ray tracing.
This wouldn't mean that games would be harder to run. I think it'd lead to the opposite. If games are built from the ground up with RT in mind then it would lead to a better integration and therefore performance. Just look at Metro Exodus enhanced edition. It's RT exclusive, looks amazing and even runs on my deck with no issues. Let's start to think of the future and not get hung up in the past.
GPU manufacturers have already been improving RT performance. The mandatory RT requirements only upset gamers who dont have good enough hardware and feel it's unfair and forced. Once consoles have at least 60fps with mid-high quality RTAO, RTGI, RT shadows, and RT reflections. Those same PC gamers won't have excuses anymore, and by then, the RTX 6000 series will be here and will be close to Raster performance with RT enabled
Consoles is the platform that can mandate certain tech going forward because they kind of like the least common denominator, just like how SSD only becomes mandatory starting from PS5 gen. Many of the games still can’t run at 60fps when RT is on, so not yet
SSDs never became mandatory. You can play whatever you want from an SSD or hard drive.
@AdamMi1 There are so many games nowadays that stutter and will not run properly on a hard drive. You're just ill informed
Now if they'd mandate 60fps over ray tracing we'd all be better off . The last thing wanna hear outta of next gen is that 30fps is still a thing cause developers choose to go with ridiculously unoptimized engines like a future Unreal Engine 6 or some other over demanding engines only to have these developers say uh well we'll release a remake or remaster instead of just upgrade the damn way the game runs through a patch!
@@AdamMi1 I mean technically yes you can run new games from a hard drive but the performance is abysmal and there can be glitches that happen when trying to do.
Consoles have typically targeted 30fps. I believe we could see more high-quality RT features next gen, but it would absolutely kick consoles back to 30fps, with maybe a performance RT mode that has less features or lower quality where it's boosted to 60fps, followed by ultra performance with no RT and 120fps at middle of the road quality.
I thought it was metro exodus enhanced edition the first game that needed mandatory RT capable graphic cards? Am I wrong?
yeah it is. i dont know how people forgot about it?
@@delayeedbmsyou can still play through metro exodus without running the enhanced edition since you get both when you buy the game.
Yeah but you could play the same game in a non - rt version. This you cant
It's annoying that people keep forgetting about it, especially because it's a great example for the gains of RT exclusivity.
Yeah that was great.
AFAIK making shadows the old fashion way is time consuming and RT simplifies this aspect of game dev a bit. If it makes building a game significantly cheaper, devs will continue to mandate RT.
Not just that, but it also translates to great savings on disk space because you don't need baked lighting
Honestly it would make more sense to make games playable on very low with stock integrated graphics on average computers than to keep making it harder and more expensive to actually play the game. We should be getting closer to every computer being able to play a game, not further away.
Totally. This tech race is making it more difficult for people to get into gaming. Also, developers struggle to properly optimize their games. I am way happier with how a game from 10 years ago was running than most UE5 titles today. I also think they look over-processed and outright blurry if you do not have a 1000 dollar GPU. So, yeah, more optimization instead of high-end exclusives would be nice.
But how would they sell the latest high end graphics cards? I think that is part of the whole point. We've seen how many games where AMD or Nvidia partner with the company making the game, whats stopping these companies from collaborating and releasing a game that is harder to run than it should be just to sell more computer hardware? I've been thinking it for probably 15 years now, and I really think that happens on some level today.
@@craciunator99I mean, it's not exactly secret. Almost all RT-heavy games out on the market right now have been developed in cooperation with Nvidia: Cyberpunk 2077, Alan Wake 2, Black Myth: Wukong, even Metro Exodus Enhanced Edition and Control back in the day. And somehow, the games keep getting more expensive to run over the years, despite using the same tech.
Couldn't more than agree with you on that one but unfortunately these developers along with hardware manufacturers like Nvidia don't see it that way. They just see more and more people bend the knee and buy their overly expensive cards t
Do to gimmicks like frame gen and TAA. Why have a game run at much more truer ,raw fps instead just keep releasing cards using shotty a.i as a literally crutch in order to get there? We as customers demand at least for hardware to excel software and do it by not gimping it at a big cost
Sad times :(
I got an RTX 3060, it technically can do RT buuuut there is a price on performance.
If more games run worse, I would be very sad :(
DLSS.
@@philipcooper8297DLSS at 1080p looks like crap
@ I use 4070 Ti Super and in Cyberpunk2077, RDR2, SystemShock... with the DLSS set to quality to 4K output the game looks 99% of the time on a 32'' monitor like native 4K. In some games, such as Forza Motorsport there is ghosting with DLSS, but it's not that bad that I would sacrifice sharpness or frame rate over that.
@@philipcooper8297 dlss 4 with transformer is also insane, you can get the same quality as you used to get on quality DLSS3 with dlss 4 performance with the transformer model
@gorillagroddgamingDLSS quality at 1080p looks perfectly fine.
My 5700XT on its knees begging for mercy rn
The 2070’s aged so much better its not even funny( I had a 5700xt too)
I think it is inevitable, but I think it's gonna take another GPU cycle maybe when nvidia launches the 6000 lineup in a couple of years which will also be around the same time we get the PS6. But I think in the years between we will see a gradual increase in games that mandate hardware RT like we are already seeing. I think one of the biggest improvement with next-gen consoles will be their ability to handle hardware path tracing, and they're going to want to really push that to help sell the system as truly "next-gen." When it becomes ubiquitous in console games, it will become ubiquitous in all games.
They should mandate it for lighting and shadows at least it makes development so much easier and faster
sounds like lazy development to me
Needs to be mandatory for Reflections. I saw Screen Space Reflections the other day for the first time in forever and instantly disliked what I was seeing.. so damn distracting watching Reflections play peek a boo based on where you're looking.
@@smittyvanjagermanjenson182 yes reflections as well those 3 things could be a gamechanger for devs half of dev time is spent baking lighting and things to give a realistic feel when raytracing it could prevent all of this and just allow other aspects to be focused on
The future should be good playable games again with great story.
Im sorry but i dont really care if i can see the skin details on the protagonist, and that shadows looked amazing, that tree moving leaves fall.
Control was a beautiful game but the Story was so intriguing that i couldn't care if the graphics looked like the first tomb raider from 2013.
I think it's fine to require a hardware-RT capable GPU at this point (coming from someone that does not own one), as long as the game can run on AMD/Nvidias lowest end (gaming) GPUs that are RT capable (so RX 6xxx/RTX 2xxx cards)... If you can't get it to run on that hardware with RT, you should still implement raster options for now (in my opinion) and have optional RT...
Edit: Upscaling can totally be required for older RT cards, though (but not FG unless it can already get 60FPS, which then it wouldn't be required anyway)... And it should never require extreme upscaling to get 30FPS (though maybe to get 60+)... And target res should be at least 1080p, ideally 1440p...
If i were to guess maybe in like 5-8 years, most triple AAA games will require raytracing capable GPUs.
I don't need all that jazz. I need good games. All this tech-chasing is not for me. I played through RE5 recently and am amazed by how good it still looks. I'm happy with games looking like this for a long, long time.
Seriously, games from 10 years ago still look amazing with some slight modernization and higher resolutions. Look at some PS4 games, they look INCREDIBLE, we have AAA games today that look worse than games for ps4. Developers chase all this crap instead of making a good game using not high end graphics.
@@craciunator99 Agreed. It's all marketing and no art. Well, not all of course. There are great games being made, but the industry as a whole is really annoyingly obsessed with terms like "fidelity" and "power" and all that stuff.
@@Mrs_Puffington Yeah i think thats why indie games have exploded in recent years, there are some incredible games made by smaller companies, that frankly have put a lot of AAA games to shame.
@@Mrs_Puffington It's diversity. The vast majority of game releases out there is not targeted at top of the line hardware, but some studios have a particular focus on develivering the latest and greatest visual spectacle.
I didn't care about graphics for about 10 years either, mostly playing Slay the Spire and World of Warships on a GTX 1060. Then suddeny Cyberpunk released its Overdrive path tracing mode. It floored me so much that I got myself a 4090 for it, which naturally got me to explore other high end titles since. You can enjoy gaming with or without top tier graphics. Both are fun in their own way.
@@T33K3SS3LCH3N Yes. I don't have a problem with it being an option. I myself will perhaps upgrade for the next Resident Evil ;)
But honestly, I'd be fine with the next RE looking like RE2 Remake. I've got zero issues with that and I'd applaud a more scalable game engine that lets me choose what features I want to use.
No no no no no no no, u want to kill pc game sales?
I say give it another generation. Once we’ve got 2, 3, 4, 5, and 6000 series cards on the market I’d say there’s precedent for it. That being said, if Indiana Jones experiences success on PC in spite of this hardware mandate, good on them
The 2000 series is 6 years old at this point, that's basically a whole console generation right there. I think it's safe to start moving on from the 10 series
The next Doom is going to release on the same engine and mandatory RT. So the precedent will already become pretty strong this year.
The best part about it is that Indy ran really well on AMD and older RTX cards as well, rather than having performance limitations that made RT only viable on stronger Nvidia cards.
@@AlrhamaYeah even plenty of laptop gamers have RTX GPUs now. It's been 3 generations of graphics cards, going on 4. It's time to move on. Developers spending less of their time on implementing baked lighting might also mean that even budget cards will be able to run games with mandatory raytracing fine.
The future should be all RT so devs don't need to waste time on mimicking anything that has to do with light physics. Also more powerful fluid/volumetric calculations for more complex "realistic" results.
With the Steam hardware survey showing that the most owned GPUs are RTX cards and such, I certainly wouldn't mind or be surprised that a good amount of games released in the future require support for it. Raytracing has been out for three generations of cards now so even budget and laptop gamers have cards that support it. That being said, developers should be sure that their games work on the most common hardware. If the game can't work on even the lowest budget RTX card, then I would not support that. Also if new games can't work on common/popular hardware, more people would be pushed into playing older titles and not even glance at any modern ones which I'm sure publishers and companies wouldn't want.
The switch 2 is going to have it. That's how RT becomes our baseline. Every console will have RT, no more excuses not to push more RT. Even if the series S and switch 2 suck at RT.
Yes, they all should
Let's wait until all GPUs from both NVIDIA and AMD released in the last 10 years are RT capable, so at least another 5 years, by that point if you don't have an RT capable PC too bad it's time for an upgrade
What we need is a Steam Machine by Valve, to set generational standards for PC gaming just like what Sony and Microsoft are doing for consoles.
If it runs well yes. According to the steam hardware survey most people own rtx or amd capable video cards. You can see that Indiana Jones is an rt only game and on a 3060 at native 1080p on ultra settings you get 60-70fps which to me is crazy, assuming you are not CPU bottlenecked. Ultra settings are overkill, turn that down to low or medium and use dlss on balanced or quality to upscale to 1440p you get even better performance for pretty much the same experience, depends what trade offs you want to make. The new Intel card is 200$ and is better than a 4060 in many cases. Plenty of games will still support raster only approaches to rendering so playing on your ancient 1060 will still be viable for awhile. If you want to play the new state of the art cutting edge games you need capable hardware, 3060 is the minimum IMO.
Just curious because you mentioned dead space in the video, do you think it’s possible dlss 4 could help mitigate ue5 stutter?
If you’re talking about frame gen specifically, probably not. If I had to guess stutters would probably cause even more visual artifacts since the motion would be erratic, which the AI algorithm would have difficulty interpreting.
@@MerryBlind Thank you, I was wondering more about the new transformer model for dlss4 not frame gen specifically (I have a 40 series card and don't plan on upgrading for this gen). My tech knowledge is on the limited side but from how I understood it the dead space stutter came from frame time issues and I was wondering if the new method might help reduce it/bring more frame stability
Yes. Non-RT GPUs are 8+ years old at this point. Just ensure the lowest settings are running OK on the most shit tier RT-capable GPU still. ie. 2060 and lowest AMD RT stuff (probably Steam Deck...)
rx 5800 is from 2019.
RDNA1 GPUs are only 5 years old and can't do RT
@@BC-lg7hf There is no such thing as an RX 5800 the highest was RX 5700 XT
@@AzaiaMonota They are also a tiny fraction of the market. And 5 year old stuff is already ripe for replacement. Rest is a business decision from the game developer. Do you do two lighting routines just to support 5+ year old AMD cards exclusively, knowing they have like 10% marketshare?
@@Jarnis-v1c RT capable GPUs in general didn't even exist 7 years ago it's still too early to make it mandatory. I'd say give it 5 more years at that point every GPU from AMD and NVIDIA in the last 10 years will be RT capable, that is the point where we can say if you don't have an RT capable system that's too bad time to upgrade
They should be able to do what they want and take the risk that comes with it.
Yes. Create lighting ONLY on RT is improve performancje, like metro exodus with RT version.
Thank you! People forget about that. ME enhanced edition looks amazing and even runs perfectly fine on the deck.
AFAIK, Spiderman 2 is also RT only and it runs pretty decent on PS5. It seems that RT only games are actually better than raster RT hybrid.
@ Exacly, it is why we shoud switch to only RT games. First series gpu with RT is 6.5y old now so, it is not "new" tech now.
Until AMD has RTX I don’t think this should ever be a mandated requirement
No one uses AMD though
AMD already has RT support (Nvidia's "RTX" is just a name for cards which support hardware RT).
It's just that AMD's RT support is nowhere near Nvidia.
@ then why can’t AMD cards even launch RTX remix games like Portal RTX, Half life RTX etc? There must be something about RTX that literally cannot function on an AMD card because if you try to launch those games there will be several graphical errors and glitches. Even video games aside AMD cards cannot use any type of ray tracing acceleration for workstation rendering like using blender. They just render at standard rasterization speeds and take no advantage of RT
@@WigWoo1 Blender has supported RT acceleration on AMD for a while now.
Of course games that have RT on should support all RT accelerated cards by default.
@ how? Only OptiX and CUDA use the RT cores. HIP is the only rendering method available in Blender with AMD and it doesn’t use any RT acceleration
Would like to see PSVR2 version of Indy at PS5 launch just for giggles.
Definitely, I remember when people where opting for letting the CPU do all the work, not that Voodoo Tech, thank good Voodoo did it's job.
It's time to move on, all vendors are at the point where there should be no issue moving on and focus on this and other new tech.
Amazing video, thank you Digital Foundry.
Indiana Jones, FF7 Rebirth and Doom Dark Ages. 3 games that require RT.
Next Doom game now too.
It should when most can do it.
Why is everyone acting like this is the first time new rendering tech was added to video cards and then it was slowly made mandatory as time goes by? I have to asume that all these commenters and posters aee like 13 or 16 years old and aren't aware that this has always happened with video cards. I remember whe they first added hardware T&L; in a manner of months cards that released that same year were completely obsolete. The same thing happen with pixel shader tech. And don't get me started with the VRAM amounts doubling every couple of months and games requiring the higher memory. At least these days we get multiple years of use from our GPUs. Back then a videocard became unsupported in a years and sometimes less. PC game development and hardware advancements were very fast and fluid, so game developers implemented new tech during development.
Yes, they should. However, these gpu manufacturers aren’t making it easy with their pricey enthusiast options and their shitty “budget” options. People are keeping their 8+ year old cards for a reason.
We need more than 8gb vram.
If you have a 10 series Geforce and want to play today's PC games you shouldn't be gaming on PC, get a PS5.
People forget how old the 10 series is. It makes sense do not be considered by developers anymore.
@@AdamMi1 imagine trying to play games from 2005 on a pc from 1996 😁🤣
No. Get a series x. Much better performance and gamepass. ☮️
Lmfao consoles are trash value, get a decent GPU with 12 or 16gb vram. An RTX 4070 costing less than 500$ destroys 700$ console and a B580 if you can find it for msrp matches the base PS5 easily. Not to mention all the long term benefits of PC like cheaper games and no online pay to play.
@@ninetendopesaitama2107 lmfao series x doesn't have any better performance than PS5. And gamepass is cheaper on PC lol. You console fanboys are so ignorant 🤣🤣🤣
The visual gain of ray traced lighting is not worth the large performance hit for most people. However, if AI were to improve ray tracing performance by a factor of 100 or more, I would not mind a mandate for tensor cores in the future eventually.
If developers would focus purely on RT the performance impact could be mitigated thanks to optimization. Look at the Metro Exodus Pc enhanced edition. It's got multi bounce rautraced Gi and runs like a dream, thanks to being RT exclusive.
Not until we can get past the poor preformance era.
it's about direct x 12 ultimate, not RT
no
They have the right to demand it.
I have the right not to buy it.
It won't happen until next gen consoles, it's actually insane to think that a 2070 Super is still what most games are designed around in 2025.
yet somehow you need a 4070 ti+ to run them lmao.
I'm quite tired of the tech race aspect, feels like the graphics get more expansive and the gameplay stays the same, i wish stuff got more dinamic and interesting instead of graphical accuracy, there's a reason DF shows of graphics with cinematic shots, how much of it do you get to see while actually playing...
I do love to see RT on games, what played the most was Minecraft pathtraced, but other games dont that dinamic and interactive enough.
Let's offload even more of the production costs to the end user!
New RTX new Alex
Couldn't disagree more with Olie.
Doom: The Dark Ages also mandates ray tracing capable gpus.
I think why not? But id give a few more years. Maybe during the next cycle of consoles. Gpus aside… most people rock mid level CPUs still. My ryzen 2600bottlenecks my 4070. Most will require new builds.
Yup have an RTX 3060 12Gbyte, a pretty decent card... one issue also an i5 8600F. So, ... yeah not going to get THAT much out of there.
The next generation of consoles is starting this year with the switch 2.
Only makes sense if GPU manufacturers would design independent RT core architecture and sell dedicated hardware for it. Otherwise it would only drive the graphics card prices unnecessarily high and restrict the generational uplift in traditional raster performance.
So, when shaders on hardware T&L were introduced, did you also think that they should have sold separate cards or hardware? That would end up being more expensive and cumbersome for everyone. How would it make sense to implement graphics features in separate hardware instead of adding the tech in the video cards themselves that is meant to handle graphics?
Bs, you want RT specific extra PCIe card ? That would be stupid, expensive and worse performing than what we have today.
No. Because as of 2024 the gaming industry still has no official standard of acceptable performance. Native 1080p 60fps somehow isnt baseline and the vast majority of games struggle to hit that.
Great example: god of war reboot was released in 2018. The 1080ti released 2017. The vast majority of games from indie to AAA don't look half as good, yet we want RT and AI to do the optimization work for us?
If you have to buy a graphics card to get RT, its not mainstream enough. When integrated graphics can do ray tracing at 1080p 60fps, then pushing RT as a baseline makes sense
Sure just make sure the game can actually run at an stable and acceptable 60fps on a reasonably priced GPU without any upscaling or fake frames gimmick. Which means we're still far from this 😂
The issue is that as seen on the hardware unboxed channel a couple months ago: too many games still look WORSE with RT on, a ton of games look very similar but with a hit to performance and a lot of games may look nicer but at a high performance hit.
This is different from the late 90s where if you had a 3D acceleration card like a 3dfx voodoo it completely transformed your gaming experience. Suddenly you were running at seriously high frame rates, with transparency effects, lighting etc. it was just an upgrade in every way.
@@paulszkibut the games on HUB’s list where RT was a focus point were all on the good side. Which in my opinion is the important part.
Also, why did they use the not so great RT games for their RT benchmarks i wonder? AMDunboxed at their best
@ I honestly can't take you serious with that conspiratorial implication.
What are the "great" RT-games that were missing?
Saying "Raytracing is often not worth it to enable" isn't some AMD fanboy hot take. I have a 3090 myself and frequently check out raytracing in games and often times, it's just a visual side-grade with a hit to performance.
@@paulszki its not conspiratorial. HUB did make a list ranging from Bad RT, to good, correct?. And HUB did use games from the bad side in their RT benchmarks. Just like when they did rasterized benchmarks, they added Cod twice. Its not up for debate. That is objectively true.
Also, RT in games where it was clearly a development target is objectively better than rasterized graphics. Again, even HUB said that.
I would definitely not mandate it yet. AMD got into RT just 4 years ago, which means that there are plenty of people at that time who bought into the previous generation for a bargain price, like the 5700 XT. I happen to have that card and it performs very well even at 1440p in todays AAA games at medium-low settings, I would prefer not to have to replace it.
AMD's fault for not jumping earlier. Besides they have non existent market share of rx 5000 series so just leave them behind.
As soon as something better exists, mandate it. Sales will increase and prices will drop.
The only reason LCD TV’s went down in price was because the Xbox 360 went to game stores on demo displays. Immediately the whole world was able to use one in real time and games needed that display hardware to play the game.
TV’s were forced to price drop because of videgames.
No, because some people prioritize rasterization and real framerates over RT and frame gen.
Nobody cares about the poors who refuse to move on
@@squirrelsinjacket1804Nobody cares about the poors who refus- ACK!
🤣
@@squirrelsinjacket1804 It's not about price, it's about consumers having options
Stay stuck in the past then :)
No because the cards are overpriced and underpowered for RayTracing. It will be in the future, but not now.
Based. People who shill for RT suck digger nick.
@gorillagroddgamingoof. I mean it’s the future of games. You will always get a better product the longer you wait.
Just wait 6 more years and get an 8000 series card if the 5000 series isn’t good enough.
I’m going to be upgrading from a 1080ti to a 5090.
Brother, Rtx 20 series are capable of primitive ray tracing, and the games that have this requirement can run on those cards by the moment, you can't just stop technology and advancements on graphics, they are not going to wait for you sadly
@@javierarroyo6600 At what framerate, brother?
before I even start the video, if the game visuals don’t improve then NO it should not require more and more advanced and expensive hardware.
The visuals do improve, immensely. Fortnite has some of the most optimized RT performance around. When I max out nanite and lumen and set everything else to High the game runs 4k 60fps and is jaw dropping. Setting it back to Raster for the first time in months made me dislike prebaked crap.. Raster lighting and fake reflections was nice 7 years ago, now it just looks soulless AF...
Only when entry level gaming graphics cards can properly run RT features with acceptable performance at native res.
Native res is not relevant anymore.
@@zacthegamer6145 That's sad. It should.
@@charmingpeasant9834 Mate, quality upscaling is 30-40% more performant and provides the same visual quality.
@@zacthegamer6145 I dont care about your AI smear filters.
@charmingpeasant9834 It's not about what you care but this is how industry is moving forward. I would rather play on higher settings/FPS than lower frame rate and lower settings on native.
lmao no. there are still a lot of people playing on pre 20/6000 series cards and even those playing on low 30/40/6000/7000 series cards are not going to be able to run new games with proper ray tracing. that really limits your market when only the top maybe 1/3 of the player base can run the game. the only hope is that upscalers can get good enough to run games at 50% or even lower base resolution to get those 60 frames needed for a good experience without terrible artifacting
The majority of PC gamers have RTX branded cards. With the new DLSS I can play CP2077 maxed out with psycho RT, at 3440x1440 and get 60 Fps on a 3070, or even pathtracing and get a good bit above 30. And the image quality is great.
Its time for at least some games to mandate RT.
What a question! 😂 OFC not! Not everyone can afford to update often enough to have this mandatory in 2025. Devs should keep giving options to run games without Ray Tracing. I don't know how the RTX 5060 is going to perform but I don't think it will be miles better than an RTX 4060. Ray Tracing remains an option for people with deep pockets. And by deep pockets I mean people with at least a $650 dollars graphics card for the next 2 years.
Yah! We also shouldn't have luxury versions of anything! Not everyone can afford it so it shouldn't exist!!
Obvious sarcasm
While I get the point, and agree to some extent, this causes many issues during the game’s development. Lighting is one of the hardest things to get right and if you not only have to design 2 lighting options, but the raster one is way harder to implement properly, I can clearly see why studios are moving away from it.
@@AnEyeRacky That's why most games make RT optional not mandatory
Render of geometry in frame, cost about few ms . ~5 ms
Other 30ms for 30 fps cost shadows , GI , lights . Upscaler like AA solution.
When player talk about optimization. there are not many ways to optimize something
in RT even low quality, actually is too high for hardware .
The current games are just soulless products of profit. AC shadows system requirements are ridiculous while offering nothing new or groundbreaking. Art design is more important than raw graphics. ( take a look at Arkham Knight)
Simple answer . No... Developers don't optimize ... The only generation where "PC Games" was optimized was when consoles had weak single threaded performance which forced developers to optimize. Now the games are 10 times more taxing for 5% better graphics and terrible games that are good for nothing more than tinkering hours on the settings screen
Optimisation is no worse now than it was in the past. You just do f remember how poorly games ran in the past.
@zachb1706 whenever the console generation has strong CPU's optimizations were bad. Best example.2 eras where optimizations were bad coincide with xbox360/PS3 era and PS5/Xbox series x era
@@soulsbourne the XBONE/PS4 era had loads of poorly optimised games. Off the top of my head Arkham Knight, RDR2, Watchdogs, Farcry 4, AC: Unity (Ubisoft games ran terribly on PC).
Games developed for PC first run well, games that aren’t run poorly. It’s been the same forever.
@@soulsbournealso the PS2/GC/Xbox era. See how bad the PC ports of GTA games were.
I would say that PS4/Xbox one era had the best optimized (relatively speaking) PC ports of console games.
The game industry can try, but I've already given up on AAA slop and just play retro and indie games.
Isn’t it great how the goalpost is constantly being moved for those that reject RT? First it was 60fps, then 120fps, then 4K, then 144fps, then it’s this .2ms latency is killing muh K/D ratio. Cards can easily do RT, it’s PathTracing that’s the next hurdle.
“What do you mean RT has been optimized and not brute forced? FAKE we say!!!” /s
I dont want to be forced to buy an Nvidia RTX6070 for $3000 with 8 gigs of vram
Yes, they should. It's just too much work for developers to tune the baked lighting for all the old GPUs from eight years ago.
No
Just get a ps5 pro. Better experience
Switch 2 supporting RT? On such weak mobile hardware that NEEDS the crutch of AI to just barely perform 1080p60!? I'm laughing my ass off.
100%. Apple said NO MORE 3.5mm and lo and behold! The _entire industry_ followed suit with not so much as a small resistance. Going forward, It's RT or DIE. This is a very good thing indeed.
>This is a very good thing indeed.
No it’s not.
so cucked
Apple took away the 3.5mm to sell more air pods. Every decision apple has made with the iPhone has been based on profit margins. Please stop.
Sorry bud, you cant listen to this new music because you are trying to listen to it through your 3.5mm jack and we arbitrarily made bluetooth the only way you can listen
Why would you use a bad thing everyone hates as an example