Even at 1440p you can DLDSR to 4k and DLSS down for the render res. The point is 4k is a perfectly viable end resolution, it's just not a viable render resolution. Not that you couldn't make it work, but it's just wasteful to reserve performance for it.
High frame rates at 4K is easily achievable and looks miles better than 1440p on a display over 32". The problem is when you turn on high/ultra/extreme settings which add a massive overhead.
Then just turn off ray tracing? No one buys a 4090 and plays 1080p upscaled for free. It's an option every single person playing like that makes for otherwise unavailable features.
You're spending that much to have more performance than the other GPUs. You don't see those of us will lower cards complain about using 720p render resolution. That still looks plenty fine. 4k DLSS Performance looks insane for most people, it's not anywhere close to actually having a 1080p like MOST PEOPLE HAVE.
You don't need to. The 4090 is for a minority who don't think remotely close to you. In fact you are not the demographic for that card. The 4060-4070 is more than enough for everybody else....or you can buy a used card and save even more money.
Worth noting the way that raytracing scales with the number of pixels being rendered. Going from 1080p to 4K in a pathtraced game will be more demanding than going from 1080p to 4K in a traditional game. Performance falls off a cliff once you hit a certain number of pixels/rays.
Pathtracing also pretty much solves scene complexity, you can throw as much shit on screen as you want and it will run just fine portal RTX and Cyberpunk Overdeive don’t run all that different despite one being a corridor puzzle game and the other being an open world mega city, All it takes is the shader grunt to push the pixel count you are looking for. Also in the last 2 generations Nvidia has managed “only” +60% in raster per generations but +125% in path tracing each generation we haven’t seen generational scaling that good since the late 90s / early 2000s. It’s fertile ground for growth like we haven’t seen in decades. If they keep that up another 2 generations the 6090 will be pathtracing at native 4K easily north of 75fps.
@@gameguy301 scene complexity is never an issue with PT, its always the pixels, even movies render path traced scenes at lower resolutions cause of it.
People are just mad they got suckered into paying 1600$ for a GPU. There have always been games that could not be played at max settings on contemporary hardware. Just turn down the settings and learn from your mistake when the 5090 gets announced.
Some people think that if you pay more you get a card made out of unobtanium or something. You're only getting +46% performance over the half price 4070 ti super.
Maybe but a lot of the folks who bought a 4090 aren't exactly struggling and are not the people who are willing to compromise. They'll turn on DLSS quality because it looks as good as native but won't compromise much beyond that. And I'd guess a large portion can absorb a $1600 expense every 2 years. And if you factor in the resale of the old card, it's less than that. So it was very easy to know exactly what one was getting into when they bought it.
@albert2006xp To be fair the highest end of GPUs have always been top dollar for not much in extra value but people are willing to pay for the best of the best at the time.
Believe me I NEVER ONCE felt suckered into getting my 4090. I just stick with 1440p and have a whole lot of extra headroom. I can play anything at max settings max raytracing at 1440p 144fps
4K 4090 gamer here. You can definitely forget about native 4K with "extreme" settings on newer, intensive games. People overestimate how powerful the 4090 is. :) DLSS is a life saver here. I personally dislike frame generation, something about it feels off to me. Maybe it will improve in the future. But, yeah, some compromises are necessary even to hit the 60 FPS "minimum", and I still rely on optimization guides (like those made by DigitalFoundry).
The 60fps brainworm is the worst thing to ever happen to pc gaming. The 4090 is perfectly capable of 4k ultra in almost every title if you go by how it feels to play rather than whether you can hit a completely arbitrary number. Only times I've needed DLSS at all was for path tracing.
@@thomasjames5757 Nothing "arbitrary" here. 30 FPS looks bad to me, and to a lot of other people, too. We'd rather remove some bells and whistles instead.
@@EmblemParade Yeah I just tried a little Wukong at 30fps. Feeling genuinely physically sick after that. Been loving my 40 series for high fps at 4K - quite happy lowering settings to achieve that.
@@EmblemParade 30 fps definitely feels much worse but the difference between 55 and 60 isn't that impactful. As long as the game stays in your monitor's VRR window. I think what he means is you don't have to get EXACTLY 60+.
@@albert2006xp You're splitting hairs here. Sure, nothing magical about the number 60. But if I'm honest, even 60 is a compromise. Once you go above 120 it's hard to unsee. :) Still, 30 is way too low for my tastes.
Even then the RTXDI on Outlaws sets zaps frames down to 65-70fps from 120-130 with frame gen too low for me that’s for sure. That’s with DLSS quality. Which tbf you expect more from a 1600 gpu paired with a 7800X3D. I guess like they said they designed that setting for the future, just a shame todays enthusiast level gpu isn’t enough.
How could they be happy with a weird screen that looks like a tv cut in half. Games looks really bad on them, why would you want a screen with huge horizontal viewing range but teeny tiny vertical? No games look good like that lol
1440p is like the perfect res for bigger monitors. 4k already goes into diminishing returns territory and is 2x more pixels to render. 4k can be nice for creative applications but for gaming with lots of motion, might as well put that slight sharpness increase into better graphics.
"Recalibrate your expectations", if I'm a person spending $1600-2000 on the card alone then I'm going to tell you there's really not much room for recalibration. I was there for Crysis, I've been there for Quake. I've been there for Windows 3.1 and the 486. But I wasn't spending $2000 (or the equivalent with inflation) where the 720p 30fps medium-high details at the time for the former was a pain point.
For me it is not even the 4K issue, it is instead games coming out where high end cards are struggling with 1080p native, with visuals not being that much improved compared to older titles.
@@mondodimotori Check out benchmarks for Black Myth Wukong at 1080P native with max settings (max RT) and no frame generation. The RTX 4080 super struggles to maintain 50FPS the only card hitting 60+ FPS in the RTX 4090. This is why you see most benchmarks on review sites using 67% scaling, which at a 1440p output is an internal render resolution that is a little under 1080p. PS, benchmarks hat use upscaling with a 1080p output are doing a disservice, considering that DLSS and FSR offer horrible results at such low internal render resolutions. Too many distant details drop to a level where does not have enough info to go on. In trying it with the free benchmark tool, it looks horrible in that game.
@@Razor2048 because Wukong has settings that are normally disabled in UE5 games and meant to be used for tech demos only. The cinematic settings are enabled, so the devs weren't accused of downgrading the graphics and the game can scale graphically on future hardware. The game looks and plays perfectly fine on an RTX3070 from 3.5 years ago at 1080p/60FPS with high settings (no upscaler, or 1440p with 75% dlss). The same goes for Alan Wake, and SW:O.
@@chimpantvit's awesome, you play the game when they are actually finished with patches and the hardware you have is designed to run what was considered futuristic on your current mid range hardware.
you know what’s funny? seeing people crap on 400$ consoles not being able to run games 4k@60fps and then seeing a discussion here about a 2000$ video card also not able to run the new games 4k@60fps, and the comments going "oh but we went too quick to 4k" or "that’s why I play in 1080p/1440p" the irony
If a game needs a 4090 with DLSS upscaling for 4K, it really shouldn't exist as anything other than a tech demo. If it only "needs" a 4090 because of path tracing or ultra settings, that is a completely different thing. Frame generation should NEVER be used when showing frame rates in reviews.
I disagree, I think technology will continue to move and priorities will change. I'm willing to deal with the occasional stumble in a game if it means progress in graphics and immersion over time. Otherwise we would still be using directx 9
@@imlegend8108 I understand, but I'm thinking of Dx14, or even 15. We get there by learning from everything we do now. Fingers crossed that future leadership in these companies understand that
1:40 This is something big to consider and it's true. Once hex and octa core processors became the norm, GPU capability also went through the roof. A $500 CPU just a couple of years ago is struggling to keep up with single core perfomance of budget chips today. It's much like the late 90's into the early 2000's. A whole generation of hardware became obselete in the blink of an eye.
It's games that still using mainly single core performance. Engines are not optimized and ue5 is the prime example. Still has issues with multithreading up to v5.4 and nanite basically only cuts devs time by making everything less optimized. Games are getting demanding because there is less and less work required on the tech side to make them run properly. Devs just tick dlss and framegen boxes and call it a day..then traversal/shader comp stutters, bad memory/vram usage, bad cpu usage. People might disagree all they want but it's just the reality of things, they might not like it, doesn't change a thing.
It's not really that accurate. I have a 9900k from 2018 and it still performs perfectly well in anything I throw at it and usually in the 120fps+ range. It's mainly the cheaper 6 core chips (like Ryzen 3600) that are starting to struggle but that was a given when the consoles shipped with 8 cores of the same generation.
@@enricod.7198 u have no idea how hard is multithreading. Its not "lazy devs relying on cheats". Even professional software that cost thousands of dollars per year, struggle to get advantage of all the cores of current cpu. Look at Autocad. This "lazy devs don't optimizer" argument is dumb.
1440p on an ultra wide has always been great for me on PC with a 4090. I can run anything at ultra settings with decent frame rates, and I’ve never once wished I had a 4k display. It’s just always seemed like overkill, especially when sitting so close to the display anyway.
Good point. That viewing distance is why I just switched to this 4K 32" - because the majority of gaming time I like to grab the controller and sit back. Been great fun so far.
I would have agreed with this statement until every game from 2016 onwards decided using TAA with a blur filter was a good idea. MW22 looks blurry and gross even at 4k, but most games look great, and I always found myself finding games to look blurry at 1440p, kind of like I needed glasses, but then 4k is like wearing those glasses. For games like Battlefield, PUBG, Warzone it makes it signficantly easier to see at a distance, it feels like cheating imo.
Same! 3440x1440p OLED is beautiful and plenty of eye candy. I don't really care to up resolution as I cannot tell the difference unless it's literally side by side. I prefer a high frame rate
If the card I spent a mortgage payment on cannot handle native 4K then we've fucking lost the plot as consumers and the fact that y'all are defending regression in performance is insanity and bewildering. $1,600 on a GPU means high expectations and it is not unreasonable for us to expect NATIVE 4K, but instead they shove graphs at us using god awful smeary frame-gen and FSR/DLSS at potato internal resolutions and bad latency and tell us to focus on "FPS number go up!!!" but the image quality is butchered in the end, at which point we're paying $1.6k for blurry gaming at 1080p? Thanks? I was playing native 1080p games 12 years ago on my $215 GPU, as seen in my oldest videos, and I'm being told that 4K is a waste and blurry 1080p upscaled to artifact hell is somehow better? It's like I'm being gas lit lol. Get real guys, seriously.
This is thanks to nvidia trying to push for ray tracing when the tech just isn't developed enough to achieve a comparable level to rasterization, along with new technologies and optimizations that demand 5 times the performance for a marginal increase in graphical fidelity There have always been unoptimized ports, but the problem isn't optimization, is that we are trying to use technologies that are not ready yet and that are not worth using at the moment Look at how the graphics settings in games have changed, 5 to 10 years ago playing on low looked horrible, playing on medium look acceptable but far worse than high settings, high looked very good and ultra looked amazing, now low looks acceptable and medium, high and ultra sometimes look almost exactly the same, even though high and ultra settings lower performance by a lot
Why would anyone leave half the performance on the table for you to waste on 4k native? Use DLSS Quality and double your fps. Stop wasting performance. 4k native is not a resolution that is supported anymore, it's basically as pointless as 8k.
@@islu6425 His point is trying to equate how much $ Nvidia makes him pay with how much resolution games are tuned around. Resolution will never go any higher than this. games will get more graphical fidelity. Resolution with upscaling is good enough as it is. 60 fps is around the good balance point for considering fps also. Resolution and fps are always going to be a balance of how low can we go to then use that performance for the game itself. Upscaling simply shifted that because it made lower render resolution actually look good. So now they're acceptable and games can be pushed further.
@@christophermullins7163 not necessarily, since even today many cards still struggle with 4k cards of today support 8k native, but that doesn't mean they can run modern demanding games at that, especially maxed something like 20-30 years(!) ago, cards supported a max. of 2048x1536, and then, from memory, 2560x1600. Then eventually, where we are today(probably some more in between). if you were to look at an actual timeline of resolution support, you would see just how slow progression has been
I understand that you guys are really excited about frame gen and DLSS and equivalent technology. It is neat. However, I am sorry but it looks terrible. Outlaws on anything less than DLSS quality looks awful, the artifacting is really bad, it's really blurry. Ray-tracing cannot be turned off. So to run in on reasonable hardware that doesn't cost the same as a car, you need to turn it down low. And frankly, raytracing looks rubbish on lower settings. The weird blurriness of it because it is low resolution, added to the constant flickering as each ray is processed. It looks pants. Lighting was better in games 6-7 years ago. I think with you guys trying all the latest tech you may have completely lost touch with the average gamer who is not running a 4090 and the latest greatest CPU and RAM.
Stop trying to run max settings or using features with less than adequate hardware. Games like Star Wars Outlaws are using technology which won't be fully enjoyed until future cards arrive and that's a good thing as it's exciting and pushing the technology. The graphics are beautiful if you use settings that are within the reach of your GPU.
3:37 - Finally someone says it! The fact that many modern GPUs have been able to run games at high framerates while on the highest possible settings is a relatively new phenomena. It's been quite common throughout the past that games push graphics technology so aggressively that it creates a situation where hardware is forced to catch up, and we're starting to see it happen yet again. This part isn't new. Honestly, I think people are just upset they drank the Nvidia koolaid and dumped nearly two-thousand dollars on an RTX 4090 when it only has a 20-40% performance advantage over cards that're less than half the price. I'd feel pretty sour too if I let my brain disregard the comparison charts because "more money = more better" and I still didn't get the performance I'd hoped for after buying a GPU that was the most expensive in the market by such an extreme measure.
you don't even need native!! I use my 4090 on a 65 inches OLED TV and I've compared native 4k so many times to 4k + DLSS and the difference was always so damn little to my eyes so I always used + DLSS quality on for the better efficiency! I'd also use DLSS always even if I'd get the same FPS with native 4k!!!
This is insane that on a chanel based on video game graphical fidelity you keep recommending to use upscalers in performance mode. There's a noticeable dip in clarity especially in motion when you anywhere below quality (or balanced if I'm generous) even with DLSS at 4k. You guys love upscaling so much
Other than Tom, who actually recommended the visual 40 FPS mode on Star Wars: Outlaws, everyone else, especially Olivier, have somehow convinced themselves that abysmally small internal resolutions upscaled to 4K are somehow okay and acceptable, and it's so antithetical to the channel that I've honestly lost respect. It shows a fundamental misunderstanding of what they're talking about sometimes and it shows. DLSS, even though it's come a long way, STILL has noticeable shimmering / flickering issues, and ghosting and they don't really acknowledge it very well.
There's a noticeable dip in image clarity, yes, but it often comes with the benefit of headroom that, if turned into advanced rendering features, can pay off. They never pretend 4k performance "looks line native 4k", they just think hyper-prestine visual clarity is not the be-all end-all of graphics.
Consider the amount of graphical fidelity you can ADD on screen with the performance you save though. Most people play on a 1080p monitor and are fine. 4k DLSS performance looks massively better than even DLDSR+DLSS on a 1080p monitor which already looks insane. The performance cost of going to 4k native is like going back 3 series of GPUs..
I mean, did people forget completely about Crisis? In which generation was that game finally completely playable maxed out at 60 fps? Certainly not on the highest end gpu when it released and neither on the gen after that. The more important question is how much more power do we get with the 5000 series. If the 5080 is on par or faster than the 4090, ok. But then how many % of an uptick does the 5090 have over the 4090? 40%? Because in my mind it would need double the performance to justify the probable exorbitant price tag. But we’ll see.
Only information we have are leaks, and they say 5090 is about 30-40% more powerful than 4090 and that it will use 600W. Rumors also say that 5080 will be 0-10% more powerful than 4090 and use 450W. But those are just rumors / leaks / speculation, and I would take them with a grain of salt. And as a 4090 owner with some money to burn, I would take 5090 into consideration if it was 60%+ better (and that i can sell my gpu for half of the cost of 5090). also a note, we don't know if that "more powerful" is raster or RT performance.
@@captainshiner42 I wouldn’t be so sure about that. I could see 24 or 20 gb being the new norm for the XX80 series going forward. A 5080 with 16gb of vram would be weird but I also wouldn’t put it past nvidia. Thing for me is, having seen games that are already demanding more than 16gb vram at 4K, I’ll be going for the 5090 if the 5080 doesn’t have at least 20gb of vram. Still stuck on the 3080 10gb, which is fine but it’s showing it’s age.
Having a 4090 myself, it seems like 100% of the time my CPU can provide frames at a higher rate than my GPU. I haven't played anything where my GPU isn't at 99%-100%. Hardware Unboxed always does CPU testing and at 1080p ultra, with many modern CPUs tend to provide way higher fps than I'm going to get with my 4090 at 4k. Even Toms Hardware has said they don't do CPU tests at 4k due to a almost always GPU bottleneck. I wish it was mentioned what games are CPU bottlenecks, and why it is thought to be the CPU. I can see it at lower resolutions, but generally if you are upscaling you are likely to be GPU bottlenecked already, unless you are fine lowering graphics settings to hit max CPU fps.
It's more of an issue for people that adjust their DLSS for 90 fps or so. Though if you have a 7800X3D only a couple of games will have problems before 90 fps. See: Hogwarts Legacy as an example that comes to mind.
From the top of my head, Dragon's Dogma 2 and Starfield are two titles that get very CPU-bottlenecked in urban areas. But the most common CPU issue would be the stuttering, most notably shader compilation stutter.
Modern games on my 4090 at native 4K I’m pretty much always gpu bound if I set dlss Q I’m still mostly gpu bound.. only a few select horribly cpu optimised games I’m cpu bottleneck but not many..
No need whatsoever. People will spend themselves into a hole if they keep striving for maxed out settings in a game at highest possible resolutions. These games are look and run just fine on machines that cost half the price of a 4090s.
you don't even need native!! I use my 4090 on a 65 inches OLED TV and I've compared native 4k so many times to 4k + DLSS and the difference was always so damn little to my eyes so I always used + DLSS quality on for the better efficiency! I'd also use DLSS always even if I'd get the same FPS with native 4k!!!
The issue is that game developers should prioritize optimizing for console-level graphic fidelity when preparing a PC release. This entails building the game on PC hardware that closely mirrors the specifications of the PlayStation 5 and Xbox Series X, ensuring that the PC version is fine-tuned to extract the maximum possible performance. The goal should be a minimum of 60 FPS on mid-range PC configurations with no image upscaling. This way, players using higher-end hardware, such as an NVIDIA RTX 4070 or better, would experience exceptional scalability, offering 4K resolution at 60 FPS (or higher) on max settings for every game. Proper optimization ensures that the PC build benefits from increased resources without compromising performance across different hardware tiers.
People still praise and want unreal engine 5 which is, actually, a step back in a lot of optimizations that were developed through the years. Yes less work for devs, but it ends up giving worse optimized games and it shows. Ue5 started barely better than later ue4 games while running 3x worse. Turns out most of the new tech like nanite is wasting a lot of gpu power for nothing, which was avoided using lods. Future should have been ia to automatically generate lods for reference meshes, not nanite.
@@enricod.7198 Yeah, agreed. You can kind of tell when performance drops around 20% or so and the general file size grows dramatically just because a project switches from UE4 to UE5.
Even the goat 1080ti fell behind at 4k very quickly. I'm sure the cynics were ranting about how Nvidia was purposely weakening the card in order to sell their next flagship while ignoring the fact that graphics are improving. I guess one could say that things are different now, because UE5 was designed with upscaling in mind, so now even the 4090 needs to use these new features at ultra settings. Once nanite completely eliminates lod pop-in in games, the visuals will be way better, and the next gen cards will be able to handle the features with better or no upscaling at all. All Nvidia needs to do is not price people out of these features, and provide enough vram.
@@kainairsoft2331 Less DRM locks, dramatically superior multitasking, the ability to game on something that isn't a controller, guaranteed backwards compatibility, more scalable in-game settings, better VRR, SteamVR, and the overall freedom to personalise your hardware and software alike? Consoles are affordable and reliable gaming machines, but being on PC is about so much more than boasting about having a graphical edge over console with a $2000 rig.
@@kainairsoft2331 Customization? Modding? Uses outside of gaming? Upgradability? Keyboard and mouse controls? Better-than-console experiences even if they aren't 4k or max settings? The promise of PC was never "buy whatever and get 4k max settings", it never even was "buy the best and get 4k at max settings". It was, the better hardware you buy, the more advanced features you have, but wanting these advanced features to always be "max" is just being hyper fixated on naming instead of experiences.
@@iurigrang All these pale in comparison to just pressing a button and everything working. There's lots of problems surrounded with using a PC for gaming, and I would know since I started building pcs in 2004. I have a 14900ks right now, asus rog strix OC 4090, Z790 asus rog maximus extreme mobo and 128 GB DDR5 corsair dominator titanium. I still prefer my ps5. My PC already feels like a scrap of outaded crap while on the ps5 everything just works. You have another set of expectations if you spend 5-8k every two years for your PC. You expect the best. You get into the fps monitor to see the performance, usage and temps. You try to get the best out of your pc. Consoles don't give me such anxiety. And that's it if the port is actually good and not some unoptimized bs. I won't speak for usage outside of gaming, as the PC master race is aimed specifically to high end gaming settings and fps. You can't laugh at a console if you need dlss from 900p for a 4090 to run UE5.
I got 22-24 FPS in BMW native in 4K which is unplayble, but when you turn on Frame Generation and pull up in game menu with staff you can see some glitches around text. And I have to use 85% and Frame generation to play on 60FPS otherwise its unplayable
Some people really underestimate how heavy is 4k alone. Let alone when coupled with high quality ray tracing. The more pixels you render, the more rays you need.
one issue to discuss is the problem with how the game downscales textures, shadows, draw distance, lots of things because of downsizing the resolution and if there was an independency in that (unreal engine is a prime subject for trial) using dlss on performance wouldn't look like using low settings anymore.
"If you're CPU limited then there's not really a lot you can do to solve that, which is why frame gen is so important." INCORRECT. That's why *optimization* is important. We need to stop pretending like game developers are powerless and the software just is what it is. Upscaling and frame gen are crutches that lazy developers use to save money by not optimizing their code to run well. There is absolutely no excuse for your game not running fantastically well on thousands of dollars worth of hardware.
@@fcukugimmeausername This is a well know phenominon that as hardware gets better optimization gets worse look at the new capcom games which struggle to achieve even 60 fps on a 4090 while visually and enviromentally equivelent games from just a few years before do it with ease
im currently testing the 7090 gpu which will come out in 10 years time and im able to play stardew valley at 16k resolution locked at 120fps, just a sneak look into the future for you guys
There is absolutely no reason games shouldn’t be running easily at 4K 60fps on a RTX 4090, the GTX 1080 ti came out 8 years ago now and nearly every game could be played at 4K 60fps at high or even highest on that when it released. A RTX 4090 is massively more powerful than a 1080 ti and game visuals haven’t improved that massively since 2017. The fact that someone goes out and buys a 4090 for $1500 and has to upscale 1080p to 4K and sometimes even use frame generation to get great frame rates is just insane. I remember nvidia saying 8k 60fps with the RTX 3090, and technically speaking, yes the card SHOULD be capable of that. The reality is that $1500, that was supposed to get you 8k 60fps 4 YEARS ago isn’t going to and instead of DLSS and frame gen getting you even more FPS on a gpu already capable of more than playable frame rates they’re being used make your game get playable frame rates when it otherwise couldn’t. I hate taking nvidia’s side but their claims were mostly correct and yet everyone is blaming their $1500 and 400W+ of immense GPU power for being unable to get 4K 60fps when running a poorly optimized game with graphics and visuals that should be easily ran far better. Also I hate to say it but RTX is by far the WORST “technological” upgrade to games ever, games made over 10 years ago that looked really good for their time didn’t manage it just by doing anything new and fancy tech wise, but by also making smart decisions for designing the look, effect and direction of their texture and lighting work to give the game a beautiful look.
I think the technology is awesome it just needs optimizing or some sort of massive hardware overhaul to run it better than having to literally buy the most expensive graphics card just to run it at 4k
Devs just to spend more time optimization and actually learning the limits or tricks of the game engine they are using, but most studios don’t get to because of the publishers and shareholders keep rushing games and such
And better sound and 16million colours if on a say gigabyte z270 xmp i7700k as sound 192khs is sent quad using sony chip hdmi canr send both gfx n quad ts using stereo half quality and todays wifi pc waste lanes and are 42khs 4096 colours. We gone backwards since m1 n 80ti mono vcore quads 4 layer 16 lane mbds
Funny how nvidia is pushing 5090/80 when honestly the current gen gpus are overpowering most games but the issue is just optimization… whats the point of next gen gpus when their job is just to overpowered bad optimization rather than running more advanced graphics.
Optimization is already comical and will continue to get worse ubisofts avatar game uses more vram than cyberpunk does and many games now struggle to get 60 fps even on a 4090 the optimization in the industry now is comically bad
4K was a standard that TV manufacturer wanted for increasing TV sizes. The gaming industry was never ready for it but they jumped into the marketing bandwagon.
i got a 4k monitor and its sooooo much better then 1440p also my 6800xt plays every game out at 4k , sure need upscaling in some games but upscaled 4k is better then native 1440p ,alot better.
LMAO and there's definitely a point to it. I have a 55' TCL C805 4K 120hz Mini-LED TV and it's amazing. 1080p looks terrible by comparison even on a much smaller TV.
We excuse developers far too much for not targeting a sustainable frame-rate at 4K. It is likewise appalling that upscaling and frame generation are seen as clutch solutions when graphics cards are more expensive than I can ever remember in my adult life. Do better or people just won't bother.
Because hatdware used to determine res n we were native, games from 10 year ago run at different res but i can pkay at 4k uhd because i have a gtx card mb sound chip and lanes at lossless becauae my cpu isnt dwtermining my rps or game but today its low tech hardware cpu ai controlled n why thwy are multitasking pc os in background in control hence not gaming or music 3d fx pcs, 80ti quads is0 16 ran to the gpu not back n foeth as native meant windows cpu isnt needed its running s ound on mboard n gfx on gpu n sli 16 x 4 64 on gpu in realtime no latency. Roday its sticking half binned togwther ai cobteolling at less than cd quality stereo hdmi less colours and cant do native real time 3d gfx rendered they load in images n pixels 3 to 1 hence e cores laggy upacaled with wifi wasting lanes n speed to memory. Wechad smoorh 4k native in 2016 now windows controls cpus to fake 1 high end fast pathtraced lossless studio 16 snd n gfx
more important i think is to generate enough graphical power with not too beefy GPU while considering power efficiency, if it just come to raw graphical power i think thats not a issue for manufacturers to make such GPU, all things considered, there is only so much a desktop pc friendly card can be, real estate on the board itself can be the issue in upcoming hardware if we do need a hardware that can handle ray tracing like its nothing
When I dock my steam deck I try to target 40fps in games as much as I can, because most games just won't reach 60 when you try to get to 900p or higher. Not fully analogous but just made me think of it
I was playing cyberpunk at 1440p during the launch period. Upgraded my rig and went back to cyberpunk. Able to play in 4k and ray tracing aboce 60fps. Its very noticeable. Add in dlss and FG and you can add on path tracing, and it is jaw dropping... completely unplayable, but it looks AMAZING lol
I played cyberpunk with pathtracing on 4k and it's doable but nothing close to native. Dlss looks great though and it's worth using low settings, path tracing, performance upscaling and framegen. That's my preferred way to play and get 80+ fps on 4070 ti super.
@@christophermullins7163 which dlss setting do you use? Even at performance upscaled to 4k that looks good. What cpu you paired with? I'm also on a 4070ti super. I'm actually playing around with some mods now. I have found a few that believe it or not make the game look better (admittedly not in all areas) and perform better too. In most areas with dlss, frame gen, and a few key settings turned down, (mostly used DF optimized settings) I'm able to path trace upscaled to 4k and average 70-80 fps in most areas, but stay above 60 everywhere I've played so far. I switched to controller to negate some of the extra input lag added from FG. It's actually an incredibly good experience. Unless you are super sensitive to input lag, playing the way I have it set up is actually crazy, I can some pretty life like lookin stills like you see online everywhere. Path tracing is the future for sure, it is a game changer
All of these are optimized to run on consoles. 4090 for this generation of games should never have an issue. If you’re $1500-$2000 you should expect to be able to play your current generation of games without compromise.
It's frustrating that the older I get, the more I am willing to turn settings down, but the less of an impact to performance it makes. I look at the newer games I play, like COD MW3 or ARK Survival Ascended, where I might want more performance and lower the settings, but they look so piss poor at lower-medium and don't run much better than at ultra. The Black Ops 6 Beta is out and I can get 150fps at Ultra or completely butcher the visuals and get 180fps. I look back at Battlefield 1 from 2016 and not only does it still look better than most new games, but you turn one or two settings down a notch and I gain like 40fps and can't even see the difference in the visuals. Another game from 2016, Mafia 3, had the problem where if you couldn't run it at max settings you likely couldn't run it at all because turning down to Low barely improved performance, albeit not with a huge quality loss. But most games now feel the same, Ultra runs bad, so does everything below it. I think it really is just bad optimization.
I think it's mostly a CPU limited problem you have. I think so because I have the same. I can turn the settings in a COD game up or down with no change to performance, and it's because I'm quite CPU limited despite having a good CPU from a few years ago.
CPU limited at 4K with new titles? It's hard to find 4K CPU benchmark data, because they don't bother with it because it doesn't tell you anything about the CPU.
1080p CPU benchmark data is the same as 4k CPU benchmark data (if GPUs would be able to push the CPU that hard)... CPUs will start to get limited in the 60-90 range. Even the 7800X3D has a couple of games it struggles to hit 90.
@@vindeiatrix That's because it's done at 4k native so ofc GPUs won't hit that. The CPU problem is when you turn DLSS down to hit the high fps. Like I said, you need to push to 60-90 to see issues.
@@vindeiatrix Look at recent CPU benchmark maybe. Hardware Unboxed big Starfield CPU benchmark showed that a 5800X is just shy of 60 fps on 1080p Ultra, so it would be the same other resolutions ultra settings. It only slightly improves to 66 fps by turning settings to medium as most settings don't affect CPU. Hogwarts legacy 7950X3D caps out at 97 fps and 7800X3D at 94. A 7700X caps out at 80. A 5800X3D at 67... Those aren't old CPUs. Literally the best AM4 CPU capping out at 67? Elden Ring I've seen 45 fps zones on the CPU on a 3600X. Unlocked framerate caps out around 80 in less intensive areas.
4k Dlss on performance looks better than native 1080p. So 4k is not an issue. For me it's Raytracing. Especially Ubisoft starting this new trend of forced Raytracing.
As mentioned back in 2000s games could not run at full tilt on release with commercially available hardware. Crysis took it to the extreme but in general you probably needed the next gen to max out everything. Also in crt days resolution choice was much more open. I remember fiddling about with different settings for shadows and ambient occlusion to get the right performance.
If you are forced to spend 1000+ let alone 1500+ on one single gpu from team green or team red,, you should be guaranteed to be able to play the latest games that come out at a minimum of 60hz at 4k with no up scaling or frame gen. Even when cyberpunk came out the 4090 was not even able to max that out and it was 2 years later.. so saying that this is an evolution of gaming is highly inaccurate. Team green/red need to make better gpus simple! And make them be able to handle the latest and greatest until the next cycle of gpus comes out then rinse and repeat..also when a 2 year old game was unplayable at max settings without fake frames and down sampling.. also, if you buy top tier card like 4080/4090, you only buy for the visuals and maxed out settings. If you cannot max out every single setting in a game then what’s the point of having the card?? There is none so the point of turning down settings is also invalid… If I buy a 4k display I refuse and should not have to go down to 1440p and use up scaling or frame gen. That case might as well buy a 1080p monitor .. none of these points are valid..
This has been common for generations, only recently have gpus been so performant that you could max out new aaa games at hi resolution and frame rates. Remember Half Life 2 at launch? CRYSIS???
4k DLSS Performance mode is the optimal way to play right now. I have been using 1440p for years, and It's not as good. We are close to a 5090 at this point anyway.
I think a lot of friction comes from people who thinks their money entitles them to click "ultra" on everything and get 60fps+. They don't seem to care that they literally couldn't identify "ultra" vs "very high" in a side by side comparison. Its sad that we're forgetting that the important thing is how good it looks (and runs), not what settings or techniques were employed to get there.
These games were legendary for being unoptimised great games. Lords fallen the 1st game/watchdogs 1 could only hit 30fps with a 980 and only could hit over that 4k maxed until the 10 series high end cards came Control couldn’t run well with 4K native high settings until the 30 series high-end cards came. The man with glasses said the same thing: games that are very demanding aren't made for Gpus at the moment. That's why I named the games I did down below. Alan Wake still can't run well without drops, even in 1440/Black Faith a souls clone can't run 1440 without dropping FPs, stutters bad, and drops FPs using a 4070TI on 4K. I still can't play remnant even in native 1440 high settings. I'm stuck in 40s and get drops. Hellblade is very demanding/tarkov/cyberpunk and still doesn't run like it should. I'm thinking all of these games and others should run like they should've when the 50 series comes.
I think its nice that games cannot run on max features at launch. You can go back a few years later and replay the game with even higher settings. It just adds lifetime. With Control it was not perfect with 3090 at launch but it is super smooth with 4090.
I wonder what percentage of 4070Ti Supers have been sold to gamers vs the 4090 since the introduction of that card. For $800, you get 67% of the 4090's performance. You're going to pay $1,200 extra for the other 33% of extra performance. Even at the launch price of $1,600, the 4090 is still charging double for every frame rate over the 4070. Now it's charging triple.
4070ti super have 51% of 4090 cuda cores, rops,tm etc. Also not forget that 4090 isn't even full die. Back in the day gtx 770 had 53% of full die. So 4070ti super should be named rtx 4060. Or 4060 ti at best
I agree with the sentiment, but just FYI if the 4070ti super is 67% of a 4090 then you are paying an extra $1200 for an extra 50% performance not 33% extra performance.
I'm a little more upset it feel like everyone is setting 4k60 as the goal. I feel like games might be scaling better in resolution than in fps these days with 120fps at lower resolutions being harder to hit than you'd expect.
People should do better with their money, specially those who work for wages. The only way things will change is when over priced goods are rotting in a warehouse without a single sale.
Imagine telling people in 2014 that Nvidia flagship GPU from 10 years into future would still be rendering at 1080p just to achieve optimal performance💀
Paying 2K for a GPU we should be able at this point of time to run games at native 4K we shouldn't have to use 1080p rendering upscaled. DLSS and Framegen is just a bandaid to get by.
4090 owner and I play games in 1440p on LG 45 inch ultra wide monitor at 240hz and also have 32 inch 4k 240 hz Samsung G8. I use to think 4k was best way to game until I discovered 1440p. I’m waiting with bated breath for that 5090!
I’ve yet to play a game that my RTX 4090/7800x 3d combo couldn’t easily handle maxed out at native 4K without frame generation enabled. Of course, once you throw ray tracing into the mix then all bets are off.
I don't agree with the user question. The 4090 is now 2 years old and should have a totally new generation released. But the reality is the 4090 is still able to even if it is via DLSS and frame generation, to put out high enough fps and that is maxed out settings. This means it still has headroom with tweaks etc. And playing with VRR it means that even if FPS isn't 120Hz, most people are fine playing single player games with fidelity and 60+ fps which you can get with DLSS. I am skipping 5000 series , I own a 4090. I can afford to by a 5090. But I don't see any reason for it. I am fine even if I have to tweak a setting here and there and use DLSS performance and or Frame generation at times. It's just another 2 years and likely no more than 5-10 games I will play during that time and I bet most will be done at 60fps no problem. I like maxed settings. But I am also no FPS snob demanding 120fps at all times. Not when I have VRR. My gripe was always the problem to get 60fps when I didn't have VRR so Vsync would work without stuttering. With VRR I don't have to worry if FPS dips to 50-55 fps even if it stays above 60 most of the time. Playing an RPG I can easily live with 50+ fps. Also, when I upgrade to 6000 series the jump will be very big again.
For years newly launched games seem to aim with optimization to deliver ~60 fps in native 4K on flaghip Nvidia cards. Not counting ray tracing showcases where upscalers' help is needed. It's quite visible pattern being on the border of conspiracy theory, but it's not that unbelievable it being how Nvidia and their partners, devs, sell flagship GPUs to ones seeking 4K gaming with max out settings. A little lower settings not showing a difference, but bringing noticeable fps uplift also fit here.
4k is an absolute waste of pixels unless your screen is massive, like movie theater massive relative to your viewing distance. Running that natively is just wasting energy :)
Strong disagree as someone who went from 2k 27 inch to 4k 27 inch and waits to upgrade to 4k 32 inch. massive difference and my old 2k looks bad relative to my 4k monitor I currently use.
@@eliadbu There are many factors into image quality not just the pixels. It's likely your new panel has other features that impact the image quality not only raw pixel count. But the reality is that for desktop use at 27" you are going to run everything scaled up 125% to 150%, otherwise things like text will be absolutely tiny. So most of those pixels you see are just a waste. Yes, 4k is "sharper" through pixel density, but at a native resolution most of the time you are looking at four pixels showing you the same pixel. For gaming you are pushing 4x the pixel count per frame and unless you are running a movie theater, it's an absolute waste of energy for the returns. Yes you can see the difference in relative sharpness, but no, it's not worth the energy to push 8.3 million pixels per frame vs. 3.7. I'm running a 49" 32:9 screen at 5120x1440 pixels and i find this is a lot better use of my 4090 than running a 4k screen of 16:9 as there's actually more things on the screen horizontally.
It depend on the anti aliasing implementation on each games , there are few game who have extremely 💩 anti aliasing that it make the game still look like pixelated mess even on tiny 15 inch laptop screen
@@Sipu79 text scale with resolution, there is a clear noticeable quality difference from 2k to 4k in text, so the extra pixels are not 'waste'. Same with games I can clearly see the difference in PPI - going from 108 PPI to 163 PPI is still very noticeable at the distance which I'm sitting from the screen, about 60-80 cm away.
@@eliadbu i didn't say there's no noticeable quality difference. you can tell a 1440p and 4k screen from each other due to the pure pixel density. But the quality difference in workspace use is 100% not worth the extra processing power you need to render games. If we talk about gaming applications (which this channel is 100% about), 4k is a waste of pixels and energy.
i do think that devs can make a high end type of games but they are not doing it cause they know that only a handfull of ppl can play them and that means that it will floop
There has been an insane decline in logically selecting and optimizing the most important graphical aspects for any given game and instead using universal solutions that tend to destroy performance. In-house graphical engines tend to be great performers, while Unreal is just a blob of infinite technologies that end up poorly applied. If a 4090 cannot run a game at 4k, developers need to rethink their concept of what is a game and what is the most satisfying balance between fidelity and fluidity. FSR, DLSS or XESS should not be used to ensure performance levels, it should be only added on top of smart and logical use of the resources available.
Native 4k + Ultra maxes out settings is child’s play for the 4090. 4k + ultra maxed out settings + some light RT is a challenge, but it can still deliver 60fps in many of those. 4k + ultra maxed out settings + Heavy Raytracing load, is when the ask gets too big, but with DlSS quality (so 1440p internally, and a completely perfect reconstruction result) it can still handle it in many games. But PATHTRACING, went from being a tech preview in cyberpunk, like “hey this is possible, but it’s not even an official setting, it’s just a tech preview to try and benchmark” To hey games now come with path tracing in less than 1 year
The problem lies in poor optymizing vicious circle taking place in PC releases during the latest years. Devs do rely more and more on DLSS and FG for granting decent FPSs and do not make that much of an effort to optymize perforamnce anymore. For Nvidia that's is fine since this way people are more prone to buy newer cards with latest technologies.
4k gaming is only a thing because people use TV's for monitors so then they had to have 4k on monitors. Unless you like low frame rates 4k is ridiculous. 1440p is where you should be, or 1080p if you have a midrange or lower gpu.
It's a 2000$ GPU and you defend it for not going native 4k???? They're scamming gamers all over, the 40 series is weak af, i hope the 50 will be powerful but i highly doubt at the same time.
It's an expensive GPU but it is considerably better than any other GPUs. 4k native is not a resolution that has reserved performance left over for it, it's as pointless as 8k. 4k DLSS Quality is the highest resolution considered.
@albert2006xp It's not the hardware it the absurdity of modern autotuned games that add nothing to visual quality over older games. Absurd to pay $1700 to take a hit in visual quality at 4K which is what all the scalers do. I game at 6K because it looks awesome with HDR on my monitor. A game that won't let me do that is a game not worth owning. We need to accepting TAA and UE5 and RT as a crutch. Reflections are fine but your getting lazier and lazier development and artistry.
Show me a stronger GPU than a 4090 on current tech. Nvidia aren't miracle makers, they can't suddenly conjure up future hardware now. They can only progress as much as current tech allows them to progress. The real issue isn't GPU's being underpowered, it's that game Dev's have just gotten far too overly ambitious, are aiming thier sights far too high and as per usual are trying to outclass each other in the graphics department. Cyberpunk 2044 was one of the catalysts for them to take the current route that they have taken. They simply saw it in all it's graphical spender at launch and not wanting to be outclassed, decided to do exactly same or better in thier games. The issue was Cyberpunk was an extremely demanding game on the GPU (A new Crysis in that regard). So likewise the end result was AAA game studios all upped the graphics in thier games, which in turn upped GPU requirements across the board. The recommended GPU requirements of games then was raised far too high. Far higher than what the average gamer owned. Dev's simply need to dial things back a fair amount and aim lower. We don't need games that can only run fairly well on expensive high end GPU's or future GPU's (or future consoles for that matter), we instead need games that can run reasonable well on current "middling" GPU's that people own, again (including current consoles). And if that means reducing the graphical eye candy, not aiming for often unreachable goals like 4K and dialing back on things like the amount of RT, then so be it. Games are simply out pacing the hardware, that's the real problem.
@@robertmyers6488 TAA doesn't exist with DLSS. They're two different things and are exclusive with each other. Most sane people are absolutely fine taking a hit in rendering resolution given upscalers make that hit much less impactful than it used to be just to render more advanced games. You live in entitled land. Most people game at 1080p with DLSS and enable all the graphical candy you refuse to accept. Deal with that.
Native 4K? No problem, but you must compromise elsewhere. In Wukong at that setting I'd need RT Off, settings to High preset, and Frame Gen to get a decent ~110fps. If you're less of a framerate snob, you can surely get away with more.
Black Myth Wukong players on a steam deck and the lowest settings are loving life when they have a stable 30 on the lowest settings are loving life Meanwhile 4090 and i9 players will lose their minds if they are dropping below 60 at 4k Get your expectations in line with what you have and we'll all make it out this generation happier with our gaming experiences. That or decide if you're here for the game or for the tech demo.
I would worry more about stuttering with AAA titles no longer being a thing before GPUs being the bottleneck in a newer game at "native" 4K. Doesn't matter how good a game looks if it runs like trash.
Honestly the late ps4 gen games looked almost as good but the performance and optimization were much better. Look at games like Red Dead 2. We need games that focus on having good gameplay and being more fun instead of getting slightly better graphics
4K games are not and never have been broadly played natively. At this point, DLSS Quality is often a better AA solution than TAA. 5 years ago, DLSS Performance looked worse than the internal resolution at which it was rendered. Now, DLSS Quality, Balanced and Performance look very similar. I hate that the industry is leaning on upscaling when making their games, but I don't think we can put that genie back in the bottle.
I think the problem is that the developers are using the RTX 4090 as a benchmark to target 1080p at 60fps. If it runs at that resolution and framerate, they consider it 'optimized' for release. They should be targeting 1080p at 60fps on an RTX 4060 instead.
Most cards can run 4k especially the higher end cards. Whats killing performance for a small visual upgrade is the baked-in ray tracing that these games are coming out with. I think this is the not the right generation for games to be trying to push Raytracing especially with current consoles using AMD chips
I got a 4K oled tv, but I still game at 1440p, as it still looks great. 4K isn't worth sacrificing performance. And you can gain a lot, from running games in high, instead of Ultra, as you can barely tell the difference, in most games. And I don't like, having to use frame gen, due to the lag. 5-10 years from now, we'll still be having problems with running AAA games, in native 4K, no matter how highend the PC is, due to the games becoming more and more demanding, and/or badly optimised.
I was crazy (or stupid) enough to buy a 4090, and I say 95% of the time native 4k is viable, and it's just the odd game here and there where DLSS becomes necessary because of the harder to drive features like full path tracing, and ray reconstruction. Star Wars Outlaws, I turn off ray reconstruction and I can run the game native 4k in the 80fps range. Same thing with Cyberpunk, turn off path tracing and ray reconstruction and leave ray tracing on I am in the mid 50s at native 4k. At point I am good to use Native 4k with frame gen and not feel any input lag to get me over that 60fps hump. Full path tracing and ray reconstruction is nice, but both can be hit and miss as both are still in their first gen iterations. I see these options I try them, and I turn them off because you never notice it unless you stand still and look for it. Still tech like this is great and is exactly what PC gaming is all about. Pushing new features that rides the bleeding edge or just plain jumps off the edge. I may not turn on those bleeding edge features on the regular, but I will always turn it on to try it out.
Nope definitely not, when it released it did but this has happened every generation since the 1080 Ti, after every 2 years which is usually a generation of GPUs the flagship gpu goes up from 4k to 1440p resolution. The upcoming 5090 will be the only viable 4K gpu to play with and anything else is 1080p or 1440p in the latest games
I feel like 4k native gaming is never going to settle in. If you’re in the 4k game it’s like no GPU is ever going to be much for long if at all. Future proofing I really don’t think exists at 4k. That’s why I feel like 1440p is the best resolution. IMO 4k isn’t THAT much sharper than 1440p. Like imo it’s close but the performance you get at 1440 obliterates what you get at 4k. So like I just feel like as a whole 1440p pros really outweigh 4k. You’re not gaining much if anything that’s worth that performance hit.
I’m sorry but no, 4K has been a thing for over 10 years now, GPUs have gotten like 10x more powerful since then and games don’t look 10x better than something like battlefield 4 from 2013. Like how can you look at an RTX 4090’s specs and think “yeah, this GPU is too weak it needs upscaling”. The only time I can justify that thinking is if the game has full path tracing, which only a few do.
4k screens and Blu rays, sure. Real time rendering? Not even remotely. The entire reason xess, fsr and dlss have become essential for all but those with the highest end gear is because hitting 4k natively on typical gaming pc's, let alone consoles, would require significant scaling back of visual quality, just to bump resolution. And no, gpu's have not become 10x more powerful. New features, and designs have added to gpu abilities, but in raw rasterization a 4090 is roughly 4x a 1080ti.
@@thelonejedi538 The GTX 780 Ti, a 2013 GPU, supports 4K and could even do it on a new game at the time like battlefield 4 at high settings 60fps. The 4090 is about 8x more powerful than a 780 Ti and about 10x more powerful than a regular GTX 780, all in raw rasterization. You can ignore the 10x thing and just pay attention to the 8x part, games don’t look 10x better and neither do they look 8x better. So yeah, upscaling on a 4090 shouldn’t be needed unless path tracing is in the conversation
@@yancgc5098this is a weird example. You could reasonably expect drops below 60 on the 780ti running BF4 at 1080p. (Digital Foundry has a video showing just that). It’s normal for new games to struggle maxing out current high end hardware. Those settings if anything are more for future proofing for new hardware. Most of these games coming out look just fine on current consoles, which are far from running games at max quality and resolution.
You are not taking into account how much graphical cuts the game needs to take to make native 4k happen. Games were more restricted when all we had was native. Now they can use more power because we all run upscaling.
It's ok that sometimes the software steps further than the hardware, just i don't see all this improvement in graphics to justify the fact a 4090 struggles at native 4k, even with ray tracing, path tracing egg yolk tracing ecc ecc, i mean, ok the flagship card suffers all this filters and ends up to be insufficient but this comes with a really poor graphic improvement
Its more like developers are cutting corners and are abusing upscaling techniques and frame gen so it can optimize their games just for a paycheck. Theyre the ones who are holding the hardware back. The point of Fg and upscaling techniques should only be to get better performance for ray tracing, its shouldnt be a major focus on rasterization.
This is a take only accepted by people who don't know anything about game development. Don't get me wrong, it's true sometimes. But the idea that devs have gotten lazier or less competent is just silly.
@@DanKaschel I'm not saying all developers are lazy. I mean there has been games out there that are optimized but games like black myth wukong and ff16 are both poorly optimized on pc and ps5, and those games abuse upscaling techniques, the fact that ur defending upscaling techniques is insane. I would rather have 1080p native resolution than 1080p upscaled to 1440p with so much shimmering, ghosting, and other visual artifacts on ff16 on ps5. Btw I'm we’re talking about the 4090(dlss is an exception bc the quality is good) but no way ur advocating for developers to rely on upscaling techniques at this point we will never achieve 4k gaming.
I know dlss is a great thing for lower hardware but I feel it’s used as a crutch by developers. I upgraded a yeah ago from a 1650super to a 4070 and a 1440p monitor and I hate how some games dlss is necessary.. I know it looks fine but still
4K Native will never be achieve forever by any GPU, not even a 5090 or a 6090. Developers will just keep increasing visuals and not really optimizing anymore since that allow them to pump more games faster and you really can’t sell a RTX 5090 if a 4090 can do just as good as that GPU, so GPU makers incentives developers to use more and more features like RT, more polygons, more details and many more things. Only way to keep playing at 4K forever is to lower settings and use upscalers.
Yep bring out unoptimised games that barely look better then 2-3 years ago but are twice as demanding so you feel like you have to upgrade to the 5000 series.
@@PCZONE1 It's funny how the latest games are always *just barely* too much for Nvidia's highest-end GPU, right before they release their next line of GPUs.
@@PCZONE1 try 17(!) years ago(Crysis) RDR2 is a nice recent example as well of a great-looking game without RT/PT the whole RT/PT/upscaling thing was rotten from the start
Frame generation only applies for those who want to run their games at over 60fps. Otherwise it's just not nice to watch. DLSS and FSR will be doing the heavy lifting, unless the game is CPU limited, so it's not necessarily the GPU that needs to be improved. Native 4K isn't necessary these days.
Actually, that's exactly what they want you to believe. The 4090 is a fucking beast in reality and you know it. Just play old-gen games and you see what a massive difference it is in performance, but not in graphics. We could have another 3 generations of graphics innovation if gaming corporations put in the effort on current gen hardware.
4K was never really that viable. The industry tried going from 1080p straight to 4K, that's insane.
1440p is the sweet spot.
Even at 1440p you can DLDSR to 4k and DLSS down for the render res. The point is 4k is a perfectly viable end resolution, it's just not a viable render resolution. Not that you couldn't make it work, but it's just wasteful to reserve performance for it.
No they didnt. 1440p is a thing
High frame rates at 4K is easily achievable and looks miles better than 1440p on a display over 32". The problem is when you turn on high/ultra/extreme settings which add a massive overhead.
@@harlemstruggle Yeah, find me a 1440p TV then.
I’m sorry but I am not spending 2 grand on a GPU to play 1080P upscaled.
And no one in their right mind should. It's fucking ridiculous lol.
Then just turn off ray tracing?
No one buys a 4090 and plays 1080p upscaled for free. It's an option every single person playing like that makes for otherwise unavailable features.
You're spending that much to have more performance than the other GPUs. You don't see those of us will lower cards complain about using 720p render resolution. That still looks plenty fine. 4k DLSS Performance looks insane for most people, it's not anywhere close to actually having a 1080p like MOST PEOPLE HAVE.
You don't need to. The 4090 is for a minority who don't think remotely close to you. In fact you are not the demographic for that card. The 4060-4070 is more than enough for everybody else....or you can buy a used card and save even more money.
@@KhromTX you are poor but not everyone is. If you have the money and refuse to spend that much on a card you are still poor.
Worth noting the way that raytracing scales with the number of pixels being rendered. Going from 1080p to 4K in a pathtraced game will be more demanding than going from 1080p to 4K in a traditional game. Performance falls off a cliff once you hit a certain number of pixels/rays.
Pathtracing also pretty much solves scene complexity, you can throw as much shit on screen as you want and it will run just fine portal RTX and Cyberpunk Overdeive don’t run all that different despite one being a corridor puzzle game and the other being an open world mega city, All it takes is the shader grunt to push the pixel count you are looking for. Also in the last 2 generations Nvidia has managed “only” +60% in raster per generations but +125% in path tracing each generation we haven’t seen generational scaling that good since the late 90s / early 2000s. It’s fertile ground for growth like we haven’t seen in decades. If they keep that up another 2 generations the 6090 will be pathtracing at native 4K easily north of 75fps.
That's why they invented dlss, to calculate those things BEFORE upscaling..
Yeah, games made in 2024. In a couple of years 6090 will get hammerd anyways. Unreal engine 6 anyone? @@gameguy301
@@gameguy301
They cant keep making obese gpus at the top end though, you shouldnt need a new case for a 4090.
@@gameguy301 scene complexity is never an issue with PT, its always the pixels, even movies render path traced scenes at lower resolutions cause of it.
People are just mad they got suckered into paying 1600$ for a GPU. There have always been games that could not be played at max settings on contemporary hardware. Just turn down the settings and learn from your mistake when the 5090 gets announced.
Some people think that if you pay more you get a card made out of unobtanium or something. You're only getting +46% performance over the half price 4070 ti super.
Maybe but a lot of the folks who bought a 4090 aren't exactly struggling and are not the people who are willing to compromise. They'll turn on DLSS quality because it looks as good as native but won't compromise much beyond that. And I'd guess a large portion can absorb a $1600 expense every 2 years. And if you factor in the resale of the old card, it's less than that. So it was very easy to know exactly what one was getting into when they bought it.
@albert2006xp To be fair the highest end of GPUs have always been top dollar for not much in extra value but people are willing to pay for the best of the best at the time.
@@Bargate Which is fine, when they understand that and don't think they bought a spaceship.
Believe me I NEVER ONCE felt suckered into getting my 4090. I just stick with 1440p and have a whole lot of extra headroom. I can play anything at max settings max raytracing at 1440p 144fps
4K 4090 gamer here. You can definitely forget about native 4K with "extreme" settings on newer, intensive games. People overestimate how powerful the 4090 is. :) DLSS is a life saver here. I personally dislike frame generation, something about it feels off to me. Maybe it will improve in the future. But, yeah, some compromises are necessary even to hit the 60 FPS "minimum", and I still rely on optimization guides (like those made by DigitalFoundry).
The 60fps brainworm is the worst thing to ever happen to pc gaming. The 4090 is perfectly capable of 4k ultra in almost every title if you go by how it feels to play rather than whether you can hit a completely arbitrary number. Only times I've needed DLSS at all was for path tracing.
@@thomasjames5757 Nothing "arbitrary" here. 30 FPS looks bad to me, and to a lot of other people, too. We'd rather remove some bells and whistles instead.
@@EmblemParade Yeah I just tried a little Wukong at 30fps. Feeling genuinely physically sick after that. Been loving my 40 series for high fps at 4K - quite happy lowering settings to achieve that.
@@EmblemParade 30 fps definitely feels much worse but the difference between 55 and 60 isn't that impactful. As long as the game stays in your monitor's VRR window. I think what he means is you don't have to get EXACTLY 60+.
@@albert2006xp You're splitting hairs here. Sure, nothing magical about the number 60. But if I'm honest, even 60 is a compromise. Once you go above 120 it's hard to unsee. :) Still, 30 is way too low for my tastes.
3440x1440 gamers happy they never went to 4k.
Even then the RTXDI on Outlaws sets zaps frames down to 65-70fps from 120-130 with frame gen too low for me that’s for sure. That’s with DLSS quality. Which tbf you expect more from a 1600 gpu paired with a 7800X3D. I guess like they said they designed that setting for the future, just a shame todays enthusiast level gpu isn’t enough.
How could they be happy with a weird screen that looks like a tv cut in half. Games looks really bad on them, why would you want a screen with huge horizontal viewing range but teeny tiny vertical? No games look good like that lol
@@Dempig 21:9 monitors look great. You have no idea lmao. Same height as 2560x1440 just extra width for more details, better field of view.
@@Dempigcap. You have deffo never used an ultrawide
@@VirulentPip or just get a 65"+ tv for a much much better view. First person games especially look ridiculous on ultrawide
people sleep on 1440p like its the middle child between 1080p and 4k. XD
Sounds like what a broke boy would play on .
@@kolz4ever1980it's so sad that anybody would use the term "broke" as an insult
@@DanKaschel sounds like another broke boy living off my taxes just replied...
@@kolz4ever1980 I doubt a blue collar trump supporter pays enough in taxes to support me.
See? I can make fun of people for stupid reasons too.
1440p is like the perfect res for bigger monitors. 4k already goes into diminishing returns territory and is 2x more pixels to render. 4k can be nice for creative applications but for gaming with lots of motion, might as well put that slight sharpness increase into better graphics.
"Recalibrate your expectations", if I'm a person spending $1600-2000 on the card alone then I'm going to tell you there's really not much room for recalibration. I was there for Crysis, I've been there for Quake. I've been there for Windows 3.1 and the 486. But I wasn't spending $2000 (or the equivalent with inflation) where the 720p 30fps medium-high details at the time for the former was a pain point.
For me it is not even the 4K issue, it is instead games coming out where high end cards are struggling with 1080p native, with visuals not being that much improved compared to older titles.
I wonder if they've messed with the scaling so that increasing the resolution isn't as hard as increasing framerate passed a point.
What high end card is struggling at 1080p??
@@mondodimotori Check out benchmarks for Black Myth Wukong at 1080P native with max settings (max RT) and no frame generation. The RTX 4080 super struggles to maintain 50FPS the only card hitting 60+ FPS in the RTX 4090. This is why you see most benchmarks on review sites using 67% scaling, which at a 1440p output is an internal render resolution that is a little under 1080p.
PS, benchmarks hat use upscaling with a 1080p output are doing a disservice, considering that DLSS and FSR offer horrible results at such low internal render resolutions. Too many distant details drop to a level where does not have enough info to go on. In trying it with the free benchmark tool, it looks horrible in that game.
@@Razor2048 because Wukong has settings that are normally disabled in UE5 games and meant to be used for tech demos only. The cinematic settings are enabled, so the devs weren't accused of downgrading the graphics and the game can scale graphically on future hardware. The game looks and plays perfectly fine on an RTX3070 from 3.5 years ago at 1080p/60FPS with high settings (no upscaler, or 1440p with 75% dlss). The same goes for Alan Wake, and SW:O.
@@gabber_ I'm using a 3070ti. It looks terrible with those exact settings bro. Blury as hell and inconsistent frame rates between 60-80
I always play games from previous gen of hardware or older so everything can run like a dream maxed out with very high framerate.
I've started doing that playing metro last light max settings at 170fps is so good
@@chimpantvit's awesome, you play the game when they are actually finished with patches and the hardware you have is designed to run what was considered futuristic on your current mid range hardware.
@@bjarnis totally agree it's hard to choose what game to max out next xD
@@chimpantvhave fun buddy, never fall for their FOMO propaganda 😉
Not a bad idea.
you know what’s funny? seeing people crap on 400$ consoles not being able to run games 4k@60fps and then seeing a discussion here about a 2000$ video card also not able to run the new games 4k@60fps, and the comments going "oh but we went too quick to 4k" or "that’s why I play in 1080p/1440p"
the irony
The consoles were 750$ at launch tho.
@interstate_78 preach
@@juanme555 no they weren't. At launch the PS5 was $499 with a disc drive and $399 without one
If a game needs a 4090 with DLSS upscaling for 4K, it really shouldn't exist as anything other than a tech demo.
If it only "needs" a 4090 because of path tracing or ultra settings, that is a completely different thing.
Frame generation should NEVER be used when showing frame rates in reviews.
1080p or 1440p, if the path-tracing is max without config/settings file modified. 🙂
I disagree, I think technology will continue to move and priorities will change. I'm willing to deal with the occasional stumble in a game if it means progress in graphics and immersion over time. Otherwise we would still be using directx 9
How dare a game exist and use the current technology if you can't do something dumb with it like using 4k native. /s
@@tpate123345tbf DX9 was goated. And while DX11 did a fine job as well, DX12 is terrible. So in that case the „progress“ was actually not that great 😔
@@imlegend8108 I understand, but I'm thinking of Dx14, or even 15. We get there by learning from everything we do now. Fingers crossed that future leadership in these companies understand that
1:40 This is something big to consider and it's true. Once hex and octa core processors became the norm, GPU capability also went through the roof. A $500 CPU just a couple of years ago is struggling to keep up with single core perfomance of budget chips today. It's much like the late 90's into the early 2000's. A whole generation of hardware became obselete in the blink of an eye.
Not accurate depiction of cpu's from the 1990s to today 😮.
It's games that still using mainly single core performance. Engines are not optimized and ue5 is the prime example. Still has issues with multithreading up to v5.4 and nanite basically only cuts devs time by making everything less optimized. Games are getting demanding because there is less and less work required on the tech side to make them run properly. Devs just tick dlss and framegen boxes and call it a day..then traversal/shader comp stutters, bad memory/vram usage, bad cpu usage. People might disagree all they want but it's just the reality of things, they might not like it, doesn't change a thing.
It's not really that accurate. I have a 9900k from 2018 and it still performs perfectly well in anything I throw at it and usually in the 120fps+ range. It's mainly the cheaper 6 core chips (like Ryzen 3600) that are starting to struggle but that was a given when the consoles shipped with 8 cores of the same generation.
@@enricod.7198 u have no idea how hard is multithreading.
Its not "lazy devs relying on cheats". Even professional software that cost thousands of dollars per year, struggle to get advantage of all the cores of current cpu. Look at Autocad.
This "lazy devs don't optimizer" argument is dumb.
No, hyperthreading being hard doesn't negate how lazy developers have gotten.
1440p on an ultra wide has always been great for me on PC with a 4090. I can run anything at ultra settings with decent frame rates, and I’ve never once wished I had a 4k display. It’s just always seemed like overkill, especially when sitting so close to the display anyway.
Good point. That viewing distance is why I just switched to this 4K 32" - because the majority of gaming time I like to grab the controller and sit back. Been great fun so far.
Totally agree, I can’t imagine needing more than 3840x1600 - it looks amazing paired with a 4090
I would have agreed with this statement until every game from 2016 onwards decided using TAA with a blur filter was a good idea. MW22 looks blurry and gross even at 4k, but most games look great, and I always found myself finding games to look blurry at 1440p, kind of like I needed glasses, but then 4k is like wearing those glasses. For games like Battlefield, PUBG, Warzone it makes it signficantly easier to see at a distance, it feels like cheating imo.
Same! 3440x1440p OLED is beautiful and plenty of eye candy. I don't really care to up resolution as I cannot tell the difference unless it's literally side by side. I prefer a high frame rate
@@skywalkies77 I use that same resolution on my lg c1 oled and it looks amazing
If the card I spent a mortgage payment on cannot handle native 4K then we've fucking lost the plot as consumers and the fact that y'all are defending regression in performance is insanity and bewildering. $1,600 on a GPU means high expectations and it is not unreasonable for us to expect NATIVE 4K, but instead they shove graphs at us using god awful smeary frame-gen and FSR/DLSS at potato internal resolutions and bad latency and tell us to focus on "FPS number go up!!!" but the image quality is butchered in the end, at which point we're paying $1.6k for blurry gaming at 1080p? Thanks? I was playing native 1080p games 12 years ago on my $215 GPU, as seen in my oldest videos, and I'm being told that 4K is a waste and blurry 1080p upscaled to artifact hell is somehow better? It's like I'm being gas lit lol. Get real guys, seriously.
the problem isnt the card, but lazy game devs. they rely you will turn dlss so they don't bother optimization
This is thanks to nvidia trying to push for ray tracing when the tech just isn't developed enough to achieve a comparable level to rasterization, along with new technologies and optimizations that demand 5 times the performance for a marginal increase in graphical fidelity
There have always been unoptimized ports, but the problem isn't optimization, is that we are trying to use technologies that are not ready yet and that are not worth using at the moment
Look at how the graphics settings in games have changed, 5 to 10 years ago playing on low looked horrible, playing on medium look acceptable but far worse than high settings, high looked very good and ultra looked amazing, now low looks acceptable and medium, high and ultra sometimes look almost exactly the same, even though high and ultra settings lower performance by a lot
Why would anyone leave half the performance on the table for you to waste on 4k native? Use DLSS Quality and double your fps. Stop wasting performance. 4k native is not a resolution that is supported anymore, it's basically as pointless as 8k.
@@albert2006xp you are both proving and failing to see his point
@@islu6425 His point is trying to equate how much $ Nvidia makes him pay with how much resolution games are tuned around. Resolution will never go any higher than this. games will get more graphical fidelity. Resolution with upscaling is good enough as it is. 60 fps is around the good balance point for considering fps also. Resolution and fps are always going to be a balance of how low can we go to then use that performance for the game itself. Upscaling simply shifted that because it made lower render resolution actually look good. So now they're acceptable and games can be pushed further.
Resolution increases were much more incremental back in the days of CRT. Even early LCD days you would see jumps from 1050p to 1080p.
but that was going from 16:10 to 16:9
@@KoozwadHis point stands.
@@christophermullins7163 not necessarily, since even today many cards still struggle with 4k
cards of today support 8k native, but that doesn't mean they can run modern demanding games at that, especially maxed
something like 20-30 years(!) ago, cards supported a max. of 2048x1536, and then, from memory, 2560x1600. Then eventually, where we are today(probably some more in between).
if you were to look at an actual timeline of resolution support, you would see just how slow progression has been
@@Koozwad you're arguing one point with something different all together. This is impossible conversation because of it.
@@christophermullins7163 why do you think so? how long have we been stuck at 4k now? 10 years? even longer when it comes to cards supporting it?
Here I am happy with 1440p/60fps. Don't fall for the 4k hype fellas.
Black Myth wukong latest patch added like 10+ fps. My first playthrough I was around - 55-60fps, now I'm like around 65-75fps.
I understand that you guys are really excited about frame gen and DLSS and equivalent technology. It is neat. However, I am sorry but it looks terrible. Outlaws on anything less than DLSS quality looks awful, the artifacting is really bad, it's really blurry. Ray-tracing cannot be turned off. So to run in on reasonable hardware that doesn't cost the same as a car, you need to turn it down low. And frankly, raytracing looks rubbish on lower settings. The weird blurriness of it because it is low resolution, added to the constant flickering as each ray is processed. It looks pants. Lighting was better in games 6-7 years ago. I think with you guys trying all the latest tech you may have completely lost touch with the average gamer who is not running a 4090 and the latest greatest CPU and RAM.
Stop trying to run max settings or using features with less than adequate hardware. Games like Star Wars Outlaws are using technology which won't be fully enjoyed until future cards arrive and that's a good thing as it's exciting and pushing the technology. The graphics are beautiful if you use settings that are within the reach of your GPU.
@@grahamt19781You can't polish a turd.
@@grahamt19781U can't make poop shiny no matter how much you rub it.
@@Moonmonkian no, but you can roll it in glitter
@@grahamt19781 Is that what were calling DLSS these days? It's just as irritating to the eyes. Fabulous.
Oliver is lit like a Crysis character. Very fitting
3:37 - Finally someone says it! The fact that many modern GPUs have been able to run games at high framerates while on the highest possible settings is a relatively new phenomena.
It's been quite common throughout the past that games push graphics technology so aggressively that it creates a situation where hardware is forced to catch up, and we're starting to see it happen yet again. This part isn't new.
Honestly, I think people are just upset they drank the Nvidia koolaid and dumped nearly two-thousand dollars on an RTX 4090 when it only has a 20-40% performance advantage over cards that're less than half the price. I'd feel pretty sour too if I let my brain disregard the comparison charts because "more money = more better" and I still didn't get the performance I'd hoped for after buying a GPU that was the most expensive in the market by such an extreme measure.
I believe that the 4090 would've been a great deal at $1300.
@@keymo9359
For a flagship/halo product - sure. I agree that'd have been a reasonable launch MSRP.
you don't even need native!! I use my 4090 on a 65 inches OLED TV and I've compared native 4k so many times to 4k + DLSS and the difference was always so damn little to my eyes so I always used + DLSS quality on for the better efficiency! I'd also use DLSS always even if I'd get the same FPS with native 4k!!!
@@Serandi1987
Honestly, for 2160p, even DLSS Balanced looks shockingly good. That upscaler is practically magic.
This is insane that on a chanel based on video game graphical fidelity you keep recommending to use upscalers in performance mode. There's a noticeable dip in clarity especially in motion when you anywhere below quality (or balanced if I'm generous) even with DLSS at 4k.
You guys love upscaling so much
Other than Tom, who actually recommended the visual 40 FPS mode on Star Wars: Outlaws, everyone else, especially Olivier, have somehow convinced themselves that abysmally small internal resolutions upscaled to 4K are somehow okay and acceptable, and it's so antithetical to the channel that I've honestly lost respect. It shows a fundamental misunderstanding of what they're talking about sometimes and it shows. DLSS, even though it's come a long way, STILL has noticeable shimmering / flickering issues, and ghosting and they don't really acknowledge it very well.
There's a noticeable dip in image clarity, yes, but it often comes with the benefit of headroom that, if turned into advanced rendering features, can pay off.
They never pretend 4k performance "looks line native 4k", they just think hyper-prestine visual clarity is not the be-all end-all of graphics.
@@KhromTX Uh? have other recommended the performance mode?
Dlss quality or bust!
Consider the amount of graphical fidelity you can ADD on screen with the performance you save though. Most people play on a 1080p monitor and are fine. 4k DLSS performance looks massively better than even DLDSR+DLSS on a 1080p monitor which already looks insane. The performance cost of going to 4k native is like going back 3 series of GPUs..
I mean, did people forget completely about Crisis? In which generation was that game finally completely playable maxed out at 60 fps? Certainly not on the highest end gpu when it released and neither on the gen after that.
The more important question is how much more power do we get with the 5000 series. If the 5080 is on par or faster than the 4090, ok. But then how many % of an uptick does the 5090 have over the 4090? 40%? Because in my mind it would need double the performance to justify the probable exorbitant price tag. But we’ll see.
Not to mention the fact that the 5080 will have significantly less VRAM than the 4090.
@@captainshiner42 not 16 GB?
Only information we have are leaks, and they say 5090 is about 30-40% more powerful than 4090 and that it will use 600W.
Rumors also say that 5080 will be 0-10% more powerful than 4090 and use 450W.
But those are just rumors / leaks / speculation, and I would take them with a grain of salt.
And as a 4090 owner with some money to burn, I would take 5090 into consideration if it was 60%+ better (and that i can sell my gpu for half of the cost of 5090).
also a note, we don't know if that "more powerful" is raster or RT performance.
@@captainshiner42
I wouldn’t be so sure about that.
I could see 24 or 20 gb being the new norm for the XX80 series going forward. A 5080 with 16gb of vram would be weird but I also wouldn’t put it past nvidia.
Thing for me is, having seen games that are already demanding more than 16gb vram at 4K, I’ll be going for the 5090 if the 5080 doesn’t have at least 20gb of vram. Still stuck on the 3080 10gb, which is fine but it’s showing it’s age.
Having a 4090 myself, it seems like 100% of the time my CPU can provide frames at a higher rate than my GPU. I haven't played anything where my GPU isn't at 99%-100%. Hardware Unboxed always does CPU testing and at 1080p ultra, with many modern CPUs tend to provide way higher fps than I'm going to get with my 4090 at 4k. Even Toms Hardware has said they don't do CPU tests at 4k due to a almost always GPU bottleneck. I wish it was mentioned what games are CPU bottlenecks, and why it is thought to be the CPU. I can see it at lower resolutions, but generally if you are upscaling you are likely to be GPU bottlenecked already, unless you are fine lowering graphics settings to hit max CPU fps.
I have an 11700k cpu
Is it worthwhile me buying the 4090? Or is this cpu going to hold me back at 3440x1440p and/or 4k?
It's more of an issue for people that adjust their DLSS for 90 fps or so. Though if you have a 7800X3D only a couple of games will have problems before 90 fps. See: Hogwarts Legacy as an example that comes to mind.
From the top of my head, Dragon's Dogma 2 and Starfield are two titles that get very CPU-bottlenecked in urban areas. But the most common CPU issue would be the stuttering, most notably shader compilation stutter.
Modern games on my 4090 at native 4K I’m pretty much always gpu bound if I set dlss Q I’m still mostly gpu bound.. only a few select horribly cpu optimised games I’m cpu bottleneck but not many..
@@djpep94 Get what you like, and if you CPU can't keep up, upgrade it later
Don’t need to play at max settings for 4k.
Yea so many unnecessary options that hits performance.
No need whatsoever. People will spend themselves into a hole if they keep striving for maxed out settings in a game at highest possible resolutions. These games are look and run just fine on machines that cost half the price of a 4090s.
@@Alex-kn7cb The upgrade pit.
you don't even need native!! I use my 4090 on a 65 inches OLED TV and I've compared native 4k so many times to 4k + DLSS and the difference was always so damn little to my eyes so I always used + DLSS quality on for the better efficiency! I'd also use DLSS always even if I'd get the same FPS with native 4k!!!
That’s because people want to run native 4K, max settings with raytracing on. Also triple A games are getting more demanding and unoptimized.
The issue is that game developers should prioritize optimizing for console-level graphic fidelity when preparing a PC release. This entails building the game on PC hardware that closely mirrors the specifications of the PlayStation 5 and Xbox Series X, ensuring that the PC version is fine-tuned to extract the maximum possible performance. The goal should be a minimum of 60 FPS on mid-range PC configurations with no image upscaling. This way, players using higher-end hardware, such as an NVIDIA RTX 4070 or better, would experience exceptional scalability, offering 4K resolution at 60 FPS (or higher) on max settings for every game. Proper optimization ensures that the PC build benefits from increased resources without compromising performance across different hardware tiers.
People still praise and want unreal engine 5 which is, actually, a step back in a lot of optimizations that were developed through the years. Yes less work for devs, but it ends up giving worse optimized games and it shows. Ue5 started barely better than later ue4 games while running 3x worse. Turns out most of the new tech like nanite is wasting a lot of gpu power for nothing, which was avoided using lods. Future should have been ia to automatically generate lods for reference meshes, not nanite.
@@enricod.7198 Yeah, agreed. You can kind of tell when performance drops around 20% or so and the general file size grows dramatically just because a project switches from UE4 to UE5.
@@enricod.7198 lods have pop in, nanite is smoother. It's definitely more costly to use but it will age well.
What? Just play on lower settings console games don't even run at max graphic
No high end card was ever able to keep up at max settings nothing new here.
Even the goat 1080ti fell behind at 4k very quickly. I'm sure the cynics were ranting about how Nvidia was purposely weakening the card in order to sell their next flagship while ignoring the fact that graphics are improving.
I guess one could say that things are different now, because UE5 was designed with upscaling in mind, so now even the 4090 needs to use these new features at ultra settings. Once nanite completely eliminates lod pop-in in games, the visuals will be way better, and the next gen cards will be able to handle the features with better or no upscaling at all.
All Nvidia needs to do is not price people out of these features, and provide enough vram.
Native 4k and Max settings just aren't necessary, it's a nice to have if you have expendable money, don't feel like you have to.
Then what's the point of the PC master race?
if youre buying a 4090 you have expendable money lol
@@kainairsoft2331 Less DRM locks, dramatically superior multitasking, the ability to game on something that isn't a controller, guaranteed backwards compatibility, more scalable in-game settings, better VRR, SteamVR, and the overall freedom to personalise your hardware and software alike? Consoles are affordable and reliable gaming machines, but being on PC is about so much more than boasting about having a graphical edge over console with a $2000 rig.
@@kainairsoft2331 Customization? Modding? Uses outside of gaming? Upgradability? Keyboard and mouse controls? Better-than-console experiences even if they aren't 4k or max settings?
The promise of PC was never "buy whatever and get 4k max settings", it never even was "buy the best and get 4k at max settings". It was, the better hardware you buy, the more advanced features you have, but wanting these advanced features to always be "max" is just being hyper fixated on naming instead of experiences.
@@iurigrang All these pale in comparison to just pressing a button and everything working. There's lots of problems surrounded with using a PC for gaming, and I would know since I started building pcs in 2004.
I have a 14900ks right now, asus rog strix OC 4090, Z790 asus rog maximus extreme mobo and 128 GB DDR5 corsair dominator titanium.
I still prefer my ps5. My PC already feels like a scrap of outaded crap while on the ps5 everything just works.
You have another set of expectations if you spend 5-8k every two years for your PC. You expect the best. You get into the fps monitor to see the performance, usage and temps. You try to get the best out of your pc. Consoles don't give me such anxiety. And that's it if the port is actually good and not some unoptimized bs.
I won't speak for usage outside of gaming, as the PC master race is aimed specifically to high end gaming settings and fps. You can't laugh at a console if you need dlss from 900p for a 4090 to run UE5.
I got 22-24 FPS in BMW native in 4K which is unplayble, but when you turn on Frame Generation and pull up in game menu with staff you can see some glitches around text. And I have to use 85% and Frame generation to play on 60FPS otherwise its unplayable
Some people really underestimate how heavy is 4k alone. Let alone when coupled with high quality ray tracing. The more pixels you render, the more rays you need.
one issue to discuss is the problem with how the game downscales textures, shadows, draw distance, lots of things because of downsizing the resolution and if there was an independency in that (unreal engine is a prime subject for trial) using dlss on performance wouldn't look like using low settings anymore.
"If you're CPU limited then there's not really a lot you can do to solve that, which is why frame gen is so important."
INCORRECT. That's why *optimization* is important. We need to stop pretending like game developers are powerless and the software just is what it is. Upscaling and frame gen are crutches that lazy developers use to save money by not optimizing their code to run well. There is absolutely no excuse for your game not running fantastically well on thousands of dollars worth of hardware.
Unless you have a Degree in Computer Science, equivalent qualification or industry experience please stop using the word 'optimization'.
@@fcukugimmeausername This is a well know phenominon that as hardware gets better optimization gets worse look at the new capcom games which struggle to achieve even 60 fps on a 4090 while visually and enviromentally equivelent games from just a few years before do it with ease
im currently testing the 7090 gpu which will come out in 10 years time and im able to play stardew valley at 16k resolution locked at 120fps, just a sneak look into the future for you guys
There is absolutely no reason games shouldn’t be running easily at 4K 60fps on a RTX 4090, the GTX 1080 ti came out 8 years ago now and nearly every game could be played at 4K 60fps at high or even highest on that when it released. A RTX 4090 is massively more powerful than a 1080 ti and game visuals haven’t improved that massively since 2017. The fact that someone goes out and buys a 4090 for $1500 and has to upscale 1080p to 4K and sometimes even use frame generation to get great frame rates is just insane. I remember nvidia saying 8k 60fps with the RTX 3090, and technically speaking, yes the card SHOULD be capable of that. The reality is that $1500, that was supposed to get you 8k 60fps 4 YEARS ago isn’t going to and instead of DLSS and frame gen getting you even more FPS on a gpu already capable of more than playable frame rates they’re being used make your game get playable frame rates when it otherwise couldn’t. I hate taking nvidia’s side but their claims were mostly correct and yet everyone is blaming their $1500 and 400W+ of immense GPU power for being unable to get 4K 60fps when running a poorly optimized game with graphics and visuals that should be easily ran far better. Also I hate to say it but RTX is by far the WORST “technological” upgrade to games ever, games made over 10 years ago that looked really good for their time didn’t manage it just by doing anything new and fancy tech wise, but by also making smart decisions for designing the look, effect and direction of their texture and lighting work to give the game a beautiful look.
Well said, yeah RT/PT including its reliance on fake resolutions/fake frames really is a disaster for the industry. Clarity is just gone.
I think the technology is awesome it just needs optimizing or some sort of massive hardware overhaul to run it better than having to literally buy the most expensive graphics card just to run it at 4k
Devs just to spend more time optimization and actually learning the limits or tricks of the game engine they are using, but most studios don’t get to because of the publishers and shareholders keep rushing games and such
And better sound and 16million colours if on a say gigabyte z270 xmp i7700k as sound 192khs is sent quad using sony chip hdmi canr send both gfx n quad ts using stereo half quality and todays wifi pc waste lanes and are 42khs 4096 colours. We gone backwards since m1 n 80ti mono vcore quads 4 layer 16 lane mbds
Funny how nvidia is pushing 5090/80 when honestly the current gen gpus are overpowering most games but the issue is just optimization… whats the point of next gen gpus when their job is just to overpowered bad optimization rather than running more advanced graphics.
Optimization is already comical and will continue to get worse ubisofts avatar game uses more vram than cyberpunk does and many games now struggle to get 60 fps even on a 4090 the optimization in the industry now is comically bad
4K was a standard that TV manufacturer wanted for increasing TV sizes. The gaming industry was never ready for it but they jumped into the marketing bandwagon.
Yeah I kinda regret getting a 4K TV rather than a 1440p monitor. I coulda saved a lot of money in powering what runs on my gaming screen.
@@user-2x5s-r5x6 Not true.
i got a 4k monitor and its sooooo much better then 1440p also my 6800xt plays every game out at 4k , sure need upscaling in some games but upscaled 4k is better then native 1440p ,alot better.
LMAO and there's definitely a point to it. I have a 55' TCL C805 4K 120hz Mini-LED TV and it's amazing. 1080p looks terrible by comparison even on a much smaller TV.
That's the part everyone seems to not understand. DLSS quality looks a lot better than native 2K with TAA applied@pantsgaming759
It use to take up to TWO GPU GENERATIONS to completely max out a PC game back in the day (pre-2010).
We excuse developers far too much for not targeting a sustainable frame-rate at 4K. It is likewise appalling that upscaling and frame generation are seen as clutch solutions when graphics cards are more expensive than I can ever remember in my adult life. Do better or people just won't bother.
Because hatdware used to determine res n we were native, games from 10 year ago run at different res but i can pkay at 4k uhd because i have a gtx card mb sound chip and lanes at lossless becauae my cpu isnt dwtermining my rps or game but today its low tech hardware cpu ai controlled n why thwy are multitasking pc os in background in control hence not gaming or music 3d fx pcs, 80ti quads is0 16 ran to the gpu not back n foeth as native meant windows cpu isnt needed its running s ound on mboard n gfx on gpu n sli 16 x 4 64 on gpu in realtime no latency. Roday its sticking half binned togwther ai cobteolling at less than cd quality stereo hdmi less colours and cant do native real time 3d gfx rendered they load in images n pixels 3 to 1 hence e cores laggy upacaled with wifi wasting lanes n speed to memory. Wechad smoorh 4k native in 2016 now windows controls cpus to fake 1 high end fast pathtraced lossless studio 16 snd n gfx
more important i think is to generate enough graphical power with not too beefy GPU while considering power efficiency, if it just come to raw graphical power i think thats not a issue for manufacturers to make such GPU, all things considered, there is only so much a desktop pc friendly card can be, real estate on the board itself can be the issue in upcoming hardware if we do need a hardware that can handle ray tracing like its nothing
I hate upscaling, the image gets blurry, simple is not for FPS games. That's why I still think 1440P is the sweet spot for high quality high frames.
How many games have to tried with upscaling, not all game dev using upscaling tech properly
@@mrbobgamingmemes9558 Warzone and Battlefield 2042 look much worse on upscale imo.
Is this case of hitting the limit or reduction in optimization because of frame gen, fsr/dlss?
How is rdr2 looking so stunning and runs on a potato?
When I dock my steam deck I try to target 40fps in games as much as I can, because most games just won't reach 60 when you try to get to 900p or higher. Not fully analogous but just made me think of it
I was playing cyberpunk at 1440p during the launch period. Upgraded my rig and went back to cyberpunk. Able to play in 4k and ray tracing aboce 60fps. Its very noticeable. Add in dlss and FG and you can add on path tracing, and it is jaw dropping... completely unplayable, but it looks AMAZING lol
I played cyberpunk with pathtracing on 4k and it's doable but nothing close to native. Dlss looks great though and it's worth using low settings, path tracing, performance upscaling and framegen. That's my preferred way to play and get 80+ fps on 4070 ti super.
@@christophermullins7163 which dlss setting do you use? Even at performance upscaled to 4k that looks good. What cpu you paired with? I'm also on a 4070ti super.
I'm actually playing around with some mods now. I have found a few that believe it or not make the game look better (admittedly not in all areas) and perform better too. In most areas with dlss, frame gen, and a few key settings turned down, (mostly used DF optimized settings) I'm able to path trace upscaled to 4k and average 70-80 fps in most areas, but stay above 60 everywhere I've played so far. I switched to controller to negate some of the extra input lag added from FG. It's actually an incredibly good experience.
Unless you are super sensitive to input lag, playing the way I have it set up is actually crazy, I can some pretty life like lookin stills like you see online everywhere. Path tracing is the future for sure, it is a game changer
All of these are optimized to run on consoles. 4090 for this generation of games should never have an issue.
If you’re $1500-$2000 you should expect to be able to play your current generation of games without compromise.
It's frustrating that the older I get, the more I am willing to turn settings down, but the less of an impact to performance it makes. I look at the newer games I play, like COD MW3 or ARK Survival Ascended, where I might want more performance and lower the settings, but they look so piss poor at lower-medium and don't run much better than at ultra.
The Black Ops 6 Beta is out and I can get 150fps at Ultra or completely butcher the visuals and get 180fps. I look back at Battlefield 1 from 2016 and not only does it still look better than most new games, but you turn one or two settings down a notch and I gain like 40fps and can't even see the difference in the visuals.
Another game from 2016, Mafia 3, had the problem where if you couldn't run it at max settings you likely couldn't run it at all because turning down to Low barely improved performance, albeit not with a huge quality loss. But most games now feel the same, Ultra runs bad, so does everything below it. I think it really is just bad optimization.
I think it's mostly a CPU limited problem you have. I think so because I have the same. I can turn the settings in a COD game up or down with no change to performance, and it's because I'm quite CPU limited despite having a good CPU from a few years ago.
I think we are just now ready for 1440p, I hung on to 1080p until a year ago and now running a 1440p OLED. 4k isn’t worth the huge FPS loss.
CPU limited at 4K with new titles? It's hard to find 4K CPU benchmark data, because they don't bother with it because it doesn't tell you anything about the CPU.
1080p CPU benchmark data is the same as 4k CPU benchmark data (if GPUs would be able to push the CPU that hard)... CPUs will start to get limited in the 60-90 range. Even the 7800X3D has a couple of games it struggles to hit 90.
@@albert2006xp I have yet to see a 4K benchmark where there's more than 1-2% difference in avg fps between mid/high range CPUs
@@vindeiatrix That's because it's done at 4k native so ofc GPUs won't hit that. The CPU problem is when you turn DLSS down to hit the high fps. Like I said, you need to push to 60-90 to see issues.
@@albert2006xp A CPU limiting in the 60-90s? idk about that. maybe a really old one.
@@vindeiatrix Look at recent CPU benchmark maybe. Hardware Unboxed big Starfield CPU benchmark showed that a 5800X is just shy of 60 fps on 1080p Ultra, so it would be the same other resolutions ultra settings. It only slightly improves to 66 fps by turning settings to medium as most settings don't affect CPU. Hogwarts legacy 7950X3D caps out at 97 fps and 7800X3D at 94. A 7700X caps out at 80. A 5800X3D at 67... Those aren't old CPUs. Literally the best AM4 CPU capping out at 67? Elden Ring I've seen 45 fps zones on the CPU on a 3600X. Unlocked framerate caps out around 80 in less intensive areas.
4k Dlss on performance looks better than native 1080p. So 4k is not an issue. For me it's Raytracing. Especially Ubisoft starting this new trend of forced Raytracing.
The 4090 would last my 720p ass the next 10 years
As mentioned back in 2000s games could not run at full tilt on release with commercially available hardware. Crysis took it to the extreme but in general you probably needed the next gen to max out everything.
Also in crt days resolution choice was much more open.
I remember fiddling about with different settings for shadows and ambient occlusion to get the right performance.
If you are forced to spend 1000+ let alone 1500+ on one single gpu from team green or team red,, you should be guaranteed to be able to play the latest games that come out at a minimum of 60hz at 4k with no up scaling or frame gen. Even when cyberpunk came out the 4090 was not even able to max that out and it was 2 years later.. so saying that this is an evolution of gaming is highly inaccurate. Team green/red need to make better gpus simple! And make them be able to handle the latest and greatest until the next cycle of gpus comes out then rinse and repeat..also when a 2 year old game was unplayable at max settings without fake frames and down sampling.. also, if you buy top tier card like 4080/4090, you only buy for the visuals and maxed out settings. If you cannot max out every single setting in a game then what’s the point of having the card?? There is none so the point of turning down settings is also invalid…
If I buy a 4k display I refuse and should not have to go down to 1440p and use up scaling or frame gen. That case might as well buy a 1080p monitor .. none of these points are valid..
This has been common for generations, only recently have gpus been so performant that you could max out new aaa games at hi resolution and frame rates.
Remember Half Life 2 at launch? CRYSIS???
4k DLSS Performance mode is the optimal way to play right now. I have been using 1440p for years, and It's not as good. We are close to a 5090 at this point anyway.
I think a lot of friction comes from people who thinks their money entitles them to click "ultra" on everything and get 60fps+. They don't seem to care that they literally couldn't identify "ultra" vs "very high" in a side by side comparison.
Its sad that we're forgetting that the important thing is how good it looks (and runs), not what settings or techniques were employed to get there.
These games were legendary for being unoptimised great games. Lords fallen the 1st game/watchdogs 1 could only hit 30fps with a 980 and only could hit over that 4k maxed until the 10 series high end cards came
Control couldn’t run well with 4K native high settings until the 30 series high-end cards came.
The man with glasses said the same thing: games that are very demanding aren't made for Gpus at the moment. That's why I named the games I did down below.
Alan Wake still can't run well without drops, even in 1440/Black Faith a souls clone can't run 1440 without dropping FPs, stutters bad, and drops FPs using a 4070TI on 4K.
I still can't play remnant even in native 1440 high settings. I'm stuck in 40s and get drops.
Hellblade is very demanding/tarkov/cyberpunk and still doesn't run like it should.
I'm thinking all of these games and others should run like they should've when the 50 series comes.
I think its nice that games cannot run on max features at launch. You can go back a few years later and replay the game with even higher settings. It just adds lifetime. With Control it was not perfect with 3090 at launch but it is super smooth with 4090.
I wonder what percentage of 4070Ti Supers have been sold to gamers vs the 4090 since the introduction of that card.
For $800, you get 67% of the 4090's performance.
You're going to pay $1,200 extra for the other 33% of extra performance. Even at the launch price of $1,600, the 4090 is still charging double for every frame rate over the 4070. Now it's charging triple.
4070ti super have 51% of 4090 cuda cores, rops,tm etc. Also not forget that 4090 isn't even full die. Back in the day gtx 770 had 53% of full die. So 4070ti super should be named rtx 4060. Or 4060 ti at best
@@UltimateLF We're talking actual practical performance here. 4090 is really fat, but that doesn't give proportional performance.
@@UltimateLF Percentages of chip don't translate into performance. The performance is closer than the difference in specs.
I agree with the sentiment, but just FYI if the 4070ti super is 67% of a 4090 then you are paying an extra $1200 for an extra 50% performance not 33% extra performance.
@@DethronedCrown It depends which you apply the 33% to 🙂
I'm a little more upset it feel like everyone is setting 4k60 as the goal. I feel like games might be scaling better in resolution than in fps these days with 120fps at lower resolutions being harder to hit than you'd expect.
4k gamer here with a 4080. DLSS gets used religiously. Anyone with a RTX card who doesn't use its features should just throw it in the trash.
People should do better with their money, specially those who work for wages. The only way things will change is when over priced goods are rotting in a warehouse without a single sale.
Absolutely. I want the 3050 6gb for my project dell optiplex and am waiting for a price drop.
Imagine telling people in 2014 that Nvidia flagship GPU from 10 years into future would still be rendering at 1080p just to achieve optimal performance💀
Paying 2K for a GPU we should be able at this point of time to run games at native 4K we shouldn't have to use 1080p rendering upscaled. DLSS and Framegen is just a bandaid to get by.
4090 owner and I play games in 1440p on LG 45 inch ultra wide monitor at 240hz and also have 32 inch 4k 240 hz Samsung G8. I use to think 4k was best way to game until I discovered 1440p. I’m waiting with bated breath for that 5090!
I have my PC connected to a 65 inch LG OLED I cannot put my games at 1440 they look horrible.
I’ve yet to play a game that my RTX 4090/7800x 3d combo couldn’t easily handle maxed out at native 4K without frame generation enabled. Of course, once you throw ray tracing into the mix then all bets are off.
I don't agree with the user question.
The 4090 is now 2 years old and should have a totally new generation released. But the reality is the 4090 is still able to even if it is via DLSS and frame generation, to put out high enough fps and that is maxed out settings. This means it still has headroom with tweaks etc. And playing with VRR it means that even if FPS isn't 120Hz, most people are fine playing single player games with fidelity and 60+ fps which you can get with DLSS. I am skipping 5000 series , I own a 4090. I can afford to by a 5090. But I don't see any reason for it. I am fine even if I have to tweak a setting here and there and use DLSS performance and or Frame generation at times. It's just another 2 years and likely no more than 5-10 games I will play during that time and I bet most will be done at 60fps no problem.
I like maxed settings. But I am also no FPS snob demanding 120fps at all times. Not when I have VRR. My gripe was always the problem to get 60fps when I didn't have VRR so Vsync would work without stuttering.
With VRR I don't have to worry if FPS dips to 50-55 fps even if it stays above 60 most of the time. Playing an RPG I can easily live with 50+ fps.
Also, when I upgrade to 6000 series the jump will be very big again.
For years newly launched games seem to aim with optimization to deliver ~60 fps in native 4K on flaghip Nvidia cards. Not counting ray tracing showcases where upscalers' help is needed. It's quite visible pattern being on the border of conspiracy theory, but it's not that unbelievable it being how Nvidia and their partners, devs, sell flagship GPUs to ones seeking 4K gaming with max out settings. A little lower settings not showing a difference, but bringing noticeable fps uplift also fit here.
4k is an absolute waste of pixels unless your screen is massive, like movie theater massive relative to your viewing distance. Running that natively is just wasting energy :)
Strong disagree as someone who went from 2k 27 inch to 4k 27 inch and waits to upgrade to 4k 32 inch. massive difference and my old 2k looks bad relative to my 4k monitor I currently use.
@@eliadbu There are many factors into image quality not just the pixels. It's likely your new panel has other features that impact the image quality not only raw pixel count. But the reality is that for desktop use at 27" you are going to run everything scaled up 125% to 150%, otherwise things like text will be absolutely tiny. So most of those pixels you see are just a waste. Yes, 4k is "sharper" through pixel density, but at a native resolution most of the time you are looking at four pixels showing you the same pixel. For gaming you are pushing 4x the pixel count per frame and unless you are running a movie theater, it's an absolute waste of energy for the returns. Yes you can see the difference in relative sharpness, but no, it's not worth the energy to push 8.3 million pixels per frame vs. 3.7. I'm running a 49" 32:9 screen at 5120x1440 pixels and i find this is a lot better use of my 4090 than running a 4k screen of 16:9 as there's actually more things on the screen horizontally.
It depend on the anti aliasing implementation on each games , there are few game who have extremely 💩 anti aliasing that it make the game still look like pixelated mess even on tiny 15 inch laptop screen
@@Sipu79 text scale with resolution, there is a clear noticeable quality difference from 2k to 4k in text, so the extra pixels are not 'waste'. Same with games I can clearly see the difference in PPI - going from 108 PPI to 163 PPI is still very noticeable at the distance which I'm sitting from the screen, about 60-80 cm away.
@@eliadbu i didn't say there's no noticeable quality difference. you can tell a 1440p and 4k screen from each other due to the pure pixel density. But the quality difference in workspace use is 100% not worth the extra processing power you need to render games. If we talk about gaming applications (which this channel is 100% about), 4k is a waste of pixels and energy.
i do think that devs can make a high end type of games but they are not doing it cause they know that only a handfull of ppl can play them and that means that it will floop
I have zero problem gaming at 4k... 7600x 4080
if you want to max everything you can't do that even with a 4090 *at native
There has been an insane decline in logically selecting and optimizing the most important graphical aspects for any given game and instead using universal solutions that tend to destroy performance. In-house graphical engines tend to be great performers, while Unreal is just a blob of infinite technologies that end up poorly applied. If a 4090 cannot run a game at 4k, developers need to rethink their concept of what is a game and what is the most satisfying balance between fidelity and fluidity. FSR, DLSS or XESS should not be used to ensure performance levels, it should be only added on top of smart and logical use of the resources available.
1080p ultra is plenty of good visual but in 1080p i get pissed off that the cpu usage is very high. High cpu usage = sluggish game/frame drops
Native 4k + Ultra maxes out settings is child’s play for the 4090.
4k + ultra maxed out settings + some light RT is a challenge, but it can still deliver 60fps in many of those.
4k + ultra maxed out settings + Heavy Raytracing load, is when the ask gets too big, but with DlSS quality (so 1440p internally, and a completely perfect reconstruction result) it can still handle it in many games.
But PATHTRACING, went from being a tech preview in cyberpunk, like “hey this is possible, but it’s not even an official setting, it’s just a tech preview to try and benchmark”
To hey games now come with path tracing in less than 1 year
The problem lies in poor optymizing vicious circle taking place in PC releases during the latest years. Devs do rely more and more on DLSS and FG for granting decent FPSs and do not make that much of an effort to optymize perforamnce anymore. For Nvidia that's is fine since this way people are more prone to buy newer cards with latest technologies.
4k gaming is only a thing because people use TV's for monitors so then they had to have 4k on monitors. Unless you like low frame rates 4k is ridiculous. 1440p is where you should be, or 1080p if you have a midrange or lower gpu.
It's a 2000$ GPU and you defend it for not going native 4k???? They're scamming gamers all over, the 40 series is weak af, i hope the 50 will be powerful but i highly doubt at the same time.
It's not the 4090 it's crapy visuals masquerading something decent. It's mushy TAA and UE5.
It's an expensive GPU but it is considerably better than any other GPUs. 4k native is not a resolution that has reserved performance left over for it, it's as pointless as 8k. 4k DLSS Quality is the highest resolution considered.
@albert2006xp It's not the hardware it the absurdity of modern autotuned games that add nothing to visual quality over older games. Absurd to pay $1700 to take a hit in visual quality at 4K which is what all the scalers do. I game at 6K because it looks awesome with HDR on my monitor. A game that won't let me do that is a game not worth owning. We need to accepting TAA and UE5 and RT as a crutch. Reflections are fine but your getting lazier and lazier development and artistry.
Show me a stronger GPU than a 4090 on current tech. Nvidia aren't miracle makers, they can't suddenly conjure up future hardware now. They can only progress as much as current tech allows them to progress.
The real issue isn't GPU's being underpowered, it's that game Dev's have just gotten far too overly ambitious, are aiming thier sights far too high and as per usual are trying to outclass each other in the graphics department.
Cyberpunk 2044 was one of the catalysts for them to take the current route that they have taken. They simply saw it in all it's graphical spender at launch and not wanting to be outclassed, decided to do exactly same or better in thier games. The issue was Cyberpunk was an extremely demanding game on the GPU (A new Crysis in that regard). So likewise the end result was AAA game studios all upped the graphics in thier games, which in turn upped GPU requirements across the board. The recommended GPU requirements of games then was raised far too high. Far higher than what the average gamer owned.
Dev's simply need to dial things back a fair amount and aim lower. We don't need games that can only run fairly well on expensive high end GPU's or future GPU's (or future consoles for that matter), we instead need games that can run reasonable well on current "middling" GPU's that people own, again (including current consoles). And if that means reducing the graphical eye candy, not aiming for often unreachable goals like 4K and dialing back on things like the amount of RT, then so be it.
Games are simply out pacing the hardware, that's the real problem.
@@robertmyers6488 TAA doesn't exist with DLSS. They're two different things and are exclusive with each other. Most sane people are absolutely fine taking a hit in rendering resolution given upscalers make that hit much less impactful than it used to be just to render more advanced games. You live in entitled land. Most people game at 1080p with DLSS and enable all the graphical candy you refuse to accept. Deal with that.
Native 4K? No problem, but you must compromise elsewhere. In Wukong at that setting I'd need RT Off, settings to High preset, and Frame Gen to get a decent ~110fps. If you're less of a framerate snob, you can surely get away with more.
Black Myth Wukong players on a steam deck and the lowest settings are loving life when they have a stable 30 on the lowest settings are loving life
Meanwhile 4090 and i9 players will lose their minds if they are dropping below 60 at 4k
Get your expectations in line with what you have and we'll all make it out this generation happier with our gaming experiences. That or decide if you're here for the game or for the tech demo.
I would worry more about stuttering with AAA titles no longer being a thing before GPUs being the bottleneck in a newer game at "native" 4K. Doesn't matter how good a game looks if it runs like trash.
Honestly the late ps4 gen games looked almost as good but the performance and optimization were much better. Look at games like Red Dead 2. We need games that focus on having good gameplay and being more fun instead of getting slightly better graphics
4K games are not and never have been broadly played natively. At this point, DLSS Quality is often a better AA solution than TAA.
5 years ago, DLSS Performance looked worse than the internal resolution at which it was rendered. Now, DLSS Quality, Balanced and Performance look very similar.
I hate that the industry is leaning on upscaling when making their games, but I don't think we can put that genie back in the bottle.
I think the problem is that the developers are using the RTX 4090 as a benchmark to target 1080p at 60fps. If it runs at that resolution and framerate, they consider it 'optimized' for release. They should be targeting 1080p at 60fps on an RTX 4060 instead.
It should be on the most popular gpu used by steam user which is rtx 3060 atlteast on high setting
If 1600$ is not enough try $2000 with 5090 then you gotta pay the premium prices
They glance past the use of 8k with optimal performance using 4090
Most cards can run 4k especially the higher end cards. Whats killing performance for a small visual upgrade is the baked-in ray tracing that these games are coming out with. I think this is the not the right generation for games to be trying to push Raytracing especially with current consoles using AMD chips
I got a 4K oled tv, but I still game at 1440p, as it still looks great. 4K isn't worth sacrificing performance.
And you can gain a lot, from running games in high, instead of Ultra, as you can barely tell the difference, in most games.
And I don't like, having to use frame gen, due to the lag.
5-10 years from now, we'll still be having problems with running AAA games, in native 4K, no matter how highend the PC is, due to the games becoming more and more demanding, and/or badly optimised.
I was crazy (or stupid) enough to buy a 4090, and I say 95% of the time native 4k is viable, and it's just the odd game here and there where DLSS becomes necessary because of the harder to drive features like full path tracing, and ray reconstruction. Star Wars Outlaws, I turn off ray reconstruction and I can run the game native 4k in the 80fps range. Same thing with Cyberpunk, turn off path tracing and ray reconstruction and leave ray tracing on I am in the mid 50s at native 4k. At point I am good to use Native 4k with frame gen and not feel any input lag to get me over that 60fps hump.
Full path tracing and ray reconstruction is nice, but both can be hit and miss as both are still in their first gen iterations. I see these options I try them, and I turn them off because you never notice it unless you stand still and look for it.
Still tech like this is great and is exactly what PC gaming is all about. Pushing new features that rides the bleeding edge or just plain jumps off the edge. I may not turn on those bleeding edge features on the regular, but I will always turn it on to try it out.
Game engines.. MGSV runs with 1080 maxed out, and that game looks still good.
Nope definitely not, when it released it did but this has happened every generation since the 1080 Ti, after every 2 years which is usually a generation of GPUs the flagship gpu goes up from 4k to 1440p resolution.
The upcoming 5090 will be the only viable 4K gpu to play with and anything else is 1080p or 1440p in the latest games
I feel like 4k native gaming is never going to settle in. If you’re in the 4k game it’s like no GPU is ever going to be much for long if at all. Future proofing I really don’t think exists at 4k. That’s why I feel like 1440p is the best resolution. IMO 4k isn’t THAT much sharper than 1440p. Like imo it’s close but the performance you get at 1440 obliterates what you get at 4k. So like I just feel like as a whole 1440p pros really outweigh 4k. You’re not gaining much if anything that’s worth that performance hit.
I’m sorry but no, 4K has been a thing for over 10 years now, GPUs have gotten like 10x more powerful since then and games don’t look 10x better than something like battlefield 4 from 2013. Like how can you look at an RTX 4090’s specs and think “yeah, this GPU is too weak it needs upscaling”. The only time I can justify that thinking is if the game has full path tracing, which only a few do.
4k screens and Blu rays, sure. Real time rendering? Not even remotely. The entire reason xess, fsr and dlss have become essential for all but those with the highest end gear is because hitting 4k natively on typical gaming pc's, let alone consoles, would require significant scaling back of visual quality, just to bump resolution.
And no, gpu's have not become 10x more powerful. New features, and designs have added to gpu abilities, but in raw rasterization a 4090 is roughly 4x a 1080ti.
@@thelonejedi538 The GTX 780 Ti, a 2013 GPU, supports 4K and could even do it on a new game at the time like battlefield 4 at high settings 60fps. The 4090 is about 8x more powerful than a 780 Ti and about 10x more powerful than a regular GTX 780, all in raw rasterization. You can ignore the 10x thing and just pay attention to the 8x part, games don’t look 10x better and neither do they look 8x better. So yeah, upscaling on a 4090 shouldn’t be needed unless path tracing is in the conversation
Which games "need" upscaling to be played on a 4090 without using ridiculously advanced rendering, at least comparable to path tracing?
@@yancgc5098this is a weird example. You could reasonably expect drops below 60 on the 780ti running BF4 at 1080p. (Digital Foundry has a video showing just that). It’s normal for new games to struggle maxing out current high end hardware. Those settings if anything are more for future proofing for new hardware. Most of these games coming out look just fine on current consoles, which are far from running games at max quality and resolution.
You are not taking into account how much graphical cuts the game needs to take to make native 4k happen. Games were more restricted when all we had was native. Now they can use more power because we all run upscaling.
It's ok that sometimes the software steps further than the hardware, just i don't see all this improvement in graphics to justify the fact a 4090 struggles at native 4k, even with ray tracing, path tracing egg yolk tracing ecc ecc, i mean, ok the flagship card suffers all this filters and ends up to be insufficient but this comes with a really poor graphic improvement
Its more like developers are cutting corners and are abusing upscaling techniques and frame gen so it can optimize their games just for a paycheck. Theyre the ones who are holding the hardware back. The point of Fg and upscaling techniques should only be to get better performance for ray tracing, its shouldnt be a major focus on rasterization.
This is a take only accepted by people who don't know anything about game development.
Don't get me wrong, it's true sometimes. But the idea that devs have gotten lazier or less competent is just silly.
@@DanKaschel I'm not saying all developers are lazy. I mean there has been games out there that are optimized but games like black myth wukong and ff16 are both poorly optimized on pc and ps5, and those games abuse upscaling techniques, the fact that ur defending upscaling techniques is insane. I would rather have 1080p native resolution than 1080p upscaled to 1440p with so much shimmering, ghosting, and other visual artifacts on ff16 on ps5. Btw I'm we’re talking about the 4090(dlss is an exception bc the quality is good) but no way ur advocating for developers to rely on upscaling techniques at this point we will never achieve 4k gaming.
@@Filthyfrank_17 honestly when I defend upscaling I almost exclusively mean DLSS, everything else drives me crazy
@@DanKaschel oh ok, that's fair
I know dlss is a great thing for lower hardware but I feel it’s used as a crutch by developers. I upgraded a yeah ago from a 1650super to a 4070 and a 1440p monitor and I hate how some games dlss is necessary.. I know it looks fine but still
4K Native will never be achieve forever by any GPU, not even a 5090 or a 6090. Developers will just keep increasing visuals and not really optimizing anymore since that allow them to pump more games faster and you really can’t sell a RTX 5090 if a 4090 can do just as good as that GPU, so GPU makers incentives developers to use more and more features like RT, more polygons, more details and many more things. Only way to keep playing at 4K forever is to lower settings and use upscalers.
Of course. It's well over 2yrs old. We need a 5090 already.
Of course they are! Gotta sell those 5090s somehow amirite?
Yep bring out unoptimised games that barely look better then 2-3 years ago but are twice as demanding so you feel like you have to upgrade to the 5000 series.
@@PCZONE1 It's funny how the latest games are always *just barely* too much for Nvidia's highest-end GPU, right before they release their next line of GPUs.
@@PCZONE1 try 17(!) years ago(Crysis)
RDR2 is a nice recent example as well of a great-looking game without RT/PT
the whole RT/PT/upscaling thing was rotten from the start
Frame generation only applies for those who want to run their games at over 60fps. Otherwise it's just not nice to watch. DLSS and FSR will be doing the heavy lifting, unless the game is CPU limited, so it's not necessarily the GPU that needs to be improved. Native 4K isn't necessary these days.
Yeah. The RTX 4090 is underpowered right now. We are in great need of RTX 5090 😭
Thats not gonna keep up if game dev dont give a f**k about performance
Actually, that's exactly what they want you to believe. The 4090 is a fucking beast in reality and you know it. Just play old-gen games and you see what a massive difference it is in performance, but not in graphics. We could have another 3 generations of graphics innovation if gaming corporations put in the effort on current gen hardware.