@@hankb27 its not,thats why they are different kinds of settings. You are not supposed to be able to play ultra quality on 4k at 120fps,you can lower and tweek the settings to achieve that goal.
@firstsonofthesea7938 my point was that it's not a requirement. 4k is very hard resolution to run,kids these days expect the 4090 to run 4k 140fps in every game on ultra.
The very high preset in CS2 uses 8x MSAA by default, which at 4k is not only way overkill but also extremely demanding. I play 2x MSAA at 1440p and it looks nice and sharp. 4k I imagine you could get away with no MSAA. The biggest reason you saw so much improvement dropping down to lower and lower presets is because the MSAA setting was being lowered.
depending what level of performance are you expecting the rtx 4080 with a top of the line cpu can crush 60fps in all games at respectable setting in 4k.
It absolutely is ive only ever gamed in 4k on it. We can't just go by alan wake and cyberpunk when all you need to do is adjust 2 settings and alan wake is over 60fps also we shouldn't judge 4k performance on ray trscing results doesn't make any sense at all that's an additional setting. So far most games I play are 80+ fps in native 4k with normal ultra settings. For sim racers like myself all sim racing games I play like acc and iracing are all over 130fps native 4k. It's a 4k card absolutely. Just hate the way mainstream media wants to go by 1 or 2 games in the worse use case ever and poor optimization and act as if we can't just reduce a few settings to high and get that 60fps mark lol. Even in the most intensive titles. The fact it can get close to 120 in many games is pretty insane at 4k. 4k should not be set as the same standard as 1080p or 1440p. Anything over 60fps in 4k is acceptable bevause nobody is playing competative shooters at 4k. It's made for most single player amazing looking games. Then we have dlss quality and man oh man its amazing. Just always have to keep in mind some games were not designed for 4k in mind and weren't well optimized for it. Now we have devs throwing everything they can at our games and going yep let your hardware take care of it. Crazy because new games look amazing but we all know they aren't as crispy as they used to be for some reason. Devs def don't go around making sure everything is clean like before it's all based on anti aliasing now which sucks lol. Sometimes I'll play an older game and look at road textures and detials and gp wow how is it so sharp and clean? They eant us to think graphics have improved so much but only the people and certain things in sight are. Manu of the surrounding areas are actually worse. Buildings legit would look 3 dimensional. But one day we will see it come back. To me Spiderman proves how good pc couod be because that was done based on a console! Now imagine that game being built from the ground uo purely for gaming and the devs put that much effort into it? We woupd have a truely ground breaking experience.
Race sims are designed to run high fps on a potato, like competitive shooters. It's not optimization at all. He has a very fair selection of titles here, and they're hardware limited.
@@drewnewby his test wasn't bad this wasn't about his test it's very clear it's a 4k card as he showed here I was just speaking I. General about mainstream media.
@Dempig what? Lol that's so false even you saw on the most demanding games 58fps is playable lol and again why can't you reduce a few settings to high instead? It's still native 4k resolution I dint get that it's still better then 1440p ultra also doing it that way. Some of yalls logic with mainstream showings blows my mind.
@Dempig again a few games that it has a rough time in 4k doesn't make it not a 4k card. That doesn't mske any sense at all. It also comes down to optimization but again ive yet to have a single game I can't get over 60fps by reducing a few settings to high instead.
Great vid. I purchased a 4080 super off the back of your flightsim experience with it and I’m happy. Yes the card costs a fortune but the days of cheap fast GPUs are long gone.
@@zeroturn7091 Reminds me of the days when you could get a PC running very low on power, like sub 450W for two cards, having a GTX 460 SLI to match a GTX 580 performance. Those days are long gone now indeed.
@@davidnott_ I remember buying a 1070 FE for $350 from Best Buy and then upgrading a couple years later to a 2070 Strix card from Best Buy for $499. Then a 3080 12GB for $800 in 2022. Every purchase, thinking to myself that I was totally insane to spend that kind of money on a GPU. Looking over at my new 4080S... I now remember the 1070 and 2070 purchases as the "good ole days" of cheap GPU's! LOL
Very cool vid thanks for making it. Talking about Cyberpunk 2077, even the 4090 won't get you anywhere near 120 at the highest ray tracing preset in 4k. With DLSS and FG then you're looking at 90-100fps so it really is as you said a time to redefine what we mean by "4k GPU" because under some of these metrics not even the best GPU on the market is a 4k GPU.
so true , graphics currently are in a huge "leap bubble" there was one of these such leaps going on in 1999-2004 , where the visual software tech (games) was quickly out pacing the hardware. the difference from now and back then is back then new more capable hardware was coming out every year and that hardware was significantly cheaper (don't give me this inflation bullshit , the economy of 1999-2004 was much better and it was generally easier to afford a top end card back then , i did so several times on a lower wage job while supporting myself ). today cards take 3 years to develope and cost way more in work /time resources . back in 1999 i started my first job making $6.75 and hour and i quickly moved up to $7.25 and hr . it was a 40 hr a week job at that rate i made $1,160 every month take out taxes i came in around $900 -$950 a month. the top end Geforce 2 ultra i got in 2000-2001 cost me $270. even afer buying it , i had room for bills and food. today that same job pays 12 and hour. or $1,920 before taxes. take out taxes you are looking at around $1500. today's top end video card (the 4090) cost more than what i'd make in a month doiing the same job i did in 1999-2002. where as before, the top end card costed me only a third of what i made in a month. Sorry didn't mean to get in a rant about our shit economy. my point was that right now graphics technology is currently hitting a "milestone" in advancement and it'll be acouple more genrations of hardware before 4k w/RT @ 60FPS becomes a standard normal thing. that said if you ignore RT for the time several cards are techniclaly "4k cards:" here's my list of 4 cards (no RT) from benches i've seen (both on YT and with my on hands benching Radeons 6900xt 6950xt 6800xt 7900xtx 7900xt 7900gre 7800xt Geforces 3080 3090 3090ti 4070 Super 4070ti 4070ti Super 4080 4080 Super 4090 Honorable mentions : these are cards that can pull off slightly older games in 4k easily or can manage 4k with some settings tinkering 3070 3070ti 3060 ti (12 gb version) 6700xt 7700xt 4070 4060 ti (16GB version)
@@DenverStarkey Good rant. My current solution to this dilemma is also ignoring RT. I have the 4080 Super. Disabling RT makes it finally feel like the performance that a card in this category SHOULD have. Smooth 4k Ultra or 1440p at a locked 165hz. Even Frame Generation at such a base FPS is very nice and has minimum latency.
Really the expectation should be 120fps minimum at 4k for the price of a 4090. Especially when you look at what you used to get for much less money as in the top tier card. I get 105fps average in Cyberpunk on the benchmark which is still well playable but the card was nearly 2g which is shocking. In a years time it will probably be a 1440p card given the game engines being used in some cases.
@@chrissyboy7047 Nah not in a year's time . sure we are getting huge visual leap ahead with real time RT and path tracing. but it's still not as fast as things were moving between 1999-2004. maybe in 4-5 years the 4090 will be 1440p card but even then only on the newest games in 4-5 years. for isntance my old 1070 (taht resides on my mom's PC now) can still blast Doom 2016 , skyrim , Resident evil 2 remake and a slew of other games at 4k 60FPS. it's not like the games of current year will all of a sudden play liek shit in 5 years. and with the way things are going modern triple A games are just getting worse and worse with the bull shit agenda pushing any way. who knows in 5 years there might not be any new games you care to play. these last couple of months of saving fro the rest of my new pc (i already got teh 4080 super for it ) i've been playing a lot of really old games . like stuff from 20+ years ago. at times i am wondering why i even care to finish geting the new PC ... but well i love buildign new systems and i already got the case , so yeah. but after this one .. i'm just not sure when my next build will be or if i'll even be interested in the crop of games taht are out 5+ years from now.
@@chrissyboy7047 Yeah I know these are weird times. Tbh the only one who wins in this sh*tshow is Nvidia. With new features like Pathtracing the price per frame has increased massively. There is minimum visual improvement at a massive cost in performance. While we fight down here in UA-cam comment sections and Reddit discussing whether we can notice Raytracing in certain titles they bag in thousands of dollars per person each generation.
THANK YOU. You are one of the few that show settings changes in real time and give a complete walkthrough and explanation of performance results, rather than just posting numbers with footage of random scenes in the background. I really want to upgrade to a 4080 but prices are not worth it with 5000 series around the corner.
Thanks for the kind words! And yes, hard to argue with that assessment of the market! I certainly wouldn’t be getting a 4080 super at this point in the cycle.
the 4080 is more than enough to do that!!! I use a 4090 just like that and limit it so MUCH! I use it always with DLSS Q on, RT/PT off and 116 FPS cap on 120Hz LG OLED screen! So, I could also use a 4080 and would't even notice a difference on that setting!!!
@@SiLva_SuRfa Well he's got a basic Nvidia factory RTX 4080S or FE. Even there, there are notable differences with aftermarket overclocked cards. And even among aftermarket overclocked cards, there are notable differences with the cooling performances and quality of the components/overclocks which translates to more framerate in the end (that can go up to 10 fps difference which is huge, and then you can overclock manually on top of that if you really want to squeeze even more frames out of the thing). For instance, the cheapest of all, the PNY slim form factor card, despite having three fans, is not that great compared to a solid Gigabyte model.
According to techpowerup's review, the 4080S averages 95 fps in 4K based on 25 games tested, and that's even without DLSS and FG. I play on a 1440p monitor, but I often use DLDSR downscaling with or without DLSS, and this GPU offers an amazingly high refresh rate experience even at these extremely demanding settings. Even path tracing games run great although at 1440p. I have bought several graphics cards over the years (TNT2, Geforce 2MX, Geforce 3, Geforce 8800Ultra, GTX680, GTX 1080ti, and GTX1080), but I have never been more impressed with a GPU, and the RTX4080S is not even the fastest GPU available (the RTX4090 is up to 50% faster in some games).
It's just us becoming unreasonable. 4K is such a powerhungry milestone to reach. I just went from 1080p (used to play on an old and crappy 32" LCD TV with horrible latency, panel backlighting, basic 60 hz refresh rate, the image was looking yucky) to a 34" 1440p QLED 21:9 UW monitor (Odyssey G9 OLED) using a RTX 3070 Ti and I plan to buy a RTX 4080S during Black Friday. Frankly my card is holding up good at this spot, but I don't reach the possible 175 Hz in most modern and moderatly demanding games and UE5 games are really crushing it. I get 60 fps out of UE5 games but with sacrifices in quality: no Lumen, reducing textures a notch, shadows a notch, some details, overall running a frankenstein medium/high/ultra setting generally. Which still looks great don't get me wrong. But my card is screaming! So to run the most demanding games, 4080S it is. I'm still not sold on 4K, I'm more sold on the ultrawide aspect ratio for now.
@RonJeremy514 The 3070ti is still not a bad GPU, but the 8GB VRAM limits this card for sure. If I were you, I would wait a little longer (2-3 months) with the upgrade and get the RTX5080 instead of the 4080S. I only bought the RTX4080S because it made no sense to pair the GTX1080 with the Ryzen 7800X3D. When I had GTX1080, 4K was not easy to run. DLSS technology however makes it very easy to run games at 4K, especially if you use DLSS FG on top of that. On my RTX4080S, almost every game in my library runs at 4K 120fps with the help of DLSS quality, and with DLSS FG on top of that, even very demanding Hellblade 2 (UE5 game) runs at 120fps. Some people might say that this is just an AI generated image, but I don't care because it still looks like a real 4K image to my eyes on my 4K TV, and downscaled 4K on my 1440p monitor also looks awesome. Older games like Resident Evil 3 remake doesnt even need DLSS to run at ultra smooth fps, I get 120-170fps at 4K native in this game and that's with RT.
I remember when 2x SLI GTX980 was recommended to play GTA5 at 4K, but I never heard people recommending single GTX970 to play at 4K. I guess this card could run X360 / PS3 era ports at 4K, but PS4 ports were just too demanding for sure. The first cards that were described (by reviewers) as capable of running 4K were the GTX1080 and GTX1080ti. I had both and was able to run a number of games at 4K at 60fps. Now my RTX4080S can run almost every game at 4K at high refreshrate except for some of the most demanding RT games, but thanks to DLSS even these games are perfectly playable.
@@PabloB888 Digital foundry for example had a "series" called 4K On A Budget where they played different games at 4k and showed settings. I also remember back then I played the Lego games with my kids at 4k on a RX480 :) This 4k-capable moves forward all the time, when the games get more and more complex.
to be fair the 4090 can't run all games in 4k at max settings at high refresh rates either. imo if it hits 60fps on most games at max non ray tracing native then i consider it a 4k card. we also need to take into account that sadly games are poorly optimized and developed with upscaling in mind and that affects the performance at native. seeing how the industry is right now i'd be fine with 120 fps at very high with down to dlss balanced, the card seems to be able to do that but for how long that statement will remain true i don't know.
Hey, does anyone remember what a 4k card was when 4k was first introduced? By that standard, that card is doing very well. What about when the setting "Ultra" first came out? You getting good frames on Ultra in 4k with most of your game suite. That said, expecting 144hz is asking a little too much, but eventually, that is basically what a 4k card will be. Maybe even with Ray tracing. The cards just aren't there yet, but I can see where you're coming from.
I have two monitors for my 4080 - LG 27GR95QE-B and recently a aw3225qf. I have actually put in a return submission for the aw32 simply because after a weekend of testing 4k over 1440p I wasn’t entirely happy with performance. It wasn’t terrible, it looks amazing - but I am worried about future proofing my pc because I won’t be upgrading anytime soon and I know my setup will run anything pretty much maxed out at 1440p, where as with 4k it’s a gamble and reliant on DLSS.
Kinda thinking the same thing. I only have a 1080p TV and a 4k TV atm because everything else broke. I'm worried now I'm used to 4k that 1440p won't seem as sharp. Do you miss 4k or is the difference not that bad?
I bought last year March an RTX 4080 and I play in 1440p because I want those Ray Tracing effects and in 1440p the RTX 4080 is a very capable card of course with DLSS and Frame Generation. To be honest I've never really experienced frametime issues with Frame Generation. I am not a competitive gamer and for me 80-90 fps is "enough" and my monitor is capable of 165 hz but I have no intention to max out.
Should I buy a 4k monitor if I get the 4080 super with 7800x3D? Worry about having to downscale to 1440p gaming if the fps is bad. I just want average 60-90 fps with ray tracing and high settings only, no ultra. I play all game genres. Which monitor should I get? 1440p vs 4k OLED for this gpu if I want 70+ fps on all games at high settings no ultra?
You can just get a 1440p monitor and use DLDSR with DLSS for the 4k experience. From my experience the performance tank is not worth the higher pixel density.
I'm currently running a Ryzen 5 5600X and an RTX 3070; noticed you used to run with a 3070 before your upgrade! The best possible CPU my motherboard supports is the Ryzen 9 5950X, so I'll probably grab one on sale and wait for the 50 series cards. Good video!
Wasn't there something about the 40 series cards where not every single card is identical? I heard somewhere that the components were changed in later manufactured versions
Texture settings have vastly increased in fidelity for years, despite remaining 4k. This is a good thing. Resolution is important, but there are many attributes to creating a good visual and i think we have improved a bunch. Raytracing will be good on a 5070 and AMD will haveclosed the gap some. After Raytracing, mesh shaders should start to become more common. Then hopefully that last resolution jump to 8k and 12k as that is the max a very large TV will need in any living room.
I been using a TUF 4080 super since release day, the issue I've found is Nvidia purposely used weaker VRM and fewer power phases on the 4080 super to cut cost. This is the main reason it performs worse than regular 4080 sometimes. The remedy I found was flash the vbios to strix 4080 super, which provides 22% higher power limit, this made a difference even tho I always undervolt to 900-950mV , now it doesn't hit power limit and cause frame drop.
To address the stuttering there's major problems relating to stuttering with this card caused by lower wattage PSUs and poor quality power cables but some drivers have also been causing problems
Don't just tweak hardware (CPU, GPU, RAM, etc), find the perfect in game settings are important too. I remember turn on SSR on the Witcher 3 next gen update did nothing to fidelity, but it's destroyed my fps. PCMR is not about hardware only.
Imo panel and graphics tech has just far outpaced hardware to support it for the average consumer. 4K was far too dmeanding for consoles and PC rigs when it was introduced, and still is to a fair degree. The introduction of resolution scaling, checkerboard rendering, and DLSS helped mitigate some of the demand on hardware, but then ray tracing came along and now they're trying to use frame generation to help with performance. Most average PC rigs and certainly consoles can't handle native 4K with med-high graphics settings, nontheless ray tracing on top of it
That's why my panel upgrade went to 1440p 21:9 UW, 175 Hz, QLED and VRR, HDR, instead of 4K. I'd rather get the features that really matter to me and have a great experience than 4K, I'm not sold on that alone having to make a lot more sacrifices just to meet the resolution's requirements. It's like trying to play Ace Combat or Batllefield's aircrafts with an arcade gameplay but with an expensive HOTAS setup, it doesn't really make any sense.
I was considering grabbing this card to my build but after reviewing some benchmarks similar to this one, I am reconsidering whether or not I should wait for the next gen. It is a little bit of a shame like you said for a card this expensive to have to rely on software features and tweaks.
In my humble opinion I think a fair standard of cards of any fidelity is being able to do that fidelity at a STABLE 60fps at high settings. That's what makes it a true 4k or 1440 or 1080p card. Can it run most games on medium to high settings at a stable 60. I think that's the fairest point that ignores BOTH the snobs and the apologists and is realistic. High end cards of that fidelity are valued on how far past that they can give you and anything less than that is actually a lower fidelity card that happens to be able to punch above its weight on somethings.
I recently acquired the 4080 Super from a friend who originally bought it but later upgraded to the 4090 for reason's. I paired it with my i9-1300KS and made an Elden Ring Mod video. Someone pointed out that my CPU was a bit long in the tooth. Nowadays, there seems to be constant pressure to always have the latest and greatest, or else you're made to feel like you're already falling behind. The graphics card is good, as it should be, but when it comes to Ray Tracing, it seems like they exaggerated its capabilities. The only card that truly excels in ray tracing at high frames is the 4090, and even with that, some games can bring it to it's knees. It's wild to spend anywhere from $2500 to over $5000 and still can't play Black Myth: Wukong over 70 Fps at max setting with ray tracing.
What a ridiculous standard. 60fps ultra settings in the most modern titles is something that's only happened in a couple generations. The 4090 can be easily crushed. The 5090 will be easily crushed.
instead od using presets like extreme or ultra i would recommend costum settings. often there´s a few settings wich are very expensive but don´t provide any real visual benefit
Pretty much sums up modern games. Now running everything 100% ultra is quite luxurious even for the premium cards. Some ultra settings performance sacrifices outweigh the visual benefits clearly. Some things will remain unchanged though like having high res parallax textures, particles, Ambient Occlusion, details, shadows, long distance LOD rendering, VFX all while running a good resolution. But Ray Tracing? I can live without it. AA x16? DLSS is perfectly fine and improved upon at a fast pace to look better and tank a lot less performance.
I am trying to decide between the 4080S and the 7900 XTX. Both have custom options at or below $1000, however the XTX has 24GB of VRAM. I noticed from many of the benchmarks, they perform within a 10% margin of error in most titles. Is that because most games do not require more than 16GB VRAM or because the XTX is doesn't have cores to take advantage of the available memory? If you had to choose, which would you pick?
I've had absolutely no issues with my 7900XTX. DLSS Quality is decent if it doesn't affect the visuals. But tbh in some games it is noticeable. FSR is behind when it comes to sheer FPS numbers but I find it is a smoother render. Plus most games have FSR as it's open source. As for power consumption, if you're that worried, just undervolt it. But tbh 100Watts shouldn't scare anyone if you can truly afford this GPU. It's like buying a Ferrari or Lambo and worrying about the MPG.
I was deciding between the two back in May. Went with a ProArt 4080 Super for the aesthetics in the end. Edit: oh, and better resale value for Nvidia? Another point to consider. Maybe the XTX would have been just as good (or better?) but I’m happy with my Super. I play mmorpgs mostly.
Nice video! For future content regarding Counter strike. In the steam launch options for cs2 you can add this command to allow msi afterburner overlay: -allow_third_party_software -W
I would still say its a 4k card, i have been enjoying rdr2, horizon forbidden west, spiderman with raytracing, all of them at ultra settings native, and it does work super well for high refreshrates single player games lika forza or doom eternal, but playing shooters in 4k is a bit overkill man. Like if you want to play the campaign in 4k i get it but I use a 240hz 1440p screen for cod and i hit 300 fps with balanced settings in that game. I really love this card honestly.
It’s not there just trippin! I got the same card with an 5700x3D and have no issues at all playing everything on 4k… people are complaining about a chip with a dead platform that still kick behinds today even compare to 7000/9000 chips like you said anything above 60 is just fine nothing least than 30! Which I have not seen one game on my rig that I can’t run at 4k max graphics
@charlessmith9369 I'm running my 4080 + 7800x3d at 1440p do to my 240hz monitor so the frames never get that low. My PS5 running at 30-60 would never stop me from playing on it. I'm new to pc gaming but I'm thinking when PCs frames get that low it becomes unstable. I don't know, I'm trying to learn.
Technically speaking a 3080 12Gb was a 4K card and the 4070 has comparable performance. In most games you can definitely game at 4K with a 4070 or above but theres a few outliers of course.
I'm perfectly happy with max settings 1440p at 100+ fps average. 4k is completely overrated since the details are negligible. What's NOT negligible is the responsiveness that comes with high fps
Bad 0.1% lows? You definitely have an AMD CPU. The AMdip is incredibly annoying. And now that Intel also went with a chiplet design, we have no low latency options. 😔
Good video. I bought a 4090 at $2200 Cdn, this was before the 4080 Super release. The 4080 (non Super) was around $1700 Cdn. I figured if I was going to spend $1700 for a card then I might as well go all in and spend $2200. It's good that the 4080 Super is cheaper.
Even the RTX 4090 is "optimal" at Native 1440p, 1440p 240hz would be the ideal display to pair with an RTX 4070 to RTX 4090 - From 120hz to 240hz. At Native 1440p you can enjoy a high refresh rate experience ( 90 FPS + ) without the drawbacks of visual artifacts and increased input lag brought about via AI Upscaling and "Frame Gen". 4K is doable with an RTX 4090 in many cases but I wouldn't call 4K "optimal" that would be Native 1440p.
They classified the 3080 a 4k gaming card so i went out and bought one and i can tell you now no it wasn't it couldn't even do shadow of the tomb raider maxout in 4k with there crap ray tracing for its time compared to todays ray tracing in game but i only had a 5600x cpu. Your 4k gaming is only good because you have a good cpu as well so it's not just the gpu that's doing the 4k you won't get the same performance on a lesser cpu paired with it. So just buy a 4090 or wait for the next new 5090 with the best cpu and your then ok for at least another 6 months before it starts to get useless with the next new tech around the corner lol
CPU doesn’t matter on 4k as much as the gpu unless the game is also cpu heavy dependent 4k takes a huge strain on the gpu 99% of the time on certain games but that doesn’t mean cheap out on a cpu either
It's a $1000 card because Nvidia has been price fixing. There is absolutely no reason 2 year old tech should be sold at the same MSRP as release. All tech should depreciate in price over time.
Okay You probably saved me few hundreds bucks. I have similar system (5800x3d, 2x16gb ram,etc.) and I was thinking about upgrading my 1440p TN monitor to 4K Oled, 240hz panel and my RX6800XT to 4080, but holly …. I didnt realized that 4K gaming is so taxing compared to 1440p that even 4080 struggle to play some games at 4K natively without upscalers. Well I will reconsider this. I will either buy 4K monitor now and wait for rtx 5080 or buy rtx 4080 + UW 1440p high refresh rate panel now.
Can someone explain to me why these cards can't play games at 4k native but cost $1k and more? Nvidia and AMD are relying on software and RT cores to get the FPS into the upper ranges on these games, but what happened to just having the fastest card without all the gimmicks! If i were to compare it to cars, it would be a comparing a v8 to a Turbo 4 or 6! Is it so bad to want a card that cost half of my rent payment to just be the baddest, "fastest" S.O.B in the neighborhood or in my computer for this purpose that will play any game at 4k 60 and above without the 8 or 10sp automatic and AWD to help it?
The 4080 S is a 1440p card in reality. If you play in 1440p, you can use ultra settings and DLSS and get 100+ FPS and this card will take you for the next 5 years. At 4k, even the 4090 can struggle.
I think having seen these tests, it's good on 4k depending on the game and within what you might expect in a lot of instances especially with the high needs of 2077 and Rift Apart. My only thing in testing, I know it's personal preference, but I find DLSS Quality to be a less practical choice than balanced, and I think that setting can make a decent amount of difference. It's almost like the effect raytracing has in some games where the visual uplift you get from RT in the game is marginal but the performance hit is quite a lot. I felt that particularly when you showed Diablo 4. I didn't really notice a difference with visuals but there was a decent performance hit with it on. When I've tested DLSS Quality in a game like 2077, the visuals look ever so slightly better if I take the time to sit around and nitpick, but the performance cost is a bit much. I think balanced is the way to go on that front.
for me 4k is when u dont need upscaler but that is long gone now tbh, cuz they force some kind of rt, but does it looks good enough that u can ignore dlss, sure, probably cuz i dont have it so cant say for sure
My take is this. If it runs over 60 at 4k with Dlss at performance...yes. Reason I say performance becuase its that good. But the 4080 does almost everthing at quality dlss or native for games i play. Also if it can run pt..thats the ultimate test...2 cards right now can ones the 4080. The difference is you cant go beloq quality with fsr thats why the best AMD cards arent as good 4k cards. Dlss performance is on par if not beats fsr quality from my tests.
Nope , there are no true 4k cards even the 4090. I own a 4080 s, when you can run native 4k above 120 in all games without "features" that will be a true 4k card.
I don't like sayings like this.. You might as well say there are no true 1080p cards because there are moments playing in cyberpunk 2077 that will fall below 60fps if you have everything maxed with path tracing on. If you want to up it to requiring 120fps at all times, lots of games would not be able to accomplish this.. I guess the 4090 is only a true 7200P card according to your logic
great video, thanks a lot! imho avg fps s are only part of the story. a smooth experience is more important to me. e.g. my 4080 plus 5600x system runs last of us 1 at ultra settings at 4k at ~80fps. however 1%lows are below 40fps which can spoil the entire experience by stuttering. BUT reason here ist the CPU bottlenecking by running close to 100%. so, yes the 4080 is a true 4k card in that game, but my system ist not. to answer your question: my dream 4k system would be able to run 90fps 1%lows with rtx on and dlss quality on. reason is that i love the eye candy and my old eyes cannot resolve the difference between native and dlss quality anymore😊. btw my system gets close to this target probably by 90% in most triple a titles i play. still thinking to upgrade because i enjoy pimping my system. so i am a very happy victim of the industry. cheers!
That's how I look at It as well , either get the best value with 70 Super tier or under , or else just buy a 90 tier card , because the middle ground dosen't make much sense anymore when still natively underperforming so frequently at around $1000+. I wouldnt be surprised If 80/70 ti super buyers have the most remorse due how overpriced they are and wishing for or expecting way better performance the most often. I mean when you spend around the $1000 threshold or more , gamers are really expecting a clear seperation of elite 4K performance without having to tone things down and basically play like much cheaper cards can. I think the 5090 will be the first true card where enthusiasts finally have just barely enough , and 6080 will probably be the first real 80 series card where at that price point It's enough as well , and then the 70 tier cards will finally feel like that when 7000 series drops.. but who knows because graphics engines keep getting more complex and taxing as well.
Have I been in a coma? When did 120+ fps become "minimum"?! Okay I have a 48" 1440 monitor and I'm still taking my 4080 OC back to get a 4090. Out of pure shame. And then I'm coming back and playing Skyrim and Unreal Tournament as per usual. Because I'm old. Can't wait for my 9800X 3D and 64GB of 6400Mhz RAM for playing Solitaire and running the pipes screensaver with textures!
Quickly invalidated your own point claiming the 0.1% low is 7fps when it was a loading screen and it never dropped below 100fps in the actual game even on the highest FPS. Get a better PC
I’d say a consistent 60fps would be a 4k card in my opinion
Wow 110 fps avg in 4K and extreme preset and this guy says theres a problem? People nowadays are totally spoiled
It’s not even the card struggling. It’s poor optimization by the devs
@@hankb27 that and maybe the 5800x3d bottle necking the 4080 ??
@@hankb27 its not,thats why they are different kinds of settings. You are not supposed to be able to play ultra quality on 4k at 120fps,you can lower and tweek the settings to achieve that goal.
@@h1roshema 4k native gets me to 100fps
@firstsonofthesea7938 my point was that it's not a requirement. 4k is very hard resolution to run,kids these days expect the 4090 to run 4k 140fps in every game on ultra.
The very high preset in CS2 uses 8x MSAA by default, which at 4k is not only way overkill but also extremely demanding. I play 2x MSAA at 1440p and it looks nice and sharp. 4k I imagine you could get away with no MSAA. The biggest reason you saw so much improvement dropping down to lower and lower presets is because the MSAA setting was being lowered.
depending what level of performance are you expecting the rtx 4080 with a top of the line cpu can crush 60fps in all games at respectable setting in 4k.
It absolutely is ive only ever gamed in 4k on it. We can't just go by alan wake and cyberpunk when all you need to do is adjust 2 settings and alan wake is over 60fps also we shouldn't judge 4k performance on ray trscing results doesn't make any sense at all that's an additional setting.
So far most games I play are 80+ fps in native 4k with normal ultra settings. For sim racers like myself all sim racing games I play like acc and iracing are all over 130fps native 4k. It's a 4k card absolutely. Just hate the way mainstream media wants to go by 1 or 2 games in the worse use case ever and poor optimization and act as if we can't just reduce a few settings to high and get that 60fps mark lol. Even in the most intensive titles. The fact it can get close to 120 in many games is pretty insane at 4k. 4k should not be set as the same standard as 1080p or 1440p. Anything over 60fps in 4k is acceptable bevause nobody is playing competative shooters at 4k. It's made for most single player amazing looking games. Then we have dlss quality and man oh man its amazing.
Just always have to keep in mind some games were not designed for 4k in mind and weren't well optimized for it. Now we have devs throwing everything they can at our games and going yep let your hardware take care of it. Crazy because new games look amazing but we all know they aren't as crispy as they used to be for some reason. Devs def don't go around making sure everything is clean like before it's all based on anti aliasing now which sucks lol.
Sometimes I'll play an older game and look at road textures and detials and gp wow how is it so sharp and clean? They eant us to think graphics have improved so much but only the people and certain things in sight are. Manu of the surrounding areas are actually worse. Buildings legit would look 3 dimensional.
But one day we will see it come back. To me Spiderman proves how good pc couod be because that was done based on a console! Now imagine that game being built from the ground uo purely for gaming and the devs put that much effort into it? We woupd have a truely ground breaking experience.
Race sims are designed to run high fps on a potato, like competitive shooters. It's not optimization at all. He has a very fair selection of titles here, and they're hardware limited.
@@drewnewby his test wasn't bad this wasn't about his test it's very clear it's a 4k card as he showed here I was just speaking I. General about mainstream media.
Nah man even the 4090 is not a true 4k card. 4080 is far too weak not even including RT, UE5 games are not playable at native 4k with a 4080
@Dempig what? Lol that's so false even you saw on the most demanding games 58fps is playable lol and again why can't you reduce a few settings to high instead? It's still native 4k resolution I dint get that it's still better then 1440p ultra also doing it that way. Some of yalls logic with mainstream showings blows my mind.
@Dempig again a few games that it has a rough time in 4k doesn't make it not a 4k card. That doesn't mske any sense at all. It also comes down to optimization but again ive yet to have a single game I can't get over 60fps by reducing a few settings to high instead.
Great vid. I purchased a 4080 super off the back of your flightsim experience with it and I’m happy. Yes the card costs a fortune but the days of cheap fast GPUs are long gone.
Very true. I remember thinking £330 was a lot for a 6800 Ultra back in the day
The cost of evolving from SLI and Crossfire.
@@zeroturn7091 Reminds me of the days when you could get a PC running very low on power, like sub 450W for two cards, having a GTX 460 SLI to match a GTX 580 performance. Those days are long gone now indeed.
@@davidnott_ I remember buying a 1070 FE for $350 from Best Buy and then upgrading a couple years later to a 2070 Strix card from Best Buy for $499. Then a 3080 12GB for $800 in 2022. Every purchase, thinking to myself that I was totally insane to spend that kind of money on a GPU. Looking over at my new 4080S... I now remember the 1070 and 2070 purchases as the "good ole days" of cheap GPU's! LOL
What I classify as a true 4 K graphics card is to be able to play modern games at 60 frames and older games in 4K at 120 with high / extreme settings
agreed! saying 4K ULTRA at 90 FPS is bad boggles my mind!
Very cool vid thanks for making it. Talking about Cyberpunk 2077, even the 4090 won't get you anywhere near 120 at the highest ray tracing preset in 4k. With DLSS and FG then you're looking at 90-100fps so it really is as you said a time to redefine what we mean by "4k GPU" because under some of these metrics not even the best GPU on the market is a 4k GPU.
so true , graphics currently are in a huge "leap bubble" there was one of these such leaps going on in 1999-2004 , where the visual software tech (games) was quickly out pacing the hardware. the difference from now and back then is back then new more capable hardware was coming out every year and that hardware was significantly cheaper (don't give me this inflation bullshit , the economy of 1999-2004 was much better and it was generally easier to afford a top end card back then , i did so several times on a lower wage job while supporting myself ).
today cards take 3 years to develope and cost way more in work /time resources . back in 1999 i started my first job making $6.75 and hour and i quickly moved up to $7.25 and hr . it was a 40 hr a week job at that rate i made $1,160 every month take out taxes i came in around $900 -$950 a month. the top end Geforce 2 ultra i got in 2000-2001 cost me $270. even afer buying it , i had room for bills and food. today that same job pays 12 and hour. or $1,920 before taxes. take out taxes you are looking at around $1500. today's top end video card (the 4090) cost more than what i'd make in a month doiing the same job i did in 1999-2002. where as before, the top end card costed me only a third of what i made in a month.
Sorry didn't mean to get in a rant about our shit economy. my point was that right now graphics technology is currently hitting a "milestone" in advancement and it'll be acouple more genrations of hardware before 4k w/RT @ 60FPS becomes a standard normal thing. that said if you ignore RT for the time several cards are techniclaly "4k cards:"
here's my list of 4 cards (no RT) from benches i've seen (both on YT and with my on hands benching
Radeons
6900xt
6950xt
6800xt
7900xtx
7900xt
7900gre
7800xt
Geforces
3080
3090
3090ti
4070 Super
4070ti
4070ti Super
4080
4080 Super
4090
Honorable mentions :
these are cards that can pull off slightly older games in 4k easily or can manage 4k with some settings tinkering
3070
3070ti
3060 ti (12 gb version)
6700xt
7700xt
4070
4060 ti (16GB version)
@@DenverStarkey Good rant. My current solution to this dilemma is also ignoring RT. I have the 4080 Super. Disabling RT makes it finally feel like the performance that a card in this category SHOULD have. Smooth 4k Ultra or 1440p at a locked 165hz. Even Frame Generation at such a base FPS is very nice and has minimum latency.
Really the expectation should be 120fps minimum at 4k for the price of a 4090. Especially when you look at what you used to get for much less money as in the top tier card.
I get 105fps average in Cyberpunk on the benchmark which is still well playable but the card was nearly 2g which is shocking. In a years time it will probably be a 1440p card given the game engines being used in some cases.
@@chrissyboy7047 Nah not in a year's time . sure we are getting huge visual leap ahead with real time RT and path tracing. but it's still not as fast as things were moving between 1999-2004. maybe in 4-5 years the 4090 will be 1440p card but even then only on the newest games in 4-5 years. for isntance my old 1070 (taht resides on my mom's PC now) can still blast Doom 2016 , skyrim , Resident evil 2 remake and a slew of other games at 4k 60FPS.
it's not like the games of current year will all of a sudden play liek shit in 5 years.
and with the way things are going modern triple A games are just getting worse and worse with the bull shit agenda pushing any way. who knows in 5 years there might not be any new games you care to play. these last couple of months of saving fro the rest of my new pc (i already got teh 4080 super for it ) i've been playing a lot of really old games . like stuff from 20+ years ago. at times i am wondering why i even care to finish geting the new PC ... but well i love buildign new systems and i already got the case , so yeah. but after this one .. i'm just not sure when my next build will be or if i'll even be interested in the crop of games taht are out 5+ years from now.
@@chrissyboy7047 Yeah I know these are weird times. Tbh the only one who wins in this sh*tshow is Nvidia. With new features like Pathtracing the price per frame has increased massively. There is minimum visual improvement at a massive cost in performance. While we fight down here in UA-cam comment sections and Reddit discussing whether we can notice Raytracing in certain titles they bag in thousands of dollars per person each generation.
THANK YOU. You are one of the few that show settings changes in real time and give a complete walkthrough and explanation of performance results, rather than just posting numbers with footage of random scenes in the background. I really want to upgrade to a 4080 but prices are not worth it with 5000 series around the corner.
Thanks for the kind words! And yes, hard to argue with that assessment of the market! I certainly wouldn’t be getting a 4080 super at this point in the cycle.
120 fps to match the 120Hz refresh of the latest TV's.
what is you cpu?
the 4080 is more than enough to do that!!! I use a 4090 just like that and limit it so MUCH! I use it always with DLSS Q on, RT/PT off and 116 FPS cap on 120Hz LG OLED screen! So, I could also use a 4080 and would't even notice a difference on that setting!!!
@@Serandi1987 Why 116fps cap and not 120 fps cap?
I must have a golden sample 4080. I get 134-145 fps on Forza horizon 5 4k ultra.
with DLSS and/or frame gen? I get 137 with no DLSS or sampling in 4K
Yeah there is something iffy about this dudes 4080 super, it wasn't getting the correct amount of performance in any of the games
@@SiLva_SuRfa Well he's got a basic Nvidia factory RTX 4080S or FE. Even there, there are notable differences with aftermarket overclocked cards. And even among aftermarket overclocked cards, there are notable differences with the cooling performances and quality of the components/overclocks which translates to more framerate in the end (that can go up to 10 fps difference which is huge, and then you can overclock manually on top of that if you really want to squeeze even more frames out of the thing). For instance, the cheapest of all, the PNY slim form factor card, despite having three fans, is not that great compared to a solid Gigabyte model.
@@SiLva_SuRfa he is using an old cpu with ddr4 that could be the limiting factor..
According to techpowerup's review, the 4080S averages 95 fps in 4K based on 25 games tested, and that's even without DLSS and FG. I play on a 1440p monitor, but I often use DLDSR downscaling with or without DLSS, and this GPU offers an amazingly high refresh rate experience even at these extremely demanding settings. Even path tracing games run great although at 1440p. I have bought several graphics cards over the years (TNT2, Geforce 2MX, Geforce 3, Geforce 8800Ultra, GTX680, GTX 1080ti, and GTX1080), but I have never been more impressed with a GPU, and the RTX4080S is not even the fastest GPU available (the RTX4090 is up to 50% faster in some games).
It's just us becoming unreasonable. 4K is such a powerhungry milestone to reach. I just went from 1080p (used to play on an old and crappy 32" LCD TV with horrible latency, panel backlighting, basic 60 hz refresh rate, the image was looking yucky) to a 34" 1440p QLED 21:9 UW monitor (Odyssey G9 OLED) using a RTX 3070 Ti and I plan to buy a RTX 4080S during Black Friday. Frankly my card is holding up good at this spot, but I don't reach the possible 175 Hz in most modern and moderatly demanding games and UE5 games are really crushing it. I get 60 fps out of UE5 games but with sacrifices in quality: no Lumen, reducing textures a notch, shadows a notch, some details, overall running a frankenstein medium/high/ultra setting generally. Which still looks great don't get me wrong. But my card is screaming! So to run the most demanding games, 4080S it is. I'm still not sold on 4K, I'm more sold on the ultrawide aspect ratio for now.
@RonJeremy514 The 3070ti is still not a bad GPU, but the 8GB VRAM limits this card for sure. If I were you, I would wait a little longer (2-3 months) with the upgrade and get the RTX5080 instead of the 4080S. I only bought the RTX4080S because it made no sense to pair the GTX1080 with the Ryzen 7800X3D.
When I had GTX1080, 4K was not easy to run. DLSS technology however makes it very easy to run games at 4K, especially if you use DLSS FG on top of that. On my RTX4080S, almost every game in my library runs at 4K 120fps with the help of DLSS quality, and with DLSS FG on top of that, even very demanding Hellblade 2 (UE5 game) runs at 120fps. Some people might say that this is just an AI generated image, but I don't care because it still looks like a real 4K image to my eyes on my 4K TV, and downscaled 4K on my 1440p monitor also looks awesome. Older games like Resident Evil 3 remake doesnt even need DLSS to run at ultra smooth fps, I get 120-170fps at 4K native in this game and that's with RT.
I remember when GTX970 was classified as a 4k card.
I remember when 2x SLI GTX980 was recommended to play GTA5 at 4K, but I never heard people recommending single GTX970 to play at 4K. I guess this card could run X360 / PS3 era ports at 4K, but PS4 ports were just too demanding for sure.
The first cards that were described (by reviewers) as capable of running 4K were the GTX1080 and GTX1080ti. I had both and was able to run a number of games at 4K at 60fps. Now my RTX4080S can run almost every game at 4K at high refreshrate except for some of the most demanding RT games, but thanks to DLSS even these games are perfectly playable.
@@PabloB888 Digital foundry for example had a "series" called 4K On A Budget where they played different games at 4k and showed settings. I also remember back then I played the Lego games with my kids at 4k on a RX480 :) This 4k-capable moves forward all the time, when the games get more and more complex.
@@AndrewTSq I watched this video and it seems you are right, although 30 fps on medium / low settings in Crysis 3 is a bit rough.
4k 15fps
4k, 12fps
consistent 60 on native and 120+ with the nvdia sorcery turned on
A card needs to, on most titles, hold 60fps, run at native 4k at medium settings for me to think it's a 4k card.
And the 4080 s does that just fine
A 4070 super can do that for nearly half the price of this card.
@@Freshstart1011 I have a 4080 super. It rarely holds native 4k 60fps. What are you talking about? Proof? Source?
@@Freshstart1011 what game at 4k native with no upscaling to avg 60fps? Something from 2018?
@@2Guys1ControllerShow turn down a few settings a boom native 4k buddy
2:50. Average over 100 fps? And you think the performance is bad your crazy.
to be fair the 4090 can't run all games in 4k at max settings at high refresh rates either. imo if it hits 60fps on most games at max non ray tracing native then i consider it a 4k card. we also need to take into account that sadly games are poorly optimized and developed with upscaling in mind and that affects the performance at native. seeing how the industry is right now i'd be fine with 120 fps at very high with down to dlss balanced, the card seems to be able to do that but for how long that statement will remain true i don't know.
Hey, does anyone remember what a 4k card was when 4k was first introduced?
By that standard, that card is doing very well.
What about when the setting "Ultra" first came out?
You getting good frames on Ultra in 4k with most of your game suite.
That said, expecting 144hz is asking a little too much, but eventually, that is basically what a 4k card will be. Maybe even with Ray tracing.
The cards just aren't there yet, but I can see where you're coming from.
Rachet and clank needs a restart after changing settings. To he safe, you should restart most games after fiddling with settings
for me a 4k card need to do at least 4k 60fps native
for a good one need to do: 4k RT 120fps upscaled in quality mode
I mean the 4080s can actually do that
@@The_one_that_got_away yes for shure
@@The_one_that_got_away a 4090 cant to native 4k60 RT...
@@toivopro it can depends on the game
@@toivopro in an unoptimised title even a rtx 9090 won't be able to do native lol
Short answer: yes. Max settings 60-120fps. Love my gallardo 4080
I have two monitors for my 4080 -
LG 27GR95QE-B and recently a aw3225qf.
I have actually put in a return submission for the aw32 simply because after a weekend of testing 4k over 1440p I wasn’t entirely happy with performance.
It wasn’t terrible, it looks amazing - but I am worried about future proofing my pc because I won’t be upgrading anytime soon and I know my setup will run anything pretty much maxed out at 1440p, where as with 4k it’s a gamble and reliant on DLSS.
Kinda thinking the same thing. I only have a 1080p TV and a 4k TV atm because everything else broke. I'm worried now I'm used to 4k that 1440p won't seem as sharp. Do you miss 4k or is the difference not that bad?
@@Kelig You can tell the difference especially running in native 4k in games.
@vulcanjoe8258 I think for now I'll go 1440p and wait for the RTX5000 series, then upgrade to a 4k monitor. Think that's sensible?
People are judging cards off of tech demos. No one smart plays at ultra. High is usually a waste in a modern game.
Second best at 4k currently, easily a 4k card. Iv got a qn90a 4k tv and it doesnt fail me ever even with pt.
the problem isnt the gpu is the game itself and those that optimized it, the low fps is because of bad optimization
My 3090 is a 4K gaming card because it can still play games today in 4K, so yes the 4080 Super is also "a 4K Card" 🤦🏽
😂 3090 a 4k card you are one of those who like 30 fps on ultra. I have a 3090ti it's a 1440p card
I bought last year March an RTX 4080 and I play in 1440p because I want those Ray Tracing effects and in 1440p the RTX 4080 is a very capable card of course with DLSS and Frame Generation. To be honest I've never really experienced frametime issues with Frame Generation. I am not a competitive gamer and for me 80-90 fps is "enough" and my monitor is capable of 165 hz but I have no intention to max out.
If you care more about resolution, RX 7900 XTX is probably a better choice with inferior RT. If RT is more important then 4080S.
Should I buy a 4k monitor if I get the 4080 super with 7800x3D?
Worry about having to downscale to 1440p gaming if the fps is bad. I just want average 60-90 fps with ray tracing and high settings only, no ultra. I play all game genres.
Which monitor should I get? 1440p vs 4k OLED for this gpu if I want 70+ fps on all games at high settings no ultra?
You can just get a 1440p monitor and use DLDSR with DLSS for the 4k experience. From my experience the performance tank is not worth the higher pixel density.
I got a 1440p 360 hz Oled for my 4080, when you Play in 1440 the 4080 will Last the next 5 years
Def is a 4k card. I only play in 4k on mine and am usually just setting things to ultra and enjoying.
I'm currently running a Ryzen 5 5600X and an RTX 3070; noticed you used to run with a 3070 before your upgrade! The best possible CPU my motherboard supports is the Ryzen 9 5950X, so I'll probably grab one on sale and wait for the 50 series cards. Good video!
Wasn't there something about the 40 series cards where not every single card is identical? I heard somewhere that the components were changed in later manufactured versions
Texture settings have vastly increased in fidelity for years, despite remaining 4k.
This is a good thing. Resolution is important, but there are many attributes to creating a good visual and i think we have improved a bunch.
Raytracing will be good on a 5070 and AMD will haveclosed the gap some.
After Raytracing, mesh shaders should start to become more common. Then hopefully that last resolution jump to 8k and 12k as that is the max a very large TV will need in any living room.
Your content is so underrated. You need more subscribers! Keep it up my man.
I been using a TUF 4080 super since release day, the issue I've found is Nvidia purposely used weaker VRM and fewer power phases on the 4080 super to cut cost. This is the main reason it performs worse than regular 4080 sometimes. The remedy I found was flash the vbios to strix 4080 super, which provides 22% higher power limit, this made a difference even tho I always undervolt to 900-950mV , now it doesn't hit power limit and cause frame drop.
Just got a msi 4080 Super to pair with 7950x3d i use a 32x9 monitor and a 34 inch 1440p. I love it so far
My personal thought is that 90 fps would give you a very enjoyable experience in most of the game. Chasing for 100+ fps is too costly and unnecessary.
To address the stuttering there's major problems relating to stuttering with this card caused by lower wattage PSUs and poor quality power cables but some drivers have also been causing problems
It's a 1440p Ultrawide GPU 🤌🏾
Used to play on 24 inch 1920x1200 lcd after buying 1070 but now that i have 4080 i now play on 24 inch 2560x1440p lcd
Don't just tweak hardware (CPU, GPU, RAM, etc), find the perfect in game settings are important too.
I remember turn on SSR on the Witcher 3 next gen update did nothing to fidelity, but it's destroyed my fps.
PCMR is not about hardware only.
New viewer here, I never heard someone so disappointed with a 4080S 😆
+1sub
Do you undervolt your gpu? I was pretty surprised at the wattage throughout the testing
I have a 4080 super with i9 13900k and I get way over 200 fps on ultra on 1440 p
War Zone stuttering is due to GPU memory exceeded.
For warzone its simply bcs the 16gb of the RTX 4080S were full, if the card had 18gb or 20 then the 1% lows would be better and less stuttering
Imo panel and graphics tech has just far outpaced hardware to support it for the average consumer. 4K was far too dmeanding for consoles and PC rigs when it was introduced, and still is to a fair degree. The introduction of resolution scaling, checkerboard rendering, and DLSS helped mitigate some of the demand on hardware, but then ray tracing came along and now they're trying to use frame generation to help with performance. Most average PC rigs and certainly consoles can't handle native 4K with med-high graphics settings, nontheless ray tracing on top of it
That's why my panel upgrade went to 1440p 21:9 UW, 175 Hz, QLED and VRR, HDR, instead of 4K. I'd rather get the features that really matter to me and have a great experience than 4K, I'm not sold on that alone having to make a lot more sacrifices just to meet the resolution's requirements. It's like trying to play Ace Combat or Batllefield's aircrafts with an arcade gameplay but with an expensive HOTAS setup, it doesn't really make any sense.
4k 120, I can do that on my 3080 mobile card. At medium/high settings. Kicks ass…. Extreme?! That’s insanely good.
Trust the GeForce experience settings. It usually sets dlss to performance, but you can switch it balanced.
The 100% lows are sooo bad. I wonder what was causing it. Could it be shader compilation?
Great video, too bad youtube thinks I should watch it at 720p on a 4K display.
This is the way 😅
You are funny, UW display and black bars constantly for UA-cam. I wish there was more UW content.
I was considering grabbing this card to my build but after reviewing some benchmarks similar to this one, I am reconsidering whether or not I should wait for the next gen. It is a little bit of a shame like you said for a card this expensive to have to rely on software features and tweaks.
In my humble opinion I think a fair standard of cards of any fidelity is being able to do that fidelity at a STABLE 60fps at high settings. That's what makes it a true 4k or 1440 or 1080p card. Can it run most games on medium to high settings at a stable 60. I think that's the fairest point that ignores BOTH the snobs and the apologists and is realistic. High end cards of that fidelity are valued on how far past that they can give you and anything less than that is actually a lower fidelity card that happens to be able to punch above its weight on somethings.
I recently acquired the 4080 Super from a friend who originally bought it but later upgraded to the 4090 for reason's. I paired it with my i9-1300KS and made an Elden Ring Mod video. Someone pointed out that my CPU was a bit long in the tooth. Nowadays, there seems to be constant pressure to always have the latest and greatest, or else you're made to feel like you're already falling behind. The graphics card is good, as it should be, but when it comes to Ray Tracing, it seems like they exaggerated its capabilities. The only card that truly excels in ray tracing at high frames is the 4090, and even with that, some games can bring it to it's knees. It's wild to spend anywhere from $2500 to over $5000 and still can't play Black Myth: Wukong over 70 Fps at max setting with ray tracing.
True 4K card should be able to run any game Native at 60FPS with everything on High and RTX on
And what card does that...6090?? 4090 ain't doing that.
Hate this take, because in 2024 your an idiot if you dont use dlss it's that good.
in your dreams...
What a ridiculous standard. 60fps ultra settings in the most modern titles is something that's only happened in a couple generations. The 4090 can be easily crushed. The 5090 will be easily crushed.
That extra 4gigs this card is missing is what is keeping it from being 'the to go' card
instead od using presets like extreme or ultra i would recommend costum settings. often there´s a few settings wich are very expensive but don´t provide any real visual benefit
Pretty much sums up modern games. Now running everything 100% ultra is quite luxurious even for the premium cards. Some ultra settings performance sacrifices outweigh the visual benefits clearly. Some things will remain unchanged though like having high res parallax textures, particles, Ambient Occlusion, details, shadows, long distance LOD rendering, VFX all while running a good resolution. But Ray Tracing? I can live without it. AA x16? DLSS is perfectly fine and improved upon at a fast pace to look better and tank a lot less performance.
I am trying to decide between the 4080S and the 7900 XTX. Both have custom options at or below $1000, however the XTX has 24GB of VRAM. I noticed from many of the benchmarks, they perform within a 10% margin of error in most titles. Is that because most games do not require more than 16GB VRAM or because the XTX is doesn't have cores to take advantage of the available memory? If you had to choose, which would you pick?
4080s I had a 7900,xtx nightmare. 4080s is also way more efficient on power
I've had absolutely no issues with my 7900XTX.
DLSS Quality is decent if it doesn't affect the visuals. But tbh in some games it is noticeable.
FSR is behind when it comes to sheer FPS numbers but I find it is a smoother render. Plus most games have FSR as it's open source.
As for power consumption, if you're that worried, just undervolt it. But tbh 100Watts shouldn't scare anyone if you can truly afford this GPU.
It's like buying a Ferrari or Lambo and worrying about the MPG.
@@Franki3683 FSR is nowhere near DLSS when it comes to visual quality. And could you elaborate more on "smoother render"? Is it metaphor?
I was deciding between the two back in May. Went with a ProArt 4080 Super for the aesthetics in the end.
Edit: oh, and better resale value for Nvidia? Another point to consider. Maybe the XTX would have been just as good (or better?) but I’m happy with my Super. I play mmorpgs mostly.
Nice video!
For future content regarding Counter strike. In the steam launch options for cs2 you can add this command to allow msi afterburner overlay: -allow_third_party_software
-W
I would still say its a 4k card, i have been enjoying rdr2, horizon forbidden west, spiderman with raytracing, all of them at ultra settings native, and it does work super well for high refreshrates single player games lika forza or doom eternal, but playing shooters in 4k is a bit overkill man. Like if you want to play the campaign in 4k i get it but I use a 240hz 1440p screen for cod and i hit 300 fps with balanced settings in that game. I really love this card honestly.
It is also almost half the price as rtx 4090 and you are not getting a crazy diffrence in preformance.
4k and 100+ FPS is what I expect out of a card for 4k gaming
Why is 30 fps and 60 fps extremely playable on consoles, but everyone on computers say 60 fps is unplayable.
It’s not there just trippin! I got the same card with an 5700x3D and have no issues at all playing everything on 4k… people are complaining about a chip with a dead platform that still kick behinds today even compare to 7000/9000 chips like you said anything above 60 is just fine nothing least than 30! Which I have not seen one game on my rig that I can’t run at 4k max graphics
@charlessmith9369 I'm running my 4080 + 7800x3d at 1440p do to my 240hz monitor so the frames never get that low. My PS5 running at 30-60 would never stop me from playing on it. I'm new to pc gaming but I'm thinking when PCs frames get that low it becomes unstable. I don't know, I'm trying to learn.
Warzone is terribly optimized, which is why you were getting those stutters. Try dropping VRAM scale down to 70
Tbh these days, I’d rather not play it at all haha
@@davidnott_That's odd, I get an extremely smooth experience around 150fps maxed out at 1440 with my 7900 GRE
@@theofficialpollocod prefers amd cards, my rx 7900 xtx runs amazing
What's funny I played my 4070 all day 4k😊for me anything better than a ps5 I'm happy.
Technically speaking a 3080 12Gb was a 4K card and the 4070 has comparable performance. In most games you can definitely game at 4K with a 4070 or above but theres a few outliers of course.
I'm perfectly happy with max settings 1440p at 100+ fps average. 4k is completely overrated since the details are negligible. What's NOT negligible is the responsiveness that comes with high fps
Bad 0.1% lows? You definitely have an AMD CPU. The AMdip is incredibly annoying. And now that Intel also went with a chiplet design, we have no low latency options. 😔
Good video. I bought a 4090 at $2200 Cdn, this was before the 4080 Super release. The 4080 (non Super) was around $1700 Cdn. I figured if I was going to spend $1700 for a card then I might as well go all in and spend $2200. It's good that the 4080 Super is cheaper.
Yeah the price is what tempted me in the end. I was doing so well holding out prior to that 😅
I dont understand. Why yours 4080 super uses around 230 watts on 100% load when it is 320 watts card?
Its efficient on the power draw I get 19w on idle
Even the RTX 4090 is "optimal" at Native 1440p, 1440p 240hz would be the ideal display to pair with an RTX 4070 to RTX 4090 - From 120hz to 240hz.
At Native 1440p you can enjoy a high refresh rate experience ( 90 FPS + ) without the drawbacks of visual artifacts and increased input lag brought about via AI Upscaling and "Frame Gen".
4K is doable with an RTX 4090 in many cases but I wouldn't call 4K "optimal" that would be Native 1440p.
4K is over kill. i think 2K is enough specially with new QD-OLED monitors!.
They classified the 3080 a 4k gaming card so i went out and bought one and i can tell you now no it wasn't it couldn't even do shadow of the tomb raider maxout in 4k with there crap ray tracing for its time compared to todays ray tracing in game but i only had a 5600x cpu. Your 4k gaming is only good because you have a good cpu as well so it's not just the gpu that's doing the 4k you won't get the same performance on a lesser cpu paired with it. So just buy a 4090 or wait for the next new 5090 with the best cpu and your then ok for at least another 6 months before it starts to get useless with the next new tech around the corner lol
CPU doesn’t matter on 4k as much as the gpu unless the game is also cpu heavy dependent 4k takes a huge strain on the gpu 99% of the time on certain games but that doesn’t mean cheap out on a cpu either
It's a $1000 card because Nvidia has been price fixing. There is absolutely no reason 2 year old tech should be sold at the same MSRP as release. All tech should depreciate in price over time.
Okay You probably saved me few hundreds bucks. I have similar system (5800x3d, 2x16gb ram,etc.) and I was thinking about upgrading my 1440p TN monitor to 4K Oled, 240hz panel and my RX6800XT to 4080, but holly …. I didnt realized that 4K gaming is so taxing compared to 1440p that even 4080 struggle to play some games at 4K natively without upscalers. Well I will reconsider this. I will either buy 4K monitor now and wait for rtx 5080 or buy rtx 4080 + UW 1440p high refresh rate panel now.
Can someone explain to me why these cards can't play games at 4k native but cost $1k and more? Nvidia and AMD are relying on software and RT cores to get the FPS into the upper ranges on these games, but what happened to just having the fastest card without all the gimmicks! If i were to compare it to cars, it would be a comparing a v8 to a Turbo 4 or 6! Is it so bad to want a card that cost half of my rent payment to just be the baddest, "fastest" S.O.B in the neighborhood or in my computer for this purpose that will play any game at 4k 60 and above without the 8 or 10sp automatic and AWD to help it?
The 4080 S is a 1440p card in reality. If you play in 1440p, you can use ultra settings and DLSS and get 100+ FPS and this card will take you for the next 5 years. At 4k, even the 4090 can struggle.
I honestly use mine for 3440x1440
Sounds like a good balance!
Welcome to the UW average enjoyers club.
Not tested msfs?
I think having seen these tests, it's good on 4k depending on the game and within what you might expect in a lot of instances especially with the high needs of 2077 and Rift Apart.
My only thing in testing, I know it's personal preference, but I find DLSS Quality to be a less practical choice than balanced, and I think that setting can make a decent amount of difference. It's almost like the effect raytracing has in some games where the visual uplift you get from RT in the game is marginal but the performance hit is quite a lot. I felt that particularly when you showed Diablo 4. I didn't really notice a difference with visuals but there was a decent performance hit with it on. When I've tested DLSS Quality in a game like 2077, the visuals look ever so slightly better if I take the time to sit around and nitpick, but the performance cost is a bit much. I think balanced is the way to go on that front.
for me 4k is when u dont need upscaler but that is long gone now tbh, cuz they force some kind of rt, but does it looks good enough that u can ignore dlss, sure, probably cuz i dont have it so cant say for sure
0.1% low is completely useless as no one will be able to experience it compared to how much 1% low affects and feels.
Yes end of story 👍
When ya look for faults and frame chase, you'll find something wrong with even the 4090.
consistenat 60fps on 4kg ultra graphics is a 4k card for me ;)
You’re airnott as well right?
Indeed!
Wasn’t the 20 series a 4k graphics card it’s seems like the 50 series is finally going to grasp 4k
Bro thinks average 150 fps is better than having ray tracing on 🤡
thank god I saw this video, I’ll go for the 4090 I guess
My take is this. If it runs over 60 at 4k with Dlss at performance...yes. Reason I say performance becuase its that good. But the 4080 does almost everthing at quality dlss or native for games i play. Also if it can run pt..thats the ultimate test...2 cards right now can ones the 4080.
The difference is you cant go beloq quality with fsr thats why the best AMD cards arent as good 4k cards. Dlss performance is on par if not beats fsr quality from my tests.
Nope , there are no true 4k cards even the 4090. I own a 4080 s, when you can run native 4k above 120 in all games without "features" that will be a true 4k card.
I don't like sayings like this.. You might as well say there are no true 1080p cards because there are moments playing in cyberpunk 2077 that will fall below 60fps if you have everything maxed with path tracing on. If you want to up it to requiring 120fps at all times, lots of games would not be able to accomplish this.. I guess the 4090 is only a true 7200P card according to your logic
@@michaelangst6078 1080p is CPU dependent on a 4090, ooops try again.
4k card mean no upscalers and max settings whit out RT
Without using RT I would say it is
great video, thanks a lot! imho avg fps s are only part of the story. a smooth experience is more important to me. e.g. my 4080 plus 5600x system runs last of us 1 at ultra settings at 4k at ~80fps. however 1%lows are below 40fps which can spoil the entire experience by stuttering. BUT reason here ist the CPU bottlenecking by running close to 100%. so, yes the 4080 is a true 4k card in that game, but my system ist not. to answer your question: my dream 4k system would be able to run 90fps 1%lows with rtx on and dlss quality on. reason is that i love the eye candy and my old eyes cannot resolve the difference between native and dlss quality anymore😊. btw my system gets close to this target probably by 90% in most triple a titles i play. still thinking to upgrade because i enjoy pimping my system. so i am a very happy victim of the industry. cheers!
1440p 120 high settings is the way
absolutely disgusting for the price. I wouldn't go past 4070 super. It's the best value. After that, if you're not going 4090 its a waste of money.
That's how I look at It as well , either get the best value with 70 Super tier or under , or else just buy a 90 tier card , because the middle ground dosen't make much sense anymore when still natively underperforming so frequently at around $1000+. I wouldnt be surprised If 80/70 ti super buyers have the most remorse due how overpriced they are and wishing for or expecting way better performance the most often. I mean when you spend around the $1000 threshold or more , gamers are really expecting a clear seperation of elite 4K performance without having to tone things down and basically play like much cheaper cards can. I think the 5090 will be the first true card where enthusiasts finally have just barely enough , and 6080 will probably be the first real 80 series card where at that price point It's enough as well , and then the 70 tier cards will finally feel like that when 7000 series drops.. but who knows because graphics engines keep getting more complex and taxing as well.
Have I been in a coma? When did 120+ fps become "minimum"?! Okay I have a 48" 1440 monitor and I'm still taking my 4080 OC back to get a 4090. Out of pure shame. And then I'm coming back and playing Skyrim and Unreal Tournament as per usual. Because I'm old. Can't wait for my 9800X 3D and 64GB of 6400Mhz RAM for playing Solitaire and running the pipes screensaver with textures!
If you want high fps and competitive gameplay, probably wouldnt be in 4k unless you have a 4090 (or better in the future)
120 fps the only acceptable frame rate? In a single player game that is perfect.
Quickly invalidated your own point claiming the 0.1% low is 7fps when it was a loading screen and it never dropped below 100fps in the actual game even on the highest FPS.
Get a better PC
Anyone else laughed at the cs go players completely smoking him off spawn lmao
With DLSS balanced, yes!