Note that there is currently a bug in the benchmark that does not apply your selected scaling percentage to DLSS. Rather, it uses the closest DLSS preset. As such, the path-tracing results are using 66.66% scaling for NVIDIA cards and 75% for AMD. Subjectively, the DLSS image quality will still be higher despite rendering at a lower internal resolution, but that probably wasn’t the intention of the testing. Also, given that this is really a path tracing implementation, 50% scaling makes sense at 4K, which would be equivalent to 1440p at 75% scaling. 1440p DLSS quality only renders at 960p internallly. No graphics card can do 4K at 66 or 75% scaling using path tracing at useable frame rates, even with frame generation.
I don't know if it's a bug or just questionably weird settings behavior, but I can confirm that there is no FPS difference between DLSS 67% and DLSS 89% (so it uses DLSS Quality preset). With FSR the performance scales based on the resolution scale you set (so there are differences between 67%, 75% and for example 90%). This is really important information, because it makes the benchmark result quite skewed.
I play on 1440p with a RTX 4090 7950X3D and I have 140 FPS on average with DLSS enabled, cinematic settings, RTX off, 75% resolution. Still have some stuttering I think because of the game. My temps are 59°C for both CPU and GPU during the game. GPU reach 99% utilisation.
Sure thats worth 1800 bucks 😂 The more and more i look at how much that GPU costs the more a 4070 super looks like a steal. Can literally build a whole gaming rig for the price of the 4090. Hell you can build a 4080 super gaming rig and get 80 percent of the performance @colemin2
RTX 3000 and 4000 GPUs can handle ray tracing fine with DLSS on. Even on the 4090, you shouldn't bother with ray tracing without DLSS, especially when using a game with DLSS 3.5 enabled.
Well to be honest I think none of these tests represent the combination of settings that 90% of people who play this game will use which would probably be FG/DLSS/FSR on + RT off. Regarding the first 2 sets of tests, I don't know how many people will ever run RT at all when the results are so catastrophic for frame rates, unless you're using the literal top end of top end GPUs. Regarding the last 3 set of tests, I don't know how many people will refuse to use frame generation when it practically gives you free frames.
@@ASadTeddyBear If you're not on a 60Hz screen there's 0 reason to avoid FG, input latency is negligible at 144hz for an instance. Even at 60Hz there are things that can be done to improve latency a whole lot if you know what you're doing and depending on the game
Then don't play at absolute maximum settings and use high preset instead which makes you get 250% fps with only 10% graphics quality loss. Nobody is forcing you to play maxed out graphics.. maxed out graphics is only for the most powerful GPU and next gen. Btw, consoles will run this game on mix of low and medium settings with high textures and it will still look very good.
@@ASadTeddyBear Yeah I am not getting this kind of test also, I like high refresh and usually play with no RT and DLSS enabled (on performance mode if game looks ok) with the mix of different settings like textures\lighting\postprocessing on max and lowered shadows eflections. DLSS+FG+mixed settings can do pretty high FPS in demanding games even on mid-tier hardware, but yeah... Sadly big channels usually only bench using pre-sets, well at least they bench on different resolutions still -_-
no, game is no nearly important or impressive enough for that, also I think with Crysis we by now must have found out how stupid that line was anyway, EVERYTHING could RUN Crysis, few could MAX it.
Unforunately Every other AAA game nowadays is an unoptimized mess so.....when every game runs like crap, there's no "gold standard", like crysis in the past.
I absolutely hate how frame generation has become a thing. It is garbage below 70FPS so what’s the actual point of it. Also, holy balls. A 4090 struggling outside of 1080p.
@@justwinclassic frame gen feels bad all the time to me. And looks not as good to my eye. Frame gen data is next to useless to me, so I was kinda disappointed in this.
@@dakbassett to me? Yes. Because I like RT. I absolutely do not use FG. It's terrible. I'd rather play 30fps. At least then it plays how it feels, and the dissonance in my head doesn't pop off. So, showing me FG that I have to guestimate 40fps increase, if that? IDC if the game is hard to run. I have a 7800x3d and a 4090. If there's a game I can't play well, then that just means it's not gonna play well until next gen drops, if that. And I want to know what I can actually do maxed out without some BS stop-gap tech that only exists to make the numbers on a graph look better.
@@dakbassett I never said anything about the data points and the work put in, I was specifically talking about frame gen. I also hate how we need to consider things like DLSS and FSR.
@@quasii7 I guess. I think it would be useful in games that are as demanding as this to be given some context as to how the game actually looks. If the game genuinely looks really amazing then sure. I'm fine with it not running well at native 1440p even on a RTX 4090. It's built to continue to look better as new GPUs come out. But just looking at a FPS graph you cannot tell the difference between that, and a game that doesn't look that good but is just poorly optimised. A quick rundown of if the performance is justified by the visuals at any point in the video, a bit like what digital foundry do would be nice. I get that's not what this video is about, and the optimization guide next week will cover this. But at the moment I think my reaction is kind of expected due to the lack of context surrounding pure FPS graphs.
RT doesn`t seem to be worth it in the current generation of GPUs in this game unless you`re on 4090, to be honest. The graphic upgrade is mild and hard to notice, at least on YT clips. The game still looks nice on high, and fps is about 3 times higher. There are some games where RT is like a generational upgrade (Metro, Dying Light 2,. Cyberpunk( but Wukong doesn`t seem to be one of these games, at least I think so far based on limited exposure to the game recordings on YT. Another thing I noticed is abnormally weak RT performance on RTX 3080 / Ti. In all other games, these cards are on par with 4070 / super or even 4070 Ti but in Wukong they are way slower. The interesting part is that in 1440p high 3080, it gets a lot faster and starts to reach 4070 so either RT implementation in this game is bugged on RTX 3000 or there is a driver issue (or perhaps Wukong employees RTX 4000 architecture is better?)
All these games are beyond unoptimized. This game looks good, but not ground breaking good The reality is that companies can't make a game look as good as Red Dead Redemption 2, yet they require you to have 7900xtx or RTX 4090 to be above 100 fps in 1080p. Which in 2024 is absolutely RIDICULOUS This is like Watch Dogs Legion barely hanging on RTX 4090 at 1080p. Companies make unoptimized mess, and call them "masterpieces", requiring you to upgrade more and more, which is bull of collosal proportions. I am speaking this as 13900k owner with RTX 4090. Why does an average gamer need to have something so expensive as me, to play a game that looks like it was made in 2018?
RDR2 is not on the level of this game. This is a proper UE5 game too. RDR2 was the best of the previous gen and now we are really getting to see what this current gen can do.
more like its 4k native + fully maxed out cinematic & very high raytracing. - at 4k u usually always want use dlss (or fsr) no matter what game & gpu u have Raytracing chuggs a ton of performance when just switching from medium to very high - its like a 50% hit. everything else is peanuts (though ingame the performance hit is bigger, the benchmark basically shows u if u can run the game , no real fps test here)
@Hardware Unboxed DLSS does not work like the game says it does. It does not use 75% render scale. For some reason DLSS is not actually using the resolutions displayed on the slider. Instead it only uses the "Current DLSS Quality Mode" displayed under the Super Resolutions description. These use the typical DLSS resolutions. The slider does allow you to change the Quality mode, but it only uses the 5 discrete values for the different Quality modes. You can display DLSSs internal render resolution with the DLSS overlay enabled in regedit/DLSSTweaks or show it with Special Ks overlay. As far as I can tell the other upscalers use the actual slider resolutions, so you are benchmarking DLSS 67% vs FSR 75%. Aditionally, there is a display bug that sometimes changes the displayed slider value. For example if I set the slider to 89 and hit apply, then leave the menu, opening the menu up again the slider will show 88 instead. 88 will also be displayed in the benchmark result, but this is not correct. You can open the config file and see that the game is still set to 89% render resolution.
@@chillhour6155 you're talking about demo versions ;) I'd take them as well but was refering specifically to a benchmark tool so people can test if their system can actually run the game...
Three things I discovered with the benchmark wukong tool: Check to see if the upscaling percent decreased by 1 when you change graphic settings. If you use a keyboard, you can change two settings at once by highlighting one setting with your keys and another setting with your mouse. Left clicking when the benchmark runs hides the mouse.
Is this an rtx 4000 advertisement? Why did you run rtx 4000 series with %66 dlss while running all other gpus with %75 fsr? I don't think %26 resolution difference is negligible.
@@stormshadow5382 RT isn't always on, Lumen is. From what I have read from others it seems there is a misunderstanding of what Full Ray Tracing actually is (which is what is being tested here), full ray tracing is actually path tracing which is why it is so demanding and also explains why even the 4090 struggles with it, this setting is not always on it can be toggled on or off. The Lumen aspect has nothing to do with Ray Tracing.
@@stormshadow5382 " lumen uses basic ray tracing for it's lighting." No it doesn't, Lumen uses no ray tracing, Lumen is software based, meaning Nvidia cards that have RT cores do not get any improvement from games that use Lumen i.e. both AMD and Nvidia are on more of an equal playing field here. Lumen, is Unreal Engine's real-time global illumination and reflections system, it does not use traditional ray tracing for its lighting. Instead, it employs a combination of screen space techniques and a voxel-based representation to achieve dynamic global illumination and reflections. So no, it has nothing to do with Ray Tracing or Nvidia.
@@sammiller6631 Hes talking about the AI in the game. how is that a buzzword? The CPU and GPU applications of AI and their effect on performance have been around since forever. Look at The Last Guardian for one of the most intense AI developed around the PS3/early PS4 age.
@@panzer3279 Lumen and nanite, unlike PBR (physically based rendering) are not really basic UE5 features, are special extra UE5 features and they are entirely optional, while PBR in UE4 and UE5 is not. You can make a UE5 game, without nanite and Lumen and it will still be a UE5 game plus you will take advantage of the improvements this newer engine brought to the tools and some of the workflow. But yes, most recent UE5 dev's now use those features, so technically they will become "basic" UE5 features, going forward.
This game was designed for a RTX 3080 in mind to run at 4K back when it was first revealed, but now it' can't even run at 1080p 60fps using the highest settings. What went wrong here? I did not buy a 3080 to not run the game well at all.
I think a lot of people will change their tune about this benchmark once they get the game and find how much performance drops off in areas with a lot of NPCs and during intense combat with all the flashy lighting effects Asian developers love to (over) use This is a best case scenario benchmark that is leaving out all the CPU intensive parts of the game which makes me suspect that it will turn out to be another Dragon's Dogma 2
DD2 was a unique case of RE Engine being wholly inadequate for large NPC counts - hence the massive CPU bottleneck that developers tried to somewhat mitigate with ridiculously close spawn distances for said NPCs.
Crysis 1 running in one thread on my 12700K, with the crazy physics, the AI on the hardest difficulty and destruction on buildings still gives me +100fps. I wonder how the heck a game like Dragon's Dogma 2 is that CPU intensive, is not an RTS game like Troy Total War Saga.
@@nossy232323accurate... Considering Radeon RX 7900 XTX can't even deliver proper gaming experience at 1080p. I will try this benchmark with my older Radeon GPUs from GCN era.
Why does nobody mention the artifacting with any kind of upscaling, dlss or fsr both? in the benchmark atleast, my eyes nearly fell out from the amount of shimmering around leaves and branches. Even DLAA has artifacts and once you see them you can't unsee them.
Ignore that game bench. It is Nvidia title and ALL AMD cards are bad with this. Its basically Nvidia paid game devs to make game look good and run on heavy RT knowing it will cuck AMD cards.
One thing to note: 75% upscaling is higher resolution than Quality preset, at least for DLSS. Quality is around 67%, so would PROBABLY be a more balanced pick.
This. Also, I found the lack of ray tracing + high really weird. Once you reach high, ray tracing has a lot better bang for buck in terms of image quality than this fixation with "ultra" settings. Feels especially weird from an outlet so vocal about the stupidity of ultra settings.
In this case, 75% is the same as Quality. Here are the quality settings for specific slider positions: if you set resolution slider to 90-100 then DLAA gets selected (100%) 62-89 then Quality mode is used (67%) 55-61 then Balanced mode is used (58%) 40-54 then Performance mode is used (50%)
They're using Nvidia drivers specifically launched for this game (560.87 which you can't get). But they're not waiting the 2 days for AMD to have driver support for this game.
Yeah you enjoy that 831p 51fps slideshow son Meanwhile a 7900xtx owner disables RTBS, selects 2160p on very high settings, FSR quality for 1440p render +FG and averages 99fps, I don't use AMD but thats what I would do.
@@tomstech4390 Same story every time. Some nVidia fanboy brags about RT performance when in reality, they are getting a shit experience as well unless they have a 4090 maybe.
@johnc8327 people here discussing the anecdote itself, without realizing the bigger picture. Is the comparison between the 4060ti and 7900xtx stupid? Ofc it is! But considering the fact that in any situation at all, the 7900xtx gives you the same benefit as someone who purchased the 4060ti 8GB, is the 4070 ti super comparison out of the question? That's a big no. Nvidia is the market leading behemoth that charges a premium, they should not win on value, and in this case, they sort of do. This and the future consequences is "the big deal".
Back in the dat you couldn’t play the newest games in the highest settings with a 4 year old gpu either to be honest. I like that we are finally getting some true next gen experiences at least
@@Nucleosynthese Graphics don't make a game next-gen, gameplay does. Pretty graphics don't do squat when your game is soulless (no pun intended). Just look at all the AAA games that have come out from ubisoft and EA over the years that are complete trash despite massive leaps in how good the games look.
@@giglioflex Next-gen has always been co-related with graphical and technical leap, not so much gameplay wise, We couldn't simulate thousands of procedural stuff with old gen consoles, you're changing that definition for the sake of your argument. Gameplay is more important than graphics but it doesn't constitute towards being next gen as much as graphics/technical stuff do. There are so many technical stuff you can develop only on next gen hardware, ghost of tsuhima used custom procedural generation to simulate high quality grass which would've been neigh impossible in old gen consoles.
@@giglioflex That's just your opinion though. To me a giant leap in graphics is exciting and that's what makes this game next gen. It makes me think back at the original Crysis. A great example of a next gen title before its time.
Nobody will use 75% upscaling on an Ampere and not even on Ada DLSS Quality uses 67% upscaling not 75! the performance will be way better with optimized settings and 67% even on a 3080 and I dare say that most people won't use RT in this game at all, but this is a quality good video as always!
from what I can read, some reviewers using their rtx 4090 GPUs switched to medium quality to keep a reasonable performance in the game. so the benchmark tool appear to be very limited vs what the real game is doing, making all these benchmarks results irrelevant. this and the number of bugs (apparently this game compete with CB2077), we have to wait for a day 1 (or later) patch to do the benchmarks.
Idk how you possibly look at these results and don't come away thinking rt is multiple gens away still for the vast majority. Like holy shit, this game is completely unplayable with rt on basically.
I think this benchmark tool has problem. I tested with rtx4060 ->1080p /Fg:on /cinematic preset/ Upscaling: 75% dlss Rt (off)--> Avg fps : 61 Rt (medium)--> Avg fps : 64 And in your benchmark test with this graphic settings the avg fps for 4060 is (53) I tested several times and got the same results
If with Valve's Source 2 and the global illumination it has, i can run Counter Strike 2 dust2 deathmatch with UHD770 at +60fps with integer scaling 1280x1024, Unreal Engine 5 is a dumpster fire only designed for an RTX4090 and i9-14900K.
Nice data as always, do you actually see any significant visual difference between medium/high/cinematic or RT med vs very high? The benchmark runs just fine at 4k (DLSS/TST 1080p) + FG even on old GPUs, I'm not sure why are we always pushing for the max (future proofed) presets. A lot of games don't release with these to avoid optimization backlash and years later they age much faster.
This is mind boggling how you have to have frame gen/upscale to even have a 1440p RT experience on a 4090!! What the flying f ** are we even doing anymore lol
I wish you had not tested with frame generation. I dont see the point - it just makes it harder to see what the actual framerate and responsiveness is.
Tried the Benchmark tool for myself and with a few adjustments in settings got 100 FPS. I got that on my 4070 TI Super and my RX 6700 XT @1440p. The 6700 XT settings are High Preset, FSR + FG, Super Resolution set to 65, No RT. The CPUs for both cards are Ryzen 5000 X3D parts.
@@australienski6687 It's the new engine UE5 that's giving our GPUs a hard time. They weren't as prolific when I bought my Ti Super. I needed to upgrade as the RX 6700 XT was starting to top out its 12GB VRAM in a few games
that's why i disliked the video. hardware unboxed benchmark video is not worth watching anymore. especially when i skip and suddenly bottom part of the list is cut off. and doesn't show all the gpu that were tested.
How is optimization good if GPUs like the 7600 struggle to maintain 60 fps on native 1080p? Newer GPUs should be pushing us to a 1440p standard, but "optimized" games like this prevent that.
This game is the new Crysis, I have a 4090 and I found the best settings for this game is DLSS 75%, Ray Tracing Medium, and Very high settings. It gives you the Elden Ring experience. 60fps most of the time but drops to like 55 on demanding scenes.
Its not. If it was this perf would be reasonable. Crysis when it came out was much more ahead in everything than games offer at that time. This looks good but its not that much better and this is very linear game with closed levels while Crysis was complete opposite, what makes things even more shitty for this game.
@@scarletspidernz you know there is more options.. like other good game engines. ue5 games is a shite as i see now with some games.i saw rtx 4090 benchmark of fortnite barbie doll graphics and its was avg 90fps with dlss q .. i mean its bad . another example senua hellblade 2.. its looks good when you walk but nothing else. its no interaction
I just ran a few benchmarks, thanks for the info. I honestly do not see the appeal of RT. On max RT I think to myself "yea, looks good I guess" but for me its totally not worth such a gigantic performance decrease. My 7950X3D matched your 7800X3D btw, within 1-2 fps difference margin of error.
5:41 frame generation... here that's it, it feels like I'm playing something out of the '90s with cinematics of the 2020s... In any game, using frame generation, maybe I'm just very sensitive to the laggyness, which surprises me because I'm older, and do not play either competitive games or heavy combat first person games except maybe for rocket League. Frame generation messes with my timing, I'm not talking about my actions, I'm talking about how everything looks it just doesn't feel real, because it reacts wrong
I hate frame generation. It always feels kind of muddy... Not the visuals but that's how I'm getting information, and I can't put my finger on why I think so. It drives me bonkers. It has to be extremely high framerate before I don't notice it, at which point why do I even have it... This is one feature I absolutely detest
I have tried various UE5 tech demos maxed out @1440p and get around 60fps without upscaling, but this game is just another beast. Get low 50s in the benchmark with everything cranked and very high RT. Better wait for winter before I play this because the house is going to get hot.
Hey Steve, I think the testing may be flawed as DLSS gets set to 67% while XeSS and FSR does 75%. Can you see if your results change with extra testing?
Im no conspiracy theorist but it almost feels like they are just kahooting with devs now to make a push for an rtx 4950ti or something and we not realising that would be 2 maxwell titans in sli back in the day x) Pretty much no one actually went for it then but now they have tricked the consumer into thinking it's the new norm... Oh the game runs bad huh? Maybe you should have bought our sli tita... i mean 4090 huh?
if you keep yesterdays graphics you can keep yesterdays performance , who forcing people in comment section to turn on Hardware raytracing , Lumen is just as good for much less performance cost
Only at maxed out settings. This is called progress. We want games pushing out graphics to the max so we have forward progress there. Then you can downscale the graphics settings to fit your gpu. We don't want devs to get lazy and not try to push the graphical boundaries because then no forward progress gets made.
One thing that reviewers often mention, minimum fps for a good experience. I play these type of games on a 4k TV with controller. It's like a night and day difference when playing on a desktop monitor with mouse & keyboard. With 60 fps + frame gen is more than fine. On my 4080S testing, frame gen with average native 60 fps the game didn't have any issues, other than limited effect options to turn off depth of field or similar effects + no full screen mode for DLDSR. This is a must have feature for this type of game.
The problem is, is that this isn't actually RT from what I understand, it is path tracing (you know what they showcased in Cyberpunk 2077 a while back), hence why it is so demanding.
@@ThisIsMeArnold Why on earth would you ever test a game with Path Tracing and no DLSS? Path Tracing is a completely different beast to regular Ray Tracing, the 4090 will not be capable of using it without DLSS, we already knew this from Cyberpunk. Path Tracing is still only a showcase/demonstration of what Ray Tracing will like in the future, the current cards are not ready for it, even Nvidia knows this.
I was excited to play this at launch, but now I think I'll wait until RTX 6000 or RX 9000 series to get a native 1440p 60 experience with RT. When the mighty 4090 struggles at 1080p max with RT, and your card isn't even half as fast, you just can't justify paying full price for the game.
Setting DLSS to 75% will default to standard DLSS Quality resolution which is 67% while FSR will be computing real 75% . This is causing false comparison between Nvidia and AMD!
That thumbnail is amazing! 😅
its the best 🤣😂i had too look twice
That is Steve after Ryzen 5 benchmarks 😁
Stevey King 🐵
I remember listening to them trying to find the thumbnail.
@@HoneyTwee it's f** halirous
I saw it then I was like there is something wrong then I focused at it and laughed 🤣🤣
"Black Myth Wukong is a game!"
Go on Steve.... you have my attention.
Thanks, Steve
Still better review than IGN
@@vuxpee Do not mix this up with Black Myth WuDONG...thats not a game.
@@c2ashman and its sequel Black Myth: WuKONG DONG !
At that point he has to tell this to himself to believe it is a game and not benchmark tool.
That thumbnail could easily boost game sales by 200%.
best thumbnail ever !
second
I laughed so hard when I saw that thumbnail. 😆
😂😂
I just noticed that after reading this comment lol
Note that there is currently a bug in the benchmark that does not apply your selected scaling percentage to DLSS. Rather, it uses the closest DLSS preset. As such, the path-tracing results are using 66.66% scaling for NVIDIA cards and 75% for AMD.
Subjectively, the DLSS image quality will still be higher despite rendering at a lower internal resolution, but that probably wasn’t the intention of the testing.
Also, given that this is really a path tracing implementation, 50% scaling makes sense at 4K, which would be equivalent to 1440p at 75% scaling. 1440p DLSS quality only renders at 960p internallly. No graphics card can do 4K at 66 or 75% scaling using path tracing at useable frame rates, even with frame generation.
You also have to restart the program after changing any settings just to make sure because many of them don't change
And it doesn't have the latest DLSS DLLs, does it?
Those scaling percentages are applied only if TSR is used, otherwise the default resolutions for render scale are applied.
I don't know if it's a bug or just questionably weird settings behavior, but I can confirm that there is no FPS difference between DLSS 67% and DLSS 89% (so it uses DLSS Quality preset). With FSR the performance scales based on the resolution scale you set (so there are differences between 67%, 75% and for example 90%). This is really important information, because it makes the benchmark result quite skewed.
@@mondzi Hopefully HUB will do a full dive with DLSS, FSR, and XeSS.
Gamers on their way to buy a 4090 so that they can play in full hd at barely 60 fps in exchange for literally subtle water reflections.
What about gamers getting the 4090 to play the game at HUB optimized settings, at 4k, dlss quality, high refresh rate?
I remember when people were buying 2080ti to play bf5 at similar fps, 1080p.
@@jacklowe7479i remember when people were buying ati rage to play at 640*480 at 30 fps at best =p
I play on 1440p with a RTX 4090 7950X3D and I have 140 FPS on average with DLSS enabled, cinematic settings, RTX off, 75% resolution.
Still have some stuttering I think because of the game. My temps are 59°C for both CPU and GPU during the game. GPU reach 99% utilisation.
Sure thats worth 1800 bucks 😂 The more and more i look at how much that GPU costs the more a 4070 super looks like a steal. Can literally build a whole gaming rig for the price of the 4090. Hell you can build a 4080 super gaming rig and get 80 percent of the performance @colemin2
RT should come with a warning it will kick your GPU in the balls.
It does, but people seem to forget everytime a new title is released
In this game "RT" is always on and the "full RT" switch enables path tracing. That's why basically only RTX 4000 can somewhat handle it
RTX 3000 and 4000 GPUs can handle ray tracing fine with DLSS on. Even on the 4090, you shouldn't bother with ray tracing without DLSS, especially when using a game with DLSS 3.5 enabled.
4070 ti and up and no AMD can do ray tracing in this game
@@geetargato 3000? cap. Maybe a 3080 and 3080 but my 3070ti cant handle shite even at lowest possible rt with dlss Q or B
43 GPUs and 522 data points, damn. Thats a lot of work
Now the same with CPUs... :D
@@followtheoriIf you play at 4K it would be the same between i3 10100F and R7 7800X3D 😆
40 hours in the benchmark tool, this guy is not human...
Spent more than hour myself with single GPU... So 40 hours is quite realistic.
But taking in account GPU physical changes it must be EXHAUSTING.
@@DimkaTsv still sounds like a fun job
@@saiyaman9000yeah with some good music or philosophical background sound playing 😂 thats paradise for a man
It's literally his job
Look at the thumbnail, dude. He abandoned humanity to return to monke
"But can it run Wukong?" I can see it when the 5000 series and 8000 series come out
I still remember when the 3080 was sold as 4k GPU 😢
that's what i use it for 4K60fps
Hey I'm right there, 6950xt 😂
It 100% still is.
@@coinsagE46m3as long as you set everything to low, sure
The worst thing is that Turing was sold as a GPU for RT, many years later $2000 GPUs still don't run well.
"Cinematic" means movie-like as in 24fps. 😂
Ultra would be a better name
@@TheEgzithey were making a joke. The standard fps for movies is usually 24 fps.
@@dakbassett yeh I got it
@@TheEgzi - But did you? Lol
I don't know why you are laughing because that is literally what it means. "cinematic experience" means low fps but high quality
Monkey king❌
Benchmark king ✔️
New Crysis.
@@valok252My GPU is going to cry sis
@@trinityforce9138 😂😂😂😂😂
1080p with only 75% resolution scale + frame generation and most gpus still dont hit 60 fps. jesus christ. cant wait for 480p 15fps gaming in 3 years.
Well to be honest I think none of these tests represent the combination of settings that 90% of people who play this game will use which would probably be FG/DLSS/FSR on + RT off.
Regarding the first 2 sets of tests, I don't know how many people will ever run RT at all when the results are so catastrophic for frame rates, unless you're using the literal top end of top end GPUs. Regarding the last 3 set of tests, I don't know how many people will refuse to use frame generation when it practically gives you free frames.
@@ASadTeddyBear If you're not on a 60Hz screen there's 0 reason to avoid FG, input latency is negligible at 144hz for an instance.
Even at 60Hz there are things that can be done to improve latency a whole lot if you know what you're doing and depending on the game
Then don't play at absolute maximum settings and use high preset instead which makes you get 250% fps with only 10% graphics quality loss.
Nobody is forcing you to play maxed out graphics.. maxed out graphics is only for the most powerful GPU and next gen.
Btw, consoles will run this game on mix of low and medium settings with high textures and it will still look very good.
@@D.Enniss That's exactly my point.
@@ASadTeddyBear Yeah I am not getting this kind of test also, I like high refresh and usually play with no RT and DLSS enabled (on performance mode if game looks ok) with the mix of different settings like textures\lighting\postprocessing on max and lowered shadows
eflections. DLSS+FG+mixed settings can do pretty high FPS in demanding games even on mid-tier hardware, but yeah...
Sadly big channels usually only bench using pre-sets, well at least they bench on different resolutions still -_-
"...but can it run Wukong?" That is going to be the catchphrase for the next decade I believe.
A bit like "Crysis".
@@tdunster2011 that's the joke
no, game is no nearly important or impressive enough for that, also I think with Crysis we by now must have found out how stupid that line was anyway, EVERYTHING could RUN Crysis, few could MAX it.
@@MaaZeus until GTA 6 comes out.
Unforunately Every other AAA game nowadays is an unoptimized mess so.....when every game runs like crap, there's no "gold standard", like crysis in the past.
I absolutely hate how frame generation has become a thing. It is garbage below 70FPS so what’s the actual point of it.
Also, holy balls. A 4090 struggling outside of 1080p.
@@justwinclassic frame gen feels bad all the time to me. And looks not as good to my eye. Frame gen data is next to useless to me, so I was kinda disappointed in this.
@@robred123she tested without FG in this video as well. Also 43 different graphics cards. Useless? Lol
@@dakbassett to me? Yes. Because I like RT. I absolutely do not use FG. It's terrible. I'd rather play 30fps. At least then it plays how it feels, and the dissonance in my head doesn't pop off. So, showing me FG that I have to guestimate 40fps increase, if that?
IDC if the game is hard to run. I have a 7800x3d and a 4090. If there's a game I can't play well, then that just means it's not gonna play well until next gen drops, if that. And I want to know what I can actually do maxed out without some BS stop-gap tech that only exists to make the numbers on a graph look better.
@@dakbassett I never said anything about the data points and the work put in, I was specifically talking about frame gen.
I also hate how we need to consider things like DLSS and FSR.
@@robred123s amen brother.
"Black Myth Wukong is a game!"
That's all I needed to know, see you in the next one
pu m
810p upscaled to 1080p with frame generation. And still only the 4070Ti and above are really a smooth experience. Damn.
@@HoneyTwee well that's you don't put every setting on max. Daniel Owen has made optimized settings for this game.
Future proof game, very nice
@@quasii7 I guess. I think it would be useful in games that are as demanding as this to be given some context as to how the game actually looks.
If the game genuinely looks really amazing then sure. I'm fine with it not running well at native 1440p even on a RTX 4090. It's built to continue to look better as new GPUs come out.
But just looking at a FPS graph you cannot tell the difference between that, and a game that doesn't look that good but is just poorly optimised. A quick rundown of if the performance is justified by the visuals at any point in the video, a bit like what digital foundry do would be nice. I get that's not what this video is about, and the optimization guide next week will cover this. But at the moment I think my reaction is kind of expected due to the lack of context surrounding pure FPS graphs.
@@HoneyTwee the game really isnt anything impressive.
RT doesn`t seem to be worth it in the current generation of GPUs in this game unless you`re on 4090, to be honest. The graphic upgrade is mild and hard to notice, at least on YT clips. The game still looks nice on high, and fps is about 3 times higher.
There are some games where RT is like a generational upgrade (Metro, Dying Light 2,. Cyberpunk( but Wukong doesn`t seem to be one of these games, at least I think so far based on limited exposure to the game recordings on YT.
Another thing I noticed is abnormally weak RT performance on RTX 3080 / Ti. In all other games, these cards are on par with 4070 / super or even 4070 Ti but in Wukong they are way slower. The interesting part is that in 1440p high 3080, it gets a lot faster and starts to reach 4070 so either RT implementation in this game is bugged on RTX 3000 or there is a driver issue (or perhaps Wukong employees RTX 4000 architecture is better?)
Now we know who the monkey king really is. I've always suspected.
pu m
Ok … 98% of us will just play something else
All these games are beyond unoptimized.
This game looks good, but not ground breaking good
The reality is that companies can't make a game look as good as Red Dead Redemption 2, yet they require you to have 7900xtx or RTX 4090 to be above 100 fps in 1080p. Which in 2024 is absolutely RIDICULOUS
This is like Watch Dogs Legion barely hanging on RTX 4090 at 1080p.
Companies make unoptimized mess, and call them "masterpieces", requiring you to upgrade more and more, which is bull of collosal proportions. I am speaking this as 13900k owner with RTX 4090. Why does an average gamer need to have something so expensive as me, to play a game that looks like it was made in 2018?
😅😂rdr2 cost 5 time more money than this game
My 12900ks says rip big bro to your 13900k 😂
Yep, and the solution is so simple: just don't buy the game when they do this garbage. Sales line go down, game companies start paying attention.
Denuvo and forced shadow RT is to blame here, if only those two can be turned off average people can play the game
RDR2 is not on the level of this game. This is a proper UE5 game too. RDR2 was the best of the previous gen and now we are really getting to see what this current gen can do.
Funny how 4K isn't even considered lol
We're literally going backwards
more like 4k was way too early, the only gpu that can run 4k is the high end gpu and even then getting 60fps can be a struggle.
of course that game is totally marketing the new 5090 and 5080
more like its 4k native + fully maxed out cinematic & very high raytracing. - at 4k u usually always want use dlss (or fsr) no matter what game & gpu u have
Raytracing chuggs a ton of performance when just switching from medium to very high - its like a 50% hit. everything else is peanuts (though ingame the performance hit is bigger, the benchmark basically shows u if u can run the game , no real fps test here)
Thanks to Nvidia Gimpworks (raytracing), which doesn't even look that much better in 99% of scenarios.
@@TechnoGuille Devs sure might be using unreleased 5090.
@Hardware Unboxed
DLSS does not work like the game says it does. It does not use 75% render scale.
For some reason DLSS is not actually using the resolutions displayed on the slider. Instead it only uses the "Current DLSS Quality Mode" displayed under the Super Resolutions description. These use the typical DLSS resolutions. The slider does allow you to change the Quality mode, but it only uses the 5 discrete values for the different Quality modes.
You can display DLSSs internal render resolution with the DLSS overlay enabled in regedit/DLSSTweaks or show it with Special Ks overlay.
As far as I can tell the other upscalers use the actual slider resolutions, so you are benchmarking DLSS 67% vs FSR 75%.
Aditionally, there is a display bug that sometimes changes the displayed slider value. For example if I set the slider to 89 and hit apply, then leave the menu, opening the menu up again the slider will show 88 instead. 88 will also be displayed in the benchmark result, but this is not correct. You can open the config file and see that the game is still set to 89% render resolution.
I really hope free benchmark tools become more common. Thats such a good idea... Love it.
Preferably with a bit more realistic gameplay scenes.
You here? Gonna test on laptop hub? :D
@@princehariz7859 if it's gonna be a popular game and they offer a free Benchmark Tool, hell yeah 😅
its an old idea..from 24 years ago to be more precise
@@chillhour6155 you're talking about demo versions ;) I'd take them as well but was refering specifically to a benchmark tool so people can test if their system can actually run the game...
Three things I discovered with the benchmark wukong tool: Check to see if the upscaling percent decreased by 1 when you change graphic settings. If you use a keyboard, you can change two settings at once by highlighting one setting with your keys and another setting with your mouse. Left clicking when the benchmark runs hides the mouse.
Is this an rtx 4000 advertisement? Why did you run rtx 4000 series with %66 dlss while running all other gpus with %75 fsr? I don't think %26 resolution difference is negligible.
so basically the game is unplayable for most people
good stuff
@@anasdabbar yeap. Thank you nvidia sponsor making it have RT always on.
Not really. Changing settings to medium and most cards play it fine.
@@stormshadow5382 RT isn't always on, Lumen is. From what I have read from others it seems there is a misunderstanding of what Full Ray Tracing actually is (which is what is being tested here), full ray tracing is actually path tracing which is why it is so demanding and also explains why even the 4090 struggles with it, this setting is not always on it can be toggled on or off.
The Lumen aspect has nothing to do with Ray Tracing.
@@aflyingcowboy31 lumen uses basic ray tracing for it's lighting. That's why I said RT on, not full RT on
@@stormshadow5382 " lumen uses basic ray tracing for it's lighting."
No it doesn't, Lumen uses no ray tracing, Lumen is software based, meaning Nvidia cards that have RT cores do not get any improvement from games that use Lumen i.e. both AMD and Nvidia are on more of an equal playing field here.
Lumen, is Unreal Engine's real-time global illumination and reflections system, it does not use traditional ray tracing for its lighting. Instead, it employs a combination of screen space techniques and a voxel-based representation to achieve dynamic global illumination and reflections.
So no, it has nothing to do with Ray Tracing or Nvidia.
This feels like a whole class of students failed an exam and now the teacher is blaming us lol
lmao, only the 4090 passed this term and not on all subjects
tag yourself, I'm the 2080ti btw
4090 cheating in exams.
3050: It ain't much but it's honest work. 🤣
@@amnottabs deff 6500 XT material right here (I'm sorry...)
Software Lumen is still fully supported on DX11, just not hardware Lumen.
and nanite?
I don't think the benchmark is representative of the actual game. There is nothing happening. Not particle effects, no action, no AI.
AI is a nothingburger buzzword.
Did the AI overlords pay you to say that?
@@sammiller6631
@@sammiller6631 Hes talking about the AI in the game. how is that a buzzword? The CPU and GPU applications of AI and their effect on performance have been around since forever. Look at The Last Guardian for one of the most intense AI developed around the PS3/early PS4 age.
You can't even disable upscaling/temporal stuff in the menu. "We built this high quality assets that can't be seen in all its glory"
Wish the lumen/software rt thing was itself a toggleable setting instead of being tied to presets or api or something
You can always change to DX11 by tweaking the files, it disables lumen 👀
It's an UE5 game. If you want to turn off basic UE5 features, don't buy the game.
You can disable it by using the DX11 version, but then the game looks crap
@@panzer3279 Lumen and nanite, unlike PBR (physically based rendering) are not really basic UE5 features, are special extra UE5 features and they are entirely optional, while PBR in UE4 and UE5 is not.
You can make a UE5 game, without nanite and Lumen and it will still be a UE5 game plus you will take advantage of the improvements this newer engine brought to the tools and some of the workflow.
But yes, most recent UE5 dev's now use those features, so technically they will become "basic" UE5 features, going forward.
1080P Native Max Settings without RT requiring RX 7900 XTX/ RTX 4080for at least 60 FPS is INSANE
The game always uses RT
@@mitsuhh still, a $1,000 for 1080P gaming
We need next gen GPUs already 😅
@@mitsuhhwait there is no option to turn off RT in this game?
@@v808 Lumen is on by default. The custom RT setting is for path tracing
GPU drivers optimizations incoming... Steve, prepare for more benchmarks! 😁
Sadly gaming is slowly changing to a hobby for rich people.. If you want a smooth experience even on 1080p you need to upgrade every year or two :/
This game was designed for a RTX 3080 in mind to run at 4K back when it was first revealed, but now it' can't even run at 1080p 60fps using the highest settings. What went wrong here? I did not buy a 3080 to not run the game well at all.
I think a lot of people will change their tune about this benchmark once they get the game and find how much performance drops off in areas with a lot of NPCs and during intense combat with all the flashy lighting effects Asian developers love to (over) use
This is a best case scenario benchmark that is leaving out all the CPU intensive parts of the game which makes me suspect that it will turn out to be another Dragon's Dogma 2
Updates will fiix it,once they get real world data.
DD2 was a unique case of RE Engine being wholly inadequate for large NPC counts - hence the massive CPU bottleneck that developers tried to somewhat mitigate with ridiculously close spawn distances for said NPCs.
Also the recommendation is using fsr 😂
this has me scared im getting 86 at max and 76 avg high native 2160x1080 on my 6800 xt on linux with alot of npc its going to tank
Crysis 1 running in one thread on my 12700K, with the crazy physics, the AI on the hardest difficulty and destruction on buildings still gives me +100fps.
I wonder how the heck a game like Dragon's Dogma 2 is that CPU intensive, is not an RTS game like Troy Total War Saga.
It always surprises me the amount of content Steve can put out in one video.
I've really missed you doing this type of content. I used to love your day one in benchmark videos I hope we'll see more of these
Please provide pure native (aka 100% render resolution, no upscaling, no frame generation) RT maxed-out results as well.
You are definitely getting a like for all the hard work. You've got bags under your eyes mate, the least that we could do is say thank you!
This is ridiculous. You need a 4090 minimum at 1080p to have a "good" RT experience? What happened to these devs?
The same thing that happens to every game Nvidia sponsors, they take money and in exchange put whatever unoptimized BS Nvidia wants in.
Nonsense. Did you even watch the video. You don't just leave it on max settings. With optimised settings you can easily get over 100fps at 4k
@@mojojojo6292 FG and DLSS is not "optimized settings".
yeah it's a pass from me, It doesn't even look like it needs that hardware
@@daztorathis!! I cant see any impressive visuals that justify this RT perfomance cap!
Hardware unboxed showing off their inner warrior beast in that thumbnail.💪
You're everywhere
@@GewelReal Was about to say the same
RTX : ON
FPS: OFF
It's actually PTX so fps crashed
RTX: ON
AMD: LEFT THE ROOM
@@nossy232323accurate... Considering Radeon RX 7900 XTX can't even deliver proper gaming experience at 1080p.
I will try this benchmark with my older Radeon GPUs from GCN era.
@@nossy232323 * "nvidia sponsored game"
Steve
@@eternalwingspro4454 Huh ?
the 7900xtx hits over 100FPS the fuck are you talking about
reject ray tracing
return to *m o n k e*
Why does nobody mention the artifacting with any kind of upscaling, dlss or fsr both? in the benchmark atleast, my eyes nearly fell out from the amount of shimmering around leaves and branches. Even DLAA has artifacts and once you see them you can't unsee them.
"7800xt a mid-tier 1080p card now" - Wukong
Always has been
@@iPpBG never was, its a good card for 1440p ultra or even 4k gaming with compromises
@@rolandkovacs3 Yeah, only been able to play Raster games.
It's good for competitive games, nothing else.
Ignore that game bench. It is Nvidia title and ALL AMD cards are bad with this. Its basically Nvidia paid game devs to make game look good and run on heavy RT knowing it will cuck AMD cards.
@@Frigobar_Ranamelonico dafuq are raster games
One thing to note: 75% upscaling is higher resolution than Quality preset, at least for DLSS. Quality is around 67%, so would PROBABLY be a more balanced pick.
77% is usually ultra quality.
AMDunboxed never goes below quality preset because FSR would be unusable on it despite DLSS performance being quite good.
This. Also, I found the lack of ray tracing + high really weird. Once you reach high, ray tracing has a lot better bang for buck in terms of image quality than this fixation with "ultra" settings.
Feels especially weird from an outlet so vocal about the stupidity of ultra settings.
In this case, 75% is the same as Quality. Here are the quality settings for specific slider positions:
if you set resolution slider to
90-100 then DLAA gets selected (100%)
62-89 then Quality mode is used (67%)
55-61 then Balanced mode is used (58%)
40-54 then Performance mode is used (50%)
@@iurigrangIt's not that deep. He is just giving the benchmark results based on the GPU he tested. As he said, optimization guide is on the way.
3:39 4060ti 8GB beats 7900XTX with RT turned on LMFAO.
That is absolutely insane
They're using Nvidia drivers specifically launched for this game (560.87 which you can't get).
But they're not waiting the 2 days for AMD to have driver support for this game.
Yeah you enjoy that 831p 51fps slideshow son
Meanwhile a 7900xtx owner disables RTBS, selects 2160p on very high settings, FSR quality for 1440p render +FG and averages 99fps, I don't use AMD but thats what I would do.
@@tomstech4390 Same story every time. Some nVidia fanboy brags about RT performance when in reality, they are getting a shit experience as well unless they have a 4090 maybe.
@johnc8327 people here discussing the anecdote itself, without realizing the bigger picture.
Is the comparison between the 4060ti and 7900xtx stupid? Ofc it is!
But considering the fact that in any situation at all, the 7900xtx gives you the same benefit as someone who purchased the 4060ti 8GB, is the 4070 ti super comparison out of the question? That's a big no. Nvidia is the market leading behemoth that charges a premium, they should not win on value, and in this case, they sort of do. This and the future consequences is "the big deal".
Thank you for doing the work to bring us these results.
0:05: "Black Myth: Wukong is a game." Crazy if true! 😮😂
😂😭😭
A controversial statement if ever there was one.
Truly one of the statements of all time.
It's crazy that something like a 3090 Ti could deliver "unplayable" performance.
I don't think I'll buy anything this decade.
Just don't run the game with ultra preset, bro
Back in the dat you couldn’t play the newest games in the highest settings with a 4 year old gpu either to be honest. I like that we are finally getting some true next gen experiences at least
@@Nucleosynthese Graphics don't make a game next-gen, gameplay does. Pretty graphics don't do squat when your game is soulless (no pun intended). Just look at all the AAA games that have come out from ubisoft and EA over the years that are complete trash despite massive leaps in how good the games look.
@@giglioflex Next-gen has always been co-related with graphical and technical leap, not so much gameplay wise, We couldn't simulate thousands of procedural stuff with old gen consoles, you're changing that definition for the sake of your argument. Gameplay is more important than graphics but it doesn't constitute towards being next gen as much as graphics/technical stuff do. There are so many technical stuff you can develop only on next gen hardware, ghost of tsuhima used custom procedural generation to simulate high quality grass which would've been neigh impossible in old gen consoles.
@@giglioflex That's just your opinion though. To me a giant leap in graphics is exciting and that's what makes this game next gen. It makes me think back at the original Crysis. A great example of a next gen title before its time.
Nobody will use 75% upscaling on an Ampere and not even on Ada DLSS Quality uses 67% upscaling not 75! the performance will be way better with optimized settings and 67% even on a 3080 and I dare say that most people won't use RT in this game at all, but this is a quality good video as always!
Honestly man, do you guys ever sleep? I mean seriously! Your output is in a league of it's own. Thx!
When is the graphics settings optimization video going to come out? I reckon today is the last day of "early next week" since Wednesday is midweek. 😊
from what I can read, some reviewers using their rtx 4090 GPUs switched to medium quality to keep a reasonable performance in the game.
so the benchmark tool appear to be very limited vs what the real game is doing, making all these benchmarks results irrelevant.
this and the number of bugs (apparently this game compete with CB2077), we have to wait for a day 1 (or later) patch to do the benchmarks.
*This game made my 7800XT seem like a budget GPU* 🤣
Wait it isn't ?! 😮
@@lefourbe5596 is it??
from 2k king to 480p king, so bad industry
@@lefourbe5596 $500 is absolutely not budget. That should go without saying. Budget is the bottom bracket, 6500 XT for example.
It is
Idk how you possibly look at these results and don't come away thinking rt is multiple gens away still for the vast majority. Like holy shit, this game is completely unplayable with rt on basically.
Thumbnail is hilarious. Cheers
I think this benchmark tool has problem.
I tested with rtx4060
->1080p /Fg:on /cinematic preset/
Upscaling: 75% dlss
Rt (off)--> Avg fps : 61
Rt (medium)--> Avg fps : 64
And in your benchmark test with this graphic settings the avg fps for 4060 is (53)
I tested several times and got the same results
I think setting DLSS to 75% does use DLSS Quality mode, which internally runs at 67%, while FSR at 75% should actually run at 75%.
Kudos and hats off to the sheer effort you guys put in for the gaming community
You're a lot of bloody champions thank you for your hard work to inform and protect the consumer!
If with Valve's Source 2 and the global illumination it has, i can run Counter Strike 2 dust2 deathmatch with UHD770 at +60fps with integer scaling 1280x1024, Unreal Engine 5 is a dumpster fire only designed for an RTX4090 and i9-14900K.
How come the 1080 ti beats the 3060
Nice data as always, do you actually see any significant visual difference between medium/high/cinematic or RT med vs very high?
The benchmark runs just fine at 4k (DLSS/TST 1080p) + FG even on old GPUs, I'm not sure why are we always pushing for the max (future proofed) presets.
A lot of games don't release with these to avoid optimization backlash and years later they age much faster.
This is mind boggling how you have to have frame gen/upscale to even have a 1440p RT experience on a 4090!!
What the flying f ** are we even doing anymore lol
I wish you had not tested with frame generation. I dont see the point - it just makes it harder to see what the actual framerate and responsiveness is.
Tried the Benchmark tool for myself and with a few adjustments in settings got 100 FPS.
I got that on my 4070 TI Super and my RX 6700 XT @1440p.
The 6700 XT settings are High Preset, FSR + FG, Super Resolution set to 65, No RT.
The CPUs for both cards are Ryzen 5000 X3D parts.
Thanks.....phew.
4070 ti super already compromising settings? That's depressing.
@@australienski6687 It's the new engine UE5 that's giving our GPUs a hard time. They weren't as prolific when I bought my Ti Super. I needed to upgrade as the RX 6700 XT was starting to top out its 12GB VRAM in a few games
that's why i disliked the video. hardware unboxed benchmark video is not worth watching anymore.
especially when i skip and suddenly bottom part of the list is cut off. and doesn't show all the gpu that were tested.
@@australienski6687 Was getting 80fps+ at 4K with a 4070ti, so no not really.
Send it back to the devs, it's clearly not done yet
How is optimization good if GPUs like the 7600 struggle to maintain 60 fps on native 1080p? Newer GPUs should be pushing us to a 1440p standard, but "optimized" games like this prevent that.
At what point does playability weigh more than looks....
The game is optimized for Nvidia cards obviously, when the 3080 and 3070 are doing much better than the 7900 XTX lmfao another nvidia shenanigans.
43 GPUs? That is an insane amount of GPUs to test and would have required lots of time and effort. Thanks for doing this, Steve. Greatly appreciated.
This game is the new Crysis, I have a 4090 and I found the best settings for this game is DLSS 75%, Ray Tracing Medium, and Very high settings. It gives you the Elden Ring experience. 60fps most of the time but drops to like 55 on demanding scenes.
Its not. If it was this perf would be reasonable. Crysis when it came out was much more ahead in everything than games offer at that time. This looks good but its not that much better and this is very linear game with closed levels while Crysis was complete opposite, what makes things even more shitty for this game.
Dammit, I just bought a 4070 ti super, it came with a copy of wukong and now you're telling me I can't run it at max settings at 1440p? 😢
UE5: let's evolve backwards to 1080p 60fps gaming...with RTX 4090.
🤦♂
Not UE5's fault that GPUS can't keep up with, if it weren't for UE5 we'd be still using UE4 for everything.
@@scarletspidernz you know there is more options.. like other good game engines. ue5 games is a shite as i see now with some games.i saw rtx 4090 benchmark of fortnite barbie doll graphics and its was avg 90fps with dlss q .. i mean its bad . another example senua hellblade 2.. its looks good when you walk but nothing else. its no interaction
"Cinematic quality" needs renamed "cinematic framerate"
that's mostly shadows and vegetation quality, though.
Looks like they actually put their ultra settings for future gpus based on the jump between high and cinematic.
4090 barely manages 60fps at 4k at high settings, while entirely on rails and no sudden camera changes - 15:39
I just ran a few benchmarks, thanks for the info. I honestly do not see the appeal of RT. On max RT I think to myself "yea, looks good I guess" but for me its totally not worth such a gigantic performance decrease. My 7950X3D matched your 7800X3D btw, within 1-2 fps difference margin of error.
5:41 frame generation... here that's it, it feels like I'm playing something out of the '90s with cinematics of the 2020s... In any game, using frame generation, maybe I'm just very sensitive to the laggyness, which surprises me because I'm older, and do not play either competitive games or heavy combat first person games except maybe for rocket League. Frame generation messes with my timing, I'm not talking about my actions, I'm talking about how everything looks it just doesn't feel real, because it reacts wrong
I hate frame generation. It always feels kind of muddy... Not the visuals but that's how I'm getting information, and I can't put my finger on why I think so. It drives me bonkers. It has to be extremely high framerate before I don't notice it, at which point why do I even have it... This is one feature I absolutely detest
Because they are fake frames it's like that awful effect tvs all come with enabled that make everything look like a soap opera.
Wahh what a thumbnail , amazing work 👏🏻
Owning a 7900 XTX........OOOOOOOOOFFFF
I have tried various UE5 tech demos maxed out @1440p and get around 60fps without upscaling, but this game is just another beast. Get low 50s in the benchmark with everything cranked and very high RT. Better wait for winter before I play this because the house is going to get hot.
Hey Steve, I think the testing may be flawed as DLSS gets set to 67% while XeSS and FSR does 75%. Can you see if your results change with extra testing?
i get 7fps on my 3080ti fully maxed out lol
I get 33fps on 4080 fully maxed at 1440p
@@mitsuhh
@h0stile420 i have a 7700xt card, after seeing the results i just new i wont be playing this game. it is off my list🤣
@h0stile420 Sure, if you know nothing about rendering
@h0stile420 path tracing!
Why would you NOT test dlss on, framegen off?? Am I the only one who doesnt use framegen?
Facts bro
Im no conspiracy theorist but it almost feels like they are just kahooting with devs now to make a push for an rtx 4950ti or something and we not realising that would be 2 maxwell titans in sli back in the day x)
Pretty much no one actually went for it then but now they have tricked the consumer into thinking it's the new norm...
Oh the game runs bad huh? Maybe you should have bought our sli tita... i mean 4090 huh?
1080p and 60fps. We're going so much backwards in todays gaming
if you keep yesterdays graphics you can keep yesterdays performance , who forcing people in comment section to turn on Hardware raytracing , Lumen is just as good for much less performance cost
Only at maxed out settings. This is called progress. We want games pushing out graphics to the max so we have forward progress there. Then you can downscale the graphics settings to fit your gpu. We don't want devs to get lazy and not try to push the graphical boundaries because then no forward progress gets made.
True. As long as high settings still look great
@@TheRover87 progress? its a shimmering low quality blurry mess....wtf are you talking about
@@bingbong3084 7600 and 4060 are yesterday's graphics?
What does the setting Upscale 75% mean? Why not 100%? And if it is an Upscaler why use it after all?
Thx for the great work again Steve!
My 5600g + rx 6600 combo didnt make it on max preset
You know you should give us a download link for the thumbnail image, right? Please?
One thing that reviewers often mention, minimum fps for a good experience. I play these type of games on a 4k TV with controller. It's like a night and day difference when playing on a desktop monitor with mouse & keyboard. With 60 fps + frame gen is more than fine. On my 4080S testing, frame gen with average native 60 fps the game didn't have any issues, other than limited effect options to turn off depth of field or similar effects + no full screen mode for DLDSR. This is a must have feature for this type of game.
You should have included a segment of fully maxed out settings. No DLSS, Cinematic presets, maxed RT. Brings the 4090 numbers to around 20 at 4K.
Up
The problem is, is that this isn't actually RT from what I understand, it is path tracing (you know what they showcased in Cyberpunk 2077 a while back), hence why it is so demanding.
@@aflyingcowboy31 I'm not understanding where the "problem" is.
@@ThisIsMeArnold Why on earth would you ever test a game with Path Tracing and no DLSS? Path Tracing is a completely different beast to regular Ray Tracing, the 4090 will not be capable of using it without DLSS, we already knew this from Cyberpunk.
Path Tracing is still only a showcase/demonstration of what Ray Tracing will like in the future, the current cards are not ready for it, even Nvidia knows this.
because that will actually showcase what kind of raw performance your 2000€ gpu will get you?
Surprise,
It's not alot, at least not 2000€ worth
I was excited to play this at launch, but now I think I'll wait until RTX 6000 or RX 9000 series to get a native 1440p 60 experience with RT.
When the mighty 4090 struggles at 1080p max with RT, and your card isn't even half as fast, you just can't justify paying full price for the game.
Where can we download the Nvidia 560.87 driver mentioned in this video?
Nice new benchmark game for the charts
Setting DLSS to 75% will default to standard DLSS Quality resolution which is 67% while FSR will be computing real 75% .
This is causing false comparison between Nvidia and AMD!
legit funniest thumbnail from you guys LMFAOOOOOOOOOOOOOOOOO
Would it be possible to add in wide-screen resolutions in gpu testing? Not just in this game but overall in other games and just general gpu testing
I only got 2 fps average and max and 1 mininum with GTX 750 Ti 2 GB Strix OC and I5 3470 + 16 GB. All setting default 1080 + cinematic preset.