Thanks for benchmarking an RTX 3080 as well, my card. If I stick with 1440p, or 1440p DLSS Quality if needed, maybe I can skip the RTX 50 series generation as well, what do you think?
3080 is far better than current consoles, so it should be enough for a longer while. Whats your cpu? i would skip 50 series, or buy used 40 series for lower price after rdna 4 launch. 8800 xt have to be at 4080 performance level, so i guess used 4080 should have decent price.
@ i did that and thanks and such a difference, everything else still at highest setting , performance drop is brutal from the in game benchmark,high textures 84fps dlss balanced at 4k vs 45 fps average
*do NOT use Frame Gen (with rare exceptions).. just FYI* WHY? For example, the RTX4070 at around "68FPS" with FG means that it's interpolating every 2nd frame. So you RENDER (and thus lag/sluggishness) is based on the REAL frame generated. So 34FPS, but due to how frame gen works the lag is slightly BELOW 34FPS so roughly the same as 30FPS. The VISUAL smoothness improves but the lag is worse... so what about a higher FPS? Well, if you could get 100FPS after FG that would mean just below 50FPS for lag which isn't too bad. Great, right? Not so fast. If you can get 100FPS after FG then you might have been able to get 70 or 80FPS (varies by hardware) normally which is already pretty smooth and has LESS visual artifacts from the "fake" frames. And has less lag, so combat is more responsive etc. So, when does Frame Generation make sense? Well, personal preference in part but I'd say for SLOWER moving games where the lag doesn't matter as much but you want things to look smooth on the monitor. So, say 80FPS+ after frame gen for example. In. My. Opinion.
There's two options in the SOUND section you can change. One is for the sound source (i.e. "TV" etc) and the other one is next to it but I forget the name. I only have a STEREO (two speakers only) setup but I didn't have any issues.
the game already looked fantastic before and the differences are so marginal that it's just not worth buying, especially since I've already played through it completely once
No benchmark without upscaling shit like DLSS or FSR? No native benchmarks? That would be interesting. Devs are becoming lazy to optimize games in this era.
@@MxBenchmarkPC Technically correct, although "DLAA" is still based on DLSS AI and is more demanding than native resolution AFAIK (but then should be better visually as well). Honestly, not much point in testing something like just native resolution with TAA if using DLSS/DLAA... comparing TAA with FSR might be a different story and will vary by the game.
What a stupid comment. This has absolutely nothing to do with laziness and we are glad that these techniques and possibilities exist, especially as not everyone has the money for a powerful graphics card and can enjoy a smooth game. That's why it makes little sense to test the whole thing without the Aldi function when these possibilities already exist
Please give this video a thumbs up if you liked it and feel free to comment below or ask me anything. It will help me to get recognized by youtube's algorithm. Thanks! :)
Timecodes:
○ 0:00 - RTX 4080 4K DLAA, DLSS FG Off, Very High Settings
○ 0:41 - RTX 4080 4K DLSS 3.7 Quality, DLSS FG Off, Very High Settings
○ 1:22 - RTX 4080 4K DLSS 3.7 Quality, DLSS FG On, Very High Settings
○ 1:58 - RTX 4080 1440p DLAA, DLSS FG Off, Very High Settings
○ 2:41 - RTX 4080 1440p DLAA, DLSS FG On, Very High Settings
○ 3:19 - RTX 4070 4K DLSS 3.7 Quality, DLSS FG Off, Very High Settings
○ 3:53 - RTX 4070 4K DLSS 3.7 Quality, DLSS FG On, Very High Settings
○ 4:33 - RTX 4070 1440p DLAA, DLSS FG Off, Very High Settings
○ 5:00 - RTX 4070 1440p DLSS 3.7 Quality, DLSS FG Off, Very High Settings
○ 5:37 - RTX 4070 1440p DLSS 3.7 Quality, DLSS FG On, Very High Settings
○ 6:12 - RTX 4060 1080p DLAA, DLSS FG Off, Very High Settings
○ 6:59 - RTX 4060 1080p DLSS 3.7 Quality, DLSS FG Off, Very High Settings
○ 7:35 - RTX 4060 1080p DLSS 3.7 Quality, DLSS FG On, Very High Settings
○ 8:18 - RTX 4060 1440p DLSS 3.7 Balanced, DLSS FG Off, Very High Settings
○ 8:57 - RTX 4060 1440p DLSS 3.7 Balanced, DLSS FG On, Very High Settings
○ 9:48 - RTX 3060 1080p DLAA, FSR 3.1 FG Off, Very High Settings
○ 10:29 - RTX 3060 1080p DLSS 3.7 Quality, FSR 3.1 FG Off, Very High Settings
○ 11:17 - RTX 3060 1080p DLSS 3.7 Quality, FSR 3.1 FG On, Very High Settings
○ 12:03 - RTX 3060 1440p DLSS 3.7 Balanced, FSR 3.1 FG Off, Very High Settings
○ 12:44 - RTX 3060 1440p DLSS 3.7 Balanced, FSR 3.1 FG On, Very High Settings
○ 13:26 - RTX 3080 4K DLSS 3.7 Quality, FSR 3.1 FG Off, Very High Settings
○ 14:12 - RTX 3080 1440p DLAA, FSR 3.1 FG Off, Very High Settings
○ 14:51 - RTX 3080 1440p DLSS 3.7 Quality, FSR 3.1 FG Off, Very High Settings
○ 15:34 - RTX 3080 1440p DLSS 3.7 Quality, FSR 3.1 FG On, Very High Settings
Hi. Can you tell me please? Benchmark at remaster include as Original or not? Tjanks
Way better graphics than the new monster hunter game, and this runs way better.
Thanks for benchmarking an RTX 3080 as well, my card. If I stick with 1440p, or 1440p DLSS Quality if needed, maybe I can skip the RTX 50 series generation as well, what do you think?
3080 is far better than current consoles, so it should be enough for a longer while. Whats your cpu? i would skip 50 series, or buy used 40 series for lower price after rdna 4 launch. 8800 xt have to be at 4080 performance level, so i guess used 4080 should have decent price.
@thecarl1013 Ryzen 7 7800X3D after I retired my Intel 9700K.
You'll be fine, the 3080 has around the same power level as PS5 Pro.
Thanks for benchmarking a 4070. Is there any noticeable lag with frame gen? I know it’s still not perfect but some games it’s not noticeable.
Its pretty noticeable especially when using kbm controller prob fine
@ thanks. I use a controller exclusively so hopefully it’s fine. Thanks for the video.
Which driver version have you used for RTX 3060?
566.03
Did you set the 3080 to high textures instead of very high because of vram?
Yes.
@ i did that and thanks and such a difference, everything else still at highest setting , performance drop is brutal from the in game benchmark,high textures 84fps dlss balanced at 4k vs 45 fps average
@Miguel-ou2py happy to help!
*do NOT use Frame Gen (with rare exceptions).. just FYI*
WHY?
For example, the RTX4070 at around "68FPS" with FG means that it's interpolating every 2nd frame. So you RENDER (and thus lag/sluggishness) is based on the REAL frame generated. So 34FPS, but due to how frame gen works the lag is slightly BELOW 34FPS so roughly the same as 30FPS. The VISUAL smoothness improves but the lag is worse... so what about a higher FPS?
Well, if you could get 100FPS after FG that would mean just below 50FPS for lag which isn't too bad. Great, right?
Not so fast. If you can get 100FPS after FG then you might have been able to get 70 or 80FPS (varies by hardware) normally which is already pretty smooth and has LESS visual artifacts from the "fake" frames. And has less lag, so combat is more responsive etc.
So, when does Frame Generation make sense?
Well, personal preference in part but I'd say for SLOWER moving games where the lag doesn't matter as much but you want things to look smooth on the monitor. So, say 80FPS+ after frame gen for example. In. My. Opinion.
Hi, the game runs fine on PC. But I notice the audio is a bit low even though I have it at maximum in the configuration. Is there any solution?
There's two options in the SOUND section you can change. One is for the sound source (i.e. "TV" etc) and the other one is next to it but I forget the name. I only have a STEREO (two speakers only) setup but I didn't have any issues.
game seems to be CPU heavy as i have a 4080 too and get better FPS
Same vram leak as in forbidden west?
Nope.
this is a big issue for 16gb ram owners with 8gb vram. Thanks for bringing this up
@@mrnerd766 can you notice it with 16gb vram?
@@mr.h5566 no 16gb is more than enough, you could do 4k with that
@@mrnerd766 what does the momory leak do? Does the game needs 7,5Gb at the start and then more after a while?
why are you still using the 10700f it's bottlenecking you hard
at 4k using a RTX 4080 his cpu is at 90% he's close but not bottlenecking, if anything they're a perfect match imho
WTF is going on with the CPU? Looks like it compiles shaders on the go
He's got a 10th gen Intel CPU, so averaging 65% or so in normal gameplay WITHOUT compiling shaders seems about right.
Yeah, it slowly compiles shaders during gameplay.
@@photonboy999 dude it's an i7, top tier 10 gen cpu which is not that old.I'm wondering how it runs on a 8700k
@@NotReallyLaraCroft yo tengo un i7 8700k con una 3080ti y me funciona perfecto. 1440p DLAA ultra settings y siempre está por encima de 60 fps.
@@NickT9330 Meridian? Other places with tons of NPCs?
the game already looked fantastic before and the differences are so marginal that it's just not worth buying, especially since I've already played through it completely once
$10 for the upgrade isn't a bad deal. I never completed it so it's worth it for me.
I see a big visual difference.
No benchmark without upscaling shit like DLSS or FSR? No native benchmarks? That would be interesting. Devs are becoming lazy to optimize games in this era.
All GPUs in this video were tested with and without upscaling.
DLSS works amazing well, actually. FSR not so great. But if you had an NVidia RTX card it really wouldn't make sense NOT to use DLSS.
@@MxBenchmarkPC
Technically correct, although "DLAA" is still based on DLSS AI and is more demanding than native resolution AFAIK (but then should be better visually as well). Honestly, not much point in testing something like just native resolution with TAA if using DLSS/DLAA... comparing TAA with FSR might be a different story and will vary by the game.
@@MxBenchmarkPC I just saw 4K native with 4080. No 4K for other cards if I'm wrong tell me
What a stupid comment. This has absolutely nothing to do with laziness and we are glad that these techniques and possibilities exist, especially as not everyone has the money for a powerful graphics card and can enjoy a smooth game. That's why it makes little sense to test the whole thing without the Aldi function when these possibilities already exist