Also I watched the Stalker 2 deep dive and it looked pretty good overall. The gunplay looked really good overall. I'm sure there are some rough edges people can point out but one interesting thing I noticed is that most of the gameplay seemed to be at 60 but in the last minute of the video they showed what looked to be 30 fps footage so not sure if that is a different preset or not. It was obviously played with a controller and my guess would be on a series x vs a PC. Luckily they have a new demo for people at gamescom next week so looking forward to hearing updated impressions. I have only watched it on my phone but damn do the visuals look good on it. Hoping that DF will talk about it in their weekly video next week.
Nice benchmarks man. Might be a good idea to make a video on overclocking your GPU because your performance is amazing! Your 3060Ti is outperforming my 3070 by 11fps on avg XD
Hmm is it outperforming your 3070? Are you very CPU limited or at low (1080p) resolution and with Upscaling? Could see it happening in such scenarios, with a less powerful CPU, but at higher resolutions it should not happen, unless your 3070 is downclocking a bit due to to temperature. Which is also why I normally do not OC my GPU any more than what comes from the factory, as the summer season here is brutal, I have to turn on the AC to keep the GPU below 69-70C at full load, and since it has 3 fans and the cooler is decent, it then stays below 65C but yeah, an overlock would likely ruin that a bit, so I keep it at stock. Could overclock it for the winter but, to be fair, winter does not last that long over here, I wish it was colder all year, honestly.
@@Sholvacri My 3070 is undervolted, but even with added OC, it performs around 3 few frames worse than yours at 1440p high w/ DLSS at 57%. I'm not CPU bottlenecked,, but my CPU is very old and weak now (i7 8700k @ 4.8GHz). The only good thing about my clocks is that my GPU doesn't run hotter than 65C even in the summer and doesn't pull more than 100W lol
@@xplodax Okay then that makes sense, indeed, to be fair with the i7 8700K you're, in fact, CPU limited, likely due to the IPC of the CPU being so far behind what modern CPUS can achieve, and being Unreal Engine 5 which got huge CPU optimizations in version 5.4, I can see older CPUs such as yours falling behind in certain intensive scenarios where they cannot match how much more efficient newer architectures are. But it's still great it can hold up relatively well on a card such as the 3070, but you would definitely notice an improvement with a newer CPU in this type of game where, as seen in my own metrics, the overall CPU "load" is not high at all, so I guess the "bottleneck" is not due to that, just how the engine is using the CPU, it maxes at a certain load, which is different for each CPU model depending on their capabilities, I guess. But yeah, I can even notice my 5700X holding back the 3060 Ti at 1080p when FG is on, the GPU usage drops a bit to the low 90s, but it's nothing to be very worried about.
@@xplodaxdon't upgrade. Instead heavily tighten ram subtimings. It will do 10-20% better. You can find some oc results on my channel. And it's nowhere near "weak", 8700K 5GHz is pretty much a stock 5600 in games. Certainly not as good as 5700x though. You could also try a nicely priced 9900KF which is pretty close to 5700x. Combined with tweaked ram your 3070 will go noticeably faster.
If you want to stick to 1080p yeah, it's best to use DLAA (if you can get 60 FPS or more) as it will remove all the upscaling blur from lower resolution scaling. That said, whenever possible it's always best to try downsampling a bit above 1080p, like 1440p DLDSR or so, at least, because this will make the Upscalers like DLSS, FSR, etc., have a higher resolution to work with, therefore potentially producing a cleaner result. I'm not sure how Shadows scale, visually, with the settings, but I could notice some Shadow pop-in at High settings, for sure, so if they are gonna become even worse at Medium or so, maybe it won't be worth it, but it's hard to know when we cannot even move the camera around (I saw people posting that it can be done with the unreal engine unlocker, though).
@giuseppevattimo47 if you have the VRAM to spare and you can still get good FPS then yes, combining down sampling like DLDSR and any upscaler, gives great image quality results, better than native for sure if DLSS is used as the upscaler, possibly the same with XeSS since that one also uses AI like DLSS.
Those vhigh and cinematic presets seems to be using lumen and nanite looking at how taxing they are but no way to now when is not a setting on the graphic menu. I played a game named Jusant that use UE5 and uses those features on every preset, really simple textures but with an amazing illumination.
@@wayoming3453 Nixxes made that possible with GoT. Unfortunately, DLSS with AMD's FG is not possible with this game. Game needs the rtx 40 series for that. Maybe, the devs will officially add it in the future
I have played Hi-Fi Rush, and completed it, already, but it was like I got the game one weekend, and got really hooked to it, so it didn't occur to me that I could make a video, so I didn't, haha. I can say that it ran very well on my system, played at 1620p DLDSR, which really made this game pretty clean, thanks to its art style, and frame rate was always pretty close to 120 with some small dips. But yeah, it's just a game that runs pretty well, very easy to have it running above 100 FPS.
It's the 27GP850-B, couldn't find the one that's cheaper and has no USB ports and no Overclock, and also couldn't find the 240 Hz version, sadly, so yeah I got that one for 279€, supposedly discounted because days after it went up an extra 44€ or so, so I guess I got lucky. Although, I kinda welcome having some extra USBs around, so I guess it's fine, as my old monitor also has a small USB hub. It's working quite well, I don't see any major issues with the image quality, despite the "fears" around the net about IPS panels having terrible light bleed and looking so badly on dark games, to be fair, I have tested many games where shadows and dark corners are important, and I see it all quite well. I think the issue with the default panel settings though is the Black Stabilizer setting, I believe LG tried to compensate the poor Contrast Ratio of IPS panels by lowering the black level there, at 50 which is the default, it kinda crushes blacks in some games like the Resident Evil 2, 3 and 4 remakes, to name a few, with my adjusted settings I liked with my old monitor. To make it look similar to the old one, I had to raise the Black Stabilizer to 60, which doesn't crush the blacks anymore and I can do the Brightness Calibration in those games properly, indeed. Having slightly lower black level can help making colors pop, though, but I am using the monitor on the "Gamer 1" preset, which already has slightly saturated colors, mainly red and green, but I manually run it on the Warm 2 color temperature though, as the default looks too cool for my taste. The SRGB preset the monitor has seems quite accurate, color wise, but for games I still prefer "Gamer 1", with Gamma 2 setting, to be honest, but it is nice to have a Factory Calibrated SRGB profile, too. Overall, and running it at like 50 Brightness and default contrast, I think it looks great in the games I've tried. Variable Refresh Rate also seems to work like a champ, even at low FPS below the 48 Hz threshold, Low Frame Compensation kicks in and yeah, I have a good impression of how that feels, compared to my other older screen which actually has a Gsync Module in. I'm running it at 180 Hz and so far no major issues in the games I've tested. I think this panel they use for these monitors is overall very solid with little to no issues in games, which is what matters the most to me, really, so I'm happy so far.
@@Sholvacri Hellz yeah, I got the same model but just slightly cheaper and at 165hz instead. Love it. Hope u get your moneys worth out of it. I think u also have black light option I am not sure but if u do try it out. It almost removes all the motion blur feels odd but good at first.
@@rakhoo5236 It does have Motion Blur Reduction AKA Black Frame Insertion yeah, although it cannot be used with Variable Refresh so it is not that useful to me, I will always prefer having Gsync enabled as I am too used to how it feels in games, coming from my previous monitor. Could be useful for certain games but yeah, I prefer Gsync for sure on those games where the frame-rate is quite variable and not stuck at 120 FPS, where perhaps Black Frame Insertion is best suited for, to get that pristine "CRT-like" motion.
@@Sholvacri is your 1440p screen your main one now with the 1080p one off to the side? In the black myth video it looks like you other one shows up first but I don't know if that is cause you clicked on or if it picked it up first.
@autoxguy GPUs are quite dumb and each model and brand internally designate an order to the ports, so of I want the 1440p screen to display my BIOS on boot up, I had to plug it on the display port that this GPU considers it's number 2, and it appears as screen 2 in windows, but I made it primary screen. Sadly doing that in Windows does not affect which monitor is used on the BIOS, lol, so this is why if appears as the second one in game.
I hate it when the system requirements mention that they are with up scaling involved. To me up scaling should be a bonus and not required. This tells me that the 2060 can probably only get like 30 at high settings with no up scaling with is really bad. They should not have mentioned the 2060 at all, but probably did just to calm people down. Even a 3060i without upscaling would probably be mostly in the 50's with a bit over 60 possibly. Until they optimize it more, as of now it looks to me as if this game only works very well with 40 series gpu's and up, when no upscaling is involved.
I'm gonna guess, without looking at any other benchmarks out there, that the 2060 does close to 60, or in the mid 50s, at 1080p High Settings, Quality DLSS, which is fine honestly, considering how old it is, but yeah knowing how much slower it is compared to the 3060 Ti, and I get 84 FPS at 1080p High with DLSS, it's safe to say it could get close to 60. Also no one should ever be running 1080p native with no DLSS if it is available, as it is the Anti Aliasing that needs to be used, basically, it will always be better than TAA at native scale, a bit blurry compared to 100% TSR for sure, but honestly it's kinda unfair because TSR is already pretty damn good and using it at 100% is close to using DLAA, not there in terms of maximum image and AA clarity, but close. Also still unsure if the devs intended 30 FPS to be the target, because surely Nvidia lists 30 FPS as the target for the Ray Tracing Requirements, in their website, lol: www.nvidia.com/en-us/geforce/campaigns/black-myth-wukong-bundle/ Yeah, recommending 30 for some and 60 for others, it's weird...it's easier to just say "don't bother enabling RT on the RTX 3060, it's just below 30 quite often surely unless you go very low on the DLSS or global settings or something, at that point the game will look poor and the Path Tracing won't do much because it is still 1080p and with upscaling, for basically up to 30 FPS at full HD...nah.
Yea, like i mentioned before. The RTX 2060 as a recommended GPU for High settings, is a joke from the developers. The only card that matters there is the 5700XT. Which tells you everything that you need to know. The 3060 runs the benchmark between 40-48 fps, on native 1080p/High, but mostly stays in the 41 fps range when the benchmark is in it's taxing moments. With DLSS Q, it runs from 50 to 60 fps, on the same settings. So a normal 2060 with upscaling or not, will have a very bad time with this game, on these settings. You would have to go Medium, to have a good enough framerate. One thing to note tho. Going from High to Medium settings, boost the FPS by a lot. So there is definitely a setting or two, that we will be able to lower, and get much better FPS with very small reduction to image quality. Just like in Avatar. And yes, the 4000 series performs really well in this game. Sort of what you see with Horizon 5 on PC. A normal 4060 is about 13 frames pers second more, than a 3060. Something that is normally not the case with most games. I also expect that a 4060ti, which is normally minimally behind a 3070, to also be decently faster in this title. Guess it's the much higher clock speeds on the GPU. Resident Evil 4 Remake, also prefers the 4000 series. Where it beats a 3070 by 5-6 more frames per second on average. Talking about 1080p ofc, at 1440p the 3070 wins, cos of the anemic bus/lanes that the 4060ti has:D. Aside for that, the game is really heavy. So ofc that they are gonna use upscaling as a mandatory feature. If not, then they would have to use the 4070 as the base for their system requirements. And recommend that for 1080p/High no upscaling. Where the 4070 averages at 77 fps. At the end of the day, it's just a simple benchmark. Of a location where nothing is going on. If you check the recent trailer, they show a ton of fights, with terrain deformation, effects going crazy. things moving on the screen. So I expect it to run even worse in "actuall" gameplay. Lucky for me, I'm not interested in this game. But if it's gonna have a 60 fps mode on PS5, you will for sure get a better 60 fps experience on PC. As it's the case with most multiplat games.
@@asahira7834 Well we're yet to see an UE 5 game on consoles not be extremely either low resolution with upscaling or have any other issues, UE 5 is, indeed, just that heavy, too bad Mark Cerny boasted about the amazing capabilities of the PS5 by using one of the first ever seen UE 5 demos. We've seen how stuff like the Matrix demo also runs on PC and consoles, with Lumen, both Software and Hardware, and yeah...even though UE 5 (is it 5.4 now? or 5.3) has vastly improved since that demo, it is still quite damn heavy in most games. For that heavy load it puts on the hardware, well we get nice VRAM usage, normally excellent very detailed Geometry, thanks to Nanite, and I still have my bag of mixed feelings about Lumen, since almost no game uses it in Hardware, and those which use it in Software tend to mix in some Screen Space stuff as well, like Robocop Rogue City and others do. Virtual Shadow Maps are alright but to be fair, Shadows is normally the least of my concerns in most games, as long as they don't look awful, I prefer if they just blend in there and don't bother me. According to the game's descriptions, the Visual Effects, Shadows and Global Illumination settings are some of the heaviest settings, but yeah, not a lot to tweak in there, as some of these have many other settings embedded within. I'll wait for the full game to see, for example, how High vs Medium Shadows fare, because at High Shadows I can see pop-in on the Benchmark, which is not great. For that reason, I'd also like to check if Very High Shadows fixes this but yeah I need to see that in movement, not on the same benchmark area. On a side note, there's no Driver Profile for the game, on the Nvidia side at least, which is odd, they normally just automatically add them via OTA download or something, and considering the Benchmark is out, I'm not sure why the profile does not exist, yet. They will surely it add in the official driver but I expected them to do it sooner, really, so I assume the game is also not using ReBAR by default, because that is normally enabled at the profile level by Nvidia, if they see it improves performance and causes no issues.
@@Sholvacri I think it was Alex, who showed the difference of UE5 5.4 vs 5.3, and it was mostly CPU optimization. Where his 7800X3D was pulling more frames on the 4090. But GPU optimization is still in the same place as it was before. So some comments on UA-cam proclaiming that UE5 5.4 is gonna have some insane gains in new games, are in for a rude awakening:D. Ofc I don't mind that UE5 is heavy, it does look amazing in new games. No one can deny it. Just wish they waited with it going mainstream, at least till we get better mid range cards and consoles. Tho I do understand that this is not how the current industry works. That's why I do miss the old gaming days. When games were made with the hardware in mind. Well at least on consoles:D. cos PC was still a Wild West when it came to game optimization. And yea, the current generation of consoles sure did fool a lot of ppl. I still remember Xbox marketing. How 60 fps is gonna be the new standard, with many games alos having 120 modes. And that 30 is a thing of the past, mostly. Don't remember is Sony did it as well, but they also overhyped their hardware. So it's not a surprise that when a game only runs at 30 fps, or has very blurry image to get to 60. A lot of customers are disappointed. Still, I love the idea of releasing a benchmark for your game, a few days before release. It's the best thing, if you are not gonna provide a real demo. So you know if your hardware is enough to play it or not. On the PC ofc.
Just did a test with RTX 2060 super at medium settings DLSS balanced (58%) 1440p and it averaged 62 fps, min was 42 and max was 72. Compared to many other games that are coming out this one is pretty decently optimized imo.
My gpu temp is much higher than yours 80-85c with max fan speed 🤔 I got used RTX 3060 ti gigabyte eagle LHR, interesting when using vsync temp is going down.
The Eagle series cooler has inferior cooling performance compared to the 3 fan model used on models such as mine , so having higher temps is not uncommon. The airflow of the PC case is also a big factor for GPU temps. But yes, limiting the FPS or the GPU load will also lower the temps and , as long as you can get 60 FPS, if you are happy with how the game feels, it's a good way to keep it cooler and save some power.
Woah, that's insane, aren't you seeing mad ghosting trails on things like the tree leaves flying around close to the end part of the benchmark? I can clearly see ghosting with just regular FSR FG enabled, so I can imagine it gets worse with Lossless Scaling. The default ghosting in some of the upscalers should be addressed honestly, if they improve that, then I assume stuff like Lossless Scaling with its own frame gen would also look better, but I haven't tried using it so I don't know how or in what way bad sampling being fed to it affects the output, but I would guess it's not too good if the game already has temporal ghosting at its core.
@@Sholvacri @Sholvacri, it doesn't look that bad, but I can't imagine how it feels in-game. The camera makes smooth moves without angle changes in the benchmark. I think it looks pretty decent to me because of that. It's something curious, and I take it as something experimental. In the future, with the game, maybe I'll use DLSS and lossless scaling together for FG.
@@FranciscoRestivo96 Yeah I agree that without being able to move the camera ourselves it's hard to gauge how much worse this could look. Definitely should look into manually adding DLSS 3.7 with preset E and then lossless scaling may look better, but I don't know. TSR definitely shows way less ghosting so I assume the TAA samples are not too badly tuned, so the issue is as usual with either FSR upscaling or FSR FG, maybe they are not using the latest build, but the in game files are encrypted or so, they don't show the version on FSR, unlike XeSS or DLSS. But if you don't notice any extra or terrible ghosting with LS, that's promising to say the least.
@@FranciscoRestivo96 I have no issues understanding everything you said and trust me, English is not my native tongue either so it's fine, by looking at your user name I assume yours is Spanish, so that's the same as me, as I'm from Spain myself so I'm already doing real time mental translation every time I post here, anyways :P
Some useful timestamps:
00:00 - Intro and Brief Settings overview
03:17 - Graphics Settings
04:36 - [1080p]High Preset, Quality DLSS, RT & Frame Gen OFF Benchmark
07:51 - [1080p]High Settings, Quality DLSS, FG OFF, RT OFF vs Very High RT
09:15 - [1080p]High Settings, Quality DLSS, FG OFF, RT Presets Scaling
10:33 - [1080p]High Settings, Quality Upscaling (66%), Frame Gen OFF vs ON
13:01 - [1080p]High Preset, FG ON, TSR (66%) vs FSR Upscaling (66 and 90% FSR)
14:36 - [1440p]High, Balanced Upscaling (57%), FG OFF vs ON
17:26 - [1440p]Balanced DLSS, Graphics Presets Scaling (Cinematic/Very High/High)
19:44 - [1440p]Balanced DLSS, Graphics Presets Scaling (High/Medium/Low)
21:50 - [1920p DLDSR]High Preset, 50% FSR Upscaling, Frame Gen ON
24:34 - Some FSR (maybe FG) Ghost / Trails on particles, comparison with TSR
26:03 - [4K DLDSR]High Preset, Performance DLSS (50%), Frame Gen OFF
Also I watched the Stalker 2 deep dive and it looked pretty good overall. The gunplay looked really good overall. I'm sure there are some rough edges people can point out but one interesting thing I noticed is that most of the gameplay seemed to be at 60 but in the last minute of the video they showed what looked to be 30 fps footage so not sure if that is a different preset or not.
It was obviously played with a controller and my guess would be on a series x vs a PC. Luckily they have a new demo for people at gamescom next week so looking forward to hearing updated impressions.
I have only watched it on my phone but damn do the visuals look good on it. Hoping that DF will talk about it in their weekly video next week.
Nice benchmarks man. Might be a good idea to make a video on overclocking your GPU because your performance is amazing! Your 3060Ti is outperforming my 3070 by 11fps on avg XD
Hmm is it outperforming your 3070? Are you very CPU limited or at low (1080p) resolution and with Upscaling? Could see it happening in such scenarios, with a less powerful CPU, but at higher resolutions it should not happen, unless your 3070 is downclocking a bit due to to temperature.
Which is also why I normally do not OC my GPU any more than what comes from the factory, as the summer season here is brutal, I have to turn on the AC to keep the GPU below 69-70C at full load, and since it has 3 fans and the cooler is decent, it then stays below 65C but yeah, an overlock would likely ruin that a bit, so I keep it at stock.
Could overclock it for the winter but, to be fair, winter does not last that long over here, I wish it was colder all year, honestly.
@@Sholvacri My 3070 is undervolted, but even with added OC, it performs around 3 few frames worse than yours at 1440p high w/ DLSS at 57%. I'm not CPU bottlenecked,, but my CPU is very old and weak now (i7 8700k @ 4.8GHz). The only good thing about my clocks is that my GPU doesn't run hotter than 65C even in the summer and doesn't pull more than 100W lol
@@xplodax Okay then that makes sense, indeed, to be fair with the i7 8700K you're, in fact, CPU limited, likely due to the IPC of the CPU being so far behind what modern CPUS can achieve, and being Unreal Engine 5 which got huge CPU optimizations in version 5.4, I can see older CPUs such as yours falling behind in certain intensive scenarios where they cannot match how much more efficient newer architectures are.
But it's still great it can hold up relatively well on a card such as the 3070, but you would definitely notice an improvement with a newer CPU in this type of game where, as seen in my own metrics, the overall CPU "load" is not high at all, so I guess the "bottleneck" is not due to that, just how the engine is using the CPU, it maxes at a certain load, which is different for each CPU model depending on their capabilities, I guess.
But yeah, I can even notice my 5700X holding back the 3060 Ti at 1080p when FG is on, the GPU usage drops a bit to the low 90s, but it's nothing to be very worried about.
@@Sholvacri Ah, that makes sense actually. Damn, this 8700K has served me well lol. Maybe I ought to think about upgrading
@@xplodaxdon't upgrade. Instead heavily tighten ram subtimings. It will do 10-20% better. You can find some oc results on my channel. And it's nowhere near "weak", 8700K 5GHz is pretty much a stock 5600 in games. Certainly not as good as 5700x though.
You could also try a nicely priced 9900KF which is pretty close to 5700x. Combined with tweaked ram your 3070 will go noticeably faster.
I think the best is high preset+90-100% scale (DLAA), maybe you can lower shadows o some other stuff to maintain 60fps. Talking about 1080p of course
If you want to stick to 1080p yeah, it's best to use DLAA (if you can get 60 FPS or more) as it will remove all the upscaling blur from lower resolution scaling. That said, whenever possible it's always best to try downsampling a bit above 1080p, like 1440p DLDSR or so, at least, because this will make the Upscalers like DLSS, FSR, etc., have a higher resolution to work with, therefore potentially producing a cleaner result.
I'm not sure how Shadows scale, visually, with the settings, but I could notice some Shadow pop-in at High settings, for sure, so if they are gonna become even worse at Medium or so, maybe it won't be worth it, but it's hard to know when we cannot even move the camera around (I saw people posting that it can be done with the unreal engine unlocker, though).
@@Sholvacri oh so you're saying dldsr+dlss thats a good idea , i think the global illumination Is the most heavy thing
@giuseppevattimo47 if you have the VRAM to spare and you can still get good FPS then yes, combining down sampling like DLDSR and any upscaler, gives great image quality results, better than native for sure if DLSS is used as the upscaler, possibly the same with XeSS since that one also uses AI like DLSS.
@@Sholvacri yes cause im using the 3060 12gb version
Those vhigh and cinematic presets seems to be using lumen and nanite looking at how taxing they are but no way to now when is not a setting on the graphic menu.
I played a game named Jusant that use UE5 and uses those features on every preset, really simple textures but with an amazing illumination.
I wonder if there would be a mod enabling FG with DLSS. I would appreciate it if there was. Anyways, great video!
I think you can use dlss + frame gen of amd, at least you can in ghost of tsushima
@@wayoming3453 Nixxes made that possible with GoT. Unfortunately, DLSS with AMD's FG is not possible with this game. Game needs the rtx 40 series for that. Maybe, the devs will officially add it in the future
@@PanmaTG.Gaming I saw there is a mod on nexus that tries to enable Amd FG + DLSS . don't know how it performs tho
Any interest in covering hifi rush? Hard to find benchmarks for that game.
I have played Hi-Fi Rush, and completed it, already, but it was like I got the game one weekend, and got really hooked to it, so it didn't occur to me that I could make a video, so I didn't, haha.
I can say that it ran very well on my system, played at 1620p DLDSR, which really made this game pretty clean, thanks to its art style, and frame rate was always pretty close to 120 with some small dips.
But yeah, it's just a game that runs pretty well, very easy to have it running above 100 FPS.
@@Sholvacri Thanks!
@@D-OS_II_Fanill 2nd that in that game runs very well. Probably one of the best optimized UE4 engine games that exist.
How are u liking the new monitor? Which one did u buy?
It's the 27GP850-B, couldn't find the one that's cheaper and has no USB ports and no Overclock, and also couldn't find the 240 Hz version, sadly, so yeah I got that one for 279€, supposedly discounted because days after it went up an extra 44€ or so, so I guess I got lucky.
Although, I kinda welcome having some extra USBs around, so I guess it's fine, as my old monitor also has a small USB hub.
It's working quite well, I don't see any major issues with the image quality, despite the "fears" around the net about IPS panels having terrible light bleed and looking so badly on dark games, to be fair, I have tested many games where shadows and dark corners are important, and I see it all quite well.
I think the issue with the default panel settings though is the Black Stabilizer setting, I believe LG tried to compensate the poor Contrast Ratio of IPS panels by lowering the black level there, at 50 which is the default, it kinda crushes blacks in some games like the Resident Evil 2, 3 and 4 remakes, to name a few, with my adjusted settings I liked with my old monitor.
To make it look similar to the old one, I had to raise the Black Stabilizer to 60, which doesn't crush the blacks anymore and I can do the Brightness Calibration in those games properly, indeed.
Having slightly lower black level can help making colors pop, though, but I am using the monitor on the "Gamer 1" preset, which already has slightly saturated colors, mainly red and green, but I manually run it on the Warm 2 color temperature though, as the default looks too cool for my taste.
The SRGB preset the monitor has seems quite accurate, color wise, but for games I still prefer "Gamer 1", with Gamma 2 setting, to be honest, but it is nice to have a Factory Calibrated SRGB profile, too.
Overall, and running it at like 50 Brightness and default contrast, I think it looks great in the games I've tried.
Variable Refresh Rate also seems to work like a champ, even at low FPS below the 48 Hz threshold, Low Frame Compensation kicks in and yeah, I have a good impression of how that feels, compared to my other older screen which actually has a Gsync Module in.
I'm running it at 180 Hz and so far no major issues in the games I've tested.
I think this panel they use for these monitors is overall very solid with little to no issues in games, which is what matters the most to me, really, so I'm happy so far.
@@Sholvacri Hellz yeah, I got the same model but just slightly cheaper and at 165hz instead. Love it. Hope u get your moneys worth out of it. I think u also have black light option I am not sure but if u do try it out. It almost removes all the motion blur feels odd but good at first.
@@rakhoo5236 It does have Motion Blur Reduction AKA Black Frame Insertion yeah, although it cannot be used with Variable Refresh so it is not that useful to me, I will always prefer having Gsync enabled as I am too used to how it feels in games, coming from my previous monitor.
Could be useful for certain games but yeah, I prefer Gsync for sure on those games where the frame-rate is quite variable and not stuck at 120 FPS, where perhaps Black Frame Insertion is best suited for, to get that pristine "CRT-like" motion.
@@Sholvacri is your 1440p screen your main one now with the 1080p one off to the side? In the black myth video it looks like you other one shows up first but I don't know if that is cause you clicked on or if it picked it up first.
@autoxguy GPUs are quite dumb and each model and brand internally designate an order to the ports, so of I want the 1440p screen to display my BIOS on boot up, I had to plug it on the display port that this GPU considers it's number 2, and it appears as screen 2 in windows, but I made it primary screen.
Sadly doing that in Windows does not affect which monitor is used on the BIOS, lol, so this is why if appears as the second one in game.
I hate it when the system requirements mention that they are with up scaling involved. To me up scaling should be a bonus and not required. This tells me that the 2060 can probably only get like 30 at high settings with no up scaling with is really bad. They should not have mentioned the 2060 at all, but probably did just to calm people down. Even a 3060i without upscaling would probably be mostly in the 50's with a bit over 60 possibly.
Until they optimize it more, as of now it looks to me as if this game only works very well with 40 series gpu's and up, when no upscaling is involved.
I'm gonna guess, without looking at any other benchmarks out there, that the 2060 does close to 60, or in the mid 50s, at 1080p High Settings, Quality DLSS, which is fine honestly, considering how old it is, but yeah knowing how much slower it is compared to the 3060 Ti, and I get 84 FPS at 1080p High with DLSS, it's safe to say it could get close to 60.
Also no one should ever be running 1080p native with no DLSS if it is available, as it is the Anti Aliasing that needs to be used, basically, it will always be better than TAA at native scale, a bit blurry compared to 100% TSR for sure, but honestly it's kinda unfair because TSR is already pretty damn good and using it at 100% is close to using DLAA, not there in terms of maximum image and AA clarity, but close.
Also still unsure if the devs intended 30 FPS to be the target, because surely Nvidia lists 30 FPS as the target for the Ray Tracing Requirements, in their website, lol: www.nvidia.com/en-us/geforce/campaigns/black-myth-wukong-bundle/
Yeah, recommending 30 for some and 60 for others, it's weird...it's easier to just say "don't bother enabling RT on the RTX 3060, it's just below 30 quite often surely unless you go very low on the DLSS or global settings or something, at that point the game will look poor and the Path Tracing won't do much because it is still 1080p and with upscaling, for basically up to 30 FPS at full HD...nah.
Yea, like i mentioned before. The RTX 2060 as a recommended GPU for High settings, is a joke from the developers. The only card that matters there is the 5700XT. Which tells you everything that you need to know. The 3060 runs the benchmark between 40-48 fps, on native 1080p/High, but mostly stays in the 41 fps range when the benchmark is in it's taxing moments. With DLSS Q, it runs from 50 to 60 fps, on the same settings. So a normal 2060 with upscaling or not, will have a very bad time with this game, on these settings. You would have to go Medium, to have a good enough framerate.
One thing to note tho. Going from High to Medium settings, boost the FPS by a lot. So there is definitely a setting or two, that we will be able to lower, and get much better FPS with very small reduction to image quality. Just like in Avatar.
And yes, the 4000 series performs really well in this game. Sort of what you see with Horizon 5 on PC. A normal 4060 is about 13 frames pers second more, than a 3060. Something that is normally not the case with most games. I also expect that a 4060ti, which is normally minimally behind a 3070, to also be decently faster in this title. Guess it's the much higher clock speeds on the GPU. Resident Evil 4 Remake, also prefers the 4000 series. Where it beats a 3070 by 5-6 more frames per second on average. Talking about 1080p ofc, at 1440p the 3070 wins, cos of the anemic bus/lanes that the 4060ti has:D.
Aside for that, the game is really heavy. So ofc that they are gonna use upscaling as a mandatory feature. If not, then they would have to use the 4070 as the base for their system requirements. And recommend that for 1080p/High no upscaling. Where the 4070 averages at 77 fps. At the end of the day, it's just a simple benchmark. Of a location where nothing is going on. If you check the recent trailer, they show a ton of fights, with terrain deformation, effects going crazy. things moving on the screen. So I expect it to run even worse in "actuall" gameplay.
Lucky for me, I'm not interested in this game. But if it's gonna have a 60 fps mode on PS5, you will for sure get a better 60 fps experience on PC. As it's the case with most multiplat games.
@@asahira7834 Well we're yet to see an UE 5 game on consoles not be extremely either low resolution with upscaling or have any other issues, UE 5 is, indeed, just that heavy, too bad Mark Cerny boasted about the amazing capabilities of the PS5 by using one of the first ever seen UE 5 demos.
We've seen how stuff like the Matrix demo also runs on PC and consoles, with Lumen, both Software and Hardware, and yeah...even though UE 5 (is it 5.4 now? or 5.3) has vastly improved since that demo, it is still quite damn heavy in most games.
For that heavy load it puts on the hardware, well we get nice VRAM usage, normally excellent very detailed Geometry, thanks to Nanite, and I still have my bag of mixed feelings about Lumen, since almost no game uses it in Hardware, and those which use it in Software tend to mix in some Screen Space stuff as well, like Robocop Rogue City and others do.
Virtual Shadow Maps are alright but to be fair, Shadows is normally the least of my concerns in most games, as long as they don't look awful, I prefer if they just blend in there and don't bother me.
According to the game's descriptions, the Visual Effects, Shadows and Global Illumination settings are some of the heaviest settings, but yeah, not a lot to tweak in there, as some of these have many other settings embedded within.
I'll wait for the full game to see, for example, how High vs Medium Shadows fare, because at High Shadows I can see pop-in on the Benchmark, which is not great. For that reason, I'd also like to check if Very High Shadows fixes this but yeah I need to see that in movement, not on the same benchmark area.
On a side note, there's no Driver Profile for the game, on the Nvidia side at least, which is odd, they normally just automatically add them via OTA download or something, and considering the Benchmark is out, I'm not sure why the profile does not exist, yet.
They will surely it add in the official driver but I expected them to do it sooner, really, so I assume the game is also not using ReBAR by default, because that is normally enabled at the profile level by Nvidia, if they see it improves performance and causes no issues.
@@Sholvacri I think it was Alex, who showed the difference of UE5 5.4 vs 5.3, and it was mostly CPU optimization. Where his 7800X3D was pulling more frames on the 4090. But GPU optimization is still in the same place as it was before. So some comments on UA-cam proclaiming that UE5 5.4 is gonna have some insane gains in new games, are in for a rude awakening:D.
Ofc I don't mind that UE5 is heavy, it does look amazing in new games. No one can deny it. Just wish they waited with it going mainstream, at least till we get better mid range cards and consoles. Tho I do understand that this is not how the current industry works. That's why I do miss the old gaming days. When games were made with the hardware in mind. Well at least on consoles:D. cos PC was still a Wild West when it came to game optimization.
And yea, the current generation of consoles sure did fool a lot of ppl. I still remember Xbox marketing. How 60 fps is gonna be the new standard, with many games alos having 120 modes. And that 30 is a thing of the past, mostly. Don't remember is Sony did it as well, but they also overhyped their hardware. So it's not a surprise that when a game only runs at 30 fps, or has very blurry image to get to 60. A lot of customers are disappointed.
Still, I love the idea of releasing a benchmark for your game, a few days before release. It's the best thing, if you are not gonna provide a real demo. So you know if your hardware is enough to play it or not. On the PC ofc.
Just did a test with RTX 2060 super at medium settings DLSS balanced (58%) 1440p and it averaged 62 fps, min was 42 and max was 72. Compared to many other games that are coming out this one is pretty decently optimized imo.
My gpu temp is much higher than yours 80-85c with max fan speed 🤔 I got used RTX 3060 ti gigabyte eagle LHR, interesting when using vsync temp is going down.
The Eagle series cooler has inferior cooling performance compared to the 3 fan model used on models such as mine , so having higher temps is not uncommon.
The airflow of the PC case is also a big factor for GPU temps.
But yes, limiting the FPS or the GPU load will also lower the temps and , as long as you can get 60 FPS, if you are happy with how the game feels, it's a good way to keep it cooler and save some power.
I play this benchmark with lossless scaling x4 and fsr3 fg. So i get fg x2 and then x4 from this base.
Woah, that's insane, aren't you seeing mad ghosting trails on things like the tree leaves flying around close to the end part of the benchmark? I can clearly see ghosting with just regular FSR FG enabled, so I can imagine it gets worse with Lossless Scaling.
The default ghosting in some of the upscalers should be addressed honestly, if they improve that, then I assume stuff like Lossless Scaling with its own frame gen would also look better, but I haven't tried using it so I don't know how or in what way bad sampling being fed to it affects the output, but I would guess it's not too good if the game already has temporal ghosting at its core.
@@Sholvacri @Sholvacri, it doesn't look that bad, but I can't imagine how it feels in-game. The camera makes smooth moves without angle changes in the benchmark. I think it looks pretty decent to me because of that. It's something curious, and I take it as something experimental. In the future, with the game, maybe I'll use DLSS and lossless scaling together for FG.
im not good with english, sorry
@@FranciscoRestivo96 Yeah I agree that without being able to move the camera ourselves it's hard to gauge how much worse this could look. Definitely should look into manually adding DLSS 3.7 with preset E and then lossless scaling may look better, but I don't know.
TSR definitely shows way less ghosting so I assume the TAA samples are not too badly tuned, so the issue is as usual with either FSR upscaling or FSR FG, maybe they are not using the latest build, but the in game files are encrypted or so, they don't show the version on FSR, unlike XeSS or DLSS.
But if you don't notice any extra or terrible ghosting with LS, that's promising to say the least.
@@FranciscoRestivo96 I have no issues understanding everything you said and trust me, English is not my native tongue either so it's fine, by looking at your user name I assume yours is Spanish, so that's the same as me, as I'm from Spain myself so I'm already doing real time mental translation every time I post here, anyways :P