For best texture filtering it's best to leave it off in game and force AF in the Nvidia control panel, you will have 16x AF, because from what I been reading best texture filtering does not equal 16x AF
nice i got a simmilair set up but with a 7900xtx u confirmed what i was thinking my diff in framerater between 1080p and 4k wasnt massive but its because of the cpu bottleneck
It's sad to see how badly it runs. My 7800X3D have stutters very often whenever there's players around, and places in SotO runs like ass. Can literally pan the camera and go from 120 to 40 fps.
I wonder if it's an AMD thing. When I upgraded recently to a 14700KF intel my GW2 went from running really nice most of the time with mostly max settings(minus shadows and sky stuff idc about)to running 250 fps almost all the time with few dips enough to feel like a studder. Finally zero issues....Crazy you have to spend this type of money to get it to run smooth...Though i did buy mostly open box and used on a lot of the parts minus the CPUs...And I got a killer deal on buying the 3090s together which made them so cheap I felt like I committed a crime. Though I have to say for some reason Shadowplay doesn't like my intel PC and makes it studder after 30 secs of turning it on instant replay. So that's annoying..
@@jdgoesham5381 You are obviously lying. The poor performance is due to engine limitations, the game will never run smoothly in all scenarios. You can literally look up your cpu + gw2 to see what performance looks like, and as expected its not great. I dont get what you are trying to achieve here.
this tool of DXVK is allowed of Arenanet? because I'd like to use it! I don't like DirectX 11 I remember when I use the d912xpy was a tool that allow you pass from DirectX 9 to DirectX 12 and was huge the change, big gain of performance! with my Ryzen 7 5700X3D + RX 7800 XT I have micro stuttering I hate it!
They need to either modernize this engine or just make GW3. The performance in this game was unacceptable when it launched and its only gotten more unacceptable over time.
I personally run this game on R97950X3D and 4090. Runs pretty good with very minimal stutter every now and then. You would get a lot of fps if you turned Reflections off and Character model Limit to low. It will pretty much keep at 60fps even in the most demanding metas.
I run an undervolted and well cooled 14700kf and a rtx 3090 & 32 gb DDR5 these days and I usually have 225-250 fps even in WvW zerg fights. Mostly max settings but not Super sampling. On a msi mpg 321URX 4k monitor. I don't pve so IDK what it would get in world bosses or meta events but I'd wager not much less. WvW screen vomit gets crazy with all the siege also going off as well. I did like how it ran with my 78003d but it would still sometimes studder but now it's insanely smooth. Funny though I don't play like I used to. Alliances in WvW kind of made it bleh for me.
Hah Nice to see that even with your CPU the game runs like ass during high player count events. :D I have a 5700>G + RX6800XT and I'm getting nearly the same framerates. On Linux.
Isn't the RTX 4090 extremely bottlenecked? I'd assume that switching renderer mostly affects the GPU, and the 4090 is not being challenged at all, it's the CPU which can't keep up no matter the renderer.
dxvk operates as a translation layer to vulcan which can improve cpu usage in some games by offloading some resources to the gpu, a good example is the dxvk implementation for GTA 4. You are correct though regardless of either renderer with my hardware the 4090 is pretty much chilling in the background with nothing to do.
@@Extreme_Narwhal I have the exact same hardware, I recommend reducing the two character model limit settings to Medium as it greatly improves FPS when it counts. Really helps on the Soo-Won meta for an example.
@@Extreme_Narwhal Btw, the 7800X3D's memory controller runs at 6400 MHz on the current BIOS versions, so having 6400 MHz RAM and infinity fabric to match the controller is ideal. I have the exact same RAM as you and I've been able to overclock it no problem, it's rock solid. Set RAM speed multiplier from 60 to 64. (For 6400 MHz) Set the following RAM timings: 30-38-38-80 Overclock infinity fabric from 2000 MHz to 2133 MHz (1/3 of RAM speed) This will give you nice performance gains, particularly on the 1% and 0.1% lows. You can also use Curve Optimizer in BIOS and set -20 mV on all cores. This will allow the 7800X3D to clock higher before hitting the voltage limit. You're not overclocking, you're undervolting which allows it to clock higher on all-core workloads. It's completely safe.
bottlenecked in what way? The cpu seemed to be at the same level of utilization as the gpu with both at around 40% on all tests. The 7800x3D is literally the best gaming cpu currently on the market. I think the game engine itself is incapable of running well with model limit on high or above.
@@Svobone GW2 doesn't have great multi-threaded CPU utilization. The CPU may be running at 100% on one or two cores while the rest are nearly idle, which bottlenecks the GPU.
Good video. Appreciate the results. I always enjoy videos like this.
For best texture filtering it's best to leave it off in game and force AF in the Nvidia control panel, you will have 16x AF, because from what I been reading best texture filtering does not equal 16x AF
nice i got a simmilair set up but with a 7900xtx u confirmed what i was thinking my diff in framerater between 1080p and 4k wasnt massive but its because of the cpu bottleneck
It's sad to see how badly it runs. My 7800X3D have stutters very often whenever there's players around, and places in SotO runs like ass. Can literally pan the camera and go from 120 to 40 fps.
I wonder if it's an AMD thing. When I upgraded recently to a 14700KF intel my GW2 went from running really nice most of the time with mostly max settings(minus shadows and sky stuff idc about)to running 250 fps almost all the time with few dips enough to feel like a studder. Finally zero issues....Crazy you have to spend this type of money to get it to run smooth...Though i did buy mostly open box and used on a lot of the parts minus the CPUs...And I got a killer deal on buying the 3090s together which made them so cheap I felt like I committed a crime.
Though I have to say for some reason Shadowplay doesn't like my intel PC and makes it studder after 30 secs of turning it on instant replay. So that's annoying..
@@jdgoesham5381 You are obviously lying. The poor performance is due to engine limitations, the game will never run smoothly in all scenarios. You can literally look up your cpu + gw2 to see what performance looks like, and as expected its not great. I dont get what you are trying to achieve here.
this tool of DXVK is allowed of Arenanet? because I'd like to use it! I don't like DirectX 11 I remember when I use the d912xpy was a tool that allow you pass from DirectX 9 to DirectX 12 and was huge the change, big gain of performance! with my Ryzen 7 5700X3D + RX 7800 XT I have micro stuttering I hate it!
They need to either modernize this engine or just make GW3. The performance in this game was unacceptable when it launched and its only gotten more unacceptable over time.
I personally run this game on R97950X3D and 4090. Runs pretty good with very minimal stutter every now and then. You would get a lot of fps if you turned Reflections off and Character model Limit to low. It will pretty much keep at 60fps even in the most demanding metas.
I run an undervolted and well cooled 14700kf and a rtx 3090 & 32 gb DDR5 these days and I usually have 225-250 fps even in WvW zerg fights. Mostly max settings but not Super sampling. On a msi mpg 321URX 4k monitor. I don't pve so IDK what it would get in world bosses or meta events but I'd wager not much less. WvW screen vomit gets crazy with all the siege also going off as well. I did like how it ran with my 78003d but it would still sometimes studder but now it's insanely smooth. Funny though I don't play like I used to. Alliances in WvW kind of made it bleh for me.
Hah Nice to see that even with your CPU the game runs like ass during high player count events. :D I have a 5700>G + RX6800XT and I'm getting nearly the same framerates. On Linux.
Isn't the RTX 4090 extremely bottlenecked? I'd assume that switching renderer mostly affects the GPU, and the 4090 is not being challenged at all, it's the CPU which can't keep up no matter the renderer.
dxvk operates as a translation layer to vulcan which can improve cpu usage in some games by offloading some resources to the gpu, a good example is the dxvk implementation for GTA 4. You are correct though regardless of either renderer with my hardware the 4090 is pretty much chilling in the background with nothing to do.
@@Extreme_Narwhal I have the exact same hardware, I recommend reducing the two character model limit settings to Medium as it greatly improves FPS when it counts. Really helps on the Soo-Won meta for an example.
@@Extreme_Narwhal Btw, the 7800X3D's memory controller runs at 6400 MHz on the current BIOS versions, so having 6400 MHz RAM and infinity fabric to match the controller is ideal. I have the exact same RAM as you and I've been able to overclock it no problem, it's rock solid.
Set RAM speed multiplier from 60 to 64. (For 6400 MHz)
Set the following RAM timings: 30-38-38-80
Overclock infinity fabric from 2000 MHz to 2133 MHz (1/3 of RAM speed)
This will give you nice performance gains, particularly on the 1% and 0.1% lows.
You can also use Curve Optimizer in BIOS and set -20 mV on all cores. This will allow the 7800X3D to clock higher before hitting the voltage limit. You're not overclocking, you're undervolting which allows it to clock higher on all-core workloads. It's completely safe.
bottlenecked in what way? The cpu seemed to be at the same level of utilization as the gpu with both at around 40% on all tests. The 7800x3D is literally the best gaming cpu currently on the market. I think the game engine itself is incapable of running well with model limit on high or above.
@@Svobone GW2 doesn't have great multi-threaded CPU utilization. The CPU may be running at 100% on one or two cores while the rest are nearly idle, which bottlenecks the GPU.