Watch our New vs. Old R5 3600 Benchmark here: ua-cam.com/video/TJjqUyrWgwg/v-deo.html Find our CPU testing methodology for 2020 over here: ua-cam.com/video/sg9WgwIkhvU/v-deo.html&feature=emb_title Article version of CPU testing methodology: www.gamersnexus.net/guides/3577-cpu-test-methodology-unveil-for-2020-compile-gaming-more Article for this GPU scheduling piece: www.gamersnexus.net/guides/3599-windows-10-hardware-accelerated-gpu-scheduling-benchmarks
I had it turned on the past few days and I've had some unexpected game crashes. Shadow of the tomb raider, wreckfest and Forza horizon 4. I turned it back off for now and everything is back to normal.
Thanks so much for being so quick to cover topics like these! I saw this scheduling news yesterday and I knew that Gamers Nexus would be on it. Keep up the good work guys
@@korphiltag9364 I didn't have any issues before. I have a 3950x and 1080ti. 32gb RAM etc. More than capable hardware. Fiber optic internet too 500/500. It was a software based fuck up.
Noticed something interesting when streaming and monitoring the stream on Chrome, it seems it improved performance on the Chrome tab running twitch while playing MHW, but other than that, gaming remained the same. MHW is terrible to stream and monitor on the same machine, causing discord and chrome to slowdown, typing was impossible on discord, enabling gpu scheduling fixed the typing and the frozen twitch player on Chrome.
@@GamersNexus It might be the case, the game is poorly optimized, 4 friends have the game and they experienced the same behavior while playing the Monster Hunter World and typing on discord, the input lag with each stroke was really high, almost to half a sec, I'm just waiting to see if I can have those folks to test and report back.
Bizarre, I've had the complete opposite experience when enabling GPU scheduling, every game I've tried including MHW has made other windows incredibly stuttery and slow to update, I pretty much can't watch videos and play a game at all with this feature enabled.
@@prateekpoddar1890 I have definitely been using it for the past 4 months and I have to say that I haven't encountered issues, on the other hand I have found that running it with obs and obs without admin has let me push my gpu and my cpu further without droping frames or causing any encoding issues so I'm all in for running it 24/7, 5900x 3080 10G 64GB 3600mhz ram
The vague description of the new feature by Microsoft to me it sounds like it could be targeted at future MCM based GPUs. Getting the scheduling into the GPU allows it to "hide" some of its internal topology from the rest of the system. It could enable a kind of "GPU I/O die" taking care of the internal workflow of the GPU
A future GPU I/O die could make sense. In order to replicate the new consoles storage system they need to bypass the CPU and load the game data directly into the GPU memory. I can imagine future GPUs coming with an SSD connection.
@@Sunlight91 Considering that there are AMD GPUs _today_ with on board SSDs, yes, I would not be at all surprised to see that (they're currently only for their professional graphics line. Maybe we'll see it expand into consumer?)
@@Sunlight91 And now Microsoft has added the DirectStorage 1.1 API to Windows, with GPU asset decompression. Only one game out that uses it (and I think 1.0 not 1.1) Forespoken or Forsaken (it seems crap). But not surprising with a new API. Looking forward to it becoming more commonly used. Good call!
For anyone like me coming back to this. It’s mandatory for DLSS 3.5, however when NOT using that, I find it makes game stutter far more often and frame time spikes and 1% lows a LOT worse. Particularly on rdr2 and re4 remake. Use it wisely. Tested on 3090/3090ti/4090 12900k/13900k.
This setting wasn't made to improve performance but input latency at lower refresh rates, this can be seen when gaming on a 60hz monitor, which is what the new systems are targeting.
What i am wondering is if it impact power draw at all as well, since it's a task that change the demand of the hardware in question. There might be a small but interesting difference depending on gpu and cpu's mode of operation or simply type. And on that note, i have to ask if the change is worth it or not as well. And the same goes if one run multiple pieces of hard or soft hitting software on a machine. This also make a significant impact, worthy of a brisk test. (if a larger change is seen in terms of wattage pull, it might be telling if a game or combination is more or less efficient prior to the change, for the worse or better. I think it would be interesting or completely mundane but therefore worth a checkup if a game engine behave differently)
Wonder how much it could help for older generation CPUs, especially those with lower memory bandwidth, and slower/less cache. I expect the gains could be the highest there
This is what I was wondering about too. Older but still capable CPUs such as a 4770K paired up with something like an RTX 2060 to see if anything changes with HAGS being on or off.
I mean it boosted my performance on Dying Light 2 RTX maxed...had issues with gpu usage but this completely maxes it to 100% all the time now (= more fps). Also forces my cpu to be used more (extra fps)...and I'm on a 3080 12 gb/11700k OC
It's 2022 by now and I've recently tested on two notebooks, one with RTX 3050 and another one with an GTX 1650, this make some games very unstable, like Final Fantasy XV for example and others don`t seem to have improved in any way... I've used the latest graphics drivers... Would recommend to switch it off, seems like it wont improve latency and FPS, it will mostly degrade performance...
@@amgh4743 yeah, my problem with it is that it makes some hiccups on some games, that without it are buttery smooth, at least I`ve seem this happening in a few games, I only would use it in competitive FPS games... (my system is not high end but not bad either... So it may not be the problem... Dell G15, RTX 3050, i5-11400H, NVME SSD)
It does prevent stuttering on one monitor affecting both when the monitors have different refresh rates, because each monitor has separated desktop composition. Also if you ever played a game in windowed mode, 1 frame of input lag is added due to DWM's triple buffered vsync (and this would be according to your lowest refresh rate monitor previously, instead of the monitor the game is on).
Some people report fixing stutters in VR by this, which is probably related (rendering on monitor + the VR headset). community.openmr.ai/t/windows-2004-hags-nvidia-pitool-261-catalyst-dcs/28905
@@bartholomew4487 no, no they do not. Nowadays people don't experience tearing if they have variable refresh rate displays... freesync and gsync will eliminate that as well. But real gamers do not use vsync on fixed refresh rate displays, it absolutely kills input latency. 15-20 years ago none of us used vsync whether we had crappy 60hz monitors or better CRT's that could do decently high resolutions at 75, 85, 100+hz. Never vsync you noob. On a slow single player campaign, non-fps, or cut scenes, yeah sure. No fast fps player EVER HAD VSYNC ON YOU UTTER NOOB GTFO!
The purpose of the Hardware accelerate GPU scheduling is not to improve performance, latency or even jitter in games. The main purpose is when you have multiple applications using GPU at the same time doing something. Think of a multiple viewports in Blender, or a game editor, a web browser, and a game running all at the same time. Similarly it might help in virtualisation scenarios. In such situations the improvements might be visible. It might help in some scenarios in a single application, where there is many threads submitting work for the GPU at the same time, but in reality such scenarios are already pretty well optimised on the game side to do it close to optimal (or possible using known algorithms), so it will rarely help. But the hardware scheduliing do have advantages of making new things possible in the future in more efficient way, so that is to be see tho.
@@oxfordsparky How would that change anything? The results would be even less noticeable there. This offloads a process the CPU used to do to a dedicated chip on the GPU. Putting a bottle neck on the GPU side would reveal nothing.
@@WillisPtheone Actually it would still show similar results. If the bottlenecked GPU can now manage its own memory it may improve it's performance to lessen the bottleneck. Testing on a super high end GPU that isn't memory bottlenecked would show less of a change.
Just a feedback from my experience. Old CPU (i7 4770k) and GTX 1080. Got 40 - 60 fps no matter the graphical settings on Squad game. Turned hardware GPU scheduling ON, and got a 15 - 20 fps boost and low fps boost. It rarely dips to 55 fps now. The boost is HUGE on my old CPU. Try it if you are in my case.
3950x/2080 super. every game ive tested has gained a couple frames in the same areas, and has better 1 and .1% lows. also gpu usage actually reaches 100 percent now, before it never EVER did so.
@@user-zu1ix3yq2w ah man, i cant remember exact games anymore, but i mostly play old modded stuff lol. and its an x570 board with a 3950x, 32gb 3200mhz samsung b-die with SUPER tight timings and ive since upgraded to a 2080ti when the 30 series was announced and there were panic sales (snagged one new for 650 USD). HAGS improved performance notably and memorably to me in 2 games off the top of my head : heavily modded skyrim SE (like 1k mods) and modded Stalker (anomaly), although i can remember pretty much everything seeing an improvement except for no mans sky. i dont know if these statements hold true anymore as theres been many windows build and driver updates since that 1 year old comment.
I have just had to turn this off because it made Desktop Window Manager put 100% load on my gpu while gaming and causes me an unplayable experience. fps drops and stuttering (3080 rtx) Where am I going wrong...
You're a goat man, this fixed my CPU and GPU load. have a 3700x running at 4.45 all cores but was holding my GPU 2060 super founders edition from performing it's max... anyways i turned this on and tested me self i saw like a 200% of FPS boost on a game I play... Insane
What might be an interesting rest: Measure input latency in a few games (or settings combinations) with low, medium, and high framerates, specifically with buffering disabled. This is realistic because some people, especially in competitive games, want every latency advantage possible. Then, enable this feature, and see if you can achieve better performance at the same latency, better latency at the same performance, etc. If you can also use this with buffering, that's a candidate for equal latency with higher performance. It's almost assuredly going to be a wash, but I'm sure some are interested to see (myself included)
Artcore103 you can’t go “too high” with fps. The game feels better and for someone that spends thousands of hours in one single game, getting it to perform 3-5% more consistent is huge.
@@Artcore103 black desert online has animation speeds and skill speeds scaling off fps for whatever reason so in this particular case it unfortunately does matter. that's why you see people with 5000 dollar PCs run that game in the lowest potato graphics settings possible to squeeze out more FPS, it's just ridiculous.
My experiance when enabling the feature: 1.Unigine Heaven benchmarks: +10 FPS (220 to 230 new) 2.SOTTR benchmark: -20 FPS (120 to 100), this by VERY weird behaviour of frametimes (pls test for yourself to see:) bc CPU and GPU frametime graph are EXACTLY the same!! its an exact overlay of the graphs :)) looks so strange..have no explanation for it.. 3.CoD Warzone/Multiplayer: + XX FPS (seems higher, with better lows, but havent benchmarked it with feature disabled yet) My setup: Ryzen 5 3600 with GTX 1080 ti, 16 GB RAM 3600 Mhz
You should take a look at this again. I recently had a pretty good performance bump by enabling this setting when trying to resolve lower GPU usage I was getting in RDR2 using Vulkan with HDR. I updated my MB bios and vbios to enable resizeable bar on my 9900kf and 3090 build but I noticed in RDR2 I was getting less GPU usage, but only with Vulkan and HDR enabled. I went from 90-91% usage up to 97-98% usage by enabling this. I, of course, tested the differences in the benchmark also. I went from around 73 fps to 79 fps. I did multiple runs to come to an average on those tests. I also tested GR Breakpoints Vulkan implementation using this and saw a consistent 2 fps average increase using the benchmark, from 94 up to 96 fps.
Let me save you 20mins: hardware accelerated gpu scheduling (the new setting in windows) won't give any noticable performance improvements. They mostly came at problem from the cpu side.
After enabling this feature I have noticed that the frame tearing issues in my games with vsync off, have been solved.I tested witcher 3(vsync off),apex legends in tripple buffer,shadow of the tomb raider(v sync off) My rig:i5 9400f,rtx2060,8*2=16gig ram,b365 mobo with 76hz ips panel display.
Sorry,the tearing wasn't gone but you can hardly notice it.specially in Apex,i was having a huge issue while the fps wasn't capped at 144 fps. I used to get major tearing issue while fps was fluctuating between 150-220 but after enabling GPU Scheduling, I can't notice any tearing(in Apex Legends with 150-220 fps) .
Hey man, I have the exact same set up as yours. RTX 2060, i5 9400f and (8*2) 16Gb Ram. I was wondering if you play warzone, since whenever I play I always get these stutters and FPS drops every once in a while. I run everything on high and i have an avg FPS of 130, but I can't seem to figure out the problem.
I'll leave it off until there is a known difference in performance, knowing how Microsoft is lately, they'll push an update that breaks performance or games at some point.
Hardware accelerated GPU scheduling negates the need for frame-buffering. So no. You will see no change in most cases. Watch the video again and pay attention to the words he is saying this time. The improvements you would see from disabling frame-buffering ARE THE SAME you will get from HAGPU scheduling. Think turning your pc off with the power button or yanking the power cord. Different methods for the same result. Pressing the power button after yanking the cord would be pointless.
This is a good question. I know that disabling future frame rendering in Battlefield 5 in DX11 will reduce frame rate, because the GPU and CPU have to pause while they wait on each other. Maybe GPU scheduling will improve frame times slightly in this scenario?
@@WillisPtheone So hard to understand what this guy is talking, in the video. And I have no problem with other UA-camrs. So, I also did not clearly understand what he was saying, all the time...
its this: steamcommunity.com/sharedfiles/filedetails/?id=1604225602 the app for using animated wallpapers is called Wallpaper Engine on Steam. very cheap app i feel like an advertising guy, answering every comment what the wallpaper is lol
Just saw that @AbelSlayer linked to the specific Firewatch wallpaper that was used in another comment. steamcommunity.com/sharedfiles/filedetails/?id=1604225602
I think the best use case for graphics scheduling can be observed, when there are multiple programs trying to execute loads on the gpu, e.g. streaming with obs and gaming, hardware accelerated video playback and gaming, etc Maybe something to look into ;)
actually people are reporting problems with multiple programs. I personally have problems with GPU scheduling and game running with RTX Voice. Others with OBS and game running. i think it forcefully gives the active window all GPU ressources, or something like that...
Good stuff GN, would still love to see you benchmark anti-seize (automotive product) as thermal compound with your testing setup. ive seen for a long time how people claim both the silver and copper versions of anti-seize work really well as thermal paste.
You need multiple processes competing for GPU resources in order for changes affecting scheduling of multiple processes to make any difference. Testing with one process should reveal no change. As has been stated below, an example of multiple processes utilizing the GPU would be streaming a game using GPU video encoding. I would suggest you run your tests while having a stream and recording using GPU encoding at the same time. That might show an improvement, as it is a situation that where there is actually some scheduling going on. Simply running a game by itself is practically no scheduling load. This is akin to benchmarking a six core CPU versus an eight core CPU that are both at the same clock speed for single thread workloads with a single-threaded workload. Of course they’ll give results within the margin of error. You need to test multiple threads at the same time to reveal differences between six and eight cores. Test with one and two encoding jobs.
Clayton Macleod it seems that in those cases the effect is negative... It Also means that it is almost impossible to get comparable results, so the difference from run to run would be much bigger that Turning the option on or off even in the cases it does not crash one program or another....
finally the explanation of scheduling ive wanted , great video and subbed . ps what ive found that with my GTX1650 laptop this has certainly made a diffrence .
As soon as the driver dropped for AMD i've been using it since on my 5700 XT i've tested in a few games, BL3, Destiny 2, and DeadbyDaylight as for as i can tell there wasn't much of a difference other than MAYBE 2-4 fps increase on games (mainly for frame stability and less stuttering). I haven't had any issues with it no crashing no game stability issues i'm on a 2700x and 5700 XT.
Some media outlets have reported visible improvements with cards with low VRAM such as the gtx 1650. It makes sense since lower vram amount should require more management when it gets full.
I can say, after enabling hardware accelerated scheduling, that my VR experience (with most games that were overloading the CPU) has improved. So games not so much, causes stuttering in Half Life Alyx, but improved others.
Everyone keeps talking frame rates and latency, but I'm wondering if this could eventually help with frame pacing and some kinds of stutter (which tbh is more important to me). I hear GPUs getting more control over draw calls and wonder, but that could be wishful thinking.
It would be interesting to see how this feature compares to enabling low latency mode(aka old max prerender frames = 1 setting) in the gpu driver input latency wise. I'd guess the negative scaling in some of the games comes from the fact that with hw scaling batching is inherently disabled. Typically enabling low latency mode in the driver has the same slightly negative effect on the fps however the input latency improvement is huge to the point that I pretty much run any game with it enabled except maybe turn based games. (running 6700K + GTX 1080 btw)
@@BromTeque I checked out the video, I now remember that I saw it at the time. However I mostly play AAA games at High/Ultra settings where I can barely hit 60 fps and dip to 50 fps sometimes. I also cap the framerate at 60 in these scenarios for more consistent latency. I agree with Battle(non)sense's findings and felt it myself (as anecdotal evidence) that input lag is reduced while the GPU is not maxed out. But I feel like to be 100% sure that I stay below the FPS limit I need to give the GPU more headroom like 20-25% because graphics intensive games often have inconsistent performance. That would mean lowering the settings which I am unwilling to do. Basically I believe his solution applies to esports games.
@@ecvent0r Yeah, I rewatched the video just now to check if I remembered correctly. You'll have to assess the situation yourself. Low latency mode might be right for you, as you have assessed. I'm a fps/input latency snob and I'll gladly trade graphics quality for more fps/better input lag.
On my experience with Mount & Blade II: Bannerlord, the Low Latency Mode reduces considerably the input lag issue on big FPS drops (but it's still there). GPU Scheduling seems to have removed the problem completely - no matter how much the FPS drops (and it seems to drop just as much) I don't "feel it" on the input at all. Never had such a smooth experience with that game before on big battles.
In Elite: Dangerous turning on HAGS literally HALVES my VRAM usage. Off I regularly see 9500 MB or more usage, but with HAGS on it's typically 4500 or less.
@@MichaelReznoR Games would load that data no matter what. Assuming the actual VRAM usage has been reported right in both cases - that would most likely suggest less redundant structures and less buffering necessary, which is still a decent improvement.
i had to turn hardware-accelerated gpu scheduling off as it was lagging the stream i was watching on my second monitor when ever i started playing a game
I had a similar experience while watching video or youtube. Even the steam overlay had this lag and i noticed my desktop felt laggy when windows started and i moved my mouse around and clicked on things.
Just want to add my findings with an R9 3900X and 1080Ti tested by playing COD Warzone while live streaming on facebook using Streamlabs OBS. Enabling the feature improved minimum fps by about 10 and reduced CPU usage by 10% on average, however, for some reason the livestream was definitely not as smooth when the feature is on vs when it is off despite showing 60fps in live stats for both cases. VRAM usage also seems to be reduced by 1GB (using 10GB vs 11GB) with it on. GPU usage was 99% on both cases with CPU usage averaging around 40-50% with it off and 30-40% with it on. This lead to cooler temps of about 3-5 degrees as well. Average fps was not hugely impacted for both cases hovering is the 80s with all settings maxed out except Ray Tracing at 3440x1440 resolution. Minimum fps improved quite a bit from around 62fps with it off to about 74fps with it on. Tbh though, on a GSYNC display which is what I am using, there is no distinguishable difference between on and off while gaming. Where you will notice a difference is streaming. For some reason the stream seems to be running at around 45fps when on instead of the usual 60fps without changing any settings on Streamlabs. This could be a bug as the feature is new and Streamlabs might need an update to iron it out. As it stands right now, if you are planning to stream, you might need to play with the settings a bit to get it to run at a stable 60fps, otherwise, just leave the feature off. If you are not streaming and just gaming, you might find some benefits to enabling the feature if only for the lower CPU usage, temps, and better minimum framerate though I suspect the differences will be even more negligable when not streaming as CPU usage hovers around the single digits to lower 10s anyway.
when I enabled this it gave me constant micro stutters in WoW Classic, problem resolved immediately once i turned it off, my system is an 8700k, 2080super, 16gb 3000mhz team group ram and an 970evo 1tb samsung boot drive.
This for some reason made my cpu OC very unstable and I started blue screening in games my OC was a 4.4 on a Ryzen 7 3700x which I had no problems with. and when I turned this off every thing went back to normal
1 Year Update: It has significantly reduced the power draw for the i9-10900k. I immediately noticed my cpu temp is much much more consistant, using a kraken x53 280mm aio with arctic mx-4 paste and running 5.1 ghz on all cores, my temp varies between 30 and 31 degrees Celsius now. Previously it would hover between 34 and 39.
There's noise coming from some in the DCS World Flight Sim community that scheduling on 2004 is raising the minimum FPS in DCS for VR users. DCS is basically running an entire complex simulation on a single-thread. Needless to say that puts a high burden on one CPU core meaning high core clocks are needed to compensate. The single core struggles to push data to the GPU and for VR users this results in absolutely sure performance relying entirely on VR driver motion-smoothing to make it even baseline playable. I haven't been able to test it myself because MS still haven't made 2004 available for download. I'd prefer to do an update install. DCS world in theory might benefit from this feature because even a little boost of minimum FPS would help.
@gamers nexus I installed 2004 today and was doing some benchmarks, compared before and after 1903/2004/2004 with hardware accelerated GPU scheduling. Framerates were identical. BUT while running the benchmarks I was monitoring CPU load, wattage, etc... with hardware accelerated GPU Scheduling turned on, the CPU was at much lower utilization, wattage, temp, it would even clock down, with no discernible loss of performance. With out it, it would stay pegged at max all core boost. It doesn't seem like a big deal, but the game's benchmark mode has little to do with the in game performance, it's just something repeatable I can use to compare changes. While actually in game, the CPU is much more heavily load, and the fact that some CPU load was eliminated helped increase minimum frame rate and make the game feel significantly smoother. Completely unscientific results, I know, but I think it merits a closer look. 2070 8gb max p mobile, i7 9750h
my experience so far with my : i7 8700K@4,8Ghz 16Gb DDR4 RTx 2080Ti on a 1080p 144hz.Gsync Monitor Windows 10 Pro 2004 latest Nvidia driver 451.48 all games are installed on a Samsung Evo 860 1 T SSD 0 background programs only Steam is running. all games are @ 1080p 85Hz ULMB and HAGS is ON FF15 highest possible ingame settings with Nvidia works ; smooth stutter free amazing result Batman Arkham Knight : highest settings stutter free an almost nonstop 85 fps ULMB experience The Division 2 : smooth as butter no more stutters nor fps spikes a whole new level of happiness Deus Ex MD : highest settings except MSAA OFF stutter free and super smooth.. Call me crazy but this HAGS thing serious stopped every microstutter i ever had im glad with this and cant believe how smooth games run now without stuttering.
Steve is an idiot for not testing this on older CPUs, especially since this helps cpu more than gpu. But no, he really thinks everyone is using the latest i3 and pairing it a 2080ti... what a tard lol
some people like me buy once in a couple of more years pc components to be future proof. A CPU and GPU must play games for at least 3 to 5 years at maximum settings in the desired fps range you want. Its just insane and crazy to spent every year money on new builds. I think 80 to 90 % of the people who buy and or build a game pc for themselfs is to enjoy it for more than 1 year before building a new one.. Yes it sometimes pains me when people say you have got an outdated PC if its only 1 or 2 years old..hyped up by youtube videos about the latest hardware some people cant even afford. I love when people who understand that.
@@GamersNexus i have the same issue, had to roll back nvidia drivers two releases and turn on the windows thingy. What happend is that as soon as my gpu got load on it the screens turned black but i can hear sound. I got i7 8700k with a gtx 1080ti asus strix.
was running with this enabled for a year or more. Then, out of the blue, my video card would just blank out, black screen, forced to force restart. Turned this option off and no more crashes. 😕
I had random reboots at idle about every other day until I turned HAGS off so it definitely isn't 100% stable. I'm leaving it off from now on until it's required which will probably be a long time from now.
Great video, since they said these will get better in the future.. I would love too see results today 3 years later, with the top 10 multiplayer games...
Watch our New vs. Old R5 3600 Benchmark here: ua-cam.com/video/TJjqUyrWgwg/v-deo.html
Find our CPU testing methodology for 2020 over here: ua-cam.com/video/sg9WgwIkhvU/v-deo.html&feature=emb_title
Article version of CPU testing methodology: www.gamersnexus.net/guides/3577-cpu-test-methodology-unveil-for-2020-compile-gaming-more
Article for this GPU scheduling piece: www.gamersnexus.net/guides/3599-windows-10-hardware-accelerated-gpu-scheduling-benchmarks
Very Epic
You are a Big Brain.
Will you test GPU Scheduling for input latency as well?
bruh that valley wallpaper tho. where did you get that and how?
how would this affect sli?
I had it turned on the past few days and I've had some unexpected game crashes. Shadow of the tomb raider, wreckfest and Forza horizon 4. I turned it back off for now and everything is back to normal.
Great information. Thanks for letting everyone know!
@@GamersNexus Glad to help, GPU is evga 1080 ti SC2.
When does FH4 not crash on PC...
the problem might be you dont have a supported gpu , if you check on nvidia website they say its 20 series only
@@gandalf6700 Pascal has support for the feature, Steve said this in the video. Also, the option would not appear if the GPU didn't have support.
Thanks so much for being so quick to cover topics like these! I saw this scheduling news yesterday and I knew that Gamers Nexus would be on it. Keep up the good work guys
@Nismo touché
I'm not going to use it till after a lengthy breaking period
Bro you late asf 😭😭
So... remember to turn it on in like 3 years. Got it.
I'm sure a Windows Update will do it for you without warning at some point. ;)
Yeah like "Edge" whatever tf that is...
me too :(
@@RackBaLLZ The best browser ;)
Not even joking this is fixed my crashing issues in a few games with the new 3080FTW3.. Guess it became helpful sooner than we thought.
I was waiting on this!
Likewise. And I'm waiting for battle(non)sense channel to do a video on it
me also
Fuck yeah, for the longest Noooo one talked about this
I suggest to analyze the latency, see if there is any improvement to the time from the click of a button and until you see it on the screen.
It messed up my Streamlabs OBS kept dropping frames. I disabled. it
it would need an update .... and thats with game cap right?
I don't know if you know, but running SLOBS in admin mode allows it to be first priority in windows, thus not dropping any frames.
@@korphiltag9364 I didn't have any issues before. I have a 3950x and 1080ti. 32gb RAM etc. More than capable hardware. Fiber optic internet too 500/500. It was a software based fuck up.
@@TheRisingMiles try it as a precautionary measure, as running in admin mode cant hurt anything.
Slick way to plug that your stream lol.
Just paving the way for something that can be good in the future, when GPU companies and MS can work together on perfecting the tech.
Noticed something interesting when streaming and monitoring the stream on Chrome, it seems it improved performance on the Chrome tab running twitch while playing MHW, but other than that, gaming remained the same. MHW is terrible to stream and monitor on the same machine, causing discord and chrome to slowdown, typing was impossible on discord, enabling gpu scheduling fixed the typing and the frozen twitch player on Chrome.
Sounds like an outlier more than anything.
I feel HW scheduling freezes feeds on any other app if you running a game full screen or windowed mode around it.
Not especially for a specific game.
@@GamersNexus It might be the case, the game is poorly optimized, 4 friends have the game and they experienced the same behavior while playing the Monster Hunter World and typing on discord, the input lag with each stroke was really high, almost to half a sec, I'm just waiting to see if I can have those folks to test and report back.
@Nismo monitoring stream while streaming is something common.
Bizarre, I've had the complete opposite experience when enabling GPU scheduling, every game I've tried including MHW has made other windows incredibly stuttery and slow to update, I pretty much can't watch videos and play a game at all with this feature enabled.
We need to revisit this
Agreed. Would also like to see if enabling HAGS causes crashes.
@@prateekpoddar1890 I have definitely been using it for the past 4 months and I have to say that I haven't encountered issues, on the other hand I have found that running it with obs and obs without admin has let me push my gpu and my cpu further without droping frames or causing any encoding issues so I'm all in for running it 24/7, 5900x 3080 10G 64GB 3600mhz ram
@@RayneYoruka windows 10 or 11?
@@wrywndp 10 22h2
Been 3 years. Would be really interesting to see if theres any changes or benefits now since the time you had last tested it.
seems like moonlight+sunshine don't like that setting on, it looks like it's causing a lot of stuttering
I second this!
This setting basically paralyzed my computer. Turn this garbage setting off
The vague description of the new feature by Microsoft to me it sounds like it could be targeted at future MCM based GPUs. Getting the scheduling into the GPU allows it to "hide" some of its internal topology from the rest of the system. It could enable a kind of "GPU I/O die" taking care of the internal workflow of the GPU
A future GPU I/O die could make sense. In order to replicate the new consoles storage system they need to bypass the CPU and load the game data directly into the GPU memory. I can imagine future GPUs coming with an SSD connection.
@@Sunlight91 Considering that there are AMD GPUs _today_ with on board SSDs, yes, I would not be at all surprised to see that (they're currently only for their professional graphics line. Maybe we'll see it expand into consumer?)
@@Sunlight91 You must watch moore's law is dead. Let us hope this is the case and is coming soon!
@@mduckernz Yes but we're talking MUCH MUCH faster SSD/cache implementations.
@@Sunlight91 And now Microsoft has added the DirectStorage 1.1 API to Windows, with GPU asset decompression. Only one game out that uses it (and I think 1.0 not 1.1) Forespoken or Forsaken (it seems crap). But not surprising with a new API. Looking forward to it becoming more commonly used. Good call!
20:04 minutes of videos, windows 2004, that's make senses
Illuminati confirmed
20:03
For anyone like me coming back to this.
It’s mandatory for DLSS 3.5, however when NOT using that, I find it makes game stutter far more often and frame time spikes and 1% lows a LOT worse.
Particularly on rdr2 and re4 remake.
Use it wisely.
Tested on 3090/3090ti/4090
12900k/13900k.
13:53 - scheduling disabled, enabled, disabled, enabled.
14:58 - scheduling disabled, enabled, enabled, disabled.
16:01 - scheduling enabled, disabled, enabled, disabled.
You guys gotta be consistent in your graphs.
Ikr? That threw me off...
People with lazy brain... pff
@@Gragagrogog brony calling somone a lazy brain
They were all highest FPS to lowest FPS though...
@@UnOrigionalOne he knows, he's suggesting that was not the better way to go about it
Glad you guys looked into this. My lesson learned here, disable it for now until we see any improvements (if any) in the future.
Since Microsoft have it off by default (for now) does tell a tale, doesn't it? :P
It's more or less a beta feature right now
This setting wasn't made to improve performance but input latency at lower refresh rates, this can be seen when gaming on a 60hz monitor, which is what the new systems are targeting.
Thank you for clearing things up Steve. Great work as always.
2:47 I like the quotations for graphics settings.
Yes joke go funny haha
What i am wondering is if it impact power draw at all as well, since it's a task that change the demand of the hardware in question. There might be a small but interesting difference depending on gpu and cpu's mode of operation or simply type. And on that note, i have to ask if the change is worth it or not as well. And the same goes if one run multiple pieces of hard or soft hitting software on a machine. This also make a significant impact, worthy of a brisk test. (if a larger change is seen in terms of wattage pull, it might be telling if a game or combination is more or less efficient prior to the change, for the worse or better. I think it would be interesting or completely mundane but therefore worth a checkup if a game engine behave differently)
IS the any chance to revisit this option in 2023?
I am more curious how this plays with AMD drivers than anything else. Excellent explanation, well worded and easily understood. Thumbs up Steve!
20.5.1 is a trash can of a driver...
@@VentureNW For you...
Finally, I was waiting.
HAGS will change everything!
Yeah
can you guys revisit this?
Wonder how much it could help for older generation CPUs, especially those with lower memory bandwidth, and slower/less cache. I expect the gains could be the highest there
This is what I was wondering about too. Older but still capable CPUs such as a 4770K paired up with something like an RTX 2060 to see if anything changes with HAGS being on or off.
I mean it boosted my performance on Dying Light 2 RTX maxed...had issues with gpu usage but this completely maxes it to 100% all the time now (= more fps). Also forces my cpu to be used more (extra fps)...and I'm on a 3080 12 gb/11700k OC
Also nice video about update 2004 being 20:04 in length :D
it says 20:04 in the thumbnail, but it is actually 20:03 in the video
@@Patrick73787 For me, it's 20:04 in the video too.
UA-cam has gone mad with adding and removing a second from the video length.
@@ToniML200 Are you on mobile? I see 2004 as well
I was on mobile, for those wondering
20:04 for me, mobile
It's 2021 now, did it improve anything? Should I turn it on or off?
It gives me like +1fps in most games, but causes crashes and stuttering in some games, so I turned it off.
It's 2022 by now and I've recently tested on two notebooks, one with RTX 3050 and another one with an GTX 1650, this make some games very unstable, like Final Fantasy XV for example and others don`t seem to have improved in any way... I've used the latest graphics drivers... Would recommend to switch it off, seems like it wont improve latency and FPS, it will mostly degrade performance...
@@RISSITECH i tested it with valorant... Yes it did improve the latency , but it just made the performance awful and reduces fps...
@@amgh4743 yeah, my problem with it is that it makes some hiccups on some games, that without it are buttery smooth, at least I`ve seem this happening in a few games, I only would use it in competitive FPS games... (my system is not high end but not bad either... So it may not be the problem... Dell G15, RTX 3050, i5-11400H, NVME SSD)
@@RISSITECH yea mahn , games like valorant and apex , the ones i play , fps stutters when i switch it on...
It does prevent stuttering on one monitor affecting both when the monitors have different refresh rates, because each monitor has separated desktop composition. Also if you ever played a game in windowed mode, 1 frame of input lag is added due to DWM's triple buffered vsync (and this would be according to your lowest refresh rate monitor previously, instead of the monitor the game is on).
Some people report fixing stutters in VR by this, which is probably related (rendering on monitor + the VR headset). community.openmr.ai/t/windows-2004-hags-nvidia-pitool-261-catalyst-dcs/28905
extra frame of latency? but what if you run vsync off like a normal person?
@@Artcore103 Most people run vsync on.
Should i use this feature if i want my GPU to be safe on the amazon mmo New world ?
@@bartholomew4487 no, no they do not. Nowadays people don't experience tearing if they have variable refresh rate displays... freesync and gsync will eliminate that as well.
But real gamers do not use vsync on fixed refresh rate displays, it absolutely kills input latency. 15-20 years ago none of us used vsync whether we had crappy 60hz monitors or better CRT's that could do decently high resolutions at 75, 85, 100+hz. Never vsync you noob. On a slow single player campaign, non-fps, or cut scenes, yeah sure. No fast fps player EVER HAD VSYNC ON YOU UTTER NOOB GTFO!
could you check if those results are still valid today, with current gpu and more recent games ?
It would be super cool to see you guys revisit this setting for the ray tracing generation of games.
How about an update for these benchmarks 2 years later?
The purpose of the Hardware accelerate GPU scheduling is not to improve performance, latency or even jitter in games. The main purpose is when you have multiple applications using GPU at the same time doing something. Think of a multiple viewports in Blender, or a game editor, a web browser, and a game running all at the same time. Similarly it might help in virtualisation scenarios. In such situations the improvements might be visible. It might help in some scenarios in a single application, where there is many threads submitting work for the GPU at the same time, but in reality such scenarios are already pretty well optimised on the game side to do it close to optimal (or possible using known algorithms), so it will rarely help. But the hardware scheduliing do have advantages of making new things possible in the future in more efficient way, so that is to be see tho.
Nah, it noticeably decreases latency.
This is literally contradictory to what Microsoft tells you.
What about testing with a AMD GPU, since it is about offloading to the hardware it seem quite relevant to test with both manufacturers.
also be nice to see it tested on a lower level card rather than a $1200 gpu
Steve is paid by Nvidia
@@oxfordsparky How would that change anything? The results would be even less noticeable there. This offloads a process the CPU used to do to a dedicated chip on the GPU. Putting a bottle neck on the GPU side would reveal nothing.
@@WillisPtheone Actually it would still show similar results. If the bottlenecked GPU can now manage its own memory it may improve it's performance to lessen the bottleneck.
Testing on a super high end GPU that isn't memory bottlenecked would show less of a change.
In fact, users with AMD CPUs should see a noticeable improvement. Steve should've tested with a 3100 and a 3900x
2:42 love that desktop background!
yeah I like that wallpaper too
Wonder where it's from
Looks like from Among Trees game
This has been my background too for some time. Even Bitwit uses it.
Its an animated wallpaper through wallpaper engine, here's the steam workshop link:
steamcommunity.com/sharedfiles/filedetails/?id=1604225602
Is the 20:04 length of the video a coincidence? 👁️👁️
Yes
Its 2:04 am as I'm reading this
vaids nomiss i can tell bc ur grammar 😂 u must be tired
@@suniro lil guy needs a rest
Yobama nice name
Just a feedback from my experience. Old CPU (i7 4770k) and GTX 1080. Got 40 - 60 fps no matter the graphical settings on Squad game. Turned hardware GPU scheduling ON, and got a 15 - 20 fps boost and low fps boost. It rarely dips to 55 fps now. The boost is HUGE on my old CPU. Try it if you are in my case.
Me too. Running an i5-6600K paired with GTX 1070...
Not tested yet but of course, we should see good results.
Is it better now? did they improve it?
10:30 finally, FINALLY, a reviewer that gets this. you guys earned my sub for this. maybe I find something in your shop as well.
big ups!
I thought everyone realized frametimes were important 7-9y ago. I guess some people live under a rock or aren't old enough to have seen that.
@@rickross4337 Get out much? No wait, stay inside.
3950x/2080 super. every game ive tested has gained a couple frames in the same areas, and has better 1 and .1% lows. also gpu usage actually reaches 100 percent now, before it never EVER did so.
Cichlid_Visuals Are you running any VR software? If so what platform and what resolution etc. :)
@@theegg-viator4707 no vr software im afraid, i actually have never tried it!
For whatever reason, AMD CPUs benefits more from this feature
I think you should've been more specific. Which games, what's your pc build (cpu, etc.)?
@@user-zu1ix3yq2w ah man, i cant remember exact games anymore, but i mostly play old modded stuff lol. and its an x570 board with a 3950x, 32gb 3200mhz samsung b-die with SUPER tight timings and ive since upgraded to a 2080ti when the 30 series was announced and there were panic sales (snagged one new for 650 USD). HAGS improved performance notably and memorably to me in 2 games off the top of my head : heavily modded skyrim SE (like 1k mods) and modded Stalker (anomaly), although i can remember pretty much everything seeing an improvement except for no mans sky. i dont know if these statements hold true anymore as theres been many windows build and driver updates since that 1 year old comment.
I have just had to turn this off because it made Desktop Window Manager put 100% load on my gpu while gaming and causes me an unplayable experience. fps drops and stuttering (3080 rtx) Where am I going wrong...
TLDR: turn it off/leave it off. For now.
You're a goat man, this fixed my CPU and GPU load. have a 3700x running at 4.45 all cores but was holding my GPU 2060 super founders edition from performing it's max... anyways i turned this on and tested me self i saw like a 200% of FPS boost on a game I play... Insane
What might be an interesting rest:
Measure input latency in a few games (or settings combinations) with low, medium, and high framerates, specifically with buffering disabled. This is realistic because some people, especially in competitive games, want every latency advantage possible.
Then, enable this feature, and see if you can achieve better performance at the same latency, better latency at the same performance, etc.
If you can also use this with buffering, that's a candidate for equal latency with higher performance.
It's almost assuredly going to be a wash, but I'm sure some are interested to see (myself included)
I noticed good inprovements in high fps scenarios (400+ fps) jumped from 450 to 500 in Black Desert online after enabeling it.
utterly useless.
Artcore103 you can’t go “too high” with fps. The game feels better and for someone that spends thousands of hours in one single game, getting it to perform 3-5% more consistent is huge.
@@YTHandlesWereAMistake no one can feel 450 vs 500fps don't be absurd.
@@Artcore103 black desert online has animation speeds and skill speeds scaling off fps for whatever reason so in this particular case it unfortunately does matter. that's why you see people with 5000 dollar PCs run that game in the lowest potato graphics settings possible to squeeze out more FPS, it's just ridiculous.
@@crispycrisp134 ok well that's a special case due to weird coding.
My experiance when enabling the feature:
1.Unigine Heaven benchmarks: +10 FPS (220 to 230 new)
2.SOTTR benchmark: -20 FPS (120 to 100), this by VERY weird behaviour of frametimes (pls test for yourself to see:) bc CPU and GPU frametime graph are EXACTLY the same!! its an exact overlay of the graphs :)) looks so strange..have no explanation for it..
3.CoD Warzone/Multiplayer: + XX FPS (seems higher, with better lows, but havent benchmarked it with feature disabled yet)
My setup: Ryzen 5 3600 with GTX 1080 ti, 16 GB RAM 3600 Mhz
We have almost the same setup lmao I have a ryzen 5 3600 , a gtx 1080 and 3600 ram
You should take a look at this again. I recently had a pretty good performance bump by enabling this setting when trying to resolve lower GPU usage I was getting in RDR2 using Vulkan with HDR. I updated my MB bios and vbios to enable resizeable bar on my 9900kf and 3090 build but I noticed in RDR2 I was getting less GPU usage, but only with Vulkan and HDR enabled. I went from 90-91% usage up to 97-98% usage by enabling this. I, of course, tested the differences in the benchmark also. I went from around 73 fps to 79 fps. I did multiple runs to come to an average on those tests. I also tested GR Breakpoints Vulkan implementation using this and saw a consistent 2 fps average increase using the benchmark, from 94 up to 96 fps.
Let me save you 20mins: hardware accelerated gpu scheduling (the new setting in windows) won't give any noticable performance improvements. They mostly came at problem from the cpu side.
huh
not all heroes wear capes. thanks!
Dude what?
After enabling this feature I have noticed that the frame tearing issues in my games with vsync off, have been solved.I tested witcher 3(vsync off),apex legends in tripple buffer,shadow of the tomb raider(v sync off)
My rig:i5 9400f,rtx2060,8*2=16gig ram,b365 mobo with 76hz ips panel display.
Sorry,the tearing wasn't gone but you can hardly notice it.specially in Apex,i was having a huge issue while the fps wasn't capped at 144 fps. I used to get major tearing issue while fps was fluctuating between 150-220 but after enabling GPU Scheduling, I can't notice any tearing(in Apex Legends with 150-220 fps) .
Hey man, I have the exact same set up as yours. RTX 2060, i5 9400f and (8*2) 16Gb Ram. I was wondering if you play warzone, since whenever I play I always get these stutters and FPS drops every once in a while. I run everything on high and i have an avg FPS of 130, but I can't seem to figure out the problem.
Thanks for the tests! I can imagine that this feature will be improved and become a standard setting in the near future.
I'll leave it off until there is a known difference in performance, knowing how Microsoft is lately, they'll push an update that breaks performance or games at some point.
You come in clutch yet again, nicely done lads and lasses!
10% difference on the 1% lows for GTA V using a quad-core CPU looks interesting, and definitely more than negligible.
I come for the information but I stay for the wonderful shots of CPUs and GPUs. Props to the guy behind the camera!
18:00 So is there any significant difference in performance using hardware accelerated GPU scheduling with frame-buffering disabled?
Hardware accelerated GPU scheduling negates the need for frame-buffering. So no. You will see no change in most cases. Watch the video again and pay attention to the words he is saying this time. The improvements you would see from disabling frame-buffering ARE THE SAME you will get from HAGPU scheduling. Think turning your pc off with the power button or yanking the power cord. Different methods for the same result. Pressing the power button after yanking the cord would be pointless.
This is a good question. I know that disabling future frame rendering in Battlefield 5 in DX11 will reduce frame rate, because the GPU and CPU have to pause while they wait on each other. Maybe GPU scheduling will improve frame times slightly in this scenario?
@@WillisPtheone hm.. if so, radeon has it since navi release, cuz anti-lag is basically it.
@@WillisPtheone So hard to understand what this guy is talking, in the video.
And I have no problem with other UA-camrs. So, I also did not clearly understand what he was saying, all the time...
What is the animated wallpaper you had at 2:40 please?
its this: steamcommunity.com/sharedfiles/filedetails/?id=1604225602
the app for using animated wallpapers is called Wallpaper Engine on Steam. very cheap app
i feel like an advertising guy, answering every comment what the wallpaper is lol
Random question but.. where can I get that live wallpaper with the shooting star? Is this also a program or app on windows?
I believe that's using Wallpaper Engine which is available on Steam. Once you have it you can search for Firewatch wallpapers.
Just saw that @AbelSlayer linked to the specific Firewatch wallpaper that was used in another comment. steamcommunity.com/sharedfiles/filedetails/?id=1604225602
I think the best use case for graphics scheduling can be observed, when there are multiple programs trying to execute loads on the gpu, e.g. streaming with obs and gaming, hardware accelerated video playback and gaming, etc
Maybe something to look into ;)
actually people are reporting problems with multiple programs. I personally have problems with GPU scheduling and game running with RTX Voice. Others with OBS and game running. i think it forcefully gives the active window all GPU ressources, or something like that...
Not breaking everything when you enable it is probably a good result :D
Was waiting for this. Thanks.
Your hair is majestic af. You should do hair products commercials
Good stuff GN, would still love to see you benchmark anti-seize (automotive product) as thermal compound with your testing setup. ive seen for a long time how people claim both the silver and copper versions of anti-seize work really well as thermal paste.
You need multiple processes competing for GPU resources in order for changes affecting scheduling of multiple processes to make any difference. Testing with one process should reveal no change. As has been stated below, an example of multiple processes utilizing the GPU would be streaming a game using GPU video encoding. I would suggest you run your tests while having a stream and recording using GPU encoding at the same time. That might show an improvement, as it is a situation that where there is actually some scheduling going on. Simply running a game by itself is practically no scheduling load. This is akin to benchmarking a six core CPU versus an eight core CPU that are both at the same clock speed for single thread workloads with a single-threaded workload. Of course they’ll give results within the margin of error. You need to test multiple threads at the same time to reveal differences between six and eight cores. Test with one and two encoding jobs.
Clayton Macleod it seems that in those cases the effect is negative...
It Also means that it is almost impossible to get comparable results, so the difference from run to run would be much bigger that Turning the option on or off even in the cases it does not crash one program or another....
finally the explanation of scheduling ive wanted , great video and subbed .
ps what ive found that with my GTX1650 laptop this has certainly made a diffrence .
4C/4T w/1060 3GB would be very interesting to see in games with a lot of texture streaming.
This only works on low or mid tier cards. If you have 3000 series this will not do much.
Are the results the same for AMD cards?
Some Navi results would be nice.
Or even AMD CPUs
no but black screen lockups are
As soon as the driver dropped for AMD i've been using it since on my 5700 XT i've tested in a few games, BL3, Destiny 2, and DeadbyDaylight as for as i can tell there wasn't much of a difference other than MAYBE 2-4 fps increase on games (mainly for frame stability and less stuttering). I haven't had any issues with it no crashing no game stability issues i'm on a 2700x and 5700 XT.
@@bezzieog3827 thanks good to know I'm kinda curious in checking in my saphire pulse 5700 xt
Some media outlets have reported visible improvements with cards with low VRAM such as the gtx 1650. It makes sense since lower vram amount should require more management when it gets full.
My exact thoughts, video is completely useless if you are running a card with 11GB of VRAM...
It dont improve anything on an GTX 1650... It makes some games unstable (at least on notebooks)
be nice if you do a 1 year update, to see if any of the newer games are taken more of a advantage of this tech?
I can say, after enabling hardware accelerated scheduling, that my VR experience (with most games that were overloading the CPU) has improved. So games not so much, causes stuttering in Half Life Alyx, but improved others.
Everyone keeps talking frame rates and latency, but I'm wondering if this could eventually help with frame pacing and some kinds of stutter (which tbh is more important to me). I hear GPUs getting more control over draw calls and wonder, but that could be wishful thinking.
thanks a lot . I was waiting for this
Can you guys make a update for this topic Gamers Nexus Team pls.
Nice! Thank you I was wondering about this once I heard about it and didn't find much 👍
It would be interesting to see how this feature compares to enabling low latency mode(aka old max prerender frames = 1 setting) in the gpu driver input latency wise.
I'd guess the negative scaling in some of the games comes from the fact that with hw scaling batching is inherently disabled.
Typically enabling low latency mode in the driver has the same slightly negative effect on the fps however the input latency improvement is huge to the point that I pretty much run any game with it enabled except maybe turn based games. (running 6700K + GTX 1080 btw)
Dont mind me just dropping my spec in the comments gtx 1080 ryzen 3600 and 3600mhz ram
Battle(None)Sense did a video on low latency mode and recommend not using it.
@@BromTeque I checked out the video, I now remember that I saw it at the time.
However I mostly play AAA games at High/Ultra settings where I can barely hit 60 fps and dip to 50 fps sometimes. I also cap the framerate at 60 in these scenarios for more consistent latency. I agree with Battle(non)sense's findings and felt it myself (as anecdotal evidence) that input lag is reduced while the GPU is not maxed out.
But I feel like to be 100% sure that I stay below the FPS limit I need to give the GPU more headroom like 20-25% because graphics intensive games often have inconsistent performance. That would mean lowering the settings which I am unwilling to do. Basically I believe his solution applies to esports games.
@@ecvent0r Yeah, I rewatched the video just now to check if I remembered correctly. You'll have to assess the situation yourself. Low latency mode might be right for you, as you have assessed.
I'm a fps/input latency snob and I'll gladly trade graphics quality for more fps/better input lag.
On my experience with Mount & Blade II: Bannerlord, the Low Latency Mode reduces considerably the input lag issue on big FPS drops (but it's still there). GPU Scheduling seems to have removed the problem completely - no matter how much the FPS drops (and it seems to drop just as much) I don't "feel it" on the input at all. Never had such a smooth experience with that game before on big battles.
Been waiting for this video
In Elite: Dangerous turning on HAGS literally HALVES my VRAM usage. Off I regularly see 9500 MB or more usage, but with HAGS on it's typically 4500 or less.
Does your ram usage increase?
@@SumitKumar-ce7ov I don't know. That one's a bit harder to check since I usually have Firefox open in the background.
Probably allocation now being more accurate & not actual VRAM usage since that'd be really weird.
Less VRAM usage not always means better. The more of the game data in Video memory, the faster the PC can load and render the game world.
@@MichaelReznoR Games would load that data no matter what. Assuming the actual VRAM usage has been reported right in both cases - that would most likely suggest less redundant structures and less buffering necessary, which is still a decent improvement.
i had to turn hardware-accelerated gpu scheduling off as it was lagging the stream i was watching on my second monitor when ever i started playing a game
I had a similar experience while watching video or youtube. Even the steam overlay had this lag and i noticed my desktop felt laggy when windows started and i moved my mouse around and clicked on things.
Same
yeah same here. couldn't watch vids while gaming with this crap on. turned it off since i didnt notice a difference while gaming with it on or off.
Is your second monitor a different resolution/refresh rate?
Just want to add my findings with an R9 3900X and 1080Ti tested by playing COD Warzone while live streaming on facebook using Streamlabs OBS.
Enabling the feature improved minimum fps by about 10 and reduced CPU usage by 10% on average, however, for some reason the livestream was definitely not as smooth when the feature is on vs when it is off despite showing 60fps in live stats for both cases.
VRAM usage also seems to be reduced by 1GB (using 10GB vs 11GB) with it on. GPU usage was 99% on both cases with CPU usage averaging around 40-50% with it off and 30-40% with it on. This lead to cooler temps of about 3-5 degrees as well.
Average fps was not hugely impacted for both cases hovering is the 80s with all settings maxed out except Ray Tracing at 3440x1440 resolution. Minimum fps improved quite a bit from around 62fps with it off to about 74fps with it on. Tbh though, on a GSYNC display which is what I am using, there is no distinguishable difference between on and off while gaming.
Where you will notice a difference is streaming. For some reason the stream seems to be running at around 45fps when on instead of the usual 60fps without changing any settings on Streamlabs. This could be a bug as the feature is new and Streamlabs might need an update to iron it out. As it stands right now, if you are planning to stream, you might need to play with the settings a bit to get it to run at a stable 60fps, otherwise, just leave the feature off. If you are not streaming and just gaming, you might find some benefits to enabling the feature if only for the lower CPU usage, temps, and better minimum framerate though I suspect the differences will be even more negligable when not streaming as CPU usage hovers around the single digits to lower 10s anyway.
when I enabled this it gave me constant micro stutters in WoW Classic, problem resolved immediately once i turned it off, my system is an 8700k, 2080super, 16gb 3000mhz team group ram and an 970evo 1tb samsung boot drive.
6700k, 16gb ddr4, 1070.
Same...
Knowing your specs are above me kills the thought it was all in my head.
You should overclock that ram or buy some 3600mhz at least. 3000 seems kinda slow since you have such a good GPU, and a solid CPU.
Been waiting for this video :)
This for some reason made my cpu OC very unstable and I started blue screening in games my OC was a 4.4 on a Ryzen 7 3700x which I had no problems with. and when I turned this off every thing went back to normal
Thanks GN. I just found out about this.
I'm keeping this feature off, the difference is not enough to keep it on.
1 Year Update: It has significantly reduced the power draw for the i9-10900k. I immediately noticed my cpu temp is much much more consistant, using a kraken x53 280mm aio with arctic mx-4 paste and running 5.1 ghz on all cores, my temp varies between 30 and 31 degrees Celsius now. Previously it would hover between 34 and 39.
what's your GPU?
@@alexandresaoghal1761 RTX 2080 Super
There's noise coming from some in the DCS World Flight Sim community that scheduling on 2004 is raising the minimum FPS in DCS for VR users. DCS is basically running an entire complex simulation on a single-thread. Needless to say that puts a high burden on one CPU core meaning high core clocks are needed to compensate. The single core struggles to push data to the GPU and for VR users this results in absolutely sure performance relying entirely on VR driver motion-smoothing to make it even baseline playable. I haven't been able to test it myself because MS still haven't made 2004 available for download. I'd prefer to do an update install. DCS world in theory might benefit from this feature because even a little boost of minimum FPS would help.
So I would guess that the difference would come into play when streaming or multitasking?
THANK YOU FOR THIS VIDEO. VERY DETAILED :D
When I enabled, my idle GPU temp went up 5 degree so I turned it off
Is it running hotter because it's not being bottlenecked by the CPU scheduling, so runs faster.
@Oblivion Isn't amazing that three people in a row missed the IDLE part of your statement? :P
@@The_Noticer. 5 degrees hotter at idle is 31 to 36. That's a couple watts that it's using to manage the memory
@Shekelstein leaving the steam storepage or minimizing it shouldn't cause a draw of that many watts lol.... regardless if it's the GPU or CPU.
@@The_Noticer. i opened up the steam store and left it there for it bit, net temperature was -1. It dropped one degree
What did you use for the animated desktop wallpaper? :)
here bruh: steamcommunity.com/sharedfiles/filedetails/?id=1604225602
app is Wallpaper Engine on Steam
I would be curious to know if there is any impact over stability, as my 3700X and RTX 2080Ti combo tends to crash in DirectX12 games.
Have you tried using obs at the same time to stream giving it more than one job to do while playing a game
"i3-10100 to create a CPU bottleneck on a modern system"
Me with an i3-2100 that I thought was still decently modern: 👁️ 👄 👁️
@gamers nexus I installed 2004 today and was doing some benchmarks, compared before and after 1903/2004/2004 with hardware accelerated GPU scheduling. Framerates were identical. BUT while running the benchmarks I was monitoring CPU load, wattage, etc... with hardware accelerated GPU Scheduling turned on, the CPU was at much lower utilization, wattage, temp, it would even clock down, with no discernible loss of performance. With out it, it would stay pegged at max all core boost. It doesn't seem like a big deal, but the game's benchmark mode has little to do with the in game performance, it's just something repeatable I can use to compare changes. While actually in game, the CPU is much more heavily load, and the fact that some CPU load was eliminated helped increase minimum frame rate and make the game feel significantly smoother. Completely unscientific results, I know, but I think it merits a closer look. 2070 8gb max p mobile, i7 9750h
Any 2021 update on this?
This is what I have been waiting for!
my experience so far with my : i7 8700K@4,8Ghz 16Gb DDR4 RTx 2080Ti on a 1080p 144hz.Gsync Monitor Windows 10 Pro 2004 latest Nvidia driver 451.48 all games are installed on a Samsung Evo 860 1 T SSD 0 background programs only Steam is running.
all games are @ 1080p 85Hz ULMB and HAGS is ON
FF15 highest possible ingame settings with Nvidia works ; smooth stutter free amazing result
Batman Arkham Knight : highest settings stutter free an almost nonstop 85 fps ULMB experience
The Division 2 : smooth as butter no more stutters nor fps spikes a whole new level of happiness
Deus Ex MD : highest settings except MSAA OFF stutter free and super smooth..
Call me crazy but this HAGS thing serious stopped every microstutter i ever had
im glad with this and cant believe how smooth games run now without stuttering.
Steve is an idiot for not testing this on older CPUs, especially since this helps cpu more than gpu.
But no, he really thinks everyone is using the latest i3 and pairing it a 2080ti... what a tard lol
some people like me buy once in a couple of more years pc components to be future proof. A CPU and GPU must play games for at least 3 to 5 years at maximum settings in the desired fps range you want. Its just insane and crazy to spent every year money on new builds.
I think 80 to 90 % of the people who buy and or build a game pc for themselfs is to enjoy it for more than 1 year before building a new one..
Yes it sometimes pains me when people say you have got an outdated PC if its only 1 or 2 years old..hyped up by youtube videos about the latest hardware some people cant even afford.
I love when people who understand that.
The video I was waiting for!
For me it causes gpu crashes
Good info to have. Thanks. What GPU?
Gamers Nexus I have been getting crashes with my 1080ti
@@GamersNexussame issue for me and my gpu is an AMD RX 5700xt
@@GamersNexus i have the same issue, had to roll back nvidia drivers two releases and turn on the windows thingy. What happend is that as soon as my gpu got load on it the screens turned black but i can hear sound. I got i7 8700k with a gtx 1080ti asus strix.
@@GamersNexus I have a evga 1650s and a evga 2070s, the 1650s crashes first then the 2070s seconds later... and yes its a weird gpu pair
One question.. what kind of background are you using around timecode 2:44? That is awesome and would like to have something like that on my desktop!!
was running with this enabled for a year or more. Then, out of the blue, my video card would just blank out, black screen, forced to force restart. Turned this option off and no more crashes.
😕
I had random reboots at idle about every other day until I turned HAGS off so it definitely isn't 100% stable. I'm leaving it off from now on until it's required which will probably be a long time from now.
Great video, since they said these will get better in the future.. I would love too see results today 3 years later, with the top 10 multiplayer games...