Great video! I just wanted to let you know that just because there is a spike in the frametime, but not a spike in the GPU Busy metric, doesn't exactly mean it's a Ram or CPU bottleneck, it could be a bunch of things, such as the game itself hanging, the hard-drive pulling assets or another program using the CPU draw calls for another purpose. It is a good metric to see if it's your GPU that's causing stutters (Flushing V-ram or not having enough, transient power spikes, unstable OCs or UVs) or holding your framerate back, but unfortunately beyond that, it doesn't tell you what exactly is causing the stutter, just that it wasn't your GPU. LatenecyMon can tell you in most cases I believe. I've watched you for a while and I'm certain you know this information because you're smart and well versed in computer rendering, but I just wanted to clarify just in case and for anyone in the comments that may have gotten the wrong idea about what the metric is actually recording.
Yes, exactly. I said that it could be some other things even when I stated that the cpu (and consequently ram, hdd) were loading assets as well 💪 Basically we know when its not the gpu, which is a good thing
Draw Rate is basically the FPS of the overlay, it is limited by the Sampling Period(in Settings -Data Processing). So by default 10fps Draw rate is the max with 100ms sampling period(how often it gets the data). 1sec = 1000ms so 10frames per second = 100ms. If you want the graph smoother you increase both.
@@AncientGameplays The preset section shown at 3:23 where there's basic, gpu focus, and then custom + the edit button. Also just figured out you can customize the default presets.
it would be nice to see how much frame time is lost in a 5600x sytem in non very optimized games (or main thread dependant), but also to show how good really is in higher resolutions were not much tmes is lost due to game driver, or ddr5 latency, etc... Frame time - GPU Busy Time = Rest of the System time. We may found ddr4 and 5800x3d (or 32Mb cache of 5600x) not wasting time due to latency being so good versus much expensive "newer", or to validate which gpu upgrades are good to keep old ssytem parts, or what games just lose too much time and resources available...6% loads on 13900K is a too often seen scenario...
hey non related, but I was trying to static overclock my 7700x to 5.4ghz, i set the voltages to 1.20500 and when I tried to run prime95 the computer instantly reboots, is 1.20500 too low?! I know you run at 1.18 and ur fine, and I know every system is different. could you maybe help me?
@@AncientGameplays now im just using, pbo, but i noticed on the amd software, the cpu had a preset of "overclock cpu" could that have messed up the voltages i put in the bios?
The very first example he showed was absolutely a cpu bottleneck… gpu couldn’t be fully utilized because it was outputting too many frames for the cpu to handle.
it's a really good tool and i like it but it don't have the customizable aspect of RTSS (Rivatuner Statistics Server) yet. so if someone know how to import the GPU busy data to add it in RTSS that would be amazing for me Thanks.
The 7950x in gn testing 20 fps behind a 13900k there is a bottleneck then the 13900k falls behind the 7800x3d by 20 fps in some games just like the 7950 in this testing the 5800x3d sr deal I recently tested the 6900xt on FX at 4k the difference in some games was 10 fps. Others 36 but that's a 50% difference. In this case vs the 5900x and the 5600x3d will be in for another system and more testing would be done here as well as this is replacing a 2600x I will test all res of course turn the fps counter off and enjoy the game if the game runs like crap then troubleshoot and upgrade if need the word bottleneck is marketing at it best.
There is always a bottleneck in monitor or drive system or CPU GPU there is no way to get rid of them that's hardware knowledge you know I push numbers but did you see me upgrade this gen? No why as 3k for upgrade for 30ish fps yeah that's marketing.
FX vs the 4790k 15% on games this would depend some times 10 sometimes 30 fps difference. You would not notice the difference between them because PC gaming is now car racing. All marketing. I don't agree with this aspect you have to have the 14900k or the 8950x3d and the 5090 to enjoy gaming if you have a whatever CPU and a 4090 now for xxx gains for xxxx cost.
I love this, I usually understand bottlenecks as I know my system and know a ton about games and how they work but this is actually amazing and makes it much easier for people to find out what the bottleneck is. Heck I will use it just to confirm if I am correct or not. Also allowing it on a second monitor is great and will allow me to just slap it on my portrait monitor and keep an eye on things. Also... OMG... Bruccius's thank you kind sir ahahahaha, I have not seen that in a while now.
Which games are good for testing? 5900x and rx 7900 xtx. Getting double frametime then gpu busy on; Vallheim and Dishonored 1. Battlebit seems more stable. Could this also be the game engine? The cpu doesnt seem to go up in usage, so I don't understand why there is such a big difference.
Yeah, that a classic and BIG CPU bottleneck you got there man. The usage doesn't matter anything at all, that CPU has low IPC and is most likely paired with a low end RAM kit. watch this: ua-cam.com/video/hAVlzEW8qgM/v-deo.html
@@AncientGameplays Any tips on which CPU and ram to go for? I have had memory frequency issues with every AMD cpu I have bought recently. 3900x and the 5900x..
If Intel really wants to impress the penguin crowd (penguinestas) they will port this to Linux, which desperately needs a GPU monitoring solution. There is Green with Envy for Nvidia and corectrl for AMD but neither utility allows for real-time monitoring of CPU and GPU stats.
@@XxXTMillzXxX Mango is great IF you can install it, config it and modify your Steam launcher to support it. Not exactly a plug n play option like this, is it?
Just installed the Intel PresentMon Betav0.5 and configured it as per your video in 2 min! I am happy to report my RX6800 and 5800X are "Balanced" in DCS World Multithread. Very Nice!
@@AncientGameplays I considered not watching it until reading this comment since my system is AMD, I was assuming this intel software is for intel cpu's
This is a cool and everything, but all you really need to look at is GPU usage. If it is 98% or higher, you are GPU limited. Anything below means you are either CPU limited or limited by the game engine.
exactly, but that's not the point. look at the Counter Strike 2 benchmark, even though the CPU was bottlenecking, the GPU and CPU had around the same times, meaning that there were nothing loading in the background, while it was different with Jedi: Surviror, which is interesting to see
Even single player games use some sort of connection to server, but things become more interesting when you play multiplayer online competitive games. Even if game itself didn't depends on data from other side of Earth, you still want have some own input to draw game scene. Playing movie is almost optimal state of things, but you know stutters there. I got quite responsive connection, 15msec typically in my DX9 game, 30ms in another DX11 game. If your PC doesn't wait for incoming data, the image becomes disconnected from reality, if it waits, you have already at least 30ms delays. Idk how devs solved lack of continuos positioning and other data during compute, it looks they use either last known pos or aproximation given last pos, speed and heading. Anyway I think we are doomed, until we become psychically and physically fully connected all together with no lag. I think I love to have some lags😅.
U allways have a bottleneck, else FPS will never stop rising. The first bottleneck (CPU/GPU /bus speed) will be the 1 that determines your FPS. The ideal condition is that the GPU is on 100% utilization. than u know your system is not limiting it from reaching its full potential.
I'm gonna try it cause fedex in Portugal destroyed my pc and fnac sen't my pc back with insurance paid 4070ti tuf oc, new case but broken mobo and other parts damaged in the accident. I got single channel 3800mhz cl17, so yes i gotta check my stats. Thank you.
well, I guess, for CS Go test 1ms GPU time and 1.32ms CPU time is not a match. 1 vs 1.32 it's a big difference. So, huge CPU bottleneck, right? GPU can produce 1000 frames but CPU only 750, huuuuuuge bottleneck.
Im waiting for AAA devs Tweet about it. This is gonna expose so much of Bad optimization 🤣 Oh and Fabio :) Cache is pronounced with an E not an A Kesh not Kash :)
I have a weird question. Do you do this sign 👌 because you want too or that you don't really think about it? 👀 ( question is cause I know some really big famous ppl do it because they are being told.) ❤
what the hell is that? I use that sometimes when talking to say some things are like "top". Been doing that since I was a child, its a common thing here
@@AncientGameplays that's why I asked my guy. Top tier actors and basically all huge content creators are striking deals with bad ppl to do the 666 symbol.
so intel just got off the 32bit L1 cache feed? finally. 64bit came out over 10 years ago moores law dictates 128bit L1 should have come about over 5 years ago and 32bit should have gone the way of windows 95. further moores law GPU memory bit buses should be at 1-2mb as 512kb memory bus was soooooo 2008 with the radeon 6800 and nvidia gtx290/5 gpus
IM tested this its just show gpu time and its very good! but not cpu time im only seen cod warzone game showed gpu time and cpu time if a app can show cpu time its great! its just show the frame time not cpu time! if they add cpu time its gonna be greatest benchmark app!
I have to say my gaming friends, I have an I9-10900k and I still rock the frames on most top Titles. Perhaps a wee bottleneck in my CPU but overall, my performance with this 7900XTX is fantastic! A big step-up from my 3080 TUF
Okay so in this regard I have a strange issue here and I hope you can see my comment to help me with it I have a Core i7 12700f paired with RX 6700XT and 32 gigs of memory. Every Time I go to the bios (Gigabyte Z690 UD) and activate the Gaming profile or Enable the Enhanced Multi Core Performance I witness a drop in the Gpu usage in games which results in much lower FPS. Also, when I enable either of these options in the bios I notice that the Cpu usage inside games increases to 30%~40% vs 3%~7% when they are off. So I am not sure what's going on here
the low FPS don't results because of the low usage, the low usage appears due to those gimmicks you're trying to use. Let the CPU at stock and use the RAM XMP, period
gaming profile disables the ecores. Ecores smooth stuff out in general I recommend keeping them on. I have the same cpu but a b660 gigabyte mobo. CPU is great on stock settings and XMP. If you on DDR4 set RAM command rate to 1T.
i know i have cpu bottleneck, i had to downclock and undervolt my r5 3600 xD, its because i dont know what the "safe voltage" is, because it was high temp. right now 4ghz @ 1.144v. ive heard these cpus killing themselves because of the 5v boost it does on stock
Can Someone help me? I'm encountering an unusual problem while gaming. Despite disabling notifications, I'm hearing the notification sound randomly. Strangely, when this happens, if I press any movement key like W, A, S, or D and then remove my finger from the button, my character continues to move on its own. How can I troubleshoot and resolve this perplexing issue?
I dont get the use of this, its effectively gpu utilization crammed inside frametime graph. If you look at gpu utilization graph and gpu busy they are basically in parallel to each other. If it gets implemented in RTSS i might try it, for now, meh.
you're not really getting the point here, at all. The GPU busy feature can tell you effectively if the issue of stutters, frame drops and so on is because of your GPU or not, something that the GPU usage can't show at all. That's the point
@@AncientGameplays well at least with current presentmon configuration you cant see stutters in gpu busy either, because its the **AVG value** shown, not realtime per frame. As of now, you can only see big gpu bottlenecks and dips, which leads to exact same value as gpu utilization. I get what you are saying but for now it does not show what you are suggesting, even in your own video there are literally no stutters shown, like in RTSS default frametime graph for example.
@12:36 this is the part where he shows the different frame time graphs of when the CPU is bottlenecking the GPU. imho he seems to be mumbling through most of the video or I'm just too dumb to figure out what he's saying. But the 12:36 mark illustrates CPU bottlenecking really well.
What we need is a bottleneck for dummies software to identify where the weakness is in our systems, most of us either don't have the time to dig deep into finding out or are not capable of doing so. This Intel performance software could do the trick if it breaks things down in easy to understand language for most to understand, and that could be a real help for many in seeing what part of their PC's need upgrading, because let's be honest, a lot of us are not that good at doing a balance system, many of us just throw in what we think is a good match from cpu, gpu and memory, or many throw in higher end parts, but a lot of us are not good at seeing the right balance of hardware is, and that could be helpful in reducing cost whiles also showing us what needs upgrading to get a bigger bang for the bucks. This Intel performance software could do just that once it's mature. On another note, Intel have released the source code as open source, that's good news because AMD, Nvidia and Intel could integrate this into their drivers and that could monitor performance and give us tips in where the bottlenecks are and what hardware upgrades would help.
I just got a 12700 k, 1080ti, 64G ram, samsung 990 SSD and my fps in GTAV were 150ish 1080p. I just installed a 4080 and now I'm at 100fps or less. Any ideas?
i have 7800x3d and when i use full hd resolution and using ultra performance dlls my gpu is not at max.. When you see gpu not working max there is cpu bottleneck on gpu demand games.
@@AncientGameplaysFH5 (and previously FH4 and FH3) are my primary overlay stability stress-tests, which I use during RTSS development and testing. They never banned anyone for overlays, it is false info.
I have a 3080 and bought a XFX Rx 6800xt merc 319 I have to tell that AMD drivers are absolute dogshit because i was having weird gpu downlclocking issues so I had terrible stuttering and fps drops . I tried so many things to fix it but nothing worked finally I fixed it by manually setting GPU core frequency to 2700mhz and minimum frequency to 2600mhz and boom everything works flawlessly I'm getting insane fps not fps drops what's so ever . Some People may say my AMD GPU don't have this problem bla bla bla just open msi overlay and watch your gpu core frequency it will drop like monkey jumping tree to tree test this in many games.
You have a cpu or ram bottleneck most likely. Also, clean install the drivers and install chipset drivers. That's not a thing that happens usually. You have something wrong there
so this just shows you there is a bottleneck, but not WHERE, it could be software sided, game engine, driver overhead, slow ram, not just CPU how is this different from low gpu usage?
Yes, it is much different because you can see that it is not gpu sided, with the gpu usage it is relative as sometimes, even at 99% you're getting bottlenecked. A good example was TLOU before smart access memory
This might be a stupid suggestion, but why don't gpu makers make a graphics card that has not only a gpu and vram on it but also a special "cpu" type chip that can handle everything in-game that the system's cpu handles now just more efficiently and effectively? That way the system's cpu wouldn't be burdened with the task of running certain aspects of a game, since that would be handled by the "cpu" on the graphics board, and can instead focus on system stability and processes.
7800X3D+7900XT here. I'm having an issue where it is polling wattage and VRAM data from the integrated GPU, Radeon TM instead of the 7900XT. Going in settings, it only says default adapter as the driver source.
Wow new thing and u dont get it. In cs relative difference is huge and u are cpu limited as cpu cant generate any more frames u should up settings a bit and u would see that then this difference lowers till point when it rises again when using rtx for example
I said that YES, we were cpu bottlenecked and that the difference was there, percentage wise it is big, BUT, there are always variations eve in non cpu bound scenarios. But well, thanks for the cockyness
Intel is doing something great for the PC community, unlike AMD who is blocking game developers from adopting upscaling technologies that are not Flickery Shimmering Resolution. AMD is simply pathetic beside Intel. RIP AMD🎉
Just took a look at Intel PresentMon and my frame time = gpu busy (I mean they are near the same value all the time). There is no gap at all. 10900k and RTX 3080 ti. Tested in Cyberpunk 2077 1080p Ultra settings and Diablo 4. Reduced my overclock to 5.1/5.0GHz as there is zero increase in FPS in games. 5.2/5.1GHz is just for benchmarks now. Love this program. In the video he is CPU limited in CS2. Why he is getting less than 99% GPU load, around ~80% which is clear cpu bottlenecking (expected see high fps but even so). Even in Jedi he is seeing CPU bottlenecks for periods, there can be lots of reasons for this. With me its 99% GPU load and both frame time and gpu busy are the same (all the time). This is in the games I have tested. This is bad I guess because a faster card likely wont do me much good. I would say the CPU is not paired with the correct GPU in this video but still is up to the task and fine. The GPU is a little strong for the build but its acceptable. Likely upgrade is a 7800X3D. It will likely push more frames in CS2 at lower resolutions. Even so this is a don't touch build if there is no hitching or stuttering. He's still getting most of the GPUs performance. Its balanced enough.
@@AncientGameplaysI have a big question my game stutters when I watch my gameplay that recorded on street fighter 6 I don't know how to fix it does it on all of them I tried to use to record my gameplay street fighter 6 is the main one having issues
I got these stutters you showed in jedi survivor in all games. Running my 7600 with a 7800xt and 32gb 6000 cl30 ram. What is causing this exactly and how can i fix this?
Now we can expose lazy devs that still uses 4 threads on cpus and make games stupidly cpu bound where the gpu is at like 50% for no real reason (since we didn't got any major game improvement on cpu related workloads).
@@AncientGameplays usually, yes. But there are games like tw3 remake that used less threads than the original when ported to dx12 on launch, it's a mess. Same goes for hogwarts legacy and a lot of unreal engine games.
What ima love about Fabio he can explain how a systems GPU is bottlenecked by the CPU how the Graphics card is sleeping not pulling its weight as in running at 98% to 100% rather running at 70% or far less but ima guilty as sin doing that with most of my retro AGP builds why they are for retro gaming only lower FPS as in 60fps in some games max running at 60Hz to 85Hz monitors why because the graphics are of the highest crispiest as they will ever get yet the cards running at even 50% or way lower in some games with the builds running ultra silent that way the Graphics card bottlenecked in bed sleeping just dreaming the games on easy street will last forever as the capacitors last up and looks after the silicon to be as good as new no fans blasting sucking in dust to block the cards that is what it is all about purely ornamental builds that can run PC games on Direct X10 to the max if need be as these are AGP the crown jewel builds antiques that hold a premium that should only be looked after with the most utter respect...But not High end like Fabio is bench testing live gaming all his viewers knows he is of the best of the Techs with zero need for Liquid Nitrogen overclocking to find that sweet spot in his systems as most gamers as in 99.9999% will never use Liquid Nitrogen cooling that is only for the likes of perfectionists like GN's Steve and JayZtwoCents chasing world records with golden sample silicon to the max anyhow hats off to Fabio amazing as always nothing but total respect..
Downloaded the intel presentmon yesterday. Was trying to find a way to make background transparent like MSI's one. guess not yet but the OSD from Intel GPU busy is wholesome! thx AG❤
Great video! I just wanted to let you know that just because there is a spike in the frametime, but not a spike in the GPU Busy metric, doesn't exactly mean it's a Ram or CPU bottleneck, it could be a bunch of things, such as the game itself hanging, the hard-drive pulling assets or another program using the CPU draw calls for another purpose.
It is a good metric to see if it's your GPU that's causing stutters (Flushing V-ram or not having enough, transient power spikes, unstable OCs or UVs) or holding your framerate back, but unfortunately beyond that, it doesn't tell you what exactly is causing the stutter, just that it wasn't your GPU. LatenecyMon can tell you in most cases I believe.
I've watched you for a while and I'm certain you know this information because you're smart and well versed in computer rendering, but I just wanted to clarify just in case and for anyone in the comments that may have gotten the wrong idea about what the metric is actually recording.
Yes, exactly. I said that it could be some other things even when I stated that the cpu (and consequently ram, hdd) were loading assets as well 💪
Basically we know when its not the gpu, which is a good thing
@@AncientGameplays Oh whoops! I def missed that.
LatencyMon is good for finding bad programs/drivers causing stutters.
Two videos in a row today, DAMN son xD
Your on fire!!!
3 videos im a row when? :)
have to admit shes pretty good at it
Draw Rate is basically the FPS of the overlay, it is limited by the Sampling Period(in Settings -Data Processing). So by default 10fps Draw rate is the max with 100ms sampling period(how often it gets the data). 1sec = 1000ms so 10frames per second = 100ms. If you want the graph smoother you increase both.
Thanks for the head's up!
gpu : help, i'm being bottled neck!
cpu: shut up, I'm busy. 😆😆😆
hahahah
Hey new AMD Chipset drivers dropped, I've been checking your channel for updates
I downloaded this when the Gamers Nexus video came out but hadn't had an opportunity to try it out. 7900 XTX and 7800X3D
lay an eye on this then :D
@@AncientGameplays So I was using it last night and it's pretty cool although a lot of the custom things don't work on my AMD card.
@@TyGamer125 what custom things?
@@AncientGameplays The preset section shown at 3:23 where there's basic, gpu focus, and then custom + the edit button. Also just figured out you can customize the default presets.
Yes, I made some of them myself as well @@TyGamer125
Thanks for doing this software overview! This seems like a highly useful app so I just downloaded it.
💪💪💪
Huge thanks for this incredible tip !!
God Bless you !!
Thank you as well!
it would be nice to see how much frame time is lost in a 5600x sytem in non very optimized games (or main thread dependant), but also to show how good really is in higher resolutions were not much tmes is lost due to game driver, or ddr5 latency, etc... Frame time - GPU Busy Time = Rest of the System time. We may found ddr4 and 5800x3d (or 32Mb cache of 5600x) not wasting time due to latency being so good versus much expensive "newer", or to validate which gpu upgrades are good to keep old ssytem parts, or what games just lose too much time and resources available...6% loads on 13900K is a too often seen scenario...
So if I'm seeing Frame time of 60.3ms and GPU Busy of 20.1ms, what does that tell me?
Can someone tell me why the app cant read my CPU temperature? HWinfo64 and Afterburner can
Its still on beta, its for GPU mostly now, it will get better
To my surprise the newer chipset did drop the performance sooo I'm gonna go back to the previous one
I didn't really do many test, but seems okay here
Installed but can't get it as an overlay, only have it outside of my games lol !
Maybe it's conflicting with Adrenalin (I'm on a 7900XTX) ?
Choose the app manually then
@@AncientGameplays I launch the app set hotkey but when in-game got no overlay it's like another window (and I didn't select it in a window)
My old ass body is the bottleneck 😂
Hahaha
Wow, hell frozen! Intel was doing something right and for free 😮.
Exactly haha
hey non related, but I was trying to static overclock my 7700x to 5.4ghz, i set the voltages to 1.20500 and when I tried to run prime95 the computer instantly reboots, is 1.20500 too low?! I know you run at 1.18 and ur fine, and I know every system is different. could you maybe help me?
Yeap, your cpu is not that great. Try at 5.3GHz
@@AncientGameplays now im just using, pbo, but i noticed on the amd software, the cpu had a preset of "overclock cpu" could that have messed up the voltages i put in the bios?
with lossless scaling around, i dont mind bottleneck. save me tons of power.
That makes no sense as you can simply lock frames
The very first example he showed was absolutely a cpu bottleneck… gpu couldn’t be fully utilized because it was outputting too many frames for the cpu to handle.
That's exactly what a bottleneck is called... a cpu bottlenecking the gpu... you're "special" aren't you?
it's a really good tool and i like it but it don't have the customizable aspect of RTSS (Rivatuner Statistics Server) yet.
so if someone know how to import the GPU busy data to add it in RTSS that would be amazing for me Thanks.
What is that frame rate graph you use from your testing
The msi afterburner one?
@@AncientGameplays the fps graph you show from the Jedi survivor
@@Ghostlynotme445 dude, I am literally making a video about it...
They still haven’t fixed Star Wars. EA sucks
Yup
I think jedi survivor is a bad choice cause its still brocken. Try Apex legends, call of duty or some well optimised story game
🤦 No Intel Did Not solve the WHOLE CPU bottleneck dilemma.
Even a semi blind guy can see this shit now,lol
there is aways a bottleneck there is not a way around it the key is to build a balanced system
Not the point, you usually want the GPU to be the limiting factor
The 7950x in gn testing 20 fps behind a 13900k there is a bottleneck then the 13900k falls behind the 7800x3d by 20 fps in some games just like the 7950 in this testing the 5800x3d sr deal I recently tested the 6900xt on FX at 4k the difference in some games was 10 fps. Others 36 but that's a 50% difference. In this case vs the 5900x and the 5600x3d will be in for another system and more testing would be done here as well as this is replacing a 2600x I will test all res of course turn the fps counter off and enjoy the game if the game runs like crap then troubleshoot and upgrade if need the word bottleneck is marketing at it best.
There is always a bottleneck in monitor or drive system or CPU GPU there is no way to get rid of them that's hardware knowledge you know I push numbers but did you see me upgrade this gen? No why as 3k for upgrade for 30ish fps yeah that's marketing.
FX vs the 4790k 15% on games this would depend some times 10 sometimes 30 fps difference. You would not notice the difference between them because PC gaming is now car racing. All marketing. I don't agree with this aspect you have to have the 14900k or the 8950x3d and the 5090 to enjoy gaming if you have a whatever CPU and a 4090 now for xxx gains for xxxx cost.
Since this is open source. MSI Afterburner should include this in their graph.
I love this, I usually understand bottlenecks as I know my system and know a ton about games and how they work but this is actually amazing and makes it much easier for people to find out what the bottleneck is. Heck I will use it just to confirm if I am correct or not. Also allowing it on a second monitor is great and will allow me to just slap it on my portrait monitor and keep an eye on things. Also... OMG... Bruccius's thank you kind sir ahahahaha, I have not seen that in a while now.
haha, You're welcome kind sir
Just gonna make sure to double check given that it still in work in progress!
I assumed this was only for Intel Arc cards. It's cool that everyone can use it
Indeed
Which games are good for testing?
5900x and rx 7900 xtx.
Getting double frametime then gpu busy on; Vallheim and Dishonored 1. Battlebit seems more stable. Could this also be the game engine? The cpu doesnt seem to go up in usage, so I don't understand why there is such a big difference.
Yeah, that a classic and BIG CPU bottleneck you got there man. The usage doesn't matter anything at all, that CPU has low IPC and is most likely paired with a low end RAM kit.
watch this: ua-cam.com/video/hAVlzEW8qgM/v-deo.html
@@AncientGameplays Thank you! Yeah Got 64gb 2880mhz. Because I couldnt get the frequencies higher.
@@MusicForHourss that's it then
@@AncientGameplays Any tips on which CPU and ram to go for?
I have had memory frequency issues with every AMD cpu I have bought recently. 3900x and the 5900x..
@@MusicForHourss get the 5800x3d and some 3600MHz ram and the difference will be huge
If Intel really wants to impress the penguin crowd (penguinestas) they will port this to Linux, which desperately needs a GPU monitoring solution. There is Green with Envy for Nvidia and corectrl for AMD but neither utility allows for real-time monitoring of CPU and GPU stats.
Mangohud?
@@XxXTMillzXxX Mango is great IF you can install it, config it and modify your Steam launcher to support it. Not exactly a plug n play option like this, is it?
Just installed the Intel PresentMon Betav0.5 and configured it as per your video in 2 min!
I am happy to report my RX6800 and 5800X are "Balanced" in DCS World Multithread. Very Nice!
Nice indeed!
Intel software works on AMD?
@@LexxDesign3D it have similar instruction sets, so it's work for both
haven't you watched the video? @@LexxDesign3D
@@AncientGameplays I considered not watching it until reading this comment since my system is AMD, I was assuming this intel software is for intel cpu's
GPU Bussy lmao.
Very good video man appreciate this info, didn't even know this app existed, thanks!!!
Also, AMD WHERE IS FSR3.
Thanks as well! Fsr3 is supposed to be announced in the 25th
Can we expect a video on the new AMD chipset driver 5.08 ??
There's really not muxh to show there
@@AncientGameplays I totally understand. Your always on top of the new drivers so was just wondering.
Does anyone know why after I downloaded presentmon, I had high latency and audio issues that went away as soon as i deleted it?
Maybe a bug? Its still on beta
Maaan Jedi Survivor is still a broken stuttering mess. Even on a high end system. I will probably get it in 2 years when it will be free lol
The biggest problem that causes stutter is the Windows 😂
Noob "GPU busy" graph user vs Chad "I just know my CPU suck" enjoyer
In the case of Jedi survivor i think it is safe to say the game is the bottleneck not the pc 😂
Its a good example though haha
Hi! Could you please make a video about the new AMD Chipset driver 5.08.02.027 and its differences vs the oldest driver? Thanx!!!
They're fine, most differences are for x3d cpus
This is a cool and everything, but all you really need to look at is GPU usage. If it is 98% or higher, you are GPU limited. Anything below means you are either CPU limited or limited by the game engine.
exactly, but that's not the point. look at the Counter Strike 2 benchmark, even though the CPU was bottlenecking, the GPU and CPU had around the same times, meaning that there were nothing loading in the background, while it was different with Jedi: Surviror, which is interesting to see
Even single player games use some sort of connection to server, but things become more interesting when you play multiplayer online competitive games. Even if game itself didn't depends on data from other side of Earth, you still want have some own input to draw game scene.
Playing movie is almost optimal state of things, but you know stutters there.
I got quite responsive connection, 15msec typically in my DX9 game, 30ms in another DX11 game.
If your PC doesn't wait for incoming data, the image becomes disconnected from reality, if it waits, you have already at least 30ms delays. Idk how devs solved lack of continuos positioning and other data during compute, it looks they use either last known pos or aproximation given last pos, speed and heading.
Anyway I think we are doomed, until we become psychically and physically fully connected all together with no lag.
I think I love to have some lags😅.
U allways have a bottleneck, else FPS will never stop rising. The first bottleneck (CPU/GPU /bus speed) will be the 1 that determines your FPS. The ideal condition is that the GPU is on 100% utilization. than u know your system is not limiting it from reaching its full potential.
Exactly
I just tried it and it doesn't show my CPU's temperature... and I have a 13600K lol
Its still in beta, it will get there
Developers: clearly you all have weak CPUs, nothing wrong with our optimization.
Its a mix in most scenarios xD
Great. Time to dust off the 486DX2-66.
❤
I`m sorry but how is this monitoring utility solving bottleneck?? BTW thx for review!
Didn't say it solved the bottleneck, said it solved the dilemma, since now you can literally watch it on graphs
Glasses😳, I mean, it's would be very interesting to see on 7800X3D with RX 7900XTX, also Ray Tracing
I'm gonna try it cause fedex in Portugal destroyed my pc and fnac sen't my pc back with insurance paid 4070ti tuf oc, new case but broken mobo and other parts damaged in the accident. I got single channel 3800mhz cl17, so yes i gotta check my stats. Thank you.
well, I guess, for CS Go test 1ms GPU time and 1.32ms CPU time is not a match. 1 vs 1.32 it's a big difference. So, huge CPU bottleneck, right? GPU can produce 1000 frames but CPU only 750, huuuuuuge bottleneck.
It will never be a perfect match though, not even with a non bottleneck scenario
I just use one of those 3.5" system monitors from ali express. Works great 👌
Just to have things showing, interesting!
A 3.5 inch monitor??
Im waiting for AAA devs Tweet about it.
This is gonna expose so much of Bad optimization
🤣
Oh and Fabio :)
Cache is pronounced with an E not an A
Kesh not Kash :)
I like Kash more hahaha
@@AncientGameplays English teacher OCD 😂
I have a weird question. Do you do this sign 👌 because you want too or that you don't really think about it? 👀 ( question is cause I know some really big famous ppl do it because they are being told.) ❤
what the hell is that? I use that sometimes when talking to say some things are like "top". Been doing that since I was a child, its a common thing here
@@AncientGameplays that's why I asked my guy. Top tier actors and basically all huge content creators are striking deals with bad ppl to do the 666 symbol.
you have 2 top GPUs 7900 XT and 7900 XTX? Damn!! :P
XT was sent for testing only
So when are you going to get yourself a 7800X3D, sir?
Never. 8000 series most likely
so intel just got off the 32bit L1 cache feed? finally. 64bit came out over 10 years ago moores law dictates 128bit L1 should have come about over 5 years ago and 32bit should have gone the way of windows 95.
further moores law GPU memory bit buses should be at 1-2mb as 512kb memory bus was soooooo 2008 with the radeon 6800 and nvidia gtx290/5 gpus
what's the Difference with the MSI AfterBurner?
Afterburner is much more complete as explained
i see Thank you
Don't you want a GPU bottleneck.Not a CPU bottleneck just asking.
you want to NOT be limited by the CPU and RAM, yes that's the point usually
0:09 This happens right now in Starfield. 180w on a 3060 Ti, 99% GPU usage - other games 240w.
IM tested this its just show gpu time and its very good! but not cpu time im only seen cod warzone game showed gpu time and cpu time if a app can show cpu time its great! its just show the frame time not cpu time! if they add cpu time its gonna be greatest benchmark app!
CPU time = Frame time...
but in the call of duty warzone value its not same as in game cpu time @@AncientGameplays
every other channel: its a good software, yeah
ancient gameplays: even talks about scaling slider. :Gigachad:
Haha
I have to say my gaming friends, I have an I9-10900k and I still rock the frames on most top Titles. Perhaps a wee bottleneck in my CPU but overall, my performance with this 7900XTX is fantastic! A big step-up from my 3080 TUF
Is this software free and where do we download? How do we get latencymon?
Free yes. Google my friend
Okay so in this regard I have a strange issue here and I hope you can see my comment to help me with it
I have a Core i7 12700f paired with RX 6700XT and 32 gigs of memory.
Every Time I go to the bios (Gigabyte Z690 UD) and activate the Gaming profile or Enable the Enhanced Multi Core Performance I witness a drop in the Gpu usage in games which results in much lower FPS.
Also, when I enable either of these options in the bios I notice that the Cpu usage inside games increases to 30%~40% vs 3%~7% when they are off.
So I am not sure what's going on here
the low FPS don't results because of the low usage, the low usage appears due to those gimmicks you're trying to use. Let the CPU at stock and use the RAM XMP, period
@@AncientGameplays Thank you for your help
gaming profile disables the ecores. Ecores smooth stuff out in general I recommend keeping them on. I have the same cpu but a b660 gigabyte mobo. CPU is great on stock settings and XMP. If you on DDR4 set RAM command rate to 1T.
@@multiicore_ hmmm I see, Thank you for sharing knowledge man!
i will wait when amd will add this to adrenaline .. i have too many monitoring software HWinfo 64 runing 24/7 AMD and gigabyte which i have to
i know i have cpu bottleneck, i had to downclock and undervolt my r5 3600 xD, its because i dont know what the "safe voltage" is, because it was high temp. right now 4ghz @ 1.144v. ive heard these cpus killing themselves because of the 5v boost it does on stock
Can Someone help me?
I'm encountering an unusual problem while gaming. Despite disabling notifications, I'm hearing the notification sound randomly. Strangely, when this happens, if I press any movement key like W, A, S, or D and then remove my finger from the button, my character continues to move on its own. How can I troubleshoot and resolve this perplexing issue?
would like to see this in all new videos. This is great. Wonder how Rachel & Clank with DirectStoarge 1.2 will perform?
I cannot get intel GPU busy to work for me at all. Show me why?
I have two RTX 2080ti's are they too old?
I dont get the use of this, its effectively gpu utilization crammed inside frametime graph. If you look at gpu utilization graph and gpu busy they are basically in parallel to each other. If it gets implemented in RTSS i might try it, for now, meh.
you're not really getting the point here, at all. The GPU busy feature can tell you effectively if the issue of stutters, frame drops and so on is because of your GPU or not, something that the GPU usage can't show at all. That's the point
@@AncientGameplays well at least with current presentmon configuration you cant see stutters in gpu busy either, because its the **AVG value** shown, not realtime per frame. As of now, you can only see big gpu bottlenecks and dips, which leads to exact same value as gpu utilization. I get what you are saying but for now it does not show what you are suggesting, even in your own video there are literally no stutters shown, like in RTSS default frametime graph for example.
It is already integrated into RTSS OverlayEditor
@@unwinder Really? i will into guru3d then, as always thx for developing RTSS :)
@@h1tzzYT it is not publicly released yet, but PresentMon integradion details are available in MSI AB development thread in the forum.
@12:36 this is the part where he shows the different frame time graphs of when the CPU is bottlenecking the GPU.
imho he seems to be mumbling through most of the video or I'm just too dumb to figure out what he's saying. But the 12:36 mark illustrates CPU bottlenecking really well.
mumbling? lol
What we need is a bottleneck for dummies software to identify where the weakness is in our systems, most of us either don't have the time to dig deep into finding out or are not capable of doing so.
This Intel performance software could do the trick if it breaks things down in easy to understand language for most to understand, and that could be a real help for many in seeing what part of their PC's need upgrading, because let's be honest, a lot of us are not that good at doing a balance system, many of us just throw in what we think is a good match from cpu, gpu and memory, or many throw in higher end parts, but a lot of us are not good at seeing the right balance of hardware is, and that could be helpful in reducing cost whiles also showing us what needs upgrading to get a bigger bang for the bucks.
This Intel performance software could do just that once it's mature.
On another note, Intel have released the source code as open source, that's good news because AMD, Nvidia and Intel could integrate this into their drivers and that could monitor performance and give us tips in where the bottlenecks are and what hardware upgrades would help.
джедай выглядит не айс для своих системок
Nice now I'm going to see if I can find my bottleneck and overclock it out!!!
Check it out :D
Now next step is adding CPU busy/wating stats aka if its hold back by RAM bandwidth/latency or if it reached its own max ST/MT potencial.
Maybe
I just got a 12700 k, 1080ti, 64G ram, samsung 990 SSD and my fps in GTAV were 150ish 1080p. I just installed a 4080 and now I'm at 100fps or less. Any ideas?
Its an obvious cpu and ram bottleneck yes, the game is also shit and broken. The 4080 has bigger cpu overhead, that's why I guess
i have 7800x3d and when i use full hd resolution and using ultra performance dlls my gpu is not at max.. When you see gpu not working max there is cpu bottleneck on gpu demand games.
depends. What you have there happens because you're tying to render like 480P resolution, obviously the CPU would be the bottleneck
Can you be banned using it in some game? Cuz forza horizon ban ppl using overlays.
Than FH5 team are a bunch of idiots...
@@AncientGameplays not the 1st to say that. But I can't complain, cuz I can get banned aswell for complaints hehehe! :X
I've used the rivatuner overlay with no issues and no ban.
@@megadeth8592 I have a friend that was banned exactly for using it. But they don't do it to everyone. Just to ppl using it too much. Idk why.
@@AncientGameplaysFH5 (and previously FH4 and FH3) are my primary overlay stability stress-tests, which I use during RTSS development and testing. They never banned anyone for overlays, it is false info.
This guy has no idea what he’s talking about lmao
I have a 3080 and bought a XFX Rx 6800xt merc 319 I have to tell that AMD drivers are absolute dogshit because i was having weird gpu downlclocking issues so I had terrible stuttering and fps drops . I tried so many things to fix it but nothing worked finally I fixed it by manually setting GPU core frequency to 2700mhz and minimum frequency to 2600mhz and boom everything works flawlessly I'm getting insane fps not fps drops what's so ever . Some People may say my AMD GPU don't have this problem bla bla bla just open msi overlay and watch your gpu core frequency it will drop like monkey jumping tree to tree test this in many games.
You have a cpu or ram bottleneck most likely. Also, clean install the drivers and install chipset drivers. That's not a thing that happens usually. You have something wrong there
CPU was just loading..... SHIT....LOL
🤣🤣💪
great video. you are bottled necked on the first csgo bench. 7800x3d helps with those fluctuations
there are no "fluctiations" and a CPU bottleneck is there of course, we're running mostly 540P native res xD
so this just shows you there is a bottleneck, but not WHERE, it could be software sided, game engine, driver overhead, slow ram, not just CPU
how is this different from low gpu usage?
Yes, it is much different because you can see that it is not gpu sided, with the gpu usage it is relative as sometimes, even at 99% you're getting bottlenecked. A good example was TLOU before smart access memory
This might be a stupid suggestion, but why don't gpu makers make a graphics card that has not only a gpu and vram on it but also a special "cpu" type chip that can handle everything in-game that the system's cpu handles now just more efficiently and effectively? That way the system's cpu wouldn't be burdened with the task of running certain aspects of a game, since that would be handled by the "cpu" on the graphics board, and can instead focus on system stability and processes.
What CPUs need are Data Processing Units that can offload and speed up memory bandwidth
Will there always be a CPU bottleneck for 1080p in new games?
Most likely as gpus keep getting stronger
7800X3D+7900XT here. I'm having an issue where it is polling wattage and VRAM data from the integrated GPU, Radeon TM instead of the 7900XT. Going in settings, it only says default adapter as the driver source.
same for me
There are still some issues as this is beta, but you can choose the source on the settings edit
How long will it take for RivaTuner Statistics Server to add this feature?
who knows haha
Wow new thing and u dont get it. In cs relative difference is huge and u are cpu limited as cpu cant generate any more frames u should up settings a bit and u would see that then this difference lowers till point when it rises again when using rtx for example
I said that YES, we were cpu bottlenecked and that the difference was there, percentage wise it is big, BUT, there are always variations eve in non cpu bound scenarios. But well, thanks for the cockyness
I took it for a quick spin on my 7600/6750 XT rig and it can't read GPU temp or memory. Latest drivers/BIOS etc.
it did here but not the power draw. needs updates
On mine it read power draw just fine. I really hope Intel keeps working on this.
the overlay wont show up for me?
Obrigado Geeklord.
💪💪
Intel is doing something great for the PC community, unlike AMD who is blocking game developers from adopting upscaling technologies that are not Flickery Shimmering Resolution. AMD is simply pathetic beside Intel. RIP AMD🎉
AMD doesn't need artificial crutches.
This is what happens when your parents drop you on your head when you're a toddler 😂😂
Yes of course, NVIDIA never did anything against the market. RIP brain
Just took a look at Intel PresentMon and my frame time = gpu busy (I mean they are near the same value all the time). There is no gap at all. 10900k and RTX 3080 ti. Tested in Cyberpunk 2077 1080p Ultra settings and Diablo 4. Reduced my overclock to 5.1/5.0GHz as there is zero increase in FPS in games. 5.2/5.1GHz is just for benchmarks now. Love this program.
In the video he is CPU limited in CS2. Why he is getting less than 99% GPU load, around ~80% which is clear cpu bottlenecking (expected see high fps but even so). Even in Jedi he is seeing CPU bottlenecks for periods, there can be lots of reasons for this. With me its 99% GPU load and both frame time and gpu busy are the same (all the time). This is in the games I have tested. This is bad I guess because a faster card likely wont do me much good.
I would say the CPU is not paired with the correct GPU in this video but still is up to the task and fine. The GPU is a little strong for the build but its acceptable. Likely upgrade is a 7800X3D. It will likely push more frames in CS2 at lower resolutions.
Even so this is a don't touch build if there is no hitching or stuttering. He's still getting most of the GPUs performance. Its balanced enough.
quando passei o cursor pelo teu video no "feedback" do youtube em mute parecia que iniciaste o video a dizer boa tarde. lol
Hahah, coincidência
Still new and needs to brush up the app
Yes, but MUCH better than what they had before, and now has a good UI
@@AncientGameplaysI have a big question my game stutters when I watch my gameplay that recorded on street fighter 6 I don't know how to fix it does it on all of them I tried to use to record my gameplay street fighter 6 is the main one having issues
@@Jason_Bover9000 so, when you're gaming + watching a video?
@@AncientGameplays I upload videos on youtube after I record the footage it stutters
maybe the issue is the "playing" and not the video itself @@Jason_Bover9000
I got these stutters you showed in jedi survivor in all games. Running my 7600 with a 7800xt and 32gb 6000 cl30 ram. What is causing this exactly and how can i fix this?
"I know my system, and I know my shit." 💪😎👊
hahah
Now we can expose lazy devs that still uses 4 threads on cpus and make games stupidly cpu bound where the gpu is at like 50% for no real reason (since we didn't got any major game improvement on cpu related workloads).
The 4 cores workloads aren't the issue anymore. Poor distribution usually is
@@AncientGameplays usually, yes. But there are games like tw3 remake that used less threads than the original when ported to dx12 on launch, it's a mess. Same goes for hogwarts legacy and a lot of unreal engine games.
What ima love about Fabio he can explain how a systems GPU is bottlenecked by the CPU how the Graphics card is sleeping not pulling its weight as in running at 98% to 100% rather running at 70% or far less but ima guilty as sin doing that with most of my retro AGP builds why they are for retro gaming only lower FPS as in 60fps in some games max running at 60Hz to 85Hz monitors why because the graphics are of the highest crispiest as they will ever get yet the cards running at even 50% or way lower in some games with the builds running ultra silent that way the Graphics card bottlenecked in bed sleeping just dreaming the games on easy street will last forever as the capacitors last up and looks after the silicon to be as good as new no fans blasting sucking in dust to block the cards that is what it is all about purely ornamental builds that can run PC games on Direct X10 to the max if need be as these are AGP the crown jewel builds antiques that hold a premium that should only be looked after with the most utter respect...But not High end like Fabio is bench testing live gaming all his viewers knows he is of the best of the Techs with zero need for Liquid Nitrogen overclocking to find that sweet spot in his systems as most gamers as in 99.9999% will never use Liquid Nitrogen cooling that is only for the likes of perfectionists like GN's Steve and JayZtwoCents chasing world records with golden sample silicon to the max anyhow hats off to Fabio amazing as always nothing but total respect..
What does the presented fps graph (99%) mean?
Which one? 99% is usually the 1% lows, the close to the averages, the best
Downloaded the intel presentmon yesterday. Was trying to find a way to make background transparent like MSI's one. guess not yet but the OSD from Intel GPU busy is wholesome! thx AG❤
Congratz