THIS is what Bottlenecking REALLY looks like! AVOID THIS!

Поділитися
Вставка
  • Опубліковано 24 лис 2024

КОМЕНТАРІ • 2,4 тис.

  • @Ration999
    @Ration999 Рік тому +8161

    Personally my biggest bottleneck is my wallet, paired whit gpu prices.

    • @demontekdigital1704
      @demontekdigital1704 Рік тому +68

      Same. For me neither is a bottleneck. More like trying to fit a softball in a garden hose, lol. My PC has tagged in my house, and I'm currently being choke-slammed by all the repairs, lol.

    • @camotech1314
      @camotech1314 Рік тому +121

      😂😂😂 poor people are always bottlenecked

    • @demontekdigital1704
      @demontekdigital1704 Рік тому +201

      @@camotech1314 LMAO! So painful, and so true. I'm so poor I can't even pay attention.

    • @RainyFoxUwU
      @RainyFoxUwU Рік тому +20

      that comment killed me

    • @rchapman801
      @rchapman801 Рік тому +14

      This comment made me laugh. So true.

  • @luck9837
    @luck9837 7 місяців тому +99

    My jobs been bottlenecking my life

  • @ee2610
    @ee2610 Рік тому +499

    I upgraded from an Intel i5-7600k to a Ryzen 7 7700x but kept my GTX 1080. My framerates have increased 2 to 3x in CPU bound games like Risk of Rain 2 and Rust. God bless Microcenter for that $400 bundle

    • @Davincibeats
      @Davincibeats 9 місяців тому +31

      That's a huge upgrade. Mine is similar. I upgraded from the i5-8600K to the Ryzen 7 7800X3D ... I've only tested one game... But Kingdom Come went from 40 FPS at ultra to 60 FPS ultra. Never knew my CPU could bottleneck my PC like that.
      I also massively upgraded the cooling, which helped keep temps insanely low. My GPU used to run at 90+ and my CPU at 90+ ... Now my CPU runs at 75 and GPU 75 at max.

    • @Heckinwhatonearth
      @Heckinwhatonearth 9 місяців тому +9

      I just went from a 6700k (4.8ghz) to a 14700k with a 3070, and from a z170 w 2600mhz ram to a z790 w 6000mhz ram, yet to pick it up, stoked!
      A 7900xt is planned next 😁
      I'm at 3440x1440 120hz so I didn't think my cpu was bottlenecking super hard until I got into act 3 in bg3 😂

    • @Tiagocross
      @Tiagocross 8 місяців тому +2

      Yep! Im building a new pc and for the moment my GTX 1080 paired with Ryzen 7 7800x3d will have to work

    • @balin1
      @balin1 8 місяців тому +3

      Those microcenter bundles are the real value

    • @PhilipKerry
      @PhilipKerry 8 місяців тому

      @@balin1 In the UK Microcenter doesn't exist we have Currys PC World but the choice of peripherals is limited .

  • @hamzaalhassani4154
    @hamzaalhassani4154 8 місяців тому +171

    Out here playing cyberpunk on an i39100F and a gtx1660 with cpu bottleneck at 40fps. when Jay said "UGH ! feels like playing on a potato" it me hard in the feels, man.

    • @tentymes10x
      @tentymes10x 8 місяців тому +5

      i play enshrouded on my i7 4790 and a rx 570 at 30 fps.....i dont mind tho cuz im old

    • @hamzaalhassani4154
      @hamzaalhassani4154 8 місяців тому +3

      @@tentymes10x i ve been playing at 60 fps a lot lately. i cant handle 30fps anymore XD

    • @YourMother6yrsago
      @YourMother6yrsago 7 місяців тому +1

      @@hamzaalhassani4154same, I upgraded to a 1650 super build from a 1050, and now I can play every game at 60 fps where before I was struggling to get 30. I can’t get used to 30 anymore

    • @megapet777
      @megapet777 7 місяців тому +2

      thats not such a bad setup. Cyberpunk is just really demanding game. I bet you would get 60fps in elden ring for example.

    • @hamzaalhassani4154
      @hamzaalhassani4154 7 місяців тому +1

      @@megapet777 i can get 60 fps, you're right. But i have to sacrifice details and use upscaling whenever possible.

  • @northwestrepair
    @northwestrepair 9 місяців тому +48

    Lowering graphics performance will allow GPU to generate more FPS.
    The more frames GPU generates, the more CPU has to work with those frames.

    • @Shywer
      @Shywer 20 днів тому +1

      And capping Fps saves alot of energy. =)

  • @thigo94
    @thigo94 Рік тому +357

    the frametime graph in afterburner would be a great visual aid, because as you said, it is hard to see jitter/stutter in the video. Adjusting the max and min properly gives a great indication of this issues.

    • @x0Fang0x
      @x0Fang0x Рік тому +8

      or just use intel's gpu busy graph to know what settings to use.

    • @Sulphur_67
      @Sulphur_67 Рік тому

      @@x0Fang0xtried and didn’t work at all with my rx 7600

    • @disser3849
      @disser3849 Рік тому +2

      YES! Always show frametimes pls.

    • @squirrelattackspidy
      @squirrelattackspidy 2 місяці тому

      How do you use it?

    • @thigo94
      @thigo94 2 місяці тому +1

      @@squirrelattackspidy just search for how to setup the msi afterburner/riva overlay, once you set it up you can turn the osd for the frametime with the option "graph" the max and the minimum for the graph can be set to 2 ms and the max to 20, with this value you would be able to monitor from 50 to 500 fps. if you are at perfect 500 fps your line would be a flat line at the bottom of the graph. the lower the fps the higher the line will go, what you want is aways a flat line with not a lot of lumps, this means the experience of playing the game is smooth, any spikes mean a big stutter.

  • @berserkslayer8638
    @berserkslayer8638 Рік тому +511

    Your video about bottlenecking (the one with the i3 and 2080Ti) was the one that started my journey in PC building and gaming. Before watching it I was so lost and afraid of all this stuff, but you kept it simple and easy to understand. So after 6 months of watching your videos and saving some money, I was able to build my own PC back in 2019. I have nothing to say but thank you!

  • @frostbite3820
    @frostbite3820 Рік тому +194

    My concern with bottlenecking isnt that I'll get stutters or jitters, no my concern is that I'd pay money for an upgrade to only then find out that my upgrade was entirely pointless because some other component is maxed out and that keeps whatever I just bought and installed from working harder and giving me more performance

    • @DragomirGaming
      @DragomirGaming 8 місяців тому +18

      Exactly, I upgraded my PC from i5 3rd Gen to i5 10th Gen, and GPU from GTX750Ti to RTX4060 only to find out that my i5-10400 is bottlenecking 😢
      Now I need to spend more money on i7 or i9 12 or 13th Gen processor to get the max performance 😭😭

    • @masterkamen371
      @masterkamen371 8 місяців тому +6

      ​​​@@DragomirGamingI really hope I won't be running into the same issue after upgrading to an R5 5600X. As it stands, my Arc A750 is being bottlenecked quite hard by the Ryzen 3 2200G.
      According to my tests with another PC, the Ryzen 5 is 2,6x faster than my current CPU so who knows.
      Edit: well as it turns out, no, a 5600X is not a bottleneck. In fact, it's the best gaming CPU in the price range.

    • @fred114b
      @fred114b 8 місяців тому

      that cpu will abselutely not get bottlenecked by a a750@@masterkamen371

    • @Vitek_1599
      @Vitek_1599 8 місяців тому +9

      I bought a 4070 for a computer that had a ryzen 5 2600 cuz I didn't know what bottlenecking was

    • @zoopa9988
      @zoopa9988 7 місяців тому

      @@Vitek_1599 You could buy a i3-12100F if you go with LGA 1700, if you want a faster CPU or want to go with AM5 than the R5 7600 should be plenty fast.

  • @wkeyser0024
    @wkeyser0024 10 місяців тому +29

    Thank you for all the information you and your team provide. It makes a world that seemed unattainable for the Layman to start. Haven’t built a computer since a 486 but that changes this weekend. Thank you sir, be well.

  • @princexaine5340
    @princexaine5340 Рік тому +23

    Really good guide Jay. I see this all the time and I try my hardest to explain bottlenecking in its simplest form wherever possible - but a lot of people seem to think bottlenecking happens "immediately" when you pair two components together. And that just isn't true. Depends on application. Very well thought out. Thank you.
    I actually paired an X5470 from 2008 (At 4.5 GHz) with my 4090 just to prove that the card can still run near close to specifications.

    • @chrischaf
      @chrischaf Рік тому +4

      A lot will depend on how the game itself balances demands between the cpu and gpu, *and* what you are doing in the background.
      (I came back up here to mention, this got *very long*, so tldr people should just skip this post and have a good day ;D and, also, I *don't* have big issues with any other games, *including* what little I've played of cyberpunk -8.2 hours- So this is a story of an "exception to the rule" sort of thing. *One particular game* that brought my system to it's knees due to bottlenecking. For reference, I run an i7 7700 at stock speed, and a zotac 1080 mini at stock speed. ssds all around. it's a prebuilt -zotac mek1- except for the 1080, a beefier power supply, 32 gigs of ram and being stuffed with more ssds than zotac ever intended lol and while it may sound very old by today's standards, to build *the same tier rig* in today's market would be about 3 times what I could afford. Plus, since I use a 65" tv running 1080p@120hz I'm basically limited to 1080p gaming, which the current system already handles quite well *in most games*. STRESS the *in most games* part lol hence this bottle-necking story)
      Like, I play a lot of 7 days to die, which actually relies *a lot* on the cpu when there are hordes (lots of zombies at once).
      Things worked pretty well on the base game, but when I started running a mod (Darkness falls) that tended to increase the size of the hordes a bit, and some various other things, it was dropping the game from completely playable, to completely UNplayable (12-14 fps while dozens of zombies are trying to pound on you, with occasional drops to 0 fps. yes *ZERO* fps lol!).
      So I started messing with game settings, trying to see what I could turn down in hopes of squeezing out a few more fps to get by...
      But...
      What I discovered, was that I could actually *turn a whole lot of graphics settings UP* and it didn't really matter, and I was actually already running the game at much lower settings than I really needed to, because it simply wasn't really the graphics being too high that were killing me.
      There was a major game update right in the middle of my settings-testing so I never got to learn if there were any particular settings worth turning down...
      But what I did manage to learn fairly quickly, was that I was almost constantly pegging the CPU up to 100% (in the task manager. I didn't have anything at the time to look at single core usage, but if it's riding at 100% in the task manager a majority of the time what's happening per core is probably less important lol).
      Now... I'll bring up one of the *big* issues that I hit on as a problem right away.
      I *stream* my games. So I always have obs running in the background, and I always keep chrome open to monitor my stream (just that one window, on a second monitor) and often I have firefox open in the background, for this and that (I still have all the bookmarks from back to the early 2000's in my browser, cause I always move them forward to my next pc, so firefox sort of serves of *alternative/extra* memory for my brain, which I rely on heavily since I have such a bad memory, *plus* I tend to always open things in new tabs, and my browser loads with all tabs from the previous sessions, so I can look back over my previous tabs to help keep track of time and when i did things, what i was looking up yesterday and the day before, etc. I rely on this so-much-so that I often end up with hundreds, or even *thousands* of tabs in my main browser. I believe the record number of tabs i've had in firefox at one time was 3,172 or something like that. somewhere over 3 thousand, i'd have to find the pic to know the exact number for sure. but, for reference, chrome with my one twitch page open, tends to hog just-as-much-or-more cpu than firefox with 2,000+ tabs).
      Chrome tended to want to use about 14% cpu most of the time (it liked to randomly use 60+% for no obvious good reason) and firefox tended to want to use 7%-14% as well, so the first thing I had to do, was stop streaming for testing.
      Aaaand, well I didn't get much further than that before the big game update changed too much to use what I'd already done, which for various reasons had taken a couple of days.
      And it's that part that was really showing me how cpu-limited/bottlenecked my system was with this particular game.
      The basic problem was, that getting REAL reliable AND *consistently* repeatable test results on that game was a heck of a lot harder and more time intensive than I expected. It was *literally* taking me *hours-per-setting* to get AND verify consistent/reliable/meaningful differences between runs.
      And the primary problem was that my video card was simply a bit *too* good for single changes to make an obvious impact.
      As in, the difference between low/off and ultra/on could be between 0-5 fps difference, while the variation between runs *with the same setting* could ALSO be between 0-5 fps.
      So I wasn't able to do just simple run/spin around tests and get real numbers. They were all over the place.
      I had to build a big square base with a particular place to stand, run around to all 4 corners and spin around to get everything to load, reset the time and weather to be the same, set my crosshairs on a particular point in the distance, then spawn a given number of zombies set to run towards me, and log the fps while they'd be beating on the base, and while I'd be shooting, etc. had to do that multiple times for *every single setting* to have consistent accurate results. And just running around in the city wouldn't have told me anything, because *that* part wasn't what was killing my fps. it was when my cpu was having to do this and that when there were large groups 25-50 or more trying to chase/attack at once.
      Which, ironically, was the absolute WORST time to have major fps drops.
      But, anyway, yeah. I *thought* that all these fancy green flamey effects that were added to some of the zombies (added by the mod) were going to be what was killing my fps, but no. spawning zombies that had lots of effects, had almost no impact compared to ones that didn't.
      That was all within the realm of what the graphics card could handle, if it was being allowed to handle things *at all*, which it *wasn't* when there were large groups, due to the cpu bottle-necking.
      So, yeah, if you go to do some graphics troubleshooting/tuning, and find that turning a bunch of settings down doesn't seem to actually help, and/or actually seems to make the game run slower/worse, you may very well be cpu-limited/bottlenecked, because your gpu is stuck waiting around for the cpu to take care of some things before it can do it's next job.
      And, since I still can't afford to upgrade to a similar tier modern system, i'm probably looking at needing to upgrade from my 7700 to a 7700k, since the k model's base clock is as fast as the standard model's full/max "turbo" speed.
      My mek1's proprietary bios may not allow me to actually overclock the k, but just the default speed bump would essentially be an overclock, so it's all good. And with that, I think this system will be about as maxed as it can get.
      Although... I actually have a factory refurbished 2080 super in the mail that i got for a decent price, which is probably the max video card I could use in this system. lol
      That might seem silly considering I've been talking about my system already being bottlenecked by the cpu, *but*, there's some caveats.
      1) My system is only this heavily cpu bottlenecked *on this one game*. other games, which balance things more heavily on the on the graphics side, don't have such a game-breaking limitation.
      2) The new video card costs $276 (including tax and shipping). I couldn't buy anything near that tier from nvidia for that price, and I will *never* touch another "noone would ever notice that it doesn't have a 256-bit memory interface during normal gaming" card again... I bought a lower tier card that was recommended on that advice once, and had to suffer a year of seasickness-inducing micro-stutter before I could afford another 256 bit card to replace it. And since nvidia is now even chopping down their xx70 series cards, that only leaves me xx80 series and higher, which cost... well you know what they cost ;P
      3) The 2080 super is a "blower card". My system was designed to use a 1070ti blower card, which sits in a little chamber, and the blower card blows all the heat out the back.
      I swapped it for my 1080 mini, because I already had the 1080 and i didn't want to use a 70ti when I already had an 80, even though, in my quick round of testing, the 80 didn't really perform much different than the 70ti.
      But, yeah, the temps with the mini are *bad*. I even tried adding a fan to pull the hot air out, and it made literally no difference. AND I found during recent testing that my 1080 *will not throttle* even though it's supposed to. so it's a risk to leave it running in there unmonitored. and it's actually the heat issue that was my biggest reason to buy the newer card.
      now it's time for lunch :) lol

    • @al3xb827
      @al3xb827 8 місяців тому

      Can you please help me? I got a Ryzen 5 3600x and 6750XT and there is a significant Bottleneck what should I do?

    • @princexaine5340
      @princexaine5340 8 місяців тому +1

      @@al3xb827 Turn graphics settings up. You can probably play at better settings without losing FPS due to the CPU bottleneck. You can try to OC your CPU if you have thermal headroom. You can try limiting the framerate, if you are experiencing large hitches in performance.
      ...Or just upgrade the CPU.

    • @al3xb827
      @al3xb827 8 місяців тому

      @@princexaine5340 thanks. Somehow it looks like in some games if you change the graphic settings there isn't such a big impact on fps. But yeah that was my idea to oc the cpu at least. Cause I just bought the GPU sometime later this year I may upgrade the CPU.

    • @princexaine5340
      @princexaine5340 8 місяців тому +1

      @@al3xb827 Right, and I know hearing "upgrade" after you've already dropped money on a component isn't the answer we all want to hear, but in reality, you can only do so much to mitigate the bottleneck your setup is experiencing in titles where a powerful cpu is beneficial.

  • @Liaret
    @Liaret Рік тому +149

    A small detail/correction: In most game engines, such as Unreal, Game Thread (main engine thread running the world etc) and Render Thread(s) (the threads responsible for "talking" to GPU and giving it instructions) are separate. The main thread (GT, Game Thread as it's called in Unreal) "ticks" the world, but does not send render instructions to GPU or waits for GPU (for the most part). That's what Render Threads do.

    • @rodiculous9464
      @rodiculous9464 7 місяців тому

      Is there a way to check exactly which are the game threads and render threads? I don't see it in afterburner/rtss.

    • @noth606
      @noth606 5 місяців тому +1

      @@rodiculous9464 Depending on how it's done to a degree but in general no, since you don't see threads anywhere, only separate processes/exe's. Also, there is no reason for a game engine to have n threading enabled, it's an unnecessary overhead to code support for. It's not impossible by any means but it would bring a bunch of extra junk eating CPU time without enough gain in return. What I mean is, I can code a process to spawn 4 threads always, or just one, or *sniff* the load on the machine, take a count of cores and roll a pair o magic dice to decide how many threads to spawn, and check again every 10sec to see if to spawn new threads or cull and queue old ones etc.
      I'm an ex sw dev, most of the time for small things I never spawned multiple threads even if the task could support it, because it's more overhead than gain. For major worker loops I usually set a max, 4 or 8 depending on what it is. I did very occasionally code to spawn MANY threads at once, because I wanted a sort of snapshot to process coherently, meaning for example 40 different parameters of a changing dataset sampled at once, each spawning a worker thread to process - once done I have the processed result of a "snapshot" all different parameter at one time.
      But all these things change faster than a user can perceive in many cases. But yeah, it depends on how it was coded more than anything else. The OS cannot decide to spawn new threads of something, the application can be single threaded if it wants to. The application decides more than the OS does, since it has to be coded specifically to spawn threads and then manage them, there is no "automagical mode" to it, if it isn't coded for it, it won't nor can it do multiple threads. And even if it is threaded in code, the developer(s) decide how many threads and when and for what. The OS has no crystal ball to see what the application benefits from, thus has no positive control, and very little negative control.

  • @Gravgon
    @Gravgon Рік тому +156

    I just wanted to thank you for all the videos you make. You have helped me so much with Building, Upgrading and trouble shooting my PC.

  • @Gravgon
    @Gravgon Рік тому +74

    HAHA! At 6:18 I thought his hand coming up to point was part of the animation after putting the sword away.

    • @lennyshoe
      @lennyshoe 7 місяців тому +2

      same 😂😂😂

    • @rezaimran98
      @rezaimran98 7 місяців тому +1

      RTX Overdrive

    • @SnarlyCharly
      @SnarlyCharly 2 місяці тому

      I was looking for this comment. I thought the exact same thing, "whoa how is he getting his character to point in those exact spots like that?"

  • @TheHoodSite
    @TheHoodSite Рік тому +3

    The ad has gotta my favorite part of the video, haven’t something that funny/rad in a while.

  • @oistyer
    @oistyer Рік тому +2

    Thanks so much for this, now I don't feel so nervous about my 3060 and my i5 12400

  • @micb3rd
    @micb3rd Рік тому +59

    Also one more topic which is very interesting, not all bottlenecks feel equal. Jay is right hitting a CPU limit gives large frame time spikes, it is very noticeable as stutters, it feels horrible. When a GPU is at a limit the delivered frame times are often quite stable so it is a much much nicer feeling. This is why it really is often best for motion fluidity to either load your GPU to 98% or set a frame rate cap a little below below where your CPU FPS limit is.
    This is also why Nvidia and AMD work on technologies like Reflex and Anti-Lag + to allow a small buffer in the GPU rendering output (a few FPS below your set refresh rate) to ensure latency does not spike up when the current workloads are saturating the GPU to its limit. The game play experience is much nicer.

    • @raven4k998
      @raven4k998 Рік тому +1

      don't you just love it when you got an fps sissy crying about not getting the same fps as jay or some other youtuber video on the high end with their gpu even though it is still butter smooth? to play the game?🤣🤣🤣

    • @micb3rd
      @micb3rd Рік тому +11

      @@raven4k998
      What looks and feels butter smooth to one person is not the butter smooth to another person.
      Lots of people are happy with 60 FPS from a motion clarity and input lag perspective.
      There are lots of people who have a much more enjoyable experience when they are at 90 FPS to 144 FPS.
      It is down to their preference.
      I don't judge other people I just help educate them to get a smoother, faster better looking gaming experience.

    • @raven4k998
      @raven4k998 Рік тому

      @@micb3rd cry me a river cause the point he made was that 20 fps is nothing at all

    • @silverfoxvr8541
      @silverfoxvr8541 Рік тому +3

      This really messes up VR, where frametime is king.

  • @BlackHoleForge
    @BlackHoleForge Рік тому +6

    Sometimes while tweaking my system, the numbers and specs just blur together from test after test.
    Thanks Jay for keeping it simple.

  • @ragetist
    @ragetist Рік тому +53

    I see many people talk about bottlenecking and just comparing gpu and cpu. If you run a graphically heavy game on 4K/144Hz you're very unlikely to be bottlenecked by your cpu. Cpu + gpu is like a company with a painter and a mathematician and work being hindured by either is based on the work you give them, if you ask them to paint a fresco the math guy is gonna sit idle and if you ask them to do tax returns it the other way around.

    • @demontekdigital1704
      @demontekdigital1704 Рік тому +10

      That's an excellent analogy because that's basically exactly what happens. My only addition to this is while they're sitting around waiting for each other, the painter would be sipping some soy latte abomination while the mathematician would be sucking down double shot espressos, lol.

    • @Jason_Bover9000
      @Jason_Bover9000 Рік тому +2

      ​@@demontekdigital1704its still use it when hitting 144fps

    • @justinpatterson5291
      @justinpatterson5291 Рік тому +3

      That would mean your VRAM, SSD and RAM are like the accountant/warehouse manager who holds and keeps stock of whats being used/needed... Right?

    • @NickSteffen
      @NickSteffen Рік тому +4

      Also if you frame limit your fps to below your monitors refresh rate then it’s very hard to get cpu bound as well. There’s no reason to have more frames then your monitors refresh rate.
      You can just set this globally in nvidia control panel or AMDs settings. ( Don’t use in game settings for it (or v sync) as they often are terrible and cause problems).

    • @samuelsulaiman
      @samuelsulaiman Рік тому +1

      its like a restaurant operation where GPU is the kitchen and CPU is the Front of the House. When kitchen is beefy enough and can send food as fast as it can and FOH just can't keep up in delivering the food, thats cpu bottlenecking. The kitchen will end up slowing down production because why? no one picking up the food to go anyway

  • @o_Domo
    @o_Domo Рік тому +30

    the iFixit ad never gets old

  • @wezleyjackson9918
    @wezleyjackson9918 4 місяці тому +4

    OMG - That advert for iFixit at 0:34 - I wish all ads were this funny - maybe I would watch them multiple times as I did this one - "..no, we interrupt this interruption with this interruption..." Jay you goofball!! Subscribed! 🤣

  • @zackzeed
    @zackzeed Рік тому +19

    I love these 'refresh' videos, Also it seems like Jay is the only techtuber that does this nowadays... correct me if i'm wrong!
    Much appreciated Guys!

  • @CaptToilet
    @CaptToilet Рік тому +141

    Interested to see this test again when the 2.0 update lands. Dev has been saying the CPU can be hit hard due to better utilization across the board.

    • @HanCurunyr
      @HanCurunyr Рік тому +4

      I dont see recommending a 7800x3D or a 12700k for 1080p High without RT a "better utilization", there is nothing good in that

    • @max16
      @max16 Рік тому +1

      ohhhh. that would be why my 4770k and 3080 have been acting weird after the update. frames are still there as normal but CPU uti is like... 60% now.

    • @justcrap3703
      @justcrap3703 Рік тому +2

      @@max16 But if it's because of better utilization, shouldn't your fps increase along with the usage?

    • @DavidTMSN
      @DavidTMSN Рік тому

      @@justcrap3703 Not necessarily.

    • @alexandruilea915
      @alexandruilea915 Рік тому +1

      ​@@justcrap3703maybe the GPU was already at it's limits as well.

  • @Th3King0fHearts
    @Th3King0fHearts Рік тому +9

    I will never not enjoy that I fix it ad😂 putting my build together this week so have been binging your content! Hope yours and your familys health are well!

  • @4N5W3R5
    @4N5W3R5 8 місяців тому +3

    Been running my rtx 4090 with my old i7 8700k and gaming in 4k for nearly a year and a half now and it's surprisingly good still (about 10-15 fps down on best case benchmarks)... was just waiting for another massive jump in cpu performance before sinking more cash into my build.

  • @pixelcutterofficial
    @pixelcutterofficial Рік тому +2

    I recently got bad advice for a build and the bad pair of i5 13400F and 4060 TI causes dramatic drops in FPS and stuttering. Got my money back on original CPU and upped to a 13700KF solving the problem but DAMN was the bottleneck noticable

  • @veraxis9961
    @veraxis9961 Рік тому +84

    I completely agree with the conclusions here. I have seen a trend over time towards this idea that a CPU and GPU need to be "matched" (i.e. an upper-mid tier GPU has to be used with an upper-mid tier CPU or else you will get a bottleneck) rather than the wisdom of 5-10 years ago that for gaming it tends to be most cost effective to buy a slightly better GPU even with a slightly worse CPU. I think that logic still holds. Aside from maybe specific examples of CPU-heavy games, the numbers seem to support that most modern CPUs should be able to handle a mid- or upper-mid tier GPU just fine without bottlenecking. You might get a small performance boost from a better CPU, but most of your base performance is still going to be scaled off of your GPU.

    • @Muppet-kz2nc
      @Muppet-kz2nc Рік тому +7

      the beauty of computing, whether productivity, gaming, or other is tailoring something to your use case. I see it all the time in subreddits where people throw out recommendations with very ltitle information on the use case scenario. As a professional, most the stuff I read and watch on UA-cam really misses the mark. Nvidia has iterated leaps and bounds with its GeForce Experience gaming optimization settings and slider. It used to be horrid but does a pretty bang up job dialing in settings if you take the time to use it. Decide on a target FPS you want, then turn the slider until you find a breaklpoint that you like.

    • @Mr.Morden
      @Mr.Morden Рік тому +7

      I got a 5600X with a 3060 Ti and living in Miami FL. Even if I was willing to buy a peak end system I can't tolerate the heat that dumps out of anymore more powerful. The ventilation in this house isn't setup to handle a hot spot like that. Nvidia would need to give me a voucher for a central AC upgrade and more duct installation.

    • @jondonnelly3
      @jondonnelly3 Рік тому +3

      ​@@Mr.Morden5600x will handle a 4070 1440p no issue.

    • @Yerinjibbang
      @Yerinjibbang Рік тому

      just got a 7800xt but still on my r5 2600 lol looks like a 5600x will do fine upgrade@@jondonnelly3

    • @zackwalkman8574
      @zackwalkman8574 Рік тому +3

      I have rtx 3060ti in my old i7 3770k machine. It can play most newer games without issues. Even the newest i3 is better that that but it still going.

  • @liamcaroline448
    @liamcaroline448 Рік тому +9

    Meanwhile the FPS where the 4090 thinks it has no load is where my 1070 is at full tilt.

  • @SeaMonkeyMetals
    @SeaMonkeyMetals Рік тому +137

    Rather than artificially slowing the CPU, you should have run this test on a legacy system. For example, drop that 4090 into a system with an Athlon X4 CPU and you will really see the effect.
    Before I understood bottlenecking, I paired an R9 390 with an 880k Athlon X4 and wondered why I didn't get much better performance over the HD 5860...
    Little performance increases and bad stability. I regret ever buying that R9 card.
    But at the time I didn't know about bottlenecking, and thought a new GPU would solve my problems.

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Рік тому

      yeah did that with 4670 long time ago, absolutely no difference

    • @fortigan2174
      @fortigan2174 Рік тому +5

      The issue with the test you suggest is the mobo of that settup will not have sufficient PCIE lanes. So at that point you are bottlenecking on the motherboard before the CPU even comes into the picture. That renders the test inconclusive as to how much of the bottleneck would be from the CPU.

    • @SeaMonkeyMetals
      @SeaMonkeyMetals Рік тому +2

      @@fortigan2174 the Crossblade Ranger has a 16 lane pcie slot...cards only use 8. I'm not sure what the lane issue you speak of is, however I do realize that older boards have older gen lanes. Now speed can have a huge impact, but you have to go way way back in time to drop below pcie 8x...
      My point was, gimping the CPU does not give an accurate reflection of a real-world scenario, where someone might be trying to use an overpowered graphics card in an old system that cannot keep up.

    • @SeaMonkeyMetals
      @SeaMonkeyMetals Рік тому

      @@fortigan2174 I'm no expert, and if I am missing something in my previous comment, I am open to clarification. Thank you.

    • @anteep4900
      @anteep4900 Рік тому

      haha!

  • @saltysalt7339
    @saltysalt7339 Рік тому +16

    GPU VRAM and the speed of your ram can also create those stutters. Something not often talked about at all. Like having 8gb or 16gb can make a whole lot of a difference even if they got the exact same performance values. Especially with games that have a lot of assets loaded and unloaded a lot of times like with waves of enemies

    • @christianmyhretrani8956
      @christianmyhretrani8956 11 місяців тому +2

      This probably explains why my RYZEN 5 3600 paired with a GEFORCE GT 710 is making my games from 200fps down to 60fps in a second just by dragging my mouse left and right

    • @xXLEGEND368Xx
      @xXLEGEND368Xx 11 місяців тому

      Something not often talked about...at all? Dude Literally everyone talked about this when 4060 were released, for example hardware unboxed did one comparison between 3060(or 4060 cant remeber) and 6700xt in games with bigger vram needs

    • @saltysalt7339
      @saltysalt7339 11 місяців тому

      @@xXLEGEND368XxNo they didnt. Dude. They talked about how it fucks performance but not that you can actually have 8GB 100 fps and still the game runs like shit with a million stutters and all it ever was is YOUR VRAM. How many games people complain about stutters when their hardware was just to shit to even load all the assets in time

  • @calebjit
    @calebjit 4 місяці тому +1

    I have been going insane trying to figure out where my bottleneck was and this video singlehanded corrected my PC. I definitely am glad I stumbled on this

  • @ice.3000
    @ice.3000 Рік тому +129

    My biggest bottleneck is my wallet ..

    • @camotech1314
      @camotech1314 Рік тому +8

      Just stop being poor 😂 bottleneck solved 😅

    • @ice.3000
      @ice.3000 Рік тому

      @@camotech1314 ok i will stop being poor, from now on i will be a millionair. Thanks for the lifehack!

    • @VioIetteMolotov
      @VioIetteMolotov 9 місяців тому +16

      Just stop being solved 😂 bottleneck poor 😅

    • @YA-mr9zx
      @YA-mr9zx 8 місяців тому +4

      Just stop being bottleneck 😂 poor solved 😅

    • @BumpkinBros
      @BumpkinBros 8 місяців тому +5

      Just bottleneck being poor, stop solved 🔥🔥

  • @TheModeRed
    @TheModeRed Рік тому +14

    I think i speak for most when I ask for a video on what settings to use for Starfield that firmly put the load on the GPU but give you the max FPS without hurting graphical quality too much. Firmly on the GPU is key. I understand this completely depends on your specific PC hardware, but a tutorial on how to min/max would be great.

    • @alfredthibodeaux2414
      @alfredthibodeaux2414 Рік тому +3

      HUB has a couple of videos on this topic.

    • @insomniacjack729
      @insomniacjack729 Рік тому +6

      Starfield has me confused. According to their min specs I'm below or at min with a 1700x and a 1080 but it runs just fine at 1440p medium settings. Dips below 60 in the large cities but I can deal with that. What I don't understand is why people with better cpus and gpus are having the same issues

    • @xSkylar64
      @xSkylar64 Рік тому

      I used this on my 3070 TI Rig and it worked great. Highly recommend @@alfredthibodeaux2414

    • @fredEVOIX
      @fredEVOIX Рік тому +4

      @@insomniacjack729 the game engine doesnt really like more than 8 threads aka 4 cores, on 8 cores you didn't really saw this but now that we have 10-16 cores this became relevant, games don't know what to do and jump between cores all the time creating stutter and fps drops, imagine talkking on the phone but every word plays on a different one and you have 16 in front of you...that's the problem but devs know it, a lot of recent games limit core usage by default now

    • @jyubei_ichimonji
      @jyubei_ichimonji Рік тому +4

      @insomniacjack729 Starfield is one of this year worst offenders, though. It's very badly optimized.
      Someone discovered recently that the Bethesda developers have a bad gpu driver implementation.

  • @PindleofKujata
    @PindleofKujata Рік тому +64

    I'm looking forward to seeing how Cyberpunk 2.0 will handle CPU core load. It's supposed to utilise them far more effectively instead of just parking 90% of your cores and using one or two of them.

    • @Ellis_B28
      @Ellis_B28 Рік тому +6

      apparently 90% usage on an 8 core CPU from what a developer said on Twitter

    • @zagan1
      @zagan1 Рік тому +4

      Depends on what windows does, ONLY seeing 90% on a 4 core 8 threads example.
      But having 32 threads etc will be hard oressed to see the total go over 30 to 50% usage.
      Plus cpus do spc so far more that also reduces usage.

    • @maolcogi
      @maolcogi Рік тому +6

      I'm pretty excited for this personally. I have a 4090 and a 5800X3D and it already runs the game amazingly, with DLSS 3.5 and the CPU usage upgrades I feel like the game will be jaw droppingly beautiful.

    • @durrik
      @durrik Рік тому +2

      @@maolcogiI'm in the same boat but with a 13900k, extremely excited but concerned about temps lol

    • @Ellis_B28
      @Ellis_B28 Рік тому

      @@durrik The Cyberpunk Dev did say to make sure your CPU cooling is up to the task 😅We'll find out tomorrow

  • @Evrae04
    @Evrae04 Рік тому +1

    I was using a core i7 6700t with a 3080. The card was crying for new computer. Awesome video Jay.

  • @lckrgl
    @lckrgl Рік тому +5

    I feel this one.
    Had a pretty old setup with an GTX-650Ti and a FX-8320e and upgraded the gpu first to a RX6600.
    That upgrade made almost no difference besides allowing me to play BG3 (the intended goal) with a very bad performance.

  • @blahorgaslisk7763
    @blahorgaslisk7763 Рік тому +23

    Bottlenecking can give pretty weird results. Some years back both I and a friend upgraded our graphics cards to the RTX 2070 Super. He had a machine with an Intel Core i9 9900K while my machine was running a Core i7 6700. There's a pretty huge difference in CPU performance between these two, and in games his machine tended to be a lot faster. But when benchmarking The Division 2 we got some pretty interesting results. Like I said we both got 2070 Super cards, but they were different make and models. And in The Division 2 my machine consistently benched one to two FPS higher. The difference was that my graphics card had a very slight factory overclock which was just enough to make a difference in the benchmarking of the game. However what wasn't obvious when looking at the average FPS was that on my machine the CPU was pegged at over 90% for the entire benchmark. I actually got a good laugh when I sat watching the CPU utilization as reported by the game and it at some point reporter a utilization of 105%...
    Mean while my friends machine never broke 35% CPU utilization in the benchmark.
    When playing the game this was a lot more obvious. Loading times were a lot faster, and there were a lot less cases where the game slowed down

    • @EBMproductions1
      @EBMproductions1 Рік тому +4

      This is why i say if you have a Cpu bottleneck you should instantly do a 15% overclock on the gpu as the card makes up for the gap a tiny bit but you still have to know there will be slow downs and loading issues etc. Thing is these issues really dont bother as much right now in my current combo so im good.

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Рік тому

      im alright, if my belongings all returned theyll keep me busy for years

    • @anhiirr
      @anhiirr Рік тому +1

      i mean some ppl "adapt' b-die ram....and OC the snot out of their ram....w.o realizing the price difference is near trying to grab a better cpu/board/chipset/ram combination w.o trying to buy "BENCHMARKER FLAGSHIP ram kits" Price difference PURELY from a if you "SELL" what you have for an "upgrade" the cost/difference is almost negligible

    • @MJSGamingSanctuary
      @MJSGamingSanctuary Рік тому +1

      @@EBMproductions1 Yeah OC'ing can improve things but it could potentially also have lasting long term effects. On the CPU. I support OC'ing on test benchmarks but for gaming its kinda walking into a black hole a bit. Any bugs or glitches that occur is like a soup of worms. XD. Most devs will just be like WTAF are you playing on hardware from the late 90's for XD.

    • @EBMproductions1
      @EBMproductions1 Рік тому +1

      @@MJSGamingSanctuary nah my idea us a low level overclock so enough to feel a difference but not enough to hurt anything. Plus my pc is doing fine post gpu upgrade so now im saving up for a i5 13600k or Ryzen 5 7600x combo.

  • @leonbigio5499
    @leonbigio5499 Рік тому +4

    brother games have gotten so good when jay put his hand on the monitor at 6:23 i thought it was an animation from the game. No kidding searched how to point in cyberpunk lol

    • @xXXEnderCraftXXx
      @xXXEnderCraftXXx Рік тому +2

      The axact same thing I thought!

    • @daniell9834
      @daniell9834 Рік тому +2

      lol I was just going to comment that, I was like how the fuck do you do that?

    • @xXXEnderCraftXXx
      @xXXEnderCraftXXx Рік тому +2

      @@daniell9834 RTX being too real😂

    • @RainyFoxUwU
      @RainyFoxUwU Рік тому

      same here!

  • @l3lue7hunder12
    @l3lue7hunder12 Рік тому +49

    As Jayz just demonstrated, the issue with bottlenecking isn't really that your games don't run - if you throw something like an RTX 4090 at it, most computers of the last 6 years should do at least well.
    The real issue for most is the price, because price conscious buyers seeking to upgrade their system tend to buy graphic cards that according to benchmarks should be enough for their needs, instead of emptying their bank account to go all overkill with the latest fasted graphics card model available.
    The problem here is that you won't get those same benchmark results if your CPU can't keep up, which means you just wasted money.

    • @LSSMIRAK
      @LSSMIRAK Рік тому +14

      Pretty much every benchmark videos that compare gpus are inaccurate due to this. They pair a 13900K with every GPU, misleading people by making them think that they'll get the same performance with their crappy 9 year old cpu.

    • @DavidTMSN
      @DavidTMSN Рік тому +2

      @@LSSMIRAK Exactly.
      You can have a bunch of cores but if they're all clocked low and utilize an older IPC then it's gonna be the bottleneck if paired with newer architecture gpus.
      I run a i9 10th gen with a 3090 and it works but the improvements since then are pretty substantial - especially now with 14700k/7800x3d - seems like best upgrade path for my situation.

    • @richardrassat614
      @richardrassat614 Рік тому +3

      I have a 1920x thread ripper that is 6 years old and would bottle neck on my 2070 super. didn't realize it until I started my new build and dropped my 4090 in and got the same fps as the 2070 super. the 1920x had a mild OC and never felt unplayable but apparently it still has more head room in it.

    • @MaddaxxxE
      @MaddaxxxE Рік тому +2

      Just upgraded my 11700F to a 7800x3d and can confirm that I was not getting the most out of my 4080 for almost a year at 3440x1440p. I was so mad when didn’t get the same performance that I saw in UA-cam videos in new games and realized that I was bottlenecked. I was missing 30-40 frames in some games.

    • @jouniosmala9921
      @jouniosmala9921 11 місяців тому

      I upgraded my i7 920 with 2070 super and 4k TV. That's what a real CPU bottleneck looks in gaming. It proved to me that average FPS and 1% low's are often bullshit numbers, the real issue happens occasionally when there's spike in CPU load for instance enemy appearing on my screen second after the ambush because your CPU didn't have enough power left to decompress it's model while it was rendering the frames. Oh. Upgrade wasn't for gaming I needed a new GPU for a GPGPU programming project. I did use it like that almost a year, until I could afford and really needed to replace rest of the parts.

  • @ThiagoMatuo
    @ThiagoMatuo Рік тому +4

    Me with a R5 3600 / RTX 3070, I was playing in 1080p for a long time and some games that would reach above 120fps I could feel a lot of bottleneck. My monitor was 75hz, so it was better to lock the fps to 75 than trying to go over it and having a lot of sttutering.
    Now I upgraded my monitor to an UW 1440p and the bottleneck is better, but I still need an upgrade of CPU.

  • @ericwhitney8277
    @ericwhitney8277 Рік тому +1

    I wish the whole gaming community would see this. I've had to explain this concept to so many people. Most people think because a 7800x has higher FPS than a 5800x that the CPU is therefore bottlenecking their GPU.

  • @Altrop
    @Altrop Рік тому +5

    People need to realize that a bottleneck will vary depending on the game (unless you have a serious hardware mismatch). In fact, there are often both CPU bottlenecked moments and GPU bottlenecked moments in the same game. My old 5600X frequently bottlenecked my old 6700XT even though they are a good pair.
    CPU bottleneck is usually the worst.

    • @jaket1520
      @jaket1520 Рік тому +2

      What did you upgrade to from 5600X? I have RX 7800 XT now and 5600X and it kind of feels that the 5600X is a little bit of bottleneck in some situations.

    • @dragonclubracing8669
      @dragonclubracing8669 Рік тому

      What happens when cpu bottle necks gpu? I can currently get a 4090 at a good price and considering buying?but I currently have 5800x3d cpu, if I bottle neck at 1440p will games stutter or drop frames etc? Thanks

    • @NoDFX_
      @NoDFX_ 11 місяців тому +4

      @@dragonclubracing8669 A 5800x3d will work fine with a 4090, hell you can get away with a 5600x.
      Some cpu heavy games will have a little bottleneck on a 5600x, but no matter what you have a bottleneck in your system which was the entire point of this video.
      Any CPU that has came out in the past 5 years that isn't a I3 or Ryzen 3 will run a 4090 just fine. Even more fine if your planning on playing 1440p or 4k.

  • @SpacemanSpifff
    @SpacemanSpifff Рік тому +171

    I’d love to see a video showing the same kind of this with slightly older CPUs. The ones people are using that may be thinking of a GPU upgrade.

    • @djBurgers
      @djBurgers Рік тому +8

      Yeah I have a 3600 and seeing if that would bottleneck a 4070 hard

    • @Bello..
      @Bello.. Рік тому +3

      ​@@djBurgersdepends on your setup, but most likely it would be heavily bottlenecked. You should consider upgrading your CPU to something better on the AM4 platform if you don't want to spend much more

    • @oneonone8855
      @oneonone8855 Рік тому

      Just turn on the graph in-game and you can see performance in % of the GPU and CPU. If you play a high GPU tense game like Cyberpunk at 1440 with high settings and your GPU is below 98% performance it's for sure bottlenecked. @@djBurgers

    • @Kmmlc
      @Kmmlc Рік тому +4

      I have a new (bought 2 months ago) 6750 XT with a 9900K. I can tell you the bottleneck is real with that pairing.

    • @justinjesse2107
      @justinjesse2107 Рік тому +1

      ​@@djBurgers as a fellow 3600 owner, it would, but not by that much. It would also depend on what res you're playing at. I play 1440p so my CPU doesn't work as hard

  • @ivankong1065
    @ivankong1065 Рік тому +20

    Thank you for showing this video Jay. As your video from couple years ago has pointed out, it is not that easy to bottleneck a GPU with modern tech these days unless you are really going bottom dollar to pair an i3 (i.e. 13100) to a 4090. Thank you for demonstrating that here.

    • @SolarTara
      @SolarTara 11 місяців тому

      My god thank you for mentioning that its an I3 he mentioned. I was confused cause I thought he misspoke and meant a I7 13700 and was like.. woa.. how would that be a bottleneck

    • @10th_Doctor
      @10th_Doctor 10 місяців тому

      Or my antique i5-6600k paired with an RTX 3090. That situation is being resolved this month when all my new build parts arrive.

    • @Alex96194
      @Alex96194 10 місяців тому

      @@10th_Doctor im running an I5 9400F with an overclocked RTX 3070Ti. Will also solve the issue soon.

    • @10th_Doctor
      @10th_Doctor 10 місяців тому

      @@Alex96194 parts arriving by the end of the week except a few parts like the Thermaltake CPU frame and AIO arriving next monday.

    • @10th_Doctor
      @10th_Doctor 10 місяців тому

      @@Alex96194 I am going with an i7-13700K, not 14700K as the "refresh" doesn't really make it much better than the 13700 but does cost extra, 96GB DDR5 6400. I also bought a seasonic 1600W PSU with a 10 year warrnaty that will not only likely outlive my system even if I have it for another 20 years but also has plenty of head room for higher powered components down the line. I am waiting to see what the 5th gen Nvidia GPUs look like so not going with a 4th gen as my 3090 is still good enough for really any game at 5120x1440 super ultrawide.

  • @logo59alpha
    @logo59alpha 6 днів тому

    This helped me so much! thanks
    Turns out I had something very similar to your 14:37 explanation.
    Gpu limit: no load
    Gpu frequent fluctuation
    Cpu load 70% to 97%
    game fps 40~
    Time to justify the 9800X3D.. lets call it future proofing and getting rid of intel 8th Gen.

  • @multiluxem2218
    @multiluxem2218 7 місяців тому

    Many years ago, the PC at my home had a Pentium E2200. The CPU would easily go 100% when I open any modern games. Initially, it had a 9500GT IIRC. Then I upgraded it with GT440 and later GTX550Ti. But the improvements are not that much. There's still lagging. Later I decided to get a new PC with i7 3770k. I did still use the GTX 550 Ti, but the difference is night and day! I could play many mainstream games with FPS many times better than before lol

  • @dipakgosain
    @dipakgosain Рік тому +8

    The cpu is sending frames to the GPU so slowly the GPU is like bro we're not doing anything😂

  • @TheRealDlo
    @TheRealDlo Рік тому +7

    It is important to have SYNERGY in your system! Thanks J

  • @MeeMoo220
    @MeeMoo220 Рік тому +27

    This makes me feel better about pairing a used 3090 FE with the Ryzen 5 3600 w/ Prism cooler in my secondary gaming rig. Thanks Jay!

    • @rusudan9631
      @rusudan9631 Рік тому +3

      i set my 3600 at 2.2ghz all cores, still hitting stable 60 fps in 1440p gaming in rdr2 and cyberpunk with a 3060ti eagle

    • @vincentvega3093
      @vincentvega3093 Рік тому +6

      ​@@rusudan9631clearly GPU limited😂

    • @rustler08
      @rustler08 Рік тому +6

      If you're using a 3600 in games like Starfield or Cyberpunk, you should feel bad. You are leaving so much performance on the table, and I can tell you this considering I had a superior 3700X and swapped to a 7600X.
      Unless you're running lighter games, a 5600X3D or a 5800X3D would massively improve your gaming performance in Cyberpunk and Starfield.

    • @MeeMoo220
      @MeeMoo220 Рік тому +1

      @@rustler08 You’re right. Thankfully, I’m only playing BG3 and LoL at 1440p 144Hz. If I wanted to run Starfield I’d need to sell my kidney. Not interested.

    • @__-fi6xg
      @__-fi6xg Рік тому

      yeah i remember playing for honor on a new 6700 xt with a r5 2600 thinking looks nice but something is off, and when i moved to AM5 with same gpu and 7600x, the 1% lows and all the hick ups were gone.

  • @platinumgrit
    @platinumgrit Рік тому +1

    LOL I've literally bought additional iFixit toolkits because of Jay's ads 😆 it's how ads should be done!

  • @Mr_Jimbo
    @Mr_Jimbo Рік тому +12

    I thought I'd defined the term "bottleneck" by testing my new 4080 in my old Q9300 (2 cores, OC'd to a whopping 3.3ghz) as it was the only case/system I had that fit the monster cooler of the 4080, but in pure GPU benching, it's doesn't seem to be massively affected compared to my 9900k system, about 500-750 points in a furmark bench for example

    • @ward7337
      @ward7337 10 місяців тому +3

      Now play a normal game

  • @BeeWhere
    @BeeWhere Рік тому +5

    The biggest bottleneck for me has always been my monitors. Going to 1440p 144hz was a game changer but 165 or 240hz doesn't have the same impact. And 4k 144hz is still a bit out of my budget.

    • @dmytrosoboliev935
      @dmytrosoboliev935 Рік тому +2

      Yeah, i went from 1080p 60hz to 1440p 165hz recently and it's mind blowing. Of course, if you have at least 100fps in games. At 60fps it's not a big difference, pretty much same, just a higher resolution. But at competitive titles 144hz or above is a gamechanger

  • @watercannonscollaboration2281
    @watercannonscollaboration2281 Рік тому +14

    “Bottlenecking”, the only word on par in terror as the phrase “future-proofing”

  • @JoeStuffzAlt
    @JoeStuffzAlt Рік тому +5

    I once upgraded my GPU and my CPU was bottlenecking. However, the bottlenecked GPU was overall faster than before. Of course, this was part of a phased upgrade. This was also one of those eras where I had to go from DDR2 to DDR3, which adds more to the CPU upgrade cost.

    • @anhiirr
      @anhiirr Рік тому

      for me my bigger fear was the operating voltages of ddr3 and 1.65v ram vs 1.5v in that "ERA" it grew to be quite troublesome for a lot of end users during this time. Where a newer GPU also required a relatively ideal range of 12v rail operation/capability....paired aspects of vdroop etc when running 1.65v ram in these sort of scenarios which largely FRIED/KILLED a lot of ppls builds/systems etc. ESP as they were little/by little upgrading aspects of their build like GPU/SSD etc....w.o realizing how INTEGRAL a TIGHTER system from an operating voltage stanpdoint....building a PC really GREW into/became...even to this day.

  • @patricklee8552
    @patricklee8552 5 місяців тому +1

    the calling card for a CPU upgrade is when you buy a new game and all that loads is a black screen or it just don't load at all

  • @brokenmailman
    @brokenmailman Рік тому +1

    When I first started to stream the PC I used was a i7-7700 (non-k) with a RX6600. The bottleneck was INSANE! Used to watch framerate go from 120fps to 30 fps. Just all over the place.

  • @Ntouk
    @Ntouk Рік тому +5

    This is why i Love your Videos along with Steve's, Proper execution with proper info and more analysis. Cause of your help i could run my 3950x on release period back on 2020. Well Done Jay you are AWESOME

  • @KimBoKastekniv47
    @KimBoKastekniv47 Рік тому +5

    Your other bottlenecking video from 2019 was much more clear to understand.

  • @LukeTheJoker
    @LukeTheJoker Рік тому +13

    Awesome video, I didn't realise how far you would have to go to cause a real bottleneck.

  • @akuma_soul
    @akuma_soul Рік тому +1

    Rocking a old Gen1 Threadripper and a 3090. After using my PC a LOT for 3D Rendering and Work, I noticed the following: In Cyberpunk, Enemys just pop in when I arive somewhere and cant be attacked... guess its time to go for the Ryzen 7950X Rig for the new Update 👌

  • @mattnielsen7639
    @mattnielsen7639 22 дні тому

    One of the BIGGEST improvements I finslly figured out: nvidia control panel > manage 3d setting > global settings enable use of more than one CPU core which means basically the CPU and GPU can work together better

  • @kr00tman
    @kr00tman Рік тому +8

    This is a fantastic video, and I learned alot about cpus, I can't believe i never thought to disable cores to turn a stronger cpu into a weaker one for testing, thanks for this one!

  • @williamscott6209
    @williamscott6209 11 місяців тому +5

    Just recently upgraded my RTX 3060 to a Radeon 6950 XT. I was quite concerned that I'd be severely CPU bottlenecked since I'm running a 5600x and I was thinking "damn, am I gonna have to get 5800X3D to run this thing adequately?" but this video makes me think the bottleneck won't be nearly as bad as I thought. Hopefully my CPU can hold out for another generation or 2 until I end up upgrading the whole motherboard.

    • @BansheeNornPhenex
      @BansheeNornPhenex 10 місяців тому

      Thats a downgrade..

    • @williamscott6209
      @williamscott6209 10 місяців тому

      @@BansheeNornPhenex How is that a downgrade? The 6950 XT is twice as fast

    • @unnamed715
      @unnamed715 5 місяців тому

      I'm running a 5600X with a 4070 and having a pretty good experience. In worst case scenarios I just have to tweak the settings a bit.

  • @mr.gamerchannel2970
    @mr.gamerchannel2970 Рік тому +145

    In short,the gamers worst nightmare

    • @fififiz
      @fififiz Рік тому +12

      aside from showers

    • @Justlivin00
      @Justlivin00 Рік тому +10

      ​@@fififizI love showers

    • @LilTachanka
      @LilTachanka Рік тому +20

      ​@@fififizthat's a redditors worst fear

    • @drinkintea1572
      @drinkintea1572 Рік тому +4

      @@LilTachanka nah that should be "responsabilities"

    • @iHaveTheDocuments
      @iHaveTheDocuments Рік тому

      ​@@fififiz What is a shower?

  • @GriffinCorpOne
    @GriffinCorpOne Рік тому +1

    "Ten lane highway going down to a single lane dirt road" LOL - I love this channel

  • @DigitalRecollections
    @DigitalRecollections 9 місяців тому +2

    yeah i had a 5700xt with an i7 8700k....upgraded the GPU to a 7900xt and only saw somewhat of an increase in 1440p gaming....ended up with a new AM5 build and all is well now lol

    • @AlejandroMagnoRivas
      @AlejandroMagnoRivas 5 місяців тому

      maybe because Ultra settings or latest heavy games like hellblade 2,avatar or something with ray tracing take all the power of u gpu..
      Not is to much powerfull that radeon to bottleneck..
      I see i7 7700k maintain the frames with ur card..
      I7 7700k no bottleneck gtx 1080ti but yes 2080ti a little I think.

  • @Shadow0fd3ath24
    @Shadow0fd3ath24 Рік тому +17

    too many people worry about this it seems, yes its a concern, but you almost have to try to bottleneck a system in gaming, especially on any Ryzen or gen of Intel i5/i7 mainline chip of the past 5+ years

    • @TheRealDlo
      @TheRealDlo Рік тому +1

      I think the point is that Synergy matters 👍

    • @orangeapples
      @orangeapples Рік тому +5

      Yeah. As long as things are of similar age and of similar price tiers you’re okay.
      Don’t pair a Celeron with a RX 7900xt. Don’t pair a 7800x3d with a RX 6400.
      And people will need to know there is no magic set of hardware that REMOVE bottlenecks. They don’t realize that different software will use hardware differently.

    • @neiltroppmann7773
      @neiltroppmann7773 Рік тому

      Not entirely true, If your cpu and mobo can only do pcie gen 3(which is intel 10gen and previous), and you buy a lower end gpu like the rx 6600 gpu that only has 8 pcie lanes you will see stutters and lower fps then if you had gen 4 8 lanes or gen 3 16 lanes.

    • @Shadow0fd3ath24
      @Shadow0fd3ath24 Рік тому

      Thats not due to CPU choice, that just the 6600xt itself, and it would be FPS not stuttering. PLUS 6600xt isnt fast or powerful enough to benefit from gen 4 PCIE much at all you lose MAYBE 5% from actual tests ive seen. If youre on a 6600xt youre already gonna have stutters and lower fps. Plus the 6600xt is a HORRIBLE buy, 2080ti is a solid 65-80% better and 150 bucks USD less, and a 3080 is like 50 bucks cheaper and even better @@neiltroppmann7773​

    • @HiluxForEveryone
      @HiluxForEveryone Рік тому

      @@neiltroppmann7773Valid point, although the stutter part is a bit farfetched as from what I know the only thing that'd actually change is the framerate

  • @NSA-admin
    @NSA-admin Рік тому +4

    Your ifixit spots are still the best in the business. XD

  • @constitutionalright827
    @constitutionalright827 Рік тому +48

    Love the video, Jay. I would have liked it better to see this done with a basic card like a 2070... Basically in line with your PC industry destruct video, I'd like to see a discussion like this on bottlenecking with mainline, average components like exists in 90% of the PCs out there.

    • @julianvera1098
      @julianvera1098 11 місяців тому +2

      agree with this also because I use a 2070 super lol so I would like to see a ryzen 5600 or so which is the most popular for most of gamers and a 2070, 3060, or 3070 or so maybe 3080 or different cards with different CPUs

  • @curbthepain
    @curbthepain Рік тому +1

    Man that's (at the beginning running overclocked) the smoothest I've seen a game run since I visited my local Milwaukee PC and played Borderlands 2 on a Nvidia branded PC like 11 years ago.

  • @MrDarckGhost
    @MrDarckGhost Місяць тому +1

    4 ghz and 6-8 cores seems a sweet spot with 32gb rams (because of hog windows...)

  • @mackan072
    @mackan072 Рік тому +18

    I (briefly) ran an RTX 3080 coupled with an i7 4790k OC:ed to 4.8GHz, and at 3440x1440 I was significantly more GPU rather than CPU limited in most games. Sure, upgrading the the 5800X (Once it arrived, mine got delayed...) did improve on performance, but typically only in 1% lows, rather than "general performance" for most games. All in all, my ancient i7 4790k fared significantly better than I expected, given how huge of a mismatch it is in hardware.

    • @TheModeRed
      @TheModeRed Рік тому +2

      I agree with the sentiment of confusion here. It's not just about bottlenecks. It's how do I maximize the potential of a 3080 or a 7900xt so that my older CPU really isn't a problem? Are certain areas of a game really CPU intensive (bottlenecked) regardless of settings? 3440x1440 is a good use case. You'd think it'd stay GPU bound, but lower a setting and maybe now you're CPU bound. But if you have everything ultra, you get less FPS. I think most UA-camr's are almost getting there but it's still confusing. The best recommendation I have is to turn on metrics and test every setting yourself to see where the bottleneck is when min/maxing for FPS vs quality.

    • @Gastell0
      @Gastell0 Рік тому +2

      I have 4790K and 4K as well, and just upgraded to 7900XTX, still don't get CPU bottlenecks that would warrant CPU upgrade.
      Might even modify my Z97 motherboard to support ReBar (there's github for that) to get more performance out of that ol'reliable CPU!

    • @mackan072
      @mackan072 Рік тому +2

      @@Gastell0 I still did find it worthwhile to upgrade my CPU though. There are some games where the CPU performance was subpar, and the upgrade to a better CPU made stutters less frequent.
      Just because I tended to be more GPU rather than CPU limited, it doesn't mean that I always was limited by the GPU. Better CPU performance still improved on the general gaming experience overall. More so in some titles than in others though.

    • @TheModeRed
      @TheModeRed Рік тому

      @@Gastell0 So say we're playing Starfield on ultra settings. We want more FPS but not so much a loss in visual quality. Does the CPU start bottlenecking in the city no matter what or just when we lower settings trying to improve FPS? There seems to be a fine line between min/maxing and shifting load back onto the CPU that I think we will just have to figure out for ourselves on a per game, per area, per rig basis.

    • @Gastell0
      @Gastell0 Рік тому +2

      @@mackan072 That's definitely, I was debating what to upgrade, MB+CPU+RAM or GPU, and went with GPU as it provides the most gains (I changed from GTX1080)

  • @ThickpropheT
    @ThickpropheT Рік тому +12

    Thanks for sharing this! Been meaning test out my 4770k and 1070 combo to see how much bottlenecking there is, but didn't really have too good of an idea on what I'd be looking for, but this clears it right up. Gonna check that out later

    • @fredEVOIX
      @fredEVOIX Рік тому +3

      if you have shadow of the tomb raider do the benchmark in 1080p it will tell you, check GPU BOUND if it's 99% your gpu is limiting the game if it say 0% it's your cpu, this will tell you the balance of your setup

    • @ThickpropheT
      @ThickpropheT Рік тому

      @@fredEVOIX Thanks for the tips. Cheers!

    • @m8x425
      @m8x425 Рік тому +1

      My brother had a system with a stock 3770k and upgraded to the GTX 1070 back in 2017. He got a couple years out of that setup and he had no complaints. He did upgrade to a 9900k, but not for the sake of gaming performance.

    • @ThickpropheT
      @ThickpropheT Рік тому

      @@friendlysloth hmm. I see. I guess I know what to expect now lol

    • @illustriousinc8608
      @illustriousinc8608 Рік тому +1

      @@ThickpropheT I'm still running a 3770k with a 1070 and there is nearly no bottlenecking. At best like 2-3 % depending on the game/task, so it's not noticeable.

  • @cncisrobot6354
    @cncisrobot6354 Рік тому +5

    CPU & GPU bottle necking is not having the money to be able buy either of them!

  • @kieferonline
    @kieferonline Рік тому +1

    Ha! I love the iFixit commercial!! So funny. It's clear someone had a good time making that.

  • @ANovaMaquinadoTempo
    @ANovaMaquinadoTempo 10 місяців тому +1

    I have a RTX4080 on a 12400. 32gb ram(2x16GB 3200MT). But I play mosty at 4k on a TV that has a 120hz refresh rate. Most of the time the GPU is at or close to 100%. Only when I use DLSS that it drops, still most games fps is 70 or higher.

  • @RysenKai
    @RysenKai Рік тому +4

    I don't have any bottlenecking, because I just put all the iFixit pieces into my case.

  • @Redsfanatic32
    @Redsfanatic32 Рік тому +4

    I was a little worried about 11700k paired with 4070 but it seems my worries were misplaced.
    My biggest mistake was making my secondary storage drive a hard disk instead of a SSD.

  • @AmritpalMoga
    @AmritpalMoga Рік тому +8

    How big of a bottleneck is the RAM speed? Would be interesting to test.

    • @Cravenfr
      @Cravenfr Рік тому

      imo, long testing for not huge differences, since there is test out there where you can see like +3 fps differences but the ram kit is also 3 times the cost of the reference basis 😂

    • @jordanlazarus7345
      @jordanlazarus7345 Рік тому

      It depends. It can make a massive difference, but that depends on how high the speeds already are and what CPU you're using. Sometimes a shitty RAM config can literally cut your performance in half, other times it's not the limiting factor.

  • @Lunar_Valkyrie
    @Lunar_Valkyrie Рік тому +1

    I'm still just as confused as I ever was. Maybe moreso now.

  • @ether49
    @ether49 10 місяців тому +1

    6:18 looks like you put away your sword and then pointed with your other hand

  • @CampamentoUL
    @CampamentoUL Рік тому +3

    Really good video man, I've been interested in pcs since I was a kid, I built my pcs when I was 16 and 19 and now im building my third at 25. When I built those I thought I knew everything but now I have more perspective, whatching your videos Im learning a lot and im only a motherboard away for my next build really happy with my component decisions, thank you for sharing your knowledge!

  • @TheGamefaq
    @TheGamefaq Рік тому +4

    Jay do that with Starfield. This also requires a powerful CPU! The changes to the clock speed and/or number of cores will become noticeable much earlier and more strongly.

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ Рік тому

      @@CarlosXPhone starfield does require a powerful cpu. my 5800x3d is in the 50-80%s in new atlantis at 1440p max lol. ON ALL CORES.

    • @TremorX
      @TremorX Рік тому

      ​@@Pand0rasAct0r_5800x3d and 4090; it's not a hardware problem. It's 100% Starfield being a turd. Some of the INI file tweaks prove just how full of it Todd is about optimization. I get the feeling all they did was "make sure the game loads" and called it a day.

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ Рік тому

      @@CarlosXPhone mate this game simulates thousands of objects in real time even more than back in skyrim or fallout4. Yes its creation engine jank as well but damm did they improve its capabilities. it could be better optimised for sure but its simply wrong to say its not a "powerful" game because it very much is.

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ Рік тому

      @spaghebbio it is doing that though. unless you are in a specific cell like a house the area you landed in is loaded in. all of it xD

    • @micb3rd
      @micb3rd Рік тому

      @TheGameFAQ.
      I guess you missed my post on Jays "Starfield is the new Cyberpunk " video it is at the top....... I already tested this to a much deeper level across multiple dimensions.
      I did significant testing / Benchmarking to understand how this game scales CPU with different limited metrics. Computer: Intel 13th Gen - 13900K, 32 Gigs DDR5 @ 7200 MT/s & nVidia 4090 MSI Suprim X on Windows 10. Below are the results:
      I found a good CPU limited situation in new Atlantis near the tree just after we walk up the stairs (right near where Jay and Phil were testing) There may be a tougher area later in the game but this worked well for me to test.
      The Game was set at at 1080p HUB optimised settings, with FSR2 and 50% resolution scale so the 4090 GPU was loaded 63%-73% utilisation so I could tell this was CPU limited (the bottleneck for FPS).
      Test set 1 - DDR 5 Memory Scaling.
      P Cores @ 5.5 Ghz & DDR5 @ 7200 MT/s = 115 FPS
      P Cores at 5.5 Ghz & DDR5 @ 5600 MT/s = 105 FPS (28% less memory clock and 7.4% less performance)
      P Cores at 5.5 Ghz & DDR5 @ 4400 MT/s = 96 FPS (63% less memory clock and 18% less performance)
      P Cores at 5.5 Ghz & DDR5 @ 2933 MT/s = 76 FPS (145% less memory clock and 51% less performance)
      This shows there is some scaling with memory but not one to one. Also it shows you can cripple a 13900K with slow memory. Those hungry cores need feeding with good fast RAM.
      Test set 2 - CPU - Clock Rate Scaling (E Cores turned off so only P Cores are on)
      P Cores @ 5.7 Ghz & DDR5 @ 7200 MT/s = 117 FPS
      P Cores @ 5.5 Ghz & DDR5 @ 7200 MT/s = 115 FPS
      P Cores @ 5.0 Ghz & DDR5 @ 7200 MT/s = 105 FPS (10% less clock than 5.5 Ghz and 9.5% less performance)
      P Cores @ 4.5 Ghz & DDR5 @ 7200 MT/s = 100 FPS (22% less clock than 5.5 Ghz and 15% less performance)
      P Cores @ 4.0 Ghz & DDR5 @ 7200 MT/s = 90 FPS (37% less clock than 5.5 Ghz and 27% less performance)
      P Cores @ 3.5 Ghz & DDR5 @ 7200 MT/s = 80 FPS (57% less clock than 5.5 Ghz and 43% less performance)
      P Cores @ 3.0 Ghz & DDR5 @ 7200 MT/s = 70 FPS (83% less clock than 5.5 Ghz and 64% less performance)
      This shows there is good scaling with CPU clock rate, I managed to drop over half the fps performance by dropping the clock rate. It would be interesting if we could see if 6.5 Ghz continues to scale performance. - I'll leave that for someone with LN2 (Gamers Nexus).
      Test set 3 - Ecore Scaling - E Cores On vs Off.
      P Cores @ 5.5 Ghz + E Cores @ 4.2Ghz & DDR5 @ 7200 MT/s = 125 FPS (9.5% more performance with E Cores on vs E Cores off)
      P Cores @ 5.5 Ghz & DDR5 @ 7200 MT/s = 115 FPS (9.5 % less performance with E cores turned off).
      I checked the E Cores load and the game was loading them over 55% so a good utilisation and it took some of the workload (15%-20%) off the P Cores. So they are working well in this game title.
      Note: Turning off Hyper Threading on P Cores actually lost me 10% performance.
      I hope you find this interesting!
      I'm not sure they can optimize the CPU load more as the CPU Core utilization is already very good in Starfield. Perhaps they can reduce some specific workloads in the title with some effort.
      Yes the 13900K and 4090 are well matched, funny thing is the recent re-bar mode pushed the game to be CPU limited on 4090 as you get 5-10% more FPS with rebar on so the fix is you can now turn up the graphical settings like resolution scale to 68% to load the GPU up to 98% load.

  • @Amin_2k
    @Amin_2k Рік тому +6

    Exactly! I was pressured into changing my i7 4770k to a ryzen 5 5600 (and i am still happy i did it because i can run windows 11 now) but in terms of gaming the difference i noticed was barely noticable. Gaming is mostly GPU bound, and CPU bottlenecks are very rare it seems, even if you have a 9 year old CPU. I must add that if you play at 1080p you might notice a bigger difference, but if you play at 1440p or 4k the older CPUs are still amazing.

    • @secondc0ming
      @secondc0ming Рік тому +9

      If you didn't see any difference in games going from a 4770k to a Ryzen 5600, you either play old games or you have an old GPU.

    • @HackedGlitch265
      @HackedGlitch265 Рік тому +4

      I think that stems from the fact that while GPUs have heavily increased in power, higher resolutions also require vastly more power to be rendered.
      So for 1080p, which was once high end, GPUs can now scale that mountain, but CPUs are slower, so they struggle. Push up to 4k, and while the GPU was faster, it ends up scaling a sheer cliff, while the CPU climbs merely a slope, and they end up reaching the top at around the same time.

    • @bigdaisy19k
      @bigdaisy19k Рік тому

      ​@@HackedGlitch265best description ever. 👍

    • @lilpain1997
      @lilpain1997 Рік тому +4

      yeah I really want to know what GPU you were running to get barely any gains at all from a 4770k to a 5600. I went from an i7 3770k ( not far off the 4770k at all ) to a 3600 then a 5800x3D and the jumps were massive. I play at 3440x1440p and noticed the Jump more so in 1% lows and .1% lows. Avg really does not matter at all as you can get decent averages out of older CPUs but your 1% and .1%s are much worse and if you play anything CPU limited like Satisfactory or games like that you will notice a massive jump.

    • @Amin_2k
      @Amin_2k Рік тому +2

      ​@@secondc0ming I have a 2070 super, and played newer games (Watch dogs legion, Forza Horizon 5, metro exodus). I have benchmark comparison on my channel and the difference is not noticable, at least not at 1440p. My question to you is: Did you ever pair an older CPU like the 4770k with a medium/high end modern GPU, or are you just speaking out of what you think should happen?

  • @lukasholly5691
    @lukasholly5691 Рік тому +1

    my i7 7700 + 3080 still running fine on 2k, the cpu gets to 100% sometimes but the gpu rarely goes under 90%, usually running 100% utilization

  • @tbas8741
    @tbas8741 9 місяців тому +1

    great information very informative.
    Lucky Nvidia Fast Sync is AWESOME!
    Monitor refresh rate has no impact on performance
    i Run my IL-2 Flight sim at 80-120fps with Fast Sync on a 60hz Screen and no tearing and see almost all the frames as fast sync can display even 240fps on 60hz screen

  • @sterilyte
    @sterilyte Рік тому +4

    Hey Jay, I love what you're doing here, but I think a better example would be doing this with hardware, like putting a first gen Ryzen (or even a FX chip) with the 4090. And doing the reverse and using something like a 1060 with a modern CPU. (I say Ryzen/FX because I don't know Intel chips that well).

    • @nhf7170
      @nhf7170 Рік тому

      I had an FX chip with a 1070. Before that an HD 7950. I don't think either ever saw usage over 75%.

    • @sterilyte
      @sterilyte Рік тому

      @@nhf7170 I used an FX-8350 with an AMD 480 and it was a pretty good match, though the CPU was the bottleneck more often than the 480 was.

  • @UserNamesAreObsolete
    @UserNamesAreObsolete Рік тому +15

    Hello Jay,
    thanks for your video. People can now identify a bottle neck when they have built a pc - but how can people identify a bottleneck before buying new components?
    When my 1070 broke, it took me quite a lot of time to decide on which GPU to buy next since my system was already built, basically.
    I already had a 5600X and was waiting for better GPU prices before getting a new card.
    Well, after checking a lot of videos around pricing and performance, I decided to buy a 6800XT for 1440p gaming.
    Later on I asked myself if I created a bottle neck for myself.
    Could you make a video to recognize a potential for a bottle neck on paper, so before you buy?
    Can you see which CPU / GPU are a good match just by comparing their technical data and if so, how?
    I believe such a video would be a real help, which people could use as a rough guideline on what to look out for.
    Old rules like "the higher the frequency the better the CPU" don´t apply any more, neither can you just tell the best match by looking at the core count of a CPU, else any EPYC would constantly be the best CPU for everything.
    What GPU would you pair with a r5 7600X? A 7800XT? 7900XT? 4070?
    And if you paired this setup, which PSU would you pick? 650 W or 750 W?
    What would change in your selection if ray tracing was considered unnecessary?
    Thanks in advance.

    • @michisauer
      @michisauer Рік тому +4

      It's quite easy to understand even before buying a card:
      Hard bottlenecks should not happen anymore as long as you got at least 8 cores.
      Even a 6 core cpu running at high speeds will seldom drop you to framerates that are considered bottlenecks in its fullest.
      Best advise is:
      Running a midrange-cpu -> buy a midrange gpu.
      Pairing a 5600x with a 4090 is not a good choice.
      Funny thing is:
      Any 8-core cpu nowadays ( if it's 8 perf- cores ) should be safe not to force you into any bottlenecks.
      So best advice to the end:
      If you got a cpu, read the test what it was performing like with gpus that were out at that time. Then search for a gpu- test with the cards you want to use and 1 of the cards in the cpu test in it.
      Check the performance difference of the cards.
      The trick behind this is, that usually there will be the best performing card of the cpu generation listed. If the cpu was able to run that card unlimited, any card with equal or minimal higher performance won't cause the cpu to bottleneck. Most times you even cards with much higher performance will only be slowed down.
      Real bottlenecks occur only when the cpu or the gpu are not able to handle the workload anymore and they have to wait for each other.
      Therefore: buy accordingly and get your game settings right. Most times when running into bad frame times, increasing resolution or using gpu-intensive settings will lower the average fps by a bit, but will smoothen out frame consistency, which in the end will give you the better experience.
      I know this answer is long and hard to read, but giving a list of which gpu to buy to which cpu would take years to make. Just because there is so many combinations out there.
      Plus, you would need to do it for every resolution and a long list of games to get consistent results.

    • @tool46296
      @tool46296 Рік тому

      Google bottleneck calculator. There is a nifty website that lets you choose a CPU, GPU, what you’ll be using them for, and at what resolution.
      Check it out. 👍🏼

    • @UrBeastyBalz619
      @UrBeastyBalz619 Рік тому +2

      Honestly. You have a very good pairing with the 5600x and 6800xt. As for bottlenecking on paper is really down to the individual game your looking at playing. Some are cpu heavy and others gpu heavy.
      Hypothetically you could get similar performance in a game with your 6800xt paired with a 5950x or your 5600x. If it’s GPU heavy. But the next game may play way better with the 5950x if it’s a cpu heavy game.
      The argument could be made that your previous pairing, 1070 and 5600x your cpu is bottlenecked by your gpu. This is where I’d want to be btw. Not the other way around. But I digress.
      I don’t upgrade anything until I see something in my system isn’t providing the level of performance I’m looking for. Then I look at deals and wait for the best bang for buck. Kinda like you did looking at the 6800xt
      I just upgraded to a 5600x from my 3300x because I started playing games that needed more cpu to push the frames to my gpu fast enough. (Gpu at 60% util) But before I never needed a faster cpu. Just gpu.
      To touch on your psu question. I’d go with the 750W. I’d push for a good 850W though. Cards are getting more and more power hungry so in the future when you need another upgrade you don’t want to be buying ANOTHER psu just to power the next gpu. Buy one that’s good now so you can slot in a gpu in the future. Your psu is a component that can grow with your rig.
      And for your pairing question of the 7600x. I’d buy the best gpu for your budget. Even if your cpu is bottlenecking it a bit in specific titles. Your gpu potential isn’t going anywhere. Amd will be releasing another gen on am5 (finger crossed it’s the same story as am4) you can upgrade then to get the potential in your gpu then, when you need to.
      Buy what you can afford and have fun gaming!
      But I’m just a dude who has his own opinions on stuff and I’m sure someone will shoot me down 😂
      I hope this helps. Cheers.

    • @imo098765
      @imo098765 Рік тому

      Hardware Unboxed and other benchmarking tech channels show cpu scaling vids every few months, thats the best place

    • @UserNamesAreObsolete
      @UserNamesAreObsolete Рік тому

      @@imo098765 Thanks for the answer, that's how I decided on my GPU. I shot it for 509 (average price for a 6800XT was 560 Euro, 6950 is 640 Euro). But it took me some time to find a channel providing me with the necessary information. I could have needed a video as a guide for it, including possible bottlenecks for certain CPU / GPU combos.

  • @dudenda757
    @dudenda757 Рік тому +14

    Awesome info here. Thanks Jay! I would love to see the same kind of testing done with an AMD CPU/GPU combo though.

  • @EmptyPokeball
    @EmptyPokeball 10 місяців тому +2

    These videos are so entertaining and informative

  • @reinhardswart753
    @reinhardswart753 4 місяці тому

    Recently upgraded my old i5 3rd gen to a i7 10th gen and gained anywhere from 40-50 fps on my GTX 1070 in more modern games. After seeing it in action, I finally understood what bottlenecking is.

  • @TheRogueWolf
    @TheRogueWolf Рік тому +5

    And, of course, bottlenecking is also going to vary depending on which game you're playing. So forget about building the perfect system that plays everything at 100% efficiency, because there's no such thing.

  • @nadkudo1798
    @nadkudo1798 Рік тому +4

    it'd be interesting to see how this translates to actual different, older cpus. Going to an actual 4c4t 3.2 cpu and see how it would work with a bunch of newer gpus or the other way around since there seems to be a misunderstanding about how much mixing generations would affect performance...

    • @RobBCactive
      @RobBCactive Рік тому +1

      Jay's example showed clearly that single thread performance matters alot. I had to use an APU while waiting for a semi-sanely price GPU as and interim measure and while it wasn't ideal, it was a lot better in game with a real GPU bottlenecks or not. The later CPU upgrade to Zen3 after prices fell, delivered on stability and game performance, but it didn't make things playable that weren't before.
      So I would live with bottlenecking again when doing phased upgrades, it's far cheaper nd easier than rebuilding a whole new system.

  • @Intrepid17011
    @Intrepid17011 Рік тому +5

    Currently bottlenecking my 7800X3D with a 6700Xt, upgraded from a i5 6600K.
    I choose to do it that way since i plan to get a new GPU anyway.
    Also, that way my CPU wont bottleneck my next GPU which needs to be more powerful than my 6700Xt.
    Also there are so many titles that are SO CPU depended, like Star Citizen ( which i play a lot ).
    Also i feel like a GPU bottlenecking is not as worse as the other way around.
    If your CPU is tooo slow for a title you cant do much , if your gou is too slow you can turn down the settings.

    • @insertnamehere4419
      @insertnamehere4419 Рік тому +2

      lol stop using the word bottleneck. Your cpu is not "bottlenecked" by your gpu.

    • @jiboo6850
      @jiboo6850 Рік тому

      huh? what?? i think your pc is a trash can that needs to be emptied. no way in a million year, your pc in a normal "good" situation will bottleneck. something is wrong on your pc.

    • @insertnamehere4419
      @insertnamehere4419 Рік тому

      @@jiboo6850 There is always a "bottleneck". You are either cpu limited, or gpu limited. You can also be IO limited. You people need to just stop using the word. Pair a good cpu with a good gpu, and you'll be fine. You will benefit from a better cpu when cpu limited and a better gpu when gpu limited. You have to have a lower end cpu to literally "bottleneck" your gpu.

    • @jiboo6850
      @jiboo6850 Рік тому

      @@insertnamehere4419 yes but there's a difference between a bottleneck that is 3 lanes to 1 vs a 10lanes to 1. the 1st one will be easier to dela with just by finding the balanced settings that ask an equal job from both. and it's not difficult to figure out. what i meant in the 1st comment is that his config is high enough to pull Starfield easy. he just hasn't the right settings to let it express itself to the best.

    • @HiluxForEveryone
      @HiluxForEveryone Рік тому

      GPUs do not hold CPUs back. Stop using that term for that instance.

  • @urmask7264
    @urmask7264 Рік тому +1

    You've said it before - there's always a bottleneck.
    This video is a good example to explain why one should not pay X00€ extra on a GPU but divide this between CPU and GPU and sometimes even RAM to get better performance.
    I would've like to see setting gfx to high, to our more work on the GPU to see how this affects FPS. Maybe another video?

  • @Zemnexx
    @Zemnexx Рік тому +1

    I recently had to upgrade my entire setup because I was hitting major bottlenecking on a 4090 I bought. I had the same setup for years and the system was originally built with already older parts in 2020, and the only thing I upgraded was the GPU and some M.2 SSD drives for a few years. I was running a i7 6700K, which was not terrible for it's time, but definitely old and slow by today standards. I didn't realize how bad the bottleneck was until I started looking at benchmarks for games I was playing, such as Baldurs Gate 3, and see people hitting well above 100 FPS on a 4090 when I could only manage about 60-70 FPS (on a ultrawide monitor, max settings) and even had it dip into 40-50s on complex scenes. At that point I knew it was time for a full system upgrade to be able to keep up with my GPU. Bit the bullet and replaced everything with the latest stuff, now running a i7 13700KF CPU. Man, what a difference, instantly saw 150-200 FPS at Ultrawide with everything maxed. Just an example of how bad bottlenecking can really affect your peformance.

  • @biggil4
    @biggil4 Рік тому

    Your worst bottleneck scenario is my life with the Ryzen 7 2700x/2080 while playing cyberpunk.
    This made a lot of sense. Thanks for the bud!

  • @rexyoshimoto4278
    @rexyoshimoto4278 Рік тому

    I did my last clock-up on the cpu this year. My rig has finally reach the years I hoped to replace it. It's running hotter to stay up in new and better fps games. It's an Intel i7 8700k, 32gb 3200 ram, Nvidia RTX 3080 ti. Had two other gpus ( a Vega64 and a 2080 S) two yrs each. But for 6 yrs. of pleasure I had with the rig, it was monster. I'm gonna retire it as my daily work horse. Love this machine.

    • @vegg6408
      @vegg6408 Рік тому +1

      My pc died a week ago and it i5 2400 and 1050ti

    • @rexyoshimoto4278
      @rexyoshimoto4278 Рік тому

      @@vegg6408You kept it going that long? That's like gramma's car she putts around forever on. That's great. 2012. I have an old Dell Studio xps 435mt in the garage. It runs but is as slow as a rug. Intel i7 920. first year i7, smallest one. Came with an AMD HD 4850 2gb. The last gpu it had was an Nvidia GeForce GTX 970 ss. Still in the Pcie slot.😀

  • @son2soaringeagle
    @son2soaringeagle 11 місяців тому

    This was even more detailed than it needed to be lol its very simple. Now a days, It has to be very dated hardware. Or old hardware combined with new and/or high performance hardware.

  • @persistentpally
    @persistentpally 21 день тому

    Skyrim Legendary Editon and Special Edition requires a display mod to go over the 60fps limit. There is also ENB, and graphic mods that are so strong that they would tank any RTX card less than 12GB, possibly 16. It's always a labor of love trying to get the most out of the hardware vs game battle.

  • @Chris-techgamesfood
    @Chris-techgamesfood 6 місяців тому +1

    At 1080 in Starfield with an RTX 2080 and i7 9700F I don’t see any bottleneck.
    The CPU is getting on now but it’s still viable, I can run a lot of games at 120fps and not notice any drops.