The RTX 4070 Ti CPU Bottleneck - How Much? Compared to the RTX 3090.
Вставка
- Опубліковано 26 бер 2023
- How much of a CPU Bottleneck is there? Do you need a new CPU to get the most FPS from your RTX 4070 Ti? What "old" CPUs are still good? What about the RTX 3090? Will a newer CPU get you more FPS?
#4070tiCPU #CPUBottleneck #RTX3090CPU - Наука та технологія
5 years ago?! Oh my poor i9 9900k.
Its hard to believe some people build new PCs every five years.
I had a 3600x and 2060 for 1080p gaming and everything was smooth. I then upgraded to a 3080 for 1440p and some games gave me bad frame drops. I ended up upgrading to a 5700x and my 1% lows and averages went up. I will stick with this combo for a while and wait for the next gen gpus and cpus to release
Tyvm for the vid. Been running a 2070 & 9900K in my htpc and I wanted to upgrade the gpu to a 4070 Ti but was slightly concerned about cpu bottleneck. Nice to know that the bottleneck is negligible for as old as the 9900K is. I never run ultra settings typically a mix of High & Med at 1440p and my display LG is capped at 120fps anyways I just don't bother w/ 4K because the PPI on the LG displays is so good I don't feel its necessary for the cost of power and heat. So getting a nice smooth 1440p 120fps w/ low power consumption I find to be a nice balance for my use case.
The 9900K still can get it done in 2023. I agree, 1440p/120fps is the best place to be since chasing 4k will cost big money going forward.
I needed this video a couple of weeks ago. When I got my 4070ti I had a 5700G and noticed that whilst in most games I got a huge jump in performance, I noticed that in Cyberpunk and The Witcher with heavy RT it wasn't performing as expected despite CPU usage showing around 40%. I picked up the 5800X3D and was blown away as the Cyberpunk benchmark at 1440p RT Ultra DLSS Auto went from 76fps average (97 max) to 100 fps average and 127 max, in many like for like scenes I was gaining over 50% performance and the same for The Witcher 3
But what else does it tell you?
Buying a GPU and easy install it doubles the framerate.
Buying a hole system for about 680 costs you between 3h or a weekend and delivers only between 20-30frames or not even 50%+
@@memoli801 pretty sure it was just an easy cpu upgrade for him aswell, as far as i know you can use the same motherboard and ram with both cpus and i mean the gpu costs 800-900$ and the cpu is way cheaper. In the end you need the whole pc around the same performance lvl and you gotta know what youre doing with your pc, which games are played, are they heavy on the cpu or on the gpu and so on
How much fps you got for cyberpunk with rt enabled?
Learn to overclock your ram.
@@abdullahdanze2061 for? 2% gains?
I just built my new PC and skipped over the 7950X3D in favor of the normal 7950X instead. Not too crazy about the way you seem to have to jump through hoops to "sometimes" see extra performance in "some games". I have my 7950X paired with a 4070Ti and couldn't be happier with my system performance.
I might have to get that cpu over the 7800x3d :/
The X3D will perform better
The focus of these graphics cards is on ray tracing. I think you should consider that. The 60 RT cores should also be used to get a decent result.
Recently bought 5800x3d and want to pair it with 4070ti if it's price comes down. Great video
Is 5800x 3d the best for 4070ti ?? I too thinking of getting that
@@ramprasad9277 its really good , I would recommend that if you have aM4 or want to save a bit of money over 7800x3d
@@ramprasad9277 just don't forget that this CPU is very hot, I had to upgrade my cooler
What u upgraded from and to and what was the temperature drops, please.@@makiroll6815
Just bought a 5800x 3D as well. I'm waiting for prices for GPU's to drop as well.
As pointed out by Hardware Unboxed, the 7950X 3D is slower because Windows doesn't properly select the correct chip to use. Instead of selecting the 3D V-Cache chip, it selects the other one. For this reason alone, the 7800X3D is a better choice, offering more performance, and also cheaper.
Is it bad to buy 4070ti for 1080p gaming?
@@Zelchinho depends on the rest of your setup. If you have a high end cpu and a monitor with 165hz or higher you should be good
My build is 4070ti, potentially 7800x3d, 32gb 6000mhz CL30-36-36-76 1.35v, b650 aorus pro elite ax ice edition
cpu will be water cooled with a kraken x53
It’s the same scheduling problems intel 12th gen had being new P&E core split architecture, it’s only some games and it’ll be fixed with a windows scheduler update.
4080 coming down in price, but I still lean towards the 7900 XTX.
I3 13100f. I wouldn't worry about GPU scaling on a old game. 100+ FPS is plenty. You could certainly use a 4090 with the i3 and have zero cpu bottlenecks at 8k.
I'd like to know how +3fps = 27% Gpu bottleneck out +160 fps ?
even in the all the other fps listed it's no wear near above 10%.
Just got a gigabyte aourus water force 3090 and its a beast obviously i paid alot less for it as its an x mining card but its still performing great 👍
Thank you for you work
its the same problem as intel when they did their big little design , scheduling is the problem
turn of the die in the bios that has no v cache and see your fps fly
I see that it tends to vary depending on the game and how well its engine can utilize the CPU cores. Some games can spread the load over 8 cores while other games will only fully utilize 1, 2, or 4 cores. So in the instance that the game engine isn't spreading the workload over all the cores. It requires a CPU with a faster single core speed to get the best FPS. One really easy example of this is RuneScape on the low graphics preset. It's a very old game engine that can't take advantage of all the CPU cores. The 8 core 16 thread 11900k would get 100's more FPS that the 9900k even though they both had the same core count that the game engine couldn't fully use. The faster single core speed is what gave the biggest performance improvement. To be clear, I'm not saying all games behave this way. Just saying it's something I notice a lot in the games I tend to play. The single core speed of my old 5900x was a bottleneck at times and with my current, 12900ks system I dont see the same CPU bottleneck anymore.
I still have my older 9900k system and it performs very respectably in games that can use all the cores. I am curious to try the new 3d v-cache CPU's but I'd like to wait until the single CCD 7800x3d comes out because I think that one will be better for gaming because it shouldn't have the latency and scheduling problems like 7950x3d with its 2 CCD design. Just my thoughts.
There is no latency, The damn game will only run off of one CCD if it only uses 1 CCD, which 95% of all games ONLY use 8 cores or less!
@@cdiff07 up to 16 threads to be correct, cause my 5600x was lacking in new gamers like Control.
@@kevinerbs2778 hyperthreading isnt really a big factor in gaming, best example is actually the 9700k vs the 9900k in gaming, both 8 cores, around the same clockspeed, same architecture, same performance in gaming. In some games the 9700k was even a tiny bit better because some games seem to struggl with Hyperthreading. So your 5600x was struggling cause its only 6 cores not because it has only 12 threads
@@kevinerbs2778 I have an i5-11600k pared with a RTX 3090. I can definitely tell that the 6-cores are sometimes causing an issue but usually I just leave my games at 60fps. Eventually I'm going to upgrade to a i9-11900k once the prices drop more.
I have a 5950x with a 7900xt , it can be considered equivalent or a little better than a 4070ti in rasterization performance and after testing with more recent and demanding games such as dead space remake , far cry 6 , sons of the forest , kingdom come deliverance , doom eternal, resident evil 4 remake demo, Hunt Show Down I consistently get 95-100% gpu usage at 1440p max settings so I would imagine it to be not far off with the 4070ti. 50-60 % gpu usage is rather extreme and either is just the way this game is or there is something not quite right with your setup. Update : also just tested 5950x/7900XT with SOT and again getting 100% GPU utilization so I’m convinced it’s something to do with the configuration of your rig.
Radeon gpus in general has better gpu utilization compared to nvidia.
I have an I7-8700k and a 4070ti.
With max settings at 1440p with no DLSS on, I get 27% GPU bound as the results.
Do you have any issues with %1 lows on your 8700k?
Is that good?
Because i want to buy i7 11700k with 4070 does it have abottlneck?
@@doler2570 i would recommend going to at least a 12th gen cpu if you are building new. More of an upgrade path and prices have dropped a lot lately
I just got 7900XTX to my 3700X. Funny thing that in Hunt Showdown on Low settings I have almost same FPS like my old ROG 3060Ti. And I have also the same FPS in Hunt on Low and on Ultra settings. Difference is minimal. So, okay, now I can play every game on Ultra but Im not benefiting "any" FPS boost.
Can you confirm my results here? Does it make sense? Because I was thinking if my AsRock 7900XTX and its super loud coil whine isnt bad piece.
Also my results in SOTR: 3700x + 7900xtx + 32GB 3600Mhz CL16 + 3440x1440p 21:9.
- cca 18.569 points, 118 avg. fps, 0% GPU Bound.
I dont want to invest to AM4 anymore :(. Im waiting for next gen. CPUs :(
The 3700x which is based on Zen 2 Ryzen architecture released in July 2019 and will bottleneck that XTX. The SOTR benchmark is the big indicator with 0% GPU bound. I also would not invest in AM4 at this point as even Zen 3 can bottleneck at less than 4k resolutions. I don't think you need to wait for next gen CPUs. You can get an AM5 motherboard today and it will be good for a couple of years including Zen 5 coming next year. If I were in your position, I would wait for the holiday Black Friday sales and look for a CPU/MB/RAM combo deal. Even a Ryzen 5 7600 would be much better. Then upgrade to the next gen CPU I really want next year and sell the 7600. There could be some bottlenecking in games that use a lot of cores with that 7600 however, at least in MOST games, that 7600 would provide much higher frame rates with that XTX than the 3700x ever could. In the meantime, I would sell that AM4 MB/CPU/RAM while it's still worth something.
OR, return/sell that XTX until you are ready to upgrade your system. Hardware only gets cheaper over time. Good Luck!
Thank you, good to see a different point of view, but, the question is why?
nice vid. what software do you use to show the GPU bound? I have a RTX3090 and AMD 5900x so I can stay for it the next 2-3 years?
That is the built-in benchmark for Shadow of the Tomb Raider.
Yes. This combo will not be outdated anytime soon, that’s last generation pc components enthusiasts specs. I’m using 3090 with 5800x and will use it for years to come. I’ll skip next generation ale will consider upgrading one that comes after RTX5000/RX8000 and Ryzen 9000/Intel 15000 cpus.
sounds like shadow of the tomb raider might not be all that relible of a test at this point, i'd say try several other cpu bound games to see a better picture of how much those CPU's perfromance is deviate one another.
I have a wicked fast setup with a 4070 Ti, there are still some scenarios where there is a bottleneck. It 'seems' like it's the Cpu at times, but I think it's more likely Nvidia's garbage driver overhead, API issues, and general bad optimization of games at the higher tier graphics settings. Kinda sucks we have to throw expensive hardware at these problems that could at least in part be fixed by Nvidia and Microsoft.
I risk the bottleneck if i made a build with Ryzen 5 7600 and RTX 4070 playing in 2K resolution?
i've got the 10700k@5.1Ghz and the 4070 (non ti) is bottlenecking my cpu (even in 1440p)
heard that the 40 series are bottlenecking almost all cpu's
i guess nvidia did something wrong with the drivers
@@mr.electronx9036try the 13 the Generation
@@mr.electronx9036 you want your gpu to be the bottleneck though.. that means that you are getting all of its performance and it’s not being held back by the cpu. You want to see 99% gpu usage as often as possible. But when you go down in resolution like to 1080p it is so easy for your gpu to pump out frames that it’s not even trying, and the maximum fps your cpu can do handling its part of the equation is now what is the cap or cpu bottlenecked. There will always be a “bottleneck” though or else you would just be at infinity fps. When you go up to 4K however the resolution is much more taxing on the gpu, so it is pumping out much fewer frames which is well below the CPU’s max frame rendering and thus the max fps of the gpu is rendered and it is now the bottleneck.
@@eternalabysx we will see what 14th gen brings to the table.
I got a 4070 ti "coming soon" and i have a ryzen 3700x. wondering if i should stay with it for now / any thoughts appreciated
I have similar setup. 3700x as well. Check my comment here and let me know if you have similar behaviour.
its not always a CPU bottleneck, but memory, and memory access (higher cache alleviates )
Good point! It's the combination with the platform.
No it doesn't the math doesn't support that. The benchmark it self is in fact incorrect. You can't mathmatically get 27% out off the numbers on both gpu render or cpu render fps in any of the calculations.
at 1440p ultra settings, GPU mainly used so even i7-8700 non k can work fine with 4070.
I did some benchmarking in SOTTR over the last two years. I dont know/think it's of any use, but I'll throw some of the results at you, and if it helps in any way, then great. If not, well I guess I just wasted a bit of your time :D I did two upgrades in total between now and december 2021. CPU and GPU. I really wish I had tested my new gpu with the old cpu(4790k), as that would've been more of what you asked about at the end of the video.
And even though the video was about the 4070ti, which I cant help or contribute any sort of info towards. But figured I would throw some other numbers at you anyway. Feel free to ignore lol ;)
Anywho, I have both 4k, 1440, and 1080 results.
1440p: (4790k + gtx1080 = 65fps. 99% gpu bound ) -- (12700k + gtx1080 = 65fps(!) 100% gpu bound) -- (12700k + 3080Ti = 165fps. 85% bound)
1080p: (4790k + gtx1080 = 89fps. 50% bound) -- (12700k + gtx1080 = 96fps. 99% bound) -- (12700k + 3080Ti = 198fps. 42% bound)
4K: (sadly no test with 4790k) (12700k + gtx1080 = 34fps 100%bound) -- (12700k + 3080Ti = 93fps. 100% bound)
Probably zero interesting things to gather from this. But meh, at least you have more numbers to look at ;-) I'm by no means an expert, nor do I know much. But I always found the difference at 1440p between the 4790k and 12700k to be "funny". Exact same fps. Sure 1440p is more gpu heavy than say 1080p ofcourse. But still, zero increase going from that cpu to the 'new' one. Would've really loved it if I had tested 4K with the 4790k too, just for the sake of it. But way too lazy to remove my 12700 and replace it with 4790k just for a test ;-) I always do these kinds of benchmarks before I upgrade. As it's fun to see the difference between then/now. SoTTR is just one of several games I use for that. Guardians of the Galaxy and Forza 5 are used aswell.
Just as in SoTTR, Guardians also saw zero changes in 1440p between 4790k and 12700k. And as mentioned, 1440p is from my understanding more GPU heavy than CPU dependent. But still find it weird that there wasnt even a slight upgrade with a 'better' CPU. Even at 1080p, I felt the improvement was abysmal, where the 4790k gave me 89fps, while 12700k pushed it to 96fps. Not a big jump IMO. In Guardians of the Galaxy built-in benchmark, there was at least a bit higher improvment between 4790k and 12700k. Went from 75fps to 95fps. (and in 1440p there was 1fps difference between the two cpus (using same gpu ofcourse)
Oh and also, every test was only different when to which cpu/gpu was in it. In other words, same mobo, same RAM, same everything other than the cpu/gpu
Anyways, like I said, this might've been somewhat interesting, or a complete waste of time of a comment lol.
I love more data. I still have my 4790k but didn't think of testing with this generation of GPU hardware...but maybe it I should reconsider? Thank you very much for sharing!
unfortunately this is missing 4790K + 3080Ti test, that would give the best idea how much FPS the processor update actually gave.
also some games really push the CPU side, like total war games or warhammer 3, much more than your usual FPS
as this side is missing from everywhere, how does the CPU scale from. 3700x/ 4790K, 5800x(3d) to these last gen monsters with mid/upper mid tier GPU. I dont care about 4090, its not something 99.5% is going to have.
I somehow managed to build my very first computer earlier this year with an i9-9900k, 32 GB of 3200 mhz RAM, and a 4070 TI for 1000 bucks. I literally had no idea what I was doing lol but I can confirm it's is the just enough it can still get 100% usage, especially if you overclock the CPU and undervolt the gpu. I wouldn't go any lower than that though lol.
Im thinking 3090 ti with my i9 9900kf but not sure if its a good match at 1440p.
Is the 9900k with a 4070ti fine for 1080p? I wanna upgrade from my 2060
@@veqr7266 way overkill just get a 4070 for 1080p, it has plenty of headroom for 1080p gaming for the next few years unless your going for something crazy like 240 fps.
@StuffIThink would a 4070 super still be fine because its the same price?
@@veqr7266they lowered the price of the 4070 since releasing the super series. 4070s lowest price new is $540 while the super is $610 and the 4070 is already kind of overkill for 1080p right now so i would just stick with the 4070.
i have a i7 8700k@4,8Ghz and play on 3440x1440. the only game where i had a cpu bottleneck was spiderman remastered with a 4070ti. it get even worse then i use dlss cuz the gpu load was on 50% or something. even with raytracing and everything maxed out my gpu load is around 65-70%. all other games my gpu load is between 90-100%.
i dont understand the point you getting to
what are you saying
does the 40700ti need any cpu or the fastest that doesnt even exist yet
does it perform or underform ?
Can you test the ryzen 7 3700x with the 4070 ti? How much bottleneck? Thanks
it has to do with ccd's the 7950x3d does not perform properly due to it having to deal with both ccds. and the latency between them, you should see better performance with 1 ccd only enabled. i believe hardwareunboxed simulated these results in their gaming benchmarks. with the 7800x3d simulated.
Makes sense, there should be that latency present.. switching off a ccd is not really optimal tho, hoping for the people that own these cpus that they get new microcodw
@@lemonhaze715 its not that bad if your only gaming with a chip than a theoratically fake 7800x3d with 8 cores will be just fine
Thats true but the main difference why other dual ccd chips performs well E.G. 5950x, 7950x is because they have symmetrical CCD's while 7950x3d/7900x3d do not and scheduling and managing is a complete nightmare with those chips
Again, it has NOTHING to do with latency. If the right CCD is chosen and said game utilizes 8 cores or less there is no god damn latency! By the way, 95% of all games sue 8 cores or less, may even be higher than that!
@@cdiff07 it will still use both. there is a reason why the simulated results in hardware unboxed showed very good performance gains when only 1 ccd is used
I am using 5600x with rtx 2060. I am thinking of upgrading to rtx 4070 non ti version. Will there be any bottleneck? My targeted resolution is 1080p and only gaming.
My mindset is to get 60fps maxxed out in singleplayer and 120fps at multiplayer end of the story. Was literally using 2500K for 11 years and recently bought 5950x and mobo (real cheap from a guy moving to 7800X3D) since I needed it for Handbrake and Blender, good for another x years. Never undterstood chasing FPS and getting the best of the best every 1-2 years. Great test
depends on what you are doing and what parts are in your pc right? if all youre playing is minecraft and some storygames youre fine with 10yo cpus and gpus, if you wanna play the latest and greatest games with decent quality settings to get the full experience you might need to get a better gpu and cpu right?
That is a way to save power draw and keep temps lower too... If you get a VRR display then you don't necessarily need a fixed frame rate either.
Intel didn't make decent improvements until 8th gen Core series where they upped the core counts. Good job riding out the years of BS 5% improvements from intel.
4070Ti with 7800X3D it's a BEAST in gaming ...you should try it ;)
so what is best choice?
this video is exactly what i needed, i recently got a 4070ti and need to see what cpus are better suited than the 5950x
Glad it helped!
Just because the gpu isn't 100% utilized doesn't mean its bottlenecking it can be the game engine the game its self theres a lot of different things to account for JayzTwoCents make a vid on it it was pretty helpful
Nicely done and cool seeing these various configs. Ignore the lost souls defending the “scam” 7950x3d. You proved it just doesn’t work and is not the best at anything without senseless tweaking. You also know how the 7800x3d will perform (properly without BS software and driver settings) at far less cost.
Would love to see your take on the VRAM limit hitting some games now. Basically 8-10GB cards are starting to show limits with higher settings sooner then most thought. Cheers.
mac you need to turn the xbox game bar on for Vcache to work this will be some thing in the future they will already have in windows 12 already on . so turn that on and rerun the test again... with the 7950x3D
Thanks for the feedback, but it was turned on. Installing Windows 10 on a clean install on a new SSD and using AMDs reviewers guide gets it to work...most of the time.
so would a i7 11700f be okay with a 4070 ti super?
In your opinion what GPU bottlenecks even 5800X3D in 1080p games ?
Hi. I have the 2700x and the 4070. Is the any bottleneck?
In valheim at 1440p my dip below 60fps.
In Enshrouded the same.
Will the 5800x3d end this problem?
Greetings from Germany
2700X isn’t powerful enough to utilise 4070. 5800x 3D will be great. Make sure you using latest bios, installed latest motherboard chipset drivers for and use 3600mhz ram in dual channel.
as long as everything is smooth not choppy no frame drops then the cpus are technically not bottlenecking yes some cpus add and take a few frames but nothing really noticeable unless your looking at stats in the grand scheme of things unless your having frame drops or seriously low 1% lows youre fine
So if you want high frames stay in 1080p?
Hi all! Can u pls say will 7600x bottleneck 4070 Ti in 1080p and 1440p?
Exactly my question, I have a 8700k, I play in 1440p ultrawide or 4k Oled TV. Thinking to buy a 7900xt to have an in between synificant upgrade (from a rtx 2080) instead of going full load and change everything. But I don't know exactly how much the 8700k will bottleneck.
unfortunately alot, your 2080 is already getting close to getting cpu bottlenecked with that i7. Owned the 8700k myself up until end of 2021 where ive upgraded to an i7 12700k and saw a bump of 40% in avg frames and even more % in the lows. Id suggest to hold out a little longer and save up something to get yourself a completely new system thats well balanced. i5 12600k being the bang for buck right now might be something you'd like to look into.
@@GioCTRL actually I just end up on a video where it shows it starts to bottleneck in 1440p. But only for some games like cyberpunk :
ua-cam.com/video/81c5wMZTNfI/v-deo.html
So yes, maybe worth to change everything. Thanks for the reply
Higher graphics settings + higher resolutions = less chance of a CPU bottleneck. Ask yourself how often you are pushing high frame rates at lower resolutions. Also, how high of a frame rate do you actually need?
Is 200+ fps required to play a game like Shadow of the Tomb Raider? Definitely not. There's a difference between pushing FPS for testing purposes and playing a game with realistic settings for personal enjoyment.
@@WickedRibbon yes exactly that's why I'm wondering if I really change everything now or change the Gpu and then the rest.
Right now I'm playing on Ultrawide 3440*1440 @144hz or 4k @120hz. Maybe the 8700k oc can still hold a bit for this resolution.
Why I really need to change the Gpu is that there is not even a hdmi 2.1 on the 2080 so I'm limited in terms of bandwidth for gaming on my TV... (4k @60hz 10bit or more hz but 8bit if not mistaken). And 7900xt or 4070ti = 2x more fps from 2080...
I would say, get the GPU and see how you go. Assess the games you're playing and you'll be able to tell quite easily if your CPU is an immediate problem - especially at those high resolutions.
There's nothing stopping you from doing a full platform upgrade later on.
You said people should not buy a 4070 ti for 1080p. But what if someone just wants to max out every game and play at 1080p and never dip below 60 fps? I'm seeing a lot of games drop below 60 on 1440p, which would really irk me if I did have a 1440p monitor and bought a 4070 ti. Hell, with RT enabled on a few games it drops below 60 even in 1080p. I thought I figured out the whole "bottleneck" thing a few years back, but now that games come out unoptimized I am completely lost about it yet again.
Agree, not everyone wants to play on 1440p and 4k, cuz 1080p when right monitor, can still looks very good, i have 4070 ti with 1080p monitor and its cool experience
For 1080p 60fps a 3060ti will suffice. A 4070 is for those with high refresh monitors, you know, competitive gaming. If your problem is lack of optimization, well Frame Generation solves It. I wouldnt pay more than a 4060 or 3060ti for 1080p
@@gerardotejada2531 i dont play online games, only sibgleplayer, so 4070 ti is still a better choice over 3060 ti, and i lready have 4070ti playing single player games with 1080p and its good
@@JacoB-wp4wswhat cpu?
4090 drops below 60 in RT overdrive maxed out settings 😂
does anyone think that the 5800x3d will be a good combo with the upcoming 4070 ti super?
so.. 4070ti is better on a 5800X3D than a 7800X3D because of bottleneck issues? and if paired with 7600X is the same as 5800X3D in fps or did i miss the point ? trying to understand noob here
it depends also on the game and its architecture, those numbers are consistent to that particular game, and in others at certain resolutions the bottlenecks would appear differently, doing that for 50 games would take a lot of time, but would potentially provide clearer or more conclusive results althogh I doubt it would really mean much all things considered, either way tx for the vid you always pose interesting questions..
A lot of Bottleneck will depend on Specific Game you are playing and how much it leverages CPU..
Could this be a game engine issue.
I will wait until ryzen 8000 3D cpus release before I make the jump to AM5. For now my 5700x is more than enough for my gaming needs
Is there a bottleneck with a 5600x and a 4070 or 4070 ti?
I have i9 11900, is enough for 4070Ti?
Great video but some visual numbers comparisons on screen to accompany the commentary would make things so much clearer to understand. Admittedly my focus wasnt 100% on your video, if it were i would be able to follow but in a world full of distractions and multitasking my suggestion would make your video clearer to understand, i see that you're at 8k subs so maybe this is the next step in video quality, all the best.
Thank you for the feedback, very helpful!
RAM speed?
I have an i9 9900K and a 4070ti in the rig I am typing this on! My CPU is running a 5.0ghz all core overclock and I feel this really helps to push it up to modern standards, and when overclocked to 5.2ghz it can even hold it's own with the likes of the 5800X3D , beating it in synthetic benchmarks.. so it's a good chip, despite its age, and most games are optimized for 4-8 core systems now I would say.
I just don't like the 45-50 degree idle temps with the 5.2Ghz OC applied, as I need to give it about 1.45v, where as I can get away with 1.28v at 5hz stable.. I will consider perhaps upgrading to the next gen intel platform next year (maybe when they have fixed the DDR5 issues) but for now the 9900K is still a beast for gaming!
It would take a lot of convincing to get me to switch over to AMD anytime soon even with all the recent improvements
What kind of cooler you have ? I have mine at 5.2, 1.4V, i'm getting 35-40 at idle or doing random browser stuff on a NH-D15.
@@Taldirok Corsair H115i Elite
Guys stop bothering with core clocks, it needs fast ram, not 5.2ghz
Please which one is better CPU .. 13600k or 7900x if they are have similar price
In gaming and work..etc
7800x3d only for gaming
Just got a 4070 Ti Super and I'm still on AM4 Ryzen 9 5900x. I'm perfectly happy with my frame rates in gaming, but my PC is also a workstation. The Cuda performance is more important.
can my 5800x handle the 4090 upgrading from a 1050ti
You need to test more than one old game.
True...as time allows. The nice thing about the old Shadow of the Benchmarks is how well it still scales with hardware and how it does not get continual patch updates so you can always compare results.
I have the ryzen 7 5800X, and in the next week i will buy the 4070ti. Should be work fine? Or i need to change the cpu for the possible botleneck? Thanks you!
Your 5800x will perform very much like the 5950x in the video.
@@ImaMac-PC Thanks! I working in edition and 3d, so thats cool, a New cpu is ver expensive!
Imma pair my I7 6950x (heavily overclocked) with a 3090 for the lols, I'll upgrade to sapphire rapids one day
Hi mac, i have a i9 11900kf and i want buy a rtx 4070 ti, my monitor resolution is 3840x1080( Samsung 49") i have a significant bottleneck in this configuration?
Yes it will, tha rez is barely more intenseinve than normal 1440p. 3.6mil pixels vs 4.1milion, only a 460,000 pixel difference the card won't brake a sweat.
I have a 5600x, B550 Tomahawk, 32GB 3600Mhz CL16, and a 4070ti and im having pretty good performance, i just intend to change the 5600x to a 5800x3D this year.
That will be a very nice and noticeable upgrade 🙂
I have a 10900k with a 3090 and I see no need to upgrade unitl 14th or 15th gen intel.
Will 7800x3D bottleneck the 4070 Ti or should I pair it with 4070? Can anyone help pls.
Hi. I get 245fps with stock 4070ti ASUS TUF (no oc model, with silent bios) on 1440p with 7800x3d with highest preset. (Tomb rider demo from steram benchmark)
Xoe sit make any problem if you play on 1080p ?
You need to test about 10-15 games at the minimum and i have a 5800x3d and 7950x3d the new chip is a good bit faster in gaming and people who get cpu’s like this like to play around and tune. Most people are not following the steps to installing a new x3d like updating windows, chipset, bios and so on plus running windows in balanced not performance. You can’t made an informed decision off one game there’s so many variations in games you need many to test different engines and api’s dx11 and 12 or vulkan.
Do you think 4070ti Bottleneck Ryzen 7 5800x On 1440p 16:9 or 3440p Ultrawide 21:9 32 ram ddr4 3200mhz please help
I have 3090 with i9 10900k and play few games but warzone the most and im losing fps quick 1440p uesd to get 240 now luckily if im getting 140fps. If i dropped to 1080 i get little bit more
Im using 4070ti with 1080p and its amazing, not everyone should follow rules of humans to play only 1440p or 4k, 1080p when you buy good monitor this can still looks very good
i have a 3080 and 5600x, i will get a 5800x3d. and my next upgrade will be a new build with intel
What about a 4080 with a gold sample 5800x @ 1440p? My situation is a bit complex though. 1440p multimonitor desktop. I also use triple 27-1080's for sim racing and a Vive Pro VR headset for Sim racing and VRChat. With no need for VR, I might have bought a 4070ti, but tbh i think that small memory bus and 12GB will not age well regardless. But VR is super sensitive to memory bus width and VRAM amount and frametime is KING in VR. I'd love a 5800x3d because it crushes sim racing and VR, but it's hard to justify since I already have a good 5800x :(.
You should be fine with a 5800x and 4080 at 1440p don’t waste your $$$ on a 5800x3d.
5800x3d is a beast can't wait to buy it :D
13600k is way better tbh
@@Alex-bl8uh it really isnt
@@justinsuerbier4301 ok, tell me why? same price but better performance in gaming on average + extremly better productivity performance.
@@Alex-bl8uh Platform price, B550 + DDR4 are much cheaper. Also if you go Intel now, you need to change mobo anyway once 14th gen hit. I'm personally wait for 14400f as well, should be the same uArch as 14600k but less ecore and clock.
I am waiting for the price to drop more before changing my 5600x to the 5800x3d. I have no problems with it on my 4070ti build but to use it’s full potential for the next 3 years, I’ll need the 5800x3D.
im on a i5 and i got more then the monitor allowed one day up past 240 fps in a older title
I have a 3800x with the 4070ti and I've been on the fence about 5800x3d vs 5950x vs7950x3d upgrade
I have similar setup. Can you check my comment here and say if you have similar behaviour?
@@MrJirkaCZ no I'm well into the hundreds for fps on ultra raytracying in almost everything I play only reason I want to upgrade cpu is some cpu intensive games I want to play while streaming and that will be a bit much for my current 3800x
What res was this on?
1440
Still using i5 9600k for 1440p ultra gaming :(
I think it’s the best card for i9-9900k for anything stronger new cpu gonna be required 😅
This is great research. I still have the i5 6600k clocked to 4.8ghz in my system. I’ve been trying to figure out if I could just slap a 4060 or 4060ti in there and call it a day. I would probably go back to playing some old games like StarCraft 2 and destiny 2. Diablo 4 catches my interest as well
You would definitely be losing performance but the question is how much. It may or may not matter.
@@pf100andahalf I have 2 gtx 680’s currently so as long as it’s better than that.
My wife had an i5-6600k and I installed a 2080 super which was bottlenecking, she's playing Hogwarts Legacy. Just upgraded her to a 5600x (Micro center is having a combo cpu/mobo sale), she's able run everything at ultra now at almost 60FPS. I just played the last two weekends of D4 Beta, all is well on my 5900x/4070ti...
@@ginzero that’s good information. I’ve seen somewhere that games that are cpu intensive take a performance hit in fps. I suppose it won’t matter much if I’m playing games that are already a few years old.
My daughter is still running a I5 8600K with Asus Z370 MB and new 4070 TI honestly it works pretty good in 144 and some games ar 4K at >70 FPS but its definitely being pushed to limit.
Not all games can take advantage of the v memory cache.
Maybe you should have tried a couple more games
Anyeay thr 7800x3d is impressive because it performance is great and sips power at just 120tdp
I have a 7700x with a 4070ti supreme x and i get way better results then this: 1440p 197 fps 95% gpu bound, 1080p 252 fps 40% gpu bound. Budget cpu for the win?
this is heavily dependent on the game and resolution, old games will run blazingly fast with new cards, test with a unreal engine 5 game firing on all cylinders with features like nanite, lumen, virtual shadow maps etc etc
Is a 4070ti and a ryzen 5 5600x a good combi or bottleneck for 1440p ?
What games are you playing? What settings are you running at 1440p?
@@ImaMac-PC i only use my pc for simracing my game is acc(assetto corsa competizione). Now with my 3070 i play at 144 fps and mostly on high . In league racing i tweak all down to medium
I have to admit that i‘m not that kind of player who needs all settings on epic or ultra high
@@ImaMac-PC I'm wondering the same answer. I got the same system. I usually play single player games with max detail.
I wondered why my 11700k/3090 combo worked well at 4k. Anything more on that cpu then Nvidia would really have reason to sell you dlss3 with frame generation. Made for cpu bottlenecks.
DLSS 3 is the reason why I invested in a 4070 Ti despite having a 11700f for 1440p
@@cks2020693 Exactly, I just got my 4070ti in the mail a couple of days ago and only had to pay 730 USD for it! Only difference is that I have it paired with a 13600k overclocked.
I have a 5900x with the rtx 4070ti. They seem a perfect match
See i had this combo at first as well but I’ve noticed bottlenecks that wasn’t there before until i swapped to the 4070 ti
I have 5900x with 4070ti but I do have a bit of botlleneck ! About 80% gpu bound
i have a 9900k and a 4070ti and when playing warzone at 1080p low settings im only getting 100 fps and it drops to 50. my cpu usage is at 100 percent while my gpu is only at 20 percent. is there anything i can do to improve this or should i upgrade?
At 1080p-Low, the 4070ti is bored and just waiting on the 9900k, even if you have the i9 OC at 5GHz all-core and have good memory (3600MHz or better). If you plan on playing at 1080p-low, you'll need a current gen CPU to get the most performance from that 4070ti.
@@ImaMac-PC I figured that would be the case what would be the best cpu to pair with a 4070ti (preferably intel)
I had the same problem with my 8700k, but it's not a processor problem at all, it seems like a game problem. It worked for me by changing the maximum and minimum state of the processor. I left the minimum at 85 and the maximum at 95. With that, the processor was kept at a maximum of 90% use and the fps do not have such serious drops.
Sorry, English is not my native language.
@@ImaMac-PCwhat about i7 13700k 4070ti or 3080ti on 1080p? (I dont mind to spend money on one of these to get the maximum performance and fps )
What gpu would you recommend with a 9900k then for 1080p?
How much a 5700g will bottleneck a 4070 super?
Im running a 4070 with Ryzen 5 3600, what cpu should i upgrade to?
5800x3d
I have a 3900X + 32 GB of memory and a 4070 TI and no similar problems are observed in either 1440p or 2160p 🤷
The 3900x or any Zen 2 based CPU will hold back the 4070ti in CPU demanding situations. Even if it holds it back, it may not be significant enough to matter. It just depends on what games you play, settings you use and FPS you are ok with. To put it another way, if the game plays fine for you, then it just doesn't matter...Just enjoy!
My 9900k still rock in 1440p
I7 13700k and 4070 ti no bottlenecking. On 13 most popular games ddr5 5600 36
i mean 9900k is still fine but it can bottleneck games like battlefield 2042 or warzone 2 heavy multiplayer games in single player games i dare say even 8700k is enough for 1440p with 4070 ti
That's a really good distinction between types of games.
So you are saying 9900k is not enough for 3090 at 1440p? Jumping to 13 gen isnt worth imo.
@@fiece4767 well we are discussing bottleneck and 9900k oced will bottleneck 4070ti in those heavy weight multiplayer games for sure
I’d be returning the 7950X3D. What a PITA.
I wish i knew the procentile on the R5 5600x with the 4070 Ti.
53.5% at 1080p .... 41% at 1440p according to gpu bottleneck calculators..
I don’t know how accurate are the bottle neck calculators , but 3070ti and a 5950x should not have a bottle neck on 1440 p 21:9 aspect ration
5800x3d has been rock solid with the 4070Ti at 3440x1440
this is why i got a 5900x, 1080p is not what I play for anything.