The cpu is really so productive that even rtx 4090 in 1080p and dlss in many games is not able to fully reveal the full potential of the new processor. Games : CYBERPUNK 2077 - 0:00 Microsoft Flight Simulator - 1:01 Hogwarts Legacy - 2:00 Forza Horizon 5 - 3:02 Ghost of Tsushima - 4:12 Starfield - 5:15 God of War Ragnarok - 6:05 CS2 - 7:09 The Witcher 3 - 8:15 Horizon Forbidden West - 9:24 System: Windows 11 Ryzen 7 9800X3D - bit.ly/3NZh3Rb Ryzen 7 7800X3D - bit.ly/43e3VxW MSI MPG X670E CARBON - bit.ly/3YWS871 RAM 32GB DDR5 6000MHz CL30 - bit.ly/4e3MqEG CPU Cooler - be quiet! DARK ROCK 5 - bit.ly/3AnAOin GeForce RTX 4090 24GB - bit.ly/3CSaMCj SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe Power Supply CORSAIR HX Series HX1200 1200W - bit.ly/3EZWtNj
@@chireaionut8473 yeah no clue wtf you are on about my man. My 7800x3d has literally zero thermal problems and run stable with a -28 all core offset.... I see 60C max in games....
@@pwn3426 he's just saying the changes to 9800X3D putting the cache on bottom has allowed the CPU core to pull more power, a bigger power budget is on hand for a given TDP. So since it now has the cooling capacity to sacrifice a bit of efficiency for performance. Whereas the previous generation X3D parts would simply overheat and throttle under the same power use.
@@mikeweatherford5312i had to buy a 280mm aio to keep my 5800x3d in check as my noctual nh-u12a couldn't cool it enough, id imagine this new 9800x3d could be cooled with an air cooler
@ I’m not an expert either but a simple way to describe it is the microstutters during the game, in a perfect world you want the 1% low frame rate to be as close to actual frame rate as possible for a buttery smooth experience
I use i7 14700k myself, but AMD did a great job, I can't deny that the best gaming processor is 9800x3d. Both the temperature and the power are really amazing.
@@micaiahchapman1599 you'd need to get a new Motherboard, new expensive cpu, and possibly new ram if you used ddr4 with the intel. 14700k is fine for now
Depends what games you are playing. I personally find all newer games these days to be eh, so I just replay games that I already own. It’ll last forever for those lol.
@@justrandomguy8002 There is less stuttering with higher 1% lows. The higher the 1% is, the less frame loss there is, which means that there is more fluidity during the game.
@@justrandomguy8002 1% indicates microstutters, 0.1% indicates bigger stutters. Basically you want both values to get close to your averages to depict a smooth gameplay experience. 🤷🏻♀️🥴
1% lows are the real reason to upgrade even with just XMP enabled and you can even overclock the CPU and memory this time, hands down the best cpu on the market.
I'm still on the 5800X3D and haven't even thought about switching to something more recent, but AMD has already released the 9800X3D, have mercy, give Intel and my wallet a chance :D
@@peterpanini96 Uh.... But the 7800X3D and the 9800X3D are actually 8 cores hyperthreaded CPUs 🤣 I think what you mean is that we'd be paying a grand for those 8 cores hyperthreaded.
@@Hardcore_Remixer Hello, Technically i9-14900K is a 8 Core Gaming Cpu, In games 8 Performance cores are used other 16 efficiency cores has nothing to do with gaming.. Also R9-9950X is a 16 Performance core cpu but games doesn't utilise anything above 8 Performance Cores.
"I thought the 1% low referred to the lowest FPS, and the 0.1% low referred to the most extreme frame drops. However, since CS never dropped below 600 FPS, shouldn't the 1% low for CS also be 600+ FPS?"
It's crazy for a game (I'm talking about you Hogwarts legacy) to barely hit 100 FPS at 1080p with the fastest gaming CPU on the planet right now. Developers really did their best to make that game avoid fully utilizing your hardware.
@@iikatinggangsengii2471 You would get the same FPS in 4K even with ray tracing off. You can see the GPU is chilling just above 60% usage. So the limit has to be on the CPU, but the usage is just around 35% meaning it's just badly coded. The dogshit optimization cannot be excused in any way and I wouldn't pay a cent for that game. Cyberpunk is also openworld and is clearly GPU bottlenecked even at 1080p so I don't see how HL can be more demanding.
@@happybuggy1582Yeah it's extremely popular, I suspect a lot of people are moving away from their old Intel systems towards the 9800X3D in time for the 5090
This cpu is really good, but I have a 7800x3d, I see no need in paying that price right now for an upgrade, but good to see an improvement in the cpu, I'd rather wait for the next generation of cpu's
I agree I did the upgrade because I had a 2-3yr old cpu.. so it’ll be a huge difference but in 1440p there isn’t much you get out of it , but let’s see how overlocking goes . I think we’ve yet to see what it can do
def would not upgrade from 7800x3d to 9800x3d. I usually wait at least 2 full generations, and /or 30-50% improvement. You'll go broke chasing 10-15% per year. Im upgrading from 5800X3D and it's a solid 30-45% faster
I play at 4k. There's basically going to be zero improvement from my 7950X3D. But we'll see what the 9950X3D does, its supposedly going to kill the 9800X3D in gaming.
The 9800x3d is far more efficient because it uses a smaller more efficient process node, the only reason its using more power is because AMD is "pushing it further"
@@igm1571 I remember wheen I bought neew pc 2020 and theye said AMD was taking oveer too. It ended up Intel still leeading. It will most likely just push intel to improve moree and be ea back n forth
These comments are so brain dead. Y’all are seeing games in 1080p with dlss getting 20 fps boost and think it’s the greatest think y’all ever seen. Nobody is gaming at 1080p on a 4080/4090.
Honestly I was expecting more from 9800X3D's power efficiency. Sometimes it's almost 20W higher with just a few more FPS. Still a beast though. EDIT: Thanks for the explanations guys
Yet it's 5°C cooler on average and rarely goes over 100W in games. It's a marvelous CPU, there's nothing to complain about apart from the prices overseas.
you want a miracle than. And there's no miracles, it's physics. The improved layout enable it to run higher clocks on a very similar architecture and same lithography the performance gains are coming from the improved capacity to remove heat from the die and that requires higher power. There will be efficiency gains on lower loads because it will require less power since the processor will be almost always running cooler. This is a refinement product, and god damn it's very refined and elegant. Go see der8auer delid OC results that showcases to a larger degree what is possible. But again, there are no miracles.
@@trexxtechs2857 also lets not forget 14000 intel cpus are literally the same 13000 cpus, they were not prepared for supporting 3 generations on the same socket, they only did that refresh because of AMD. Mediocre performance gains with the cost of power consumption... 9800x3d is a nice upgrade over the 7800x3d, but thats for those without a x3d chip or still on am4 i would say. lets see whats coming next with the 11800x3d or whatever they will call it, another 10% is good enough for next gen on the same socket.
The 9800X3D is definitely the new gaming king. Not as impressive as the jump from the 5800X3D to the 7800X3D but still a great gaming CPU overall, despite the higher power consumption. AMD did a great job.
Looking for some perspective without any hate because I genuinely would like to understand it. I totally understand only testing at low resolution to get the most accurate results for a specific CPU. You're basically putting max pressure on said CPU workload. And I get why the majority do that. Me personally, I want to see BOTH types of benchmarking, because I do see interesting results when you benchmark at higher resolutions, sometimes you see results that don't quite make sense if you are in the "only benchmark in low resolution" camp. I want both types of benchmarking because I have a 4090 and 7900x3d (I only have the 7900x3d because my 7800x3d died and I had to send it in for replacement, but they only had 7900x3ds in stock...). I'm considering upgrading to a 9800x3d IF the results in real world (I play 4k resolution) actually warrant it. I understand playing in 4k is going to limit basically every game because they are going to be GPU bound. But I am just trying to understand why some results show for example, my 7900x3d have better results at higher resolution than the 9800x3d or 7800x3d. Silicon lottery? I have no idea. I would genuinely like to know from someone truly knowledgeable in that category, because I don't wanna spend a lot of money on a 9800x3d if it's not going to help in my particular situation playing in 4k. Why in certain scenarios, is a 7900x3d outperforming a 9800x3d at higher resolutions in the same game?
I think 7800x3d is still enough, for now Maybe i am gonna take it. I could still stay on my actual Intel but games uses more in more shaders at the lunch so, i need to take it The 9800x3D begins interesting for the next Nvidia/Amd gpu's
Well, I absolutely love your comparisons. Best channel for that ever! Thanks so much. This is the one I was waiting for the last weeks... But on the other hand I have to admit: I guess we are at a turning point. What's the point in having e.g. 704 fps with 9800X3d instead of 633 fps with 7800X3D? LOL. Does any monitor show? 😂 Does any eye see these insane frame rates? For the first time ever I feel I have a CPU-GPU combo (7800X3d + RTX 4090) which gives always enough performance for everything, every game... And I am running Flightsimulation on 3 HD beamers here with 220° FOV... NIce to see are better 0,1% lows with 9800X3D, but actually I think I am not upgrading anymore. The unbelievable power efficency of the 7800X3D cobined with it's more than sufficient performance is still the champ.
Prices are little high of intel right now but they are good in productivity tasks overall like 245k which beats ryzen 5 and in someway 7 too in overall productivity tasks
@@nicolasgarcia253 What do you mean similar? The 7800X3D was already better in gaming than the 14900K and the new 285K is worse than the 14900K. Now with 9800X3D Intel really is in the dust in terms of gaming and they can't even hope to reach that efficiency without entirely changing the architecture.
@@ONLY_BHARATVAASI Yes, in productivity. But the Intel CPUs are also consuming much more power for that, even the 285K which is supposedly considered efficient and yet nowhere close to Ryzen level of efficiency. Ironically, Intel has come to the point of only coming on top in productivity.
It’s absolutely crazy how AMD is getting this much off a performance gain with their 3DV cache technology. Welcome to the new gaming king CPU of the market, Ryzen 9800X3D
@@jigjigzacharias5038Someone mentioned that I’d need a better GPU (like the 4090 which is really expensive for me) if I go with the 9800X3D. I’m new to PCs, and I’m asking because I’m planning to build my first one.
So I have the R7 5800X3D and when the 7800X3D came out the lead of the new CPU was greater than it is now between the 9800X3D and the 7800X3D. But this CPU can also be overclocked and shows good performance.
while 9000 is 7000 with improvements and better node, 7000 is completely different than 5000 cpus, also lets not forget 7000 uses ddr5 while 5000 only support ddr4. now you know why. 10% improvement over previous gen out of the box is nice on the same socket, intel couldnt even achieve that with their refresh...
@@igorrafael7429 My 5800X3D is undervolted. So it is more efficient and also a little faster. My CPU never clocks below 3.6 GHz (Okay, I bought it at the beginning of January 2024, so newer ID) and my DDR4 has 4000 CL 18 in a ratio of 1:1. This means it has a lot of performance and I am completely satisfied with it.
I want to see that as well because even though 1440p and 4K are more GPU bound, I feel like those resolutions would be more relevant to gamers who have a 4090+7800X3D or 4090+9800X3D. It is very unlikely that anyone games at 1080p with a top of the line gaming PC such as the 4090 and 9800X3D.
@@JosephKarthic Yeah, that's what I am curious to see. The difference between the 7800X3D and 9800X3D would obviously be smaller at 1440p and 4K, but it could be bigger than one might think. The 4090 is so fast that a lot of CPUs bottleneck it even at native 4K. If the 5090 ends up being around 75% faster than the 4090 at 4K, then even CPUs like the 7800X3D could bottleneck the 5090 at native 4K max settings in some games.
Best 1% lows of any processor on the market, AMD takes the crown. 14900KS has only SLIGHTLY better 1% lows after over two months of tweaking, this doesn't count Intel.
Well, of course the CPU is faster. I understand that you are using a lower resolution to show the CPU performance. But at higher resolutions and graphic quality it is even tied to a 5700x3d, in GPU bound games.
@@nilesmorris9132 Yes, I killed the comment, realized I was being a bit too heavy-handed. I agree that it is a poor benchmark too other than comparing the two processors. They should have done 1080p native across the board...
Finally a chip that’s more optimized and configured for modern day components. Currently utilizing the 7800 with an RTX4080, just from the numbers I could tell the CPU was holding back GPU performance.
it's a good jump, but it's more like Nehalem to Sandy Bridge. The P4 to C2D leap is probably something we'll never see again.... but mostly because P4 was a disaster from day one. The P4 was created because people & the media were obsessed with Clock Frequencies. Clock for Clock P4 is/was worse than PIII.
Nice game review! What its missing is you gotta test the performance during combat and stuff or scenes that are really demanding Other then that keep up the good work!
Really urging to go from a ryzen 5 3600 to the 9800X3D. To pair with my current GPU 4080S. I wonder what kind of performance gain there is. Any chance you could do a comparison for these side by side?
The blood that run in Lisa Su family is simply genius, first u have Jen that found the world most valuable company, then you have Su saving AMD from brink of bankruptcy. Now i am getting excited for Zen 6.
People need to understand that 10% more power in AMD and Intel is not the same because 10% power in AMD is 10% of 60 watts which is 6 watts and 10% power in Intel is 10% of 100 watts or 10 watts. My numbers are just examples but my point is that 10% of a higher number give a even higher power consumption in Intel. Percentages cheat you. Right now ultra 9 285K is bad and AMD is the best . Maybe in the future ultra 9 385K maybe better with low power usage but better performance. Edit: In better context, the base power of previous AMD cpus are lower than Intel's cpus and that's why they are not the same in comparision.
The temperatures are amazing on 9800x3d , its even less than 7800x3d with an average of 4-5 degrees down , especially if you consider consumes more watt
The cpu is really so productive that even rtx 4090 in 1080p and dlss in many games is not able to fully reveal the full potential of the new processor.
Games :
CYBERPUNK 2077 - 0:00
Microsoft Flight Simulator - 1:01
Hogwarts Legacy - 2:00
Forza Horizon 5 - 3:02
Ghost of Tsushima - 4:12
Starfield - 5:15
God of War Ragnarok - 6:05
CS2 - 7:09
The Witcher 3 - 8:15
Horizon Forbidden West - 9:24
System:
Windows 11
Ryzen 7 9800X3D - bit.ly/3NZh3Rb
Ryzen 7 7800X3D - bit.ly/43e3VxW
MSI MPG X670E CARBON - bit.ly/3YWS871
RAM 32GB DDR5 6000MHz CL30 - bit.ly/4e3MqEG
CPU Cooler - be quiet! DARK ROCK 5 - bit.ly/3AnAOin
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR HX Series HX1200 1200W - bit.ly/3EZWtNj
This is sick, what a big jump between ryzen 7800x3d and 9800x3d
All benchmarks are taken on 1080p display?
@@Trampoukosss5090 owners MUST buy this cpu once that Gpu comes out. Nothing else will take advantage of it
which win 11 ?
Would love to see the 9800x3d vs 5800x3d in 2k to see if it’s worth it for me to upgrade
Intel: Takes 10% performance
AMD: Gives 10% performance
Yep
Yeah im not buying intel sny soon... unless it sells cheaper and uses less power... got pissed up with the devil Canon... 😅😅😅
cooler but still more frecuency and performance holy fvck. lisa su is destroying intel
Meanwhile in France: 600€
but how biased is userbenchmark, these disgusting little rats
9800x3d uses more power than the 7800x3d but still has a lower temp.
Great product 🎉
I've heard they change the layout, so the cores are on top of the Cache now instead making it closer to your Cooler
@@maxibooi820that is true. Some ppl reported a 20-30C difference in temps.
@@maxibooi820 yes you are correct
Can confirm
Consumir mais produzindo mais, é natural. Em alguns casos excepcionais, se consegue produzir mais consumindo menos, mas isso é excessão
Ladies and gentlemen here is your new Undisputed Gaming Heavyweight Champion AMD RYZEN 7-9800X3D 🏆
Hail to the king baby.
7 not 9
@@kearmau5 my bad
why I am reading this like the announcer in boxing???????
Not for long, cue the 9950x3d!
9800X3D > ALL
AMD
Advanced
Monster
Device
Its Not Device its just a processor 😅
BURT
Black
Urban
Ripped
Trainer
What was AMD name stand for back in the bulldozer days? I wonder how many % improvement have AMD jump from 9800X3D from Bulldozer.
Q
@@Gattberserki386 vs 9800x3d I think 9800x3d might be the winner
amazed that at a lower power draw the 7800X3D performance is not very far off from the 9800X3D.
Possibly due to the games tested being GPU bound rather than CPU bound so it doesn't really matter if you have a 9800X3D or 7800X3D.
9800X3D is a 7800X3D without thermal problems. 9800X3D can have a higher power consumption with less heat.
@@chireaionut8473 yeah no clue wtf you are on about my man. My 7800x3d has literally zero thermal problems and run stable with a -28 all core offset.... I see 60C max in games....
@@pwn3426 he's just saying the changes to 9800X3D putting the cache on bottom has allowed the CPU core to pull more power, a bigger power budget is on hand for a given TDP. So since it now has the cooling capacity to sacrifice a bit of efficiency for performance. Whereas the previous generation X3D parts would simply overheat and throttle under the same power use.
@@mikeweatherford5312i had to buy a 280mm aio to keep my 5800x3d in check as my noctual nh-u12a couldn't cool it enough, id imagine this new 9800x3d could be cooled with an air cooler
The 1% low improvements are really impressive, something that we tend to forget that is very important
Can you explain to me in dummy terms why the 1% is important? I never took the time to grasp what that even means. Less stutters?
@ I’m not an expert either but a simple way to describe it is the microstutters during the game, in a perfect world you want the 1% low frame rate to be as close to actual frame rate as possible for a buttery smooth experience
I use i7 14700k myself, but AMD did a great job, I can't deny that the best gaming processor is 9800x3d. Both the temperature and the power are really amazing.
9900K here, no reason to upgrade yet.
@@Johan-rm6ec lol
@@Johan-rm6ec LM AO 11gen i5 is fater than 9900
dump the 14700k and get the 9800x3D. I switched from the 147, to the 9800 and oh my god it’s a dream.
@@micaiahchapman1599 you'd need to get a new Motherboard, new expensive cpu, and possibly new ram if you used ddr4 with the intel. 14700k is fine for now
I see no need for people on 7800x3d to upgrade another 2-3 years...
5 yrs.
@@wingman-1977 7 years
Depends what games you are playing.
I personally find all newer games these days to be eh, so I just replay games that I already own. It’ll last forever for those lol.
Even those that still have a 5800X3d in their systems. They still have a solid chip.
Even those that still have a 5800X3d in their systems. They still have a solid chip.
low 1% improvements for cyberpunk 2077, ghost of tsushima, starfield, witcher 3 are crazy
even in Forza, while average FPS is only a little higher, the 1% lows are 10-15% higher
@@Definedd why is higher 1% lows better ? can u explain?
@@justrandomguy8002 There is less stuttering with higher 1% lows. The higher the 1% is, the less frame loss there is, which means that there is more fluidity during the game.
@@justrandomguy8002 1% indicates microstutters, 0.1% indicates bigger stutters. Basically you want both values to get close to your averages to depict a smooth gameplay experience. 🤷🏻♀️🥴
those 1% lows in witcher 3 are insane
First one! Good tests as always!
The king is dead - long live the new king!
For Witcher the 4090 GPU is the bottleneck for 1080p resolution.
who the f plays 1080p on a 4090? ROFL
@@pwn3426who tf plays 1080p on r(t)x 4(0)60
I was studying for my final year semester but this seems to be more important
1% lows are the real reason to upgrade even with just XMP enabled and you can even overclock the CPU and memory this time, hands down the best cpu on the market.
not for CS2
It's hands down not the best. If you're only interested in games maybe it's the best.
What's even better is that it uses the same amount of wattage, produces higher frequencies, and runs COOLER than its predecessor.
I'm still on the 5800X3D and haven't even thought about switching to something more recent, but AMD has already released the 9800X3D, have mercy, give Intel and my wallet a chance :D
itll be a good upgrade
Don't worry. The next 11800X3D processor, which you will replace your 5800X3D with, will be better than 9800X3D. Save up money.
I have the 7800X3D and I’m thinking of switching
@@hairybizrat2078 for what reason bro even I have 7800x3d just 2 months ago I had built the pc
@@_ReverseSide_😂😂😂
AMDomination over intel is insane.
Amd deserves it whitout them we would still be using 8 cores hyperthreaded intel cpus for gaming.... 😂😂😂
@@peterpanini96 Uh.... But the 7800X3D and the 9800X3D are actually 8 cores hyperthreaded CPUs 🤣
I think what you mean is that we'd be paying a grand for those 8 cores hyperthreaded.
@@Hardcore_Remixer😂
@@Hardcore_Remixer Hello, Technically i9-14900K is a 8 Core Gaming Cpu, In games 8 Performance cores are used other 16 efficiency cores has nothing to do with gaming..
Also R9-9950X is a 16 Performance core cpu but games doesn't utilise anything above 8 Performance Cores.
the 1% low is godly on the 9800x3d
only on the 1080p
"I thought the 1% low referred to the lowest FPS, and the 0.1% low referred to the most extreme frame drops. However, since CS
never dropped below 600 FPS, shouldn't the 1% low for CS
also be 600+ FPS?"
I know right?! Best 1% lows I have LITERALLY ever seen on screen in my entire life!
@@Anon1370 well its obvious isnt it, the GPU bottlnecks in 4k which wont show us the full potential of the 9800x3d
Fight of giants. Two great Kings of games.
Wow, the 4090 is really bottlenecking the 9800x3d
This cpu is right in time for the 5090 launching soon
its 1080p thats why.
@@elpato3190 you meant the 8800 xt? (I'm very retired btw)
@@ramyissadlittledog7258 8800xt will be slower than the 4090, so no, it'll still bottleneck the 9800X3D
9950x3D is going to be a monster.
It's crazy for a game (I'm talking about you Hogwarts legacy) to barely hit 100 FPS at 1080p with the fastest gaming CPU on the planet right now. Developers really did their best to make that game avoid fully utilizing your hardware.
its bcs the game really demanding, its an openworld with raytracing and full effects, witcher 3 also alike with rt patch
@@iikatinggangsengii2471 You would get the same FPS in 4K even with ray tracing off. You can see the GPU is chilling just above 60% usage. So the limit has to be on the CPU, but the usage is just around 35% meaning it's just badly coded. The dogshit optimization cannot be excused in any way and I wouldn't pay a cent for that game. Cyberpunk is also openworld and is clearly GPU bottlenecked even at 1080p so I don't see how HL can be more demanding.
The 9800x3D has taken the crown 👑.
9800x3d with 5090 💀
4090 is CPU limited even at 4K on some games. I think even with 9800x3D 5090 won't get full potential in some games.
@@ivofixzone6410 I think with RT and everything enabled at 4K we will be able to push the 5090 to it's limits, maybe not in flight simulator though.
They’ve been scalped to $550 rn
@@happybuggy1582Yeah it's extremely popular, I suspect a lot of people are moving away from their old Intel systems towards the 9800X3D in time for the 5090
@@AgentSmith911 it’s still worth it if it costs $2000.
This cpu is really good, but I have a 7800x3d, I see no need in paying that price right now for an upgrade, but good to see an improvement in the cpu, I'd rather wait for the next generation of cpu's
yeah zero point in going from a 7800x3d unless you plays 1080p on a 4090, but who tf actually does that but a certain few esports gamers, lol.
I agree I did the upgrade because I had a 2-3yr old cpu.. so it’ll be a huge difference but in 1440p there isn’t much you get out of it , but let’s see how overlocking goes . I think we’ve yet to see what it can do
def would not upgrade from 7800x3d to 9800x3d. I usually wait at least 2 full generations, and /or 30-50% improvement. You'll go broke chasing 10-15% per year. Im upgrading from 5800X3D and it's a solid 30-45% faster
really ? you dont upgrade a high end CPU crushing 99.9% of CPUs in gaming ? You're kidding right ?
I play at 4k. There's basically going to be zero improvement from my 7950X3D. But we'll see what the 9950X3D does, its supposedly going to kill the 9800X3D in gaming.
The 9800X3D gives more performance while consuming slightly more power but it's way easier to cool, that's a win for me.
its quite significant, but maybe just refined 7800x3d, if 78x3d alot cheaper its a no brainer
think if only amd increase the 3d cache to 128 and bit more mhz itll be indeed best cpu, maybe 9900x3d
The 9800x3d is far more efficient because it uses a smaller more efficient process node, the only reason its using more power is because AMD is "pushing it further"
The 1% lows are INSANE
As a long time Intel user (gaming), I am absolutely amazed. For real. An overclocked 9800X3D also performs like it is another release!
Can't wait to build my new 9800x3d next week!
And Intel is being left completely in the dust.
Intel still good.... not that bad you get more cores... it's like 2017 for intel...
@@peterpanini96 but intel now is really trouble if they dont do something.
@@igm1571 I remember wheen I bought neew pc 2020 and theye said AMD was taking oveer too. It ended up Intel still leeading. It will most likely just push intel to improve moree and be ea back n forth
These comments are so brain dead. Y’all are seeing games in 1080p with dlss getting 20 fps boost and think it’s the greatest think y’all ever seen. Nobody is gaming at 1080p on a 4080/4090.
@@NotFromHere. but this is the correct way to benchmark new cpus. in any case at 1440p it still the best cpu in the market rn.
Can't wait for the benchmarks pairing this beast with the upcoming 5090
9800x3d king
My 9800x3d coming tomorrow at 8am i'm so exited
Got mine today you will love it.
Thanks! Can you do this again but with the 5800X3D? Many of us still use AM4
Honestly I was expecting more from 9800X3D's power efficiency. Sometimes it's almost 20W higher with just a few more FPS. Still a beast though.
EDIT: Thanks for the explanations guys
Yet it's 5°C cooler on average and rarely goes over 100W in games. It's a marvelous CPU, there's nothing to complain about apart from the prices overseas.
there is a +400mhz higher clock speed over the 7800x3d acknowledge that!
you want a miracle than. And there's no miracles, it's physics. The improved layout enable it to run higher clocks on a very similar architecture and same lithography the performance gains are coming from the improved capacity to remove heat from the die and that requires higher power. There will be efficiency gains on lower loads because it will require less power since the processor will be almost always running cooler. This is a refinement product, and god damn it's very refined and elegant. Go see der8auer delid OC results that showcases to a larger degree what is possible. But again, there are no miracles.
you’re probably clueless but that’s fine though. it runs a bit more watts while being 5-6 degrees celsius cooler
You expected more? Okay dude
And here it is. The new X3D Champ. Well Done AMD 😎
9800x3d is so fast it can almost hit 100% utilization for the 4090 in 1080p 💀
Great. Now who wants to buy a 2K EUR GPU and 500 EUR CPU to play at 1080p?
@0w784g professional esports players
We talk about benchmark rn why you so negative lol@@0w784g
@@0w784g me
@@0w784g A lot of people, actually, judging by the sales...
gains in hogwarts legacy are huge!
R7 9800 - More Wattage/Less Temperature/More FPS. It's a goat.
This is how CPUs are supposed to improve every generation. Take notes Intel
Their x3D is improvement over time but cant change the facts that 9000 series is not a big jump over 7000 series...
@@igm1571 what about Intel ultra series? Just better power draw?
@@trexxtechs2857 maybe
@@igm1571 that's all for intel
@@trexxtechs2857 also lets not forget 14000 intel cpus are literally the same 13000 cpus, they were not prepared for supporting 3 generations on the same socket, they only did that refresh because of AMD. Mediocre performance gains with the cost of power consumption... 9800x3d is a nice upgrade over the 7800x3d, but thats for those without a x3d chip or still on am4 i would say. lets see whats coming next with the 11800x3d or whatever they will call it, another 10% is good enough for next gen on the same socket.
The New King of Gaming 🔥 🔥
The 9800X3D is definitely the new gaming king. Not as impressive as the jump from the 5800X3D to the 7800X3D but still a great gaming CPU overall, despite the higher power consumption. AMD did a great job.
The difference between zen 3 and zen 4 was huge, Ryzen 5 7600x alone performs the same as the Ryzen 7 5800x3d
@@El-Leon-VLLC Good point... I´m out 5900x to 7800x3d and the performance of zen4 is INSANE...
Cant wait to get this in about 4 months when scalpers are all too busy with the rtx5090
9800X3D, best processor of 2024 easy! Intel is now a literal joke, and its sad to see.
I take back everything negative I ever said about AMD for the past 20 years as an intel customer....
Bro 7800x3d unstoptable, but 9800x3d🔥
As a Very Proud owner of 7800X3D, I have to say, I'm Truly Impressed 😊
I'm a Intel user myself but oh boy what a job AMD did
Finally they addressed the low % numbers. Now it's truly like an i9 for gaming.
Run cooler while draw more power AMD doing their homework
Looking for some perspective without any hate because I genuinely would like to understand it.
I totally understand only testing at low resolution to get the most accurate results for a specific CPU. You're basically putting max pressure on said CPU workload. And I get why the majority do that. Me personally, I want to see BOTH types of benchmarking, because I do see interesting results when you benchmark at higher resolutions, sometimes you see results that don't quite make sense if you are in the "only benchmark in low resolution" camp. I want both types of benchmarking because I have a 4090 and 7900x3d (I only have the 7900x3d because my 7800x3d died and I had to send it in for replacement, but they only had 7900x3ds in stock...). I'm considering upgrading to a 9800x3d IF the results in real world (I play 4k resolution) actually warrant it. I understand playing in 4k is going to limit basically every game because they are going to be GPU bound. But I am just trying to understand why some results show for example, my 7900x3d have better results at higher resolution than the 9800x3d or 7800x3d. Silicon lottery? I have no idea. I would genuinely like to know from someone truly knowledgeable in that category, because I don't wanna spend a lot of money on a 9800x3d if it's not going to help in my particular situation playing in 4k. Why in certain scenarios, is a 7900x3d outperforming a 9800x3d at higher resolutions in the same game?
Waiting for that 9800X3D vs. 13900K comparison video.
Why? Obviously AMD will win by 15+ fps on everything.
It doesn't make sense such comparison
Might be a good time to upgrade my current CPU, i7-7700K. Cheers Kaby Lake, you did alright.
Luckily for me I bought the 7800x3d+ motherboard for less than 9800x3d
I think 7800x3d is still enough, for now
Maybe i am gonna take it. I could still stay on my actual Intel but games uses more in more shaders at the lunch so, i need to take it
The 9800x3D begins interesting for the next Nvidia/Amd gpu's
Much wiser choice!
Well, I absolutely love your comparisons. Best channel for that ever! Thanks so much. This is the one I was waiting for the last weeks...
But on the other hand I have to admit: I guess we are at a turning point. What's the point in having e.g. 704 fps with 9800X3d instead of 633 fps with 7800X3D? LOL. Does any monitor show? 😂 Does any eye see these insane frame rates?
For the first time ever I feel I have a CPU-GPU combo (7800X3d + RTX 4090) which gives always enough performance for everything, every game... And I am running Flightsimulation on 3 HD beamers here with 220° FOV...
NIce to see are better 0,1% lows with 9800X3D, but actually I think I am not upgrading anymore. The unbelievable power efficency of the 7800X3D cobined with it's more than sufficient performance is still the champ.
intel's future is uncertain
hopefully it gets cheaper
Intel have to split CPUs into productivity and gaming like AMD does and might have a chance.
super impressive. I won't be upgrading from my 7800x3d tho since I don't game at 1080p.
R.I.P. Intel
na, intel whitout 3d vcache, gives a similar perfomance...
Prices are little high of intel right now but they are good in productivity tasks overall like 245k which beats ryzen 5 and in someway 7 too in overall productivity tasks
@@nicolasgarcia253 What do you mean similar? The 7800X3D was already better in gaming than the 14900K and the new 285K is worse than the 14900K. Now with 9800X3D Intel really is in the dust in terms of gaming and they can't even hope to reach that efficiency without entirely changing the architecture.
@@ONLY_BHARATVAASI Yes, in productivity. But the Intel CPUs are also consuming much more power for that, even the 285K which is supposedly considered efficient and yet nowhere close to Ryzen level of efficiency.
Ironically, Intel has come to the point of only coming on top in productivity.
@@nicolasgarcia253 The 9800X3D at 1080P is on average 35% faster in games than the 285K according to Hardware Unboxed. How is that similar?
It’s absolutely crazy how AMD is getting this much off a performance gain with their 3DV cache technology.
Welcome to the new gaming king CPU of the market, Ryzen 9800X3D
Which would be the best pair with a 4070 Ti Super the 9800X3D or the 7800X3D?
stupid question... its always better to get the better performing parts
@@jigjigzacharias5038Someone mentioned that I’d need a better GPU (like the 4090 which is really expensive for me) if I go with the 9800X3D. I’m new to PCs, and I’m asking because I’m planning to build my first one.
@@jigjigzacharias5038 Nooooooooooo
that cpus works only on 1080P
if u ve 1440 p screen and rtx 40 series gpuı
5600x3d will be king for u
@@BananaBeach519Yeah also I'm gonna play at 1440p
@@jackie2-g8l play at 1440P isn't important for cpu, so take cheap one
9800X3D + RTX 5090 will be a monster combo!!!
9800X3D can be overclocked
7800X3D is still worth it, second best cpu in gaming
lower watt, lower price too
I love how most people who recently bought a 7800x3d are downplaying this uplift
Im happy i waited and got. the new one
Patience wins!
Yeah the 1% low & frame time is marvelous. Not to mention overclock. Im 7800X3D owner fighting internally...😂
I fell u, out of stock for Ryzen 7 7800x3d save my money. But it is time for this baby
7800x3d was selling around $330. Best Gaming CPU at that price. No competition
So I have the R7 5800X3D and when the 7800X3D came out the lead of the new CPU was greater than it is now between the 9800X3D and the 7800X3D. But this CPU can also be overclocked and shows good performance.
5800x3d is still the best am4 cpu
My R7 5800X3D is still enough good and RTX 4070.
while 9000 is 7000 with improvements and better node, 7000 is completely different than 5000 cpus, also lets not forget 7000 uses ddr5 while 5000 only support ddr4. now you know why. 10% improvement over previous gen out of the box is nice on the same socket, intel couldnt even achieve that with their refresh...
@@igorrafael7429 My 5800X3D is undervolted. So it is more efficient and also a little faster. My CPU never clocks below 3.6 GHz (Okay, I bought it at the beginning of January 2024, so newer ID) and my DDR4 has 4000 CL 18 in a ratio of 1:1. This means it has a lot of performance and I am completely satisfied with it.
Compared to this, Intel's 285K is simply garbage
9800x3d vs intel 285 👍🏻
Seeing a good new CPU launch is refreshing. If you don't count the 7600x3d from being a limited supply
9800x3d rtx 4090 4k next
so there is only some improvement on 1080p? on 1440p makes no difference?
Hope you will do this in 1440p and 4k too.
It will be no difference😂
I want to see that as well because even though 1440p and 4K are more GPU bound, I feel like those resolutions would be more relevant to gamers who have a 4090+7800X3D or 4090+9800X3D. It is very unlikely that anyone games at 1080p with a top of the line gaming PC such as the 4090 and 9800X3D.
@@Gamer-q7vexactly, why can't they benchmark at 1440p and 4k too? I'm sure the fps will be different surely and that's what everyone wanted to see..
@@lycherasbry i wouldnt mind if so
@@JosephKarthic Yeah, that's what I am curious to see. The difference between the 7800X3D and 9800X3D would obviously be smaller at 1440p and 4K, but it could be bigger than one might think. The 4090 is so fast that a lot of CPUs bottleneck it even at native 4K. If the 5090 ends up being around 75% faster than the 4090 at 4K, then even CPUs like the 7800X3D could bottleneck the 5090 at native 4K max settings in some games.
I can't wait to get my pre-built with the 9800x3d for Christmas
Fortnite test where ?
People who play Fortnite should be jailed 🤮
Best 1% lows of any processor on the market, AMD takes the crown. 14900KS has only SLIGHTLY better 1% lows after over two months of tweaking, this doesn't count Intel.
Its so good
Well, of course the CPU is faster. I understand that you are using a lower resolution to show the CPU performance. But at higher resolutions and graphic quality it is even tied to a 5700x3d, in GPU bound games.
Let's be realistic, no one gaming 1080p with 4090 with either 7800x 3d or 9800x3d. This doesn't measure day to day used
@dkindig I know why it's done @1080p. Just saying it's not realistic all represent everyday usage for those kinds of hardware.
@@nilesmorris9132 Yes, I killed the comment, realized I was being a bit too heavy-handed. I agree that it is a poor benchmark too other than comparing the two processors. They should have done 1080p native across the board...
I will test it next week :) Ordered 9800x3d today. I'm going compare it with 13700k/7800x3d at 1080p/2k/4k
Но это показывает запас производительности процессора на будущее
I will be moving from 8th gen i5, looking forward to being > 🤯
It would also be nice to overclock the new processor, is it quite possible compared to the previous one?
Finally a chip that’s more optimized and configured for modern day components. Currently utilizing the 7800 with an RTX4080, just from the numbers I could tell the CPU was holding back GPU performance.
500 bucks to play at 1080p lets go !!!!
Now this is a "generational leap" i didn't saw since pentium to core 2 duo/quad...
it's a good jump, but it's more like Nehalem to Sandy Bridge. The P4 to C2D leap is probably something we'll never see again.... but mostly because P4 was a disaster from day one. The P4 was created because people & the media were obsessed with Clock Frequencies. Clock for Clock P4 is/was worse than PIII.
Nice game review!
What its missing is you gotta test the performance during combat and stuff or scenes that are really demanding
Other then that keep up the good work!
no need to upgrade from 7800X3D, still 9800X3D sold out everywhere xD
Great comparison, I like it :)
Really urging to go from a ryzen 5 3600 to the 9800X3D. To pair with my current GPU 4080S. I wonder what kind of performance gain there is. Any chance you could do a comparison for these side by side?
The blood that run in Lisa Su family is simply genius, first u have Jen that found the world most valuable company, then you have Su saving AMD from brink of bankruptcy.
Now i am getting excited for Zen 6.
Wow the 1% low improvement is brutal. 9800x3D FTW! Will be my next upgrade when the price normalizes.
People need to understand that 10% more power in AMD and Intel is not the same because 10% power in AMD is 10% of 60 watts which is 6 watts and 10% power in Intel is 10% of 100 watts or 10 watts. My numbers are just examples but my point is that 10% of a higher number give a even higher power consumption in Intel. Percentages cheat you. Right now ultra 9 285K is bad and AMD is the best . Maybe in the future ultra 9 385K maybe better with low power usage but better performance.
Edit: In better context, the base power of previous AMD cpus are lower than Intel's cpus and that's why they are not the same in comparision.
That's insane. Remember, you can still overclock this CPU!
what program are you using to show these stats?
The temperatures are amazing on 9800x3d , its even less than 7800x3d with an average of 4-5 degrees down , especially if you consider consumes more watt
Believe it or not, people that can afford this cpu probably won’t be playing at 1080p
Thank you for test! Is there any plans to do the same for 1440p ?
For a cpu update this performance is unbelievable❤
Does the 9800X3D also improve the lows in higher resolutions? I'm mostly playing in 1440p...
Damn, what's up with Hogwarts? That shit's hungry.
Even in 1080p the difference is not that big, but if you Compare 1440p or even 4K the difference is even way smaller.
Ha that’s funny. $480 for the 9800x3d when scalpers have completely obliterated stock and are now selling for $800