With the limitation in energy required, and the fact that some PC parts are the size of a few atoms, there isn't an infinite way of getting a massive boost from what we already have ( at least without redoing everything from the ground up and make your CPU incompatible with any other hardware ).
@@crash42modder63 When he made that claim? To the 2nd performer on the list, i.e. 7800X3D, i.e. the same tier processor from the prev generation. That's why he's equating that generational gap to a tier gap. The Intel processors were like 2-5 tiers down in performance
Well, the "glued" cpus were pretty bad. The 5800, 7800, 9800 and x3d variants etc are all excellent - but they are 8 core cpu's not 4+4 like the underperforming "glued" 3000-gen had. The newer "glued" 5-7000 series 8+4 and 8+8 cpu's have had a fair amount of issues with cache (the 9000-series finally fixes this with cache for all clusters so the >8 core cpu's will likely be problem-free from now on). So the tables turned, but the other way from how you think :)
Glue logic is circuitry that connects two integrated circuits. You're too ignorant to know that, so you assume it's a "slam". I'm not sure how 240 people equally as ignorant gathered in one place.
you know what's even funnier??? Back in 2005, AMD released the Athlon 64 X2 CPU, a true dual core design. Intel, with it Core 2 processor still a year away, had nothing to offer, so what did they do??? The "GLUED" 2 Intel Pentium 4 Dies onto a single substrate and called it the Pentium D. Then when AMD was gearing up for Quad Core processors, Intel again had nothing for that, so what did they do??? "GLUED" 2 Core 2 Duo CPU's onto a single substrate and called it the Core 2 Quad. Intel had been "GLUING" CPU's together long before AMD gave us Ryzen, but Intel likes to not talk about their hypocrisy, especially their fangirls
I mean, GN is the only channel I really watch for reviews and comparisons, they're the best in my opinion, the other tech channels I follow, is just for fun.
I like J. GN comes off as extremely smuggish. Iv definately had my own discrepancies with their "accurate" benchmarks. At least if J is wrong hes humble about it. GN went out of there way to try and smear LTT. For what? Just to be dicks.
The need for core parking is a windows issue. Windows tries to load balance light loads across all the cores in a way that only makes sense for monolithic CPUs. The idea is to spread heat and wear evenly.
I am really looking forward to the big X3D CPUs like you. I do imaging work, CAD/CAM, and gaming. I put it all into one box these days because I've moved to a smaller home, reducing my footprint in computers from 8 desks to 3. Some of that has to be dedicated to older systems supporting old work that's still active, so the remaining space has to be a "Do it all" personal system. I want to see the 9950X3D with dual X3D cache dies and running cooler just like this 9800X3D! Hopefully AMD is listening, and stocking up good chips before the release of the 9900X3D and 9950X3D even now toward a before the end of the year launch date!
@@space.raider.2271 same here. Got all the parts on the way, and waiting for tomorrow. Intel has just dropped the ball, and this is a no brainer for me coming from a 9700k
@@mikecheckmike i had an 9900k that unfortunately it broke so had to downgrade to an 9600k as it was an outdated soc, been waiting for an good upgrade and this looks very promising to get
Thank AMD for clockspeed stability and bypass of the terrible IO die. 5800X3D was heavily flawed due to clockspeed jitter (still better than base Zen 3 but worse than DDR5 Alder Lake), the 7800X3D a huge improvement by higher stable clocks (still jittery at times but way better), the 9800X3D is just higher clocks in general.
FPS gaming especially demands high frame rates, but the 0.1% and 1% lows are much more important than the average FPS. Fluidity and stable gaming are what you feel with good 0.1% lows, not the averages.
I'm less excited about that one, tbh. But I'm just that kinda fella that likes his gaming CPUs to be scheduled correctly in Windows. Everything above the _800X3D in past gens seems to struggle and it's never been a major uplift. I'd love to be proven wrong, though. It would mean that AMD finally figured out a decent solution to the problem.
@@pirojfmifhghek566 I'm not having any problems with scheduling in Windows 11 with the 7950X3D, Updates do happen. Also anyone buying a 7950X3D probably isn't gaming.
@@Orginal_Sinner I have a 5900x too, and besides the performance being amazing, its caused so many headaches with weird temps and updates and compatibility issues. Definitely going to get the 9800X3D
I'm getting ready to build (or have built for me) my final gaming computer after the first of the year (I'm 72 ). It looks like the 9900x3d may be my first AMD processor since the K7. Thanks for the info, Jay!
Excellent review, I just ordered a new 9800 X3D/Asus Tuf Gaming X870 Plus board/ Corsair 3-fan liquid cooler, and G-Skill (2x32) DDR5 6000 CL30 ram.. Be my 1st AMD in ~20 years
Great review guys. I'm with you on the 9950X3D, I'm really excited and hopeful for it, because I really do make great use of my 7950X3D and it would be really nice to not be dependent on any software any longer for gaming perf. Although I've mostly had zero issues with games sticking on the correct CCD and working perfectly. That reminds me, it's nice to see that you guys didn't fudge the data on the dual CCD X3D parts, which seems all too common now days; multiple other reviewers did and had them (7950X3D/7900X3D) running about as fast as a 5600X/5700X in gaming. And left the data in their charts.
No, in a vast majority of games you'll barely notice an FPS difference in 4K if you already have a 7800x3d. I would not bother to upgrade unless you can get a good price selling your 7800x3d.
That'll be an amazing upgrade for you. I still have my 13700 so i'll wait at least a year or two before upgrading. My 4090 will go several years if it lasts that long
I have a 3700x too. I very much want to upgrade 😅 I think I might get a Zen 4 or Zen 5 Ryzen 9 over the 9800X3D though since I care more about productivity performance when it comes to upgrading my CPU.
@Manysdugjohn there's more to processors than just gaming performance. Plus it's not true that X3D is completely useless at 2K and 4K. At the very least, 1% and .1% lows tend to be a lot better. Also if you are going to stay in AM4, the 5700X3D is probably a better value at this point than the 5800X3D.
Not ! Run the tests in 1440P and 4K and its a much tighter comparison. This chip is only good for one thing and its not even that good. Dont believe everything you hear from these tech heads because they dont always tell the truth.
@@rodneyp9590exactly, in real world scenarios (for me that is on 1440p175 ultrawide or 4k120 screens) you won't See mich difference. If you're into competitive games like valorant, overwatch or any moba i reckon you'd see decent improvements.
Coming from a 5600x3D on my gaming rig, having a 5900X for my workstation, getting an ASRock X870 Pro RS WiFi for $129 on a price glitch at Micro Center and having no CPU in it, I am excited.
The deal was the 2nd week that the X870 motherboards were out on the market. They just got it in and it was supposed to be listed at $209 but for some reason it was $129. I took a detour to the store and got it. Not even the employees working knew if it was a quick sale or a glitch. All motherboards are listed online that are in stock for your store only but you have to buy in store.
8:16 FYI Jay/Phil when you redo your stuff, that highlight needs to go all the way to the end especially when looking at comparing the numbers, makes it easier for the audience at a glance, the fade off is too early
I was hoping reviewers wouldn't miss the fact that the MSRP is basically what the street price of the 7800X3D was selling for. Yes it's slightly higher MSRP and we would all like cheaper, but it is what pepole have shown they are willing to pay. I wonder if more than Jay wil point this out in the reviews?
They strategized that and did it well, curtailing production of the 5800X3D and 7800X3D at just the right time to ensure they both end up at or near the price of the 9800X3D at launch to avoid competing with themselves. What a lot of people don't realize is that a ton of 5800X3D dies were dumped on the 5700X3D production line. I got a 5700X3D about a month ago and it boosts to 4.2GHz all core. The 5800X3D typically has an all core boost frequency between 4.1 and 4.3 GHz. So I basically got a 5800X3D for half price. And, when they launch the 7700X3D in a couple of months, there will be a lot of 7800X3D binned silicon going into those. I think they originally planned to wait until next year to release the 9800X3D but the Core Ultra CPUs launched and AMD decided to move up the launch so they could mop the floor with intel during the holiday season.
@Lurch-Bot I would be surprised if we get a 7700X3D, unless they end up with a tonn that they can't use in servers or for RMA stockpile. I've heard that 7600X3D is intended to replace AM4 for budget gaming along with cheap B650 boards. However as long as Zen3 Epic continues to sell we could still get a "new" release of something to AM4 or continued stock of some AM4 CPUs, as long as AMD has to keep making Zen3 to support RMAs for server they are bound to have some yields they can't use for server and will have to sell to us. 😋
Thermals. Pure and simple... brought myself a 7600x when it was first released out of desperation since my old PC was struggling with everything, thought to myself i don't need a 7800x3d. Recently been having trouble with heat, checked out making sure thermal paste was good... my AIO fans were still working and cooling. Knew this was a few weeks away and decided then and there if they show to run cooler i would be getting one asap. Watching multiple reviewers all saying it's running cooler than all the other CPU's has concreted my decision.
So far, out of the videos I have watched about the 9800X3D, Jay has been the only one to truly look at the bigger picture. Some of the games, ok the averages may not have increased much, but the lows absolutely did. Considering how well the processor was holding higher max clocks more steadily than anything else on the chart, that is a HUGE deal to me. Yes, there is something to be said about big increases in averages, but to me it is a HUGE deal to have that kind of clock stability to help keep frame stability. I am for one, stunned.
After 20 years of using Intel going to be a first time user of AMD. Honestly little scared since never used AMD cpu. Hopefully I will get my hands on 9800x3d soon since other parts are on order.
Nothing to be afraid of. I don't know why people say that. It's a x86/64 cpu, not like going to RISC and an entirely different OS. You don't need to recompile the kernel, you don't need to rewrite any code. You reinstall Windows and play your games. That's it, not rocket science
I made the swap back in May, and honestly, AMD is so much better, and the transition was flawless. And I didn't even get a 'fancy' chip (tho I wish now I had just because I do get a bit of a bottle neck on the CPU in some games with what I went with)
Right im on 5700x3D and this look nice but shoot I spent so much money just upgrading to this from my 3600/2060S to an 5700x3D/4080S and I got an oled 4k monitor now to like my pockets are saying ENOUGH😭 I raise the flag im still able to game maxed out at and enjoyable rate at this point It just a pixel chase and at the cost I can’t justify atm but definitely didn’t disappointed!
@@MrAnimescrazy thanks bro it took over 5 years to get here! I’m definitely greatful and surprises me how many people don’t also appreciate the current tech at Our disposal these stuff are powerful!
@@Brxndz_ Yeap! Once you on 4k you’re pretty much gpu bound so at 4k you be at like 6% bottleneck which is more than fine! You’ll still be using about 94% of ur gpu you’ll still see a small uplift if you go 5800x3D or 5700x3D to get that close to zero but honestly ur just fine NICE 4090! Especially having one in the current state is pretty sweet! lol If I could of afford the diff I would of but an 4090 is a different beast! I play everything pretty much maxed out at 4k don’t see any current games go below 60 once with dlss and frame gen is on but with you’re card you can pretty much run just about any game native and get close to 60 fps no matter the title so you’re pretty comfortable on average got be in the 90-180 fps range if you talking those easy to run shooters you gotta be flying colors cause I play OW2 max out on my 4k panel at 240hz so pretty much you’re gpu will be still a valuable player for at least a min of 8 years but can go way longer congrats on the purchase dude!
Jay, your elaboration on the "funny looking graphs" was great. Though some people might compare one channel to another, I enjoy watching both yours and GN's content cause you're both different and it's a conversation where different perspectives might be valuable in their own ways. Thank you. Keep it up. I'm finally leaving Intel for my new build. Looking forward to this.
In 4K resolution, the 9800X3D is marginally faster than the 7800X3D but.... when you pair it with the upcoming RTX 5090, the improvement should be more noticeable, even in 4K ultra settings... Also, you can overclock the 9800X3D to gain few more FPS.
About what I expected. Great upgrade from a non-X3D 7th gen or even more so from a AM4 CPU or Intel 10th/11th/12th gen. Obviously people with a 7800X3D shouldn't get it.
depend, the average fps slide show 100 fps gain out of 450 fps, meaning, about 20 to 25% gain, so the price to performance kinda make sense even if the true pricing of the 7800x3d was around 350 sooner this year. (minute 13:25)
@@brtcobrathreadrippers smash Intel at enthusiast side and 9950x is barely 5% +/- than 285k. Except quicksync Intel doesn't really have big stand out feature. Maybe the npu is useful but a decent dgpu is still better for ai tasks.
eh, I built a 7800X3D system before the 9800X3D was announced. spent $400 on the 7800X3D. and I de-lidded it. I had no reason to upgrade, i run a 1440p ultrawide, I probably wouldn't see much of a boost.
you wont at 1440p the results are almost identical. and even at 1080p the gains arent that crazy prob 8-10% improvement on average, the 7800x3d will still eat up games for years to come.
9950X3D All-Core Productivity/MultiThread Estimate: Take 7800X3D vs 7950X3D difference, apply to 9800X3D A real 9950X3D could perform better or worse. It could perform better if we get dual X3D CCDs It could perform better if core scheduling is fixed VS 7950X3D It could perform worse if there is a power limit threshold being hit that the 7950X3D stays below Of course there could be other X factors. Anyhow, this is a very crude basic estimate: Cinebench R24 Multi: 7950X3D 2023 / 7800X3D 1088 = 1.86X 9800X3D 1346 X 1.86 = 2503 Estimated 9950X3D Result: 2503 4.7% below 285K +8400 CUDIMM +OC 0.6% below 285K +8400 CUDIMM Beats 285K, 9950X, Raptor Lake Cinebench R23 Multi: 7950X3D 35235 / 7800X3D 18055 = 1.95 X 9800X3D 22986 X 1.95 = 44858 Estimated 9950X3D Result: 44858 Beats 285K (All Modes), 9950X, Raptor Lake Blender CPU: Monster 7950X3D 234.1 / 7800X3D 118.4 = 1.98 X 9800X3D 146.13 X 1.98 = 288.9 Estimated 9950X3D Result: 288.9 Beats 285K (All Modes), 9950X, Raptor Lake Blender CPU: Junk Shop 7950X3D 166.72 / 7800X3D 83.16 = 2.00 X 9800X3D 103.73 X 2.00 = 207.96 Estimated 9950X3D Result: 207.96 *Crushes* 285K (All Modes), 9950X, Raptor Lake Blender CPU: Classroom 7950X3D 115.7 / 7800X3D 57.22 = 2.02 X 9800X3D 72.36 X 2.02 = 146.31 Estimated 9950X3D Result: 146.31 Beats 285K (All Modes), 9950X, Raptor Lake Geekbench Multi 7950X3D 19841 / 7800X3D 15194 = 1.31 X 9800X3D 18243 X 1.31 = 23823 Estimated 9950X3D Result: 23823 4.6% below 285K + 8400 CUDIMM +OC 3.3% below 285K + 8400 CUDIMM Beats 285K, 9950X, Raptor Lake Timespy Extreme 7950X3D 11589 / 7800X3D 5918 = 1.96 X 9800X3D 7450 X 1.96 = 14589 Estimated 9950X3D Result: 14589 Near Tie: 0.75% below 285K +8400 CUDIMM +OC Beats 285K +8400 CUDIMM, 285K, 9950X, Raptor Lake
Not comparing to 5800x3d feels like a miss to me. A lot of us are still on am4, would have loved tk get direct comparison instead needing to now find a 5800x3d to 7800x3d comparison and then come back tl this video.
You don't need that. They still kick so much ass it's hilarious and they'll keep doing so for years. For $199 the 5700X3D is the deal of the decade for anyone still rocking an early gen Ryzen rig.
Would love to see the scores on how they compare 1440 and 4k at max settings….lik i know these are the tests they tell u how to test but it be nice to see max settings tests to really see what im paying for
Buying a 5700X3D recently was one of the best PC component buys I've seen in the 30 years since I built my first PC. The fact I got one that boosts to 4.2GHz all core was a nice bonus. AMD recently dumped a ton of 5800X3D binned silicon onto the 5700X3D production line because they had to artificially curtail 5800X3D and 7800X3D production to get prices up so they wouldn't be competing with themselves with the 9800X3D launch, which I'm fairly certain they originally planned to launch next year. But with Core Ultra taking a nosedive, they saw an opportunity to clean house over the holiday season. I wouldn't expect the picture to be as bleak as you suggest. I think demand is just that high and they are probably churning out 9800X3Ds as fast as possible. It is unlikely they would move the launch up if they weren't prepared to meet demand.
I REALLY wish reviewers would also include real world scenarios. What difference does it make at 4K ultra settings? Compared to the 7800x3D and the competition? I get the test and its purpose but I couldn’t give a shit how many frames I’m getting at 1080p medium.
Because if you cared, you'd already know that there hasn't been a cpu in a decade with meaningful difference at 4k. Also it's not just the frames, sim times are on the graphs for Stellaris. Other sim games will show similar sim time benefits.
Gaming data at 1440p WS or 4k resolutions would be beneficial to determine the benefit of upgrading. I know this is a CPU test but at the higher resolutions does it even matter if you upgrade from a 7800X3D to a 9800X3D? I know a lot of companies rely on impulse buying but I need to spend wisely.
I am quite excited to see the 9950X3D results, my 7950X3D is a beast and I already have it overclocked (Linux is saying it's max is now 5.9ghz..lol), really want to see how they compare.
AMD claims that 9000 series supports up to 8000MT/s. However you have to switch the IMC to 1:2 mode, which introduces additional latency. Better to stick with the sweet spot, low latency DDR5-6000 RAM, maybe 6200-6400, as the IMC can run with a 1:1 ratio to the RAM. Similar performance to 8000 RAM, but much less expensive.
One has to wonder how long it's going to be before these software companies start utilizing large cache system that these architectures are starting to utilize.
They literally do you get about 5-10% higher productivity ipc with vcache. Its just that games are extremely latemcy bound while productivity is normally bound by the compute power of the core itself once again though make no mistake, VCACHE DOES HELP IN PRODUCTIVITY.
@@Frozoken It depends on which kind of productivity. If I remember correctly, fluid dynamics simulations heavily profited from the cache on the 5800X3D, so I wonder what kind of performance jumps the 9800X3D can achieve there with higher clocks and big cache.
When you write performance software you try and use tight code and data blocks that fit into L1 caches. Deliberately using super-sized stuff just to use more cache is called software bloat. Cache is a CPU feature that mitigates the relatively slow access to memory.
@@RobBCactive I understand what cache is. But just like game developers working to utilize more than one core/thread on multi-core CPU's, it's time for companies to try and utilize the full range of cache memory that's available.
With the second generation 3D V-Cache I'm actually more excited for a 9700X3D. I have a 3900X right now and have been wanting an upgrade, but the finances just aren't there for a $600 CPU, which is what it costs here in Norway, especially as I would also need a new motherboard and RAM.
These tests are all 1080p tho… how many actually game at 1080p by todays standards. I’d like to see the comparisons vs 1440p and 4k. Let’s see how it performs
@@teatea7684 I have an i9 14900k and all I hear from AMD nerds is how bad it is when it crushes everything at 4k 1440p. And I have a 4k lgue 32 monitor and a 4090. Yeah I’m not getting a POS amd chip lmao
Those deciding between 9800X3D and 9950X3D take note that most games are written with the console architecture in mind i.e 8 core 16 thread. Its rare that you'll need more than that (for games) and frequency will be more important than core count as game cpu workloads are inherently serial (certain tasks can be broken out onto multiple threads though). If you do productivity stuff as well then it sounds like the 9950X3D is a good option otherwise the 9800X3D will be best.
9900x3d probably won't be too exciting, they should drop it altogether really. It drops 2 x3d cores so the much like the last gen it'll probably run slower than the 9800x3d in games but be a little better in productivity. The 9950x3d I'll bet will match or excede the 9800x3d in games and crush it in productivity.
Honestly those aren’t worth it lol, just get the standard variant 9900x or 9950x for content creation/workflow. If I had to guess they will be a repeat of the 7950x3d
@@itsprod.472People don't seem to get that others exist that play games that very much like the cache on top of having uses for the extra threads. It's irrelevant that the 8-core is faster just for games, when the 16 is getting you most of the way there on top of being better for many other things.
It’s like 14 gen vs 13 gen of Intel but amd actually did do the work in terms of performance and more importantly in term of temps. Intel just turned up the dials and called it the day. Amd boosted in power and performance. But improved the cooling with better cooling vs previous gen. Great work amd.
I just bought the 14700k because at that price bracket, upgrading from lga1200 for CAD$1000 ish I really wanted the best cpu at that price, what I found was that the 14700k was the best. Also I have a 3060, so I'm not pushing it very hard when I play cod, but I mostly play Fortnite, Valorant, and Minecraft anyways
As you talked about the poor design of the 3D cache from a heat transfer perspective, I wondered yet again if anyone has explored a design that would allow cooling the CPU from BOTH sides rather than just the “top.” There are challenges but sandwiching the CPU and socket between two cooling systems would be a giant leap forward.
You need to run benchmarks with a 4090 at 4K with ultra or extreme settings. No one with that card is going to run it at 1080p on medium. I know this shows the raw extra power but I've got a 14900K and would like to know in a real world scenario how much more FPS I would get if I switched to a 9800x3d platform using my PC in the normal way because I bet it isn't 20% higher it's probably much less.
lol anti intel is cringe. I don't need a cpu that can be a gpu; 120fps and idgaf about 14700k vs whatever. As long as my 4090 isn't being held back from 120fps; idc.
its cause for gaming purposes which is arguably the more used purpose for computer users now. the new 9800x3d is going to be the best for that but in terms of video editing its not the best which explains why the 285k is at the top.
The reason for the 1080p tests is to minimize the GPU load and place as much load as possible on the CPU. The reviewers know 1440p or 4K are the go to for resolution. They are testing the performance though and need the GPU to not interfere with the test as much as possible otherwise all the CPUs will have the same performance.
@@AwesomesMan It's not irrelevant. First and foremost, the most popular resolution for gaming (more than 55% of gamers on steam) still use 1080p and for the test methodology, the way you test a product is by not letting any other component limit it. At the end of the day you can buy whatever part you prefer or brand you want to support, but saying these tests are irrelevant is like saying a bicycle is just as fast as a Ferrari while watching them glide around a school zone. Silly logic.
There is almost no performance increase from the 5800X3D to 9800X3D for 1440p High / RT games according to LTT's benchmarks. So, unless you are gaming at 1080p low, consider saving your money!
I'm loving the glimpse of the future. But I'll wait it out til prices drop and keep abusing my 7700X as I'm seeing 5.7 ghz (Heat doesn't bother me) and not even close to any bottle necking with my 4080 Super
Waste of time@ Gaming Charts... Who would run games, at medium quality and 1080p, with a RTX 4090(or 4080 super?)?😂 Do a test at 4K and quality settings, ie... REAL WORLD tests... Thanks👍😃
Glad to see the advancements in tech with this change. Flipping the chip made a ton of sense when this first came out and I'm glad to see they've done it, and well. Looking at the R23 numbers though I'll keep my old 5950X (obvious core count difference) for the time being since its in the 7900x to 9900x territory anyway and I have yet to notice it being the bottleneck in any gaming I do. When the generational uptick is high enough that my system will no longer keep up with the mid tier, then i'll take a harder look. I think that's what most people budget for anyway.
I'm in the "I need a gaming/workstation CPU" so the 9950X3D is looking like it'll be the go to. BUT the leaked(?) Threadripper X3D has me more excited. CCs, streamers (and me) could really benefit from a "9960X3D" that has more PCIe lanes for things like running a second GPU and Capture Card(s) or something else, with fully populated SSD slots at the same time with no compromises.
if their lowest end threadripper is "affordable" enough, not weird scheduling/latency issues (i had these when i was on the first-gen threadripper) but also has a significant increase in i/o over what the 9950X3D offers.. I might go back to the threadripper platform.
“It’s like an entire tier of CPU jump in just generational performance”
As it should be.
It's what we want and demand, but it's an unrealistic expectation.
Is he comparing the ryzen to intel chip or to the 7800?
With the limitation in energy required, and the fact that some PC parts are the size of a few atoms, there isn't an infinite way of getting a massive boost from what we already have ( at least without redoing everything from the ground up and make your CPU incompatible with any other hardware ).
@@crash42modder63 When he made that claim? To the 2nd performer on the list, i.e. 7800X3D, i.e. the same tier processor from the prev generation. That's why he's equating that generational gap to a tier gap. The Intel processors were like 2-5 tiers down in performance
man 400+fps gaming is not a dream anymore especially when 5000 gpu series drop we reached new heights
Remember in 2017 how Intel mocked AMD for gluing their Chips
oh how the turn table'd
And Intel just pulled both Bulldozer and Piledriver moments with both Meteor Lake and Arrow Lake.
Well, the "glued" cpus were pretty bad. The 5800, 7800, 9800 and x3d variants etc are all excellent - but they are 8 core cpu's not 4+4 like the underperforming "glued" 3000-gen had. The newer "glued" 5-7000 series 8+4 and 8+8 cpu's have had a fair amount of issues with cache (the 9000-series finally fixes this with cache for all clusters so the >8 core cpu's will likely be problem-free from now on). So the tables turned, but the other way from how you think :)
@@saricubra2867 I defend bulldozer from the point that a 9590 cost me 90 dollars and ran at 5.5Ghz for 6 years without issue :D at a mere 225watts.
Glue logic is circuitry that connects two integrated circuits.
You're too ignorant to know that, so you assume it's a "slam".
I'm not sure how 240 people equally as ignorant gathered in one place.
you know what's even funnier??? Back in 2005, AMD released the Athlon 64 X2 CPU, a true dual core design. Intel, with it Core 2 processor still a year away, had nothing to offer, so what did they do??? The "GLUED" 2 Intel Pentium 4 Dies onto a single substrate and called it the Pentium D. Then when AMD was gearing up for Quad Core processors, Intel again had nothing for that, so what did they do??? "GLUED" 2 Core 2 Duo CPU's onto a single substrate and called it the Core 2 Quad.
Intel had been "GLUING" CPU's together long before AMD gave us Ryzen, but Intel likes to not talk about their hypocrisy, especially their fangirls
7:42 Jay: “They may not be Gamer’s Nexus levels of charts…”
Jay is always acting like GN is the guy we told him not to worry about 😂
It is prophesized that the end of all other benchmarking channels would come when one combines the power of GN Steve and HUB Steve.
@@moldyshishkabob They6 need a 3rd Steve for the triangle.
I mean, GN is the only channel I really watch for reviews and comparisons, they're the best in my opinion, the other tech channels I follow, is just for fun.
Poor Jay.. Chart-envy is rough.
I like J. GN comes off as extremely smuggish. Iv definately had my own discrepancies with their "accurate" benchmarks. At least if J is wrong hes humble about it. GN went out of there way to try and smear LTT. For what? Just to be dicks.
if they fixed core parking then 9900x3d and 9950x3d might be huge for gaming
Indeed. Because both will have some non-3D cores they may also improve non-gaming performance over the all-3D core 9800x3d.
The need for core parking is a windows issue. Windows tries to load balance light loads across all the cores in a way that only makes sense for monolithic CPUs. The idea is to spread heat and wear evenly.
if both CCDs have additional VCache there won't need to be much more core parking than the non-X3D counterparts.
I am anxiously awaiting testing results on the 9950x3d. If they've fixed the issues from the 7950x3d then that's my next cpu without question.
@@Karlston The 9900X3D and 9950X3D will have V-cache on both CCXs.
I am really looking forward to the big X3D CPUs like you. I do imaging work, CAD/CAM, and gaming. I put it all into one box these days because I've moved to a smaller home, reducing my footprint in computers from 8 desks to 3. Some of that has to be dedicated to older systems supporting old work that's still active, so the remaining space has to be a "Do it all" personal system.
I want to see the 9950X3D with dual X3D cache dies and running cooler just like this 9800X3D! Hopefully AMD is listening, and stocking up good chips before the release of the 9900X3D and 9950X3D even now toward a before the end of the year launch date!
I think I finally found the replacement for my 8700K.......
I had an 8700K, but early October I replaced it with a 7700X. I'm happy with my upgrade, especially at the prime day deal price I got the 7700X for.
@@volticarchwing51237700x is a beast cpu.
heeyyyyyy a fellow 8700k here, and yeah gonna be looking to upgrade over the holidays
Same 😂😂
I upgraded my 8700k earlier this year to a 14700k oof
first CPU that manages to motivate me into an upgrade to AM5
Getting into the AM5 for at a good price would be the 7500f or 7600x 9800x3d is like a bonus if you have that much money to spend lol
that was the 7800x3d for me. intel is just broken
@@space.raider.2271 same here. Got all the parts on the way, and waiting for tomorrow. Intel has just dropped the ball, and this is a no brainer for me coming from a 9700k
@@mikecheckmike i had an 9900k that unfortunately it broke so had to downgrade to an 9600k as it was an outdated soc, been waiting for an good upgrade and this looks very promising to get
@@CyberBeep_kenshi funny my intel works just fine
1% lows are so much better, so big winner here.
Only thing I care about tbh. 1% and 0.1% lows are what I care about more, since I limit frame cap to 137 for gsync (4k/144hz monitor).
@Munky332 truth!
Thank AMD for clockspeed stability and bypass of the terrible IO die.
5800X3D was heavily flawed due to clockspeed jitter (still better than base Zen 3 but worse than DDR5 Alder Lake), the 7800X3D a huge improvement by higher stable clocks (still jittery at times but way better), the 9800X3D is just higher clocks in general.
yeh 1% lows and 0.1% lows are really not talked about enough and its worth it just for them to be higher
FPS gaming especially demands high frame rates, but the 0.1% and 1% lows are much more important than the average FPS.
Fluidity and stable gaming are what you feel with good 0.1% lows, not the averages.
excited for 9950X3D!!!!
I'm less excited about that one, tbh. But I'm just that kinda fella that likes his gaming CPUs to be scheduled correctly in Windows. Everything above the _800X3D in past gens seems to struggle and it's never been a major uplift. I'd love to be proven wrong, though. It would mean that AMD finally figured out a decent solution to the problem.
@@pirojfmifhghek566 I'm not having any problems with scheduling in Windows 11 with the 7950X3D, Updates do happen. Also anyone buying a 7950X3D probably isn't gaming.
thats exactly what I'm waiting on as well, its nearly time for my 5900x to retire
@@Orginal_Sinner I have a 5900x too, and besides the performance being amazing, its caused so many headaches with weird temps and updates and compatibility issues. Definitely going to get the 9800X3D
If 3d Cache is on all chiplets, its an INSTA buy! If not, I'll get 9800x3D
I'm getting ready to build (or have built for me) my final gaming computer after the first of the year (I'm 72 ). It looks like the 9900x3d may be my first AMD processor since the K7. Thanks for the info, Jay!
Cool
enjoy
That K7 is a crazy blast from the past. Isn't that like a 30 year old cpu?
I remember the K7, slot A right?
Build your own its easy and rewarding to say you built it! 😁
I just got a 5800x3d it's just hard to keep up with new stuff at least for me. I just hope it serves me well for at least 4-5 more years
😂🫵🏻
Same for me. Git the 5700x3d. Ill upgrade to AM5 in a couple years
5800x3d still a killer chip. Especially for the price. plus side you do NOT have to pair it with a 4080super or higher.
I'm still using a 4790k lol. I'm thinking about upgrading finally after 10 years.
It will :)
Never commented on a YT video, I had to for the flipped it and reversed it part, 5:00 kudos to the editor!
Feel sad for all those who will not fully grasp the joke. Mostly people without gray hairs xD (see: Missy Elliott - Work It)
@@josestefan listen I’m in my 30’s and got it lmao
No, for real. The attention to detail, yet simple edit to deliver the joke was perfect lol
You did it again Jay... even though you called out AMD's typo, there's one in the charts @20:16 ... What's a "DUATION"? 😜
😂😂😂😂
You gotta appreciate that he's humble enough to know that he frequently has errors in stuff too. 😂
@@getinthespace7715 Totally! After he said it, I just couldn't resist it. The troll in me was too stronk!
I'm so glad that Jayz brought in the 7950X3D benchmark in the graph. First I've seen among all the benchmark videos. Thanks!
Excellent review, I just ordered a new 9800 X3D/Asus Tuf Gaming X870 Plus board/ Corsair 3-fan liquid cooler, and G-Skill (2x32) DDR5 6000 CL30 ram.. Be my 1st AMD in ~20 years
Same here, I haven’t used a AMD in around 20ish years. I have mine up and running and it’s doing very well in games.
Boom, dozen new 9800X3D videos in a second 😅 time to get some popcorn and enjoy the show. Starting with Jay of course.
Why? Starting from the bottom of the barrel?
Kind of ironic. Lol
@@Ktmzqw Yet here you are at jays channel. Hypocrisy is strong with you.
@thunderhawk51 Micro Center bundle price i'm guessing? lol
oops i meant @WoodakaDiddy, lol
I remember Falcon North west in PC mag's in the 90's always a dream computer.
The flipped and inverted bit at the start was great. Thanks Phill 😂
If you know the reference...
Great review guys. I'm with you on the 9950X3D, I'm really excited and hopeful for it, because I really do make great use of my 7950X3D and it would be really nice to not be dependent on any software any longer for gaming perf. Although I've mostly had zero issues with games sticking on the correct CCD and working perfectly. That reminds me, it's nice to see that you guys didn't fudge the data on the dual CCD X3D parts, which seems all too common now days; multiple other reviewers did and had them (7950X3D/7900X3D) running about as fast as a 5600X/5700X in gaming. And left the data in their charts.
I’m only interested in 4K gaming. Is the fps improved enough to swap it from the 7800x3D?
No, in a vast majority of games you'll barely notice an FPS difference in 4K if you already have a 7800x3d. I would not bother to upgrade unless you can get a good price selling your 7800x3d.
I'm gonna miss my 6700k. it's been a faithful old horse but the time has finally come
It took you 10 years to make up your mind...
So long to Intel, ey?
bruh, that is like putting Joe Biden 2 years from now in an athletics competition? i mean, if monies an issue i get it but that is dinosaur-tech.
Similar finally upgrading from my 8700 not k
That'll be an amazing upgrade for you. I still have my 13700 so i'll wait at least a year or two before upgrading. My 4090 will go several years if it lasts that long
5:00 jayz going missy elliot...
Didn't think I'd see that one on my bingo card... 😅
I'm so pumped! 3700x served me right for 5 years but I'm excited to feel the x3d for the first time and it will be with the new GOAT
lol I recently got a 7800x3d, and I had a 2700x before that feels good to have a real meaningful upgrade!
I have a 3700x too. I very much want to upgrade 😅 I think I might get a Zen 4 or Zen 5 Ryzen 9 over the 9800X3D though since I care more about productivity performance when it comes to upgrading my CPU.
do your wallet a favor and just upgrade to a 5800X3D and skip the whole DDR5 line up.
Its pointless in 2k or 4k gaming to get those chips.
@Manysdugjohn there's more to processors than just gaming performance. Plus it's not true that X3D is completely useless at 2K and 4K. At the very least, 1% and .1% lows tend to be a lot better. Also if you are going to stay in AM4, the 5700X3D is probably a better value at this point than the 5800X3D.
3700x is still amazing! I'm a 1080p gamer so I won't need anything for a very long time.
EVGA "Not" 4090 in the background. LOL.
Great observation 🤣
AMD Ryzen 9800X3D the NEW KING of GAMING!
Not ! Run the tests in 1440P and 4K and its a much tighter comparison. This chip is only good for one thing and its not even that good. Dont believe everything you hear from these tech heads because they dont always tell the truth.
That FPS jump from the previous generation is insane. The new king of gaming CPU has arrived.
They fudged the numbers by running it on medium settings. You don’t play 4090 on medium settings. Linus said almost no improvement at 4k
if you are playing at 1080p certainly. At 4k, there's basically zero gain.
@@rodneyp9590 they are testing the cpu not the gpu, of course at higher resolutions it will be a gpu bound situation
@@rodneyp9590exactly, in real world scenarios (for me that is on 1440p175 ultrawide or 4k120 screens) you won't See mich difference. If you're into competitive games like valorant, overwatch or any moba i reckon you'd see decent improvements.
@@Abc6-abc so it’s not an improvement for its intended use, at least until we see the next generation GPUs
Coming from a 5600x3D on my gaming rig, having a 5900X for my workstation, getting an ASRock X870 Pro RS WiFi for $129 on a price glitch at Micro Center and having no CPU in it, I am excited.
Did you get that motherboard online? Was it at your Microcenter only? Can’t find that board at mine.
Whats a price glitch? They messed up but because you see it you can screenshot and and they have to honor it or something else?
Damn good deal 👍
The deal was the 2nd week that the X870 motherboards were out on the market. They just got it in and it was supposed to be listed at $209 but for some reason it was $129. I took a detour to the store and got it. Not even the employees working knew if it was a quick sale or a glitch. All motherboards are listed online that are in stock for your store only but you have to buy in store.
Just pre ordered the 9800x3d for £453 all in(UK) for delivery between the 11th-13th of November. 😁😁😁😁😁
Same here in the US. Bought at 8:50am, and they were sold out by 9:00 which was the release time on Newegg
This makes me tempted 😮👀 I mostly use my current pc for video production but this makes me want to build a SFF build with the 9800X3D for gaming.
Go ahead.
8:16 FYI Jay/Phil when you redo your stuff, that highlight needs to go all the way to the end especially when looking at comparing the numbers, makes it easier for the audience at a glance, the fade off is too early
I was hoping reviewers wouldn't miss the fact that the MSRP is basically what the street price of the 7800X3D was selling for. Yes it's slightly higher MSRP and we would all like cheaper, but it is what pepole have shown they are willing to pay. I wonder if more than Jay wil point this out in the reviews?
They strategized that and did it well, curtailing production of the 5800X3D and 7800X3D at just the right time to ensure they both end up at or near the price of the 9800X3D at launch to avoid competing with themselves.
What a lot of people don't realize is that a ton of 5800X3D dies were dumped on the 5700X3D production line. I got a 5700X3D about a month ago and it boosts to 4.2GHz all core. The 5800X3D typically has an all core boost frequency between 4.1 and 4.3 GHz. So I basically got a 5800X3D for half price.
And, when they launch the 7700X3D in a couple of months, there will be a lot of 7800X3D binned silicon going into those. I think they originally planned to wait until next year to release the 9800X3D but the Core Ultra CPUs launched and AMD decided to move up the launch so they could mop the floor with intel during the holiday season.
@Lurch-Bot I would be surprised if we get a 7700X3D, unless they end up with a tonn that they can't use in servers or for RMA stockpile. I've heard that 7600X3D is intended to replace AM4 for budget gaming along with cheap B650 boards. However as long as Zen3 Epic continues to sell we could still get a "new" release of something to AM4 or continued stock of some AM4 CPUs, as long as AMD has to keep making Zen3 to support RMAs for server they are bound to have some yields they can't use for server and will have to sell to us. 😋
Honestly didn't expect you to drop a missy Elliott bar, kudos!
Thermals. Pure and simple... brought myself a 7600x when it was first released out of desperation since my old PC was struggling with everything, thought to myself i don't need a 7800x3d. Recently been having trouble with heat, checked out making sure thermal paste was good... my AIO fans were still working and cooling. Knew this was a few weeks away and decided then and there if they show to run cooler i would be getting one asap. Watching multiple reviewers all saying it's running cooler than all the other CPU's has concreted my decision.
true
So far, out of the videos I have watched about the 9800X3D, Jay has been the only one to truly look at the bigger picture. Some of the games, ok the averages may not have increased much, but the lows absolutely did. Considering how well the processor was holding higher max clocks more steadily than anything else on the chart, that is a HUGE deal to me. Yes, there is something to be said about big increases in averages, but to me it is a HUGE deal to have that kind of clock stability to help keep frame stability. I am for one, stunned.
1080p “ medium “ settings lol …….
After 20 years of using Intel going to be a first time user of AMD. Honestly little scared since never used AMD cpu. Hopefully I will get my hands on 9800x3d soon since other parts are on order.
What are you scared of? This is not Ryzen 1000 or 2000 anymore. It's a very mature architecture now.
Nothing to be afraid of. I don't know why people say that. It's a x86/64 cpu, not like going to RISC and an entirely different OS. You don't need to recompile the kernel, you don't need to rewrite any code. You reinstall Windows and play your games. That's it, not rocket science
Amd is awesome, nothing to be afraid of
I made the swap back in May, and honestly, AMD is so much better, and the transition was flawless. And I didn't even get a 'fancy' chip (tho I wish now I had just because I do get a bit of a bottle neck on the CPU in some games with what I went with)
I have mostly used AMD. :)
I still think I have at least 4 more years on my 5800X3D...no real reason to upgrade just yet.
Right im on 5700x3D and this look nice but shoot I spent so much money just upgrading to this from my 3600/2060S to an 5700x3D/4080S and I got an oled 4k monitor now to like my pockets are saying ENOUGH😭 I raise the flag im still able to game maxed out at and enjoyable rate at this point It just a pixel chase and at the cost I can’t justify atm but definitely didn’t disappointed!
@@charlessmith9369 nice upgrade
@@MrAnimescrazy thanks bro it took over 5 years to get here! I’m definitely greatful and surprises me how many people don’t also appreciate the current tech at
Our disposal these stuff are powerful!
@@charlessmith9369Just got a 4090, have a 5800x, how good is your combo ? I’m assuming at higher resolutions it’ll become GPU and I’ll be fine?
@@Brxndz_ Yeap! Once you on 4k you’re pretty much gpu bound so at 4k you be at like 6% bottleneck which is more than fine! You’ll still be using about 94% of ur gpu you’ll still see a small uplift if you go 5800x3D or 5700x3D to get that close to zero but honestly ur just fine NICE 4090! Especially having one in the current state is pretty sweet! lol If I could of afford the diff I would of but an 4090 is a different beast! I play everything pretty much maxed out at 4k don’t see any current games go below 60 once with dlss and frame gen is on but with you’re card you can pretty much run just about any game native and get close to 60 fps no matter the title so you’re pretty comfortable on average got be in the 90-180 fps range if you talking those easy to run shooters you gotta be flying colors cause I play OW2 max out on my 4k panel at 240hz so pretty much you’re gpu will be still a valuable player for at least a min of 8 years but can go way longer congrats on the purchase dude!
broke embargo by 2 min ;)
Brazil broke embargo by an hour. They apparently got the wrong embargo time from AMD.
@@realnzallthey don’t to daylight saving time
The new king, god emperor trump
Embargo police
The new King is TrumpX2TERMs
Jay, your elaboration on the "funny looking graphs" was great. Though some people might compare one channel to another, I enjoy watching both yours and GN's content cause you're both different and it's a conversation where different perspectives might be valuable in their own ways. Thank you. Keep it up. I'm finally leaving Intel for my new build. Looking forward to this.
In 4K resolution, the 9800X3D is marginally faster than the 7800X3D but.... when you pair it with the upcoming RTX 5090, the improvement should be more noticeable, even in 4K ultra settings... Also, you can overclock the 9800X3D to gain few more FPS.
About what I expected. Great upgrade from a non-X3D 7th gen or even more so from a AM4 CPU or Intel 10th/11th/12th gen. Obviously people with a 7800X3D shouldn't get it.
depend, the average fps slide show 100 fps gain out of 450 fps, meaning, about 20 to 25% gain, so the price to performance kinda make sense even if the true pricing of the 7800x3d was around 350 sooner this year. (minute 13:25)
"what, an intel CPU that does suck?" we've come a long way since the launch of ryzen💀💀💀💀
to be fait they are made for productivity and that is where they kill everything. intel is smashing that.
@@brtcobrathreadrippers smash Intel at enthusiast side and 9950x is barely 5% +/- than 285k. Except quicksync Intel doesn't really have big stand out feature. Maybe the npu is useful but a decent dgpu is still better for ai tasks.
intel remained fine in laptops because of those E cores when you buy a ultra thin laptop which you want to have a high battery life
@@Riyozsu threadrippers cost an arm and a leg and have worse single core performance
Or not if you’re Intel
eh, I built a 7800X3D system before the 9800X3D was announced. spent $400 on the 7800X3D. and I de-lidded it. I had no reason to upgrade, i run a 1440p ultrawide, I probably wouldn't see much of a boost.
@@REAVER781 go to Linus tech tips his review touches on the 1440p and 4k aspect
That depends on how strong your GPU is and what you run.
you wont at 1440p the results are almost identical. and even at 1080p the gains arent that crazy prob 8-10% improvement on average, the 7800x3d will still eat up games for years to come.
Absolutely no reason to upgrade. In fact it's more efficient
5:10 ~ As an Australian, I can confirm, things usually look better if you flip them. [thumbs up!]
0:22... NO!! An ACTUAL CPU..." that's jacked lol!^^
Been waiting for this review
5:00 Nice Missy Elliot reference^^
Can't wait to see the 9950X3D
9950X3D All-Core Productivity/MultiThread Estimate:
Take 7800X3D vs 7950X3D difference, apply to 9800X3D
A real 9950X3D could perform better or worse.
It could perform better if we get dual X3D CCDs
It could perform better if core scheduling is fixed VS 7950X3D
It could perform worse if there is a power limit threshold being hit that the 7950X3D stays below
Of course there could be other X factors. Anyhow, this is a very crude basic estimate:
Cinebench R24 Multi:
7950X3D 2023 / 7800X3D 1088 = 1.86X
9800X3D 1346 X 1.86 = 2503 Estimated 9950X3D
Result: 2503
4.7% below 285K +8400 CUDIMM +OC
0.6% below 285K +8400 CUDIMM
Beats 285K, 9950X, Raptor Lake
Cinebench R23 Multi:
7950X3D 35235 / 7800X3D 18055 = 1.95 X
9800X3D 22986 X 1.95 = 44858 Estimated 9950X3D
Result: 44858
Beats 285K (All Modes), 9950X, Raptor Lake
Blender CPU: Monster
7950X3D 234.1 / 7800X3D 118.4 = 1.98 X
9800X3D 146.13 X 1.98 = 288.9 Estimated 9950X3D
Result: 288.9
Beats 285K (All Modes), 9950X, Raptor Lake
Blender CPU: Junk Shop
7950X3D 166.72 / 7800X3D 83.16 = 2.00 X
9800X3D 103.73 X 2.00 = 207.96 Estimated 9950X3D
Result: 207.96
*Crushes* 285K (All Modes), 9950X, Raptor Lake
Blender CPU: Classroom
7950X3D 115.7 / 7800X3D 57.22 = 2.02 X
9800X3D 72.36 X 2.02 = 146.31 Estimated 9950X3D
Result: 146.31
Beats 285K (All Modes), 9950X, Raptor Lake
Geekbench Multi
7950X3D 19841 / 7800X3D 15194 = 1.31 X
9800X3D 18243 X 1.31 = 23823 Estimated 9950X3D
Result: 23823
4.6% below 285K + 8400 CUDIMM +OC
3.3% below 285K + 8400 CUDIMM
Beats 285K, 9950X, Raptor Lake
Timespy Extreme
7950X3D 11589 / 7800X3D 5918 = 1.96 X
9800X3D 7450 X 1.96 = 14589 Estimated 9950X3D
Result: 14589
Near Tie: 0.75% below 285K +8400 CUDIMM +OC
Beats 285K +8400 CUDIMM, 285K, 9950X, Raptor Lake
I just watched a video on why you switched back to Intel from AMD X3D. So, the first thing I want to know is about that in this video.
Really cool that AMD is putting out top notch products lately. Great vid thanks.
Also I was hoping to see 1440 or 4k charts for gaming
The new Ryzen 7 9800X3D CPU is making a huge impact in gaming
Not that long ago people were making a big deal of Intel having a 5% gaming advantage.
Intel fan today: "Who would run games, at low quality and 1080p"...there are always excuses!
@@pedroferrr1412 it is true, but the facts are not that bothered by that.
Best gaming CPU by a large margin.
This may be the first time I switch to AMD CPUs. Time to go Red.
Great and fun reviews as always Jayz! Oh boi. Now I just have to wait. Because the processor sold out in 4 minutes here in Sweden!
Thanks Jay & team for this great video.
I watched other channels cover it but yours was the most down to earth and really easy to understand!
9950x3d is going to be insane.
Got tired of the stairs 😂
Not comparing to 5800x3d feels like a miss to me. A lot of us are still on am4, would have loved tk get direct comparison instead needing to now find a 5800x3d to 7800x3d comparison and then come back tl this video.
We have been redoing all of our tests and there hasn't been enough time to get to Zen 3.5 prior to this launch
Throw 5700x3D in there to!
5800x3d is still great in comparison. If you can find it, and if you can find it for less than $400
Linus did
You don't need that. They still kick so much ass it's hilarious and they'll keep doing so for years. For $199 the 5700X3D is the deal of the decade for anyone still rocking an early gen Ryzen rig.
Would love to see the scores on how they compare 1440 and 4k at max settings….lik i know these are the tests they tell u how to test but it be nice to see max settings tests to really see what im paying for
I love that he mentioned the 7700x. Got mine with a MicroCenter bundle and its been more than enough for anything ive put it through
7700x is the best bang for the buck pair with the 4090 tat wat I have
I can't wait until the processor is barely available to buy for another year and prices are 50% higher
Buying a 5700X3D recently was one of the best PC component buys I've seen in the 30 years since I built my first PC. The fact I got one that boosts to 4.2GHz all core was a nice bonus. AMD recently dumped a ton of 5800X3D binned silicon onto the 5700X3D production line because they had to artificially curtail 5800X3D and 7800X3D production to get prices up so they wouldn't be competing with themselves with the 9800X3D launch, which I'm fairly certain they originally planned to launch next year. But with Core Ultra taking a nosedive, they saw an opportunity to clean house over the holiday season.
I wouldn't expect the picture to be as bleak as you suggest. I think demand is just that high and they are probably churning out 9800X3Ds as fast as possible. It is unlikely they would move the launch up if they weren't prepared to meet demand.
Mmmm I think someone jumped the gun by two minutes!!! 😂😂😂😂
I REALLY wish reviewers would also include real world scenarios.
What difference does it make at 4K ultra settings? Compared to the 7800x3D and the competition?
I get the test and its purpose but I couldn’t give a shit how many frames I’m getting at 1080p medium.
Yes which is especially funny since ppl buying latest cpus are even more likely to use 4k. At least the small review channels have 4k charts
Because if you cared, you'd already know that there hasn't been a cpu in a decade with meaningful difference at 4k. Also it's not just the frames, sim times are on the graphs for Stellaris. Other sim games will show similar sim time benefits.
4k is a total scam anyway
@@SC.KINGDOM care to elaborate?
@@xModerax you can't run next gen AAA at 60fps on max settings. Big scam
Gaming data at 1440p WS or 4k resolutions would be beneficial to determine the benefit of upgrading. I know this is a CPU test but at the higher resolutions does it even matter if you upgrade from a 7800X3D to a 9800X3D? I know a lot of companies rely on impulse buying but I need to spend wisely.
I am quite excited to see the 9950X3D results, my 7950X3D is a beast and I already have it overclocked (Linux is saying it's max is now 5.9ghz..lol), really want to see how they compare.
Video starts at 00:58
Video starts at 27:44
What are the speed limits when using ddr5 on 9800x3d? Considering it was 6000-6400 mhz on a 7800x38
Looks like it’s the same, from other reviews. DDR6000 cl30 or 28
@@jessiethedude No, that's the sweet spot for RAM speed, not the max speed.
AMD claims that 9000 series supports up to 8000MT/s. However you have to switch the IMC to 1:2 mode, which introduces additional latency.
Better to stick with the sweet spot, low latency DDR5-6000 RAM, maybe 6200-6400, as the IMC can run with a 1:1 ratio to the RAM. Similar performance to 8000 RAM, but much less expensive.
@@johnscaramis2515 ohhh whoops
AMD stop kicking Intel they are already in the ground
like you said, i really wanna build a 7950x3d build for games and work but im really excited to see what happens with the 9950x3d
Good luck getting it.
One has to wonder how long it's going to be before these software companies start utilizing large cache system that these architectures are starting to utilize.
They literally do you get about 5-10% higher productivity ipc with vcache. Its just that games are extremely latemcy bound while productivity is normally bound by the compute power of the core itself
once again though make no mistake, VCACHE DOES HELP IN PRODUCTIVITY.
@@Frozoken It depends on which kind of productivity. If I remember correctly, fluid dynamics simulations heavily profited from the cache on the 5800X3D, so I wonder what kind of performance jumps the 9800X3D can achieve there with higher clocks and big cache.
When you write performance software you try and use tight code and data blocks that fit into L1 caches.
Deliberately using super-sized stuff just to use more cache is called software bloat.
Cache is a CPU feature that mitigates the relatively slow access to memory.
@@Frozoken Apparently not as much as these benchmarks would show.
@@RobBCactive I understand what cache is. But just like game developers working to utilize more than one core/thread on multi-core CPU's, it's time for companies to try and utilize the full range of cache memory that's available.
Every single time you say "T.J.Max"... a little voice in my head goes: "shoe store?"
I'm Bri'ish and it's called T.K. Max here so every time I hear it a little voice in my head goes 'you mean T.K. Max?' 😂😂
@@DystopianOverture LOL
That's brutal.... We haven't seen Intel had their ass handed to them so hard since Athlon Thunderbird. And that was a long long time ago
Not everything is about video games.
@@Ryzen9800-c2y Everything is about video games for X3d. Workstation they got 99xx already, even then AMD in the lead with their efficiency.
REALLY excited about the 9950X3D now after seeing AI workload boosted by the extra L3 cache. Especially if it's both CCDs getting Vcache!
With the second generation 3D V-Cache I'm actually more excited for a 9700X3D. I have a 3900X right now and have been wanting an upgrade, but the finances just aren't there for a $600 CPU, which is what it costs here in Norway, especially as I would also need a new motherboard and RAM.
These tests are all 1080p tho… how many actually game at 1080p by todays standards. I’d like to see the comparisons vs 1440p and 4k. Let’s see how it performs
Lol these guys know that and they know Intel is still on top. The parrots will parrot that AMD is the best when they’re not
i don't think you know how CPU testing works..
@teatea7684 he not wrong alot of gamers play at higher resolution so these scores really don't mean much to gamer at 1440p or 4k
@@teatea7684 I have an i9 14900k and all I hear from AMD nerds is how bad it is when it crushes everything at 4k 1440p. And I have a 4k lgue 32 monitor and a 4090. Yeah I’m not getting a POS amd chip lmao
Fact, I have the 9 7950x and it still top of the charts.
Those deciding between 9800X3D and 9950X3D take note that most games are written with the console architecture in mind i.e 8 core 16 thread. Its rare that you'll need more than that (for games) and frequency will be more important than core count as game cpu workloads are inherently serial (certain tasks can be broken out onto multiple threads though). If you do productivity stuff as well then it sounds like the 9950X3D is a good option otherwise the 9800X3D will be best.
I am gonna see what the 9900X3D & 9950X3D is gonna put out.
9900x3d probably won't be too exciting, they should drop it altogether really. It drops 2 x3d cores so the much like the last gen it'll probably run slower than the 9800x3d in games but be a little better in productivity. The 9950x3d I'll bet will match or excede the 9800x3d in games and crush it in productivity.
Honestly those aren’t worth it lol, just get the standard variant 9900x or 9950x for content creation/workflow. If I had to guess they will be a repeat of the 7950x3d
@@kendil22 You are right. 9950X3D would be the one with 2 X3D cores.
not worth, will be slower in gaming. Proven with the prior gen. Losing those cores is critical to performance.
@@itsprod.472People don't seem to get that others exist that play games that very much like the cache on top of having uses for the extra threads. It's irrelevant that the 8-core is faster just for games, when the 16 is getting you most of the way there on top of being better for many other things.
It’s like 14 gen vs 13 gen of Intel but amd actually did do the work in terms of performance and more importantly in term of temps. Intel just turned up the dials and called it the day. Amd boosted in power and performance. But improved the cooling with better cooling vs previous gen. Great work amd.
I just bought the 14700k because at that price bracket, upgrading from lga1200 for CAD$1000 ish I really wanted the best cpu at that price, what I found was that the 14700k was the best. Also I have a 3060, so I'm not pushing it very hard when I play cod, but I mostly play Fortnite, Valorant, and Minecraft anyways
1:32 After all the work and money spent, you're telling us that the stairs are the reason you guys moved back to the old studio? 🤨
Yeah that was weird, maybe just being too sarcastic?
He is obviously joking. They likely just filmed this before they moved and felt like making a joke.
As you talked about the poor design of the 3D cache from a heat transfer perspective, I wondered yet again if anyone has explored a design that would allow cooling the CPU from BOTH sides rather than just the “top.” There are challenges but sandwiching the CPU and socket between two cooling systems would be a giant leap forward.
"We're talking about an actual CPU..." - Ouch, Intel just got burned!!!
You need to run benchmarks with a 4090 at 4K with ultra or extreme settings. No one with that card is going to run it at 1080p on medium. I know this shows the raw extra power but I've got a 14900K and would like to know in a real world scenario how much more FPS I would get if I switched to a 9800x3d platform using my PC in the normal way because I bet it isn't 20% higher it's probably much less.
lol anti intel is cringe. I don't need a cpu that can be a gpu; 120fps and idgaf about 14700k vs whatever. As long as my 4090 isn't being held back from 120fps; idc.
The 285K being at the top of most charts except in 1080p gaming (who does that?) I wonder why it gets so much hate.
I'd honestly want to know too
@JayzTwoCents
its cause for gaming purposes which is arguably the more used purpose for computer users now. the new 9800x3d is going to be the best for that but in terms of video editing its not the best which explains why the 285k is at the top.
The synthetic benchmarks are useless.
Doing these tests in 1080p is silly. Nobody buying a 500 dollar cpu will be playing anything at 1080p. looks promising tho.
The reason for the 1080p tests is to minimize the GPU load and place as much load as possible on the CPU. The reviewers know 1440p or 4K are the go to for resolution. They are testing the performance though and need the GPU to not interfere with the test as much as possible otherwise all the CPUs will have the same performance.
Agreed. They should do realistic tests with 4060ti at 4K. Performance differences at 1080p is irrelevant.
@@AwesomesMan It's not irrelevant. First and foremost, the most popular resolution for gaming (more than 55% of gamers on steam) still use 1080p and for the test methodology, the way you test a product is by not letting any other component limit it.
At the end of the day you can buy whatever part you prefer or brand you want to support, but saying these tests are irrelevant is like saying a bicycle is just as fast as a Ferrari while watching them glide around a school zone. Silly logic.
1 min is unreal
Damnnnnnnnnnnnnnn the chart at 22:43 is awesome to see! Constant clock speeds and lower temps at the same time is huge for gaming!
Now what we need is 6 core Ryzen 5 9600X3D as a sweet spot for gaming.
Now I know for sure that the 9800x3d is my new cpu. I wanted to see comparisons between 7800x3d to see if it makes sense to buy 9800x3d.
There is almost no performance increase from the 5800X3D to 9800X3D for 1440p High / RT games according to LTT's benchmarks.
So, unless you are gaming at 1080p low, consider saving your money!
Should I upgrade my 7800x3d for the new 9800x3d
@@Edgar-u8v9m if I owned a 7800x3d, I wouldn't upgrade. But I have a 5800x
@reigovahtre8418 yeah I have the 7800x3d and I'm thinking of getting the 9800x3d
None of us have watched the whole thing yet
1080P ? ... Zzzz zzz zzz How about a part 2 showing in use for "Real world" users at 2k or 4k? Please and thank you.
at 4k the diffence would be less. theres no "real" upgrade for gaming CPU if you play on 4k i think. since GPU play a bigger role there
I'm loving the glimpse of the future. But I'll wait it out til prices drop and keep abusing my 7700X as I'm seeing 5.7 ghz (Heat doesn't bother me) and not even close to any bottle necking with my 4080 Super
"10 Minute Duation" @ 18:18 😅 You weren't kidding Jay. I'm not seeing anything bad here, and might finally be inclined to adopt AM5.
too bad it ends up being comparable to intel chips since people dont game at 1080p with a 4090. In other words, X3D chips arent necessary
Waste of time@ Gaming Charts...
Who would run games, at medium quality and 1080p, with a RTX 4090(or 4080 super?)?😂
Do a test at 4K and quality settings, ie...
REAL WORLD tests...
Thanks👍😃
You want to bench GPU or CPU, because at 4K you are GPU bottle neck and beching GPU, NOT CPU 😂.
Stop embarrassing yourself 😂
Its to take the graphics card out of the equation as mich as possible and just compare processor performance.
I ❤ my 7950x3d all the quirks have been worked out awhile ago... and it runs very very cool all the time
Glad to see the advancements in tech with this change. Flipping the chip made a ton of sense when this first came out and I'm glad to see they've done it, and well. Looking at the R23 numbers though I'll keep my old 5950X (obvious core count difference) for the time being since its in the 7900x to 9900x territory anyway and I have yet to notice it being the bottleneck in any gaming I do. When the generational uptick is high enough that my system will no longer keep up with the mid tier, then i'll take a harder look. I think that's what most people budget for anyway.
I'm in the "I need a gaming/workstation CPU" so the 9950X3D is looking like it'll be the go to.
BUT the leaked(?) Threadripper X3D has me more excited. CCs, streamers (and me) could really benefit from a "9960X3D" that has more PCIe lanes for things like running a second GPU and Capture Card(s) or something else, with fully populated SSD slots at the same time with no compromises.
if their lowest end threadripper is "affordable" enough, not weird scheduling/latency issues (i had these when i was on the first-gen threadripper) but also has a significant increase in i/o over what the 9950X3D offers.. I might go back to the threadripper platform.