Before getting to the end of this video, I have to address the testing methodology. If you want to see what a CPU can do, then you have to remove the GPU as a bottleneck. As unrealistic as it may sound, this requires either using the best GPUs for rasterization, or if you want to use something like a 3070 or 6700 XT, turning down quality setting to where you see a clear difference between the CPUs. Otherwise, you're simply showing the limitation of the GPU. When testing is done by removing the GPU as any form of bottleneck the difference between the Vcache part and non Vcache tends to be a bit larger. You also can't test it with benchmarks, since benchmarks make poor use of L3 cache, and to explain this is really delving into the details of how a game works between the CPU and GPU and I just don't want to type it all out. The jist of it is, when you play a game, you will turn around, move one way, come back, etc.... A benchmark like Timespy doesn't do that. It's a continual progression through a scene. So, benchmarks in no way represent REAL LIFE game play, even though they're testing graphics engines. What lives in cache is data about the things that are next to you that you can interact with, but typically only if you interacted with THAT object. If the CPU never needed to fetch attributes about some object that you can interact with, it probably won't be cached. But, the more objects around you that you can interact with, the more likely the larger cache will benefit you. This is why in older fast paced games, that extra cache does very little and the slower clock speed usually mean a minor drop in fps. But this can't be confused with data the GPU needs. The GPU has to paint the images, so it needs all the graphical data. In Cyberpunk you really didn't test the CPUs, you tested your graphics card because you used the Ultra preset. While you do get minor difference between the CPUs, mainly due to latency issues which come along with slower clock speeds, or in the case of the 3800 XT having 2 CCDs so core to core comms along with the caching scheme is worse, it was really limited by the GPU. A person can make the argument of "this is what a typical person would use for a GPU", but the problem is you're making claims about the CPU, and in the case of GPU limitations you need to be clear about what you're showing or you give the wrong impression. Remove the GPU as a bottleneck when CPU testing otherwise the ONLY thing you can say is, "These are the results when using these CPUs and THIS GPU with THIS memory kit". And that's not very useful for people who don't have that GPU. The 5800X3D actually give a bit better performance than what you showed and it REALLY increases 1% lows, but once again you have to remove the GPU as a bottleneck to see it.
First of all, I appreciate that your argument is well put together and thought out, and that you kept it respectful. It's something 99% of negative commenters seem unable to do, unfortunately. I've been working with PC hardware for the better part of a decade now (via both this channel, and various other outlets), and I'm well aware of the importance of eliminating bottlenecks as much as possible in order to get the most valid results possible. I can't afford to buy the best of the best to eliminate every bottleneck as this is a tiny channel, so I did the best I could with the best I could afford. I did in fact, as you explained, test all of the games at lower settings. I have pages and pages of results for various graphical presets. I chose the settings I did because at lower settings there was practically no difference (percentage-wise) to the results I presented. I'm also aware that synthetic benchmarks have their shortcomings, however, they're quick and dirty ways to compare performance, which is what a lot of people want to see, thus I included them. As for the actual games, I've phased out most of the automated benchmarks almost entirely at this point, I think I only used those for two of the games on the list. The rest were benchmarked while running through a pre-determined, repeatable, but still human-controlled playthrough of a section of whatever game's being tested, thus in fact "real-world performance". I believe I did evaluate the CPUs in Cyberpunk rather than the GPU because of testing at lower settings and running real-world benchmarks. As for the rest of your argument, you and I both know that no single reviewer was using the same exact system, or the exact same settings while evaluating the X3D, and that there are variations in performance (however slight) between each and every X3D out there. I never claimed that my numbers were perfect, only that they were the numbers I got while accounting for every variable I could with the system I had available, and that's fundamentally the same as any other reviewer on any other system. I believe the data I presented is accurate to what the X3D is capable of, and I stand by it. Could I have made mistakes in the methodology? It's possible. Could there still be a GPU bottleneck even though I've done my damndest to account for it? Sure. But all I can do is to keep working at it, and improving and refining whatever I can, and that's what I'm going to do.
@@TheTechTank furthermore showing plausible settings gives a better perspective than running with the fastest possible GPU set to low resolutions. I'm experimenting with FSR/RSR to figure if I'd be happy with lower resolutions where the improved lows with an x3D is a likely greater benefit than average fps. For less gfx dependent games like strategy that model a lot of things in game, the cache can speed key things up improving experience in late game that's prone to lag. Your figures suggest I need to see the price fall considerably to be worthwhile. People who bought cheap 2700x's and B450 boards will see much higher improvement.
@@TheTechTank I've been working with PC hardware for 35 years. I built my first system with a 486DX2 around 1995. I was also a computer tech. I also worked IT. I also worked with mainframe systems from the late 60s and 70s so I had to learn digital logic, most of a computer engineering degree, Basic Electricity and Electronics (BEE) along with different systems. With that means learning the 1s and the 0s inside a computer, the equivalent now would be understand all the logic inside a CPU. So, binary, octal and HEX math, the machine language of each machine, for which I've even coded using machine language. I also became a micro miniature repair tech which also required miniature repair (replacing any type of component on a PCB basically). In all about 3 years of schooling 5 days a week, 7.5 hours a day to learn everything I did, for working on multiple computer and electronic systems in the USN, for 20 years, and along the way learned PCs and networking. I've followed this industry since 1981. You never know who's going to drop a comment on you and throwing around qualifications isn't any valid point. You probably don't care about anything I just said. It doesn't change the data that I've seen over and over again though. HDWU got the inverse results you did and he's been testing for a REALLY long time and where sometimes I don't always agree with every test decision he makes, he did a good job with the 5800X3D review for the test setup to show you what that CPU is CAPABLE of. It doesn't mean that's what people will see with THEIR system. He used a 3090 Ti with high settings @1080p. You used a 3070 with ultra setting @1080p. The results you showed for Zen 3 was based on clock speed, then a little drop down to the 3800 XT. Basically you were seeing a bottleneck right there, even though it was different than the Intel CPUs because they can bottleneck at different points, which I've seen many times in graphs over the 6 years I've been watching CPU testing on UA-cam. In his testing the 5800X3D easily beat the 5800X. We'll agree to disagree about what happened in Cyberpunk. It's a GPU intensive game. You got framerates in the 90s, he got framerates in the 140ish range. So, which system do you think had more work being done by the CPU? That's a rhetorical question. Now, maybe it's a difference in testing. He doesn't use built in benchmarking. And I can tell you that if you use built in tests, the exact thing can happen that I described in my OP for Timespy. Simply moving through an area doesn't really benchmark the entire gaming system. You have to do different movements, turn around, interact with objects, etc..... You can use benchmarks for voltage testing, frequencies, etc.... but it can be misleading for actual gameplay. Or, maybe this is a better way to understand what can happen, and this is why it looks like you're thinking in that situation you were CPU limited. You were limited by both the CPU and GPU. A limitation with one can create a limitation for the other. It really doesn't matter which part is creating the first limitation. I do understand that many GPU bottlenecks create basically a straight cutoff point, but normally when that happens you tend to find all CPUs tested being limited at about the same point. And, often when you're CPU limited you have a point to where one CPU architecture gets limited at one point and another architecture gets limited around another point. So it does look like you're CPU limited right there. And then what happens is you pick another part of the game, change what you're doing, interact with it and then the results look different. So, I'm not going to say I'm positively right, but I'm going to say that because of the reliance for both the CPU and GPU to perform at a certain level, they can affect each other, and both can be happening at the same time. Steve at GN got something that looks like a complete GPU bottleneck, but it still wasn't. He ran 1080p Med and the 5800X3D beat all the other parts including the 12900KS, except for 1% lows. All the high end parts beat out the 5800X and for 1% lows it was a bigger win. Now, NO one on UA-cam does a great job of testing GPUs or CPUs, but at least Steve from HDWU does a really good job when using games to test because he does more than run built in benchmarks and he eliminates bottlenecks as much as possible with the exception of memory kits SOMETIMES, and my near 30 years of building systems and 20 years of gaming says he's right most of the time. Steve from Gamer's Nexus also does a really good job, partially because he does a lot of validation about what the MBs are doing, verifies boost behavior so he know he's testing based on the specs for the CPU, tests power at the rails, etc.... Hey peace dude it's your channel. We're going to agree to disagree with your response on why what you did................. Good luck with it.
If it came off like I was "throwing around qualifications" then I miscommunicated. I'm a nerd who plays with computer hardware and now makes videos about it. I'm an idiot. The only reason I brought it up was because trying to explain what a GPU bottleneck is to anyone who's in the PC Hardware space is nigh condescending, and I believe a person of your experience probably knows that, as I'm sure you've experienced similar things countless times. The number of times I've had comments trying to explain fairly elementary concepts to me is in the hundreds at this point, so I really shouldn't take offense at this point anymore, but it still irks me a little. I'd be a fool to try to challenge Hardware Unboxed, Gamers Nexus or any of the old guard on their results. Their methodology, testing, and equipment are on an entirely different level compared to 95% of reviewers. I'm still learning from them and others with every new video they drop. But that doesn't change the fact that I ran multiple tests at various graphical quality settings for the titles I presented, and saw a negligible difference in most of them. In the few cases where there was a real difference, I shifted to using those settings and presented that data instead. I did my due diligence and presented the data I had (unlike some channels which I've seen "tweak" their results to more closely match the big channels'). But, as we both know, there's always the possibility of errors during testing, errors that could be caused by various issues (including a perceived GPU bottleneck). Even though I always try my best to control for those as much as possible, they can still and do still slip through. Essentially what I'm trying and probably failing to say is that you could absolutely be correct in your argument, and even if you're only partially correct, I'm still going to work on improving every aspect of my hardware testing as I do with each video. I try my best, and sometimes that's not good enough, and I accept that. I still believe in the numbers I got, and even if they WERE more similar to what the big channels got, it doesn't change the overall sentiment or conclusion of the video, so I don't believe I misled anyone or misrepresented anything. Although anyone's obviously free to feel differently. Anyway, while it may not have come off that way, I do legitimately appreciate your input on this, regardless of what I do and don't agree with. Ultimately it's the comments that challenge rather than praise my work that will help me improve and that's what this is all about. So, thanks! And if you do ever happen to come across another of my videos at some point, I hope you'll challenge me again. Peace!
X3d is worth the money for gaming centric builds. Your already burning the money for a 3080ti or 3090ti. Is $100 going to matter for a gaming build. Money is still saved from not going the 12900k route.
So now 1 year later, have you got your 7k series cpu? I just dropped 300 bucks on the 5800x3D to replace my 5600x seems stupid. But in most games it was a 40fps uplift. This setup (32GB Ram and 4080 super), will last me at least another 5-6 years and then i will do a complete upgrade. Considering how bad the new Zen5 is.
@@Chanharp yeah I am still hanging on to my 5800x3d. The main reason the new 9000 chips are not as performant is because AMD dropped the power profile down to 65 Watt. I am hoping with the 9800x3d they give it the power. So I will just wait and see how that goes. I am not in a hurry to upgrade I love my 5800x3d.
I jumped on the X3D craze recently with a new Ryzen 7 5800X3D. Coming from 3000 series I'm happy with my purchase. I was not excited for the price at launch but with current pricing got it for $279.00. Plays my older single thread games smoothly. Worth the purchase at a good price. Thanks Tech Tank for the later review!
I have 2 systems, 1 with 5800x3d 3070 64gb ram, other with 32gb 5800x with 4080, its crazy how many games with 5800x3d beats out the otherwise way more powerful system, games like tarkov and planet side 2( which are both ram and cpu hungery) i see a 30-50% frame increase on the exact same settings and resolution, I cant wait to see what the 7000x3d cpus put out
I swapped my 5950x to 5800x3D with my 4090 (cpu became bottleneck) at 3440x1440 and it´s 90% of the games a HUGE boost in performance like +30-40fps more. Sonf of Forest 5950x 80-100fps, 5800x3D is 120-140fps same spot. All Ubisoft games like AC +10-15fps, FC 6 +40fps, MMOs scale very well with 3D cache as well. 4090 broke the rules and even at high res the cpu impact is a lot, so testing cpu at 720p and not limited by a gpu to show the real cpu performance is very important.
Thanks for your input! Yip, since this video launched I went back to the drawing board and found that the X3D was in fact being bottlenecked by my GPU and the in-game settings for certain titles, so my next video is a follow-up to this one where I correct the numbers and explain where I went wrong so others can avoid the same mistakes.
When I upgraded my AM4 platform I went with the 5900X. $60 cheaper than the 5800X3D and had more cores with very little gaming difference. Dollar for dollar the 5900X at $360 is the play to go with for more balance work/game.
I've changed from 5800X to the 5800X3D in my Asus Rog Crosshair VIII Dark Hero X570 board and cool it with a liquid freezer II 420mm at - 30 he beats my 5800X B2 which hit 4,85Ghz also in Cinebench R23!
Just got my 5800x3d yesterday, stock settings was getting pretty hot but with kombo strike 3 I get 500+ score in cinebench r23 and it's almost 10 degrees cooler. Pretty nice from my point of view!
I afforded a MSI b550m mortar, updated the bios to the latest one, turned on kombo strike 3 [disabled] -> [3], after few minutes I got a reboot, tried again a random reboots is the result of my kombo strike 3, on the other side, kombo strike 1 seems to be working fine, but no cooler temps for me.
What I think is going on after months of research is games that are CPU limited are the ones that get the most boost from the extra cache, so if you play a lot of modded Skyrim, Total War games and games towards the end of this test suite that are known to be CPU limited the 5800X3D will definitely give you a nice boost, especially if you are coming from Ryzen 3000 series.
@@Obie327 I actually dropped one one in and am not disappointed! Used a negative offset in the BIOS and getting very low temps and still good performance. Definitely an upgrade over my 3800 XT in most games. Even when there is not a huge boost in max fps I usually see a big boost in 1% lows and fps avg. Definitely a great choice for anyone already on AM4
Games aren't specifically optimized for increased cache, its really just incidental that the repetitive data required for the some games can be larger than more common cache sizes. If anything needing more resources is a potential sign of a lack of optimization. The productivity apps show no benefit because they don't need the extra cache to do their work.
Right, well optimised code works with blocks of memory without unpredictable branching so the pipeline of instructions doesn't stall. It's easy to query processor cache sizes so the tightest work doesn't spill out of the very fastest cache.
You know what is impressive? The perforamance it gives to people who are running even 3000 series ryzen still. No MB upgrade or anything. I doubled my FPS from a 3700x in most games and didn't have to spend $1000+ on AM5.
i just changed from a 5900x to a 5800x3d today paired with my 3090ti suprim x i play 3440x1440 i didn't see to much difference in overall fps but what i did notice that games seem a lot smoother and the 1% lows are increased nicely :) maybe i might lose a couple of fps in some titles but i guess that game creators may start optimising new games to take advantage of the 3d cache (hopefully)
Most benchmarks I've seen lately run with re-bar on, so I stuck with it, but I too would like to see if there would be any difference with the 5800X3D.
@@TheTechTank I would like to see Re-bar testing with properly tuned memory settings, almost nobody has done any testing on this and I suspect the highly random memory access of Re-bar would benefit from tRRD/tFAW tweaks, among others.
Retested the chips recently and got 87 for the 5800X and 90 for the X3D, not sure what happened with the temps for this video. Did you get that 73C after a 15 min+ stress test to make sure your AIO was running steady state?
I'm a gamer and have the 5800x currently and it's a great chip. However with dropped frames in MSFS and other titles I recently ordered the 5500X 3D as in VR the extra smoothness the vcache offers sold me. Besides the spare chip will be perfect for the living room pc
I think the same results can be achieved on any mobo with PBO2 tuner. I managed 15094 in R23 with an old gigabyte x470 mobo using that software and a good AIO (arctic freezer 280) with a closed fractal case and a 24 dec C ambient.
Something is not right on your tests, man. Ryzen 7 3800xt beating the 5800x in far cry ? A franchise that used to hate ryzen 3000. 5800x3d being only 3% faster while in most reviews we've seen it consistently beating the 5800x by 10~~20% and in the same games you have tested.
With this MSI combo feature the 120W 5800X3D is winning in gaming the 240W 12900K being cheaper and cooler. Only whoever needs a workstation for professional use would need more threads.
I upgraded from the 3700x to the 5800x 3D, have loved it the entire time of ownership, got it day 1. But I also think I got a "Platinum" level chip, it will boost to 4.35 - 4.45Ghz stock with nothing changed in the bios other than I set XMP. It's paired with a 2080TI and boy was I happy to see the performance I was missing, I'm seriously thinking about using the 2080TI as my main GPU for possibly even 1 - 2 more years skipping this current generation as well. Vote is out on that one though till I see the performance of this latest gen.
I've only got a 2060. But, I've seen the kind of performance my 2600 is making unavailable to me. A 5800X3D (paired with a 6700/6800) would flog what I got.
I undervolted mine and it’s fantastic. It allows it to boost to 4.5 with the lower temps. I also paired with cl14 14 14 14 bdie 16gb dual rank. Worked better than 32gb
Truth be told I am out of this price bracket. My system is a 3200G using the embedded graphics. But you are a tech channel, you get a like. I am feeding that algorithm. As this comment does, showing engagement with a tech video. Thus encouraging the tube to place more tech videos into recommended. Thus ends the lesson.
here in Italy these are the price on amazon (the price are all VAT included and it's 22%) 5800x - 325€ (5700x is only 10€ cheaper) 5900x - 460€ 5800x3d - 550€ 5950x - 745€ while the price for the intel counterparts are: 12600K - 340€ (same for the KF) 12700KF - 430€ (12700 is only 10€ cheaper) 12700K - 480€ 12900K - 700€ 12900KS - 800€ here the x3d fill a gap in the market but for me it seems to be too little, too late and too expensive at that price the 5900x, and more so for the 5800x, allow to save money to get a better gpu for builds with a budget of 300€ for the cpu amd remains the best choice, but at 550€ it is better to buy the i7 + MSI PRO Z690-A DDR4 (240€, the cheapest Z690 you can find) even considering the usually cheaper amd mobo
Those prices are wild, man. It's even worse here in SA. Even if the X3D does fil a gap in the market, its overall improvement isn't high enough to justify the price.
Interesting, some older lightly threaded games and strategy ones which cannot easily use the same data flow paradigm of action games have shown good benefits in other large benchmark tests. What I like about this review is it gives a realistic assessment, upgrades are tempting but not always cost effective.
Totally 💯 You can save 300$ and go for 5600 instead and use that money for Zen 5 platform (8600x) in 2 years. Even midrange will wipe the floor with the 5800x3d in 1.2-2 years. Probably the Zen 5 midrange will also have more features and cores.
Well. Try PBO offsets and turn off the msi junk. Same effect. As a 5800x3D enjoyer, my 1% lows are beyond what anything else I’ve had. That’s really what counts imo. I’d rather have a constant 80fps vs a 60fps 1%.
Once the 7000 series launches, I will look at everything available and decide if I want to build a new system. I am open to the 5000 series, intel, and the 7000 series.
Yeah... Not really. I am pretty sure you are GPU bound in some of the Titles and also the Chip really shines in MMO or Online Shooter Titles, which no one wants to test because of the inconsistencies.
My hypothesis is that a game that has very low 1% lows would benefit from rgw X3D with high cache as the 1% low is either, loading from disk to RAM or loading from RAM to Cache. So since we test games with 32GB of ram nowadays, it is not disk loading anymore, it is cache starvation causing the stutters.
If that were the case, we would expect to see big 1% low improvements across the board, but that doesn't seem to be the case in my benchmarks or the ones I've read/watched.
I just installed a 3800X3D. Its running MW2 1440p Ultra pre sets. Sitting around 65 degrees. Frames only average 90 buts thats because i have a piece of dog shit 3050 that I will be swapping out. Guess i’ll leavw the volts for now. Running fine as ass
Am4 and rtx 3000 series probably going to be the best value platform till AM5, ddr5, gen5 m.2, and rtx 4000 series are mainstream. I'm skipping next gen and maybe getting a 4000 series card down the line. X3d is more of a focus gamming build. 5900x, 5950x is never going to beat 12900. 12900 = too much $$ & energy.
Got my x3d installed the other day. Had the same findings with kombo strike. Mine boosts to 4450mhz in games at 2k res. But runs 7-8c cooler with kombo strike 3. Gaming experience seems to be more stable, I don't get the typical ryzen dips that I'm used too. Coming from a 5800x with 4800mhz all core oc. I got my x3d for $329 usd
Free temperature drops are always nice to have, and less stuttering too. Definitely worth turning Kombo on for this chip. I'm actually planning a follow-up video to this one, because my numbers for the 5800X3D here were significantly lower than I'm getting in more recent testing. It's an amazing little chip.
I'm going to try this tonight! I saw the setting so mu MSI b550 has it, i noticed 65 to 77 temps while gaming to be abit excessive compared to what I'm use to a 5600x sitting at 32 idle degrees lol. My 5800x3d sits around 43... Have you found this setting to lower idles or just stress test and gaming workloads? I mainly care because I have to now mess around with my fan curves again and I'm not use to them ramping up and down so often
Just an update. Temps dropped significantly, idle and most games by about 8 to 10 degrees, some games still hit the mid 70s I would assume they are heavy cpu. I'm looking at you subnautica....
@@Soj1337 The right Kombo level should lower temps across the board, including idle temps. That being said, the temps you're getting really aren't bad at all. If the fan noise is an issue, try a more gradual fan curve with lower values for all steps.
Good afternoon. Thanks for the interesting video. I want to share my own experience and you can draw your own conclusions. My system was 16gb ddr4 3733mhz, ryzen 5 3600x, 3070ti aorus master, I thought that my processor was already outdated and decided that I needed to change it, I bought 5800x3d for $400. But what was my surprise when, after testing in games at high graphics settings, it turned out that the difference in FPS was minimal! Yes, I understand that the processor should be tested at the minimum graphics settings so that the video card does not limit the processor. But I usually play on high-ultra settings. And the question is, wouldn't it be better to spend selling the 3070ti, add that $400 and buy a 4070ti or 7900xt.
i just got a 5800X3D for $319 USD +Sales Tax still way cheaper than MSRP which when i get a motherboard for it, is gunna be my upgrade from a i7 4790k and i was gunna skip it but tha value of tha whole platform as of now was hard to beat and i grabbed a 32gb of DDR4 3200MHZ kit of Corsair Vengeance LPX for at tha time $79 USD which i also bought for my streaming rig which has a Ryzen 5 3600 which was alone an upgrade from a FX 8350 and was gunna run 64gb of quad channel DDR4 but decided it was time to upgrade my gaming rig so with it being a massive discount from what u uploaded, there was no way i was passin up an oppurtunity of an gaming rig upgrade from a 4790k to whatever i weighed my options on and i decided to settle for a 5800X3D instead of goin' AM5 because of discount deals i couldn't refuse
You definitely made all the right choices there, man. Sounds like you're going to get a HUGE upgrade for super cheap, especially compared to the new platforms. I'm even doing a follow-up to this video where the 5800X3D proves that it dominates way more than it did in this video.
I sent mine back after about 3 weeks of trying to get it to run at its peak performance without hitting its thermal limit and pulling itself back. I thought my X570 Aorus Ultra was the issue due to it having ram issues and blue screens if any ddr4 kits is run past 2800 (it’s a rev 1.0 🤮). So I ordered an Msi X570S gaming pro carbon (not the ek version) and was extremely disappointed when I ended up hitting thermal limits again under basic benchmark loads. I started with a Corsair 360mm, then bought an enthoo pro 2 and slapped the Arctic 420mm aio on it and when that didn’t work I got the EK Elite 360mm that has the 6 Rgb Vardar fans. That one did the best, but at that point I was tweaking the bios and spending way too much time on trying to simply get to the 5800x3D’s average score. So I finally said screw it and sent it back and managed to get a golden sample 3900X for $200 locally and am super happy with it!
Yeah I think I had the same issue. It screwed up the MB and possibly the CPU. I haven't retested it. That's an AGESA issue but I think you can make manual settings that will cap temps or frequency but that sucks. I was using an MSI X570S Edge. It's possible this could be a combo of things, like pushing memory up to 3600 CL14 for instance which already drives up temps a few degrees and uses more power, I don't know. I know HDWU got very little difference between 3200 and 3600 memory. I'm not sending mine back though unless it fails on another MB after I cap it. On that system I lost the ethernet controller so you'd think that was the MB that went bad. Any diagram I look at shows ethernet off the chipset. I sent the MB back. So I don't know. What I know is it happened after I stress tested it and it didn't throttle itself correctly and instead stayed in the red zone for probably too long
Is Kombo Strike 3 best option you can enable for your 5800x3d? Like lower temp + performance boost? I'm running it right now without problems and i have low temperatures. Dunno if 1 or 2 are better boost? or no?
Only if you're unhappy with the performance you're getting out of the 3900X. You'll get a noticeable performance boost in a lot of games, but in others it won't be nearly as big. I'd stick with the 3900X until I can upgrade to the next-gen stuff like the 7000/X3D series.
Kombo strike is just pbo tuner2 that anyone can download. Great cpu for single player games but I dislike the 0.1 fps lows in multiplayer games especially in gun fights my 5700g can average 30 fps higher lows for a smoother experience. Just wish the would unlock the bios for the 5800x3d and let a allcore oc :/
Would've loved if AMD left the chip unlocked, but I understand why they had to lock it down. With the chip-stacking tech being brand new there's probably a higher chance of inexperienced overclockers bricking something.
Just because you can get a Lambo for less than MSRP doesn't mean it's a good idea. Not clickbait, I deliver on the title twice in the video. Just because it changes nothing for you doesn't make it any less true.
Here's a list from the MSI forums of boards that are supposedly supported, and yes the Tomahawk is on it: forum-en.msi.com/index.php?threads/kombo-strike-beta-bios-for-r7-5800x3d-cpu.376532/
@@TheTechTank Thank you! I'm planning to get the 5800x3d for the build I'm making in the next few months, I suspect by then Kombo Strike may be out of Beta!
The 5950X is WAY overkill for purely gaming. So many cores you'll never really use unless you're also running professional applications, lol. Regardless I'd stick with the 3900X for another month or two to see what the market looks like after the launch of the 7000-series.
There are reasons why specific games are benchmarked more than others, and it depends on what's being benchmarked, but personally I benchmark games I already own. Can't afford to buy games just for benchmarks just yet, lol.
Mechwarrior 5 would be another good one. But yeah, Space Engineers is a great CPU benchmarking game due to its physics engine. It's also very cheap compared to most other titles. I have to agree with odizzido. Seeing some different titles benchmarked rather than the same ones repeatedly would be rather refreshing.
Am running 3600 and RTX ZOTAC 3090 trinity on X570 Aorus Ultra with 750w PSU, 16gig 3200 RAM, does getting 5800x3D require new PSU like 850W and above?? or you think 5700X is better choice here?
58003d have 105 tdp the 3600 tdp is65 you dont need new psu yes i know tdp is heat not power but in this case the delta is high so im pretty sure you dont have any problem
r5 7600x zen4 will be faster in gaming and probably in productivity too vs 5800x3d despite 7600x will be only 6/12 c/t cpu and price will be 300-330$ and that is 100+$ less then 5800x3d but problem will be more expensive ddr5 and expensive am5 mobos
Not necessarily in simracing, but overall most probably looking at last couple years generational improvements and leaks. Again like you said, new platform looking to be considerably as expensive as 5800x3d.
I hate how techtubers speak about the AM4 motherboards and CPU's as being dead. Like that's it, it will die and people will bury it with roses and trumpet songs. What they forget to mention is the cost that AMD's AM5 and Intel's 1700/1800 sockets CPU's and motherboards are still/will be wayyyy too much. Let's not forget that DDR5 costs us an arm and a leg. AM4 will still be relevant for at least 2 years from now on. If you still have a 5800X and pair it with a 3070 or above you're not playing at 1080p, you are playing at 1440p and we all know the higher the resolution the less strain is on the CPU and more on the GPU.
Yeah... knew that was going to be a sore spot for some people. Even made air quotes in the A-roll but then covered it with B-roll absentmindedly, though at least you can still hear the exagerated tone on "dead". Anywho, no tech UA-camr is saying that either the boards or chips will just disappear from existence, or that they'll all be made irrelevant as soon as next-gen drops. That has never been true for pretty much any platform. The current platform is going to stick around for a heck of a lot longer than another two years. Further into the video, I do actually "remember" to mention the additional cost of next-gen motherboards and DDR5, and that sticking to current-gen is absolutely the way to go if you're already invested in the AM4 platform and want to save a lot of cash. However, for new system builders starting from scratch, it would be a huge disservice to not remind them that there is no further upgrade path on AM4 beyond what we have right now, and that waiting to see what pricing, features, and performance is like on next-gen is the best course of action if possible. Or even just because the new launch will almost undoubtedly drive pricing for current-gen even lower than it already is, making the choice much easier to make.
@@TheTechTank I'm not talking about you, talking about other and the way they use the phrasing "dead". I saw the video and saw that you made those remarks.
@@Teksers Ah, I see. I was confused because you said "what you forget to mention" and I assumed it was directed at me. If not, then I apologize for the misunderstanding 🙂
can you tell me difference in warzone 2.0 on competitive settings 1080p? because with mine 5600x with 3060ti I can achieve max 130fps and see some videos with 5800x3d with same 3060ti same settings they get easily 200+ FPS....
It's hard to compare your system's performance to someone else's, even with similar specs. The people in those vids might have a better RAM configuration, factory overclock on the graphics card etc. You'll get a good FPS improvement with the 5800X3D, but probably not quite that much without any other upgrades too.
@@TheTechTank I am not agreeing in the comments thanks to guy minimum 50 people who do the updates and gain a 60 to 70 % uplift in fps and today when I install I will make a video comparison to see with your eyes 🤭
@@TheTechTank bro I bought today 5800x3d over 5600x and just for your information: 127FPS average with 5600x everything on low warzone 2.0 now with medium settings 186FPS average! day and night bro you don't have any clue and speaking without proving, another one ☝️ on Fortnite with 5600x DX12 on competitive settings 223fps average now on same settings 480 average on some places 550fps like mind blowing, and also most important 1% lows now "sleep like baby" and all this from upgrade only cpu from 5600x to 5800x3d beast!!!
@@PsYVla Wow, that's almost suspiciously amazing! Congrats man, that's definitely a massive upgrade! Planning a revisit to my review of the 5800X3D soon because of stuff like this. The chip deserves way more praise than I gave it in this video.
@@TheTechTank Iam telling you a honestly and if you want I can send to you videos screenshot etc... this cpu is beast buuut need good cooler definitely for max 4450mhz boost now with my boost to 4300 in game and with better cooler i will gain more fps hahaha massive upgrade for just 150 usd because I sell my old 5600x :) cheers
Someday it'll get to $199 once AM5 becomes the norm Just like how i got my 3600+b450 for $140 Crazy lastyr the cpu alone is $200+ just because of crypto 😂
I guess the reviewer just got late to the review for this CPU. Everyone else made this very same review about 3 months ago. I still like the video anyways! hopefully he gets lots of views!
Being based in South Africa means problems with availability, unfortunately. However I don't think it's "the very same review" as Kombo Strike was not a thing back then 😂 Appreciate the well-wishes!
Not clickbait, delivered on the title in the video. A bit sensational, sure, but until UA-cam stops favoring sensationalized titles and thumbnails, they're not going away.
@@NetNeelsie Lol, I was trying to see how many people I could make anxious by manhandling it the whole time 😂 then I dropped it and played myself, haha
@@TheTechTank AMD already confirmed that there will be a 7800X3D, 7900X3D and 7950X3D but that they won't launch until MUCH later (at least 6 months but could be up to a year) because AMD wants to get rid of their overstock of AM4 CPU's (in particular the 5800X3D) first. This of course means they will have to lower the price because being much more expensive than the 12-core 5900X with 64 MB cache and higher clock speeds makes the 5900X a much better choice and being 50% more expensive than the 5800X is based on also makes it bad value in comparison.
That's incorrect. AMD has not yet confirmed X3D versions of the 7000-series (though it doesn't take a genius to assume that they are indeed coming, regardless of the rumors and leaks). I doubt delaying the launch has anything to do with current overstock, and more to do with being able to "refresh" the 7000-series lineup down the road.
why nobody ever look at where that CPU is useful ?!... you did the exact same base tests as all the others, and gave the same results... but you have no idea why the CPU can be useful... the main reason why your benchmark fails is because you tested the CPU with shitty games that use the GPU to generate post. the usage for this CPU is where someone is playing games that are handled with Creation/Game Engines like Quake engine, Unreal engine, Unity 3D... just test one well known Unity 3D game, RUST... the difference between 5800x and 5800x3D is around 28% more efficiency in post, dpi, and latency... because this game use cache to generate and store maps and graphical elements. so the best benchmark you could provide would be to compare games using these specific engines ... have fun and test these: en.wikipedia.org/wiki/List_of_Unity_games
You seem to be under the impression that reviewers exist to paint the products they review in the best light. That's not how it works. The X3D is a gaming CPU, at one point the "fastest gaming CPU", and that claim needs to be tested across a wide range of games, not just in games where it performs particularly well. It doesn't matter if it, according to you, performs better in "games that are handled with Creation/Game Engines like Quake engine, Unreal engine, Unity 3D," all that matters is its gaming performance in general, which is how AMD marketed the X3D.
@@TheTechTank so you do exactly what i say... you use the advertising, or the name, or the color of a product to compare it with the others who state the same, which are the same color, etc... thanks... bye
@@jean-pierremichaud7330 Are you new here? That's how reviews work, how they've always worked, you're literally just describing the process. Cheers, you won't be missed.
Before getting to the end of this video, I have to address the testing methodology. If you want to see what a CPU can do, then you have to remove the GPU as a bottleneck. As unrealistic as it may sound, this requires either using the best GPUs for rasterization, or if you want to use something like a 3070 or 6700 XT, turning down quality setting to where you see a clear difference between the CPUs. Otherwise, you're simply showing the limitation of the GPU.
When testing is done by removing the GPU as any form of bottleneck the difference between the Vcache part and non Vcache tends to be a bit larger.
You also can't test it with benchmarks, since benchmarks make poor use of L3 cache, and to explain this is really delving into the details of how a game works between the CPU and GPU and I just don't want to type it all out. The jist of it is, when you play a game, you will turn around, move one way, come back, etc.... A benchmark like Timespy doesn't do that. It's a continual progression through a scene. So, benchmarks in no way represent REAL LIFE game play, even though they're testing graphics engines. What lives in cache is data about the things that are next to you that you can interact with, but typically only if you interacted with THAT object. If the CPU never needed to fetch attributes about some object that you can interact with, it probably won't be cached. But, the more objects around you that you can interact with, the more likely the larger cache will benefit you. This is why in older fast paced games, that extra cache does very little and the slower clock speed usually mean a minor drop in fps.
But this can't be confused with data the GPU needs. The GPU has to paint the images, so it needs all the graphical data.
In Cyberpunk you really didn't test the CPUs, you tested your graphics card because you used the Ultra preset. While you do get minor difference between the CPUs, mainly due to latency issues which come along with slower clock speeds, or in the case of the 3800 XT having 2 CCDs so core to core comms along with the caching scheme is worse, it was really limited by the GPU.
A person can make the argument of "this is what a typical person would use for a GPU", but the problem is you're making claims about the CPU, and in the case of GPU limitations you need to be clear about what you're showing or you give the wrong impression. Remove the GPU as a bottleneck when CPU testing otherwise the ONLY thing you can say is, "These are the results when using these CPUs and THIS GPU with THIS memory kit". And that's not very useful for people who don't have that GPU.
The 5800X3D actually give a bit better performance than what you showed and it REALLY increases 1% lows, but once again you have to remove the GPU as a bottleneck to see it.
First of all, I appreciate that your argument is well put together and thought out, and that you kept it respectful. It's something 99% of negative commenters seem unable to do, unfortunately.
I've been working with PC hardware for the better part of a decade now (via both this channel, and various other outlets), and I'm well aware of the importance of eliminating bottlenecks as much as possible in order to get the most valid results possible. I can't afford to buy the best of the best to eliminate every bottleneck as this is a tiny channel, so I did the best I could with the best I could afford. I did in fact, as you explained, test all of the games at lower settings. I have pages and pages of results for various graphical presets. I chose the settings I did because at lower settings there was practically no difference (percentage-wise) to the results I presented.
I'm also aware that synthetic benchmarks have their shortcomings, however, they're quick and dirty ways to compare performance, which is what a lot of people want to see, thus I included them. As for the actual games, I've phased out most of the automated benchmarks almost entirely at this point, I think I only used those for two of the games on the list. The rest were benchmarked while running through a pre-determined, repeatable, but still human-controlled playthrough of a section of whatever game's being tested, thus in fact "real-world performance".
I believe I did evaluate the CPUs in Cyberpunk rather than the GPU because of testing at lower settings and running real-world benchmarks.
As for the rest of your argument, you and I both know that no single reviewer was using the same exact system, or the exact same settings while evaluating the X3D, and that there are variations in performance (however slight) between each and every X3D out there. I never claimed that my numbers were perfect, only that they were the numbers I got while accounting for every variable I could with the system I had available, and that's fundamentally the same as any other reviewer on any other system.
I believe the data I presented is accurate to what the X3D is capable of, and I stand by it. Could I have made mistakes in the methodology? It's possible. Could there still be a GPU bottleneck even though I've done my damndest to account for it? Sure. But all I can do is to keep working at it, and improving and refining whatever I can, and that's what I'm going to do.
@@TheTechTank furthermore showing plausible settings gives a better perspective than running with the fastest possible GPU set to low resolutions.
I'm experimenting with FSR/RSR to figure if I'd be happy with lower resolutions where the improved lows with an x3D is a likely greater benefit than average fps.
For less gfx dependent games like strategy that model a lot of things in game, the cache can speed key things up improving experience in late game that's prone to lag.
Your figures suggest I need to see the price fall considerably to be worthwhile. People who bought cheap 2700x's and B450 boards will see much higher improvement.
@@TheTechTank I've been working with PC hardware for 35 years. I built my first system with a 486DX2 around 1995. I was also a computer tech. I also worked IT. I also worked with mainframe systems from the late 60s and 70s so I had to learn digital logic, most of a computer engineering degree, Basic Electricity and Electronics (BEE) along with different systems. With that means learning the 1s and the 0s inside a computer, the equivalent now would be understand all the logic inside a CPU. So, binary, octal and HEX math, the machine language of each machine, for which I've even coded using machine language. I also became a micro miniature repair tech which also required miniature repair (replacing any type of component on a PCB basically). In all about 3 years of schooling 5 days a week, 7.5 hours a day to learn everything I did, for working on multiple computer and electronic systems in the USN, for 20 years, and along the way learned PCs and networking. I've followed this industry since 1981.
You never know who's going to drop a comment on you and throwing around qualifications isn't any valid point. You probably don't care about anything I just said. It doesn't change the data that I've seen over and over again though.
HDWU got the inverse results you did and he's been testing for a REALLY long time and where sometimes I don't always agree with every test decision he makes, he did a good job with the 5800X3D review for the test setup to show you what that CPU is CAPABLE of. It doesn't mean that's what people will see with THEIR system.
He used a 3090 Ti with high settings @1080p. You used a 3070 with ultra setting @1080p. The results you showed for Zen 3 was based on clock speed, then a little drop down to the 3800 XT. Basically you were seeing a bottleneck right there, even though it was different than the Intel CPUs because they can bottleneck at different points, which I've seen many times in graphs over the 6 years I've been watching CPU testing on UA-cam. In his testing the 5800X3D easily beat the 5800X.
We'll agree to disagree about what happened in Cyberpunk. It's a GPU intensive game. You got framerates in the 90s, he got framerates in the 140ish range. So, which system do you think had more work being done by the CPU? That's a rhetorical question.
Now, maybe it's a difference in testing. He doesn't use built in benchmarking. And I can tell you that if you use built in tests, the exact thing can happen that I described in my OP for Timespy. Simply moving through an area doesn't really benchmark the entire gaming system. You have to do different movements, turn around, interact with objects, etc..... You can use benchmarks for voltage testing, frequencies, etc.... but it can be misleading for actual gameplay.
Or, maybe this is a better way to understand what can happen, and this is why it looks like you're thinking in that situation you were CPU limited. You were limited by both the CPU and GPU. A limitation with one can create a limitation for the other. It really doesn't matter which part is creating the first limitation. I do understand that many GPU bottlenecks create basically a straight cutoff point, but normally when that happens you tend to find all CPUs tested being limited at about the same point. And, often when you're CPU limited you have a point to where one CPU architecture gets limited at one point and another architecture gets limited around another point. So it does look like you're CPU limited right there. And then what happens is you pick another part of the game, change what you're doing, interact with it and then the results look different. So, I'm not going to say I'm positively right, but I'm going to say that because of the reliance for both the CPU and GPU to perform at a certain level, they can affect each other, and both can be happening at the same time.
Steve at GN got something that looks like a complete GPU bottleneck, but it still wasn't. He ran 1080p Med and the 5800X3D beat all the other parts including the 12900KS, except for 1% lows. All the high end parts beat out the 5800X and for 1% lows it was a bigger win.
Now, NO one on UA-cam does a great job of testing GPUs or CPUs, but at least Steve from HDWU does a really good job when using games to test because he does more than run built in benchmarks and he eliminates bottlenecks as much as possible with the exception of memory kits SOMETIMES, and my near 30 years of building systems and 20 years of gaming says he's right most of the time. Steve from Gamer's Nexus also does a really good job, partially because he does a lot of validation about what the MBs are doing, verifies boost behavior so he know he's testing based on the specs for the CPU, tests power at the rails, etc....
Hey peace dude it's your channel. We're going to agree to disagree with your response on why what you did.................
Good luck with it.
If it came off like I was "throwing around qualifications" then I miscommunicated. I'm a nerd who plays with computer hardware and now makes videos about it. I'm an idiot. The only reason I brought it up was because trying to explain what a GPU bottleneck is to anyone who's in the PC Hardware space is nigh condescending, and I believe a person of your experience probably knows that, as I'm sure you've experienced similar things countless times. The number of times I've had comments trying to explain fairly elementary concepts to me is in the hundreds at this point, so I really shouldn't take offense at this point anymore, but it still irks me a little.
I'd be a fool to try to challenge Hardware Unboxed, Gamers Nexus or any of the old guard on their results. Their methodology, testing, and equipment are on an entirely different level compared to 95% of reviewers. I'm still learning from them and others with every new video they drop. But that doesn't change the fact that I ran multiple tests at various graphical quality settings for the titles I presented, and saw a negligible difference in most of them. In the few cases where there was a real difference, I shifted to using those settings and presented that data instead. I did my due diligence and presented the data I had (unlike some channels which I've seen "tweak" their results to more closely match the big channels'). But, as we both know, there's always the possibility of errors during testing, errors that could be caused by various issues (including a perceived GPU bottleneck). Even though I always try my best to control for those as much as possible, they can still and do still slip through.
Essentially what I'm trying and probably failing to say is that you could absolutely be correct in your argument, and even if you're only partially correct, I'm still going to work on improving every aspect of my hardware testing as I do with each video. I try my best, and sometimes that's not good enough, and I accept that. I still believe in the numbers I got, and even if they WERE more similar to what the big channels got, it doesn't change the overall sentiment or conclusion of the video, so I don't believe I misled anyone or misrepresented anything. Although anyone's obviously free to feel differently.
Anyway, while it may not have come off that way, I do legitimately appreciate your input on this, regardless of what I do and don't agree with. Ultimately it's the comments that challenge rather than praise my work that will help me improve and that's what this is all about. So, thanks! And if you do ever happen to come across another of my videos at some point, I hope you'll challenge me again. Peace!
X3d is worth the money for gaming centric builds. Your already burning the money for a 3080ti or 3090ti. Is $100 going to matter for a gaming build. Money is still saved from not going the 12900k route.
I notice more than anything else a major reduction in stutters in VR. Was not in a position to wait for next gen, so happy with my purchase.
It's a great CPU no doubt. What did you upgrade from?
What level did you use?
I look at the 5800x3d as my last upgrade from my 3600. once the price drop when the 7000s come out I will get one and run for another 3 years.
Definitely not a bad plan.
That's what I'm waiting for xD
So now 1 year later, have you got your 7k series cpu? I just dropped 300 bucks on the 5800x3D to replace my 5600x seems stupid. But in most games it was a 40fps uplift. This setup (32GB Ram and 4080 super), will last me at least another 5-6 years and then i will do a complete upgrade. Considering how bad the new Zen5 is.
@@Chanharp yeah I am still hanging on to my 5800x3d. The main reason the new 9000 chips are not as performant is because AMD dropped the power profile down to 65 Watt. I am hoping with the 9800x3d they give it the power. So I will just wait and see how that goes. I am not in a hurry to upgrade I love my 5800x3d.
I jumped on the X3D craze recently with a new Ryzen 7 5800X3D. Coming from 3000 series I'm happy with my purchase. I was not excited for the price at launch but with current pricing got it for $279.00. Plays my older single thread games smoothly. Worth the purchase at a good price. Thanks Tech Tank for the later review!
The value for gaming really can't be beat right now, tbh.
You will only know the proper difference with an rtx 3090 ti
I have 2 systems, 1 with 5800x3d 3070 64gb ram, other with 32gb 5800x with 4080, its crazy how many games with 5800x3d beats out the otherwise way more powerful system, games like tarkov and planet side 2( which are both ram and cpu hungery) i see a 30-50% frame increase on the exact same settings and resolution, I cant wait to see what the 7000x3d cpus put out
I swapped my 5950x to 5800x3D with my 4090 (cpu became bottleneck) at 3440x1440 and it´s 90% of the games a HUGE boost in performance like +30-40fps more. Sonf of Forest 5950x 80-100fps, 5800x3D is 120-140fps same spot. All Ubisoft games like AC +10-15fps, FC 6 +40fps, MMOs scale very well with 3D cache as well.
4090 broke the rules and even at high res the cpu impact is a lot, so testing cpu at 720p and not limited by a gpu to show the real cpu performance is very important.
Thanks for your input! Yip, since this video launched I went back to the drawing board and found that the X3D was in fact being bottlenecked by my GPU and the in-game settings for certain titles, so my next video is a follow-up to this one where I correct the numbers and explain where I went wrong so others can avoid the same mistakes.
When I upgraded my AM4 platform I went with the 5900X. $60 cheaper than the 5800X3D and had more cores with very little gaming difference. Dollar for dollar the 5900X at $360 is the play to go with for more balance work/game.
I'd much rather have a 5900X than the X3D any day, though to be fair I use my system for work too, lol.
Unless you’re a gamer
3d v cache eats any cpu based game for breakfast
I've changed from 5800X to the 5800X3D in my Asus Rog Crosshair VIII Dark Hero X570 board and cool it with a liquid freezer II 420mm at - 30 he beats my 5800X B2 which hit 4,85Ghz also in Cinebench R23!
The Ryzen 7 5800X3D is a great last huzzah! For an otherwise good system with an older Zen/+/2 CPU.
Just got my 5800x3d yesterday, stock settings was getting pretty hot but with kombo strike 3 I get 500+ score in cinebench r23 and it's almost 10 degrees cooler. Pretty nice from my point of view!
I afforded a MSI b550m mortar, updated the bios to the latest one, turned on kombo strike 3 [disabled] -> [3], after few minutes I got a reboot, tried again a random reboots is the result of my kombo strike 3, on the other side, kombo strike 1 seems to be working fine, but no cooler temps for me.
What I think is going on after months of research is games that are CPU limited are the ones that get the most boost from the extra cache, so if you play a lot of modded Skyrim, Total War games and games towards the end of this test suite that are known to be CPU limited the 5800X3D will definitely give you a nice boost, especially if you are coming from Ryzen 3000 series.
Older CPU limited games make a huge difference with this 5800X3D. A Nice upgrade for gaming coming from Zen 2.
@@Obie327 I actually dropped one one in and am not disappointed! Used a negative offset in the BIOS and getting very low temps and still good performance. Definitely an upgrade over my 3800 XT in most games. Even when there is not a huge boost in max fps I usually see a big boost in 1% lows and fps avg. Definitely a great choice for anyone already on AM4
Games aren't specifically optimized for increased cache, its really just incidental that the repetitive data required for the some games can be larger than more common cache sizes. If anything needing more resources is a potential sign of a lack of optimization. The productivity apps show no benefit because they don't need the extra cache to do their work.
Right, well optimised code works with blocks of memory without unpredictable branching so the pipeline of instructions doesn't stall.
It's easy to query processor cache sizes so the tightest work doesn't spill out of the very fastest cache.
Except for escape from Tarkov lol
Very cool video mate. Really is fun to watch your videos. Keep well friend.
Not late, just early. Love it. Love your review. Thanks.
Thanks, man!
You know what is impressive? The perforamance it gives to people who are running even 3000 series ryzen still. No MB upgrade or anything. I doubled my FPS from a 3700x in most games and didn't have to spend $1000+ on AM5.
i just changed from a 5900x to a 5800x3d today paired with my 3090ti suprim x i play 3440x1440 i didn't see to much difference in overall fps but what i did notice that games seem a lot smoother and the 1% lows are increased nicely :) maybe i might lose a couple of fps in some titles but i guess that game creators may start optimising new games to take advantage of the 3d cache (hopefully)
I would like to have seen benchmarks with Resizable BAR on and off. To see how that affects games now with V Cache.
Most benchmarks I've seen lately run with re-bar on, so I stuck with it, but I too would like to see if there would be any difference with the 5800X3D.
@@TheTechTank I would like to see Re-bar testing with properly tuned memory settings, almost nobody has done any testing on this and I suspect the highly random memory access of Re-bar would benefit from tRRD/tFAW tweaks, among others.
Runs cooler than 5800X?? I had 5800X and max temp in cinebench was 73C, Upgraded to 5800X3D and max temp in cinebench is 89C.
Retested the chips recently and got 87 for the 5800X and 90 for the X3D, not sure what happened with the temps for this video. Did you get that 73C after a 15 min+ stress test to make sure your AIO was running steady state?
I'm a gamer and have the 5800x currently and it's a great chip. However with dropped frames in MSFS and other titles I recently ordered the 5500X 3D as in VR the extra smoothness the vcache offers sold me. Besides the spare chip will be perfect for the living room pc
The 5800X3D dominates in MSFS from what I've seen, so that's a very nice upgrade, especially if you'll be moving the regular chip to an HTPC.
I think the same results can be achieved on any mobo with PBO2 tuner. I managed 15094 in R23 with an old gigabyte x470 mobo using that software and a good AIO (arctic freezer 280) with a closed fractal case and a 24 dec C ambient.
Absolutely, I just think proper PBO tuning might be a little intimidating for the average user, whereas Kombo simplifies things a bit.
Something is not right on your tests, man. Ryzen 7 3800xt beating the 5800x in far cry ? A franchise that used to hate ryzen 3000. 5800x3d being only 3% faster while in most reviews we've seen it consistently beating the 5800x by 10~~20% and in the same games you have tested.
With this MSI combo feature the 120W 5800X3D is winning in gaming the 240W 12900K being cheaper and cooler. Only whoever needs a workstation for professional use would need more threads.
Great video as always. Love the content.
Thanks man, always appreciate you dropping by! Been a while 😅
I upgraded from the 3700x to the 5800x 3D, have loved it the entire time of ownership, got it day 1. But I also think I got a "Platinum" level chip, it will boost to 4.35 - 4.45Ghz stock with nothing changed in the bios other than I set XMP. It's paired with a 2080TI and boy was I happy to see the performance I was missing, I'm seriously thinking about using the 2080TI as my main GPU for possibly even 1 - 2 more years skipping this current generation as well. Vote is out on that one though till I see the performance of this latest gen.
I've only got a 2060. But, I've seen the kind of performance my 2600 is making unavailable to me. A 5800X3D (paired with a 6700/6800) would flog what I got.
I undervolted mine and it’s fantastic. It allows it to boost to 4.5 with the lower temps. I also paired with cl14 14 14 14 bdie 16gb dual rank. Worked better than 32gb
I went from 3700x to 5800x to 5800x 3d. Each time I was able to notice a huge jump in performance on the same motherboard. That’s crazy…
@@carloscruz7317 I'm running CL14, but it's 32GB of RAM total, however I also stream, video edit and etc so I needed more than 16GB.
@@bradyb2233 That's AMD's constant push for more performance for you, they're really putting the hurting on Intel
Truth be told I am out of this price bracket. My system is a 3200G using the embedded graphics. But you are a tech channel, you get a like. I am feeding that algorithm. As this comment does, showing engagement with a tech video. Thus encouraging the tube to place more tech videos into recommended. Thus ends the lesson.
here in Italy these are the price on amazon (the price are all VAT included and it's 22%)
5800x - 325€ (5700x is only 10€ cheaper)
5900x - 460€
5800x3d - 550€
5950x - 745€
while the price for the intel counterparts are:
12600K - 340€ (same for the KF)
12700KF - 430€ (12700 is only 10€ cheaper)
12700K - 480€
12900K - 700€
12900KS - 800€
here the x3d fill a gap in the market but for me it seems to be too little, too late and too expensive
at that price the 5900x, and more so for the 5800x, allow to save money to get a better gpu
for builds with a budget of 300€ for the cpu amd remains the best choice, but at 550€ it is better to buy the i7 + MSI PRO Z690-A DDR4 (240€, the cheapest Z690 you can find) even considering the usually cheaper amd mobo
Those prices are wild, man. It's even worse here in SA. Even if the X3D does fil a gap in the market, its overall improvement isn't high enough to justify the price.
@@TheTechTank i will be happy if the next gen will hold the actual price, but untill now every gen has come with a bump in price
Interesting, some older lightly threaded games and strategy ones which cannot easily use the same data flow paradigm of action games have shown good benefits in other large benchmark tests.
What I like about this review is it gives a realistic assessment, upgrades are tempting but not always cost effective.
Totally 💯
You can save 300$ and go for 5600 instead and use that money for Zen 5 platform (8600x) in 2 years.
Even midrange will wipe the floor with the 5800x3d in 1.2-2 years.
Probably the Zen 5 midrange will also have more features and cores.
@@KryssN1 or just sticking with what I have, those with bargain 2700x might benefit far more than I would.
Well. Try PBO offsets and turn off the msi junk. Same effect. As a 5800x3D enjoyer, my 1% lows are beyond what anything else I’ve had. That’s really what counts imo. I’d rather have a constant 80fps vs a 60fps 1%.
Just curious, which motherboard are you using with X3D ?
Other games may also benefit but probably require even bigger cache
Other games definitely benefit from it from what I've seen, but I don't think cache size is that much of a bottleneck yet.
There's two upgrades I want to get. 1 is the 5800X3D. The other is either a 6700XT or a 6800.
I'd personally go with the 5800X and use the cash saved to go with that 6800 or XT. You'll see a bigger FPS bump from the GPU.
I had a 5800x/6800xt build 180-190fps docks train tracks run warzone 1080p low no RT
With the 5800x3d I get 210-220 that same run
What‘s with the single core performance with combo strike?
The flipping on the cpu in your hands XD
I hope it made a couple of people a little anxious 😂
Great review!
Appreciate it!
Once the 7000 series launches, I will look at everything available and decide if I want to build a new system. I am open to the 5000 series, intel, and the 7000 series.
That's the best plan for anyone right now. I'd hold off another month or two before even considering a new build or upgrade.
Are the gaming benchmarks in 1080p?
Yes, all at 1080p.
Yeah... Not really.
I am pretty sure you are GPU bound in some of the Titles and also the Chip really shines in MMO or Online Shooter Titles, which no one wants to test because of the inconsistencies.
My hypothesis is that a game that has very low 1% lows would benefit from rgw X3D with high cache as the 1% low is either,
loading from disk to RAM
or loading from RAM to Cache.
So since we test games with 32GB of ram nowadays, it is not disk loading anymore, it is cache starvation causing the stutters.
If that were the case, we would expect to see big 1% low improvements across the board, but that doesn't seem to be the case in my benchmarks or the ones I've read/watched.
@@TheTechTank I need to look at a lot more reviews with many games to see if that could be one of the things.
Thanks for the idea.
A deep dive like what Hardware Unboxed does every now and then would be great to see, lol. "100+ game CPU cache testing!"
Interesting idea as in star citizen the 5900x has 1% lows of 27 fps and the 5800x3d as 1% lows of 35 fps. ua-cam.com/video/XLVy7AXf83E/v-deo.html
I just installed a 3800X3D. Its running MW2 1440p Ultra pre sets. Sitting around 65 degrees. Frames only average 90 buts thats because i have a piece of dog shit 3050 that I will be swapping out. Guess i’ll leavw the volts for now. Running fine as ass
What you think of the Ryzen 7 7800X3D
It looks like the only 7000-series chip worth getting if you're a gamer.
@@TheTechTank I was deciding between the 7900X or 7800X3D. Also was considering the 7700X
Am4 and rtx 3000 series probably going to be the best value platform till AM5, ddr5, gen5 m.2, and rtx 4000 series are mainstream. I'm skipping next gen and maybe getting a 4000 series card down the line. X3d is more of a focus gamming build. 5900x, 5950x is never going to beat 12900. 12900 = too much $$ & energy.
Got my x3d installed the other day. Had the same findings with kombo strike. Mine boosts to 4450mhz in games at 2k res. But runs 7-8c cooler with kombo strike 3. Gaming experience seems to be more stable, I don't get the typical ryzen dips that I'm used too. Coming from a 5800x with 4800mhz all core oc. I got my x3d for $329 usd
Free temperature drops are always nice to have, and less stuttering too. Definitely worth turning Kombo on for this chip. I'm actually planning a follow-up video to this one, because my numbers for the 5800X3D here were significantly lower than I'm getting in more recent testing. It's an amazing little chip.
I'm going to try this tonight! I saw the setting so mu MSI b550 has it, i noticed 65 to 77 temps while gaming to be abit excessive compared to what I'm use to a 5600x sitting at 32 idle degrees lol. My 5800x3d sits around 43... Have you found this setting to lower idles or just stress test and gaming workloads?
I mainly care because I have to now mess around with my fan curves again and I'm not use to them ramping up and down so often
Just an update. Temps dropped significantly, idle and most games by about 8 to 10 degrees, some games still hit the mid 70s I would assume they are heavy cpu. I'm looking at you subnautica....
@@Soj1337 The right Kombo level should lower temps across the board, including idle temps. That being said, the temps you're getting really aren't bad at all. If the fan noise is an issue, try a more gradual fan curve with lower values for all steps.
Good afternoon. Thanks for the interesting video. I want to share my own experience and you can draw your own conclusions. My system was 16gb ddr4 3733mhz, ryzen 5 3600x, 3070ti aorus master, I thought that my processor was already outdated and decided that I needed to change it, I bought 5800x3d for $400. But what was my surprise when, after testing in games at high graphics settings, it turned out that the difference in FPS was minimal! Yes, I understand that the processor should be tested at the minimum graphics settings so that the video card does not limit the processor. But I usually play on high-ultra settings. And the question is, wouldn't it be better to spend selling the 3070ti, add that $400 and buy a 4070ti or 7900xt.
I will give you a short answer, if you are playing in 1440p no, if you are playing in 4k then yes.
i just got a 5800X3D for $319 USD +Sales Tax still way cheaper than MSRP which when i get a motherboard for it, is gunna be my upgrade from a i7 4790k and i was gunna skip it but tha value of tha whole platform as of now was hard to beat and i grabbed a 32gb of DDR4 3200MHZ kit of Corsair Vengeance LPX for at tha time $79 USD which i also bought for my streaming rig which has a Ryzen 5 3600 which was alone an upgrade from a FX 8350 and was gunna run 64gb of quad channel DDR4 but decided it was time to upgrade my gaming rig so with it being a massive discount from what u uploaded, there was no way i was passin up an oppurtunity of an gaming rig upgrade from a 4790k to whatever i weighed my options on and i decided to settle for a 5800X3D instead of goin' AM5 because of discount deals i couldn't refuse
You definitely made all the right choices there, man. Sounds like you're going to get a HUGE upgrade for super cheap, especially compared to the new platforms. I'm even doing a follow-up to this video where the 5800X3D proves that it dominates way more than it did in this video.
I sent mine back after about 3 weeks of trying to get it to run at its peak performance without hitting its thermal limit and pulling itself back. I thought my X570 Aorus Ultra was the issue due to it having ram issues and blue screens if any ddr4 kits is run past 2800 (it’s a rev 1.0 🤮). So I ordered an Msi X570S gaming pro carbon (not the ek version) and was extremely disappointed when I ended up hitting thermal limits again under basic benchmark loads. I started with a Corsair 360mm, then bought an enthoo pro 2 and slapped the Arctic 420mm aio on it and when that didn’t work I got the EK Elite 360mm that has the 6 Rgb Vardar fans. That one did the best, but at that point I was tweaking the bios and spending way too much time on trying to simply get to the 5800x3D’s average score.
So I finally said screw it and sent it back and managed to get a golden sample 3900X for $200 locally and am super happy with it!
Yeah I think I had the same issue. It screwed up the MB and possibly the CPU. I haven't retested it. That's an AGESA issue but I think you can make manual settings that will cap temps or frequency but that sucks. I was using an MSI X570S Edge.
It's possible this could be a combo of things, like pushing memory up to 3600 CL14 for instance which already drives up temps a few degrees and uses more power, I don't know. I know HDWU got very little difference between 3200 and 3600 memory.
I'm not sending mine back though unless it fails on another MB after I cap it. On that system I lost the ethernet controller so you'd think that was the MB that went bad. Any diagram I look at shows ethernet off the chipset. I sent the MB back.
So I don't know. What I know is it happened after I stress tested it and it didn't throttle itself correctly and instead stayed in the red zone for probably too long
Yea mine is under full custom loop with 2 360mm radiators and it still almost hits 80c. It’s very hot chip
@@bradyb2233 nothing wrong here. My chip runs about 76- 80 degrees celcius average.. And 84 degrees celcius MAX.
I think 80c is normal on this chip.. it wont throttle until 90c I believed
Eyyyyy new video
They thought I was gone for good, but the haters don't know me as well as you do, Waseem 😂
Is Kombo Strike 3 best option you can enable for your 5800x3d? Like lower temp + performance boost? I'm running it right now without problems and i have low temperatures. Dunno if 1 or 2 are better boost? or no?
I found combo 3 to be the best all around.
PBO2 Tuner probably does the same trick, but at OS level.
Sure, to some extent, however the process is significantly more involved than a toggle in the BIOS.
I currently have a 3900X is it worth the upgrade?
Only if you're unhappy with the performance you're getting out of the 3900X. You'll get a noticeable performance boost in a lot of games, but in others it won't be nearly as big. I'd stick with the 3900X until I can upgrade to the next-gen stuff like the 7000/X3D series.
You should have tried it the game Control, it slaps
Control's great. I use it for testing GPUs as it's super graphically demanding. It's not quite so great for CPU testing.
Kombo strike is just pbo tuner2 that anyone can download. Great cpu for single player games but I dislike the 0.1 fps lows in multiplayer games especially in gun fights my 5700g can average 30 fps higher lows for a smoother experience. Just wish the would unlock the bios for the 5800x3d and let a allcore oc :/
Would've loved if AMD left the chip unlocked, but I understand why they had to lock it down. With the chip-stacking tech being brand new there's probably a higher chance of inexperienced overclockers bricking something.
The click bait title was definitely disappointing. And you can get the 5800X3D for less than MSRP.
Just because you can get a Lambo for less than MSRP doesn't mean it's a good idea. Not clickbait, I deliver on the title twice in the video. Just because it changes nothing for you doesn't make it any less true.
Which boards offer Kombo Strike? I'm leaning towards the B550 Tomahawk for my build
Here's a list from the MSI forums of boards that are supposedly supported, and yes the Tomahawk is on it: forum-en.msi.com/index.php?threads/kombo-strike-beta-bios-for-r7-5800x3d-cpu.376532/
@@TheTechTank Thank you! I'm planning to get the 5800x3d for the build I'm making in the next few months, I suspect by then Kombo Strike may be out of Beta!
I hope they get all of the kinks out by then!
Is Kombo strike only for the 5800x3d? I was thinking about upgrading to the beta bios but I have a 5600x.
Yip, only for the 5800X3D since the other chips can all be overclocked manually.
does it worth an upgrade from my 3900x for gaming? i was thinking about 5950 instead of 5800x3d ...
The 5950X is WAY overkill for purely gaming. So many cores you'll never really use unless you're also running professional applications, lol. Regardless I'd stick with the 3900X for another month or two to see what the market looks like after the launch of the 7000-series.
@@TheTechTank thank you.
I bought the 5800x3d because of DCS and Microsoft flight sim. That said I think the 5900x is the best value from AMD over the 5950x and 5800x3d.
which version of kombo strike is better to use (1,2,3)?
In my case it was Kombo 3, but you could give Kombo 1 a try too.
Everyone always benches the same games. I would love to see some stellaris, space engineers, or factorio benches some time.
There are reasons why specific games are benchmarked more than others, and it depends on what's being benchmarked, but personally I benchmark games I already own. Can't afford to buy games just for benchmarks just yet, lol.
@@TheTechTank Yeah, if I had unlimited money I would buy a 5800X3D and bench it myself. We are each missing what the other person has.
Ah, capitalism.
Mechwarrior 5 would be another good one.
But yeah, Space Engineers is a great CPU benchmarking game due to its physics engine. It's also very cheap compared to most other titles.
I have to agree with odizzido.
Seeing some different titles benchmarked rather than the same ones repeatedly would be rather refreshing.
Does anyone know if the 5800x3D works wonders with Bethesda games?
I saw one post claiming that it's good for certain mods in Fallout 4, but I can't imagine it'll "work wonders" in Bethesda games specifically.
should have used rtx 3090ti for testing
Send me one.
Am running 3600 and RTX ZOTAC 3090 trinity on X570 Aorus Ultra with 750w PSU, 16gig 3200 RAM, does getting 5800x3D require new PSU like 850W and above?? or you think 5700X is better choice here?
58003d have 105 tdp the 3600 tdp is65 you dont need new psu yes i know tdp is heat not power but in this case the delta is high so im pretty sure you dont have any problem
@@shayeladshayelad2416 THNX
Just to back Shayelad up, 750W is totally fine for that system, especially since you won't be overclocking the 5800X3D.
@@TheTechTank Thanks man , or i would rather go for 5700x it is 24k INR and 5800X3D is 42K INR. will be mostly getting NEO G7/8 4k Monitor.
@@QUANTUM4KGAMER I'd personally get the 5700X and spend the extra cash on the monitor or to put towards a future upgrade.
r5 7600x zen4 will be faster in gaming and probably in productivity too vs 5800x3d despite 7600x will be only 6/12 c/t cpu and price will be 300-330$ and that is 100+$ less then 5800x3d but problem will be more expensive ddr5 and expensive am5 mobos
Not necessarily in simracing, but overall most probably looking at last couple years generational improvements and leaks.
Again like you said, new platform looking to be considerably as expensive as 5800x3d.
Not too sure about productivity (physical cores still count for a lot), but sure it had better be faster for gaming, otherwise what's the point?
@@TheTechTank vs 5800x3d will be better and vs 5800x will be tight depend on task
@@Mr11ESSE111 Won't have to wait too much longer to find out for sure.
@@TheTechTank you will see it will be like i sayed because otherwise zen4 could be new faildozer
I hate how techtubers speak about the AM4 motherboards and CPU's as being dead. Like that's it, it will die and people will bury it with roses and trumpet songs.
What they forget to mention is the cost that AMD's AM5 and Intel's 1700/1800 sockets CPU's and motherboards are still/will be wayyyy too much. Let's not forget that DDR5 costs us an arm and a leg. AM4 will still be relevant for at least 2 years from now on. If you still have a 5800X and pair it with a 3070 or above you're not playing at 1080p, you are playing at 1440p and we all know the higher the resolution the less strain is on the CPU and more on the GPU.
Yeah... knew that was going to be a sore spot for some people. Even made air quotes in the A-roll but then covered it with B-roll absentmindedly, though at least you can still hear the exagerated tone on "dead".
Anywho, no tech UA-camr is saying that either the boards or chips will just disappear from existence, or that they'll all be made irrelevant as soon as next-gen drops. That has never been true for pretty much any platform. The current platform is going to stick around for a heck of a lot longer than another two years.
Further into the video, I do actually "remember" to mention the additional cost of next-gen motherboards and DDR5, and that sticking to current-gen is absolutely the way to go if you're already invested in the AM4 platform and want to save a lot of cash. However, for new system builders starting from scratch, it would be a huge disservice to not remind them that there is no further upgrade path on AM4 beyond what we have right now, and that waiting to see what pricing, features, and performance is like on next-gen is the best course of action if possible. Or even just because the new launch will almost undoubtedly drive pricing for current-gen even lower than it already is, making the choice much easier to make.
@@TheTechTank I'm not talking about you, talking about other and the way they use the phrasing "dead". I saw the video and saw that you made those remarks.
@@Teksers Ah, I see. I was confused because you said "what you forget to mention" and I assumed it was directed at me. If not, then I apologize for the misunderstanding 🙂
@@TheTechTank corrected myself 👍😁
can you tell me difference in warzone 2.0 on competitive settings 1080p? because with mine 5600x with 3060ti I can achieve max 130fps and see some videos with 5800x3d with same 3060ti same settings they get easily 200+ FPS....
It's hard to compare your system's performance to someone else's, even with similar specs. The people in those vids might have a better RAM configuration, factory overclock on the graphics card etc. You'll get a good FPS improvement with the 5800X3D, but probably not quite that much without any other upgrades too.
@@TheTechTank I am not agreeing in the comments thanks to guy minimum 50 people who do the updates and gain a 60 to 70 % uplift in fps and today when I install I will make a video comparison to see with your eyes 🤭
@@TheTechTank bro I bought today 5800x3d over 5600x and just for your information: 127FPS average with 5600x everything on low warzone 2.0 now with medium settings 186FPS average! day and night bro you don't have any clue and speaking without proving, another one ☝️ on Fortnite with 5600x DX12 on competitive settings 223fps average now on same settings 480 average on some places 550fps like mind blowing, and also most important 1% lows now "sleep like baby" and all this from upgrade only cpu from 5600x to 5800x3d beast!!!
@@PsYVla Wow, that's almost suspiciously amazing! Congrats man, that's definitely a massive upgrade! Planning a revisit to my review of the 5800X3D soon because of stuff like this. The chip deserves way more praise than I gave it in this video.
@@TheTechTank Iam telling you a honestly and if you want I can send to you videos screenshot etc... this cpu is beast buuut need good cooler definitely for max 4450mhz boost now with my boost to 4300 in game and with better cooler i will gain more fps hahaha massive upgrade for just 150 usd because I sell my old 5600x :) cheers
Got busy... Didn't watch the whole video... I will be back (terminator voice)
No worries! Looking forward to what you think about it. Upped the editing level here and there and tried some new things, lol.
5800x3d is too expensive..
I don't see a big change in csgo and apex...
For the price it's not worth it.
Someday it'll get to $199 once AM5 becomes the norm
Just like how i got my 3600+b450 for $140
Crazy lastyr the cpu alone is $200+ just because of crypto 😂
Lol, the deals will always be had by those with patience.
I guess the reviewer just got late to the review for this CPU. Everyone else made this very same review about 3 months ago. I still like the video anyways! hopefully he gets lots of views!
Being based in South Africa means problems with availability, unfortunately. However I don't think it's "the very same review" as Kombo Strike was not a thing back then 😂 Appreciate the well-wishes!
good commentary
Appreciate it.
If your clockspeed APPEARS to be higher, but your CPU is drawing less power under load, you can be nearly CERTAIN that performance has not increased.
Undervolting is a quite common way to increase performance in temperature based bottlenecks.
Stock processors are thermally regulated so undervolting allows higher boost due to thermal temps being lower.
Ryzen 7 5800X3D🤔🤔🤔🤔🤔🤔🤔
So tired of click bait titles
Not clickbait, delivered on the title in the video. A bit sensational, sure, but until UA-cam stops favoring sensationalized titles and thumbnails, they're not going away.
Alright then, thank you.
Hahaha! He dropped it and I hope he bent no pins!
@@NetNeelsie Lol, I was trying to see how many people I could make anxious by manhandling it the whole time 😂 then I dropped it and played myself, haha
@@TheTechTank my 2700x was getting motion sickness 🤪, thanks for the honest review
😂 My humblest apologies to that poor 2700X, and thanks man!
@@TheTechTank I have built 3 Ryzen systems and those pins make me real anxious.
What a Waste of Sand.....INTEL FOREVER ....
Fanboyism will always lead to disappointment.
@@TheTechTank No Problem it's my opinion.
𝓟Ř𝔬𝓂𝔬𝐒ϻ
78003D
If it happens it'll probably be the 7800X3D.
@@TheTechTank Escape from tarkov loves v cache
@@chrisblandford8971 I really need to add that game to my testing library.
@@TheTechTank AMD already confirmed that there will be a 7800X3D, 7900X3D and 7950X3D but that they won't launch until MUCH later (at least 6 months but could be up to a year) because AMD wants to get rid of their overstock of AM4 CPU's (in particular the 5800X3D) first.
This of course means they will have to lower the price because being much more expensive than the 12-core 5900X with 64 MB cache and higher clock speeds makes the 5900X a much better choice and being 50% more expensive than the 5800X is based on also makes it bad value in comparison.
That's incorrect. AMD has not yet confirmed X3D versions of the 7000-series (though it doesn't take a genius to assume that they are indeed coming, regardless of the rumors and leaks). I doubt delaying the launch has anything to do with current overstock, and more to do with being able to "refresh" the 7000-series lineup down the road.
Common sense Information, thanks a bunch.
Unnecessarily passive aggressive comment, thanks a bunch!
why nobody ever look at where that CPU is useful ?!... you did the exact same base tests as all the others, and gave the same results... but you have no idea why the CPU can be useful...
the main reason why your benchmark fails is because you tested the CPU with shitty games that use the GPU to generate post.
the usage for this CPU is where someone is playing games that are handled with Creation/Game Engines like Quake engine, Unreal engine, Unity 3D...
just test one well known Unity 3D game, RUST... the difference between 5800x and 5800x3D is around 28% more efficiency in post, dpi, and latency... because this game use cache to generate and store maps and graphical elements.
so the best benchmark you could provide would be to compare games using these specific engines ... have fun and test these: en.wikipedia.org/wiki/List_of_Unity_games
You seem to be under the impression that reviewers exist to paint the products they review in the best light. That's not how it works. The X3D is a gaming CPU, at one point the "fastest gaming CPU", and that claim needs to be tested across a wide range of games, not just in games where it performs particularly well. It doesn't matter if it, according to you, performs better in "games that are handled with Creation/Game Engines like Quake engine, Unreal engine, Unity 3D," all that matters is its gaming performance in general, which is how AMD marketed the X3D.
@@TheTechTank so you do exactly what i say... you use the advertising, or the name, or the color of a product to compare it with the others who state the same, which are the same color, etc... thanks... bye
@@jean-pierremichaud7330 Are you new here? That's how reviews work, how they've always worked, you're literally just describing the process. Cheers, you won't be missed.