As a 13700K owner I’m surprised how good that 7800X3D is. I play at 4K 120 Hz on my LG OLED C1. But I got an Asus Tuf RX 6800 XT for $320 and my gaming experience has never been better. I was mostly looking for 12900K performance but yeah the 7800X3D would be my CPU if I had an AM5 build.
Just bought the 7950X3D oh man its crazy strong in wow and other MMO games! Wow raiding has never been so good ...3D v-Cache to rule them all =) should be a banger for diablo 4 too.
In Warzone I notice if you disable SMT the game runs more stable with better 1% lows. It might be optimization with AMD cpus. My 5800x3d 1% lows are horrible with SMT enable. Disable that and 1% lows are close to average
My FPS in warzone with the current settings at 1440p are insane in the 3D chip. 180-240 FPS in most areas. I haven't even turned on PBO. I have everything on Ultra also. 7900XTX.
Maybe you need more physical cores instead of disabling SMT. Warzone runs pretty smooth on a Ryzen 9 5900X, at least from watching a friend playing it live with a 3080Ti, Ryzen 9 5900X and i7-12700K are pretty similar overall at least on 3D mark scores. One of the few games that scales pretty well with physical cores.
AMD no es una victoria nunca porque no se puede vencer a Intel monstruo pero todos ya sabemos que nuevo 14th gen meteor lake es más poderoso que van a destruir a todos X3D AMD si vale pena
finaly , it was kinda needed its realy not great feeling to play on 7900xtx or 4090ti and drop way bellow 60fps kinda often just becouse of cpus were not able to handle these cards especialy at 1440p or ultrawida 1440p its kinda weird. Cant wait for 7800x3d for me its no brainer i have 7900xtx , thus i wanted something good but with my limited budget while xtx+7800x3d costs together less then 4090alone (of course 4090 is that best card, its just way to expensive for me) so i choosed price/performance ratio instead.
My grand mother couldn’t tell the difference. 😂 but yeah, 7800X3D sounds the best option. Though some other channels pointed out, AMD has better options if productivity is important as well. I don’t understand the rant about scam marketing though, because the most over promise anyway. 😂 for example DLSS 3.0 is only supported with 17 games. 33 games and 5 applications in process. Plus the fact there is no DLSS 3.0 for older series. FSR works for all GPUs. If FSR 3.0 will work for all GPU makers, I’m skeptical that it will, 3090 versus 4070 TI would be interesting to compare with. 3070 had a 256 memory eVGA made a 3070 TI 12 GB model. 250$ more for one generation. 7900 XT price is over as well besides it should have been maybe 7800 XT, but I guess having the 9 in it, they can ask more. In general I’m not sold on AMD or Nvidia GPUs and not jumping yet on AM5 platform yet. I hope the next generation won’t require a 2000 Watt PSU 😂🙈
@@markusbronst7942 dont be afraid its not possible for them to release such energy inefficent components at last in Europe especialy in crissys :), probably not even legaly in future. Actualy already were some debates about GPUs its very possibly that actual limit will be 400wats per unit , which is less then some actual gpus are able to consume. Its kinda makes sense cost o electrocity in Europe is some times real madness i am living in Czech where 1Mvh costs literaly more then double of its inflated price in USA, not even year ago it was 300% more expensive compare to USA :D.
Why would you say the 7950X3D is a scam? the 7800X3D would be great for gaming, but as a content creator I still need those cores for video/photo editing. The lower power consumption of the X3D is the main draw point for me vs the 7950X or the 13900K, I have gotten rid of equipment that ran too hot and used too much power for a 24/7 system in an office that became a sauna. Seeing nearly 50w difference in consumption is a huge win IMO.
Damn those lows are way lower than I expected, but its like you said - want to save money? go for 7800x3d and cheap 6000 cl30 and youre good to go. if amd would actually used marketing well for this strategy, it would be whole different story
@@QuaK3RRR you can buy some 7200cl 34 or 7600-7800 cl36 with decent board and just turn on xmp as well and still have better lows, its more like about money difference :D lows are important and in warzone amd struggles with that, but for less money its decent
@@QuaK3RRR i know its hard to get insane ram speeds, but its not hard to run 7200-7600 with decent timing common man :D and yes you will still get better lows, even with 13900k with 6000 only offers decent performance in warzone beating current amds withou t x3d like its nothing
More like I was awake for 3 days straight and work a day job on top and couldn’t physically do any more benchmarks, if you can do better go start your own channel
I think it will be very interesting when the scheduler will be able to understand when a game benefits more from running on the ccd with the higher clocks, instead of just pushing each and every game on the ccd with the added cache. Or when it will be able to use both ccds in conjunction correctly more often like I could see on some games with the 7900x3d and 7950x3d, giving sometimes up to 5-10 boost over a 7950x3d with its CCD2 disabled.
Something about the factorio results is wrong, because the extra cache does help with the tick times, maybe it's not loading the map properly, maybe it's loading another map entirely. 50 seconds is way too little for a map that size; maybe it's craping out because it's 100,000 ticks. People usually benchmark 10k. Either way, the results the program is giving back don't make sense. Nothing against you, but it would be nice if you re-ran it.
@@TheJonazas he's using max oc on the Intel. That's not the average user. Without running highest memory freq and the same oc on Intel. The difference would be larger.
Yeah, no reason to buy a 7950x3d if you just game. It was always meant for those who do more with their computer. Cache should also help more when you are running 128GB of RAM, as your dramatically limit how fast you can clock it (Most motherboards list 4800 max)
If I didn't already build a new Intel 13th gen PC in Oct/Nov alongside my 4090, I would've waited for the 7800x3d. But I'm happy with my performance so I'm not going to get worked up for an extra 10% or so. Also, I'm just sad that Jufe's community is so small. If only most people knew about how to properly run benchmarks there wouldn't be so much misinformation circling the fucking PC space.
🤣I'm older and playing Hogwarts right now with my stock 12900k and xmp ddr5 6200mhz with a 4090 at 4k 120hz. I'm enjoying it so much, the stress these fps gamers must feels wanting to tune up stuff for a few fps more, i couldn't be arsed with any of that. Frame gen is more useful to single player gamers than any bios screen. Cool to see some theoretical fps though, none issue for me though at 120hz.
You want Intel just for the sake of lacenty between the cores which reduces input lag. If you use emulators, the lower is the latency the less input lag you get and the more accurancy you get. Also, the 6thz marks is way faster and uses in many older games and emulators. Why bother with AMD when you can go Intel that we know just works and is scalable.
@@gurugamer8632 Gaming Intel CPU no questions if you know how to tune your RAM and adjust voltages to your CPU / RAIL. Video Editing, I would prefer the 16 cores of AMD 7950x
Controlling affinity on the 7950X3D is easy when you turn Game Bar and Game Mode off, set up an Ultimate Power Plan, turn SMT off, use PBO, and set the affinities in Process Lasso. Once you do that, the 7950X3D is the king. The 7800X3D is great, but you simply lose the ability to assign processes to the non-Vcache core.
I don't agree with the idea that a 7950x3d with the non 3d cache core's disabled will be the same as the 7800x3d. I see so many channels claiming this, but people seem to ignore that the 7800x3d has a 700 MHz lower max clock speed and less cache. We will have to wait for it to come out, before judging performance.
They have the same clockspeed on the v-cache CCD. The extra 700MHz are on the non-v-cache CCD only, so irrelevant for gaming. And the extra cache is because they just add the cache on both CCD's together and print that total number on the box. Since the non-v-cache CCD doesn't get used while gaming, the actual usable cache will be the same.
@@isakh8565 So the 7950x3d doesn't boost to 5,7 GHz on the non v Cache cores. But boosts to 5 GHz and the 7800x3d gets 5 GHz on all cores as they are all v Cache?
@@raptordna The 7950X3D does boost to 5.7 on the CCD without v-cache, and to around 5 on the v-cache CCD. The 7800X3D only has the v-cache CCD, and boosts to around 5GHz, yes. The 7950X3D turns off the CCD without v-cache when you run most games, effectively turning itself into a 7800X3D. That's why some people call it a scam, as this behaviour isn't at all clear in the marketing. It's mostly a 5GHz 8 core while gaming, and only acts as a 16 core in productivity (with only half the cores running at 5.7GHz, and the other half at 5). But it's marketed as just a straight up 5.7GHz 16 core without clearly informing people of these big caveats.
Awesome! Love how you showed all the resolutions for Single player games. Can't wait to see the DDR4 vs 13900K/KS DDR5 VS 7950X3D(7800X3D). Maybe throw in the cheaper 12900K @ $350 in there ;)
I play a ton of BF2042 at 4k native with DLSS off/Ray tracing AO off. I have noticed from UA-cam benchmarkers that I get a lot more fps at these same 4k native settings than the 7950x3D does. 7950x3D was 115-135 fps and my 13900ks with 7200mhz ram was at 135-165 fps. I currently have a 13900ks/4090/msi Meg z790 Ace 4-DIMM motherboard with a stable 7200mhz 32gb DDR5 kit. Because of this video I ordered an Asus ROG Maximus Z790 APEX 2-DIMM motherboard and GSkill 7800mhz 32gb ram kit. 8000mhz was out of stock, so I’ll just push this 7800 kit to 8000mhz probably. Also trying to decide whether to upgrade my Corsair hx1200i(1200 watt) PSU to a Corsair hx1500i (1500watt) PSU for more headroom when running the 13900ks with unlimited turbo wattages and a 4090 at up to 450watts during gaming. Then you have whatever the wattage requirements for the motherboard and ram are, then fans I guess.
@@ilovehotdogs125790 yeah, not a ton of gains, but some. Mostly in the 1% lows. I ended up getting a G.Skill 2x24 48GB 8000 CL40 kit, but I overclocked it to 8200mhz 36-48-48-86 timings. So I’m essentially running the timings you would get with a 7600mhz kit at 8200mhz. My cpu is overclocked to 6Ghz across all 8 P Cores and 4.5Ghz on all 16 E cores.
For these top-tier CPUs Streamers need all the cores to run the game, OBS, mixing, and uploading to multiple sources. Losing 8 cores over Intel's i9 monolithic CPU is what needs to be tested. People buying this are often Streamers try to get by with 1 PC vs 2 PCs.
So best "bang for the buck" is going to be the 7800x3d (since you don´t need expensive RAM or do any overclocking)? And the downsides compared to I9-13900k being mainly lower 1% & 0.1% FPS in some games like Call Of Duty, BF 2042 and MSFS for example? I wonder how these CPU´s would compare in a 100 person COD match with lots of action in a crowded city area. Or a 100 person Arma 3 King Of The Hill match. Anyone tried? Summary (please correct me if I got it wrong): *** AMD 7800x3d *** Pros: *Cheaper? (probably) *Higher average FPS in some games (ex: Horizon Zero Dawn, Shadow Of The Tomb Raider) *Doesn´t need expensive high-end RAM for good performance *Consumes less power *Can upgrade CPU until 2025 using the same socket (AM5) Cons: *Lower average FPS in some games (ex: BF 2042, MSFS) * Lower 1% & 0.1% FPS in some games (ex: MSFS, COD) *** Intel I9-13900k *** Pros: *Higher 1% & 0.1% FPS in some games (ex: MSFS, COD, BF 2042) *Higher average FPS in some games (ex: BF 2042, Spider-Man) Cons: *Lower average FPS in some games (ex: Horizon Zero Dawn, Shadow Of The Tomb Raider) *Needs expensive high-end RAM for the best performance *Consumes more power *End of life socket (LGA 1700) (for I9 CPU´s)
Good summary! If you add price/performance ratio for people needing good performance but don't want to pay too much, you need a whole Flipchart right now! I decided for the 7600x instead of the 13600k to be able to swiftly throw a 7800x3D on sale into my mobo in 2 years without the need to re-install anything. Or a 8600X3D (lol I'd wish). Bought "cheap" 2x16 Kingston 5600 cl36 that can easily run 6000CL30 with timings from igor's lab. 8800X3D shouldn't need faster ram to perform great, so I won't need a new mobo + ram.
@@Habixus Ty. Sounds smart. I´m using a I9-9900k with a RTX 3080 (being bottlenecked) so want to upgrade. Will go either 7800x3d or 13900k. Very tempted with AM5 for the upgrade-ability, just wish it didn´t have those 0.1% dips in COD (and probably Arma 3) too.
@@mikaelandersson1288 Because it was 80€ more expensive in November. Got my 2 sticks in a black Friday sale and according to Buildzoid, "all" Kingston 5600+ sticks can run the same speeds. I didn't spend much time with tuning. Just used Igor's lab's 6000CL30 "sweetspot" AM5 settings and that was it. I only needed more CPU power for simracing and now I have plenty enough even at the 5600CL36 expo profile :)
I don't think you understood the review. It is fast but has 2 ccd. Which increases problems with the allocation of work on the cpu So it's better to wait for 7800x3d. The i9 results were without e core too.
7950X3D is a stupid purchase for gaming, at best when everything works it performs like the much cheaper 7800X3D and at worst it's buggy and uses the wrong CCD. If you want to buy a Ryzen 7000 CPU for gaming, it's the 7800X3D.
My biggest regret is that I bought a 7900X3D w/o finding this channel first. That said, after a month of researching my CPU's features and functions, learning about the known issue of using 2 vs 4 DIMM slots of DDR5 (and not being able to use EXPO profile), and switching CPPC settings in bio from AUTO -> DRIVER: Yeah I regret nothing! Intel IS awesome; used to buy nothing but team blue. While the first month of going team red was a bit frustrating, I'm acutually really enjoying my new CPU now more than ever playing games like starfield w/ only 20-30% CPU utilization lol
AMD is really tough for adhd manchilds who can't spend an hour in bios properly overclocking memory and setting PBO, I kinda get it, I was 14 years old once too. You will get through it bro
Jufes does it again. Bringing the Heat! I'm looking at doing a Summer time build. I'm seriously leaning towards the 7800X3D right now. (Im mainly an Emulators / Single Player high graphics and resolution gamer.) We'll see how it goes.
Then the 7800x3d will be a waste of money for you. The difference of it’s performance will not matter at 4K and if your only doing emulators any modern CPU will get the job done. You also need a 4090 as well which is yet again extremely overkill for your use case scenario.
@@ZackSNetwork Lol. No man. I'm not the regular tech viewer. I know exactly what I need for my use case. "Any CPU is fine for Emulation." No. Just No. For the latest emulators like RPCS3, Xenia, Yuzu or even some games in Dolphin and PCSX2. You need a really beefy CPU to get the best performance. 7000 series or 12th or 13th gen specifically. Also I never said I was going for a 4090. I have my eyes set on the 4080 looking for it to go down a bit but if not if not thats fine. I don't plan on upgrading my cpu for at 2 generations or my gpu depending on how well the DLSS and Frame Generation works on future titles. So I'm good. Good info. For a casual. But I know what I'm looking for and why I'm looking for it. Have a nice day.
@@NBWDOUGHBOY My i7-12700K is right on par with a non-3D Ryzen 9 for emulators and only losing against high end raptor lake. The iGPU can handle the emulators no problem, still faster than the Switch. I have Corsair DDR5 4800, no XMP needed. Watch the floating point numbers.
@@NBWDOUGHBOYAlder Lake and Raptor Lake have 6% higher IPC than Zen 4. That 6% difference on IPC is a huge one for clockspeed and thermals, not worth it buying Zen 4 or Raptor Lake for that, Alder Lake Golden Cove is enough (as an option find an AVX512 batch).
@@saricubra2867 Ironically 12700K or 12900K with AVX-512 Tuned is still the fastest chip available for the RPCS3 emulator. It's still good. But, I want a Set it and forget it system for the next 4 to 6 years. The 3D should be decent for my purposes if it isn't ill return it and get something else. Not a Big Deal.
@@Gamebro321 It uses overclocked e-cores with more cache, basically 2015 Skylake IPC with dinosaur 14nm. i7-12700K, Ryzen 9 5900X, i5-13600K are way better.
Great video, but kind of boring results. Worth buying if you're already looking to build a new PC, but nothing to get excited about. The whole 3D cache thing is starting to seem like a dead end.
AMD has a pretty solid platform with the 7800x3D. Looking forward to the price per fps video. Your max oc 13900k platform is probably double the price of the amd platform between the asus apex, custom loop, ram and ram cooling.
During gaming loads, the custom loop is only helping a little. Heavy workloads the custom loop allows it not to throttle. Don't have to worry about throttling as much for most games.
@@03chrisv meanwhile the amd cpu is using half that power in gaming, the power bill isn't that concerning but the extra heat is noticeable in a room, which could mean running AC more often which definitely does start to hit the electricity bill.
@Malinkadink Intel CPUs don't really use a lot of power while gaming. They average 70w to 90w, and don't tend to go over 130w even in super demanding scenarios. The extra 10w of heat that the Intel produces over the AMD processor is negligible in terms of heating up a room. We're splitting hairs here.
@@Malinkadink different nodes lol if your gaming shouldn't be bothered with how much electricity your using unless your poor then that's a different story if Intel was on the same node as AMD then you'll switch sides lol
I really like Jufes’s objectiveness. He gives AMD shit for their shady marketing and false claims but at the same time recommends them for certain cases i.e. people who don’t know how to OC, don’t care about max performance or just wanna plug and play. 👏🏻 Only hardware reviewer I trust 👍🏻
Ive seen plenty of reviews as I am gaming in 4K and I have a 4090 TUF OC with the 4090 Strix OC Bios flashed. Then I am still running the older productivity AM4 CPU 5950X, so i wanted to see if 7950X was a good upgrade, but found that it wasent for 4K. Then looked at 7950X3d, but I find that the mainboards and cpus + ram will cost too much for it beeing worth it for 4K gaming. Still I feel that Intel is a bit cheaper, mainly due to the mainboards. but a 13900K will cost almost the same as an 7950X3d. However it is also possible to buy an 7950X.. Or maybe I will just wait. I think that the prices will drop at both AMD and Intel, especially motherboard prices must down for AM5. I am by no means married to Intel or AMD, I had been using Intel forever, but then I switched to AMD after I felt Intel stagnated for several generations. However at the moment there is finally a fight between Intel and AMD, productivity wise the 5950X is still fast enough for me as a developer. But I do of course miss out on the "Minimum 1 fps" with my 4090 in 4K.. However it is so fast still that I dont feel any games lag at highest settings.. So I think I will just wait and see what happends with prices or bundles on either AMD or Intel side.
i think 5% low is way more important than 1% lows. if i'm only seeing a drop 1% of the time with a high refresh rate monitor setup with one of these i'll likely never know. seems a little overkill to harp on it but maybe not.
If your game stutters 1% of the time that means every second at 100fps you get a stutter. Or you play for 99 seconds and then your game fucking dies for a second. I would notice that.
frametimes are important 1% and .1% lows after that, if you're getting steady frametimes 99% of the time that is significantly more important than something happening in the game that causes a spike in lows.
@@christianprice9832 that's true. I was just trying to work out a frame of reference for what 1% lows mean while gaming. Kinda came off a bit aggressive.
@@YOGURT1 I don't guess I notice. It doesn't die for a second it merely drops from x fps to y fps and 1 percent of the time doesn't mean 1 percent in these cases. It's the lowest it will drop not really meaning 1% of the time it's that low. Just as his video states it really is semantics. Just as people selling high frames he's doing the same for 1% lows.
@@jeffiedean that makes sense. My examples where on the extreme end. In your case you would see the avg fps and lows close to each other, which would be good since at least its consistent. Frametimes might be the best way to see stutters.
Care to explain how this is any different than turning the e cores off on 12th and 13th gen for maximum fps? The thing is, we are at the very beginnings of heterogeneous cpus in x86, and there are bound to be issues. Games not launching due to drm freaking out on 12th gen with p and e cores at launch comes to mind. Its going to take time for game and software developers to properly code for cpus with different cores under the same IHS
A delidded OC 7000 X3D video would be great. Definitely watch that. From what i've researched you need Thermal Grizzly stuff to delid, frame and short backplate
Thanks supporters. Dude you are so passionate, don't let it make you sick. Everyone isn't drinking the flavor-ade. I bought the 7950x cuz it was a great deal for me. This may not be my niche of a channel but your content it thought provoking. Keep up the good work!
I just bought this after asus ruined my 13900k. I have a 💯 opposite experience! Cyperpunk has increased scores and micro stutters in games like guild wars2 and even old swtor at the fleet are now gone. All my games are so much smoother. My hunch is gamebar plus chipset drivers and firmware updates in the past 13 months since this was made fixed it. I feel this is an upgrade from my crappy 13900k. The scheduler works perfect in my testing and is within 98% in cb23 of my 13900k
Besides the AMDip, theres the platform stability, having usb drop outs or tpm stutters or being on the mercy of the x-box gamebar for core allocation, I went Intel this time because of that. Even if it needs more tuning to reach the max, I'd much rather tune than being left in the shit.
Game bar yes, but I don’t think the USB stuff happens on AM5. The TPM stutters I have never heard of, tell me more? Still a solid decision. I am still debating mine. It is really the whole dead end platform thing, and lack of PCIe5 M.2-to a much lesser extent-that is holding me off Intel. Also general cost if you want DDR5
@@nathanwhite617 Best I can explain it, is that a few times per day your system would just freeze and start sputtering for a few seconds, cutting off audio and dropping your fps to 2, didn't matter what you were doing, gaming, movies, or just idling on desktop. You can see examples if you just google amd tpm stutter. AM5 will propably be free of that too, much like the usb issue, but all the firmware/agesa issues I personally had with am4, makes me appreciate the stability of the intel platform. Lacking the ability to upgrade the cpu down the line is ofc the major downside. I did upgrade on am4, 2700x-3900x-5900x, but by the time I hit the last one, I already wanted a new motherboard that had better features, so by the time we hit that last supported cpu on x670, might be the same thing, so really didn't matter that much for me, but ymmv.
@@thefreemonk6938 The shoe is pretty much in the other foot right now, and this current situation is getting beyond silly, personally the only bad thing I've heard about am5 as a whole is the boot/memory training issues and as a a whole amd seems to be a much better choice than intel at the moment. Intel not communicating about the stability is just adding insult to injury.
This video, is actually truly unbiased. Though I suspect not everyone will pick up the true message. Buy a 7800x3d if you are NOT interested in overclocking/tinkering. I myself am in this camp, I just want it to work, out of the box and game. However, if you are in the tinkering camp, in certain situations/games you can optimise a 13900k to outperform. In some games memory bandwidth is important, in others it’s raw single thread performance and in those scenarios maybe the 13900k is for you. In life, one size does not fit all and that is why, it is important to educate yourself with information. That information needs to be evidence based and you should always cross reference with multiple sources and add weighting based on the evidence provided in those videos. Rarely are videos on the same subject providing apples to apple comparisons (if they do, the. you need to diversify your source material). Video’s like this truly give you an insightful view, other valuable sources will give you a stock for stock comparison. Armed with all the information, and based on your own personal requirements, then you can make an informed choice. Don’t rely on anyone else telling you what to buy, they will always introduce their own personal biases, take the time to educate yourself if you want to get the most for your money.
also, for the 14900k to run as well as it does here you need to get a REALLY high binned Hynix A-die ddr5 kit, which is not cheap. another cost factor at play.
That's what some people are trying to explain games don't push modern CPUs anymore. In this video, tomb raider is using 44% of the cores so that's actually just 3 cores or 4 or 5 threads. And that game is only pushing the 7950x3d to 44% usage on those 3 cores, if you actually look at the game running in this video at 5:27 it's showing 390 to 445 fps in tomb raider. And no I don't play COD.
would the 5800x3d dip more than the 7800x3d or would they be equal? since the 5800x3d is probably still the cheapest platform to adopt in 2023 with maybe 10-15% slower fps
If you want to save money and just need something "good enough", the 5800X3D is a good and lazy option without much hassle. Buy any cheap B550 100$ board, some random Samsung B-Die for 150$ or even cheaper Micron E-Die for 100$, or if you already have anything just reuse it.
@@marceldiezasch6192 I know this. i currently run a 10700 non K. im pretty sure i would get massive gains (up to 50% ?) with 5800x3d. But i am concerned about the dip. i'd rather invest more into 7800x3d since ill have to upgrade platform on both ways and avoid worse dippings if you know what i mean.
@@Boogeymanjw cba learning to overclock my ram and spending time in bios. unless somebody volunteers to tell me exactly what numbers i need to punch in, i am really not into wasting time and learning the whole thing and stresss testing for hours. i like simplicity :P plug and play basically.
yep yep was contemplating getting an x3d chip when they came out but not impressed with teh performance imo. Guess 5800x3d was teh one hit wonder for them, i was hoping for that kind of uplift. Gonna stick with my trusty old 12600k.
Heavy graphics here; I'm a painter, plus use it for work myself. So yeah, this actually sounds like a good decision--and efficient energy usage, comparatively speaking, plus it's not like they won't improve the tasking over what they've got the first week.
@@geeesejeeeze6055 Honestly, if you even notice the one minute compile difference on truly gigantic projects, you must seriously never leave your seat. Learn to do so--deep vein thrombosis is no joke. Every painter stands and sits occasionally to change position.
@@BronzeDragon133 when I'm doing my work and I'm zoned in I don't want there to be any delays. There's a reason why most people that do any productivity work stick with Intel and Nvidia. It just works. AMD is people who don't value their time and like to waste money.
@@geeesejeeeze6055 That seems overly judgmental, like saying that Intel is for people who don't value electricity and like to waste resources. But you do you, Booboo.
As a 13700K owner I’m surprised how good that 7800X3D is. I play at 4K 120 Hz on my LG OLED C1. But I got an Asus Tuf RX 6800 XT for $320 and my gaming experience has never been better. I was mostly looking for 12900K performance but yeah the 7800X3D would be my CPU if I had an AM5 build.
how did u get for 320?
If you want i9-12900K perfomance, i7-12700K and 13600K are very similar (om gaming).
@@UrCheckMate he got it from your sister for a special price 😅 just kidding
What size please?
Thanks to the supporters on the Twitch streams for their massive donations & Jufes for the content.
BRUH TY FOR TIP 😭😭❤️❤️🥰
There's no 7950X3D in here, only 7800X3D
@@unlimiteduploads2971hes trolling buddy....
i want to thank everyone who supports this channel. i really appreciate the reviews this channel provides.
@@FrameChasers great video love to see the scam lol
Just bought the 7950X3D oh man its crazy strong in wow and other MMO games! Wow raiding has never been so good ...3D v-Cache to rule them all =) should be a banger for diablo 4 too.
How the cpu run in Valdrakken ?
Might have to do an upgrade for the mmos.
which cpu you had before
do you play classic or retail? what other cpu did you had before? how much fps are you getting during action in raids?
What did you have before? I'm on a 3900x and this cpu is pretty shit with MMOs.
In Warzone I notice if you disable SMT the game runs more stable with better 1% lows. It might be optimization with AMD cpus. My 5800x3d 1% lows are horrible with SMT enable. Disable that and 1% lows are close to average
does your avg fps stay the same? or is it slightly lower?
My FPS in warzone with the current settings at 1440p are insane in the 3D chip. 180-240 FPS in most areas. I haven't even turned on PBO. I have everything on Ultra also. 7900XTX.
Are you disabling bios or using process lasso to disable SMT? I don't play Warzone, more just curious.
.
Maybe you need more physical cores instead of disabling SMT. Warzone runs pretty smooth on a Ryzen 9 5900X, at least from watching a friend playing it live with a 3080Ti, Ryzen 9 5900X and i7-12700K are pretty similar overall at least on 3D mark scores.
One of the few games that scales pretty well with physical cores.
Great video. The only thing I'd have liked to see is games with RT on and off (for both x3d and 13900k), so we could see the impact cache has on RT.
This is a win for both Intel and Amd. Extremely good cpus!
AMD no es una victoria nunca porque no se puede vencer a Intel monstruo pero todos ya sabemos que nuevo 14th gen meteor lake es más poderoso que van a destruir a todos X3D AMD si vale pena
@@lightward9487ni inglés ni español hablas. No se te entiende XD
finaly , it was kinda needed its realy not great feeling to play on 7900xtx or 4090ti and drop way bellow 60fps kinda often just becouse of cpus were not able to handle these cards especialy at 1440p or ultrawida 1440p its kinda weird. Cant wait for 7800x3d for me its no brainer i have 7900xtx , thus i wanted something good but with my limited budget while xtx+7800x3d costs together less then 4090alone (of course 4090 is that best card, its just way to expensive for me) so i choosed price/performance ratio instead.
My grand mother couldn’t tell the difference. 😂 but yeah, 7800X3D sounds the best option. Though some other channels pointed out, AMD has better options if productivity is important as well. I don’t understand the rant about scam marketing though, because the most over promise anyway. 😂 for example DLSS 3.0 is only supported with 17 games. 33 games and 5 applications in process. Plus the fact there is no DLSS 3.0 for older series. FSR works for all GPUs. If FSR 3.0 will work for all GPU makers, I’m skeptical that it will, 3090 versus 4070 TI would be interesting to compare with. 3070 had a 256 memory eVGA made a 3070 TI 12 GB model. 250$ more for one generation. 7900 XT price is over as well besides it should have been maybe 7800 XT, but I guess having the 9 in it, they can ask more. In general I’m not sold on AMD or Nvidia GPUs and not jumping yet on AM5 platform yet. I hope the next generation won’t require a 2000 Watt PSU 😂🙈
@@markusbronst7942 dont be afraid its not possible for them to release such energy inefficent components at last in Europe especialy in crissys :), probably not even legaly in future. Actualy already were some debates about GPUs its very possibly that actual limit will be 400wats per unit , which is less then some actual gpus are able to consume. Its kinda makes sense cost o electrocity in Europe is some times real madness i am living in Czech where 1Mvh costs literaly more then double of its inflated price in USA, not even year ago it was 300% more expensive compare to USA :D.
Why would you say the 7950X3D is a scam? the 7800X3D would be great for gaming, but as a content creator I still need those cores for video/photo editing.
The lower power consumption of the X3D is the main draw point for me vs the 7950X or the 13900K, I have gotten rid of equipment that ran too hot and used too much power for a 24/7 system in an office that became a sauna.
Seeing nearly 50w difference in consumption is a huge win IMO.
Rock solid test with useful info about framerate dips in recent multiplayer titles. What else should u ask for?
So I guess if money isn't an option and you want to overclock to get max performance, then you might as well go for Intel 13900k (s) right ?
Thank you to ALL Frame Chaser supporters,including myself 😅😂🤣💪👍. The 7800X3D is definitely the cpu to get from this 7000 series lineup for gamers 🥰✊💪
If you are 4k gamer then x3d is not useful
@@ji3200 How at 4K specifically would you not get the GPU bound load scenario with any of these.
None of the CPUs matter in a comparison at 4K.
@@SKHYJINX it will matter in near future, people upgrade CPU's less often than GPU's
@@darkfire3691 I’m a noob, could you explain why the CPU won’t be great at 4k in the near future? Thats my main goal for a CPU.
@GlasgowIsBlue Cause most of the work is been done by the GPU. Reason when you compare 4k benchmark the majority of games fps are similar.
What about for gaming + streaming on same pc with minimal FAFO? Still 13900k?
Clowns. Clowns everywhere!
Damn those lows are way lower than I expected, but its like you said - want to save money? go for 7800x3d and cheap 6000 cl30 and youre good to go.
if amd would actually used marketing well for this strategy, it would be whole different story
Dude thank you so much for the tip 😭😭😭😭
@@FrameChasers I know its nothing vs your usuals subs but you know, atleast its something 😅
@@tomorpedreiro3032 1$ or 100$ man, I show gratitude and appreciation all the same 🙏
@@QuaK3RRR you can buy some 7200cl 34 or 7600-7800 cl36 with decent board and just turn on xmp as well and still have better lows, its more like about money difference :D lows are important and in warzone amd struggles with that, but for less money its decent
@@QuaK3RRR i know its hard to get insane ram speeds, but its not hard to run 7200-7600 with decent timing common man :D and yes you will still get better lows, even with 13900k with 6000 only offers decent performance in warzone beating current amds withou t x3d like its nothing
why not show the 13900k stock?
you are just showing us bin lottery with the OC results.
Bias much?
More like I was awake for 3 days straight and work a day job on top and couldn’t physically do any more benchmarks, if you can do better go start your own channel
@@FrameChasers Even more interesting show us the frequencies you used for the 13900K OC.
Man I have seen this video several times and the opening ace ventura based theme is hilarious! Rolling every time! Seriously awesome!!!
agree, it's all about 1% fps low in 2023, we don't want that max fps anymore, we want 1% low stay close to the high as much as possible.
true but smtms worst case fps doesnt always hardwares fault
finding lowest/heaviest areas at games also req effort
That intro, lol! 🤣
Thank you Jufes and the your supporters! Life is good with you letting us know how it is.
I think it will be very interesting when the scheduler will be able to understand when a game benefits more from running on the ccd with the higher clocks, instead of just pushing each and every game on the ccd with the added cache. Or when it will be able to use both ccds in conjunction correctly more often like I could see on some games with the 7900x3d and 7950x3d, giving sometimes up to 5-10 boost over a 7950x3d with its CCD2 disabled.
"When"? Seems like a big "if" to me...
Something about the factorio results is wrong, because the extra cache does help with the tick times, maybe it's not loading the map properly, maybe it's loading another map entirely. 50 seconds is way too little for a map that size; maybe it's craping out because it's 100,000 ticks. People usually benchmark 10k. Either way, the results the program is giving back don't make sense. Nothing against you, but it would be nice if you re-ran it.
Better performance for less than half the power draw. Seems good to me.
Sometimes better performance* sometimes loses, but that doesn't matter since as you said half power draw AND is much cheaper. WIN!
@@TheJonazas he's using max oc on the Intel. That's not the average user. Without running highest memory freq and the same oc on Intel. The difference would be larger.
Can't overstate how important this channel for the community is. Thanks for all your efforts!
Yeah, no reason to buy a 7950x3d if you just game. It was always meant for those who do more with their computer. Cache should also help more when you are running 128GB of RAM, as your dramatically limit how fast you can clock it (Most motherboards list 4800 max)
If I didn't already build a new Intel 13th gen PC in Oct/Nov alongside my 4090, I would've waited for the 7800x3d. But I'm happy with my performance so I'm not going to get worked up for an extra 10% or so.
Also, I'm just sad that Jufe's community is so small. If only most people knew about how to properly run benchmarks there wouldn't be so much misinformation circling the fucking PC space.
🤣I'm older and playing Hogwarts right now with my stock 12900k and xmp ddr5 6200mhz with a 4090 at 4k 120hz. I'm enjoying it so much, the stress these fps gamers must feels wanting to tune up stuff for a few fps more, i couldn't be arsed with any of that. Frame gen is more useful to single player gamers than any bios screen. Cool to see some theoretical fps though, none issue for me though at 120hz.
Amen brother, I wish there were more gamers in the hardware space and not so many bored Reddit warriors
@@FrameChasers Amén por AMD RIP está muerto ya no va a competir a Intel Jajaja Sólo Amén por AMD RIP
@@lightward9487I think you missed the point of his comment
My favorite techtuber at it again! Curious, do you still fresh install Win10 for your systems?
Yep
"13900ks costs about the same and looses about 20-30% in gaming" 😂
That comment was pure gold 😅😂🤣😅😂🤣
You want Intel just for the sake of lacenty between the cores which reduces input lag. If you use emulators, the lower is the latency the less input lag you get and the more accurancy you get. Also, the 6thz marks is way faster and uses in many older games and emulators. Why bother with AMD when you can go Intel that we know just works and is scalable.
20 30% is the amount of BS they use in advertising
Which cpu would you buy for video editing and gaming ?
@@gurugamer8632 Gaming Intel CPU no questions if you know how to tune your RAM and adjust voltages to your CPU / RAIL. Video Editing, I would prefer the 16 cores of AMD 7950x
Controlling affinity on the 7950X3D is easy when you turn Game Bar and Game Mode off, set up an Ultimate Power Plan, turn SMT off, use PBO, and set the affinities in Process Lasso. Once you do that, the 7950X3D is the king. The 7800X3D is great, but you simply lose the ability to assign processes to the non-Vcache core.
Frame chasers - what motherboard would you suggest to get for the 7800x3d? Something sorta future proof kinda?
Maybe x670e aorus Master
I don't agree with the idea that a 7950x3d with the non 3d cache core's disabled will be the same as the 7800x3d. I see so many channels claiming this, but people seem to ignore that the 7800x3d has a 700 MHz lower max clock speed and less cache. We will have to wait for it to come out, before judging performance.
They have the same clockspeed on the v-cache CCD. The extra 700MHz are on the non-v-cache CCD only, so irrelevant for gaming.
And the extra cache is because they just add the cache on both CCD's together and print that total number on the box. Since the non-v-cache CCD doesn't get used while gaming, the actual usable cache will be the same.
@@isakh8565 So the 7950x3d doesn't boost to 5,7 GHz on the non v Cache cores. But boosts to 5 GHz and the 7800x3d gets 5 GHz on all cores as they are all v Cache?
@@raptordna The 7950X3D does boost to 5.7 on the CCD without v-cache, and to around 5 on the v-cache CCD. The 7800X3D only has the v-cache CCD, and boosts to around 5GHz, yes.
The 7950X3D turns off the CCD without v-cache when you run most games, effectively turning itself into a 7800X3D.
That's why some people call it a scam, as this behaviour isn't at all clear in the marketing. It's mostly a 5GHz 8 core while gaming, and only acts as a 16 core in productivity (with only half the cores running at 5.7GHz, and the other half at 5). But it's marketed as just a straight up 5.7GHz 16 core without clearly informing people of these big caveats.
@@isakh8565 I see, thx for explaining. I stand corrected. In that case I should be really close to the 7800x3d.
Awesome! Love how you showed all the resolutions for Single player games. Can't wait to see the DDR4 vs 13900K/KS DDR5 VS 7950X3D(7800X3D). Maybe throw in the cheaper 12900K @ $350 in there ;)
12600k is barely slower, do you even have a 4090?^^
Can't take a reviewer seriously if he only benchmarks 7 games
I play a ton of BF2042 at 4k native with DLSS off/Ray tracing AO off. I have noticed from UA-cam benchmarkers that I get a lot more fps at these same 4k native settings than the 7950x3D does. 7950x3D was 115-135 fps and my 13900ks with 7200mhz ram was at 135-165 fps. I currently have a 13900ks/4090/msi Meg z790 Ace 4-DIMM motherboard with a stable 7200mhz 32gb DDR5 kit. Because of this video I ordered an Asus ROG Maximus Z790 APEX 2-DIMM motherboard and GSkill 7800mhz 32gb ram kit. 8000mhz was out of stock, so I’ll just push this 7800 kit to 8000mhz probably. Also trying to decide whether to upgrade my Corsair hx1200i(1200 watt) PSU to a Corsair hx1500i (1500watt) PSU for more headroom when running the 13900ks with unlimited turbo wattages and a 4090 at up to 450watts during gaming. Then you have whatever the wattage requirements for the motherboard and ram are, then fans I guess.
Seems like a lot of work for not a lot of gains. You already have 7200mhz ram. How much gains are you gonna get with 8000 bro?
@@ilovehotdogs125790 yeah, not a ton of gains, but some. Mostly in the 1% lows. I ended up getting a G.Skill 2x24 48GB 8000 CL40 kit, but I overclocked it to 8200mhz 36-48-48-86 timings. So I’m essentially running the timings you would get with a 7600mhz kit at 8200mhz. My cpu is overclocked to 6Ghz across all 8 P Cores and 4.5Ghz on all 16 E cores.
@@hadleys.4869 Hella fast system you have there. Congrats
I appreciate a real review on all the hardware.
Grandma complaining about performance 😂
My grandma doesn't even have a computer. I tried giving her one, but she refused it.
Wish we had something like this for VR games, like VRChat to choose the best hardware for VR.
For these top-tier CPUs Streamers need all the cores to run the game, OBS, mixing, and uploading to multiple sources. Losing 8 cores over Intel's i9 monolithic CPU is what needs to be tested. People buying this are often Streamers try to get by with 1 PC vs 2 PCs.
should have no prob w software encoder, yeah most likely no fps lost
That intro was gold.
BTW thanks for the work, we appreciate it for sure :)
So best "bang for the buck" is going to be the 7800x3d (since you don´t need expensive RAM or do any overclocking)?
And the downsides compared to I9-13900k being mainly lower 1% & 0.1% FPS in some games like Call Of Duty, BF 2042 and MSFS for example?
I wonder how these CPU´s would compare in a 100 person COD match with lots of action in a crowded city area. Or a 100 person Arma 3 King Of The Hill match. Anyone tried?
Summary (please correct me if I got it wrong):
*** AMD 7800x3d ***
Pros:
*Cheaper? (probably)
*Higher average FPS in some games (ex: Horizon Zero Dawn, Shadow Of The Tomb Raider)
*Doesn´t need expensive high-end RAM for good performance
*Consumes less power
*Can upgrade CPU until 2025 using the same socket (AM5)
Cons:
*Lower average FPS in some games (ex: BF 2042, MSFS)
* Lower 1% & 0.1% FPS in some games (ex: MSFS, COD)
*** Intel I9-13900k ***
Pros:
*Higher 1% & 0.1% FPS in some games (ex: MSFS, COD, BF 2042)
*Higher average FPS in some games (ex: BF 2042, Spider-Man)
Cons:
*Lower average FPS in some games (ex: Horizon Zero Dawn, Shadow Of The Tomb Raider)
*Needs expensive high-end RAM for the best performance
*Consumes more power
*End of life socket (LGA 1700) (for I9 CPU´s)
Good summary! If you add price/performance ratio for people needing good performance but don't want to pay too much, you need a whole Flipchart right now! I decided for the 7600x instead of the 13600k to be able to swiftly throw a 7800x3D on sale into my mobo in 2 years without the need to re-install anything. Or a 8600X3D (lol I'd wish). Bought "cheap" 2x16 Kingston 5600 cl36 that can easily run 6000CL30 with timings from igor's lab. 8800X3D shouldn't need faster ram to perform great, so I won't need a new mobo + ram.
@@Habixus Ty. Sounds smart. I´m using a I9-9900k with a RTX 3080 (being bottlenecked) so want to upgrade. Will go either 7800x3d or 13900k. Very tempted with AM5 for the upgrade-ability, just wish it didn´t have those 0.1% dips in COD (and probably Arma 3) too.
@@Habixus Why not buy 6000MHz CL28 and see if it can run at CL26?
@@mikaelandersson1288 Because it was 80€ more expensive in November. Got my 2 sticks in a black Friday sale and according to Buildzoid, "all" Kingston 5600+ sticks can run the same speeds. I didn't spend much time with tuning. Just used Igor's lab's 6000CL30 "sweetspot" AM5 settings and that was it. I only needed more CPU power for simracing and now I have plenty enough even at the 5600CL36 expo profile :)
True. The best "plug and play" will likely be Ryzen 7800X3D. This channel isn't about stock though.
7950X3D is the gaming king. Admit it. 😁
Watch the video first ...
I don't think you understood the review. It is fast but has 2 ccd. Which increases problems with the allocation of work on the cpu So it's better to wait for 7800x3d. The i9 results were without e core too.
@@nkk8339 Yes, he said another video.
7950X3D is a stupid purchase for gaming, at best when everything works it performs like the much cheaper 7800X3D and at worst it's buggy and uses the wrong CCD. If you want to buy a Ryzen 7000 CPU for gaming, it's the 7800X3D.
Everyone knows Pentium 4 is THE real gaming king. YOU just need to admit it. 😁
Disabling a CCD doesn't scale in all games and causes regressions in others. Why not just run a 7950X 3D OC benchmark instead?
Such good testing. Nobody else this dedicated to the craft as you hoodoos
Is it going to be worth swapping out a 7700x for a 7800x3d? Use case is 4k 120hz.
No. Not for 4k. Only for 1440p high refresh IMO. It may help a slight bit. Depending on the game. But probably not worth it.
Great video , love the honesty over it all , im still happy with my 5950x. keep at it my friend
Jufes the safe zone destroyer
so for apex legends to maintain stable 240fps with a 4090 (1% lows higher than 240), is it recommended to go with the 13900K over the 7800x3d?
Here if someone responds
I appreciate your commitment to maxed out tuning!
maxxed out
THANKS SO MUCH PWNY
My biggest regret is that I bought a 7900X3D w/o finding this channel first.
That said, after a month of researching my CPU's features and functions, learning about the known issue of using 2 vs 4 DIMM slots of DDR5 (and not being able to use EXPO profile), and switching CPPC settings in bio from AUTO -> DRIVER:
Yeah I regret nothing!
Intel IS awesome; used to buy nothing but team blue.
While the first month of going team red was a bit frustrating, I'm acutually really enjoying my new CPU now more than ever playing games like starfield w/ only 20-30% CPU utilization lol
AMD is really tough for adhd manchilds who can't spend an hour in bios properly overclocking memory and setting PBO, I kinda get it, I was 14 years old once too. You will get through it bro
All i have to say is thank you for all information
Jufes does it again. Bringing the Heat! I'm looking at doing a Summer time build. I'm seriously leaning towards the 7800X3D right now. (Im mainly an Emulators / Single Player high graphics and resolution gamer.) We'll see how it goes.
Then the 7800x3d will be a waste of money for you. The difference of it’s performance will not matter at 4K and if your only doing emulators any modern CPU will get the job done. You also need a 4090 as well which is yet again extremely overkill for your use case scenario.
@@ZackSNetwork Lol. No man. I'm not the regular tech viewer. I know exactly what I need for my use case. "Any CPU is fine for Emulation." No. Just No. For the latest emulators like RPCS3, Xenia, Yuzu or even some games in Dolphin and PCSX2. You need a really beefy CPU to get the best performance. 7000 series or 12th or 13th gen specifically. Also I never said I was going for a 4090. I have my eyes set on the 4080 looking for it to go down a bit but if not if not thats fine. I don't plan on upgrading my cpu for at 2 generations or my gpu depending on how well the DLSS and Frame Generation works on future titles. So I'm good. Good info. For a casual. But I know what I'm looking for and why I'm looking for it. Have a nice day.
@@NBWDOUGHBOY My i7-12700K is right on par with a non-3D Ryzen 9 for emulators and only losing against high end raptor lake. The iGPU can handle the emulators no problem, still faster than the Switch.
I have Corsair DDR5 4800, no XMP needed. Watch the floating point numbers.
@@NBWDOUGHBOYAlder Lake and Raptor Lake have 6% higher IPC than Zen 4. That 6% difference on IPC is a huge one for clockspeed and thermals, not worth it buying Zen 4 or Raptor Lake for that, Alder Lake Golden Cove is enough (as an option find an AVX512 batch).
@@saricubra2867 Ironically 12700K or 12900K with AVX-512 Tuned is still the fastest chip available for the RPCS3 emulator. It's still good. But, I want a Set it and forget it system for the next 4 to 6 years. The 3D should be decent for my purposes if it isn't ill return it and get something else. Not a Big Deal.
As MW2 ranked play is becoming a lot more popular compared to Warzone. It would be nice to see these comparisons at 1080p and see which CPU wins.
think MW2 is optimized for AMD if i am not wrong. So AMD should win in almost any case (Again if im not wrong)
@@kharaf9920you are thinking of GPUs
@@nathanwhite617 you're probably right , it may have been the 7900 xtx
Any info on the power draw on the MAX OC cpus? I honestly want to keep my power as low as possible on my next pc
Amd uses less power. Check out the gamer nexus vid
"Low power" lol 🤡
@@redclaw72666 No one used the term "low power" so I really don't get the clown face for someone asking about power draw for each chip at MAX OC...
Thanks for highlighting the 0.1% FPS dips on the x3d in COD in the city since this is where much of the multiplayer action happens
Please explain why if you're a Gamer why you buy a 16 core CPU and then expect 16 cores while GAMING. Why is this just being ignored?
Stream while gaming. Funny enough, this is too keep the 1% lows constant.
10900K has 10 cores and you can game on them
@@Gamebro321 It uses overclocked e-cores with more cache, basically 2015 Skylake IPC with dinosaur 14nm.
i7-12700K, Ryzen 9 5900X, i5-13600K are way better.
@@saricubra2867 I will prove you wrong by playing all my games on the 16 e-cores of the 13900K.
@@Gamebro321 It would be 1 P-core and 16 e-cores. One P-core from 12th gen destroys the 10900K's Skylake core.
Great video, but kind of boring results. Worth buying if you're already looking to build a new PC, but nothing to get excited about. The whole 3D cache thing is starting to seem like a dead end.
AMD has a pretty solid platform with the 7800x3D. Looking forward to the price per fps video. Your max oc 13900k platform is probably double the price of the amd platform between the asus apex, custom loop, ram and ram cooling.
During gaming loads, the custom loop is only helping a little. Heavy workloads the custom loop allows it not to throttle. Don't have to worry about throttling as much for most games.
Most games don't push the 13900k beyond a 130w load. A stock 13900k would be performing basically the same, maybe just a few percent slower.
@@03chrisv meanwhile the amd cpu is using half that power in gaming, the power bill isn't that concerning but the extra heat is noticeable in a room, which could mean running AC more often which definitely does start to hit the electricity bill.
@Malinkadink Intel CPUs don't really use a lot of power while gaming. They average 70w to 90w, and don't tend to go over 130w even in super demanding scenarios. The extra 10w of heat that the Intel produces over the AMD processor is negligible in terms of heating up a room. We're splitting hairs here.
@@Malinkadink different nodes lol if your gaming shouldn't be bothered with how much electricity your using unless your poor then that's a different story if Intel was on the same node as AMD then you'll switch sides lol
Appreciate you and the supporters ❤
Thank u all supporters, you are big brain !
I really like Jufes’s objectiveness.
He gives AMD shit for their shady marketing and false claims but at the same time recommends them for certain cases i.e. people who don’t know how to OC, don’t care about max performance or just wanna plug and play. 👏🏻
Only hardware reviewer I trust 👍🏻
is the second CCD disabled on the 7950x3d "stock" benchies too?
Yes
But, but Jufes! Why is your arm bigger than my leg?!?!
Did you FAFO with X3D memory tuning yourself and/or have you tried Hynix A/M-die timings published by Buildzoid?
That intro. My sides
I'm so sorry, I'm a bit confused - the 7800x3d isn't even available until tomorrow, how were you able to get one for overclocking and benchmarking?
So thankful I don’t have to spend 700 bucks on a CPU to enjoy PC gaming
Yo where is the stock Intel numbers? :/
Ive seen plenty of reviews as I am gaming in 4K and I have a 4090 TUF OC with the 4090 Strix OC Bios flashed.
Then I am still running the older productivity AM4 CPU 5950X, so i wanted to see if 7950X was a good upgrade, but found that it wasent for 4K.
Then looked at 7950X3d, but I find that the mainboards and cpus + ram will cost too much for it beeing worth it for 4K gaming.
Still I feel that Intel is a bit cheaper, mainly due to the mainboards. but a 13900K will cost almost the same as an 7950X3d.
However it is also possible to buy an 7950X.. Or maybe I will just wait. I think that the prices will drop at both AMD and Intel, especially motherboard prices must down for AM5.
I am by no means married to Intel or AMD, I had been using Intel forever, but then I switched to AMD after I felt Intel stagnated for several generations.
However at the moment there is finally a fight between Intel and AMD, productivity wise the 5950X is still fast enough for me as a developer. But I do of course miss out on the "Minimum 1 fps" with my 4090 in 4K.. However it is so fast still that I dont feel any games lag at highest settings.. So I think I will just wait and see what happends with prices or bundles on either AMD or Intel side.
What are the benefits of flashing a different bios to the gpu? And do you lose your warranty when you doing this?
@@shumba_the_don it's mainly the stock speed and turbo speed. The gains are small
Appreciated!
Thank you so much for this, 13900ks it is.
i think 5% low is way more important than 1% lows. if i'm only seeing a drop 1% of the time with a high refresh rate monitor setup with one of these i'll likely never know. seems a little overkill to harp on it but maybe not.
If your game stutters 1% of the time that means every second at 100fps you get a stutter. Or you play for 99 seconds and then your game fucking dies for a second. I would notice that.
frametimes are important 1% and .1% lows after that, if you're getting steady frametimes 99% of the time that is significantly more important than something happening in the game that causes a spike in lows.
@@christianprice9832 that's true. I was just trying to work out a frame of reference for what 1% lows mean while gaming. Kinda came off a bit aggressive.
@@YOGURT1 I don't guess I notice. It doesn't die for a second it merely drops from x fps to y fps and 1 percent of the time doesn't mean 1 percent in these cases. It's the lowest it will drop not really meaning 1% of the time it's that low. Just as his video states it really is semantics. Just as people selling high frames he's doing the same for 1% lows.
@@jeffiedean that makes sense. My examples where on the extreme end. In your case you would see the avg fps and lows close to each other, which would be good since at least its consistent. Frametimes might be the best way to see stutters.
That intro was hawt!
I’m a cod bro, thank you for calling me out
thank you for this video and thank u supporters, im a new guy here just found your channel not too long ago and im happy i did.
Care to explain how this is any different than turning the e cores off on 12th and 13th gen for maximum fps? The thing is, we are at the very beginnings of heterogeneous cpus in x86, and there are bound to be issues. Games not launching due to drm freaking out on 12th gen with p and e cores at launch comes to mind. Its going to take time for game and software developers to properly code for cpus with different cores under the same IHS
Grand! What a stand-up review
can you redo the benchmark soon since we have 8000 RAM on AMD now
A delidded OC 7000 X3D video would be great. Definitely watch that. From what i've researched you need Thermal Grizzly stuff to delid, frame and short backplate
That is cool, a grandma running a 7950X for her gaming rig. 😂
Thanks to the people of the community ☻
That intro was fire! Lol
Thanks supporters. Dude you are so passionate, don't let it make you sick. Everyone isn't drinking the flavor-ade. I bought the 7950x cuz it was a great deal for me. This may not be my niche of a channel but your content it thought provoking. Keep up the good work!
That intro made my day lol😂
I just bought this after asus ruined my 13900k. I have a 💯 opposite experience! Cyperpunk has increased scores and micro stutters in games like guild wars2 and even old swtor at the fleet are now gone. All my games are so much smoother. My hunch is gamebar plus chipset drivers and firmware updates in the past 13 months since this was made fixed it. I feel this is an upgrade from my crappy 13900k. The scheduler works perfect in my testing and is within 98% in cb23 of my 13900k
Thanks so much man. This helps me so much
This intro LMFAOOOOOOOOOOOOOOOO
Ok first off, love the honesty on here as well as on Twitter ! Fuckin need more honesty in this world tbh man
Besides the AMDip, theres the platform stability, having usb drop outs or tpm stutters or being on the mercy of the x-box gamebar for core allocation, I went Intel this time because of that. Even if it needs more tuning to reach the max, I'd much rather tune than being left in the shit.
Game bar yes, but I don’t think the USB stuff happens on AM5. The TPM stutters I have never heard of, tell me more? Still a solid decision. I am still debating mine. It is really the whole dead end platform thing, and lack of PCIe5 M.2-to a much lesser extent-that is holding me off Intel. Also general cost if you want DDR5
@@nathanwhite617 Best I can explain it, is that a few times per day your system would just freeze and start sputtering for a few seconds, cutting off audio and dropping your fps to 2, didn't matter what you were doing, gaming, movies, or just idling on desktop. You can see examples if you just google amd tpm stutter. AM5 will propably be free of that too, much like the usb issue, but all the firmware/agesa issues I personally had with am4, makes me appreciate the stability of the intel platform. Lacking the ability to upgrade the cpu down the line is ofc the major downside. I did upgrade on am4, 2700x-3900x-5900x, but by the time I hit the last one, I already wanted a new motherboard that had better features, so by the time we hit that last supported cpu on x670, might be the same thing, so really didn't matter that much for me, but ymmv.
How the tables have turned! What would you say about intel 13th and 14th gen instability fiasco?
@@thefreemonk6938 The shoe is pretty much in the other foot right now, and this current situation is getting beyond silly, personally the only bad thing I've heard about am5 as a whole is the boot/memory training issues and as a a whole amd seems to be a much better choice than intel at the moment. Intel not communicating about the stability is just adding insult to injury.
And im here sitting on my chair and planing my upgrade from 1600X to a 5800x3D :D
This video, is actually truly unbiased. Though I suspect not everyone will pick up the true message. Buy a 7800x3d if you are NOT interested in overclocking/tinkering. I myself am in this camp, I just want it to work, out of the box and game. However, if you are in the tinkering camp, in certain situations/games you can optimise a 13900k to outperform. In some games memory bandwidth is important, in others it’s raw single thread performance and in those scenarios maybe the 13900k is for you. In life, one size does not fit all and that is why, it is important to educate yourself with information. That information needs to be evidence based and you should always cross reference with multiple sources and add weighting based on the evidence provided in those videos. Rarely are videos on the same subject providing apples to apple comparisons (if they do, the. you need to diversify your source material). Video’s like this truly give you an insightful view, other valuable sources will give you a stock for stock comparison. Armed with all the information, and based on your own personal requirements, then you can make an informed choice. Don’t rely on anyone else telling you what to buy, they will always introduce their own personal biases, take the time to educate yourself if you want to get the most for your money.
also, for the 14900k to run as well as it does here you need to get a REALLY high binned Hynix A-die ddr5 kit, which is not cheap. another cost factor at play.
I still feel that the r9 7950x3d its still a half way decent deal for a not heavy watt consumption mindend build compared to the thirsty i9's
The intro to this video is fucking gold.
That's what some people are trying to explain games don't push modern CPUs anymore.
In this video, tomb raider is using 44% of the cores so that's actually just 3 cores or 4 or 5 threads.
And that game is only pushing the 7950x3d to 44% usage on those 3 cores, if you actually look at the game running in this video at 5:27 it's showing 390 to 445 fps in tomb raider.
And no I don't play COD.
would the 5800x3d dip more than the 7800x3d or would they be equal? since the 5800x3d is probably still the cheapest platform to adopt in 2023 with maybe 10-15% slower fps
If you want to save money and just need something "good enough", the 5800X3D is a good and lazy option without much hassle.
Buy any cheap B550 100$ board, some random Samsung B-Die for 150$ or even cheaper Micron E-Die for 100$, or if you already have anything just reuse it.
@@marceldiezasch6192 I know this. i currently run a 10700 non K. im pretty sure i would get massive gains (up to 50% ?) with 5800x3d. But i am concerned about the dip. i'd rather invest more into 7800x3d since ill have to upgrade platform on both ways and avoid worse dippings if you know what i mean.
@@kharaf9920 why dont u get intel dude.
@@Boogeymanjw cba learning to overclock my ram and spending time in bios. unless somebody volunteers to tell me exactly what numbers i need to punch in, i am really not into wasting time and learning the whole thing and stresss testing for hours. i like simplicity :P plug and play basically.
5800x3d will dip more if I understand correctly
"dont buy this but if you do use my link ill take the money." -frame chasers
That intro though
yep yep was contemplating getting an x3d chip when they came out but not impressed with teh performance imo. Guess 5800x3d was teh one hit wonder for them, i was hoping for that kind of uplift. Gonna stick with my trusty old 12600k.
The best part of this video is the intro! Content is good too!
I've got 7950x3d and that was good decision. It allows me to forget about cpu change for the next 5 years. I'm not only gaming but also programming.
Heavy graphics here; I'm a painter, plus use it for work myself. So yeah, this actually sounds like a good decision--and efficient energy usage, comparatively speaking, plus it's not like they won't improve the tasking over what they've got the first week.
If you're doing any sort of productivity in your using amd theb imo you must not respect your time.
@@geeesejeeeze6055 Honestly, if you even notice the one minute compile difference on truly gigantic projects, you must seriously never leave your seat. Learn to do so--deep vein thrombosis is no joke.
Every painter stands and sits occasionally to change position.
@@BronzeDragon133 when I'm doing my work and I'm zoned in I don't want there to be any delays. There's a reason why most people that do any productivity work stick with Intel and Nvidia. It just works. AMD is people who don't value their time and like to waste money.
@@geeesejeeeze6055 That seems overly judgmental, like saying that Intel is for people who don't value electricity and like to waste resources. But you do you, Booboo.
So, does the Radeon VII in the background still work or did the interposer blow?