True, however a point that should have been taken into account is the power use of the systems. If the Intel system requires another 100W of power to supply near the same performance then it wouldn't require running the AMD system for long to make up for the initial difference in cost
@@jrhowrey lmao im sure Micron walked him through the process and properly validated that it works, but the idea of having a piece of hardware that you had a hand in putting together is cool
Both these CPUs have vastly different architectures, signal paths, core counts, caches, clock speeds, and instruction sets. They were designed and manufactured independantly by entirely different teams with next to no direct communication between them. Yet when they were pitted against each other, the difference is negligible at the abstraction levels of our minds. It just blows my mind just thinking about it .
In the early days they were radically different however because of the standardisation of operating systems and architectures and new technologies etc they are for all intents and purposes identical apart from the label.
Bruh that's only one component. Take Formula 1 teams from different countries with different engines, different chassis, different drivers, lots of different aero and mechanical parts, different electronic and computational components, and they can be within 1 one-thousandth of a second over a 100 second lap, i.e. 99.999% identical performance. That's in the real macro world too in variable atmospheric and track conditions, not in a lab
If they're this close on performance, you should have put an energy meter on the end of the PC plugs so we could see how much energy they used over the course to see if there was a difference there.
yeah but that similarity is only in the enthusiast end.... they have the same thread count, but the 420$ 13700k is arround the 550$ 7900x... and dont even talk about the 13600k being better in every possible way than the 7700x for 150$ less, and mb cost...
@@AdrusFTS Not that it matters much to most if you’re building a new system, but isn’t the 13th gen platform dead after this generation? The AM5 motherboards should be able to be carried forward when 8000 (or whatever) series is out in a couple years. Doesn’t matter to most of course, but that was one lovely thing about my AM4 2600 build, going to a 5600X later on
@@giaopx amd owner here, dont think the 7900xtx will beat the 4090. might be close, but youre hyping yourself up too much. it would be really cool, but amd has a lot of performance to make up to get up there.
@@dylanneff8338 To be fair, the price-performance difference will likely make the 7900xtx the better option for gaming, but, for users that want to do streaming, or 3d modeling, or any other task that requires a GPU that isnt gaming, Nvidia has been the better choice for years, this is related to encoding, drivers and architecture. All in all, it comes down to what the buyer wants to do with the GPU itself. And about the CPUs, they forgot to take one little detail into consideration, and thats upgrade path, AMD is known to make their platforms last many generations, while Intel changes the platform every other generation. Meaning that 14th gen will be in a new platform, while 8th 9th and 10th gen will still be AM5 for AMD (if things go as expected). So in the long run, AMD is the more logical option for someone that expects to possibly upgrade in the future. But, we are talking flagship CPUs, both 13900k and 7950X will be incredible CPUs 2-3 years from now anyways.
I ended up deciding to go AMD simply for one reason: Longer socket support. They just started doing AM5, when Intel has switched sockets every couple years now to force mobo upgrades. 7950X for me.
Not just that but they give new features with each new chipset and honestly my 13900kf will last me the next 5-6 years no problem by time I upgrade I will just get a whole new rig
Simpler organization, too. I was on AM3+ until about a year and a half ago, and I went straight to AM4. It was as simple as pouring milk into a bowl of cereal. With Intel? I'd have to first get around their confusing AF naming scheme for their processors, the find out what socket and motherboard goes with said processor... nah. Just nah.
@@theengineer2017 That is cool and all, but I like to stretch 10+ years between buying new rigs. Had intel for awhile, but doing just mostly gaming AMD is the clear choice here.
@@EzrealLux I went amd since I watched the 7800x3d video. I saw the benchmarks and the prices and noticed a clear winner. Gonna step up from a xeon 1231 and im already very excited. I bought a aio, trident z ram @6400 16x2 and msi b650 tomahawk. IDK about the ram speed because everyone is using 6000. I hope it goes well. Any thoughts or do I need to buy anything else?
3:45 Yes, someone said it!! Back in college I had a gaming rig with an ATI Rage 128 for the 2D card, Voodo2 SLI and a sound card, filling up all my PCI slots. I had one ISA slot which I put a modem into. It was beautiful.
@@Incommensurabilities If he had a full size atx board and the ISA slot is the most outer slot it is not overlapping. I had a similar board back in the day and also used all slots 🙂
Ah, the memories... My first PC I put together: AMD K6-2 CPU, RIVA TNT (2, I think) AGP graphics card, a Soundblaster soundcard, 14" CRT monitor, Firewire card... I don't remember which board, but I do remember it was quite expensive (used for a review for regional the BUG - PC magazine).
As mentioned, almost the same at idle and gaming loads. And for productivity you would need a more nuanced look at power versus perf (and even then if you’re making money through your computer you probably don’t care about power and just want raw perf since even in high energy cost places the difference in yearly energy bills is not immensely significant)
It's funny because you used a clip from your pretty old $2500 rgb pc build, and it was nice to see that. That was probably the first video of yours that I watched and has kept me around for a while. Love the content, always love it
I'm no kid, but I am about to build my first PC and that is basically exactly what I've been doing. Now I'm just trying to cut back to a level I'd still be happy with while saving a bit of cash.
When it comes to CPU usage in gaming, I think you should consider testing Planet Coaster, which even if it's from 2014, because of the Cobra Engine's limitation it's actually a Core-hog when having huge parks and especially when fully filled with all the guests the park can permit at max. Just sayin' to take it into consideration! This why even those channels that present created parks in Planet Coaster actually limit the number of guests or close the parks all together to get that framerate running right.
Given that amd kept their word on the lifecycle of the AM4 platform, going AM5 means having a platform that will be around for a while with easier upgrading in the future.
They didn't on threadripper unfortunately, but the market dynamic for that kind of explains why. They literally can't make the TR Pro fast enough to fill the backlog so why would they make a lower tier part.
It's a 7950x what are you going to upgrade too?? "Yeah I really need the 9950x in 18 months though, its like 10% faster and with the same amount of cores." All I see is AMD fanboys talking about how "eco friendly" AMD is because you use less power and the same board for upgrades You wanna be eco friendly, stop upgrading your CPU every 20 months, that and if you are dropping about £2000 on a new PC, saving £100 now or £200 in 3 years means nothing.
You can always flip motherboards if you decide to upgrade, and intel has more room for ram upgrades in the future unlike AMD (which does give benefits, and RAM is quickly getting cheeper/faster). Further more most people don’t upgrade every 3 years, even if substantial performance gains are achieved in that time. Not saying longer platform support isn’t a nice feature, and intel should adopt at least one more year for their platforms. But in general I would always buy the better CPU for the cost at the time rather than planning for a theorized future. Until recently that was Intel, although with the price cuts on AMD it’s probably winning currently! Competition is nice
@@Daeyae I've upgraded 2 times in the AM4 cycle, I am also unbelievably lazy so personally id take the AM5 so I don't have to take the board out and sell it, which I would also take a long time to do.
This needs to be brought up way more often than it does. Being able to swap in a Zen 5 and even Zen 6 chip later down the road alone gives AMD the win even if AMD was more expensive.
Should also consider the new AM5 socket will go on so you will be able to upgrade your AMD cpu in the future without having to change motherboards, while intel is changing sockets after the 13900K so you’ll need a new mobo for the next gen intel cpus
This. 1st gen AM5 is on par with Intel's best and yet Intel's socket is already EOL. In a couple of years time Linus can simply drop in a 9950X3D after a BIOS update for no doubt huge gains all whilst keeping the motherboard and ram in place.
realistically, almost no one I know who buys the latest and greatest CPU even cares about keeping the same motherboard. The CPU upgrade is almost always tied to getting better feature sets that newer motherboards offer.
Honestly, I'd love if something similar became a labs project. Getting performance numbers for as many steam games as possible on intel/amd cpus and nvidia/amd gpus would allow for gamers to figure out what hardware would work best for them!
I would love to see some single thread limited/unoptimized games benchmarked. I've never seen anyone really do this before. Unity games like rust on a map with tons of buildings or far cry 3 for example. Gta online is a good example as well
@@UKCougar if you looked at cpu performance over the past 10 years, AMD hasn't been even CLOSE to intel in performance when you compare their top end chips. Previously, in a money is no object build, AMD wouldn't even be CONSIDERED because of how trash it was.
@@FaZeredemption3 power consumption is an interesting point when it comes to a cooling solution. All the dude said is that it would be interesting, chill the fuck out lmfao
Something else to take into account is longevity. If you are buying top of the line processors then you probably like to stay on the cutting edge, which means when the AMD R9 8950x and Intel I9 14900 come out you are probably going to buy them. But the difference is that the 8950,9950 and 10950 or whatever they will be called probably wont need a new motherboard every time and intel will. Food for thought price wise.
I bought a top of the line processor so I wouldn't have to build a whole new computer for several years. I'll max out my processor and MB on quality, then swap the other pieces as needed for 5-10 years, then rebuild a new monster and start it again. This time I went with the 13900K and the 4090, so we'll see if it lasts like my last build did.
No, I pick high end CPUs and keep them for ~5 years. Hate working on CPU, mainboard, cooler and cable management all the time. I have 14900k + 4090 + 64GB DDR5 + 10TB PCIe 4.0 NVMe in my system and the only thing I will change in the next 5 years is the GPU.
The performance was so close. It was within the margin of error of measurement and the margin of variation of the silicon lottery. Would have been fun if they swapped GPUs between the machines to validate
the power differences betwen the computers is literally nothing compared to the power usage of any other high powered device like your air conditioner or washer/dryer, dishwasher, etc
I did the math for my CPU(5950x) and based on 5 hrs a day at load (0.11c/kWh) I could save nearly 50% on my energy bill compared to a pig like the 13700. That's based off a single component.
when linus pulled that first 4090 out of the box, i genuinely thought it was a toy for like a joke about how huge it is. i've already seen it on video a bit, but i did not remember it being this big
@@h.m.chuang0224 yeah the M2 is a Great CPU for completely different reasons, but hey the current x86 architecture is probably gonna hit its limit soon with CPUs getting small enough that Quantum tunneling is a real issue and apart from some architecture upgrades theres not alot left for x86 which is already 44years old now, probably about time to start looking for a better solution, something similar to the much more modern ARM architecture?
@@FacialVomitTurtleFights Snapdragon 8cx and 870 have already made their way onto laptops. I'm guessing they don't have partners and the PCIE lanes aren't enough, so don't expect it on desktop anytime soon.
@@avixs1543 looking at how big Apple Silicon dies are, they will hit the same limit as x86. By that time, the only deciding factor is going to be architectural benefits of each arch. I don't believe that there's magic in this world, so the best case scenario is ARM chips being about the same performance as x86.
As a ex-BIOS engineer who worked closely with Microsoft kernel (ACPI) team, then had a long career in mobile architecture (particular in kernel and power management), it's sad that how they just didn't address power properly. I've worked closely with Microsoft on power optimization back in windows mobile/windows phone days, so they clearly have the knowledge and we've developed techniques that Windows can work just like what mobile device can... I guess that knowledge is lost in the sea of Microsoft engineers...
Microsoft simply doesn't believe in different limbs of the company being able to communicate with each other. They actually encourage the silo'ing of departments in their organization products(AD, etc).
@@shabadabadoo4326 That's how they do it in the CIA, FBI etc. This way people have no idea what's actually happening in the grand scheme as everyone is working on smaller projects that contribute to the whole. Easier to get away with shady practices this way.
@@MTGeomancer How about people that run those SETI@Home and Folding@Home programs? Or leave their computer running with torrent uploads going 24/7? Or crypto miners. Or people who's photo and video exports take >1 hour. Or people that run website/media servers on their main computers. Might be valuable to those groups.
Seasonic kicks ass. I still have my original power supply, which has outlived 2 motherboards, 3 CPU's and more than 10 years use with no sign of letting up.
I have one Seasonic that has been going for about 10 years now. :P It outlasted a few newer Corsairs, though the older ones were also made by Seasonic as the OEM, back when they also lasted forever.
9:59 Actually the 6800 MT/s kit has a bit lower latency, because memory latency is in clock cycles. The numbers may be higher on the 6800 MT/s kit but due to the higher clockspeed the effective latency in nanoseconds is a tiny bit lower for some of the memory timings.
would be nice if you would compare the electricity usage for boith systems. In some parts of the world (f.e. germany) electricity prices can get over 0.5€/kwh, so that could be a major factor.
ok, but then what else can you do at 0.5€ an hour? lie in bed? And remember that those wattages they talk about are max draw, when under load. When web browsing they can easily draw an order of magnitude less.
Power usage is such an underrated metric. If I can use way less electricity for near-same price and performance, I'll absolutely do it. There are cost of ownership and eco-friendliness implications here. And the upgradability factor favors AMD's motherboard which has additional cost-savings and eco-friendliness implications.
They spent $3100 each on just the GPUs and SSDs, and you're wondering about electricity prices? This just isn't that kind of video. Gamers Nexus has all the numbers you want for each component if electricity draw is that important to you - these guys are just fucking around with expensive stuff for fun.
Just goes 2 show you how being rich can just make someone so damn careless. no one in my house would even dream of setting that equipment down with such FORCE before anyone calls me weird for making it a big deal, that card costs more than the RENT DOES HERE. not even a newborn baby would be set down as gentle as that damn card if I could only afford that
@@iLegionaire3755chipset drivers are somewhat irrelevant, both are good are you carrying over the stigma that comes from using an AMD gpu and "bAd DrIvErS", i mean sure, in the past, their driver support was awful (i.e on Terrascale and/or GCN), nowadays it's just as good as nvidia but for midrange modern gaming rigs, get a 7600 and pair it with a 7700XT
AMD will always be Team Teal to me and I'm glad it's showing up more in their in presentation slides and, blessedly, it will show up on their products more too. :)
I'd love to see AMD only for the reason for a follow up video to see how accurate AMDs claims are on for further support "till 2025" and possibly more. Also to see if keeping first gen AM5 boards will keep up (possibly feature wise) to newer boards in the future.
@@AngieYonaga Of course, AMD has proved themselves in keeping their promises but not without some troubles along the way. Remember the whole bios flash size debacle? Although yes, with this experience I doubt AMD would try something like that again that would get them flak.
@@NootNoot. Hopefully that's the case! No matter what it's good to remember that neither Intel or AMD wants the best for you, all they want is money and the only way for us consumers to not get screwed over is by making sure they deliver and riot if they don't
@@AngieYonaga That's true as well. Honestly, in a recession, competition couldn't have been timed any better. This video proves this and finally consumers may have an edge on the market unlike last year.
I'm glad someone else pointed this out. The failure to address the fact that AM5 will be around for a few years where as 13th gen's LGA1700 is likely to be the _last_ CPUs on this socket; feels like a massive oversight. If you are building a system from scratch right now, there is almost no reason to go Intel, you'd be shooting yourself in the foot.
Honestly at this point when both the AMD and the INTEL builds are so close to eachother in terms of performance… it’s just all about how much you can/wanna pay because other than that, it’s pretty much impossible to spot significant differences between them
@@n646n You ain't supposed to pick anything by the brand. Pick the one that performs the best and costs the least. And if that's not possible, just pick the one that looks better. Same thing with dems from team blue and reps from team red in "The American Sh*tshow: Russian Interference Edition" But somehow people seem to take everything way too personally. The same people who call others "snowflakes" tend to melt down at the slightest hint of fire.
Just got a 7950x, and it is an absolute beast. 32 thread rendering is just out of this world. And gaming is a piece of cake, even maxed out triple a games
@@badbutton5869 Finally someone says it - going for intel 13th because you already have DDR5 seems to be stupid because you will have to upgrade just everything and cannot upgrade the CPU. Unfortunately it wasnt mentioned in the video at all.
not much, 20 cpu pcie lanes from intel vs 28 cpu pcie lanes from amd. I.e. nobody of them able to provide 2x16 pcie slots. If you need tons of io in the rack - you getting xeon or epyc
OK I have to ask, why would you be looking at either of these for something you would be putting into a server rack? Just curious because If I was putting something into a server rack, it would be a server and that is a different class then what I would use for a pc. I just don't get the server rack reference. For perspective, the last node I put in a server rack had 120 cores, 2TB of RAM and was connected to a 64TB SAN using 64GB adapters. I like the the AMD design but that doesn't come close to server class hardware from AMD or Intel.
The ultimate performance test for a cpu is going into After Effects, making a very complex project, with a ton of layers, motion blur, animations, movements and scales of all kinds and a ton of different effects added and pressing spacebar to see how the cpu handles the live preview. I barely ever see this kind of test. :(
Also, with AMD you can probably upgrade your CPU after two-three generations with the same MOBO. This is going to be my first all AMD build ever, if the 7900 XTX can deliver what was promised.
Honestly an advantage for people who are buying entry level or midrange processors, but not at the top end. I bought Intel because I only upgrade my processor when I want new motherboard features like NVMe or the latest DDR memory version. High core count, fast processors should have a pretty long lifespan
@@Mournful3ch0 Sure, if you don't upgrade your rig that often it's not an advantage. I'm upgrading from DDR4 to DDR5 now, but I'm not going to wait 8 years for the next upgrade, and i don't think many top end builders are. Upgrading MOBO within 4 years rarely yields much performance gains. From the first high end CPU to the last on the AM4 platform on the same MOBO and DDR4, there is a 179 % increase, that's not insignificant.
I can't believe you didn't mention upgradability between the two platforms, next year you can just stick a 8000/9000 (whatever it will be) series AMD CPU into that AM5 slot and not have to upgrade anything else. You can't do that on intel. Not to mention the new 3D cache chips that are coming out in the new year, gaming performance will increase massively.
do we know that next gen amd will use the same socket and chipset? for all we know, amd could do an intel and force you to get a new motherboard every generation or 2 and intel could start doing an amd and support sockets and chipsets for more generations of cpu. I'd be surprised if either did change but we never know
@@om8414 That had less to do with the socket itself The problem then was the memory size of the BIOS chip or something of the sorts, and they couldn't support 1xxx to 5xxx CPUs on x370 boards. They ended up making it so that updating the BIOS would only support the latest 3xxx and 5xxx CPUs and remove 1xxx and 2xxx supports depending on the motherboards chip storage space It's not like Intel who basically used the same socket pins (different layout) and pretty much purposely designed it to not work well with newer chips They should've created a new socket but the heat from early Ryzen claiming support for years made it tough, so they kept the same socket and decided to call any future proof issues "for our own good"
@@om8414they promised to only support up to Ryzen 3000 on some boards and yet STILL updated it when people whined about 5000. Don’t act like they aren’t going well beyond what they should have to. Intel doesn’t even come close to that. Please.
Good thing at last end with 7950x fast forward 1 year to now that i9 13900k probably keep crashing the pc due to the voltage issue and the manufacturing issue 😅
@@rabbychan 7950X, ASRock X670E Pro RS, Samsung 980 Pro 2TB, Seasonic 1000 Gold, GSkill Trident Z5 32gb 6000, MSI Mag 240 AIO, Fractal Pop Air, Windows 11. Still running my old RX 580 as a placeholder until December, plan to get a 7900 XTX. Want to add some storage as well but debating between another m.2 or a big HDD.
@@HardEarnedBacon I'm very excited about the release as well, I'll certainly keep an eye out, I'd like to have a high end full amd build in the future if there are enough GOOD game releases that justify it.
@@rabbychan i build my first ever AMD Rig (comming from 9700K). My specs : 7900x / 32 GB G.Skill 6000 RAM / 3070 Rtx strix / Asus B650 A Gaming Wifi Mainboard / Asus Rog Thor 1200 watts / Noctua NH-U2A Chromax black cooler
I am actually so happy that both AMD and Intel are viable now for both gaming and working with barely any difference. I hope this trend continues and in the future we will see the same happening to the GPU market. Healthy competition is when both products are practically the same, because in the end the customers win the most
The history of Alt+F4 is quite interesting. I still fail to understand why anyone would remove this from a game or any software in fact. These days it's more hassle than it's worth 90% of the time as it's automatically assigned in most major game engines (Unreal Engine, Unity, CryENGINE etc). Heck even Game Maker has it by default!
@@JsGarage if you didn't know about it then somone telling you and you falling for it is doing you a favour because now you know about it and its not really something you fall for twice. also it hardly counts as a 'troll' I'd argue system32 is more trollolol and even that is kind of passe today
When a game stops working it sometimes send data and error report back to developers in the background. Alt f4 stop that one from doing. This one of that shity anti consumer things they do.
I've searched all comments with many different keywords, and nobody is talking about the Wizard of Oz reference at the end. Anyways, loved your Scarecrow impression Linus!
Lol, I just built my first new rig in years (2nd gen to 12th gen) I remember how careful I used to be building pcs even down to applying thermal paste back in the day. Now I just built up my most expensive computer so fast sticking it all in. Too many years watching Linus man handle pc parts 🤣
@@jrbudoybudoy "Their PC " ? Does she suffer from multiple personalities or something similar or you was looking for "her PC" and couldn't remember the proper word?
@@jrbudoybudoy You said "my GF's rig" not our rig or my family's rig. Girlfriend is singular so it does not make sense to use their. My PC can be use by my whole family as well but in the end is my PC not our family's PC because I am the main user.
Hey Linus, about that Corsair AIO, there's a known issue that causes it to freak out and tank your lighting with a false Pump Failure Notice that eventually just causes your system to safety off or some shit. I'm currently having mine RMAd. Beware my friend.
@@gizConAsh because we pay for electricity? It doesn't mater if it's a server or desktop... In fact, desktops today draws way more energy then anything else, and is not that electricity is getting cheaper...🤑 You save 50$ on the build, to pay 100$ on electricity. That's why we need a power draw comparison. I personally don't care about 5FPS more. Not gonna make you the winner anyways if you suck. Take Linus* as an example. 😂
One BIG! difference between amd and intel aswell is that am5 is a new plattform, so you can keep youre Motherboard for coming CPU Generations, on the other side is intel Rapterlake with socket 1700, that is probably the last cpu that runs on this Motherboards. (sorry for my englisch im from germany... haha)
I’ve watched a lot of Linus and the gang, a lot. This is the 1st I’ve been able to say “I have that.” I use the same Corsair Elite Hi150 AIO cooler for my i7 9700k. Good to know it comes recommended for newer, and more powerful CPUs.😁
Yeah unfortunately the H150i doesn't handle my oc 7950x terribly well, even with thermal grizzly kryo extreme. It sits at 85C on full load sucking about 200 watts at about 6 GHz. I would hate to see how hot the CPU stays with the 13900k. Tbh tho I debate doing a delid or liq metal to lower temps.
@@chrispersinger5422 No that's actually quite ok. Zen 4 is designed to thermally limited, so it'll boost as much as possible and stay at 95C. If you want better temperatures, you need better IHS or direct die cooling with a good waterblock.
The Voodoo 5 6000 was well ahead of its time, which is why it needed an external power supply plugged into it. That and it never ended up reaching the consumer market due to 3DFX going bankrupt :(.
I've seen comments that Zen 4 feels crazy responsive in windows (J2C I think) - so it would be really interesting so see how the two compare in everyday responsiveness.
Having just built a 7900X system it's my opinion that's not the case. Just upgraded from my ancient intel 4700K Windows 10 to the 7900X on Windows 11 and I really can't tell the difference in general windows use, I'm even running m.2 drives now and starting up browsers don't feel any snappier, which is what J2C was saying.
@@TangoMikeOscar The reason may be the Windows 11 and its stupid security quirks. My friend bought a laptop with Ryzen 5000 for his wife and he had like hilarious performance on W11 like waiting for start menu to open for 2 seconds each time. Just installing W10 made so huge difference that we even did not believe our eyes at first.
back when i bought my mobo i was like oh sweet look at all them pcie slots theres 6 of them and support for 3 way sli yeah! its been six years and ive only ever used 2 of them, one of them for a sound card i havent even used in at least 3 years.
You guys should do a 13th Gen comparison between Windows 10 and Windows 11. I’m planning a new high end system, and would really prefer to run Windows 10. What has me leaning towards AMD is the fact that supposedly Windows 11 is needed for much better utilization of Intel’s E cores, and it’ll be interesting to see how the second generation E cores react. Gaming may still be a tie between the two platforms, but if I’m paying for all the cores, gosh darn it I want to use all the cores.
You are right in your thinking. Windows 11, if you are team blue, Windows 10 for team Red. Windows 11 is still crap on AMD. Weird jitters and hangups on AMD for Windows 11. Or just run Linux and choose whichever you want.
Don't know why you'd prefer Win 10. Tabbed explorer is a godsend in Win 11. If you really want to gimp yourself hard just so you can use Windows 10... go AMD I guess.
Considering how close the performance and price were before sales/pricecuts the choice doesn't really seem to matter. Unless you're planning to upgrade the cpu again soon, assuming intel changes the socket again for 14th gen, amd would be the easier choice
I just found out how to determine APIs within games. Have separate hardware. GCN 3 is good at DirectX but bad at OpenGL,and vice versa for Nvidia's Fermi architecture. NVS 5200m (1080p) performed better on Sonic Utopia than the A12 9800e did (900p). Project 06 ran better on the AMD APU (720p) than the NVS 5200m (1024x768). I found this to be interesting as no one should make the mistake of buying the wrong gpu for the wrong API. Depending on what API your games run on you should buy your gpus that excel in those areas.
Or you know, just open the game's installation directory and find the DLLs that it is using, which should be named something something vk for vulkan or dx9/11/12.
You've got a lot more variables when comparing an APU to a dedicated graphics card though. Memory bandwidth is probably the most significant in that regard, as pretty much every APU will be memory bottlenecked since it has to fight the cpu for every bit of bandwidth there is. How much it will be bottlenecked depends on the game or application you're running, but it's not necessarily related to the API. Drivers are another variable: an APU will probably get game optimized drivers, while I'd imagine an NVS probably gets workstation optimized drivers. Different architectures can absolutely perform differently on various APIs even when the gpus are generally similar in performance, but I'd say you really have to look for a more apples to apples comparison to know for sure. Ideally you'd want them to have a very similar memory configuration as well (both having very similar bandwidth and either the same amount of vRAM, or making sure none of the games you're testing exceed the vRAM limitation of the card with less vRAM)
Dunno how long ago you recorded this but according to nvidia (see gn deep dive) the issue is not fully engaged connector rather than bending the cables. They still covering them under warranty, but it seems its quite easy to not fully click some in and then when cable managing it can work its way looser without being very noticeable. With the reduced contact area on the pins, it increases the resistance thus the meltage.. (and the telltale marks that show a connector that was used without full engagement) Might be worth an editors note.
Man I love filling all the slots. Especially the old PCI that has to split speed between them. It was not enough though so I put in two proprietary PCI splitters from old Optiplexes
"the Ryzen's IHS looks cooler" well... in the end, i have never seen a Computer where the CPU is visible _because that would mean you're not cooling it_
FYI... In USA / Canada our Breakers are 15 amp rated at 120v nominal for common outlets. That works out to 1800w maximum continuous load per circuit. (there are many other circuits at higher amperage and loads... I'm only discussing a typical residential outlet.)
@@bassplaya69er How many volts are in your plug? I'm around 126-127... You could go 20 A circuit if you ran #12 for like a kitchen or something special...
@@bassplaya69er Linus was worried about 2 gaming computers on a standard outlet and was advised a maximum of 1500 watts. Residential breaker panels are 240v with 120v and 240v circuits for appropriate loads. Air conditioning, dryers, stoves and other high load items run at 240v and appropriate amperage breakers based on the wiring. Unfortunately we do not have fused based appliance cords.
@@trevorallen8514 Yeah this is weird to me. Basically every professional CPU comparison test is done on low resolutions to actually test the CPUs. Often at 720p even. At 4k you could even run a 5800x3d or something and it would probably get the same FPS as those cpus. At 4k it's pretty much only your GPU that's working its ass off. Testing CPUs at 4k res on a "tech tips" channel makes me really question Linus now lol
@@DonDadda45 don't question it alot of videos they make are purely for views and alot of times for people who dabble in this and that . Stuff that catches there eye or sounds exciting . But most def after the architecture or world and people and all the polygons are rendered up which mainly relys on CPUs everything else is enhancing the graphics or adding more texcture or lighting effects etc that mainly relies on the gpu it is weired tho a game in 720 or 4 k will get the relativily same fps ( give or take a few ) when you have the same gpu and start swapping cpus at high end atleast. even at high end almost every GPU thats a step above the one you currently have makes a noticable difference when swapped out while running same cpu.
@@DonDadda45 because this is how average consumer will use his/her computer. For other tests you can see raw benchmark numbers, but running game on 720p is not real world usage. So I actually appreciated that I saw few fps difference, because it tells how whole platform is doing, not just cpu
@@Drvo3 You don't understand. At the resolution he tested it at, the CPU barely plays a difference at all, thus making this whole review almost useless. You'd be 100% right if this was a "system vs system" video but it's not, it's a CPU vs CPU video. The review fails in that regard. If you want to talk about real use: If you play on 4k it hardly matters at all which CPU you have as long as it's not a bottleneck. AMD, Intel, 3d.. doesn't matter, your graphics card is doing 99% of the work.
You'll also be able to upgrade to a 7800 3D within the next 6 months, or the 8000 series in a year or two, and the 3dvcache version of that architecture.
@@xTheUnderscorex I agree for normies like myself, but Linus is a multi-millionaire tech reviewer, so it's possible he'll get the new CPU when it comes out.
@@ArbitraryFilmings true, if you seriously multi task you want if not need more cores, but if like me you don't then 8c x3d was better. But now temps are high in AMD 7000s, did they turn down its mhz to reduce heat like they did with the 5000s?
@@tommyrotton9468 Iirc AMD will boost it's clock until it hit's the T-target and then slowly drops back down towards the stock clock until it finds the sweet spot that keeps it at the t-target.
Did ghost spectre fix the activation bypass yet on windows 11? Have been running windows 10 version for over a year now and it’s an easy 10% boost in fps in all games.
I have an 5900x for 1 1/2 yrs and i can tell you that i have no regrets of doing that. Obviusly it depends on what you want for but, at least for my use (high quality (not extreme) gaming and regular use) its really great
@@jensfosbk1601 I honestly never took any notice of the x3d, i just assumed it was a slightly tweaked 5700 ect. I only really realised how good it is when they showed it slapping the Intel 12th gens on a lot of benches ect. Held out upgrading for a while, nothing i really wanna play games wise to be honest where i feel i really want to see this maxed out ect but when those 7800x3ds drop i might just be tempted :)
The other obvious benefit to going AMD which I'm disappointed they didnt point out is that AM5 has a long life ahead of them where the current Intel socket does not.
@@Jerry-zz2eu I got a 1600x, thinking that I could always replace it with a later generation in a couple years. Never bothered, and never will lol. I'm looking at getting a decade or more out of this system before building another.
Upgraded from a 3600 to a 7600x last week, mainly because the cheapest motherboard for 13th gen intel with ddr5 was $200 more than the cheapest AM5 board I think the i5 13600kf was a bit more than the 7600x too.. Anyway very happy with the upgrade, was significantly more of a performance boost than I had expected.
The cheapest X670 motherboard for AMD is $260 while the Intel z690 ddr5 version of that same motherboard is $160. If you couldn't find an Intel motherboard that is cheaper than any AM5 board, you didn't even try to look.
@@omnihein9322 Just seems really really odd being that pretty much every reviewer from different parts of the world keep talking about the higher cost of the AM5 platform because of the motherboards are more expensive. So it makes me wonder whats the deal when a vast amount of information including what is easily available with a quick search online contradicts a comment about motherboard costs.
4:00 i definitely remember Linus remarking years ago how "it wasn't lost on him how filling all your expansion slots was the mark of a baller machine" or something to that extent
Man, when the deciding factor is sale pricing and stock you know that the competition is fierce
True, however a point that should have been taken into account is the power use of the systems. If the Intel system requires another 100W of power to supply near the same performance then it wouldn't require running the AMD system for long to make up for the initial difference in cost
@@nickryan3417 Yes, especially in those countires where electriicty prices have skyrocketed.
I miss the days when AMD regularly whipped the pants off Intel and was still massively cheaper.
These days it's almost a fair competition.
Yeah, but the one that is out of stock kindof already won ;-)
@@ShieTar_ depends on the amount of stock
Okay can we just hold up for a second to talk about how he got to build his own RAM? That's actually really sick!
I would much rather have a professional build my RAM I would most certainly mess it up
@@jrhowrey lmao im sure Micron walked him through the process and properly validated that it works, but the idea of having a piece of hardware that you had a hand in putting together is cool
DDR9-9000 cl69 🤣
@@ScottYarosh I wonder what the actual specs are
@@SoranoGuardias And to see how he dropped it and had to re-build the ram lmao
Both these CPUs have vastly different architectures, signal paths, core counts, caches, clock speeds, and instruction sets. They were designed and manufactured independantly by entirely different teams with next to no direct communication between them. Yet when they were pitted against each other, the difference is negligible at the abstraction levels of our minds. It just blows my mind just thinking about it
.
In the early days they were radically different however because of the standardisation of operating systems and architectures and new technologies etc they are for all intents and purposes identical apart from the label.
Bruh that's only one component. Take Formula 1 teams from different countries with different engines, different chassis, different drivers, lots of different aero and mechanical parts, different electronic and computational components, and they can be within 1 one-thousandth of a second over a 100 second lap, i.e. 99.999% identical performance. That's in the real macro world too in variable atmospheric and track conditions, not in a lab
@@Pax_Veritas Wow.. now that is a whole new level
Wait till homie learns what x86 is...
@@Pax_Veritas But F1 is artificially limited in many ways so you would expect things to be more similar.
There was no untitled goose game I’m disappointed
If they're this close on performance, you should have put an energy meter on the end of the PC plugs so we could see how much energy they used over the course to see if there was a difference there.
yeah but that similarity is only in the enthusiast end.... they have the same thread count, but the 420$ 13700k is arround the 550$ 7900x... and dont even talk about the 13600k being better in every possible way than the 7700x for 150$ less, and mb cost...
@@AdrusFTS Not that it matters much to most if you’re building a new system, but isn’t the 13th gen platform dead after this generation? The AM5 motherboards should be able to be carried forward when 8000 (or whatever) series is out in a couple years. Doesn’t matter to most of course, but that was one lovely thing about my AM4 2600 build, going to a 5600X later on
@@Girvo747 I recall seeing rumour 14th Gen may work in the same socket, so maybe not.
@@Girvo747 yeah a x570 is only costing $100 dollar, so just to save that $100 motherboard cost in future? Not worth it
@@tissueoflies2780 it’s not, but 14 gen has a brand new structure may beat the sht out of amd
Linus: *I need a new CPU.*
AMD and Intel looking at each other: *This building ain’t big enough for the two of us.*
I mean, for this video it literally was big enough for both of them
Until the 7900xtx beat the 4090 and Intel i9 back in stock soon
@@giaopx amd owner here, dont think the 7900xtx will beat the 4090. might be close, but youre hyping yourself up too much. it would be really cool, but amd has a lot of performance to make up to get up there.
Well ltt has 2 buildings now so they can just be civil and keep out of each others way ☺️
@@dylanneff8338 To be fair, the price-performance difference will likely make the 7900xtx the better option for gaming, but, for users that want to do streaming, or 3d modeling, or any other task that requires a GPU that isnt gaming, Nvidia has been the better choice for years, this is related to encoding, drivers and architecture. All in all, it comes down to what the buyer wants to do with the GPU itself.
And about the CPUs, they forgot to take one little detail into consideration, and thats upgrade path, AMD is known to make their platforms last many generations, while Intel changes the platform every other generation. Meaning that 14th gen will be in a new platform, while 8th 9th and 10th gen will still be AM5 for AMD (if things go as expected). So in the long run, AMD is the more logical option for someone that expects to possibly upgrade in the future. But, we are talking flagship CPUs, both 13900k and 7950X will be incredible CPUs 2-3 years from now anyways.
The fact that the 4090 alone draws more watts than my entire computer is just nuts
Lol
@@SK-uk3qh same my CPU draws 236 watts and my GPU draws 204, whole system never uses more than 500w under full load
My cpu barely draws 100 watts
Why dont they try the AMD Threadripper💀
bro i got a i5 650 and a gtx 650 lol chill
I ended up deciding to go AMD simply for one reason: Longer socket support. They just started doing AM5, when Intel has switched sockets every couple years now to force mobo upgrades.
7950X for me.
Not just that but they give new features with each new chipset and honestly my 13900kf will last me the next 5-6 years no problem by time I upgrade I will just get a whole new rig
Simpler organization, too.
I was on AM3+ until about a year and a half ago, and I went straight to AM4. It was as simple as pouring milk into a bowl of cereal.
With Intel? I'd have to first get around their confusing AF naming scheme for their processors, the find out what socket and motherboard goes with said processor... nah. Just nah.
@@theengineer2017 That is cool and all, but I like to stretch 10+ years between buying new rigs. Had intel for awhile, but doing just mostly gaming AMD is the clear choice here.
i went intel since the i9-13900 was on sale for 360 bucks
@@EzrealLux I went amd since I watched the 7800x3d video. I saw the benchmarks and the prices and noticed a clear winner. Gonna step up from a xeon 1231 and im already very excited. I bought a aio, trident z ram @6400 16x2 and msi b650 tomahawk. IDK about the ram speed because everyone is using 6000. I hope it goes well. Any thoughts or do I need to buy anything else?
3:45 Yes, someone said it!! Back in college I had a gaming rig with an ATI Rage 128 for the 2D card, Voodo2 SLI and a sound card, filling up all my PCI slots. I had one ISA slot which I put a modem into. It was beautiful.
This is the way, you paid for all the slots, you want to use all the slots! 🤣
How did you run the ISA card with the PCI slot that uses the same hole in the case?
@@Incommensurabilities If he had a full size atx board and the ISA slot is the most outer slot it is not overlapping. I had a similar board back in the day and also used all slots 🙂
I have also acheived the dream filling all the slots on my trx40, also my second dream of filling all the front slots 😁 (akiwa ghs-2000)
Ah, the memories... My first PC I put together: AMD K6-2 CPU, RIVA TNT (2, I think) AGP graphics card, a Soundblaster soundcard, 14" CRT monitor, Firewire card... I don't remember which board, but I do remember it was quite expensive (used for a review for regional the BUG - PC magazine).
Would love to see the power consumption compared in these equally performing pc's
As far as these architectures, Intel wins with idle power draw, AMD at load.
@@tardvandecluntproductions1278 full core loads. given that he's gaming and not full time cinebenching, they are almost identical.
@@tardvandecluntproductions1278 Now I gotta calculate if I'm gaming more or an idle boi
As mentioned, almost the same at idle and gaming loads. And for productivity you would need a more nuanced look at power versus perf (and even then if you’re making money through your computer you probably don’t care about power and just want raw perf since even in high energy cost places the difference in yearly energy bills is not immensely significant)
@@Jordan-ru8yf "since even in high energy cost places the difference in yearly energy bills is not immensely significant"
you kidding right?
The editing and camera work in this episode is AWESOME!!! Having two cameras made this feel so cool and interesting compared to other episodes.
It's funny because you used a clip from your pretty old $2500 rgb pc build, and it was nice to see that. That was probably the first video of yours that I watched and has kept me around for a while. Love the content, always love it
Linus lying on the ground incredulously reminding us that time is a flat circle is a genuine moment
I was listening to the video not watching can you post a time stamp I gotta see this 😂
9:20
@@loYolVibes
9:25
I gotch'u
He said floor time ^ I did not hear flat circle on time stamps ^ up above 🤣
@@digitaltactics9234 said floor time and then he lied down on the floor and said 'time is a flat circle' bozo
These builds basically just epitomise "Kid goes onto online pc builder and chooses all of the most expensive options" and I love it.
I'm no kid, but I am about to build my first PC and that is basically exactly what I've been doing. Now I'm just trying to cut back to a level I'd still be happy with while saving a bit of cash.
I’m not sure a kid gets to build his own RAM 🤣
When I was young I used to think that having a $70k/yr job would mean you could build at least a $60k PC every year. Hahahahaha. i cry.
@@quoththekraven5911 Good for you
@@astronichols1900 Hahaha more moneu more problems
The 4090 seems soo thick in a real build... As if someone accidentally extruded it 1 [inch] instead of 1 [cm] in CAD...
😱
Ja, szinte mint egy Halas konzerv
That's what she said.
I always roasted the 4090 for its size, then I got a 7900 xtx red devil, without looking at the dimensions prior to getting it... its bigger
@@xspt5019amd loves the chonks
When it comes to CPU usage in gaming, I think you should consider testing Planet Coaster, which even if it's from 2014, because of the Cobra Engine's limitation it's actually a Core-hog when having huge parks and especially when fully filled with all the guests the park can permit at max. Just sayin' to take it into consideration! This why even those channels that present created parks in Planet Coaster actually limit the number of guests or close the parks all together to get that framerate running right.
correction, 2016
they didn't even fps_max 0 in cs go and you're telling them to download a freaking cpu intensive map on that game? 😂
- Says they need a new CPU
- Builds an entire new PC
And yeah, this further shows how *MASSIVE* the 4090 is.
every PC build video for the foreseeable future will forever be overshadowed by how large the 4090 is lmao
My favorite 4090 meme is the one where it's the actual radiator outside for a house lol
Massive, cool and silent 4090. I'll take it!
4090: "Motherboard, look at me. I am the motherboard now."
The air cooled 4090 is bigger than that Xbox series S that's just insane
Given that amd kept their word on the lifecycle of the AM4 platform, going AM5 means having a platform that will be around for a while with easier upgrading in the future.
They didn't on threadripper unfortunately, but the market dynamic for that kind of explains why. They literally can't make the TR Pro fast enough to fill the backlog so why would they make a lower tier part.
It's a 7950x what are you going to upgrade too??
"Yeah I really need the 9950x in 18 months though, its like 10% faster and with the same amount of cores."
All I see is AMD fanboys talking about how "eco friendly" AMD is because you use less power and the same board for upgrades
You wanna be eco friendly, stop upgrading your CPU every 20 months, that and if you are dropping about £2000 on a new PC, saving £100 now or £200 in 3 years means nothing.
You can always flip motherboards if you decide to upgrade, and intel has more room for ram upgrades in the future unlike AMD (which does give benefits, and RAM is quickly getting cheeper/faster). Further more most people don’t upgrade every 3 years, even if substantial performance gains are achieved in that time. Not saying longer platform support isn’t a nice feature, and intel should adopt at least one more year for their platforms. But in general I would always buy the better CPU for the cost at the time rather than planning for a theorized future. Until recently that was Intel, although with the price cuts on AMD it’s probably winning currently! Competition is nice
@@Daeyae Why would 2 generations from now be only a 10% improvement? Not even including clock increases zen3->zen4 got a >10% IPC increase.
@@Daeyae I've upgraded 2 times in the AM4 cycle, I am also unbelievably lazy so personally id take the AM5 so I don't have to take the board out and sell it, which I would also take a long time to do.
The longevity we enjoyed with AM4 would push me to AM5. I went from a 2200G to a 3600 to a 5900X on the same MSI B450 ITX board.
I went from a 3400G to a 5600X on the same A320 asrock board
i staying with AM4 for a good long time i just built at AM4 computer before the AM5 was out
2700x to 5800x. Asus b450f, just surpassed 4 years. No complaints here!
This needs to be brought up way more often than it does. Being able to swap in a Zen 5 and even Zen 6 chip later down the road alone gives AMD the win even if AMD was more expensive.
@@BeautifulAngelBlossom Yeah me too, My 5950X + rtx3080 should serve me well for years to come.
Should also consider the new AM5 socket will go on so you will be able to upgrade your AMD cpu in the future without having to change motherboards, while intel is changing sockets after the 13900K so you’ll need a new mobo for the next gen intel cpus
This. 1st gen AM5 is on par with Intel's best and yet Intel's socket is already EOL. In a couple of years time Linus can simply drop in a 9950X3D after a BIOS update for no doubt huge gains all whilst keeping the motherboard and ram in place.
realistically, almost no one I know who buys the latest and greatest CPU even cares about keeping the same motherboard. The CPU upgrade is almost always tied to getting better feature sets that newer motherboards offer.
@@scythelord I got a 13900K and by the time I want to upgrade it, I'll upgrade the entire system. CPUs can last years without issues.
@@oktusprime3637 which cooler do you have ?
That only really comes into play if you're buying mid range now. 13900K will easily carry you for years.
Honestly, I'd love if something similar became a labs project. Getting performance numbers for as many steam games as possible on intel/amd cpus and nvidia/amd gpus would allow for gamers to figure out what hardware would work best for them!
Would also go well with the MarkBench program!
I hope they see your comment
I would love to see some single thread limited/unoptimized games benchmarked. I've never seen anyone really do this before. Unity games like rust on a map with tons of buildings or far cry 3 for example. Gta online is a good example as well
To the top with you so the LTT team sees this!
just go to hardware unboxed dude. nothing against ltt labs but stop wasting ur time waiting.
It's crazy to think that at this high of a level, it can be a toss up either way in performance and price.
I dunno... on a money's no object build, dropping four figures on storage because who even knows why, I'd expect them to be pretty close.
@@UKCougar if you looked at cpu performance over the past 10 years, AMD hasn't been even CLOSE to intel in performance when you compare their top end chips. Previously, in a money is no object build, AMD wouldn't even be CONSIDERED because of how trash it was.
@@kasuraga Perhaps, but in performance vs cost AMD has been mullering Intel for ages.
@@UKCougar I wouldn’t say for ages, unless we ignore Bulldozer
Yeah.. one of these is a certified space heater though.
A power consumption comparison would have been interesting 🤔
Idle Intel, Load AMD
@@derspielographdsg7435 I like graphs :)
who the fuck cares about power? yall will use any mynute talking point to save your brand loyaty
@@FaZeredemption3 power consumption is an interesting point when it comes to a cooling solution.
All the dude said is that it would be interesting, chill the fuck out lmfao
@@happybobyou they both run hot as fuck. get a good cooler. solved
Something else to take into account is longevity. If you are buying top of the line processors then you probably like to stay on the cutting edge, which means when the AMD R9 8950x and Intel I9 14900 come out you are probably going to buy them. But the difference is that the 8950,9950 and 10950 or whatever they will be called probably wont need a new motherboard every time and intel will. Food for thought price wise.
I bought a top of the line processor so I wouldn't have to build a whole new computer for several years. I'll max out my processor and MB on quality, then swap the other pieces as needed for 5-10 years, then rebuild a new monster and start it again. This time I went with the 13900K and the 4090, so we'll see if it lasts like my last build did.
@@Rush2201You literally have the CPU and GPU shown in the video? You must be living the dream.
No, I pick high end CPUs and keep them for ~5 years. Hate working on CPU, mainboard, cooler and cable management all the time. I have 14900k + 4090 + 64GB DDR5 + 10TB PCIe 4.0 NVMe in my system and the only thing I will change in the next 5 years is the GPU.
You should've compared both systems power consumption. It might make a difference in a long run in power bill savings!
Anyone using these high end pc's don't care about the power bill because it's bad no matter what.
The performance was so close. It was within the margin of error of measurement and the margin of variation of the silicon lottery. Would have been fun if they swapped GPUs between the machines to validate
the power differences betwen the computers is literally nothing compared to the power usage of any other high powered device like your air conditioner or washer/dryer, dishwasher, etc
I really want to know the power consumption of high end cpu
I did the math for my CPU(5950x) and based on 5 hrs a day at load (0.11c/kWh) I could save nearly 50% on my energy bill compared to a pig like the 13700. That's based off a single component.
when linus pulled that first 4090 out of the box, i genuinely thought it was a toy for like a joke about how huge it is. i've already seen it on video a bit, but i did not remember it being this big
thats what she said
@@rustler08 lol no.. where do you get that
The water-cooled versions are smaller, I guess the time has come for water-cooled GPUs only in the future.
Now they're both too good, We need a third player to keep them in the lines of sense 😄
Technically, there is Apple M2.
But yeah, it's uncomparable at this point still.
@@h.m.chuang0224 yeah the M2 is a Great CPU for completely different reasons, but hey the current x86 architecture is probably gonna hit its limit soon with CPUs getting small enough that Quantum tunneling is a real issue and apart from some architecture upgrades theres not alot left for x86 which is already 44years old now, probably about time to start looking for a better solution, something similar to the much more modern ARM architecture?
Be cool if snapdragon made its way to desktop format
@@FacialVomitTurtleFights Snapdragon 8cx and 870 have already made their way onto laptops. I'm guessing they don't have partners and the PCIE lanes aren't enough, so don't expect it on desktop anytime soon.
@@avixs1543 looking at how big Apple Silicon dies are, they will hit the same limit as x86. By that time, the only deciding factor is going to be architectural benefits of each arch. I don't believe that there's magic in this world, so the best case scenario is ARM chips being about the same performance as x86.
Holy crap, I didn't even know 8tb m.2s were out yet. That's more storage than my entire machine. lol
Now I want to see a video about filling all possible motherboard slots/connections and see how the device performs
Get a sound card maybe?
I think the hardware from the "crazyiest PCI cards on AliExpress" are in a box in Linus' warehouse
Gotta be a full ATX then
@@yerttttt No because a 4090 takes 3 Slots and probably gets in the Way of the Fourth. It has to be a Graphics Card for two Slots.
As a ex-BIOS engineer who worked closely with Microsoft kernel (ACPI) team, then had a long career in mobile architecture (particular in kernel and power management), it's sad that how they just didn't address power properly. I've worked closely with Microsoft on power optimization back in windows mobile/windows phone days, so they clearly have the knowledge and we've developed techniques that Windows can work just like what mobile device can... I guess that knowledge is lost in the sea of Microsoft engineers...
Microsoft simply doesn't believe in different limbs of the company being able to communicate with each other. They actually encourage the silo'ing of departments in their organization products(AD, etc).
as a former MS engineer, can confirm
@@shabadabadoo4326 DoD simping is so common for the pre-y2k mega-corps
@@shabadabadoo4326 That's how they do it in the CIA, FBI etc. This way people have no idea what's actually happening in the grand scheme as everyone is working on smaller projects that contribute to the whole. Easier to get away with shady practices this way.
im not surprised 10,000+ "professional" engineers could f something up. sometimes you go to work just to chill and eat, and post teaktoaks
Would have liked to see power usage as well. If you look at initial investment costs, you might as well look at the monthly running costs.
They're more or less the same in actual workloads. Max draw numbers are meaningless, no one is running synthetic benchmarks 24/7.
@@MTGeomancer How about people that run those SETI@Home and Folding@Home programs? Or leave their computer running with torrent uploads going 24/7? Or crypto miners. Or people who's photo and video exports take >1 hour. Or people that run website/media servers on their main computers. Might be valuable to those groups.
@@MitchJT Intel is going to be better for steady background workloads using the E cores.
@@MitchJT who bothers folding or mining on a cpu though?
It's so small (especially assuming you're not using it 24/7) that it really doesn't matter
Seasonic kicks ass. I still have my original power supply, which has outlived 2 motherboards, 3 CPU's and more than 10 years use with no sign of letting up.
I have one Seasonic that has been going for about 10 years now. :P It outlasted a few newer Corsairs, though the older ones were also made by Seasonic as the OEM, back when they also lasted forever.
9:59 Actually the 6800 MT/s kit has a bit lower latency, because memory latency is in clock cycles. The numbers may be higher on the 6800 MT/s kit but due to the higher clockspeed the effective latency in nanoseconds is a tiny bit lower for some of the memory timings.
interesting
would be nice if you would compare the electricity usage for boith systems. In some parts of the world (f.e. germany) electricity prices can get over 0.5€/kwh, so that could be a major factor.
ok, but then what else can you do at 0.5€ an hour? lie in bed?
And remember that those wattages they talk about are max draw, when under load. When web browsing they can easily draw an order of magnitude less.
That should be a default part in a benchmarking for sure.
Power usage is such an underrated metric. If I can use way less electricity for near-same price and performance, I'll absolutely do it. There are cost of ownership and eco-friendliness implications here. And the upgradability factor favors AMD's motherboard which has additional cost-savings and eco-friendliness implications.
They spent $3100 each on just the GPUs and SSDs, and you're wondering about electricity prices? This just isn't that kind of video. Gamers Nexus has all the numbers you want for each component if electricity draw is that important to you - these guys are just fucking around with expensive stuff for fun.
Sitting happily at 0.04€/kwh
Watching Linus put that 4090 down so firmly at 7:54 raised my heart rate a little 😅
It was "old" anyways.
Just goes 2 show you how being rich can just make someone so damn careless.
no one in my house would even dream of setting that equipment down with such FORCE
before anyone calls me weird for making it a big deal, that card costs more than the RENT DOES HERE. not even a newborn baby would
be set down as gentle as that damn card if I could only afford that
that card costs more than me and he set it down like that
this aint even about the card anymore just give me a second
im crying
Ryzen cpu looks cooler so it wins
tru
even though its gonna be covered by the cooler this is actually a way i would choose a cpu 💀
It's literally cooler
Intel CPU has real drivers not AMDip, so Intel wins.
@@iLegionaire3755chipset drivers are somewhat irrelevant, both are good
are you carrying over the stigma that comes from using an AMD gpu and "bAd DrIvErS", i mean sure, in the past, their driver support was awful (i.e on Terrascale and/or GCN), nowadays it's just as good as nvidia
but for midrange modern gaming rigs, get a 7600 and pair it with a 7700XT
I agree there is something very satisfying about having all the slots in your machine filled. You have exactly what you need and no excess.
Except for the excessively adorable cat in your pfp
That's what she said
I love filling my slots 👩
i too like filling all my holes
@@lavi688 🤨📸
AMD will always be Team Teal to me and I'm glad it's showing up more in their in presentation slides and, blessedly, it will show up on their products more too. :)
Bring back Team Teal!
Their logo is Light Green has been since the 386/486/Pentium days probably even longer.
I'd love to see AMD only for the reason for a follow up video to see how accurate AMDs claims are on for further support "till 2025" and possibly more. Also to see if keeping first gen AM5 boards will keep up (possibly feature wise) to newer boards in the future.
With the longevity of AM4, there's no reason to doubt that claim tbh
@@AngieYonaga Of course, AMD has proved themselves in keeping their promises but not without some troubles along the way. Remember the whole bios flash size debacle? Although yes, with this experience I doubt AMD would try something like that again that would get them flak.
@@NootNoot. Hopefully that's the case! No matter what it's good to remember that neither Intel or AMD wants the best for you, all they want is money and the only way for us consumers to not get screwed over is by making sure they deliver and riot if they don't
@@AngieYonaga That's true as well. Honestly, in a recession, competition couldn't have been timed any better. This video proves this and finally consumers may have an edge on the market unlike last year.
I'm glad someone else pointed this out. The failure to address the fact that AM5 will be around for a few years where as 13th gen's LGA1700 is likely to be the _last_ CPUs on this socket; feels like a massive oversight. If you are building a system from scratch right now, there is almost no reason to go Intel, you'd be shooting yourself in the foot.
Honestly at this point when both the AMD and the INTEL builds are so close to eachother in terms of performance… it’s just all about how much you can/wanna pay because other than that, it’s pretty much impossible to spot significant differences between them
You could also pick your favourite based on aesthetics i guess...
@@Patriot-Eaglehead It's covered by a cooler...
@@n646n Then pick the sexiest cooler.
@@Patriot-Eaglehead Then what does that have to do with intel vs. amd...
@@n646n You ain't supposed to pick anything by the brand. Pick the one that performs the best and costs the least. And if that's not possible, just pick the one that looks better.
Same thing with dems from team blue and reps from team red in "The American Sh*tshow: Russian Interference Edition"
But somehow people seem to take everything way too personally. The same people who call others "snowflakes" tend to melt down at the slightest hint of fire.
linus seems genuinely happy to be doing this with this person specifically.
he acts like an annoying brat. not very professional of him.
Who's the guy
@@millyyeasmin7904 i heard he was from Sugon
@@millyyeasmin7904 careful its a trap ^
@@Bebolife12345 I’m gonna spring it
Just got a 7950x, and it is an absolute beast. 32 thread rendering is just out of this world. And gaming is a piece of cake, even maxed out triple a games
Plus for socket support till 2025. Yeah, no brainer.
I wanted that chip too, but went for an i9 13900K instead.
@@thealien_ali3382 I do CAD work so a 7950x makes sense. But, yes, you will be absolutely blazing with 7900x.
Don't cry too much ,, you can upgrade to the 13900k later :)
@@badbutton5869 Finally someone says it - going for intel 13th because you already have DDR5 seems to be stupid because you will have to upgrade just everything and cannot upgrade the CPU. Unfortunately it wasnt mentioned in the video at all.
Also AMD has IO- and platform-advantage. Both things I'd look for in a PC I'm putting into a server rack
Really??
You mean 7950X have higher io than 13900k?
not much, 20 cpu pcie lanes from intel vs 28 cpu pcie lanes from amd. I.e. nobody of them able to provide 2x16 pcie slots. If you need tons of io in the rack - you getting xeon or epyc
@@casparhughey5651 the funny man has arrived
@@s.i.m.c.a That's 40%. Also I was specifically referring to Linus' usecase
OK I have to ask, why would you be looking at either of these for something you would be putting into a server rack? Just curious because If I was putting something into a server rack, it would be a server and that is a different class then what I would use for a pc. I just don't get the server rack reference.
For perspective, the last node I put in a server rack had 120 cores, 2TB of RAM and was connected to a 64TB SAN using 64GB adapters. I like the the AMD design but that doesn't come close to server class hardware from AMD or Intel.
The ultimate performance test for a cpu is going into After Effects, making a very complex project, with a ton of layers, motion blur, animations, movements and scales of all kinds and a ton of different effects added and pressing spacebar to see how the cpu handles the live preview. I barely ever see this kind of test. :(
They didn't use the AMD Threadripper tho
Also, with AMD you can probably upgrade your CPU after two-three generations with the same MOBO. This is going to be my first all AMD build ever, if the 7900 XTX can deliver what was promised.
yeap... thats the good part of amd... well, at least on AM4
Honestly an advantage for people who are buying entry level or midrange processors, but not at the top end. I bought Intel because I only upgrade my processor when I want new motherboard features like NVMe or the latest DDR memory version. High core count, fast processors should have a pretty long lifespan
Indeed, bought Intel 6700k 5 years ago and it is still good. Work and games sometimes some rts
@@Mournful3ch0 Sure, if you don't upgrade your rig that often it's not an advantage. I'm upgrading from DDR4 to DDR5 now, but I'm not going to wait 8 years for the next upgrade, and i don't think many top end builders are. Upgrading MOBO within 4 years rarely yields much performance gains. From the first high end CPU to the last on the AM4 platform on the same MOBO and DDR4, there is a 179 % increase, that's not insignificant.
I can't believe you didn't mention upgradability between the two platforms, next year you can just stick a 8000/9000 (whatever it will be) series AMD CPU into that AM5 slot and not have to upgrade anything else. You can't do that on intel. Not to mention the new 3D cache chips that are coming out in the new year, gaming performance will increase massively.
do we know that next gen amd will use the same socket and chipset? for all we know, amd could do an intel and force you to get a new motherboard every generation or 2 and intel could start doing an amd and support sockets and chipsets for more generations of cpu. I'd be surprised if either did change but we never know
@@nmills3 AMD stated that socket am5 will be supported until 2025. Granted not as long as am4 socket support, but still.
@@LifeOnTheSaddle417 AMD is previously gone back on its promise to continue support for socket
@@om8414 That had less to do with the socket itself
The problem then was the memory size of the BIOS chip or something of the sorts, and they couldn't support 1xxx to 5xxx CPUs on x370 boards. They ended up making it so that updating the BIOS would only support the latest 3xxx and 5xxx CPUs and remove 1xxx and 2xxx supports depending on the motherboards chip storage space
It's not like Intel who basically used the same socket pins (different layout) and pretty much purposely designed it to not work well with newer chips
They should've created a new socket but the heat from early Ryzen claiming support for years made it tough, so they kept the same socket and decided to call any future proof issues "for our own good"
@@om8414they promised to only support up to Ryzen 3000 on some boards and yet STILL updated it when people whined about 5000. Don’t act like they aren’t going well beyond what they should have to.
Intel doesn’t even come close to that. Please.
This video is great! I love the authentic just 2 dorks building crazy PCs together and gaming vibe of the whole thing.
Looking back at this, very good desicion with instability
That moment when you realise the SSD alone is enough for a normal setup
Linus, filling one's PCI slots is genuinely up there. Only once in my life have I had seven slots used by something useful, and it felt incredible.
That’s how I felt when tri SLI was a thing
I really like Nick Plouffe, he really is a great presenter and has good chemistry with Linus in the few videos they're both in. This is fun!
Good thing at last end with 7950x fast forward 1 year to now that i9 13900k probably keep crashing the pc due to the voltage issue and the manufacturing issue 😅
Built my 7950X system a few weeks ago. Absolutely loving it.
what are the specs if I may ask?
@@rabbychan 7950X, ASRock X670E Pro RS, Samsung 980 Pro 2TB, Seasonic 1000 Gold, GSkill Trident Z5 32gb 6000, MSI Mag 240 AIO, Fractal Pop Air, Windows 11. Still running my old RX 580 as a placeholder until December, plan to get a 7900 XTX. Want to add some storage as well but debating between another m.2 or a big HDD.
@@HardEarnedBacon Amazing!👌
@@HardEarnedBacon I'm very excited about the release as well, I'll certainly keep an eye out, I'd like to have a high end full amd build in the future if there are enough GOOD game releases that justify it.
@@rabbychan i build my first ever AMD Rig (comming from 9700K). My specs : 7900x / 32 GB G.Skill 6000 RAM / 3070 Rtx strix / Asus B650 A Gaming Wifi Mainboard
/ Asus Rog Thor 1200 watts / Noctua NH-U2A Chromax black cooler
I am actually so happy that both AMD and Intel are viable now for both gaming and working with barely any difference. I hope this trend continues and in the future we will see the same happening to the GPU market. Healthy competition is when both products are practically the same, because in the end the customers win the most
The history of Alt+F4 is quite interesting. I still fail to understand why anyone would remove this from a game or any software in fact. These days it's more hassle than it's worth 90% of the time as it's automatically assigned in most major game engines (Unreal Engine, Unity, CryENGINE etc). Heck even Game Maker has it by default!
Maybe because the troll thing to do was convince people to alt F4 lol.
@@JsGarage if you didn't know about it then somone telling you and you falling for it is doing you a favour because now you know about it and its not really something you fall for twice.
also it hardly counts as a 'troll' I'd argue system32 is more trollolol and even that is kind of passe today
When a game stops working it sometimes send data and error report back to developers in the background. Alt f4 stop that one from doing.
This one of that shity anti consumer things they do.
I've searched all comments with many different keywords, and nobody is talking about the Wizard of Oz reference at the end. Anyways, loved your Scarecrow impression Linus!
Lol, I just built my first new rig in years (2nd gen to 12th gen) I remember how careful I used to be building pcs even down to applying thermal paste back in the day. Now I just built up my most expensive computer so fast sticking it all in. Too many years watching Linus man handle pc parts 🤣
That sounds like my GF's rig. Their PC went from 2nd to 11th Gen.
@@jrbudoybudoy "Their PC " ? Does she suffer from multiple personalities or something similar or you was looking for "her PC" and couldn't remember the proper word?
@@alexandruilea915 it's a family PC. Ofc I'm using Their.
@@jrbudoybudoy You said "my GF's rig" not our rig or my family's rig. Girlfriend is singular so it does not make sense to use their. My PC can be use by my whole family as well but in the end is my PC not our family's PC because I am the main user.
@@alexandruilea915 'their' needn't always be plural. English isn't as rigid as you think.
Hey Linus, about that Corsair AIO, there's a known issue that causes it to freak out and tank your lighting with a false Pump Failure Notice that eventually just causes your system to safety off or some shit. I'm currently having mine RMAd. Beware my friend.
Yup, built a few systems with that issue. Just another reason to use tower coolers.
A power draw comparison would be nice =)
But why? It's a desktop.
@@gizConAsh because we pay for electricity? It doesn't mater if it's a server or desktop... In fact, desktops today draws way more energy then anything else, and is not that electricity is getting cheaper...🤑
You save 50$ on the build, to pay 100$ on electricity. That's why we need a power draw comparison.
I personally don't care about 5FPS more. Not gonna make you the winner anyways if you suck. Take Linus* as an example. 😂
For power draw I would go amd
@@gizConAsh ah yes. Desktops don't face thermal throttling or overheating.
@@gizConAsh Mom still pays your bills, eh? lol
18:44 Linus going complete Penguin. 🤣
17:10 Gamer mouth movements
18:11 that surprise announcement LOL.
One BIG! difference between amd and intel aswell is that am5 is a new plattform, so you can keep youre Motherboard for coming CPU Generations, on the other side is intel Rapterlake with socket 1700, that is probably the last cpu that runs on this Motherboards.
(sorry for my englisch im from germany... haha)
Linus cackling, while lying down on the floor screaming "TIMESAFLATCIRCLE" really buttered my chops.
I’ve watched a lot of Linus and the gang, a lot.
This is the 1st I’ve been able to say “I have that.”
I use the same Corsair Elite Hi150 AIO cooler for my i7 9700k.
Good to know it comes recommended for newer, and more powerful CPUs.😁
I have an N64, they had one in one of the gameshow videos
I can say it as well... "I have that 4090FE."...
Yeah unfortunately the H150i doesn't handle my oc 7950x terribly well, even with thermal grizzly kryo extreme. It sits at 85C on full load sucking about 200 watts at about 6 GHz. I would hate to see how hot the CPU stays with the 13900k. Tbh tho I debate doing a delid or liq metal to lower temps.
@@chrispersinger5422 No that's actually quite ok. Zen 4 is designed to thermally limited, so it'll boost as much as possible and stay at 95C. If you want better temperatures, you need better IHS or direct die cooling with a good waterblock.
@@Chopper153 yeah true, that extra thick heat spreader is kinda killing it's amazing perf there
Been craving a Linus build video for a while now.
Another thing to consider would be power consumption, was really hoping to see a power consumption comparison between the 2 systems
If you cranking 4090s power use isn't even a thought, I understand your point though
One year later, yep, AMD was the safe choice.
Would be cool if you did the power draw at idle, at gaming and just browsing the internet.
The Voodoo 5 6000 was well ahead of its time, which is why it needed an external power supply plugged into it. That and it never ended up reaching the consumer market due to 3DFX going bankrupt :(.
@@sirsneakybeaky Just great for consumers with more competition.
I've seen comments that Zen 4 feels crazy responsive in windows (J2C I think) - so it would be really interesting so see how the two compare in everyday responsiveness.
quite a few reviewers said that
Having just built a 7900X system it's my opinion that's not the case. Just upgraded from my ancient intel 4700K Windows 10 to the 7900X on Windows 11 and I really can't tell the difference in general windows use, I'm even running m.2 drives now and starting up browsers don't feel any snappier, which is what J2C was saying.
@@TangoMikeOscar The reason may be the Windows 11 and its stupid security quirks. My friend bought a laptop with Ryzen 5000 for his wife and he had like hilarious performance on W11 like waiting for start menu to open for 2 seconds each time. Just installing W10 made so huge difference that we even did not believe our eyes at first.
I have the same dreams bro. On my old mobo i slapped in a bunch of pcie to sata adapters and made a nice NAS & home server with that. Feels good man
i do agree here, using every slot on the motherboard kind of is the dream
back when i bought my mobo i was like oh sweet look at all them pcie slots theres 6 of them and support for 3 way sli yeah! its been six years and ive only ever used 2 of them, one of them for a sound card i havent even used in at least 3 years.
I am using all my PCIE slots..... on my ITX board 😂
You guys should do a 13th Gen comparison between Windows 10 and Windows 11. I’m planning a new high end system, and would really prefer to run Windows 10. What has me leaning towards AMD is the fact that supposedly Windows 11 is needed for much better utilization of Intel’s E cores, and it’ll be interesting to see how the second generation E cores react. Gaming may still be a tie between the two platforms, but if I’m paying for all the cores, gosh darn it I want to use all the cores.
You are right in your thinking. Windows 11, if you are team blue, Windows 10 for team Red. Windows 11 is still crap on AMD. Weird jitters and hangups on AMD for Windows 11.
Or just run Linux and choose whichever you want.
@@slash2bot if you play games Linux should just be out of option
@@mrpiratefox4497 you can game on linux lol. it just depends if the games they play.
If you love a particular company, you will always find a way to justify the purchase, regardless of the facts.
Don't know why you'd prefer Win 10. Tabbed explorer is a godsend in Win 11.
If you really want to gimp yourself hard just so you can use Windows 10... go AMD I guess.
to be fair when your spending around 3.5-4K on a computer 50$ is basically nothing
Yea, and efficiency of cpu also dosent matter when you are using rtx 4090 😂
18:30 It feels like Linus just woke up at 5am after hearing the price cut
Mad respect for keeping the competition fair
This was really entertaining, I like seeing Computer Drag Races, you should make that a THING. 😁
Considering how close the performance and price were before sales/pricecuts the choice doesn't really seem to matter. Unless you're planning to upgrade the cpu again soon, assuming intel changes the socket again for 14th gen, amd would be the easier choice
Tbh not when you plan on using the cpu for the upcoming 6 years or so. Then it wouldn't matter bc both platforms will have new sockets by then
I just found out how to determine APIs within games. Have separate hardware. GCN 3 is good at DirectX but bad at OpenGL,and vice versa for Nvidia's Fermi architecture. NVS 5200m (1080p) performed better on Sonic Utopia than the A12 9800e did (900p). Project 06 ran better on the AMD APU (720p) than the NVS 5200m (1024x768). I found this to be interesting as no one should make the mistake of buying the wrong gpu for the wrong API. Depending on what API your games run on you should buy your gpus that excel in those areas.
Yeah but things get a lot less serious on that front when we’re talking about modern mid to high end cards and modern APIs
The dream is everything runs on Vulkan
Or you know, just open the game's installation directory and find the DLLs that it is using, which should be named something something vk for vulkan or dx9/11/12.
MSI Afterburner: am I a joke to you?
You've got a lot more variables when comparing an APU to a dedicated graphics card though. Memory bandwidth is probably the most significant in that regard, as pretty much every APU will be memory bottlenecked since it has to fight the cpu for every bit of bandwidth there is. How much it will be bottlenecked depends on the game or application you're running, but it's not necessarily related to the API. Drivers are another variable: an APU will probably get game optimized drivers, while I'd imagine an NVS probably gets workstation optimized drivers. Different architectures can absolutely perform differently on various APIs even when the gpus are generally similar in performance, but I'd say you really have to look for a more apples to apples comparison to know for sure. Ideally you'd want them to have a very similar memory configuration as well (both having very similar bandwidth and either the same amount of vRAM, or making sure none of the games you're testing exceed the vRAM limitation of the card with less vRAM)
great video guys! lol " TIMES A FLAT CIRCLE!" loool
Dunno how long ago you recorded this but according to nvidia (see gn deep dive) the issue is not fully engaged connector rather than bending the cables.
They still covering them under warranty, but it seems its quite easy to not fully click some in and then when cable managing it can work its way looser without being very noticeable.
With the reduced contact area on the pins, it increases the resistance thus the meltage.. (and the telltale marks that show a connector that was used without full engagement)
Might be worth an editors note.
That's quite possibly the worst excuse I've heard from a billion dollar company. That's bad connector design If that's the case
Man I love filling all the slots. Especially the old PCI that has to split speed between them. It was not enough though so I put in two proprietary PCI splitters from old Optiplexes
You sure like your slots filled
I would love to see more of this content but at different price points/ tiers of the build
"the Ryzen's IHS looks cooler"
well... in the end, i have never seen a Computer where the CPU is visible _because that would mean you're not cooling it_
FYI... In USA / Canada our Breakers are 15 amp rated at 120v nominal for common outlets. That works out to 1800w maximum continuous load per circuit. (there are many other circuits at higher amperage and loads... I'm only discussing a typical residential outlet.)
Not enough volts.
Over 3KW from a standard residential socket here. Even more in Europe ~5KW? IIRC.
@@bassplaya69er How many volts are in your plug? I'm around 126-127... You could go 20 A circuit if you ran #12 for like a kitchen or something special...
@@kleetus92 230v - 240v, on 32A circuits is standard for domestic here, (UK) however every plug has a fuse inside up to 13A so just over 3kw per plug.
@@kleetus92 The spec calls for 120v. The voltage can range 5% before it's considered a problem. 127v is higher than usual.
@@bassplaya69er Linus was worried about 2 gaming computers on a standard outlet and was advised a maximum of 1500 watts. Residential breaker panels are 240v with 120v and 240v circuits for appropriate loads. Air conditioning, dryers, stoves and other high load items run at 240v and appropriate amperage breakers based on the wiring. Unfortunately we do not have fused based appliance cords.
98-99% GPU utilization pretty clearly tells the story, it's not the CPUs that are the limiting factor, at least for the fps.
100 percent true playing on 4k max settings is mainly reliant on GPU if they wanted to really test CPU they should play on 1080 p or 1440 .
@@trevorallen8514 Yeah this is weird to me. Basically every professional CPU comparison test is done on low resolutions to actually test the CPUs. Often at 720p even. At 4k you could even run a 5800x3d or something and it would probably get the same FPS as those cpus. At 4k it's pretty much only your GPU that's working its ass off.
Testing CPUs at 4k res on a "tech tips" channel makes me really question Linus now lol
@@DonDadda45 don't question it alot of videos they make are purely for views and alot of times for people who dabble in this and that . Stuff that catches there eye or sounds exciting . But most def after the architecture or world and people and all the polygons are rendered up which mainly relys on CPUs everything else is enhancing the graphics or adding more texcture or lighting effects etc that mainly relies on the gpu it is weired tho a game in 720 or 4 k will get the relativily same fps ( give or take a few ) when you have the same gpu and start swapping cpus at high end atleast. even at high end almost every GPU thats a step above the one you currently have makes a noticable difference when swapped out while running same cpu.
@@DonDadda45 because this is how average consumer will use his/her computer. For other tests you can see raw benchmark numbers, but running game on 720p is not real world usage. So I actually appreciated that I saw few fps difference, because it tells how whole platform is doing, not just cpu
@@Drvo3 You don't understand. At the resolution he tested it at, the CPU barely plays a difference at all, thus making this whole review almost useless.
You'd be 100% right if this was a "system vs system" video but it's not, it's a CPU vs CPU video. The review fails in that regard.
If you want to talk about real use: If you play on 4k it hardly matters at all which CPU you have as long as it's not a bottleneck. AMD, Intel, 3d.. doesn't matter, your graphics card is doing 99% of the work.
You'll also be able to upgrade to a 7800 3D within the next 6 months, or the 8000 series in a year or two, and the 3dvcache version of that architecture.
Definitely the better choice for a gaming PC. Although if that's the plan then you wouldn't get a 7950X in the first place you'd get like, a 7600X
@@xTheUnderscorex I agree for normies like myself, but Linus is a multi-millionaire tech reviewer, so it's possible he'll get the new CPU when it comes out.
@@ArbitraryFilmings true, if you seriously multi task you want if not need more cores, but if like me you don't then 8c x3d was better. But now temps are high in AMD 7000s, did they turn down its mhz to reduce heat like they did with the 5000s?
9000 series, remember amd ues the next number for the mobile chips and apu to avoid confusion. 6000 series being the current mobile zen3+ product.
@@tommyrotton9468 Iirc AMD will boost it's clock until it hit's the T-target and then slowly drops back down towards the stock clock until it finds the sweet spot that keeps it at the t-target.
I would've loved to see 7900xtx paired with 7950x with SAM vs 13900k paired with 4090 with ReBar
I would really enjoy a performance comparison when running Windows 11 Ghost Spectre with these machines.
Did ghost spectre fix the activation bypass yet on windows 11? Have been running windows 10 version for over a year now and it’s an easy 10% boost in fps in all games.
I am pretty sure they wouldn't do that because of security concerns.
This was really a really good one guys. I haven't owned a AMD build since I was a kid like 20+ years ago. Maybe its time.
I have an 5900x for 1 1/2 yrs and i can tell you that i have no regrets of doing that. Obviusly it depends on what you want for but, at least for my use (high quality (not extreme) gaming and regular use) its really great
@@fortunaf3 I did consider doing a AMD build, but a shop here did a really really good deal on intel, so I to jump on that. Next time, AMD for sure.
@@Skinnypuff really good for u!
Im pretty sure i would go for the pest price as well, since both of them are great right now!
yes it is
I love the guy that come in to close the door at 14:51 as if to say "You kids are being too loud" and closes the door :P
19:37 - Made me chuckle
So hyped to see the 7800x3D.
Honestly still can't believe the performance of the 5800x3D
Swapped my 3700x to a 5800x3d. The difference is night and day, even with a 5700xt GPU
@@jensfosbk1601 I honestly never took any notice of the x3d, i just assumed it was a slightly tweaked 5700 ect. I only really realised how good it is when they showed it slapping the Intel 12th gens on a lot of benches ect.
Held out upgrading for a while, nothing i really wanna play games wise to be honest where i feel i really want to see this maxed out ect but when those 7800x3ds drop i might just be tempted :)
hilarious it will be limited to 8 cores...
It will available from scalpers only (for at least the first two years).
@@bigdaz7272 well it would be a tuned 5800x then ;-) 5700 is even lower :D but the 3D cache is massive
The other obvious benefit to going AMD which I'm disappointed they didnt point out is that AM5 has a long life ahead of them where the current Intel socket does not.
THIS. This is why I decided on AMD for my new build - I don't want to buy another motherboard in a few years....
a long life AND 3d v-cache cpus ahead
@@frederickmiller5492 Do you really upgrade your cpu every few years? I'm still rocking a first gen ryzen, lol.
@@CigsInABlanket I'm rocking a 4790k and a 6700k still. Gets the job done, haha.
@@Jerry-zz2eu I got a 1600x, thinking that I could always replace it with a later generation in a couple years. Never bothered, and never will lol.
I'm looking at getting a decade or more out of this system before building another.
And here we are a year later. Now, it would be an easy choice: the winner and the one you keep is the one that works. And that ain’t Intel, sadly.
7:54 that hurt…
Yep, jesus Linus. Why you gotta slam GPUs that cost as much as my entire PC
Now to test both CPUs with the RX 7900 XTX when it comes out. I wonder if that would make any significant difference favoring the AMD side.
I assume it wouldn’t.
should favor AMD cause of SAM
They didn’t mention if resizable bar was even on, and most mobos don’t enable by default!
Upgraded from a 3600 to a 7600x last week, mainly because the cheapest motherboard for 13th gen intel with ddr5 was $200 more than the cheapest AM5 board I think the i5 13600kf was a bit more than the 7600x too.. Anyway very happy with the upgrade, was significantly more of a performance boost than I had expected.
Nice man. I'm in a similar position. 3700X thinking either 5800X3D or big spend on a full new setup
The cheapest X670 motherboard for AMD is $260 while the Intel z690 ddr5 version of that same motherboard is $160. If you couldn't find an Intel motherboard that is cheaper than any AM5 board, you didn't even try to look.
@@mattgarside7181 5800x3d will get you great performance for games. I would buy one but I already have a 5800x and I use it for more than gaming
@@dracer35 wow imagine, every country in the world has the same prices.
@@omnihein9322 Just seems really really odd being that pretty much every reviewer from different parts of the world keep talking about the higher cost of the AM5 platform because of the motherboards are more expensive. So it makes me wonder whats the deal when a vast amount of information including what is easily available with a quick search online contradicts a comment about motherboard costs.
4:00 i definitely remember Linus remarking years ago how "it wasn't lost on him how filling all your expansion slots was the mark of a baller machine" or something to that extent
9:01 Linus said "power supply", I heard "sponsor"
Bro I was searching for this comment😂