Click this link sponsr.is/bootdev_GamerMeld and use my code GAMERMELD to get 25% off your first payment for boot.dev. That’s 25% off your first month or your first year, depending on the subscription you choose.
Newer cpus doesn't make your old cpu obsolete remember it will be obsolete when its cooked the whole upgrade path hype train is absolute waste of money if we are looking at a clever consumer decisions.
My husband still games on a 5900x and it works perfectly. I have the 7950x3d because I leapfrogged from an ancient Intel and I'm perfectly happy with mine as well.
I'm still on a 4790K. Upgraded from an A10-7850K, my board was limited to 2 RAM slots, so I took the 4790 when someone gifted it to me since RAM was the limiting factor and those DDR3 modules are so expensive, if available at all. My newest part is still the GTX 970, though. Still works for most games. Will upgrade soonish, though. Time for another 10 year investment.
At this point, no one really cares about what NVIDIA is doing unless you're an enthusiast with tons of money or it's something the market is looking forward.
Also, the way the power plugs were catching fire, silicon snapping. On the high end. "Midend" has crippled performance, and is very shortly lived playing relatively new games maxed. Lowend works decently unless they lie and name a worse performing card the same thing/ gimped with bus or vram.
For Me as SFF PC enjoyer, Nvidia and undervolting goes well like bread and butter due to Nvidia overvolt way to much Their power efficiency is insane, They're the only viable higher end option for ITX case, Prob to those who engineer founder edition cooler, not only they're smaller, quieter and cooler they also at MSRP card unlike similar MSI Expert. For AMD, it is way harder to find, RDNA 3 is known to being inefficient and scale very badly
Calm down @GamerMeld, the reason for 2 12V connectors, is to mitigate the power connector burning and melting issues, they know that the 12 V connectors can only sustain 300 to 400 watts at best, they can't just admit they where wrong.
So why not go back to good old reliable PCI-E 8 pin connector then if they gonna split the load anyway. They are inventing less reliable solution when there is one very reliable for decades already...
Then why did MSI designed each of the 12V power connector to deliver 600W? MSI would just leave money on the table & give gamers a freebie? I don’t think so.
@@tringuyen7519 I would think that this PSU is not aimed at gamer but at stations that use mutliple GPU's. Meanwhile Seasonic announced new PSU that could power four RTX 5090s..
You're so wrong, my 4090 regularly has done 550watts the past year and a half, lets not lie, and spread misinformation trying to be hard, and pressing the UA-camr lmao
You’re never supposed to run a connector at 100% of its power rating for a sustained duration. You should always design a system to use 80% of it’s rating to allow for aging and oxidation of the connectors. So if it’s a 600watt connector, you should use 2 connectors if its anything over 480watts for a sustained duration.
battery test comparison sucks, Mac has the smallest screen at 13 inches as we should all know Screen size matters as they are the main variable to power use/usage. They should all be run with same external monitor in testing.
I mean sure but the difference in power usage between a snapdragon and an m3 is enough to run a 27" display. I don't think 0.7 inches is making that big of a difference. Doing some quick math that 0.7" probably equates to about half a watt. Hardly a game changer.
upgrading from 5800X3D to 7800X3D isn't even worth it, and tbh, I would guess the 9000-series X3D's as well doesn't bring a worthy upgrade, the 5800X3D is just too amazing value
@@lassebrustadGoing from 5600X to 7700X brought me a nice bit of breathing room. I will upgrade to the 9900X3D once it comes down in price. A few more threads should help a great deal.
@@narachi- No. It is exactly the other way around. There is no more power headroom in the current architecture. They are simply raising voltages and making bigger chips with everything. The real performance advance now is in improving the design of the architecture further or move to an even smaller process. Then you will see performance per watt take another little leap. Then gradually total power is increased again and the cycle repeats. This used to not be obvious because of the best performing gpus in the mid 2000s only drew about 50W.
@@juliusfucik4011 that’s what I mean by future proof, kinda hard to design new architecture that will cost more power if you don’t already have the power.
They've hit the limits of the silicon. Die shrinks and pumping absurd voltage into them doesn't work any more. More power just means more waste heat these days.
The battery usage benchmarks are bogus, or at least very misleading. The biggest power usage on light workloads is the display and no specs were given. However, the size of the display is telling. The AMD chips, with the largest display (16" vs 13"/14") has the worst battery life. Go figure.
nope, it would not possible because since the 4090 the 12VHPWR connectors just sucks and made nvidia having to change the connector's design 2 times and at least i understand is the nvidia couldn't deal with "one" 12VHPWR connector anymore so they chose to take the easy way out and put 2 of the same power connectors. I hope they would take the hard way to reduce the gpu power draw. Otherwise what you said could became true 💀
I can't even afford an RTX 4080, I'd never be able to afford a 5090+PSU. Will be passing. Also, I never overclock anything so not worried if the 9800x3D can be overclocked or not.
I honestly believe the 9800x3d is going to be the one that shines. All the other ones are just not worth it. Anyone with the 7800x3d right now will still have a beast of a chip in 3-5 years. Always skip a generation I say
@@tringuyen7519do 8k gaming monitors even exist? What would be the point of even running it above a 4k monitor you won't be able to tell the difference since the screen will be too small
@@Fr00stee8K monitors exist. I have had many screens, 1/2/4/5K and different sizes. 4K at 28 inch is perfection at 100% scaling. Main game screen is 3440×1440 at 34 inches which is great. Having a 5K ultra wide at 29-30 inch with 125% scaling would be the ultimate but for now they are not going over 60Hz.
2 16 pins? The barrier to entry into the Nvidia brand is now becoming a problem. And it's because of that connector. Most PSUs don't and shouldn't use that connector.
I could care less about overclocking the x3d chips, they better make those two dual ccdx3d or im not interested. I'll take the clock hit ifbit means not having to rely on software to figure out if im playing a game or not
Whilst it's inexcusable, there are far more of these 600w plugs in use than have failed. All the enterprise and server GPUs that Nvidia produce use these plugs too.
I think the dual connector could simply be so that it isn't trying to pull all of the power through 1 cable and to help eliminate the burning we had when 40 series launched. Say 300w on each cable rather than 600w one one
I like how the sponsor aims to battle boredom, but programming was the absolutely most boring and mind-numbing thing I've ever tried to learn in my life.
I personally feel like they went the route of separating out the power draw from the connectors by having two of them so that all the load is not on just one of the connectors. This is the same route that the overclocker Kingpin would have taken with his version of the 4090 I believe as he states in the gamers nexus video. Seems like a great idea to me!
2 connectors of 12vhpwr is because single connector just wasn't cut it..all those 600w on tiny wires put unnecessary strains on it..2 connectors seem enough to split 300w/300w and prevent it from melting..
its for up comming GPUs for many years. Didnt have to be 5000 series. Even if 5090 can hit 600W it will still be "ok" since 600W on cable + 75W pcie slot. downside cable might be a little warm.
Power requirements for GPUs are getting out of and. They really need to focus on power efficiency, we're starting to run into the scenario where a typical North American 115v plug is not enough!
FORGET Ryzen 5000! Ryzen 7000 is coming on an entire new socket! FORGET Ryzen 7000! Ryzen 9000 is going to have a huge architecture leap! FORGET Ryzen 9000! Ryzen 9000X3D is going to have a completely new 3D-VCache system!
9800X3D no overclocking support is very disappointing if true. I hope it has more performance and 3D-VCache than the 7800X3D, or it looks like the 9950X, 9950X3D and 9900X3D are the better choices in 2025.
Very disappointing. It's common knowledge that the 6 core parts are inferior to the 8 core parts. If the 9900x3d can be overclocked on the vcache chip, then there is no reason that the 9800x3d could not overclock. No reason except profit.
9800x3D cant be over-clocked. after living with a i7 6900k that cant Overclock beyond stock boost, i dont care much about OC anymore. and current tech is already pushed to the limit by manufacturers anyways. i dont need more than 16 threads, so 9800x3D is still my next pc build.
Those small laptops are difficult to compare, basically the all are effectively the same. The criteria of one over another comes down to the aesthetics.
I'd love to see one of these hardware reviewers do show on what you realistically need to game. I know everyone is waiting for the next 'most powerful' hardware to drop but let's get real here... Do you really need the latest CPU and GPU and NVME and RAM and motherboard and etc to game?
I will buy 5090 no matter what and definitly not for gaming. If it launch with 2x 16 pin connector it will be more safe (550/2 =275w per connector = safe)
Well, I figured AMD would want to get the most popular X3D chip out there quickly - which would be the 8 core version. As for new features, honestly my gut was and is still telling me it'sw likely to have some form of NPU/AI acceleration feature set that the regular 9000 series doesn't have (esp given the over-hype around using the buzzword (*YAWN*) AI).
I'm an enthusiast with loads of money, even my toilet paper is made of gently massaged, scented, perfectly fluffled and creased freshly minted hundeds an fittys (mispelled on purpose), but jokes aside, im staying on the 4090 lol, it's powerful enough I only run it at 2/3 of the power but it still puts out about 90% of the performance. So I really dont care if the 5090 is better. Could be two, three or 4 times better. Its not even the money, its just that I as an enthusiast with money that buys crazy GPUs like a 4090, i feel content. In fact im so content i bought a weaker 4080 laptop so I can kick back whenever I feel like. Mind you the 4080 laptop GPU is closer in performance to a 4070 desktop part or 3080 from previous gen. It's enough for me ...I still have my old 3080 actually. Nobody "needs" a 5090, no, not even you, yeah you, the one using it for renders and work. You'll be fine with a 4090. Its also simpler to run the 4090 cable and PSU wise, leas complicated imo. Now ok, some may disagree with me, is fine. I'm happy as a pig in sh**. I'm the guy the buys the new ARK Survival Ascended and then still plays the older ARK Survival Evolved instead because I want to despite having the gear to run the newer software. Lol.
If you are surprized by the RTX 50xx taking 600W then you have not been paying attention. The nVidia H100 was already an over 700W part and the new H200 will take more, the trend is the same on the consumer side, no surprizes there. This will be a problem in NA or anywhere else that does not natively have 220V/240V power to the outlet. SInce 1600W is the theoretical max from a 12V 15A circuit. Lets do some math, 600W for GPU, 300W for CPU, RAM 50W NIC 25W, Sound card 25W Storage 50W = 1050W Don't plug anything else in there, at least you wont need that spare space heater in the winter.
H100 and H200 GPU die is humongous, They are roughly 35% bigger die area than 4090, They also have way more VRAM, Their high power consumption is absolutely justified
On GPUs. Now to extrapolate and interpret the info in the vocals, that is why AMD has stated it is not going to compete with nvidia at the top end and will concentrate on the middle of the market. A sensible move, unless one hooks up the GPU to either the mains or the have a dedicated solar battery. AMDs move seems to be justified by looking at the values of second hand GPUs, the true market position indicator, not the loudness of those paid to boost sales by whatever means they chose like pretending they dropped from heaven off a Gods workbench.
Kids these day lack the attention and memory of a simple goldfish ffs. It was always like this for generations. One gen of power hog and next gen is power efficient. 4000 series was power efficient now its gonna be power hogs in 5000 series. They just refine product with every second gen and sell it to you as "new" cards. Wait till 6000 series for power efficient cards now.
On the next Max Tech video "AMD and Intel got a lot more points than Apple, but Apple uses 1/3 the power, so lets multiply Apple results by 3 aaaaand, wow!!! Apple has the fastest chip ever!!"
After seeing the weak performance of the 9 series chips a few months ago the 9800x3 releasing this early scares me. Like it’s gonna disappoint plus the fact it can’t be overlocked is an even bigger L I guess we’re not gonna get next gen performance out of it. Like the 3080 was to the 4080 I still remember going from 300fps to 700fps I guess this gen is gonna be like 50-75 fps per upgrade on the cpu then another 15fps from the 5080
The 8 core chips are higher quality than the 6 core chips. If the 6 core chip with 3d cache on the 9900x3d is overclockable, but the 8 core 9800x3d is not overclockable, I would be very disappointed. Very scummy AMD.
I wonder are nVidia aware that in USA, as well as other low-voltage regions with 120V AC power, there’s a safety limit of 1.6kW per outlet, because outlets are typically rated for 15 amps? Exceeding that limit is really bad for fire safety. At least according to stereotypes, suing rich companies is a popular hobby in America.
@@soonts 13 amps (3K watts) is maximum in my country, interesting to say, all plugs in the same country have a 3 pin contacts (2 are reserved for Europe). With that the adapter shouldn't draw power unless all 3 pins contacts were being used. Some imported electric toothbruth charges or all have the Eu style plug but we have to buy then the UK version 🤔
Battery life chart looks a bit off. It should be normalized, at least with battery capacity (e.g.: time per Wh). Ryzen has larger screen ... Apple is still the winner here ...
7900XTX or RDNA3 overall is known for being inefficient, That why there are little to no AMD discrete laptop GPU, Not only they're not selling well it is worse option than Nvidia. Even Intel with their latest ARC iGPU that come with Lunar Lake more efficient than latest AMD iGPU counterparts
@@miken966 It just proving your point even more with comparison to other brand, but power consumption is way overblown, AMD RX7900XTX rated TDP at 355, but they usually run at 280-300 watt in most game even in CPU limited scenario like CS2 or OW2 (280-300). RTX 4080 Super, that cost just 50 buck more and can actually do RT, And despite Rated at 320w, consume less at 80-280 watt on depend on usage, comparison to AMD CS2 or OW (80-150).
Ray tracing would be a blast is crossfire and sli got revived. We got: Resizeble bar Pcie gen 5 HAGS 4 time the amount VRAM of the Common cards back in the day Vulkan/DX12 GPU makers that would want to sell more cards like before the mining boom, are sit in laurels because If people really got double the performance, they don't sell the next gen by 5% performance gain
@@MelioUmbraBelmont you forgot directstorage 😁 but that's what I mean now would be the time... Vulkan/dx12 technically could Support multigpu Just fine Imagine amd/Nvidia setups
@@MelioUmbraBelmont that's just an article from 2020 about native sli support: "...DirectX 12: Shadow of the Tomb Raider, Civilization VI, Sniper Elite 4, Gears of War 4, Ashes of the Singularity: Escalation, Strange Brigade, Rise of the Tomb Raider, Zomby Army 4: Dead War, Hitman, Deus Ex: Mankind Divided, Battlefield 1 und Halo Wars 2 Vulkan: Red Dead Redemption 2, Quake 2 RTX, Ashes of the Singularity: Escalation, Strange Brigade und Zombie Army 4: Dead War..."
@@MelioUmbraBelmont funny how they can combine many GPUs in their servers but can't figure out how to do it for games.... That are not just working on random things but a limited possibility of calculations... Games just need to adapt this.
i think i have a chance this time to convince my family members to buy a 2 in 1 gpu / radiator instead of 2 separate devices, think of the money we'll be saving.
I don't work for AMD but I don't see that as really likely. Real issue is that currently you're just not getting much out of traditional overclocking, where you see some performance gain that is human notable (and even then not huge) is fine tuning ram and syncing with the infinity fabric.
@@OmniMontel its the freedom to do what we want with the things we buy that matters. Instead of artificial locks... but i will refrain on comments until i see it materialize or not.
My new Blender 3D workstation has a 1600w PSU due to needing multi GPU's. I had to wire a 240V line into the office to power it because the 1600w PSU was too much for the standard US 120V electrical line. Get ready people, tech has officially out spec'ed our power infrastructure's maximum limits. It is not cheap to do the power line upgrade to the house or office.
Click this link sponsr.is/bootdev_GamerMeld and use my code GAMERMELD to get 25% off your first payment for boot.dev. That’s 25% off your first month or your first year, depending on the subscription you choose.
thanks for you love amd,,,show people amd is no1 optimum cpu,gpu with lower price
I’m still using the 5900x and it’s still a very solid CPU.
Same but on the 5600x definitely upgrading to AM5 when they have a good combo deal on board/ram/cpu deal
@@Dr.RichardBanks same, AMD does what I need it to do and 9000 series looks interesting.
Newer cpus doesn't make your old cpu obsolete remember it will be obsolete when its cooked the whole upgrade path hype train is absolute waste of money if we are looking at a clever consumer decisions.
My husband still games on a 5900x and it works perfectly. I have the 7950x3d because I leapfrogged from an ancient Intel and I'm perfectly happy with mine as well.
I'm still on a 4790K. Upgraded from an A10-7850K, my board was limited to 2 RAM slots, so I took the 4790 when someone gifted it to me since RAM was the limiting factor and those DDR3 modules are so expensive, if available at all. My newest part is still the GTX 970, though. Still works for most games. Will upgrade soonish, though. Time for another 10 year investment.
At this point, no one really cares about what NVIDIA is doing unless you're an enthusiast with tons of money or it's something the market is looking forward.
nobody should care or buy because they don't make for consumers.
Also, the way the power plugs were catching fire, silicon snapping. On the high end.
"Midend" has crippled performance, and is very shortly lived playing relatively new games maxed.
Lowend works decently unless they lie and name a worse performing card the same thing/ gimped with bus or vram.
@@bionicseaserpent yep. It's a prosumer, enthusiast, and enterprise-only brand now.
For Me as SFF PC enjoyer, Nvidia and undervolting goes well like bread and butter due to Nvidia overvolt way to much
Their power efficiency is insane, They're the only viable higher end option for ITX case, Prob to those who engineer founder edition cooler, not only they're smaller, quieter and cooler they also at MSRP card unlike similar MSI Expert.
For AMD, it is way harder to find, RDNA 3 is known to being inefficient and scale very badly
You sound broke AF
Calm down @GamerMeld, the reason for 2 12V connectors, is to mitigate the power connector burning and melting issues, they know that the 12 V connectors can only sustain 300 to 400 watts at best, they can't just admit they where wrong.
So why not go back to good old reliable PCI-E 8 pin connector then if they gonna split the load anyway. They are inventing less reliable solution when there is one very reliable for decades already...
Then why did MSI designed each of the 12V power connector to deliver 600W? MSI would just leave money on the table & give gamers a freebie? I don’t think so.
@@tringuyen7519 I would think that this PSU is not aimed at gamer but at stations that use mutliple GPU's. Meanwhile Seasonic announced new PSU that could power four RTX 5090s..
You're so wrong, my 4090 regularly has done 550watts the past year and a half, lets not lie, and spread misinformation trying to be hard, and pressing the UA-camr lmao
You’re never supposed to run a connector at 100% of its power rating for a sustained duration. You should always design a system to use 80% of it’s rating to allow for aging and oxidation of the connectors. So if it’s a 600watt connector, you should use 2 connectors if its anything over 480watts for a sustained duration.
Not touching the 90 series cards because of that stupid connector.
That's why their are 2, so the power can be split
Good. More stock for the rest of us then.
Only reason im noy touching the 90 cards are for one reason alone. Thats double my rent.
@@OfficialDachia 4090 is 4x my rent if i think about it that way😂
The problem is money😂 be honest.
battery test comparison sucks, Mac has the smallest screen at 13 inches as we should all know Screen size matters as they are the main variable to power use/usage. They should all be run with same external monitor in testing.
Copium when intel starts waking up 😂
I get what you mean, but ARM chips are not power hungry and x86 chips have a bit improving to do in that area.
I mean sure but the difference in power usage between a snapdragon and an m3 is enough to run a 27" display. I don't think 0.7 inches is making that big of a difference. Doing some quick math that 0.7" probably equates to about half a watt. Hardly a game changer.
Looks like I am skipping 7800x3d, I will wait for 9900x3d/9950x3d, my 5800X3D will be enough until then.
as a 7950x3d owner i agree with this statement.
Same here. When the 7800x3d went up in price and Microcenter ended the bundle, I knew I would either cave for the 7600x3d or wait. Glad I waited.
Got a 5800x3d for like 120 bucks.. ill now wait for the 99003d/9950x3d as well.
upgrading from 5800X3D to 7800X3D isn't even worth it, and tbh, I would guess the 9000-series X3D's as well doesn't bring a worthy upgrade, the 5800X3D is just too amazing value
@@lassebrustadGoing from 5600X to 7700X brought me a nice bit of breathing room.
I will upgrade to the 9900X3D once it comes down in price. A few more threads should help a great deal.
Bro, your favorite word is "BUUUT!"😂
Everyone love a good but
Especially when copium is involved 😂
buuuette
It's crazy that graphics cards are using so much more power, yet still their performance improvements aren't all that impressive.
future proof?
@@narachi- No. It is exactly the other way around. There is no more power headroom in the current architecture. They are simply raising voltages and making bigger chips with everything.
The real performance advance now is in improving the design of the architecture further or move to an even smaller process. Then you will see performance per watt take another little leap. Then gradually total power is increased again and the cycle repeats.
This used to not be obvious because of the best performing gpus in the mid 2000s only drew about 50W.
@@juliusfucik4011 that’s what I mean by future proof, kinda hard to design new architecture that will cost more power if you don’t already have the power.
They've hit the limits of the silicon. Die shrinks and pumping absurd voltage into them doesn't work any more. More power just means more waste heat these days.
@@patrickweaver1105quantum 7000 series gpus.
The battery usage benchmarks are bogus, or at least very misleading. The biggest power usage on light workloads is the display and no specs were given. However, the size of the display is telling. The AMD chips, with the largest display (16" vs 13"/14") has the worst battery life. Go figure.
on top of that Strict AMD uses OLED .....which is consume more power than IPS
LOL 5090, draws more power than your washer and dryers.
nope, it would not possible because since the 4090 the 12VHPWR connectors just sucks and made nvidia having to change the connector's design 2 times and at least i understand is the nvidia couldn't deal with "one" 12VHPWR connector anymore so they chose to take the easy way out and put 2 of the same power connectors. I hope they would take the hard way to reduce the gpu power draw. Otherwise what you said could became true 💀
you need a direct line to the power plant with separate fuse in the electrical panel..:))
I can't even afford an RTX 4080, I'd never be able to afford a 5090+PSU. Will be passing. Also, I never overclock anything so not worried if the 9800x3D can be overclocked or not.
I honestly believe the 9800x3d is going to be the one that shines. All the other ones are just not worth it. Anyone with the 7800x3d right now will still have a beast of a chip in 3-5 years. Always skip a generation I say
To be honest, I would approve the new AMD CPU boxes if they were like the thumbnail of this video. That black and white is pretty slick.
The two connectors are for a total of 600W (the RTX 5090 is rumored to be 600W TDP) and not 1200W.
Doubt it. 5090 will dissipate more than 600W for 8K gaming. Nvidia fandom is stricken by FOMO.
@@tringuyen7519 good luck with 8k gaming loool ngreedia fanboys are hilarious.....
@@tringuyen7519 RTX 5090 (or whatever it's going to be) not going to hit that advertise 600w TDP due to being CPU limited is most game
@@tringuyen7519do 8k gaming monitors even exist? What would be the point of even running it above a 4k monitor you won't be able to tell the difference since the screen will be too small
@@Fr00stee8K monitors exist.
I have had many screens, 1/2/4/5K and different sizes.
4K at 28 inch is perfection at 100% scaling. Main game screen is 3440×1440 at 34 inches which is great.
Having a 5K ultra wide at 29-30 inch with 125% scaling would be the ultimate but for now they are not going over 60Hz.
Let's not forget that the Ryzen AI Asus S16 has a 16" screen. That's got to drain more power.
2 16 pins? The barrier to entry into the Nvidia brand is now becoming a problem. And it's because of that connector. Most PSUs don't and shouldn't use that connector.
I could care less about overclocking the x3d chips, they better make those two dual ccdx3d or im not interested. I'll take the clock hit ifbit means not having to rely on software to figure out if im playing a game or not
Yup, the 7950x3d is great only when the software uses the right cores or when manually set it up.
I have the 7950x3d the issues were pretty much fixed I got mine last month and definitely digging it over the 7700k I upgraded from
I reckon the new features are that both CCDs have the 3D cache so you don't have to worry about core parking nightmares.
The macbook will throttle during heavy load, making it more power efficient
I will not install anything with a 12VHPWR connector, much less multiple. The risk is too big regardless of the performance.
Well I guess you won't be installing anything this generation. AMD is also using the 12v power connectors.
Whilst it's inexcusable, there are far more of these 600w plugs in use than have failed. All the enterprise and server GPUs that Nvidia produce use these plugs too.
I think the dual connector could simply be so that it isn't trying to pull all of the power through 1 cable and to help eliminate the burning we had when 40 series launched.
Say 300w on each cable rather than 600w one one
I like how the sponsor aims to battle boredom, but programming was the absolutely most boring and mind-numbing thing I've ever tried to learn in my life.
I personally feel like they went the route of separating out the power draw from the connectors by having two of them so that all the load is not on just one of the connectors. This is the same route that the overclocker Kingpin would have taken with his version of the 4090 I believe as he states in the gamers nexus video. Seems like a great idea to me!
love the point before the crazy hands. lets do the point in the beginning first. its a really great opener
I expect that the dies with 3D cache will not be OC'able, but the other 6/8C will be
2 connectors of 12vhpwr is because single connector just wasn't cut it..all those 600w on tiny wires put unnecessary strains on it..2 connectors seem enough to split 300w/300w and prevent it from melting..
its for up comming GPUs for many years. Didnt have to be 5000 series.
Even if 5090 can hit 600W it will still be "ok" since 600W on cable + 75W pcie slot. downside cable might be a little warm.
Power requirements for GPUs are getting out of and. They really need to focus on power efficiency, we're starting to run into the scenario where a typical North American 115v plug is not enough!
Prices are just nuts. I'm sticking with my 5800x3d and 6900 toxic until the rig falls apart!
Man, at this point im gonna be waiting till my parts die before upgrading.
you tube does not stream in 4k their compression wont allow it
Amd 9800x3d 9900 x3d and 9950 x3d will kill the market if they have the same increased performance not jutst in game but also in work
Wow a GPU that consumes triple the power of my entire system.
@@hansolo8225 more like 5x at most.
Screen size makes a big difference in power draw. I would like to see some test with the exact same hardware.
hard to believe the 9800X3D will release in 1 month
FORGET Ryzen 5000! Ryzen 7000 is coming on an entire new socket!
FORGET Ryzen 7000! Ryzen 9000 is going to have a huge architecture leap!
FORGET Ryzen 9000! Ryzen 9000X3D is going to have a completely new 3D-VCache system!
Not exited because I bought the 9950x chip instead of waiting. So far it's still an upgrade from my i9 9900k.
Im quitting 3700x and waiting the 9800x3D, I don't think that 9000 sucks that much, only sucks If you are an enthusiast
5:00
Have fun upgrading your apple battery.
9800X3D no overclocking support is very disappointing if true. I hope it has more performance and 3D-VCache than the 7800X3D, or it looks like the 9950X, 9950X3D and 9900X3D are the better choices in 2025.
Very disappointing. It's common knowledge that the 6 core parts are inferior to the 8 core parts. If the 9900x3d can be overclocked on the vcache chip, then there is no reason that the 9800x3d could not overclock. No reason except profit.
9800x3D cant be over-clocked.
after living with a i7 6900k that cant Overclock beyond stock boost, i dont care much about OC anymore.
and current tech is already pushed to the limit by manufacturers anyways.
i dont need more than 16 threads, so 9800x3D is still my next pc build.
Those small laptops are difficult to compare, basically the all are effectively the same. The criteria of one over another comes down to the aesthetics.
I'd love to see one of these hardware reviewers do show on what you realistically need to game. I know everyone is waiting for the next 'most powerful' hardware to drop but let's get real here... Do you really need the latest CPU and GPU and NVME and RAM and motherboard and etc to game?
I will buy 5090 no matter what and definitly not for gaming. If it launch with 2x 16 pin connector it will be more safe (550/2 =275w per connector = safe)
I hope the Z2 extreme chip would be enough to convince Valve to make the Steam Deck 2.
1200W I can heat up my apartment :) that's just nuts...
Well, I figured AMD would want to get the most popular X3D chip out there quickly - which would be the 8 core version. As for new features, honestly my gut was and is still telling me it'sw likely to have some form of NPU/AI acceleration feature set that the regular 9000 series doesn't have (esp given the over-hype around using the buzzword (*YAWN*) AI).
it'll be to split the power between 2 connectors and stop them gettong overloaded
I'm an enthusiast with loads of money, even my toilet paper is made of gently massaged, scented, perfectly fluffled and creased freshly minted hundeds an fittys (mispelled on purpose), but jokes aside, im staying on the 4090 lol, it's powerful enough I only run it at 2/3 of the power but it still puts out about 90% of the performance.
So I really dont care if the 5090 is better. Could be two, three or 4 times better. Its not even the money, its just that I as an enthusiast with money that buys crazy GPUs like a 4090, i feel content.
In fact im so content i bought a weaker 4080 laptop so I can kick back whenever I feel like. Mind you the 4080 laptop GPU is closer in performance to a 4070 desktop part or 3080 from previous gen. It's enough for me ...I still have my old 3080 actually.
Nobody "needs" a 5090, no, not even you, yeah you, the one using it for renders and work. You'll be fine with a 4090.
Its also simpler to run the 4090 cable and PSU wise, leas complicated imo.
Now ok, some may disagree with me, is fine. I'm happy as a pig in sh**. I'm the guy the buys the new ARK Survival Ascended and then still plays the older ARK Survival Evolved instead because I want to despite having the gear to run the newer software. Lol.
Stop the copium for a device that hasn't even been released.
That’s a bit ridiculous…….no……..rigoddamndiculous!!!
How much do you expect the 5090 to cost?
If you are surprized by the RTX 50xx taking 600W then you have not been paying attention. The nVidia H100 was already an over 700W part and the new H200 will take more, the trend is the same on the consumer side, no surprizes there. This will be a problem in NA or anywhere else that does not natively have 220V/240V power to the outlet. SInce 1600W is the theoretical max from a 12V 15A circuit. Lets do some math, 600W for GPU, 300W for CPU, RAM 50W NIC 25W, Sound card 25W Storage 50W = 1050W Don't plug anything else in there, at least you wont need that spare space heater in the winter.
H100 and H200 GPU die is humongous, They are roughly 35% bigger die area than 4090, They also have way more VRAM, Their high power consumption is absolutely justified
I forgot Ryzen 9000 last month.
GM, You are blocking the graph that you are trying to show on screen. It's very hard to follow along.
Why the hell would you need a 1200W GPU for 12K high refresh rate gaming?
Macbook air screen size 13", Intel Zenbook 14". Ratio 13/14 = 0.9286, 0.9286 x 15:25 = 14:29. Macbook 13 shorter than Intel Zenbook 14
On GPUs. Now to extrapolate and interpret the info in the vocals, that is why AMD has stated it is not going to compete with nvidia at the top end and will concentrate on the middle of the market. A sensible move, unless one hooks up the GPU to either the mains or the have a dedicated solar battery. AMDs move seems to be justified by looking at the values of second hand GPUs, the true market position indicator, not the loudness of those paid to boost sales by whatever means they chose like pretending they dropped from heaven off a Gods workbench.
I've been talking about the 9800x3d paired with the 5090 all year
Pulling that much power isn't impressive at all... Efficiency is the next true technological breakthrough we need...
the 5090 will not be a consumer product for sure.
Kids these day lack the attention and memory of a simple goldfish ffs. It was always like this for generations. One gen of power hog and next gen is power efficient. 4000 series was power efficient now its gonna be power hogs in 5000 series. They just refine product with every second gen and sell it to you as "new" cards. Wait till 6000 series for power efficient cards now.
9800x3d will be new zen5 faildozer!! probably difference will be same as ordinary zen5 vs zen4 counterparts in gaming so 2% more
On the next Max Tech video "AMD and Intel got a lot more points than Apple, but Apple uses 1/3 the power, so lets multiply Apple results by 3 aaaaand, wow!!! Apple has the fastest chip ever!!"
not long after and I can connect a 400V outlet to my PSU to power my single gpu card
What if AMD lets you OC the non-X3D chiplet on a Ryzen 9900X3D/9950X3D? Could that be the new feature? or is that possible already?
not pumped as I waiting on getting 9800X3D to replace Intel entirely but now I may just have to stay on Intel instability and GPU spiking to 100%
5090 using that much power is ridiculous. I hope the 5080 is not like that.
that's like a fuckin AC unit power draw
Finaly no forced overclock
1200w for a gpu when wall outlets are like what 1500w max
*Intel's "Ultra" will be their next joke !!*
1200W has the potential for EPIC gaming
I really doubt the 5090 will need 2 600 w connections... think its probably more for AI 2 Gpu's or even the titan card.
Whoever pays more to the conpanies ( Cinebench) this one has a better processor.
the only reason i'd ever get a 5080 is for the new vr headsets with 8k resolution.
i predict that the 9800x3d is going to be very expensive and disappointing . hopefully i am wrong
After seeing the weak performance of the 9 series chips a few months ago the 9800x3 releasing this early scares me. Like it’s gonna disappoint plus the fact it can’t be overlocked is an even bigger L I guess we’re not gonna get next gen performance out of it. Like the 3080 was to the 4080 I still remember going from 300fps to 700fps I guess this gen is gonna be like 50-75 fps per upgrade on the cpu then another 15fps from the 5080
Why does the same spec taking less power scare you
Don't worry, 9800x3d can be undervolted😂
If the new X3D CPUs are only magically better than the 7800X3D: AMD will need a serious kick in the butt!!
The 8 core chips are higher quality than the 6 core chips. If the 6 core chip with 3d cache on the 9900x3d is overclockable, but the 8 core 9800x3d is not overclockable, I would be very disappointed. Very scummy AMD.
I wonder are nVidia aware that in USA, as well as other low-voltage regions with 120V AC power, there’s a safety limit of 1.6kW per outlet, because outlets are typically rated for 15 amps?
Exceeding that limit is really bad for fire safety. At least according to stereotypes, suing rich companies is a popular hobby in America.
Suing Hong is next up today
Are you pumped to sue NGREEDIA??
@@pacspecific I can’t even if I wanted to. I live in Europe, we use 230V AC and the power limits are higher, often 2.3 kW or 3.6 kW.
@@soonts 13 amps (3K watts) is maximum in my country, interesting to say, all plugs in the same country have a 3 pin contacts (2 are reserved for Europe). With that the adapter shouldn't draw power unless all 3 pins contacts were being used. Some imported electric toothbruth charges or all have the Eu style plug but we have to buy then the UK version 🤔
Battery life chart looks a bit off. It should be normalized, at least with battery capacity (e.g.: time per Wh). Ryzen has larger screen ... Apple is still the winner here ...
5090 and 5080 will be out in November not 2025
Either that or they are gping to give SLI another chance?
i got a 7600x for fairly cheap gonna pair with 5080 soon and see how much performance is left on table before i upgrade.
MY 7900xtx will pull 500W from time to time, so 600W doesn't seem all that crazy.
7900XTX or RDNA3 overall is known for being inefficient, That why there are little to no AMD discrete laptop GPU, Not only they're not selling well it is worse option than Nvidia.
Even Intel with their latest ARC iGPU that come with Lunar Lake more efficient than latest AMD iGPU counterparts
@@dqskatt And? Do you think that you, somehow, invalidated my point?
@@miken966 It just proving your point even more with comparison to other brand, but power consumption is way overblown,
AMD RX7900XTX rated TDP at 355, but they usually run at 280-300 watt in most game even in CPU limited scenario like CS2 or OW2 (280-300).
RTX 4080 Super, that cost just 50 buck more and can actually do RT, And despite Rated at 320w, consume less at 80-280 watt on depend on usage, comparison to AMD CS2 or OW (80-150).
maybe apple would have won with bigger battery but do they offer a bigger one ? no they don't
Maybe they Bring Back sli? Or raytrace on a second Card??
Ray tracing would be a blast is crossfire and sli got revived.
We got:
Resizeble bar
Pcie gen 5
HAGS
4 time the amount VRAM of the Common cards back in the day
Vulkan/DX12
GPU makers that would want to sell more cards like before the mining boom, are sit in laurels because If people really got double the performance, they don't sell the next gen by 5% performance gain
@@MelioUmbraBelmont you forgot directstorage 😁 but that's what I mean now would be the time... Vulkan/dx12 technically could Support multigpu Just fine Imagine amd/Nvidia setups
@@MelioUmbraBelmont that's just an article from 2020 about native sli support:
"...DirectX 12: Shadow of the Tomb Raider, Civilization VI, Sniper Elite 4, Gears of War 4, Ashes of the Singularity: Escalation, Strange Brigade, Rise of the Tomb Raider, Zomby Army 4: Dead War, Hitman, Deus Ex: Mankind Divided, Battlefield 1 und Halo Wars 2
Vulkan: Red Dead Redemption 2, Quake 2 RTX, Ashes of the Singularity: Escalation, Strange Brigade und Zombie Army 4: Dead War..."
@@MelioUmbraBelmont funny how they can combine many GPUs in their servers but can't figure out how to do it for games.... That are not just working on random things but a limited possibility of calculations... Games just need to adapt this.
@@OK75 they blame developers and don't want to have a sector tô work on It, also on AMD side, the documentation for crossfire SUCKS
So... we went full circle??
Ooohhh, two melty connectors. Solving a problem that never existed, then doing it again.
The connector is the new standard. NVidia are not using it for no reason.
Forget about Freeman
My PSU already has 2 12vhpwr outs?
AMD going to try to rain on 285k launch.
i think i have a chance this time to convince my family members to buy a 2 in 1 gpu / radiator instead of 2 separate devices, think of the money we'll be saving.
Heavy load... M3 2000 points wins. Others 8000 points looses....
If AMD locks the overclocking of X3D chips to only the higher tier Intel needs to step up their game.
I don't work for AMD but I don't see that as really likely. Real issue is that currently you're just not getting much out of traditional overclocking, where you see some performance gain that is human notable (and even then not huge) is fine tuning ram and syncing with the infinity fabric.
@@OmniMontel its the freedom to do what we want with the things we buy that matters. Instead of artificial locks... but i will refrain on comments until i see it materialize or not.
I will buy AMD next... Nvidia had lost it.
thanks
ANYONE KNOW WHAT IS THE INTRO TRACK?
My new Blender 3D workstation has a 1600w PSU due to needing multi GPU's. I had to wire a 240V line into the office to power it because the 1600w PSU was too much for the standard US 120V electrical line. Get ready people, tech has officially out spec'ed our power infrastructure's maximum limits. It is not cheap to do the power line upgrade to the house or office.