this is september 2024 interview with jack huynh of radeon group. Don’t worry. We will have a great strategy for the enthusiasts on the PC side, but we just haven’t disclosed it. We'll be using chiplets, which doesn't impact what I want to do on scale, but it still takes care of enthusiasts. [...] Don't worry, we won’t forget the Threadrippers and the Ryzen 9’s.
Why are u assuming AMD canceling next gen and avoiding high end GPU by choosing monolithic design? RTX 5000 series are high end and ultra end and still using monolithic design ? U just need to shrunk the lithography and create new arc with efficient on registers, instruction and pipeline and adding dedicated unit to do specialize instruction (tensor, WMMA, RT unit, BVH, Ray Intersections , etc) . MCM dua chip does not always mean better .
@@angrygoldfish so many people just seem to want AMD to fail, before they have even released their cards. I really have no horse in seeing them fail so I don't understand. I just want good cheap cards.
2 reasons why the current model is performance is so good that making a different model would cost more. Or #2 There are to many problems to fix and there better off starting over
Honestly the 50 series hasn't been getting reviewed very well, especially because the uplift in performance wasn't as impressive as anticipated. If the 9070 and xt variant are as good as claimed and they don't jack up the price then it should do well. Amd being who they are will probably price it too high and screw up their already botched launched though.
AMD is probably already hitting themselves in the head hard for cancelling RDNA4 halo aka N4C after the 3rd party 5090 benchmarks came out and it's only ~25% faster than 4090. Cancelling high-end UDNA would be suicide from AMD.
@@ZackSNetwork , even if amd made 2x faster card nvidia fanboy wont buy it , well proof is rx6600 is 30% cheaper and 40% faster than rtx3050 but fanboys bought 3050 by the millions , rtx3050 has 10x more marketshare , why ???? think about it , nvidia famboys only buy nviidia , its not amd fault for abandoning high end . people are too brainwashed into buying nvidia only , too bad and dont say nvidia better in RT and dlls , its minor . that shouldnt a decision maker , rt only in 5% games , dlls slightly better . that shouldnt turn into 88% marketshare for 1 company , it tells how stupid people are , braindead logo loyality
The highest-listed RDNA3 card on the Steam survey is the 7900XTX. A LOT of people (like myself) who are educated and don’t support anti-competitive and anti-consumer practices prefer to not buy Nvidia whenever possible. Not bothering with halo cards doesn’t mean you have to exit higher-performing segments entirely. The 7900XTX never needed to outperform a $1600 Nvidia halo card. Just make it better per dollar than whatever Nvidia has around the price range…which it was. AMD is missing 2 critical products with RDNA4 that they were perfectly able to give us: a 24GB, 96CU performance card, and a 12GB, 48CU midrange card. The 8 and 16GB variants have been finished seemingly forever, so they’ve obviously had plenty of time to pen up one or both of these chips. The 7900XTX being the top-listed RDNA3 card on the Steam survey proves the demand for high-end 24GB AMD cards is there. We wanted a competitor in that price range. It doesn’t have to beat Nvidia’s doubly-expensive flagship, but we DO want AMD to have some presence in the $800-1K price range. But the absence of a 12GB, 192-bit bus card is even more baffling. They’ve been spending all this time claiming to be focusing on the midrange for RDNA4. Well that’s great! Sooo…why isn’t there a 12GB, 192-bit bus card? The 6700XT was one of the stars of RDNA2. It’s the second-highest listed RDNA2 card on the Steam survey not far behind the 6600. The 7700XT (which is made using salvaged 7800XT chips) still ends up second behind the 7900XTX for RDNA3. Reviewers have been stammering about 8GB not being enough anymore. So why not have a 192-bit, 12GB offering? And I don’t mean cut-down chips, but a dedicated chip like Navi 22 was? How are you focusing on the midrange with this glaringly obvious omission in your product line? All this is to say, RDNA5 or UDNA or whatever they decide to call it NEEDS a high-end variant. TSMC’s yields for the chips used in the 9070XT are apparently really good, and a 50% larger chip for a performance model would have been fine. And they DO need a midrange, 12GB variant to round out their lineup.
@@x3ko777 if they’re reallocating N44 chips into that format then hopefully it’s performant enough that it won’t be unbalanced. Since for most of the last couple years it was rumored the N44 chip would be 128-bit 8GB. And granted, AMD absolutely *should* have a 128-bit, 8GB model available as well. Maybe as a (hopefully slightly) cut-down variant. We want the market to move past 8GB but as far as what mainstream buyers are actually buying it’s pretty clear that’s still the largest-selling part of the market by volume.
7900xtx is already a little 4090. 33.5k graphics time spy with oc nitro+. 36.5k for average 4090. Max score for both is very close 42k for 7900xtx and 44k for 4090 but you need a high-end professional Cpu
Idk what you smoking dude but I want some too. Claiming we need a 12GB, *an 8GB* card is beyond crazy lol. People only buy 8GB because that's what Nvidia made the 60 class to be and everybody knows that you *ABSOLUTELY HAVE TO BUY NVIDIA NO MATTER WHAT*
@@peterkolesar4020 I’m not a fan but the reality is 8GB cards still sell the most. The RX6600 has been AMD’s best-selling card the last several months and is their highest-listed card on the Steam Survey. The RX7600 and RX6600XT are also up there on both counts. There are actually still a LOT of PC gamers who are not buying into the massively increased prices a lot of us have gotten used to since the pandemic.
@ Intel will stop making GPU unless they start to make profit… aka increase prices a lot! Would you buy B580 at $550 aka same as 5070… they are about as big so as expensive to produce?
For me, it sounds like you just want NVIDIA cheaper. NVIDIA is not going to compete harder against AMD, even if they are on par if you want to buy an NVIDIA GPU. AMD is who needs to compete harder.
you can always buy an Nvidia card no one is stopping you. I'm going to wait and see whether it's worth me upgrading to the 90xx cards. Or skip a gen. What's an extra month ,that's nothing in the grand scheme of things
@@grumpyratt2163 Yeah, but the problem is everytime I look at Nvidias offerings and seriously consider its the same old story allover again, not enough VRam, too much money for what it offers, and the experience _I know_ is NOT going to be jaw dropping or make me feel like it was worth the price tag I paid. Fk Jensen and his premiums, Im not paying more for the same FPS and in some cases - 5070 Ahem* _less Vram._ I'd grab the 5070 Ti anyday or 4070TiSup if Jensen would charge a reasonable price of no more than $600. Jensen doesnt believe in offering 16GB VRam for midrange under $700. SO yeah, Ill just have to wait even longer for 9070XT.
@@grumpyratt2163extra two months matter because later selling date implies later embargo date which means that people who decide between 70 class or 70(XT) might loose the opportunity to pre-order the 70 class if RDNA4 proves to be underwhelming...
I dont care if its chiplet or monolithic as long as we will see a 24GB GDDR7 SKU with either 35Gbps+ IC's on a 256bit bus or 28Gbps on a 384bit bus in either case please amd dont go smaller than 6144 Shading Units / Unified Streamprocessors another Gen of just 4096 would be simply too little also up the L3 cache back to the same 2MB/CU as RDNA2 had (I wanna see 192MB L3) (or bump the L2cache per array up to achieve an equivalent amount of total cache or add spezialized caches for mesh and bvh shading like intel did)
At current point, it's will be a loss for them for each MCM consumer card. I mean MI300 probably goes for 5000usd minimum and 9190xt probably going to be 1200usd. So for each consumer card that are using TSMC cowos are a loss in possible profit to them unless magically TSMC have 100x more capacity FOR THEM. Remember, strix halo started using cowos and that's a huge market compared to 10% discrete gaming.
Going by old patent - 360CU cancelled, 240CU cancelled, 120CU monolith goes to PC and PS6. I wonder if AMD would sell it as iGPU. And of course - how far we are from those very old rummors.
I have suspicion that instead of separate GPU chips on the board they will use a similar approach to Zen 5 12 and 16 core CPU's etc. That would be a monolithic chip as a whole on the board, made up of multiple chiplets. Probably with a ton of 3D stacked infinity cache. I believe this is what they were going to do anyway, which is different to the traditional approach of separate chips where scheduling of tasks becomes a big issue for graphics. The 'monolithic' approach of chiplets on chip is still multi-chip in terms of the definition, but should scale almost perfectly. Also makes board and cooling designs much simpler. As for their compute variants, they would utilise the same cores, just minus some of the other on-chip stuff. Compute workloads are much easier to run in parallel such that multiple monolithic chips (containing the multiple cores) and wouldn't be an issue with scheduling. That's what I believe anyway.
Why are u assuming AMD canceling next gen and avoiding high end GPU by choosing monolithic design? RTX 5000 series are high end and ultra end and still using monolithic design ? U just need to shrunk the lithography and create new arc with efficient on registers, instruction and pipeline and adding dedicated unit to do specialize instruction (tensor, WMMA, RT unit, BVH, Ray Intersections , etc) . MCM dua chip does not always mean better .
@tstager1978 it's the failure rate. It's much worse the bigger the die size. Chiplets mean a much much higher yield with processes that match their requirements (IO die doesn't need the newest fab).
The ONLY upside to chiplets is yields. So unless you're throwing away a ton of large chips because of defects, it makes no sense. Look at how much silicon you get when you buy AMD, that's why they can't sell them for as cheap as people want. If you have a good yield rate then the only thing that really matters is the price per wafer
the price perf for chiplets mcm is not worth it...right now...the packaging costs actually bring the total cost higher...maybe two generations, but who knows what will be happening at that point.
We haven't seen a Titan since the $6,000 Titan V CEO Edition. I doubt there will be one as there's no money in it for Nvidia when they can sell that to data centers.
I certainly hope that AMD is planning to keep forging ahead with uDNA -- and the Halo line. Especially since nVidia is getting into the APU biz; we need more competition in the GPU market, not less!
Well said. I bet it would be called the B6000 too. But 96 GB seems too much. Maybe a H100 replacement ? B100 ?
17 днів тому+2
Who wanna bet they will change naming scheme again next gen because 10080 looks simply stupid? BTW. there is no reason why Nvidia would not go same 'fake' chiplet design AMD used. They would get a lot more space for CUs, tensor cores etc. while putting I/O, memory controllers, etc. on different die.
At this stage they should just change the naming convention to have 4 numbers, with the first two being the year of release and the last two being the generational class. E.g. 9070 becomes 2570. It just makes complete sense to me and will make it easier for people to identify the most recently released product from them. And then never deviate from this for the next 60 years (which by then will they even be around to care?)
16 днів тому
@@Rhapses because it makes sense to you and majority o inteligent life means AMD will never do it. And their 'logic' will be that for eg. 2770 looks weak for consumer in comparison to 6070.
@@nick_g It's about how to solve the issue that it will be hard and expensive to go below 3nm. Chiplet is what? The idea is to separate different parts of a die into smaller pieces and "glue" them together. Nvidia's interposer is just a different approach but similar to solving the same issue by connecting whole dies together. Different ideas to solve the same problem. MCM - multi-chip design. So yes, the solution with an interposer is exactly this.
Auntie said we are going to get a flagship, the question is how that happens. A process shrink could make a monolithic design alright, in theory, I agree. AMD needs the GDDR7, too. They still have like a year and a half to make it work. If AMD can't, then expect a $3000 card from Nvidia's next generation. I think a 50 series Titan is going to need a second power connector after seeing some of the thermal performance of the 5090's connector when they overclock. That was unsettling. Thumbs up!
Chiplets suck. The inherent latency can only be overcome with brute force clockspeeds - look at early Zen chips, Arrow Lake etc... Ryzen only became decent once they moved slowly back to semi-monolithic with the 8 cores per CCD.
The main drawback of RDNA3 are probably because of the 2.5D HBMlike MCD. If AMD were to make it like MI300 series where the GCD sit on top of the MCD and create a lower amount of MCD. Navi31 have 6/5 MCD. Navi 32 have 4 MCD and the performance improvement were there. 72cu+128mb IC rdna2(6800xt) vs 60cu+ 4x16mb IC RDNA3(7800xt) and they basically have the same performance stock vs stock.
@@noobgamer4709 so 72cu+128mb IC rdna2(6800xt) vs 60cu+ 4x16mb IC RDNA3(7800xt) clearly shows that even with such aggressive cache reduction RDNA3 performed well.
If this is the case, AMD sadly ain't surviving till 2036, since Nvidia is also going into cpus now. The PS8 will probably be the first to use Nvidia hardware.
If AMD cancel their next flagship then I truly see no reason to stay with them from this point forward. I was waiting for the AMD 9000 series flagship since there is no 8000 series flagship, but if canned then no point hanging around. It is crap like this which is why Nvidia have complete market control.
Monlitgic dies more expemsive to produce where chiplets can be cheery picked so you have more power proccesing in long run if they cab produce bus peedsand configurations that allow all the cache tonworknwith ieachboflver and each proccessor
High End, canceled. Early launch of Rx8000 cards (in order to gain market share), canceled. Rx8000 branding, canceled Announcement (at CES), canceled. Release date, canceled UDNA Halo, canceled. 🤔 What is going on? Is this some (corporate version) kind of "cancel culture"? What is going on?
I suspect a 5090 ti as they skipped 4000 series and did a 3090ti series when they could not or purposefully limited supply to inflate the price of their cards
I think AMD is canning the high end because they feel like they can't hit feature parity in software and would rather use the money to try to catch up in the software realm. This would make sense since the high end buyers want the best and compromising for couple of hundred dollar saving does not seem like a good proposal.
I feel these 5 series nvidia and 9 series Amd gpus are waste of time and marketing scheme just to put something out there for a year before they actually do new gen stuff.. Probably better to wait for next gen 6 series and udna AMD cards.
More of Gaming GPU chiplet design. But MI300 should provide enough information for them on how to improve latency issue when using GPU for gaming instead of HPC. MI300X are basically a HPC GPU. With MI300A being the APU.
So, Nvidia is going to make a Titan for 50 series. I remember the rumors of them making it for the 40 and 30 and 20 series. Oh, they never made one for ANY of those series? They're not going to make one for the 50, 60, 70, etc....... series either. It's TOO NICHE OF A PRODUCT, JUST STOP IT ALREADY!!!!!!!!!!!!!!!! Nvidia WILL MAKE GAMING AND PRO GPUs!!!!!!!!!!!!!!!! That's IT! If you want PRO level performance you BUY a pro GPU, get it?? GOT IT?? It's the way it's going to be, until Nvidia has been passed by a couple other companies and haven't had a performance crown for a few years they have no reason to make a cross between a gaming GPU and a PRO model when their top end gaming GPU along with ANY gaming GPU has work oriented drivers. And if you need support with software, etc..... then you need to buy a PRO model to get the wonderful support Nvidia will offer.
Look at everyone selling their RTX3000 cards, no one is selling their Navi 2 cards. 3060Ti, 3070, 3070Ti, 3080 + 12GB, 3080Ti..... All selling like mad on eBay right now, but no 6800 + XT, 6900XT, 6950XT. VRAM!! Navi2 worked out very nicely, especially the 6800 with 16GB vs the 3060Ti - 3070Ti with 8GB, or the 3080Ti with only 12GB. All for a small price of bad HEAVY RT perf. I bought my 6950XT because 1) consoles 8 years life span and 16GB of RAM, 2) lithography for the next 10 years is going nowhere. In 8 years when the PS6 comes out it will still be nicely mid-range, unless Sony FINALLY fix their RAM issue, and stop blowing $ on crazy unnecessary crap like the most expensive storage solution ever invented. But if the PS6 has 32GB of RAM then it will shake up EVERYTHING. I doubt it will exceed 24GB total unfortunately
There's reason for optimism in Radeon's long term? Because their relationship with Sony? What has that relationship ever done for them besides be a rock around the neck? If AMD didn't have their console relationships, they would have probably spun off RTG long ago, forcing Radeon to have to actually compete, which is something I would imagine those guys actually want to do, but they are too micromanaged by AMD's accountants who can't see past the next quarter's profits.
@LITTLEgiiant staying as a part of AMD is the dumbest thing I've ever seen, unless the goal is to fail miserably over and over and over again. I'm sure those engineers could make good products if they were allowed to/if their existence depended on it.
@@The_Chad_ they wouldn't have the financial resources to support themselves. Not to mention APUs are major revenue sources and separating those divisions would make them lose out on that. Not to mention Microsoft and Sony help with getting developers on board to optimize for their GPU architectures and push them to add more features ( Improved AI Hardware/software for example) for AMD is a result of console manufacturers joint collaboration. The Radeon division being its own thing will lead it to demise for sure and any sane person would see that.
@@LITTLEgiiant They're already in demise. If they were independent, there's nothing keeping them from having agreements with 3rd parties. All of those products could and probably would still have Radeon tech in them. Independence would necessitate they actually start making good products people want to but, negating the need for MS and Sony to prop up their dev support. No, they probably couldn't survive just being cut loose with their current capital, but that's where loans and licensing agreements would benefit them. And even if that isn't the answer, what AMD is doing with Radeon now is a massive disservice to the name, the technology, the people that work there, gamers, and professionals. They basically use it as an R&D department for APU and SOC graphics. Selfishly, as someone who uses a lot of products with 3D graphics accelerators, I want better. If I was just an AMD investor, I'd probably be fairly satisfied with the current arrangement, but not thrilled seeing how well Nvidia's stock is doing and knowing RTG could do that.
@@The_Chad_ I just think it's too much of a risk to spin off Radeon, but yeah I absolutely agree AMD put some serious work into Radeon. Hopefully UDNA will be AMDs "Ryzen moment" for GPUs.
RTX 50 is FP4 cart but there is no FP4 workload in games so they develop DLSS and all the AI features as an afterthought all the AI features are there so they can "DO SOMTHING"
Fp8 was already highly destructive in neural network quality. Nobody likes to use fp4 already. It break down something useful into something so mediocre (there might be some exception i admit)
That would be great then nvidia can charge even more for their vram starved cards.People deserve that.They want Amd to compete so they can just buy nvidia cards cheaper.
This is unreal. They are going to make all progress in the GPU market come to a stall. Nvidia will just become the Intel of GPUs with 10% gains every generation. If AMD doesn't provide a high-end card, I guess I'm finally switching to Nvidia.
Nvidia already made "chiplets" working by "gluing" two chips together. I think this will be Nvidia's path to glue smaller dies together but 60xx will be still monolithic on 3nm.
60xx series will probably be on 2nm. The 3nm node is being skipped by a lot of companies since 2nm node is basically ready and doesn't really cost more than 3nm.
@@03chrisv2nm have a hard reticle limit of around 500mm². Current BW are on TSMC N4P(supposedly). The high end BW currently are at 750mm². That take some heavy engineering work to make it possible to shrink 250mm² with only logic are scaling and analog basically havent shrunk since 7nm.
Bro way to early to assum this lmao. Honestly shouldn't be starting this rumor based on one comment that had nothing to do with the anything cancelled.
RDNA 5 still being monolithic is both good news and bad news. You dont want to be the beta tester of a radical arch change and extra latency intruduced between diffferent stream processors.
A top GPU is useless at this moment from AMD they don’t care about top cards there is no money in that for them. Midlevel is the market they should go for at reasonable price’s, Lowend will be APU’s
@ time will tell i dont think that the 5070 will be that good for the price at all and most people have time just a few Nvidea lovers wanna have the biggerdickexpirience……….
Who need udna anyways? Gamers are foolish to be needing something better than 7800XT😂 Just do medusa point with rdna5 60CU and 8c8T to equip better igpu
Nvidia really desperately needs a massive desktop GPU that cost a lot of thousands of $ and sell like hot bread cause they just lost like 500 billion $ in shares recently due to Chinese AI. If Jensen would open a community sale of all his good looking full leather jackets I bet he could bring the 500 billion $ value of shares back (literally who doesn't want a Jensen leather jacket...) 🤣
Download HitPaw VikPea for free now: bit.ly/40tFpZ0
this is september 2024 interview with jack huynh of radeon group.
Don’t worry. We will have a great strategy for the enthusiasts on the PC side, but we just haven’t disclosed it. We'll be using chiplets, which doesn't impact what I want to do on scale, but it still takes care of enthusiasts. [...] Don't worry, we won’t forget the Threadrippers and the Ryzen 9’s.
Why are u assuming AMD canceling next gen and avoiding high end GPU by choosing monolithic design? RTX 5000 series are high end and ultra end and still using monolithic design ? U just need to shrunk the lithography and create new arc with efficient on registers, instruction and pipeline and adding dedicated unit to do specialize instruction (tensor, WMMA, RT unit, BVH, Ray Intersections , etc) . MCM dua chip does not always mean better .
@@mightyhadi6132 yes it does mean better you can have dual compute dies
The 5090 like the 4090 doesn't scale as one would expect, so even if such a product came out, it would have no purpose on the consumer market.
RTX 5090 TITAN with 96GB is a possibility but will it generate 10 frames for every real frame?
it's for creators not for gamers
it can potentially be slower in gaming than 5090 in gaming
the AI on it will hallucinate the entirety of the game, you need only type in the name of a game to get started
Its meant for AI developers even the 5090 is. No one needs 32 GB VRAM and the extra performance for the price is a joke.
It will AI all the noncense in the world in few seconds and start pomdering the meaning of the life!
😂
UDNA Halo Cancelled!? But they haven't even finished failing this launch yet!
Like failing your second marriage while divorce papers for your first haven't gone through yet...
🤣🤣🤣🤣🤣, fam that was delicious😂😂😂
@@angrygoldfish so many people just seem to want AMD to fail, before they have even released their cards. I really have no horse in seeing them fail so I don't understand. I just want good cheap cards.
2 reasons why the current model is performance is so good that making a different model would cost more. Or #2 There are to many problems to fix and there better off starting over
@angrygoldfish Diabolical work 😂
Sadly not a month, but two ...
It's the worst outcome for RDNA4
Yeah he said a month from now, but it's 2 months from now.
Honestly the 50 series hasn't been getting reviewed very well, especially because the uplift in performance wasn't as impressive as anticipated. If the 9070 and xt variant are as good as claimed and they don't jack up the price then it should do well. Amd being who they are will probably price it too high and screw up their already botched launched though.
AMD is probably already hitting themselves in the head hard for cancelling RDNA4 halo aka N4C after the 3rd party 5090 benchmarks came out and it's only ~25% faster than 4090. Cancelling high-end UDNA would be suicide from AMD.
I doubt it even existed
If it would exist… people still would buy Nvidia.. that is why AMD abanded the highend.
@@haukikannelBecause AMD still made slower inferior products on the high end.
Was MCM but had major issues and would be super expensive... Plus NV fanboys wouldn't have bought it. They would wait for cheaper NV cards.
@@ZackSNetwork , even if amd made 2x faster card nvidia fanboy wont buy it , well proof is rx6600 is 30% cheaper and 40% faster than rtx3050 but fanboys bought 3050 by the millions , rtx3050 has 10x more marketshare , why ???? think about it , nvidia famboys only buy nviidia , its not amd fault for abandoning high end . people are too brainwashed into buying nvidia only , too bad
and dont say nvidia better in RT and dlls , its minor . that shouldnt a decision maker , rt only in 5% games , dlls slightly better .
that shouldnt turn into 88% marketshare for 1 company , it tells how stupid people are , braindead logo loyality
The highest-listed RDNA3 card on the Steam survey is the 7900XTX. A LOT of people (like myself) who are educated and don’t support anti-competitive and anti-consumer practices prefer to not buy Nvidia whenever possible.
Not bothering with halo cards doesn’t mean you have to exit higher-performing segments entirely. The 7900XTX never needed to outperform a $1600 Nvidia halo card. Just make it better per dollar than whatever Nvidia has around the price range…which it was.
AMD is missing 2 critical products with RDNA4 that they were perfectly able to give us: a 24GB, 96CU performance card, and a 12GB, 48CU midrange card. The 8 and 16GB variants have been finished seemingly forever, so they’ve obviously had plenty of time to pen up one or both of these chips.
The 7900XTX being the top-listed RDNA3 card on the Steam survey proves the demand for high-end 24GB AMD cards is there. We wanted a competitor in that price range. It doesn’t have to beat Nvidia’s doubly-expensive flagship, but we DO want AMD to have some presence in the $800-1K price range.
But the absence of a 12GB, 192-bit bus card is even more baffling. They’ve been spending all this time claiming to be focusing on the midrange for RDNA4. Well that’s great! Sooo…why isn’t there a 12GB, 192-bit bus card? The 6700XT was one of the stars of RDNA2. It’s the second-highest listed RDNA2 card on the Steam survey not far behind the 6600. The 7700XT (which is made using salvaged 7800XT chips) still ends up second behind the 7900XTX for RDNA3. Reviewers have been stammering about 8GB not being enough anymore. So why not have a 192-bit, 12GB offering? And I don’t mean cut-down chips, but a dedicated chip like Navi 22 was? How are you focusing on the midrange with this glaringly obvious omission in your product line?
All this is to say, RDNA5 or UDNA or whatever they decide to call it NEEDS a high-end variant. TSMC’s yields for the chips used in the 9070XT are apparently really good, and a 50% larger chip for a performance model would have been fine. And they DO need a midrange, 12GB variant to round out their lineup.
Rx 9060 xt is rumoured to be 12gb and 192bit bus
@@x3ko777 if they’re reallocating N44 chips into that format then hopefully it’s performant enough that it won’t be unbalanced. Since for most of the last couple years it was rumored the N44 chip would be 128-bit 8GB. And granted, AMD absolutely *should* have a 128-bit, 8GB model available as well. Maybe as a (hopefully slightly) cut-down variant. We want the market to move past 8GB but as far as what mainstream buyers are actually buying it’s pretty clear that’s still the largest-selling part of the market by volume.
7900xtx is already a little 4090. 33.5k graphics time spy with oc nitro+. 36.5k for average 4090. Max score for both is very close 42k for 7900xtx and 44k for 4090 but you need a high-end professional Cpu
Idk what you smoking dude but I want some too. Claiming we need a 12GB, *an 8GB* card is beyond crazy lol. People only buy 8GB because that's what Nvidia made the 60 class to be and everybody knows that you *ABSOLUTELY HAVE TO BUY NVIDIA NO MATTER WHAT*
@@peterkolesar4020 I’m not a fan but the reality is 8GB cards still sell the most. The RX6600 has been AMD’s best-selling card the last several months and is their highest-listed card on the Steam Survey. The RX7600 and RX6600XT are also up there on both counts. There are actually still a LOT of PC gamers who are not buying into the massively increased prices a lot of us have gotten used to since the pandemic.
If they stop doing high end products and let nvidia get away with awful vram configurations uncontested im gonna cry
Buy Intel. Give them signal to continue their GPU products.
Why make gpus that people don`t buy?
That is AMDs problem…
@
Intel will stop making GPU unless they start to make profit… aka increase prices a lot! Would you buy B580 at $550 aka same as 5070… they are about as big so as expensive to produce?
@@GreyDeathVaccine nobody wants an Intel card that still sits at the bottom of the stack.
Same I love my 7900xtx
If they cut UDNA when it's basically the same as their server stuff that's bad news.
Just let the top tier be stupid expensive and let people eat.
Can amd just stop kicking the can down the road and do high end and compete on all fronts so Nvidia can be forced to compete more heavily?
Yeah, I don't want nvidia in my computer. But I game at 4k and the rx6900xt is not cutting it anymore. Amd let me buy a high end gpu again.
@justbob8294 same bro😢
@justbob8294 yeah and that's when amd shot for the moon and they need to keep doing it, them stopping is giving up slowly on Radeon as a brand.
No, next thing they'll announce will be cutting down VRAM and cancelling budget tiers too.
For me, it sounds like you just want NVIDIA cheaper. NVIDIA is not going to compete harder against AMD, even if they are on par if you want to buy an NVIDIA GPU. AMD is who needs to compete harder.
That’s weird. Multiple leaks suggested that PS6 is expected to use UDNA gpu along with zen5 cpu from amd. But yeah, who knows
It would never use the top UDNA die, as PS5 didn't use 6950xt. so it may use xx70 die again.
They usually use a custom version that's focused on what they want. Sort of a .5 better than the last generation, so rdna 3.5 or so.
PLease dont make me wait almost ALL of MArch, damnit!
you can always buy an Nvidia card no one is stopping you. I'm going to wait and see whether it's worth me upgrading to the 90xx cards. Or skip a gen. What's an extra month ,that's nothing in the grand scheme of things
@grumpyratt2163 buying Nvidia just validats their garbage marketing and lack luster generational uplift.
@@grumpyratt2163 Yeah, but the problem is everytime I look at Nvidias offerings and seriously consider its the same old story allover again, not enough VRam, too much money for what it offers, and the experience _I know_ is NOT going to be jaw dropping or make me feel like it was worth the price tag I paid. Fk Jensen and his premiums, Im not paying more for the same FPS and in some cases - 5070 Ahem* _less Vram._ I'd grab the 5070 Ti anyday or 4070TiSup if Jensen would charge a reasonable price of no more than $600. Jensen doesnt believe in offering 16GB VRam for midrange under $700. SO yeah, Ill just have to wait even longer for 9070XT.
@@grumpyratt2163 Extra month? 24.01 to 23.03 is 2 full months.
@@grumpyratt2163extra two months matter because later selling date implies later embargo date which means that people who decide between 70 class or 70(XT) might loose the opportunity to pre-order the 70 class if RDNA4 proves to be underwhelming...
23rd of march, now January 24, and u say a month from now :D u mean 2months from now
I dont care if its chiplet or monolithic as long as we will see a 24GB GDDR7 SKU with either 35Gbps+ IC's on a 256bit bus or 28Gbps on a 384bit bus in either case please amd dont go smaller than 6144 Shading Units / Unified Streamprocessors another Gen of just 4096 would be simply too little also up the L3 cache back to the same 2MB/CU as RDNA2 had (I wanna see 192MB L3) (or bump the L2cache per array up to achieve an equivalent amount of total cache or add spezialized caches for mesh and bvh shading like intel did)
March 23rd is 2 months from now
Dear AMD please give me a reason to buy a high end card from you guys again!
RTX 5090 TITAN AI has 96 GB of GDDR7?!?!? NVIDIA has no equal in 2025 as expected.
god damn it AMD as of now you deserver to fail.
AMD couldn't slither in just under Ngreedia on pricing the 9070 and decided to delay the release.
Please dig more on this one.
If AMD just abandons competing in the high end Nvidia can charge whatever they like.
I hope this is not true at all.
Considering how much AMD make on Instinct, my bet is UDNA MCM is Instinct and GPGPU/datacentre cards. Consumers get the monolithic dregs.
At current point, it's will be a loss for them for each MCM consumer card. I mean MI300 probably goes for 5000usd minimum and 9190xt probably going to be 1200usd. So for each consumer card that are using TSMC cowos are a loss in possible profit to them unless magically TSMC have 100x more capacity FOR THEM. Remember, strix halo started using cowos and that's a huge market compared to 10% discrete gaming.
Going by old patent - 360CU cancelled, 240CU cancelled, 120CU monolith goes to PC and PS6. I wonder if AMD would sell it as iGPU. And of course - how far we are from those very old rummors.
Your not getting 120CU on a PS6 stop the nonsense.
@ZackSNetwork it will be cheap soon and will be mid range. Initial plan for rdna5 was 13 chiplets, 9 of those with CU.
RTX 5090
Asus ROG Astral OC: $2799.99
MSI GAMING TRIO OC: $2349.99
MSI SUPRIM LIQUID SOC: $2499.99
MSI SUPRIM SOC: $2399.99
Asus TUF GAMING: $2449.99
MSI VENTUS 3X OC: $2199.99
MSI VANGUARD SOC LAUNCH EDITION: $2379.99
MSI VANGUARD SOC: $2379.99
RTX 5080:
Asus TUF GAMING OC: $1699.99
Asus ROG Astral OC: $1899.99
MSI GAMING TRIO OC: $1199.99
MSI Inspire 3X OC (Gold) - $1169.99
MSI GAMING TRIO OC (white): $1199.99
Asus PRIME: $1399.99
Asus PRIME OC: $1499.99
MSI VENTUS 3X OC PLUS: $1139.99
Gigabyte GAMING OC: $1199.99
Gigabyte WINDFORCE SFF: $1369.99
MSI SUPRIM SOC: $1249.99
MSI SUPRIM LIQUID SOC: $1299.99
MSI VANGUARD SOC: $1229.99
the future of PC gaming looks grim
I honestly don’t see UDNA being midrange only again, AMD has never ever done 2 midrange generations twice in a row.
I have suspicion that instead of separate GPU chips on the board they will use a similar approach to Zen 5 12 and 16 core CPU's etc. That would be a monolithic chip as a whole on the board, made up of multiple chiplets. Probably with a ton of 3D stacked infinity cache. I believe this is what they were going to do anyway, which is different to the traditional approach of separate chips where scheduling of tasks becomes a big issue for graphics. The 'monolithic' approach of chiplets on chip is still multi-chip in terms of the definition, but should scale almost perfectly. Also makes board and cooling designs much simpler.
As for their compute variants, they would utilise the same cores, just minus some of the other on-chip stuff. Compute workloads are much easier to run in parallel such that multiple monolithic chips (containing the multiple cores) and wouldn't be an issue with scheduling. That's what I believe anyway.
Why are u assuming AMD canceling next gen and avoiding high end GPU by choosing monolithic design? RTX 5000 series are high end and ultra end and still using monolithic design ? U just need to shrunk the lithography and create new arc with efficient on registers, instruction and pipeline and adding dedicated unit to do specialize instruction (tensor, WMMA, RT unit, BVH, Ray Intersections , etc) . MCM dua chip does not always mean better .
Yeah, that's how you get 25% uplift for each new generation. Monolithic is hitting a wall. It cost more for more transistors, thus prices keep rising.
@tstager1978 it's the failure rate. It's much worse the bigger the die size. Chiplets mean a much much higher yield with processes that match their requirements (IO die doesn't need the newest fab).
The ONLY upside to chiplets is yields. So unless you're throwing away a ton of large chips because of defects, it makes no sense.
Look at how much silicon you get when you buy AMD, that's why they can't sell them for as cheap as people want. If you have a good yield rate then the only thing that really matters is the price per wafer
Isn’t UDNA next gen? Why would they cancel it? It doesn’t make sense.
because it never really existed it was just so they wouldn't look as pathetic as they did at CES
the price perf for chiplets mcm is not worth it...right now...the packaging costs actually bring the total cost higher...maybe two generations, but who knows what will be happening at that point.
4:20 That's actually 2 months from now. We're still in january 😕
We haven't seen a Titan since the $6,000 Titan V CEO Edition. I doubt there will be one as there's no money in it for Nvidia when they can sell that to data centers.
I certainly hope that AMD is planning to keep forging ahead with uDNA -- and the Halo line. Especially since nVidia is getting into the APU biz; we need more competition in the GPU market, not less!
The 96GB GB202 card is almost likely a GB202 Quadro with 3GB Modules as those Modules will be going to server cards.
Well said.
I bet it would be called the B6000 too. But 96 GB seems too much. Maybe a H100 replacement ? B100 ?
Who wanna bet they will change naming scheme again next gen because 10080 looks simply stupid?
BTW. there is no reason why Nvidia would not go same 'fake' chiplet design AMD used. They would get a lot more space for CUs, tensor cores etc. while putting I/O, memory controllers, etc. on different die.
At this stage they should just change the naming convention to have 4 numbers, with the first two being the year of release and the last two being the generational class. E.g. 9070 becomes 2570.
It just makes complete sense to me and will make it easier for people to identify the most recently released product from them.
And then never deviate from this for the next 60 years (which by then will they even be around to care?)
@@Rhapses because it makes sense to you and majority o inteligent life means AMD will never do it. And their 'logic' will be that for eg. 2770 looks weak for consumer in comparison to 6070.
Fine w so long there’s a great midrange option like 7900gre when UDNA IS OUT
AMD : Mid Range. Intel : Low End. Nvidia : High End. its a secret agreement between 3 companies.
AMD can’t figure out chiplet for GPU. NVDA probably nails it on the first try. They probably already did it but monolithic more performant.
Nvidia already made "chiplets" working by "gluing" two chips together. Just a different approach to solving the same issue.
@ I didn’t know an interposer counts as chiplet. Apple has been offering it for years already then. Interesting
@@nick_g It's about how to solve the issue that it will be hard and expensive to go below 3nm. Chiplet is what? The idea is to separate different parts of a die into smaller pieces and "glue" them together. Nvidia's interposer is just a different approach but similar to solving the same issue by connecting whole dies together. Different ideas to solve the same problem. MCM - multi-chip design. So yes, the solution with an interposer is exactly this.
Wrong, nvidia sucks at making hardware.
@ they’re the biggest company now in the entire world. Literally the most successful hardware company.
Debate me but i think it's AMD who doesn't care about PC Gamers and not Nvidia.
Auntie said we are going to get a flagship, the question is how that happens. A process shrink could make a monolithic design alright, in theory, I agree. AMD needs the GDDR7, too. They still have like a year and a half to make it work. If AMD can't, then expect a $3000 card from Nvidia's next generation. I think a 50 series Titan is going to need a second power connector after seeing some of the thermal performance of the 5090's connector when they overclock. That was unsettling. Thumbs up!
2months i think you will find!
Never thought I’d say I miss Raja.
Chiplets suck. The inherent latency can only be overcome with brute force clockspeeds - look at early Zen chips, Arrow Lake etc... Ryzen only became decent once they moved slowly back to semi-monolithic with the 8 cores per CCD.
The main drawback of RDNA3 are probably because of the 2.5D HBMlike MCD. If AMD were to make it like MI300 series where the GCD sit on top of the MCD and create a lower amount of MCD. Navi31 have 6/5 MCD. Navi 32 have 4 MCD and the performance improvement were there. 72cu+128mb IC rdna2(6800xt) vs 60cu+ 4x16mb IC RDNA3(7800xt) and they basically have the same performance stock vs stock.
@@noobgamer4709 so 72cu+128mb IC rdna2(6800xt) vs 60cu+ 4x16mb IC RDNA3(7800xt) clearly shows that even with such aggressive cache reduction RDNA3 performed well.
What da frick is AMD doin 💀 💀
If this is the case, AMD sadly ain't surviving till 2036, since Nvidia is also going into cpus now. The PS8 will probably be the first to use Nvidia hardware.
its just same as intel B700, they do design but not taped out
If AMD cancel their next flagship then I truly see no reason to stay with them from this point forward.
I was waiting for the AMD 9000 series flagship since there is no 8000 series flagship, but if canned then no point hanging around.
It is crap like this which is why Nvidia have complete market control.
Uhm Polaris was midrange and so was RDNA1 and they both sold great?
Monlitgic dies more expemsive to produce where chiplets can be cheery picked so you have more power proccesing in long run if they cab produce bus peedsand configurations that allow all the cache tonworknwith ieachboflver and each proccessor
Been on the nay-train for chiplet GPUs from BOTH companies EVER coming, and each day I'm vindicated, just like Titan. It makes zero sense
Ah the real 90. Not the disingenuous class designations from the last 2 gens.
Omg Amarta she's back 😊
RTX 50 Titan 96GB for cooking purposes! You can bet you can make your lunch/dinner on it very fast 🤔
Sony and MS is gonna help develop UDNA with AMD and use it for their next console, like they did with RDNA2
Trust
They clearly stated it years ago...
High End, canceled.
Early launch of Rx8000 cards (in order to gain market share), canceled.
Rx8000 branding, canceled
Announcement (at CES), canceled.
Release date, canceled
UDNA Halo, canceled.
🤔
What is going on?
Is this some (corporate version) kind of "cancel culture"?
What is going on?
I suspect a 5090 ti as they skipped 4000 series and did a 3090ti series when they could not or purposefully limited supply to inflate the price of their cards
The reason they didn’t do a 4090ti is bc the 5090 is
I think AMD is canning the high end because they feel like they can't hit feature parity in software and would rather use the money to try to catch up in the software realm. This would make sense since the high end buyers want the best and compromising for couple of hundred dollar saving does not seem like a good proposal.
Clickbait
Is that Amy???? It’s been like four years, welcome back!!
I don't think Halo product are cancelled. Instead they might do Nvidia Digits kind of things to sell at premium pricing
I feel these 5 series nvidia and 9 series Amd gpus are waste of time and marketing scheme just to put something out there for a year before they actually do new gen stuff.. Probably better to wait for next gen 6 series and udna AMD cards.
Gdamn y’all’s ad read is almost a 1/3 of the video
amd prob havng problems with gpu chiplet design
Latency doesn't like playing chinese whispers.
More of Gaming GPU chiplet design. But MI300 should provide enough information for them on how to improve latency issue when using GPU for gaming instead of HPC. MI300X are basically a HPC GPU. With MI300A being the APU.
So, Nvidia is going to make a Titan for 50 series.
I remember the rumors of them making it for the 40 and 30 and 20 series.
Oh, they never made one for ANY of those series?
They're not going to make one for the 50, 60, 70, etc....... series either. It's TOO NICHE OF A PRODUCT, JUST STOP IT ALREADY!!!!!!!!!!!!!!!!
Nvidia WILL MAKE GAMING AND PRO GPUs!!!!!!!!!!!!!!!!
That's IT! If you want PRO level performance you BUY a pro GPU, get it?? GOT IT?? It's the way it's going to be, until Nvidia has been passed by a couple other companies and haven't had a performance crown for a few years they have no reason to make a cross between a gaming GPU and a PRO model when their top end gaming GPU along with ANY gaming GPU has work oriented drivers. And if you need support with software, etc..... then you need to buy a PRO model to get the wonderful support Nvidia will offer.
Look at everyone selling their RTX3000 cards, no one is selling their Navi 2 cards. 3060Ti, 3070, 3070Ti, 3080 + 12GB, 3080Ti.....
All selling like mad on eBay right now, but no 6800 + XT, 6900XT, 6950XT. VRAM!! Navi2 worked out very nicely, especially the 6800 with 16GB vs the 3060Ti - 3070Ti with 8GB, or the 3080Ti with only 12GB. All for a small price of bad HEAVY RT perf.
I bought my 6950XT because 1) consoles 8 years life span and 16GB of RAM, 2) lithography for the next 10 years is going nowhere. In 8 years when the PS6 comes out it will still be nicely mid-range, unless Sony FINALLY fix their RAM issue, and stop blowing $ on crazy unnecessary crap like the most expensive storage solution ever invented. But if the PS6 has 32GB of RAM then it will shake up EVERYTHING. I doubt it will exceed 24GB total unfortunately
Are you saying that just because it might be monolithic it won’t be powerful?
There's reason for optimism in Radeon's long term? Because their relationship with Sony? What has that relationship ever done for them besides be a rock around the neck? If AMD didn't have their console relationships, they would have probably spun off RTG long ago, forcing Radeon to have to actually compete, which is something I would imagine those guys actually want to do, but they are too micromanaged by AMD's accountants who can't see past the next quarter's profits.
Spinning off AMDs graphics division sounds like the dumbest idea I've ever heard anyone say
@LITTLEgiiant staying as a part of AMD is the dumbest thing I've ever seen, unless the goal is to fail miserably over and over and over again. I'm sure those engineers could make good products if they were allowed to/if their existence depended on it.
@@The_Chad_ they wouldn't have the financial resources to support themselves. Not to mention APUs are major revenue sources and separating those divisions would make them lose out on that. Not to mention Microsoft and Sony help with getting developers on board to optimize for their GPU architectures and push them to add more features ( Improved AI Hardware/software for example) for AMD is a result of console manufacturers joint collaboration. The Radeon division being its own thing will lead it to demise for sure and any sane person would see that.
@@LITTLEgiiant They're already in demise. If they were independent, there's nothing keeping them from having agreements with 3rd parties. All of those products could and probably would still have Radeon tech in them. Independence would necessitate they actually start making good products people want to but, negating the need for MS and Sony to prop up their dev support. No, they probably couldn't survive just being cut loose with their current capital, but that's where loans and licensing agreements would benefit them. And even if that isn't the answer, what AMD is doing with Radeon now is a massive disservice to the name, the technology, the people that work there, gamers, and professionals. They basically use it as an R&D department for APU and SOC graphics. Selfishly, as someone who uses a lot of products with 3D graphics accelerators, I want better. If I was just an AMD investor, I'd probably be fairly satisfied with the current arrangement, but not thrilled seeing how well Nvidia's stock is doing and knowing RTG could do that.
@@The_Chad_ I just think it's too much of a risk to spin off Radeon, but yeah I absolutely agree AMD put some serious work into Radeon. Hopefully UDNA will be AMDs "Ryzen moment" for GPUs.
You ok Paul? It's been a minute
march 23 is two months from now :P
Considering nvidias „uplifts“ and prices amd has no need to go chiplet in the next decade to compete
terrific more market stagnation
Wth! Is that Amy doing the ad?
Lovely Mass Effect background!
It's getting old.
RTX 50 is FP4 cart but there is no FP4 workload in games so they develop DLSS and all the AI features as an afterthought all the AI features are there so they can "DO SOMTHING"
Fp8 was already highly destructive in neural network quality.
Nobody likes to use fp4 already. It break down something useful into something so mediocre (there might be some exception i admit)
That would be great then nvidia can charge even more for their vram starved cards.People deserve that.They want Amd to compete so they can just buy nvidia cards cheaper.
Someone needs to fire the guy in charge of AMD's GPU division. First delayed overpriced RDNA 4 now this
Well Nvidia see1ms to be about 30% or so improved this generation. At a huge cost LOL. The ball is in Intels court now? 😱
Consumer gpus with more ram than most modern pcs 😆
Is Amarta back?!
What's next ?, EDNA ?
This is unreal. They are going to make all progress in the GPU market come to a stall. Nvidia will just become the Intel of GPUs with 10% gains every generation. If AMD doesn't provide a high-end card, I guess I'm finally switching to Nvidia.
Lol...wow a GPU with as much system ram as I have..that'd be impressive
Nvidia already made "chiplets" working by "gluing" two chips together.
I think this will be Nvidia's path to glue smaller dies together but 60xx will be still monolithic on 3nm.
60xx series will probably be on 2nm. The 3nm node is being skipped by a lot of companies since 2nm node is basically ready and doesn't really cost more than 3nm.
@@03chrisv2nm have a hard reticle limit of around 500mm². Current BW are on TSMC N4P(supposedly). The high end BW currently are at 750mm². That take some heavy engineering work to make it possible to shrink 250mm² with only logic are scaling and analog basically havent shrunk since 7nm.
@@noobgamer4709even more reason to go with a chiplet based approach for 2nm then
That new AMD Radeon executive needs to get Fired. Dude's a disaster
Im probably not going to buy amd GPS anymore
Bro way to early to assum this lmao. Honestly shouldn't be starting this rumor based on one comment that had nothing to do with the anything cancelled.
if AMd want to help gamers they need to pull out of gpus and intel should too that way nvidia will be split for monopoly and everything will be fixed
RDNA 5 still being monolithic is both good news and bad news. You dont want to be the beta tester of a radical arch change and extra latency intruduced between diffferent stream processors.
Also means AMD will rely on catch up gimmicks instead of huge architecture changes that are innovative.
I do!
23rd of March is "a month from now"??? Paul, we are still in January! Check your videos before posting them mate.
AMD pls
4:15 thats two months bigboi:)
A top GPU is useless at this moment from AMD they don’t care about top cards there is no money in that for them. Midlevel is the market they should go for at reasonable price’s, Lowend will be APU’s
They're failing there as well, 9070/XT delayed after everyone will already have 5070/ti lmfao
@ time will tell i dont think that the 5070 will be that good for the price at all and most people have time just a few Nvidea lovers wanna have the biggerdickexpirience……….
@@anitaremenarova6662THIS ! Exactly !!!
@@anitaremenarova6662 They are probably doing multi frame gen in the rush.
Who need udna anyways?
Gamers are foolish to be needing something better than 7800XT😂
Just do medusa point with rdna5 60CU and 8c8T to equip better igpu
23rd of march is in a month? I dont think so.
Really there is no one who cant solve MCM design. I call BS.. We are being gimp' ed and pimp' ed..
If AMD does not launch their halo strix i'm done with them forever.
No more AMD or NVIDIA for me from now.
Those money vultures should be .... sh*t !
So instead of waiting for Udna I should just go and buy this gen? Being all true that is.
Wen Radeon RX 9090 XTX news? 🤓
The Titan vista & Titan VISTA?
Nvidia Titan, the fastest, .... oh the new Titan is the .... er the new new Titan ....
Nvidia really desperately needs a massive desktop GPU that cost a lot of thousands of $ and sell like hot bread cause they just lost like 500 billion $ in shares recently due to Chinese AI.
If Jensen would open a community sale of all his good looking full leather jackets I bet he could bring the 500 billion $ value of shares back (literally who doesn't want a Jensen leather jacket...) 🤣
When will the review of 9950x3d come??
Amata is back?