Imagine making a video on another youtuber's who content is rumor leaks that are disregarded by everyone because of how many times they are wrong. Makes you wonder though if AMD is just using Moores Law Is Dead as a soft marketing channel by leaking fake stuff to him all the time.
AMD would never sacrifice the -$100 Nvidia premium prices and release before Nvidia. What customers see as an opportunity for AMD, AMD sees as a loss to profits.
@@be0wulfmarshallz I came to this conclusion...at this point it's my assumption. He definitely has communication with them as we can see...but I doubt he has many real leaks from Nvidia because he was saying it's definitely on 3 nm before Nvidia did the datacenter blackwell announcement
@@rigf1997 The best marketing for AMD, is NVIDIA's malpractices. :P I got burned three times in a row, from NVIDIA, i'm done with them... My first purchase from AMD, ever... I haven't experienced such gaming bliss since overt a decade ago, when Nvidia was still releasing bangers, and supporting SLI. Do you have any idea how many setups of mine NVIDIA has nullified, just because they were "moving on to the next thing"? NVIDIA is just the "trendy" company, if you enjoy spending money buyinf a new graphics card every month, that's the company brand for you, lol... But if you want your purchase to actually last a few years, AMD is the way to go. The straw that broke the camel's back for me, was when NVIDIA sold me a 1200 dollars GPU, that was rendered obsolete after 1 month when a cheaper, higher performance GPU was released... I am never buying another nvidia product after that, biggest insult i've ever experienced from a company, ever. Because of that, i suddenly started looking up AMD products, lol... So yeah, AMD doesn't need better marketing, they only need to allow nvidia to dig its own grave.
"Nvidia defeats AMD" "AMD defeats Nvidia" "The AMD and Nvidia situation is crazy" "AMD/Nvidia is finally finished" "It's over AMD" "It's finally over Nvidia"
RX 8800 XT will probably be like 4070 ti super in raster and 4070 super in RT, at 600$. This makes sense and is not that big of a jump in perf so it's clearly possible AMD will have a good generation with RDNA4.
@@ajjames1960 nvidia at least deliver what they promise and have huge performance boosts with each generation. rtx 4070 super is basically like rtx 3090 or 3080 ti in terms of performance, and their machine learning and raytracing capabilities are unmatched, amd and intel being 2-4 times slower or just not supporting some models.
This generation of cards has taught us that achieving generational uplift is one thing, but we are completely at the mercy of how companies give it to us. Remember Nvidia achieved a 60%+ architectural performance increase from 30 to 40 series but decided we don't deserve that power so knocked dies down a tier and bumped prices up a tier across the board. We went from "Unfortunately the 1060 doesn't quite match the 980 Ti" to "Unfortunately the 4060 Ti doesn't quite match the 3060 Ti". These 8800 XT rumours being true would constitute a normal GPU generation pre-2020.
Yea I needed an upgrade and thought about waiting on 50 series but honestly why do I think anything was gonna be different 😐 I just got my 4080S 50$ off and a free game I’m not about to deal with the OP SHINY mess and scalping bs!
But then to be fair the 40series launch WAS ALSO pretty good. I mean like how 3070 at 499 beat 2080ti at 1199 The rtx 4070s at 599 beats rtx 3090 ti at $1899. After 3 years charging 50-100$ more for that their seems okayish
@@The_one_that_got_away What are you smoking? The 3090TI is 21% faster than the 4070 Super. It's even 4% faster than the 4070 TI Super at $800. Still, the 3090 TI have 50% more VRAM (altho 16GB is all you need, for now in regards to gaming. 12GB is borderland allready). The 4070 Super dosn't even beat the 3080 TI. 3070 at 499 beats a 2080 TI at 999 by 4%. But with allmost 30% less VRAM (remind me again, how 8GB of VRAM works in 2024). I'll take a 2080 TI over a 3070 any day of the week.
@@Audiosan79 I was using hardware unboxed and Tom's hardware as sources, and his chart puts the rtx4070 5% above the 3090ti. I guess that may be because he used dlss3 and Rt which gives 4070 the edge?? And no the 3090 AINT stronger than a 4070 ti super, what's your sources? (Remind me how 8gb works in 2024) Brilliantly, games like hellblade 2, black myth wukong, any good OPTIMISED game has no problem running 1440 or even 4k at that vram "4070 doesn't best 3080ti" what the hell are YOU smokin? A 6800xt beat the 3080, and 7800xt which is 8-9% better beats the 3080 ti, are you telling me the 4070 super is slower than a 7800xt?
they have actually made cards like 7950xt, 7950xtx beating 4090, 7990xtx which who knows where it would go, and maybe 7990xt but they didn't released them... which is shame :(
@@yamsbeans Any sane person would bet that AMD made designs for RDNA3 GPUs that would compete with the RTX 4090. Their architects would have to be stupid to not at least consider such designs, as creating a basic outline for a design costs almost nothing, and ideas raised when coming up with a design have applications to similar future designs, even if the original version of the design never gets manufactured. AMD would inevitably have made a few prototypes for GPU SKUs which they never released as well; though it's very unlikely that those prototypes include any RDNA3 GPU which would actually outperform the RTX 4090 in practice (if they had any such working prototype, they probably would have released it, even if it ended up ridiculously expensive). AMD would have then decided that such designs/prototypes either wouldn't be commercially viable, or that they wouldn't work properly. That's how product design works. You create a load of ideas and prototypes, and you select the ones you think are best, refine them, and release them as products. The designs that were never released still exist though.
Okay, AMD has a genius plan. I have no doubt they probably do. If you were born any time before say, yesterday, you know that AMD always has a genius plan to sabotage their genius plan. If AMD drops something that can legit stomp my 7900XTX, I'll consider an upgrade. It'll have to clobber, though.
be ready for a big disappointment on that part AMD ain't beating RDNA3 with RDNA 4 it may even be slower due to the lower number of CU's even if they are slightly faster best we could hope for is an RTX 4070ti Super equivalent we will know if the part about the faster RT performance is true once the PS5 pro gets announced and released as that apparently has a back ported RT engine from RDNA 4.
CAn people please stop coping? it won't compete on performance and we are talking about AMD here so they won't price it right, amd will price them too high, get bad reviews and slowly drop the price.
Man, imagine if AMD was actually sandbagging and trying to hit nvidia in the midrange, where they're clearly asleep at the wheel. Wouldn't that be nice.
@@steven7297 I mean, it'd be at the same price point, but I'd just pass my 7900 XT to my brother if I upgraded, and then take my 6700 XT back and sell it.
You and 43 other people do not understand how this industry works. AMD absolutely would not "tear the market up" with a $600 GPU that matches the RTX 4080. That would be a below-average and unimpressive improvement to value compared to most previous generations. Almost all previous GPU generations have provided much greater performance uplifts than RTX 4000-series and RX 7000-series. The previous generation was unusually bad and we should expect better, just compare it to the generation before that. Remember that the RX 6700 XT matched the RTX 2080 Ti at less than half the price. It was reasonably popular, but was not treated as if it was revolutionary, because it _wasn't_ revolutionary. It didn't "tear the market up" by any reasonable metric, it has more users than the RTX 2080 Ti, but still has fewer than the RX 580, RTX 3080, or RTX 4090. If the RX 8800 XT costs more than $480 and doesn't at least match the RTX 4080 Super, it's *worse* than the RX 6700 XT and will not take significant market share.
@@nathangamble125 I don’t get that y’all sesh , The 6700xt always came out way after the 2080, that contradicts your point as this 8k series will not be much uplift but enough at a way lower price the same way the 6700xt was to a 2080
@@HurricaneSparky where would you seek your Gpu at ? I don’t want to sell the 7900 gre but if they’ll be better cards at the same price I wouldn’t see why not
did everyone forget about what happened with zen 5 that quickly? *you will be disappointed.* Edit: this comment aged like milk. zen5 x3d chips are impressive.
Honestly though, the entire 7k lineup from AMD had wrong names. 7900 XTX should have been called, at best, 7800 XT. This goes down their entire lineup. If they called them what they should, and ofcourse price them accordingly, the 7k series would have been pretty damn good. Considering my views on that, I honestly hope they will not call it the 8800 XT but the 8700 XT and price it around 450.
For those who haven't noticed, when we went from GDDR5 (1GB chips) on the RX580 (256GB/s) to GDDDR6 (2GB chips) on the 6600XT (256GB/s), you had the same capacity and bandwidth for both cards. The main difference being the bus width halving. So even though the 6600XT was twice as fast as the RX580, it could have been made with GDDR5 given the respective bandwidths. So 16GB of 20Gbps GDDR6 on a 256 bit bus (2GB chips) (640GB/s) would be roughly equal in bandwidth to 12GB (2GB chips) of GDDR7 @28Gbps on a 192 bit bus (672GB/s). The low end 28Gbps GDDR7 would be similar to the first iteration of GDDR6 (14Gbps 1GB chips) as seen on the 5700XT which had a massive bandwidth advantage of 448GB/s (256 bit bus) VS 256GB/s (128 bit bus) and 25% more shaders over the 6600XT, but the cards were roughly equal in performance. Not having GDDR7 isn't a big issue at this stage. When we see 32Gbps and 4GB chips, then having 4 of those on a 128 bit bus delivering similar bandwidth to 16Gbps on a 256 bit bus, will see the end of 12GB and less cards for good, except maybe at the very low end if they're still making them (64-96 bit bus). Right now capacity is more important than bandwidth and 16GB is the new 8GB in the latest AAA titles. The 6600XT showed that bandwidth limitations doesn't mean a wall on performance.
Never underestimate AMD marketing and product management's ability to put their foot in their mouth! It is not just plausible, but likely they will give it a higher in-generation tier number than it deserves, and a price so high that they will have to drop it within weeks again.
Either we get to upgrade our 6800xt and 6950xt cards to a 4080-performance card or we wait for rdna 5 like everyone has been planning for the last year or so. Guess which one it will be
RDNA 4 will be a distaster. AMD can't compete with nvidia in the high end and Intel will take over the lower end market. RDNA 4 will be dead on arrival, unless really AMD releases an amazing middle end product like the 8800xt. But they won't as they are not able to read the room
honestly, unless the 8800XT has absolutely insane performance and RT improvements, i'm gonna be perfectly happy with my GRE. but yea, AMD better strike now while the market is wide open (tho there's still damn 6000 cards out there, supply might be an issue)
People should stop hyping things up, the RTX4080 probably is going to be RTX5070, so USD500-550 for a 70 class card is hardly anything to get excited for and nvidia probably going to price the RTX5070 about the same as the RTX4070 super or increase another USD50 (quick check on newegg, RTX4070 Super is ~USD600 for the most basic models) so a USD650 for a RTX5070 and a USD100 for a nvidia tax against the AMD counterpart seems just about the usual. The only thing positive about AMD is they are more willing to lower prices later in the product lifecycle.
Exactly, the 5070 is likely to be on par with the 4080 if trends continue. 2070 >= 1080, 3070 >= 2080, the 4070 >= 3080, the 5070 should be >= 4080, but I suppose time will tell... either way this bump in RT performance isn't enough to catch up and be competitive in that area.
@@Yuber898 You are missing something. There were never before ti super models like rtx 4070ti super. That is why rtx 5070 will have performance od rtx 4070 ti super not rtx 4080.
@@Yuber898 I agree with you and to be completely fair the RTX 4070 in terms of performance is actually more comparable to an RTX 3090. Considering that I would like to think that if Nvidia is serious about competition the RTX 5070 would be more like 4080 super / 4090 D performance and RTX 5070 super like RTX 4090 performance. To me RDNA 4 sounds like the worst generation of all, after the great come back they had with Rx 6000 and 7000 even, I mean 7900 xtx wasn't a bad card, actually the absolute best ever from AMD even if slower than 4090, but now coming up with a new generation and claiming that 2 years old-like card performance are astonishing sounds baffling to me. If all of this is true AMD is already bloody dead and they don't know it yet
@@bellicsson4171 I highly doubt that there is going to be RTX50X0 Super this generation especially if AMD and Nvidia are both releasing in Q1 2025. Unless RDNA4 is really just a bug fix generation and launch to satisfy shareholders and RDNA5 release in 2026, only then we would see a RTX50X0 Super to compete in 2026 and RTX60 Series in 2027. Nvidia did mention that they are going to stretch the product lifecycle to ~3 years.
@@lucazani2730 time will tell, I dont think intel is at a good spot atm :D most likely, all will let us down and we get 1500USD 5080 to "choose" from and I will stay at my ol 2nd had 3080 for ever
@@HybOjRyzen moment was only possible due to Intel's issues with both architecture and 14nm process. Nvidia would have to screw up badly to give AMD an opportunity.
btw, something i want to mention is that we dont know what gpu the leaks are for, we just know its most likely a 8000's series card, for all we know it could be like a rx 8100 and/or a early engineering sample, also it would be better for amd if they released their gpu's after the 5080 and the 5090 since if nvidia thinks they have no competition, they will massively overprice the 5080 and 5090, and then amd can swoop in and release gpu's of a similar level if not more powerful but at a far lower price. after something like that, no one would want to buy a 5080 or a 5090. which could help quite a lot with the bias most people have for nvidia.
considering their recent 1st party testing is misinformation, i'll keep going with used and known perf/£ versus the marketing nonsense from green and red.
I was going to pull the trigger on a 7800x3d + 7900gre to upgrade my aging 10850k + 3070 system when I can afford it. Now I'll be waiting to see if 9000x3d + 8000 series is worth it.
Same. I'm hoping for a 9800x3D or 9900x3D if they can do the cache on 8 cores. I currently have a 6600xt and a 9700k and both my parts are starting to struggle especially if I want 1440p for my screen lmao
OK but we do need "efficiency" in 8000 series / RDNA 4 cards. 40W+ @ "video playback" UA-cam, Netflix/VLC etc. (all models inc. RX 6800 & Up) is RIDICULUS... Did u hear that AMD ? 🤔
They are. It is a Vram Problem with RX 6800 and up. Thats because these Cards hammer their Vram on full Frequency in Idle. Takes place when you have 2 Monitors or more than 60Hz Monitors. But does not seem to be for all People. I have it. When i drop the Refreshrate under Windows at 60hz instead of my 165/180, Idle Powerdraw or with YT on, is below 10W. It is wild. They are quite efficient, but this is a softwareproblem. Have RX6600, RX5700XT, RTX2070S, GTX1070, RX6800 and some older Cards like R9 280X or 1050 (m) around and the RX6800 is the only Card that does it. Great Card, though. :D VRR seems to help some People. But not in my Rig. (3440x1440@165 + 1920x1080p@60)
Truly a shame. I decided to sell my 7900gre just because of that, 95% of the time that my PC is on it is just "idling" and I see 30W on the desktop and 45-55W with video playback even with stock settings. On a 60hz 4K monitor!! This is pure insanity. I had this issue with the 6950xt too. And just to put 50W into perspective, my whole setup (with monitor) running rtx3070 used to draw an avg 51W (from the wall) during video playback, this card instead draws an avg of 48W alone!!
Yep. Multimonitor too. As soon as I plug anything more than a 1440p monitor to my 7800xt, the memory clock shoots up to 2425MHz and stays there the entire time. It's not as bad as previous gens where anything over 1080p 120Hz would make my 5700xt stuck at high mem clocks, but there's still a lot of work to do for AMD.
n48 is mean it got design later than 40-47. 40 to 43 should be the rumor chiplet but it not work out so they skipped it and design monolith for mid range, while n44 might be monolith from the start on low end tier
Hypetrain -> overhype -> disappointment -> just wait for the next-gen -> repeat. How can 8800xt with 7% more CUs, the same BUS, and the same memory speed suddenly be over 45% faster than the 7800xt to match the 4080? Let's be real at least for once.
I hope they make a properly engineered GPU cooler like the 4080 had for the 8800xt (which I doubt that they are)... Those low temps were to die for... I almost picked up a 4080 super because I heard that the 5080 and the 5090 are going back to having "smaller coolers again"... Like the 30 series cards... I don't want to run super high temps again. I also hope that they make ray tracing better for the 8800xt... If those 2 things are good I'll switch to AMD no problem
As much as I wish AMD will finally PROPERLY compete, not acting like "the same, but 5% cheaper" brother , just to knock Nvidia down a peg for once...this was almost cringe level of copium.
AMD is the only one that can sabotage this launch. NVIDIA knows this and will follow the rule of "Never interrupt your enemy when he's making mistakes".
If Nvidia or AMD can't bring slightly better performance than the 4080/7900xtx around or under 700$, I won't be upgrading from my 3080. I'd be most interested to see if AMD can make a true budget card on-par with the 5500xt. Something with 24-28 CU's, 8-12 gigs of ram with a 128-96 bit bus at $150 to $170. I think that's what the market truly needs to balance itself out. The budget tier has never been great, but it's gotten very unhealthy these past 2 generations. Nvidia charging $250 for the 3050, then charging 300$ for what's essentially a 4050 ti really damaged the budget tier, and the 6500xt didn't help either. The only bone this tier got thrown was whenever the RX 6600 went for 180$.
If these really are accurate rumour performance about the 8800xt then there's no way it's going to cost $500-600, with that kind of performance it's going to cost at least $750 if not higher
Developers are forcing ray tracing into games now. Rumors for the ps5 pro showing it’ll have more than double the RT performance and also DLSS/Frame gen-like features. Developers will just start making it standardized. Don’t make the mistake of buying an expensive GPU with bad RT performance, you’ll regret it in less than 2 years time.
Frankly AMD are on such shakey grounds with their reputation, and lying to hardware enthusiasts about performance increases, I won't believe their RDNA 4 results until I can see them. They talk up a ~5% improvement as 15%.
Yeah my dude, the new ryzen 5700 (I think thats the one anyway) supposed to beat a 7800x3d lol, AMD are lying dipshits of the highest calibre. Still not an Intel though.
The long reason imo it’s 44 and 48, was that they have been wanting rdna to be a mcm gpu like their cpus, but we’re having issues doing so without latency or something issues. I imagine at this point in the AMD/Jim Keller comeback plan, unlike cpus, the gpus (and apus tor the record) did not go to plan. From what I understand they also wanted rdna 1, 2 (1 and node woes was enough for them to change course early and squeeze rdna2 monolithic hard), 3(less degree), and 4 to be mcm designs but depending on how realistic (complexity vs node woes vs money woes vs awareness of before issues timinglyness) they were it changed how the lineup was announced. They definitely would’ve prefer saying it would be a rdna 1 gen in rdna 3, that’s why they didn’t miss it now. I believe 44 was the only original design of rdna4 left to actually be put into mass production. 41 dual die, 42 smaller, slower, less hot, dual die, 43 single die monolithic mid tier, 44 was relatively a lower mid tier chip, 45-47 were scrapped points above and between trying to figure out new plans, 48 is the sudo mid-high, sitting relatively higher than the 5700xt did at launch, probably where it should’ve been if it also didn’t have issues with heat and power. As Ryzen rose, Radeon raged red, leaving apus nerfed in two ways with no reason to ever get buffed… until now(strix halo). Halo when tuned will be peak computer efficiency, being rdna 3.5 we will get some of the important fixes but with 40 cus, lowered in wattage, better but not great memory bandwidth and latency, can find a good balance for mobile 1080p gaming and mobile workstations.
My hope is that they hit the performance target(4080/XTX equivalent), and if it costs more than 500$, then I’ll just wait until it’s discounted to 500$ and buy it.
@@leonard8766 Picked up a used A770 from a guy that bought it in china, the AV1 encoding on Intel Quick Sync is a god send for my media library converting using Handbrake. The only problem I have are weirdly seem to happen on unreal engine games where texture goes missing in Max setting, but shows up again when lower the texture setting to low
@@leonard8766 because of the issues at launch the arc cards are underrated but with the updates they are most likely the best value cards around at the moment.
Also to consider, NVIDIA and AMD are basically "family" rival companies since both CEOs from both companies are cousins so even though they may be preparing things at stuff to beat each other, a theory would be that they are just having fun doing competition to each other since they are from the same family :)
As we all know, the 7800xt should have been a 7700xt if we look at 6000 generation. When we see a 7700xt beating the 6800xt by 5%, then we should know that the 8800xt will beat the 7900xt by 5%. So it will perform more like a 7900xt then the xtx. Don't expect it to be close to the xtx version. Price will be 599 and not 499. If it's gonna be 499, then we should be very happy.
Well, Ray tracing is now evolving from RTGI to RTX DI (dynamic illumination) and Ray Reconstruction. Unfortunately, it's expensive enough to cause frame rate and frame timing issues on a 4090 in Star Wars Outlaws. Yes, it looks nice and I got a glimpse of a bit of RTGI in Star Wars Survivor Quality Mode on Coruscant but it literally transforms your 4090 into a 4070/4060. And Star Wars Outlaws seems to be using SSR for water and glass. If you turn it on for those features, I'm sure it would bring the 4090 to its knees.
I'm beginning to believe that AMD is popular with a certain type of gamer because these are the same people who get addicted to the manic/depressive cycle of real-life dysfunctional relationships. Think of the women who date abusive men. AMD is like an abusive partner who lies all the time and makes promises they won't keep.
Guys, Ryzen 9000 was also over hyped and look how it turned out... Don't expect too much from RDNA4. Also the leak from MLID is just dumb, 64 CUs but apparently the 8800 XT will "trade blows" with a 4080? Something doesn't seem right there, AMD would have to make a technological innovation to match a 4080 with just 64 CUs.
I've been involved in the MLID community for nearly 2 years, seen every podcast, every leak video, and interacted with ppl far smarter than I in his patreon discord. From personal experience I can tell you his leaks of specs, pricing, and release date are almost always on point But his performance estimates are often hit or miss Specs, pricing, and release date details on RDNA 3, RTX 40, Intel 13th & 14th, & Ryzen 7000/9000 hit the nail right on the head But performance estimates for 14th gen, R9000, & RDNA 4 were pretty far off Which in his defense Intel and AMD are basically on fire when it comes to communication between marketing and the engineers The left hand doesn't know what the right is doing Marketing says +30% and engineers say "no way"
AMD cycle: -rumors of insane new Nvidia flagship -rumors of AMD flagship that challenges Nvidia’s for half the price -AMDs card ends up being in the Nvidia 70 series card.
stop lying. 8800XT will be an $800 card at launch because 4070Ti Super is an $800 card now. AMD not gonna give away 4080 performance for $500, mooreslawisdead is the most biased AMD fanboi channel. most of his predictions end up wrong
It's always the same thing everytime AMD is about to release new hardware. People overhype the new products based on any questionable rumor (typically from MLID and the likes) and then proceed to bash the new products for not reaching their unreasonable spectations. Don't get me wrong. I'm going to criticize those companies if they release some miserable product like any more 8GB GPUs but specting Navi 48 to have RTX 4080/RX 7900 XTX performance for $500 is a bit too much. Just look at the current generations and ponder if that's really probable.
Honestly, AMD really needs what they were known best: similar competing performance and features, while UNDERCUTTING competition price (meaning value). Which AMD kept shooting itself in the foot with their weird pricing or other marketing team decisions. Unless they have better features (like software wise), I think they should prioritize value first for customers to see. I have 7800 XT, but I got that because in my country in particular, it was so much cheaper than the nVidia competition (on the similar level).
I think all these tech people miss something in the videos. We love high end cards because we are enthusiasts but the vast majority of people buy mid level cards thats where the business is at and amd knows it.
7:09 AMD numbers their chips with the design date. They start with the biggest die. Then make smaller once for different markets and price points. So either they are numbering different now. Or they went trough a lot of redesigns and are now at chip design 7: navi 48.
The really really really juicy stuff, as you put it, is the FP8 and matrix HW capabilities and 2x RT. This will absolutely PUMP frame generation, RT and general AI capabilities of the chips, and I think flooding the market with those capabilities is what they are going for here, before they introduce some new software that depends on those.
7:55 it could be a generational skip. Maybe 48 was suposed to be iinternal name for RDNA 4.5 where they were planing to fix RT performance to test it for 51 chip, but for some reason or another it was pushed up to be RDNA 4 flagship. The 44 would have a lot of focus on efficency for the PS Pro/Halo BS
They are pushing this because the PS5 Pro and the next Xbox is right around the corner, Sony might've forced them to develop the new chip and they are just throwing everything they have to the market
@@suinsarbayev2191 ye my thinking aswell. Also might be thereason to go for monolithic die. Bugfix the fully redesigned "core" chip with RT and efficiency in mind, and than weave them in with infinity mark 2 for 52 navi or some shit
I’m not super knowledgeable about the topic but wouldn’t it make sense to just drop the price of the 7900XTX to around $500? Considering it already surpasses the 4080 in performance I know that the 8800XT will probably have other benefits such as better ray tracing and maybe better power efficiency and temps but it makes me wonder what will happen to the 7900XTX? Will AMD simply just discontinue it? What about the 7900XT? It would need to see a a price drop as well. Even the 7800XT will have to be priced like $350 - $300 or below to make sense. If that’s the case then the 7800XT could become the best value GPU
Yeah AMD tends to cannibalize their product value lineup, they might drop the price of the 7000 series pre launch to move out stock but all signs point to the 8000 series being entirely midrange so its hard to say what theyll do as it seems to be a bad situation all around for them.
older cards for AMD always exist in the market place but at reduced price and get sold off fairly quickly, although RX580 went from high end type card to mid rang card and stuck it out for quite a while as AMD didn't produce anything that performed at a similar level for a few years as they simply produced high end wanna bees (Vega only came in two varieties and less said about the Radeon 7 the better.) Remember RX5700XT was also that gens top card and also only intended to compete with the 2070 not counting RT.
7900XTX is expensive to make. Lots of highly binned silicon, lots of memory, big heavy coolers. Just doesn't make sense to sell them at $500 new. It's much better financially to design a smaller card, with smaller and more efficient chip.
Not really. They need to protect their business unit margin - after all we are talking about a large company with a global presence. So to do so you need to either create low price - high volume scenario (risky) or create cheaper product (for you to manufacture) with some new features (look they propose better RT) and a feel of being fresh. Of course you can be Nvidia and just pump margins with higher prices. But Nvidia is not longer gaming-oriented so they give a F about gamers. Niche buyers from PCMR will still buy HEnd cards from them - but money nowadays you get from B2B segments (AI, servers, and others).
No, because the RX 7900 XTX costs more to manufacture than the RX 8800 XT will. Navi 48 is expected to be about the same size as Navi 31's GCD and is built on a similarly-priced node (N4 vs N5), but doesn't require separate MCDs, which have their own manufacturing cost in addition to an assembly step to combine the GCD and MCDs together. The RX 8800 XT will also have 8GB less VRAM, and will likely use smaller coolers, VRMs, and PCBs.
I don't usually comment on things like this often, but i will note to any viewers that MLID is a very unreliable source for leaked information. Just a few weeks ago, he was hyping up Zen5 being up to 20% faster than Zen4, and he has a track record of just getting things wrong. With that being said, we do know that RDNA4 GPUs will likely be quite a bit better in RT, and the RX 8800XT should be somewhere in between the 7900XT or 7900XTX in rasterization performance.
"Just a few weeks ago, he was hyping up Zen5 being up to 20% faster than Zen4" Yeah, and so were AMD themselves at Zen 5's announcement. MLID has sources within AMD, but when AMD doesn't even know (or if you're more cynical, lies about) how good their own CPU is, the numbers reported by leakers will inevitably also be wrong. MLID's performance data isn't very reliable, but it isn't significantly _less_ reliable than AMD's.
@@nathangamble125 A very sarcasic person might say that since Robert Hallock left AMD for Intel Lisa has replaced him with MLID and this UA-camr gets a monthly paycheck since then as their (outsourced) marketing guy.
the second number on the gpu die is when that gpu was designed NAVI31 was finalized before NAVI32 that was finalized before NAVI33. navi41 and 42 was supposed to be a big deal with MCM but prove to be too much for AMD and got canceled, NAVI44 was work in progress as well, AMD needed a bigger gpu then the NAVI44 and made the NAVI48 as a plan B. it was the last GPU to be finalized and it go the highest number.
unlikely, more likely a situation here is a card that would sit around the 4070ti/4070ti super, while costing 550 or so as an MSRP with a later price drop to 500 which is still insane, like thats, value to kill for. but only in gaming, with a 12 pin power connector as standard AMD adopting nvidias trend setting silliness and then that being really only as good as a 4060ti 16gb in professional workloads with nvidia being better there, with ray tracing on par with rtx 3000 series parts running their compatible software, at their rough quality anyone who goes around talking about rtx 4080 levels of preformance, are living in a dream world most likely, and ray tracing on par with nvidia is also in a dream world. if it has a 20gb variant it would be an excellent card for 3k gaming
This! The guy has almost zero credibility and his "rumors and leaks" were proven wrong 95% of the time. Lets be realistic: GPU department at AMD in the past overpromised, underdeliverd and price/performance was just a lil bit better than Nvidias. Lisa Su also has set priority in CPU development and cuts costs in R/D of their GPU lineup. - Performance of RDNA1 was low to mid tier and prices were somewhat "okayish". Also marketing has set expectations right. - RDNA2 had alot of potential but production costs and price increases prevented AMD in getting market shares. - RDNA3 underperformed/underdelivered and didnt give a good value at the beginning of the product cycle. Maybe at launch of RDNA4 and at the end of RDNA3 the prices might come down to a acceptable/good value for customers. We can expect RDNA4 to be performing like RDNA1 (low- to midrange in comparison to Nvidias lineup) and therefore only price/performance ratio will tell if the upgrade will be worth it. Also rumors that their top end chip will be 4nm monolithic design are highly doubtful because TSMC produces them and the production costs and wafer yield dictates the consumer prices at the end of the day. Thats why AMD went to chiplets with RDNA3 and got performance hits, high latency. RDNA4 either will be chiplets for their top lineup/halo product or (if monolithic will be true) the prices will skyrocket. AMD simply cant make a monolithic GPU die at 4nm TSMC node with the required die size to outperform RTX 4080 at a cost of 500-600$ per card. Simple physics!
Sounds way to good to be true. Even if it matched the 4080 ...fsr and lack of ray recon still not a 4080. Alot of people are gonna be disappointed when we see the actual performance ..but lets see@@laszlozsurka8991
Good views on the topic. Amd usually gets the hand me downs from tsmc which is why theyre always a node behind nvidia. I think they prob chose not to use ddr6x because of sam. Amd is the whole package so they plannedd around the fact they get the hand me downs for gpus and created a work around using their cpus. The ability to still used ddr6 is a huge cost saving. They will adopt ddr7 maybe the next release after this one. Theyll never be the fastest because they will have to outbid nvidia but im looking forward to this generation launch too because i too think nvidia is overpriced.
I’m planning on getting a 7900xtx around end November. I’ll be following the release to see how it goes in terms of performance and price in comparison to the 7900xtx
I think that the reasons AMD won't use GDDR7 on this generation are twofold: 1) I'm sure that GDDR7 will be HORRIFICALLY expensive in the beginning (everything is). 2) I thnk that GDDR7 will be too fast for the card to properly use, like an R9 Fury with HBM1.
The only thing AMD marketing is tricking lately is themselves. You can buy gre for 399usd equivalent in Poland - 6700xt was 479usd - this needs to be 449. AI may be ending, Nvidia will hit gaming market hard this time.
Given how Radeon used to be run, hyping up way more than they can deliver. It would be extremely intelligent for AMD to downplay their upcoming product and surprise people. But I believe them when they say they're focused on the mid range market, even though I believe it's a mistake to not have a Halo GPU of their own.
MLID exagerates everything . It wont reach RTX 4080 performance if it has only 64 CU's at 2.9-3.2 Ghz. My Calculations assume you run at stock Clock frequenciies and not even auto boost (with auto boost i mean the frequency Adrenalin tunes in even without asking it to OC, which dont reflect the real/offical frequency caps) or those 2.6-2.7Ghz or even those with lower energy costs / bigger pockets and luck witth their silicon 2.8 Ghz. From what i noticed most people have their RDNA3 GPU's running at 2.6 or 2.7 Ghz (I personally run mine at 2.65Ghz) RDNA4 If the shading unit count per CU is the same as with RDNA3 4096 Shading Units 64CU @2.9Ghz is only 47.5 TFLOPS FP32 4096 Shading Units 64CU @3.2Ghz equates to 52 TFLOPS FP32 which would put it right in front of the RX 7900XT @2.4Ghz If the shading unit count per CU is the same as with RDNA3 to put this into perspective with the RX 7900XT @2.4 Ghz you get 51.8 TFLOPS and at that Clock Frequency it doesnt beat the RTX 4080 (roughly 49 TFLOPS) and is on par with the RTX 4070 Ti Super which comes in at roughly 44 TFLOPS which would be lower than the Navi 48's but in games AMD's theoretically higher compute resources sadly dont translate as well into real world performance. Real World Performance would be some where between a RX 7900GRE and a RTX 4070 Ti Super (at least with hopefully a little lower power consumption than those two as well as greater memory bandwidth). The only way they could turn things around would be if A: their drivers make gigantic lkeaps forward to translate that theoretical performance in realworld performance and B: those new dedicated RT Cores actually work better and as a drop in replacement when running "older" games (older meaning they havent been coded to specifically target these new RT Accelerators (RDNA 2 and 3 used for a big part of their RT Pipeline the TMU's.)
Haven't been excited for a GPU generation for quite a while? Did you miss the point earlier in your video where the 3070 matched performance of the 2080ti? (Granted that was 4 years ago...)
7:30 this follows development cycle. first to be made is biggest one, so smallest number. then you start trimming, making sequential numbers with each trim. current numbers at AMD suggest they did MAYBE had 42 &44 but scrapped 42 and remade it, but to avoid any confusion internally it got next number so 48. I assume they use even only 2/4/6 was initial release, where 2 would be quad chip design going as big as those crazy rummors went, 4 was their highest volume design, and 6 was maybe 8300 version. odd numbers will most likely follow later when they fill in the gaps with mid-life products
16gb isnt enough for 4k maxed out with frame gen on in some games. Ive seen it happen with my rx 6800. Mostly saw it with ubisoft games and I did see it happen with some playstation ports. The new starwars outlaws will even automatically put you in a lower texture streaming mode if it doesnt detect over 16gb of vram. Anything under you will see pop in muddy textures no matter what your settings are. I say more vram the better, but I can see 16gb being the minimum soon in the next few years.
Amd use the slower vram. Got a 4080 never had issues at 4k. The 7900xtx has 24g and loses to black myth wukong to a 4060ti and has no chance at 4k yet alone 1440p at pt. 24g vram should be enough for 4k and 1440p right??
@c523jw7 It looses due to optimizations favoring nvidia. I have a 7900xt, and without full raytracing, it was running fine in 1440p. Everybody knows nvidia has better tech, but tbh most want price to performance, and hopefully, something comes out to fill in that gap and change the market.
@@cbz21 I agree price to performance is great without rt. 7900xt is a beast. The challenge with vram is rt increases it so you need rt cores and base strength. Look at the 3090 as well 24 g card doesn't make it a 4k card today though. So it becomes a 1440p card..24g is overkill there. But having more doesn't hurt and I do think nvidia are scummy with their vram still.
@nicane-9966 I wouldn't consider it mid tier but I did get one for under mid tier prices. Got mine for 500. You shouldn't need to pay 1k for a top tier card.
i think AMD may release it at that price to overflow the mid tier market with AMD cards, specially because i am sure Nvidia is going to keep going more expensive. Basically AMD has given Nvidia the top tier, but they will flood the mid tier market, which generally is the one that makes the majority. Meaning it could eventually force Nvidia to lower prices or give up on mid tier entirely and just cater to high tier.
Yeahhhhhh I’m not holding my breath like I have year after year anymore. All year long for a more expensive piece of hardware and a 5-10 percent boost in performance. Waiting to see what Black Friday offers and ordering 👊
I'm excited to see improvements in things like Ray Tracing or maybe see where AMD can close the gap in production programs like 3D modeling and video programs.
There is no shot AMD is gonna be dropping a $500 GPU with similar performance to the 7900 XTX. That would be one of the greatest generational performance leaps ever and given their recent track record of overpromising with and underdelivering... I wouldn't hold my breath for anything that exciting. Them dropping out of the high-end market with RDNA4 deeply concerns me though, that is simply going to open the door for Nvidia to mark up their own cards in those segments even more than they already have with RTX 40 series.
The latest rumour is that it'll be slower than the 7900XTX and much closer to the 7900XT. That puts it around 15-20% faster than the 7800XT, a $500 card. 20% more performance for the same money after 1 year. There's nothing crazy about this.
"That would be one of the greatest generational performance leaps ever" It objectively wouldn't. RX 6700 XT matched RTX 2080 Ti at less than half the price Vega 56 beat 980 Ti at less than 2/3 the price. R9 390 was only about 10% slower than 780 Ti at less than half the price, and then a year later the RX 480 matched it at 1/3 the price of the 780 Ti. It's completely normal for AMD's GPUs to provide 2x the performance per $ of Nvidia's high-end GPUs from the previous generation. You are simply ignorant of history if you think otherwise. A $500 RX 8800 XT which matches the RTX 4080 Super and RX 7900 XTX would not be unusual or impressive, especially considering that the RTX 4080 Super doesn't even provide better performance per $ at MSRP than the RTX 3080 did.
It should be at MINIMUM better than the 7900xt based off past trends with AMD, it basically has to be 7900xtx level with better ray tracing to be a good deal for around $600-650 and that’s pushing it, it would have to be cheaper than that to be a good deal as you can find 7900xtx for under $800 in some sales if you are lucky, and 2 years on you should get that performance for much cheaper, if amd cannot do this then the gpu market is finished and nvidia will hike up prices crazy and the future for gpus won’t be looking good
Rdna 4 had overtly ambitious chiplet based top end chip that got cancelled but they kept low end alive because that they thought they could make it work on time. RDNA5 got people from cancelled project, and it was supposed to be successor to that overtly ambitious chiplet based high-end chip. RDNA4 is successor to low-end chips of RDNA3 because high-end was too ambitious to get working on time. Nvidia doubled their ray-tracing triangle intersection cores in 3000 series.
I agree because the last Nvidia GPU without RT was the GTX 1660Ti/1660S and if priced right this can be the new midrange banger. This strategy might be very successful because of cutting out RT units will result in smaller die size and lesst cost to produce in latest node tech.
I just hope we see the release in time for xmas sales rather than a friggen CES-launch. I can barely keep it in my pants as it is now. Also if it turns out to be a disappointment I'd prefer to know as soon as possible.
RDNA 3 finally, instead of RDNA 2.5 type of uplift that we got with 7000 cards I wish they had higher end cards to compete with 5090 and 5080 In fact they should name NAVI 48 as a 8700XT and potentially launch 8800/8900 cards You never know what Nvidia has prepared, the 4050 (named 4060) was powerful for it`s small die size but overpriced
This is AMD we're talking about. They never miss the opportunity to miss an opportunity
Imagine making a video on another youtuber's who content is rumor leaks that are disregarded by everyone because of how many times they are wrong. Makes you wonder though if AMD is just using Moores Law Is Dead as a soft marketing channel by leaking fake stuff to him all the time.
@@be0wulfmarshallz For no one else’s sanity but my own I’m putting this one in the
‘amd troll comments’ category.
AMD would never sacrifice the -$100 Nvidia premium prices and release before Nvidia.
What customers see as an opportunity for AMD, AMD sees as a loss to profits.
@@be0wulfmarshallz I came to this conclusion...at this point it's my assumption. He definitely has communication with them as we can see...but I doubt he has many real leaks from Nvidia because he was saying it's definitely on 3 nm before Nvidia did the datacenter blackwell announcement
🤣
Based on previous amd history, 900IQ moves are not likely.
Zen 5 launched like a Ferrari that came with flat tyres as standard
@@Cantatos I would say cpu market too looking at Zen 5
I really want AMD to do good and get their fcking marketing together so we get some actual competition. But they just keep failing at it.
@@rigf1997 The best marketing for AMD, is NVIDIA's malpractices. :P
I got burned three times in a row, from NVIDIA, i'm done with them... My first purchase from AMD, ever... I haven't experienced such gaming bliss since overt a decade ago, when Nvidia was still releasing bangers, and supporting SLI.
Do you have any idea how many setups of mine NVIDIA has nullified, just because they were "moving on to the next thing"? NVIDIA is just the "trendy" company, if you enjoy spending money buyinf a new graphics card every month, that's the company brand for you, lol... But if you want your purchase to actually last a few years, AMD is the way to go.
The straw that broke the camel's back for me, was when NVIDIA sold me a 1200 dollars GPU, that was rendered obsolete after 1 month when a cheaper, higher performance GPU was released... I am never buying another nvidia product after that, biggest insult i've ever experienced from a company, ever.
Because of that, i suddenly started looking up AMD products, lol... So yeah, AMD doesn't need better marketing, they only need to allow nvidia to dig its own grave.
@@Cantatos Someone missed RDNA 2, and even RDNA 1 and the rx5XX series were pretty cool. Biggest issue I'd say is inconsistency
"Nvidia defeats AMD"
"AMD defeats Nvidia"
"The AMD and Nvidia situation is crazy"
"AMD/Nvidia is finally finished"
"It's over AMD"
"It's finally over Nvidia"
What ?
Tech Channels when AMD/Nvidia/Intel makes an oopsie:
@@AnonymousAnonymous-zp6lu 😂 Kinda like that UA-camr who made like 20 videos about MrBeast
@@AnonymousAnonymous-zp6lu
Gamer Meld titles be like:
" Nvidia empire is finally crumbling"
" It got worse for Amd"
"Nivida, is finally done"
Every year, the same story. The same high amounts of copium and then the same extreme disappointment. Under deliver, over price and over hype.
Sure but every now and then amd makes the right moves , like with early ryzen cpus and the rx 5700xt , which was the best value gpu at the time
they can't overprice something in a market led by Nvidia.
RX 8800 XT will probably be like 4070 ti super in raster and 4070 super in RT, at 600$. This makes sense and is not that big of a jump in perf so it's clearly possible AMD will have a good generation with RDNA4.
thats more like nvidia lol overpriced wannabe with 8gb of vram then add an ai features but limit it to 8gb lol scam
@@ajjames1960 nvidia at least deliver what they promise and have huge performance boosts with each generation. rtx 4070 super is basically like rtx 3090 or 3080 ti in terms of performance, and their machine learning and raytracing capabilities are unmatched, amd and intel being 2-4 times slower or just not supporting some models.
This generation of cards has taught us that achieving generational uplift is one thing, but we are completely at the mercy of how companies give it to us. Remember Nvidia achieved a 60%+ architectural performance increase from 30 to 40 series but decided we don't deserve that power so knocked dies down a tier and bumped prices up a tier across the board. We went from "Unfortunately the 1060 doesn't quite match the 980 Ti" to "Unfortunately the 4060 Ti doesn't quite match the 3060 Ti". These 8800 XT rumours being true would constitute a normal GPU generation pre-2020.
So true 😢
Yea I needed an upgrade and thought about waiting on 50 series but honestly why do I think anything was gonna be different 😐 I just got my 4080S 50$ off and a free game I’m not about to deal with the OP SHINY mess and scalping bs!
But then to be fair the 40series launch WAS ALSO pretty good.
I mean like how 3070 at 499 beat 2080ti at 1199
The rtx 4070s at 599 beats rtx 3090 ti at $1899.
After 3 years charging 50-100$ more for that their seems okayish
@@The_one_that_got_away What are you smoking? The 3090TI is 21% faster than the 4070 Super. It's even 4% faster than the 4070 TI Super at $800. Still, the 3090 TI have 50% more VRAM (altho 16GB is all you need, for now in regards to gaming. 12GB is borderland allready). The 4070 Super dosn't even beat the 3080 TI.
3070 at 499 beats a 2080 TI at 999 by 4%. But with allmost 30% less VRAM (remind me again, how 8GB of VRAM works in 2024). I'll take a 2080 TI over a 3070 any day of the week.
@@Audiosan79 I was using hardware unboxed and Tom's hardware as sources, and his chart puts the rtx4070 5% above the 3090ti.
I guess that may be because he used dlss3 and Rt which gives 4070 the edge??
And no the 3090 AINT stronger than a 4070 ti super, what's your sources?
(Remind me how 8gb works in 2024) Brilliantly, games like hellblade 2, black myth wukong, any good OPTIMISED game has no problem running 1440 or even 4k at that vram
"4070 doesn't best 3080ti" what the hell are YOU smokin? A 6800xt beat the 3080, and 7800xt which is 8-9% better beats the 3080 ti, are you telling me the 4070 super is slower than a 7800xt?
They proved that 4080 performance is possible with the 7900xtx. They just need to price their cards at launch more competitively.
they have actually made cards like 7950xt, 7950xtx beating 4090, 7990xtx which who knows where it would go, and maybe 7990xt but they didn't released them... which is shame :(
@@Silax1992cuz they aren't real😂
@@yamsbeans Any sane person would bet that AMD made designs for RDNA3 GPUs that would compete with the RTX 4090. Their architects would have to be stupid to not at least consider such designs, as creating a basic outline for a design costs almost nothing, and ideas raised when coming up with a design have applications to similar future designs, even if the original version of the design never gets manufactured. AMD would inevitably have made a few prototypes for GPU SKUs which they never released as well; though it's very unlikely that those prototypes include any RDNA3 GPU which would actually outperform the RTX 4090 in practice (if they had any such working prototype, they probably would have released it, even if it ended up ridiculously expensive).
AMD would have then decided that such designs/prototypes either wouldn't be commercially viable, or that they wouldn't work properly.
That's how product design works. You create a load of ideas and prototypes, and you select the ones you think are best, refine them, and release them as products. The designs that were never released still exist though.
@@yamsbeans some of them actually been benchmarked, by some leakers, of course it's understandable to not release a product bc of it's cost or so...
@@nathangamble125 unreleased products don't matter all that matters if what they put on the market
Okay, AMD has a genius plan. I have no doubt they probably do. If you were born any time before say, yesterday, you know that AMD always has a genius plan to sabotage their genius plan. If AMD drops something that can legit stomp my 7900XTX, I'll consider an upgrade. It'll have to clobber, though.
But they don't.
yeah, they need to replace everyone on their marketing division.. :)
yep, AMDs greatest enemy is AMDs marketing division lol
If the rumors about 8800XT are real and it is below 600€ they are getting my money cuz I have a 6800XT so it's a nice upgrade
be ready for a big disappointment on that part AMD ain't beating RDNA3 with RDNA 4 it may even be slower due to the lower number of CU's even if they are slightly faster best we could hope for is an RTX 4070ti Super equivalent we will know if the part about the faster RT performance is true once the PS5 pro gets announced and released as that apparently has a back ported RT engine from RDNA 4.
can't wait to comment "Amd never misses an opportunity when it comes to missing opportunities"
You will. And I will too. Let's just wait until Q1 2025 when they'll launch some overpriced cra
Hopefully the replacement of many top executives at amd will make this a different story.
CAn people please stop coping? it won't compete on performance and we are talking about AMD here so they won't price it right, amd will price them too high, get bad reviews and slowly drop the price.
Yeah probably
I'd probably bet on this lol, that's classic AMD
AMD's biggest enemy is sadly AMD
Unfortunately the most likely cause look at the 7900 XT/X
Little column A little column B
Man, imagine if AMD was actually sandbagging and trying to hit nvidia in the midrange, where they're clearly asleep at the wheel. Wouldn't that be nice.
AMD dropping a $500-600 card on par with the 4080? I'm very doubtful, but they'd tear the market up if they did that.
*sad 7900 XT noises*
thats good keep 7k series still good ,, i would be mad if they released something so much better than my 7900gre
@@steven7297 I mean, it'd be at the same price point, but I'd just pass my 7900 XT to my brother if I upgraded, and then take my 6700 XT back and sell it.
You and 43 other people do not understand how this industry works. AMD absolutely would not "tear the market up" with a $600 GPU that matches the RTX 4080. That would be a below-average and unimpressive improvement to value compared to most previous generations. Almost all previous GPU generations have provided much greater performance uplifts than RTX 4000-series and RX 7000-series. The previous generation was unusually bad and we should expect better, just compare it to the generation before that.
Remember that the RX 6700 XT matched the RTX 2080 Ti at less than half the price. It was reasonably popular, but was not treated as if it was revolutionary, because it _wasn't_ revolutionary. It didn't "tear the market up" by any reasonable metric, it has more users than the RTX 2080 Ti, but still has fewer than the RX 580, RTX 3080, or RTX 4090. If the RX 8800 XT costs more than $480 and doesn't at least match the RTX 4080 Super, it's *worse* than the RX 6700 XT and will not take significant market share.
@@nathangamble125 I don’t get that y’all sesh , The 6700xt always came out way after the 2080, that contradicts your point as this 8k series will not be much uplift but enough at a way lower price the same way the 6700xt was to a 2080
@@HurricaneSparky where would you seek your Gpu at ? I don’t want to sell the 7900 gre but if they’ll be better cards at the same price I wouldn’t see why not
its AMD we are talking about, only insane part possible is disappointment level
😬
AMD never fails to miss an opportunity. At least they are competitive.
go pay for Nvidia cards and not eat
did everyone forget about what happened with zen 5 that quickly? *you will be disappointed.*
Edit: this comment aged like milk. zen5 x3d chips are impressive.
7800x3d was the champ of zen 4, doesn't look like the champ of zen 5 has been announced yet.
@@jamesbael6255there won't be, these chips are just bad
so go pay for Nvidia cards.
This is graphics cards, not Ryzen.
So like 9800x3d is out and you now now zen 5 looks much better
Wouldn’t it be funny if the big companies watch your videos for more info
Honestly though, the entire 7k lineup from AMD had wrong names. 7900 XTX should have been called, at best, 7800 XT. This goes down their entire lineup. If they called them what they should, and ofcourse price them accordingly, the 7k series would have been pretty damn good. Considering my views on that, I honestly hope they will not call it the 8800 XT but the 8700 XT and price it around 450.
That would be good but I expect $500. We'll see months from now.
For those who haven't noticed, when we went from GDDR5 (1GB chips) on the RX580 (256GB/s) to GDDDR6 (2GB chips) on the 6600XT (256GB/s), you had the same capacity and bandwidth for both cards. The main difference being the bus width halving.
So even though the 6600XT was twice as fast as the RX580, it could have been made with GDDR5 given the respective bandwidths.
So 16GB of 20Gbps GDDR6 on a 256 bit bus (2GB chips) (640GB/s) would be roughly equal in bandwidth to 12GB (2GB chips) of GDDR7 @28Gbps on a 192 bit bus (672GB/s).
The low end 28Gbps GDDR7 would be similar to the first iteration of GDDR6 (14Gbps 1GB chips) as seen on the 5700XT which had a massive bandwidth advantage of 448GB/s (256 bit bus) VS 256GB/s (128 bit bus) and 25% more shaders over the 6600XT, but the cards were roughly equal in performance.
Not having GDDR7 isn't a big issue at this stage. When we see 32Gbps and 4GB chips, then having 4 of those on a 128 bit bus delivering similar bandwidth to 16Gbps on a 256 bit bus, will see the end of 12GB and less cards for good, except maybe at the very low end if they're still making them (64-96 bit bus).
Right now capacity is more important than bandwidth and 16GB is the new 8GB in the latest AAA titles. The 6600XT showed that bandwidth limitations doesn't mean a wall on performance.
Never underestimate AMD marketing and product management's ability to put their foot in their mouth! It is not just plausible, but likely they will give it a higher in-generation tier number than it deserves, and a price so high that they will have to drop it within weeks again.
Either we get to upgrade our 6800xt and 6950xt cards to a 4080-performance card or we wait for rdna 5 like everyone has been planning for the last year or so. Guess which one it will be
RDNA 4 will be a distaster. AMD can't compete with nvidia in the high end and Intel will take over the lower end market. RDNA 4 will be dead on arrival, unless really AMD releases an amazing middle end product like the 8800xt. But they won't as they are not able to read the room
@@lucazani2730 i think this has to be the most braindead take yet which is seriously saying something lol
Hope they release gpu that is around 499€ but faster than 4070ti and 7900xt
Yeah I hope this is the case(might just match 7900 XT though).
honestly, unless the 8800XT has absolutely insane performance and RT improvements, i'm gonna be perfectly happy with my GRE. but yea, AMD better strike now while the market is wide open (tho there's still damn 6000 cards out there, supply might be an issue)
Idk I’ll end up probably waiting because I have a 6800xt since launch and also 5800x3d
I highly doubt it will even beat the 7900xt by that much. Its gonna be a huge disappointment like always.
I love my GRE!
@@AzSurenoyou have a great setup!
@@Manylimes GRE is great :) what I’m probably going to do is upgrade my wife’s pc she’s got a 3070 fe
People should stop hyping things up, the RTX4080 probably is going to be RTX5070, so USD500-550 for a 70 class card is hardly anything to get excited for and nvidia probably going to price the RTX5070 about the same as the RTX4070 super or increase another USD50 (quick check on newegg, RTX4070 Super is ~USD600 for the most basic models) so a USD650 for a RTX5070 and a USD100 for a nvidia tax against the AMD counterpart seems just about the usual. The only thing positive about AMD is they are more willing to lower prices later in the product lifecycle.
Thank you, I was literally going to type this up. Why is a 4080 2+ years later at 500 to 550 good?
Exactly, the 5070 is likely to be on par with the 4080 if trends continue. 2070 >= 1080, 3070 >= 2080, the 4070 >= 3080, the 5070 should be >= 4080, but I suppose time will tell... either way this bump in RT performance isn't enough to catch up and be competitive in that area.
@@Yuber898
You are missing something. There were never before ti super models like rtx 4070ti super. That is why rtx 5070 will have performance od rtx 4070 ti super not rtx 4080.
@@Yuber898 I agree with you and to be completely fair the RTX 4070 in terms of performance is actually more comparable to an RTX 3090. Considering that I would like to think that if Nvidia is serious about competition the RTX 5070 would be more like 4080 super / 4090 D performance and RTX 5070 super like RTX 4090 performance. To me RDNA 4 sounds like the worst generation of all, after the great come back they had with Rx 6000 and 7000 even, I mean 7900 xtx wasn't a bad card, actually the absolute best ever from AMD even if slower than 4090, but now coming up with a new generation and claiming that 2 years old-like card performance are astonishing sounds baffling to me. If all of this is true AMD is already bloody dead and they don't know it yet
@@bellicsson4171 I highly doubt that there is going to be RTX50X0 Super this generation especially if AMD and Nvidia are both releasing in Q1 2025. Unless RDNA4 is really just a bug fix generation and launch to satisfy shareholders and RDNA5 release in 2026, only then we would see a RTX50X0 Super to compete in 2026 and RTX60 Series in 2027. Nvidia did mention that they are going to stretch the product lifecycle to ~3 years.
Always the same. „AMD could be insane“, „nVidia is down. AMD just needs to act now“
And then AMD always drops the ball 😒
agreed but maybe they will have their Ryzen moment in GPUs at some moment, they are NOT totally incompetent :)
@@HybOjas an AMD user. I find it way more likely for intel to shake the market with battlemage
@@lucazani2730 time will tell, I dont think intel is at a good spot atm :D most likely, all will let us down and we get 1500USD 5080 to "choose" from and I will stay at my ol 2nd had 3080 for ever
@HybOj 1st gen Navi was supposed to be a ryzen moment, but it was just competitive. I say that owning 5700xt and 7800xt
@@HybOjRyzen moment was only possible due to Intel's issues with both architecture and 14nm process. Nvidia would have to screw up badly to give AMD an opportunity.
btw, something i want to mention is that we dont know what gpu the leaks are for, we just know its most likely a 8000's series card, for all we know it could be like a rx 8100 and/or a early engineering sample, also it would be better for amd if they released their gpu's after the 5080 and the 5090 since if nvidia thinks they have no competition, they will massively overprice the 5080 and 5090, and then amd can swoop in and release gpu's of a similar level if not more powerful but at a far lower price. after something like that, no one would want to buy a 5080 or a 5090. which could help quite a lot with the bias most people have for nvidia.
considering their recent 1st party testing is misinformation, i'll keep going with used and known perf/£ versus the marketing nonsense from green and red.
AMD need to take the bottom and middle of the markets, because NVidia is going to price itself out of the market.
I was going to pull the trigger on a 7800x3d + 7900gre to upgrade my aging 10850k + 3070 system when I can afford it.
Now I'll be waiting to see if 9000x3d + 8000 series is worth it.
Same. I'm hoping for a 9800x3D or 9900x3D if they can do the cache on 8 cores.
I currently have a 6600xt and a 9700k and both my parts are starting to struggle especially if I want 1440p for my screen lmao
I wouldn't even call your current setup aging lol it's better than what most already have
Seeing the 8800xt vs 7900gre numbers... yeah I kinda doubt it.
@@todorsamardzhiev144 where have you seen these numbers?
what do you mean "aging" ?
I'm upgrading from i7-4790k to 9800x3D (got one msrp)
and a gtx 1080 to TBD.
:)
OK but we do need "efficiency" in 8000 series / RDNA 4 cards.
40W+ @ "video playback" UA-cam, Netflix/VLC etc. (all models inc. RX 6800 & Up) is RIDICULUS... Did u hear that AMD ? 🤔
Try turning in freesync, i heard it fixes that problem
They are. It is a Vram Problem with RX 6800 and up. Thats because these Cards hammer their Vram on full Frequency in Idle. Takes place when you have 2 Monitors or more than 60Hz Monitors. But does not seem to be for all People. I have it. When i drop the Refreshrate under Windows at 60hz instead of my 165/180, Idle Powerdraw or with YT on, is below 10W.
It is wild. They are quite efficient, but this is a softwareproblem. Have RX6600, RX5700XT, RTX2070S, GTX1070, RX6800 and some older Cards like R9 280X or 1050 (m) around and the RX6800 is the only Card that does it. Great Card, though. :D
VRR seems to help some People. But not in my Rig. (3440x1440@165 + 1920x1080p@60)
Truly a shame. I decided to sell my 7900gre just because of that, 95% of the time that my PC is on it is just "idling" and I see 30W on the desktop and 45-55W with video playback even with stock settings. On a 60hz 4K monitor!! This is pure insanity. I had this issue with the 6950xt too.
And just to put 50W into perspective, my whole setup (with monitor) running rtx3070 used to draw an avg 51W (from the wall) during video playback, this card instead draws an avg of 48W alone!!
Yep. Multimonitor too. As soon as I plug anything more than a 1440p monitor to my 7800xt, the memory clock shoots up to 2425MHz and stays there the entire time. It's not as bad as previous gens where anything over 1080p 120Hz would make my 5700xt stuck at high mem clocks, but there's still a lot of work to do for AMD.
n48 is mean it got design later than 40-47. 40 to 43 should be the rumor chiplet but it not work out so they skipped it and design monolith for mid range, while n44 might be monolith from the start on low end tier
I REALLY like how AMD is still in the race, and still trying to evolve their cards! :)
It's good to see there's still competition!
Hypetrain -> overhype -> disappointment -> just wait for the next-gen -> repeat.
How can 8800xt with 7% more CUs, the same BUS, and the same memory speed suddenly be over 45% faster than the 7800xt to match the 4080?
Let's be real at least for once.
Imagine, a 7900xtx with superior RT performance (than the current gen of intel/amd cards).
I hope they make a properly engineered GPU cooler like the 4080 had for the 8800xt (which I doubt that they are)... Those low temps were to die for... I almost picked up a 4080 super because I heard that the 5080 and the 5090 are going back to having "smaller coolers again"... Like the 30 series cards... I don't want to run super high temps again.
I also hope that they make ray tracing better for the 8800xt... If those 2 things are good I'll switch to AMD no problem
As much as I wish AMD will finally PROPERLY compete, not acting like "the same, but 5% cheaper" brother , just to knock Nvidia down a peg for once...this was almost cringe level of copium.
AMD is the only one that can sabotage this launch.
NVIDIA knows this and will follow the rule of "Never interrupt your enemy when he's making mistakes".
If Nvidia or AMD can't bring slightly better performance than the 4080/7900xtx around or under 700$, I won't be upgrading from my 3080. I'd be most interested to see if AMD can make a true budget card on-par with the 5500xt. Something with 24-28 CU's, 8-12 gigs of ram with a 128-96 bit bus at $150 to $170. I think that's what the market truly needs to balance itself out. The budget tier has never been great, but it's gotten very unhealthy these past 2 generations. Nvidia charging $250 for the 3050, then charging 300$ for what's essentially a 4050 ti really damaged the budget tier, and the 6500xt didn't help either. The only bone this tier got thrown was whenever the RX 6600 went for 180$.
If these really are accurate rumour performance about the 8800xt then there's no way it's going to cost $500-600, with that kind of performance it's going to cost at least $750 if not higher
I dont care about raytracing.
Give me a GTX 4080 16GB at $500 and I am happy.
ryzen 4070 is better
@@Vewsnah bro intel 7900xt is superior 😂😂😂
There is no going back, your future is ray-traced, and you will be made to care. Have a nice day :)
Developers are forcing ray tracing into games now. Rumors for the ps5 pro showing it’ll have more than double the RT performance and also DLSS/Frame gen-like features. Developers will just start making it standardized. Don’t make the mistake of buying an expensive GPU with bad RT performance, you’ll regret it in less than 2 years time.
@@xblur17so you're saying Amd just made Rt and image upscaling a must have feature by creating the Ps5 Pro chip?
Frankly AMD are on such shakey grounds with their reputation, and lying to hardware enthusiasts about performance increases, I won't believe their RDNA 4 results until I can see them. They talk up a ~5% improvement as 15%.
Yeah my dude, the new ryzen 5700 (I think thats the one anyway) supposed to beat a 7800x3d lol, AMD are lying dipshits of the highest calibre. Still not an Intel though.
Zen 5. More like Zen 5%
@@lucazani2730 yeah Intel did the same from gen 3 to 7, AMD can flop once, we can respect it. and would be half a flop if 9000 X3D are good.
@@nicane-9966 not denying that, absolutely. I'm actually an amd user
that's y it's called rumor....
The best part of this video is that it informed me a MLID video somehow didn't make my feed and I'd need to go and deliberately look for it.
The long reason imo it’s 44 and 48, was that they have been wanting rdna to be a mcm gpu like their cpus, but we’re having issues doing so without latency or something issues. I imagine at this point in the AMD/Jim Keller comeback plan, unlike cpus, the gpus (and apus tor the record) did not go to plan. From what I understand they also wanted rdna 1, 2 (1 and node woes was enough for them to change course early and squeeze rdna2 monolithic hard), 3(less degree), and 4 to be mcm designs but depending on how realistic (complexity vs node woes vs money woes vs awareness of before issues timinglyness) they were it changed how the lineup was announced. They definitely would’ve prefer saying it would be a rdna 1 gen in rdna 3, that’s why they didn’t miss it now. I believe 44 was the only original design of rdna4 left to actually be put into mass production. 41 dual die, 42 smaller, slower, less hot, dual die, 43 single die monolithic mid tier, 44 was relatively a lower mid tier chip, 45-47 were scrapped points above and between trying to figure out new plans, 48 is the sudo mid-high, sitting relatively higher than the 5700xt did at launch, probably where it should’ve been if it also didn’t have issues with heat and power. As Ryzen rose, Radeon raged red, leaving apus nerfed in two ways with no reason to ever get buffed… until now(strix halo). Halo when tuned will be peak computer efficiency, being rdna 3.5 we will get some of the important fixes but with 40 cus, lowered in wattage, better but not great memory bandwidth and latency, can find a good balance for mobile 1080p gaming and mobile workstations.
Expecting rx-8800xt to perform like 4080 is insane, at best it will close to 4070 ti super
For $500 even that's really good.
@@laszlozsurka8991 Yeah I agree and it would encourage me to buy one.
My hope is that they hit the performance target(4080/XTX equivalent), and if it costs more than 500$, then I’ll just wait until it’s discounted to 500$ and buy it.
Literally just got my 7900xt GPU on sale for $650. Upgraded from a 6700xt. I think I’m good
Intel's battlemage might also be interesting, especially if it is released this year.
Picked up the 750 on eBay for £120. Honestly if flipping amazing at that price. RE3 4k 90+ atm. Like what the hell. 😅
@@leonard8766 Picked up a used A770 from a guy that bought it in china, the AV1 encoding on Intel Quick Sync is a god send for my media library converting using Handbrake. The only problem I have are weirdly seem to happen on unreal engine games where texture goes missing in Max setting, but shows up again when lower the texture setting to low
@@leonard8766 because of the issues at launch the arc cards are underrated but with the updates they are most likely the best value cards around at the moment.
Intel battlemage will take over the lower end market, since both green and red companies are ignoring that segment
@@lucazani2730 similar at the high end to a 4080 apparently. In any case it will be interesting.
25:38 Dude. I feel ya. You're a trooper for doing it again anyway. This was a great vid.
Also to consider, NVIDIA and AMD are basically "family" rival companies since both CEOs from both companies are cousins so even though they may be preparing things at stuff to beat each other, a theory would be that they are just having fun doing competition to each other since they are from the same family :)
and to top that off, Jensen "Leather Jacketman" Huang used to work at AMD 🤣
I heard they actually despise each other.
As we all know, the 7800xt should have been a 7700xt if we look at 6000 generation.
When we see a 7700xt beating the 6800xt by 5%, then we should know that the 8800xt will beat the 7900xt by 5%.
So it will perform more like a 7900xt then the xtx.
Don't expect it to be close to the xtx version.
Price will be 599 and not 499. If it's gonna be 499, then we should be very happy.
Well, Ray tracing is now evolving from RTGI to RTX DI (dynamic illumination) and Ray Reconstruction. Unfortunately, it's expensive enough to cause frame rate and frame timing issues on a 4090 in Star Wars Outlaws.
Yes, it looks nice and I got a glimpse of a bit of RTGI in Star Wars Survivor Quality Mode on Coruscant but it literally transforms your 4090 into a 4070/4060.
And Star Wars Outlaws seems to be using SSR for water and glass. If you turn it on for those features, I'm sure it would bring the 4090 to its knees.
I'm beginning to believe that AMD is popular with a certain type of gamer because these are the same people who get addicted to the manic/depressive cycle of real-life dysfunctional relationships. Think of the women who date abusive men. AMD is like an abusive partner who lies all the time and makes promises they won't keep.
Guys, Ryzen 9000 was also over hyped and look how it turned out... Don't expect too much from RDNA4. Also the leak from MLID is just dumb, 64 CUs but apparently the 8800 XT will "trade blows" with a 4080? Something doesn't seem right there, AMD would have to make a technological innovation to match a 4080 with just 64 CUs.
I've been involved in the MLID community for nearly 2 years, seen every podcast, every leak video, and interacted with ppl far smarter than I in his patreon discord.
From personal experience I can tell you his leaks of specs, pricing, and release date are almost always on point
But his performance estimates are often hit or miss
Specs, pricing, and release date details on RDNA 3, RTX 40, Intel 13th & 14th, & Ryzen 7000/9000 hit the nail right on the head
But performance estimates for 14th gen, R9000, & RDNA 4 were pretty far off
Which in his defense Intel and AMD are basically on fire when it comes to communication between marketing and the engineers
The left hand doesn't know what the right is doing
Marketing says +30% and engineers say "no way"
AMD cycle:
-rumors of insane new Nvidia flagship
-rumors of AMD flagship that challenges Nvidia’s for half the price
-AMDs card ends up being in the Nvidia 70 series card.
stop lying. 8800XT will be an $800 card at launch because 4070Ti Super is an $800 card now. AMD not gonna give away 4080 performance for $500, mooreslawisdead is the most biased AMD fanboi channel. most of his predictions end up wrong
It's always the same thing everytime AMD is about to release new hardware.
People overhype the new products based on any questionable rumor (typically from MLID and the likes) and then proceed to bash the new products for not reaching their unreasonable spectations.
Don't get me wrong. I'm going to criticize those companies if they release some miserable product like any more 8GB GPUs but specting Navi 48 to have RTX 4080/RX 7900 XTX performance for $500 is a bit too much. Just look at the current generations and ponder if that's really probable.
Is that zoom in effect you do on webpages edited into the video or are you using some zoom in program?
I just bought a 7800xt from a 2080ti. Some how I'm not super worried about being super out classed next year but who knows
you not slick with that risk of rain music vex , man has taste
Honestly, AMD really needs what they were known best: similar competing performance and features, while UNDERCUTTING competition price (meaning value). Which AMD kept shooting itself in the foot with their weird pricing or other marketing team decisions. Unless they have better features (like software wise), I think they should prioritize value first for customers to see.
I have 7800 XT, but I got that because in my country in particular, it was so much cheaper than the nVidia competition (on the similar level).
I think all these tech people miss something in the videos. We love high end cards because we are enthusiasts but the vast majority of people buy mid level cards thats where the business is at and amd knows it.
7:09 AMD numbers their chips with the design date. They start with the biggest die. Then make smaller once for different markets and price points. So either they are numbering different now. Or they went trough a lot of redesigns and are now at chip design 7: navi 48.
The really really really juicy stuff, as you put it, is the FP8 and matrix HW capabilities and 2x RT. This will absolutely PUMP frame generation, RT and general AI capabilities of the chips, and I think flooding the market with those capabilities is what they are going for here, before they introduce some new software that depends on those.
after the 9000 release I will believe them when they say don't something super powerful.
7:55 it could be a generational skip. Maybe 48 was suposed to be iinternal name for RDNA 4.5 where they were planing to fix RT performance to test it for 51 chip, but for some reason or another it was pushed up to be RDNA 4 flagship. The 44 would have a lot of focus on efficency for the PS Pro/Halo BS
They are pushing this because the PS5 Pro and the next Xbox is right around the corner, Sony might've forced them to develop the new chip and they are just throwing everything they have to the market
@@suinsarbayev2191 ye my thinking aswell. Also might be thereason to go for monolithic die. Bugfix the fully redesigned "core" chip with RT and efficiency in mind, and than weave them in with infinity mark 2 for 52 navi or some shit
My goat vex back at it again with the reporting. Thanks for making this entertaining as always.
I’m not super knowledgeable about the topic but wouldn’t it make sense to just drop the price of the 7900XTX to around $500? Considering it already surpasses the 4080 in performance
I know that the 8800XT will probably have other benefits such as better ray tracing and maybe better power efficiency and temps but it makes me wonder what will happen to the 7900XTX? Will AMD simply just discontinue it? What about the 7900XT? It would need to see a a price drop as well. Even the 7800XT will have to be priced like $350 - $300 or below to make sense. If that’s the case then the 7800XT could become the best value GPU
Yeah AMD tends to cannibalize their product value lineup, they might drop the price of the 7000 series pre launch to move out stock but all signs point to the 8000 series being entirely midrange so its hard to say what theyll do as it seems to be a bad situation all around for them.
older cards for AMD always exist in the market place but at reduced price and get sold off fairly quickly, although RX580 went from high end type card to mid rang card and stuck it out for quite a while as AMD didn't produce anything that performed at a similar level for a few years as they simply produced high end wanna bees (Vega only came in two varieties and less said about the Radeon 7 the better.) Remember RX5700XT was also that gens top card and also only intended to compete with the 2070 not counting RT.
7900XTX is expensive to make. Lots of highly binned silicon, lots of memory, big heavy coolers. Just doesn't make sense to sell them at $500 new. It's much better financially to design a smaller card, with smaller and more efficient chip.
Not really. They need to protect their business unit margin - after all we are talking about a large company with a global presence. So to do so you need to either create low price - high volume scenario (risky) or create cheaper product (for you to manufacture) with some new features (look they propose better RT) and a feel of being fresh.
Of course you can be Nvidia and just pump margins with higher prices. But Nvidia is not longer gaming-oriented so they give a F about gamers. Niche buyers from PCMR will still buy HEnd cards from them - but money nowadays you get from B2B segments (AI, servers, and others).
No, because the RX 7900 XTX costs more to manufacture than the RX 8800 XT will.
Navi 48 is expected to be about the same size as Navi 31's GCD and is built on a similarly-priced node (N4 vs N5), but doesn't require separate MCDs, which have their own manufacturing cost in addition to an assembly step to combine the GCD and MCDs together. The RX 8800 XT will also have 8GB less VRAM, and will likely use smaller coolers, VRMs, and PCBs.
As the saying goes, there are no bad products, just bad pricing.
If they can get rdna 3 performance at significantly less price, I'll call that a win.
I don't usually comment on things like this often, but i will note to any viewers that MLID is a very unreliable source for leaked information. Just a few weeks ago, he was hyping up Zen5 being up to 20% faster than Zen4, and he has a track record of just getting things wrong. With that being said, we do know that RDNA4 GPUs will likely be quite a bit better in RT, and the RX 8800XT should be somewhere in between the 7900XT or 7900XTX in rasterization performance.
"Just a few weeks ago, he was hyping up Zen5 being up to 20% faster than Zen4"
Yeah, and so were AMD themselves at Zen 5's announcement.
MLID has sources within AMD, but when AMD doesn't even know (or if you're more cynical, lies about) how good their own CPU is, the numbers reported by leakers will inevitably also be wrong.
MLID's performance data isn't very reliable, but it isn't significantly _less_ reliable than AMD's.
@@nathangamble125 A very sarcasic person might say that since Robert Hallock left AMD for Intel Lisa has replaced him with MLID and this UA-camr gets a monthly paycheck since then as their (outsourced) marketing guy.
the second number on the gpu die is when that gpu was designed
NAVI31 was finalized before NAVI32 that was finalized before NAVI33.
navi41 and 42 was supposed to be a big deal with MCM but prove to be too much for AMD and got canceled, NAVI44 was work in progress as well, AMD needed a bigger gpu then the NAVI44 and made the NAVI48 as a plan B. it was the last GPU to be finalized and it go the highest number.
Watching this with my newly purchased RX 7900 XT in my hand (just pulled out of the box) 🍿
unlikely, more likely a situation here is a card that would sit around the 4070ti/4070ti super, while costing 550 or so as an MSRP with a later price drop to 500 which is still insane, like thats, value to kill for. but only in gaming, with a 12 pin power connector as standard AMD adopting nvidias trend setting silliness
and then that being really only as good as a 4060ti 16gb in professional workloads with nvidia being better there, with ray tracing on par with rtx 3000 series parts running their compatible software, at their rough quality
anyone who goes around talking about rtx 4080 levels of preformance, are living in a dream world most likely, and ray tracing on par with nvidia is also in a dream world. if it has a 20gb variant it would be an excellent card for 3k gaming
Moore's Law is Dead has never hit the mark with his AMD speculations
This!
The guy has almost zero credibility and his "rumors and leaks" were proven wrong 95% of the time.
Lets be realistic:
GPU department at AMD in the past overpromised, underdeliverd and price/performance was just a lil bit better than Nvidias. Lisa Su also has set priority in CPU development and cuts costs in R/D of their GPU lineup.
- Performance of RDNA1 was low to mid tier and prices were somewhat "okayish". Also marketing has set expectations right.
- RDNA2 had alot of potential but production costs and price increases prevented AMD in getting market shares.
- RDNA3 underperformed/underdelivered and didnt give a good value at the beginning of the product cycle.
Maybe at launch of RDNA4 and at the end of RDNA3 the prices might come down to a acceptable/good value for customers.
We can expect RDNA4 to be performing like RDNA1 (low- to midrange in comparison to Nvidias lineup) and therefore only price/performance ratio will tell if the upgrade will be worth it.
Also rumors that their top end chip will be 4nm monolithic design are highly doubtful because TSMC produces them and the production costs and wafer yield dictates the consumer prices at the end of the day. Thats why AMD went to chiplets with RDNA3 and got performance hits, high latency.
RDNA4 either will be chiplets for their top lineup/halo product or (if monolithic will be true) the prices will skyrocket.
AMD simply cant make a monolithic GPU die at 4nm TSMC node with the required die size to outperform RTX 4080 at a cost of 500-600$ per card. Simple physics!
What a clickbait, more then 4080 Performance for 500$ ? In what universe please.
Also with only 64 CUs... I call BS
Sounds way to good to be true. Even if it matched the 4080
...fsr and lack of ray recon still not a 4080. Alot of people are gonna be disappointed when we see the actual performance ..but lets see@@laszlozsurka8991
Yeah I expect around 7900 XT performance.
How do amd fans feel about amd giving some rt performance?
I thought amd users dont care about ray tracing or upscalers??
I personally do like Ray tracing very much and I would like to try it but Nvidia cards are too expensive. It's as simple as that
Good views on the topic. Amd usually gets the hand me downs from tsmc which is why theyre always a node behind nvidia. I think they prob chose not to use ddr6x because of sam. Amd is the whole package so they plannedd around the fact they get the hand me downs for gpus and created a work around using their cpus. The ability to still used ddr6 is a huge cost saving. They will adopt ddr7 maybe the next release after this one. Theyll never be the fastest because they will have to outbid nvidia but im looking forward to this generation launch too because i too think nvidia is overpriced.
I’m planning on getting a 7900xtx around end November. I’ll be following the release to see how it goes in terms of performance and price in comparison to the 7900xtx
I think that the reasons AMD won't use GDDR7 on this generation are twofold:
1) I'm sure that GDDR7 will be HORRIFICALLY expensive in the beginning (everything is).
2) I thnk that GDDR7 will be too fast for the card to properly use, like an R9 Fury with HBM1.
The only thing AMD marketing is tricking lately is themselves. You can buy gre for 399usd equivalent in Poland - 6700xt was 479usd - this needs to be 449. AI may be ending, Nvidia will hit gaming market hard this time.
Where, allegro? I don't know much about other polish markets.
7900 gre za 399usd ?? powiedz mi gdzie xD
@@bezly8867 Na pepper było za 2200 zł, wracając po niedorzecznym kursie wymiany wyjdzie właśnie 399.
Given how Radeon used to be run, hyping up way more than they can deliver. It would be extremely intelligent for AMD to downplay their upcoming product and surprise people. But I believe them when they say they're focused on the mid range market, even though I believe it's a mistake to not have a Halo GPU of their own.
This Nvidia beats AMD, AMD beats Nvidia stuff looks like
I drink water, I piss
I drink water, I piss
This place is a prison
The natural evolution of hardware.
Nothing to unexpected.
7800XT traded blows with 4070Super, so why wouldn't it's successor 4080 2 years later?
You should do another video on lossless scaling, its been updated to go up to 4x, also im pretty sure the quality of 2x is improved
MLID exagerates everything .
It wont reach RTX 4080 performance if it has only 64 CU's at 2.9-3.2 Ghz.
My Calculations assume you run at stock Clock frequenciies and not even auto boost (with auto boost i mean the frequency Adrenalin tunes in even without asking it to OC, which dont reflect the real/offical frequency caps) or those 2.6-2.7Ghz or even those with lower energy costs / bigger pockets and luck witth their silicon 2.8 Ghz.
From what i noticed most people have their RDNA3 GPU's running at 2.6 or 2.7 Ghz (I personally run mine at 2.65Ghz)
RDNA4
If the shading unit count per CU is the same as with RDNA3
4096 Shading Units 64CU @2.9Ghz is only 47.5 TFLOPS FP32
4096 Shading Units 64CU @3.2Ghz equates to 52 TFLOPS FP32 which would put it right in front of the RX 7900XT @2.4Ghz If the shading unit count per CU is the same as with RDNA3
to put this into perspective with the
RX 7900XT @2.4 Ghz you get 51.8 TFLOPS and at that Clock Frequency it doesnt beat the RTX 4080 (roughly 49 TFLOPS) and is on par with the RTX 4070 Ti Super which comes in at roughly 44 TFLOPS which would be lower than the Navi 48's but in games AMD's theoretically higher compute resources sadly dont translate as well into real world performance. Real World Performance would be some where between a RX 7900GRE and a RTX 4070 Ti Super (at least with hopefully a little lower power consumption than those two as well as greater memory bandwidth).
The only way they could turn things around would be if A: their drivers make gigantic lkeaps forward to translate that theoretical performance in realworld performance and B: those new dedicated RT Cores actually work better and as a drop in replacement when running "older" games (older meaning they havent been coded to specifically target these new RT Accelerators (RDNA 2 and 3 used for a big part of their RT Pipeline the TMU's.)
Battle mage finna pop off
Haven't been excited for a GPU generation for quite a while? Did you miss the point earlier in your video where the 3070 matched performance of the 2080ti? (Granted that was 4 years ago...)
7:30 this follows development cycle.
first to be made is biggest one, so smallest number.
then you start trimming, making sequential numbers with each trim.
current numbers at AMD suggest they did MAYBE had 42 &44 but scrapped 42 and remade it, but to avoid any confusion internally it got next number so 48.
I assume they use even only 2/4/6 was initial release, where 2 would be quad chip design going as big as those crazy rummors went, 4 was their highest volume design, and 6 was maybe 8300 version.
odd numbers will most likely follow later when they fill in the gaps with mid-life products
16gb isnt enough for 4k maxed out with frame gen on in some games. Ive seen it happen with my rx 6800. Mostly saw it with ubisoft games and I did see it happen with some playstation ports. The new starwars outlaws will even automatically put you in a lower texture streaming mode if it doesnt detect over 16gb of vram. Anything under you will see pop in muddy textures no matter what your settings are. I say more vram the better, but I can see 16gb being the minimum soon in the next few years.
Amd use the slower vram. Got a 4080 never had issues at 4k. The 7900xtx has 24g and loses to black myth wukong to a 4060ti and has no chance at 4k yet alone 1440p at pt. 24g vram should be enough for 4k and 1440p right??
@c523jw7 It looses due to optimizations favoring nvidia. I have a 7900xt, and without full raytracing, it was running fine in 1440p. Everybody knows nvidia has better tech, but tbh most want price to performance, and hopefully, something comes out to fill in that gap and change the market.
@@cbz21 I agree price to performance is great without rt. 7900xt is a beast.
The challenge with vram is rt increases it so you need rt cores and base strength. Look at the 3090 as well 24 g card doesn't make it a 4k card today though. So it becomes a 1440p card..24g is overkill there. But having more doesn't hurt and I do think nvidia are scummy with their vram still.
this is a middle range card, not 4k...
@nicane-9966 I wouldn't consider it mid tier but I did get one for under mid tier prices. Got mine for 500. You shouldn't need to pay 1k for a top tier card.
i think AMD may release it at that price to overflow the mid tier market with AMD cards, specially because i am sure Nvidia is going to keep going more expensive.
Basically AMD has given Nvidia the top tier, but they will flood the mid tier market, which generally is the one that makes the majority.
Meaning it could eventually force Nvidia to lower prices or give up on mid tier entirely and just cater to high tier.
Yeahhhhhh I’m not holding my breath like I have year after year anymore. All year long for a more expensive piece of hardware and a 5-10 percent boost in performance. Waiting to see what Black Friday offers and ordering 👊
They actually have a chance to gut NVidia by delivering value midrange cards that you can cool cheaply. I hope they don’t screw it up.
I'm excited to see improvements in things like Ray Tracing or maybe see where AMD can close the gap in production programs like 3D modeling and video programs.
There is no shot AMD is gonna be dropping a $500 GPU with similar performance to the 7900 XTX. That would be one of the greatest generational performance leaps ever and given their recent track record of overpromising with and underdelivering... I wouldn't hold my breath for anything that exciting.
Them dropping out of the high-end market with RDNA4 deeply concerns me though, that is simply going to open the door for Nvidia to mark up their own cards in those segments even more than they already have with RTX 40 series.
The latest rumour is that it'll be slower than the 7900XTX and much closer to the 7900XT. That puts it around 15-20% faster than the 7800XT, a $500 card.
20% more performance for the same money after 1 year. There's nothing crazy about this.
"That would be one of the greatest generational performance leaps ever"
It objectively wouldn't.
RX 6700 XT matched RTX 2080 Ti at less than half the price
Vega 56 beat 980 Ti at less than 2/3 the price.
R9 390 was only about 10% slower than 780 Ti at less than half the price, and then a year later the RX 480 matched it at 1/3 the price of the 780 Ti.
It's completely normal for AMD's GPUs to provide 2x the performance per $ of Nvidia's high-end GPUs from the previous generation. You are simply ignorant of history if you think otherwise. A $500 RX 8800 XT which matches the RTX 4080 Super and RX 7900 XTX would not be unusual or impressive, especially considering that the RTX 4080 Super doesn't even provide better performance per $ at MSRP than the RTX 3080 did.
It should be at MINIMUM better than the 7900xt based off past trends with AMD, it basically has to be 7900xtx level with better ray tracing to be a good deal for around $600-650 and that’s pushing it, it would have to be cheaper than that to be a good deal as you can find 7900xtx for under $800 in some sales if you are lucky, and 2 years on you should get that performance for much cheaper, if amd cannot do this then the gpu market is finished and nvidia will hike up prices crazy and the future for gpus won’t be looking good
Rdna 4 had overtly ambitious chiplet based top end chip that got cancelled but they kept low end alive because that they thought they could make it work on time. RDNA5 got people from cancelled project, and it was supposed to be successor to that overtly ambitious chiplet based high-end chip. RDNA4 is successor to low-end chips of RDNA3 because high-end was too ambitious to get working on time.
Nvidia doubled their ray-tracing triangle intersection cores in 3000 series.
They will surely be trying with chiplets, but they might skip generation until they perfect it.
What would be nice is a GPU without rt cores to keep it cheaper but at 4080 power.
I agree because the last Nvidia GPU without RT was the GTX 1660Ti/1660S and if priced right this can be the new midrange banger. This strategy might be very successful because of cutting out RT units will result in smaller die size and lesst cost to produce in latest node tech.
I just hope we see the release in time for xmas sales rather than a friggen CES-launch. I can barely keep it in my pants as it is now. Also if it turns out to be a disappointment I'd prefer to know as soon as possible.
I love how there's a hollow knight song in every video lol.
RDNA 2 was such a great architecture. I used to get regret buying Radeon. Now I really don't see a reason to get anything else.
RDNA 3 finally, instead of RDNA 2.5 type of uplift that we got with 7000 cards
I wish they had higher end cards to compete with 5090 and 5080
In fact they should name NAVI 48 as a 8700XT and potentially launch 8800/8900 cards
You never know what Nvidia has prepared, the 4050 (named 4060) was powerful for it`s small die size but overpriced
My local retailer has brand new 7800XTs going at $480 usd after tax right now.