[SPON: Use "brokensilicon“ at CDKeyOffer’s Black Friday Sale to get Win 11 Pro for $23: www.cdkeyoffer.com/cko/Moore11 ] [SPON: Use "brokensilicon" for 6% OFF at Silver Knight PCs: www.silverknightpcs.com/ ]
AMD doesn't have to make a faster GPU than Nvidia, they just have to make a modern competent GPU that people can afford. I applaud AMD for focusing all their resources on low and mid-range as that's what the industry needs right now, no more space heaters that cost $1600+...
They will botch it. The days of 580 are gone they want to match Nvidia -100 dollars but it will kill Radeon trying to do this. FSR4 better be really good or people are going to never trust it.
@@Navi_xooI feel the perception is already set. Dlss has been ML for years and fsr is just starting to train it's new fsr. It will be obviously worse than Nvidia for a couple years. DLSS is just unbelievable compared to fsr 3. Id upgrade to 8800xt if RT and upscaling was up to Nvidia but it won't be. RT will not be better than 40 series and fsr 4 will kick off in the same manner as pssr. Some are ok but some games actually look like shit.
@@christophermullins7163 Yep it's the unfortunate truth. Even PSSR seems to be a big step down from DLSS and some games downright broken, Silent Hill is already patched to revert from PSSR back to TSR.
Lol. Let's be real.. the names mean nothing. So long as the new 60 is not slower than the old 60.. Nvidia can name them whatever they'd like to. The Idiocracy of gamers that feel they have the right to have a certain number of memory chips on a certain tier GPU is pathetic. Gpus are expensive now.. tough shit
Yeah if we go by shader count : what's called a 5090 = 88% of uncut GB202 (170/192SM) = more like a 5080Ti what's called a 5080 = 44% of uncut GB202 (84/192SM) = more like a 5060Ti what's called a 5070ti = 36% of uncut GB202 (70/192SM) = more like a 5060 what's called a 5070 = 26% of uncut GB202 (50/192SM) = more like a 5050ti if we're being generous what's called a 5060ti assuming it's based on GB206 can at best have 36/192SMs , 19% of uncut GB202 = more like a 5050 The gap between GB202 and GB203 is absolutely wild , you could easily slot two SKUs between. We don't know what the relative performance is yet , so can't go by that - but it wouldn't surprise me if the gap between the 5090 and 5060 is as much as 4x
AMD has the opportunity to kill every offering from NVDA below the 5090 if they want to... But as we know, they never miss an opportunity to miss an opportunity.
Luckily that won't happen since strix halo has a 256 bit memory bus so it requires 8 memory chips, with strix halo being around 300mm^2 of silicon by itself, most laptops wont have space to put a 4050/4060 even if they wanted to. Although strix halo wont be cheap, I would expect $1500 starting price with 32GB shared ram.
@@frowningboat8039 If Stix Halo provides laptop 4070 performance with the ability to partition 12gb of system memory to VRAM, I’d pay 1500$ for a laptop or 1200$ for a miniPC with 32gb. I need a new travel partner. My gigabyte aero is tired.
@@allxtend4005 it's lpddr5X ram rather than the gddr6 of the 4070 so the bandwidth will be similar, it's also double that of strix point which uses a 128 bit bus. Edit: Apple's M2 pro also uses a 256 bit memory bus of lpddr memory so it's nothing revolutionary, it's just the first time it's being done on windows.
I REFUSE to spend more than 1000 dollars on a GPU. In 2023, I had 3070ti and a 4K monitor. I wanted a new card, I had 2 options. 12 gb crappy 4070ti for 800 dollars , or AMD 7900XTX 24 GB for 1000. Guess what I did! went team red first time EVER. The ONLY regret with AMD was FSR and missing out on DLSS. BUT now with FSR 4 coming and I love fsr 3 and fsr 3.1, and now playing Uncharted legacy of theives with fsr 2. it looks great and I hope FSR 4 will compete then I can stay on AMD. 8800XT is so exciting to see what it's like!
@@nicolasgkr11 Probably not. But the ray-tracing should be _leagues_ better, and the efficiency should be improved as well. Sell your XTX to mitigate the purchase, and enjoy the upgrade.
Don't sell your 7900XTX for the 8800XT. I think the 8800XT will be a really good card with great value, but you're going to give up 8GB of VRAM (which I need because I game at 4K Ultra on my OC/UV 7900XTX) and you'd also be giving up pure raw performance on top of that. Being that it's a slower card with less VRAM, how much will the raytracing (not even close to being worth the performance hit on any card ever made, IMO) really matter when you're VRAM bottlenecking at 4K and getting less FPS in non raytraced titles? Again, I'm not saying it'll be a bad card and I think a lot of people will buy it, but it would be a step down overall for 7900XTX owners. I'm waiting for the RDNA5/UDNA5 flagship.
@@benknapp3787 I will see what the perf and specs are on the 8800xt before I do anything, I am just excited to see the next products from AMD and what they have come up with.
Worth mentioning that the GPU's of today cost a lot more to build too. Fab costs per unit have gone up as the nanometers go down. Power draw has, generally, gone up making the cards more expensive in a dozen ways. Consider how much modern mid range GPU's weigh compared to when they were $150. They are like 3x the mass. A lot of this is driven by display technology. Wanting 1440p high refresh or 4k @ 60+fps creates the market for 250W $600 'mid range' GPU's.
and they used to be the size of your hand too.. these days they're over double the size and double the price, thank moore's cracking law and quadratic thermodynamics for that.
Nah. I could believe the "above a 4070 it is for professionals" if they came with more than 12 GB and 16 GB. Games are requiring more and more vram and 12 GB in 2025 is ridiculous, especially at the price point that are looking at. Series should be 5090 32 GB, 5080 24 GB, 4070 Ti 20 GB, 4070 16 GB, 5060 12 GB.
Yep, I noticed > 12GB VRAM being used in a not particularly GFX intensive OpenGL application, having 16GB is noticeably much smoother with the same mitigations allowing lossy texture compression and > 12GB is easily exceeded.
Considering the consoles have 16GB, need to decide what tier of discrete GPUs should match consoles and go from there, so for example if the 70 tier was to match consoles, then they should have 16GB, 60 tier 12GB, 50 tier 8GB, 80 tier 20GB, 90 tier 24GB.
Yeah "professional" use now seems more like a 36GB+ card. People buy MacBook pros now because of the unified memory for GPU AI development. Professionals at the high end now of Nvidia probably would want like 64GB VRAM
There's a really terrible stagnation going on in the GPU market from a price/performance perspective. All most all innovation now is coming in the form of more watts into bigger chip configurations. It feels like the Core 2 duo days where for 10 years we basically didn't get any compelling upgrades. On one hand, I guess this lets you splurge and hold on to your hardware longer (I'm still using a 1080 ti and it works fine), but on the other hand this is not healthy for the market long term.
A single local retailer in Brisbane (QLD, AU) sold 219 Ryzen 7 9800X3Ds in 7 days. Probably not a lot by US standards, but this is a small market after all.
I'm generally quite negative about content, looking for issues. But I have to agree. This was the best episode in a long time. I'm always appreciative of all guests. But you get far more depth listening to people with a track record from within the industry itself. Also, he is really good at explaining his points (again based on the requirements of his professional experience) Well done to MLiD for getting us this interview
Honestly, if 5090 is a "professional" card, then I naturally expect commitments with regards to raising priority support tickets, including regarding firmware and a guaranteed non-evasive response over a reasonably short time period. Or am I being gaslit?
"Professional card" simply means that professionals will buy it over AXXXX pro cards in less vram heavy tasks -just like how many visualization studios started buying 4090 cards instead of A6000 cards simply because you could get twice as many for the same cost and reach better performance for things like Unreal 5 visualization etc, which is now a fairly standard part of many studio software suites. 5090 is likely going to eat into small AI studio needs and increasing 3d visualization demand as well. Probably would sell like hot cake even at 2500 bucks regardless of criticism from gamers.
@@Real_MisterSir I do agree that the 90 ones in particular, with large VRAM sizes, do have a niche, as explained in your comment in more detail. The 80 seems to be a different story.
I feel like recent advancements (and new markets, i.e. AI/ML, not just compute) made quite a large overlap between professional and gaming hardware. Some years ago GeForce EULA was changed to forbid usage in datacenter because it was really cheap to just use gaming cards. Nvidia was really smart for ~15 years or so allowing CUDA, etc. on their gaming GPUs. Multiple generations of students finish their universities with multi-year experience of CUDA and everything else related to Nvidia software stack. Nvidia 80 and 90 class cards meet requirements for today's workstation/development stations. That's not good for Nvidia as they can make way more money from professional and datacenter side. Nvidia needs to figure out how to distance/segment products to avoid loosing money. I don't know anyone using 4090 card primarily for gaming.
I'm glad I picked up an Arc A380, not because its a great gaming card, but because its perfect for my plex server and when it inevitably gets replaced several years from now it'll be a cool piece for my PC history shelf.
Yeah I agree. I want Arc for a mini PC myself. They dominate the price to performance there and the price is low enough that I'm willing to dabble regardless of Arc's future.
I was contemplating getting an intel GPU for my unraid server, but will probably end up just getting a new MB and Intel CPU with an iGPU. Is there an advantage to having a dedicated GPU?
@@adamwest1138 the a380 is cheaper in the end. Going to AM5 is better for cpu/platform support and upgradability, plus consistent core performance is a good thing in servers (VMs especially) and AMD is better price to performance. Overall I’d pay the $100 for an a380 to not be on an Intel platform. (Actually just did this exercise, but haven’t bought the ark as I had a 1080 sitting in the closet… it’s good enough for now) Plus you can do something with the IGPU in the am5 processor.
@@benjaminoechsli1941No I'm pretty sure he's always said ~7900 *XT* performance for b/w ~$500 - $600 and NOT 7900 *XTX* performance. That's a pretty important distinction to keep in mind to keep our expectations in check.
@ I've always heard him say 7900 XT-ish performance so if he really said 4080 performance then that's news to me. But I *highly* doubt he "always" said 4080 performance, so if you can post a link to where he said that I'd appreciate it.
This. However it’s AMD. NVIDIA can leave a planet sized target for AMD to hit, that would allow AMD to win market share and “win” the generation…and AMD would still miss.
I'm in the 'no important things in the cloud' camp. We learned the hard way when our cloud provider ff-ed up and we lost a lot of money and customers. Went back to our own servers in a datacenter with a cloud upscale ability for peak demand only. Cloud is just another way of saying not your property, not under your control. As for ryzen vs threadripper. I would be perfectly happy with a 16 core CPU with quad channel memory and one extra fully connected x16 slot compared to what we get now. That said, Threadripper still allows us to do things cheaper than before, without relying on cloudproviders.
"I'm in the 'no important things in the cloud' camp. We learned the hard way when our cloud provider ff-ed up and we lost a lot of money and customers. Went back to our own servers in a datacenter with a cloud upscale ability for peak demand only." We work a lot with cloud, whenever they have outages, they never cover our losses. I'm also in the 'no important things in the cloud' camp.
I just want 8 more PCIe lanes straight from the CPU. And then maybe another 8 from the chipset. That's all I want HEDT for, so if that was available in a pro-sumer AM5/AM6 chipset I'd be set.
@36:40 when speaking about pricing, there are always the early adopters that want it right now. They want to be the first kid on the block to have the bleeding edge technology. This is why AMD can charge more in the beginning. I really don't see anything wrong with this. If they changed it, their shareholders would going, "Hey... what are you doing?!"
"will the win be handed to them by Nvidia" Honestly 5000 series seems to have been notched down in terms of specs, ie 5080 should be 5070, and it's probably going to be more expensive than current 4080S. At some point people will find it difficult to accept such prices and switch. Hopefully...
Nvidia will keep the profits though. In years past when GeForce was just for gaming, a die shrink generation would have great pricing from passing the cost savings onto the MSRP. This time around they are a trillion filler AI company. GeForce is for professionals and data centers that will pay a 400% markup. 5080 is massively cut down, but it’ll still be 1000-1200$. Pricing down the stack will depend on what AMD does with their MSRPs. Nvidia knows that they can be 20% more expensive for the same performance or 20% weaker performance at the same price and not lose market share.
The 40 series "handed the win to AMD" already, and AMD did jack all with it. Sure a bit lower prices but only barely enough to seem like a slightly better raster performance/price option, but without the brand name and without rtx features and with inferior software stack. There is zero reason to expect them to do more vs 50 series than they did vs 40 series, especially now that they've dropped out of the top end of GPU segmentation all together.
@@Real_MisterSir also we can't forget that Radeon went with experimental chiplet design, which as any major change, will have problems, and it did have them, to be perfectly honest, i'm amazed how well they nailed it, even if it had problems with high idle for long time, but overall, additional latency from chiplets is something that didn't affect the products that much, also 1% and 0.1% lows are much tighter on Radeon cards, they might suffer in RT scenarios, but still good looking RT is barely usable of better 40 series cards
I don't think a switch is nearly as likely as consumers simply opting for older cards since the newer cards offer so little. This did happen to an extent with the 4060, which iirc had record low sales and had insane headlines like that just a single person in Japan showed up for its launch. I think the brand loyalty by the average customer to NVIDIA is too strong. And also not without deserving it, NVIDIA's software suite is miles ahead of AMD and honestly should have been a large investment point for AMD from the 7000 series onwards. But customers do still see these ridiculous prices and, like with the 40 series, might start dwindling in numbers actually opting to buy these overpriced newer generations and settle for an older one.
@@ruseruser2227 they SAY the 4060 had historically low sales, but it’s at the top of the Steam hardware survey with the 2060 and 3060. Honestly when it was selling for 280$ with game bundles, it’s not a bad deal. People so overwhelmingly pissed off about the 8gb of vram failed to recognize that it was 10% cheaper and 20% faster than the outgoing card. That’s not “tHe wOrSt CaRd eVEr”, it’s just disappointing considering the 50% perf/watt uplift we saw at the 3080 to 4080 upgrade. They definitely should have offered a 16gb version of the 4060 for 350$ though instead of the 500$ 4060ti we got 🤦🏻♂️.
Id be interested in how something like zen5c with vcache would perform, if it can be much cheaper as dense chiplets are smaller while giving 90% the same perf as the standard cores, it would be a great sell
Imagine in 2035 we might have high end systems with all Nvidia, all AMD, or all Intel product lines and they all compete with each other and drive down prices.
High end cards are ridiculous. It is not the price, but energy consumption. I'm not buying GPU that uses more than 250 watts power. Also today GPU should have 16Mb VRAM. AMD seems to hit that sweet spot better while nVidia lower watt cards are crippled having low amount VRAM.
If AMD is serious about competing in low-mid end they need to make a sub 300 dollar card with 12GB or more VRAM that beats Nvidia's 400 dollar card with 8GB, if Nvida make that.
I find it strange how many of the guests that you bring on disagree with the pricing strategy arguments you have for AMD. They tend to back up AMD saying its so hard to figure out the right pricing and nobody can know except in hindsight. We have all seen the data and trends release after release on what people will buy and when. Seems pretty easy for AMD to be able to figure out attractive pricing of their products based on the performance it delivers. Them charging as much as possible and then back peddling is very irritating. I completely agree with your opinion of things Tom.
Nvidia’s CUDA platform is losing its luster with AI companies. That’s why OpenAI invented Triton which tricks legacy LLM into thinking AMD GPUs are NVDA GPUs.
I hope RDNA 4 tries to dominate in value at the 200 to 500 dollar price range. Clearly theres a huge market for 3060 price-class cards. Most people are holding out for cards that are a significant enough upgrade over their old card at that price range. AMD also has to recognize that Power Efficiency, CUDA, DLSS, and NVENC are real value adds that actually can justify a 50 dollar price jump to nvidia. I've recommended the 6600 and 6600XT to people because thats as much as a lot of new pc buyers are willing to spend. And a lot of those new buyers are afraid to go secondhand.
While I agree with your sentiment, the embaressing fact is that the market were willing to pay way over the odds for the RTX 3060, despite it being sold higher than 3080 MSRP and being 2 tiers slower than the 6700xt. There's a lot of people who believe the fanboi FUD put about, even when the RTX option was objectively worst on all metrics.
I'm eagerly waiting too see what the next gen got for us. I'm still with a 1070ti and this generation is the limit for me to upgrade, as some games will just not run anymore on my graphics card
Buy a 7600X3D. I did the same because, yeah, 7800X3D price keeps climbing, 9800X3D is out of stock and pre-order price is climbing. Meanwhile, the 7600X3D is super efficient and sits below that price point. I was able to get the 7600X3D + new AM5 mobo for less than the price of the 9800X3D, and just a little more than the 7800X3D currently. I game in 4k, and all 3 of them are within range of each other here, except that the 7600X3D is 11% more efficient than the 7800X3D.
@@Rampagee75 That is just false. You can buy it from Mindfactory and from Proshop, some others, too. I know because I bought mine from Proshop, as Mindfactory only delivers within Germany. Just gotta check the listings, the stock changes all the time.
I feel your statement on the GPU price, I have no wish to buy a GPU for the same price as a used motorbike. Just not paying 2k for the latest pixel pusher 5000
If RDNA4 dropped in October, I think the reason the 4070 GDDR6(nonX) exists was so nvidia could counter RDNA4 with a $449 or even $399 4070 if they had to. That would have really hurt RDNA4 sales, so they sidestepped that scenario by delaying. The alternate timeline would have had the cheap 4070 run the first half of 2025 with the 5070 on delay. Instead, nvidia are committed to an as-sheduled early 25 5070 with eye-watering prices. AMD will be in its own segment now on price and value. The alternate scenario would have made for cheaper midrange GPUs this holiday season, but AMD would have been further weakened. What is about to play out could result in nvidia losing a lot of market share with people abandoning $600+ GPUs for gaming.
Think you guys are rationalizing AMDs midrange strategy. If what you suggest about investment were true, AMD CPUs would still be midrange behind Intel. To me, irs all about CUDA.. use antitrust action to free up CUDA language and the market will be open afain at the high end
hard to use antitrust vs an inhouse built feature from ages ago, it's not something they acquired or bought into recently to cut out the existing market- and cuda is being opened up anyways. Earlier this year Cuda was opened to work with non-Nvidia gpus so any attempt to reach for antitrust legislation is going to be an absurdly uphill battle.
@@Real_MisterSir Where did you read CUDA is opening up? Just because projects like ZLUDA exist doesn't mean CUDA is opening up now. In the contrary, NV tries to lock CUDA even more. Personally, I really hope NV gets pushed to truly open CUDA by EU and French government to Intel, AMD, Qualcomm and Apple benefit. But who know what happens
@@Real_MisterSir It is a computer language.. sure they have hardware geared toward executing it.. but how is it different than instruction sets for processors? Nvidia activelyopposes competition mile emulation and code translations for Cuda. I am no expert, but for the sake of competition they have a monopoly there, do they not? Plenty of businesses lose ownership of IP for competitive reasons after a time. Genric drugs for example... even trademark names of products have been lost or no longer protected by the government. Even Apple lost the ability to control the term "App Store".
@@Real_MisterSir I should have added.. the point is to get it to open up.. so if that happens no need for action. If AMD or Intel can then work on making their tech execute cuda stuff then there may be more competition in and markets Nvidia dominates and more ability to get those revenues needed to compete in the future.
most likely, but it will also depend on how much worse RT perf and fsr4 quality will be, because we know they will offer +4gb vram, it should generate enough hype to phase out nvidia pricing with low vram offers
Unlikely. AMD is releasing a midrange card when Nvidia is releasing High end cards. Unless Nvidia drops their whole stack at once, it’s likely that AMD will be setting the price for midrange this generation. Hopefully the 8800xt is much stronger than the 5070, and they don’t get greedy and try and sell it for 600-700$. Although with incoming Tariffs they might not have a choice. Last time the tariffs dropped, the 3080 went from 750$ AIB cards to 950$ AIB cards over night.
Yep. Just low enough to on-paper look like a better price/performance option -but not good enough to actually make a dent in Nvidia's brand name and superior software stack appeal. And crucially, not low enough that they can't hold on to their sweet sweet big margins.
Nothing AMD or NVidia is doing interests me. Why? Because they have both "deemed" that an "entry level" product now costs at least US$300! In this economic environment???!! They BOTH have zero products that service the sub $300 market, so I'm not intersted in their products - they are simply WAY too expensive to consider!
@@liberteus not talking about the used market. What has either company offered, good or bad, since the 3000/RX6000 series, which will now be 2 generations? A/ nothing. nothing since the absolute crap GTX 1630 and RX 6400 and 6500, both AMD parts were mobile GPU's that were gimped to the point they could not compete with the RX580 it was replacing in some tasks, due to cerrtain parts being on the CPU side of the die. And being a X8 card it was a disaster on PCIE 3 boards, which most who had the RX580's and GTX1650's were using at the time. Last gen bottom tiers were the RTX 4060 at $400 and the RX7600XT at $300. The non XT version came much later and was still $270. I live in Australia and you cannot get a RTX 4060 or RX7600 for less than A$500!! And as I type you cannot buy any of the RX7600 line, there is no stock. What's the bottom tier going to cost this time? $500? $400? $300? or something under $250 that normal people can afford without it being absolute garbage! I'm hoping AMD remember who kept them in the GPU business, and that Intels Battlemage offers a compelling choice, cost and feature wise, and understand there is a huge untapped market the other 2 have forgotten about.
I agree, the GPU market is not consumer friendly. If you're not against it, the best way to get great value is to check the used market. There are some killer deals to be had. If you're looking to upgrade your GPU under $300 and buy new, hold off just a little longer. As soon as the new GPUs are announced, you'll see more deals. You can pick up a 16GB 7600XT right now for $300. I'd expect the 7800XT to creep down towards the $300 mark in the first half of 2025.
@@davidmccarthy6390 you're right. I'm in Canada and it's the same. Our $ is worth nothing and our standard of living has been chewed away by inflation and it's impossible to get anything below 500CAD (and add 15%taxes on top). For such a price fairly sure I could grab a 3080 though, or a 6800XT... But Canada is like AUS a vast country and except if you live in metropolitan areas the 2nd hand market is kinda limited to certain areas.
@@liberteus We have the GST at 10%. The only difference is that in Australia the retail price has the GST included by law. For reference i bought the RX 580 8Gb in earlu 2019 for A$250. I have been waiting since for a cost effective replacement. AMD's offerings were inferior to what I had, NVidia's had a roughly 50% price premium for similar tier performance and to this day cost ridicu;uos amounts here. Prices start at A$500 for the 8gb 4060. Add $150 fpr 16gb version. that price bought you at least the 80 series in 2019, now that costs over A$1600 here. Not realistic, I can buy an entire PS5 with all the bells and whistles for less than that, and that's only a single component of a bigger build.. Don't even ask what the RTX 4090 costs here. it goes to well over A$5000, but generally sits in the A$4000-$4500 price range. I can buy a GOOD used car for that sort of money
IMO, what the GPU market really needs is a proper successor to Polaris. The RX 460 4GB was about as fast as a GTX 950 but could be used with PSUs with no PCIe cables, was about 20% faster than the GTX 750 Ti, had twice as much VRAM as both 50-class GPUs, and cost only $120. A successor would be as fast as an RTX 3050 8GB, have at least 12GB VRAM, use 75W or less, and cost less than $150 ($120 plus inflation, or slightly undercutting the 3050 6GB). The RX 480 8GB was about 5% slower than a GTX 980, but had twice as much VRAM, and cost less than half as much. A successor would be as fast or slightly slower than an RTX 4080, have 32GB VRAM, and cost $600 or less. The RX 470 4GB was about 5% slower than a GTX 970 at just over half the price, and used less power. A successor would _either_ be about as fast as an RTX 4070, have 12GB VRAM, use less than 200W, and cost about $320; or be about as fast as an RTX 4070 Ti, have 16GB VRAM, and cost about $400. Giving Navi 44 12GB probably isn't realistic (if it's a 128-bit bus or less, it would require clamshelling, because 3GB GDDR chips aren't likely to be available until 2026, and clamshelling costs too much to be viable for an entry-level card. It might happen on some workstation configs of N44, but not on a
Threadripper was great to start. Now it's purely a workstation product with lite and pro versions. I have no desire whatsoever for 16 cores if it means the limitations of the consumer socket and I also have no desire to pay as much for just a 24 core CPU as it used to run to get the HEDT CPU, board and memory combined. We really need a TR4 successor. Something with 12 to 32 cores, quad channel non registered (aka traditional desktop) memory and 48 pcie lanes that doesn't cost an insane amount.
One thing that is becoming increasingly popular is Mini PCs. One of the ironies of this market is that units like the Elitedesk 705 G4 have an expansion connector for a discrete GPU. It seems that a significant number of buyers of refurbished Elitedesk 705 G4 are upgrading with a Radeon 560 dGPU. On the surface this makes little sense. The 705 G4 uses a Ryzen 2400g(e) APU, which only offers about 40% higher performance than the iGPU But this is enough to change an almost playable Fortnite experience into an actually playable Fortnite experience. It is a shame these dGPUs can only be used with AMD Ryzen based office PCs. The older Intel based office PCs have a more powerful (for gaming) APU in them. It is also a shame that the Radeon 560 is the most powerful dGPU available for these refurbished office PCs. Currently it would be great if a Navi 2 dGPU could be installed in one of these low end gaming PCs. But the real potential is for the future. It appears that Mini PCs, like the BMAX Mini PCs have a bright future. This future could be even brighter if low end Mini PCs were upgradeable with dGPUs like the Ryzen 560, but future versions of them like one based on Navi. Ideally, both AMD and Intel would produce low end dGPUs based on their high end iGPUs, that could be used to upgrade refurbished office PCs and low end Mini PCs. Imagine if after AMD discontinues support for Vega, Elitedesk 705 G4 and other Mini PC users could purchase a low end Navi dGPU to upgrade it. Better yet, buy a low end UDNA dGPU based on a high end UDNA iGPU. Abd Intel could follow suit with dGPUs based on their highest end ARC iGPUs. Oh yes. I think AMD cancelled high end Navi 4 to focus on the transition to UDNA.
Intel gpus did one excellent thing, they make amazing encoding accelerators for Plex servers. Especially since they all have the same encoding capabilities. There's no reason to spend the money on the top-tier devices. An a310 works perfect
Regarding the RX 8800XT (or whatever the AMD Highest Card will be)… Nvidia isnt stupid imo. You can see that when you know the RTX 5070 is only 6400 Cores compared to 5888 on the RTX 4070. So the Die Size will be even smaller this Time around compared to the last Gen. Meaning… RTX 4070 was 36% from the RTX 4090. RTX 5070 is 30% of the RTX 5090. So its another 6% smaller of the already bad RTX 4060 i mean 4070. It will use more Power and GDDR 7 will be used as well to make it around 40% better which will mean… more or less 15% behind the RTX 4080 (bit better than RX7900XT). AMD will have something around the same Performance, maybe bit better but who knows about RT…? Thing is Nvidia CAN price this Garbage… below 600 Dollars EASILY if they want to and still make their normal Gains if they want to. Its not hard to do for them. If they want to be Assholes even go to 550 probably and tell everyone how they understand that ppl. are mad 😢 because of the old 40 Series Prices but this Time… Well, we had this BS happen before with the RTX 3070. Point is… the RX 8800XT needs to be REALLY good and REALLY cheap or else i fear Nvidia will be the one who laughs at last again. RTX 5070 could also sell at 700 Dollars. Sure. But Nvidia isnt dumb. They know what they can do and that ppl just dont care enough for AMD if the Price is almost the same.
I want 4k 240hz, thats why I will buy a 5090 and then after that a 6090 and 7090, if they make it possible in really intensive games like Black Myth Wukong for example, which is a heavily GPU-limited title atm, if you have a high end CPU like 9800x3D. so no if you want to play high refresh rate 4k SINGLE PLAYER games that are really graphically intensive, you dont want 4090 performance, you want more than that. Even a 5090 wont be enough for what I want. Features like Path Tracing can look amazing, for that alone we will need a lot more powerful GPUs than we have now. If I buy ultra high end/highest end, I want to play my games in the best looking way possible. thats why I also bought a 32inch 4k 240hz OLED monitor. to drive that in intensive games, will require a better CPU than a 9800x3D (9800x3D I have) and better GPU than the upcoming 5090 (which I will have until 6090 is out). So I already know, that I will upgrade again as soon as there is another big step up.
John Petty's quarter 3 GPU report tells a grim tale for AMDs GPU division. They don't have a mountain to climb, they have to climb to the moon. Nvidia is out selling them 10 to 1 sometimes 12 or 15 to 1.
I have VERY low expectations for RDNA4 but will be VERY happy to be proven wrong. I have a 4090 atm so I just doubt AMD can beat that for a very long time. Time will tell.
I'm kinda in the same boat, although I wouldn't phrase my opinion in quite a pessimistic way. We'll see - the tech is there...but I'm not willing to bet anything on RADEON actually going for it.
I think that point about Radeon sticks to the wall more than any other, in that until AMD really invests in the software side of things on a level that creates confidence in their ecosystem, there will never be an Nvidia killer on the high end. Adrenalin software is amazing, and is probably the first time in a long time where Nvidia are playing catch up, in this case with the Nvidia App. But with FSR they're always way behind schedule which erodes confidence. Raytracing seems to finally be on their radar, which is a good thing. AI is where they could really make headway if they started enabling the layman user to experiment with AI, since very few consumers are even aware they can utilize their GPU in that manner. Above all else AMD need to stop chasing Nvidia, and start leveraging their CPU mindshare by enabling certain advantages for Radeon GPU's only. Nvidia have been doing this since buying physX, locking out the competition at a hardware level. If they want to compete in the GPU space with Nvidia, they need to play dirty like Nvidia does with the tools at hand, Ryzen being a giant hammer they can start swinging.
AMD needs a home run with RDNA 4 if they don't increase market share they will simply allocate less and less UDNA resources to gaming cards and that's probably why they have changed gear to UDNA so if worst happens they can keep making Instinct and console chips and offer whatever isn't to much hassle to the discrete market
RDNA4 looks better by the day. The B580 from Intel is huge, with a big power draw, but has decent performance at a cheap price. But, can intel make any money from selling such a big die for such a low price, on an expensive 3 fan card? I doubt it, and thus I see why the B770 and above may be cancelled. By contrast, Navi 44 (8600XT) has a very small die, at 130-152mm^2, and most likely a very low power draw. It should outperform the B580, but be much, much cheaper to make. If AMD is out for blood, they can put the 8600XT at $249 or less, and the more expensive to make 8700XT and 8800XT at prices like $400 and $500. Few people need more power than the 8800XT will provide, if it indeed matches a 4080, so AMD could indeed gain back market share that they haven't had for a decade.
If you look at the Steam Hardware Survey, AMD needs to make a card that is as good as a 4070 and be decently cheap, like maybe $425 as the 4070 is about $500 and the 4060 and 3060 seem to be about $300 and $280. Tops cards are 3060, 4060, 1650, 2060, 4060 TI, 3060 TI. Also throw in a good amount of VRAM as well, that would help a bit.
It seems likely to me that there are some tradeoffs to having UDNA. As I understand it, nvidia has a unified structure, and the money is coming from AI, so the gaming just isn't getting the emphasis anymore. My guess is that that explains why the 40 series was such an insignificant improvement over the 30 series. Will 50 be better? We'll have to wait and see.
Strix halo is the product I'm after, I've been excited for it ever since it was first leaked on this channel, no more vram limitations and only one gpu to worry about. Should be a good upgrade from my Zephyrus g14 2022
I know that I am going to be building a ful AMD build soon, I will be going from a Ryzen 5 3500 with a RTX 2060. I am excited to see what an increase i will have in performance in gaming.
Ahh AMD... never fails to disappoint! 7900 xt/xtx should've been launched as 7800/7800xt for $650 and $800 at launch. Imagine 7800xt, the successor to 6800xt around 50% more perf for 20% more price with 8gb extra VRAM. 4080 competitor for 33% less cost... what a missed opportunity that was! Now if they stop fucking around with the name... release top end as 8750xt (5070 ti / 4080 perf) for $650 with 20GB VRAM and a 8700xt (5070 perf) for $500 with 16GB VRAM, they will really shake up the mid range and return some consumer confidence... But honestly I have a feeling they'll promise 5080 competition, name it 8900xtxx so consumers think it challenges nvidia 90 class, charge $800 and then deliver 5070ti performance with 4070ish ray tracing and slap on an unnecessary amount of VRAM
RDNA4 is only going to sell well if top model is really as fast as 4080 and 500$. If it's 600$ people will just buy 5070 instead. 90% of normal people don't understand enough about GPUs to care about 12gb vram.
I feel like they need more VRAM and a wider memory bus. The 5080 getting a 256bit bus is ridiculous. It basically negates the move to GDDR7. It's got essentially the same bandwidth as a NON OC 7900XTX. With AMD out of the high end this gen, I'd consider an NVIDIA card just for one gen if I didn't have to take a step back in several areas and pay way more to do it.
I don't know about you all, but I am getting the distinct feeling Jensen thinks we are all a bunch of knuckle-dragging idiots. Sorry bub, I haven't bought one of your GPUs since the original RTX 1080. AMD does just fine for me.
I've got my fingers crossed that the rx 8700xt (whatever it's called) takes a page out of the rx 5700xt launch and comes in as powerful as Nvidia 's previous gens top end card for $400.
@Vriess123 lol, the rx 5700xt was as fast as the gtx 1080 ti. Now, I seriously doubt it will touch the rtx 4090, but I'm hoping it trades blows with the rtx 4080.
Trading blows with the 4080 is what the presumably 8800XT is going to do on the optimistic end (pessimistic is 7900 XT). And that is gonna be 500-600USD for sure. Hopefully 500USD though, that's half the price of the 4080 Super.
@Hayden2447 i hope they don't release it for $600. That would be a huge missed opportunity. It needs to be $400-$450. They will dominate the pc gaming market at that price.
Nvidia is probably downplaying the fact that 80 and 90 are gaming cards because of the crypto lawsuit. They don't want to get sued again so they're probably going to market the high end as not gaming product so the next time they get sued - they can say 'but we said those aren't gaming cards'. Well, and to try to con people into thinking that GPUs aren't as insanely priced as they actually are.
The way I see it, Radeon isn't making much money atm anyway as they need to sell N31 and N32 basically at cost just to move them. RDNA4's dies are small, simple and cheap. Price them aggressively, regardless of whaf Nvidia does and just earn the good will. Chances are they will make money anyway.
This podcast and its focus is usually too long for me. However the questions had decent answers from the guest, and maybe the editing cut out some fat. Ended up listening to the whole thing. :)
Nvidia will be calling the 5080 & 5090 Professional cards for only one reason, a way not to sell them to China due to restrictions. Or rather, a way to placate the Chinese market. Look we can sell you a gamer card! Yay.
Isn't it obvious that anything that requires advanced packaging is currently being squeezed out by the big bucks that are being paid for the AI GPUs? There's no mystery.
Unifiying architectures should make designing GPUs a lot more cost efficient for R&D than having one architecture that caters to consumers while a seperate one caters to professional and enterprise. If AMD then sticks to maybe 2 or 3 simpler dies for gaming where they only cater to the section of the market where market share is generated (which is entry and mid range) I feel like they can produce decent competetive products while also rerouting resources into the software part to improve AI based upscalers (like it or not it's a sales argument), an NVENC equivalent as well as their drivers etc. Also it would make the more widespread use of ROCm easier I think? Also also AMD needs to improve their marketing.
[SPON: Use "brokensilicon“ at CDKeyOffer’s Black Friday Sale to get Win 11 Pro for $23: www.cdkeyoffer.com/cko/Moore11 ]
[SPON: Use "brokensilicon" for 6% OFF at Silver Knight PCs: www.silverknightpcs.com/ ]
And isn't going to compete. They will be $50 less than. NViDIA at a semi comparable performance
AMD doesn't have to make a faster GPU than Nvidia, they just have to make a modern competent GPU that people can afford. I applaud AMD for focusing all their resources on low and mid-range as that's what the industry needs right now, no more space heaters that cost $1600+...
They will botch it. The days of 580 are gone they want to match Nvidia -100 dollars but it will kill Radeon trying to do this. FSR4 better be really good or people are going to never trust it.
@@Navi_xooI feel the perception is already set. Dlss has been ML for years and fsr is just starting to train it's new fsr. It will be obviously worse than Nvidia for a couple years. DLSS is just unbelievable compared to fsr 3. Id upgrade to 8800xt if RT and upscaling was up to Nvidia but it won't be. RT will not be better than 40 series and fsr 4 will kick off in the same manner as pssr. Some are ok but some games actually look like shit.
@@christophermullins7163 Yep it's the unfortunate truth. Even PSSR seems to be a big step down from DLSS and some games downright broken, Silent Hill is already patched to revert from PSSR back to TSR.
@@Navi_xoo I don't trust anyone's AI solution. Games always look better without that crap.
AMD will fuck it up. Cant wait to buy 2 5090's day of launch.
Quality guests bring out the best in you.
Quality begets quality ;)
With how much weaker the cards below xx90 are getting relatively, the next lineup looks like 5090, 5060, 5050, 5030 and 5010
Lol. Let's be real.. the names mean nothing. So long as the new 60 is not slower than the old 60.. Nvidia can name them whatever they'd like to. The Idiocracy of gamers that feel they have the right to have a certain number of memory chips on a certain tier GPU is pathetic. Gpus are expensive now.. tough shit
Yeah if we go by shader count :
what's called a 5090 = 88% of uncut GB202 (170/192SM) = more like a 5080Ti
what's called a 5080 = 44% of uncut GB202 (84/192SM) = more like a 5060Ti
what's called a 5070ti = 36% of uncut GB202 (70/192SM) = more like a 5060
what's called a 5070 = 26% of uncut GB202 (50/192SM) = more like a 5050ti if we're being generous
what's called a 5060ti assuming it's based on GB206 can at best have 36/192SMs , 19% of uncut GB202 = more like a 5050
The gap between GB202 and GB203 is absolutely wild , you could easily slot two SKUs between. We don't know what the relative performance is yet , so can't go by that - but it wouldn't surprise me if the gap between the 5090 and 5060 is as much as 4x
@@christophermullins7163 Found the NVidia shareholder account
@@christophermullins7163 A good example of how *not* to use the word "idiocracy."
They dont care, their fanbois will pay what they demand regardless.
AMD has the opportunity to kill every offering from NVDA below the 5090 if they want to...
But as we know, they never miss an opportunity to miss an opportunity.
They can, but they can't price it too low or it's gonna cost them their margins.
AMD cards could be offered with a 50% discount compared to nvidia and people would still not buy it.
@@ghoulbuster1 patently untrue
at most, retailers would not use it with prebuilts, in no small part due to Nvidia's anti-competitive practices
@@martinkrauser4029sadly it's true. Morons are in the majority and just love getting milked. Every overpriced brand is proof of that.
people say this every single generation, 7800 is 150 less for a 4070 super and no one buys it.
I can’t wait for Strix Halo to be exclusively paired with RTX 4050’s for no reason
Luckily that won't happen since strix halo has a 256 bit memory bus so it requires 8 memory chips, with strix halo being around 300mm^2 of silicon by itself, most laptops wont have space to put a 4050/4060 even if they wanted to. Although strix halo wont be cheap, I would expect $1500 starting price with 32GB shared ram.
@@frowningboat8039 what was you drinking ? 256 bit ? even the RTX 4070 does not has that,
@@frowningboat8039 If Stix Halo provides laptop 4070 performance with the ability to partition 12gb of system memory to VRAM, I’d pay 1500$ for a laptop or 1200$ for a miniPC with 32gb. I need a new travel partner. My gigabyte aero is tired.
@@allxtend4005 it's lpddr5X ram rather than the gddr6 of the 4070 so the bandwidth will be similar, it's also double that of strix point which uses a 128 bit bus.
Edit: Apple's M2 pro also uses a 256 bit memory bus of lpddr memory so it's nothing revolutionary, it's just the first time it's being done on windows.
@@MechAdv rumors are definitely suggesting that's possible but it will probably start at $1800 for a laptop with the full 4070 like GPU.
Missed an opportunity to call this Broken Silicon Intel Ultra 285 😔
I REFUSE to spend more than 1000 dollars on a GPU. In 2023, I had 3070ti and a 4K monitor. I wanted a new card, I had 2 options. 12 gb crappy 4070ti for 800 dollars , or AMD 7900XTX 24 GB for 1000. Guess what I did! went team red first time EVER. The ONLY regret with AMD was FSR and missing out on DLSS. BUT now with FSR 4 coming and I love fsr 3 and fsr 3.1, and now playing Uncharted legacy of theives with fsr 2. it looks great and I hope FSR 4 will compete then I can stay on AMD. 8800XT is so exciting to see what it's like!
it aint going to be faster than an XTX
@@nicolasgkr11 Probably not. But the ray-tracing should be _leagues_ better, and the efficiency should be improved as well.
Sell your XTX to mitigate the purchase, and enjoy the upgrade.
Don't sell your 7900XTX for the 8800XT. I think the 8800XT will be a really good card with great value, but you're going to give up 8GB of VRAM (which I need because I game at 4K Ultra on my OC/UV 7900XTX) and you'd also be giving up pure raw performance on top of that. Being that it's a slower card with less VRAM, how much will the raytracing (not even close to being worth the performance hit on any card ever made, IMO) really matter when you're VRAM bottlenecking at 4K and getting less FPS in non raytraced titles?
Again, I'm not saying it'll be a bad card and I think a lot of people will buy it, but it would be a step down overall for 7900XTX owners. I'm waiting for the RDNA5/UDNA5 flagship.
@@benknapp3787 I will see what the perf and specs are on the 8800xt before I do anything, I am just excited to see the next products from AMD and what they have come up with.
I was considering upgrading in the new year but I will not be paying for 1k for GPU, not that kind of rich
Really enjoyed listening to James Prior such a good guest.
Mid range cards used to cost about 150$
And houses used to be cheaper, inflation is a thing.
Top-end cards used to be
Worth mentioning that the GPU's of today cost a lot more to build too. Fab costs per unit have gone up as the nanometers go down. Power draw has, generally, gone up making the cards more expensive in a dozen ways. Consider how much modern mid range GPU's weigh compared to when they were $150. They are like 3x the mass. A lot of this is driven by display technology. Wanting 1440p high refresh or 4k @ 60+fps creates the market for 250W $600 'mid range' GPU's.
@@kleinbottled79 i suppose midrange is a band from 350-700
and they used to be the size of your hand too.. these days they're over double the size and double the price, thank moore's cracking law and quadratic thermodynamics for that.
Nah. I could believe the "above a 4070 it is for professionals" if they came with more than 12 GB and 16 GB. Games are requiring more and more vram and 12 GB in 2025 is ridiculous, especially at the price point that are looking at.
Series should be 5090 32 GB, 5080 24 GB, 4070 Ti 20 GB, 4070 16 GB, 5060 12 GB.
Why I went for the Ti Super. Looking as supposed 5070 TI with 16gb at $899.. I have a feeling this gpu will age well and retain value.
Yep, I noticed > 12GB VRAM being used in a not particularly GFX intensive OpenGL application, having 16GB is noticeably much smoother with the same mitigations allowing lossy texture compression and > 12GB is easily exceeded.
Considering the consoles have 16GB, need to decide what tier of discrete GPUs should match consoles and go from there, so for example if the 70 tier was to match consoles, then they should have 16GB, 60 tier 12GB, 50 tier 8GB, 80 tier 20GB, 90 tier 24GB.
Yeah "professional" use now seems more like a 36GB+ card. People buy MacBook pros now because of the unified memory for GPU AI development. Professionals at the high end now of Nvidia probably would want like 64GB VRAM
nvidia doesn’t care
There's a really terrible stagnation going on in the GPU market from a price/performance perspective. All most all innovation now is coming in the form of more watts into bigger chip configurations. It feels like the Core 2 duo days where for 10 years we basically didn't get any compelling upgrades. On one hand, I guess this lets you splurge and hold on to your hardware longer (I'm still using a 1080 ti and it works fine), but on the other hand this is not healthy for the market long term.
A single local retailer in Brisbane (QLD, AU) sold 219 Ryzen 7 9800X3Ds in 7 days. Probably not a lot by US standards, but this is a small market after all.
Best show ever!! I was fully immersed in his every word as he offered his visions and an almost behind the scenes take on all these topics
It's a pleasure to listen to James Prior .
Fr.
I'm generally quite negative about content, looking for issues.
But I have to agree. This was the best episode in a long time. I'm always appreciative of all guests. But you get far more depth listening to people with a track record from within the industry itself. Also, he is really good at explaining his points (again based on the requirements of his professional experience)
Well done to MLiD for getting us this interview
Just wait to listen to his evil twin John Posterior.
Honestly, if 5090 is a "professional" card, then I naturally expect commitments with regards to raising priority support tickets, including regarding firmware and a guaranteed non-evasive response over a reasonably short time period.
Or am I being gaslit?
Professional in gaming and flexing /s
they can just chalk it up to user error
"Professional card" simply means that professionals will buy it over AXXXX pro cards in less vram heavy tasks -just like how many visualization studios started buying 4090 cards instead of A6000 cards simply because you could get twice as many for the same cost and reach better performance for things like Unreal 5 visualization etc, which is now a fairly standard part of many studio software suites.
5090 is likely going to eat into small AI studio needs and increasing 3d visualization demand as well. Probably would sell like hot cake even at 2500 bucks regardless of criticism from gamers.
@@Real_MisterSir I do agree that the 90 ones in particular, with large VRAM sizes, do have a niche, as explained in your comment in more detail. The 80 seems to be a different story.
I feel like recent advancements (and new markets, i.e. AI/ML, not just compute) made quite a large overlap between professional and gaming hardware. Some years ago GeForce EULA was changed to forbid usage in datacenter because it was really cheap to just use gaming cards. Nvidia was really smart for ~15 years or so allowing CUDA, etc. on their gaming GPUs. Multiple generations of students finish their universities with multi-year experience of CUDA and everything else related to Nvidia software stack. Nvidia 80 and 90 class cards meet requirements for today's workstation/development stations. That's not good for Nvidia as they can make way more money from professional and datacenter side. Nvidia needs to figure out how to distance/segment products to avoid loosing money. I don't know anyone using 4090 card primarily for gaming.
AMD may be out of blood, but I am out of goodwill for NVIDIA. I will not buy the 5090, and everything below it looks like a scam.
Please more product managers and owners 🙏 what an awesome episode
"I don't remember writing that" same pfp 😂
I was hooked on this guests every word, what a guest and fountain of knowledge!
I'm glad I picked up an Arc A380, not because its a great gaming card, but because its perfect for my plex server and when it inevitably gets replaced several years from now it'll be a cool piece for my PC history shelf.
Yeah I agree. I want Arc for a mini PC myself. They dominate the price to performance there and the price is low enough that I'm willing to dabble regardless of Arc's future.
Challenger Arc A380 Club! I bought it for the AV1 encoders.
I was contemplating getting an intel GPU for my unraid server, but will probably end up just getting a new MB and Intel CPU with an iGPU.
Is there an advantage to having a dedicated GPU?
Ah, you're a collector. Makes sense. Those blue ARC cards really are pretty.
@@adamwest1138 the a380 is cheaper in the end. Going to AM5 is better for cpu/platform support and upgradability, plus consistent core performance is a good thing in servers (VMs especially) and AMD is better price to performance.
Overall I’d pay the $100 for an a380 to not be on an Intel platform. (Actually just did this exercise, but haven’t bought the ark as I had a 1080 sitting in the closet… it’s good enough for now)
Plus you can do something with the IGPU in the am5 processor.
Would be happy with 7900 xtx performance with better ray tracing and some AI cores.
At less than $600, yes please.
Tom's leaks have consistently assured us we'll get that.
The price is the final question.
@@benjaminoechsli1941No I'm pretty sure he's always said ~7900 *XT* performance for b/w ~$500 - $600 and NOT 7900 *XTX* performance.
That's a pretty important distinction to keep in mind to keep our expectations in check.
@@Aspire705he said 4080 perf so around 7900 xtx
@ I've always heard him say 7900 XT-ish performance so if he really said 4080 performance then that's news to me.
But I *highly* doubt he "always" said 4080 performance, so if you can post a link to where he said that I'd appreciate it.
I hope they actually price it well instead of just slightly cheaper than nvidia
This. However it’s AMD. NVIDIA can leave a planet sized target for AMD to hit, that would allow AMD to win market share and “win” the generation…and AMD would still miss.
@@ptung88 bullshit, everyone was screeching the same thing in 2017 and you were very very wrong
@@victorkreig6089 and look what they did in 2023 with RDNA3. Regression and tripping over their own success following RDNA2
I'm in the 'no important things in the cloud' camp. We learned the hard way when our cloud provider ff-ed up and we lost a lot of money and customers. Went back to our own servers in a datacenter with a cloud upscale ability for peak demand only.
Cloud is just another way of saying not your property, not under your control.
As for ryzen vs threadripper. I would be perfectly happy with a 16 core CPU with quad channel memory and one extra fully connected x16 slot compared to what we get now.
That said, Threadripper still allows us to do things cheaper than before, without relying on cloudproviders.
"I'm in the 'no important things in the cloud' camp. We learned the hard way when our cloud provider ff-ed up and we lost a lot of money and customers. Went back to our own servers in a datacenter with a cloud upscale ability for peak demand only."
We work a lot with cloud, whenever they have outages, they never cover our losses. I'm also in the 'no important things in the cloud' camp.
"It's not the cloud, it's someone elses computer"
Just sold my 4090 for more than I paid for it two years ago and now patiently waiting for the 5090 launch.
Coreteks said $1800 - $2000
Well done! My best ROI was a $360 1080ti I sold for $700 during the mining craze. That gave me the justification to afford a $1000 6800xt.
you retards are just as bad as sony ponies
I just want 8 more PCIe lanes straight from the CPU. And then maybe another 8 from the chipset.
That's all I want HEDT for, so if that was available in a pro-sumer AM5/AM6 chipset I'd be set.
Same, i want more lanes. i would give up pcie 5.0 if i could have more lanes in 4.0 or even 3.0.
James is a great guest. Like, we're lucky he's an enthusiast.
Indeed - he clearly is very good at marketing and business...AND actually games lol. That combination makes his input so insightful...
Excellent content. Thank you gentlemen. 🔥🔥🔥
800€ 9800x3d, 500€ out of stock 7800x3d, vs 360€ summer price. AMD European prices are deadly serious...
Fantastic guest, thank you
@36:40 when speaking about pricing, there are always the early adopters that want it right now. They want to be the first kid on the block to have the bleeding edge technology. This is why AMD can charge more in the beginning. I really don't see anything wrong with this. If they changed it, their shareholders would going, "Hey... what are you doing?!"
Good podcast, good guest.
What an awesome guest.
RDNA 4, my expectations are simple, superior performance to the previous generation and improvements in efficiency thermally and energetically.
"will the win be handed to them by Nvidia"
Honestly 5000 series seems to have been notched down in terms of specs, ie 5080 should be 5070, and it's probably going to be more expensive than current 4080S. At some point people will find it difficult to accept such prices and switch. Hopefully...
Nvidia will keep the profits though. In years past when GeForce was just for gaming, a die shrink generation would have great pricing from passing the cost savings onto the MSRP. This time around they are a trillion filler AI company. GeForce is for professionals and data centers that will pay a 400% markup. 5080 is massively cut down, but it’ll still be 1000-1200$. Pricing down the stack will depend on what AMD does with their MSRPs. Nvidia knows that they can be 20% more expensive for the same performance or 20% weaker performance at the same price and not lose market share.
The 40 series "handed the win to AMD" already, and AMD did jack all with it. Sure a bit lower prices but only barely enough to seem like a slightly better raster performance/price option, but without the brand name and without rtx features and with inferior software stack. There is zero reason to expect them to do more vs 50 series than they did vs 40 series, especially now that they've dropped out of the top end of GPU segmentation all together.
@@Real_MisterSir also we can't forget that Radeon went with experimental chiplet design, which as any major change, will have problems, and it did have them, to be perfectly honest, i'm amazed how well they nailed it, even if it had problems with high idle for long time, but overall, additional latency from chiplets is something that didn't affect the products that much, also 1% and 0.1% lows are much tighter on Radeon cards, they might suffer in RT scenarios, but still good looking RT is barely usable of better 40 series cards
I don't think a switch is nearly as likely as consumers simply opting for older cards since the newer cards offer so little. This did happen to an extent with the 4060, which iirc had record low sales and had insane headlines like that just a single person in Japan showed up for its launch.
I think the brand loyalty by the average customer to NVIDIA is too strong. And also not without deserving it, NVIDIA's software suite is miles ahead of AMD and honestly should have been a large investment point for AMD from the 7000 series onwards. But customers do still see these ridiculous prices and, like with the 40 series, might start dwindling in numbers actually opting to buy these overpriced newer generations and settle for an older one.
@@ruseruser2227 they SAY the 4060 had historically low sales, but it’s at the top of the Steam hardware survey with the 2060 and 3060. Honestly when it was selling for 280$ with game bundles, it’s not a bad deal. People so overwhelmingly pissed off about the 8gb of vram failed to recognize that it was 10% cheaper and 20% faster than the outgoing card. That’s not “tHe wOrSt CaRd eVEr”, it’s just disappointing considering the 50% perf/watt uplift we saw at the 3080 to 4080 upgrade. They definitely should have offered a 16gb version of the 4060 for 350$ though instead of the 500$ 4060ti we got 🤦🏻♂️.
Good conversation! bring James back again!
If 20 different review sites could all find the performance issues with ryzen 9 in 5 days of review, AMD sure as hell has no excuse
Id be interested in how something like zen5c with vcache would perform, if it can be much cheaper as dense chiplets are smaller while giving 90% the same perf as the standard cores, it would be a great sell
Imagine in 2035 we might have high end systems with all Nvidia, all AMD, or all Intel product lines and they all compete with each other and drive down prices.
High end cards are ridiculous.
It is not the price, but energy consumption. I'm not buying GPU that uses more than 250 watts power. Also today GPU should have 16Mb VRAM. AMD seems to hit that sweet spot better while nVidia lower watt cards are crippled having low amount VRAM.
Greedy Nvidia upsell technique
Nvidia doesn't want to cannibalize the pro market. It's just about vram, 24 and above is pro. That's why only 90 will have 24 or more.
250 w is my absolute limit as well.
Anything over that should only be considered for pro applications as far as i am concerned.
If AMD is serious about competing in low-mid end they need to make a sub 300 dollar card with 12GB or more VRAM that beats Nvidia's 400 dollar card with 8GB, if Nvida make that.
mips is greatish have written so much mips assembly back in the uni, please never forget university pipeline !
I find it strange how many of the guests that you bring on disagree with the pricing strategy arguments you have for AMD. They tend to back up AMD saying its so hard to figure out the right pricing and nobody can know except in hindsight. We have all seen the data and trends release after release on what people will buy and when. Seems pretty easy for AMD to be able to figure out attractive pricing of their products based on the performance it delivers. Them charging as much as possible and then back peddling is very irritating. I completely agree with your opinion of things Tom.
34:08 That's the reason i hate public companies,wholly owned private ones are just so better.
VALVe is a brilliant example of private companies being the Way. You can get rich without f***ing the consumer over.
@benjaminoechsli1941 i hope it stays like that forever.
Well done Tom for improved interview style here
5:00 Nvidia isn't asleep at the wheel like intel were.
My thoughts exactly
They aren't but, Nvidia pricing rn is abussive.
Nvidia’s CUDA platform is losing its luster with AI companies. That’s why OpenAI invented Triton which tricks legacy LLM into thinking AMD GPUs are NVDA GPUs.
@@tringuyen7519 I forgot this, you have all the reason.
@@tringuyen7519 i think there was also a opensource translation project called ZLUDA: CUDA for non NVIDIA cards.
I hope RDNA 4 tries to dominate in value at the 200 to 500 dollar price range. Clearly theres a huge market for 3060 price-class cards. Most people are holding out for cards that are a significant enough upgrade over their old card at that price range. AMD also has to recognize that Power Efficiency, CUDA, DLSS, and NVENC are real value adds that actually can justify a 50 dollar price jump to nvidia. I've recommended the 6600 and 6600XT to people because thats as much as a lot of new pc buyers are willing to spend. And a lot of those new buyers are afraid to go secondhand.
200 dollar gpu market is over unfortunately
While I agree with your sentiment, the embaressing fact is that the market were willing to pay way over the odds for the RTX 3060, despite it being sold higher than 3080 MSRP and being 2 tiers slower than the 6700xt. There's a lot of people who believe the fanboi FUD put about, even when the RTX option was objectively worst on all metrics.
@@letsplay1097 How so? Maybe for Nvidia, but it seems fairly lively for AMD and to a lesser extent Intel by just doing a quick amazon search.
I'm eagerly waiting too see what the next gen got for us. I'm still with a 1070ti and this generation is the limit for me to upgrade, as some games will just not run anymore on my graphics card
Why you think AMD’s encoders aren’t on-par with NVENC? If so, the best at media acceleration is Intel, this is where they should beat them at
Tnx for the perspective
This dude is a true professional. Don't he has his own podcast by any chance?
the 7800x3d has hiked in price by over 50% in germany in like a month
i love tech
Buy a 7600X3D. I did the same because, yeah, 7800X3D price keeps climbing, 9800X3D is out of stock and pre-order price is climbing. Meanwhile, the 7600X3D is super efficient and sits below that price point. I was able to get the 7600X3D + new AM5 mobo for less than the price of the 9800X3D, and just a little more than the 7800X3D currently. I game in 4k, and all 3 of them are within range of each other here, except that the 7600X3D is 11% more efficient than the 7800X3D.
That one is 460€ and only two sellers have it.. so even worse.. @@BaumisMagicalWorld
@@BaumisMagicalWorld7600X3D Is only micro center exclusive..we all not living here in the USA.
@@Rampagee75 That is just false. You can buy it from Mindfactory and from Proshop, some others, too. I know because I bought mine from Proshop, as Mindfactory only delivers within Germany. Just gotta check the listings, the stock changes all the time.
@@Rampagee75 And I'm also not from the USA, lol.
They scrapped it & put all GPU resources into RDNA 5 that's supposed to be cheaper to produce & a lot more efficient & competitive with Nvidi0t GPUs
I dont see them selling 4080 performance for 499. Gotta be at least 699, if its 499 its not beating the 4080
I could see 599 with 350w power draw maybe but yeah 499 4080 that is relatively efficient? Hard to believe.
That's MSRP. We all know cards won't actually sell for that, especially partner models. It'll start 100 bucks more and climb for the crazier variants.
I love how he just puts RTX 5090 in the title and it's barely a discussion point lmao. The grind is real I guess.
The rest is far more interesting. We all know the 5090 is a powerhouse, not much more to say.
I feel your statement on the GPU price, I have no wish to buy a GPU for the same price as a used motorbike. Just not paying 2k for the latest pixel pusher 5000
If RDNA4 dropped in October, I think the reason the 4070 GDDR6(nonX) exists was so nvidia could counter RDNA4 with a $449 or even $399 4070 if they had to. That would have really hurt RDNA4 sales, so they sidestepped that scenario by delaying. The alternate timeline would have had the cheap 4070 run the first half of 2025 with the 5070 on delay. Instead, nvidia are committed to an as-sheduled early 25 5070 with eye-watering prices. AMD will be in its own segment now on price and value. The alternate scenario would have made for cheaper midrange GPUs this holiday season, but AMD would have been further weakened. What is about to play out could result in nvidia losing a lot of market share with people abandoning $600+ GPUs for gaming.
Think you guys are rationalizing AMDs midrange strategy. If what you suggest about investment were true, AMD CPUs would still be midrange behind Intel. To me, irs all about CUDA.. use antitrust action to free up CUDA language and the market will be open afain at the high end
hard to use antitrust vs an inhouse built feature from ages ago, it's not something they acquired or bought into recently to cut out the existing market- and cuda is being opened up anyways. Earlier this year Cuda was opened to work with non-Nvidia gpus so any attempt to reach for antitrust legislation is going to be an absurdly uphill battle.
@@Real_MisterSir Where did you read CUDA is opening up? Just because projects like ZLUDA exist doesn't mean CUDA is opening up now. In the contrary, NV tries to lock CUDA even more. Personally, I really hope NV gets pushed to truly open CUDA by EU and French government to Intel, AMD, Qualcomm and Apple benefit. But who know what happens
@@Real_MisterSir It is a computer language.. sure they have hardware geared toward executing it.. but how is it different than instruction sets for processors? Nvidia activelyopposes competition mile emulation and code translations for Cuda. I am no expert, but for the sake of competition they have a monopoly there, do they not? Plenty of businesses lose ownership of IP for competitive reasons after a time. Genric drugs for example... even trademark names of products have been lost or no longer protected by the government. Even Apple lost the ability to control the term "App Store".
@@Real_MisterSir I should have added.. the point is to get it to open up.. so if that happens no need for action. If AMD or Intel can then work on making their tech execute cuda stuff then there may be more competition in and markets Nvidia dominates and more ability to get those revenues needed to compete in the future.
Amd is gonna look at nvidia pricing and just put it slightly lower. With worse RT performance and upscaling quality
Trust
Smile
Huff the hopium with me friend. 4080 killer in raster for $400!!!!
most likely, but it will also depend on how much worse RT perf and fsr4 quality will be, because we know they will offer +4gb vram, it should generate enough hype to phase out nvidia pricing with low vram offers
Unlikely. AMD is releasing a midrange card when Nvidia is releasing High end cards. Unless Nvidia drops their whole stack at once, it’s likely that AMD will be setting the price for midrange this generation. Hopefully the 8800xt is much stronger than the 5070, and they don’t get greedy and try and sell it for 600-700$. Although with incoming Tariffs they might not have a choice. Last time the tariffs dropped, the 3080 went from 750$ AIB cards to 950$ AIB cards over night.
Yep. Just low enough to on-paper look like a better price/performance option -but not good enough to actually make a dent in Nvidia's brand name and superior software stack appeal. And crucially, not low enough that they can't hold on to their sweet sweet big margins.
Sadly you are right
AMD 6708V
6 = Zen Gen
7 = Tier - 3,5,7,9 as before
08 = Cores
Letter = Product Feature (V for V-Cache, or X for unlocked, E for Eco etc.)
Nothing AMD or NVidia is doing interests me.
Why? Because they have both "deemed" that an "entry level" product now costs at least US$300!
In this economic environment???!!
They BOTH have zero products that service the sub $300 market, so I'm not intersted in their products - they are simply WAY too expensive to consider!
Many good 2nd hand cards under 300.
@@liberteus not talking about the used market. What has either company offered, good or bad, since the 3000/RX6000 series, which will now be 2 generations?
A/ nothing.
nothing since the absolute crap GTX 1630 and RX 6400 and 6500, both AMD parts were mobile GPU's that were gimped to the point they could not compete with the RX580 it was replacing in some tasks, due to cerrtain parts being on the CPU side of the die. And being a X8 card it was a disaster on PCIE 3 boards, which most who had the RX580's and GTX1650's were using at the time.
Last gen bottom tiers were the RTX 4060 at $400 and the RX7600XT at $300. The non XT version came much later and was still $270.
I live in Australia and you cannot get a RTX 4060 or RX7600 for less than A$500!! And as I type you cannot buy any of the RX7600 line, there is no stock.
What's the bottom tier going to cost this time? $500? $400? $300? or something under $250 that normal people can afford without it being absolute garbage!
I'm hoping AMD remember who kept them in the GPU business, and that Intels Battlemage offers a compelling choice, cost and feature wise, and understand there is a huge untapped market the other 2 have forgotten about.
I agree, the GPU market is not consumer friendly. If you're not against it, the best way to get great value is to check the used market. There are some killer deals to be had. If you're looking to upgrade your GPU under $300 and buy new, hold off just a little longer. As soon as the new GPUs are announced, you'll see more deals. You can pick up a 16GB 7600XT right now for $300. I'd expect the 7800XT to creep down towards the $300 mark in the first half of 2025.
@@davidmccarthy6390 you're right. I'm in Canada and it's the same. Our $ is worth nothing and our standard of living has been chewed away by inflation and it's impossible to get anything below 500CAD (and add 15%taxes on top). For such a price fairly sure I could grab a 3080 though, or a 6800XT... But Canada is like AUS a vast country and except if you live in metropolitan areas the 2nd hand market is kinda limited to certain areas.
@@liberteus We have the GST at 10%. The only difference is that in Australia the retail price has the GST included by law.
For reference i bought the RX 580 8Gb in earlu 2019 for A$250.
I have been waiting since for a cost effective replacement. AMD's offerings were inferior to what I had, NVidia's had a roughly 50% price premium for similar tier performance and to this day cost ridicu;uos amounts here. Prices start at A$500 for the 8gb 4060. Add $150 fpr 16gb version.
that price bought you at least the 80 series in 2019, now that costs over A$1600 here.
Not realistic, I can buy an entire PS5 with all the bells and whistles for less than that,
and that's only a single component of a bigger build..
Don't even ask what the RTX 4090 costs here. it goes to well over A$5000, but generally sits in the A$4000-$4500 price range. I can buy a GOOD used car for that sort of money
IMO, what the GPU market really needs is a proper successor to Polaris.
The RX 460 4GB was about as fast as a GTX 950 but could be used with PSUs with no PCIe cables, was about 20% faster than the GTX 750 Ti, had twice as much VRAM as both 50-class GPUs, and cost only $120. A successor would be as fast as an RTX 3050 8GB, have at least 12GB VRAM, use 75W or less, and cost less than $150 ($120 plus inflation, or slightly undercutting the 3050 6GB).
The RX 480 8GB was about 5% slower than a GTX 980, but had twice as much VRAM, and cost less than half as much. A successor would be as fast or slightly slower than an RTX 4080, have 32GB VRAM, and cost $600 or less.
The RX 470 4GB was about 5% slower than a GTX 970 at just over half the price, and used less power. A successor would _either_ be about as fast as an RTX 4070, have 12GB VRAM, use less than 200W, and cost about $320; or be about as fast as an RTX 4070 Ti, have 16GB VRAM, and cost about $400.
Giving Navi 44 12GB probably isn't realistic (if it's a 128-bit bus or less, it would require clamshelling, because 3GB GDDR chips aren't likely to be available until 2026, and clamshelling costs too much to be viable for an entry-level card. It might happen on some workstation configs of N44, but not on a
AMD have a lower budget to build a CPU compared to Intel but yet they are able to come out with a better design and execute better as well
Threadripper was great to start. Now it's purely a workstation product with lite and pro versions. I have no desire whatsoever for 16 cores if it means the limitations of the consumer socket and I also have no desire to pay as much for just a 24 core CPU as it used to run to get the HEDT CPU, board and memory combined. We really need a TR4 successor. Something with 12 to 32 cores, quad channel non registered (aka traditional desktop) memory and 48 pcie lanes that doesn't cost an insane amount.
One thing that is becoming increasingly popular is Mini PCs. One of the ironies of this market is that units like the Elitedesk 705 G4 have an expansion connector for a discrete GPU. It seems that a significant number of buyers of refurbished Elitedesk 705 G4 are upgrading with a Radeon 560 dGPU. On the surface this makes little sense. The 705 G4 uses a Ryzen 2400g(e) APU, which only offers about 40% higher performance than the iGPU But this is enough to change an almost playable Fortnite experience into an actually playable Fortnite experience.
It is a shame these dGPUs can only be used with AMD Ryzen based office PCs. The older Intel based office PCs have a more powerful (for gaming) APU in them. It is also a shame that the Radeon 560 is the most powerful dGPU available for these refurbished office PCs.
Currently it would be great if a Navi 2 dGPU could be installed in one of these low end gaming PCs. But the real potential is for the future.
It appears that Mini PCs, like the BMAX Mini PCs have a bright future. This future could be even brighter if low end Mini PCs were upgradeable with dGPUs like the Ryzen 560, but future versions of them like one based on Navi.
Ideally, both AMD and Intel would produce low end dGPUs based on their high end iGPUs, that could be used to upgrade refurbished office PCs and low end Mini PCs. Imagine if after AMD discontinues support for Vega, Elitedesk 705 G4 and other Mini PC users could purchase a low end Navi dGPU to upgrade it. Better yet, buy a low end UDNA dGPU based on a high end UDNA iGPU. Abd Intel could follow suit with dGPUs based on their highest end ARC iGPUs.
Oh yes. I think AMD cancelled high end Navi 4 to focus on the transition to UDNA.
I wanna see AMD do 5080 performance @ 600 to 800 bucks. 1000+ dollar GPUS are BS.
awesome as always
AMD should be on the forefront, go full on Path Tracing and Beyond. PUSHHHHH further, more!
Thanks
Intel gpus did one excellent thing, they make amazing encoding accelerators for Plex servers. Especially since they all have the same encoding capabilities. There's no reason to spend the money on the top-tier devices. An a310 works perfect
So if TSMC is forcing Intel to buy 3nm and nobody is stocking up on Arrowlake, What's happening to the extra capacity?
Regarding the RX 8800XT (or whatever the AMD Highest Card will be)… Nvidia isnt stupid imo. You can see that when you know the RTX 5070 is only 6400 Cores compared to 5888 on the RTX 4070. So the Die Size will be even smaller this Time around compared to the last Gen. Meaning… RTX 4070 was 36% from the RTX 4090. RTX 5070 is 30% of the RTX 5090. So its another 6% smaller of the already bad RTX 4060 i mean 4070. It will use more Power and GDDR 7 will be used as well to make it around 40% better which will mean… more or less 15% behind the RTX 4080 (bit better than RX7900XT). AMD will have something around the same Performance, maybe bit better but who knows about RT…? Thing is Nvidia CAN price this Garbage… below 600 Dollars EASILY if they want to and still make their normal Gains if they want to. Its not hard to do for them. If they want to be Assholes even go to 550 probably and tell everyone how they understand that ppl. are mad 😢 because of the old 40 Series Prices but this Time…
Well, we had this BS happen before with the RTX 3070. Point is… the RX 8800XT needs to be REALLY good and REALLY cheap or else i fear Nvidia will be the one who laughs at last again. RTX 5070 could also sell at 700 Dollars. Sure. But Nvidia isnt dumb. They know what they can do and that ppl just dont care enough for AMD if the Price is almost the same.
I want 4k 240hz, thats why I will buy a 5090 and then after that a 6090 and 7090, if they make it possible in really intensive games like Black Myth Wukong for example, which is a heavily GPU-limited title atm, if you have a high end CPU like 9800x3D.
so no if you want to play high refresh rate 4k SINGLE PLAYER games that are really graphically intensive, you dont want 4090 performance, you want more than that. Even a 5090 wont be enough for what I want. Features like Path Tracing can look amazing, for that alone we will need a lot more powerful GPUs than we have now. If I buy ultra high end/highest end, I want to play my games in the best looking way possible. thats why I also bought a 32inch 4k 240hz OLED monitor. to drive that in intensive games, will require a better CPU than a 9800x3D (9800x3D I have) and better GPU than the upcoming 5090 (which I will have until 6090 is out). So I already know, that I will upgrade again as soon as there is another big step up.
John Petty's quarter 3 GPU report tells a grim tale for AMDs GPU division. They don't have a mountain to climb, they have to climb to the moon. Nvidia is out selling them 10 to 1 sometimes 12 or 15 to 1.
ouch
thx for stats
I have VERY low expectations for RDNA4 but will be VERY happy to be proven wrong. I have a 4090 atm so I just doubt AMD can beat that for a very long time. Time will tell.
I'm kinda in the same boat, although I wouldn't phrase my opinion in quite a pessimistic way. We'll see - the tech is there...but I'm not willing to bet anything on RADEON actually going for it.
I think that point about Radeon sticks to the wall more than any other, in that until AMD really invests in the software side of things on a level that creates confidence in their ecosystem, there will never be an Nvidia killer on the high end. Adrenalin software is amazing, and is probably the first time in a long time where Nvidia are playing catch up, in this case with the Nvidia App. But with FSR they're always way behind schedule which erodes confidence. Raytracing seems to finally be on their radar, which is a good thing. AI is where they could really make headway if they started enabling the layman user to experiment with AI, since very few consumers are even aware they can utilize their GPU in that manner. Above all else AMD need to stop chasing Nvidia, and start leveraging their CPU mindshare by enabling certain advantages for Radeon GPU's only. Nvidia have been doing this since buying physX, locking out the competition at a hardware level. If they want to compete in the GPU space with Nvidia, they need to play dirty like Nvidia does with the tools at hand, Ryzen being a giant hammer they can start swinging.
AMD needs a home run with RDNA 4 if they don't increase market share they will simply allocate less and less UDNA resources to gaming cards and that's probably why they have changed gear to UDNA so if worst happens they can keep making Instinct and console chips and offer whatever isn't to much hassle to the discrete market
RDNA4 looks better by the day. The B580 from Intel is huge, with a big power draw, but has decent performance at a cheap price. But, can intel make any money from selling such a big die for such a low price, on an expensive 3 fan card? I doubt it, and thus I see why the B770 and above may be cancelled. By contrast, Navi 44 (8600XT) has a very small die, at 130-152mm^2, and most likely a very low power draw. It should outperform the B580, but be much, much cheaper to make. If AMD is out for blood, they can put the 8600XT at $249 or less, and the more expensive to make 8700XT and 8800XT at prices like $400 and $500. Few people need more power than the 8800XT will provide, if it indeed matches a 4080, so AMD could indeed gain back market share that they haven't had for a decade.
IO could be increased with pcie switches on the MB but if you want pcie connected to the CPU, you would need different CPUs.
If you look at the Steam Hardware Survey, AMD needs to make a card that is as good as a 4070 and be decently cheap, like maybe $425 as the 4070 is about $500 and the 4060 and 3060 seem to be about $300 and $280.
Tops cards are 3060, 4060, 1650, 2060, 4060 TI, 3060 TI. Also throw in a good amount of VRAM as well, that would help a bit.
MIPs is still around? The 90s was a golden age of Hardware.
It seems likely to me that there are some tradeoffs to having UDNA. As I understand it, nvidia has a unified structure, and the money is coming from AI, so the gaming just isn't getting the emphasis anymore. My guess is that that explains why the 40 series was such an insignificant improvement over the 30 series. Will 50 be better? We'll have to wait and see.
Absolutely AMD should just concentrate on x3D. If every CPU they made were x3D, Intel would feel the pinch.
Is it Broken Silicon Episode Ultra 9 285k?
That's actually a good one
@@mexicanopdb
Broken Silicon to Intel "Look who's broken now...!"
Strix halo is the product I'm after, I've been excited for it ever since it was first leaked on this channel, no more vram limitations and only one gpu to worry about. Should be a good upgrade from my Zephyrus g14 2022
Everything launches not quite ready. You can't perfect everything in the time you have, but sometimes you miss some important things.
AMD would shake up the market with an easy 3080 class gpu with 32gb VRAM for AI hobbyists. Yes, that's the new thing.
I know that I am going to be building a ful AMD build soon, I will be going from a Ryzen 5 3500 with a RTX 2060. I am excited to see what an increase i will have in performance in gaming.
Supply like GDDR7 in blackwell. Maybe a good reason to market high end is professional graphic card.
AMD should focus on affordability with maybe a single top-of-the-line GPU
Most of us are sold on DLSS
Ahh AMD... never fails to disappoint! 7900 xt/xtx should've been launched as 7800/7800xt for $650 and $800 at launch. Imagine 7800xt, the successor to 6800xt around 50% more perf for 20% more price with 8gb extra VRAM. 4080 competitor for 33% less cost... what a missed opportunity that was!
Now if they stop fucking around with the name... release top end as 8750xt (5070 ti / 4080 perf) for $650 with 20GB VRAM and a 8700xt (5070 perf) for $500 with 16GB VRAM, they will really shake up the mid range and return some consumer confidence...
But honestly I have a feeling they'll promise 5080 competition, name it 8900xtxx so consumers think it challenges nvidia 90 class, charge $800 and then deliver 5070ti performance with 4070ish ray tracing and slap on an unnecessary amount of VRAM
RDNA4 is only going to sell well if top model is really as fast as 4080 and 500$. If it's 600$ people will just buy 5070 instead. 90% of normal people don't understand enough about GPUs to care about 12gb vram.
honestly.. just wish nvidia gave us more VRAM lol. 80 tier cards are just suffering, especially when you have to pay +1000usd for it.
I feel like they need more VRAM and a wider memory bus. The 5080 getting a 256bit bus is ridiculous. It basically negates the move to GDDR7. It's got essentially the same bandwidth as a NON OC 7900XTX. With AMD out of the high end this gen, I'd consider an NVIDIA card just for one gen if I didn't have to take a step back in several areas and pay way more to do it.
I don't know about you all, but I am getting the distinct feeling Jensen thinks we are all a bunch of knuckle-dragging idiots. Sorry bub, I haven't bought one of your GPUs since the original RTX 1080. AMD does just fine for me.
I've got my fingers crossed that the rx 8700xt (whatever it's called) takes a page out of the rx 5700xt launch and comes in as powerful as Nvidia 's previous gens top end card for $400.
No way in hell it's as fast as a 4090 for 400 dollars, I'd love to be wrong
@Vriess123 lol, the rx 5700xt was as fast as the gtx 1080 ti. Now, I seriously doubt it will touch the rtx 4090, but I'm hoping it trades blows with the rtx 4080.
Trading blows with the 4080 is what the presumably 8800XT is going to do on the optimistic end (pessimistic is 7900 XT). And that is gonna be 500-600USD for sure. Hopefully 500USD though, that's half the price of the 4080 Super.
@Hayden2447 i hope they don't release it for $600. That would be a huge missed opportunity. It needs to be $400-$450. They will dominate the pc gaming market at that price.
9957XTX ULTRA PRO! IS IT A CPU, IS IT A GPU? WHO KNOWS! IT'S SO EXCITING 😂
Nvidia is probably downplaying the fact that 80 and 90 are gaming cards because of the crypto lawsuit. They don't want to get sued again so they're probably going to market the high end as not gaming product so the next time they get sued - they can say 'but we said those aren't gaming cards'.
Well, and to try to con people into thinking that GPUs aren't as insanely priced as they actually are.
The way I see it, Radeon isn't making much money atm anyway as they need to sell N31 and N32 basically at cost just to move them. RDNA4's dies are small, simple and cheap. Price them aggressively, regardless of whaf Nvidia does and just earn the good will. Chances are they will make money anyway.
This podcast and its focus is usually too long for me. However the questions had decent answers from the guest, and maybe the editing cut out some fat. Ended up listening to the whole thing. :)
Nvidia will be calling the 5080 & 5090 Professional cards for only one reason, a way not to sell them to China due to restrictions. Or rather, a way to placate the Chinese market. Look we can sell you a gamer card! Yay.
Isn't it obvious that anything that requires advanced packaging is currently being squeezed out by the big bucks that are being paid for the AI GPUs? There's no mystery.
Is James using that AI eye tracking technology to make it seem like he's always looking at the camera? I can't unnotice it now.
Yes, I did, wanted to try it out and see how noticeable it was. Setting in nvidia broadcast.
Unifiying architectures should make designing GPUs a lot more cost efficient for R&D than having one architecture that caters to consumers while a seperate one caters to professional and enterprise. If AMD then sticks to maybe 2 or 3 simpler dies for gaming where they only cater to the section of the market where market share is generated (which is entry and mid range) I feel like they can produce decent competetive products while also rerouting resources into the software part to improve AI based upscalers (like it or not it's a sales argument), an NVENC equivalent as well as their drivers etc. Also it would make the more widespread use of ROCm easier I think? Also also AMD needs to improve their marketing.