Whokeys Black Friday Sale 25% code: RGT Windows 11 Pro $23,2:biitt.ly/581eD Windows 10 Pro $17,6: biitt.ly/9f0ie Windows 10 Home $15:biitt.ly/XYm9o Office 2016 $28,8:biitt.ly/cy7Vb Office 2019 $48.6:biitt.ly/YInvw #blackfriday#Windows11pro#whokeys whokeys.com
The chip price.. pricey but tolerable.....that 5090 price that is probably out of my league. Would be a heck of a combination though...not gonna lie. 😊
@@tinmanslickgreasy999 well even if you can afford it I dont think it is worth it paying that kind of money to a company that is greedy and already the most valued in the world
I can tell you whatever Nvidia decides to do no matter if its cheaper for them like a mcm design over monolith it will ALWAYS be more expensive for the customer.. the price jump from 30 series to 40 series was absolute highway robbery
@@sacamentobob Yeah, that 20 series 'RTX' price jump was pretty sickening. What's infinitely more sickening is that today, after 2 more generations, RT is STILL no more than shiny chrome and puddles. Even more sickening than that is that both Nvidia and AMD are clearly working together to price fix while raising the pricebar. Today's focus is not on hardware innovation...it's on software upscalers and fake frames, and all because these two allow Nvidia and AMD to avoid expensive hardware R&D.
MCM does not makes things cheaper. That's a misconception. If anything monolitchic is cheaper than MCM. MCM is about yield not makes thing cheaper. Sure those small die might be cheaper than bigger monolithic die but the advance packaging and more complicated software needed for those separate die to work as if they were a single die is making MCM as a whole much more expensive. AMD already encounter this issue with RX7800 XT. and that is only GCD + MCD not the much more complicated GCD+GCD.
Imagine if the 8800XT had: 4080 performance (raster), 4070Ti performance (RT), Tensor cores for FSR4 (so AI based FSR) and 24GB VRAM instead of 16GB for $499. Not gonna happen, sure, but IMAGINE man. That card would sell so well lmao.
@@Shieftainit won't. Tried it last time and went bankrupt and broken up. Had they survived, they'd have a foundry, still own and dominate instead of Qualcomm which was partially made from old amd's ashes etc
Can’t wait to see 5700X3D/8800XT as the budget pc build assuming the GPU releases at $500-550 and performs like the 7900XT with better RT of course this build will set u back at $1100+ which is cheap because the performance is going to be allot
@@ZaberfangX I think $599 still would for a 16GB, AI based FSR, much improved RT vs 12GB 5070 at $799-899??? But yeah, if the 5070 is $599-699 then AMD would have to go to $499 or lower.
from what MLID is mostly saying, 50 series and rdna 4 are ready for some time now, they could have released both product stacks by now, but stock of old gen is what keeps them at ceasefire and waiting if rdna 4 ends as leaks suggest, then high end of rdna 3 would become unwanted e-waste even if rdna 4 is $600, since it's a good buy, if it went $550 it would be great and $500 insanely good for this economy... 4080 super half off, so delaying is mostly a move to prevent too much losses on old hardware and if they want to get aggresive with marketshare, which they claimed, going $550 would make them get to probably 20% since a lot of buyers are waiting for longer time now
@@ZaberfangXbut AMD still not stupid going to use aggressive pricing to do that. Gaining market share against nvidia is not as simple as heavily undercutting nvidia. AMD know that the most after doing such thing for the last 15 years and yet still losing market share agaisnt nvidia.
Until AMD gets significant mindshare they will never get much penetration into the GPU market. AMD needs either to blow out performance vs Nvidia(not likely anything soon) or take a hit in pricing for a generation or two, otherwise you just keep the status quo. AMD can either keep doing the SOS or challenge Nvidia in price to the point customers will take a chance on an AMD GPU.
AMD needs a 'Zen' moment in their GPU division. That moment can only come with multi GPU chiplets. Unfortunately, AMD - despite their significant lead in this area - are a long way from doing it with GPU's. It's infinitely easier to connect 6-16 CPU cores than the countless number of equivalent connections needed with GPU's. Until AMD can do that, Nvidia will continue to grow and dominate.
@@arenzricodexd4409 There's no doubt Nvidia are hard at work on this too because if AMD cracks the secret first, they can instantly overturn Nvidia's dominance.
I rather get a good product for a good price than a Nvidia product for a Nvidia price. Right now i run my first AMD card for some time and see no reason at all to pay what Nvidia wants for its products - and once pathtracing becomes a common thing in about 5 years all the cards will support it very well. But since everything needs to work on console levels and barely gets optimised for PC capabilities, there is still no point to expect that for the next 2 generations. The next cashgrab acronym will now be AI and raytracing will slowly become a mainstream thing. ... just like shadows a couple of decades ago
@@merrydoc1051 It only seem reasonable if you play in 1080p. Other than that it would be gpu bottlenecked. You should watch some videos before taking action i guess.
honestly i think it will be on par in raster with a rx 7900xt at best (at reference speeds and models) and in raytracing it might be on par with a 4070 ti (but with better 1% lows/ minimums due to the large memory bandwidth and capcaity compared to the 4070, 4070 super and 4070 ti), anything higher i cant really imagine with just 64 CU's at 3Ghz ish. to match the rx 7900XT in real world performance it would probably need to clock to 3.3-3.4Ghz since the RX 7900XT typically reaches 2.5-2.6Ghz while gaming (without over clocking, just normal boost). but larger and faster L3 compared to n32 (hopefully 96MB) might mitigate the need for such high clocks, and hopefully is able to counter act the lack of memory bandwidth (800GB/s vs 640GB/s)
@@riven4121Which is weird because the 10900k was fine. It has less to do about power and more to do about architecture trying to mimic Apple. E cores are great for mobile devices.
I've bought several keys through your Whokeys sponsorships in the past but I still get surprised by the ad reels and hear it as "Hookies" and think of hookups...
the ai 370 is where I am intrigued to go for, to upgrade from my 5600 am4 rig - 9800x3d is very heartwarming but unobtainable and anyhow I would probably go for 9900 because of 12c/24t it offers, curious on end of the year/break to new one for sale on components and the new video cards intel arc 2nd gen maybe in december, january ces range for 5000 series nvidia and 8000 series amd which should be out in first q of next year maybe.... so depending on what will be available it looks promising, if not than either a mobile workstation with the 370 ai, or a nuc like device with it
Amd won't release until nividia does so they can slot in but if performance is similar price wise im going amd I have a 6950xt and my son has a 6750xt and we haven't experienced any driver issues whatsoever
If AMD/Intel plays their cards right i can see a big shift away from Nvidia mainly do to cost and the state of the world right now. Even with Trump in office its going to take time for the price of goods to come down and people have been hurting for awhile now. Nvidia doesn't play well in anything but the very high end market so if AMD/Intel can make a card close to Nvidia's for half the cost i think we will see some Nvidia fanboys jump ship.
Yes but dont forget than NVIDIA have the best brand GPU and techs. And dont forget they're coming with their New apu. Someone i know have already talk to me about that and it's incredible. He works in.😊
couldnt agree any more with you i think it will be on par in raster with a rx 7900xt at best (at reference speeds and models) and in raytracing it might be on par with a 4070 ti (but with better 1% lows/ minimums due to the large memory bandwidth and capcaity compared to the 4070, 4070 super and 4070 ti), anything higher i cant really imagine with just 64 CU's at 3Ghz ish. to match the rx 7900XT in real world performance it would probably need to clock to 3.3-3.4Ghz since the RX 7900XT typically reaches 2.5-2.6Ghz while gaming (without over clocking, just normal boost). but larger and faster L3 compared to n32 (hopefully 96MB like in some of the leaks it was stated, I cant fathom why they went from 128MB on RDNA2 to just 64MB (on the 256bit bus models/ n31 has 80-96MB)) might mitigate the need for such high clocks, and hopefully is able to counter act the lack of memory bandwidth (800GB/s vs 640GB/s) considering amd was already with rdna3 300-400Mhz off of the leaks and then later down the road even in the advertisements. -> realistically its only 2.9-3.1 Ghz since MLID claimed "anywhere from mid to low 3 Ghz" 2.9 - 3.1 was already reahced by rdna3.5 APU's giving me hope that rdna4 on n4p can reach those clocks and higher on those big 64 CU GPU's it will atleast in theoretical compute resource when clocked at 3 Ghz be right in between a 7900GRE at ~2.4Ghz and a 7900XT at ~2.5Ghz.
I went out on a limb when i predicted an AMD Mega APU and about.... 2... 3 years later? they came out with socketable Mi300A(when AMD first announced their IO die i predicted there'd be a mega APU) So i'm going to go out on a limb again and say AMD may be making a medium APU that is 3D stacked. 200-240mm compute die with just cores, CUs and L2/L3cache 8-16 cores, 24-48CUs (i dont know how much extra space is needed for SMDs under the headsink) Then an IO die underneath containing 3D cache for both the CPU and GPU The Full PCIe compliment but add CXL so some of those current gen PCIe lanes can be used for additional memory channels if desired Maintain the god tier memory controller present on AMD APUs, improve the FCLK/UCLK now that IO can more easily reach the CPU/GPU (5700G has better UCLK than my 8700G but my 8700G can go to DDR5 8000 in 1:1 with MCLK but UCLK is stuck at 2200, where as my 5700G can only do 5000 in 1:1:1:1) some of those Alveo video engines (so far Alveo video engine seem almost as good as CPU transcoding making them better than even Apple Video Engine far better than the existing video engine but only present in the $1600 MA35D transcode card that can do something like 10 simultaneous 8K streams) I had requested something like this when they brought out the 5700G because i was upset with how little cache the CPU had, and the PCIe being so much cut down compared to the 3700x/5700x
I got the 9800X3D just for Microsoft Flight Simulator for now until I find out if the 9950X3D is better. I will never touch an AMD video card though. My only disappointment is AMD can't properly use 8000+ MT/s RAM without taking a performance hit. Just my opinion.
Ram performance seems to be always to be a combination of CL Timings/ latency and MT rate. So Id look at benchmarks usually, since 6000mhz CL28 can be better than 6800 CL 36.
keep it 4k or else ps5 just proves us they are running 1800p better than our pc 1080p low master race ...its only way defeat sony is do 4k only benchmark at 100+ fps
I switched to Mac’s because of the slow progression with AMD and Intel. But they still got a long way to go when you need CPU coolers that are bigger than the size of a Mac mini. I would think after 20 years, the size of desktops would shrink, but the exact opposite happened. There’s not even any good games out and people are obsessed with FPS and all the good games worth playing run perfectly fine on 5 years old hardware.
Intel needs to STEP UP or SHUT UP!! ARC Alchemist was introduced too late and didn't put any pressure on Nvidia or even AMD. What's going on with Battlemage? 2025 will be too late - again!
AMD better have a 8c version X3D Fire Range, the previous 7945HX3D was so bad that it uses so much power and has little to no improvement in gaming. Like seriously 7800X3D is using a crazy low amount of power, like less than 60W in gaming. Couldn't they have just put that chip in laptops? There is no need for 16c X3D chips in laptops, 8c X3D with high performance and low power is key to dominating laptop markets.
@@CSC8393 Which one? 7945HX3D? Yes, that is a 16c CPU and that's why its bad, it draws alot of power and has little to no improvement in gaming compared to the non X3D variant. That's why Fire Range needs to have an 8 core X3D version.
I really hope that AMD (especially) and Intel, together with Windows developers, will find a way to make x86 more efficient for the ones who need, or want, a powerful yet efficient laptop,that can give you the best compatibility and performace even when unplugged. I guess Strix Halo will be the first "test" to see how good AMD chiplets can be on mobile, but Zen 6 needs to be at least on par with what Qualcomm and Nvidia will have out with ARM architecture by the time it will hit the market. I don't think they will be able to compete with Apple unfortunally,for a number of reason,but looking how much they're pushing on AI Windows might find in AMD the "best player" for Windows laptops and PC for both CPU and GPU and might be "tempted" to optimize x86 as much as possibile before ARM will eventually take over the world, we'll see, i just hope the best for us, final users, dgaf about brands royality.
@B.Ch3rry oh yeah,for sure. I'm just hoping that, since Zen 6 will come out anyway and OEMs will implement it on laptops and desktops, it would be nice if it won't suck in comparison to the Arm offering when it comes down to performace and efficiency 😂 WOA still have a long way to go, but i think that the next generation of X Elite will be a big boost for the industry if it will be build up upon of what we saw with the Snapdragon 8 Elite. It's a couple of years that Qualcomm already beat Apple in performance on smartphone GPUs, this year seems that they beat them also in efficiency, and the architecture of the GPU itself is finally similar to what we have on laptops, so i think they have the potential to bring a killer Apu for Windows in mid 2025 this time around,really competing with Apple, but the problems is still the same, apps and games compatibility,and how long developers will take to optimize them. When Nvidia will announce their Arm APU for laptops things will move faster, for sure, but i'm pretty sure it will take at least two years before all the games and the professional apps will be Arm native and perfectly optimized for ARM/Nvidia. In the meanwhile, "we" need to stick to x86 for Windows and Linux for the best experience, but unfortunally not the best efficiency 🫠
we are entering 5k world now by now....4k is now becoming the new 1440p that 4k is now becoming standard for pc monitors now....while 5k is luxury class until 6k comes out
you delusional. AMD already said there not making halo products for 8000, there will be no 4080 equal, they already have it. stop listening to uninformed youtubers
You also need a monitor or TV with over 144hz to really take advantage of it too. If you're limited to 144hz, then there is no point. I do know some 144 or 120 lhz TV allow 240hz at 1080p, then I can see the point.
U should rephrase this to say aslong as ur 100% gpu bound then ur OK. Sometimes 4k can see a difference with a cpu but not as much as 720p or 1080p low
Ray tracing makes you really CPU bound even at 4K and besides , better 0.1 and 1% lows equals to a much smoother and future proof experience. My 5800X3D will serve me well until 11800X3D while suckers can keep buying Shintel 😂
Whokeys Black Friday Sale 25% code: RGT
Windows 11 Pro $23,2:biitt.ly/581eD
Windows 10 Pro $17,6: biitt.ly/9f0ie
Windows 10 Home $15:biitt.ly/XYm9o
Office 2016 $28,8:biitt.ly/cy7Vb
Office 2019 $48.6:biitt.ly/YInvw
#blackfriday#Windows11pro#whokeys
whokeys.com
Yes, Paul. I am buying this 9800 x3d and the 5090. Right after I went to the moon with my Lamborghini; just like in fast and furious.
Well, I shall be getting the 9950X3D and a 5090 as soon as they are available. 🙂
The chip price.. pricey but tolerable.....that 5090 price that is probably out of my league.
Would be a heck of a combination though...not gonna lie. 😊
@@metalikmike1Why not buy an array of 5090's and build a super computer? I mean after orbiting Uranus with your Bugatti just like in Fast & Furious
@@tinmanslickgreasy999 well even if you can afford it I dont think it is worth it paying that kind of money to a company that is greedy and already the most valued in the world
can I be the passenger?
I can tell you whatever Nvidia decides to do no matter if its cheaper for them like a mcm design over monolith it will ALWAYS be more expensive for the customer.. the price jump from 30 series to 40 series was absolute highway robbery
The price jump from the 20 series from the 10 series was robbery!!
@@sacamentobob Yeah, that 20 series 'RTX' price jump was pretty sickening. What's infinitely more sickening is that today, after 2 more generations, RT is STILL no more than shiny chrome and puddles. Even more sickening than that is that both Nvidia and AMD are clearly working together to price fix while raising the pricebar. Today's focus is not on hardware innovation...it's on software upscalers and fake frames, and all because these two allow Nvidia and AMD to avoid expensive hardware R&D.
@@ChrisM541Nvidia doesn't need ands help to price fix when amd is less than 10% of the market
@@NadeemAhmed-nv2br Nvidia doesn't need a high %age of the market to price fix with a competitor ;)
MCM does not makes things cheaper. That's a misconception. If anything monolitchic is cheaper than MCM. MCM is about yield not makes thing cheaper. Sure those small die might be cheaper than bigger monolithic die but the advance packaging and more complicated software needed for those separate die to work as if they were a single die is making MCM as a whole much more expensive. AMD already encounter this issue with RX7800 XT. and that is only GCD + MCD not the much more complicated GCD+GCD.
Imagine if the 8800XT had: 4080 performance (raster), 4070Ti performance (RT), Tensor cores for FSR4 (so AI based FSR) and 24GB VRAM instead of 16GB for $499. Not gonna happen, sure, but IMAGINE man. That card would sell so well lmao.
It needs to happen if they want to win back some market share. Taking the hit in consumer GPU profit will help them in the long run.
it would. not. be sold for 499$.
50% more vram? That's just unnecessary lol the extra cost will only get passed onto you. 16gb is still plenty
@@Shieftainit won't. Tried it last time and went bankrupt and broken up.
Had they survived, they'd have a foundry, still own and dominate instead of Qualcomm which was partially made from old amd's ashes etc
Make it 699 and its a realistic deal.
Can’t wait to see 5700X3D/8800XT as the budget pc build assuming the GPU releases at $500-550 and performs like the 7900XT with better RT of course this build will set u back at $1100+ which is cheap because the performance is going to be allot
9800xd3 is a great cpu for gaming..... If you can find one 💀
Just get the 7600x3d, it's good enough cheap and at 4k it's mostly all gpu
I really can't see AMD going lower than $599 if the card is stronger than the 7900XT, just least because they have loads of 7900 and 7800 to shift.
if AMD wants to gain marketshare they need to make the video card worth it over nvidia in cost
@@ZaberfangX I think $599 still would for a 16GB, AI based FSR, much improved RT vs 12GB 5070 at $799-899???
But yeah, if the 5070 is $599-699 then AMD would have to go to $499 or lower.
from what MLID is mostly saying, 50 series and rdna 4 are ready for some time now, they could have released both product stacks by now, but stock of old gen is what keeps them at ceasefire and waiting
if rdna 4 ends as leaks suggest, then high end of rdna 3 would become unwanted e-waste even if rdna 4 is $600, since it's a good buy, if it went $550 it would be great and $500 insanely good for this economy... 4080 super half off, so delaying is mostly a move to prevent too much losses on old hardware
and if they want to get aggresive with marketshare, which they claimed, going $550 would make them get to probably 20% since a lot of buyers are waiting for longer time now
5070 will be around 599, they kinda have to
@@ZaberfangXbut AMD still not stupid going to use aggressive pricing to do that. Gaining market share against nvidia is not as simple as heavily undercutting nvidia. AMD know that the most after doing such thing for the last 15 years and yet still losing market share agaisnt nvidia.
There needs to be a 8 core 40 CU Strix Halo for gaming laptop, ideally with 3D V-Cache.
“Thanks very much for watching the add.”
That was when I knew I could stop pressing the forward 10 secs button.
😂
Until AMD gets significant mindshare they will never get much penetration into the GPU market. AMD needs either to blow out performance vs Nvidia(not likely anything soon) or take a hit in pricing for a generation or two, otherwise you just keep the status quo. AMD can either keep doing the SOS or challenge Nvidia in price to the point customers will take a chance on an AMD GPU.
AMD is so done to go into price war with nvidia. It only hurt them rather than giving them more market share.
AMD needs a 'Zen' moment in their GPU division. That moment can only come with multi GPU chiplets. Unfortunately, AMD - despite their significant lead in this area - are a long way from doing it with GPU's. It's infinitely easier to connect 6-16 CPU cores than the countless number of equivalent connections needed with GPU's. Until AMD can do that, Nvidia will continue to grow and dominate.
@@ChrisM541 nvidia probably will be ahead of AMD to make MCM GPU work in games.
@@arenzricodexd4409 There's no doubt Nvidia are hard at work on this too because if AMD cracks the secret first, they can instantly overturn Nvidia's dominance.
Gettings my 9800 X3D today can't wait 😃
Fire range x3d will be awesome (if it is available in more than laptop this time that is)
I rather get a good product for a good price than a Nvidia product for a Nvidia price. Right now i run my first AMD card for some time and see no reason at all to pay what Nvidia wants for its products - and once pathtracing becomes a common thing in about 5 years all the cards will support it very well. But since everything needs to work on console levels and barely gets optimised for PC capabilities, there is still no point to expect that for the next 2 generations. The next cashgrab acronym will now be AI and raytracing will slowly become a mainstream thing. ... just like shadows a couple of decades ago
Will they have 40% performance uplift?
Got a 9800x3d on the way actually. Can't wait.
Thinking of upgrading my 5800x3d to an 9800x3d. Not sure if it's worth it for 1440p.
@@merrydoc1051 nah it isnt, maybe only if u have a 4090 then maybe it could get bottlenecked by the cpu
@@merrydoc1051 It only seem reasonable if you play in 1080p. Other than that it would be gpu bottlenecked. You should watch some videos before taking action i guess.
Stop with the hopium. If the “8800xt” trades blows with 4080 then AMD can go 700$ and still call it “agressive pricing “
honestly i think it will be on par in raster with a rx 7900xt at best (at reference speeds and models) and in raytracing it might be on par with a 4070 ti (but with better 1% lows/ minimums due to the large memory bandwidth and capcaity compared to the 4070, 4070 super and 4070 ti), anything higher i cant really imagine with just 64 CU's at 3Ghz ish. to match the rx 7900XT in real world performance it would probably need to clock to 3.3-3.4Ghz since the RX 7900XT typically reaches 2.5-2.6Ghz while gaming (without over clocking, just normal boost).
but larger and faster L3 compared to n32 (hopefully 96MB) might mitigate the need for such high clocks, and hopefully is able to counter act the lack of memory bandwidth (800GB/s vs 640GB/s)
Yes Zen 6 x3D with a better IO die and with CCD's that have more cores say 16 each would blow the world away
Yes but neither Amd or Intel want to spoils us so that wont ever happen. Even at the risk of going bankrupt Intel sticks to 8 cores.
@@impuls60 Intel sticks to 8 P cores because any more would skyrocket power consumption to the moon.
@@riven4121Which is weird because the 10900k was fine. It has less to do about power and more to do about architecture trying to mimic Apple. E cores are great for mobile devices.
long time wait to see that
Ryzen with more cores like 24 or 32 is exactly what I am waiting for my next PC build
still no info about how many stream processor or TF of RDNA4
I've bought several keys through your Whokeys sponsorships in the past but I still get surprised by the ad reels and hear it as "Hookies" and think of hookups...
Thank you Paul for the nice AMD CPU/GPU plans leak 🤓
That's the ticket...go on...'fall' for those leaks 🤓
Advertisement way too long
i have my 9800x3d now i need a 6800xt replacement amd =)
the ai 370 is where I am intrigued to go for, to upgrade from my 5600 am4 rig - 9800x3d is very heartwarming but unobtainable and anyhow I would probably go for 9900 because of 12c/24t it offers, curious on end of the year/break to new one for sale on components and the new video cards intel arc 2nd gen maybe in december, january ces range for 5000 series nvidia and 8000 series amd which should be out in first q of next year maybe.... so depending on what will be available it looks promising, if not than either a mobile workstation with the 370 ai, or a nuc like device with it
Amd won't release until nividia does so they can slot in but if performance is similar price wise im going amd I have a 6950xt and my son has a 6750xt and we haven't experienced any driver issues whatsoever
Thanks for your very nicely done video as always.
If AMD/Intel plays their cards right i can see a big shift away from Nvidia mainly do to cost and the state of the world right now. Even with Trump in office its going to take time for the price of goods to come down and people have been hurting for awhile now. Nvidia doesn't play well in anything but the very high end market so if AMD/Intel can make a card close to Nvidia's for half the cost i think we will see some Nvidia fanboys jump ship.
Yes but dont forget than NVIDIA have the best brand GPU and techs.
And dont forget they're coming with their New apu.
Someone i know have already talk to me about that and it's incredible. He works in.😊
I already bought 4070ti super and having over 2 k in my pc i wait a year when all prices drop alot plus scalpers will buy alot of new stuff anyway.
Will crossfire and sli make a comeback?
Why does AMD still use CU's for APU's? Wouldn't WGP be more efficient?
Sounds like you need to go into the CPU/GPU business ;)
It will be b/t the 7900 GRE and 7900 XT.
It will not be any faster.
couldnt agree any more with you
i think it will be on par in raster with a rx 7900xt at best (at reference speeds and models) and in raytracing it might be on par with a 4070 ti (but with better 1% lows/ minimums due to the large memory bandwidth and capcaity compared to the 4070, 4070 super and 4070 ti), anything higher i cant really imagine with just 64 CU's at 3Ghz ish. to match the rx 7900XT in real world performance it would probably need to clock to 3.3-3.4Ghz since the RX 7900XT typically reaches 2.5-2.6Ghz while gaming (without over clocking, just normal boost).
but larger and faster L3 compared to n32 (hopefully 96MB like in some of the leaks it was stated, I cant fathom why they went from 128MB on RDNA2 to just 64MB (on the 256bit bus models/ n31 has 80-96MB)) might mitigate the need for such high clocks, and hopefully is able to counter act the lack of memory bandwidth (800GB/s vs 640GB/s)
considering amd was already with rdna3 300-400Mhz off of the leaks and then later down the road even in the advertisements.
-> realistically its only 2.9-3.1 Ghz since MLID claimed "anywhere from mid to low 3 Ghz" 2.9 - 3.1 was already reahced by rdna3.5 APU's giving me hope that rdna4 on n4p can reach those clocks and higher on those big 64 CU GPU's
it will atleast in theoretical compute resource when clocked at 3 Ghz be right in between a 7900GRE at ~2.4Ghz and a 7900XT at ~2.5Ghz.
Gears 5 on desktop. Are you ok ......mentally......
?
Are there any news if AMD 8000 series will use atx3.0?
Don't think you can call it a roadmap if it's literally Navi "4x" shaded in for the ENTIRE year of 2025.
What about the desktop APUs?
I went out on a limb when i predicted an AMD Mega APU and about.... 2... 3 years later? they came out with socketable Mi300A(when AMD first announced their IO die i predicted there'd be a mega APU)
So i'm going to go out on a limb again and say AMD may be making a medium APU that is 3D stacked.
200-240mm compute die with just cores, CUs and L2/L3cache 8-16 cores, 24-48CUs (i dont know how much extra space is needed for SMDs under the headsink)
Then an IO die underneath containing
3D cache for both the CPU and GPU
The Full PCIe compliment but add CXL so some of those current gen PCIe lanes can be used for additional memory channels if desired
Maintain the god tier memory controller present on AMD APUs, improve the FCLK/UCLK now that IO can more easily reach the CPU/GPU (5700G has better UCLK than my 8700G but my 8700G can go to DDR5 8000 in 1:1 with MCLK but UCLK is stuck at 2200, where as my 5700G can only do 5000 in 1:1:1:1)
some of those Alveo video engines
(so far Alveo video engine seem almost as good as CPU transcoding making them better than even Apple Video Engine far better than the existing video engine but only present in the $1600 MA35D transcode card that can do something like 10 simultaneous 8K streams)
I had requested something like this when they brought out the 5700G because i was upset with how little cache the CPU had, and the PCIe being so much cut down compared to the 3700x/5700x
hopefully people don't get hurt in the knees again with 1080p low settings to run their games vs nvidia 4k ultra master pc race
Can't wait for the highest 8000 GPU not really interested in any other product from Nvidia or current lineup for AMD
Ayyye fredripper😅 haven't watched a vid in a min
I got the 9800X3D just for Microsoft Flight Simulator for now until I find out if the 9950X3D is better. I will never touch an AMD video card though. My only disappointment is AMD can't properly use 8000+ MT/s RAM without taking a performance hit. Just my opinion.
Ram performance seems to be always to be a combination of CL Timings/ latency and MT rate. So Id look at benchmarks usually, since 6000mhz CL28 can be better than 6800 CL 36.
@scuffedgodcxcx253 That's why I went with 6400 MT/s CL 28 which was hard as hell to find in 64 GB
Sounds like you SHOULD have stayed with Intel ;)
@@ChrisM541 Yes and no.
Thanks Paul
brush your hair dude
🤣
Are you gay?
AMD doesn't even release all 9000 series CPUs and he already talks about next gen. Hilarious
Do you have a voice changer on?
keep it 4k or else ps5 just proves us they are running 1800p better than our pc 1080p low master race ...its only way defeat sony is do 4k only benchmark at 100+ fps
They need to bust up Nvidia and allow other companies to grow and be competitive, Me my self have had nothing but bad experience with nvidia
I switched to Mac’s because of the slow progression with AMD and Intel. But they still got a long way to go when you need CPU coolers that are bigger than the size of a Mac mini. I would think after 20 years, the size of desktops would shrink, but the exact opposite happened. There’s not even any good games out and people are obsessed with FPS and all the good games worth playing run perfectly fine on 5 years old hardware.
AMD ryzen max strix helo hold my beer that APU IM WAITING ON IM A APU GEEK ON MY 5700G WAITING WAITING WAITING
Intel needs to STEP UP or SHUT UP!! ARC Alchemist was introduced too late and didn't put any pressure on Nvidia or even AMD. What's going on with Battlemage? 2025 will be too late - again!
Clickbait
Avoid this video
Unsubscribing
AMD better have a 8c version X3D Fire Range, the previous 7945HX3D was so bad that it uses so much power and has little to no improvement in gaming. Like seriously 7800X3D is using a crazy low amount of power, like less than 60W in gaming. Couldn't they have just put that chip in laptops? There is no need for 16c X3D chips in laptops, 8c X3D with high performance and low power is key to dominating laptop markets.
Wasn't it a 16 cores CPU ?
@@CSC8393 Which one? 7945HX3D? Yes, that is a 16c CPU and that's why its bad, it draws alot of power and has little to no improvement in gaming compared to the non X3D variant. That's why Fire Range needs to have an 8 core X3D version.
Will Paul get a sponsor from a hair brush company before RDNA 4 releases or not?
if amd can release a 4080 for $1000 aud ill be getting it.
They already have - the 7900XT (or XTX if you need) ;)
@@ChrisM541 7900xtx cost like $1450 for a cheapo model. but hopefully it will drop in price when the new card come out, only 2-3 months away.
I really hope that AMD (especially) and Intel, together with Windows developers, will find a way to make x86 more efficient for the ones who need, or want, a powerful yet efficient laptop,that can give you the best compatibility and performace even when unplugged. I guess Strix Halo will be the first "test" to see how good AMD chiplets can be on mobile, but Zen 6 needs to be at least on par with what Qualcomm and Nvidia will have out with ARM architecture by the time it will hit the market. I don't think they will be able to compete with Apple unfortunally,for a number of reason,but looking how much they're pushing on AI Windows might find in AMD the "best player" for Windows laptops and PC for both CPU and GPU and might be "tempted" to optimize x86 as much as possibile before ARM will eventually take over the world, we'll see, i just hope the best for us, final users, dgaf about brands royality.
That’s a pipe dream… With Qualcomm, now nVidia and AMD, blazing the trail with Arm SoCs/APUs… x86/64 is aging out.
@B.Ch3rry oh yeah,for sure. I'm just hoping that, since Zen 6 will come out anyway and OEMs will implement it on laptops and desktops, it would be nice if it won't suck in comparison to the Arm offering when it comes down to performace and efficiency 😂 WOA still have a long way to go, but i think that the next generation of X Elite will be a big boost for the industry if it will be build up upon of what we saw with the Snapdragon 8 Elite. It's a couple of years that Qualcomm already beat Apple in performance on smartphone GPUs, this year seems that they beat them also in efficiency, and the architecture of the GPU itself is finally similar to what we have on laptops, so i think they have the potential to bring a killer Apu for Windows in mid 2025 this time around,really competing with Apple, but the problems is still the same, apps and games compatibility,and how long developers will take to optimize them. When Nvidia will announce their Arm APU for laptops things will move faster, for sure, but i'm pretty sure it will take at least two years before all the games and the professional apps will be Arm native and perfectly optimized for ARM/Nvidia. In the meanwhile, "we" need to stick to x86 for Windows and Linux for the best experience, but unfortunally not the best efficiency 🫠
we are entering 5k world now by now....4k is now becoming the new 1440p that 4k is now becoming standard for pc monitors now....while 5k is luxury class until 6k comes out
you delusional. AMD already said there not making halo products for 8000, there will be no 4080 equal, they already have it. stop listening to uninformed youtubers
4080 isn't a halo product, 4090 or better is.
@@26Guenter lol
processor worth to buy : intel i5 12400f & AMD ryzen 5500
gpu worth to buy : nvida 3050 8 gb ddr 6 & amd rx 6600 8 gb ddr 6 & intel arch A 580 8 gb ddr 6
above that its worthless
If you still play at 1080p than the 9800X3D is for you. If your at 4K, X3D doesn’t really benefit you
You also need a monitor or TV with over 144hz to really take advantage of it too.
If you're limited to 144hz, then there is no point. I do know some 144 or 120 lhz TV allow 240hz at 1080p, then I can see the point.
That's a real cope.
Cope harder.
U should rephrase this to say aslong as ur 100% gpu bound then ur OK. Sometimes 4k can see a difference with a cpu but not as much as 720p or 1080p low
Ray tracing makes you really CPU bound even at 4K and besides , better 0.1 and 1% lows equals to a much smoother and future proof experience. My 5800X3D will serve me well until 11800X3D while suckers can keep buying Shintel 😂
First
99th first
AMD GPU is not a choice now😂
Keep drinking Jensen's milk ;)
So, Nvidia will keep destroying MAD.... 🤔🙄
yes they gonan run 5k at higher fps ...while AMD fan boys stuck at 1080p still
AMD should sell Radeon division to Qualcomm, nobody buys AMD Radeon GPUs
I scored a 7900 XT for $500 USD and is amazing
Keep drinking Jensen's milk ;)
Your hair is distracting please comb it.
😂
🤣🤣
Are you gay?
When is gow6 coming out?.