after last gen 'leaks', i say we shouldn't believe it. it might be -25%power at the same performance, or +45% in a few games that performed really badly. I hope i'm wrong, but...
@@miyagiryota9238 Well the numbers given aren't even first party numbers, but numbers some guy on a Chinese forum put out there. So take these with a whole truckload of salt.
I'll believe it when I see it. We haven't had real competition for a while now, and AMD's abandonment of the high-end is very bad for the consumer. The idea that AMD may bring some competition to the low-mid range is small consolation.
Could care less about RT, it’s useless in any card below the 4070 ti super anyway which is a £799 card. Give us a 30 - 40% increase in raster compared to 7800 xt at that £499 price tag and people will be happy. We need to go back to getting value for money. PC hardware and gaming in general are in a bad spot. Things getting too expensive and literally no games coming out that warrant upgrades. I plan upgrading every 2 generations and only when there’s a game I’m interested in my current hardware would struggle to run. Not into paying £1000 to play the same games for the third time but with a few extra frames. What’s coming out soon? Nothing, save your money, keep what you’ve got.
I don't know what you are into but for some of us 2025 will be a huge year. Biggest year in a while I think. I wasn't interested in gaming from 2019 to 2023 but in 2025 I'll have to upgrade my GTX 1070 as we are getting: Gothic (been 20 years), GTA 6 (12 years), a new Homm (10 years), a new DOOM, KCD2, Subnautica 2...
Sad that we are 6 years into RT & there is so little to show for it even at $800. You've got to start somewhere, but it's been so much wasted $ & silicon. Even launching today is probably still too early to be useful. Maybe we see some good or meaningful implementations after a console has halfway competent RT.
If the Battlemage 580 dropped with 16GB and the 570 with 12GB I might have paid attention. As is, this limited memory will not cut it with modded games from the last decade, never mind new titles. The new resolution is 1440p and higher refresh rates are also becoming standard. When you add that to Intel's other offerings in CPU, I really do wonder what they hope to achieve. People get giddy about AMD, but they forget that nvidia and intel are still the giants in the market. However this kind of showing is just not good enough. Way too much pandering to the shareholders looking for their short term fix is putting them in a poor position. If they don't start addressing modern day consumer demands they will be left behind, so much for their share price then. Paul was right when he said these cards are niche. How is that going to help them?
I cant wait, already built my pc. Mobo: X870E Nova CPU: 9800X3D RAM: Trident Z5 Royal Neo(6400mhz/CL28 SSD: SK Hynix P41 Platinum 2TB Case: darkFlash DY470
If the 8800XT can deliver RTX 4080 performance with better efficiency for around 549$ then it's a slam dunk for RDNA 4. Hopefully their AI infused upscaling tech will be ready at launch too.
Ah yes the usual leaks claiming miracles from AMD. Are we really supposed to believe that AMD making relatively small changes to their RT pipeline suddenly results in far better proportional performance than Nvidia even though they're currently still far behind Nvidia's original implementation in the 20 series? Especially after the PS5 Pro's RT improvements (which are supposedly the same as RDNA4's) turned out to amount to pretty much nothing?
Doubling of RT cores per CU isn't exactly a "small pipeline change." The only reason why AMD has been behind Nvidia all this time is because Nvidia simply dedicates a lot more silicon to RT cores and Tensor cores. AMD's RT cores are significantly smaller, hence why their performance is so poor. Even Intel Arc Alechemist has larger RT cores, which is why their RT is miles ahead of AMD, and has Tensor cores, which is why their hardware-accelerated XeSS is also far superior to AMD's software-accelerated FSR. It's all about what you're doing with the silicon, and AMD simply chose not to have any Tensor cores and skimp on the RT cores. It allowed them to make cheaper GPUs and still make fat margins.
@@hasnihossainsami8375 Yes you're exactly right about why AMD's RT support has been so poor. However from the PS5 Pro info they didn't double the RT cores, they just doubled the throughput of one part of the RT pipeline, a part that does not seem to make all that much difference in overall performance. I saw a chart a while back showing that RDNA3 was actually ahead of Turing in this particular spec, yet Turing's RT architecture is still far more performant. So unless RDNA4 completely overhauls the RT implementation, completely different from the PS5 Pro's changes which nobody has suggested as of yet, then I don't see how these wild performance rumors are even remotely possible.
Raytracing at a point where it actually is a visible quality improvement is such a performance loss that it is only playable on 4090s. Anyone with a gpu below that simply turns it off.
with AMD, i think they are aiming midrange, best case scenario, intel gains a solid foothold in the lower end gpu market while AMD is midrange. i have a 12 gb card and raytracing in newer games becomes more and more unavoidable.
Darn it these would have been a great GPU to pair with a Core i5 7400 or an i7 7700 if only it didn't have a broken memory controller making it require REBAR as a crutch otherwise the controller would stall and would stop working at its peak performance without it.
this aint going to be less than 650-700$, esc if Nvidia continues current pricing.. RT in todays games i kind of a joke if you need high end gpu to hit 60fps at native 1440p without RT.. I am not sure i want to butcher resolution and insert fake frames just to get my reflexion in pound of watter..
I play 0 games that have raytracing beyond Cyberpunk, most game sI use XeSS over fsr 1 or 2 and honestly it looks absolutely fine on my gtx 1080ti. im interested in raster and power draw and price to perf
can't wait for them to sell the 8800xt for only $100 less than whatever Nshittification prices their 5080, so as to offset any reason to want to purchase AMDs GPUs, just like how they did last time. /s -_-
4080 performance will not happen. Don't get your hopes up because it will only be a disappointment. It is impossible with just 64 CU. RDNA3 would have to have a huge bottleneck that they just unclogged.
8800 XT and 8700 XT have 16 gb vram, what i like to keep 4-5 years the same GPU. Because i like to mod games too. Now whats the price of the custom models vs 5070 ti 16 gb vram version. TO pay 300-350€ more for how much FPS and Traytracing winning. Maybe i buy a 8700 XT and replace it in 2 years with RNDA 5 mid range 9700 XT.
Bullshit Paul, i bet it will be far behind Ngreedia 5k series just as usual. And Res4 is a very low load for a RT game. The real question is how it wil stack up in pathtracing and AI work loads because that is the future if you want it to last.
just for a select group of people that would be so, yet most gamers don't give a damn about that and those would rather have a 600 euro card as a 1600 euro card. just like RT nobody i know uses it or even see it as a need to have.
@Mcorpmike I am talking about the Ngreedia market share and for the massive crowd that actually care about new features. I know you are a bit slow so I am spelling it out for you.
So amd is reselling you the 7900xtx/xt? Why not just call it a 8700xt and release the actual 8800xt later that matches the 4090? And I'm sorry if you think that's impossible Nvidia has done It constantly We're in for Nvidia being the only one with performance gains and that means more jensen. saying Moore's law is dead more of ridiculous pricing for actual gen over gen performance gains and less for your dollar.
LOL AMD GPUs. Typical, that they would compete with the prior Nvidia GPU line. Seems to have been the norm, since AMD took over the Radeon division. Whereas? ATI competed with Nvidia, model-for-model...even winning a few generations. Just barely, albeit but a win's a win! Then comes along AMD and they're like, "Our Vega64 flagship will compete against the GTX 1080! Oh? It's having a hard time surpassing the 1070? Alrighty then." Then? They're like, "Our Radeon VII will do absolutely nothing!" Which was true. Then? "Our 5700XT will be competive with the 2070 Super! Just you see! Scratch that...let's just nix the 'S' and call it, '2070,' yeah?" Then? "Our 6900 XT will square off against that 3080! Oh, nuts! It's having a hard time against even the 3070, especially in RT! Well? Shoot!" Next up! "Our 7900XTX will be competitive with the 4080...well, crap! It's losing around against the 4070 in many benchmarks, especially in RT." This is why nobody trusts them. They cried "Nvidia Killer!" So often? That now people just look at them like the Miata: "Awww...but it's soo cute...just look at those wittle, wittle puppy dog eyes." More realistically? "Hey, it's not a bad little GPU, if you don't care about RT and upscaling." Meanwhile? This just in: The latest Steam survey suggests Nvidia user base is 75% while Intel and AMD are left picking up the scraps.
@stephenwerner1662 Hey, case-in-point? AMD GPUs typically target a non-Nvidia flagship model and often fail to do that even. Thanks for the feedback. Made some compelling arguments the contrary.
Well, if you want to play Indiana Jones apparently everyone its required to play im sure it will not be the last game to require Ray tracing as a way to gate keep amd
@@sargentpayne the requirements mean majority of pc owners can't play it.. majority have 4060 and we all know in RT performance it is basically useless.
I know! I guess we should give Intel a participation trophy for releasing cutting edge gpus whose performance sits between last gen entry-level cards and soon-to-be released Strix Halo APU. 😂
after last gen 'leaks', i say we shouldn't believe it. it might be -25%power at the same performance, or +45% in a few games that performed really badly.
I hope i'm wrong, but...
just like with nvidia and intel, 1st party chart perforomance should be taken with grain of salt
My gpu is 2 gens behind, as long as its reasonably priced, Iam in.
@@miyagiryota9238 Well the numbers given aren't even first party numbers, but numbers some guy on a Chinese forum put out there. So take these with a whole truckload of salt.
Yeah, Radeon 7000 were so hyped and eventually 7800XT performs almost as 6800XT 🫠
ladies and gentlemen, his name is Paul!
And he's hoping you're having an AMAZING day
I'll believe it when I see it. We haven't had real competition for a while now, and AMD's abandonment of the high-end is very bad for the consumer. The idea that AMD may bring some competition to the low-mid range is small consolation.
Could care less about RT, it’s useless in any card below the 4070 ti super anyway which is a £799 card.
Give us a 30 - 40% increase in raster compared to 7800 xt at that £499 price tag and people will be happy.
We need to go back to getting value for money. PC hardware and gaming in general are in a bad spot. Things getting too expensive and literally no games coming out that warrant upgrades.
I plan upgrading every 2 generations and only when there’s a game I’m interested in my current hardware would struggle to run. Not into paying £1000 to play the same games for the third time but with a few extra frames.
What’s coming out soon? Nothing, save your money, keep what you’ve got.
this
Proper english is "could NOT care less".
I don't know what you are into but for some of us 2025 will be a huge year. Biggest year in a while I think.
I wasn't interested in gaming from 2019 to 2023 but in 2025 I'll have to upgrade my GTX 1070 as we are getting:
Gothic (been 20 years), GTA 6 (12 years), a new Homm (10 years), a new DOOM, KCD2, Subnautica 2...
I mean sure.. but even budget cards are ridiculously good at raster right now. 1440p 120fps is common place even with 4 year old cards.
Sad that we are 6 years into RT & there is so little to show for it even at $800. You've got to start somewhere, but it's been so much wasted $ & silicon. Even launching today is probably still too early to be useful.
Maybe we see some good or meaningful implementations after a console has halfway competent RT.
if the drivers are solid intel has a winner!
If the Battlemage 580 dropped with 16GB and the 570 with 12GB I might have paid attention. As is, this limited memory will not cut it with modded games from the last decade, never mind new titles. The new resolution is 1440p and higher refresh rates are also becoming standard. When you add that to Intel's other offerings in CPU, I really do wonder what they hope to achieve. People get giddy about AMD, but they forget that nvidia and intel are still the giants in the market. However this kind of showing is just not good enough. Way too much pandering to the shareholders looking for their short term fix is putting them in a poor position. If they don't start addressing modern day consumer demands they will be left behind, so much for their share price then. Paul was right when he said these cards are niche. How is that going to help them?
I cant wait, already built my pc.
Mobo: X870E Nova
CPU: 9800X3D
RAM: Trident Z5 Royal Neo(6400mhz/CL28
SSD: SK Hynix P41 Platinum 2TB
Case: darkFlash DY470
*Only 45% increase in ray tracing is not impressive. Should be at least 2x faster.*
literally 10 years, we only have around 50fps increase !!! TF !!!
If the 8800XT can deliver RTX 4080 performance with better efficiency for around 549$ then it's a slam dunk for RDNA 4. Hopefully their AI infused upscaling tech will be ready at launch too.
It is impossible with just 64 CU. RDNA3 would have to have a huge bottleneck that they just unclogged.
Coming from a guy that said that the 7900XTX is the 4090s equalent in gaming.
LOL. Dude is a joke
Ah yes the usual leaks claiming miracles from AMD. Are we really supposed to believe that AMD making relatively small changes to their RT pipeline suddenly results in far better proportional performance than Nvidia even though they're currently still far behind Nvidia's original implementation in the 20 series? Especially after the PS5 Pro's RT improvements (which are supposedly the same as RDNA4's) turned out to amount to pretty much nothing?
Doubling of RT cores per CU isn't exactly a "small pipeline change." The only reason why AMD has been behind Nvidia all this time is because Nvidia simply dedicates a lot more silicon to RT cores and Tensor cores. AMD's RT cores are significantly smaller, hence why their performance is so poor. Even Intel Arc Alechemist has larger RT cores, which is why their RT is miles ahead of AMD, and has Tensor cores, which is why their hardware-accelerated XeSS is also far superior to AMD's software-accelerated FSR.
It's all about what you're doing with the silicon, and AMD simply chose not to have any Tensor cores and skimp on the RT cores. It allowed them to make cheaper GPUs and still make fat margins.
@@hasnihossainsami8375 Yes you're exactly right about why AMD's RT support has been so poor. However from the PS5 Pro info they didn't double the RT cores, they just doubled the throughput of one part of the RT pipeline, a part that does not seem to make all that much difference in overall performance. I saw a chart a while back showing that RDNA3 was actually ahead of Turing in this particular spec, yet Turing's RT architecture is still far more performant. So unless RDNA4 completely overhauls the RT implementation, completely different from the PS5 Pro's changes which nobody has suggested as of yet, then I don't see how these wild performance rumors are even remotely possible.
@@nimbulan2020 Very sad news.
N44 $250, N48 $400 best case scenario
5900 $650 , 5800 $500 best case also
Comparing the 8800XT to the 7900XTX, is going to be a lot like comparing a Radeon RX-5700XT to a Vega64, or an R9-390x to an RX-480.
Raytracing at a point where it actually is a visible quality improvement is such a performance loss that it is only playable on 4090s. Anyone with a gpu below that simply turns it off.
if 2 RT engine per unit still didnt good enough amd should put 3, but RT is useless on 8GB card
with AMD, i think they are aiming midrange, best case scenario, intel gains a solid foothold in the lower end gpu market while AMD is midrange. i have a 12 gb card and raytracing in newer games becomes more and more unavoidable.
Intel needs a Radeon 9700 Pro.
The end: darkness as black as tar.
They need to fix their driver issues before i go back to team red
Darn it these would have been a great GPU to pair with a Core i5 7400 or an i7 7700 if only it didn't have a broken memory controller making it require REBAR as a crutch otherwise the controller would stall and would stop working at its peak performance without it.
Are they releasing a $300-350 gpu with 4070 performance in both raster and ray tracing next year?
this aint going to be less than 650-700$, esc if Nvidia continues current pricing.. RT in todays games i kind of a joke if you need high end gpu to hit 60fps at native 1440p without RT.. I am not sure i want to butcher resolution and insert fake frames just to get my reflexion in pound of watter..
I play 0 games that have raytracing beyond Cyberpunk, most game sI use XeSS over fsr 1 or 2 and honestly it looks absolutely fine on my gtx 1080ti. im interested in raster and power draw and price to perf
Thank god for 1.25x to take the edge off of your voice
can't wait for them to sell the 8800xt for only $100 less than whatever Nshittification prices their 5080, so as to offset any reason to want to purchase AMDs GPUs, just like how they did last time.
/s
-_-
4080 performance will not happen. Don't get your hopes up because it will only be a disappointment.
It is impossible with just 64 CU. RDNA3 would have to have a huge bottleneck that they just unclogged.
8800 XT and 8700 XT have 16 gb vram, what i like to keep 4-5 years the same GPU. Because i like to mod games too. Now whats the price of the custom models vs 5070 ti 16 gb vram version.
TO pay 300-350€ more for how much FPS and Traytracing winning. Maybe i buy a 8700 XT and replace it in 2 years with RNDA 5 mid range 9700 XT.
Go AMD!!!
many reviewers are already negative on the new Intel arc gpu's starting with their negative stinkfase thumbnails
Bullshit Paul, i bet it will be far behind Ngreedia 5k series just as usual.
And Res4 is a very low load for a RT game.
The real question is how it wil stack up in pathtracing and AI work loads because that is the future if you want it to last.
just for a select group of people that would be so, yet most gamers don't give a damn about that and those would rather have a 600 euro card as a 1600 euro card. just like RT nobody i know uses it or even see it as a need to have.
@Mcorpmike Yes 90% of the market sure is a "Select group of people" LOL
@Mcorpmike I am talking about the Ngreedia market share and for the massive crowd that actually care about new features. I know you are a bit slow so I am spelling it out for you.
So amd is reselling you the 7900xtx/xt? Why not just call it a 8700xt and release the actual 8800xt later that matches the 4090? And I'm sorry if you think that's impossible Nvidia has done It constantly
We're in for Nvidia being the only one with performance gains and that means more jensen. saying Moore's law is dead more of ridiculous pricing for actual gen over gen performance gains and less for your dollar.
LOL AMD GPUs.
Typical, that they would compete with the prior Nvidia GPU line. Seems to have been the norm, since AMD took over the Radeon division. Whereas? ATI competed with Nvidia, model-for-model...even winning a few generations. Just barely, albeit but a win's a win!
Then comes along AMD and they're like, "Our Vega64 flagship will compete against the GTX 1080! Oh? It's having a hard time surpassing the 1070? Alrighty then."
Then? They're like, "Our Radeon VII will do absolutely nothing!" Which was true.
Then? "Our 5700XT will be competive with the 2070 Super! Just you see! Scratch that...let's just nix the 'S' and call it, '2070,' yeah?"
Then? "Our 6900 XT will square off against that 3080! Oh, nuts! It's having a hard time against even the 3070, especially in RT! Well? Shoot!" Next up!
"Our 7900XTX will be competitive with the 4080...well, crap! It's losing around against the 4070 in many benchmarks, especially in RT."
This is why nobody trusts them. They cried "Nvidia Killer!" So often? That now people just look at them like the Miata: "Awww...but it's soo cute...just look at those wittle, wittle puppy dog eyes."
More realistically? "Hey, it's not a bad little GPU, if you don't care about RT and upscaling."
Meanwhile? This just in: The latest Steam survey suggests Nvidia user base is 75% while Intel and AMD are left picking up the scraps.
I just read a lot of lies from you. Is this your normal behaviour?
@stephenwerner1662 Hey, case-in-point? AMD GPUs typically target a non-Nvidia flagship model and often fail to do that even. Thanks for the feedback. Made some compelling arguments the contrary.
RX 8900 - RTX 5090
RX 8800 - RTX 5080
RX 8700 - RTX 5070
RX 8600 - RTX 5060
Who actually cares about ray tracing dude
Well, if you want to play Indiana Jones apparently everyone its required to play im sure it will not be the last game to require Ray tracing as a way to gate keep amd
Nobody.
Especially if these mid range card cost $500
@@sargentpayne the requirements mean majority of pc owners can't play it.. majority have 4060 and we all know in RT performance it is basically useless.
Path tracing is really the better technology but our current technology is still too undeveloped and unready for the tech.
Me !
Nobody cares about rt
so an a770 but half off
If Intel Battlemage does meet expectations, AMD should price RDNA4 accordingly or it will be DOA. RX8800 it shouldn't cost more than 699$
* $599
$549. 😂😂😂
@@strangedotmachine9281
I want 2 for a little over a grand 😂😂😂
$499
299
how the hell did we get this low as to being impressed with this garbage excuse for a gpu.
I know! I guess we should give Intel a participation trophy for releasing cutting edge gpus whose performance sits between last gen entry-level cards and soon-to-be released Strix Halo APU. 😂