Sorry for some of the issues with the audio. It somehow got switched to the wrong framerate, and there were some errors I didn't notice when changing it back.
ask them why their memory controllers on the io die suck so much and why they dont support faster frequencies of ram which seems to make quite a difference especially in gaming and AI and having chiplets is no excuse intel has chiplet design cpus which support higher frequency ram and if they plan in the next generation or refresh to include a better memory controller. Edit: also ask them why not using 2 x3d chips on the 9950x (and any x3d cpu with more than 1 ccds,,, ) extra memory not making difference is a BS excuse tell them -because that's what they will respond with - that it doesnt have to be extra memory they could just reduce the size of each x3d chip by about half and add each to each ccd (so again e.g 128 MB total but instead of one ccd having 96 and the other 32 make it so that both have 64 MB or god forbid 76MB ) so that there is no governor issue when gaming on which thread goes to which ccd...
PLEASE PLEASE PLEASE ASK AMD WHY THEIR PRODUCT NAMING IS SO CONVOLUTED, CHANGES SO MUCH, AND IS NEARLY IDENTICAL TO THEIR COMPETITORS PRODUC NAMES? (or something along those lines that doesn't get you kicked out of CES lol.... but you HAVE TO bring up them changing their names SO MUCH 🙏🙏 PLEASE!!)
I bet it will be more expensive than GRE same raster more raytracing performance otherwise they wouldn't discontinue the GRE. So like 600$-700$ at least so it will be bad at best lol.
@@cin2110 its said to be as fast as the 4080. the GRE did not compete with the 4080. itll be at around the 7900xt performance. 600-700$ sounds like the right area. even then the rumor is that itll be at a good price point. they are aiming for more market space. in theory the only way to do that is by offering higher performance for lower prices. if the 9070xt drops at around 500-600 then nvidia is in ttrouble. thats if they hold true what they have said. cant trust big corps only time will tell on the price point.
The performance of the b580 is pretty bad in most games thought it was supposed to be a 4060ti competitor but it’s nowhere close. Used 3090 your best bang for buck atm
it's to help the consumer find their way by taking the known names of the market leader, and honestly it makes things easier for me like with processors
it makes it easy to compare and find out what are the same tier gpus from two companies. It's good for the consumer IMO. This 9070 xt equals to 5070 or 5070 Ti.
3:10 I think you got this a little wrong. AMD did not have a competitor for the 4090 and the 7900 XTX was a competitor for the 4080, if even that. They just didn't match their 7X00 numbers to the 40X0 numbers of nvidia, and honestley why should they. Imo you have to look at performance at a similar price range and nothing else.
The 7900xtx is between a 4090 and 4080 in performance. The ray tracing isn’t as good as probably the 4080 but it’s only 20% slower in raw rasterization than a 4090.
And a gen later they still don't have a real competitor for the 4080.. That is pathetic. They are literally almost 2 generations behind ..., and I only see that gab widening over the next few years.. Intel will likely catch up to AMD next generation and then surpass them.. I see AMD tanking like 3DFX did
AMD figured that an "8800xt" would be compared to a 5080 and suffer in comparative performance. AMD's "Navi 48" card may be such a better value in terms of memory & price/performance than a 5070 that AMD changed the name at the last minute to "9070" to invite that comparison instead.
People saying that if the 9070 XT is priced well, then Nvidia in "trouble". Look... Even if it sells well, that's still far from "trouble" for the likes of Nvidia. It's teetering on the edge of delusion to suggest Nvidia is anywhere near "trouble" in 2025/26
True. As Warren Buffer said the market can stay irrational longer than you can stay solvent. Nvidia is the new Apple. A cult. The fact that those gimped three generation old features on new iPhones are still selling says it all.
I would assume they are talking about in the gaming GPU market. They would probably lose some market share but they do not care about their gaming revenue at all because it’s tiny compared to their AI revenue. That’s why they are always priced ridiculously high. They know they are the Supreme of PC gaming and it doesn’t even matter if people aren’t willing to pay that premium. Every company in the world is getting on their knees to buy their AI cards and software.
nah, even if the 9070xt does phenomenally well as far as gpu sales go. It could capture what? maybe 15% marketshare at the absolute most. This would also represent a LOT more people buying in this price range than normal. So yeah, that's like a best case scenario. More realistic would be btwn 5 and 8% again.....if it does really well. But hey....it would be a move in the right direction for a change.
If I'm not mistaken I have 5120 cores. Toxic sapphire 6900xt liquid cooled. While I'm at it even the 7900xtx has 6140 or something close to that so why go from 6k to 4k. I don't get it
Why do they need to fire anyone when they expected nvidia to pull this bs. If anyone buys a 5070 for 1000 has no right to a opinion lol you damn well know 5070 will be over 1000 and the 5060 will be around $800. Amd will sell just fine now cause I saved and have 2800 and now I know I don't have enough for a 5090 when that will 4 grand.
@@jessiestarr4600 My bet is they price the 5070 at $900 but when the cards actually show up, there won't be one for sale for less than $1000 and the average price will be higher than that. The 5080 will creep up to 4090 prices.
@@jessiestarr4600 Because their Marketing department has been the biggest villain since acquiring ATi. It seems like AMD fired their marketing team and just replaced them with ATi's team. The cannot keep consistent naming of things which confuses the normies or makes them think something is new. It is hard for the regular smegular person to distinguish and decipher the marketing speak. I say this as a GNU/Linux user and a long time AMD/ATi shill. I know the difference but the regular normie does not. And that is why the over spend on Nvidia instead of buying AMD. The NV naming scheme has been around a long time and thus does not look like something "new". There is a reason why Nvidia still has a large market share even though they charge highway robbery prices, even when AMD has competitive products.
Of course Ngreedie is trying to upsell their higher end models. Why pay for a GPU that has marginal performance increase when you can spend x5 the amount of money and get x10 the performance. So glad I got a 7900xt.
I mean that would be amazing value if 5x the cost got 10x performance. With higher tiers it's usually 100 percent price increase for 50 percent more performance.
AMD cards are trash though I would use a 4070 over a 7900xt lol FSR makes games look worse than console games. ANd yes the 7900xt requires upscaling even at 1440p for current gen UE5 games. Its garbage.
I've never been an AMD GPU Fan, witnessed a lot of bugs from friends, bad drivers, huge drops all over the years. I was glad i had ever since my nVidia GPUs. I love RTX and DLSS. But nowdays? Pricing is hugely odd, VRAM is lacking in WQHD and perfomance wise you just get good numbers with DLSS and Frame-Gen in some cases. Looking forward to the new RX 9xxx, maybe ist the new meta after this point.
@@djofulll buy a fkn server rack at that point dude. Just saying stuff like that is the reason the rest of us have to suffer. Besides ur playing CS, you can do that with a 4070
From all the leaks so far, the 9070xt looks highly disappointing. It's no good comparing it with current gen cards, it's supposed to be be the next gen card. It's looking barely better than a 7800xt(which is what I already have) so far. Nvidia is going to absolutely dominate this generation in terms of performance. And that's tragic for everyone's wallets!
The leaks have put the 9070xt no lower than the 7900gre with the ray-tracing performance of the 4070ti and a max price of $600… That’s the worst case scenario, not even including FSR4
@@haz1101 if you look at purely rasterization performance at native res, then yeah $500 or lower, but you’re forgetting that the RT performance will be at least 4070ti level, as well as better ai performance, which will make FSR4 probably perform just as well as DLSS on an Nvidia card. Overall I think it will be a decent card on launch, and then drop in price
i would assume at least SOME level of performance per clock increase. multiple metrics to improve performance with so if im being realistic id guess somewhere between the 7900xt and xtx for like $550~ish
@@tomaspavka2014 Timespy doesn't really mean that much. games scale very differently. With that said, I do think the 7900 xt will be faster. If it comes in at a cheap price, then it's a winner.
@@varcaic9170 based on these leaked benchmarks not looking to be anywhere near that. We'll have to wait and see! gamer nexus and hardware unboxed will surely let us know :)
@@ilbro7874 what is this thing with the 7900xtx being a room heater? I have one and it doesn’t run hot at all, even in the most demanding games it doesn’t go above 70°c
@@akathenuggetI agree. It's very odd comments.I've been running one for a while. Cool temps. I've not even had to play with the fan settings. 7900xtx is the best value card on the market at £750 that I got it for. And makes it very difficult to upgrade. £1500+ 4090 or 5080 for not much more performance in raster. Or £2500 for 5090. I'm willing to upgrade but only if it makes sense Vs my xtx. Which if the rumours are to be believed it won't be.
expect the RTX 5080 TI to get 24 GB VRAM, as it would make sense, but the price might be higher than RTX 4090, with better performance because of GDDR7
there is still the thing with the core count. having faster memory wont really matter for games unless the core of the 80TI goes way higher than the 80.
@lucaskp16 that's true, but isn't the 5080 supposed to perform about the same as 4090? it would be really bad news for gamers if Nvidia chose to leave entry-level like a series of GPUs where upgrades are "needed" every generation, mid-range as every 2nd or 3rd generation, making high-end the literally best value for money, but overpriced like it should be dedicated for successful content creators and companies
not going to happen intel is new in the graphics card market and nvidia is not going to lower the prices of their gpus ever 5090 for example some will cost almost 3k
I wouldn't be surprised if by D or E series cards Intel are matching XX70-80 tier Nvidia cards, they have some incredible engineers working on the project
They can barely compete with AMD in CPUs, what makes you think they can beat Nvidia if not even AMD can beat Nvidia? The only thing they'll do is give tons of vram for now, lol.
I just designed a new PC this June. I have an AMD Ryzen 9 7950x3d 16 core CPU. And an MSI GeForce 4080 RTX 16GB Super Slim. There is no reason for me to get a 5090. Always remember the golden rule with computers I made two decades ago "A computer is only as good for what you want to use it for."
@@carolebaskin138 well, my rx 5600 xt broke last year and I had to pick something fast. I had budget for 4060/3060 ti so I took 4060. It was decent but in newest games its unbearable cuz of vram even in 1080p. Now im selling it and I picked 4070 ti super. Hopefully it will be enough for at least 4 years in 1440p. 2070 super is pretty much same thing as 4060 so I assume you want to upgrade as well
I own a 3090, bought it at launch. I refuse to buy the 4090 at launch & that was a mistake on my part given the performance for $100 more. I looked at the process on Newegg today & the cheapest was $2800
744mm squared means less per wafer accounting for other material costs including new GDDR7 someone broke it down in a video and it would be in the price range of a little over $1900ish if the 4090 is the base of how much it cost to make that card. This card is easily going to be over $2k sadly i hate it and hate even more that the 5080 is basically a 70 card because its half the cores of the 90.
The 5090 probably will cost that much but the 5080 simply cannot sell past $1000. It's a boosted 5070 Ti... and the 5070 Ti will probably be trading blows with a 4080 Super. Unless they have the gall to try and charge more for the 5070 Ti which is comparable with a card you can buy for $900 in the used market... there's NO WAY a $1200 5080 would sell. Even the most dense Nvidia fanboy can see to either get the 5070 Ti or 5090.
No, it's an 80 series card. 90s series were dual GPU cards so it makes sense to use this nomenclature for a card that roughly has twice of everything. The difference is that unlike in the past, companies can actually produce massive dies somewhat economically so there's no need for using 2 GPUs and any of the associated downsides like extra latency from the interconnect or any of the issues of Crossfire/SLI.
That additional 50W of power that the 5070 is going to suck down could power an M4 Mac running at full load, CPU, RAM, SSD *and* GPU with 10W still left over to power external devices.
@@rev3489 You're right, I'll actually get some work done. But feel free to pay $2,000 for a sand brick that eats 600W for a few extra FPS. Best wallet-opening device ever!
Well and even the M4 max is stuck at 40-50 FPS medium settings 1080p gaming for the few games that run on it. GPU performance is a whole different thing than CPU performance and to this day no one created a low power competitive GPU not even Apple so your comment is invalid.
@@Max-9871 Well thanks for making that up on the spot, Max-9871, but I think we all prefer actual facts. The M4's GPU cores are on the same die as the CPU, along with the neural engine cores. All 3 share the same RAM. So how did Apple manage this in under 40 watts while Intel and nVidia are pushing systems closer to 1,000W?
It's like Nvidia is relying on RayTracing and DLSS to sell 5070 and below. They're not putting much effort into improving those cards compared to previous generations. Memory clocks go up with the addition of GDDR7, but they're not increasing the amount of memory or the cores (meaningfully). And all this with a 25% increased power draw? For what? A little bit of base clock?
There are already games that consume more memory than they're giving cards like the 5070. And they continue to add throughput bonuses to cards above it. They've not hit a plateau. They're just selling a new generation of software that comes with replacement hardware.
4060 was carried by frame generation and dlss 3 and Ray reconstruction. It's as good of a card as my 3090 if that's says anything only because of heavy ai. It beat my 3090 in 1080p and 1440p but in 4k my 3090 slams the 4060 into the ground by a landslide
@venataciamoon2789 that's nuts. Nvidia is getting out of control. AMD or Intel need to step up their high powered cards, the race to the middle is kinda screwing us.
Nah I'm just trying to get a solid 120 fps with max settings in the newest games on my 4k 120hz monitor without having to use upscaling or frame generation. Do you think the 5090 will be capable of that? I bet I would still have to use DLSS.
Not going to happen, rumors 5080 coming out first because of issues in manufacturing. Scalpers will get 50% of them. And lots will be randoms who don't game and just want to upsell them like douche bags. Needs to be illegal, period. Good luck though gonna be pricey
I really don't think scalping will last long. If you look at the 4090 release, scalpers didn't fair well. The 4090 Strix was going for stupid money for a short while. Within a month or two 4090s were selling for like $100 over MSRP in my local area.
i built my first PC in 2018 w/ an Ryzen 3700X and an RTX 2070, switched to 5800x3d w/ 7900XT in 2023 and i am honestly happy to see Intel's relative success w/ the B580 covering entry level systems. i will probably wait this generation out to strike a deal, since my system is doing just fine in 3440*1440.
I would have loved to see the 9950x3D have the 3D vcache on both dies. Based on videos I've seen it's a pain to jump through the hoops to get the CPU to park the proper cores. I'd pay extra to just have the whole chip stacked up.
I was going to buy a 5090, but then all the pricing talk... So I bought an OLED TV and a new Pellet Smoker Grill instead. My 3090Ti is still strong enough.
That's hard to answer. If you can find a 4070ti super, then I would get that. I doubt the 5070 would be much better and it'll be a while before you can even buy because I have little doubt it will be scalped to oblivion. Just my 2 cents
As Someone with a 4070 ti, i likely wont upgrade anytime soon but with the trend of smaller upgrades for more stable prices I am gonna be really invested for this Generation of AMD and NVIDEA
I just bought a RX 7900XT today with my Christmas cash(and savings). Should I return it and wait, or will it still be worth the $660 I spent? I've been running a RX 580 for 5 years now and wanted an upgrade because I'm primarily on VRChat (and need the vram). Any help would be appreciated.
It's really difficult to predict what will happen after the release of the new generation of graphics cards. Prices for older generations might drop a bit, but those reductions are likely to be minor, and you'll probably have to wait a few months for them. It's also hard to say how well the new cards will perform and whether there will be any issues with them. If you're curious, you could return your current card and wait those few days, but in my humble opinion, if you're satisfied with the performance and the price you paid, stick with your choice. Your time and peace of mind are worth more than what you might gain by waiting for the new release.
Keep it! Only because we have NO CLUE, despite all the 'supposed' leaker info, WHEN AMD is going to release these new cards. Sure, they will show them at CES 2025, but still no date on arrival to the public for purchase. It may be a month, or two, or three. And 'supposedly', the 9070XT, will be on par with the 7900XT.
Reminds me of the races I didn't know I was running in when I had an 4Mhz x86 and upgrading to an 8Mhz. So my mate got a 16Mhz but I somehow ended up with a 24Mhz so he got a 33Mhz. Mines bigger than yours and all that.
the one take away I noticed here is that if those numbers are accurate? Then that puts intel's 580 in a Very competitive spot with the 5070 right off the bat.
I have a 7900 XTX on order for $800 and I’m not feeling any reason from AMD or Nvidia to not just stick with that considering I’m not a super gamer and not as much caring about ray tracing, I’m kind of thinking the 24 gig of VRAM is the better future proofing
XD XD XDXDXDXDXDXDXD 5070 gonna be like 15% more expensive for 4% gain. Coming from the company that wants us to believe Moor's law is dead, while other manufacturers seem to not have the same conclusion. At this point Nvidia isn't just betting on their fans being idiots, they already accepted it to be the case.
@@trentongardner2106 lol. Probably. Best case scenario whatever card they offer will be a repeat of the 7900XTX vs 4090. So a tier down from Nvidia's best card
This is is my first GPU cycle from the outside looking in and if you want the top-of-line cards the strategy should be to buy 5090 when they first come out, use it for 22/23 months and then sell it right before the new one comes out. Lose 300-400 bucks and you always have the best card. lol
So much conflicting information. The last leak said the 9070xt was close to the 7900gre in performance though to be fair that was time spy and games always perform differently, but this huge of a gap would be surprising, we'll see. Now this card would possibly make at least a little sense at $650, still hoping for closer to $550, but if this is true it could mean there's a $500 card that's close to the 7900gre and 4070ti in performance.
7:24 classic one ... just throw more power at it so it will be more powerfull ... its just a shame to ignore efficiency that hard or atleast thats how it feels 9:28 well neither of them xD my hope is in the next amd gpu generation to maybe replace my 6600 xD
looking for a A100 40GB+ or similar for local LLMs, but I'll never buy another cutting edge GPU for gaming, what's the point, I have a 10GB RTX380 and I not had any issues with any games to date.
So either Nvidia will charge $900 for the 5070 Ti (where it won't sell well against just getting a used 4080/S And they charge $1200 for the 5080 which then would be DOA because it's just a boosted 5070 Ti. Save $300 and sac 15% which you can cut in half from an OC.
These insane component prices and limited stock remind me of the pandemic days.... unfortunate what it means to have a "High end" PC has now grown out of reach for most.
I'd say the 5070 vs 4070 hardware comparison is a bit misleading. The 4070 was already an alright card, despite the price. The undisclosed clock speed increase, paired with the much faster memory for texture and physics calculations should have a somewhat noticeable affect on FPS
I am debating if i should buy the 5090 on release,atm i have the tuf oc 4090a,the only reason for the 5090 is to play Ark Ascended better on 4K,as it is my most favourite game of all time.
BIIIIIIG IF: IF the 9070xt is close to 4080, day 1 purchase. If its blow 7900xt, might explore Nvidia. 4080 performance would be incredible if its priced well.
@@trentongardner2106 Yea I've seen those but here we are with another video claiming near 4080 performance. I expect a GRE+ but a 4080-ish performance would be a day-1 buy if its in the $600 range.
I’m going to reserve judgement on this Rx 9070 xt until it’s released and some independent reviews and gaming benchmarks are out but if this is true and it’s priced right in the $500 - $600 then I think is will do well.
Considering the 5080 is looking like it will be around $1500 USD I'm guessing the 5070ti will be around $1100...... 5070 $1000 and 5060Ti around the $800 mark, just wondering if its going to be a significant amount of a jump to justify these prices.
Sorry for some of the issues with the audio. It somehow got switched to the wrong framerate, and there were some errors I didn't notice when changing it back.
I am more excited for the price drop for card below it. Unless the prices be in range of 800 euros for 50 series.
ask them why their memory controllers on the io die suck so much and why they dont support faster frequencies of ram which seems to make quite a difference especially in gaming and AI and having chiplets is no excuse intel has chiplet design cpus which support higher frequency ram and if they plan in the next generation or refresh to include a better memory controller.
Edit: also ask them why not using 2 x3d chips on the 9950x (and any x3d cpu with more than 1 ccds,,, ) extra memory not making difference is a BS excuse tell them -because that's what they will respond with - that it doesnt have to be extra memory they could just reduce the size of each x3d chip by about half and add each to each ccd (so again e.g 128 MB total but instead of one ccd having 96 and the other 32 make it so that both have 64 MB or god forbid 76MB ) so that there is no governor issue when gaming on which thread goes to which ccd...
From your point of view, how likely is it that RX 9070 TI models will come out that do not exceed 305mm in length and no more than two slots in width?
PLEASE PLEASE PLEASE ASK AMD WHY THEIR PRODUCT NAMING IS SO CONVOLUTED, CHANGES SO MUCH, AND IS NEARLY IDENTICAL TO THEIR COMPETITORS PRODUC NAMES?
(or something along those lines that doesn't get you kicked out of CES lol.... but you HAVE TO bring up them changing their names SO MUCH 🙏🙏 PLEASE!!)
yeah... my audio is in italian. and I can't change it, so I have to read the closed captions instead...
DONT BUY ANY RTX CARDS UNTIL THEY FIX THE PRICING!!!!!!!!!
I'm buying one day one
@@buzzkill8214 me too.....5090 baby!!!!!!!
nice try I’m getting one as well
Sorry, I gotta have it 😂
Haha get ur money up. The people who can afford it don't mind bro.
if that 9070xt is priced correctly then nvidia is in trouble.
I bet it will be more expensive than GRE same raster more raytracing performance otherwise they wouldn't discontinue the GRE. So like 600$-700$ at least so it will be bad at best lol.
@@cin2110 its said to be as fast as the 4080. the GRE did not compete with the 4080. itll be at around the 7900xt performance. 600-700$ sounds like the right area. even then the rumor is that itll be at a good price point. they are aiming for more market space. in theory the only way to do that is by offering higher performance for lower prices. if the 9070xt drops at around 500-600 then nvidia is in ttrouble. thats if they hold true what they have said. cant trust big corps only time will tell on the price point.
@@moldetaco2281 timespy score was leaked and it as barely faster than a 7900 GRE
@@cin2110absolutely
From your lips to God's ears.
that intel B580 is starting to look better and better right now for 250$, hope AMD or NVdia has a better option for the price once new gen hits.
Def will
and the possible B770 B770 will darn sure be a better choice for less money
THEY DO NOT EXIST AT $250. Stop praising a fake price, and channel that energy into being pissed it no longer exist.
The performance of the b580 is pretty bad in most games thought it was supposed to be a 4060ti competitor but it’s nowhere close. Used 3090 your best bang for buck atm
@@Giljrg the b580 competes with 7600xt
“We want to distance ourselves from nvidia “
*uses the same naming scheme as nvidia
Lol okay
That leak sounds fake as fuck though. Seriously? 9070? Who's begging for their leak to get attention here?
it's to help the consumer find their way by taking the known names of the market leader, and honestly it makes things easier for me like with processors
KOPITE7KIMI
it makes it easy to compare and find out what are the same tier gpus from two companies. It's good for the consumer IMO. This 9070 xt equals to 5070 or 5070 Ti.
they can say what they want but if they feel it can make them more money they will do it
3:10 I think you got this a little wrong. AMD did not have a competitor for the 4090 and the 7900 XTX was a competitor for the 4080, if even that.
They just didn't match their 7X00 numbers to the 40X0 numbers of nvidia, and honestley why should they.
Imo you have to look at performance at a similar price range and nothing else.
The 7900xtx is between a 4090 and 4080 in performance. The ray tracing isn’t as good as probably the 4080 but it’s only 20% slower in raw rasterization than a 4090.
And a gen later they still don't have a real competitor for the 4080.. That is pathetic. They are literally almost 2 generations behind ..., and I only see that gab widening over the next few years.. Intel will likely catch up to AMD next generation and then surpass them.. I see AMD tanking like 3DFX did
@TheJackelantern ragebait used to be believable lmao
7900 xtx gets real close to the 4090 in raster with a good overclock
i mean... 1000$ for 7900XTX, and double that amount for the 4090. The Nvidia Pricing is atrocious.
@@TheJackelantern you sound like one of those idiots that’s actually gaming on like a 4060 or some shit
Yk what else is massive??
your mom
Don’t
MY MOM
Ahem, JUNGLIST MASSIVE
THIS LOW TAPER FADE
"We might get regression at the XX60 series next gen"
We already did, the RTX 3060 matched and sometimes beats the RTX 4060.
5060 will match the 4060ti.
AMD figured that an "8800xt" would be compared to a 5080 and suffer in comparative performance. AMD's "Navi 48" card may be such a better value in terms of memory & price/performance than a 5070 that AMD changed the name at the last minute to "9070" to invite that comparison instead.
A gamble that will totally work Now that we have confirmation of a garbage 12GB 5070.
People saying that if the 9070 XT is priced well, then Nvidia in "trouble".
Look... Even if it sells well, that's still far from "trouble" for the likes of Nvidia. It's teetering on the edge of delusion to suggest Nvidia is anywhere near "trouble" in 2025/26
True. As Warren Buffer said the market can stay irrational longer than you can stay solvent. Nvidia is the new Apple. A cult. The fact that those gimped three generation old features on new iPhones are still selling says it all.
I would assume they are talking about in the gaming GPU market. They would probably lose some market share but they do not care about their gaming revenue at all because it’s tiny compared to their AI revenue. That’s why they are always priced ridiculously high. They know they are the Supreme of PC gaming and it doesn’t even matter if people aren’t willing to pay that premium. Every company in the world is getting on their knees to buy their AI cards and software.
nah, even if the 9070xt does phenomenally well as far as gpu sales go. It could capture what? maybe 15% marketshare at the absolute most. This would also represent a LOT more people buying in this price range than normal. So yeah, that's like a best case scenario. More realistic would be btwn 5 and 8% again.....if it does really well. But hey....it would be a move in the right direction for a change.
obvious man, the nvidia market aren't of gamers XDDD, thats 20% or less jajajajajaja
It's about which card Internet Cafes choose...
See i don't get how the new gpu comes with less cores than my 6900xt.
If I'm not mistaken I have 5120 cores. Toxic sapphire 6900xt liquid cooled. While I'm at it even the 7900xtx has 6140 or something close to that so why go from 6k to 4k. I don't get it
Less cores, higher speeds is my guess.
Same or better performance with less cores. Better efficiency, optimizations, etc. More cores does not equal better performance.
@kirilbarbov6949 no but more cores mean more processing
cores is probably not what you should be focusing on.
You were slightly mistaken at 1:04 stating 148MB but the screenshot lists 128MB of L3 Cache. Just in case someone was only listening and not watching.
You just saved my life bro, thanks
what if that someone is only listening, not watching, not reading ?🤣
Yea i dont think anyone cares lol
@@adisyoyeenterprisers care very much
The price is also gonna be MASSIVE!! 😆
And you know What Else is massive?
@razzor7861 LOOOOOOOOW TAPER FADE
And?
Please ask AMD, "When are you going to fire your Marketing department?"
Why do they need to fire anyone when they expected nvidia to pull this bs. If anyone buys a 5070 for 1000 has no right to a opinion lol you damn well know 5070 will be over 1000 and the 5060 will be around $800. Amd will sell just fine now cause I saved and have 2800 and now I know I don't have enough for a 5090 when that will 4 grand.
@@jessiestarr4600 My bet is they price the 5070 at $900 but when the cards actually show up, there won't be one for sale for less than $1000 and the average price will be higher than that. The 5080 will creep up to 4090 prices.
@@jessiestarr4600 Because their Marketing department has been the biggest villain since acquiring ATi. It seems like AMD fired their marketing team and just replaced them with ATi's team. The cannot keep consistent naming of things which confuses the normies or makes them think something is new. It is hard for the regular smegular person to distinguish and decipher the marketing speak. I say this as a GNU/Linux user and a long time AMD/ATi shill. I know the difference but the regular normie does not. And that is why the over spend on Nvidia instead of buying AMD. The NV naming scheme has been around a long time and thus does not look like something "new". There is a reason why Nvidia still has a large market share even though they charge highway robbery prices, even when AMD has competitive products.
Of course Ngreedie is trying to upsell their higher end models. Why pay for a GPU that has marginal performance increase when you can spend x5 the amount of money and get x10 the performance. So glad I got a 7900xt.
I mean that would be amazing value if 5x the cost got 10x performance. With higher tiers it's usually 100 percent price increase for 50 percent more performance.
AMD cards are trash though I would use a 4070 over a 7900xt lol FSR makes games look worse than console games. ANd yes the 7900xt requires upscaling even at 1440p for current gen UE5 games. Its garbage.
@@Dempig look guys I found an Ngreedia schill! Enjoy paying 1k for a GPU when I can do the same thing for $600
Sooo, that was wrong as fuck.... How surprising that the coping babies were exaggerating...
5070 - $549
5070 ti - $749
5080 - $999
5090 - $1999
@ have fun with your 16GB cards 😂😂😂😂😂
I've never been an AMD GPU Fan, witnessed a lot of bugs from friends, bad drivers, huge drops all over the years.
I was glad i had ever since my nVidia GPUs. I love RTX and DLSS.
But nowdays? Pricing is hugely odd, VRAM is lacking in WQHD and perfomance wise you just get good numbers with DLSS and Frame-Gen in some cases.
Looking forward to the new RX 9xxx, maybe ist the new meta after this point.
what ya'll think Nvidia is gonna price their flagship card at? 2.5K? 3.5k??? lmao
1.8 for founder, 2k and up to 3k for custom 5090
@@YueZhuang-pt6ff So similar to 4090 at launch price.
They could make it $7,000 and I'm still buying it
It will only cost one. One Bitcoin.
@@djofulll buy a fkn server rack at that point dude. Just saying stuff like that is the reason the rest of us have to suffer. Besides ur playing CS, you can do that with a 4070
From all the leaks so far, the 9070xt looks highly disappointing. It's no good comparing it with current gen cards, it's supposed to be be the next gen card. It's looking barely better than a 7800xt(which is what I already have) so far. Nvidia is going to absolutely dominate this generation in terms of performance. And that's tragic for everyone's wallets!
The leaks have put the 9070xt no lower than the 7900gre with the ray-tracing performance of the 4070ti and a max price of $600… That’s the worst case scenario, not even including FSR4
@@akathenugget That's not the worse case scenario as the leaked price range went all the way up to 650 USD.
If the 9070XT is under $500 it’s a buy even it performs similar to the 7900GRE.
If it’s over $500 then it DOA and a skip
@@haz1101 if you look at purely rasterization performance at native res, then yeah $500 or lower, but you’re forgetting that the RT performance will be at least 4070ti level, as well as better ai performance, which will make FSR4 probably perform just as well as DLSS on an Nvidia card. Overall I think it will be a decent card on launch, and then drop in price
i think who ever says they buy over-priced GPU are bots created by this companies. Don't buy overpriced products.
I’m buying 5090 I’m from Ohio not a bot I’m 20 years old turn 21 in may
I'm buying the 5080 day one
"Over priced" is a non existant concept. I have tons of money. For me, they're cheap.
What about KOPITE7KIMI?
@@tnh.tenshi with current AI giving that response is not hard bot.
If you keep telling consumers to not buy Nvidia to tell them these prices are not acceptable, than you need to walk the walk and not hype Nvidia....
more like boycott ...
It's career suicide to ignore NVIDIA this January.
GPU companies are playing Limbo, trying to see how low they can get by with
5080 - leak price is above $1500 MSRP. Well it doesn´t looks well now.
The 7900 xt has 5,376 shaders. So, unless there is a performance boost via some kind of other chance. It'll be slower than the 7900 xt.
25300 time spy graphics. Oc mode. Less that 7900xt. 23000 normal mode. 7900gre or 7800xt oc. Really weak high end amd card
i would assume at least SOME level of performance per clock increase. multiple metrics to improve performance with so if im being realistic id guess somewhere between the 7900xt and xtx for like $550~ish
@@tomaspavka2014 Timespy doesn't really mean that much. games scale very differently.
With that said, I do think the 7900 xt will be faster.
If it comes in at a cheap price, then it's a winner.
@@varcaic9170 based on these leaked benchmarks not looking to be anywhere near that.
We'll have to wait and see! gamer nexus and hardware unboxed will surely let us know :)
It’s like cpu’s you know. having the same core count doesn’t mean the same performance again as each individual core can be faster.
6:13 Of course, this comparison totally ignores the 4070 Ti Super and 4070 Super cards we are buying now when making those claimed improvements.
So, people might as well forego the newest card and just go for the 7900XTX.
Disappointing.
7900xtx is trash for Ray Tracing and that terrible FSR
Nah fam, that shit will make my room so fucking hot. Plus i do want a lil RT yea
@@ilbro7874 what is this thing with the 7900xtx being a room heater? I have one and it doesn’t run hot at all, even in the most demanding games it doesn’t go above 70°c
@@TTx04xCOBRA **for those of us who don't care about RT.
@@akathenuggetI agree. It's very odd comments.I've been running one for a while. Cool temps. I've not even had to play with the fan settings. 7900xtx is the best value card on the market at £750 that I got it for. And makes it very difficult to upgrade. £1500+ 4090 or 5080 for not much more performance in raster. Or £2500 for 5090.
I'm willing to upgrade but only if it makes sense Vs my xtx. Which if the rumours are to be believed it won't be.
12gb for a 70 series, TI or not, is insulting.
expect the RTX 5080 TI to get 24 GB VRAM, as it would make sense, but the price might be higher than RTX 4090, with better performance because of GDDR7
there is still the thing with the core count. having faster memory wont really matter for games unless the core of the 80TI goes way higher than the 80.
@lucaskp16 that's true, but isn't the 5080 supposed to perform about the same as 4090? it would be really bad news for gamers if Nvidia chose to leave entry-level like a series of GPUs where upgrades are "needed" every generation, mid-range as every 2nd or 3rd generation, making high-end the literally best value for money, but overpriced like it should be dedicated for successful content creators and companies
@@lucaskp16 The overall architecture can make a difference.
It’s actually getting 32GB I believe
Say it with me kids!! We don't need top of the stack cards to play a video games!!!!
I saved 2800 dollars in my gpu budget to still not have enough money for the 5090 😂 the fuck is wrong with pricing 😂
Bro if you are just playing games you won't need a 5090
Your first 3 seconds are just banging, how do you do that intro sequence?!
Let's hope Intel comes in and gives Nvidia some much needed competition to hopefully drive down prices again.
not going to happen intel is new in the graphics card market and nvidia is not going to lower the prices of their gpus ever 5090 for example some will cost almost 3k
I wouldn't be surprised if by D or E series cards Intel are matching XX70-80 tier Nvidia cards, they have some incredible engineers working on the project
You have to pray to the KOPITE7KIMI. He can fix this! KOPITE7KIMI is a hero!
They can barely compete with AMD in CPUs, what makes you think they can beat Nvidia if not even AMD can beat Nvidia?
The only thing they'll do is give tons of vram for now, lol.
U wish 🤣
I just designed a new PC this June. I have an AMD Ryzen 9 7950x3d 16 core CPU. And an MSI GeForce 4080 RTX 16GB Super Slim. There is no reason for me to get a 5090. Always remember the golden rule with computers I made two decades ago "A computer is only as good for what you want to use it for."
It’s also $2500 plus. Total insanity. I’m done with Nvidia.
Sooo, that was wrong as fuck.... How surprising that the coping babies were exaggerating...
5070 - $549
5070 ti - $749
5080 - $999
5090 - $1999
I'm still hoping for a 24GB 5080. I at least hope a 5070 ti super or a 5080 ti will come with that amount of VRAM.
Dont buy the new cards guys! Let them take a hit so we can all get a better market
Oh they will buy, tons of people are waiting on 2060/2070 etc to upgrade. So they wont wait for 6xxx
@@jordaniansniper934 I'm in the 2070 super boat unfortunately.
@@carolebaskin138 well, my rx 5600 xt broke last year and I had to pick something fast. I had budget for 4060/3060 ti so I took 4060. It was decent but in newest games its unbearable cuz of vram even in 1080p. Now im selling it and I picked 4070 ti super. Hopefully it will be enough for at least 4 years in 1440p. 2070 super is pretty much same thing as 4060 so I assume you want to upgrade as well
@jordaniansniper934 yeah I do
I own a 3090, bought it at launch. I refuse to buy the 4090 at launch & that was a mistake on my part given the performance for $100 more. I looked at the process on Newegg today & the cheapest was $2800
why was it a mistake? what games are you failing to get the performance you need from the 3090? 100 dollars more than the price of a 3090 is obscene.
744mm squared means less per wafer accounting for other material costs including new GDDR7 someone broke it down in a video and it would be in the price range of a little over $1900ish if the 4090 is the base of how much it cost to make that card. This card is easily going to be over $2k sadly i hate it and hate even more that the 5080 is basically a 70 card because its half the cores of the 90.
Anything below 4090/5090 is basically one tier lower than what they claim 4070 is actually a 4060, 4060 is a 4050 renamed etc.
@@cin2110did i miss the part where they told people that those cards are supposed have hardware that makes you feel good?
The 5090 probably will cost that much but the 5080 simply cannot sell past $1000.
It's a boosted 5070 Ti... and the 5070 Ti will probably be trading blows with a 4080 Super.
Unless they have the gall to try and charge more for the 5070 Ti which is comparable with a card you can buy for $900 in the used market... there's NO WAY a $1200 5080 would sell. Even the most dense Nvidia fanboy can see to either get the 5070 Ti or 5090.
No, it's an 80 series card. 90s series were dual GPU cards so it makes sense to use this nomenclature for a card that roughly has twice of everything.
The difference is that unlike in the past, companies can actually produce massive dies somewhat economically so there's no need for using 2 GPUs and any of the associated downsides like extra latency from the interconnect or any of the issues of Crossfire/SLI.
Why are the 50 series cards not compared to nvidias latest super version but the release versions
That additional 50W of power that the 5070 is going to suck down could power an M4 Mac running at full load, CPU, RAM, SSD *and* GPU with 10W still left over to power external devices.
Mac and nvidia are for idiots
and what u gonna do with it? i doubt the same
@@rev3489 You're right, I'll actually get some work done.
But feel free to pay $2,000 for a sand brick that eats 600W for a few extra FPS. Best wallet-opening device ever!
Well and even the M4 max is stuck at 40-50 FPS medium settings 1080p gaming for the few games that run on it. GPU performance is a whole different thing than CPU performance and to this day no one created a low power competitive GPU not even Apple so your comment is invalid.
@@Max-9871 Well thanks for making that up on the spot, Max-9871, but I think we all prefer actual facts.
The M4's GPU cores are on the same die as the CPU, along with the neural engine cores. All 3 share the same RAM.
So how did Apple manage this in under 40 watts while Intel and nVidia are pushing systems closer to 1,000W?
It's like Nvidia is relying on RayTracing and DLSS to sell 5070 and below. They're not putting much effort into improving those cards compared to previous generations. Memory clocks go up with the addition of GDDR7, but they're not increasing the amount of memory or the cores (meaningfully). And all this with a 25% increased power draw? For what? A little bit of base clock?
Rasterization is hitting a limit. The only way to go is all the other technologies.
There are already games that consume more memory than they're giving cards like the 5070. And they continue to add throughput bonuses to cards above it. They've not hit a plateau. They're just selling a new generation of software that comes with replacement hardware.
4060 was carried by frame generation and dlss 3 and Ray reconstruction. It's as good of a card as my 3090 if that's says anything only because of heavy ai. It beat my 3090 in 1080p and 1440p but in 4k my 3090 slams the 4060 into the ground by a landslide
Can’t wait for the 9070XT to be an idiotic $599 and be DOA
Oh, 9070XT will be DOA if it's more than $450.
@@muaddib7356 and 90% sure it will be more than $450
why do you think only one chiplet has access? both chiplets could have access but to ONE 3d v-cache that does not require to be double the size
9070 is gonna suck arse. AMD cares less about their descreet cards now than they ever have. They still got stock of their last generation to sell.
Did you mean to say 128MB L3 Cache?
L3 cache total X3d is 96mb on top of the regular cache (32mb per ccd)
L3 same as RDNA3
So, are we all hoping AMD might surprise us with a 9090xt??
Nope, no high end cards this gen
Maybe they can get their high end udna out early
nope, its maxed with 4096 cores per chip and more than one chip isnt get the right ratio for performance and watts
nope, we are hoping for the 9070xt to be not too expensive
*9900XT.
Have there been any pricing leaks on the 50 series cards?
Vex found an Australian price, works out at $1744 or £1386 for the 5080.
@venataciamoon2789 that's nuts. Nvidia is getting out of control. AMD or Intel need to step up their high powered cards, the race to the middle is kinda screwing us.
All you kids can't wait to spend $2k so you get 400 fps on your 144Hz 1440p monitors.. 😂
VR gaming is much more demanding, especially with high res VR headsets. 5090 can easily be tapped out with this.
Nah I'm just trying to get a solid 120 fps with max settings in the newest games on my 4k 120hz monitor without having to use upscaling or frame generation. Do you think the 5090 will be capable of that? I bet I would still have to use DLSS.
360hz QD-OLED*
5070 para 1440p 120fps path tracing ya me vale
Current triple A games are so unoptimised on PC these days. Even the best hardware can struggle.
More than 1000 dolars for a gpu is insane. Sorry, wont spend my hard earned money on a GPU anymore
Massive you say 🤨
Big typo on that article, a "9550X3D" is not 16 cores lol.
Hoping I will actually be able to buy a 5090 on launch without all the scalpers ruining it.
just order it from a store no need for the scalpers to get money.
Not going to happen, rumors 5080 coming out first because of issues in manufacturing. Scalpers will get 50% of them. And lots will be randoms who don't game and just want to upsell them like douche bags. Needs to be illegal, period. Good luck though gonna be pricey
I really don't think scalping will last long. If you look at the 4090 release, scalpers didn't fair well. The 4090 Strix was going for stupid money for a short while. Within a month or two 4090s were selling for like $100 over MSRP in my local area.
i built my first PC in 2018 w/ an Ryzen 3700X and an RTX 2070, switched to 5800x3d w/ 7900XT in 2023 and i am honestly happy to see Intel's relative success w/ the B580 covering entry level systems. i will probably wait this generation out to strike a deal, since my system is doing just fine in 3440*1440.
So how high is amd gonna go ? Will there be a rx 9080xt ? Or not ?
260w TBP? It's basically the same as the 7900GRE...
come on, AMD, low down this damn consumption
I would have loved to see the 9950x3D have the 3D vcache on both dies. Based on videos I've seen it's a pain to jump through the hoops to get the CPU to park the proper cores. I'd pay extra to just have the whole chip stacked up.
I was going to buy a 5090, but then all the pricing talk... So I bought an OLED TV and a new Pellet Smoker Grill instead.
My 3090Ti is still strong enough.
It is gonna be massive, but my budget might be low so I hope the price will fade
I want to build my first pc and i need help, should i build it right now and buy an rtx 4070s asus dual or should i wait for the next gen?
That's hard to answer. If you can find a 4070ti super, then I would get that. I doubt the 5070 would be much better and it'll be a while before you can even buy because I have little doubt it will be scalped to oblivion. Just my 2 cents
9070 XT has the same specs as a 6800 XT basically, it will rival a 5060 Ti at 399-499 most likely
so in other words you are saying the 9070XT is the same as a 7700XT 🤣
As Someone with a 4070 ti, i likely wont upgrade anytime soon but with the trend of smaller upgrades for more stable prices I am gonna be really invested for this Generation of AMD and NVIDEA
I just bought a RX 7900XT today with my Christmas cash(and savings). Should I return it and wait, or will it still be worth the $660 I spent? I've been running a RX 580 for 5 years now and wanted an upgrade because I'm primarily on VRChat (and need the vram). Any help would be appreciated.
It's really difficult to predict what will happen after the release of the new generation of graphics cards. Prices for older generations might drop a bit, but those reductions are likely to be minor, and you'll probably have to wait a few months for them. It's also hard to say how well the new cards will perform and whether there will be any issues with them.
If you're curious, you could return your current card and wait those few days, but in my humble opinion, if you're satisfied with the performance and the price you paid, stick with your choice. Your time and peace of mind are worth more than what you might gain by waiting for the new release.
Keep it if you need the vram it only really matters if you want better raytracing.
Keep it! Only because we have NO CLUE, despite all the 'supposed' leaker info, WHEN AMD is going to release these new cards. Sure, they will show them at CES 2025, but still no date on arrival to the public for purchase. It may be a month, or two, or three. And 'supposedly', the 9070XT, will be on par with the 7900XT.
You think that’ll make Nvidia lower their prices a bit? Really tryna get a 5080 with my new 9800x3d
People need to stop buying nvidia cards so they can fix their pricing.
They have full market control.
So they are overpricing everything.
V-cache on 9950X3D would guarantee no bottleneck due to the IO die. That would actually have been quite helpful!
Waiting for pricing before deciding whether to jump on a 7800xt or the 9070xt
Anyone else notice the 1.38v on the 9950x3d core voltage?
Thank you for listing sources. Solid news work.
4:45 if it’s true, I’ll get that over XTX. might as well have better Ray tracing performance. But I’m not confident about this performance
Will there be 9600X3D??
love your content keep it up!
Both chiplets having 3d vcache would prevent the scheduling issues
It's funny to see some commenters who own a 4080 want to upgrade to 5080
Reminds me of the races I didn't know I was running in when I had an 4Mhz x86 and upgrading to an 8Mhz. So my mate got a 16Mhz but I somehow ended up with a 24Mhz so he got a 33Mhz. Mines bigger than yours and all that.
great news for the second hand market
the one take away I noticed here is that if those numbers are accurate? Then that puts intel's 580 in a Very competitive spot with the 5070 right off the bat.
I have a 7900 XTX on order for $800 and I’m not feeling any reason from AMD or Nvidia to not just stick with that considering I’m not a super gamer and not as much caring about ray tracing, I’m kind of thinking the 24 gig of VRAM is the better future proofing
XD XD XDXDXDXDXDXDXD 5070 gonna be like 15% more expensive for 4% gain. Coming from the company that wants us to believe Moor's law is dead, while other manufacturers seem to not have the same conclusion. At this point Nvidia isn't just betting on their fans being idiots, they already accepted it to be the case.
I hope 9070 XT have driver option to set base clock so it work 260W. I rather choose lower watts than higher performance.
Can you please ask amd if they are going to compete with higher end gpus the generation after 9070s?
yes they making udna out of datacenters chiplets
They've already said they will. But that's 2 years off
@@Silver-h4m2 years till they can compete with the 4090 lol. Nvidia will be selling portals to the moon by then man.
@@trentongardner2106 lol. Probably. Best case scenario whatever card they offer will be a repeat of the 7900XTX vs 4090. So a tier down from Nvidia's best card
7:47 not to foreshadow the 4060 getting less cores than 3060
fewer cores*
This is is my first GPU cycle from the outside looking in and if you want the top-of-line cards the strategy should be to buy 5090 when they first come out, use it for 22/23 months and then sell it right before the new one comes out. Lose 300-400 bucks and you always have the best card. lol
Wonder what the RX 9090XT?
So much conflicting information. The last leak said the 9070xt was close to the 7900gre in performance though to be fair that was time spy and games always perform differently, but this huge of a gap would be surprising, we'll see. Now this card would possibly make at least a little sense at $650, still hoping for closer to $550, but if this is true it could mean there's a $500 card that's close to the 7900gre and 4070ti in performance.
if the 9070 turns out to be so close to nvidia's counterpart I ll be even more pissed that they don t release the high end cards as well....
really makes the hanging question that much heavier. how much will they cost?
7:24 classic one ... just throw more power at it so it will be more powerfull ... its just a shame to ignore efficiency that hard or atleast thats how it feels
9:28 well neither of them xD my hope is in the next amd gpu generation to maybe replace my 6600 xD
1.368V is a bummer for me tho, guess those new CPU will run with this in mind (higher voltage)
looking for a A100 40GB+ or similar for local LLMs, but I'll never buy another cutting edge GPU for gaming, what's the point, I have a 10GB RTX380 and I not had any issues with any games to date.
So either Nvidia will charge $900 for the 5070 Ti (where it won't sell well against just getting a used 4080/S
And they charge $1200 for the 5080 which then would be DOA because it's just a boosted 5070 Ti. Save $300 and sac 15% which you can cut in half from an OC.
These insane component prices and limited stock remind me of the pandemic days.... unfortunate what it means to have a "High end" PC has now grown out of reach for most.
Thanks for the info, Gamer Meld! 👍🏾🔥
I usually upgrade every two generation, but this time it would be a downgrade from my 6950 XT. Maybe about time to switch to team green again?
I'd say the 5070 vs 4070 hardware comparison is a bit misleading.
The 4070 was already an alright card, despite the price.
The undisclosed clock speed increase, paired with the much faster memory for texture and physics calculations should have a somewhat noticeable affect on FPS
I has a full tower case, size doesn't scare me at all.
So, the RX 7900 XTX will be better in performances than the 9070 ? ( exept the ray tracing ? )
Why exactly do you need so much power for such insane prices?
I would like you to ask AMD why their productions schedule for the 9800x3d is so out of kilter with demand. Have they really dropped the ball here?
Will there be more memory
I am debating if i should buy the 5090 on release,atm i have the tuf oc 4090a,the only reason for the 5090 is to play Ark Ascended better on 4K,as it is my most favourite game of all time.
BIIIIIIG IF: IF the 9070xt is close to 4080, day 1 purchase. If its blow 7900xt, might explore Nvidia. 4080 performance would be incredible if its priced well.
All other sources that I've seen pointing at sub 7900xt performance.
There are rastor leaks for it. It's a 7900gre with better ray tracing. Thats why they pulled the 7900gre from production.
@@mrg2039 I know, and this channel peddles hype, but the dream is some secret switcharoo where its actually competitive with a 4080.
@@trentongardner2106 Yea I've seen those but here we are with another video claiming near 4080 performance. I expect a GRE+ but a 4080-ish performance would be a day-1 buy if its in the $600 range.
@balthorpayne I can tell you why that is. They're getting paid to say that.
Nvidia be like: "We don' trade our cards in cash no mo, we trade em in Organs"
I’m going to reserve judgement on this Rx 9070 xt until it’s released and some independent reviews and gaming benchmarks are out but if this is true and it’s priced right in the $500 - $600 then I think is will do well.
Considering the 5080 is looking like it will be around $1500 USD I'm guessing the 5070ti will be around $1100...... 5070 $1000 and 5060Ti around the $800 mark, just wondering if its going to be a significant amount of a jump to justify these prices.