Hi Vex, the 2 slots are not there for anythig to be solder onto them. Memory chips can work in 32 or 16 bit modes... and so on. 320(bit) / 20(GB RAM) = 16 256 / 8 = 32 256 / 16 = 16 192 / 6 = 32 192 / 12 = 16 ...and so on. That gives you a picture of what is possible. You can cheat a bit but then you get situation like with GTX970 where some of the VRAM works much slower than it should. (192 / 16 = 12 so this is not viable) Hope that helps!
Amazing Video like always. Please make video about Nvidia Gen to Gen peformance and price diffrence . Because RTX 4070 Ti shoud be named RTX 4070 and priced at 550-600$ from the beginning . Like we RTX 3070 was faster the RTX 2080 by 20% and now RTX 4070 has same RTX 3080 performance . RTX 4070 Ti is better then RTX 3080 by 20%. Nvidia missleading by using wrong names and higher prices . Even power draw efficiency is lie because RTX 3070 has 220 watt TDP and RTX 4070 Ti (the real RTX 4070) has the same power draw and the same goes on for the rest of GPU's.
Anything looks good compared to previous 40 series cards. They are all overpriced. comparing a bad price to a bad price is still a bad price. Also your channel has come a long way from 1k subs! Grats on your growth, your great, keep it up, more subs will come!
@@dontsupportrats4089 Yeah let's wish they go back to track in next RTX 5000 and i wish AMD wake up and make reboot like they did with Ryzen CPU and i hope Intel make good GPU too so we see good competition to make the performance better for less money . Thx bro i just dropped my channel i just did it for fun when i was playing Battle royale hhhh
Yes, they are both trying to just slightly be faster than the other to say they're better, so they can then have the higher price. Nvidia could have sold the 4080 for 750$ and instantly win the generation, but they decided since it's predicted to be faster than AMD's flagship then they may as well charge more than what AMD would charge for their flagship. They're just trying to squeeze more money out of less and less consumers. The reason they often show big improvements (especially Nvidia for counting FG as real FPS) over last gen is to say to people who haven't upgraded "look how bad your GPU is now, you need to upgrade!". Jensen Huang himself directly addressed GTX 1000 owners saying "it's safe to upgrade" at the launch of Ampere, which is the exact moment I realised Nvidia went corrupt. AMD isn't doing much better, just copying some of Nvidia's homework and switching it up and making it marginally better. Hopefully Intel Battlemage can give Huang and Dr Su a reality check.
@@Crazicali hope the battlemage is insane value and rocks the market. We need the market to reset in a good way for consumers. Well see though. Theres a ton of potential in the mid range market as thats where a majority of pc gamers hang.
@@ElJewPacabrah It wont be because they'll only be releasing the base model due to it being too 'expensive' to have multiple cards. At least that's what they're saying for now.
AMD wants to get rid of RDNA2 stock, and from now untill rdna4 they will want to get rid of rdna3 stock. My prediction is massive cuts when rdna4 is released, and rdna4 will bring current highend performance for midrange prices. Then when RDNA5 is out, the 8800xt/8700xt will become "budget"/ low midrange cards, and the 9xxx cards will be a new tier of performance at nvidia prices. Meanwhile there will be apus as powerful as the 6600/6650xt with zen 6 / ddr6 which will have 64-128GBps bandwith per dimm
@@Crazical I understand why Nvidia behaves this way, as they have 80%+ of the dedicated GPU market share and they can get away with it. It just doesn't make sense why AMD refuses to put out better value cards and start moving units to gain some ground on Nvidia. They aren't even trying to compete in this space. They seem content to let Nvidia dominate them and at this rate even Intel could eventually surpass them. We need some of that innovation from the CPU side
But if you are potentially about to spend 600-700 $ or £ on a graphics card shouldn't spending 40 or 50 minutes (at least) researching the product be the very thing you SHOULD be doing?
@@defnotatroll yep this kind of content is stealing viewers for sure, and it's lower effort too, just take results from others and summarize it. Just like chat those chat GPT AI that take people ideas, writing and summarize all of it
Nah I have to agree with TheMicj38, you really need to spend time on stuff like this. If you are planning on buying something at this price point and only do the bare min research its on you if you get burned.
The contact pads for extra RAM chips are intended so that the same PCBs can be used for GPUs based on AD103 (RTX 4080s and 4070 Ti Supers). AD103 uses the same substrate package layout as AD104 (the chip used for the RTX 4070, 4070 Super, and 4070 Ti), and that substrate includes connections for 2 extra RAM chips but AD104 only has a 192-bit memory bus, so it can't use the extra connections, the other 6 chips use up the whole 192 bits. So the extra connections on the PCB are only usable when an AD103 GPU is used. It's not that the chip is "cut down", AD104 is a completely different chip which physically doesn't have any way to use the extra connections.
Exactly, this is why i dont like it when people talk about things they dont know anything about and jump to assumtions and made up conclusions...this looks just bad for them and he fell for this trap and not for the first time... but its good to see, that i'm not the only one pointing that out. cheers. :)
Bytesizetech explained in a recent video that in order to use >12GB with the 192 bit bus of these GPUs, the GPU should run at lower speeds or just double the 12GB to 24GB, or something like that (don't remember).
@@Adri9570 You cant exceed the 12GB as each chip needs 32bit making it 6 chips à 2GB at a 192Bit interface. The only way would be to use bigger and more expensive Modules, which pretty much do not exist yet...as we just got the 2gig sized ones a while ago. Well yes, you could also try double stacking on the backside aswell, but that brings its own problems and doesnt work very well, if not intentionally made for with the controllers and co. Not to mention the additional power, voltage, phases and generated heat again... The 3+Gig ones will probably only coming with GDDR7 and RTX5000.
ram and ginpped bus and bet you next gen will have another feature that will make the current one obsolete at least if you are stupid enough to buy ngreedia in the first place. i have both ngreedia and amd and wish all were amd....
And AMD uses more VRAM than Nvidia even on GPUs with the same amount of VRAM. Perhaps you should ask why. In some scenarios the difference be well over 1GB on a 8GB Nv vs 8GB AMD GPU. Why is this? Is AMDs compression just that bad? Nvidia doesnt seem to need as much VRAM. Regardless. If you're a muppet that have to play everything on ULTRA (no reason to do so) then you should reconsider. Also 3 years from now, textures on Medium will be what todays Ultra is. So it doesnt matter. UNless you're a pixelpeeper at 4K. People rave about VRAM a lot. But in 99% of the scenarios, you wont notice adifference between texture qualities unless you literally stand still and pixelpeep. From a normal viewing distance of 60-80cm then you wont notice a lick of a differnce between todays High and Ultra. All you're doing is getting upset about something you're not really missing out on. Anyway in 3 years from now, judging by todays games, the current 7800xt and 4070 S will be long obsolete in raster anyway.
@@haewymetalI agree with this but it’s a little aggressive lol. I have an 8gb VRAM card and I haven’t had any issues with 1440p gaming even with high textures. The only game that I have been limited by VRAM on is the RE4 remake, which is notoriously VRAM hungry. Even then, it just meant I could only have the textures most of the way to maximum instead of max. I think unless you’re doing path tracing at 4k, 12gb will be plenty for years to come. And if you ARE doing path tracing at 4k, you wouldn’t buy a 4070 in the first place.
I upgraded to a 7800xt 2 months ago and I’m very happy with it, waiting for the 4070 super was a option but at the time there was nothing about price or performance, still surprised that the 4070 super is still only 7% faster though.
@@tringuyen7519 that 192 bit memory bus is killer at 1440p or 4k. everyone talks about 12gb of vram and frankly ya its a little low especially for the cost but would still be ok for a while its the memory bus being 192bit that really makes me question why its marketed as a 1440p card because it isnt going to age well and already struggling on newer titles at those higher resolution. 1440p card needs a 256 bit bus at minimum. frankly its obvious the 4060 was suppose to be a new 50 series card and 70 was suppose to be a 60 series card and 4070ti super should have been the 4070 from the start except with original core count.
apart from jayz2cents, it seems the RT benchmarks are pretty lightweight and make AMD look better than it actually is in RT. Appreciate the meta-analysis! Would be curious to do more of this to identify if any discrepancies do exist between the techtubers
A very good 1080p gaming GPU, 1440p looks good as well. 12 gigs feels way better than 8? Could see this being good for longevity if the buyer of these cards stays at the lower resolutions in the future? What I'm thinking is that 4k has been out for a few years and many of us want the muscle to push those pixels going forward. I feel Nvidia had missed the opportunity soaring too high to the sun with Ai. (singed its wings?) Enjoyed the video Vex. Peace!
I agree with some of your points but your comment about the extra vram spots on the board aren't for this die. The PCB is mostly likely used for upper range models aswell like the 4080 and upcoming 4070ti super and 4080 super. Those cards use the AD103 die on a 256 bit bus which would support 16gb of vram. The 4070 super physical can't support 16gb as it's a AD104 die on a 192 bit bus, 6 or 12gb is all that gpu can support.
I think the card is alright but the price jump is a bit insane for all gpu honeslty and if the 7800 xt drops down to 450. Than it becomes a card that can not be ignore because it gives a solid gpu performance for 1440p ultra gaming. Also if 7900xt drops to 650 or even to 680... Peoples gonna start reccomending AMD for raw performance unless you simply want dlss.
Thing is 4070S will drop to around 550 also and I think having RT options and frame gen puts it ahead. Games are released with bad performance lately dlss is important tbh
@@IncredibleLyrics Yea dlss is a good feature to have but its not gonna break any boundaries yet. I still think DLSS wont become a norm feature thing yet. Having it is a plus but also not having it wont change as much honestly. If I really want something for DLSS, Im getting a 4080 or 4090, thats where the the best DLSS performance is at. But still price for performance 7800 xt is too much of a power house at 450$. If were just talking raw performance without DLSS. 7800 xt. I jsut dont see the incentive to buy a 4070 super for 100$ more just so you can get a 10 percent increase in performance and the feature it have is DLSS. Also if the super 4070 dont drop price to 550 and stay at 600, witrh the ti releasing and 7900 xt at 700 dollar now... when the 4070 ti super is release it may drop down to 670 ... on holiday and for 70 dollar more you can get a 7900 xt and get a huge ... and I mean a huge performance increase. Cause from what I know Nvidia hardly drop prices on their GPU and CPU.
The reason there plenty of 4070 super cards in stock at msrp is because people are not willing to buy a 12GB vram gpu at $600 price range. Nvidia skimped on vram and its biting them in the ass. As it should. Only fools buy the 4070 super to upgrade from some 3080 or 6700xt or 6800.
I'm not sure about how much its hitting them. All the people who need like 20+ gb for work-related stuff are forced to buy 3090s /4090s when otherwisely they might just need a lot less performance
*I would take the 7800XT, yes the 4070S might be better now, however with more games using more VRAM as time goes on the 7800XT with the 16GB would be better value. Also, along with that AMD will continue to improve on the software side, FSR etc. long story short the 7800XT will improve in it's life cycle whereas the 4070S with have more issues with limited VRAM*
To be fair, Nvidia also constantly improves DLSS, FG and RR quality. And they also announced Neural Compressed Textures, which may help with that 12GB buffer in the future. We can not be sure what future will bring us, that is the point. And by the time 7800 XT will take advantage of these additional 4 GB, this card may simply become outdated due to the lack of raw GPU power. 🤔
@@AKMcF Yes if they implement games with RT already built into the game then it will be an issue. For the second point I'm talking in context to the video so 1440p gaming.
@@stangamer1151 Yes both will continue to improve, I meant to say AMD has more room for improvement in FSR department. DLSS already looks close to native, AMD is far behind in upscaling, if they can also make it look close to native and remove those shimmers etc then AMD will look more appealing in terms of value.
Used 3080s are going for a very good price where I live, for around 400 USD. But the 10gb of vram and the ridiculous power consumption have kept me away. I'm just gonna wait for the 4070 non-Super and 7800 XT to hit the used market.
How often do you buy a high end card? If you want the bells and whistles for years to come, stick a pry-bar in your wallet and get them, or run the risk of regretting it for years and deal with the extra heat generated by AMD cards.
I buy a new card/build ever 5-8 years. Once I start struggling with games on medium setting then its time for a new build. If you're obsessed about ultra maxed out graphics every year you'll be disappointed every year you have to upgrade a card. if you can handle customizing your graphic settings between visual fidelity and FPS you'll be happy. a decent PC build will last you at least 5 years minimum unless you're a real nazi for graphics... and if thats your problem you might as well buy a console PRO version so you can game at 4k and not worry about price For example, a 1080ti still holds its value to this very day, and personally I still have a 1070 and I've upgraded to a 4070s.
I guess $600 is the new $300 and $1,600 is the new $900. FYI, Nvidia's overall profit margins are 26%, that's Apple territory levels of monopolistic greed.
@@srobeck77Both are less than 30% COMBINED. NVIDIA knows that they control 70% of the market share, so just through brand recognition alone, they can sell more even if it's marginally worse than its competitors.
Power is not only an issue in your area. It is also an issue in your case, which may need a bigger PSU. Oh, and because you are using more energy, you need more cooling. So it may be worth paying more for something that uses less power as you don't have to do the Temperature and power supply dance.
6:10 I don't know how this works but from watching other reviewers I'm pretty sure the chip the 4070 and 4070 super and 4070 ti use CAN'T have 16Gb of ram. That's why the 4070 ti super uses the same chip as the 4080. Maybe it has to be a multiple of 6 or 3 or something. Perhaps it could have 18gb or 24gb. Just guessing, haven't got a clue, I would have thought a tech reviewer would know. I'm pretty sure I've heard other "pc techtubers" mention it.
I could search up ANY GPU and find numerous sites ripping on it. I have a 4070S so Im biased, but man, I got it for $549 and it sure feels like that price was worth it.
Power Consumption is not just a thing about price. 100 Watts more also mean 100 Watts more to cool, and that demands better cooling solution, and if they are not better, they are louder.
IMO RT is still not a feature worth pinning for. Currently the 4090 is the only card I'd consider to have competent RT abilities, otherwise you HAVE to use an upscaler to get anything resembling a good frame rate. Upscalers are great but if you're going for max visual fidelity than it's a no go. If RT is something you're really passionate about, prolly still worth waiting for the next gen of gpus.
Great analysis. I don't think the 7800XT needs a price cut though. Couple of reasons why not. My Phathom Gaming card is probably the strongest performing card of all the 7800XTs out there. It's like a run away train when it comes to clock. It just goes up and up resulting in increased performance numberss above what you're seeing in the content creator videos. My 7800XT fps numbers are on par with that of a 4070Ti. I think it could be I've undervolted it is why I'm seeing this. I've seen it boost to 2766MHz no overclock. I've seen TBP as high as 424W without undervolting as well. I don't really care about FSR or RT I care about the raw rasterization. I got a long memory. Second reason the price should remain the same is I remember I bought my 1070Ti back in the day for $500. How much more powerful is a 7800XT over a 1070Ti? So yeah, AMD gave us a more than a fair price imo unlike greedy Nvidia. I will likely never buy Nvidia ever again they've showed us they operate by a culture of greed. That's disgusting. I've bought 4 AMD cards in the past 2 years. I'm itching to pull the trigger on a 7900XT right now and will probably do so. The 7800XT is worth every penny as is.
Yeah, unfortunately when you put it all down on paper the arguments in favor of the 7800XT really aren’t affected much even given how much of a performance bump the 4070 Super makes over the OG 4070. Not too much incentive for AMD to drop prices on that card, or even the 7900XT as those were already in the mid-$700 range. I was thinking about this just prior to ordering a 7800XT a few days ago, I don’t see AMD being that compelled to make any long-standing price drop on it. Maybe incremental changes at most. And I want a reference model anyway, usually the slowest to get any long-standing price drops. Nvidia is likely aiming to raise prices on the 4070 Super to $700+ anyway. Nvidia fanboys will probably pitch in. It’s what they do every generation, every refresh.
i rly think nvidia should start giving upgrade services. like if u bough normal 4070 then u should be able to trade that for super or ti model and the cost would be what the current price difference is or what the price difference would have been during the first purchase
every gpu is 100$ more of msrp for me in romania. the prices haven't budged at all, i hope they will once the 4070ti super and 4080 super get released but i doubt it at least not right away and could take several weeks
GTX 1070 came at USD 400 with as much VRAM as a GTX 1080 and outstanding performance for its time. We should understand that GreedVidia only listens as result of boycott.
I am hoping I wont have to upgrade until the next generation consoles are released and we see what architecture games will be optimized for. The next gen consoles might come with 20 or 24 gigs of ram and developers will start to utilize that extra memory. Or in a few years frame generation software might be so good that you wont even need a dedicated GPU. Hard to say.
Amd Has an Amazing Oppurtunity for getting great Community feedback. They should reduce price of the 7800xt to $450, 7900xt to $650 and 7900xtx to $750.
of course it's *that* bad when they call a '60-class' card a "70"-class card, then charge twice what they should for it. It's insanity. The only legitimate '70-class' card is the 4070 Ti Super, but should have been called the 4070. The so-called 40"80" should be called 4070 Super, and the 40"80 Super" should be called 4070 Ti. The so-named 40"90" is the only legitimate '80-class' card and should have been $700.
@@BaldKiwi117 yeah. I just thought it was funny. Even though recent NVIDIA series have slowed in raster progress, somehow AMD is worse. Let's keep it real here
I had bought a 4070 before Christmas but now that 4070 Super releasd im just gonna return it to Amazon. Already ordered a 4070 Super. Pairing it with a 7800X3D
@@lifemocker85 I just feel Nvidia cards will encounter fewer problems. Maybe im wrong but as a dumb first time buyer this is the impression im getting so ill spend a bit more on the card for that
I'm stupid. Does the 3080 really have a good price to performance given that you'll be pulling more power from the wall? Do we need to factor in the running cost to have the 30 series cards when gaming at least 3-4 hrs a day?
Honestly I just had to return my red devil 7800xt due to a manufacturing error and was trying to decide between another one or the 4070/4070 super. Ive only had a intel arc a770 till I got the 7800xt so I’m pretty neutral when it comes to brands
Hi from Europe. In regards to pricing, the more expensive the card in USD then the euro prices climb exponentially higher for us because of VAT(sales tax) Germany has 19%, Spain 21%(I think), here in Ireland it's 23 and goes even higher in some other countries. So 23% of 600 euro is more than 23% of 450. So really, we could do with prices dropping more here to combat the high VAT added. I'm interested in a 7800xt or 4070s. But it's minimum 550/or almost 700 euro for the 4070s. Plus, cost of living is higher in Europe so people often have less discretionary spending money available. For example, our car fuel is much more expensive and I travel to work, luckily not as far as I used to, but I took a pay cut to vastly reduce my commute. Unfortunately in Ireland, wages can often be much higher in our capital but you have to deal with awful traffic and much higher rents or house prices. And because of our mostly terrible summer weather we got to another country for our summer vacation. You can't just hop in a car. We fly to Spain, Italy or the canary islands. Anyway, our health system has pretty much collapses and homelessness is at record high levels, partially because of open borders so I guess buying GPUs isn't overly important
If AMD wanted to really hit NVIDIA where it hurts, I think the following would shake up the market: 7800XT - Drop MSRP to $450 - Why buy a 4060Ti 16GB 7700XT - Drop MSRP to roughly $380 - Why buy a 4060 or 4060ti 8GB 7900XT - Drop MSRP to $700 7900XTX - Drop MSRP to $800-$850 Release the 7900GRE globally as a 7850XT OR 7800XTX for $550-$600 (7900 GRE is a China exclusive launch, but its the card I would buy now if I could)
Keeping the 3080Ti for now I think at least another year or two, in fact just got another prebuilt Alienware with a 3080Ti for my kiddo that’s almost exactly the same setup for $700.
For some reason in Oz there's always a launch/early adopter tax so currently the 4070ti is the better value over the 4070S as far a frames/$ is concerned. Once prices normalise the 4070S is reasonable value if you ignore the fact that GPU's in general continue to be overpriced.
People still feel that it is not enough VRAM for the money and the cards haven't sold well despite not having "big" supply which moores law is dead on his channel reported recently in his newest video.
3080 was launch at $700 in Sep 2020, adjust for inflation, it's around $1200 in Dec 2023 according to BLS. So if Nvidia can sell 3080 before pandemic like hotcake, I don't think they will reduce the price to be below $600 for a 4070 Super. In their eyes, a 4070 Super is cheaper than a 3080 (adjust for inflation) yet perform 8% more (according to Techpowerup) so from a business perspective, I don't see why they need to reduce it more.
12gb of ram isnt really enough anymore for 1440p high/ultra or RT, so if you are going to be stuck on 1080p you can get a much better deal for less money.
Thanks, It helped a lot but I REALLY want to have the same comparison for 4070ti super. I am stuck between it and 7900xt. So I could stretch my budget just a bit but knowing that it is worth it
How come no one talks about the pathetic less than 5% performance boost between a 6800 XT and a 7800 XT? It's every bit as bad as the 3060 to 4060 yet no one talks about THAT
i have a 7800xt and both a 6800xt, over the last couple months i’ve seen it go more to 10% aswell as being more stable in drops, but you’re right, driver updates are constantly making the 7800xt better at a pretty fast rate, honestly in a year from now it might not be 2-3% from a 4070 super
Chilling with my 1080ti 11gb since 2018.... lol such a future proof card especially at 1080p ultrawide 75hz. I might get a 4070 ti super or wait till next year and get a 5070 16gb for $649
I bought the rx7600 (bundled w/Resident Evil4) last June @$257. I don't have buyers remorse. Even though the 7700xt 7800xt and now the 7600xt is out. I upgraded from a rx580/ryzen 5 2600 to a rx7600/r5 5600.
If rasterization is anything to do with texture quality I feel like that is where my 4070 lacks. Had to do a lot of fiddling to get sharpish images and my old RX 580 still looked better as far as texture/image quality, just way less bells and whistles.
Using 1440p res for a xx70-tier card feels like a kick in the guts when you know in just less than a year newer games will run below 60fps @1440p with this card... and I'm talking about non-RT rasterized performance. It's like there's no future proofing in GPUs anymore. Nvidia is so dominant they're manipulating their releases so that even a generation refresh (like with this Super lineup) makes the last year's releases obsolete.
If you can't handle customizing your graphic settings, then you really should just settle with a console. It will cause you a lot less headache and trouble rather than worrying about maxing out every dog shit poorly optimized new release mediocre "TRIPLE A" game that's going to be released. Once you stop playing "keeping up the the joneses" on graphics card you'll finally be satisfied with a decent card with a little bit of customization
Dies have a certain amount of bandwidth. They cannot just add more memory. When they add more VRAM, it is double that amount, because they solder it at the back, using same channels, but with twice that RAM. It bottlenecks information transmit in certain applications, but it shows what hack manufacturers have to do to add more VRAM. Something like RTX 4070 Ti Super went to RTX 4080 die. This is why it has 16 GB of VRAM in a first place. RTX 4070S have empty spots, because it shares board with different models. This is done for cost reasons.
And now that you find a 7900xt nitro+ for 700 euro GL ngreedia ahah only a idiot would pick a 12gb gpu when there's a 20gb at that price and top brand.
Na, it's still a 12GB GPU for 600$+. Not to mention the AIB models that are close or at MSRP tend to be of worse quality than the reference model. So no ty Nvidia.
Honestly fsr in newer games is implemented really well. I'm not sure dlss will be a major selling point for much longer. I'd rather have the better raster performance especially if I was at 1440p. 1440p upscaling isn't going to look great anyways dlss or not
@@bryanbowling1857 interesting. Thanks for reminding me about fair use thing. Lately, reaction material has been a talk of the town and it makes me question whether I should be doing that myself. I suppose it'll depend on the material I'm making and what content I'm using to make whatever point I wanna put out....if there is a point. Hahaha
Avatar is a full RT title, it is very demanding. The 7900xt 2160, Fsr 3 ultra, high/ultra custom, getting 80fps+ average. With fram gen , 140fps+. I have to say I cant tell that I have fsr and fram gen enabled! Still disappointing that one has to use such tech with such a high end card. But games are so demanding now. My 6800xt wouldn’t make 40fps without fsr and fram gen In Avatar.
I'm still more interested in RT and upscale performance. It'd be worse to spend more than $500 dollars on a card that struggles running next gen games and other tasks. Yep Nvidia is more expensive but the hard work they've put in AI and efficiency shows. AMD has to try harder than just cutting prices down.
Given Nvidia loves to play the game of cutting down cards while keeping the same name and have discontinued the original 4080 anyway, it kinda surprises me they didn’t skip the awkward naming of the 4070ti Super and just called it the 4080 hoping no one noticed. They’ve done it over and over and their buyers never cared.
wish nvidia wouldn't be so predatory and put that 16 gb for 600 usd, then it might have been worth it, although 100+ more for dlss is still a bit too much.
7900 XT in the vein, with a blind eye without regrets. As always nVidia let us down again with this stinky 12GB of VRAM, could be the OneCard to rule them all in its range if i have 16GB VRAM. But nVidia been nVidia as always.
imma bite into the sour apple and support AMD. I have extremely high hopes for FSR (already making great progress since open source) and want AMD to be competition to Nvidia. I would never recommend a product based on “it will be good in the future” but I can take the risk myself. it’s like i’m super happy when somebody buys Intel GPUs, supporting another comeptitor is good for the consumer, but i’m not willing to deal with that myself yet
The point why this super update is "bad": Its only for the few PC gamer who are able or willing to pay 600 bucks or more for a graphics cards. How much is this? 5% of the market? 10% of the market? Maybe 15% of the market? Its a nice price drop for the few people who pay usually A LOOOOT money. For all those people this update and price drop is nice. For all folks who still face the not very interesting 7600 or 4060 ... or are even facing the overproced 4060 ti - there is nothing better. At 400/450 you can think about buying the 7700xt but below is quite underwhelming and NOTHING changed there. For me this directly tells us: Nvidia and AMD dont give a shit about the average PC gamer anymore. THIS is what this "update" directly tells me. As one of this average gamers who went up for rather high 400 bucks (RX 6800 non-xt) ... I have the feeling that Nvidia diretly spit in my face and laughed at me. Like they do for years now.
Just check the last video chapter, it's wild
Hi Vex, the 2 slots are not there for anythig to be solder onto them. Memory chips can work in 32 or 16 bit modes... and so on.
320(bit) / 20(GB RAM) = 16
256 / 8 = 32
256 / 16 = 16
192 / 6 = 32
192 / 12 = 16
...and so on. That gives you a picture of what is possible.
You can cheat a bit but then you get situation like with GTX970 where some of the VRAM works much slower than it should.
(192 / 16 = 12 so this is not viable)
Hope that helps!
Amazing Video like always. Please make video about Nvidia Gen to Gen peformance and price diffrence . Because RTX 4070 Ti shoud be named RTX 4070 and priced at 550-600$ from the beginning . Like we RTX 3070 was faster the RTX 2080 by 20% and now RTX 4070 has same RTX 3080 performance . RTX 4070 Ti is better then RTX 3080 by 20%. Nvidia missleading by using wrong names and higher prices . Even power draw efficiency is lie because RTX 3070 has 220 watt TDP and RTX 4070 Ti (the real RTX 4070) has the same power draw and the same goes on for the rest of GPU's.
Anything looks good compared to previous 40 series cards. They are all overpriced. comparing a bad price to a bad price is still a bad price. Also your channel has come a long way from 1k subs! Grats on your growth, your great, keep it up, more subs will come!
@@dontsupportrats4089 Yeah let's wish they go back to track in next RTX 5000 and i wish AMD wake up and make reboot like they did with Ryzen CPU and i hope Intel make good GPU too so we see good competition to make the performance better for less money . Thx bro i just dropped my channel i just did it for fun when i was playing Battle royale hhhh
GREAT IDEA for this video. Congrats man!
i feel like both companies are trying very competitively to not over compete with eachother honestly
Yes, they are both trying to just slightly be faster than the other to say they're better, so they can then have the higher price. Nvidia could have sold the 4080 for 750$ and instantly win the generation, but they decided since it's predicted to be faster than AMD's flagship then they may as well charge more than what AMD would charge for their flagship. They're just trying to squeeze more money out of less and less consumers. The reason they often show big improvements (especially Nvidia for counting FG as real FPS) over last gen is to say to people who haven't upgraded "look how bad your GPU is now, you need to upgrade!". Jensen Huang himself directly addressed GTX 1000 owners saying "it's safe to upgrade" at the launch of Ampere, which is the exact moment I realised Nvidia went corrupt. AMD isn't doing much better, just copying some of Nvidia's homework and switching it up and making it marginally better. Hopefully Intel Battlemage can give Huang and Dr Su a reality check.
@@Crazicali hope the battlemage is insane value and rocks the market. We need the market to reset in a good way for consumers. Well see though. Theres a ton of potential in the mid range market as thats where a majority of pc gamers hang.
@@ElJewPacabrah It wont be because they'll only be releasing the base model due to it being too 'expensive' to have multiple cards. At least that's what they're saying for now.
AMD wants to get rid of RDNA2 stock, and from now untill rdna4 they will want to get rid of rdna3 stock. My prediction is massive cuts when rdna4 is released, and rdna4 will bring current highend performance for midrange prices. Then when RDNA5 is out, the 8800xt/8700xt will become "budget"/ low midrange cards, and the 9xxx cards will be a new tier of performance at nvidia prices.
Meanwhile there will be apus as powerful as the 6600/6650xt with zen 6 / ddr6 which will have 64-128GBps bandwith per dimm
@@Crazical I understand why Nvidia behaves this way, as they have 80%+ of the dedicated GPU market share and they can get away with it. It just doesn't make sense why AMD refuses to put out better value cards and start moving units to gain some ground on Nvidia. They aren't even trying to compete in this space. They seem content to let Nvidia dominate them and at this rate even Intel could eventually surpass them. We need some of that innovation from the CPU side
You know what this format is great of summarizing reviews because most people don't have to watch 40, 50 min of content on single component
But if you are potentially about to spend 600-700 $ or £ on a graphics card shouldn't spending 40 or 50 minutes (at least) researching the product be the very thing you SHOULD be doing?
@@TheMicj38 if I am not buying right now maybe I am evaluating or comparing
it is fantastic, until the original creators get pissed off at Vex for stealing their viewer numbers
@@defnotatroll yep this kind of content is stealing viewers for sure, and it's lower effort too, just take results from others and summarize it. Just like chat those chat GPT AI that take people ideas, writing and summarize all of it
Nah I have to agree with TheMicj38, you really need to spend time on stuff like this. If you are planning on buying something at this price point and only do the bare min research its on you if you get burned.
The contact pads for extra RAM chips are intended so that the same PCBs can be used for GPUs based on AD103 (RTX 4080s and 4070 Ti Supers).
AD103 uses the same substrate package layout as AD104 (the chip used for the RTX 4070, 4070 Super, and 4070 Ti), and that substrate includes connections for 2 extra RAM chips but AD104 only has a 192-bit memory bus, so it can't use the extra connections, the other 6 chips use up the whole 192 bits.
So the extra connections on the PCB are only usable when an AD103 GPU is used.
It's not that the chip is "cut down", AD104 is a completely different chip which physically doesn't have any way to use the extra connections.
Exactly, this is why i dont like it when people talk about things they dont know anything about and jump to assumtions and made up conclusions...this looks just bad for them and he fell for this trap and not for the first time...
but its good to see, that i'm not the only one pointing that out.
cheers. :)
Bytesizetech explained in a recent video that in order to use >12GB with the 192 bit bus of these GPUs, the GPU should run at lower speeds or just double the 12GB to 24GB, or something like that (don't remember).
@@Adri9570
You cant exceed the 12GB as each chip needs 32bit making it 6 chips à 2GB at a 192Bit interface.
The only way would be to use bigger and more expensive Modules, which pretty much do not exist yet...as we just got the 2gig sized ones a while ago.
Well yes, you could also try double stacking on the backside aswell, but that brings its own problems and doesnt work very well, if not intentionally made for with the controllers and co.
Not to mention the additional power, voltage, phases and generated heat again...
The 3+Gig ones will probably only coming with GDDR7 and RTX5000.
nvidia are STILL trying to nickle and dime us with RAM
ram and ginpped bus and bet you next gen will have another feature that will make the current one obsolete at least if you are stupid enough to buy ngreedia in the first place. i have both ngreedia and amd and wish all were amd....
That's by design hopping that you upgrade every generation.
@@thepro08man I wish AMD made their own AI upscaler that games used. My only way to get high FPS in the new ARK is with DLSS
And AMD uses more VRAM than Nvidia even on GPUs with the same amount of VRAM. Perhaps you should ask why. In some scenarios the difference be well over 1GB on a 8GB Nv vs 8GB AMD GPU. Why is this? Is AMDs compression just that bad? Nvidia doesnt seem to need as much VRAM. Regardless. If you're a muppet that have to play everything on ULTRA (no reason to do so) then you should reconsider. Also 3 years from now, textures on Medium will be what todays Ultra is. So it doesnt matter. UNless you're a pixelpeeper at 4K. People rave about VRAM a lot. But in 99% of the scenarios, you wont notice adifference between texture qualities unless you literally stand still and pixelpeep. From a normal viewing distance of 60-80cm then you wont notice a lick of a differnce between todays High and Ultra. All you're doing is getting upset about something you're not really missing out on. Anyway in 3 years from now, judging by todays games, the current 7800xt and 4070 S will be long obsolete in raster anyway.
@@haewymetalI agree with this but it’s a little aggressive lol. I have an 8gb VRAM card and I haven’t had any issues with 1440p gaming even with high textures. The only game that I have been limited by VRAM on is the RE4 remake, which is notoriously VRAM hungry. Even then, it just meant I could only have the textures most of the way to maximum instead of max.
I think unless you’re doing path tracing at 4k, 12gb will be plenty for years to come. And if you ARE doing path tracing at 4k, you wouldn’t buy a 4070 in the first place.
I upgraded to a 7800xt 2 months ago and I’m very happy with it, waiting for the 4070 super was a option but at the time there was nothing about price or performance, still surprised that the 4070 super is still only 7% faster though.
best choice you have made
Only 7% faster bc memory bus is only 192 bits. 7800XT has 4G more VRAM on a 256 bits memory bus.
@@tringuyen7519 that 192 bit memory bus is killer at 1440p or 4k. everyone talks about 12gb of vram and frankly ya its a little low especially for the cost but would still be ok for a while its the memory bus being 192bit that really makes me question why its marketed as a 1440p card because it isnt going to age well and already struggling on newer titles at those higher resolution. 1440p card needs a 256 bit bus at minimum. frankly its obvious the 4060 was suppose to be a new 50 series card and 70 was suppose to be a 60 series card and 4070ti super should have been the 4070 from the start except with original core count.
@@KraziAgent-cj8cx Why did you buy 2 cards and at such close classes?
@@RogueRyzen a better upgrade would be a rtx 4090 at 4k.
Because 7800xt can handle all games at 1440p.
apart from jayz2cents, it seems the RT benchmarks are pretty lightweight and make AMD look better than it actually is in RT. Appreciate the meta-analysis! Would be curious to do more of this to identify if any discrepancies do exist between the techtubers
A very good 1080p gaming GPU, 1440p looks good as well. 12 gigs feels way better than 8? Could see this being good for longevity if the buyer of these cards stays at the lower resolutions in the future? What I'm thinking is that 4k has been out for a few years and many of us want the muscle to push those pixels going forward. I feel Nvidia had missed the opportunity soaring too high to the sun with Ai. (singed its wings?) Enjoyed the video Vex. Peace!
I agree with some of your points but your comment about the extra vram spots on the board aren't for this die. The PCB is mostly likely used for upper range models aswell like the 4080 and upcoming 4070ti super and 4080 super. Those cards use the AD103 die on a 256 bit bus which would support 16gb of vram. The 4070 super physical can't support 16gb as it's a AD104 die on a 192 bit bus, 6 or 12gb is all that gpu can support.
I like these deep analyses,really interesting for potential buyers. Keep it up!
I think the card is alright but the price jump is a bit insane for all gpu honeslty and if the 7800 xt drops down to 450. Than it becomes a card that can not be ignore because it gives a solid gpu performance for 1440p ultra gaming. Also if 7900xt drops to 650 or even to 680... Peoples gonna start reccomending AMD for raw performance unless you simply want dlss.
Thing is 4070S will drop to around 550 also and I think having RT options and frame gen puts it ahead. Games are released with bad performance lately dlss is important tbh
@@IncredibleLyrics Yea dlss is a good feature to have but its not gonna break any boundaries yet. I still think DLSS wont become a norm feature thing yet. Having it is a plus but also not having it wont change as much honestly. If I really want something for DLSS, Im getting a 4080 or 4090, thats where the the best DLSS performance is at. But still price for performance 7800 xt is too much of a power house at 450$. If were just talking raw performance without DLSS. 7800 xt. I jsut dont see the incentive to buy a 4070 super for 100$ more just so you can get a 10 percent increase in performance and the feature it have is DLSS. Also if the super 4070 dont drop price to 550 and stay at 600, witrh the ti releasing and 7900 xt at 700 dollar now... when the 4070 ti super is release it may drop down to 670 ... on holiday and for 70 dollar more you can get a 7900 xt and get a huge ... and I mean a huge performance increase. Cause from what I know Nvidia hardly drop prices on their GPU and CPU.
glad to see daniel owen getting recognition, i subbed to him when he had less than a thousand subscribers
The reason there plenty of 4070 super cards in stock at msrp is because people are not willing to buy a 12GB vram gpu at $600 price range.
Nvidia skimped on vram and its biting them in the ass. As it should. Only fools buy the 4070 super to upgrade from some 3080 or 6700xt or 6800.
I'm not sure about how much its hitting them. All the people who need like 20+ gb for work-related stuff are forced to buy 3090s /4090s when otherwisely they might just need a lot less performance
@@Alice_Fumoaye, it's an upsell product.
In stock where? I haven't seen any since launch
@@nnightkingjbruh what? tons on amazon, newegg, walmart etc
But they'll be plenty buying 4080 super cards out of stock
*I would take the 7800XT, yes the 4070S might be better now, however with more games using more VRAM as time goes on the 7800XT with the 16GB would be better value. Also, along with that AMD will continue to improve on the software side, FSR etc. long story short the 7800XT will improve in it's life cycle whereas the 4070S with have more issues with limited VRAM*
Unless games continue get more RT heavy.. which they are... no point in having the VRAM if you're at 1080p with medium settings😀
To be fair, Nvidia also constantly improves DLSS, FG and RR quality. And they also announced Neural Compressed Textures, which may help with that 12GB buffer in the future. We can not be sure what future will bring us, that is the point.
And by the time 7800 XT will take advantage of these additional 4 GB, this card may simply become outdated due to the lack of raw GPU power. 🤔
@@AKMcF Yes if they implement games with RT already built into the game then it will be an issue. For the second point I'm talking in context to the video so 1440p gaming.
@@stangamer1151 neural compressed textures 50 series only if you pay $10/month subscription.
@@stangamer1151 Yes both will continue to improve, I meant to say AMD has more room for improvement in FSR department. DLSS already looks close to native, AMD is far behind in upscaling, if they can also make it look close to native and remove those shimmers etc then AMD will look more appealing in terms of value.
Used 3080s are going for a very good price where I live, for around 400 USD. But the 10gb of vram and the ridiculous power consumption have kept me away. I'm just gonna wait for the 4070 non-Super and 7800 XT to hit the used market.
Reviewing the Reviewer's is actually a good round-a-bout video for us to see. All in one spot.
Awesome Video. I got 7800xt instead of waiting for the 4070 Super but i dont really regret it as AMD seems good so far.
How often do you buy a high end card? If you want the bells and whistles for years to come, stick a pry-bar in your wallet and get them, or run the risk of regretting it for years and deal with the extra heat generated by AMD cards.
I buy a new card/build ever 5-8 years. Once I start struggling with games on medium setting then its time for a new build. If you're obsessed about ultra maxed out graphics every year you'll be disappointed every year you have to upgrade a card.
if you can handle customizing your graphic settings between visual fidelity and FPS you'll be happy. a decent PC build will last you at least 5 years minimum unless you're a real nazi for graphics... and if thats your problem you might as well buy a console PRO version so you can game at 4k and not worry about price
For example, a 1080ti still holds its value to this very day, and personally I still have a 1070 and I've upgraded to a 4070s.
Moore's law is dead put a video saying how bad the sales are
Yeah saw that. I did not expect this card to do so low.
why tho? the reviews are positive
@@defnotatroll 600 dollars for 12gb of vram in 2024 just get the 7800xt or add 100 dollars and get 7900xt
@@defnotatroll Retailers are charging more than the 4070 ti. Shows you who the real enemies are here.
I guess $600 is the new $300 and $1,600 is the new $900. FYI, Nvidia's overall profit margins are 26%, that's Apple territory levels of monopolistic greed.
Doesnt a monoply mean just 1, so no competitors? AMD is waving hello friend, Im not dead over here. Intel is also waving, but is perhaps dead.
@@srobeck77Go on twitch and check out ten random streamers. See what gpu they have , nvidia or amd.
@@srobeck77Both are less than 30% COMBINED. NVIDIA knows that they control 70% of the market share, so just through brand recognition alone, they can sell more even if it's marginally worse than its competitors.
@@CompressedReassurance monpoly =1, not 70%. Do we need a linear chart as a visual aid?
@@CompressedReassurancemore like less than 20% if we’re talking just discrete GPUs.
If ur planning to build a pc now, I’ll suggest waiting to see what amd’s response to nividia as pricing may change soon.
Power is not only an issue in your area.
It is also an issue in your case, which may need a bigger PSU.
Oh, and because you are using more energy, you need more cooling.
So it may be worth paying more for something that uses less power as you don't have to do the Temperature and power supply dance.
6:10 I don't know how this works but from watching other reviewers I'm pretty sure the chip the 4070 and 4070 super and 4070 ti use CAN'T have 16Gb of ram. That's why the 4070 ti super uses the same chip as the 4080. Maybe it has to be a multiple of 6 or 3 or something. Perhaps it could have 18gb or 24gb. Just guessing, haven't got a clue, I would have thought a tech reviewer would know. I'm pretty sure I've heard other "pc techtubers" mention it.
I could search up ANY GPU and find numerous sites ripping on it. I have a 4070S so Im biased, but man, I got it for $549 and it sure feels like that price was worth it.
Can we stop calling upscaling software a reason to buy a $600+ graphics card?
the GPU renders an image...a better image is the entire point...
Power Consumption is not just a thing about price. 100 Watts more also mean 100 Watts more to cool, and that demands better cooling solution, and if they are not better, they are louder.
Amazing review. Been watching you since summer, can't believe you only got 54k subs. Keep up the good work
i always appreciate this meta-analysis content, it's what i subscribed for ngl it's super valuable
IMO RT is still not a feature worth pinning for. Currently the 4090 is the only card I'd consider to have competent RT abilities, otherwise you HAVE to use an upscaler to get anything resembling a good frame rate. Upscalers are great but if you're going for max visual fidelity than it's a no go.
If RT is something you're really passionate about, prolly still worth waiting for the next gen of gpus.
I got a 4070S and I am very happy with it! 7900XT prices aren't very good here so I'm confident in this purchase.
Great analysis. I don't think the 7800XT needs a price cut though. Couple of reasons why not. My Phathom Gaming card is probably the strongest performing card of all the 7800XTs out there. It's like a run away train when it comes to clock. It just goes up and up resulting in increased performance numberss above what you're seeing in the content creator videos. My 7800XT fps numbers are on par with that of a 4070Ti. I think it could be I've undervolted it is why I'm seeing this. I've seen it boost to 2766MHz no overclock. I've seen TBP as high as 424W without undervolting as well. I don't really care about FSR or RT I care about the raw rasterization. I got a long memory. Second reason the price should remain the same is I remember I bought my 1070Ti back in the day for $500. How much more powerful is a 7800XT over a 1070Ti? So yeah, AMD gave us a more than a fair price imo unlike greedy Nvidia. I will likely never buy Nvidia ever again they've showed us they operate by a culture of greed. That's disgusting. I've bought 4 AMD cards in the past 2 years. I'm itching to pull the trigger on a 7900XT right now and will probably do so. The 7800XT is worth every penny as is.
Yeah, unfortunately when you put it all down on paper the arguments in favor of the 7800XT really aren’t affected much even given how much of a performance bump the 4070 Super makes over the OG 4070. Not too much incentive for AMD to drop prices on that card, or even the 7900XT as those were already in the mid-$700 range.
I was thinking about this just prior to ordering a 7800XT a few days ago, I don’t see AMD being that compelled to make any long-standing price drop on it. Maybe incremental changes at most. And I want a reference model anyway, usually the slowest to get any long-standing price drops.
Nvidia is likely aiming to raise prices on the 4070 Super to $700+ anyway. Nvidia fanboys will probably pitch in. It’s what they do every generation, every refresh.
i rly think nvidia should start giving upgrade services. like if u bough normal 4070 then u should be able to trade that for super or ti model and the cost would be what the current price difference is or what the price difference would have been during the first purchase
every gpu is 100$ more of msrp for me in romania. the prices haven't budged at all, i hope they will once the 4070ti super and 4080 super get released but i doubt it at least not right away and could take several weeks
I appreciate the meta-review. I see this format being very useful for viewers for whom time is a huge concern.
RT performance should always be considered now, considering its baked into lots of new games without ways of disabling it.
GTX 1070 came at USD 400 with as much VRAM as a GTX 1080 and outstanding performance for its time.
We should understand that GreedVidia only listens as result of boycott.
I am hoping I wont have to upgrade until the next generation consoles are released and we see what architecture games will be optimized for. The next gen consoles might come with 20 or 24 gigs of ram and developers will start to utilize that extra memory. Or in a few years frame generation software might be so good that you wont even need a dedicated GPU. Hard to say.
Amd Has an Amazing Oppurtunity for getting great Community feedback.
They should reduce price of the 7800xt to $450, 7900xt to $650 and 7900xtx to $750.
Thats a good video idea (reviewing the other youtubers data) pls do it more often
of course it's *that* bad when they call a '60-class' card a "70"-class card, then charge twice what they should for it. It's insanity. The only legitimate '70-class' card is the 4070 Ti Super, but should have been called the 4070. The so-called 40"80" should be called 4070 Super, and the 40"80 Super" should be called 4070 Ti. The so-named 40"90" is the only legitimate '80-class' card and should have been $700.
And yet it is faster than AMD's 80 series card 😂
@@ehenningsenI mean it's $100 more so...
@@BaldKiwi117 yeah. I just thought it was funny. Even though recent NVIDIA series have slowed in raster progress, somehow AMD is worse.
Let's keep it real here
I had bought a 4070 before Christmas but now that 4070 Super releasd im just gonna return it to Amazon. Already ordered a 4070 Super. Pairing it with a 7800X3D
Still stupid choice
@@lifemocker85 I just feel Nvidia cards will encounter fewer problems. Maybe im wrong but as a dumb first time buyer this is the impression im getting so ill spend a bit more on the card for that
600 will be better spent upgrading my cyclocross bike to a wireless groupset!
Wise choice, good rides.
Nice overview, good ideas in this video!
Im getting the 4070 Super tomorrow. Upgrading from a 3050. So excited
I'm stupid. Does the 3080 really have a good price to performance given that you'll be pulling more power from the wall? Do we need to factor in the running cost to have the 30 series cards when gaming at least 3-4 hrs a day?
30 serie cards are all stupid
So the 7800xt is 20% cheaper but only 5% slower?
That’s a no brainer.
You can see the 20% savings but you can’t feel the 5% fps while gaming
I know this is late comment but is it fine for 4070 to be on 1080p and will it be worth it?
Honestly I just had to return my red devil 7800xt due to a manufacturing error and was trying to decide between another one or the 4070/4070 super. Ive only had a intel arc a770 till I got the 7800xt so I’m pretty neutral when it comes to brands
Its becoming hard to get the 7900xt below $750 it looks like AMD is not afraid to hold the line on prices any more.
Hi from Europe.
In regards to pricing, the more expensive the card in USD then the euro prices climb exponentially higher for us because of VAT(sales tax) Germany has 19%, Spain 21%(I think), here in Ireland it's 23 and goes even higher in some other countries. So 23% of 600 euro is more than 23% of 450. So really, we could do with prices dropping more here to combat the high VAT added. I'm interested in a 7800xt or 4070s. But it's minimum 550/or almost 700 euro for the 4070s. Plus, cost of living is higher in Europe so people often have less discretionary spending money available. For example, our car fuel is much more expensive and I travel to work, luckily not as far as I used to, but I took a pay cut to vastly reduce my commute. Unfortunately in Ireland, wages can often be much higher in our capital but you have to deal with awful traffic and much higher rents or house prices.
And because of our mostly terrible summer weather we got to another country for our summer vacation. You can't just hop in a car. We fly to Spain, Italy or the canary islands.
Anyway, our health system has pretty much collapses and homelessness is at record high levels, partially because of open borders so I guess buying GPUs isn't overly important
Europe sucks because of the VAT.
@@torchbearer3784 yes, mine is 4% higher than Germany. I think Hungary is 25%.
Being in the euro hasn't done much for competition or prices
If AMD wanted to really hit NVIDIA where it hurts, I think the following would shake up the market:
7800XT - Drop MSRP to $450 - Why buy a 4060Ti 16GB
7700XT - Drop MSRP to roughly $380 - Why buy a 4060 or 4060ti 8GB
7900XT - Drop MSRP to $700
7900XTX - Drop MSRP to $800-$850
Release the 7900GRE globally as a 7850XT OR 7800XTX for $550-$600 (7900 GRE is a China exclusive launch, but its the card I would buy now if I could)
Keeping the 3080Ti for now I think at least another year or two, in fact just got another prebuilt Alienware with a 3080Ti for my kiddo that’s almost exactly the same setup for $700.
80 series was on Samsung node. Nvidia doubled down on crypto and went the cheaper Samsung route instead of tsmc that’s why they ran hot.
That particular card torn down is also skimping on the power stages, probably worse than not having 16gb for the longevity or the cards life span.
For some reason in Oz there's always a launch/early adopter tax so currently the 4070ti is the better value over the 4070S as far a frames/$ is concerned. Once prices normalise the 4070S is reasonable value if you ignore the fact that GPU's in general continue to be overpriced.
People still feel that it is not enough VRAM for the money and the cards haven't sold well despite not having "big" supply which moores law is dead on his channel reported recently in his newest video.
3080 was launch at $700 in Sep 2020, adjust for inflation, it's around $1200 in Dec 2023 according to BLS. So if Nvidia can sell 3080 before pandemic like hotcake, I don't think they will reduce the price to be below $600 for a 4070 Super. In their eyes, a 4070 Super is cheaper than a 3080 (adjust for inflation) yet perform 8% more (according to Techpowerup) so from a business perspective, I don't see why they need to reduce it more.
Being slightly better value than a four year old card is not something to be proud of, AMD is still offering better value for money.
@@gintozlato1880 last gen high tier performance for half the price. I would argue that's not a slightly better value than a 3080 as you said.
just found this channel and im staying for the music
I'd say good news. Wouldn't this mean from now on anything costing $600 or above will always have minimum 3090 performance?
At 1080p 4070 is more than enough, plus 8pin vs the new 12v connector.... Easy choice
12gb of ram isnt really enough anymore for 1440p high/ultra or RT, so if you are going to be stuck on 1080p you can get a much better deal for less money.
Thanks, It helped a lot but I REALLY want to have the same comparison for 4070ti super. I am stuck between it and 7900xt. So I could stretch my budget just a bit but knowing that it is worth it
and this video is why I subscribe to your channel.... nicely, nicely done..!
How come no one talks about the pathetic less than 5% performance boost between a 6800 XT and a 7800 XT? It's every bit as bad as the 3060 to 4060 yet no one talks about THAT
i have a 7800xt and both a 6800xt, over the last couple months i’ve seen it go more to 10% aswell as being more stable in drops, but you’re right, driver updates are constantly making the 7800xt better at a pretty fast rate, honestly in a year from now it might not be 2-3% from a 4070 super
Daniel Owen is underrated IMO….
You are too man….Keep it dude!
Chilling with my 1080ti 11gb since 2018.... lol such a future proof card especially at 1080p ultrawide 75hz. I might get a 4070 ti super or wait till next year and get a 5070 16gb for $649
I bought the rx7600 (bundled w/Resident Evil4) last June @$257. I don't have buyers remorse. Even though the 7700xt 7800xt and now the 7600xt is out. I upgraded from a rx580/ryzen 5 2600 to a rx7600/r5 5600.
If rasterization is anything to do with texture quality I feel like that is where my 4070 lacks. Had to do a lot of fiddling to get sharpish images and my old RX 580 still looked better as far as texture/image quality, just way less bells and whistles.
Using 1440p res for a xx70-tier card feels like a kick in the guts when you know in just less than a year newer games will run below 60fps @1440p with this card... and I'm talking about non-RT rasterized performance.
It's like there's no future proofing in GPUs anymore. Nvidia is so dominant they're manipulating their releases so that even a generation refresh (like with this Super lineup) makes the last year's releases obsolete.
If you can't handle customizing your graphic settings, then you really should just settle with a console. It will cause you a lot less headache and trouble rather than worrying about maxing out every dog shit poorly optimized new release mediocre "TRIPLE A" game that's going to be released. Once you stop playing "keeping up the the joneses" on graphics card you'll finally be satisfied with a decent card with a little bit of customization
900 cdn lol, mind you the 4080 is still averaging for 1800-2000 for aib depending on brand. Effing outrageous
Dies have a certain amount of bandwidth. They cannot just add more memory. When they add more VRAM, it is double that amount, because they solder it at the back, using same channels, but with twice that RAM. It bottlenecks information transmit in certain applications, but it shows what hack manufacturers have to do to add more VRAM.
Something like RTX 4070 Ti Super went to RTX 4080 die. This is why it has 16 GB of VRAM in a first place.
RTX 4070S have empty spots, because it shares board with different models. This is done for cost reasons.
And now that you find a 7900xt nitro+ for 700 euro GL ngreedia ahah only a idiot would pick a 12gb gpu when there's a 20gb at that price and top brand.
Na, it's still a 12GB GPU for 600$+. Not to mention the AIB models that are close or at MSRP tend to be of worse quality than the reference model. So no ty Nvidia.
I think a lot of people are sleeping on the og 4070 when that hits 475-500
Honestly fsr in newer games is implemented really well. I'm not sure dlss will be a major selling point for much longer. I'd rather have the better raster performance especially if I was at 1440p. 1440p upscaling isn't going to look great anyways dlss or not
Great job vex, owen math teacher must be proud
Should I review the reviewed review of reviewed rtx 4070 super?
That 6800XT GamerNexus has isn't an anomally; It's a Sapphire 😈
When you use clips like these do you email the creators ahead of time? Asking for a friend.
there is really no need. They put themselves out there, it is fair use.
@@bryanbowling1857 interesting. Thanks for reminding me about fair use thing. Lately, reaction material has been a talk of the town and it makes me question whether I should be doing that myself. I suppose it'll depend on the material I'm making and what content I'm using to make whatever point I wanna put out....if there is a point. Hahaha
Avatar is a full RT title, it is very demanding. The 7900xt 2160, Fsr 3 ultra, high/ultra custom, getting 80fps+ average. With fram gen , 140fps+. I have to say I cant tell that I have fsr and fram gen enabled! Still disappointing that one has to use such tech with such a high end card. But games are so demanding now. My 6800xt wouldn’t make 40fps without fsr and fram gen In Avatar.
ive been watching your videos for the last two months idk why i didnt sub sooner my fault bro 🙏
I wonder how many people were sitting in front of their PC one day and thinking: "Man, I sure wish buying a graphics card became really confusing."
Nvidia is claiming the 4070ti super to be 2.5x the 3070ti in perf. Is that a real 2.5x or a jensen 2.5x? Prolly a jensen 2.5x.
I'm still more interested in RT and upscale performance. It'd be worse to spend more than $500 dollars on a card that struggles running next gen games and other tasks.
Yep Nvidia is more expensive but the hard work they've put in AI and efficiency shows. AMD has to try harder than just cutting prices down.
You did something with sound. Some cracks I hear. In x1.75 more noticeable
Starting to actually appreciate my 7900xt. Was worried about it under performing. But found after a bit of tinkering. Its better than a 4080!
😂 reviewing reviews! Now that's something I didn't think I'd see and like.
Given Nvidia loves to play the game of cutting down cards while keeping the same name and have discontinued the original 4080 anyway, it kinda surprises me they didn’t skip the awkward naming of the 4070ti Super and just called it the 4080 hoping no one noticed. They’ve done it over and over and their buyers never cared.
wish nvidia wouldn't be so predatory and put that 16 gb for 600 usd, then it might have been worth it, although 100+ more for dlss is still a bit too much.
The 2070 super chipset only supports 12gb or 24gb. Since you are not getting 24 gb at this price, it has to be 12...sad story
I've got a 12gb 3080, so thinking maybe the 4070 ti Super but not super compelling atm
So, I just wait for the reviews of the RTX 4070 ti Super or the RTX 4080 Super.
whole lot in stock at microcenter massachusetts
7900 XT in the vein, with a blind eye without regrets. As always nVidia let us down again with this stinky 12GB of VRAM, could be the OneCard to rule them all in its range if i have 16GB VRAM. But nVidia been nVidia as always.
Hope someone answer, if i'm going for a brand new PC and I want to future proof 1080 gaming is it worth?
Going to be my next card to replace my current 3060ti
should i upgrade from 3060 ti to 4070 super?
imma bite into the sour apple and support AMD. I have extremely high hopes for FSR (already making great progress since open source) and want AMD to be competition to Nvidia.
I would never recommend a product based on “it will be good in the future” but I can take the risk myself.
it’s like i’m super happy when somebody buys Intel GPUs, supporting another comeptitor is good for the consumer, but i’m not willing to deal with that myself yet
$20 says the 4070 doesn’t drop in price and this just ends up slotting between the 4070 and 4070Ti Super at like $650-$700.
The point why this super update is "bad": Its only for the few PC gamer who are able or willing to pay 600 bucks or more for a graphics cards.
How much is this? 5% of the market? 10% of the market? Maybe 15% of the market?
Its a nice price drop for the few people who pay usually A LOOOOT money. For all those people this update and price drop is nice.
For all folks who still face the not very interesting 7600 or 4060 ... or are even facing the overproced 4060 ti - there is nothing better. At 400/450 you can think about buying the 7700xt but below is quite underwhelming and NOTHING changed there.
For me this directly tells us: Nvidia and AMD dont give a shit about the average PC gamer anymore.
THIS is what this "update" directly tells me. As one of this average gamers who went up for rather high 400 bucks (RX 6800 non-xt) ... I have the feeling that Nvidia diretly spit in my face and laughed at me.
Like they do for years now.