I've been a 70 guy for a while now, i upgraded from a 1070 to a 3070 this past holiday season. The reason I have been a 70 guy is because it was always a nice balance between price and performance for me. Now I'm not even a 70 guy anymore, this is despicable. Edit: This gained more traction than I thought it would. I wanna clarify that seeing these prices go up is quite frustrating because with previous generations of nvidia cards the 60s and 70s were in the mid range where there was a good sweet spot for price to performance. Seeing these kinds of prices makes the consumer expect the same kind of price to performance ratio, just a higher pricetag. But especially with the newest 4070, this is no longer the case and the lower priced options aren't any better for value, just a lower price. It feels like there is no longer a great cost-efficient option from team green unless you want to miss out on higher vram options which somewhat makes your card outdated at launch.
If you go by pascal’s scale of cards the 1070 was amazing, if you put the performance ratios of the 40 series into the naming of the 10 series, the 4070 should be called a 4050 even though they gave it the price of what should be the 4080.
@@DashVandle Not 4050. I would go with 4060Ti at maximum. But yes - with 4080 price. In our country you can get new RX 6800 for 499USD or 2nd hand for 349USD. Today I slightly regret that I didn't wait longer and bought (2nd hand) 6700XT. I'd rather have the 6800 16GB card. But even the 6700XT is pretty good for 2K gaming. Nitro+, undervolted and quiet as a mouse. RTX 4070 is perfect card for today's 2K gaming, but not for 599USD. Should have been at least 100USD cheaper.
Why do people keep grouping themselves by model numbers... If you used to be an 80s then get the 70 since they are about equal. All this stuff is just a mess... People need to quit just looking at model numbers and compare the performance they are getting for the price instead.
I've never been a high-end build guy, always budget. But this next build was supposed to be the one were I finally went high-end (Zen 4 + ~4070). And then the last few years happened where prices for everything went crazy. I was hoping things would return to normal, but I hate the idea of spending more money for something than I would normally have to. I am seriously considering just sticking to low-midrange for my next build now.
yep I feel like the best bang for your buck atm is still a used 3060 12gb if you are still having something older. I still play on my 1070ti 8gb and I really dont know what to upgrade to, I actually want atleast 16gb for my next card but the prices are insane.. I am actually looking at the Intel Arc A770 because it has 16gb for only 400 Bucks which is unmatched, but they still seem to have driver issues with dx11 and dx9 if I heard correctly. Ill probably still keep the waiting game and see what happens, but if things dont get better I have to make a decision at some point...
Depending what you're looking at, your price range if you want to go high end, and where you live consider amd's graphic card offerings. Last gen's Radeon 6950xtx is punching well above the 4070's weight class at the same price (in some regions in my part of the US it's the same or cheaper) and more vram plus the driver support has been great. Through the communities dedicated support for upgrading outdated cards. It will likely have a long life span. The 7900xt and xtx both punch equal if not well above the 4080 for pricing similar to the 4070ti's price as well as as having an insane amount vram. Just do be aware of the risk of those two card right now as some do have physical issues but if you get an ok bored from them the driver support should keep it relevant for a long time to come. Unless you feel like you need path tracing specifically, amd is in a good spot right now surprisingly. And honestly I don't think path tracing is gonna be decent or widely supported until another architect upgrade possible 2. I'd say consider used but considering the miner boom there's no way of knowing how heavily over used and under cooled some cards were versus won't. So I can't in good conscious recommend used this time around. And it feels really weird to say that.
stick with budget builds. that's what I've always done, and I've done many builds in my time (since the 1990s). you don't need some GPU that will cost you as much as a complete system should cost you unless you want to play games that were released the last year at very high framerates or at very high settings. charts made by reviewers tend to use pointlessly high settings. just use "high" or "medium" and you'll be able to play most games just fine even if your GPU is "budget".
Buy ARC! If not this gen, then hold off until the next generation of ARC, which - if driver updates continue to see 3-5% improvements to performance EVERY MONTH it's possible that even the second generation ARC GPU will be swinging up with AMD, if not Nvidia!
Funny with the economy the way it is you'd think they'd go for lower budget cards that normal people could shop for and that still compete since they have higher end cards already for the enthusiasts or people who have the money to buy higher end. Guess if your an average perrson who just likes gaming your assed out unless you save for a year lol
I, personally, would not call this a 'higher end'. Mid tier maybe. But it is the new high, mid, low era we're in... it ain't gonna change when they keep selling these cards.
If I'm not wrong for 30 and 40 series they've started with the 90 series and worked their way back, it's not necessarily them focusing on the "high end" gpus it's just what they planned to be release next and shouldn't be unexpected
I bought a 4070fe. Honestly if I’m going to get similar performance of a 3080 for 150w less I’ll take it. I have a 3080 now and it runs 15-20c hotter and only gives maybe 10% better performance on certain titles? If you have the money, this ain’t bad and you can splurge on a higher end card later lol
@@Wheel-os I originally bought a 4070 ti, and it was way more than I needed tbh. I don't play games, just surf the web and try and mess around in fruity loops. but running msi kombustor and watching the fps 😂😂🤣🤣. it's like a drag race, you always wanna beat your score somehow. so I decided to do a whole new build for a 2nd computer, all white td500 mesh 13700k 4080 aero white Asia horse and cablemod cables... white 280mm white nzxt x63 there's some black on the mobo and psu that give a decent contrast for the casing. I'm terribly new to cable management though 😩. everything is running but I really wish I bought a bigger case to start with. and could get my wire runs a lot neater.
70 was always the sweet spot for me, I always felt it was the right balance between price and power. And I know that's how many people were feeling as well. They really s**t the bed with that one, and Nvidia appears less and less like a good buy. It looks more and more like I'll be looking into an AMD card to upgrade my old 1070, which WAS a great card.
Which card you thinking? I need a new gpu before counter strike 2 comes put. Just built a 13700k build but my 1070 remains in it due to my inability to figure out what to buy. Just seems like an awful time to need a new gpu.
@@hobosnake1 Same spot here with the exception that I have 6900k instead of 13700k. I'm thinking maybe either rx 6950 xt or rx 7900 xt for longer term. Originally looked at RTX 4070 and 4070 ti, but they seem to be on drugs.
@Dezmont I was thinking the 6950xt as well. I just have never used amd. I don't know in practice what the differences will be. I like editing video and recording gameplay in CS, I need to research the differences a bit more before I pull the trigger, but the 6950xt seems like a better idea if you're like me and prefer not to swap out hardware until a complete new build is in order.
@@hobosnake1 I have a 6800xt, (used to have a 3070 but sold it for a profit) so I can tell you this: the H264 encoder is bad, but the HEVC (265) works fine, and streaming is waaaay better than geforce experience bloatware. Open source everything is a nice touch, so if you use linux on the side, it's a no brainer, just plug and play. If you get the 7900xt, you will have AV1 encoding however, if that matter to you. I think you can get it for $700 open box, not sure. Edit: 7900xt $680 open box.
Thank you for explaining the charts, making them understandable AND showcasing the items in comparison with a little white overlay on top to help us put our eyes on the things you're talking about. I would still recommend doing the results reversed. I understand you put above as the resolution went up, but seeing as the values are lower, I'd say lower resolution on top would be better.
Truthfully, I'm still in 10 series. Been watching all these cards, how they are doing and watching price points. I would love some ray tracing, but I'm not gonna die without it. I think once evga backed out of the market I ended up dropping my plans to upgrade. Now I just keep watching and waiting to see something that catches my eye. But really, even at 1440p my 1080ti still does what I want it to for the most part. I would love to updlgrade at some point but the pricing has been pretty nuts for awhile.
Agreed. My 1080 still plays every game I own at high to maxed out completely at 1080p which is what I game at (For now, until something compelling comes along. Wouldn't mind a 3080ti) but I'm just not happy with the prices for the 40 series, and seeing how they fucked the 70 line, up I don't even want one of those now either. I'm usually an 80 or higher, but man. $1200 for a 4080 is slightly more than double what I paid for my 1080
@@artvandelay9131 I just sold my 1080 with 150 euros. Masterful product imo. I bought it new for 450€. I updated to used 3080 and paid 600 of it. I don't play a lot but I guess I can go few years with this..
I'm the same. I'm just willing to get a 3080ti because my 1080ti shows its age when I plug it into a 4K tv. But the price is still holding me back. I just don't get why people simply hit the buy button and that is it. Don't they think it sends a message that those prices are ok?
@@BrunoRodrigoPintodaSilva As someone that has a 4080, I just want to play games? I work hard, make good ass money at my job, want to come home to a peak gaming experience of 100 fps 1440p ultrawide max settings. I'm not gonna not buy something I want and can afford just to send some company a message that isn't going to be heard anyway. It's fucking video games, it's not that serious. Sorry, I guess? lol
@@ForeverMasterless I really get you point of view. It just gets on my nerves that we don't get what we feel we should get. When you buy a car with a higher power/mass ratio you expect it to be more powerful, in the end it's just a car. OK, you can buy a 100k USD car that is focused on building materials and not power itself, but you know what you will get. What do you want with a VGA? I guess we all want our fps, and it seems we are paying for god know what, because the price is going up much faster than graphics and fps. I really hope you get what I wanted to express. Anyway, great weekend for all of us.
These cards just make me glad that I switched over to AMD. Never used AMD before, first time buying an RX 6700 for $330 and I think the value proposition on AMD is just insane compared to Nvidia. I can see how ray tracing and certain productivity software could push people towards Nvidia, but overall I think AMD takes the cake for value gaming cards.
Same here... Man, Nvidia is great, but AMD is for gamers.Nvidias main target are the professionals that need Nvidia productivity software and enthusiasts.. if you're not one of them, THEN A M D IS F O R Y O U .
i'd also keep a eye on Intel's graphic cards, they got a 2nd generation in the works that hopefully is them punching through and comparing well to AMD/Nvidia cards.
4090: 20 GB VRAM, $1600 7900 XTX: 24 GB VRAM, $1000 Need I say more. Sure the 4090 is faster, but for $600 more, on what planet is that worth it. And less VR so overtime it will be worse than the XTX? What a joke
@@thisislame2207 Intel is awesome for sure! Their drivers have been maturing slowly. I think the only place they fall behind is legacy game support. DX9 and earlier titles especially have very poor performance with Intel GPUs according to almost every reviewer on UA-cam.
Thanks Jay and thanks for continuing to hammer Nvidia on their pricing. Basically I am done with them at this point until they come back to their senses. Every video card that I have bought since 2006 has been an Nvidia based product but that will end when I swap out my 1070 Ti. I will either go back to Radeon, or the 2nd generation Intel ARC cards after they release if I decide to switch platforms. If people decide to cave in and buy 40 series cards from Nvidia at the prices they are charging, they can expect to see price increases every generation that go well beyond the rate of inflation. To those folks, I wish you luck.
I agree wholeheartedly. I have a 3080 and it's the last NVidia card that I will buy. I will definitely go with an AMD GPU on my next build in a few years. Intel's ARC cards are picking up some steam with their driver updates
GPU prices, and those from Nvidia in particular, won't go down as long as people keep buying their products. And they do. I don't know who, spending more than $300 on a GPU - and that price used to give you a high end GPU - seems ludicrous to me. In case anyone's wondering for some reason, I'm still using a RX 570. I've thought about getting a RX 6600, but I'm not convinced it's worth buying AMD either since they seem to have a mindset of "we'll just put stuff out there that's barely got a better price/performance ratio than nvidia since consumers have the choice of buying that or nvidias slightly more overpriced cards"
Just built a new super rig around my 1660 specifically with the intent to build around it to support up to most of the RTX series GPUs, and didn't even have to debate with myself about whether I remain loyal to Nvidia like I have for well over a decade now. I went back to Radeon and while I'm excited for the change, I'm still slightly bummed that it came to a point where Nvidia has gotten so delusional with their pricing that I bought a competitor card without even a second thought. All because it posts the same or better specs as some of the RTX cards at like... I think I paid almost $1,000 less 🙄
This is how hardware review videos should be done. Quick answers in the title for those of us with little/short attention spans, lots of details in the video for those who are actually concerned about that kind of stuff. Both are perfectly fine, but honestly I'm only really interested in if I'm getting a good product or not at the price it's being offered.
Man, who knew that buying a 2070 Super 3,5 years ago would be such an amazing financial decision. With prices going insane on GPUs right after and to this day, I'm probably gonna stick with it for some years to come.
We NEED more of this format, Jay. I think it works better for everyone involved and I gotta say, I've been watching your videos for a long time and this discussion was what I really needed to know If i should go out and buy or keep sitting on my 1660 Ti and wait. Thank you so much!
Price per performance is basically on the level of previous generation which means 4+ years of stagnation altogether... At least the power consumption is lower thanks to move from Samsung to TSMC.
Great breakdown of the charts! Clear and informative, with fantastic comparison notes. Thanks for doing this for us all, Jay! Much prefer this over the rock music shotgun charts with discussion after.
I feel like you'd be better off looking to get a 6950xt, considering that they're going for about the same price point and perform typically better as long as Ray Tracing isn't involved. On another note, I'm a big fan of the new way to go over these charts. Talking over them and explaining them with highlights like Gamers Nexus does is a solid move. It feels much easier to follow along!
i prefer the 6900xt over the 6950xt only because the 6950 tends uses a lot more watts for pretty much no difference in performance and its usually about the same price.
Ray tracing is getting more important, and you're losing out on NVIDIA features there. DLSS is just superior to FSR, and NVIDIA has graphic upscaling for videos and livestreams. Forget the branding for it but its **amazing**. Streams look freaking amazing with it turned on. If you're willing to pay for the power the card is using.
@@deathhimself1653 12GB vram not enough, especially with RT on. Those ray acceleration structures need memory too. There will be games where this card loses in RT benchmarks because it doesn't have enough memory for RT mode.
@@SomeFrenchDude not to mention turning your room into a sauna lmao. It it’s exactly why I think the 6900xt is better than the 6950xt, way less power consumption with basically same performance.
I'm hanging on to what I've got for now. $600 for a "70" series card is too rich. Pascal was the pinnacle IMO, when you could get an "80" series card for $600 and sip the same amount of power as the 4070 many years ago. The pricing of these things are ridiculous coming from a company that reported record revenue and claiming that its to offset the development cost of these cards, and that they treat board partners so badly that one of the best ones on the market pulls out of GPUs altogether.
I would stick with team red but their video encoders are quite shit compared to team green. When using a VR headset like a Quest or a Pico it's noticeably worse. Wish AMD would get that straight before the end of the century.
I have the Sapphire Nitro 7900XT one... This card is a beast! I have beat with them recently the spytime extreme world record (with fan cooling, no liquid nitrogen or so).
Going to echo what everyone else has said-I really dig this format and the explanations slide by slide. Makes it much easier to digest a lot of numbers. I’m using a 2070 Super in my current rig and have been trying to figure out what to upgrade to. The power efficiency of this card would make it perfect for a small form factor build, but like you said… it’s difficult to stomach at this price point. I’ll wait for now to see if I can get one on sale later in the year.
I must say, the iFixit promo bit is my favorite, World of Warships is a very close second place. The informative and very helpful episode? Great, as always, of course! Thanks Jay
the iFixit stuff is always my favorite opener to any youtube video ever these days, always makes me smile, and I'm pretty sure I'm gonna end up buying a set or two anyways because it's so good
THANK YOU for mentioning how the 80ti used to be the top and the new 70 series used to beat the old 80ti! And that we used to get that TOP card for this price! It's insane!
We got great price to performance uplifts with new shifts in architecture culminating in the amazing value of Pascal and the amazing value of the 30-series cards. But the generation after those always to seems horrible. 20-series, now 40-series has bother been super anti-consumer priced cards that were out of touch with all but enthusiasts.
Been on the fence with what to buy for my new system in a few months and after this I am 100% going with an XT or XTX depending on sale prices at the time.
Jay, you should include a game of the RE engine (maybe 4 remake) in your suite. They are great at showing the vram bottlenecks, like the 3060 totally destroying the 3070 because of it. So it would be interesting to see the 4070 12gigs vs the original 3080 10gigs.
Great video as always, but I do have a question about your results. Why do you only include the 7900XTX in one chart but then proceed with the 7900XT in the remainder?
Glad i opted for a used 3090 after the 40 series launched. No way im paying into the overpriced market for new cards. My 3090 will be good for years hopefully (i upgraded from a 1080ti which i have given to a friend).
Im keeping my 1070. It still does 90% of what I want and cost a ton less than what we have now. Ive been wanting to upgrade the last few generations, but the cards just cost stupid amounts now.
Still using my 1070 as well. I'm so surprised how well it plays some games, Cyberpunk 2077 is running great for me. Granted if that game didn't use FSR 2.0 I probably wouldn't be having such a good time.
As far as I know, this is the first 2-slot card (40mm thick) from the NVIDIA 4xxx series. Might be worth mentioning, this is ideal for some unique builds and small cases.
Love the look of the new chats (colors, layout, etc). Might suggest using red/green/blue colors for the card name so it's easier to see what is AMD/Nvidia/Intel); or an icon. Also like the explanation (rock music was never my fav - prob because I'm do dumb/lazy to interpret them fully). I personally don't mind the 4K on top, 1080p on bottom.
3:30 no actually the new xx70 cards have usually beat (or been very close to) the 3080 Ti or 3090. It's the xx60 cards that used to beat or compete with the previous generation xx80 cards. This 4070 performs the way one would have expected the 4060 to perform. Check techpowerup's GPU comparison.
complete scam indeed, i mean it's been how many years and you get a 3080 for 100 bucks less and less power? sure it's improvement but this is way less than what it should be
Well, I have to say that I'm impressed by the efficiency and the level of performance with the RTX 4070. Since RTX 4060 and TI aren't great and that the 4080/4090 are so expensive, the 4070 could be the best option for most people. The fact that it can perform as well as the 3080 is a very good thing too. But then again, coming from a GTX 1070, I guess I'm easily impressed :D
Maybe at me, small content creator who wants GPU to work without problems while playing and recording like I had with AMD, but the price is acceptable if I do some overtime at work.
Man... i'm so happy i changed from my 1070 to a 6800 a couple of months ago. Now i don't have to deal with all that for at least 2 generations. By then all the new FABs around the world will be working and and the rippoffs will flatten out - hopefully...
I hate them adding more models (and therfor confusion) to the mix but part of me hopes they do add a super in there, like Jay pointed out.. I think I'm aiming to do an efficiency build next (lower tdp over highest performance), as the cards are all capable of what I want, for the most part but having a mid ground between a 4070 and 4070ti would be ideal for me, I think.
I still use my 1080ti and it is still sufficient for my 1440p ultrawide monitor. Of course I can't play on the maximum graphics setting, but current titles still look extremely good on medium. I hope that the graphics cards will be cheaper again at some point.
@@naapsuvaimne740 truth. I'd love to upgrade my 1070 but with the way pricing is it's not worth it to me when nothing interesting has been releasing anyway.
It apparently is difficult to impossible to change 4070 clock speeds, which would suggest that your theory of it originally being a 4070 Ti likely holds true. Also, a very nice presentation. 👍
Bought a 1070 in spring 2017 for around $450. Amazing value for the price, and just a year after it first launched. Up until ray-tracing made its way into games, I could run ANYTHING maxed out at 1440p or even 4K. Even Cyberpunk was playable at almost maxed-out non-raytraced settings, with two or three settings bumped down one notch below maximum. It's only this year that games like Hogwarts and Last of Us have finally beaten the heck out of my long-time companion. I didn't upgrade to the 2070 because it seemed a bit too expensive and ray-tracing seemed to make every 20-series card chug except the 2090. I didn't upgrade to the 3070 because it was even more expensive and it came out during the GPU shortage. And now I'm not going to upgrade to the 4070 because it's a damn scam - the equivalent of a 1060, labeled as a 1070, and sold at a 1080 TI price point. Go kick rocks, NVIDIA. I have no interest in blowing a huge chunk of my savings on your ripoff card just so I can brute force my way through the ridiculously unoptimized games coming out these days. I'm going to keep my 1070 and explore my huge backlog of great games that are already out. Try not to screw up again with the 5070.
Historically speaking an _eh_ 70-class card matches the previous top consumer card, often surpassing it. It typically beats the 2nd fastes card from the previous gen, or the 3090 in this case, by 25% in a great year, 15% in a poor one, and around 20% in aggregate. Instead, at best, this matches the 3080 10GB, or the *5th* card in Nvidia's Ampere consumer stack while increasing the price by $100 USD. Now you can counter by saying _"But, those other cards the previous 70's matched were only $600-$720 USD, the new top end consumer cards at $1600! The ones they beat were only $500-$700."_ at which point I'd say _"Don't really care because those 70 class cards were $275-$380."_ When was the last time a 70 class card came out with the exact same number of CUDA cores? It's on a better, more dense node, literally having the ability to double transistor density and we get...58xx CUDA? For an extra $100 bucks? Absurd. We have gone essentially nowhere in resolution improvements over the last 8 years, having cards that could game pretty well in games from their era at 4K way back in 2015, with 2016 being the real breakout year for 4K. Nvidia says _"Well our stats show that only 29% of people use 4K displays so having a 60-class BUS on a 70-class card works fine since the card is coming in t the low, low price of $600 USD which is nothing when you look at the $1600 needed for strong 4K gaming!"_ leading to exactly why we are stagnating. Because _most_ people can't afford to buy cards that are capable of decent 5K+ gaming. How long has 1080P been the standard on PC now, with over 60% of PC gamers still using it daily? But 4K displays have been over 50% of the console market since 2016. Well that certainly flipped. I understand, the chips themselves are more expensive, but shipping costs are down 75% since the height of the _pandemic._ GDDR6X 24GBs is coming in under $4/GB, and is less expensive than GDDR6 back in 2020/2021. GDDR6X 21GBs is literally half the price of standard GDDR6 modules from 2020. This is easily verifiable anywhere you can buy VRAM in bulk (heck, AliExpress for instance, you don't even need to go to a huge wholesaler to se the vast pricing difference). Unless they are using 15 layer PCB's or something ridiculous, the cost on 7-9 layers is also down. Check any custom PCB manufacturer and you can basically build your own 4090 board minus the Nvidia Die for $30 bucks. Printed circuitry and all. And that's 1 unit. If silicon were really that much more expensive when everything was taken into account; design, masking, cost of the actual wafer, yields and defect rates etc, why haven't CPU's doubled in price over the last 6 years? I get it, we buy budget cars at a 100% markup over cost. We buy our Nike T-shirts and sneakers at a 500%+ markup. We buy consoles at a -5% to 5% markup the first couple of...wait, bad example. I also understand real life inflation. I get the CPI index, but electronics are not eggs. They aren't ground beef, chicken or milk (though it's obvious that the 4070/Ti, much like the 3070/Ti are going to age like the above mentioned dairy product due to the lack of VRAM for their target resolution along with the BUS width of a 60-class card, which the 4070 essentially is, down to the actual size of the die. 294mm vs 285mm on the 3060 with that speedy 192bit BUS). Inflation has hit electronics far different than things like food, gas and shelter when it comes to cost to produce. The only thing up is the margins on units sold. _"Then why are Nvidia's margins down to 22%?"_ Because their assets have been sitting at 98% for the last 5 months. They normally sit at 30% of their $660 Billion market cap. Anyone who understands how to read a companies basic financials will tell you that Nvidia is sitting on somewhere between $400 and $550 Billion in unsold merchandise. Jensen had miners to rely on, then saved the stocks from falling off the cliff they should have when Ethereum died and profits without mining were front and center again, but he got the investors all glassy eyed when he said _"AI. We are an AI software company. AI. AI..."_ He then went on to talk about Hopper and it being 400x faster at AI design than last gens pro cards and people hooted and hollered. Except Microsoft doesn't need 100K Hopper chips right now. Google, Amazon and Apple all have their own. Facebook/Meta should have their own soon. Tesla is buying 10K though, so that's something. Now if they can just find another customer to buy $35 Billion in chips by the end of Q2 and they'll be caught up for the year. If 70% of that is profit. Something has to give. When ASP goes through the roof but profits are down 70% year on year, how can anyone see that as a good thing? They'll be fine at a 22% gross margin, bu they should be so much higher due to the amount of datacenter and professional products they do sell. If they cut their GPU margins from 50%-100% down to 10%-50% they would start selling cards like nuts again. It's not that people don't want GPUs, to teh contrary, demand is probably about as high as it has ever been for non-professional use. People just aren't seeing a value there. Whereas $800 USD would always get you the best of the best, over a 20 year+ span, that then shot up to $1200, then $1700, and now back to $1600 (though the 4090 obviously is not the best consumer chip they could make, and is a substantial price hike over the 3090)...a lot of people who normally wouldn't be $700+ dollar purchasers were able to save enough to be able to afford what they thought would be a great card. A generational leap over the already strong Ampere cards, and this nonsense happened. So many, if not most of those people chose not to buy cards. Then you have the lower-high end buyers who expected a $500 4070 that beat a 3090 by 10-20% without DLSS 3 while having at least 60% of the VRAM. That obviously didn't happen. What do 60 class buyers have to look forward to? This *_"Pay more, get less performance day 1 and don't expect the card to remain viable as long as previous generations at their target res either"_* isn't exactly inspiring much confidence in giving companies our money. If I was not so into modding/mods in the 90% single player games I spend the majority of my PC time on, I'd have switched to a console 2 years ago. It doesn't matter if I can _afford_ a 4090 or not, it's whether or not I feel like I'm getting screwed. I can afford a $500 hamburger, even if it is the best burger on the planet with chili-cheese fries that make a Tex-Mex joint in El' Paso feel seem like a burger chain, chances are most of us are still sticking to Red Robin or Krystal Burgers in a pinch. If they all of a sudden doubled their prices, and they would actually be warranted to because of inflation regarding food, rent/leases going through the roof etc, they'd lose a ton of business even though it is justified. With GPUs it simply isn't. Pft on Nvidia, and I typically don't root for *people* to lose money, but with around 8 or 9 Billion USD of Jensen's net worth tied up in Nvidia shares, well I'd have no problems seeing his net worth cut in half as for all of their innovations, the main reason PC gaming is not progressing like it had been for 20 fricken years is almost all on Nvidia. Up through Pascal everything from resolutions to the quality of graphics continued on a steady upward trajectory. Now we are at a point where we can just continuously slot the new cards into the old lineups and see next to no price to performance gains except with the 4090, and that simply isn't acceptable. Not to me at least, and is a great way to kill an industry along with the sub-economy that has sprung up around it.
Honestly going to keep hold of my 1080ti .. it's performing well enough and isn't costing me anymore money. Was kinda hoping the 4070 would be the one to finally replace it but I just don't feel there yet. I *may* look at the used market tho since I heard that's where the real price drops are happening.
Yeah but you are missing very sweet stuff like RT + DLSS. Go used man, I recently bought 2060 Super (upgrade from 1060 3gb, also used, 3 years and still kicking) and even though this card is I think on par or maybe less powerful than 1080ti on paper, once you turn on DLSS 2, you jump even in 1080p so much higher in fps it's not even funny. CP2077 (1080p) with RT on = 30fps. With RT on + DLSS = 70fps. Of course you don't have to play with RT, but once you try it you kinda never want to go back. Also this used card has Samsung memory so it can be very nicely overclocked to 2070 basically. The card was selling in it's prime for like 800$, I bought it for 170$ (still with warranty). So even if it went bad I can replace it. There are some good deals out there, just be aware of scammers or cards used for extensive mining (techpowerup bios check can help). Oh and with the money I saved from buying like 700$ new graphics card, I was able to upgrade my whole PC (CPU, mobo/RAM and PSU)
@@chemeister you mean wattage? as when it does use half the volts, it doesn't say anything without the current, so the wattage could be very similar (U*I=P)
@@venataciamoon2789 Because the RTX 4070 has 226mm^2 die size out of a 295mm^2 full chip, which is basically the RTX 3050 which has a 198mm^2 die size cut from a 276mm^2 chip. The RTX 4070 is literally the successor to the RTX 3050...
Sounds like we've reached the stage where 50 series cards are now going to be around 300-400 which is kind of depressing. I remember buying my 780TI for £380 way back in 2013
@Delorean911Turbo yea these humans really need a slap of reality and truly appreciate what all goes into computer components in general and all they have to do is spend the money and get pleasure like wow man
Great review and great presentation of the details. Personally, with so many PC games basically being poor ports of the console game, I may make my current 6950xt the last card I own and go back to a console in the future.
To be honest. I have a PS5. I just got a new PC. Which I am now gonna be $2,000 into it by the time its all said and done. And TBH. If you are JUST focused on gaming… and not into any other PC specific type shit. The PS5 is much simpler to use. Easy to just jump into 4k/60 games. I love the PS5. The advantage for PC is if you are into playing competitive shooters at 1080/1440 for High FPS. If you are into RPG’s ect…. The PS5 is the way to go
Given vram limits hampering RT performance on Geforce cards - the 4070 the 12 GB may not age well for higher resolutions with RT. The RT results of the 7900 XT at 13:13, 15:23, 15:49 are quite noteworthy imo.
This chart discussion format is much MUCH better than the previous "Throw all charts at your face first" format. I do have to ask, Jay, can we get an updated video on what some of the graphics settings are and what they do nowadays in more modern titles such as Cyberpunk, RDR2, etc? I'm sure there's many people that have no idea what some of those settings do in terms of performance and visuals (i.e. DLSS)
I bought my 3080 12gb strix for 699 off Newegg in September 2022 and I thought I would regret it since I assumed the 4070 was gonna be much faster and cheaper but now seeing these numbers I am very glad with my purchase and the strix looks amazing in my build
@@thomasb282 DLSS3, 4k120, lower temperature, lower power consumption, cheaper. I am a BFG gamer in an ITX box. It is even better without ray-tracing turning on. In my opinion, without RT, one might consider switching to AMD card
I would have bought the 4070 today had it not been for it only having 12GB of VRAM. That's close to the minimum req for new games it seems, or it will be the new minimum very soon, so now I'm only considering 16GB to start with, but will likely get a 20GB/24GB card when I can. Really wish Nvidia had moved to 16GB for all their 4070/4070ti cards like AMD does for their 6800 and higher cards from last gen.
I've said on a few reviews now that this feels like a $500-550 US card that is priced at $600. It's still not good enough value for me to bite. Personally, I've just been skipping buying newer games because I'm not going to get good performance out of them without spending more on a GPU than I'm willing to. I've actually been enjoying discovering older titles that still play well and/or titles that provide compelling gameplay without ridiculous hardware requirements (ie non-AAA FPS-style games). Still having fun! 🙂
I have been critical of your reviews, so its only fair to say the positive as well. I appreciate the time you take to explain the marketing hype and pricing strategies of these cards. As a hobbiestt PC builder, meaning I only build when my current PC gets old or crashes, the marketing/performance hype vs actual value/ cost per dollar when comparing current to previous models is awesome. Super informative. Thank you.
Not intrested at this price point & not purchasing a card with less than 12 GB VRAM. Which will >probably< be the 4060 series. I want a card that will last longer & 8GB cards are showing their limits today.
Why would you want a 40 series when you have a 30 series card anyway? Still rocking a 1080Ti and it's serving me absolutely fine, only recently started to even think about upgrading, I can still run all the modern games (albeit not at the highest graphic settings).
The 3080 is good for a few more years yet. I got mine to replace a 980, which was only just starting to struggle with VR. We have a 1060 in the house which the kids use and it's still a 60 fps machine at 1080p on almost anything we throw at it ( the 1060 was similar performance to the previous gen 980). Unless you're a 4K gamer and are desperate for the full RT experience, the old cards still hold up well. The lower power draw is impressive though, but not $600 impressive
@@crossfire4902 It's new tech, and I have never seen textures missing, or any added lag. DLSS 3 is amazing, getting 2x the framerate with no downside is great.
@@drunkhusband6257 its a fps gimmick generator to support those low end nvidia cards. if you're not going to buy a 4090 or 4080 then just get the 6950xt. real raw performance no gimmick at all.
I felt I overpaid in 2020 when I got an evga ftw 3060ti for $440. I typically stick with a midrange build but if I were to build something similar today it'd nearly $3k mostly because of the gpu.
I agree with what Jay has stated in the past, that if you don't care about ray tracing, then go AMD. That's why I picked up a Sapphire 7900XT, and I couldn't be happier.
and if you care about - then go to amd aswell :) If Last of us or Hogwarts already crash in Full HD at 8GB, the 4070 wont run them much more stable in higher resolutions. No RAM - no RT. Or stay at Full HD.
Reality is ray tracing makes very little difference and most of these Nvidia users switch it on a few times to admire the shine and then switch it off when actually playing games to double or quadruple the frame rate......
What's exciting about this card is the low power consumption and possibility of having short cards, making it the most powerful low power consumption sub 280mm long card currently available - good for people with weak power supplies and small cases. Hopefully AMD matches it with a better performing 7800 XT at a competitive price with their AIBs producing some short cards. The reference 7900 XT is under 280mm long I think, so there's hope.
I'm really impressed with the TGP of that card, considering that the 3070 was a 220W TGP (I have one) and I use it at 1080p with a power limit of 46%, and that thing uses at AAA Titles around 70-100W, conserving a good amount of FPS, but I'll have to update that card because NVIDIA put 8GB of VRAM, with RE4 Remake, I noticed that I wasn't able to put it at max settings due VRAM, because it was able to run that game at 75FPS (My monitor is 75Hz, so I don't care having more FPS) and the game crashed due VRAM, being at the max settings, because the GPU wasn't at it's max %, was about 70%, so I might have the 4070 in the sight if I find it at a good price and a model that I like
Basically from what I saw with other reviews, it is 3080 level, RX6800XT level in raster for Radeon. For 600USD it gets "better than rest of the lineup" award. However MSRP is not only thing one should look at with pricing, because MSRP is what it was and depending on where you live RX6800XT or maybe even slightly slower RX6800 could be decently cheaper. Sure you lose on raytracing performance but do gain with extra VRAM making card more durable. Also that is if price will stay 600USD. As for gap, I think it is mainly there because they saw outrage from gamers on leaks of it being slightly below 4070Ti in price, plus the fact that new cards are just selling poorly. So they had to make it cheaper and I am sure they wanted nice gap for upselling purposes. Still nevertheless, it is at least worthy of consideration. Though 12GB of VRAM is becoming bare bones minimum you want, since games like The Last Of Us, Hogrwarts Legacy, Resident Evil 4 remaster, Forspoken, Plague Tale Requiem,... already need you to lower settings to accommodate 8GB VRAM. And I can assume developers will eventually want to push things further. Sure it is not a problem today, but it also isn't exactly 500 or 400USD card where I would find it more acceptable to take bit of potential planned obsolescence. And P.S. As far as I am concerned, naming scheme kind of died a while ago. So personally I don't care what they call it, naming can be anything nVidia wants it to be or whatever they can make up. In the end, what matters is price. Naming in the end can be marketing tool once people start attaching meaning and can be used to manipulate people.
@@2leggedpirate265 We are talking 600USD cards, so dunno for you, but I am not paying 600USD to reduce settings like I got 1650, fir which you understandably lower settings since you paid considerably less. And 3060 12GB here is irrelevant since 12GB was more than 8GB last time I checked. Try something more comparable like 3060Ti or 3070 where you do need to compromise despite latter being 70 tier. Sure we are talking max/ultra settings, but especially 3070 wasn't exactly 300USD card and we are talking 1080p, not even 1440p.
I bought my GTX 1660 Super for around $200, just before the Corona hit back in 2020. It's now 3 years later, and it costs about $230 now, and there's no upgrades anywhere near that prices either. It's the first time in my lifetime, I could not upgrade my graphics card for about the same price after 3 years. We are living in "interesting" times, for sure - also known as "pain in the butt" times.
Can you imagine how good this card could have been if NVIDIA didn't shift the entire product stack, and it was the 4060 at $350 (as it should de based on sku)? Now that would have been properly impressive!
@@drunkhusband6257 that fucking blows man… prices all go to shit when i’m finally an adult with a job. i have to worry about bills and all that other BS. so sad
@@drunkhusband6257lmfaooo okay… try living in new york, you pay to breathe here. a majority of my money goes to savings, gas, food, car insurance, rent and etc. don’t come at me with that manage your money better shit
I jumped from a 1070 to a 3090. The entire 40 series is a hard pass right now for me personally. I'm going to wait for 5000 series to come out, and then purchase a 4090TI on sale.
@@Spherecaster no? 3070 came out in october 2020. and the card was hard to get, most ppl got that card mid 2021. stop lying and justyfing sketchy tactics from nvidia
@@ImotekhtheStormlord-tx2it uhuh, that's why the steam survey shows that over half of the userbase of both the 3070 and 3060 has only started using them in just the past month, also where am I justifying anything? I just said that most people are buying it now because it was a good but overpriced card then (mostly due to scalping and miners ) that is a great value now that 4k series are being sold for even more ludicrous prices
I would check out the RX6950XT if you're in the US cost the same but the RX6950XT is more powerful if you just want a gpu for just gameing and don't use the Nvidia professional software & hardware just go AMD you get more raw power at a cheaper price
@@MrAnony07 Thanks for the info! I have never used an AMD product before so my brand knowledge is zero. I'm not a brand loyalist so I don't mind trying something new.
@@MrAnony07 If a person cares at all about ray tracing, then they wouldn't go with AMD at this point. You can't even run CP2077's new Overdrive mode with anything from AMD.
@@ssreeser95 @Samuel Reeser yea if your budget is at the $600 mark then the RX6950XT is the best choice if you care about RT it'll be more powerful than a 3080 and slightly less powerful than a 3080ti which cost a whole lot more in the US and at that point just get the 4080 it cost about the same as the 3080ti that would be a smart choice but for raw gaming performance of the RX6000 & RTX3000 cards the RX6950xt is perfect at that $600 mark
I got a 2070 for $400 new back before everything went crazy. The 1:1 performance per dollar increases makes it really hard to justify moving to a new GPU. Lets hope the next generation intel GPU puts some pressure on both AMD and NVidia to adjust pricing.
Compared to previous generations, the MSRP is not at all appealing. But compared to the MSRP of everything else in this generation, then it looks like fantastic value for money. If it does actually release near its MSRP and available in the UK at £500, then it may be worth picking up. Especially compared to the £850 4070 Ti or £1200 4080, or even £700 3080s still available. Sadly I expect retailers will be scalping these from day 1.
@@johannliebert2870 Its not looking good. The exchange rate would suggest a UK retail price for £480 + some extra shipping costs would maybe mean £500 per unit. They're actually listed as £590+ for base models and £650 for overclocked AIB models. And no FE's available over here.
Based upon your charts and commentary, the 4070Ti TUF seems to me to be the best "bang for buck" over the 4070 FE. ie performance closer to the high-end cards at 1080p (which is mostly what I game at) but at a far better price point. Of course given that I'm (still) on a GeForce GTX 560Ti, *any* modern Nvidia card is going to be a major jump in video processing capability for me.
@@somerandomdudeOG Yea at 1080p a 3060 is more than enough in 99% of games. I'd only get a 70 or 80 series if you wanna play everthing at max settings + 1440p/4k
Scott, what platform are you using atm? I was using a 5GHz 2700K for the longest time with a 1080 Ti, upgraded to a B450/5600X, performance doubled for some games. In other words, even if you switched to something newer, but not the latest, the degree of gain may be curtailed if the base platform is sufficiently old (heavily depends on the game & settings though).
70 class is still a 70 class GPU. A reasonable price GPU that gets you high->ultra settings. 60 class is still 60 class. A cheap GPU that's powerful enough to play the latest games. The only difference is prices have gone up some. But that's more to blame on the economy than GPU manufacturers. Computer parts aren't the only thing getting more expensive.
I'm just not sure how much longer I can hold on to my GTX 1070. I think I could upgrade to the 4070 and keep using my 650 watt psu. That's saving 100 buck or more right there.
Same here, but at that price.... I'm considering an AMD card and will wait a while for their mid-range 7000 cards. 12 GB VRam just won't last very long. I had upgraded from a 960 4GB to a 1070 8GB before that. I'm expecting 16 GB VRam, not this lousy nickel-dimeing.
Until prices do not drop on this card, it's only worth looking at it if you build a completely new pc. An upgrade from a 10series at the moment is either AMD or 30series that has very nice offerings here and there.
Yeah same story, wonder if it would run on 550w corsair gold paired with 5600, since entire system usage was like 320w on the charts I saw, that would actually be quite impressive
I would love to see Jay run a test to see how these cards do against Star Citizen, especially in the starting zone of Orison cause in that area it can drop a lot of cards to their knees
I'm well past the point of caring about having the most powerful GPU. I make video games, not to produce the best possible graphics, but to make something fun and mentally stimulating. My GTX 1080 is definitely enough for that. I want people to be able to play my games on laptop graphics, old mid-range GPUs like a GTX 560, stuff like that. The more, the merrier.
I doubt you will see a lot of the $600 models leftover once they sell out. It's been pretty clear that board Partners were hoping for the 700 to 750 dollar price point, and I expect that's where a lot of board partner cards will end up.
@@Aefweard yea 6950xt is so good almost close to 4070 Ti in performance, only issue is that is 350W+ but u can easy undervolt without loosing much performance.
@@yeshuayahushua4338 it's pretty common to do in order to get like 2-3% less performance at like 10% less power, it just makes it a bit better in terms of performance per watt
I actually bought a 6700XT. It was on sale, on the shelf at Best Buy for $349. If the 4070 was $449 I might have bought that instead, but with real world prices climbing above $800 for a 4070 it's completely insane. Instead, the 3060Ti is still $500 for some reason. If the prices of the 30 series won't go down, nVidia really has no way to offer the 40 series at low prices.
800 dollars? I'll buy second hand upside downs and custom yokes for my motorcycle to upgrade the suspension from conventional to upside downs instead 😂
At this point, at least for me, 600 bucks is too much. I don’t game nearly as much as I used to. Will be looking for a deal on a 2070 or 3060 on the used market, but e all know they are still way too expensive, but maybe I will get lucky. 😂 thanks for the awesome video Jay! Keep up the awesome work. Curious, when was the last time you were excited for a GPU launch?
Dude get the a770. Its great and comes with 16gb vram. Or like dude said, a ps5 also comes with 16gb vram. Ive been building computers for 25 years. So sad nvidia stopped caring about gamers
@@andrewlockhart841 Yeah I hear that dude. I just don't want to pay an ai tax when I'm not going to use it. They could make a 4070ti without the ai cores for 300 bucks and shove their dlss where the sun don't shine you know what I mena lol. But no they're gooing to force gamers to pay that tax
At this point I have totally lost interest in the GPU market. The comparisons make no sense whatsoever: We're now comparing value at insane pricing versus new cards at equally insane pricing. Even if the new price is a smidge less insane, it still is insane. Still rocking my 1080Ti, which runs everything I play fine at 1440p. If that ever breaks, I will probably just make more use of my Steam deck.
This card should have 16GB of VRAM. I wish more reviewers covered these cards for content creators. 4070 should support AV1 encoding. Need to see those benchmarks.
Still waiting for the 4080ti. Just hoping my tired 1080ti can hold on for just a few more months. It's served me a good hard life for gaming and media server transcoding.
I've been a 70 guy for a while now, i upgraded from a 1070 to a 3070 this past holiday season. The reason I have been a 70 guy is because it was always a nice balance between price and performance for me. Now I'm not even a 70 guy anymore, this is despicable.
Edit: This gained more traction than I thought it would. I wanna clarify that seeing these prices go up is quite frustrating because with previous generations of nvidia cards the 60s and 70s were in the mid range where there was a good sweet spot for price to performance. Seeing these kinds of prices makes the consumer expect the same kind of price to performance ratio, just a higher pricetag. But especially with the newest 4070, this is no longer the case and the lower priced options aren't any better for value, just a lower price. It feels like there is no longer a great cost-efficient option from team green unless you want to miss out on higher vram options which somewhat makes your card outdated at launch.
If you go by pascal’s scale of cards the 1070 was amazing, if you put the performance ratios of the 40 series into the naming of the 10 series, the 4070 should be called a 4050 even though they gave it the price of what should be the 4080.
@@DashVandle Not 4050. I would go with 4060Ti at maximum. But yes - with 4080 price. In our country you can get new RX 6800 for 499USD or 2nd hand for 349USD. Today I slightly regret that I didn't wait longer and bought (2nd hand) 6700XT. I'd rather have the 6800 16GB card. But even the 6700XT is pretty good for 2K gaming. Nitro+, undervolted and quiet as a mouse.
RTX 4070 is perfect card for today's 2K gaming, but not for 599USD. Should have been at least 100USD cheaper.
Join us 60 an 60ti guys we're still afloat
I’m a “x80” guy. Luckily I’m also an “every other gen” guy, so the 8800XT (lol) should be pretty good
Why do people keep grouping themselves by model numbers... If you used to be an 80s then get the 70 since they are about equal. All this stuff is just a mess... People need to quit just looking at model numbers and compare the performance they are getting for the price instead.
Hey this was really helpful. I like this format. The chart-by-chart discussion was perfect. Wasn’t too rushed, but wasn’t too long on each slide.
You guys like this but he didn't have time to do the AMD cards? Sounds fishy
the content doesn´t match the title though, i got scammed
agreed
I've never been a high-end build guy, always budget. But this next build was supposed to be the one were I finally went high-end (Zen 4 + ~4070). And then the last few years happened where prices for everything went crazy. I was hoping things would return to normal, but I hate the idea of spending more money for something than I would normally have to. I am seriously considering just sticking to low-midrange for my next build now.
yep I feel like the best bang for your buck atm is still a used 3060 12gb if you are still having something older. I still play on my 1070ti 8gb and I really dont know what to upgrade to, I actually want atleast 16gb for my next card but the prices are insane.. I am actually looking at the Intel Arc A770 because it has 16gb for only 400 Bucks which is unmatched, but they still seem to have driver issues with dx11 and dx9 if I heard correctly. Ill probably still keep the waiting game and see what happens, but if things dont get better I have to make a decision at some point...
Depending what you're looking at, your price range if you want to go high end, and where you live consider amd's graphic card offerings. Last gen's Radeon 6950xtx is punching well above the 4070's weight class at the same price (in some regions in my part of the US it's the same or cheaper) and more vram plus the driver support has been great. Through the communities dedicated support for upgrading outdated cards. It will likely have a long life span. The 7900xt and xtx both punch equal if not well above the 4080 for pricing similar to the 4070ti's price as well as as having an insane amount vram. Just do be aware of the risk of those two card right now as some do have physical issues but if you get an ok bored from them the driver support should keep it relevant for a long time to come.
Unless you feel like you need path tracing specifically, amd is in a good spot right now surprisingly. And honestly I don't think path tracing is gonna be decent or widely supported until another architect upgrade possible 2. I'd say consider used but considering the miner boom there's no way of knowing how heavily over used and under cooled some cards were versus won't. So I can't in good conscious recommend used this time around. And it feels really weird to say that.
stick with budget builds. that's what I've always done, and I've done many builds in my time (since the 1990s). you don't need some GPU that will cost you as much as a complete system should cost you unless you want to play games that were released the last year at very high framerates or at very high settings.
charts made by reviewers tend to use pointlessly high settings. just use "high" or "medium" and you'll be able to play most games just fine even if your GPU is "budget".
Look into an 6800xt. Amazing 1440p ultra card, amazing price to performance!
Buy ARC! If not this gen, then hold off until the next generation of ARC, which - if driver updates continue to see 3-5% improvements to performance EVERY MONTH it's possible that even the second generation ARC GPU will be swinging up with AMD, if not Nvidia!
Here we go again with companies focusing on higher end cards instead of more budget friendly ones. Money money moneyyyy
Funny with the economy the way it is you'd think they'd go for lower budget cards that normal people could shop for and that still compete since they have higher end cards already for the enthusiasts or people who have the money to buy higher end. Guess if your an average perrson who just likes gaming your assed out unless you save for a year lol
I, personally, would not call this a 'higher end'. Mid tier maybe.
But it is the new high, mid, low era we're in... it ain't gonna change when they keep selling these cards.
…must be funny! In the rich man’s world!
Ahaaaaaaiaaaah!
If I'm not wrong for 30 and 40 series they've started with the 90 series and worked their way back, it's not necessarily them focusing on the "high end" gpus it's just what they planned to be release next and shouldn't be unexpected
I hope they collect dust on the shelves like the 4080. I'm hanging onto my 6950xt for now.
dang,, I bought a 4080. 😭😭.
making me feel like a dumby
I bought a 4070fe. Honestly if I’m going to get similar performance of a 3080 for 150w less I’ll take it. I have a 3080 now and it runs 15-20c hotter and only gives maybe 10% better performance on certain titles? If you have the money, this ain’t bad and you can splurge on a higher end card later lol
@@Wheel-os I originally bought a 4070 ti, and it was way more than I needed tbh. I don't play games, just surf the web and try and mess around in fruity loops.
but running msi kombustor and watching the fps 😂😂🤣🤣. it's like a drag race, you always wanna beat your score somehow.
so I decided to do a whole new build for a 2nd computer,
all white td500 mesh
13700k
4080 aero white
Asia horse and cablemod cables... white
280mm white nzxt x63
there's some black on the mobo and psu that give a decent contrast for the casing.
I'm terribly new to cable management though 😩. everything is running but I really wish I bought a bigger case to start with. and could get my wire runs a lot neater.
With 6950xt, you can still afford wait for a while until nvidia crashes and burns on their current catalog pricing..
I don’t know why I want a good gpu I don’t even play high performance demanding games.
70 was always the sweet spot for me, I always felt it was the right balance between price and power. And I know that's how many people were feeling as well. They really s**t the bed with that one, and Nvidia appears less and less like a good buy. It looks more and more like I'll be looking into an AMD card to upgrade my old 1070, which WAS a great card.
Which card you thinking? I need a new gpu before counter strike 2 comes put. Just built a 13700k build but my 1070 remains in it due to my inability to figure out what to buy. Just seems like an awful time to need a new gpu.
@@hobosnake1 Same spot here with the exception that I have 6900k instead of 13700k. I'm thinking maybe either rx 6950 xt or rx 7900 xt for longer term. Originally looked at RTX 4070 and 4070 ti, but they seem to be on drugs.
@Dezmont I was thinking the 6950xt as well. I just have never used amd. I don't know in practice what the differences will be. I like editing video and recording gameplay in CS, I need to research the differences a bit more before I pull the trigger, but the 6950xt seems like a better idea if you're like me and prefer not to swap out hardware until a complete new build is in order.
6950xt is 599 at microcenter, still. Way cheaper than a 4070 and a bit faster. Plus more VRAM, though 12gb is probably not as bad as 8gb like 3070.
@@hobosnake1 I have a 6800xt, (used to have a 3070 but sold it for a profit) so I can tell you this: the H264 encoder is bad, but the HEVC (265) works fine, and streaming is waaaay better than geforce experience bloatware. Open source everything is a nice touch, so if you use linux on the side, it's a no brainer, just plug and play. If you get the 7900xt, you will have AV1 encoding however, if that matter to you. I think you can get it for $700 open box, not sure.
Edit: 7900xt $680 open box.
Thank you for explaining the charts, making them understandable AND showcasing the items in comparison with a little white overlay on top to help us put our eyes on the things you're talking about.
I would still recommend doing the results reversed. I understand you put above as the resolution went up, but seeing as the values are lower, I'd say lower resolution on top would be better.
Thanks for the suggestion, Ill play with the charts a bit.
contrasting colours would also help visually
I also liked this format much more than with some (rock) music over it.
Truthfully, I'm still in 10 series. Been watching all these cards, how they are doing and watching price points. I would love some ray tracing, but I'm not gonna die without it. I think once evga backed out of the market I ended up dropping my plans to upgrade. Now I just keep watching and waiting to see something that catches my eye. But really, even at 1440p my 1080ti still does what I want it to for the most part. I would love to updlgrade at some point but the pricing has been pretty nuts for awhile.
Agreed. My 1080 still plays every game I own at high to maxed out completely at 1080p which is what I game at (For now, until something compelling comes along. Wouldn't mind a 3080ti) but I'm just not happy with the prices for the 40 series, and seeing how they fucked the 70 line, up I don't even want one of those now either. I'm usually an 80 or higher, but man. $1200 for a 4080 is slightly more than double what I paid for my 1080
@@artvandelay9131 I just sold my 1080 with 150 euros. Masterful product imo. I bought it new for 450€. I updated to used 3080 and paid 600 of it. I don't play a lot but I guess I can go few years with this..
I'm the same. I'm just willing to get a 3080ti because my 1080ti shows its age when I plug it into a 4K tv. But the price is still holding me back. I just don't get why people simply hit the buy button and that is it. Don't they think it sends a message that those prices are ok?
@@BrunoRodrigoPintodaSilva As someone that has a 4080, I just want to play games? I work hard, make good ass money at my job, want to come home to a peak gaming experience of 100 fps 1440p ultrawide max settings. I'm not gonna not buy something I want and can afford just to send some company a message that isn't going to be heard anyway. It's fucking video games, it's not that serious. Sorry, I guess? lol
@@ForeverMasterless I really get you point of view. It just gets on my nerves that we don't get what we feel we should get. When you buy a car with a higher power/mass ratio you expect it to be more powerful, in the end it's just a car. OK, you can buy a 100k USD car that is focused on building materials and not power itself, but you know what you will get. What do you want with a VGA? I guess we all want our fps, and it seems we are paying for god know what, because the price is going up much faster than graphics and fps. I really hope you get what I wanted to express. Anyway, great weekend for all of us.
These cards just make me glad that I switched over to AMD. Never used AMD before, first time buying an RX 6700 for $330 and I think the value proposition on AMD is just insane compared to Nvidia. I can see how ray tracing and certain productivity software could push people towards Nvidia, but overall I think AMD takes the cake for value gaming cards.
Same here... Man, Nvidia is great, but AMD is for gamers.Nvidias main target are the professionals that need Nvidia productivity software and enthusiasts.. if you're not one of them, THEN A M D IS F O R Y O U .
I nearly went for Nvidia but stuck to AMD, I was AMD back in early 2000s and glad I stuck by them
i'd also keep a eye on Intel's graphic cards, they got a 2nd generation in the works that hopefully is them punching through and comparing well to AMD/Nvidia cards.
4090: 20 GB VRAM, $1600
7900 XTX: 24 GB VRAM, $1000
Need I say more. Sure the 4090 is faster, but for $600 more, on what planet is that worth it. And less VR so overtime it will be worse than the XTX? What a joke
@@thisislame2207 Intel is awesome for sure! Their drivers have been maturing slowly. I think the only place they fall behind is legacy game support. DX9 and earlier titles especially have very poor performance with Intel GPUs according to almost every reviewer on UA-cam.
Thanks Jay and thanks for continuing to hammer Nvidia on their pricing. Basically I am done with them at this point until they come back to their senses. Every video card that I have bought since 2006 has been an Nvidia based product but that will end when I swap out my 1070 Ti. I will either go back to Radeon, or the 2nd generation Intel ARC cards after they release if I decide to switch platforms. If people decide to cave in and buy 40 series cards from Nvidia at the prices they are charging, they can expect to see price increases every generation that go well beyond the rate of inflation. To those folks, I wish you luck.
Agree with almost everything here. I'd amend the last sentence though: "To those folks, you're stupid cunts."
you are subsidizing an amazing journey of AI medicine which will cure cancer and save lifes🤣so that is great
I agree wholeheartedly. I have a 3080 and it's the last NVidia card that I will buy. I will definitely go with an AMD GPU on my next build in a few years. Intel's ARC cards are picking up some steam with their driver updates
GPU prices, and those from Nvidia in particular, won't go down as long as people keep buying their products. And they do. I don't know who, spending more than $300 on a GPU - and that price used to give you a high end GPU - seems ludicrous to me.
In case anyone's wondering for some reason, I'm still using a RX 570. I've thought about getting a RX 6600, but I'm not convinced it's worth buying AMD either since they seem to have a mindset of "we'll just put stuff out there that's barely got a better price/performance ratio than nvidia since consumers have the choice of buying that or nvidias slightly more overpriced cards"
Just built a new super rig around my 1660 specifically with the intent to build around it to support up to most of the RTX series GPUs, and didn't even have to debate with myself about whether I remain loyal to Nvidia like I have for well over a decade now. I went back to Radeon and while I'm excited for the change, I'm still slightly bummed that it came to a point where Nvidia has gotten so delusional with their pricing that I bought a competitor card without even a second thought. All because it posts the same or better specs as some of the RTX cards at like... I think I paid almost $1,000 less 🙄
This is how hardware review videos should be done. Quick answers in the title for those of us with little/short attention spans, lots of details in the video for those who are actually concerned about that kind of stuff. Both are perfectly fine, but honestly I'm only really interested in if I'm getting a good product or not at the price it's being offered.
Man, who knew that buying a 2070 Super 3,5 years ago would be such an amazing financial decision. With prices going insane on GPUs right after and to this day, I'm probably gonna stick with it for some years to come.
Same. Got a 2080s which was a "bad value" at the time.... lol
@@thegobstopper323 still rocking my 1080
still using my 2060s
Same, 2070 Super still chugging away and I have no plans to upgrade
It’s a good card, I have the FE sitting on a shelf on display. Nicest card, build wise, I’ve ever had.
We NEED more of this format, Jay. I think it works better for everyone involved and I gotta say, I've been watching your videos for a long time and this discussion was what I really needed to know If i should go out and buy or keep sitting on my 1660 Ti and wait. Thank you so much!
agreed much prefer this over the 20 minutes of charts and music
I agree price makes a big difference on whether or not to be happy with performance versus excited about it's performance
Price per performance is basically on the level of previous generation which means 4+ years of stagnation altogether... At least the power consumption is lower thanks to move from Samsung to TSMC.
eh im more annoyed by the fact that it doesnt perform as good as last gen top tier. as it should do.
@@Pand0rasAct0r_ Yes, but all cards should now have at least 12gb so this card is, in reality, what should have been the 4060 (or 4060ti).
@@generalawareness101 nah, it just should have had more vram. A 4060 performing this well would have been pretty crazy.
@@Blissy1175 Not for a 4060ti they are planning on releasing with effing 8gb. DOA.
Great breakdown of the charts! Clear and informative, with fantastic comparison notes. Thanks for doing this for us all, Jay! Much prefer this over the rock music shotgun charts with discussion after.
I loved this format where the graphs were commentated. Great job Jay and Phil. Do this again!
Yes, but perhaps next time try it with the rocker music at low volume behind it!
@@HFRG-zq1qm Haha. To be fair, I have always enjoyed the rocker music.
@@seth4321 So have I, so give us the best of both!
I feel like you'd be better off looking to get a 6950xt, considering that they're going for about the same price point and perform typically better as long as Ray Tracing isn't involved.
On another note, I'm a big fan of the new way to go over these charts. Talking over them and explaining them with highlights like Gamers Nexus does is a solid move. It feels much easier to follow along!
i prefer the 6900xt over the 6950xt only because the 6950 tends uses a lot more watts for pretty much no difference in performance and its usually about the same price.
Ray tracing is getting more important, and you're losing out on NVIDIA features there. DLSS is just superior to FSR, and NVIDIA has graphic upscaling for videos and livestreams. Forget the branding for it but its **amazing**. Streams look freaking amazing with it turned on. If you're willing to pay for the power the card is using.
@@deathhimself1653 12GB vram not enough, especially with RT on. Those ray acceleration structures need memory too. There will be games where this card loses in RT benchmarks because it doesn't have enough memory for RT mode.
2.5x power consumption difference. Stop saying ''the same price point'', unless you're not the one paying the electrical bill.
@@SomeFrenchDude not to mention turning your room into a sauna lmao. It it’s exactly why I think the 6900xt is better than the 6950xt, way less power consumption with basically same performance.
I'm hanging on to what I've got for now. $600 for a "70" series card is too rich. Pascal was the pinnacle IMO, when you could get an "80" series card for $600 and sip the same amount of power as the 4070 many years ago. The pricing of these things are ridiculous coming from a company that reported record revenue and claiming that its to offset the development cost of these cards, and that they treat board partners so badly that one of the best ones on the market pulls out of GPUs altogether.
Pulled trigger on XFX 7900XT because their 5700XT was so good to me for so long - Very happy with it
RADEON! 👏RADEON! 👏 RADEON! 👏
niceee i got a 6700xt recently prices arent to bad!
Don't have to worry about running out of vram anytime soon
I would stick with team red but their video encoders are quite shit compared to team green. When using a VR headset like a Quest or a Pico it's noticeably worse. Wish AMD would get that straight before the end of the century.
I have the Sapphire Nitro 7900XT one... This card is a beast! I have beat with them recently the spytime extreme world record (with fan cooling, no liquid nitrogen or so).
Going to echo what everyone else has said-I really dig this format and the explanations slide by slide. Makes it much easier to digest a lot of numbers. I’m using a 2070 Super in my current rig and have been trying to figure out what to upgrade to. The power efficiency of this card would make it perfect for a small form factor build, but like you said… it’s difficult to stomach at this price point. I’ll wait for now to see if I can get one on sale later in the year.
ditto mate same card to.. was hoping to be a convert to AMD but they missed the opportunity
Enjoying the 1080Ti box lurking in the background watching over everything like the stacked grandpa who could still kick your ass.
I like this way of talking about results more then just showing performance flash cards. I'd love to see this format more often
I must say, the iFixit promo bit is my favorite, World of Warships is a very close second place. The informative and very helpful episode? Great, as always, of course! Thanks Jay
Jaaaaaaaayyyyyyyyyy
The iFixit promo makes me thank God for sponsorblock
the iFixit stuff is always my favorite opener to any youtube video ever these days, always makes me smile, and I'm pretty sure I'm gonna end up buying a set or two anyways because it's so good
the world of warships promo is WAY better, just because we get to see Jay in full admiral dress uniform which looks dope as hell
@@iamamish WHAT!?!? I'M PLAYING WORLD OF WARSHIPS!!!
THANK YOU for mentioning how the 80ti used to be the top and the new 70 series used to beat the old 80ti! And that we used to get that TOP card for this price! It's insane!
We got great price to performance uplifts with new shifts in architecture culminating in the amazing value of Pascal and the amazing value of the 30-series cards. But the generation after those always to seems horrible. 20-series, now 40-series has bother been super anti-consumer priced cards that were out of touch with all but enthusiasts.
Been on the fence with what to buy for my new system in a few months and after this I am 100% going with an XT or XTX depending on sale prices at the time.
Just wait a little bit and if you want 4k gaming atleast pick the card with 16 gb vram atleast
Yeah I'm thinking the same. I don't upgrade yearly so I agree with getting the most vram you can get
The XTX is a much better value than the XT if they're going by msrp
@@chilpeeps 4k is trash, ultrawide or nothing.
@@drunkhusband6257 on big oled 4k is necessary.
Jay, you should include a game of the RE engine (maybe 4 remake) in your suite. They are great at showing the vram bottlenecks, like the 3060 totally destroying the 3070 because of it. So it would be interesting to see the 4070 12gigs vs the original 3080 10gigs.
Great video as always, but I do have a question about your results. Why do you only include the 7900XTX in one chart but then proceed with the 7900XT in the remainder?
prolly a typo?
Glad i opted for a used 3090 after the 40 series launched. No way im paying into the overpriced market for new cards. My 3090 will be good for years hopefully (i upgraded from a 1080ti which i have given to a friend).
Yea same. I just a 3090 a few months ago for dumb cheap, less than $800 lol
What PSU do you have?
I did the same bought white strix oc version of rtx 3090 for 740eur even had the ring with and I am super happy with it
Did the same thing just last month. Found a TUF 3090 that was less than a third of the price of a 4090 in mint condition.
@@BlessedNoobwhy do that when there 4070 ti is less than 800 and is cheaper to run and outperforms the 3090 ti
Im keeping my 1070. It still does 90% of what I want and cost a ton less than what we have now.
Ive been wanting to upgrade the last few generations, but the cards just cost stupid amounts now.
Still using my 1070 as well. I'm so surprised how well it plays some games, Cyberpunk 2077 is running great for me. Granted if that game didn't use FSR 2.0 I probably wouldn't be having such a good time.
@@TheInternetwatcher In the same boat here, price to performance it doesn't make sense to jump ship when most of my games run ultra 1080p no issue
As far as I know, this is the first 2-slot card (40mm thick) from the NVIDIA 4xxx series. Might be worth mentioning, this is ideal for some unique builds and small cases.
Love the look of the new chats (colors, layout, etc). Might suggest using red/green/blue colors for the card name so it's easier to see what is AMD/Nvidia/Intel); or an icon. Also like the explanation (rock music was never my fav - prob because I'm do dumb/lazy to interpret them fully). I personally don't mind the 4K on top, 1080p on bottom.
Description: "RTX 4070 is here, but should you care?"
Tbh I dont really care, I just love your chaotically wholesome vibe and commentary
5:28 780 Ti was $700, and 3 months later the Titan Black edition came out which was 20% better, but it was $1000
Thank you for the charts and explanations! There are so many products on the stack. I get lost on what’s good and bad
big number good, small number bad, duh
3:30 no actually the new xx70 cards have usually beat (or been very close to) the 3080 Ti or 3090. It's the xx60 cards that used to beat or compete with the previous generation xx80 cards. This 4070 performs the way one would have expected the 4060 to perform. Check techpowerup's GPU comparison.
Yea lots of people seem to have forgotten this.
complete scam indeed, i mean it's been how many years and you get a 3080 for 100 bucks less and less power? sure it's improvement but this is way less than what it should be
All these reviewers seem to attempt to soften the reality....
Well, I have to say that I'm impressed by the efficiency and the level of performance with the RTX 4070. Since RTX 4060 and TI aren't great and that the 4080/4090 are so expensive, the 4070 could be the best option for most people. The fact that it can perform as well as the 3080 is a very good thing too. But then again, coming from a GTX 1070, I guess I'm easily impressed :D
Who is this card even aimed for? 6800xt performs better, has more vram and launched late 2020 (over 2 years ago) at 650$.
Maybe at me, small content creator who wants GPU to work without problems while playing and recording like I had with AMD, but the price is acceptable if I do some overtime at work.
@@Nub85204 I know, that why I won't buy it new from the store, there's always some other way, I've never bought one new.
Man... i'm so happy i changed from my 1070 to a 6800 a couple of months ago. Now i don't have to deal with all that for at least 2 generations. By then all the new FABs around the world will be working and and the rippoffs will flatten out - hopefully...
I hate them adding more models (and therfor confusion) to the mix but part of me hopes they do add a super in there, like Jay pointed out.. I think I'm aiming to do an efficiency build next (lower tdp over highest performance), as the cards are all capable of what I want, for the most part but having a mid ground between a 4070 and 4070ti would be ideal for me, I think.
Hooked on the nvidia train? Why the are abusing all you.
I still use my 1080ti and it is still sufficient for my 1440p ultrawide monitor. Of course I can't play on the maximum graphics setting, but current titles still look extremely good on medium. I hope that the graphics cards will be cheaper again at some point.
90-95%of new games are boring
@@naapsuvaimne740 truth. I'd love to upgrade my 1070 but with the way pricing is it's not worth it to me when nothing interesting has been releasing anyway.
Outrage isn’t useful. Not buying Nvidia cards is. And that’s not happening.
Still loving my MSI 1080Ti, holds up really well ^^
It apparently is difficult to impossible to change 4070 clock speeds, which would suggest that your theory of it originally being a 4070 Ti likely holds true.
Also, a very nice presentation. 👍
Bought a 1070 in spring 2017 for around $450. Amazing value for the price, and just a year after it first launched. Up until ray-tracing made its way into games, I could run ANYTHING maxed out at 1440p or even 4K. Even Cyberpunk was playable at almost maxed-out non-raytraced settings, with two or three settings bumped down one notch below maximum. It's only this year that games like Hogwarts and Last of Us have finally beaten the heck out of my long-time companion.
I didn't upgrade to the 2070 because it seemed a bit too expensive and ray-tracing seemed to make every 20-series card chug except the 2090. I didn't upgrade to the 3070 because it was even more expensive and it came out during the GPU shortage. And now I'm not going to upgrade to the 4070 because it's a damn scam - the equivalent of a 1060, labeled as a 1070, and sold at a 1080 TI price point. Go kick rocks, NVIDIA. I have no interest in blowing a huge chunk of my savings on your ripoff card just so I can brute force my way through the ridiculously unoptimized games coming out these days. I'm going to keep my 1070 and explore my huge backlog of great games that are already out. Try not to screw up again with the 5070.
Still loving my RX 6800 16GB VRAM ♥ Thank you aunt AMD ♥
Still rocking mine too even though it's nearly 6 months old
@@Nub85204 when i got 6800 i sold my old 1060 for a little less than what i paid for it new
''Aunt AMD'', these posts are so cringe.
I like how Jay talked through the results instead of just slides of the fps
Historically speaking an _eh_ 70-class card matches the previous top consumer card, often surpassing it. It typically beats the 2nd fastes card from the previous gen, or the 3090 in this case, by 25% in a great year, 15% in a poor one, and around 20% in aggregate. Instead, at best, this matches the 3080 10GB, or the *5th* card in Nvidia's Ampere consumer stack while increasing the price by $100 USD.
Now you can counter by saying _"But, those other cards the previous 70's matched were only $600-$720 USD, the new top end consumer cards at $1600! The ones they beat were only $500-$700."_ at which point I'd say _"Don't really care because those 70 class cards were $275-$380."_ When was the last time a 70 class card came out with the exact same number of CUDA cores? It's on a better, more dense node, literally having the ability to double transistor density and we get...58xx CUDA? For an extra $100 bucks? Absurd.
We have gone essentially nowhere in resolution improvements over the last 8 years, having cards that could game pretty well in games from their era at 4K way back in 2015, with 2016 being the real breakout year for 4K. Nvidia says _"Well our stats show that only 29% of people use 4K displays so having a 60-class BUS on a 70-class card works fine since the card is coming in t the low, low price of $600 USD which is nothing when you look at the $1600 needed for strong 4K gaming!"_ leading to exactly why we are stagnating. Because _most_ people can't afford to buy cards that are capable of decent 5K+ gaming. How long has 1080P been the standard on PC now, with over 60% of PC gamers still using it daily? But 4K displays have been over 50% of the console market since 2016. Well that certainly flipped.
I understand, the chips themselves are more expensive, but shipping costs are down 75% since the height of the _pandemic._ GDDR6X 24GBs is coming in under $4/GB, and is less expensive than GDDR6 back in 2020/2021. GDDR6X 21GBs is literally half the price of standard GDDR6 modules from 2020. This is easily verifiable anywhere you can buy VRAM in bulk (heck, AliExpress for instance, you don't even need to go to a huge wholesaler to se the vast pricing difference). Unless they are using 15 layer PCB's or something ridiculous, the cost on 7-9 layers is also down. Check any custom PCB manufacturer and you can basically build your own 4090 board minus the Nvidia Die for $30 bucks. Printed circuitry and all. And that's 1 unit.
If silicon were really that much more expensive when everything was taken into account; design, masking, cost of the actual wafer, yields and defect rates etc, why haven't CPU's doubled in price over the last 6 years?
I get it, we buy budget cars at a 100% markup over cost. We buy our Nike T-shirts and sneakers at a 500%+ markup. We buy consoles at a -5% to 5% markup the first couple of...wait, bad example. I also understand real life inflation. I get the CPI index, but electronics are not eggs. They aren't ground beef, chicken or milk (though it's obvious that the 4070/Ti, much like the 3070/Ti are going to age like the above mentioned dairy product due to the lack of VRAM for their target resolution along with the BUS width of a 60-class card, which the 4070 essentially is, down to the actual size of the die. 294mm vs 285mm on the 3060 with that speedy 192bit BUS). Inflation has hit electronics far different than things like food, gas and shelter when it comes to cost to produce. The only thing up is the margins on units sold.
_"Then why are Nvidia's margins down to 22%?"_ Because their assets have been sitting at 98% for the last 5 months. They normally sit at 30% of their $660 Billion market cap. Anyone who understands how to read a companies basic financials will tell you that Nvidia is sitting on somewhere between $400 and $550 Billion in unsold merchandise. Jensen had miners to rely on, then saved the stocks from falling off the cliff they should have when Ethereum died and profits without mining were front and center again, but he got the investors all glassy eyed when he said _"AI. We are an AI software company. AI. AI..."_ He then went on to talk about Hopper and it being 400x faster at AI design than last gens pro cards and people hooted and hollered. Except Microsoft doesn't need 100K Hopper chips right now. Google, Amazon and Apple all have their own. Facebook/Meta should have their own soon. Tesla is buying 10K though, so that's something. Now if they can just find another customer to buy $35 Billion in chips by the end of Q2 and they'll be caught up for the year. If 70% of that is profit.
Something has to give. When ASP goes through the roof but profits are down 70% year on year, how can anyone see that as a good thing? They'll be fine at a 22% gross margin, bu they should be so much higher due to the amount of datacenter and professional products they do sell. If they cut their GPU margins from 50%-100% down to 10%-50% they would start selling cards like nuts again. It's not that people don't want GPUs, to teh contrary, demand is probably about as high as it has ever been for non-professional use. People just aren't seeing a value there. Whereas $800 USD would always get you the best of the best, over a 20 year+ span, that then shot up to $1200, then $1700, and now back to $1600 (though the 4090 obviously is not the best consumer chip they could make, and is a substantial price hike over the 3090)...a lot of people who normally wouldn't be $700+ dollar purchasers were able to save enough to be able to afford what they thought would be a great card. A generational leap over the already strong Ampere cards, and this nonsense happened. So many, if not most of those people chose not to buy cards.
Then you have the lower-high end buyers who expected a $500 4070 that beat a 3090 by 10-20% without DLSS 3 while having at least 60% of the VRAM. That obviously didn't happen. What do 60 class buyers have to look forward to? This *_"Pay more, get less performance day 1 and don't expect the card to remain viable as long as previous generations at their target res either"_* isn't exactly inspiring much confidence in giving companies our money.
If I was not so into modding/mods in the 90% single player games I spend the majority of my PC time on, I'd have switched to a console 2 years ago. It doesn't matter if I can _afford_ a 4090 or not, it's whether or not I feel like I'm getting screwed. I can afford a $500 hamburger, even if it is the best burger on the planet with chili-cheese fries that make a Tex-Mex joint in El' Paso feel seem like a burger chain, chances are most of us are still sticking to Red Robin or Krystal Burgers in a pinch. If they all of a sudden doubled their prices, and they would actually be warranted to because of inflation regarding food, rent/leases going through the roof etc, they'd lose a ton of business even though it is justified. With GPUs it simply isn't.
Pft on Nvidia, and I typically don't root for *people* to lose money, but with around 8 or 9 Billion USD of Jensen's net worth tied up in Nvidia shares, well I'd have no problems seeing his net worth cut in half as for all of their innovations, the main reason PC gaming is not progressing like it had been for 20 fricken years is almost all on Nvidia. Up through Pascal everything from resolutions to the quality of graphics continued on a steady upward trajectory. Now we are at a point where we can just continuously slot the new cards into the old lineups and see next to no price to performance gains except with the 4090, and that simply isn't acceptable. Not to me at least, and is a great way to kill an industry along with the sub-economy that has sprung up around it.
Honestly going to keep hold of my 1080ti .. it's performing well enough and isn't costing me anymore money. Was kinda hoping the 4070 would be the one to finally replace it but I just don't feel there yet. I *may* look at the used market tho since I heard that's where the real price drops are happening.
Yeah but you are missing very sweet stuff like RT + DLSS. Go used man, I recently bought 2060 Super (upgrade from 1060 3gb, also used, 3 years and still kicking) and even though this card is I think on par or maybe less powerful than 1080ti on paper, once you turn on DLSS 2, you jump even in 1080p so much higher in fps it's not even funny. CP2077 (1080p) with RT on = 30fps. With RT on + DLSS = 70fps. Of course you don't have to play with RT, but once you try it you kinda never want to go back. Also this used card has Samsung memory so it can be very nicely overclocked to 2070 basically. The card was selling in it's prime for like 800$, I bought it for 170$ (still with warranty). So even if it went bad I can replace it. There are some good deals out there, just be aware of scammers or cards used for extensive mining (techpowerup bios check can help). Oh and with the money I saved from buying like 700$ new graphics card, I was able to upgrade my whole PC (CPU, mobo/RAM and PSU)
My first gen 2070 out performed my 1080ti and used half the volts. 🤷🏼♂️
@@chemeister that's weird, UserBenchmark says 1080ti should be faster than 2060S and also 2070/2070S ..maybe driver issue?
@@chemeister you mean wattage? as when it does use half the volts, it doesn't say anything without the current, so the wattage could be very similar (U*I=P)
Used 6800 XT will get you anywhere between 60-90% performance uplift with less power draw.
Interesting… am I the only one wondering what happened to the 4070 Ti on Gears 5?
Remember when mid graphics cards were actually reasonably priced at midrange? Ah those were the days...
Coming from a 2070s standpoint I paid 500 bucks for it 4 years ago so 100 bucks more for the performance gains is pretty good for me
Not really, considering that the rtx 4070 is a 60 class card and not a 70 class card.
Same here ill be getting a 4070
4GB more over the 2070s, not exactly going to last long. I don't see the 4070 a upgrade over my 2070s.
you ngot scammed and you want to get scammed again?
@@venataciamoon2789 Because the RTX 4070 has 226mm^2 die size out of a 295mm^2 full chip, which is basically the RTX 3050 which has a 198mm^2 die size cut from a 276mm^2 chip.
The RTX 4070 is literally the successor to the RTX 3050...
Sounds like we've reached the stage where 50 series cards are now going to be around 300-400 which is kind of depressing. I remember buying my 780TI for £380 way back in 2013
380 Pounds is ~$620, or the price of this card, that offers 4 times the performance. Sounds about right for 4 years.... wait, it's been 10 years?
@@taekwoncrawfish9418 Most people don't, lol.
@Delorean911Turbo yea these humans really need a slap of reality and truly appreciate what all goes into computer components in general and all they have to do is spend the money and get pleasure like wow man
@@Delorean911Turbo Except the 780 TI was one of the top tier products with only the titan being higher at the time.
Great review and great presentation of the details. Personally, with so many PC games basically being poor ports of the console game, I may make my current 6950xt the last card I own and go back to a console in the future.
To be honest. I have a PS5. I just got a new PC. Which I am now gonna be $2,000 into it by the time its all said and done. And TBH. If you are JUST focused on gaming… and not into any other PC specific type shit. The PS5 is much simpler to use. Easy to just jump into 4k/60 games. I love the PS5. The advantage for PC is if you are into playing competitive shooters at 1080/1440 for High FPS. If you are into RPG’s ect…. The PS5 is the way to go
If Nvidia and AMD keeps the pricing trends as is it’ll make the barrier to PC gaming too high and it’ll have horrible ripple affects.
The ripples have already started. My friends who have had PCs for the last 30 years are moving to a steam deck.
Given vram limits hampering RT performance on Geforce cards - the 4070 the 12 GB may not age well for higher resolutions with RT. The RT results of the 7900 XT at 13:13, 15:23, 15:49 are quite noteworthy imo.
By noteworthy I hope you mean laughable a $780 card matching the raytracing performance of a $600 card.
@@courier3567 RT is cringe, rasterization is da way imo
A 4070 isn't meant for high resolutions, it's a 1080p card
@@drunkhusband6257 so for only 600 dollars you can get yourself a 1080p card in 2023. sounds like a deal
@@drunkhusband6257 1440p
This chart discussion format is much MUCH better than the previous "Throw all charts at your face first" format. I do have to ask, Jay, can we get an updated video on what some of the graphics settings are and what they do nowadays in more modern titles such as Cyberpunk, RDR2, etc? I'm sure there's many people that have no idea what some of those settings do in terms of performance and visuals (i.e. DLSS)
I bought my 3080 12gb strix for 699 off Newegg in September 2022 and I thought I would regret it since I assumed the 4070 was gonna be much faster and cheaper but now seeing these numbers I am very glad with my purchase and the strix looks amazing in my build
30s don't have DLSS3...
@@ksk622 DLSS2 is more than enough, what are you even saying?
@@thomasb282 DLSS3, 4k120, lower temperature, lower power consumption, cheaper. I am a BFG gamer in an ITX box. It is even better without ray-tracing turning on. In my opinion, without RT, one might consider switching to AMD card
I would have bought the 4070 today had it not been for it only having 12GB of VRAM. That's close to the minimum req for new games it seems, or it will be the new minimum very soon, so now I'm only considering 16GB to start with, but will likely get a 20GB/24GB card when I can. Really wish Nvidia had moved to 16GB for all their 4070/4070ti cards like AMD does for their 6800 and higher cards from last gen.
I've said on a few reviews now that this feels like a $500-550 US card that is priced at $600. It's still not good enough value for me to bite. Personally, I've just been skipping buying newer games because I'm not going to get good performance out of them without spending more on a GPU than I'm willing to. I've actually been enjoying discovering older titles that still play well and/or titles that provide compelling gameplay without ridiculous hardware requirements (ie non-AAA FPS-style games). Still having fun! 🙂
You're not missing anything at all.
I'm perfectly satisfied with my 6700xt for now. With components pricing these days I think there will be more console gaming.
I have been critical of your reviews, so its only fair to say the positive as well. I appreciate the time you take to explain the marketing hype and pricing strategies of these cards. As a hobbiestt PC builder, meaning I only build when my current PC gets old or crashes, the marketing/performance hype vs actual value/ cost per dollar when comparing current to previous models is awesome. Super informative. Thank you.
*
Same situation. I usually build another one every 5-7 years depending. I will also buy new GPU to support them. These prices are pretty insane.
Not intrested at this price point & not purchasing a card with less than 12 GB VRAM. Which will >probably< be the 4060 series. I want a card that will last longer & 8GB cards are showing their limits today.
Will be sticking with my 3080 threw this generation it seems. Makes me even happier I got a MSRP one from Evga.
Same
Why would you want a 40 series when you have a 30 series card anyway? Still rocking a 1080Ti and it's serving me absolutely fine, only recently started to even think about upgrading, I can still run all the modern games (albeit not at the highest graphic settings).
same, mine is ASUS TUF, got it launch day at MSRP
If they continue their shenanigans with the 50 series I may finally jump to AMD next time
The 3080 is good for a few more years yet. I got mine to replace a 980, which was only just starting to struggle with VR. We have a 1060 in the house which the kids use and it's still a 60 fps machine at 1080p on almost anything we throw at it ( the 1060 was similar performance to the previous gen 980). Unless you're a 4K gamer and are desperate for the full RT experience, the old cards still hold up well.
The lower power draw is impressive though, but not $600 impressive
Me too. Going to safe for rtx 5090
I recommend ignoring the Nvidia cards around the $600 price and instead look at the AMD 6800XT or the 6950XT - more performance per $.
Disagree completely. DLSS 3 is a complete game changer for fps.
@@drunkhusband6257 yea…fake frames with textures missing and adding lag… and only 6 games supported…it is a game changer 😂
@@crossfire4902 It's new tech, and I have never seen textures missing, or any added lag. DLSS 3 is amazing, getting 2x the framerate with no downside is great.
@@drunkhusband6257 its a fps gimmick generator to support those low end nvidia cards. if you're not going to buy a 4090 or 4080 then just get the 6950xt. real raw performance no gimmick at all.
@@irbvek almost 2x the fps isn't a gimmick.
I felt I overpaid in 2020 when I got an evga ftw 3060ti for $440. I typically stick with a midrange build but if I were to build something similar today it'd nearly $3k mostly because of the gpu.
I agree with what Jay has stated in the past, that if you don't care about ray tracing, then go AMD. That's why I picked up a Sapphire 7900XT, and I couldn't be happier.
I have same... Is the most beautiful card ever and it's a real beast
The 7900XT is a great RT card.
and if you care about - then go to amd aswell :) If Last of us or Hogwarts already crash in Full HD at 8GB, the 4070 wont run them much more stable in higher resolutions. No RAM - no RT. Or stay at Full HD.
Reality is ray tracing makes very little difference and most of these Nvidia users switch it on a few times to admire the shine and then switch it off when actually playing games to double or quadruple the frame rate......
@@jasonking1284 honestly i can't see that big difference with RT on or off 😅
yet another display of why I'm likely to go with team red on gpus once I have the cash.
What's exciting about this card is the low power consumption and possibility of having short cards, making it the most powerful low power consumption sub 280mm long card currently available - good for people with weak power supplies and small cases. Hopefully AMD matches it with a better performing 7800 XT at a competitive price with their AIBs producing some short cards. The reference 7900 XT is under 280mm long I think, so there's hope.
$499 would have made this a great value. AMD last gen seems a better buy compared to the new gen 4070
If your in the US the RX6950XT cost the same if not $50 cheaper
I'm really impressed with the TGP of that card, considering that the 3070 was a 220W TGP (I have one) and I use it at 1080p with a power limit of 46%, and that thing uses at AAA Titles around 70-100W, conserving a good amount of FPS, but I'll have to update that card because NVIDIA put 8GB of VRAM, with RE4 Remake, I noticed that I wasn't able to put it at max settings due VRAM, because it was able to run that game at 75FPS (My monitor is 75Hz, so I don't care having more FPS) and the game crashed due VRAM, being at the max settings, because the GPU wasn't at it's max %, was about 70%, so I might have the 4070 in the sight if I find it at a good price and a model that I like
It shows how much Samsung lags behind TSMC...
The 4070 is over $1100 Australian dollars which is just crazy.
I paid $650 *for a 980ti,* when they were still new and still the best you could get. What the eff is with this pricing, man?! 🤬
NVIDIA be having Dollar signs in their eyes. 😂
Your point being? In adjusted dollars, $650 2014 dollars is $825 2023 dollars...
@@awebuser5914 my income didn't jump that much either, dingus.
@@ZeroHourProductions407 Well, you didn't have a decent job, obviously! CPI is CPI, your job should have kept pace, unless it is minimum-wage...
Basically from what I saw with other reviews, it is 3080 level, RX6800XT level in raster for Radeon. For 600USD it gets "better than rest of the lineup" award. However MSRP is not only thing one should look at with pricing, because MSRP is what it was and depending on where you live RX6800XT or maybe even slightly slower RX6800 could be decently cheaper. Sure you lose on raytracing performance but do gain with extra VRAM making card more durable. Also that is if price will stay 600USD. As for gap, I think it is mainly there because they saw outrage from gamers on leaks of it being slightly below 4070Ti in price, plus the fact that new cards are just selling poorly. So they had to make it cheaper and I am sure they wanted nice gap for upselling purposes. Still nevertheless, it is at least worthy of consideration. Though 12GB of VRAM is becoming bare bones minimum you want, since games like The Last Of Us, Hogrwarts Legacy, Resident Evil 4 remaster, Forspoken, Plague Tale Requiem,... already need you to lower settings to accommodate 8GB VRAM. And I can assume developers will eventually want to push things further. Sure it is not a problem today, but it also isn't exactly 500 or 400USD card where I would find it more acceptable to take bit of potential planned obsolescence.
And P.S. As far as I am concerned, naming scheme kind of died a while ago. So personally I don't care what they call it, naming can be anything nVidia wants it to be or whatever they can make up. In the end, what matters is price. Naming in the end can be marketing tool once people start attaching meaning and can be used to manipulate people.
Not Plague Tale Requiem
@@2leggedpirate265 *bare bones minimum for a good experience. Running games on low settings is not always a good experience.
@@2leggedpirate265 You're right, I mean 8 Gigs for 1080p, looking down the barrel of tomorrow.
@@2leggedpirate265 We are talking 600USD cards, so dunno for you, but I am not paying 600USD to reduce settings like I got 1650, fir which you understandably lower settings since you paid considerably less. And 3060 12GB here is irrelevant since 12GB was more than 8GB last time I checked. Try something more comparable like 3060Ti or 3070 where you do need to compromise despite latter being 70 tier. Sure we are talking max/ultra settings, but especially 3070 wasn't exactly 300USD card and we are talking 1080p, not even 1440p.
6950XT is around $620 and looks like the better buy currently
I bought my GTX 1660 Super for around $200, just before the Corona hit back in 2020.
It's now 3 years later, and it costs about $230 now, and there's no upgrades anywhere near that prices either.
It's the first time in my lifetime, I could not upgrade my graphics card for about the same price after 3 years.
We are living in "interesting" times, for sure - also known as "pain in the butt" times.
Can you imagine how good this card could have been if NVIDIA didn't shift the entire product stack, and it was the 4060 at $350 (as it should de based on sku)?
Now that would have been properly impressive!
There is no way you will ever even see a $400 card again, you're living in the past bud.
@@drunkhusband6257 that fucking blows man… prices all go to shit when i’m finally an adult with a job. i have to worry about bills and all that other BS. so sad
@@DgkYogi PC gaming is a cheap hobby in comparison to a lot of them out there. Learn to manage your money better....
@@drunkhusband6257lmfaooo okay… try living in new york, you pay to breathe here. a majority of my money goes to savings, gas, food, car insurance, rent and etc. don’t come at me with that manage your money better shit
I know they lost all touch. I just purchased a new GPU five months ago as part of my five year upgrade cycle. Absolutely insane.
Then why did you buy it???
@@armyofninjas9055 "as part of my five year upgrade cycle"
I jumped from a 1070 to a 3090. The entire 40 series is a hard pass right now for me personally. I'm going to wait for 5000 series to come out, and then purchase a 4090TI on sale.
Any 40 series should be a hard pass for you. That 3090 should easily last you into the 60 - 70 series cards lol
That huge vram buffer is going to serve you well. It's resale value is going to be way higher than the 4070 series that's for sure.
I went from a 2080 non super to a 3090 Ti (got it on sale from EVGA). I don't plan on looking again until the 60 series lol.
This looks like a fantastic card to buy...used in 3 years when the 5k series launches.
So essentially what most people are doing with the 3070 right now xd
@@Spherecaster no? 3070 came out in october 2020. and the card was hard to get, most ppl got that card mid 2021. stop lying and justyfing sketchy tactics from nvidia
@@ImotekhtheStormlord-tx2it uhuh, that's why the steam survey shows that over half of the userbase of both the 3070 and 3060 has only started using them in just the past month, also where am I justifying anything? I just said that most people are buying it now because it was a good but overpriced card then (mostly due to scalping and miners ) that is a great value now that 4k series are being sold for even more ludicrous prices
This 4070 is pretty tempting, especially since I'm still running a 1080 TI.
I just dropped the cash to go from a 1080 to a 4080, and the difference is massive. Really unbelievable.
I would check out the RX6950XT if you're in the US cost the same but the RX6950XT is more powerful if you just want a gpu for just gameing and don't use the Nvidia professional software & hardware just go AMD you get more raw power at a cheaper price
@@MrAnony07 Thanks for the info! I have never used an AMD product before so my brand knowledge is zero. I'm not a brand loyalist so I don't mind trying something new.
@@MrAnony07 If a person cares at all about ray tracing, then they wouldn't go with AMD at this point. You can't even run CP2077's new Overdrive mode with anything from AMD.
@@ssreeser95 @Samuel Reeser yea if your budget is at the $600 mark then the RX6950XT is the best choice if you care about RT it'll be more powerful than a 3080 and slightly less powerful than a 3080ti which cost a whole lot more in the US and at that point just get the 4080 it cost about the same as the 3080ti that would be a smart choice but for raw gaming performance of the RX6000 & RTX3000 cards the RX6950xt is perfect at that $600 mark
Why was the 4070Ti TUF so much slower than the RTX 4070 FE in Gears 5 at 13:52 ... driver bugs? Seems like a weird result.
I got a 2070 for $400 new back before everything went crazy. The 1:1 performance per dollar increases makes it really hard to justify moving to a new GPU. Lets hope the next generation intel GPU puts some pressure on both AMD and NVidia to adjust pricing.
Compared to previous generations, the MSRP is not at all appealing. But compared to the MSRP of everything else in this generation, then it looks like fantastic value for money.
If it does actually release near its MSRP and available in the UK at £500, then it may be worth picking up. Especially compared to the £850 4070 Ti or £1200 4080, or even £700 3080s still available.
Sadly I expect retailers will be scalping these from day 1.
How is it looking over in the UK? In Canada the card actually dropped for its proper MSRP and is looking a bit tempting.
@@johannliebert2870 Its not looking good. The exchange rate would suggest a UK retail price for £480 + some extra shipping costs would maybe mean £500 per unit. They're actually listed as £590+ for base models and £650 for overclocked AIB models. And no FE's available over here.
@@richardwinstanley8219 dang that sucks
I switched to AMD after seeing the 4000 series prices...no regrets and a great price for a great card. Picked up a 6900XT end of last year for £650!
Based upon your charts and commentary, the 4070Ti TUF seems to me to be the best "bang for buck" over the 4070 FE. ie performance closer to the high-end cards at 1080p (which is mostly what I game at) but at a far better price point. Of course given that I'm (still) on a GeForce GTX 560Ti, *any* modern Nvidia card is going to be a major jump in video processing capability for me.
All these new cards are moronic. Check out used RDNA 2 and 30 series first. Depending on location, some great deals out there.
At 1080p unless you want to upgrade to 1440p the 4070ti is a bit overkill
@@somerandomdudeOG Yea at 1080p a 3060 is more than enough in 99% of games. I'd only get a 70 or 80 series if you wanna play everthing at max settings + 1440p/4k
Love my 4070ti Tuf but $200 more gets you a ROG version some will pay extra for looks
Scott, what platform are you using atm? I was using a 5GHz 2700K for the longest time with a 1080 Ti, upgraded to a B450/5600X, performance doubled for some games. In other words, even if you switched to something newer, but not the latest, the degree of gain may be curtailed if the base platform is sufficiently old (heavily depends on the game & settings though).
Wonder if the 70 class is the new 60?
More like its the new 1080 but then again i game on 1080p with my trustee gtx1080
Pretty much. This is awful for a 70 card.
yeah and the **60 is the new **garbage lmao 8gb 4060 ti??????????? axaxaxqxax1x1x1x1xaxaxaxaxa
70 class is still a 70 class GPU. A reasonable price GPU that gets you high->ultra settings. 60 class is still 60 class. A cheap GPU that's powerful enough to play the latest games. The only difference is prices have gone up some. But that's more to blame on the economy than GPU manufacturers. Computer parts aren't the only thing getting more expensive.
@@johnc8327 cheap lmao, brainwashed nvidia fanboy right here, get help
I'm still hanging with 970 as I would need to change almost everything to get new card and prices aren't the best currently 😅
I'm just not sure how much longer I can hold on to my GTX 1070. I think I could upgrade to the 4070 and keep using my 650 watt psu. That's saving 100 buck or more right there.
Still stuck with my GTX 1070 and i will wait till the price go down. Also i hope intel release new card and force those greedy to drop the price
Same here, but at that price.... I'm considering an AMD card and will wait a while for their mid-range 7000 cards. 12 GB VRam just won't last very long. I had upgraded from a 960 4GB to a 1070 8GB before that. I'm expecting 16 GB VRam, not this lousy nickel-dimeing.
tell me about it, I'm in the same boat with my 1080ti and 750psu
Until prices do not drop on this card, it's only worth looking at it if you build a completely new pc. An upgrade from a 10series at the moment is either AMD or 30series that has very nice offerings here and there.
Yeah same story, wonder if it would run on 550w corsair gold paired with 5600, since entire system usage was like 320w on the charts I saw, that would actually be quite impressive
Great job Jay! Going through the charts with explanation is a way to go! Really like it better!
This is why I'm STILL running a 5th Gen i5, two GTX 970's and a DDR3 MoBo.
I would love to see Jay run a test to see how these cards do against Star Citizen, especially in the starting zone of Orison cause in that area it can drop a lot of cards to their knees
$600 for a 1080p gaming card. L O L
if you want more of a joke, go to pc part picker, and on the price tab, sort it by most expensive to least. theres a 1080Ti for an asking price of $9k
I'm well past the point of caring about having the most powerful GPU. I make video games, not to produce the best possible graphics, but to make something fun and mentally stimulating. My GTX 1080 is definitely enough for that. I want people to be able to play my games on laptop graphics, old mid-range GPUs like a GTX 560, stuff like that. The more, the merrier.
I doubt you will see a lot of the $600 models leftover once they sell out. It's been pretty clear that board Partners were hoping for the 700 to 750 dollar price point, and I expect that's where a lot of board partner cards will end up.
Probably, and more insane as you can pretty regularly get a 6950xt for about 650
@@Aefweard yea 6950xt is so good almost close to 4070 Ti in performance, only issue is that is 350W+ but u can easy undervolt without loosing much performance.
@@ZondaRbg why would I want that bro?
@@yeshuayahushua4338 it's pretty common to do in order to get like 2-3% less performance at like 10% less power, it just makes it a bit better in terms of performance per watt
@@yeshuayahushua4338 Lower bills and temperatures.
I actually bought a 6700XT. It was on sale, on the shelf at Best Buy for $349. If the 4070 was $449 I might have bought that instead, but with real world prices climbing above $800 for a 4070 it's completely insane.
Instead, the 3060Ti is still $500 for some reason. If the prices of the 30 series won't go down, nVidia really has no way to offer the 40 series at low prices.
Noice what model
800 dollars? I'll buy second hand upside downs and custom yokes for my motorcycle to upgrade the suspension from conventional to upside downs instead 😂
At this point, at least for me, 600 bucks is too much. I don’t game nearly as much as I used to. Will be looking for a deal on a 2070 or 3060 on the used market, but e all know they are still way too expensive, but maybe I will get lucky. 😂 thanks for the awesome video Jay! Keep up the awesome work. Curious, when was the last time you were excited for a GPU launch?
Just get a PS5, and a PSN account, and you'll never have to worry about upgrades again - or at least, until PS6 arrives in 4 or 5 years.
Dude get the a770. Its great and comes with 16gb vram. Or like dude said, a ps5 also comes with 16gb vram. Ive been building computers for 25 years. So sad nvidia stopped caring about gamers
@@MlnscBoo Machine Learning pays better nowadays.
@@andrewlockhart841 Yeah I hear that dude. I just don't want to pay an ai tax when I'm not going to use it. They could make a 4070ti without the ai cores for 300 bucks and shove their dlss where the sun don't shine you know what I mena lol. But no they're gooing to force gamers to pay that tax
Get an RX 6600, more mature than Intel and better value than AMD. right on the tail of the 3060 while being only $200 brand new.
At this point I have totally lost interest in the GPU market. The comparisons make no sense whatsoever: We're now comparing value at insane pricing versus new cards at equally insane pricing. Even if the new price is a smidge less insane, it still is insane. Still rocking my 1080Ti, which runs everything I play fine at 1440p. If that ever breaks, I will probably just make more use of my Steam deck.
Never get rid of the iFixit ads. Pure gold.
Sponsor block extension is pure gold
@@V1CT1MIZED amen brother
This card should have 16GB of VRAM. I wish more reviewers covered these cards for content creators. 4070 should support AV1 encoding. Need to see those benchmarks.
Should have 24 or 32 bro
Still waiting for the 4080ti.
Just hoping my tired 1080ti can hold on for just a few more months. It's served me a good hard life for gaming and media server transcoding.
Surely the 5070 just have a same card as 4060 with DLSS 4.0 to generate 80% of the FPS
You say that as if it would be a bad thing.
@@itsaUSBline 4070 has the same CUDA cores as the 3070 RIP