UGREEN Revodok Pro 210(29% OFF): amzn.to/3SmM1EE UGREEN Revodok Pro 313(23% OFF): amzn.to/3HHSIw3 UGREEN Revodok Series(Up to 40% OFF): amzn.to/42s4lB4 Check out UGREEN Official Store(Up to 40% OFF): bit.ly/3OrOY5R
@@arghpee It's about the same for the whole 40 series. 4070S vs the 3060, 4070TiS vs 3060Ti. Oddly, the 4060Ti is about 2x a 3050 but "only" a 1.6x price hike. The 3050 was already grossly overpriced, so it's really not good.
i totally get your point, but I think the recent titles for 4k max settings cant be handled by anything. even a 4090 needs DLSS to keep a solid 60fps on some titles, and it is literally the fastest gpu we have. I think its more of a "these games are just fucking bonkers" and less of a "a 4080 super is weak and slow"
@@KellerFkinRyan Well 2160P is for the newbies that don't know any better, that's why most informed higher end gamers go 1440P high hz including ultrawide. If you go 2160P you will be buying the top end GPU every generation, and towards the end of that generation be struggling for FPS.
@@Battleneter Yep, it's another marketing BS, just like fram gen!... More pizels better... Ooooooh... Xcept that never mind a $1000 card, not even a $2000 card can actually drive those pixels!
Game engines and tech (path tracing, etc) are outpacing tech releases. DLSS helps but many new games are slamming CPUs harder than GPUs and if you’re CPU limited, DLSS doesn’t matter We need frame Gen to be a standard option and to keep improving its flaws
Their 2x performance increase is certainly nothing but a play on words for using framegen. Most consumers will not know this and therefore think yay great value.
That's why Nvidia kept DLSS 3.0 off the 3000 series (even though I think they could have updated it for 3000) so it makes the 4000 series look much stronger than they are@@Battleneter
@@genx156 FSR frame gen is done in software, Nvidia frame gen is done via the Tensor cores. Nvidia effectively "claim" the 30 series Tensor cores are too slow for Frame Gen,. While I am suspicious that's BS, you should not spread potential misinformation.
No, you guys keep thinking the typical consumer takes everything written on the box or promos as the truth. Ppl are doing their research and if they're not interested in tech details, they ask their techie friends. The proof? Several 4000 series cards have not sold well at all. Also, the 4080S msrp is $200 lower than the 4080s so if sales were so good they would have set it at like $1,299. Stop spreading nonsense about how ppl don't care and will just keep buying. Ever consider they are even buying bc there's really only 1 other competitor on the market? Its not like shopping for a TV dude
"Skip it" generation. To pay 43% more money over the $700 3080 10GB to get a 50% uplift just doesn't make sense. Soon this kind of performance will be available for ~$700 with the 5070 and then this will make sense where for the same $700 people will get a ~50% performance uplift. The $1000 price tag for an 80-class card is just absurd that Nvidia is pushing more and more with each generation. 1080 was $600, 2080 and 3080 were $700.
I am assuming you are unaware 30 series was the last series manufactured on Samsung fab facility and Nvidia has now shifted to TSMC who charge more. Forget about getting a $700 80 series card. Consumers will get adapted to it. Yes 5070 will match 4080 at $600 but 5070 will be launched mid next year.
Will it make sense then? Wouldn't it be a little to late? To wait from 2020 to 2025, 5 years for 50% improvement for the same money? That kind of an improvement should be generational, not every two generations.
@@pinakmiku4999 I am assuming that you have no idea about chip prices and chip sizes. Search in Google silicon calculator and you will know that a chip for such a card like 3080 costs $70 and for the 4080 costs $120. Yes, TSMC is more expensive but the chip itself is so cheap that this doesn't have much impact on the overall price. And also you don't know that the 4080's chip is 40% smaller than the 3080's chip. Stop fooling yourself and stop excusing corporations - it's just stupid.
Problem is game devs are poorly optimized their games. (I dont know if they have any incentives from GPU vendors). -Jensen buy more to save more. We are using 3-4x more GPU power for 1.5x graphic quality. Remember how RDR2 was released on 2018 and the best GPU we had was 1080ti/2080ti. I really dont see any recently released games look 3-4x better than RDR2.
RDR2 Doesn't have any sort of ray tracing or any of the demanding UE5 features that a lot of the current gen performance demanding games are being built with. Rockstar is just great at optimizing (for consoles at least) and UE5 needs more work
@@amartinez97That's the point. Even with all the bells & whistles built in UE5, Rockstar Advanced Game Engine still somehow looks as good from a 6-year-old title. And RAGE was only made because EA bought Criterion which had RenderWare (GTA 3). So RAGE itself is as old as the 360 era.
You know what's nuts? In 2014 the _entry_ level GPU would run you about $100 bucks. That would get you 720P/High in most games, and 1080P/High in a ton of games that were incredibly popular like Skyrim, Mass Effect 2 and 3, Deus Ex: Human Revolution etc etc, though they were obviously 1-6 years old at that point. A high end card would run you $350-$700 and would net you 4K 30 in pretty much everything. Then in 2017 we had an entry level of $230, but this was really more midrange as opposed to entry level, and you could easily get 1080P/High 60FPS+ in just about everything on the market, with both the 580 8GB and 6GB 1060 often getting 1440P/High 60FPS, so we had a big resolution bump from entry level 720P and 60FPS became almost standard for a huge part of the player base who had been relegated to 720P or 900P if they wanted 60 frames per second along with medium graphics settings. At the high end we of course had the $500 1080 ($700 for it's first two months on the market at which point sales dried up, Nvidia knew there was a problem with Vega coming out so the 1080Ti was on the way while the 1080 received an almost immediate $200 price drop). Now we had 4K/Medium-High 30-70 FPS on the market for $500 bucks, with some pretty darn modern graphics tricks, and 4K/High to Ultra 60 FPS+ (not that many people had 4K 60hz+ displays at the time, but they existed, so...) for $719.99, along with incredible 1080P and 1440P/high-ultra crazy tight frame time high fps gaming. This was incredible. let's jump forward 6.5 years this time instead of three. 1440P cards are now running people...$1000 USD? Seriously? People are buying the 4080 Super with the expectations of playing at 1440P? That's awful. The max common gaming resolution is...the same as it was 10 years ago, frame rates have pretty much doubled there though, at least with what people expect, but instead of the $700 780Ti it's the $1600 4090. Instead of the $500 780 for Ultra 1440P gaming it's the $1000 4080. No resolution improvements so yeah, I would hope there was a substantial FPS jump over the course of 10 years along with much better graphics as a whole. Except I'd expect it for the same price. Less if we were not taking inflation into account. The worst part is that with the increase in popularity of PC gaming over the last 10 years one would think that the economy of scale would have helped keep prices down, but because of home mining (which became very popular in 2014 oddly enough) and the margins it afforded, PC gamers, the vast, vast majority of whom have never mined a coin on Nicehash, never mind set up a bunch of mining rigs, are being forced to pay far higher prices because miners did. Both companies (Nvidia and AMD) lied to investors and now they have to try to prove it really was gamers paying the inflated prices by keeping the prices stupid high. Sure, you can still get a great prebuilt HP with a 7800XT for $1000, but good luck building one with a current gen CPU. And no, not everyone wants to buy a b450 to flash the bios, put a 5800X3D in etc to build a PC with a 7800X to see if they can squeeze into the $1000 area. Unfortunately motherboard prices for Zen 4 are still atrocious, coming in higher than Intel boards which is certainly a change and contrary to popular belief, I am still seeing people leave PC gaming for consoles. Not because they want to, but because they can't afford PC's that will play the games they want the way they want for what they consider a reasonable amount of money. The 4080 Super (Or 4080) would have been fine had it been released at $999.99 and *DROPPED* in price from there after the first few months, or even the first year, but this is just ugly at this juncture. It would be a lot less frustrating if the 3080's all had 16GB of VRAM as people who had previously been considered _high-end_ gamers had a reasonable $700-$800 upgrade path this generation but we didn't. Not from Nvidia, and not from AMD. Nvidia is making less 80 class cards than I have ever seen and they are still having a hard time selling them. The few people in retail I've talked to said they received a minuscule amount of 4080 Supers, which would explain why both Newegg and Amazon don't have any under $1100 atm, with NewEgg bundling them with 7900X3D's for $1399.99-$1484.98. I understand that the demand for the 4090 would seemingly push demand for the 4080/Super higher, but I am also pretty sure that the low number of units shipped is also a pretty large factor. If that's only pushing models sold by NewEgg and Amazon (not third parties) to $1100-$1300 for custom cards, with the way their algorithms work, they still can't be selling all that fantastic. Lol, I've never rooted for a company I own shares in to do shitty before, but I really can't wait for the AI craze to settle down. I'm sure Google, Meta, Microsoft and a few other large players will be just fine, but once the literal thousands upon thousands of AI upstarts start to fall off I'm sure demand will fall accordingly. At that point what happens though? Will gamers be expected to make up for the lost margins again? That didn't work all that well this gen, 4070 Supers and 4070 Ti Supers are already selling for below MSRP (like $10-$30 bucks via instant savings/rebates, nothing substantial or official from Nvidia mind you) I'd be hard pressed to believe either AMD or Nvidia would think that gamers would, as it seems they showed that they were not willing to, or were unable to this generation. At the same time, if AMD doesn't have a high end offering, topping out at, let's say the 5070 Ultra Ti Super Mega GT level, will Nvidia even bother with 80 class cards at $1200 bucks if they are not going to sell? Or do they basically paper launch them, only produce a couple of thousand for each major market, using the rest of their silicon for 5090's and the smaller dies they can still hold 60%+ margins on after the card is made? Once again leaving everyone who was previously in that $700-$800 area with nothing worth buying as I assume the regular 5070 will Be around $800, probably come in 15% faster than the current 4070 Super, and essentially be a Super Ti side grade. Ahh well, I suppose all we can do is wait at this point. I have a $2K budget for next gen so _hopefully_ it won't effect me, but if the 5090 comes in at $2200 I'm not touching it. I'm also not spending $1000 on 4080 Super levels of performance. I'll buy a used 7900XTX or 4090 if that's the case. It'd be great if everyone went that route, but then Nvidia would probably just delay the launch until Q4 2025 because my goodness are they stubborn, and with the size of the war chest I'm sure they have sitting in Jensen's leather jacket closet, they can certainly afford it.
The Rx 6800/XT and 7800XT is still a good 1440p card for ultra settings some only high to ultra tho and the 6700XT/50XT and 7700XT is a really good entry level 1440p card on Nvidia you have the....uhh 3080 and 4070? But all in all you can still buy a midrange GPU for 1440p
@@justinwalker5237 It's also called margins going from 23% to 60%+. Have you looked at their investor notes? Considering your response I am going to assume the answer is _"No."_ I do mention inflation in the silly rant novel I wrote, and stated that $999.99 at release would have been acceptable _(to me)_ with prices dropping or going up based on demand. Nvidia *HOLDS BACK* units, cuts production of models etc to artificially keep prices higher than demand would normally allow. With AMD not even attempting to compete, instead, following Market Leader Monopolistic practices which should not be allowed under US law, instead of being fined 3 or 4 billion USD, we look the other way. That's a problem. A problem that most likely would not exist had Obama not signed the bill allowing corporations to donate as much money as they would like to PACs and receive tax credits for donating to a non-profit without repercussions. You and I? We can only get credit for $2300 or so. Nvidia? They can get tax breaks in the hundreds of millions. So sorry, my friend, it runs a lot deeper than _"It's more expensive to make the cards now so the price goes up."_ A lot, lot deeper.
@@justinwalker5237 that's not really hiw it works. The only thing that forces a price to increase is an increase in labor cost to produce that good or service. Most items do not see 50%+ labor cost increases every 10 years lol. In capitalism, most price increases are voluntary margin increases. Then, they have to eventually pay they labor more because when enough people raise margins, labor (who is also the customers) may also receive a pay increase. But yeah, most things tend to get cheaper over time to produce as you get BETTER at making them, as most resources are actually quite abundant.
20 series: forced ray tracing on consumers even though only poorly implemented in a couple of games. Saved by the start of crypto mining. 30 series : better ray tracing performance, more games support it. Cypto mining and chip shortage lead to stupid scalping pricing. 40 series: nvidia is now the scalper, but no crypto mining to save them.
See one of the comments above... 4080 Supers are OoS everywhere and are only found by third parties selling for $1,500 and up. Scalpers are scalping on TOP of nVIDIA.
@@1x4x9 As far as I know, the price of the RTX 4080 Super card is high because Nvidia themself is limiting the supply to graphics card retailers. They are trying to gauge how much users are willing to pay for their products after the disastrous sales of the non-Super RTX 40 series.
We've reached the fourth day of the 4080Super's release. As some of us predicted (well, as I predicted), the 4080 Super has basically gone OOS at retailers and is being scalped by third party sellers starting at around $1500. So much for its MSRP.
You can get one with little effort. I was able to purchase 4 of them without hassle, you just have to keep checking. I cancelled 3 of them, since I got the one I wanted.
I called it years ago, for a reason the GTX1080Ti has 11GB of VRAM and for a long time was a 1440p card. Recently it's now 1080p because of the lack of GPU horsepower. Meanwhile the 3080Ti is entry level 4K.
Did the same, from RX 580 8gb to RX 7900 XTX. Honestly, my mind is pleased but I don't really know why I updated since I played Stardew Valley and a bit of Mass Effect since I got 7900 xtx and my old RX 580 was doing just great with those games. Future proofing I guess
using FG to highlight "x performance increase" should be against consumer laws, pretty sure in the UK this would breach UK consumer law because its not actually improving performance, if you took this to the ASA they would probs slap nvidia hard on that marketing. even nvidia are resorting to FG to sell there new GPUs, lets imagine for one second that the 3000 series had FG, nvidia wouldnt be able to claim such claims on the 4000 series and make them look appealing to buyers. its cummy af and should be illegal
And additionally not all games offer Frame Generation. They should at least call it "up to 2x RTX 3080 ti" with an footnote, that it only applies to games with FG enabled.
I like your tests. Thus I watch em. But I maybe wanna advice you one thing, even if the Nvidia at least "test sessions" are kind of over the way I see it. I also watch ~other tests... which are just gameplay GPU performances (at least some of them), but the here and the advice is. In most of them, I notice the use some sort of different Direct Hardware Access Software offered by namely instead of DX11 or DX12 (am not sure, which one this game offers, if not both), the other testers chose the Vulcan option on 1080p (and possibly the others) and that allows the GPUs to Hit on 1080p (and obviously the high end ones) to hit 200+ FPS, which I think is more realistic representation of 1080p FPS for a "light" (for modern times at least) game on high, or decent grade GPU with the rest of the appropriate components.
The 40 series with the Super refresh included was the consumer's opportunity to let NVIDIA know that we will not stand for them becoming the scalpers. We should have let the entire series rot on the shelves so they would get the message for the 50 series but people are their own worst enemies.
The consumer bought all the rtx40 cards by the hundreds of thousands, sorry your little boycott went nowhere. 1 in 10 gamers already had an rtx40 card before this refresh. Cope harder
@@Amzyy that's the price you want it too be. The 4080 super wouldn't even be able to be kept in stock 800 dollars. Cards cost more than they used too. Some is inflation some is increased manufacturing costs and of course some is greed but not as much of it is greed as you think.
3080 is a beast tho with my 7800X3D. I just wanted a little bit more power but for the price it wasn’t the best decision but hey you only live once lol but yeah 1k is crazy but that’s why I work hard hahaha. Happy gaming !!!! 3080 with that cpu are beast nonetheless ❤
The real problem with most of current titles not hitting 60fps+ at 4K is due to Ray Tracing/Path Tracing, or because they are CPU bottlenecked like Spider-Man, Hogwarts Legacy, etc. But most Rasterized games can definitely run above 60fps at 4K on a 4080. DLSS & FG were created to allow players to get/feel like it's 60fps+ in RT/PT games but even with that it can be too demanding. We'll have to wait and see what Blackwell with a brand new architecture and then Vera Rubin will bring to the table, but as of now even a 4090 would need 2-3x performance to hit 60fps+ at Native 4K on Cyberpunk 2077 with Path Tracing... 😬
@@Bazylchuk_UA For sure! The 7900 XTX at $1000 is a much better deal for Rasterization than the 4090 at $1600 but that's just the way it is! Let's hope RDNA 5 will be more competitive...
I would completely disagree that people are picking the 3080 and 4080 for 1440P resolution. I bought the 3080 originally for 4K and most people I see buying the 4080 Super claim that they are buying for 4K. The budget spent is a lot, and you expect it the best performance at 4K. Granted I bought a 3090 soon after I realized 10GB VRAM was limiting my 4K experience, and now I am using a 4090. It is very disappointing that Nvidia only has a single GPU capable of 4K gaming with Ray Tracing.
Same here in Germany. The whole school runs on windows, everything, even the servers. But some idiot in the department of education decided that all the teachers get macbooks. but no MS office license with them because that's too expensive. Nothing but problems, many of the older colleagues just left the macbooks in their boxes and put them in the cupboard at home.
Not everyone uses 60fps. I just upgraded from a 10GB 3080 to a 4080 and it's a great upgrade. 2560x1440@120Hz It's awesome to revisit old titles. I was almost always hitting the 10GB Vram wall in newer titles that was my main reason. Console port from late 2023 and in the future will need that 16GB min buffer imo.
thanks for the video. I'm currently on the 3080 12gb. Your video help me to make the decision to keep using it without upgrading to the 4080s. the only viable upgrade would be 4090 at the moment.
Still rocking my EVGA 2080ti I grabbed for $400 in '22. I tend to wait until I can more than double my performance in similar price ranges for an upgrade. Is there anything reasonable in sight? New or used? Never purchased a brand new card outside of my 660ti way back when. I'm still team green.. somehow.. but I primarily look for stability and raster perf.
I personally will wait for the 50 series. Not only with Frame generation but any new tech they probably put into DLSS 4.0. Im still on AM4 so might have to jump to AM5 as well if the 50 series uses PCIE 5.0
I'll try using the 5090 with the 5800x3D, should be fine for a while for my needs. A platform upgraded may be needed at some point for additional CPU performance, but can wait.
@@grim8511 it performs exceptionally great but I did have some issues with the drivers unfortunately so I ended up returning it and got my money back and I went with a 4090 😭 I was so fed up hahahahaha
best part about owning a 3080 that I bought used for 500 6 months ago is..... I can straight up skip the 40 series and not deal with scalpers. Everyone should just buy last years tech used for half the price... Most of the games I play are fine at 1440p above 100 fps without RT.
Have a 3080ti. See no reason for upgrading at all. Lossless scaling is a lifesaver for games that are poorly optimized. GPU market has gone to shit. Newer GPUs use a bestial amount of watts (as do CPUs) and aren't actually that much faster. Feels like they are pushing bigger numbers as if they were overclocking older hardware.
I upgraded from a RTX 3080 10 GB to a RTX 4080 Founders Edition and the performance improvement on my i5-13600KF was massive and very noticeable at 1440p. Instead of getting 60-90 fps, I am now getting a locked 120 fps in the majority of games I play and with Frame Generation and the framerate cap off I see many games hitting the 158 fps cap of my 165 Hz display. I can't wait to replace my ageing 27" 165 Hz IPS monitor with a 360 Hz Dell QD-OILED monitor sometime this year, once the new models are released here in the UK, but the upgrade is more for the OLED goodness so I can say goodbye finally to IPS glow and weak blacks plus have HDR as opposed to the higher refresh rate. That said the high refresh will allow games with Frame Gen to exceed the current 158 fps limit on my current display which for me is a bonus.
It’s a dope upgrade. People just tend to hate because they haven’t done it 🤣🤣🤣 enjoy your extra performance and gaming too. I just upgraded from a 3080 10gb which I love and it’s a beast at 1440p to a 7900xtx sapphire nitro+ 🫶🏻🫶🏻🫶🏻
Old news. Nvidia GPU's have been known for years to have 30% increase in performance for a 60% increase in cores between generations and a 30% increase in performance within generation. People who know computers know when to buy and what to buy. 😊
Graphics card need more Memory VRAM like 32GB or more like 64GB VRAM . The difference is going improvement in speed and for more large textured scenes.
I had the 3080 10GB and made the jump to the 4070 Ti because of the extra VRAM. The additional VRAM made a difference in MSFS because I'm easily hitting over 11GB in most scenarios. Granted, Microsoft is coming up with a memory optimization update which might fix this issue, but for me, the upgrade made sense. Also the FG mod on the 3080 worked OK as long as your baseline FPS are good. If you're only getting 20FPS or hitting the VRAM limit you're going to get sutters with our without FG.
@@denvernaicker8250 This made me feel a bit better about purchasing a 24GB 3090 for half the price. It seems like people are realizing how important more memory is to the performance of high-end graphics cards.
Something is wrong with Nvidia. Every driver they release is full of bugs and the next driver comes with more bugs without fixing the previous bug. Their Control panel is still the oldest UI-based and not so responsive for the newer system. On the other hand, AMD is doing its best to satisfy its customers. Never expected this from Nvidia.
2x performance gain sounds like a bold claim. I think a lot of this comes down to if you already have a 3080 or not when buying. Im coming from a 1650 ti (or something like that, cant remember the exact name) and as a new computer would anyway set me back a lot of money I went for the 4080 super. If I already had the 3080 I wouldnt even be considering upgrading. I doubt a lot of ppl have that kind of disposable cash to buy a new gen graphics card as an upgrade when ever a new card come out, Id wait at least 2 generations before even considering it because its so expensive. Im very happy with my 4080 super, its stupid expensive but I do really appreciate the kind of graphic fidelity it can deliver
Excited to get my 4080 super FE in the mail, been holding out for years and years to build a PC, I've been stuck with a laptop. I really like DLSS and driver support with NVIDIA especially for VR.
@@ZackSNetwork thanks dude, I actually went with a 7800x3d because I've always rocked AMD cpus and the 3d vcache is new and exciting. I've just been rolling with an r9 380 GPU until I get my 4080 in the mail, I've had it since launch in my closet lol
Oh man, going from a laptop to a high-end desktop feels amazing. I still remember when I first made that switch. Complete night and day difference. Enjoy man.
I think 50xx being 2x the 30xx series is generous. Looking at the 40xx vs. my 1080, I can "only" expect to get about double the performance now at 4K vs. what I get with my 1080. An interesting thing I keep seeing with the few 1080 benchmark videos I can find is they seem to get lower performance with seemingly faster CPUs than I do. I tend to run without any SSxx or AA, so this is perhaps the difference. The point remains that for me at least, based on all the data I can collate, a 4080 Super in most situations will yield 2x the performance I see with my 1080 at 4K. If the 50xx would yield 2x the 30xx performance, well... I'm definitely waiting, as that would be 4x the performance, but alas, I don't think this will happen. I think it will be 2.5 to 3x faster. As it is now, the 4090 seems to get bottlenecked by even the fastest CPU, and then depending upon which games you run, even the 4080 Super gets CPU limited (certainly at 1440 or below). I think we're starting to hit a raw performance ceiling more generally with current architecture. I would love to be wrong, but we are currently already seeing this with CPUs. I think we're not far off with GPUs now, if you consider the 4090 already shows signs of this.
Sadly DLSS 3.5 frame gen is not working for VR right now, but you can use frame re-projection which is like the same idea. It doesn't look very good though.
Frame reprojection really works with any old gpu and it's best avoided. Frame generation (FSR3) works on all RTX GPUs now with the mod, so frame generation is a bad argument to buy a 40 series card, besides the incompatibility with VR.
Great work on all your videos Daniel 😇💪🥳! Thanks to all the fellow channel supporters, commenters etc for supporting an awesome content creator who provides amazing detailed reviews 😳🤩💪.
I am now super happy for sometime to come having 3080 Ti 12GB (bought about a year ago for USD550) with now available Nukem and LukeFZ Frame Generation mods for this card! Able to play 1440p Ultra/High with DLSS Q/RT/PT on all games. :)
I have a 10gb 3080 driving a 4k display. The 10gb vram is ticking time bomb that’ll tank the value shortly and exacerbate the performance difference in the near future where 10gb vram is not enough for 4k.
@@robertofranco4515 The 2080Ti has a pretty bad GPU, not that much faster than a 1080Ti. That alone isn't entry level 4K. The 3080 doesn't have the GPU horse power of the 3080Ti/3090 for entry level 4K as well.
In fact, it is the same as always. Just choose your price point and upgrade every two generations and you are fine. Game settings are always flexible enough to let you game on the last gen GPU. Ok, the 3070 was a fail, but the 8gb were unchanged since the 1070, so that was predictable...
@@sorinpopa1442it's not just RT. It's 16bit float compute (ie AI compute) as well. The card this channel is shitting on is BETTER THAN AMD'S ABSOLUTE BEST LOLOLOLOL
@@Wobbothe3rd and are u delusional ? or just another brainwash nvidia little fanboy ? a 4070 TI cost almost 1k $ in Eu , and 4080 cost 1350 $ here ! nVidia its crazy overpriced ! ripoff !
About 4 years ago I was was looking for an upgrade, but stayed off the GPU marked entirely from 2020 to 2023 due to the scalper/crypto era and kept running my old beast - the GTX 980 ti until I got an RTX 3070 for 450 bucks in August 2023. Then a 3080 and finally the 3080 ti for about 650 bucks a month ago. The 980 ti I bought new for roughly 750 dollars in 2015 (in Norway we pay 25% tax on everything) and that's the most I'm willing to pay for a high end enthusiast card with a xx80 ti label on it. It took "only" 3 years for the pre-used prices to go back to normal prices (disregarding inflation, obviously).
I love this man. He always brings my consoomerism down to reality with these videos. Here in Australia the cheapest 4080 Super is about $1600 AUD, so in order for me to "upgrade" from my 3080 12GB, I'd have to sell my 3080 for about $650, then pay $1000 on top. There's not $1000 worth of upgrade in this. Not even close.
Great content, as usual Currently on a 3080 12gb myself so your video was interesting considering I am willing to upgrade. but not necessarily for desktop gaming (I have a 4k 120hz OLED TV) but rather for PCVR as the card shows its limits there so if the 50% uplift would materialize in VR as well then I am sold. are you planning to get into VR at some point and do similar comparison exercises? that would be so helpful for me and certainly some of your followers thanks in advance
I have been playing VR on 3080 10 GB for a few months, and upgraded to 4080 a month ago. Performance increase is noticeable. But i got mine factory refurbished for 660 USD, so the price didn't affect me. If you can get similar deal i would say go for it if you want an upgrade right now. 3080 12 Gb (and 10 GB too) are still pretty great cards, also you can get some stutters in heavy VR games if you won't turn down resolution on a headset while playing on high. 4080 allows to have clearer image on ultra in 90% of games.
Would you mind sharing what your settings in nvidia control panel are for these tests? Or if you have them anywhere in a link 🙏 love the benchmarks as usual
Thought about waiting for 50 series to upgrade my 3080 10GB, but it does not seem like it will be coming anytime soon (not to mention bots/scalpers/etc).
this is a very useful video, better than Gamers Nexus who seem to focus on 1080p and 1440p and just skim over 4k RT tests which is what us enthusists who buy the cards and who are actaully invested in these videos are running.
Very informative and thorough as usual Daniel. I've been trying to decide between a 4080 super and a 7900 xtx but I think I'll just keep my 3080 Ftw3 for a little longer. I'm running a 1440p ultrawide and haven't had any issues at all playing modern titles, even with the 10gb version. I'll be ready for a 5080 though, or maybe a Battlemage if Intel maintains their current trajectory.
Generally when I'm upgrading GPU I don't care how much faster it is, I care about breakpoints. For example, the 3080 isn't good enough for 4k gaming, it struggles. It struggles HARD in Cyberpunk2077 with RT on DLSS performance. Also, the 40-series has more features that further boost fps. The minimum to make that game playable with RT on requires a 4080. Starfield and modded Skyrim (4k) also struggle at 30-40fps with a 3080. My CPU is an AMD R7 7800x3d. The difference between 30 and 60 fps is massive compared to 60 fps to 90 fps.
A lot of people saying wait for 50 series cards. But when they come out, u think they will be 'better' values? I really doubt it that anything would be under $1000
Even if they release the 5080 for $1200 again, if it does at least 20% better than a 4080 super(which is very likely, if not more) you'd have about the same card performance for cost ratio. Though they may finally make the 80 series get 24gb vram, which would help the card in tthe long run. However, 5080 will for sure have dlss 4.0 stuff(4080 could get as well, but suffer from the generational jump of improvements, like from 20 to 30 series), plus whatever new thing they may introduce that may or may not be on the 40 series. Plus, its less than a year away. Holding onto the 4080 super money, and getting 200 more by then shouldnt be an issue if 1k for a gpu isnt an issue now. And this is just factoring the same tiers. 5070 would be roughly 4080 performance for significantly less(maybe 5070 runs at $700), and still have benefits of the 50 series. The biggest issue is if they will release a 5080 for the same price range as a 4080 or not. They'll likely use the 4080's $1200 retail to market the 5080's "value." They sure as hell hint at it when compairing the 4070 super to the 3090. Tldr, if the price for performance is roughly the same, and if one can wait, the 50 series could provide more value. Could even cost less to get the same 4080 performance by waiting and getting the 5070.
It's a "nice to have", but calling it 2x faster is a lie. It's not doing the same thing 2x faster, it's basically cheating off its own homework to give twice the output at a degraded quality. It's like two cars racing and one finishes twice as fast, but ends up at a different finish line.
@@jasonhurdlow6607 there really isn't any visual problems. Even the increased input latency isnt noticeable. It's definitely a setting I will always turn on if its an option. If you compare 7900xt with FSR vs 4080 with DLSS and frame gen on cyberpunk with max settings 1440p it is litterally 40fps vs 100+fps
@@thomasriess9208 its still 40fps vs 100fps with 7900xtx. There is a mod now that lets you use frame generation on any gpu and that will get you 60fps on 7900xtx but I heard the mod can cause crashing and there's some visual problems that regular frame gen doesn't have
@@calebadamu ok, very interesting. I did not expect that much of a difference. I guess AMD is so far behind, because they only have these features in software without dedicated hardware
Since the 4080 super and 3080ti has the same CUDA core count I decided to underclock it. So, I noticed that if I underclock my 4080 super to 3080ti levels it in rasterization preforms no different than the 3080ti I once had in benchmarks. Think there is no IPC gains from cross generations besides clock speed lol. But the 4080 super becomes an extremely efficient 3080ti that pulls 3060ti power.
Great video as usual but, I was really hoping that you could've done a direct comparison to the 3080 TI as I have the EVGA FTW3 3080 TI Ultra, for which is probably one of the strongest of those GPUs and it even beats out the 3090 in many games so, I've been watching many benchmarking and review videos to help determine if upgrading to the new 4080 Super would be worth it (Well, that's also depending on what I could actually get $$$-Wise for my 3080 TI that is???).
How does this memory bandwidth thing of 40xx series affect performance? Also, 4080 uses less power than 3080 / ti. What's the catch? I assume you ran tests without any undervolting.
The problem that will occur is that Nvidia knows they can sell their video cards for outragious prices and people will pay them. Look at the $2000-$2500 RTX 4090's selling out all over the place. So a 5080, when its released, will prolly MSRP at $1200-$1500 and then be even MORE expensive from 3rd party vendors. With a 5090 sitting at over $2000 MSRP. This is what should be expected for new video card pricing.
I have an evga 3080 but I’m looking to upgrade to the 4080 super. I got ripped up for the 3080 because of scalpers and covid prices but I can always sell my 3080 and put in the difference for the 4080. Is it worth waiting for the 5 series? Not really because that’s only if you’re able to get your hands on one. If not, by that time you’re going to tell yourself to wait for a 6 series lol.. Who wants to wait another 6-10 months for one?
If you listened to all the reviewers about how the 1080 super something to be upset about. You most likely didn't realize it was completely sold out and going for a 60% markup by resellers.
i bought my 3080 at launch and have not had a single issue with it. I planned on skipping at least 1 generation. Next upgrade i do will most likely include a new Motherboard, RAM and CPU.
Can I ask since I have a 4080 super. In avatar 1440 ultra it says ur using no upscaling but the only my results match ur 1/1 is if I use biased upscaling ultra quality if I disable that I lose 6-10 fps but if I change from biased to fixed ultra quality I hit near 100fps. Difference between fixed (using native res when upscaling) vs biased which uses ur native res and in addition tries to aim for 4k.
Is paying 600 usd over a used 400 usd 10gb 3080 worthwhile to buy a 4080 (S) for playing at 4k? I am a bit conflicted, at one hand 3080 value is really good, and while the performance of 3080 isn't the best anymore, it's still a very capable card, the main issue for me is the lack of VRAM, and really the fact I play at 4k, I mean if I had 1440p I would have gotten the 3080 no questions.
At 4k, the 3080 compute power is really getting pushed. Yea, you'll run into Vram issues occasionally but the 3080 can't really push 4k ultra anymore in modern games even if it has the Vram to. You'd have to use some form of DLSS/FSR to push that. But imo, if you're playing older games or is fine dropping to medium for 4k, the 3080 10gb is absolutely fine. It makes 0 sense to pay 2.5x more for just 1.5x more performance when they're both barely getting by for modern 4k gaming at ultra. Maybe just look at benchmark of games you play to see if the 3080 can hit 4k60fps with some tweaked settings, and if it can get that. Then, you can probably just sell the 3080 later for ~$300 when the rtx 50 series come out this year for a proper upgrade should you want a $1000 class GPU. The 4080 super will depreciate too much in value by then.
I just can’t believe the sells on this card. Its sold out everywhere and keeps selling out as soon as it gets in stock. Meanwhile there are plenty of 4070 ti supers and 4070 supers collecting dust on shelves now.
Probably a lot less 4080S were made initially to test the market. Or could be everyone saw the mediocre 4070 Ti Super benchmarks and decided the extra L2 cache was worth the extra CASH!
Daniel, I have a question! ~ 21:50 you can see the comparison of FG On / FG Off for Rogue City. Why is the overall PC latency higher with FG off? Somehow, this doesn't make sense to me.
The 3080 ti (and crypto) was the beginning of our current pricing problems, but at least it had 90% of the cores of the 3090. The 4080 has got us smoking on both ends
Companies always do this when making performance claims, especially when they compare to their previous generations. Display companies have been doing it for awhile too. They’ll claim 2500 nits peak brightness when in reality it’s hitting that in a 5% window in vivid mode with every setting maxed out for less than a few minutes
I bought a 3080 Ti for £500 brand new and genuinely, it runs things pretty well. That high end GPU was the only one I could get in a crazy market in 2023. I don't think going to a new GPU is really worth it. If I do upgrade, I'm going with AMD.
The 3080 12GB was released March 2022, 3080 10 GB in 2020. The 3080 12GB was almost a 3080 Ti with 1000 less shaders but same memory bandwidth at as the TI. Both versions are very different cards other then being called the 3080
my 3080 on CP2077 with path tracing and DLSS on balanced was getting 35fps. my 4080 Super that I just got today is getting 105-120. Double the raw mathematical power? no. Double the gaming frame output? absolutely, easily, by far. worth the upgrade vs waiting for 50 series? no, of course not. unfortunately, EVGA quality control really fell off at the end of their life span, so my 3080 had 2 dead fans when I decided to bite the bullet and move from a 13900K to X3D build. You can say what you want about frame generation, but other than the increased input lag, I guarantee you don't notice any issues. Any game that requires fast enough inputs to justify turning frame gen off ALREADY runs at 400+fps these days.
Yeah, there's little difference from a 12GB 3080 and the Ti. In fact the FTW3 Ultra 12GB card, with a little boost would match or beat a stock Ti in some tests. They had the same Vram, and memory bandwidth. It's not a big deal.
@@Battleneter For sure. I did buy one for my one PC on the LAN, but it was because it was over $300 cheaper than the Ti models at the time, and it was a very top end SKU, which was a no brainer. It's not in my main rig now, but it's still very capable, with reasonable settings.
Unpopular opinion: It's fair for them to say its a 2x increase if theyre referring to 4k raytracing performance, since thats probably what most people who are buying a 4080 intend to use it for.
Great video and definitely better for 3080 owners! I upgraded from a 3060 to 4080s, although at the moment my CPU needs to still be upgraded from i7 7700 to 78000x3d to fully play at 4k, at the moment playing at 1080p ultra settings at 75hz with vsync, I have a gsync 4k tv but with framegen there is a bit of shimmer, enabling vsync through nvidia panel and so on does alleviate that but there is too much customisation per game, I rather have one monitor for gaming, 1440p 144hz I think game customisations with monitor synchronisation...soo many things to consider
Any GPU that costs more than $500 are pointless. Gaming shouldn't cost an arm or an leg. Gaming is relaxation for most people and people shouldn't be stressed about it because of how much shits cost.
Upgraded from 3080 strix 10gb to 4080 super tuf and difference is massive. Its not 2x everywhere, but it is massive, love it with my 1440p standart and ultravide.
UGREEN Revodok Pro 210(29% OFF): amzn.to/3SmM1EE
UGREEN Revodok Pro 313(23% OFF): amzn.to/3HHSIw3
UGREEN Revodok Series(Up to 40% OFF): amzn.to/42s4lB4
Check out UGREEN Official Store(Up to 40% OFF): bit.ly/3OrOY5R
I use the 13 to 1 for my work laptop as well, works good!
It's closer to 2x a 3070 for 2x the price
And 0 performance/price ratio 😢
So a 2x of a GPU from 3 years ago... for double the price. XDDDDD
@@arghpee It's about the same for the whole 40 series. 4070S vs the 3060,
4070TiS vs 3060Ti.
Oddly, the 4060Ti is about 2x a 3050 but "only" a 1.6x price hike. The 3050 was already grossly overpriced, so it's really not good.
but now u can get a 3070 used for a quarter of the price for about half the performance, nothwithstanding vram limited situations
right on i ahve rtx 3070 FE and i will b buying ti super @@coryray8436
$1k is ridiculous for the performance we see in the most up to date titles.
i totally get your point, but I think the recent titles for 4k max settings cant be handled by anything. even a 4090 needs DLSS to keep a solid 60fps on some titles, and it is literally the fastest gpu we have. I think its more of a "these games are just fucking bonkers" and less of a "a 4080 super is weak and slow"
@@KellerFkinRyan Well 2160P is for the newbies that don't know any better, that's why most informed higher end gamers go 1440P high hz including ultrawide. If you go 2160P you will be buying the top end GPU every generation, and towards the end of that generation be struggling for FPS.
@@Battleneter Yep, it's another marketing BS, just like fram gen!... More pizels better... Ooooooh... Xcept that never mind a $1000 card, not even a $2000 card can actually drive those pixels!
Game engines and tech (path tracing, etc) are outpacing tech releases. DLSS helps but many new games are slamming CPUs harder than GPUs and if you’re CPU limited, DLSS doesn’t matter
We need frame Gen to be a standard option and to keep improving its flaws
agreed. it shoudl be $299!
Their 2x performance increase is certainly nothing but a play on words for using framegen. Most consumers will not know this and therefore think yay great value.
Recent FSR/DLSS frame gen mods with tweaks for the 30 series also go a long way to countering that 2x claim :P
That's why Nvidia kept DLSS 3.0 off the 3000 series (even though I think they could have updated it for 3000) so it makes the 4000 series look much stronger than they are@@Battleneter
@@genx156 FSR frame gen is done in software, Nvidia frame gen is done via the Tensor cores. Nvidia effectively "claim" the 30 series Tensor cores are too slow for Frame Gen,. While I am suspicious that's BS, you should not spread potential misinformation.
I just upgraded from 1080 Ti to 4080 16gb for $920 USD brand new equivalent in Australia. To me it is worth it for sure.
No, you guys keep thinking the typical consumer takes everything written on the box or promos as the truth. Ppl are doing their research and if they're not interested in tech details, they ask their techie friends. The proof? Several 4000 series cards have not sold well at all. Also, the 4080S msrp is $200 lower than the 4080s so if sales were so good they would have set it at like $1,299. Stop spreading nonsense about how ppl don't care and will just keep buying. Ever consider they are even buying bc there's really only 1 other competitor on the market? Its not like shopping for a TV dude
"Skip it" generation.
To pay 43% more money over the $700 3080 10GB to get a 50% uplift just doesn't make sense.
Soon this kind of performance will be available for ~$700 with the 5070 and then this will make sense where for the same $700 people will get a ~50% performance uplift.
The $1000 price tag for an 80-class card is just absurd that Nvidia is pushing more and more with each generation. 1080 was $600, 2080 and 3080 were $700.
I am assuming you are unaware 30 series was the last series manufactured on Samsung fab facility and Nvidia has now shifted to TSMC who charge more. Forget about getting a $700 80 series card. Consumers will get adapted to it. Yes 5070 will match 4080 at $600 but 5070 will be launched mid next year.
Tbh it's always good to skip a gen upgrading gen to gen is usually not worth it.
Stupid to upgrade every gen
Will it make sense then? Wouldn't it be a little to late? To wait from 2020 to 2025, 5 years for 50% improvement for the same money? That kind of an improvement should be generational, not every two generations.
@@pinakmiku4999 I am assuming that you have no idea about chip prices and chip sizes.
Search in Google silicon calculator and you will know that a chip for such a card like 3080 costs $70 and for the 4080 costs $120. Yes, TSMC is more expensive but the chip itself is so cheap that this doesn't have much impact on the overall price. And also you don't know that the 4080's chip is 40% smaller than the 3080's chip.
Stop fooling yourself and stop excusing corporations - it's just stupid.
Problem is game devs are poorly optimized their games. (I dont know if they have any incentives from GPU vendors). -Jensen buy more to save more.
We are using 3-4x more GPU power for 1.5x graphic quality. Remember how RDR2 was released on 2018 and the best GPU we had was 1080ti/2080ti. I really dont see any recently released games look 3-4x better than RDR2.
Devs being lazy.
RDR2 Doesn't have any sort of ray tracing or any of the demanding UE5 features that a lot of the current gen performance demanding games are being built with. Rockstar is just great at optimizing (for consoles at least) and UE5 needs more work
@@amartinez97That's the point. Even with all the bells & whistles built in UE5, Rockstar Advanced Game Engine still somehow looks as good from a 6-year-old title.
And RAGE was only made because EA bought Criterion which had RenderWare (GTA 3).
So RAGE itself is as old as the 360 era.
@amartinez97 UE5 is fine, it's just lazy developers.
UE5 games are capable of running on mobile pretty well.
You know what's nuts? In 2014 the _entry_ level GPU would run you about $100 bucks. That would get you 720P/High in most games, and 1080P/High in a ton of games that were incredibly popular like Skyrim, Mass Effect 2 and 3, Deus Ex: Human Revolution etc etc, though they were obviously 1-6 years old at that point. A high end card would run you $350-$700 and would net you 4K 30 in pretty much everything. Then in 2017 we had an entry level of $230, but this was really more midrange as opposed to entry level, and you could easily get 1080P/High 60FPS+ in just about everything on the market, with both the 580 8GB and 6GB 1060 often getting 1440P/High 60FPS, so we had a big resolution bump from entry level 720P and 60FPS became almost standard for a huge part of the player base who had been relegated to 720P or 900P if they wanted 60 frames per second along with medium graphics settings. At the high end we of course had the $500 1080 ($700 for it's first two months on the market at which point sales dried up, Nvidia knew there was a problem with Vega coming out so the 1080Ti was on the way while the 1080 received an almost immediate $200 price drop). Now we had 4K/Medium-High 30-70 FPS on the market for $500 bucks, with some pretty darn modern graphics tricks, and 4K/High to Ultra 60 FPS+ (not that many people had 4K 60hz+ displays at the time, but they existed, so...) for $719.99, along with incredible 1080P and 1440P/high-ultra crazy tight frame time high fps gaming. This was incredible. let's jump forward 6.5 years this time instead of three.
1440P cards are now running people...$1000 USD? Seriously? People are buying the 4080 Super with the expectations of playing at 1440P? That's awful. The max common gaming resolution is...the same as it was 10 years ago, frame rates have pretty much doubled there though, at least with what people expect, but instead of the $700 780Ti it's the $1600 4090. Instead of the $500 780 for Ultra 1440P gaming it's the $1000 4080. No resolution improvements so yeah, I would hope there was a substantial FPS jump over the course of 10 years along with much better graphics as a whole. Except I'd expect it for the same price. Less if we were not taking inflation into account.
The worst part is that with the increase in popularity of PC gaming over the last 10 years one would think that the economy of scale would have helped keep prices down, but because of home mining (which became very popular in 2014 oddly enough) and the margins it afforded, PC gamers, the vast, vast majority of whom have never mined a coin on Nicehash, never mind set up a bunch of mining rigs, are being forced to pay far higher prices because miners did. Both companies (Nvidia and AMD) lied to investors and now they have to try to prove it really was gamers paying the inflated prices by keeping the prices stupid high. Sure, you can still get a great prebuilt HP with a 7800XT for $1000, but good luck building one with a current gen CPU. And no, not everyone wants to buy a b450 to flash the bios, put a 5800X3D in etc to build a PC with a 7800X to see if they can squeeze into the $1000 area. Unfortunately motherboard prices for Zen 4 are still atrocious, coming in higher than Intel boards which is certainly a change and contrary to popular belief, I am still seeing people leave PC gaming for consoles. Not because they want to, but because they can't afford PC's that will play the games they want the way they want for what they consider a reasonable amount of money.
The 4080 Super (Or 4080) would have been fine had it been released at $999.99 and *DROPPED* in price from there after the first few months, or even the first year, but this is just ugly at this juncture. It would be a lot less frustrating if the 3080's all had 16GB of VRAM as people who had previously been considered _high-end_ gamers had a reasonable $700-$800 upgrade path this generation but we didn't. Not from Nvidia, and not from AMD. Nvidia is making less 80 class cards than I have ever seen and they are still having a hard time selling them. The few people in retail I've talked to said they received a minuscule amount of 4080 Supers, which would explain why both Newegg and Amazon don't have any under $1100 atm, with NewEgg bundling them with 7900X3D's for $1399.99-$1484.98. I understand that the demand for the 4090 would seemingly push demand for the 4080/Super higher, but I am also pretty sure that the low number of units shipped is also a pretty large factor. If that's only pushing models sold by NewEgg and Amazon (not third parties) to $1100-$1300 for custom cards, with the way their algorithms work, they still can't be selling all that fantastic. Lol, I've never rooted for a company I own shares in to do shitty before, but I really can't wait for the AI craze to settle down. I'm sure Google, Meta, Microsoft and a few other large players will be just fine, but once the literal thousands upon thousands of AI upstarts start to fall off I'm sure demand will fall accordingly.
At that point what happens though? Will gamers be expected to make up for the lost margins again? That didn't work all that well this gen, 4070 Supers and 4070 Ti Supers are already selling for below MSRP (like $10-$30 bucks via instant savings/rebates, nothing substantial or official from Nvidia mind you) I'd be hard pressed to believe either AMD or Nvidia would think that gamers would, as it seems they showed that they were not willing to, or were unable to this generation. At the same time, if AMD doesn't have a high end offering, topping out at, let's say the 5070 Ultra Ti Super Mega GT level, will Nvidia even bother with 80 class cards at $1200 bucks if they are not going to sell? Or do they basically paper launch them, only produce a couple of thousand for each major market, using the rest of their silicon for 5090's and the smaller dies they can still hold 60%+ margins on after the card is made? Once again leaving everyone who was previously in that $700-$800 area with nothing worth buying as I assume the regular 5070 will Be around $800, probably come in 15% faster than the current 4070 Super, and essentially be a Super Ti side grade.
Ahh well, I suppose all we can do is wait at this point. I have a $2K budget for next gen so _hopefully_ it won't effect me, but if the 5090 comes in at $2200 I'm not touching it. I'm also not spending $1000 on 4080 Super levels of performance. I'll buy a used 7900XTX or 4090 if that's the case. It'd be great if everyone went that route, but then Nvidia would probably just delay the launch until Q4 2025 because my goodness are they stubborn, and with the size of the war chest I'm sure they have sitting in Jensen's leather jacket closet, they can certainly afford it.
The Rx 6800/XT and 7800XT is still a good 1440p card for ultra settings some only high to ultra tho and the 6700XT/50XT and 7700XT is a really good entry level 1440p card on Nvidia you have the....uhh 3080 and 4070? But all in all you can still buy a midrange GPU for 1440p
That's called price goes up to make the gpu's. Same as fuel ,food etc . Prices increase and that's how it works .
@@justinwalker5237 It's also called margins going from 23% to 60%+. Have you looked at their investor notes? Considering your response I am going to assume the answer is _"No."_ I do mention inflation in the silly rant novel I wrote, and stated that $999.99 at release would have been acceptable _(to me)_ with prices dropping or going up based on demand. Nvidia *HOLDS BACK* units, cuts production of models etc to artificially keep prices higher than demand would normally allow. With AMD not even attempting to compete, instead, following Market Leader Monopolistic practices which should not be allowed under US law, instead of being fined 3 or 4 billion USD, we look the other way. That's a problem. A problem that most likely would not exist had Obama not signed the bill allowing corporations to donate as much money as they would like to PACs and receive tax credits for donating to a non-profit without repercussions. You and I? We can only get credit for $2300 or so. Nvidia? They can get tax breaks in the hundreds of millions.
So sorry, my friend, it runs a lot deeper than _"It's more expensive to make the cards now so the price goes up."_ A lot, lot deeper.
@@justinwalker5237 that's not really hiw it works. The only thing that forces a price to increase is an increase in labor cost to produce that good or service. Most items do not see 50%+ labor cost increases every 10 years lol. In capitalism, most price increases are voluntary margin increases. Then, they have to eventually pay they labor more because when enough people raise margins, labor (who is also the customers) may also receive a pay increase. But yeah, most things tend to get cheaper over time to produce as you get BETTER at making them, as most resources are actually quite abundant.
I bought my 1070 at release in Europe for 500$.
20 series: forced ray tracing on consumers even though only poorly implemented in a couple of games. Saved by the start of crypto mining.
30 series : better ray tracing performance, more games support it. Cypto mining and chip shortage lead to stupid scalping pricing.
40 series: nvidia is now the scalper, but no crypto mining to save them.
See one of the comments above... 4080 Supers are OoS everywhere and are only found by third parties selling for $1,500 and up. Scalpers are scalping on TOP of nVIDIA.
@@1x4x9 As far as I know, the price of the RTX 4080 Super card is high because Nvidia themself is limiting the supply to graphics card retailers. They are trying to gauge how much users are willing to pay for their products after the disastrous sales of the non-Super RTX 40 series.
@@wwk-1978not many. How many users will pay for a 700+ dollar card. Very few.
Very dumb move by nvid
They are the same price right 134 dollar difference now from the 7900 xtx I got the 4080 super
they dont need crypto they have Ai that needs the chips now
We've reached the fourth day of the 4080Super's release. As some of us predicted (well, as I predicted), the 4080 Super has basically gone OOS at retailers and is being scalped by third party sellers starting at around $1500. So much for its MSRP.
Here in Germany it is still available, more like 1150€ though. Still cheaper than the 4080 before the super release.
keep wishing. this card is available at the ridiculous 1k jewbust if you actually want it . this would be a retarded investment for a scalper period.
@@bencio22 And still below MSRP when you include tax.
You can get one with little effort. I was able to purchase 4 of them without hassle, you just have to keep checking.
I cancelled 3 of them, since I got the one I wanted.
@@zed0k4 of them for what? And which model did you go with? Which site?
Imagine having bought a 3080 10GB and now getting bottlenecked in 1440p.
I called it years ago, for a reason the GTX1080Ti has 11GB of VRAM and for a long time was a 1440p card. Recently it's now 1080p because of the lack of GPU horsepower.
Meanwhile the 3080Ti is entry level 4K.
They're both excellent 4K cards. Just turn settings down to avoid being bottlenecked.
What bottleneck are you talking about ?
@@saricubra2867Nah entry level 4K is the 4070ti Super.
@@yves1926VRAM capacity bottlenecked in certain games because 10gbs is not enough.
The best performance jump is when I upgraded my Rx 580 to RX 7800 XT.
GTX 970 to 1080ti was nice as well for me lol
R9 270x to 1080 ti
GTX 1070 to RTX 5000/RX 8000(assuming it could be RTX 5070/RX 8700 XT) will definitely be my best one when it happens.
1050ti to 2080ti
Did the same, from RX 580 8gb to RX 7900 XTX. Honestly, my mind is pleased but I don't really know why I updated since I played Stardew Valley and a bit of Mass Effect since I got 7900 xtx and my old RX 580 was doing just great with those games. Future proofing I guess
using FG to highlight "x performance increase" should be against consumer laws, pretty sure in the UK this would breach UK consumer law because its not actually improving performance, if you took this to the ASA they would probs slap nvidia hard on that marketing. even nvidia are resorting to FG to sell there new GPUs, lets imagine for one second that the 3000 series had FG, nvidia wouldnt be able to claim such claims on the 4000 series and make them look appealing to buyers. its cummy af and should be illegal
And additionally not all games offer Frame Generation. They should at least call it "up to 2x RTX 3080 ti" with an footnote, that it only applies to games with FG enabled.
lol only if you’re okay with AMD getting in legal trouble for marketing AFMF, the worst frame gen iteration, as a performance improvement
B- b- but Ayymd.. every, single, time 😂 @@canopylions
I made the jump from 3080 to 4080 super I can finally play cyberpunk on my 4K monitor with high frame rate. Worth it.
My card comes in next week! It’s literally why I made the jump. Cyberpunk. David should have lived.
I think nVidia mistook double the performance for double the price :D
I like your tests. Thus I watch em. But I maybe wanna advice you one thing, even if the Nvidia at least "test sessions" are kind of over the way I see it. I also watch ~other tests... which are just gameplay GPU performances (at least some of them), but the here and the advice is. In most of them, I notice the use some sort of different Direct Hardware Access Software offered by namely instead of DX11 or DX12 (am not sure, which one this game offers, if not both), the other testers chose the Vulcan option on 1080p (and possibly the others) and that allows the GPUs to Hit on 1080p (and obviously the high end ones) to hit 200+ FPS, which I think is more realistic representation of 1080p FPS for a "light" (for modern times at least) game on high, or decent grade GPU with the rest of the appropriate components.
The 40 series with the Super refresh included was the consumer's opportunity to let NVIDIA know that we will not stand for them becoming the scalpers. We should have let the entire series rot on the shelves so they would get the message for the 50 series but people are their own worst enemies.
Perfectly summed it up. At this point you can't even blame Nvidia anymore, it's just the people being people, being stupid.
The consumer bought all the rtx40 cards by the hundreds of thousands, sorry your little boycott went nowhere. 1 in 10 gamers already had an rtx40 card before this refresh. Cope harder
4080 super is only 2 fps better than the normal 4080 and it's called 'super' lol. Also, card at 999 usd is way overpriced xx80 card should be $699
Nvidia has been underpricing their cards. They sell out immediately.
@@Amzyy that's the price you want it too be. The 4080 super wouldn't even be able to be kept in stock 800 dollars. Cards cost more than they used too. Some is inflation some is increased manufacturing costs and of course some is greed but not as much of it is greed as you think.
My EVGA FTW3 Ultra 3080 12gb is doing great paired with the 7800x3d.
I got a ventus 3080 10gb and I just bought a 7900xtx 😂
3080 is a beast tho with my 7800X3D. I just wanted a little bit more power but for the price it wasn’t the best decision but hey you only live once lol but yeah 1k is crazy but that’s why I work hard hahaha. Happy gaming !!!! 3080 with that cpu are beast nonetheless ❤
i have the 3080ti verison with a 7800x3d and it beast everything. I'm skipping this generation.
@@tbreeze79 I’ll be skipping the next gen 🤣🤣🤣 I’ll be alright lmao
@@tbreeze79 that’s a beast combo. That cpu is one of the best CPU’s out there strictly for gaming. There are better ones for work and editing.
The real problem with most of current titles not hitting 60fps+ at 4K is due to Ray Tracing/Path Tracing, or because they are CPU bottlenecked like Spider-Man, Hogwarts Legacy, etc. But most Rasterized games can definitely run above 60fps at 4K on a 4080.
DLSS & FG were created to allow players to get/feel like it's 60fps+ in RT/PT games but even with that it can be too demanding.
We'll have to wait and see what Blackwell with a brand new architecture and then Vera Rubin will bring to the table, but as of now even a 4090 would need 2-3x performance to hit 60fps+ at Native 4K on Cyberpunk 2077 with Path Tracing... 😬
The thing is that RT is the only selling point of Nvidia at this point, AMD offers much better value for rasterized performance
@@Bazylchuk_UA For sure! The 7900 XTX at $1000 is a much better deal for Rasterization than the 4090 at $1600 but that's just the way it is! Let's hope RDNA 5 will be more competitive...
I would completely disagree that people are picking the 3080 and 4080 for 1440P resolution. I bought the 3080 originally for 4K and most people I see buying the 4080 Super claim that they are buying for 4K. The budget spent is a lot, and you expect it the best performance at 4K. Granted I bought a 3090 soon after I realized 10GB VRAM was limiting my 4K experience, and now I am using a 4090. It is very disappointing that Nvidia only has a single GPU capable of 4K gaming with Ray Tracing.
That u green docking station is gold, got one because my school district also gives me that MacBook with USBc only.
Are you me? Lol
Both of you stop using macs! 😅
@@DKTD23 the school hands out work laptops. At least it's not a Chromebook lol
Same here in Germany. The whole school runs on windows, everything, even the servers. But some idiot in the department of education decided that all the teachers get macbooks. but no MS office license with them because that's too expensive. Nothing but problems, many of the older colleagues just left the macbooks in their boxes and put them in the cupboard at home.
Not everyone uses 60fps. I just upgraded from a 10GB 3080 to a 4080 and it's a great upgrade. 2560x1440@120Hz It's awesome to revisit old titles. I was almost always hitting the 10GB Vram wall in newer titles that was my main reason. Console port from late 2023 and in the future will need that 16GB min buffer imo.
thanks for the video. I'm currently on the 3080 12gb. Your video help me to make the decision to keep using it without upgrading to the 4080s. the only viable upgrade would be 4090 at the moment.
Stupid to upgrade every gen
Unless you need 4k144 I'd stick with your 3080 12GB , it's still plenty enough (apart from lacking VRAM) .
Nonsense. The 4090 gets an easy ride. It’s even worse value, 60% more expensive than a 4080 for 30% more performance
Still rocking my EVGA 2080ti I grabbed for $400 in '22. I tend to wait until I can more than double my performance in similar price ranges for an upgrade. Is there anything reasonable in sight? New or used? Never purchased a brand new card outside of my 660ti way back when. I'm still team green.. somehow.. but I primarily look for stability and raster perf.
Nope, to double a 2080 Ti you would need a 4080/super or 7900xtx. Maybe once the XTX comes down to $800 that would be a decent deal.
3090/4070ti comes close but prolly a 4080 super will double
I personally will wait for the 50 series. Not only with Frame generation but any new tech they probably put into DLSS 4.0. Im still on AM4 so might have to jump to AM5 as well if the 50 series uses PCIE 5.0
Just fyi, pcie 5.0 will make no difference at all in performance, even if you buy a 5090 (as long as it has all 16 PCIe 4.0 lanes available).
I'll try using the 5090 with the 5800x3D, should be fine for a while for my needs.
A platform upgraded may be needed at some point for additional CPU performance, but can wait.
@@lorsch.I have a 5800X3D as well. I fully expect it to bottleneck a 5090. I think we'll need Zen5 to really let it run free.
my 3080 10gig died few days ago, however i am not conviced in getting nvidia again. Looking at 7900 xtx when it drops the price
Lol these comments are such lies.
Sorry to hear that. Mine is still running but I just pulled the trigger and got me the 7900xtx along with my 7800X3D I can’t wait.
@@xbox360J how does it perform? looking to get the exact cpu and gpu
@@grim8511 it performs exceptionally great but I did have some issues with the drivers unfortunately so I ended up returning it and got my money back and I went with a 4090 😭 I was so fed up hahahahaha
In terms of raw power and the fps it’s really amazing specially at 1440p.
best part about owning a 3080 that I bought used for 500 6 months ago is..... I can straight up skip the 40 series and not deal with scalpers. Everyone should just buy last years tech used for half the price... Most of the games I play are fine at 1440p above 100 fps without RT.
It's not available in all countries. This term everyone does not apply to everyone.
I don't know why Alan Wake 2 is used as a gold measure. It is all grey scale. It is so dark how can you make out detail.
by having a monitor with a high contrast ratio and an accurate eotf / gamma curve. makes a difference.
Have a 3080ti. See no reason for upgrading at all. Lossless scaling is a lifesaver for games that are poorly optimized. GPU market has gone to shit. Newer GPUs use a bestial amount of watts (as do CPUs) and aren't actually that much faster. Feels like they are pushing bigger numbers as if they were overclocking older hardware.
Same situation happened with the rtx 3080 being 2x faster than the rtx 2080. Which was BS.
Looks like jensen needs to learn counting again.
I upgraded from a RTX 3080 10 GB to a RTX 4080 Founders Edition and the performance improvement on my i5-13600KF was massive and very noticeable at 1440p. Instead of getting 60-90 fps, I am now getting a locked 120 fps in the majority of games I play and with Frame Generation and the framerate cap off I see many games hitting the 158 fps cap of my 165 Hz display. I can't wait to replace my ageing 27" 165 Hz IPS monitor with a 360 Hz Dell QD-OILED monitor sometime this year, once the new models are released here in the UK, but the upgrade is more for the OLED goodness so I can say goodbye finally to IPS glow and weak blacks plus have HDR as opposed to the higher refresh rate. That said the high refresh will allow games with Frame Gen to exceed the current 158 fps limit on my current display which for me is a bonus.
It’s a dope upgrade. People just tend to hate because they haven’t done it 🤣🤣🤣 enjoy your extra performance and gaming too. I just upgraded from a 3080 10gb which I love and it’s a beast at 1440p to a 7900xtx sapphire nitro+ 🫶🏻🫶🏻🫶🏻
Old news. Nvidia GPU's have been known for years to have 30% increase in performance for a 60% increase in cores between generations and a 30% increase in performance within generation. People who know computers know when to buy and what to buy. 😊
Are you doing a 70s vs 70tis vs 80s video ?
Yes, I'm planning on doing a video of the whole super lineup and looking at the value and performance scaling.
@@danielowentech noice ! 4070ti super ftw
@@danielowentechthat would be a good one for sure
@@danielowentech Any chance you can include the regular 4070 in on that since its a part of the line up?
Graphics card need more Memory VRAM like 32GB or more like 64GB VRAM . The difference is going improvement in speed and for more large textured scenes.
I had the 3080 10GB and made the jump to the 4070 Ti because of the extra VRAM. The additional VRAM made a difference in MSFS because I'm easily hitting over 11GB in most scenarios. Granted, Microsoft is coming up with a memory optimization update which might fix this issue, but for me, the upgrade made sense. Also the FG mod on the 3080 worked OK as long as your baseline FPS are good. If you're only getting 20FPS or hitting the VRAM limit you're going to get sutters with our without FG.
Thanks this made me feel a bit better about purchasing a 4080s
@@denvernaicker8250 Delusion is a method of coping. Maybe the best one.
@@yzfool6639 same difference
@@denvernaicker8250 This made me feel a bit better about purchasing a 24GB 3090 for half the price. It seems like people are realizing how important more memory is to the performance of high-end graphics cards.
@@mrquicky for me its more DLSS3 and frame gen that is helping push 4k 60fps titles at the moment.
Something is wrong with Nvidia. Every driver they release is full of bugs and the next driver comes with more bugs without fixing the previous bug. Their Control panel is still the oldest UI-based and not so responsive for the newer system. On the other hand, AMD is doing its best to satisfy its customers. Never expected this from Nvidia.
Imagine having a $1000+ gpu that can't even do normal fps graphics without gimmics. 😅
7900xtx 😊
2x performance gain sounds like a bold claim. I think a lot of this comes down to if you already have a 3080 or not when buying. Im coming from a 1650 ti (or something like that, cant remember the exact name) and as a new computer would anyway set me back a lot of money I went for the 4080 super. If I already had the 3080 I wouldnt even be considering upgrading. I doubt a lot of ppl have that kind of disposable cash to buy a new gen graphics card as an upgrade when ever a new card come out, Id wait at least 2 generations before even considering it because its so expensive. Im very happy with my 4080 super, its stupid expensive but I do really appreciate the kind of graphic fidelity it can deliver
Excited to get my 4080 super FE in the mail, been holding out for years and years to build a PC, I've been stuck with a laptop. I really like DLSS and driver support with NVIDIA especially for VR.
Excellent choice man what is your CPU? I would recommend either the i9 14900k, i7 14700k or 7800x3d.
@@ZackSNetwork thanks dude, I actually went with a 7800x3d because I've always rocked AMD cpus and the 3d vcache is new and exciting. I've just been rolling with an r9 380 GPU until I get my 4080 in the mail, I've had it since launch in my closet lol
Oh man, going from a laptop to a high-end desktop feels amazing. I still remember when I first made that switch. Complete night and day difference. Enjoy man.
just got a new gaming laptop. Already tired of it. I'm building a system based on 4080 super and 5700x3d
@@aeromotive2 Very nice, I feel that having an upgrade path is a big deal. Also can't stand the hot temps on laptops, it limits performance a lot.
I think 50xx being 2x the 30xx series is generous. Looking at the 40xx vs. my 1080, I can "only" expect to get about double the performance now at 4K vs. what I get with my 1080. An interesting thing I keep seeing with the few 1080 benchmark videos I can find is they seem to get lower performance with seemingly faster CPUs than I do. I tend to run without any SSxx or AA, so this is perhaps the difference. The point remains that for me at least, based on all the data I can collate, a 4080 Super in most situations will yield 2x the performance I see with my 1080 at 4K.
If the 50xx would yield 2x the 30xx performance, well... I'm definitely waiting, as that would be 4x the performance, but alas, I don't think this will happen. I think it will be 2.5 to 3x faster. As it is now, the 4090 seems to get bottlenecked by even the fastest CPU, and then depending upon which games you run, even the 4080 Super gets CPU limited (certainly at 1440 or below).
I think we're starting to hit a raw performance ceiling more generally with current architecture. I would love to be wrong, but we are currently already seeing this with CPUs. I think we're not far off with GPUs now, if you consider the 4090 already shows signs of this.
I just bought a 4080 Super to boost my VR experience. I wonder how Frame Gen makes VR look.
lol, same with me.
Sadly DLSS 3.5 frame gen is not working for VR right now, but you can use frame re-projection which is like the same idea. It doesn't look very good though.
@cherryfruit5492 good to know. Raw performance will still trounce my OG 4070.
Frame reprojection really works with any old gpu and it's best avoided.
Frame generation (FSR3) works on all RTX GPUs now with the mod, so frame generation is a bad argument to buy a 40 series card, besides the incompatibility with VR.
@@eeeeyukeHow would you not know that when you have a 4070 already?
Holy crap! I had no idea my 3080 10gb was already turning into a paperweight! What's with all the sub-60fps averages in 1440p? That's depressing!
Great work on all your videos Daniel 😇💪🥳!
Thanks to all the fellow channel supporters, commenters etc for supporting an awesome content creator who provides amazing detailed reviews 😳🤩💪.
I am now super happy for sometime to come having 3080 Ti 12GB (bought about a year ago for USD550) with now available Nukem and LukeFZ Frame Generation mods for this card! Able to play 1440p Ultra/High with DLSS Q/RT/PT on all games. :)
All I want is a GPU to max out my modded Bethesda Games at 4k 120 fps+
I don't care that much about FPS, but temps and powerdraw are far more competitive on the last generation.
Daniel, please use Dark Reader extension 🙏🥴
It's funny how the 3080 was the ultimate 4k card when launched and now it can barely hold mid 30's fps in 4k.....something seems fishy...
I have a 10gb 3080 driving a 4k display. The 10gb vram is ticking time bomb that’ll tank the value shortly and exacerbate the performance difference in the near future where 10gb vram is not enough for 4k.
Ampere is 3 years old its value will tank anyway espcially when 5000 series will release.
The 3080 never was a 4K card, the 3080Ti and anything above are 4K (entry level 4K).
@@saricubra2867 you must not remember reviews when the 3080 launched, or even when the 2080 ti was hailed as the first 4k card
@@robertofranco4515 The 2080Ti has a pretty bad GPU, not that much faster than a 1080Ti. That alone isn't entry level 4K.
The 3080 doesn't have the GPU horse power of the 3080Ti/3090 for entry level 4K as well.
@@Extreme96PLagreed. Now is the time to sell 3000 series but not necessarily buy this generation.
In fact, it is the same as always. Just choose your price point and upgrade every two generations and you are fine. Game settings are always flexible enough to let you game on the last gen GPU.
Ok, the 3070 was a fail, but the 8gb were unchanged since the 1070, so that was predictable...
'Medium to High is for gaming. Ultra is for screenshots' Tech Deals.
7900xtx = 3080 12gb in RT performance ...kind awkward for amd
But for Its Price is Good, look at Nvidia Good RT but The prices are High as An iPhone, nvidia's madness
Nvidia started the RT thing 1 gen before amd , so they still 1 gen ahead .
@@sorinpopa1442it's not just RT. It's 16bit float compute (ie AI compute) as well. The card this channel is shitting on is BETTER THAN AMD'S ABSOLUTE BEST LOLOLOLOL
@@zen9793the 7900XTX costs $1K at MSRP the fuck you talking about!?
@@Wobbothe3rd and are u delusional ? or just another brainwash nvidia little fanboy ? a 4070 TI cost almost 1k $ in Eu , and 4080 cost 1350 $ here !
nVidia its crazy overpriced ! ripoff !
About 4 years ago I was was looking for an upgrade, but stayed off the GPU marked entirely from 2020 to 2023 due to the scalper/crypto era and kept running my old beast - the GTX 980 ti until I got an RTX 3070 for 450 bucks in August 2023. Then a 3080 and finally the 3080 ti for about 650 bucks a month ago. The 980 ti I bought new for roughly 750 dollars in 2015 (in Norway we pay 25% tax on everything) and that's the most I'm willing to pay for a high end enthusiast card with a xx80 ti label on it. It took "only" 3 years for the pre-used prices to go back to normal prices (disregarding inflation, obviously).
Team 3080 12GB 😀
Sucks 😂
It's better than AMDs ABSOLUTE BEST, the 7900XTX is equal to a 3080 in RT and compute. So if the 3080 "sucks" AMDs entire line-up "sucks"
@@Wobbothe3rd Uh, 7900 XTX outperforms 3080 in raster. I have yet to see a benchmark where 3080 performs the same as 7900 XTX in raster alone.
I love this man. He always brings my consoomerism down to reality with these videos. Here in Australia the cheapest 4080 Super is about $1600 AUD, so in order for me to "upgrade" from my 3080 12GB, I'd have to sell my 3080 for about $650, then pay $1000 on top. There's not $1000 worth of upgrade in this. Not even close.
right now getting the same frame rate as my 3080ti on my 4080 super
Great content, as usual
Currently on a 3080 12gb myself so your video was interesting considering I am willing to upgrade. but not necessarily for desktop gaming (I have a 4k 120hz OLED TV) but rather for PCVR as the card shows its limits there so if the 50% uplift would materialize in VR as well then I am sold.
are you planning to get into VR at some point and do similar comparison exercises? that would be so helpful for me and certainly some of your followers
thanks in advance
I have been playing VR on 3080 10 GB for a few months, and upgraded to 4080 a month ago. Performance increase is noticeable. But i got mine factory refurbished for 660 USD, so the price didn't affect me. If you can get similar deal i would say go for it if you want an upgrade right now. 3080 12 Gb (and 10 GB too) are still pretty great cards, also you can get some stutters in heavy VR games if you won't turn down resolution on a headset while playing on high. 4080 allows to have clearer image on ultra in 90% of games.
@Tsukareda This was helpful. Thanks!
but.... if you turn on FG on 4080, why you dont use lossless scaling in the 3080???
Would you mind sharing what your settings in nvidia control panel are for these tests? Or if you have them anywhere in a link 🙏 love the benchmarks as usual
Thought about waiting for 50 series to upgrade my 3080 10GB, but it does not seem like it will be coming anytime soon (not to mention bots/scalpers/etc).
The latest rumors suggest the end of 2024 for the 5090, so I would say January or February 2025 for the 5080. Did you wait since your last comment?
this is a very useful video, better than Gamers Nexus who seem to focus on 1080p and 1440p and just skim over 4k RT tests which is what us enthusists who buy the cards and who are actaully invested in these videos are running.
Very informative and thorough as usual Daniel. I've been trying to decide between a 4080 super and a 7900 xtx but I think I'll just keep my 3080 Ftw3 for a little longer. I'm running a 1440p ultrawide and haven't had any issues at all playing modern titles, even with the 10gb version. I'll be ready for a 5080 though, or maybe a Battlemage if Intel maintains their current trajectory.
Generally when I'm upgrading GPU I don't care how much faster it is, I care about breakpoints. For example, the 3080 isn't good enough for 4k gaming, it struggles. It struggles HARD in Cyberpunk2077 with RT on DLSS performance. Also, the 40-series has more features that further boost fps. The minimum to make that game playable with RT on requires a 4080. Starfield and modded Skyrim (4k) also struggle at 30-40fps with a 3080. My CPU is an AMD R7 7800x3d. The difference between 30 and 60 fps is massive compared to 60 fps to 90 fps.
A lot of people saying wait for 50 series cards. But when they come out, u think they will be 'better' values? I really doubt it that anything would be under $1000
Even if they release the 5080 for $1200 again, if it does at least 20% better than a 4080 super(which is very likely, if not more) you'd have about the same card performance for cost ratio. Though they may finally make the 80 series get 24gb vram, which would help the card in tthe long run.
However, 5080 will for sure have dlss 4.0 stuff(4080 could get as well, but suffer from the generational jump of improvements, like from 20 to 30 series), plus whatever new thing they may introduce that may or may not be on the 40 series.
Plus, its less than a year away. Holding onto the 4080 super money, and getting 200 more by then shouldnt be an issue if 1k for a gpu isnt an issue now. And this is just factoring the same tiers. 5070 would be roughly 4080 performance for significantly less(maybe 5070 runs at $700), and still have benefits of the 50 series.
The biggest issue is if they will release a 5080 for the same price range as a 4080 or not. They'll likely use the 4080's $1200 retail to market the 5080's "value." They sure as hell hint at it when compairing the 4070 super to the 3090.
Tldr, if the price for performance is roughly the same, and if one can wait, the 50 series could provide more value. Could even cost less to get the same 4080 performance by waiting and getting the 5070.
Why do people seem to love DLSS? For me it looks like my 1440p monitor looks like a 1080p.
So glad I bought the 3080 12GB back when it came out. This will last a very long time judging by the direction Nvidia is going.
I agree. These corporations are ridiculous with these claims
As a 4080 owner the “2x speed” is advertising frame generation. A lot of games support it now, 100% worth it if you want max raytracing at 1440p-4k
It's a "nice to have", but calling it 2x faster is a lie. It's not doing the same thing 2x faster, it's basically cheating off its own homework to give twice the output at a degraded quality. It's like two cars racing and one finishes twice as fast, but ends up at a different finish line.
@@jasonhurdlow6607 there really isn't any visual problems. Even the increased input latency isnt noticeable. It's definitely a setting I will always turn on if its an option. If you compare 7900xt with FSR vs 4080 with DLSS and frame gen on cyberpunk with max settings 1440p it is litterally 40fps vs 100+fps
@@calebadamu I really hope you meant to write 7900xtx not 7900xt. Does make a significant difference
@@thomasriess9208 its still 40fps vs 100fps with 7900xtx. There is a mod now that lets you use frame generation on any gpu and that will get you 60fps on 7900xtx but I heard the mod can cause crashing and there's some visual problems that regular frame gen doesn't have
@@calebadamu ok, very interesting. I did not expect that much of a difference. I guess AMD is so far behind, because they only have these features in software without dedicated hardware
Hey, you need to test lossless scaling frame generation, it works with any game, video or emulator
Great video. The comparatively low wattage of the 40 series is amazing. I hope Nvidia and AMD continues to work on this.
Since the 4080 super and 3080ti has the same CUDA core count I decided to underclock it. So, I noticed that if I underclock my 4080 super to 3080ti levels it in rasterization preforms no different than the 3080ti I once had in benchmarks. Think there is no IPC gains from cross generations besides clock speed lol. But the 4080 super becomes an extremely efficient 3080ti that pulls 3060ti power.
Great video as usual but, I was really hoping that you could've done a direct comparison to the 3080 TI as I have the EVGA FTW3 3080 TI Ultra, for which is probably one of the strongest of those GPUs and it even beats out the 3090 in many games so, I've been watching many benchmarking and review videos to help determine if upgrading to the new 4080 Super would be worth it (Well, that's also depending on what I could actually get $$$-Wise for my 3080 TI that is???).
How does this memory bandwidth thing of 40xx series affect performance? Also, 4080 uses less power than 3080 / ti. What's the catch? I assume you ran tests without any undervolting.
The problem that will occur is that Nvidia knows they can sell their video cards for outragious prices and people will pay them. Look at the $2000-$2500 RTX 4090's selling out all over the place.
So a 5080, when its released, will prolly MSRP at $1200-$1500 and then be even MORE expensive from 3rd party vendors. With a 5090 sitting at over $2000 MSRP.
This is what should be expected for new video card pricing.
Why so much hate on the 3080ti? I absolutely love mine. It wrecks every single game I have played at 4k
I have an evga 3080 but I’m looking to upgrade to the 4080 super. I got ripped up for the 3080 because of scalpers and covid prices but I can always sell my 3080 and put in the difference for the 4080.
Is it worth waiting for the 5 series? Not really because that’s only if you’re able to get your hands on one. If not, by that time you’re going to tell yourself to wait for a 6 series lol.. Who wants to wait another 6-10 months for one?
I got mine 4080 for 660 USD (Factory refurbished) and not disappointed at all with this GPU for this price. (Upgraded from 3080)
If you listened to all the reviewers about how the 1080 super something to be upset about. You most likely didn't realize it was completely sold out and going for a 60% markup by resellers.
i bought my 3080 at launch and have not had a single issue with it. I planned on skipping at least 1 generation. Next upgrade i do will most likely include a new Motherboard, RAM and CPU.
Can I ask since I have a 4080 super. In avatar 1440 ultra it says ur using no upscaling but the only my results match ur 1/1 is if I use biased upscaling ultra quality if I disable that I lose 6-10 fps but if I change from biased to fixed ultra quality I hit near 100fps.
Difference between fixed (using native res when upscaling) vs biased which uses ur native res and in addition tries to aim for 4k.
Is paying 600 usd over a used 400 usd 10gb 3080 worthwhile to buy a 4080 (S) for playing at 4k? I am a bit conflicted, at one hand 3080 value is really good, and while the performance of 3080 isn't the best anymore, it's still a very capable card, the main issue for me is the lack of VRAM, and really the fact I play at 4k, I mean if I had 1440p I would have gotten the 3080 no questions.
At 4k, the 3080 compute power is really getting pushed.
Yea, you'll run into Vram issues occasionally but the 3080 can't really push 4k ultra anymore in modern games even if it has the Vram to. You'd have to use some form of DLSS/FSR to push that.
But imo, if you're playing older games or is fine dropping to medium for 4k, the 3080 10gb is absolutely fine.
It makes 0 sense to pay 2.5x more for just 1.5x more performance when they're both barely getting by for modern 4k gaming at ultra.
Maybe just look at benchmark of games you play to see if the 3080 can hit 4k60fps with some tweaked settings, and if it can get that.
Then, you can probably just sell the 3080 later for ~$300 when the rtx 50 series come out this year for a proper upgrade should you want a $1000 class GPU. The 4080 super will depreciate too much in value by then.
I just can’t believe the sells on this card. Its sold out everywhere and keeps selling out as soon as it gets in stock. Meanwhile there are plenty of 4070 ti supers and 4070 supers collecting dust on shelves now.
Probably a lot less 4080S were made initially to test the market. Or could be everyone saw the mediocre 4070 Ti Super benchmarks and decided the extra L2 cache was worth the extra CASH!
Is the 4080 super worth upgrading from a 3080 12gb?
id wait for the 5000 cards
Would it be worth an upgrade to jump from 3070 to a 4080 super ?
Wait for the next gen! Its comming this year
I know ! my 8gb vram is killing me !!@@IsraelSocial
Daniel, I have a question! ~ 21:50 you can see the comparison of FG On / FG Off for Rogue City. Why is the overall PC latency higher with FG off? Somehow, this doesn't make sense to me.
Good catch. I wonder if reflex didn't kick in properly when I had FG off.
The 3080 ti (and crypto) was the beginning of our current pricing problems, but at least it had 90% of the cores of the 3090. The 4080 has got us smoking on both ends
Pascal started overpricing
@@lifemocker85 That's true, but I'd argue that people were mostly wise to it and the original msrps for 3000 series reflect that
@lifemocker85 how so?
I what's thought the 20series was where the crazy began... with the RT and DLSS Taxes.
@@slimal1 Yeah Turing was when the price increase got noticeably crazy.
@@slimal1 they raised prices
Companies always do this when making performance claims, especially when they compare to their previous generations. Display companies have been doing it for awhile too. They’ll claim 2500 nits peak brightness when in reality it’s hitting that in a 5% window in vivid mode with every setting maxed out for less than a few minutes
I bought a 3080 Ti for £500 brand new and genuinely, it runs things pretty well. That high end GPU was the only one I could get in a crazy market in 2023. I don't think going to a new GPU is really worth it. If I do upgrade, I'm going with AMD.
The 3080 12GB was released March 2022, 3080 10 GB in 2020.
The 3080 12GB was almost a 3080 Ti with 1000 less shaders but same memory bandwidth at as the TI.
Both versions are very different cards other then being called the 3080
my 3080 on CP2077 with path tracing and DLSS on balanced was getting 35fps. my 4080 Super that I just got today is getting 105-120.
Double the raw mathematical power? no. Double the gaming frame output? absolutely, easily, by far. worth the upgrade vs waiting for 50 series? no, of course not. unfortunately, EVGA quality control really fell off at the end of their life span, so my 3080 had 2 dead fans when I decided to bite the bullet and move from a 13900K to X3D build. You can say what you want about frame generation, but other than the increased input lag, I guarantee you don't notice any issues. Any game that requires fast enough inputs to justify turning frame gen off ALREADY runs at 400+fps these days.
The 3070 ti not being 16 gigs was pure evil.
😂😂😂 Saying a gpu gets better fps because of frame gen.. is like calling a 1080p monitor upscaled to 4k a 4k monitor 😂😂😂
i am 30 Years Gamer from good old Viper Gpu times... I have 1 Rule ... Upgrade only when your Next GPU is 2x Faster and more.
Please test 8000 apus line up and with comparison to 5600g/5700g systems. Especially the 8500g vs 5600g
He doesn't do cpu benchmarks. Check out hardware unbound or gamers nexus for that
All of those APUs are significantly slower than my Alder Lake Core i7-12700K from 2021 CPU wise.
The 8700g is slower than a GTX 1060.
@@ZackSNetwork But anything is better than an RX6500XT.
Yeah, there's little difference from a 12GB 3080 and the Ti. In fact the FTW3 Ultra 12GB card, with a little boost would match or beat a stock Ti in some tests. They had the same Vram, and memory bandwidth. It's not a big deal.
The 3080 12GB was an utterly pointless slightly cut down 3080TI, it was about extracting more $$ from consumers during the mining boom.
@@Battleneter For sure. I did buy one for my one PC on the LAN, but it was because it was over $300 cheaper than the Ti models at the time, and it was a very top end SKU, which was a no brainer. It's not in my main rig now, but it's still very capable, with reasonable settings.
Unpopular opinion: It's fair for them to say its a 2x increase if theyre referring to 4k raytracing performance, since thats probably what most people who are buying a 4080 intend to use it for.
Daniel do you think worth upgrading from 6900 XT to 4080 super?
I'm not Daniel... but a big NO from me. Unless you have some specific applications that need NVidia. Wait until next generation.
yes, put the old gpu in the trash, tell me when you take out that trash, asap
it seems you have too much time and money at hand
I just bought a mint condition TUF 3080 12GB on ebay for $400 last month. Fantastic investment. Love this card.
Great video and definitely better for 3080 owners!
I upgraded from a 3060 to 4080s, although at the moment my CPU needs to still be upgraded from i7 7700 to 78000x3d to fully play at 4k, at the moment playing at 1080p ultra settings at 75hz with vsync, I have a gsync 4k tv but with framegen there is a bit of shimmer, enabling vsync through nvidia panel and so on does alleviate that but there is too much customisation per game, I rather have one monitor for gaming, 1440p 144hz
I think game customisations with monitor synchronisation...soo many things to consider
4080 being 80% better than a 3080 is crazy
Any GPU that costs more than $500 are pointless. Gaming shouldn't cost an arm or an leg. Gaming is relaxation for most people and people shouldn't be stressed about it because of how much shits cost.
Upgraded from 3080 strix 10gb to 4080 super tuf and difference is massive. Its not 2x everywhere, but it is massive, love it with my 1440p standart and ultravide.