I had fully intended to buy a 5090, upgrading from my 3090. Then I saw the price hike. Even though I could afford it I won't support companies continuing to maximize profits at their customers' expense, so I figured I'd get a 5080 instead. Turns out Nvidia's bending over customers on the 5080 too. So now I'm just rolling with the 3090, and maybe I'll get one of the AMD cards instead when they drop. We as consumers are the only ones who can make these companies change, but we have to refuse to buy into the hype and stop giving them our money.
In NVIDIA’s defense, I don’t think they really intended the 5090 to be a gaming card with 32gb of ram. Seems like a productivity card. Hopefully they release a 5080ti with 18-24gb of vram on the newer 3gb memory modules when they become more available…. And hopefully those won’t cost 1400 bucks MSRP….
@ I think you're right. But i could see a scenario with a 5080 Super with 18gb being released with a 1000-1200 MSRP and a 24gb 5080 TI Super being 1400-1500.
I was really bummed I missed out on an FE from best buy yesterday but as the nvidia mania is fading away, Im actually glad I saved the grand. I really hope AMD pulls through
That was only true when a die shrink actually reduced production costs, but I think we might be reaching the limit of price to performance increases each generation
@@SamiJuntunen1 I did something similar with 980 ti to used 3080 ti. Why do you even feel the need to upgrade? unless you have a 4k 240hz monitor that can make use of 4x frame gen...
@@kayblis Well I used to play a lot when I bought 3080 and it was "only" 740€ and asus tuf. Now I just lower the graphics from ultra to high and works fine.
I bought a 5090 yesterday to replace my 4090. For fun I was curious what a used 4090 could sell for. Newegg offered my $1350 which means they're gonna sell it for at least $1500-1600 if not more which is crazy for a two year old card. Finding a $1200 4090 doesn't look like it'll be an option for anyone.
@ 8800GTX remained relevant way longer than the 1080ti.. 1080ti is overly hyped and its cult is delusional... by the time the 2000 series hit pascals arch was showing some hard limitations in newer shader heavy games like RDR2.
Yeah except it cost almost 3x the price compared to the 1080ti. What made the 1080ti and 980ti great cards isnt just their performance but also their price.
All of this has me really rethinking why I keep going with nvidia. Taking a step back from my excitement and dopamine from today’s attempt to really reflect on this. Great video as usual, keep up the good work!
Well, despite the lackluster generational improvements, AMD does not offer a more powerful gaming GPU than the 5080 or 5090. So if you want that level of performance you don't really have other options. The 7900 XTX isn't too far behind in raster performance, and costs less, but definitely falls behind in RT and upscaling quality. I'm curious what AMD has in store for the next gen, since realistically they are facing some of the same issues like node increases still being expensive. Perhaps AMD will be more willing to sacrifice margins to chase market share.
@@jannegrey There are actually published numbers, around 80-90% use DLSS when it exists, Frame Gen lags behind since it is mostly not needed except in Path Traced games and there are like 3.
We love Nvidia but at some point enough is enough but there's so many people like me who are Nvidia shareholders and have made an absolute shit ton of money... So idk lol
So just to help people 4080 was 4070Ti (Pretending to be a 4080) So Nvidia can charge xx80 price 5080 is a 5070 (Pretending to be a 5080) So Nvidia can charge xx80 price 6080? Well my guess it will be a 6060Ti. Nvidia has gradually lowered the bar of the xx80 class, the gpu that is mostly for gamers, where the main difference between the top tier and the xx80 was usually things like VRAM/Creator features. HUB released a video today showcasing this in more detail.
I cant believe people are actually making dumb comments like this and they get upvoted so much. Nothing about the 4080 made it a 4070ti lmao. It was plenty fast enough to make it a 4080.
I did, sold it used for $1600 and bought a 7900 XTX used off jawa for $800. Yes, the 4090 was better. But my only Ray Tracing game was Cyberpunk. I can 120fps at 4k with the 7900XTX in just about everything else I play with FSR.
I purchased several 4090 FE at below msrp for $1440 when Bestbuy used to have a 10% off code in 2022-2023 and sold them for $1800. For personal use, I purchased a used Galax 4090 in excellent condition for $1000 on local marketplace(seller was selling her ex husband stuff for cheap) and used it for a year before selling it for $1600 last year December when 50 series news started to pop. Currently on a low powered 6900xt that I purchased for $300 until I get a deal on used 5090. I hate paying full price on these depreciating assets.
Like every business in existence today, they tend to like you to be a regular customer. Through entitlement, Nvidia seems to be expected to a different standard than every other business today. Honestly, I'd take a 16gb 5080 with transformer upscaling, than a 24gb 7900xtx with features of the past. Sure I won't be able to turn on super duper far loading textures, but the rest of my image would look far superior. With that, I think the 5080 lasts farther into the future than a 7900xtx, especially with these RT minimum games lately, and upscaling as a requirement. (I don't think VRAM is the end all be all for the future of a GPU, VRAM capacity is just one component and many say a 4k texture isn't a crazy difference from a 2k texture, and Daniel Owen didn't even notice the texture swapping from 16gb to 24gb)
@@Mcnooblet yeah am going to go with a 5070ti or 4070ti super around the same price as a 7900xtx for me but sure the xtx does get about 5-10% better than 4070ti super but with dlss and the future i think 16gb vram is okay and 24gb is overkill and the older features in the xtx will kill you in 4-5 years
@robertl6747 I'm still playing at 1080p, which basically means I'm cpu-bound in most games anyway. Plus, I got it used for around $350. What is there for me to upgrade to with that price? 4060? Lol!
I mean it truly is. It’s on the same node and the supposedly new generation rt cores are nowhere to be found. It’s literally just a beefed up 4090 with gddr7 memory
no, it's just an RTX 5080 rebranded as the 5090 and sold for 2000$. The full die isn't being used to make room for an RTX 5090Ti. and the RTX 5080 is really a 70 class card
@ There won’t be a 5090Ti the cable is only rated for 600 watts and the base 5090 already pulls 575 to 600 watts while an overclocked can pull over 700. At 700 watts the connection point is close to 100 degrees celsius already. Unless they use 2 power connectors it won’t happen.
@@kerotomas1 "Unless they use 2 power connectors it won’t happen." here you go, 1200W RTX 5090TI confirmed ! this time with 100% more performance than the 5090 for only 100% more power consumption
NVIDIA wouldn't have released the 4090 if they knew AMD wasn't going to compete at the high end. I guarantee they are kicking themselves for the 4090 right now.
What I hate about the 5080 are the folks who are going to be saying it's a good GPU because you can get large gains by overclocking it. That's the customer risking their hardware by going over base spec at that point and not raw performance gains that the makers provided within spec and warranty.
The market is definitely screwed up, there's no turning back now, people are completely uninhibited and throw their money around without a second thought in search of meaning in their lives. In the years to come, when we'll be in a monopoly situation where we'll have to pay $10K for our entry-level GPU, which we'll have to renew every 2 years, justified by those people who “do what they want with their money ”.
There's already a monopoly. Do you want to do anything that needs video encoding? Nvidia 3D rendering? OptiX acceleration is unparalleled so you need Nvidia Anything that can be compiled with CUDA? Nvidia Raytracing in games? Nvidia Upscaling? Nvidia Anything involving matrix multiplication? Nvidia AMD GPUs compete with Nvidia for people who strictly game in the budget to mid tier space. Every one else HAS to go Nvidia because they have a monopoly
@@elk3407 You are a part of the mass Nvidia delusion. There's already a monopoly. Do you want to do anything that needs video encoding? AMD works fine, just not quite as fast. 3D rendering? AMD can do it, just depends which software suite, sometimes very fast. Anything that can be compiled with CUDA? Avoid CUDA software. Raytracing in games? 4 games that have a change for the positive, RT is still too in it's infancy. Upscaling? Both look worse than Native.
@@wewlad107 that sounds like a time saving development issue. All of a sudden Ray tracing minimum is required to run Indiana and now Doom. I guess once that becomes the standard of course you'll have no option other than to upgrade. Again it's all case by case and what kind of games you play.
@wewlad107 Demands haven't because developers are now spending less time on optimization due to the sheer amount of raw performance provided by hardware making optimization less "necessary"
5080 is underwhelming but nvidia seems to have messed up the clocks on it. Literally every single person who tries to increase its clocks goes to greater than 3.1 ghz and it gets close to 4090. All of this without even touching the voltage(which is already lower than 4080)
Ye but 4090 AIB manual overclock can also yield around 8-10% of the base 4090 fe model. Moreover, majority of consumer will never go anywhere close manual overclocking. Overall, still disappointed with this gen uplift, especially for the lower tiers
Unfortunately if AMD or Intel dont step up their game this situation is just going to get worse and worse. My only hope is that game devlopers will start scaling back on hardware requirements in order to reach more people on older hardware.
Tbh just buy more older games. At this point in time there's many hundreds of great games already released that dont require cutting-edge tech to run well.
AMD was planning to screw everyone over themselves. They benefit from Nvidia raising prices & lower expectations. At this point, I no longer feel AMD wants to "compete" at all. They just want to ride the coattails of Nvidia & capitalize on their greed (by just charging slightly less & hoping consumers think that's a W)
What pisses me off about graphics cards isn't just the price but more the shortage of them which pushes up the price. It doesn't happen with phones as you can easily get a s25 samsung etc on day 1 at retail price or under.
Literally why the rtx5080 is a hard pass for me, i wanted to buy one, but only if it's equal or faster then a 4090, but not only it's slower by a lot, the real price is a insult, and let's not mention the real fact the 5080 is a 5070 when you look @ specs, so not only they increase the price of XX80 card's they also downgraded them to a lower tier chip.
I want to see an overclocked 5080 vs an overclocked 4090 since some reviewers are able to overclocked a 5080 and get similar results to a 4090. I just want to see how big the gap would be.
We'll yeah. The 5080 is actually the 5070 and the 5070 is actually the 5060. Nvidia does it again. Sorry about all you guys that were excited about the price of the 5070, you're actually getting the 5060.
I have some money invested in Nvida so please buy the cards. Imagine all the fake frames you can get in the 5000series. Just forget about lossless scaling. We don't talk about LS
@@BlackTone91 If they released it as a free update to old gen cards everyiniie would be praising it ironically though, amd did it already. They released a 7800xt to replace a 6800xt and there 1%-8% performance gains, yet instead people proclaim WHAT A DEAL! the same price with more performance!
@BlackTone91 what you are saying is partly true. It's true tricks but all algorithms in pc and optimizing is tricks to run stuff on your pc. With out these "tricks" there will be no optimizing. It's stupid from Nvidia to lock these features on 50 series but at the same time these features need ai cores to run. And you can force it through the Nvidia app so it can run on almost all games.
The funny or sad thing about this whole release and production stopping, as a pooropean is that the 4060 I bought about a year ago is up almost 30% if i were to buy it now. (The exact same model)
6080 will most likely move to lower nm, so performance will be there. But will probably be 2k for 6080 and 3k for 6090 MSRP. So like 4k and 6k after paper launch day 2027. 😅
One thing I find interesting: the performance gain over last gen is pretty bad and people explain it with NVIDIA sticking to the same node and that makes sense. However we saw this exact situation before with the GTX 600, 700 and 900 series. All of them used the 28 nanometer process and hardly any increase in transistor density. The 680 had 12.1M/mm², the 780 had 12.6M/mm² and the 980 had 13.1M/mm². The GTX 780 offered - on average - 24% more performance compared to tthe 680 but it also used 28% more power (195W TDP vs. 250W TDP). So there was actually a decrease in efficiency. Then with the 900 series performance increased by ~38% from GTX 780 to GTX 980 however the power draw was reduced by 34% as well from 250W TDP to just 165W TDP - again all while STILL using tthe same node. I find that pretty interesting and wonder how they did that.
@@RobertZ1973 Its a very old card with low performance at this point but that doesnt change the fact that it was an amazing card for years. The most recent ''1080ti'' is really the 3080 and only for those that got it at MSRP.
@@RezzzTooth what was your pay check in 2017? what is your pay check today? .. its actualy close to the same ratio.. and I say today, not 2022 because its still the second best GPU on the planet
The end of Jan of 2024 I was contemplating getting a 4090 while everyone was saying to "wait for the 50 series, they are coming soon". Well it def was "soon". I'm glad I ignored all that talk. Especially when I was upgrading from a 2070 Super. I've now my Gigabyte 4090 for a year and seeing these videos about the 50 series makes me even happier about the purchase. The only thing I'll contemplate upgrading in the future is my 7950X3D CPU. I'll be keeping the 4090 for years to come.
I would strongly assume that with RTX 6000 we will see a pretty decent performance uplift. This gen's horrendous gain is caused by NVIDIA sticking to the same node. RTX 6000 however will switch to a new node. Reports seem to hint at N2 - from Samsung tho. N3 seems to offer around 10 to 15% more performance and ~30% less power draw than N5 (and before anyone comes at me stating: "But NVIDIA uses 4nm". No they don't. NVIDIA's process is called "4NP". If you check out the different nodes on TSMC's website you see that 4NP is also part of the 5nm process). N2 seems to offer the same improvement over N3. How exactly this translates to performance remains to be seen but it could be that we get a 50% performance uplift while the cards also draw quite a bit less power. Maybe we will see a 6090 that offers +50% performance but at 350 to 400 Watts. Would be nice to see if a similar performance and efficiency gain can be seen on the lower tier cards as well. Either way I don't expect the performance gain from 5080 to 6080 to be as bad as this gen. As many say: this is historically bad. I'll save this video and come back to it in...I guess late 2026/early 2027 and see how right or wrong I was lmao
@@megamix I'd argue that if NVIDIA is considering Samsung there seems to be at least a decent chance that Samsungs yields will be at least decent. They will most likely not be on TSMCs level but if they can offer good enough yields at good prices the deal is sealed. We will see. Also "shrinking" nodes is about a lot more these days than just making gates smaller. It's about cramming in more gates and shrinking the footprint of each transistor. The coming High-NA EUV with an increased numerical aperture of ~0.5 or 0.55 will be good for the coming 4 to 5 nodes while the succeeding Hyper-NA will PROBABLY also be good for 4 or 5 nodes. But that's hard to say. Additionally they're also working on switching to different materials like carbon. Switching to fiber optically operated chips is also being explored. We see what potential fiber optics has over carrying electricity fior data transfer in internet cables. Current tech is good for another 20ish years, after that other changes to chip manufacturing will enter the ring in 20, 30 or 40 years. Long story short: we're far from reaching the end.
No, 6000 series will kick the ish out of the 5000 and 4000 series. It’s going to be based on 3nm and that thing is awesome. I’m waiting for the 6,000 series exactly because of the 3nm architecture.
We all know that Nvidia is waiting for AMD to release their cards, then 5080Ti, and 5090Ti will be released. 5080Ti will be 15-20% faster than 5080 with 24GB Vram, and the 5090ti will be 20-30% with a price cut on the entire range (a free game or two for 5080 and 5090 owners to stop the crying). AMD can spice up the market, but they are a bit late with their releases. if the 9070 comes out with $450~480 MSRP 16-20GB Vram it will make even 1080~1080TI owners consider upgrading. I want AMD to step up, we have been wanting that on the GPU side ever since the Fury times, hell even on the 7900 series almost 10 years ago.
I was excited to upgrade and after this paper launch I'm gonna keep enjoying my current GPU till mid March. Let's see what happens once all the cards are on the table 😂
@parenthlete sure, but that means fuckery with the drivers that nvidia is accustomed with.. Like with 10,20 and probably 30 series too 1-2 fps less over 12 driver "optimizations" is nothing to sneezeabout
I am seeing some UA-cam reviewers overclocking the 5080 and it apparently overclocks VERY well and can get the same performance as a 4090. Could you do a review on that?
@@berkertaskiran I just watched an 4090 OC video from HU. 4090 gets about 5-10% increase from a mild overclock. 5080 gets what, average 15% increase on overclock? Sounds like we're nearly back at square one then arent we. Seeing as the 5080 gets handed its own ass by like 20% by the 4090 at stock already, overclocking both closes that gap every so slightly. Maybe the 5080ti super ti will do better. Also, you're overclocking $1000+ GPUs. Nobody cares about power draw at that point. We know the 5080 is more power efficient than a 4090, but if you want to get 4090 performance from a 5080 it sounds like you're going to need to give it 4090 power anyway.
Hey Daniel, can you do a comparison between old CNN DLSS Quality / Balanced and the new DLSS Transformer Performance, maybe with mid range cards like 4070 / 3060 TI? What will give you the best experience and performance at which resolution?
I love my 4090, HP had Palit really make a beast, the card itself is solid as a rock and like arch welded into the Omen 45L case, I mean you could land a 747 on this sucker, takes 8 screws to remove it, and the power cable is also very robust.
What about Jay doing the OC on the card, shot up quite massively - would be good if you could try one of the OC cards - just to see if its something you would recommend doing?
9:20 "i'm not going to do a lot of 1080p testing". You're testing 1080p by doing 4k with performance upscaling, People recommending 1k to 3K GPU's to still play at 1080p is laughable.
@@LordBattleSmurf At 1080p native, as this video shown, you'll get a base frame rate of 60 FPS average, not even close to get to the high refresh rate (with MFG x4) that those panels need, so it doesn't matter.
@ What are you talking about? It absolutely matters. There are tons of competitive games where 1080p and the flagship GPU make sense. Marvel Rivals for example. People who say "4090 or 5090 are not for 1080p" don't know what they are talking about. I'd take 1080p 480hz over 4k resolution anyday and I'd pair my 1080p 480hz monior with a 5090
@@LordBattleSmurf It doesn't matter because at 1080p with a 4090 and especially with a 5090 you'll have a CPU bottleneck in the vast majority of games, so you're relying on your CPU and not on the GPU. In Marvel Rivals with a 5090 at 4K low with DLSS Perf (1080p internal render), you'll get around 250 to 300 FPS, but that's only because it's UE5, and absolutely unoptimized, Don't give a damn about competitive games, they're not my thing, so never think of them when talking about GPU's.
Idk what anyone expected. Their closest competitor literally said they arent competing with flagship GPUs and the 80 series is one of the flagships. We are literally watching intel vs amd from 2010-2016 all over again. 5-10% perf uplifts with no real innovation other than smaller nm process.
The 60 series will most likely be built on a 2nm process, so the 6080 will probably curb stomp the 4090. The jump will probably be similar going from the 20 series to the 30 series. A massive node shrink like that opens the door for really big performance gains.
Daniel don't try to justify them. We know for a fact that a die of that size on the N4 process costs less than 150$. I'm sure they can do the board, memory and cooling for less than 850$. In fact they could sell the 5090 at the 5080 price and still make good profit. This is just Nvidia setting a precedent. They want to sell the 70 class at 80 class prices. It failed in the previous gen with the 12 GB 4080, but now they are actually doing it. It is the best moment for them to do this since AMD is not competing. Possibly so that it doesnt sell well, because that way they can allocate the wafers for AI which has higher margins.
Hi Daniel! You most likely will not see this, however I was wondering if you could do a video idea I had in mind… I trust you a lot so I would love to watch this video idea specifically from you. Would it be possible to do a video on what exactly to do after getting your new PC, or if you’ve been the fella that just gets a new pc, turns it on, then just plays mindlessly without changing any settings? For me personally, I don’t change pretty much anything after I get a new pc. I do my graphics card updates and Bios, then in the Windows setting I believe I change one of the power modes and that’s about it. I don’t know how to adjust settings to make sure I’m getting the best image quality, performance, etc. There are plenty of videos out there, but I know you would be thorough in going through EVERYTHING that complete noobs may miss, and I also trust you a bunch. Thanks!
@@noahleach7690 they apparently hit their perf target which was 1.03x of a rtx 4080. Since 5080 is barely any better, the amd card is probably within 10%
@@Fumblaps they cant target a card they have no information about from a whole year ago at least. But since 5070 ti will probably be 4080, it wont matter
MFG is a good cinematic thing, you could target ~18 fps and turn that into 60 on screen fps, which would give you the "ah" feeling of prerecorded cinematics from the last couple decades.
makes sense, the 4080/12 was a 4070. they literally just did this last round... the 4080 super should have been $800, not $1000 and the original 4089/16 should never have costs $1200+ but here we are.
@@sauhamm3821 why should the 4080 super be 800 usd when it matches the 7900xtx in raster and destroys it in RT with much better upscaling and FG features? 1k for the 4080 S was a fair price comapred to what AMD had to offer. And with the recent improvements of DLSS it turns out to eb the much better buy. Everybody who bought a 7900 xtx insteadd of a 4080 super must regret their decision
RTx 5080 is slower than a RTX 4090, RTX 5080 in europe is selling for around 2000 euro (810 euro above MSRP) while a 2nd hand 4090 is 1500-1600 euro. If this gen doesn't get much cheaper soon it's dead in the water.
@GM-Shenmue yeah msrp doesn't exist anymore, retailers are scalping themselves now and it's sickening. I was planning on getting a 5090 and moving my 4090 to my brother but at this rate I'd rather buy a 2nd 4090.
The only card Nvidia cares about is the 90 series and they aren't making them for gamers. I don't think the lack of 5090's is a production problem, I think they sold 90% of the stock behind the scenes to ai companies. There is a reason why the only meaningful uplift the 5090 has over the 4090 is in AI.
I am enjoying Frame Generation via Lossless Scaling so have zero need of a 5000 series. Two thumbs up for Nvidia with the new DLSS 4 though as that’s fantastic.
please test overclocking the 5080 - my Zotac Solid Gaming OC can easily OC to +450 core +500 mem and power limit to 110% and about 10-15% more additional performance i noticed from stock with really no noticeable power draw gain and temps are super good still as well. But then again this is a huge 5080 AIB card i got with better cooling being bigger as opposed to Founder's Edition. The bad part is that Ngreedia clearly locked the power limit of 5080 to 110% and its very very obvious it can easily push a lot further than that tbh
@pt-yt8322 I would argue it's the smart move... the only other card on the market that beats a 4090 is a 5090 , and since there's no chance of getting a 5090 anytime within the next 12months for MSRP, then the only next logical choice is to get a 4090 if you can get it for around MSRP. I wanted a 5090 because that's the only card that is worth the upgrade coming from a 4070. The only thing you're getting with 50 series is multi-frame-generation and to be honest it's not even something that I would use that often. Again it all depends on what games you are trying to to play and the card you currently own. 12gig of vram on a 4070 has been causing me some issues and so the only choice I have is 4090 or 5090. If some of these other cards like the 4070 had 16gig of vram and the 4080 had 20gig then people would have other cheaper options but Nvidia has screwed everyone into fighting over top tier cards because they limit vram in EVERY card except 90 class
@@KaySwiss21 oh yeah i agree 100%, i was actually considering the 5090 since it’s the only card that can surpass the 4090 and i had a gut feeling back in the early fall the 50 series would not compare to the 4090 just because of how powerful it was with nvidia having no high end competitors, i’m really glad i ended up getting the 4090 because as you said, i highly doubt the 5090 will be available for a reasonable amount of time, and nvidia had to have known this in advance by cutting production of the best 40 series card, essentially trapping consumers who want a nice high end gpu to get a 5080 or 5090 (which barely has any stock to begin with) and on top of that, it seems nvidia essentially gave all the features of a 50 series gpu to the 40 series except mfg, which i feel isn’t even that great in the current state due to it’s weak raw performance without dlss unless for some reason the 5090s drop a fair amount in value (either to over production or low demand), i will probably hold out on my 4090 since i can essentially do everything i want from 4k gaming, to ai training research, etc it’s kinda crazy to think that people essentially mass sold their 4090s in december and January (some for as cheap as like $1100-1200) with the hopes of securing a 5090 or a 5080 that was supposedly gonna surpass the 4090, and now they’re sitting at home without a gpu looking on the used market or waiting for the 50 series to go back in stock
@@LS-wy4bk I actually love when an AMD fan says "the only reason you want competition is so Nvidia lowers your prices for you to buy one!". Idk people say it as if it would be in a perfect world, but AMD shows when they are competitive, they increase the prices as well. Maybe competition would save people $50? $100? doesn't seem like a huge win. AMD is also competing fine for gaming, not so much for AI, so we have 2 markets now, and since AI isn't "free" and gamers want everything non raster "for free", I think it puts everyone non Nvidia in a weird position. Maybe someone does a B580 where the profit margins are so small, it isn't even worth producing a bunch, and that isn't a victory either.
Well seeing that 84 blackwell sm’s when over clocked comes close to 128 sm’s (65%less)in the 4090. Just by going to 3nm and keeping sm count the same the 6080 is beating the 4090
With how Nvidia is handling the 50 series, I highly doubt we will even see any 60 series even after the release. Especially of they include the 3nm. chips
Youre wrong, it will have Framemultiplication, where you can type any number from 2-100 and you get that amount of fake frames, for only double the price and rasterizationlevel of a 750ti!
Ah 5080 truely the 7700k of gpus. All these people going on about how moores law is over and the 4 cores at about 4ghz is just the maximum that is physically possible within the realms of affordable pysics for semi conductors. Wait sorry what were we talking about again
Sell your old GPU to fund your upgrade at Jawa! jawa.link/OwenFeb24 Use code OWEN10 for $10 off your first purchase!
HUB says RTX 5080 is really a RTX 5070.
"This deals getting worse all the time"
offered me 861$ for a 4090 😭
@8BitRetroRabbit ayo sussy baka
Owen, can you sell me your 4090 for 1k? We can cut out the middle man Jawa.
@@BruhMomentBobs ill give you 1k
Sorry Nvidia but I am not stupid. I wont buy rtx 5080 that is worse than rtx 4090. I will go with 5070 that will beat the shit out of 4090
yet ppl camp for days near stores to buy them the moment stock refills
No no, he's got a point
You go girl!
😂😂👍
They will read this and think ur serious..
Jensen will claim the 6050 with 8GB VRAM will beat the 5090 next time.
He won't be wrong. 6050 with 4K Lowest Settings + DLSS P + FG vs 5090 with 4K Max + No DLSS/FG
With 95% of AI generated frames! 😂
They can just add more fake frames each generation.
@@fearoxile2230 The sad thing is that this is true lol. Basically rendering at 160p native resolution and the rest is AI shit.
dlss 5 now generating 16 frames in between each real frame
If you keep this up, NVIDIA might not send you anymore GPUs to review 😂😂
nvidia is brave company, they know what everyone will say before sending lol.
If he keeps doing unfair comparisons I hope they stop doing it!
@@eldiestro1990 how is this an unfair comparison?
@@eldiestro1990 No way you're salty cos a billion dollar corpo is getting called out😂..fanboys are clowns fr
They don't have enough anyways
I had fully intended to buy a 5090, upgrading from my 3090. Then I saw the price hike. Even though I could afford it I won't support companies continuing to maximize profits at their customers' expense, so I figured I'd get a 5080 instead. Turns out Nvidia's bending over customers on the 5080 too. So now I'm just rolling with the 3090, and maybe I'll get one of the AMD cards instead when they drop. We as consumers are the only ones who can make these companies change, but we have to refuse to buy into the hype and stop giving them our money.
In NVIDIA’s defense, I don’t think they really intended the 5090 to be a gaming card with 32gb of ram. Seems like a productivity card. Hopefully they release a 5080ti with 18-24gb of vram on the newer 3gb memory modules when they become more available…. And hopefully those won’t cost 1400 bucks MSRP….
Bruh, EVERY company tries to make maximum profit at customers expense. Some companies can just charge more cause competitors are trash.
@@MusicalWhiskey You can bet every dime you have, if a 5080 ti came out with that much VRAM, it would be $1500.
Nvidia is busy filling up server centers with expensive AI chips, the gaming market is their last priority at this point.
@ I think you're right. But i could see a scenario with a 5080 Super with 18gb being released with a 1000-1200 MSRP and a 24gb 5080 TI Super being 1400-1500.
So basically the 5080 is a 60ti/70 class card. For 1000 dollars. yay!
It's true
I was really bummed I missed out on an FE from best buy yesterday but as the nvidia mania is fading away, Im actually glad I saved the grand. I really hope AMD pulls through
Its better than its predecessor for the same money. Seems good to me coming from my 2080, but what do I know.
That was only true when a die shrink actually reduced production costs, but I think we might be reaching the limit of price to performance increases each generation
@@Alex_Verso Considering that it outperforms its previous 80 series card, albeit not by much. No.
Just wait for the 5070 mr croc jacket told us it's gonna beat the shit out of 4090
It's true. 😂
I hope every single review of the 5070 gets compared against the 4090 AND ONLY the 4090!
Nah but it's going to be a dope GPU
@@Generationalwealth94dope GPU with 12 gb
Who’s ready for the 4070 super duper lol
Imagine if Nvidia werent a bunch of bastards and didnt stop 4090 production so we could buy a 4090 instead of the useless 5080
Bought 1080 and then 3080. Seems I am having a looong wait with this one. Works fine though. 4 and 5 series are bs imo.
@@SamiJuntunen1 I did something similar with 980 ti to used 3080 ti. Why do you even feel the need to upgrade? unless you have a 4k 240hz monitor that can make use of 4x frame gen...
@@kayblis Well I used to play a lot when I bought 3080 and it was "only" 740€ and asus tuf. Now I just lower the graphics from ultra to high and works fine.
I bought a 5090 yesterday to replace my 4090. For fun I was curious what a used 4090 could sell for. Newegg offered my $1350 which means they're gonna sell it for at least $1500-1600 if not more which is crazy for a two year old card. Finding a $1200 4090 doesn't look like it'll be an option for anyone.
@@kayblis3080 unless its the 12gb variant likely running out of vram with the 10gb edition
Apparently the 4090 is the 1080ti of the RTX lineup.
Yeah, this is increasingly looking to be the case. And that’s precisely why Nvidia cut production on it.
Oh cut it out... the 1080ti cult is delusional.
Indeed. The 4090 ranks in my personal 3 best Nvidia GPU's of all-time, along with the 1080ti and 8800GTX.
@ 8800GTX remained relevant way longer than the 1080ti.. 1080ti is overly hyped and its cult is delusional... by the time the 2000 series hit pascals arch was showing some hard limitations in newer shader heavy games like RDR2.
Yeah except it cost almost 3x the price compared to the 1080ti. What made the 1080ti and 980ti great cards isnt just their performance but also their price.
All of this has me really rethinking why I keep going with nvidia. Taking a step back from my excitement and dopamine from today’s attempt to really reflect on this. Great video as usual, keep up the good work!
Well, despite the lackluster generational improvements, AMD does not offer a more powerful gaming GPU than the 5080 or 5090. So if you want that level of performance you don't really have other options. The 7900 XTX isn't too far behind in raster performance, and costs less, but definitely falls behind in RT and upscaling quality. I'm curious what AMD has in store for the next gen, since realistically they are facing some of the same issues like node increases still being expensive. Perhaps AMD will be more willing to sacrifice margins to chase market share.
Because their software is really good and shits on AMDs
I quit soon.
@@inkedsleeve4226 Just how many gamers use this "software" though?
@@jannegrey There are actually published numbers, around 80-90% use DLSS when it exists, Frame Gen lags behind since it is mostly not needed except in Path Traced games and there are like 3.
Nvidia got Greedy with 40 series with 50 series they got even Greedier
Not even greedy, they're just going insane at this point...
We love Nvidia but at some point enough is enough but there's so many people like me who are Nvidia shareholders and have made an absolute shit ton of money... So idk lol
They need to be officially rebranded to Ngreedia
So just to help people
4080 was 4070Ti (Pretending to be a 4080) So Nvidia can charge xx80 price
5080 is a 5070 (Pretending to be a 5080) So Nvidia can charge xx80 price
6080? Well my guess it will be a 6060Ti.
Nvidia has gradually lowered the bar of the xx80 class, the gpu that is mostly for gamers, where the main difference between the top tier and the xx80 was usually things like VRAM/Creator features.
HUB released a video today showcasing this in more detail.
5080 is a 5070
5090 is 5080 (4080 is like 30% faster than 3090 Ti)
5070 Ti will be 5070
5070 will be 5060 Ti
5060 will be 5050
I don't mind that -> well, as long as the price is also lowered though...
@@Margelettothis is pretty wrong. Look at how much of the chip a 5090 is actually using.
Yields are low. The 5080 uses the full 203 die already.
@@jaylapointe1654 in terms of performance its right, the die is another topic
I cant believe people are actually making dumb comments like this and they get upvoted so much. Nothing about the 4080 made it a 4070ti lmao. It was plenty fast enough to make it a 4080.
Yea the 4090 might be the penultimate gpu of this decade , i envy those guys bought that beast at the msrp of $1600 XD
I did, sold it used for $1600 and bought a 7900 XTX used off jawa for $800. Yes, the 4090 was better. But my only Ray Tracing game was Cyberpunk.
I can 120fps at 4k with the 7900XTX in just about everything else I play with FSR.
which would be the ultimate then?
@jorellh5090
I purchased several 4090 FE at below msrp for $1440 when Bestbuy used to have a 10% off code in 2022-2023 and sold them for $1800. For personal use, I purchased a used Galax 4090 in excellent condition for $1000 on local marketplace(seller was selling her ex husband stuff for cheap) and used it for a year before selling it for $1600 last year December when 50 series news started to pop. Currently on a low powered 6900xt that I purchased for $300 until I get a deal on used 5090. I hate paying full price on these depreciating assets.
@jorellh 7900XTX and GRE models. for £600 /800 you got near 4090 performance without the RT
RT will be needed in few years, but as of now no.
Hi Daniel, your video makes me really happy as a 4090 owner. Thanks. Have a great weekend
The 5080 running out of VRAM already isn't looking good for the future of its expected time in peoples systems.
Like every business in existence today, they tend to like you to be a regular customer. Through entitlement, Nvidia seems to be expected to a different standard than every other business today. Honestly, I'd take a 16gb 5080 with transformer upscaling, than a 24gb 7900xtx with features of the past. Sure I won't be able to turn on super duper far loading textures, but the rest of my image would look far superior. With that, I think the 5080 lasts farther into the future than a 7900xtx, especially with these RT minimum games lately, and upscaling as a requirement. (I don't think VRAM is the end all be all for the future of a GPU, VRAM capacity is just one component and many say a 4k texture isn't a crazy difference from a 2k texture, and Daniel Owen didn't even notice the texture swapping from 16gb to 24gb)
@@Mcnooblet yeah am going to go with a 5070ti or 4070ti super around the same price as a 7900xtx for me but sure the xtx does get about 5-10% better than 4070ti super but with dlss and the future i think 16gb vram is okay and 24gb is overkill and the older features in the xtx will kill you in 4-5 years
@@PocoYZI'm convinced the 7900xtx will age like fine wine because of that huge Vram buffer
anyone want's to give me a 4090 in exchange of a 5070? it's the same performance
I got the Ryzen 4080xt for you
I got nvidia 730 DDR4 it plays all games at minimum 1fps and max infinate fps after crash
There's gonna be another GPU apocalypse soon. My 6900 XT will become even more precious to me at the rate were going.
I also was hoping to upgrade my 6900xt. But what to .. :/
@robertl6747 I'm still playing at 1080p, which basically means I'm cpu-bound in most games anyway. Plus, I got it used for around $350. What is there for me to upgrade to with that price? 4060? Lol!
@@nimaarg3066 $350 is crazy cheap, the cheapest I could find is $500, so I just got a 3090 for $600
5090 is a repackaged 4090ti. We all knew the 4090ti was coming.
I mean it truly is. It’s on the same node and the supposedly new generation rt cores are nowhere to be found. It’s literally just a beefed up 4090 with gddr7 memory
no, it's just an RTX 5080 rebranded as the 5090 and sold for 2000$. The full die isn't being used to make room for an RTX 5090Ti.
and the RTX 5080 is really a 70 class card
@ There won’t be a 5090Ti the cable is only rated for 600 watts and the base 5090 already pulls 575 to 600 watts while an overclocked can pull over 700. At 700 watts the connection point is close to 100 degrees celsius already.
Unless they use 2 power connectors it won’t happen.
5090 can get more than a 50% uplift at 8k compared to the 4090, so this whole TI shit is lame and a lie
@@kerotomas1 "Unless they use 2 power connectors it won’t happen." here you go, 1200W RTX 5090TI confirmed ! this time with 100% more performance than the 5090 for only 100% more power consumption
NVIDIA wouldn't have released the 4090 if they knew AMD wasn't going to compete at the high end.
I guarantee they are kicking themselves for the 4090 right now.
4090 was already not even a full die so they held back
thanks my 4080s will do me good for a few more years
What I hate about the 5080 are the folks who are going to be saying it's a good GPU because you can get large gains by overclocking it. That's the customer risking their hardware by going over base spec at that point and not raw performance gains that the makers provided within spec and warranty.
The market is definitely screwed up, there's no turning back now, people are completely uninhibited and throw their money around without a second thought in search of meaning in their lives. In the years to come, when we'll be in a monopoly situation where we'll have to pay $10K for our entry-level GPU, which we'll have to renew every 2 years, justified by those people who “do what they want with their money ”.
tbf there won't be a monopoly on entry level gpus, there's 3 companies making them and is the biggest market. high end is cooked tho.
There's already a monopoly.
Do you want to do anything that needs video encoding? Nvidia
3D rendering? OptiX acceleration is unparalleled so you need Nvidia
Anything that can be compiled with CUDA? Nvidia
Raytracing in games? Nvidia
Upscaling? Nvidia
Anything involving matrix multiplication? Nvidia
AMD GPUs compete with Nvidia for people who strictly game in the budget to mid tier space. Every one else HAS to go Nvidia because they have a monopoly
@@Simp_Supreme Definitely
how do people survive in the real world being as delusional as you are?
@@elk3407 You are a part of the mass Nvidia delusion.
There's already a monopoly.
Do you want to do anything that needs video encoding? AMD works fine, just not quite as fast.
3D rendering? AMD can do it, just depends which software suite, sometimes very fast.
Anything that can be compiled with CUDA? Avoid CUDA software.
Raytracing in games? 4 games that have a change for the positive, RT is still too in it's infancy.
Upscaling? Both look worse than Native.
I was hoping 6080 is a typo.... nope, not a typo, and somehow I started to agree yeah, it is not impossible.
Graphics in general have pretty much plateaued. I can see it being safe to skip a couple generations depending on your current card.
Also how is it that we haven't gotten fire to look decent. Still looks like jpegs
Visual appearances have plateaued, the resource demands have not
@@wewlad107 that sounds like a time saving development issue. All of a sudden Ray tracing minimum is required to run Indiana and now Doom. I guess once that becomes the standard of course you'll have no option other than to upgrade. Again it's all case by case and what kind of games you play.
@wewlad107 Demands haven't because developers are now spending less time on optimization due to the sheer amount of raw performance provided by hardware making optimization less "necessary"
@@wewlad107 Visuals haven't plateaued, we could have way way better visuals, hardware just can't keep up.
5080 is underwhelming but nvidia seems to have messed up the clocks on it. Literally every single person who tries to increase its clocks goes to greater than 3.1 ghz and it gets close to 4090. All of this without even touching the voltage(which is already lower than 4080)
Presumably so nvidia can crank up the core for the super/TI models.
@@godsgrasshopper272 The 5080s oc headroom is still present regardless. Hardware Unboxed did an OC video and the card easily ties the 4090.
@@godsgrasshopper272but the chip being used is already a full die. They havent released a super/ti that also doesn't have more cores.
Ye but 4090 AIB manual overclock can also yield around 8-10% of the base 4090 fe model. Moreover, majority of consumer will never go anywhere close manual overclocking. Overall, still disappointed with this gen uplift, especially for the lower tiers
I wonder if they did this deliberately to give themselves the option of releasing a 5080 ti at a later date?
Incredible comparative Daniel! I was waiting it so much, thank you!
Unfortunately if AMD or Intel dont step up their game this situation is just going to get worse and worse. My only hope is that game devlopers will start scaling back on hardware requirements in order to reach more people on older hardware.
why care. its just games. use your current pc and thats it.
Tbh just buy more older games. At this point in time there's many hundreds of great games already released that dont require cutting-edge tech to run well.
AMD was planning to screw everyone over themselves. They benefit from Nvidia raising prices & lower expectations. At this point, I no longer feel AMD wants to "compete" at all. They just want to ride the coattails of Nvidia & capitalize on their greed (by just charging slightly less & hoping consumers think that's a W)
What kind of replies are these? Why should the customer get penalized for the incompetence and greed of the companies?
@ctrl_x1770 These are sensible replies, the market will correct over stuff like this if people don't buy the latest and (sometimes) greatest
Appreciate the discussion and info.
What pisses me off about graphics cards isn't just the price but more the shortage of them which pushes up the price. It doesn't happen with phones as you can easily get a s25 samsung etc on day 1 at retail price or under.
It's called a "paper launch", and is working as intended.
Literally why the rtx5080 is a hard pass for me, i wanted to buy one, but only if it's equal or faster then a 4090, but not only it's slower by a lot, the real price is a insult, and let's not mention the real fact the 5080 is a 5070 when you look @ specs, so not only they increase the price of XX80 card's they also downgraded them to a lower tier chip.
I want to see an overclocked 5080 vs an overclocked 4090 since some reviewers are able to overclocked a 5080 and get similar results to a 4090. I just want to see how big the gap would be.
We'll yeah. The 5080 is actually the 5070 and the 5070 is actually the 5060. Nvidia does it again. Sorry about all you guys that were excited about the price of the 5070, you're actually getting the 5060.
facts !
I have some money invested in Nvida so please buy the cards. Imagine all the fake frames you can get in the 5000series. Just forget about lossless scaling. We don't talk about LS
:(
What do you mean? isn't 5070 have 4090 performance? LOL
Only with multi frame gen which is actually good . Even the latency isnt that bad. I dont know why you all hate on new technologies just because
@@basemmgtow7954 These are just software tricks in a few games that have nothing to do with the actual speed of the hardware
@@BlackTone91 If they released it as a free update to old gen cards everyiniie would be praising it ironically though, amd did it already.
They released a 7800xt to replace a 6800xt and there 1%-8% performance gains, yet instead people proclaim WHAT A DEAL! the same price with more performance!
@BlackTone91 what you are saying is partly true. It's true tricks but all algorithms in pc and optimizing is tricks to run stuff on your pc. With out these "tricks" there will be no optimizing. It's stupid from Nvidia to lock these features on 50 series but at the same time these features need ai cores to run. And you can force it through the Nvidia app so it can run on almost all games.
They lied.
The funny or sad thing about this whole release and production stopping, as a pooropean is that the 4060 I bought about a year ago is up almost 30% if i were to buy it now. (The exact same model)
6080 will most likely move to lower nm, so performance will be there. But will probably be 2k for 6080 and 3k for 6090 MSRP. So like 4k and 6k after paper launch day 2027. 😅
One thing I find interesting: the performance gain over last gen is pretty bad and people explain it with NVIDIA sticking to the same node and that makes sense. However we saw this exact situation before with the GTX 600, 700 and 900 series. All of them used the 28 nanometer process and hardly any increase in transistor density. The 680 had 12.1M/mm², the 780 had 12.6M/mm² and the 980 had 13.1M/mm². The GTX 780 offered - on average - 24% more performance compared to tthe 680 but it also used 28% more power (195W TDP vs. 250W TDP). So there was actually a decrease in efficiency. Then with the 900 series performance increased by ~38% from GTX 780 to GTX 980 however the power draw was reduced by 34% as well from 250W TDP to just 165W TDP - again all while STILL using tthe same node. I find that pretty interesting and wonder how they did that.
my biggest surprise is how the path tracing on the 5080 did not improve at all with blackwell
Why would it? Same node basically the same rt cores
Such an informative analysis, thanks
rtx 4090 is the new 1080 ti
Not really because it is still stupidly priced in comparison to the 1080ti on release
@@RezzzTooth 1080ti is an overhyped relic and the cult is delusional.
@@RezzzTooth 4090 is more like a TITAN X Pascal, which is $1200, nvidia just removed the 80ti card
@@RobertZ1973 Its a very old card with low performance at this point but that doesnt change the fact that it was an amazing card for years.
The most recent ''1080ti'' is really the 3080 and only for those that got it at MSRP.
@@RezzzTooth what was your pay check in 2017? what is your pay check today? .. its actualy close to the same ratio.. and I say today, not 2022 because its still the second best GPU on the planet
The end of Jan of 2024 I was contemplating getting a 4090 while everyone was saying to "wait for the 50 series, they are coming soon". Well it def was "soon".
I'm glad I ignored all that talk. Especially when I was upgrading from a 2070 Super. I've now my Gigabyte 4090 for a year and seeing these videos about the 50 series
makes me even happier about the purchase. The only thing I'll contemplate upgrading in the future is my 7950X3D CPU. I'll be keeping the 4090 for years to come.
I would strongly assume that with RTX 6000 we will see a pretty decent performance uplift. This gen's horrendous gain is caused by NVIDIA sticking to the same node. RTX 6000 however will switch to a new node. Reports seem to hint at N2 - from Samsung tho. N3 seems to offer around 10 to 15% more performance and ~30% less power draw than N5 (and before anyone comes at me stating: "But NVIDIA uses 4nm". No they don't. NVIDIA's process is called "4NP". If you check out the different nodes on TSMC's website you see that 4NP is also part of the 5nm process). N2 seems to offer the same improvement over N3. How exactly this translates to performance remains to be seen but it could be that we get a 50% performance uplift while the cards also draw quite a bit less power. Maybe we will see a 6090 that offers +50% performance but at 350 to 400 Watts. Would be nice to see if a similar performance and efficiency gain can be seen on the lower tier cards as well. Either way I don't expect the performance gain from 5080 to 6080 to be as bad as this gen. As many say: this is historically bad.
I'll save this video and come back to it in...I guess late 2026/early 2027 and see how right or wrong I was lmao
Node shrinks have diminishing performance improvements and are increasingly expensive. Also any new node from Samsung is vaporware.
@@megamix I'd argue that if NVIDIA is considering Samsung there seems to be at least a decent chance that Samsungs yields will be at least decent. They will most likely not be on TSMCs level but if they can offer good enough yields at good prices the deal is sealed. We will see. Also "shrinking" nodes is about a lot more these days than just making gates smaller. It's about cramming in more gates and shrinking the footprint of each transistor. The coming High-NA EUV with an increased numerical aperture of ~0.5 or 0.55 will be good for the coming 4 to 5 nodes while the succeeding Hyper-NA will PROBABLY also be good for 4 or 5 nodes. But that's hard to say. Additionally they're also working on switching to different materials like carbon. Switching to fiber optically operated chips is also being explored. We see what potential fiber optics has over carrying electricity fior data transfer in internet cables. Current tech is good for another 20ish years, after that other changes to chip manufacturing will enter the ring in 20, 30 or 40 years. Long story short: we're far from reaching the end.
It should be expected that eventually we would reach the point of diminishing returns, gains can´t infinitely continue at the same rate.
4090 best investment ever
Be interested to see this video again but with the 5080 OC. Seeing some rumblings that it’s getting some good gains via OC.
No, 6000 series will kick the ish out of the 5000 and 4000 series. It’s going to be based on 3nm and that thing is awesome. I’m waiting for the 6,000 series exactly because of the 3nm architecture.
Can you do some content on under volt, overclocking and ram running videos that’s part of being a “GPU” reviewer
10:14 Foreshadowing 16gb VRAM will be obsolete much faster than I anticipated.
We all know that Nvidia is waiting for AMD to release their cards, then 5080Ti, and 5090Ti will be released. 5080Ti will be 15-20% faster than 5080 with 24GB Vram, and the 5090ti will be 20-30% with a price cut on the entire range (a free game or two for 5080 and 5090 owners to stop the crying).
AMD can spice up the market, but they are a bit late with their releases.
if the 9070 comes out with $450~480 MSRP 16-20GB Vram it will make even 1080~1080TI owners consider upgrading.
I want AMD to step up, we have been wanting that on the GPU side ever since the Fury times, hell even on the 7900 series almost 10 years ago.
I was excited to upgrade and after this paper launch I'm gonna keep enjoying my current GPU till mid March. Let's see what happens once all the cards are on the table 😂
Thank you for all the content Daniel. Working and doing all this testing must be tiring. I'm sure everyone appreciates it.
I'm not worried at all I'll just skip the generations that don't have too much to offer in terms of performance uplift
Hi Daniel, some reviews say that the RTX 5080 is good for OC, making it very close to the 4090 performance. Any way you could check that ?
As a 4090 owner, i see this as a absolute win
No reason to upgrade for a while..
i agree 100% i love my 4090 more every day lol.
curious to know when other people with 4090 plan to upgrade? maybe when the 6090 comes out or even later?
It’s a short term win maybe , surely leaps in performance are better for everyone longer term?!
@parenthlete sure, but that means fuckery with the drivers that nvidia is accustomed with.. Like with 10,20 and probably 30 series too 1-2 fps less over 12 driver "optimizations" is nothing to sneezeabout
Great video, exactly what i was looking for
I am seeing some UA-cam reviewers overclocking the 5080 and it apparently overclocks VERY well and can get the same performance as a 4090. Could you do a review on that?
Couldn’t you also just overclock the 4090 then?
@@Looskidify The 4090 is already a power hog though. And it doesn't overclock well. The performance doesn't change much.
@@berkertaskiran I just watched an 4090 OC video from HU. 4090 gets about 5-10% increase from a mild overclock. 5080 gets what, average 15% increase on overclock? Sounds like we're nearly back at square one then arent we. Seeing as the 5080 gets handed its own ass by like 20% by the 4090 at stock already, overclocking both closes that gap every so slightly. Maybe the 5080ti super ti will do better.
Also, you're overclocking $1000+ GPUs. Nobody cares about power draw at that point. We know the 5080 is more power efficient than a 4090, but if you want to get 4090 performance from a 5080 it sounds like you're going to need to give it 4090 power anyway.
Hey Daniel, can you do a comparison between old CNN DLSS Quality / Balanced and the new DLSS Transformer Performance, maybe with mid range cards like 4070 / 3060 TI? What will give you the best experience and performance at which resolution?
2:07 smooth 🤣
I love my 4090, HP had Palit really make a beast, the card itself is solid as a rock and like arch welded into the Omen 45L case, I mean you could land a 747 on this sucker, takes 8 screws to remove it, and the power cable is also very robust.
Looks like I’ll be waiting for a 6080S/6080 Ti
Same thing is gonna happened 🥲
@ I may just take the time to save rn and buy a 6090 and not worry about upgrading for many years lol
Which graphics card is the best for 1440p Path tracing right now?
moores law is dead they can only create more gimmicks to sell new gens
yes
What about Jay doing the OC on the card, shot up quite massively - would be good if you could try one of the OC cards - just to see if its something you would recommend doing?
Hey Daniel, can you give us your updated thoughts on the 5080 based on all the talk popping up about its pretty nice Overclocking ability ?
9:20 "i'm not going to do a lot of 1080p testing".
You're testing 1080p by doing 4k with performance upscaling, People recommending 1k to 3K GPU's to still play at 1080p is laughable.
Critical thinking is almost long gone with people. The delusion is what they want.
1080p 480hz oleds are a thing
@@LordBattleSmurf At 1080p native, as this video shown, you'll get a base frame rate of 60 FPS average, not even close to get to the high refresh rate (with MFG x4) that those panels need, so it doesn't matter.
@ What are you talking about? It absolutely matters. There are tons of competitive games where 1080p and the flagship GPU make sense. Marvel Rivals for example. People who say "4090 or 5090 are not for 1080p" don't know what they are talking about. I'd take 1080p 480hz over 4k resolution anyday and I'd pair my 1080p 480hz monior with a 5090
@@LordBattleSmurf It doesn't matter because at 1080p with a 4090 and especially with a 5090 you'll have a CPU bottleneck in the vast majority of games, so you're relying on your CPU and not on the GPU.
In Marvel Rivals with a 5090 at 4K low with DLSS Perf (1080p internal render), you'll get around 250 to 300 FPS, but that's only because it's UE5, and absolutely unoptimized,
Don't give a damn about competitive games, they're not my thing, so never think of them when talking about GPU's.
sorry if I missed this.. are you running the 4090 with the new driver updates for diss 4?
So glad I bought a Strix 4090 back in August Let’s go !!
Idk what anyone expected. Their closest competitor literally said they arent competing with flagship GPUs and the 80 series is one of the flagships. We are literally watching intel vs amd from 2010-2016 all over again. 5-10% perf uplifts with no real innovation other than smaller nm process.
As a 4090 owner, I’m going to start saying that I have the “5080 Super.”
haha and 5090 owners should say they have the rtx 4090ti
5080 total junk with 10% performance uplift. And its out of stock😂😂😂
The 60 series will most likely be built on a 2nm process, so the 6080 will probably curb stomp the 4090. The jump will probably be similar going from the 20 series to the 30 series. A massive node shrink like that opens the door for really big performance gains.
As someone who has decided to hold off on buying a new tv until a 98 inch 8K 240HZ comes out a decent price, I sure hope what you're saying is correct
The 4000 series is already expired. The RTX 5090 is a great solution. Get more FPS for less money. 👌
@@Nvidia_RTX Not with that paper launch. And I'm not paying a scalper.
18:50 I’ve heard that before🤔
I remember the early "leaker rumors" just after the 40 series launched saying the 5090 would be 80% faster than the 4090.
what software you are using to display the pc latency?
A brand new $1000 5080 having to turn down any texture settings is highly unacceptable imo.
Yep, pretty happy with my 4090 FE. The only reason for me to update would be significantly higher efficiency, but this gen doesn't offer that at all.
Daniel don't try to justify them. We know for a fact that a die of that size on the N4 process costs less than 150$.
I'm sure they can do the board, memory and cooling for less than 850$.
In fact they could sell the 5090 at the 5080 price and still make good profit.
This is just Nvidia setting a precedent. They want to sell the 70 class at 80 class prices.
It failed in the previous gen with the 12 GB 4080, but now they are actually doing it.
It is the best moment for them to do this since AMD is not competing.
Possibly so that it doesnt sell well, because that way they can allocate the wafers for AI which has higher margins.
More like less than 300. That's where the 70% margin comes in.
Hi Daniel! You most likely will not see this, however I was wondering if you could do a video idea I had in mind…
I trust you a lot so I would love to watch this video idea specifically from you. Would it be possible to do a video on what exactly to do after getting your new PC, or if you’ve been the fella that just gets a new pc, turns it on, then just plays mindlessly without changing any settings?
For me personally, I don’t change pretty much anything after I get a new pc. I do my graphics card updates and Bios, then in the Windows setting I believe I change one of the power modes and that’s about it. I don’t know how to adjust settings to make sure I’m getting the best image quality, performance, etc.
There are plenty of videos out there, but I know you would be thorough in going through EVERYTHING that complete noobs may miss, and I also trust you a bunch.
Thanks!
I have been seeing leaks that the new AMD 9070 XT is less than 10% slower than the 5080 for half the price.
probably bs but it would be nice
@@noahleach7690 they apparently hit their perf target which was 1.03x of a rtx 4080. Since 5080 is barely any better, the amd card is probably within 10%
@@Donsmokeypapi Their target was the 5070ti
@@Fumblaps they cant target a card they have no information about from a whole year ago at least. But since 5070 ti will probably be 4080, it wont matter
@@Donsmokeypapi That's their plan though, hence the renaming of their cards.
MFG is a good cinematic thing, you could target ~18 fps and turn that into 60 on screen fps, which would give you the "ah" feeling of prerecorded cinematics from the last couple decades.
I slept hearing about 5080 and woke up to this and got a mini heart attack thinking i slept for god knows how long.
that's too real and scary i feel you
Given the OC headroom on the 5080 I'm very interested to see a comparison of both at max OC....
Mmh I'm pretty sure the 6080 will be more powerful 😅
Jesus, Jensen really wasn't kidding when he said Moore's law is dead
RTX 5080 is RTX 5070
makes sense, the 4080/12 was a 4070. they literally just did this last round...
the 4080 super should have been $800, not $1000 and the original 4089/16 should never have costs $1200+
but here we are.
@@sauhamm3821 why should the 4080 super be 800 usd when it matches the 7900xtx in raster and destroys it in RT with much better upscaling and FG features? 1k for the 4080 S was a fair price comapred to what AMD had to offer. And with the recent improvements of DLSS it turns out to eb the much better buy. Everybody who bought a 7900 xtx insteadd of a 4080 super must regret their decision
What is impressive is the 5080 overclocks 15-20 percent to 3.2-3.3ghz
I’m so happy I was able to get a second hand 20 day old 4090 for $1300 back in October 🎉
RTx 5080 is slower than a RTX 4090, RTX 5080 in europe is selling for around 2000 euro (810 euro above MSRP) while a 2nd hand 4090 is 1500-1600 euro. If this gen doesn't get much cheaper soon it's dead in the water.
And i saw 5090 reaching up to 3100€...this is ridiculous.
@GM-Shenmue yeah msrp doesn't exist anymore, retailers are scalping themselves now and it's sickening. I was planning on getting a 5090 and moving my 4090 to my brother but at this rate I'd rather buy a 2nd 4090.
Can i know what software do you use to measure these performance numbers? and what software to measure the gpu latency?
As long as people continue to line up to buy this trash, value will continue to decrease
Exactly people keep buying AMD trash it's crazy
The only card Nvidia cares about is the 90 series and they aren't making them for gamers. I don't think the lack of 5090's is a production problem, I think they sold 90% of the stock behind the scenes to ai companies. There is a reason why the only meaningful uplift the 5090 has over the 4090 is in AI.
I am enjoying Frame Generation via Lossless Scaling so have zero need of a 5000 series. Two thumbs up for Nvidia with the new DLSS 4 though as that’s fantastic.
please test overclocking the 5080 - my Zotac Solid Gaming OC can easily OC to +450 core +500 mem and power limit to 110% and about 10-15% more additional performance i noticed from stock with really no noticeable power draw gain and temps are super good still as well. But then again this is a huge 5080 AIB card i got with better cooling being bigger as opposed to Founder's Edition. The bad part is that Ngreedia clearly locked the power limit of 5080 to 110% and its very very obvious it can easily push a lot further than that tbh
this is exactly why I decided to not even bother with 50 series and grabbed a 4090 off eBay , brand new for $1900
same, i grabbed for a little cheaper than that and people were calling me an idiot online for getting it last year in september
@pt-yt8322 I would argue it's the smart move... the only other card on the market that beats a 4090 is a 5090 , and since there's no chance of getting a 5090 anytime within the next 12months for MSRP, then the only next logical choice is to get a 4090 if you can get it for around MSRP. I wanted a 5090 because that's the only card that is worth the upgrade coming from a 4070. The only thing you're getting with 50 series is multi-frame-generation and to be honest it's not even something that I would use that often. Again it all depends on what games you are trying to to play and the card you currently own. 12gig of vram on a 4070 has been causing me some issues and so the only choice I have is 4090 or 5090. If some of these other cards like the 4070 had 16gig of vram and the 4080 had 20gig then people would have other cheaper options but Nvidia has screwed everyone into fighting over top tier cards because they limit vram in EVERY card except 90 class
@@KaySwiss21 oh yeah i agree 100%, i was actually considering the 5090 since it’s the only card that can surpass the 4090 and i had a gut feeling back in the early fall the 50 series would not compare to the 4090 just because of how powerful it was with nvidia having no high end competitors, i’m really glad i ended up getting the 4090 because as you said, i highly doubt the 5090 will be available for a reasonable amount of time, and nvidia had to have known this in advance by cutting production of the best 40 series card, essentially trapping consumers who want a nice high end gpu to get a 5080 or 5090 (which barely has any stock to begin with)
and on top of that, it seems nvidia essentially gave all the features of a 50 series gpu to the 40 series except mfg, which i feel isn’t even that great in the current state due to it’s weak raw performance without dlss
unless for some reason the 5090s drop a fair amount in value (either to over production or low demand), i will probably hold out on my 4090 since i can essentially do everything i want from 4k gaming, to ai training research, etc
it’s kinda crazy to think that people essentially mass sold their 4090s in december and January (some for as cheap as like $1100-1200) with the hopes of securing a 5090 or a 5080 that was supposedly gonna surpass the 4090, and now they’re sitting at home without a gpu looking on the used market or waiting for the 50 series to go back in stock
Have you tried overlocking?
People keep buying Nvidia like hotcakes and even 7080 won't beat 4090.
Which AMD card will beat the 5080?
It would be good if there was more competion
@@LS-wy4bk I actually love when an AMD fan says "the only reason you want competition is so Nvidia lowers your prices for you to buy one!". Idk people say it as if it would be in a perfect world, but AMD shows when they are competitive, they increase the prices as well. Maybe competition would save people $50? $100? doesn't seem like a huge win. AMD is also competing fine for gaming, not so much for AI, so we have 2 markets now, and since AI isn't "free" and gamers want everything non raster "for free", I think it puts everyone non Nvidia in a weird position. Maybe someone does a B580 where the profit margins are so small, it isn't even worth producing a bunch, and that isn't a victory either.
Well seeing that 84 blackwell sm’s when over clocked comes close to 128 sm’s (65%less)in the 4090. Just by going to 3nm and keeping sm count the same the 6080 is beating the 4090
Yeah but a 7080 will frame Gen a 5fps game to 500fps at 5,000ms latency! FEEL THOSE FRAMES BROTHA
I'm pretty sure the next generation is definitely going to be on 3nm or 2nm which is delayed but supposedly still beginning production soon.
With how Nvidia is handling the 50 series, I highly doubt we will even see any 60 series even after the release. Especially of they include the 3nm. chips
and the price is gonna get a lot more expensive aswell on those nodes
Youre wrong, it will have Framemultiplication, where you can type any number from 2-100 and you get that amount of fake frames, for only double the price and rasterizationlevel of a 750ti!
Great video. I wish you’d have included Alan Wake 2 since it’s also path traced.
Ah 5080 truely the 7700k of gpus.
All these people going on about how moores law is over and the 4 cores at about 4ghz is just the maximum that is physically possible within the realms of affordable pysics for semi conductors.
Wait sorry what were we talking about again
Something is wrong with your latency, at 4x frame gen on the OC model I am getting about 10ms lower latency.