@@henson2k One of them did show the performance without AI in Cyberpunk 2077. PC Centric.. Spoiler: not impressive. Is just one game but, is not nothing.
For $2000 plus, it should blow the heatsink off the 4090! I really wished they would focus on more native power performance and not A.I. this, A.I. that crap.
But the rasterization is only ever between 30%-40% improvement generation to generation anyway for the top tier card and is always around 15%-20% as you go down the stack. I for one will be going for the 5090fe from my 4090 as once I have the 5090 selling the 4090 will cover way over half the cost. What I do not understand though is how Nvidia can create a 5090 into a 2 slot card but all the AIBs are still creating the 4 slot monsters?
Most expensive driver updates since gpu's exist , I'l be gladly skipping Ngreedia's scam and I'be happy to judge everyone who is supporting Ngreedia cause they have to lack common sense if they are backing any of this. People like Jensen Huang belongs in jail for fraud and intentionally causing inflation in the pc market . Ngreedia fanboys are allowed to join his cell since they are so in to these things,what better way to show more support than taking a cell next to Jensen's 🤣
So if not for a die size, what are we paying for extra? Main reason GPUs jumped in price was due to their die size? Whats the excuse now? Im confused at this point. I'll sit and wait for reiews or tech to get ironed out.
@ My answer was removed, youtube is not in the common sense business ,be aware people our governments and big corporations together want to erase common sense from the public because that will make it easier for them to sell us lies for any price tag. That is why my previous comment was deleted ,it contained nothing bad but common sense about Ngreedia and Jensen Huang. Without free speech nobody will be able to stop this madness .Free speech was taken away from me every time one of my comments get's deleted .The Big CORPOS are protecting each other with silencing people about the truth.
Same here got the Asrock 7900XTX Taichi with the old 5800X3D, plus getting 100-170fps in my current goto game at 7680 x 2160, won't use upscaling or frame gen, it looks like crap when you're a trained electronics repair tech... My policy? Go native or reduce settings, skip at least one generation on the hardware side of things and save a fortune, and STILL get double digit performance boosts with every upgrade as opposed to stagnationor even regression and a huge amount of "buyers remorse". e.g I got a 35% fps uplift when switching out my 2700X for the 5800X and a further increase of 15% when I got the 5800X3D after I tore a pin off the 5800X. You want fakery? Get a cr*apsole and be done with it, cause once you see the artifacts you'll never NOT see them!. And yes I've tried both FSR and frame gen on both AMD and Nvidia systems and CAN see the huge drop in detail and the artifacts related to them, Black Myth Wukong looks like utter garbage and ghosts like a b***h with or without them. Forza looks blurred to hell and gone, raytracing is nice but I've lived without it for 20+ years now and the performance hit (on ANY GPU) is too high to justify using it. Sure SOME of these games look fantastic but when it runs like a hanna barbera or loonytoons carton from the 50's-90's? Forget that s**t! What's the point of raytracing when the game gives you SERIOUS "uncanny valley" vibes because of all of this. cars loo great up close but TAA makes FM look like I need my eyes tested and destroys ANY sense of immersion whilst a game that runs on a toaster PC at triple digit rates, whilst looking damned good, has sharper visuals right out to the horizon and doesn't have shadows causing the trees to change colour and shape right in front of your car, that and AI that can barely stay on track... (Oops soz for the rant but it does prove a point) Roll on the 9800X3D I'm looking at this month and the rumoured 300+Tflop RDNA5 multichip GPU's due in the next ohh 12-18 months. That'd be roughly 5-6X the TFlops of the 7900XTX. Yeah, yeah, I KNOW TFlops aren't a reliable metric but when you go from this. ua-cam.com/video/fv0EX3zvJUo/v-deo.htmlsi=y45FlgUzSVW2GmV1 To this, ua-cam.com/video/tNrOUodxRs0/v-deo.htmlsi=5r4HLqffUoKjpwSj When going from the 10TFlop 5700XT to a 62TFlop 7900XTX and get a 2-3x increase in fps? (The fps can be seen bottom left of those vids in green if you set it to play at "4K", even though it's only 1080P THEN imagine what you see at 57 inches diagonally). Wait a generation and watch AMD really push the boat out on the GPU front. Should run my G95NC 240Hz dual 4K panel at close to or over it's refresh rate... Nvidia keep selling the next big feature well before the hardware is actually ready for it let alone decently capable of utilizing it and now their using blatant deceit of different versions of DLSS to sell their fake frames (As do AMD to be fair though they still use raster whilst mentioning FSR etc) and deliberately mislead people who know no better. It's why I'll take this... ua-cam.com/video/aAkI8jhMgos/v-deo.htmlsi=XZJq2SO0v1uX7Ixi Over this.... ua-cam.com/video/3NlpCnDgIMk/v-deo.htmlsi=b7_vJSGsPuuhRBmN Any day of the week.
Honda boys with their laptops drinking a red bull laughing in your face. Sometimes people say the most uneducated things, please think before you speak.
@@unavailable291 You can put a chip in your car and get better performance, you can upgrade to a turbo and inter-cooler and get more power. "RAW" performance for a car would be NA not turbo or super charger. Saying fake frame is = to saying a turbo/supercharger or Performance chip is "FAKE" HP!! Fact.
Nvidia trying their best to not show raw rasterization comparison. Pure facts.. furthermore Rtx 4090 kingpin edition wasn't released for this same reason. Rtx 4090 + power mod + o.c + Lostless scaling 3.0 would close the gap dramatically within 10-15 percent . all aboard the "HYPE TRAIN". WE NEED A GPU THAT CAN DO PURE rasterization performance period.
I know it’s crazy to be able to play a game like cyberpunk with full path, tracing , raytracing at 4K maxed and all the eye candy and get an acceptable, Smooth experience , with under 50 MS of input lag ..it’s crazy.? Because you’re never gonna be able to do that on a console or any other GPU. Lol I don’t think people comprehend how much compute power is needed just to be able to play any game using full path tracing
Lossless scaling works much worse than dlss framegen if what has been shown is any indication. I know the last LSC update with 3.0 framegen has been very problematic for me in particular.
Razterization will appear on 60 series. They're squeezing as much as they can out of the chip. They scaled price back on these, 60 series will hit the wallet. If people think 2K is high then they're in for a surprise. Then they'll be crying and whining like pure brokies.
Honestly what can we expect from that? Nvidia is literally at their limits with available tech. We need 2nm TSMC wafers available at pre-pandemic prices and more power limits. The 5090 with all its spec might is going to be gimped because it's limited to just under the 600W capacity of its power cable. It's not even the full GB203 We need 1500-2500W power supplies to handle some real raw performance improvements because the chip has to be enormous to fill it with enough compute and it will be power hungry. We also need dev's to stop trying to use all these unoptimized aesthetics in games... So it's either that or bet on Ai.
This generation might be a lot more interesting than I thought. If the 5080 is only 10-15% better than the 4080 at $999, then an RX 9070 XT at about equal perf. as a 4080 Super (rumored) at ~550-600$ (rumored) suddenly seems like fantastic deal. Hoping FSR4 can perform on par with the up-scaling of DLSS. If so, I don't see a major reason to go with Nvidia this gen.
Doubt it, I'm expecting around 4070 TI Super raster performance out of the 9070 XT which will be dangerously close to what it's looking like the 5070 will be capable of. At 500 if the 5070 is within 5-10% of the raster performance of the 9070 XT people will buy the 5070 unless the 9070 XT is undercutting it by 100 dollars or more. Nvidia just has a stronger position in terms of upscaling, ray tracing, and frame gen. Look at 4080/4080S vs the 7900 XTX on the steam survey, the 7900 XTX was generally a bit faster on raster performance and was cheaper than the 4080/4080S and had more VRAM but its sales were terrible in comparison.
Well I'm keeping my money until I see benchmarks but my opinion on AMD vs NVIDIA in the last 10 or so years is that unless you are getting NVIDIA's top card AMD can offer similar performance for less money and that is how it has been pretty much forever.
"What another useless crap could we force on the gamers? Anyone?" "Well... I'm not sure if I should say..." "What..." "How about adding a fake frame?" "What..." "How about adding one fake frame to the existing one, doubling the frame rate?" "Damn... there is only one serious flaw with your idea..." "And what is that, boss?" "You think little... add ONE frame? MANY! MANY FRAMES!"
Only raster performance benchmarks we have is Resident Evil 4 since it doesnt have DLSS. But remember RE4 Remake is AMD sponsored title and runs worse on Nvidia cards. In other games, especially Nvidia sponsored ones the raster performace gains may be greater.
I get the same feeling about all this tech as I did when cars went from plane V8 to V6 and inline 4s with electronic fuel injection, turbos, plastic engine parts and computer control. Nothing of the charm and the rush from the raw performance is present in these designs.
omg how on earth did you missed the fact 5080 is literally the worst case of this "software upgrade" lineup? Geez, if this is how Nvidia's fanboy brain works, no wonder the company managed to grow despite all the obvious anti-consumer behavior.
DLSS is fine when it comes to super resolution. It’s frame generation that is muddying the waters. I wish people would do a better job differentiating between the two. I never use frame generation so it’s pointless for me but I love super resolution as it improves performance without significant drawbacks.
Can you discuss NR and the RTX 5090 performance as it relates to content creation, video editing and 3D rendering like in Unreal Engine? Not all of us are gamers and we content creates don't care about how many FPS we can get on a video game. Thanks :)
Oof, the 5080 and below only gaining 15% performance... That instantly makes the 5080 overpriced even at $999 Ball's in AMD's court, I hope they don't mess this up.
Is this the first step between having a gaming pc and just having a fire stick type thing to play games? Everything will be AI with internet streaming….
I've been around computers so long that common tech from years ago doesn't even exist anymore and I remember when common tech now, like graphics cards, became a thing. I was too broke to afford a graphics card when they first came out, but I bought a motherboard with an AGP slot so I could add it in later, which I eventually did. So now, I'm thinking that we gamers might be seeing the point when software takes the driverseat away from rasterization. People are complaining about prices, but if software really can do what it took hundreds of dollars of components to do, then we should all be happy.
Why would that make you happy? You are still going to have to buy hardware at an accelerating cost, but now on top of that they are eventually going to start charging you a subscription fee to be able to use your own pc. Why do you think they keep trying to make cloud gaming a thing? This is the end goal, and if you disagree then you only need to look at everything else to understand that you are wrong. This is just the beginning of the end for the PC/Hardware enthusiast and that is nothing to celebrate.
@@Pleasiotic1 Let's think about this for a moment. Computer hardware manufactures make their money buy selling computer hardware that eventually becomes obsolete so they can sell you more. They have no interest in removing pc hardware enthusiasts. It's the exact opposite. They have figured out that enthusiasts will spend gobs of money to upgrade their equipment. If you think they want to kill this golden goose, you're kidding yourself. Video game developers might want to go to a subscription model, but the hardware makers would pretend to be on the consumers' side just to keep their money rolling in. Legally we have been buying licenses for a while now anyway and the world hasn't come to end. I like the idea of ownership, but its temporary no matter which way you look at it. I own games that I can't play anymore anyway, unless I want to build a brand new vintage system. That computer I talked about from decades ago that had an AGP slot, well it broke decades ago. I have lots of games in a box that I can't even load up anymore.
@@Odin029 Firstly, I have at minimum been involved with PC Gaming and PC hardware since it has existed in its modern interpretation. My first PC was a Packard Bell 486sx 25 with 4 MB of SDR ram and a single speed cd-rom. I had been into PC's earlier than that but wasn't old enough to be financially involved. All that is to say I know exactly what makes a HARDWARE enthusiasts work. There is a growing market of retro PC gaming so just because you don't see the value of older hardware doesn't mean others don't. Secondly, Nvidia already tried their hand at cloud gaming so they do want that business model just like every other hardware mfr out there. I am a Systems Engineer and have been for some many years so I know EXACTLY how and why all of this is the way it is. I deal with it every single day... software licensing notwithstanding. You also seem to think that Nvidia sees gaming as a golden goose, when in fact it is anything but for them (Look at their earning reports). They sell silicone at a tremendous discount in that channel compared to the Datacenter market. If everyone was willing to spend $3-20k per card or even more in the gaming space, and sell it at the same volume as now, then you would have a point. If you are okay with spending $100 a month to play games on your $4k PC because progress then know that people more intelligent than you don't feel the same way.
I wonder if we had gotten all those AI driven technologies in the gaming segment if not for the demand for those types of cards in data centers. No point for Nvidia to pursue two different development tracks, is there?
The 2 grand 5090 for gaming. When one good AAA game will be relised, before it's old news, is for morons. The 4080/70 are watered down, with fake graphics
upscaling is not the problem, but this frame gen bullshit comparison is fake. also not comparing to the super cards. jensons bought a new jacket and drove his harley full 600w speed into a wall.
@5:45 AI and upscaling may be the future of graphics, and the trend is towards a dsytopia where games are left unoptimized and rely on DLSS to be playable.
@pandalayreal And the 7090 will be faster than that. What's your point? If you're constantly holding out on upgrading because "the next new thing" will be better than current then you're never going to upgrade.
The fact that the 5080 is barely gonna be faster than a 7900 XTX just shows how much of a wet fart this generation is... Only the 5090 has a decent performance increase everything below it feels like an another refresh.
I have RX 6700XT 12 GB RED DEVIL since 2021 and it still runs all games maxed 1440p 60 fps. In games who have ray tracing iuse FSR to run them with max graphicds ad max ray tracing 1440p 60 fpa while those who dont have ray tracing still run maxed 1440p 60 fps without FSR. Which means that the only reaosn fo me to get a RTX 5070 will be for max graphics and max ray traicngf 1440p 60 fps without FSR but it is not worth it to give so much money for it as idont ealy need it. MINE MONSTER PC IS RYZEN 7 1700 RX 6700XT 12 GB RED DEVIL MP 600 2TB WRITE:4950MB/S READ:4250MB/S DELLP2416D 24'' 2560X1440 60 HZ IPS It loads evrything in 1 second(windows, internet, games, movies/series, programs) and also has amazing graphics in games maxed 1440p and also in movies and series. Which means that it dosent need any upgrades at this time.
Guys, a question... if 5080 is just about 15% better than 4080, while eating 15% more power... is it really an upgrade? Should I maybe consider buying a 4080 at this point?
5090 is more like 30-35%, others are around 10-15%, with the 5070 being about 20% faster than the 4070. That is completely normal, as the 4070 was 20% faster in raster vs the 3070. This is an average generational uplift, with a bigger focus on AI features and generated frames.
@@xblur17 Maybe for those 2 cards but the past gen 80 series ALWAYS beats the previous gen flagship by around 10-15%. The 4080 beat the 3080 by 49%, so no it is not the avg gen over gen uplift. If so the 5080 would beat the 4090 by 10-15%!! That would be the 5080 being 35-40% faster than a 4080 to accomplish hat which is less than last gen. If it truly is 15% instead it is the worse 80 series card to date by a MILE! Edit: The 4080 btw beat the 3090 ti by 12%.
You commenters need to do a better job using UA-cam. There is at least ONE channel who got to test the 5090 themselves at Nvidia. He disabled framegen even though he was told not to. CP2077 ran at 62 fps on psycho settings at 4K with framegen disabled.
yeah $2000 for 25% out of 100 rasterization performance. plus ur monitor will only display as fast as the Hz it is built for. greedy green and all the suckers giving them money
So far, im looking at this in 3s. The 10 series was a massive jump to the prior series. The 20 series brought a brand new tech (dlss) and minimal upgrades. 30 series brought actual improvements to performance and updated the new tech to be better. 40 series was a massive jump compared to the previous gen. 50 series is bringing brand new tech (mult gen) but minimal gains preformance wise. 60 series will potentially bring another gain to preformance in the form of multichip tech or however they plan to do it and improvements to the new tech.
Improving gaming performance is of minor concern for Nvidia now. Any hardware improvements going forward that improve gaming performance will only be a happy coincidence. Why do you think they started putting Tensor cores in their gaming products? DLSS and such is a shoehorn to get commercial value out of that otherwise useless silicone from a gaming perspective. Nvidia also knows that vast majority of gaming consumers will buy their products no matter if they are good or not, so they have little real world reason to make any specific changes hardware wise in the interest of gaming. I wouldn't be surprised to see them eventually move on from gaming if AI stays in a boom state as gaming's primary value to them is marketing based.
Improving gaming performance is of minor concern for Nvidia now. Any hardware improvements going forward that improve gaming performance will only be a happy coincidence. Why do you think they started putting Tensor cores in their gaming products? DLSS and such is a shoehorn to get commercial value out of that otherwise useless silicone from a gaming perspective. Nvidia also knows that vast majority of gaming consumers will buy their products no matter if they are good or not, so they have little real world reason to make any specific changes hardware wise in the interest of gaming. I wouldn't be surprised to see them eventually move on from gaming if AI stays in a boom state as gaming's primary value to them is marketing based.
Three things prevent me from even CONSIDERING this POS. 1) defective power connectors 2) piss poor Linux support and 3) INSANE price gouging. No thanks. If I buy a new GPU this gen it will be a 9070XT! (I do realize these aren’t in the same class, performance wise, don’t care)
I can see why both companies and MS want to do frame generation. We are pretty close to silicon cap, soon it will just be refinements unless they come out with some completely new design for silicon. Lay the tech to strap a turbo on to the silicon we have so we can continue getting more out of it. As long as the picture is clear, smooth and consistent then I don't care how we get improved performance.
@@Deeps__ Try playing some older games from the 2016-2017 era and then you'll know what a clear image is. This is anything but clear these days. I play on 4k WOLED btw, so no. I have no problems with screen quality.
A 30% raw upgrade is worth it for me as Skyrim VR just kills the framerate. No gain from DLSS4 for that, but other games will hopefully show the improvements without image degradation. Also something I seldom hear mentioned is that having more memory is a gain for AI. Currently I do not have enough Video memory to run a game and run an AI program such as Mantella where characters have some improved abilities. Who knows what things will come up in the future that can also use this memory? Maybe a local AI assistant running in the background when you are not gaming. And as we have seen Nvidia has a record of improving and adding features as they go along, some even backward compatible. The software and drivers are one of the main reasons AMD has such an uphill battle.
I'm looking to do a full system replacement sometime over the next year as I'm on a system that is really starting to show it's age I'm on the fence between targeting 2k on an ultra wide or 4k but as of right now I'm waiting to see actually benchmarks on the 50 series and AMD 99 series.
Getting more FPS is great, but how many do we need? Most people are good with 60fps going up to 120fps. Of course anyone buying a 5090 should be buying a monitor with like 240 or 360fps capable monitors. But I would really prefer the AI to improve the quality of the graphics than the number of frames.
They can do 50%, when Jensen showed the full die at CES it shows that at 100% it would be 50% uplift but instead they gimped it and called that the 5090 and everything else gets shaved off minimal performance uplift in hardware power. When they could have made the 5090 utilize full blackwell and get 50%+ uplift over 4090 and gave 5080 35-40% uplift over the 4080 aka 10-15% over the 4090 which it should have been as ALL past gen 80 series beat previous gen by that amount. And then they could have further done the exact same down the stack and gave proper vram. I am all for Dlss and ai but only on top of proper hardware uplift gen to gen. I was going to pick up a 5080 but will not if it is slower than a 4090, only if it is faster. 15% is actually 13% (4080 super) and not generational at all. I think they just want to be able to sell new cards every year so they give minimum then do super duper ti release and screw those who just bought the gimped version a year ago. It is literally anti consumer practices. I was hyped at first but if these numbers do show true I will wait until 60 series and only buy if the 80 series beats 5090 and has proper vram or guess what I will wait again. I have 300 games just in steam I can play and have a blast so I do not need a new card and won't buy unless it is great value and uplift from past gen that will last(vram).
I'm mostly considering 5080 or 5070 Ti. 5070 Ti seems like a sweet spot with most of what 5080 has - maybe 20% less cost and 20% less performance. No founders edition for 5070 Ti changes the equation; that could mean they cost $100+ over MSRP for good ones. 9070 XT is a wildcard but I can't see it catching 5070 Ti. It all pushes me back to 5080 even though it only has 16GB and we might have to wait a year to get a version with 24GB. The situation could be better. Thumbs up for great videos!
I cannot wait for the Nvidia PR spin when these new cards are tested with everything OFF. Just pure hardware. All these leaks are looking like 10-20+% maaaaybe depending on the title. Here in NZ, You basically cannot buy any 40 or 7000 series cards - all retailers ="sold out". The reviews will hold extra weight.
I don't get the hate against frame gen. AMD is working on it too and it's likely the future. There is the Reflex latency reducer. Are you really that great of a PVP FPS pro gamer that the difference between 15 and 40 ms latency breaks your k/d ratio?
I‘m happy with my 3080 12GB @ 1440p Ultrawide, no need for Raytraicing, maybe for VR I would like to have more performance but in that case only Rasterperformance counts.
The thing about them training models for their cards and using that as a lead metric ,doesn't take into consideration that models can be trained that will be eventually better then theirs to run on their or any old standard cuda or any gpu core implementation. So the hardware itself not being a major leap in consumer devices is problematic for the price they want to charge , they could have trained the same thing to run on old hardware . AMD did it with FSR when first generation of dlss came out and Nvidia told everyone it wont be backwards compatible but truth they set that limitation . There cards should be 30$ less the price for what they are giving consumers , just not worth it and theyve once again set the bar on keeping GPUs at an overpriced market value , which means their competition is going to keep their prices unnecessarily high . Na i'll stick with my 30 series for at least 2 more generations , and bet games using AI are going to be performant still on them for a while . I think the a lot of devs are seeing the form factor of mobile PCs like the SteamDek and optimizing to accommodate never mind the advances in major game engine on performance , the current state of GPUs is not going to push devs to do more with GPUs since we are not getting more power or orders of magnitude in more memory, AIs going to be the optimization profile going forward I think .
Brute forced frames > guess-work frames. I am buying a GPU to render games, not Ray-trace movies or use optical flow processing for automotive positioning calculations. If Nvidia wants to satisfy these segments they can make products for that. As a gamers I don't care much for 10-15% improvements every 2 years at 20-30% higher prices 😢
I still don't count these as apples to apples. Because surely the 50 series will be more efficient in Ray tracing and also better suited for the new dlss transformer model. Would be more interested into something like indiana jones native with no ray tracing etc.. I assuming they won't because the 5090 is probably on 20-25% faster than a 4090.
@dddddbbb yes, but that's only the top-end, most expensive, highly overpriced halo cards. Everything below that is in the realm of 10-20%, typically edging toward the lower median range.
@@dddddbbb OK? Not seeing the relevance. Nvidia is claiming they're the same class, so you expect savings and performance gains gen-on-gen. That's typically how things go. In most computer development, products work on a tick-tock cycle, where the "tick" cycle is the revolutionary or advanced product, and the "tock" is the refresh. Everyone thought the 3000 series was the "tick" and the 4000 series was the "tock" which would make the 5000 series the next "tick" cycle. What we've got instead is the 3000 series was a tick cycle, the 400p series was another, lesser "tick" cycle with the introduction of the AI cores, FG, and more power, and now the 5000 series is a very weak "tock" on the 4000 series.
I just care about raster, roll on the full reviews n data from the big channels so we can see where to put our money this generation. Hope ya all get what you prefer once they land on YTube.
Every year the 90 series will be king in gaming, then sandbagged over time through drivers...AMD will slowly gain traction then new gen comes out. Repeat New 50 seties is an absolute joke and the sheep that support AI "performance" are the problem we now face
They compare the 5070 ti to the 4070ti not the super so pure performance they might actually be super close . Have a 3080 and looking to upgrade because i play at 4k
Was in the same situation. No card from September. Recently bought Battlemage, but not only for gaming. Video editing an AI stuff as well. And I didn't want to buy from monopolist company AKA Ngreedia.
Numbers aside Nvidia's launch was such big slap in the face of consumers, like they take us to be braindead idiots. It was a mockery a circus. I'm not going to buy new Nvidia technology this generation come hell or high water. Not that AMD and Intel are much better they seem to be taking their cue from the jacket lately
I was going to upgrade 3070 to a 5080 but all this talk of fake frames and increased latency has me sitting on the fence now. Gonna be smart this time and wait for reviews. I want an increase in raster mainly and if the generational increase is meh I’ll skip or go AMD for more VRAM
Exactly! If you are gaming at 4k you are gaming at 60 FPS so frame gen at 144 fps seems tone deaf since there are not many people gaming on 4k 144mhz monitors/tvs. Plus I'm curious about latency since I know using frame-gen going from 30 to 60 fps is a horrible experience unless you are using a controller.
Saddly clickbait "The Truth Without DLSS" there is none, since there is NO real benchmarks :(
some youtubers got RTX5090 already but they are not allowed to show performance results till Jan 24
@@henson2k I do understand but I really h8 that kind of scam and clickbaits.
@@henson2k One of them did show the performance without AI in Cyberpunk 2077. PC Centric.. Spoiler: not impressive. Is just one game but, is not nothing.
For $2000 plus, it should blow the heatsink off the 4090! I really wished they would focus on more native power performance and not A.I. this, A.I. that crap.
@@mustangthearpy unless there is real competition trend will continue
Nvidia trying their best to not show raw rasterization comparison
Trying their best to forget the 40 super series too 😅
They try to avoid it so hard its pathetic.
But the rasterization is only ever between 30%-40% improvement generation to generation anyway for the top tier card and is always around 15%-20% as you go down the stack. I for one will be going for the 5090fe from my 4090 as once I have the 5090 selling the 4090 will cover way over half the cost. What I do not understand though is how Nvidia can create a 5090 into a 2 slot card but all the AIBs are still creating the 4 slot monsters?
@@ys053rious6 they will charge more and visually it should justify the extra expense
@@ys053rious6 It should be closer to 50% 2080ti to 3090 was approx 50%, and if I am not mistaken so was 3090 to 4090.
Paying $2000 for a software upgrade seems like a disappointment.
The hardware seems like the license key, lol.
Excellent point RTX4090 will receive a software update for free. I'm keeping my current card.
Lossless Scaling is 7 dollars on steam and does 4x frame generation.
Don't pay that so I can get ASUS ROG Astral GeForce RTX 5090. Who needs a software update.
@@GSP-76 Exactly!
50 series sounds like a software gen and not a hardware gen. I am looking forward to the 3rd party testing and benchmarks to see the real vs fake perf
RIGHT!? Glad someone finally said it!
Most expensive driver updates since gpu's exist , I'l be gladly skipping Ngreedia's scam and I'be happy to judge everyone who is supporting Ngreedia cause they have to lack common sense if they are backing any of this. People like Jensen Huang belongs in jail for fraud and intentionally causing inflation in the pc market .
Ngreedia fanboys are allowed to join his cell since they are so in to these things,what better way to show more support than taking a cell next to Jensen's 🤣
for gaming: yes
for ai-work: no
So if not for a die size, what are we paying for extra? Main reason GPUs jumped in price was due to their die size? Whats the excuse now? Im confused at this point. I'll sit and wait for reiews or tech to get ironed out.
@ My answer was removed, youtube is not in the common sense business ,be aware people our governments and big corporations together want to erase common sense from the public because that will make it easier for them to sell us lies for any price tag. That is why my previous comment was deleted ,it contained nothing bad but common sense about Ngreedia and Jensen Huang. Without free speech nobody will be able to stop this madness .Free speech was taken away from me every time one of my comments get's deleted .The Big CORPOS are protecting each other with silencing people about the truth.
Looking at that no wonder they didn't talk about rasterization at the launch....
Using 7900XT. No plans to update for atleast a few years
6600xt bro 😅
Using 6900XT and no plans to upgrade
RTX 2080 Super. I don't play any modern games except Cyberpunk 2077 and Remant II, and Horizon. I get around 100 fps at 1440p in cyberpunk.
@ RTX 2080 Super is more than capable card
Same here got the Asrock 7900XTX Taichi with the old 5800X3D, plus getting 100-170fps in my current goto game at 7680 x 2160, won't use upscaling or frame gen, it looks like crap when you're a trained electronics repair tech...
My policy? Go native or reduce settings, skip at least one generation on the hardware side of things and save a fortune, and STILL get double digit performance boosts with every upgrade as opposed to stagnationor even regression and a huge amount of "buyers remorse". e.g I got a 35% fps uplift when switching out my 2700X for the 5800X and a further increase of 15% when I got the 5800X3D after I tore a pin off the 5800X.
You want fakery? Get a cr*apsole and be done with it, cause once you see the artifacts you'll never NOT see them!.
And yes I've tried both FSR and frame gen on both AMD and Nvidia systems and CAN see the huge drop in detail and the artifacts related to them, Black Myth Wukong looks like utter garbage and ghosts like a b***h with or without them.
Forza looks blurred to hell and gone, raytracing is nice but I've lived without it for 20+ years now and the performance hit (on ANY GPU) is too high to justify using it.
Sure SOME of these games look fantastic but when it runs like a hanna barbera or loonytoons carton from the 50's-90's? Forget that s**t! What's the point of raytracing when the game gives you SERIOUS "uncanny valley" vibes because of all of this. cars loo great up close but TAA makes FM look like I need my eyes tested and destroys ANY sense of immersion whilst a game that runs on a toaster PC at triple digit rates, whilst looking damned good, has sharper visuals right out to the horizon and doesn't have shadows causing the trees to change colour and shape right in front of your car, that and AI that can barely stay on track... (Oops soz for the rant but it does prove a point)
Roll on the 9800X3D I'm looking at this month and the rumoured 300+Tflop RDNA5 multichip GPU's due in the next ohh 12-18 months.
That'd be roughly 5-6X the TFlops of the 7900XTX. Yeah, yeah, I KNOW TFlops aren't a reliable metric but when you go from this.
ua-cam.com/video/fv0EX3zvJUo/v-deo.htmlsi=y45FlgUzSVW2GmV1
To this,
ua-cam.com/video/tNrOUodxRs0/v-deo.htmlsi=5r4HLqffUoKjpwSj
When going from the 10TFlop 5700XT to a 62TFlop 7900XTX and get a 2-3x increase in fps? (The fps can be seen bottom left of those vids in green if you set it to play at "4K", even though it's only 1080P THEN imagine what you see at 57 inches diagonally).
Wait a generation and watch AMD really push the boat out on the GPU front.
Should run my G95NC 240Hz dual 4K panel at close to or over it's refresh rate...
Nvidia keep selling the next big feature well before the hardware is actually ready for it let alone decently capable of utilizing it and now their using blatant deceit of different versions of DLSS to sell their fake frames (As do AMD to be fair though they still use raster whilst mentioning FSR etc) and deliberately mislead people who know no better.
It's why I'll take this...
ua-cam.com/video/aAkI8jhMgos/v-deo.htmlsi=XZJq2SO0v1uX7Ixi
Over this....
ua-cam.com/video/3NlpCnDgIMk/v-deo.htmlsi=b7_vJSGsPuuhRBmN
Any day of the week.
DLSS it is not a performance metric. Performance = Real Power, RAW Power. can you fake HP in a car? NO.
you can, sort of, with custom exhaust
You can swap your noob driver to a pro racer, thats what frame gen is in comparison
@ a pro driver won’t make more hp… framegen is a scam. Nvidia knows their silicon is almost at its limit. What’s next? 1500watts gpus? Lmao
Honda boys with their laptops drinking a red bull laughing in your face. Sometimes people say the most uneducated things, please think before you speak.
@@unavailable291 You can put a chip in your car and get better performance, you can upgrade to a turbo and inter-cooler and get more power. "RAW" performance for a car would be NA not turbo or super charger. Saying fake frame is = to saying a turbo/supercharger or Performance chip is "FAKE" HP!! Fact.
What a clickbait just showing all the Nvidia advertising.
Thx. I saved afew minutes. Now I can waste it reading comments xD
Nvidia trying their best to not show raw rasterization comparison. Pure facts.. furthermore Rtx 4090 kingpin edition wasn't released for this same reason. Rtx 4090 + power mod + o.c + Lostless scaling 3.0 would close the gap dramatically within 10-15 percent . all aboard the "HYPE TRAIN". WE NEED A GPU THAT CAN DO PURE rasterization performance period.
I know it’s crazy to be able to play a game like cyberpunk with full path, tracing , raytracing at 4K maxed and all the eye candy and get an acceptable, Smooth experience , with under 50 MS of input lag ..it’s crazy.? Because you’re never gonna be able to do that on a console or any other GPU. Lol I don’t think people comprehend how much compute power is needed just to be able to play any game using full path tracing
Lossless scaling works much worse than dlss framegen if what has been shown is any indication.
I know the last LSC update with 3.0 framegen has been very problematic for me in particular.
Razterization will appear on 60 series. They're squeezing as much as they can out of the chip. They scaled price back on these, 60 series will hit the wallet. If people think 2K is high then they're in for a surprise. Then they'll be crying and whining like pure brokies.
Iam waiting until the 60 series (using a 3080Ti)
Probably this is the last one. WW3 will stop all this
Honestly what can we expect from that? Nvidia is literally at their limits with available tech. We need 2nm TSMC wafers available at pre-pandemic prices and more power limits.
The 5090 with all its spec might is going to be gimped because it's limited to just under the 600W capacity of its power cable. It's not even the full GB203
We need 1500-2500W power supplies to handle some real raw performance improvements because the chip has to be enormous to fill it with enough compute and it will be power hungry.
We also need dev's to stop trying to use all these unoptimized aesthetics in games...
So it's either that or bet on Ai.
This generation might be a lot more interesting than I thought. If the 5080 is only 10-15% better than the 4080 at $999, then an RX 9070 XT at about equal perf. as a 4080 Super (rumored) at ~550-600$ (rumored) suddenly seems like fantastic deal. Hoping FSR4 can perform on par with the up-scaling of DLSS. If so, I don't see a major reason to go with Nvidia this gen.
Doubt it, I'm expecting around 4070 TI Super raster performance out of the 9070 XT which will be dangerously close to what it's looking like the 5070 will be capable of. At 500 if the 5070 is within 5-10% of the raster performance of the 9070 XT people will buy the 5070 unless the 9070 XT is undercutting it by 100 dollars or more.
Nvidia just has a stronger position in terms of upscaling, ray tracing, and frame gen. Look at 4080/4080S vs the 7900 XTX on the steam survey, the 7900 XTX was generally a bit faster on raster performance and was cheaper than the 4080/4080S and had more VRAM but its sales were terrible in comparison.
The 9070 xt is expected to be about 5%-10% slower than the 4080 and about a 4070tisuper in ray tracing @@scarx4181
I am starting to lean to this too, come on AMD for once!!
Well I'm keeping my money until I see benchmarks but my opinion on AMD vs NVIDIA in the last 10 or so years is that unless you are getting NVIDIA's top card AMD can offer similar performance for less money and that is how it has been pretty much forever.
The rx 9070 xt should be $449. Undercut the 5070 cause that's what most gamers will be interested in.
"What another useless crap could we force on the gamers? Anyone?"
"Well... I'm not sure if I should say..."
"What..."
"How about adding a fake frame?"
"What..."
"How about adding one fake frame to the existing one, doubling the frame rate?"
"Damn... there is only one serious flaw with your idea..."
"And what is that, boss?"
"You think little... add ONE frame? MANY! MANY FRAMES!"
It's gonna be like 10% faster IRL lmao, truly a software upgrade, what a joke
right
go buy AMD then. What's with the crying
Only raster performance benchmarks we have is Resident Evil 4 since it doesnt have DLSS. But remember RE4 Remake is AMD sponsored title and runs worse on Nvidia cards. In other games, especially Nvidia sponsored ones the raster performace gains may be greater.
Both 40 and 50 series would be crippled the same in re 4 regardless...
It has DLSS mod, which looks tremously better in RE4
The New King of Fake Frames
Simply making a dedicated ai chip that processes the frames instead would make a huge difference
I get the same feeling about all this tech as I did when cars went from plane V8 to V6 and inline 4s with electronic fuel injection, turbos, plastic engine parts and computer control.
Nothing of the charm and the rush from the raw performance is present in these designs.
Yup this hobby is going to die out just like the car enthusiast space is currently for these same reasons.
The 5080 aka 4080ti Super Duper.
I do plan on upgrading from the 3080 FE to the 5080 FE if I can get one at launch.
that should be a pretty nice upgrade, like 80% more performance?
@@darvader00 The RX 9070 XT might be a great alternative to the 5080 and 5070ti, unless DLSS4 is a big selling point for you.
@SpankeyMcCheeks I'm curious about the video encoder on the 9070xt and how well it performs at editing.
@@Mayeloski More like 40% in raw raster perf. A lot more with DLSS4, but that's just fake performance.
omg how on earth did you missed the fact 5080 is literally the worst case of this "software upgrade" lineup? Geez, if this is how Nvidia's fanboy brain works, no wonder the company managed to grow despite all the obvious anti-consumer behavior.
I don't want 600w turd on old node with fake frames in my pc.
DLSS is fine when it comes to super resolution. It’s frame generation that is muddying the waters. I wish people would do a better job differentiating between the two. I never use frame generation so it’s pointless for me but I love super resolution as it improves performance without significant drawbacks.
Can you discuss NR and the RTX 5090 performance as it relates to content creation, video editing and 3D rendering like in Unreal Engine? Not all of us are gamers and we content creates don't care about how many FPS we can get on a video game. Thanks :)
Oof, the 5080 and below only gaining 15% performance... That instantly makes the 5080 overpriced even at $999
Ball's in AMD's court, I hope they don't mess this up.
Is this the first step between having a gaming pc and just having a fire stick type thing to play games? Everything will be AI with internet streaming….
I've been around computers so long that common tech from years ago doesn't even exist anymore and I remember when common tech now, like graphics cards, became a thing. I was too broke to afford a graphics card when they first came out, but I bought a motherboard with an AGP slot so I could add it in later, which I eventually did. So now, I'm thinking that we gamers might be seeing the point when software takes the driverseat away from rasterization. People are complaining about prices, but if software really can do what it took hundreds of dollars of components to do, then we should all be happy.
Why would that make you happy? You are still going to have to buy hardware at an accelerating cost, but now on top of that they are eventually going to start charging you a subscription fee to be able to use your own pc. Why do you think they keep trying to make cloud gaming a thing? This is the end goal, and if you disagree then you only need to look at everything else to understand that you are wrong. This is just the beginning of the end for the PC/Hardware enthusiast and that is nothing to celebrate.
@@Pleasiotic1 Let's think about this for a moment. Computer hardware manufactures make their money buy selling computer hardware that eventually becomes obsolete so they can sell you more. They have no interest in removing pc hardware enthusiasts. It's the exact opposite. They have figured out that enthusiasts will spend gobs of money to upgrade their equipment. If you think they want to kill this golden goose, you're kidding yourself.
Video game developers might want to go to a subscription model, but the hardware makers would pretend to be on the consumers' side just to keep their money rolling in. Legally we have been buying licenses for a while now anyway and the world hasn't come to end.
I like the idea of ownership, but its temporary no matter which way you look at it. I own games that I can't play anymore anyway, unless I want to build a brand new vintage system. That computer I talked about from decades ago that had an AGP slot, well it broke decades ago. I have lots of games in a box that I can't even load up anymore.
🤣
@@Odin029 Firstly, I have at minimum been involved with PC Gaming and PC hardware since it has existed in its modern interpretation. My first PC was a Packard Bell 486sx 25 with 4 MB of SDR ram and a single speed cd-rom. I had been into PC's earlier than that but wasn't old enough to be financially involved. All that is to say I know exactly what makes a HARDWARE enthusiasts work. There is a growing market of retro PC gaming so just because you don't see the value of older hardware doesn't mean others don't.
Secondly, Nvidia already tried their hand at cloud gaming so they do want that business model just like every other hardware mfr out there. I am a Systems Engineer and have been for some many years so I know EXACTLY how and why all of this is the way it is. I deal with it every single day... software licensing notwithstanding.
You also seem to think that Nvidia sees gaming as a golden goose, when in fact it is anything but for them (Look at their earning reports). They sell silicone at a tremendous discount in that channel compared to the Datacenter market. If everyone was willing to spend $3-20k per card or even more in the gaming space, and sell it at the same volume as now, then you would have a point.
If you are okay with spending $100 a month to play games on your $4k PC because progress then know that people more intelligent than you don't feel the same way.
Nvidia embargoed the 5080 through the day of release. hmmmmm 6:34 looks better with RTX OFF
If you would've told people a new card came out back in the day with like 10 percent better performance they would've laughed at it.
I wonder if we had gotten all those AI driven technologies in the gaming segment if not for the demand for those types of cards in data centers. No point for Nvidia to pursue two different development tracks, is there?
Blackwell is a 40 super super refresh 😉
I just have an RTX 3080 10gig model. Not sure what new card would be an upgrade without spending too much money.
Dislike and blocked. Bring some benchmarks next time you suggest the “truth”
5090 --> 5080
5080 --> 5070 TI
5070 TI --> 5070
5070 --> 5060
FIXED
The 2 grand 5090 for gaming. When one good AAA game will be relised, before it's old news, is for morons. The 4080/70 are watered down, with fake graphics
upscaling is not the problem, but this frame gen bullshit comparison is fake. also not comparing to the super cards. jensons bought a new jacket and drove his harley full 600w speed into a wall.
I'm just waiting on the used market to finaly explode.
@5:45 AI and upscaling may be the future of graphics, and the trend is towards a dsytopia where games are left unoptimized and rely on DLSS to be playable.
Anyone remember when AI was simply called a "software program"
Massive hyperbole
@@_n8thagr8_63Not really.
Who would want 27 FPS as a base for upscaling and frame gen? This is a joke. Why no journalist is laughing at Nvidia?
The 5000 series is soooo bad, people just don't know that yet.
Until real-world benchmarks come out nothing is good or bad.
You say that because you can't afford one.
@ You are reta=rded.
I plan to go from a 3080ti to a 5090.
But 6090 will be faster.
@pandalayreal And the 7090 will be faster than that. What's your point? If you're constantly holding out on upgrading because "the next new thing" will be better than current then you're never going to upgrade.
@@vinny9500 Point is to catch good series for longevity, this time we get the same 4N node and huge 600w tdp.
@@pandalayreal The 4070ti is as good as the 3090. So two gens on a 5090 will dominate.
The cards are basically same in raster as 40 series super cards. Other than the 5090. This might be the most boring generation if that really is true.
The fact that the 5080 is barely gonna be faster than a 7900 XTX just shows how much of a wet fart this generation is...
Only the 5090 has a decent performance increase everything below it feels like an another refresh.
How can this even be considered accurate when the 4080 cannot even support DLSS 4? I call BS on these so called benchmarks!
I have RX 6700XT 12 GB RED DEVIL since 2021 and it still runs all games maxed 1440p 60 fps. In games who have ray tracing iuse FSR to run them with max graphicds ad max ray tracing 1440p 60 fpa while those who dont have ray tracing still run maxed 1440p 60 fps without FSR. Which means that the only reaosn fo me to get a RTX 5070 will be for max graphics and max ray traicngf 1440p 60 fps without FSR but it is not worth it to give so much money for it as idont ealy need it.
MINE MONSTER PC IS
RYZEN 7 1700
RX 6700XT 12 GB RED DEVIL MP 600 2TB WRITE:4950MB/S READ:4250MB/S DELLP2416D 24'' 2560X1440 60 HZ IPS
It loads evrything in 1 second(windows, internet, games, movies/series, programs) and also has amazing graphics in games maxed 1440p and also in movies and series.
Which means that it dosent need any upgrades at this time.
Guys, a question... if 5080 is just about 15% better than 4080, while eating 15% more power... is it really an upgrade? Should I maybe consider buying a 4080 at this point?
5-12% raster increase on average at best. 15-30% for 5090
5090 is more like 30-35%, others are around 10-15%, with the 5070 being about 20% faster than the 4070. That is completely normal, as the 4070 was 20% faster in raster vs the 3070. This is an average generational uplift, with a bigger focus on AI features and generated frames.
@@xblur17 Maybe for those 2 cards but the past gen 80 series ALWAYS beats the previous gen flagship by around 10-15%. The 4080 beat the 3080 by 49%, so no it is not the avg gen over gen uplift. If so the 5080 would beat the 4090 by 10-15%!! That would be the 5080 being 35-40% faster than a 4080 to accomplish hat which is less than last gen. If it truly is 15% instead it is the worse 80 series card to date by a MILE! Edit: The 4080 btw beat the 3090 ti by 12%.
They are really trying to upsell to the 5090. With the 5080 also being what the 5070 should've been this launch is disappointing as usual.
I think I will go for rx 9070xt if it is priced well.
You commenters need to do a better job using UA-cam. There is at least ONE channel who got to test the 5090 themselves at Nvidia. He disabled framegen even though he was told not to.
CP2077 ran at 62 fps on psycho settings at 4K with framegen disabled.
Feels like they could've drop AI cores for more Cuda cores, that way maybe we"ll see better raw performance improvement than fake one..
yeah $2000 for 25% out of 100 rasterization performance. plus ur monitor will only display as fast as the Hz it is built for. greedy green and all the suckers giving them money
So far, im looking at this in 3s.
The 10 series was a massive jump to the prior series. The 20 series brought a brand new tech (dlss) and minimal upgrades. 30 series brought actual improvements to performance and updated the new tech to be better.
40 series was a massive jump compared to the previous gen. 50 series is bringing brand new tech (mult gen) but minimal gains preformance wise. 60 series will potentially bring another gain to preformance in the form of multichip tech or however they plan to do it and improvements to the new tech.
Improving gaming performance is of minor concern for Nvidia now. Any hardware improvements going forward that improve gaming performance will only be a happy coincidence. Why do you think they started putting Tensor cores in their gaming products? DLSS and such is a shoehorn to get commercial value out of that otherwise useless silicone from a gaming perspective. Nvidia also knows that vast majority of gaming consumers will buy their products no matter if they are good or not, so they have little real world reason to make any specific changes hardware wise in the interest of gaming.
I wouldn't be surprised to see them eventually move on from gaming if AI stays in a boom state as gaming's primary value to them is marketing based.
Improving gaming performance is of minor concern for Nvidia now. Any hardware improvements going forward that improve gaming performance will only be a happy coincidence. Why do you think they started putting Tensor cores in their gaming products? DLSS and such is a shoehorn to get commercial value out of that otherwise useless silicone from a gaming perspective. Nvidia also knows that vast majority of gaming consumers will buy their products no matter if they are good or not, so they have little real world reason to make any specific changes hardware wise in the interest of gaming.
I wouldn't be surprised to see them eventually move on from gaming if AI stays in a boom state as gaming's primary value to them is marketing based.
I am hoping that call it 6900:) and it will perform like monster);
300fps is not something i need for flat screen gaming, i need either the raw power or frame gen to work in vr.
Three things prevent me from even CONSIDERING this POS. 1) defective power connectors 2) piss poor Linux support and 3) INSANE price gouging. No thanks. If I buy a new GPU this gen it will be a 9070XT! (I do realize these aren’t in the same class, performance wise, don’t care)
I can see why both companies and MS want to do frame generation. We are pretty close to silicon cap, soon it will just be refinements unless they come out with some completely new design for silicon. Lay the tech to strap a turbo on to the silicon we have so we can continue getting more out of it. As long as the picture is clear, smooth and consistent then I don't care how we get improved performance.
When was the last time you saw a game with a "clear picture"? It's all smearing mess (even without any AI tech).
Think you need to get a new monitor mate. My MSI 272CQR gives a very clear picture.
@@Deeps__ Try playing some older games from the 2016-2017 era and then you'll know what a clear image is. This is anything but clear these days.
I play on 4k WOLED btw, so no. I have no problems with screen quality.
I play a wide spread of games. Wow, D3, D4, hearthstone, factorio, POE and I have zero issues. Crystal clear for me, even in wow old zones 😂
At this point with all the games nvidia I playing i really hope Intel takes a segment of their business.
What we think? I think Nvidia killed the glorious xx70 card, 5070 already 4th day one and useless
A 30% raw upgrade is worth it for me as Skyrim VR just kills the framerate. No gain from DLSS4 for that, but other games will hopefully show the improvements without image degradation. Also something I seldom hear mentioned is that having more memory is a gain for AI. Currently I do not have enough Video memory to run a game and run an AI program such as Mantella where characters have some improved abilities. Who knows what things will come up in the future that can also use this memory? Maybe a local AI assistant running in the background when you are not gaming. And as we have seen Nvidia has a record of improving and adding features as they go along, some even backward compatible. The software and drivers are one of the main reasons AMD has such an uphill battle.
All BS until real benchmarks come out.
I'm looking to do a full system replacement sometime over the next year as I'm on a system that is really starting to show it's age I'm on the fence between targeting 2k on an ultra wide or 4k but as of right now I'm waiting to see actually benchmarks on the 50 series and AMD 99 series.
40 series gets everything but the multi frame gen DLSS. Definitely a wait and see feature set imho.
I have a 2080 super and am very tempted to upgrade to either the 5090 or 5080
I have a 2080ti, I want to upgrade to a 5080 or wait for a 5080 S/Ti.... depending on the benchmarks and testing.
The real FPS is much lower for 5090. The lesser versions will be so much lower. Will 6000 series improve with 15 extra frames.
Getting more FPS is great, but how many do we need? Most people are good with 60fps going up to 120fps. Of course anyone buying a 5090 should be buying a monitor with like 240 or 360fps capable monitors. But I would really prefer the AI to improve the quality of the graphics than the number of frames.
I want to know about raw power. This DLSS is a topic I don’t care about. Is it a brick and does it build a house or build a mansion.
Looks like we're waiting for the refresh again
The new TDP king. You can replace your furnace and game for the low low price of US$2000
A 4070Ti will only be worth about 400 quid. I’d go for that.
They can do 50%, when Jensen showed the full die at CES it shows that at 100% it would be 50% uplift but instead they gimped it and called that the 5090 and everything else gets shaved off minimal performance uplift in hardware power. When they could have made the 5090 utilize full blackwell and get 50%+ uplift over 4090 and gave 5080 35-40% uplift over the 4080 aka 10-15% over the 4090 which it should have been as ALL past gen 80 series beat previous gen by that amount. And then they could have further done the exact same down the stack and gave proper vram. I am all for Dlss and ai but only on top of proper hardware uplift gen to gen. I was going to pick up a 5080 but will not if it is slower than a 4090, only if it is faster. 15% is actually 13% (4080 super) and not generational at all. I think they just want to be able to sell new cards every year so they give minimum then do super duper ti release and screw those who just bought the gimped version a year ago. It is literally anti consumer practices. I was hyped at first but if these numbers do show true I will wait until 60 series and only buy if the 80 series beats 5090 and has proper vram or guess what I will wait again. I have 300 games just in steam I can play and have a blast so I do not need a new card and won't buy unless it is great value and uplift from past gen that will last(vram).
I'm mostly considering 5080 or 5070 Ti. 5070 Ti seems like a sweet spot with most of what 5080 has - maybe 20% less cost and 20% less performance. No founders edition for 5070 Ti changes the equation; that could mean they cost $100+ over MSRP for good ones. 9070 XT is a wildcard but I can't see it catching 5070 Ti. It all pushes me back to 5080 even though it only has 16GB and we might have to wait a year to get a version with 24GB. The situation could be better. Thumbs up for great videos!
I cannot wait for the Nvidia PR spin when these new cards are tested with everything OFF. Just pure hardware.
All these leaks are looking like 10-20+% maaaaybe depending on the title.
Here in NZ, You basically cannot buy any 40 or 7000 series cards - all retailers ="sold out".
The reviews will hold extra weight.
I don't get the hate against frame gen. AMD is working on it too and it's likely the future. There is the Reflex latency reducer. Are you really that great of a PVP FPS pro gamer that the difference between 15 and 40 ms latency breaks your k/d ratio?
No, your name is Charlotte!
well Blackwell looks disappointing, not worth the time of day
I‘m happy with my 3080 12GB @ 1440p Ultrawide, no need for Raytraicing, maybe for VR I would like to have more performance but in that case only Rasterperformance counts.
Thank you for deactivateing autotranslate.
I'm thinking of upgrading my 3060ti to a 5070ti or 5080. We'll se what the prices ends up at in Norway. We pay more :P
RTX 5090 the NEW KING ?? *Yes the new king, of kidney sells !!* Not 1 normal gamer is gonna buy an RTX 5090 ! These will lay on shelfs for years !
20% improvement plus fake frames (that no one wants), Nvidia getting all the money again.
Incredible lazy reporting bro.
There is so much more info out there now.
And you release this slop.
The thing about them training models for their cards and using that as a lead metric ,doesn't take into consideration that models can be trained that will be eventually better then theirs to run on their or any old standard cuda or any gpu core implementation. So the hardware itself not being a major leap in consumer devices is problematic for the price they want to charge , they could have trained the same thing to run on old hardware . AMD did it with FSR when first generation of dlss came out and Nvidia told everyone it wont be backwards compatible but truth they set that limitation . There cards should be 30$ less the price for what they are giving consumers , just not worth it and theyve once again set the bar on keeping GPUs at an overpriced market value , which means their competition is going to keep their prices unnecessarily high . Na i'll stick with my 30 series for at least 2 more generations , and bet games using AI are going to be performant still on them for a while . I think the a lot of devs are seeing the form factor of mobile PCs like the SteamDek and optimizing to accommodate never mind the advances in major game engine on performance , the current state of GPUs is not going to push devs to do more with GPUs since we are not getting more power or orders of magnitude in more memory, AIs going to be the optimization profile going forward I think .
I'm tempted to upgrade from my 1080 Ti, but probably will wait till next gen.
Playing most of the time competitive games , i don't need FG or DLSS because those add lag , only i care is pure raster power ... simple as that !
Brute forced frames > guess-work frames.
I am buying a GPU to render games, not Ray-trace movies or use optical flow processing for automotive positioning calculations.
If Nvidia wants to satisfy these segments they can make products for that. As a gamers I don't care much for 10-15% improvements every 2 years at 20-30% higher prices 😢
I still don't count these as apples to apples.
Because surely the 50 series will be more efficient in Ray tracing and also better suited for the new dlss transformer model.
Would be more interested into something like indiana jones native with no ray tracing etc.. I assuming they won't because the 5090 is probably on 20-25% faster than a 4090.
have a RTX3070, thinking to get a RTX5090. I think the performance increase will be night and day no? Or should I wait more...??
I wonder how the performance looks with that 3rd party MFG tool? Would the performance be at about parity +15-20%?
About 30% average raw for 4090 - 5090
@dddddbbb yes, but that's only the top-end, most expensive, highly overpriced halo cards. Everything below that is in the realm of 10-20%, typically edging toward the lower median range.
@@RyneLanders Well 4080 to 5080 is like $250 cheaper so not as surprising.
@@dddddbbb OK? Not seeing the relevance. Nvidia is claiming they're the same class, so you expect savings and performance gains gen-on-gen. That's typically how things go. In most computer development, products work on a tick-tock cycle, where the "tick" cycle is the revolutionary or advanced product, and the "tock" is the refresh. Everyone thought the 3000 series was the "tick" and the 4000 series was the "tock" which would make the 5000 series the next "tick" cycle. What we've got instead is the 3000 series was a tick cycle, the 400p series was another, lesser "tick" cycle with the introduction of the AI cores, FG, and more power, and now the 5000 series is a very weak "tock" on the 4000 series.
@ Gotta compare apples to apples - Easiest way to do that is by comparing flagships of each generation imo.
I just care about raster, roll on the full reviews n data from the big channels so we can see where to put our money this generation. Hope ya all get what you prefer once they land on YTube.
Every year the 90 series will be king in gaming, then sandbagged over time through drivers...AMD will slowly gain traction then new gen comes out. Repeat
New 50 seties is an absolute joke and the sheep that support AI "performance" are the problem we now face
5070 48SM- 4070 46SM= 5% faster
5070Ti 70SM- 4070Tisuper 66SM = 6% Faster
5080 84SM- 4080super 80SM = 5% Faster
4090 128SM-5090 170SM=33% Faster
If you can't get the 5090 it's not worth the upgrade. 🤷♂️
They compare the 5070 ti to the 4070ti not the super so pure performance they might actually be super close .
Have a 3080 and looking to upgrade because i play at 4k
I have no card. Hoping to get a 5090
Was in the same situation. No card from September. Recently bought Battlemage, but not only for gaming. Video editing an AI stuff as well. And I didn't want to buy from monopolist company AKA Ngreedia.
I do plan on upgrading from the 4070 to the 5080 but only if there is a major improvement :)
Pc centric has a video of it running Cyberpunk. It got 28 fps average with no framegen. So about 30% increase over 4090.
I am upgrading from AMD 3700 era to RTX5090/AMD 9950X3D
Numbers aside Nvidia's launch was such big slap in the face of consumers, like they take us to be braindead idiots. It was a mockery a circus. I'm not going to buy new Nvidia technology this generation come hell or high water. Not that AMD and Intel are much better they seem to be taking their cue from the jacket lately
Just watched a short review in Cyberpunk on the 5090 - TLDR it's disappointing unless you're ok with input lag.
That wasn't my reading of that PC Centric video (if that's the one you are talking about). It was actually very impressive.
DF, 10 seconds in the video is all it needs till you see bigly ugly artifacts in C2077
@@lcg-channelFor a 2k + card I expected more but DLSS 4 does look impressive if you’re ok with the downsides.
What video do you mean? The input lag numbers i saw weren't too bad imo.
paying for pixel soup with higher latency and software that wont make it to game for 6 years if ever, mmmkay 💀
I have a 1080TI, looking for the best card up to £700 which is the price i paid for the 1080TI. If not i can hold out another year if needed.
Let’s hope AMD does a good job
I was going to upgrade 3070 to a 5080 but all this talk of fake frames and increased latency has me sitting on the fence now. Gonna be smart this time and wait for reviews. I want an increase in raster mainly and if the generational increase is meh I’ll skip or go AMD for more VRAM
Not upgrading from a 4070TiS. I still game at 60fps, so all this frame-gen stuff is meaningless to me.
Exactly! If you are gaming at 4k you are gaming at 60 FPS so frame gen at 144 fps seems tone deaf since there are not many people gaming on 4k 144mhz monitors/tvs. Plus I'm curious about latency since I know using frame-gen going from 30 to 60 fps is a horrible experience unless you are using a controller.
@@keithroy1431 real gamers use controllers
basically no generational boost as the cards have more cuda cores and higher power draw.... Just don't buy this crap if you really don't need to