do you see the graphics? that's visually stunning and and performing better than a 4090. You're upset because their finding new strategies to combat transistors minimun size, and the eventual plateau of raw performance
It's not ridiculous. It's the same cost as two long vacations. It depends on your life priorities.... these cards are either for people who stay at home all the time or rich people
@@MADED1TS Current hardware isn't powerful enough for full RT; the devs can't be blamed for that, it's impossible to "optimize" it much more without losing fidelity. Take a look at render times using ray tracing in software like Blender and Maya, and you'll be utterly amazed that real-time RT is a thing. (A SINGLE FRAME can take several hours to render, versus 0.05 seconds even in this choppy mess of a demo with DLSS off.) That said, I agree with the main point: performance is too bad for current hardware, and they're pushing too hard.
@@exscape4060 people clowned on my fiance uses and I use a 3090 she gets better frames then my 3090 on 1080p and 1440p and on 4k I get 35 more frames I'm usually average 100 frames most games with my 3090 she's usually around 50 frames in 4k and at time 40 frames cause of the low vram the 3080 and 3090 things are fine for hardware but 5070 being 500 bucks that will demolish the 4090 in 1080p and 1440p aswell and be close to it on 4k and people don't realize this but 4k gaming isn't cheap and not for the normal people a propper 4k monitor runs 600 or if yoy get the 5090 your monitor will be 1500 dollars pc build and monitor looking at around 6 grand total with a 5090 in that pc.
Am I the only one who liked the standart face more then the neural one? Surely it looks more realistic in terms of proportions but overall it seemed like it was poorly photoshopped to the head. Also facial movement and expressions looked more appealing with standart face while neural was acting with kind of delay
StandarD. StandarD face. StandarD with a D. "STANDARD" not "sTaNdArT". You sound incredibly dense, it's not even pronounced with a "T". And yeah that's because of how the technology works. It literally swaps out the face for a 'more realistic' version or interpretation on-the-fly. Obviously it's not perfect, but look at how far ray-based effects and AI-based upscaling has come in recent years. It's also not just a blank, dead face slapped on top, it follows the expressions and speech of the target.
@@TheFlagFilms second hand he was probably talking but the same thing applies old cars in some cities are not allowed for "envoirment issues" wich makes you have to buy a new car , same thing happens with graphics cards also instead of optimizing games they just come out with a new graphics card every year ... disgusting ...
You only have one liver, so I wouldn't sell that for the 5090. And if you already sold a kidney for the 4090, may I suggest selling an eyeball? Although you wouldn't be able to enjoy that Ray Tracing as much so... maybe sell your testicals!
@@brucer.5403 HA! Fools, I bought the H100 (80gb Vram) but i sold half of my liver and 1 of my Kidneys, so jokes on the rest of u! since i will have a good G-card until 9090 comes out!
Lets take a step back I personally hope one day, everyone will be able to reasonably afford this level of performance. 4k and raytracing, 60-120 fps. And one day, we will, I feel. With the 7900xtx, I can get stuff decently close to how this looks. But one day, these cards will be 200 dollars. So take in 7 years, 5090 performance being the standard. So people with 5090's, are paving that way for us. Yes, optimization is always an issue.
great games and great movies are made with great game mechanics / stories, not graphics and CGI. realism through mechanics / story is much more valuable and engaging than realistic graphics. what do you do with photorealistic graphics that you cannot interact with? you can have great games with an infinite amount of fun with just 2D images if you want to, if its mechanics are great. older games were much better, despite worse graphics.
40은 혁신이라는 시작을 보여 줬고 비싼 가격에 팔았는데, 50은 멀티 DLSS로 프레임 상승만 보여줬다. RTX40에서 충격을 줬기 때문에 RTX 50은 넘어가는 세대. AAA급 게임 넘을 넘어가는 S급 게임이 나오던가, 대중화를 노린 가격이 저렴하고 가성비 좋은 그래픽 카드가 나와야 될듯. AI는 다른 종류라고 생각해야됨.
Remember when a new Card would be coming out and they would showcase all the cool games you could play on it? Now we get Smart Zoi, a theoretical character that doesn't do much to change the gameplay and might or might not ever be in any game. Is this Project Milo from Lion Head Studios all over again.
idk if you guys are just pretending to be stupid or are actually stupid. Smart Zoi is meant to showcase nvidia's new ACE technology that powers AI Npc behavior. It isn't a graphics card technology. Nvidia is also an AI and robotics company. Also, this independent channel is putting together it's own clips. It's not a 5090 showcase by Nvidia. That's why pubg ally is inside.
@@_MaxHeadroom_ Those aren't new games made to showcase the cards. They are games that are already out and getting a frame boost patch. Not that exciting. Remember that Indiana Jones game nobody bought. Now you can play it at 300 fps.
@@michaelblue4619 This was a graphics card showcase. I don't care if they make vacuums. They are trying to sell us a 2K graphics card. 5 years a go you could get 2 gaming laptops for that price. Sell me on it. You being so smart should have been able to figure that out.
Yes, this is all absolutely incredible, but who the hell cares about framerates in the 200s when the input latency will be locked to the base framerate?
@@JenovaGirzz more specifically, I meant the first part showing a framerate in the 20s boosted to the 100s. A 60 fps lock boosted higher is fine, but not a base framerate in the 20s.
Anything that have a base framerate of 60 will have a very good input lag so I wouldn’t complain about it, they are not showing how “low” the input lag can be but how smooth the game FEELS, this is not competitive gaming, most competitive gaming should be played without DLSS4 or ray tracing, the purpose of dlss is the feeling of smoothiness just like blur
I know this all sucks and all but I feel excited since I'm coming from a GTX 1070. I'm also coming from 1600x900 resolution so an upgrade to 1080p ultrawide at 100hz/fps is such an awesomething to me that I'm excited for the 5070.
These GPUs are monsters at traditional game rendering. But Path Tracing is another beast and the fact you can get such good quality with AI is insane. You would need a farm of GPUs to render this raw. I don't know why so many people don't seem to understand this. You can always turn RT off and play at 240. If the game allows you.
Both DLSS and FSR shouldn't exist taking in mind that it is now taken as default the usage of both in games, so devs are actually taking less and less effort to optimize their games for a native gameplay. But at least FSR isn't locked behind a paywall by being free and available to any card, while NVIDIA is promoting their merchandise solely taking in mind the usage of DLSS. Sure, go ahead, support NVIDIA by buying their cards, after all you will have better FPS than AMD with everything enabled, just don't complain when the 60 series presents that same thing in a higher scale and announces the 6090 for $2500 or more. Not achieving a Cyberpunk native 30 FPS in 2025 with a supposed so much better technology just proves my point, and keep in mind that game was optimized in Partnership with NVIDIA.
I'm so glad my gaming days are over. Companies are openly fleecing the gamers because they know you want to remain on the cutting edge. They are literally robbing you.
@@MrHyonD The Monster Hunter Wilds Stress Test in Nov 2024 ran into this exact issue. People showed that it couldn't be played on a GTX 1080 and that it was almost necessary to use DLSS to get above 60 fps with certain 20 and 30 series RTX card using medium or high settings (should be considered the norm settings for most gamers). Pushing graphical fidelity is nice and all but at the cost of gameplay or performance its a step in the wrong direction IMO especially if you have to rely on DLSS or FSR to get your game to a playable state.
This look cool on my *1080p monitor* and *480p phone screen* at *30fps*. Throttle the push for graphics, let the studios catch up and figure out optimization techniques with what they currently have.
Think of DLSS 4 as this: an artist draws a picture by hand....he no longer has to draw multiple pictures by hand (wasteful). Instead he can now focus on the next shot. AI (tensor cores) take a "pic" of what the artist drew and handles taking multiple predictive pictures in between until the artist draws (by hand) the next shot.
NPC behavior improvements being marketed as a hardware feature is peak corporate spin.. you'll do well not to be a sucker and fall for it. NPC AI doesn’t need RTX-exclusive tensor cores.. it’s just smarter code that could run even on older CPUs and GPUs if developers cared enough to implement it. As for DLSS 4, it’s a pure software feature; there’s no technical reason it couldn’t work on the 2000 series if NVIDIA wanted it to. The truth is, they lock features behind hardware generations to force upgrades, not because it’s impossible to backport. Let’s not pretend game-changing AI is exclusive to overpriced GPUs.. it’s a cash grab, plain and simple.
@@hvanmegen STOP LYING! This is biggest lie i saw. nVidia never locked anything purposefully. Theirs 2x framegen could only run on 4000series because of the optical flow in 4000series so it is impossible to run it on older cards. 3x and 4x FG can only run on 5000series because of flip metering and massively boosted tensor performance. You dont know anything about this stuff. Why would nVidia release massively upgraded DLSS, massively upgraded DLAA, massively upgraded rayreconstruction, massively upgraded reflex2 with framewarp and in future neural rendering features to 2000,3000,4000series too? And massively upgraded 2x FG for 4000series. Answer is simple, if the features can physically run on older cards, then it will be available for older cards, if the feature cant run on older cards because of need for new/upgraded hw part that only new series have, then it physically can run only on new cards. nVidias FG technology is massively complex and its miracle it can run like it runs and no its not only software feature :DDD Go ask the competent engineers that are working hard to develop these features and they will show you how it works and why these things are not only "software" :)
Yüce Allah'a hamdolsun. Yeryüzünü öyle detaylarla var etti ki hala onun zerresine ulaşabilecek gücümüz ve imkanımız yok. Bu ancak Allah'ın şanındandır.
The AI being implemented in making games and better NPCs is a game changer but another card that had more graphics that most of us can't afford doesn't really interest me you want to do something game-changing with these cards make them retroactive like the old cards like the 2000 series with DLSS 4 That's where we really need it
The problem is, is that 2000 series cards can’t calculate very well RT and DLSS due to the fact it doesn’t have any neural core, something that new cards have and are adding more and more, with the sole job of improving these techniques
Yep, nowadays we invest in AI rather than simply a video card. I strongly believe these cards won't be much of an upgrade, maybe only the 5090 which will cost around 3k euros.
I totally agree! I have a program that requires 80GB of VRAM to run properly. Looks like I’ll have to keep rendering on GPU farms for now, using something like Nvidia's H100 or better. Either that, or wait for the RTX 9090 to release before I can do anything from home. See you all in 2035! 😄 (Chimon N-vidia, how about dropping something massive next time-a real VRAM powerhouse would be amazing! Still loving the 5090, though. It’s a beast, but a little more VRAM would go a long way!)
that is, the rendered image is already perfect on its own, but to pursue ray tracing we have to invent DLSS 4 or 5 or 56, like if we had to use foundry ovens to cook a potato
still not impressed by that ray tracing. still think it doesn't worth the price for performance drop, i mean its so little quality improvement for such a high requirement, and yes i'm using a 4080 on a 1440 monitor, so is more than enough to support that RT, buy i still prefer higher framerate rather that "mega" improve in graphic quality. if they gonna make dlss 4, a 50 series feature only, like they did with frame generation only available on 40 cards, i'm never gonna by a nvidia card again
🎉 crazy! I remember getting a monster 3d II graphics card for Christmas as a child to play the first or second generation of 3d games on my first pc with bluescreen windows 98 😬
@@Eddiea2024 It’s not, it’s not meant to improve visuals. It’s meant to showcase how dlss has no loss in visuals, which it obviously did well considering how none of you guys are getting it
When will they all realize that no one really cares ultra realitic 8k graphics? Only 1% of gamers that like 4k as of now. Graphics mean no syit if the gameplay is sucks.
@@tysopiccaso8711 By the time the card comes to Europe its not 550 over here chump wtf are you smoking that 5070 is going to be more then 800 euros chump,its called inflation.......
@JenovaGirzz although I agree with you as a whole, is been a while since AMD gave an actually competitive product. The 6000 series was an actual return to form. The 7000 could be considered a reenactment of 6000, not a significant progression. Pricing was pretty bad in the beginning of 7000, although Nvidia made it look like a good deal with that BS of the 4080 for 1200.
@@JenovaGirzz Reasonable price? The 7900xt launched at $900 lol AMD prices are only reasonable 2 years later when they are trying to dump all their old stock nobody bought before they release new cards
Она так дорого стоит из за спроса на рынке, который вызван популярностью ИИ. Иначе говоря 5090 это карта для богачей и компаний разрабатывающих ИИ. Я думаю 5070 будет стоить чуть дороже своего предшествинника из 40 линейки, ее вполне хватит
Next-gen consoles need to start using Nvidia GPUs. AMD will never beat DLSS, DLSS is too good. Tired of all these console games trying to upscale games with bad visuals just to improve the FPS.
2025. I still use the Nvidia 670 GTX video card. And I am happy with everything. And of the games, I only play Genshin. And despite my old video card, I see amazing graphics in the game.
I mean, DLSS 4 really seems to be helping the framerate quite a lot. Quite excited for it, but scared about the price. Hate the AI NPC's like in PUBG though. The fun comes from playing with other players, you don't need a zomboid NPC to do your stuff, just like in Sons of the forest. ZOI looks interesing enough and would like AI NPC's to be implemented in Witcher 4 and GTA VI to give them more animations, things to do and hopefully one day, you can voice-talk to them and have them answer.(pretty much like Mantella for Skyrim and Fallout 4, but actual real time, GPU generated answers)
Given that DLSS can substantially enhance gaming experiences, I'm puzzled as to why, when the hardware improvements are relatively minor, manufacturers aren't focusing on providing us with a more advanced DLSS version.
As an artist I find Texture compression worth investing in. This will enhance and reduce the size of our video games by x7 times. Instead of 200GB call of duty we will get a 28Gb! But when it comes to facial AI swaps, I think it is uncanny and are horrible photoshop overlays that will get old quickly.
Agora podemos ver um boneco andando pra frente batendo com alguma coisa em um alguma coisa que se mexe e pegando poderzinho em altíssima taxa de quadros...
These GPU's are great, but hopefully we start seeing the $250 AI boards start to be a thing for games and creators. Having the option to have NPC's that can learn and be fairly autonomous will be great for gaming. Could be possible to run custom AI profiles as mods, too.
0:25 THE real performance of 5090
try cyberpunk at 4k no dlss on a 4090, it'll fry your eggs faster than it's fps
@@NEKORID who da fuk cares about 4k? the way to go is smoothness , all the way 540 hz
do you see the graphics? that's visually stunning and and performing better than a 4090. You're upset because their finding new strategies to combat transistors minimun size, and the eventual plateau of raw performance
@@kolebouk5502 zip it up when you're done.
😂🤣👍
How long before GPU AI RENDERING become subscription based?
Underrated comment.
It will happen as soon as cloud based gaming hits mass adoption.
dont give them ideas
Happened already in GeForce now
too late GFN is there :)
"What kind of graphics card do you have?“ is becoming like “what kind of car do you drive?“
If only lamborghinis were $2000...
cars doesnt use ai to push its speed
I don't get it
What do you mean by that!??
@@scriptkiddie1425 He means that the graphics cards these days cost as much as a car. It's not that hard to understand bro
Im still rocking a rtx 2080 not upgrading.the price is outrageous.
$2000 graphics card is just ridiculous.
I agree, especially the price gap between the 5090 and 5080
It's not ridiculous. It's the same cost as two long vacations. It depends on your life priorities.... these cards are either for people who stay at home all the time or rich people
@@wasimali9207 we will have 5080 ti 5080 super 5080 ti super and possible more cards to fill that gap lol
will be incredible for running AI models, 14 times faster than my 3090
I mean.... iphone pro max cost above 1k too and aren't filled with groundbreaking AI tech... People still buy it every year...
Seems like I'll have to wait for 9090
or a 10070 since it will just cost 1/3 of the 9090 :P hehe- but Vram is Vram :P
Then start collecting money straight away 9090 128gig 1000watts The price per month will be as high as a heating calorifier
I was thinking the same thing i have a 4080 ill just wait for the 8080.
Facts 😂
For 9999$
i'd love to see more games with just no frame gen or DLSS or any bs , just pure gpu performance
may i suggest amd?
why not Dlss? Do you want to use TAA?
DLSS is awesome. so whats your point exactly?
@@crateer you can play 4k but 1080 native resolution
@@hamzakhalil-gs7oc DLSS is resolution scaling .
the fact most devs optimize for 5090 with all the ai slop is the most concerning part.
What even is optimisation anymore 😊
yes and no because that'll be on deid aaa game flops that will have their cord pulled after 2 weeks
@@notaras1985 bruv 4x the frames and so much higher quality frames im hooked for the future by the images im seeing with my eyes it looks beautiful
0:25 "This is how incapable we currently are of running RT at full no bs 4k" And we charge you $2000 for it
@@NEKORID not their fault Devs don't optimise games anymore...
but its full rt mode
@@NEKORID $2000 is just a month worth of rent in the US….
@@MADED1TS Current hardware isn't powerful enough for full RT; the devs can't be blamed for that, it's impossible to "optimize" it much more without losing fidelity. Take a look at render times using ray tracing in software like Blender and Maya, and you'll be utterly amazed that real-time RT is a thing. (A SINGLE FRAME can take several hours to render, versus 0.05 seconds even in this choppy mess of a demo with DLSS off.)
That said, I agree with the main point: performance is too bad for current hardware, and they're pushing too hard.
@@exscape4060 people clowned on my fiance uses and I use a 3090 she gets better frames then my 3090 on 1080p and 1440p and on 4k I get 35 more frames I'm usually average 100 frames most games with my 3090 she's usually around 50 frames in 4k and at time 40 frames cause of the low vram the 3080 and 3090 things are fine for hardware but 5070 being 500 bucks that will demolish the 4090 in 1080p and 1440p aswell and be close to it on 4k and people don't realize this but 4k gaming isn't cheap and not for the normal people a propper 4k monitor runs 600 or if yoy get the 5090 your monitor will be 1500 dollars pc build and monitor looking at around 6 grand total with a 5090 in that pc.
They say 2000.00 MSRP, by the time it hits the market it will be pushing 3K.
brother 4090s are selling for 3k...
That's your power bill every week 🤣
no it 549$
I agree. You better be one of the first buyers if u want to buy for 2k. Was the same with the AMD 98000 X3D.
crazy right
Next feature they will have DLSS 5 that you need subscription to unlock its full potential. Im saying it now. GL!
Am I the only one who liked the standart face more then the neural one? Surely it looks more realistic in terms of proportions but overall it seemed like it was poorly photoshopped to the head. Also facial movement and expressions looked more appealing with standart face while neural was acting with kind of delay
StandarD. StandarD face. StandarD with a D. "STANDARD" not "sTaNdArT". You sound incredibly dense, it's not even pronounced with a "T".
And yeah that's because of how the technology works. It literally swaps out the face for a 'more realistic' version or interpretation on-the-fly. Obviously it's not perfect, but look at how far ray-based effects and AI-based upscaling has come in recent years. It's also not just a blank, dead face slapped on top, it follows the expressions and speech of the target.
@@gopnikolai7483 🤓🤓
I don’t like none of them, but the neural face looked very uncanny and weird
@@gopnikolai7483overreacting
@@gopnikolai7483 go touch the grass
My Graphic Card will eventually cost more than my car if they keep this up! 😅
really? New cars now a dayz cost +35K...
@@TheFlagFilms second hand he was probably talking but the same thing applies old cars in some cities are not allowed for "envoirment issues" wich makes you have to buy a new car , same thing happens with graphics cards also instead of optimizing games they just come out with a new graphics card every year ... disgusting ...
why do need to go out if you have 5090?
@@g33k_Tech to survive
@@TheFlagFilms $20k+ not $35k and up
One day the might make a decent game to use all these effects on?
I am so old cause I remember when the light, shadows & effects were free in game. My first card was a GTX285. I paid $319 US
dude i started on a commodore 64 back in the 80's - you aint old :)
GeForce 4 🤣
Vic 20 🤓
Spectrum
ATi Rage and GeForce 2 MX 👌🏼😂
0:25 so, the solution is not optimizing. It's the players buying a better graphic card.
its the hardware that is made and programmed to run the optimization software.
$2000 ? there goes the liver.
You only have one liver, so I wouldn't sell that for the 5090. And if you already sold a kidney for the 4090, may I suggest selling an eyeball? Although you wouldn't be able to enjoy that Ray Tracing as much so... maybe sell your testicals!
@@brucer.5403 HA! Fools, I bought the H100 (80gb Vram) but i sold half of my liver and 1 of my Kidneys, so jokes on the rest of u! since i will have a good G-card until 9090 comes out!
@@brucer.5403 I'd sell you🤣
@@brucer.5403 🤣🤣 or just start in OF
consoooooooom
Finally ! I can play Roblox Ultra 8K Setting with This New RTX 5000
Because Roblox is the only game ever made?
@@pelvist he’s joking
you can do that with a 4080
With 20 FPS without DLSS4
Victim
Can't wait for system requirements to go through the roof later
0:00 in this demo, 50% of the screen is black. The huge bars are there for a good reason...
samuel gordon profile picture!!!
@iphosux 😆
Lets take a step back
I personally hope one day, everyone will be able to reasonably afford this level of performance.
4k and raytracing, 60-120 fps.
And one day, we will, I feel.
With the 7900xtx, I can get stuff decently close to how this looks. But one day, these cards will be 200 dollars.
So take in 7 years, 5090 performance being the standard. So people with 5090's, are paving that way for us.
Yes, optimization is always an issue.
I'n in since 3dfx's voodoo2. Yes, todays "wow" is allways the standard in 10 years. ^^
@@ThadMiller1 openGL was pretty sick Back then
13:52 i saw this and ran away
$2000 graphics card for... 27 fps real ?
This is going to amazing for my Vampire Survivor sessions!!!
imagine still playing vampire survivors in 2025.... at least play balatro lol.
Also doing that, great game!
You can play that on a 2050 laptop bro
Gameplay over Graphics...how much more do we gamers need to repeat ourselves to be listened.
great games and great movies are made with great game mechanics / stories, not graphics and CGI. realism through mechanics / story is much more valuable and engaging than realistic graphics. what do you do with photorealistic graphics that you cannot interact with? you can have great games with an infinite amount of fun with just 2D images if you want to, if its mechanics are great. older games were much better, despite worse graphics.
Yeah. I'm trying to escape the reality not go back to it!!!
When you have to install a second PSU just to run the graphics card.
I strongly advise you not to operate your components of the same system with 2 independent circuits.
You don't need to do that.
It's probably a joke@@KURK_KOKANE
Why would you need that? Even with 4090, the total power draw is maximum 7-800watts. Even a decent 850w psu is enough for that..
How about use AI technology to stop resellers from buying all the GPU'S with bots on release and selling them for 3x the MSRP.
Possible. But they just don't f*cking care.
40은 혁신이라는 시작을 보여 줬고 비싼 가격에 팔았는데, 50은 멀티 DLSS로 프레임 상승만 보여줬다. RTX40에서 충격을 줬기 때문에 RTX 50은 넘어가는 세대.
AAA급 게임 넘을 넘어가는 S급 게임이 나오던가, 대중화를 노린 가격이 저렴하고 가성비 좋은 그래픽 카드가 나와야 될듯.
AI는 다른 종류라고 생각해야됨.
Does this gpu comes with monitor and cpu and whole desk setup? Why $2000?
Remember when a new Card would be coming out and they would showcase all the cool games you could play on it? Now we get Smart Zoi, a theoretical character that doesn't do much to change the gameplay and might or might not ever be in any game. Is this Project Milo from Lion Head Studios all over again.
idk if you guys are just pretending to be stupid or are actually stupid. Smart Zoi is meant to showcase nvidia's new ACE technology that powers AI Npc behavior. It isn't a graphics card technology. Nvidia is also an AI and robotics company.
Also, this independent channel is putting together it's own clips. It's not a 5090 showcase by Nvidia. That's why pubg ally is inside.
They showed 8 games that are perfect for showcasing graphics, is that not enough?
Wait?! You guys watching the graphics on a graphic cards?!
@@_MaxHeadroom_ Those aren't new games made to showcase the cards. They are games that are already out and getting a frame boost patch. Not that exciting. Remember that Indiana Jones game nobody bought. Now you can play it at 300 fps.
@@michaelblue4619 This was a graphics card showcase. I don't care if they make vacuums. They are trying to sell us a 2K graphics card. 5 years a go you could get 2 gaming laptops for that price. Sell me on it. You being so smart should have been able to figure that out.
03:18 am I the only one that thinks the reflections look better with RTX off 😂
Yes, this is all absolutely incredible, but who the hell cares about framerates in the 200s when the input latency will be locked to the base framerate?
I mean 200fps with ~60fps sound pretty amazing..
A lot of people do.
@@JenovaGirzz more specifically, I meant the first part showing a framerate in the 20s boosted to the 100s. A 60 fps lock boosted higher is fine, but not a base framerate in the 20s.
@@JenovaGirzz the f...
Anything that have a base framerate of 60 will have a very good input lag so I wouldn’t complain about it, they are not showing how “low” the input lag can be but how smooth the game FEELS, this is not competitive gaming, most competitive gaming should be played without DLSS4 or ray tracing, the purpose of dlss is the feeling of smoothiness just like blur
I realized today no amount of money is enough to buy happiness. Watching a 344 fps video showcase on 120hz monitor.
and once you get your hands on one, you won't be able to enjoy the now old favourite game of yours
I know this all sucks and all but I feel excited since I'm coming from a GTX 1070. I'm also coming from 1600x900 resolution so an upgrade to 1080p ultrawide at 100hz/fps is such an awesomething to me that I'm excited for the 5070.
why not get a 4090
These GPUs are monsters at traditional game rendering. But Path Tracing is another beast and the fact you can get such good quality with AI is insane. You would need a farm of GPUs to render this raw. I don't know why so many people don't seem to understand this. You can always turn RT off and play at 240. If the game allows you.
5:33 unless ya trick the boss few times it’ll glitch out of existence lmao 😅👍🏽
0:00 in this demo 50% of the screen a black bars. And that is for a good reason. :)
Both DLSS and FSR shouldn't exist taking in mind that it is now taken as default the usage of both in games, so devs are actually taking less and less effort to optimize their games for a native gameplay.
But at least FSR isn't locked behind a paywall by being free and available to any card, while NVIDIA is promoting their merchandise solely taking in mind the usage of DLSS.
Sure, go ahead, support NVIDIA by buying their cards, after all you will have better FPS than AMD with everything enabled, just don't complain when the 60 series presents that same thing in a higher scale and announces the 6090 for $2500 or more.
Not achieving a Cyberpunk native 30 FPS in 2025 with a supposed so much better technology just proves my point, and keep in mind that game was optimized in Partnership with NVIDIA.
Dude every 40 series GPU already has DLSS…
@@JayGotKeys-mb8ni Any Nvidia graphics card has DLSS since RTX series began, sept 2018, that was not my point
@ be quiet
Should I play the game through NVDIA app to make those frame regeneration to work?
I'm so glad my gaming days are over. Companies are openly fleecing the gamers because they know you want to remain on the cutting edge. They are literally robbing you.
I guess we can say goodbye to optimization in gaming.
The problem with this technology is that it gives an excuse to devs to not optimize their games entirely relying on AI.
Don't be dumb, they know they can't sell their game if they optimize it only for the 1% players who own such cards
@@MrHyonD The Monster Hunter Wilds Stress Test in Nov 2024 ran into this exact issue. People showed that it couldn't be played on a GTX 1080 and that it was almost necessary to use DLSS to get above 60 fps with certain 20 and 30 series RTX card using medium or high settings (should be considered the norm settings for most gamers). Pushing graphical fidelity is nice and all but at the cost of gameplay or performance its a step in the wrong direction IMO especially if you have to rely on DLSS or FSR to get your game to a playable state.
DLSS4.... and how is it with the nativ settings ? How much FPS ? ...
Who cares native since you get insane performance in dlss technology
30% more rasterizing then 4090 maybe, who knows.
@@techart5489 21 fps 4090 28 fps 5090 with out dlss and the witch craft
@@AngelGallardo-p5lyeah this is the performance with path tracing on
without path tracing is around 70/80fps
Is around 30% improvement over the 4090
"Where we landing boys" to yourself is crazy
can you play 8k with 60 fps?
@@iwantyou9944 no )
@iwantyou9944 def not with RT on
Depends on the game you two clowns
@RobertZ1973 Ironic, you calling us clown, looking at your comment history
@@CG-601So triggered you turned to stalking, I love it.. always proves I struck a nerve.
This look cool on my *1080p monitor* and *480p phone screen* at *30fps*.
Throttle the push for graphics, let the studios catch up and figure out optimization techniques with what they currently have.
0:55 rtx 5090: im I a joke to you?
Think of DLSS 4 as this: an artist draws a picture by hand....he no longer has to draw multiple pictures by hand (wasteful). Instead he can now focus on the next shot. AI (tensor cores) take a "pic" of what the artist drew and handles taking multiple predictive pictures in between until the artist draws (by hand) the next shot.
But can it run Crysis :)
the AI boss stuff is cool. I want more of that. intelligent NPCs in games will be the next true revolution.
NPC behavior improvements being marketed as a hardware feature is peak corporate spin.. you'll do well not to be a sucker and fall for it. NPC AI doesn’t need RTX-exclusive tensor cores.. it’s just smarter code that could run even on older CPUs and GPUs if developers cared enough to implement it. As for DLSS 4, it’s a pure software feature; there’s no technical reason it couldn’t work on the 2000 series if NVIDIA wanted it to. The truth is, they lock features behind hardware generations to force upgrades, not because it’s impossible to backport. Let’s not pretend game-changing AI is exclusive to overpriced GPUs.. it’s a cash grab, plain and simple.
@@hvanmegenyou are correct
@@hvanmegen STOP LYING! This is biggest lie i saw. nVidia never locked anything purposefully. Theirs 2x framegen could only run on 4000series because of the optical flow in 4000series so it is impossible to run it on older cards. 3x and 4x FG can only run on 5000series because of flip metering and massively boosted tensor performance. You dont know anything about this stuff. Why would nVidia release massively upgraded DLSS, massively upgraded DLAA, massively upgraded rayreconstruction, massively upgraded reflex2 with framewarp and in future neural rendering features to 2000,3000,4000series too? And massively upgraded 2x FG for 4000series. Answer is simple, if the features can physically run on older cards, then it will be available for older cards, if the feature cant run on older cards because of need for new/upgraded hw part that only new series have, then it physically can run only on new cards. nVidias FG technology is massively complex and its miracle it can run like it runs and no its not only software feature :DDD Go ask the competent engineers that are working hard to develop these features and they will show you how it works and why these things are not only "software" :)
@@hvanmegentrue. They've learnt a lot from GTX 1080ti
21:04 Seeing this after that Valve twitter post my hopes for Half Life 3 got rekindled.
Yüce Allah'a hamdolsun. Yeryüzünü öyle detaylarla var etti ki hala onun zerresine ulaşabilecek gücümüz ve imkanımız yok. Bu ancak Allah'ın şanındandır.
The AI being implemented in making games and better NPCs is a game changer but another card that had more graphics that most of us can't afford doesn't really interest me you want to do something game-changing with these cards make them retroactive like the old cards like the 2000 series with DLSS 4 That's where we really need it
The problem is, is that 2000 series cards can’t calculate very well RT and DLSS due to the fact it doesn’t have any neural core, something that new cards have and are adding more and more, with the sole job of improving these techniques
Yep, nowadays we invest in AI rather than simply a video card. I strongly believe these cards won't be much of an upgrade, maybe only the 5090 which will cost around 3k euros.
I totally agree! I have a program that requires 80GB of VRAM to run properly. Looks like I’ll have to keep rendering on GPU farms for now, using something like Nvidia's H100 or better. Either that, or wait for the RTX 9090 to release before I can do anything from home. See you all in 2035! 😄
(Chimon N-vidia, how about dropping something massive next time-a real VRAM powerhouse would be amazing! Still loving the 5090, though. It’s a beast, but a little more VRAM would go a long way!)
@@criteria9886 oh sure! i too have an H100 at home! lol those are like $200.000 and are not even purposed for running games
@@criteria9886 why not run multiple GPUs?
I am NOT going to change my entire computer case for ts 😭
Facts
that is, the rendered image is already perfect on its own, but to pursue ray tracing we have to invent DLSS 4 or 5 or 56, like if we had to use foundry ovens to cook a potato
still not impressed by that ray tracing. still think it doesn't worth the price for performance drop, i mean its so little quality improvement for such a high requirement, and yes i'm using a 4080 on a 1440 monitor, so is more than enough to support that RT, buy i still prefer higher framerate rather that "mega" improve in graphic quality. if they gonna make dlss 4, a 50 series feature only, like they did with frame generation only available on 40 cards, i'm never gonna by a nvidia card again
They already had a slide at CES 2025 showing that DLSS4 Frame Gen and likely also the transformer model is in fact 50 series only.
I've been VERY happy with my AMD 7900 XT, optimal rasterization for the buck. Perfect for VR! Eff NV.
@@Psychx_I think dlss 4 is a 40 series feature aswell but multi frame gen is only 50 series I'm not sure
@@ishaankumar4587 finally someone who watched the video
It literally looks as good as if it came straight out of high quality CGI and you are not impressed? Well enjoy your 2010 rasterized AMD graphics then
Can you compare the differences between 5090 and 5090D?
D is the cheaper, China exclusive alternative
overpriced games and now you can't even play them on lower end hardware...(frame-gen is only on top of the line gpus)
Cloud gaming + Fast Internet 😎
@@jaylenjames364 The resolution on cloud gaming is absolutely atrocious, its like playing in low 720p
@@jaylenjames364 Cloud gaming was killed in the egg. The industry pretty much gave up on it
dlss technology is designed to get the best results on budget graphics cards for this.
🎉 crazy! I remember getting a monster 3d II graphics card for Christmas as a child to play the first or second generation of 3d games on my first pc with bluescreen windows 98 😬
emm we just want the joy of gaming, not the joy of rendering
It was exist more, 5100 for military use, but the past, 270 mm^2, 96 billion of transistor.
@8:15 a game where i can control the bush size.. NICE!
Any leaks as to the Low Profile 75W versions of these cards? My A2000 12GB runs well but I could do with a jump to 5000 series.
Cant wait to buy this to play OSRS
can you please tell me the name of game in the starting of this video?
@@chandrakant731 the first game was black state
@@Zam___ maybe infero is possible
🔥 Absolutely Stunning
0:57 i dont see any thing change here
That’s the point, dlss is that good
Because it's all bs
Something changed. 2000 bucks are missing from your wallet now.
@@Eddiea2024 It’s not, it’s not meant to improve visuals. It’s meant to showcase how dlss has no loss in visuals, which it obviously did well considering how none of you guys are getting it
serious question, can i live with one kidney?
Yes but you are more likely to have many complications.
1 kidney, 1 eye, 1 ball...pairs are overated
Few thousand usd is not that much, not to mention that an 5070 opr 5080 will be more than enough for most people.
@@Dr.White_PHD hahahahaha thank you
When will they all realize that no one really cares ultra realitic 8k graphics? Only 1% of gamers that like 4k as of now. Graphics mean no syit if the gameplay is sucks.
Now every year they are gonna improve AI by 20-30% and ship it with new video cards for thousands of dollars. Kinda genius ngl 😂
Ngreedia at its best again and again and again...........
~30% performance increase for lower prices is greedy? Compared to AMD's 30% performance and vram decrease lol
@@Dempig 30% is nothing you clown.
550$ 5070? improving graphics and performance for everyone via a software update with dlss 4? greedy? what are you smoking
@@tysopiccaso8711 By the time the card comes to Europe its not 550 over here chump wtf are you smoking that 5070 is going to be more then 800 euros chump,its called inflation.......
@@Dempig Lets see first when the card comes out and how it performs il wait for gamer nexus review,i don`t trust Ngreedia anymore..
I am not going to lie, AMD has no response for any tech presented here. They are still trying to implement Anti-lag 2 for the love of god...
@@Chibicat2024 Usually AMD's response is to ask for a reasonable price instead of all your savings. I do agree though.
@JenovaGirzz although I agree with you as a whole, is been a while since AMD gave an actually competitive product. The 6000 series was an actual return to form. The 7000 could be considered a reenactment of 6000, not a significant progression. Pricing was pretty bad in the beginning of 7000, although Nvidia made it look like a good deal with that BS of the 4080 for 1200.
@@JenovaGirzz Reasonable price? The 7900xt launched at $900 lol AMD prices are only reasonable 2 years later when they are trying to dump all their old stock nobody bought before they release new cards
@@Dempig BRO IS SPEAKING FACTS
покупать карту за 400 тыс рублей чтоб смотреть на статуй
А можно просто купить комнату в Эрмитаже🤣
Меня не покидает мысль что все эти длсс после 2.5 это что то напоминающее пирамиду)
@@ZhelezniyDrovosek дак оно так и будет а по факу там не о чем карты
Она так дорого стоит из за спроса на рынке, который вызван популярностью ИИ. Иначе говоря 5090 это карта для богачей и компаний разрабатывающих ИИ. Я думаю 5070 будет стоить чуть дороже своего предшествинника из 40 линейки, ее вполне хватит
Next-gen consoles need to start using Nvidia GPUs. AMD will never beat DLSS, DLSS is too good. Tired of all these console games trying to upscale games with bad visuals just to improve the FPS.
Sony's proprietary upscaling solution is damn near DLSS quality and far better than XeS and FSR
@@JenovaGirzzNah this is nonsense. Sony is FAR away from DLSS
are the leaves actually part of the game or just animation?
9:12 Can't wait for the new level 4 backpack 😂
2025. I still use the Nvidia 670 GTX video card. And I am happy with everything. And of the games, I only play Genshin. And despite my old video card, I see amazing graphics in the game.
Holding my 4090 here until 6000 series
Got my ADX tokens during the presale. Adaxum’s unique vision is what sold me.
what about pure performance? show honest 4k without dlss, for example, in Cyberpunk 2077 with path tracing.
What is the neural material about, have we had any explanation?
Will DLSS 4 only be available for the 5000 series, or will the 4000 series also be eligible for DLSS 4?"
13:05
I mean, DLSS 4 really seems to be helping the framerate quite a lot. Quite excited for it, but scared about the price.
Hate the AI NPC's like in PUBG though. The fun comes from playing with other players, you don't need a zomboid NPC to do your stuff, just like in Sons of the forest. ZOI looks interesing enough and would like AI NPC's to be implemented in Witcher 4 and GTA VI to give them more animations, things to do and hopefully one day, you can voice-talk to them and have them answer.(pretty much like Mantella for Skyrim and Fallout 4, but actual real time, GPU generated answers)
Given that DLSS can substantially enhance gaming experiences, I'm puzzled as to why, when the hardware improvements are relatively minor, manufacturers aren't focusing on providing us with a more advanced DLSS version.
Wait...Did I just see a new Virtua Fighter! About time!
And the drivers for Linux ?
From ZX81 to this in 40 years. What will it be like 40 years from now? 👍
As an artist I find Texture compression worth investing in. This will enhance and reduce the size of our video games by x7 times. Instead of 200GB call of duty we will get a 28Gb! But when it comes to facial AI swaps, I think it is uncanny and are horrible photoshop overlays that will get old quickly.
6:28 if that's the actual game with unpredictable playstyle then it's a win.
I have the Feeling DLSS is going to be a MUST HAVE. Probably in a Few Years you cannot even turn it off. Such a shame.
please don't forget to add reflex 2 to iracing
Is it me or is it still raytracing they are trying to sell since the 3060 cards?
I will buy 5090 to play these games when 10090 come out.
ADX is one of the few tokens I see having massive potential for 2025. Grabbed my share during the presale.
Agora podemos ver um boneco andando pra frente batendo com alguma coisa em um alguma coisa que se mexe e pegando poderzinho em altíssima taxa de quadros...
We welcome to the new technology so we can buy the old ones
It's really going amazing 🎉
Me playing these games while providing central winter heater for 2 households from my pc.
These GPU's are great, but hopefully we start seeing the $250 AI boards start to be a thing for games and creators. Having the option to have NPC's that can learn and be fairly autonomous will be great for gaming. Could be possible to run custom AI profiles as mods, too.
Imagine AI boss in Sekiro with unpredictable moves 🤣🤣
Someone knows what optimization is ?
Someone know how to adjust their settings?
i aint spending over $500 for a gpu. My Rx 7800 xt works great and will continue to work great for a few years.