Just some notes here: - reports of the new driver coming soon, should alleviate some of the performance loss when using the new TF model along with ray reconstruction. -I haven't used OBS a ton for my content so I didn't know that in the advanced video settings, color range set to "full" rather than "limited" produces a darkened image -for those wondering why my RTX 3090 appeared to be running much slower than where it should perform. Using OBS despite utilizing NVENC, and using it in tandem with RTX broadcast actually incurs a BIG hit. So during my RTX 3090 demonstration I was getting like 30FPS less. I tested without any recording and RTX broadcast completely killed in the processes, and I was able to attain around 95FPS average DLSS Balanced transformer model, with PATH TRACING in conjunction with fsr-framegen mod. Thats like a 40% increase. Which is why i'm glad that for my test bench rig I'm utilizing a capture card. -yes, yes I know I put the labels at the bottom, I typically don't use this format style for my videos, so I'm still just familiarizing myself with it.
I think your being a little altruistic with your conclusion. If they closed off DLSS 4 to 5000 series, there might have been 4 games to support it... not only that but they would have had to put two versions of DLSS and FG in game... I think if they could of closed it off to bring more value to the 5000 series they would have but no one would have supported it.... vs them trying to be proconsumer
This is what DLSS should be used for. Older/weaker cards that struggle to run newer games. Not as a crutch for new cards that are lagging in generational uplift and bloated unoptimized games.
@MrValdemar4ik wasn't necessarily for older cards. That's why each gen you go back they get less and less. It's for newer cards. The cost of newer cards is covering the cost of dlss development. Also keeps you buying new old stock cards and keeps you team green. Also increases lower level cards like 4060 to probably help them beat the newer 580.
Man, I remember spending 500 to 700 on GPUs and being fine with that. That got me 80 class or 80ti class cards, heck my 3080 cost me £650 on release and I was fine paying that... the 4080 going to £1200 ( IIRC it was 1150 ) was insane, the super being £950 was still stupid. Now a 5080 that is barely faster than the 4080/super is £980... I know inflation exists, I know its getting more expensive to produce things. The issue is we are not got getting paid more and everything is going up by stupid amounts, yes I can afford a 5080 but do I really want to drop that much??? Especially seeing as so many games are trash now anyway. Guess my 3080 is staying longer thanks to this DLSS update. Hope its true the driver will help alleviate some of FPS hit
Prices went up just because people by 2020 started to pay whatever it was so they could have GPUs. Brands reacted with a huge rise in price. Speculation was the guilty and people paying for it the cherry on top. If you keep buying it for a high price, it’ll stay. Market laws dictate the final price. Intel with the B580 showed you can have a decent performance for not that much; AMD with the 7900GRE showed a great performance/price. It all depends how buyers react. As the 3090/4090 sold amazingly well, Nvidia calculates people won’t mind to pay a kidney to have a new GPU. From my perspective, I’ll keep my Legion Go and eGPU with a 3060Ti for some time and along the way buy a second hand 4070 Ti for a decent price.
I'm still so happy with my 3080ti won't change it for or a 40 or a 50 series card. Gaming is for fun i still think that people are to much set on fps instead of fun. I hit 200 frames at 1440p in a lot of games if it was 100 i could care less. Peolple look to much at pro players or influencers. It doens't make me a better gamer if i put a 5090 in my pc. I'm 48 started playing on a c/64,c/amiga and all the consoles and pc's with always the best gpu you can imagine but if i played on a friend pc i still had the same fun. Opimized good frametime feels the same as high fps to me.
exactly what I said on another channel (no likes/no replies) $2000 gpu + $1000- $2000 annual Electricity bill for an extra fps you can't notice and may save us some time during rendering 4k etcs...at what cost ? (-- $3500--$4000 in 2025 holyshht! GREED) I will never buy nvidia again (I'm on an rx470 = I can't afford it anyway = I'll buy a car instead)
I saw one guys custom download some preview versions of it for cyberpunk and I think the Indiana Jones game. He was running balanced dlss and looked like quality preset. It looked veeeerrrry good. Upscaling is looking to be a viable option for squeezing out a Lil more performance and thus eases up allot of potential latency if you're trying to use fg.
DLSS was already excellent for me, even at 1080p (probably due to blurry forced TAA in many games). But free improvements are welcome for sure, my 2070 super will stay strong for much longer.
It's true that TAA is blurry in many games. The easy fix is using reShade that comes with numerous postprocessing filters while having very little impact on fps. The standard installation comes with 2 sharpening filters AMD FidelityFX and LumaSharpen. Both work very well and have minutely adjustable slider to get the exact sharpness user prefers. I also like the tiny inobtrusive fps meter displayed in upper right corner. In games with only TAA this is a must.
@@janchiskitchen2720Sharpening filters can not fix ghosting though. Many TAA implementations struggle from it. Plus, sharpening is not an ideal solution. It can not restore lost details, like DLSS, unfortunately.
@ While all that is true, for myself at 1440 the sharpened image is still better, especially running it at 120+ fps when the frames are not too far apart.
@ I messed with all of the TAA/FXAA etc settings in the witcher 3 the other day and everything I did resulted in smeary blurry image. Running native without it is a aliasing nightmare.. all the grass is horrible looking. Running dlss quality makes the image look better than native or AA in my opinion. It is crazy that the only thing we have that actually works is Nvidia lol we are so reliant on Nvidia for quality.. its very unfortunate.
@@christophermullins7163 you got the lazy devs and greedy shareholders to thank for that lol. Games nowadays are almost always developed with DLSS in mind.
Issue is that we have 4 generataions of raytracing capable hardware, and most of these cards cannot be used to play a five year old game with raytracing. 4060 is the third generation, even says on he box "essential" and "raytracing" yet it's unusable for that purpose. Why even bother? You need a 4070super / 4070ti level card to start having some acceptable experience. So, MAYBE, the fourth generation, 5070 will be able to play a five year old game with raytracing. Maybe.
I've tested 14 games when using dlss swapper and the performance gain and image quality with the Transformer model is just insane. I'm gonna keep my 4080 and not do any upgrade cause for now this software update is doing it for me 🎉
Such a good side by side comparison. I agree, Nvidia did well by the people on this one. Seems like the inability to get on a new node made them prioritize software and AI upgrades, which in turn has panned out well for everyone.
yep, I just hope game devs don't look at this and say, great skip optimization and force DLSS+FG enabled by default and call it day. Then we'll basically be back to sq1
@DannyzReviews Give them an inch, they go the mile type thing. Yeah that would be a shame. I guess since AMD is falling behind on their software end, devs will need to still optimize to some degree to make them work well with APUs and the few Radeon GPUs that are out there.
@@DannyzReviews You know full well they will. Most (but not all thankfully) devs these days have really gotten into a quantity over quality approach, pumping out as much near unplayable unoptimized garbage as they can, instead of actually taking the time to put out a quality product. And people will buy them because game journos (who I'm convinced are paid off by the game dev studios) will write a review of a game, praising it like it's the 2nd coming, and then, of course, their favorite streamers will also play it, act like they love it (doesn't help most of those streamers are also sponsored AND have rigs with 4080's, bare minimum in them) and act like you're "not cool" if you don't play said unoptimized crapfest. It's a cycle of sheeple being sheeple, and unless consumers vote with their wallets, nothing is going to change.
I think it's important to stick to stocks that are immune to economic policies. AI stocks that have the potential to power and transform future technologies. It seems AI is the trajectory most companies are taking, including even established FAANG companies. Maybe there are other recommendations?
I bought into NVIDIA around September last year because my financial advisor recommended it to me. She said the company is selling shovels in a gold rush. It accounted for almost 80% of my market return this year.
@@JoeWilmoth-k2w That's a great analogy and I love the insight. Professionals could make a really big difference in investing, and I think everyone should have one. There are aspects of market trends that are difficult for the untrained eyes to see.
@@LucaMurgia-j7b There are many independent advisors to choose from. But I work with MARGARET MOLLI ALVEY and we've been working together for almost four years and she's fantastic. You could pursue her if she meets your requirements. I agree with her.
This makes me reassured that my combo for the forseable future will be a PC and a Nintendo Switch 2, these upgraded upscaling technologies from Nvidia are really awesome.
I’ve been trying the performance mode in Cyberpunk on my 3090. It’s seriously impressive how good it is. I used DLSS swapper and Stalker 2 looks so much better as well, even on performance mode. Over 100 fps in that game. I’m thinking I might skip 5000 series. At least the 5090. The 5000 series has 2000 series written all over it imo.
@thechurchofsupersampling I don't know about the new version but I wasn't impressed with the old version's image quality or latency, and I don't see how it can be good without integration into the game to get information like motion vectors.
@@quantumdot7393 It's not. No offense to anyone with older hardware. I see this all the time in comments. "You don't need any new gpus just use lossless for framegen". The simple truth is that these people have never sat with dlss framegen for hours and swapped to lossless. There is no comparison. Lossless technically works and is maybe not terrible but dlss3 framegen is great. As good as framegen can be. Again.. with all respect.. the lossless is dlss idea is straight up coping with having older hardware and ignorance of the quality of good FG.
@@quantumdot7393 tbh, I like more how Lossless upscale and sharpen now than the actual DLSS or NIS. I'm not on competitive play and use only a 60hz monitor, so may not fit everyone.
You can use lower quality modes at the same visual quality improving performance overall. Is anymore nuance than "NUmBER bIgGeR" too hard for you to understand?
@@Frozoken DLSS Quality on CNN vs DLSS Quality on Transformer Model, you loose frames so whats so hard for you to understand? Nvidia didnt upgraded nobodies GPU. Im playing in 4K on a big OLED TV and i wont use anything less than DLSS Quality. DLSS got better with its 4th instalment so why i wouldn't want to take advantage of its better quality??!! Just use ultra performance and set everything to low and enjoy you upgraded GPU with higher FPS, i dont care what anyone is doing or how is playing , my point was that DLSS Quality on Transformer Model wont bring you any performance vs the old CNN type, get it?
@@AexoeroVBut CNN quality mode looks as good as Transformer Balanced and just a bit better than Transformer Performance just turn down the setting to get the same performance
@AexoeroV Ok just because you're too ignorant to use anything less doesn't change the fact that when visually matched ur getting more fps. End of story
A good quality upgrade for the price of a perfomance mode drop for slower card. If you are on a 70 or 80 class its nice, but if you are on a 60 or 50 class and already use performance, the games will now run slower.
I have been playing cyberpunk at 4k, perf tf dlss, framegenand pathtracing on 4070 TiSuper. It fluctuates betweem 70 and 90fps and is absolutely beautiful. It is rare to be able to pick out obvious signs that it is being upscaling from 1080p. I think it is quite silly when people say upscaling or framegen is bad or bad for games is just wrong imo.
People like me are complaining not because DLSS and benefits but because : - Nvidia is comparing performance between new GPU with fake FPS and older GPUs rendering real frames, trying to make people believe that fake frames are as good as real frames. It’s not. - developers use those things to leave games unoptimized so instead of getting 60 fps real and >120 fake, we get 30 fps fake… Everything else about DLSS , FSR is great and helps to achieve better frames with newer games.
@@ricarmig I have heard it everyday long before Jensen got on stage and lied like a villain to sell 50 series AI chips to gamers because they no longer design gaming GPUs. There are issues with devs these days but the existence of these technologies are not the problem. The problem is that Nvidia uses a ton of die space on AI and designs architecture specifically for AI instead of just making GPUs game fast enough. Because this is the garbage chips Nvidia released.. devs are required to use it because the die is dedicated to AI and the hardware is not fast enough to keep up with the visuals that devs think gamers need in order to be happy with a game. If Nvidia would make a "FPS first GPU" on 4n and 750mm square die like 5090.. it would get massively more performance. AI funded all RnD for Blackwell. I suspect soon that Nvidia will branch off and make a gaming and RT gpus separate from AI. Or perhaps a gaming only and a hybrid that allows for advanced AI features. Unless games become full neural rendered.. then you really gonna have something to talk about when raster is completely gone and it is all a mix of neural and RT. Whatever that means.
If you play pleb games you can use framegen be my guest, if you actually need to be good in a game and have some feeling what so ever you hate the delay.
About DLSS TF "not being free" / "has increased cost over previous DLSS"... This is a similar situation to what happened when moving from FSR 2 to FSR 3 on older AMD cards like the RX 580 (and probably any older cards from the era). It basically meant that FSR 2's "Quality" setting's performance would have to be dropped down to "Balanced" on FSR 3 to get the same frame rate. I tested this in so many games, it was consistent across the board, and probably why AMD was going to limit FSR 3's rollout to RX 5000 series and up (Navi and up). *I bet FSR 4 will have increased cost, too, but that's a tradeoff for better tech.*
Another outlet on youtube showed that older generation RTX cards, especially 2000 series take a performance hit with TF upscaling due to the small number of tensor cores relative to newer cards. 4000 series have enough tensor cores to handle the TF without an issue. That performance hit with older cards is just the extra time needed for tensor cores, it doesn't actually impact rendering speed.
So when the new NVidia driver is released, we don't have to do anything except for installing the new driver, and every DLSS supported games will automatic start using this DLSS4 TF model for upscaling?
This is very good news! My 3090 will live on for quite some time with this better upscaling, and even fsr framegen like you said! Maybe when I get around to playing cyberpunk I'll be able to run path tracing after all haha
I was looking at some 5090 msi cards. Then I did the math. A card costing $2500 plus $250 CA tax = $2750 and if I paid $200 a month for 15 mouths with a credit rate of 22.8% I would pay $467.51 in interest = $3217.51 for a 24-30% raw power increase over a 4090 with the bonus of fake frames.
I could do the same calculation, but as a European, I just buy it, and pay it with the money in my bank account. I never get the buying on credit for non-essential products. Especially if they aren't that expensive. In Europe we usually only get a credit for things like a house or a car.
If you can afford $60 to $80 games, you can afford a $600 GPU I get a kick out of gamers that play new AAA titles and complain about the price of GPU's
Man you should go work for Nvidia, cards and games are both overpriced, especially when games get released while being completely broken. You're a joke.
I'm using an RTX 4070 laptop, which is basically a slightly less powerful 4060 Ti with a few more CUDA cores but less power and slightly slower memory bandwidth. It's amazing that I get around 35-40 FPS with path tracing performance mode in Cyberpunk. I get 60-70 fps with FG. Ultra performance mode is quite good now as well. It's slightly worse than native but still very playable with a few extra artifacts and blurrier far distances, but looks decent enough it's only running at 480p at that mode! With frame gen ultra performance mode and path tracing, I get 80-100 FPS. It's crazy it can squeeze out the level of fidelity at these tiny resolutions.
I'm more interested in what the frames look like when panning. Right now I prefer to dial down my settings than to use DLSS to get the frame rates I want.
That shimmering on foliage is fixed if you use Ray-reconstruction in conjunction with the new DLSS transformer model, however the new RR will tank performance on 20/30 series cards, so a slight trade off in image quality on older gen cards vs 40/50 series.
6:23 That building in the distance looks better on the right. On the left the windows look much worse and overall much blurrier. On top of that it barely gains any performance over old quality mode. I don't see what's so great about it.
“I don’t believe Nvidia made these changes to older cards out of the kindness of their hearts. It’s more likely that developers across many gaming companies pressured Nvidia, especially since the release of the 40-series generation, to consider making these changes-and they succeeded.”
More like, hey I don't need the 5090. I will get the 5070 that I can afford. NVIDIA winz more cards being bought in the mid tier range = more market share.
It isn't just foliage that is an issue for the transformer model. The moving signs are more unstable from a distance than with DLSS quality with the CNN model. I also tested the transformer model on Dying Light 2 and The Witcher 3 and foliage is an issue in those too.
lossless scaling app on steam, made me feel my 3090ti is an upgrade. Just inserting FG into anything is great enough but being able to limit heat and gpu usage by limiting the game to say 60fps then FG from lossless scaling set to x2 outputs 120fps. Smooth as silk, esp with reflex and boost on.
So with increased resource usage, does it make DLSS/Quality less of a performance boost at no visual cost, into moreso a visual boost at no performance cost? Also, if i was only just able to keep a stable 60fps on 4k max settings with DLSS/Quality on the CNN model previously, will I have to drop to balanced to get the same equivalent in framerate? Or would balanced actually be +/- of what Quality used to be?
This is cool. Really good for us on older GPUs. I kinda wish Nvidia wouldve added a Frame Gen for any game feature like AMD did. I mean, I guess there is Lossless Scaling but, I think Nvidia would've made it better. I wonder if AMD will back port FSR 4 to older FSR titles. Either way, This is good for Nvidia it makes them look alot less scummy.
First, multi frame gen will look like trash for anything but 5090, so you shouldn't want it. Second, cards older than 40 series do not have the hardware (optical flow generator) to do single frame gen the way they coded it. You can use FSR or some kind of hack but that's a hack. Nvidia is the bad guy for innovating and adding new hardware features, then using them? Kind of a weird take.
@@patsk8872 Never said they were bad guys just scummy. they're obviously taking advantage of the market with limiting VRAM and raising prices rather high vs. the average inflation overtime. Ive seen a few of your comments. You seem to really love defending Nvidia against criticism. Pretty weird to defend Multi Billion dollar corporation that doesnt care whether you live or die. Weird flex but okay.
gonna be ridin my rtx4090 on native 1440p res till a proper RAW performance upgrade gets developed hopefully rtx6090. fck em Fake frames, fck em Fake Res, fck em Gsyncs and fck em Vsyncs.
Fake rasterization too? 😂😂😂 Bro first off chill, Upscaling is so good, I can’t distinguish it from native at times, even 4K performance looks great, and it will look even better with the Transformer model. Gsync just improves your game screen tearing without a hit to input lag, you got Reflex 2 which I think many are sleeping on. If all they did was just improve raster performance then they will be eaten by the competition. FrameGen seems like bs but I need to try it first and get a feel, but this is the first gen MFG, it’s gonna get better, and what’s great is that if you have a big hard on for raster performance you can just opt out of all of these, the laws of physics can’t be bent out of shape, they have perfected their gpu design so much over the years, the problem is Unreal5 and all the horrible optimization.
Stop being so butthurt. Just Fyi all games are fake. Nothing truly real but an approximation of reality. So if you don't like it then just don't play and go play in your corner
The bigger news should be how Deepseek just proved how much of a fad Ai really is. Nvidia has been offering nothing in terms of Ai and artificially charging the market insane amounts of money for it. It's high time they get taken down.
I scoped out the new Transformer setting in CP the other day. Hands down looks way better than the original method. But I still wouldn't run lower than "quality" unless I absolutely had to, as there seems to still be a pretty big drop off in fidelity, on this particular game. RT in CP2077 is still a complete disaster on my 3090 though. As it is though, I've already been using DLSS "balanced" in Darktide and it's not bad. Quality looks a little better, but the performance benefit of balanced far outweighs small penalty in fidelity. Is there more info on what games will be getting Transformer and when, or is it a driver level update that can work on any DLSS game currently?
Right now you can use third-party software like DLSS Swapper and NV Profile Inspector to use it in pretty much any DLSS 2.0 game or newer. By Jan 30th, NVIDIA will release a driver update that will enable an option to force DLSS Transformer on compatible games via their own NVIDIA App. It's unclear right now how many games will be supported that way, but at least 75 if we are to believe NVIDIA's claims. And it's likely going to replace the third-party methods eventually.
There was no upgrade to the card. They just put some effort in making the feature work on older cards which they could have and should have done before this.
more like - "Hey, devs, now all gamers have that future. Use it in your games" - "OK boss, how about those who have GPS that can't handle or don't support it?" - "Bad luck for them." Gamers: - "Now i must buy new GPU because my old one with that feature looks and feels like shit. 300 FPS shit i must admit."
DLSS finally does what Nvidia promised it does 6 years ago! But in fact still not completely, going too low in upscaling levels still causes issues in most games even on the most common engines like Unreal, where a select few effects ignore DLSS and are always based on the real resolution, and too bad for UE5 that concerns the entire lightning system ghosting all over the place! Oopsie! For real I'm also impressed by this new version, but it's STILL not real resolution. Also, Epic are in trouble, DLSS4 is so sharp that their engine's flaws are really obvious to see now.
Agree. Good video review. Most will stick with their cards. DLSS4 will help millions of people with older cards. However, some of us could not get our hands on a 4080/4090 when first released. A lot of AMD driver issues for me. Good chance I will get a 5080 or 5090. I can wait 3-6 months. Patience is the key to see how this all works out.
It's concerning how many people run at 4k and use like a 4090 or something and refuse to touch DLSS. I'm not sure if it's an ego thing or what, but the majority of games it's impossible to tell unless you're really looking hard. It'll look better on DLSS Quality in a lot of games due to how harsh TAA blurring is nowadays.
@@Thomas_Angelo then u need a visit to an optometrist. it comes in blocks with clear edges. its awful.. if u cant tell the difference thats on u, not me.
It's been stated that non rtx 4000/5000 series gpus will have a performance hit when using the new dlss4 transformer model. Supposedly they lack the hardware to run it. However dlss actually works better for 4k resolution because it's not upscaling from 360p like with 1080p or 720p like with 1440p on performance mode. I believe that dlss performance mode on 4k upscales from 1080p. I could be wrong about these resolutions but I believe that it's correct. Either way the transformer mode is great and I really like dlss. It's basically free performance and it let's me keep my rtx 4080 for longer to get a better value from it. I don't understand why so many people hate on dlss, frame gen I understand but upscaling has been used for a very long time with television and consoles. It's in way more things than people even realize. Imagine if Nvidia, Intel and AMD did not give us upscaling and still charged $1000 for a graphics card with the crappy performance they have. Some would argue games would run better because they would actually have to optimize the games which is probably true, but game optimization was lacking way before we got dlss/fsr upscaling and frame gen. Greed is the component here that is destroying the gaming industry and really the whole world.
Yeah, the narrative that "games should be better optimized" always strike me wrong. Like, realistically, are they going to be? We had shitty ports way before we had any of those techs. Were these people not around during the Xbox 360/PS3 era? PC games ran like crap most of the time then. Frame-time consistency was nowhere to be found. I don't think it's a fault of DLSS/FSR that new engines are very reliant on resolution scaling, for instance. It was already heavily used even on the last-gen consoles. Regarding DLSS Upscaling, I myself really disliked using it. It had pretty obvious arctifacting, specially on Cyberpunk with Ray Reconstruction. Lines weren't quite straight, image was overal blurry and ghosting was very apparent. STALKER 2 had similar issues, but mostly ghosting and blurriness. I really reject the narrative that DLSS Q looked "native" on the previous model. Frame-gen is a different beast entirely. It does have a bit of arctifacting, it blurries the image a bit on movement and input latency is sensibly higher, specially on mouse input, but it was way more usable than DLSS Upscaling IMO. But it depends heavily on what's your performance before enabling it. It works well at over 80-90 FPS base. What sucks most about it is that NVIDIA sold it as "performance" back when the 40 series launch, which could not be further from the truth. NVIDIA is selling it as performance once again with the 50 series MFG. Transformer model is a game-changer all around, DLSS Upscaling is very much usable, ray-reconstruction inproved massively and FG was already good enough.
Pricing purely based on demand, shortage, availability but never on true manufacturing cost. We tend to think what value we get out of it but the true shocker is the margins. Nvidia's to date 2025 gross profit is 75% , so that means 25% went towards ALL costs and 75% leftover is real money left in the bank (before taxes). So cost is manufacturing, office leased, paying staff etc. Now imagine that GPU you bought for £2000 cost £500 in components, labour etc to make. So really that 75% is funding their research in future because their operating costs were already covered by that 25%. Bonus money. Look up Nvidia profit margins for their H100 GPUs.
Nvidia always has a reason when they "give" something away for "free". I think since the AI compute market is still red-hot, where the gpu's used there make consumers graphic cards look dirt cheat ($10k+ cards and companies like Meta and Microsoft make billion $ orders.). Gamers aren't currently that important to Nvidia's current priorities/bottom line. (We'll always get lip service.) This just helps Nvidia keep graphics card market share.
There is no update and there is no possible way Nvidia drivers can insert settings into games. All games have to release an update to accommodate the GPU settings
You can use third-party software to swap DLSS versions ever since 2.0 and NVIDIA already showed the in-app DLSS version swap in their presentation. It's coming soon.
So gamers should stick to old generations cards since software updates allow for performance improvements and leave newer (and far more expensive) cards for AI modellers.
This is why none of you guys are computer experts did you ever consider the fact that when you get more frames the quality goes down the quality goes up when you have less frames 😅 SMH 🙄
Your showing such easy scenes for this tech to do well at, fast moving or busy scenes with plant life etc it just falls apart. You have something for nothing in this world, and artifacts are what you get.
I'm hesitant to upgrade to the new drivers. I don't trust nVidia and at some point I wouldn't be surprised if they gimp their older GPU models to try to nudge ppl to spend money to buy their 5000 series at some point...
Imagine someone saying the words Nvidia and free in the same sentence ? Now that's the funniest thing I've ever heard ! The more you buy the more you save !! If you buy 3 5090s you save so much its like getting 2 free cards !!
I don't understand why you didn't switch to DLSS 4 quality or balanced in Cyberpunk to see how the performance or hit would have been. You only showed performance which significantly lowers resolution and of course it gives great performance because of it.
Now my fingers are crossed that AMD releases an update that makes me be able to play Fortnite and other games without random freezes on my 9900X, FPS drops etc... Literally can be at 300 FPS and then go down to literally 1, freeze everything and I have no idea why. At least my temps are no longer insanely high...
All we need is a quality frame gen model for the older cards. something the open source community should start working on. The problem is patching it into tge game drivers which NVidia would never allow. Software upgrades are nice but when buying newer generation you shouldn't need them.
@AhmedSaeed-wx3pd yep but it needs more work if they had the same driver level integration as NVidias solution it would be better. But in a pinch it definitely can help. It's funny how NVidia reports it's impossible to do frame gen on older hardware but there is now a app and FSR 3 that can do it.
@@justindressler5992 "frame gen on older hardware but there is now a app and FSR 3 that can do it." That is because FSR 3 isn't hardware AI it is Software AI, which is also why FSR in general is just worse then DLSS. FSR4 is going the hardware AI route which is why only the 9000 series is getting it.
It took literally 1 second after u switched to DLSS performance to see smearing/noise on the ground.....its really strange when people say there is no difference between native and per dlss. anything under quality is kinda bad, even on the new updated dlss4. its only my opinion tho.
I hate how discorse has been tainted when it comes to image quality by content creators on the tech/gaming space. DLSS 4 transformer is very much an improvement on DLSS 3. On ray reconstruction scenes it honestly is better than previous CNN model by a lot. I'd go as far to say that DLSS T Balanced on CP 2077 (RR + Path Tracing) looks better and has less artifacts than DLAA CNN. But people often make such unrealistic statements like "better than native", well, no shit. If the TAA is already destroying image quality, if DLSS destroys it a little better, then it's better than TAA. That's not a compliment. Also, people really understate how much artifacting can be distracting.
@@patsk8872 the CNN was already some special sauce compared to FSR3. The TF model just bitch slapped whatever FSR4 thought it was gonna be lol absolute game changer. Even ultra performance is reasonable and looks better than CNN performance. Thank you Nvidia for continuing to push the envelope of quality. Except for 50 series.. what a letdown. That architecture was completely built for and paid for by AI. Gamers got nothing because it is an AI GPU. It's bad.
Ya, but nvidia might of just screwed their recent gen if this is true, imho there would be no reason to move to 50 series even with multi-frame gen and the native performance uplift isnt that good. It must be the reason the 5070 is pretty low in comparison of previous to make it more enticing to switch and the 5090 makes even less sense. It would be nice if we get a full gpu re-evaluation from one of the bigger reviewers though cause im interested in seeing those results.
I Iive in Canada, B.C. Buying a RTX 4090 from Amazon is literally $6K! How are they allowed to price it so high is beyond me. That honestly should be illegal.
One more thing before everyone starts to say oh it's not double the 50 90 isn't double the quality we are at a point where we can't double anymore we've reached our physical limit I mean I think there's other ways but that's neither here nor there so even 10% better is going to be ridiculous people need to start using their brains 😊
I’ll never understand why people want the absolute, non plus ultra, best Graphics etc. Playing on a 1080p monitor is fine and you get good graphics plus performance but 4K on your television?! Seriously?! For that you do need the latest hardware BUT only if you want the best!!! People can spend their Money however they want but using your brain, NOT just mindlessly spending thousands of bugs for a gaming PC! You can get a nice gaming PC for round about 1200$!!! More than enough! It’s not gonna be 4K with high FPS. Who cares😑
I play on 2060 super. On cyberpunk, with dlss on for max non-rt settings at 1080p I use dlss quality and that looks great, and i cap my fps at 82 and it works amazingly. The transformer model usually only gets me 62 fps. Its much better on the newer cards, but on the older ones it doesnt look as good. Also the grass in the badlands flickers like crazy on the transformer model, not so much on the cnn.
Only reason I’m getting a 5070ti is because I’ve saved for about 3 years whilst raising kids. Hope this is a product refresh practice Nvidia continue, because I won’t be able to upgrade again for a lifetime cos I’ve got grandkids now 😂
AMD can probably train against Nvidia model+ their own stuff. ML is tricky from the competitive standpoint of building a moat. What happened to OpenAI recently is a good example : 🍿
Just some notes here:
- reports of the new driver coming soon, should alleviate some of the performance loss when using the new TF model along with ray reconstruction.
-I haven't used OBS a ton for my content so I didn't know that in the advanced video settings, color range set to "full" rather than "limited" produces a darkened image
-for those wondering why my RTX 3090 appeared to be running much slower than where it should perform.
Using OBS despite utilizing NVENC, and using it in tandem with RTX broadcast actually incurs a BIG hit. So during my RTX 3090 demonstration I was getting like 30FPS less.
I tested without any recording and RTX broadcast completely killed in the processes, and I was able to attain around 95FPS average DLSS Balanced transformer model, with PATH TRACING in conjunction with fsr-framegen mod. Thats like a 40% increase. Which is why i'm glad that for my test bench rig I'm utilizing a capture card.
-yes, yes I know I put the labels at the bottom, I typically don't use this format style for my videos, so I'm still just familiarizing myself with it.
I think your being a little altruistic with your conclusion. If they closed off DLSS 4 to 5000 series, there might have been 4 games to support it... not only that but they would have had to put two versions of DLSS and FG in game... I think if they could of closed it off to bring more value to the 5000 series they would have but no one would have supported it.... vs them trying to be proconsumer
@@DannyzReviews was just thinking that the last driver update was 2 months ago.
i have more than 40% your 3090 uplift maybe this is your problem
This is what DLSS should be used for. Older/weaker cards that struggle to run newer games. Not as a crutch for new cards that are lagging in generational uplift and bloated unoptimized games.
This is the only use I would ever have for it.
Its what happens because sli support waned. Plus the proce of gpus
Why in the world NVidia would invest R&D money into something helping consumers to keep their old cards longer?
@MrValdemar4ik wasn't necessarily for older cards. That's why each gen you go back they get less and less. It's for newer cards. The cost of newer cards is covering the cost of dlss development. Also keeps you buying new old stock cards and keeps you team green. Also increases lower level cards like 4060 to probably help them beat the newer 580.
who decided what it should or shouldn´t it be used for? surely not a random guy in youtube
Man, I remember spending 500 to 700 on GPUs and being fine with that. That got me 80 class or 80ti class cards, heck my 3080 cost me £650 on release and I was fine paying that... the 4080 going to £1200 ( IIRC it was 1150 ) was insane, the super being £950 was still stupid. Now a 5080 that is barely faster than the 4080/super is £980... I know inflation exists, I know its getting more expensive to produce things. The issue is we are not got getting paid more and everything is going up by stupid amounts, yes I can afford a 5080 but do I really want to drop that much??? Especially seeing as so many games are trash now anyway. Guess my 3080 is staying longer thanks to this DLSS update. Hope its true the driver will help alleviate some of FPS hit
Prices went up just because people by 2020 started to pay whatever it was so they could have GPUs. Brands reacted with a huge rise in price. Speculation was the guilty and people paying for it the cherry on top.
If you keep buying it for a high price, it’ll stay. Market laws dictate the final price. Intel with the B580 showed you can have a decent performance for not that much; AMD with the 7900GRE showed a great performance/price. It all depends how buyers react. As the 3090/4090 sold amazingly well, Nvidia calculates people won’t mind to pay a kidney to have a new GPU.
From my perspective, I’ll keep my Legion Go and eGPU with a 3060Ti for some time and along the way buy a second hand 4070 Ti for a decent price.
I'm still so happy with my 3080ti won't change it for or a 40 or a 50 series card. Gaming is for fun i still think that people are to much set on fps instead of fun. I hit 200 frames at 1440p in a lot of games if it was 100 i could care less. Peolple look to much at pro players or influencers. It doens't make me a better gamer if i put a 5090 in my pc. I'm 48 started playing on a c/64,c/amiga and all the consoles and pc's with always the best gpu you can imagine but if i played on a friend pc i still had the same fun. Opimized good frametime feels the same as high fps to me.
Yep, they got crazy high because people continue to pay it.
Do you remember the 2080ti being 1200 on release?? I’d say we’ve come a long way since then. Considering how weak it is compared to the 4080/5080.
exactly what I said on another channel (no likes/no replies)
$2000 gpu + $1000- $2000 annual Electricity bill for an extra fps you can't notice and may save us some time during rendering 4k etcs...at what cost ? (-- $3500--$4000 in 2025 holyshht! GREED)
I will never buy nvidia again (I'm on an rx470 = I can't afford it anyway = I'll buy a car instead)
People that haven’t seen the new transformer model really need to wait a few days.
Trust me, I’ve seen it with my own 2 eyes.
It’s UNREAL
i have it. your capping
@ fam literally Digital Foundry did a full Investigation and it was nearly perfect in a large amount of scenes.
I saw one guys custom download some preview versions of it for cyberpunk and I think the Indiana Jones game. He was running balanced dlss and looked like quality preset. It looked veeeerrrry good. Upscaling is looking to be a viable option for squeezing out a Lil more performance and thus eases up allot of potential latency if you're trying to use fg.
@@happyteenpost5165 found the amd liar user
in 4K it is literal black magic.
DLSS was already excellent for me, even at 1080p (probably due to blurry forced TAA in many games).
But free improvements are welcome for sure, my 2070 super will stay strong for much longer.
It's true that TAA is blurry in many games. The easy fix is using reShade that comes with numerous postprocessing filters while having very little impact on fps. The standard installation comes with 2 sharpening filters AMD FidelityFX and LumaSharpen. Both work very well and have minutely adjustable slider to get the exact sharpness user prefers.
I also like the tiny inobtrusive fps meter displayed in upper right corner. In games with only TAA this is a must.
@@janchiskitchen2720Sharpening filters can not fix ghosting though. Many TAA implementations struggle from it.
Plus, sharpening is not an ideal solution. It can not restore lost details, like DLSS, unfortunately.
@ While all that is true, for myself at 1440 the sharpened image is still better, especially running it at 120+ fps when the frames are not too far apart.
@ I messed with all of the TAA/FXAA etc settings in the witcher 3 the other day and everything I did resulted in smeary blurry image. Running native without it is a aliasing nightmare.. all the grass is horrible looking. Running dlss quality makes the image look better than native or AA in my opinion. It is crazy that the only thing we have that actually works is Nvidia lol we are so reliant on Nvidia for quality.. its very unfortunate.
@@christophermullins7163 you got the lazy devs and greedy shareholders to thank for that lol. Games nowadays are almost always developed with DLSS in mind.
Issue is that we have 4 generataions of raytracing capable hardware, and most of these cards cannot be used to play a five year old game with raytracing. 4060 is the third generation, even says on he box "essential" and "raytracing" yet it's unusable for that purpose. Why even bother? You need a 4070super / 4070ti level card to start having some acceptable experience. So, MAYBE, the fourth generation, 5070 will be able to play a five year old game with raytracing. Maybe.
I've tested 14 games when using dlss swapper and the performance gain and image quality with the Transformer model is just insane. I'm gonna keep my 4080 and not do any upgrade cause for now this software update is doing it for me 🎉
My 3080 is now a 4K card again. The Performance and even the Ultra Performance modes are just so much better than on the CNN model.
are you saying my 3070 is going to evolve ? 0_o
My 2080 has evolved
Such a good side by side comparison. I agree, Nvidia did well by the people on this one. Seems like the inability to get on a new node made them prioritize software and AI upgrades, which in turn has panned out well for everyone.
yep, I just hope game devs don't look at this and say, great skip optimization and force DLSS+FG enabled by default and call it day. Then we'll basically be back to sq1
@DannyzReviews Give them an inch, they go the mile type thing. Yeah that would be a shame. I guess since AMD is falling behind on their software end, devs will need to still optimize to some degree to make them work well with APUs and the few Radeon GPUs that are out there.
@@DannyzReviews You know full well they will. Most (but not all thankfully) devs these days have really gotten into a quantity over quality approach, pumping out as much near unplayable unoptimized garbage as they can, instead of actually taking the time to put out a quality product. And people will buy them because game journos (who I'm convinced are paid off by the game dev studios) will write a review of a game, praising it like it's the 2nd coming, and then, of course, their favorite streamers will also play it, act like they love it (doesn't help most of those streamers are also sponsored AND have rigs with 4080's, bare minimum in them) and act like you're "not cool" if you don't play said unoptimized crapfest.
It's a cycle of sheeple being sheeple, and unless consumers vote with their wallets, nothing is going to change.
@@DannyzReviews They already do that ...
I think it's important to stick to stocks that are immune to economic policies. AI stocks that have the potential to power and transform future technologies. It seems AI is the trajectory most companies are taking, including even established FAANG companies. Maybe there are other recommendations?
I bought into NVIDIA around September last year because my financial advisor recommended it to me. She said the company is selling shovels in a gold rush. It accounted for almost 80% of my market return this year.
@@JoeWilmoth-k2w That's a great analogy and I love the insight. Professionals could make a really big difference in investing, and I think everyone should have one. There are aspects of market trends that are difficult for the untrained eyes to see.
@@TerrencesSheldons That's a great tip. I'm setting out 50k to invest in the market this year. Any particularly useful tips you could offer to me?
@@LucaMurgia-j7b There are many independent advisors to choose from. But I work with MARGARET MOLLI ALVEY and we've been working together for almost four years and she's fantastic. You could pursue her if she meets your requirements. I agree with her.
@@TerrencesSheldons Thank you for this Pointer. It was easy to find your handler, She seems very proficient and flexible.
This makes me reassured that my combo for the forseable future will be a PC and a Nintendo Switch 2, these upgraded upscaling technologies from Nvidia are really awesome.
I’ve been trying the performance mode in Cyberpunk on my 3090. It’s seriously impressive how good it is. I used DLSS swapper and Stalker 2 looks so much better as well, even on performance mode. Over 100 fps in that game. I’m thinking I might skip 5000 series. At least the 5090. The 5000 series has 2000 series written all over it imo.
So performance mode with the new transformer mode looks better than quality looked in the old model on stalker 2? What res you playing at?
All we need now is 2x frame generation coming to atleast the 30 series
Lossless scaling, it works on any gpu and the new version is improved if less good than nvidia
@thechurchofsupersampling I don't know about the new version but I wasn't impressed with the old version's image quality or latency, and I don't see how it can be good without integration into the game to get information like motion vectors.
@@quantumdot7393 It's not. No offense to anyone with older hardware. I see this all the time in comments. "You don't need any new gpus just use lossless for framegen". The simple truth is that these people have never sat with dlss framegen for hours and swapped to lossless. There is no comparison. Lossless technically works and is maybe not terrible but dlss3 framegen is great. As good as framegen can be. Again.. with all respect.. the lossless is dlss idea is straight up coping with having older hardware and ignorance of the quality of good FG.
@@quantumdot7393 tbh, I like more how Lossless upscale and sharpen now than the actual DLSS or NIS. I'm not on competitive play and use only a 60hz monitor, so may not fit everyone.
@@thechurchofsupersampling i agree, its 7$
No, they didn't!! Transformer Model costs you additional performance, you dont gain frames, you lose some!! I just hate this click bait titles!
You can use lower quality modes at the same visual quality improving performance overall. Is anymore nuance than "NUmBER bIgGeR" too hard for you to understand?
Well actually you gain performance because you can use balanced or performance mode while getting quality-quality...
@@Frozoken DLSS Quality on CNN vs DLSS Quality on Transformer Model, you loose frames so whats so hard for you to understand? Nvidia didnt upgraded nobodies GPU. Im playing in 4K on a big OLED TV and i wont use anything less than DLSS Quality. DLSS got better with its 4th instalment so why i wouldn't want to take advantage of its better quality??!! Just use ultra performance and set everything to low and enjoy you upgraded GPU with higher FPS, i dont care what anyone is doing or how is playing , my point was that DLSS Quality on Transformer Model wont bring you any performance vs the old CNN type, get it?
@@AexoeroVBut CNN quality mode looks as good as Transformer Balanced and just a bit better than Transformer Performance just turn down the setting to get the same performance
@AexoeroV Ok just because you're too ignorant to use anything less doesn't change the fact that when visually matched ur getting more fps. End of story
A good quality upgrade for the price of a perfomance mode drop for slower card. If you are on a 70 or 80 class its nice, but if you are on a 60 or 50 class and already use performance, the games will now run slower.
Use the old model then
I have been playing cyberpunk at 4k, perf tf dlss, framegenand pathtracing on 4070 TiSuper. It fluctuates betweem 70 and 90fps and is absolutely beautiful. It is rare to be able to pick out obvious signs that it is being upscaling from 1080p. I think it is quite silly when people say upscaling or framegen is bad or bad for games is just wrong imo.
People like me are complaining not because DLSS and benefits but because :
- Nvidia is comparing performance between new GPU with fake FPS and older GPUs rendering real frames, trying to make people believe that fake frames are as good as real frames. It’s not.
- developers use those things to leave games unoptimized so instead of getting 60 fps real and >120 fake, we get 30 fps fake…
Everything else about DLSS , FSR is great and helps to achieve better frames with newer games.
@@ricarmig I have heard it everyday long before Jensen got on stage and lied like a villain to sell 50 series AI chips to gamers because they no longer design gaming GPUs. There are issues with devs these days but the existence of these technologies are not the problem. The problem is that Nvidia uses a ton of die space on AI and designs architecture specifically for AI instead of just making GPUs game fast enough. Because this is the garbage chips Nvidia released.. devs are required to use it because the die is dedicated to AI and the hardware is not fast enough to keep up with the visuals that devs think gamers need in order to be happy with a game. If Nvidia would make a "FPS first GPU" on 4n and 750mm square die like 5090.. it would get massively more performance. AI funded all RnD for Blackwell. I suspect soon that Nvidia will branch off and make a gaming and RT gpus separate from AI. Or perhaps a gaming only and a hybrid that allows for advanced AI features. Unless games become full neural rendered.. then you really gonna have something to talk about when raster is completely gone and it is all a mix of neural and RT. Whatever that means.
If you play pleb games you can use framegen be my guest, if you actually need to be good in a game and have some feeling what so ever you hate the delay.
About DLSS TF "not being free" / "has increased cost over previous DLSS"... This is a similar situation to what happened when moving from FSR 2 to FSR 3 on older AMD cards like the RX 580 (and probably any older cards from the era).
It basically meant that FSR 2's "Quality" setting's performance would have to be dropped down to "Balanced" on FSR 3 to get the same frame rate.
I tested this in so many games, it was consistent across the board, and probably why AMD was going to limit FSR 3's rollout to RX 5000 series and up (Navi and up).
*I bet FSR 4 will have increased cost, too, but that's a tradeoff for better tech.*
Another outlet on youtube showed that older generation RTX cards, especially 2000 series take a performance hit with TF upscaling due to the small number of tensor cores relative to newer cards. 4000 series have enough tensor cores to handle the TF without an issue. That performance hit with older cards is just the extra time needed for tensor cores, it doesn't actually impact rendering speed.
So when the new NVidia driver is released, we don't have to do anything except for installing the new driver, and every DLSS supported games will automatic start using this DLSS4 TF model for upscaling?
This is very good news! My 3090 will live on for quite some time with this better upscaling, and even fsr framegen like you said! Maybe when I get around to playing cyberpunk I'll be able to run path tracing after all haha
I was looking at some 5090 msi cards. Then I did the math. A card costing $2500 plus $250 CA tax = $2750 and if I paid $200 a month for 15 mouths with a credit rate of 22.8% I would pay $467.51 in interest = $3217.51 for a 24-30% raw power increase over a 4090 with the bonus of fake frames.
Brother ew Brother ew what’s that?
I could do the same calculation, but as a European, I just buy it, and pay it with the money in my bank account.
I never get the buying on credit for non-essential products. Especially if they aren't that expensive.
In Europe we usually only get a credit for things like a house or a car.
If you can afford $60 to $80 games, you can afford a $600 GPU I get a kick out of gamers that play new AAA titles and complain about the price of GPU's
Man you should go work for Nvidia, cards and games are both overpriced, especially when games get released while being completely broken. You're a joke.
I'm using an RTX 4070 laptop, which is basically a slightly less powerful 4060 Ti with a few more CUDA cores but less power and slightly slower memory bandwidth. It's amazing that I get around 35-40 FPS with path tracing performance mode in Cyberpunk. I get 60-70 fps with FG. Ultra performance mode is quite good now as well. It's slightly worse than native but still very playable with a few extra artifacts and blurrier far distances, but looks decent enough it's only running at 480p at that mode! With frame gen ultra performance mode and path tracing, I get 80-100 FPS. It's crazy it can squeeze out the level of fidelity at these tiny resolutions.
Nice video, thank you!
So how do I upgrade my 3080TI with this new model of software?
wait for new drivers
it is not out yet
And now Can you believe it the 3080ti is faster then the 5080.
@ Sweet
so dlss4 is not exclusive to 50 series?
I'm more interested in what the frames look like when panning. Right now I prefer to dial down my settings than to use DLSS to get the frame rates I want.
That shimmering on foliage is fixed if you use Ray-reconstruction in conjunction with the new DLSS transformer model, however the new RR will tank performance on 20/30 series cards, so a slight trade off in image quality on older gen cards vs 40/50 series.
6:23 That building in the distance looks better on the right. On the left the windows look much worse and overall much blurrier. On top of that it barely gains any performance over old quality mode. I don't see what's so great about it.
“I don’t believe Nvidia made these changes to older cards out of the kindness of their hearts. It’s more likely that developers across many gaming companies pressured Nvidia, especially since the release of the 40-series generation, to consider making these changes-and they succeeded.”
all it is is just a .dll file no one pressured anyone stop coming up with these conspiracies it's not that deep.
the words of conspiracy theorist hater.
More like, hey I don't need the 5090. I will get the 5070 that I can afford. NVIDIA winz more cards being bought in the mid tier range = more market share.
It isn't just foliage that is an issue for the transformer model. The moving signs are more unstable from a distance than with DLSS quality with the CNN model. I also tested the transformer model on Dying Light 2 and The Witcher 3 and foliage is an issue in those too.
Of course it's free. To get hooked on fake frames forever ;) Nvidia going the full drug dealer business model^^
And AMD is not doing the same with FSR 4?
@@adlibconstitution1609 they gotta compete somehow
FSR same thing
True gamers hate both. Fk fake frames@@adlibconstitution1609
DLSS is not Frame Generation, it does not add fake frames, nor extra latency. It's just an AI resolution upscaler.
It's for everyone? Should be great on an ARC card then, right?
RTX 3090 - midrange in 2025? oh ok.
Yea its the same performance as 4070 super
AND it’s now almost 5 years old, in the tech world 5 years is considered old hardware IMO
3090 is definitely mid ranged now.
@@alvin6572 by price? this is a nightmare
@ 4070 super is around 3080 in 1440p/2k resolutions.
np. I would buy ur 3090 for 4070 Super price!
The new version looks smoother but more blotchy the older version seems to preserve texture depth better but stuttery and more artifacts
You said everyone gets an upgrade, but I’m still rocking a GTX 1080 🙈No more love for Pascal? 😕
No tensor cores on pascal, but try out lossless scaling.
@ I know, truth is I may just have to shell out some cash and get an RTX because the GTX 1080 is having a hard time keeping up with 1440p 😑
lossless scaling app on steam, made me feel my 3090ti is an upgrade. Just inserting FG into anything is great enough but being able to limit heat and gpu usage by limiting the game to say 60fps then FG from lossless scaling set to x2 outputs 120fps. Smooth as silk, esp with reflex and boost on.
So with increased resource usage, does it make DLSS/Quality less of a performance boost at no visual cost, into moreso a visual boost at no performance cost?
Also, if i was only just able to keep a stable 60fps on 4k max settings with DLSS/Quality on the CNN model previously, will I have to drop to balanced to get the same equivalent in framerate? Or would balanced actually be +/- of what Quality used to be?
My 2080 still lives , dlss swapper plus lossless scaling it's a goat
This is cool. Really good for us on older GPUs. I kinda wish Nvidia wouldve added a Frame Gen for any game feature like AMD did. I mean, I guess there is Lossless Scaling but, I think Nvidia would've made it better. I wonder if AMD will back port FSR 4 to older FSR titles. Either way, This is good for Nvidia it makes them look alot less scummy.
First, multi frame gen will look like trash for anything but 5090, so you shouldn't want it. Second, cards older than 40 series do not have the hardware (optical flow generator) to do single frame gen the way they coded it. You can use FSR or some kind of hack but that's a hack. Nvidia is the bad guy for innovating and adding new hardware features, then using them? Kind of a weird take.
@@patsk8872 Never said they were bad guys just scummy. they're obviously taking advantage of the market with limiting VRAM and raising prices rather high vs. the average inflation overtime. Ive seen a few of your comments. You seem to really love defending Nvidia against criticism. Pretty weird to defend Multi Billion dollar corporation that doesnt care whether you live or die. Weird flex but okay.
The new Driver is the key i think uplift is noticable!!! RTX 3090 user here
1:46 taking hallucinate frames into the next level
gonna be ridin my rtx4090 on native 1440p res till a proper RAW performance upgrade gets developed hopefully rtx6090. fck em Fake frames, fck em Fake Res, fck em Gsyncs and fck em Vsyncs.
Why so angry at Gsync and Vsync?
Fake rasterization too? 😂😂😂
Bro first off chill, Upscaling is so good, I can’t distinguish it from native at times, even 4K performance looks great, and it will look even better with the Transformer model. Gsync just improves your game screen tearing without a hit to input lag, you got Reflex 2 which I think many are sleeping on.
If all they did was just improve raster performance then they will be eaten by the competition. FrameGen seems like bs but I need to try it first and get a feel, but this is the first gen MFG, it’s gonna get better, and what’s great is that if you have a big hard on for raster performance you can just opt out of all of these, the laws of physics can’t be bent out of shape, they have perfected their gpu design so much over the years, the problem is Unreal5 and all the horrible optimization.
Gsync is good. Smoothens the image with no tearing.
Stop being so butthurt. Just Fyi all games are fake. Nothing truly real but an approximation of reality. So if you don't like it then just don't play and go play in your corner
@ damn in the corner bro 💀💀💀
DLSS4 (super resolution) is a thing. Can't wait to see comparison to FSR 4 (super resolution)
The bigger news should be how Deepseek just proved how much of a fad Ai really is. Nvidia has been offering nothing in terms of Ai and artificially charging the market insane amounts of money for it. It's high time they get taken down.
I scoped out the new Transformer setting in CP the other day. Hands down looks way better than the original method. But I still wouldn't run lower than "quality" unless I absolutely had to, as there seems to still be a pretty big drop off in fidelity, on this particular game. RT in CP2077 is still a complete disaster on my 3090 though.
As it is though, I've already been using DLSS "balanced" in Darktide and it's not bad. Quality looks a little better, but the performance benefit of balanced far outweighs small penalty in fidelity.
Is there more info on what games will be getting Transformer and when, or is it a driver level update that can work on any DLSS game currently?
Right now you can use third-party software like DLSS Swapper and NV Profile Inspector to use it in pretty much any DLSS 2.0 game or newer. By Jan 30th, NVIDIA will release a driver update that will enable an option to force DLSS Transformer on compatible games via their own NVIDIA App. It's unclear right now how many games will be supported that way, but at least 75 if we are to believe NVIDIA's claims.
And it's likely going to replace the third-party methods eventually.
@band0lero Cool, I can wait a few more days for official support. I guess I'll have to finally break down and install their app though.
@ Their app is pretty ok. Nothing like the GeForce Experience bloatware.
@@band0lero Thanks, that's good to know. I never installed GeForce Experience because I don't want any bloatware 🤣
There was no upgrade to the card. They just put some effort in making the feature work on older cards which they could have and should have done before this.
more like
- "Hey, devs, now all gamers have that future. Use it in your games"
- "OK boss, how about those who have GPS that can't handle or don't support it?"
- "Bad luck for them."
Gamers:
- "Now i must buy new GPU because my old one with that feature looks and feels like shit. 300 FPS shit i must admit."
Transformer model works 100 percent better again with ray reconstruction on
Can you use dlss4 in everygame or just the ones that come with it?
is the new driver from nvidia coming out on jan 30th of this week?
DLSS finally does what Nvidia promised it does 6 years ago!
But in fact still not completely, going too low in upscaling levels still causes issues in most games even on the most common engines like Unreal, where a select few effects ignore DLSS and are always based on the real resolution, and too bad for UE5 that concerns the entire lightning system ghosting all over the place! Oopsie! For real I'm also impressed by this new version, but it's STILL not real resolution. Also, Epic are in trouble, DLSS4 is so sharp that their engine's flaws are really obvious to see now.
Go take your meds.
Eh few things are truly perfect.
If any
complaining about something FREE?! i've seen it all
@@HumbleDude46995 it’s just cool to hate on nVidia.
Dude shut up. LOL.
Features that make the game look better without a performance hit? In no way is that a bad thing.
Agree. Good video review. Most will stick with their cards. DLSS4 will help millions of people with older cards. However, some of us could not get our hands on a 4080/4090 when first released. A lot of AMD driver issues for me. Good chance I will get a 5080 or 5090. I can wait 3-6 months. Patience is the key to see how this all works out.
Ah, I hoped Nvidia invented a way to upgrade everyone to 24 GB VRAM... I need more VRAM for experimenting with local LLM models.
Buy those A6000. hopefully you guys stop buying those 3090/4090/5090 class of gpu. That way nvidia will drop the price back to the usual $700.
And for games that don't support frame generation natively you could just get lossless scaling
It's concerning how many people run at 4k and use like a 4090 or something and refuse to touch DLSS. I'm not sure if it's an ego thing or what, but the majority of games it's impossible to tell unless you're really looking hard. It'll look better on DLSS Quality in a lot of games due to how harsh TAA blurring is nowadays.
Nvidia is conditioning us to accept these fake gains. By doing this, some gamers will be defaulting on to this new BS.
What resolution does DLSS 4 upscale from Quality - Balanced - Performance. Is it 1440p, 1080p, 720p?
"Nvidia Upgraded Everyone's GPU"
Me sitting on a AMD GPU
just wait a minute, new FSR will come soon as well :)
@@HybOj
FSR4 AI upscale only supported by rdna4 gpu
@@HybOj Not for older cards. Only the 9000 series will be getting actual FSR4.
@@aflyingcowboy31 For now, AMD said they will work on FSR4 on older Cards
runs worse, jittery and gives sometimes unrealistic results compared to native
Ah yes complain for the sake of complaining.
Then don't use it in situations where it happens .
Simple
But Jensen must be worshipped
@GameslordXY it's cool fannyboi
Can native do pathtracing in 4k maintaining atleast 60 fps?
@rPenek not sure any card can without dlss
This will let me keep my 2080ti a little longer.. now I am definitely waiting for a 5080 ti/s variant to come out before I upgrade.
the way the asfalt renders in cyberpunk is godawful with dlss. how can ppl play with this on? i tried this and it sucks so hard.
why do u loot at asphalt? don't you feel the difference in smoothness and look around and actually play the game? or you just stare at asphalt lol?
It's probably a thing with your monitor. Roads look phenomenal on my pc with dlss performance
@Kachapuo I actually look at every texture for fun, and road looks good to me.
@@Thomas_AngeloAgreed
@@Thomas_Angelo then u need a visit to an optometrist. it comes in blocks with clear edges. its awful.. if u cant tell the difference thats on u, not me.
It's been stated that non rtx 4000/5000 series gpus will have a performance hit when using the new dlss4 transformer model. Supposedly they lack the hardware to run it. However dlss actually works better for 4k resolution because it's not upscaling from 360p like with 1080p or 720p like with 1440p on performance mode. I believe that dlss performance mode on 4k upscales from 1080p. I could be wrong about these resolutions but I believe that it's correct. Either way the transformer mode is great and I really like dlss. It's basically free performance and it let's me keep my rtx 4080 for longer to get a better value from it. I don't understand why so many people hate on dlss, frame gen I understand but upscaling has been used for a very long time with television and consoles. It's in way more things than people even realize. Imagine if Nvidia, Intel and AMD did not give us upscaling and still charged $1000 for a graphics card with the crappy performance they have. Some would argue games would run better because they would actually have to optimize the games which is probably true, but game optimization was lacking way before we got dlss/fsr upscaling and frame gen. Greed is the component here that is destroying the gaming industry and really the whole world.
Yeah, the narrative that "games should be better optimized" always strike me wrong. Like, realistically, are they going to be? We had shitty ports way before we had any of those techs. Were these people not around during the Xbox 360/PS3 era? PC games ran like crap most of the time then. Frame-time consistency was nowhere to be found. I don't think it's a fault of DLSS/FSR that new engines are very reliant on resolution scaling, for instance. It was already heavily used even on the last-gen consoles.
Regarding DLSS Upscaling, I myself really disliked using it. It had pretty obvious arctifacting, specially on Cyberpunk with Ray Reconstruction. Lines weren't quite straight, image was overal blurry and ghosting was very apparent. STALKER 2 had similar issues, but mostly ghosting and blurriness. I really reject the narrative that DLSS Q looked "native" on the previous model.
Frame-gen is a different beast entirely. It does have a bit of arctifacting, it blurries the image a bit on movement and input latency is sensibly higher, specially on mouse input, but it was way more usable than DLSS Upscaling IMO. But it depends heavily on what's your performance before enabling it. It works well at over 80-90 FPS base. What sucks most about it is that NVIDIA sold it as "performance" back when the 40 series launch, which could not be further from the truth. NVIDIA is selling it as performance once again with the 50 series MFG.
Transformer model is a game-changer all around, DLSS Upscaling is very much usable, ray-reconstruction inproved massively and FG was already good enough.
Pricing purely based on demand, shortage, availability but never on true manufacturing cost. We tend to think what value we get out of it but the true shocker is the margins. Nvidia's to date 2025 gross profit is 75% , so that means 25% went towards ALL costs and 75% leftover is real money left in the bank (before taxes). So cost is manufacturing, office leased, paying staff etc. Now imagine that GPU you bought for £2000 cost £500 in components, labour etc to make. So really that 75% is funding their research in future because their operating costs were already covered by that 25%. Bonus money. Look up Nvidia profit margins for their H100 GPUs.
Nvidia always has a reason when they "give" something away for "free". I think since the AI compute market is still red-hot, where the gpu's used there make consumers graphic cards look dirt cheat ($10k+ cards and companies like Meta and Microsoft make billion $ orders.). Gamers aren't currently that important to Nvidia's current priorities/bottom line. (We'll always get lip service.) This just helps Nvidia keep graphics card market share.
There is no update and there is no possible way Nvidia drivers can insert settings into games. All games have to release an update to accommodate the GPU settings
You can use third-party software to swap DLSS versions ever since 2.0 and NVIDIA already showed the in-app DLSS version swap in their presentation. It's coming soon.
@@band0lero Yeah but it doesn't officially exist now from NVIDIA so all these youtube channels need to cut the crap with these types of videos
that 5090 is no way going to be 2k. Even the 4090 is going for $3500 in some places.
Will this also apply to ol' 1080 Ti too ?
So gamers should stick to old generations cards since software updates allow for performance improvements and leave newer (and far more expensive) cards for AI modellers.
Covering the inscriptions with yourself is probably not the best idea.
for the whole 15 years, the games still stuck at around 30fps at Max Settings...WTF is this crap???? Literally I mean...makes no sense at all...
This is why none of you guys are computer experts did you ever consider the fact that when you get more frames the quality goes down the quality goes up when you have less frames 😅 SMH 🙄
Your showing such easy scenes for this tech to do well at, fast moving or busy scenes with plant life etc it just falls apart. You have something for nothing in this world, and artifacts are what you get.
I game on a GTX 1060 6GB. Is DLSS going to help my GPU?
I'm hesitant to upgrade to the new drivers. I don't trust nVidia and at some point I wouldn't be surprised if they gimp their older GPU models to try to nudge ppl to spend money to buy their 5000 series at some point...
What? You can always roll back the driver to an older version if you want to.
@@TheKims82 Haha, this guy thinks it is a phone.
Does Vermintide 2 support new Transformer upscaler?
Imagine someone saying the words Nvidia and free in the same sentence ? Now that's the funniest thing I've ever heard ! The more you buy the more you save !! If you buy 3 5090s you save so much its like getting 2 free cards !!
I don't understand why you didn't switch to DLSS 4 quality or balanced in Cyberpunk to see how the performance or hit would have been. You only showed performance which significantly lowers resolution and of course it gives great performance because of it.
Now my fingers are crossed that AMD releases an update that makes me be able to play Fortnite and other games without random freezes on my 9900X, FPS drops etc... Literally can be at 300 FPS and then go down to literally 1, freeze everything and I have no idea why. At least my temps are no longer insanely high...
All we need is a quality frame gen model for the older cards. something the open source community should start working on. The problem is patching it into tge game drivers which NVidia would never allow. Software upgrades are nice but when buying newer generation you shouldn't need them.
Lossless scaliing is pretty cool
@AhmedSaeed-wx3pd yep but it needs more work if they had the same driver level integration as NVidias solution it would be better. But in a pinch it definitely can help. It's funny how NVidia reports it's impossible to do frame gen on older hardware but there is now a app and FSR 3 that can do it.
@@justindressler5992 Nvidia has stated it may be possible to bring FG / MFG to older RTX cards.
@@justindressler5992 "frame gen on older hardware but there is now a app and FSR 3 that can do it."
That is because FSR 3 isn't hardware AI it is Software AI, which is also why FSR in general is just worse then DLSS. FSR4 is going the hardware AI route which is why only the 9000 series is getting it.
It took literally 1 second after u switched to DLSS performance to see smearing/noise on the ground.....its really strange when people say there is no difference between native and per dlss. anything under quality is kinda bad, even on the new updated dlss4. its only my opinion tho.
I hate how discorse has been tainted when it comes to image quality by content creators on the tech/gaming space. DLSS 4 transformer is very much an improvement on DLSS 3. On ray reconstruction scenes it honestly is better than previous CNN model by a lot. I'd go as far to say that DLSS T Balanced on CP 2077 (RR + Path Tracing) looks better and has less artifacts than DLAA CNN.
But people often make such unrealistic statements like "better than native", well, no shit. If the TAA is already destroying image quality, if DLSS destroys it a little better, then it's better than TAA. That's not a compliment. Also, people really understate how much artifacting can be distracting.
@@band0lero yeah the TAA is meta now is also cancer i agree.
Can I ask what application you're using to get the fps and frametime stats in the corner?
Crazy if both fsr4 and dlss4 performance mode still beat the crap out of fsr 3.1😂
My rx580 will live an extra longlife in 1440p uw
I'll be sticking with my RTX 3090 thank you. Im still having plenty of fun with it. It'll be useless to upgrade now already.
Transformer model upscaling looks amazing from what I've seen so far.
@@patsk8872 the CNN was already some special sauce compared to FSR3. The TF model just bitch slapped whatever FSR4 thought it was gonna be lol absolute game changer. Even ultra performance is reasonable and looks better than CNN performance. Thank you Nvidia for continuing to push the envelope of quality. Except for 50 series.. what a letdown. That architecture was completely built for and paid for by AI. Gamers got nothing because it is an AI GPU. It's bad.
Ya, but nvidia might of just screwed their recent gen if this is true, imho there would be no reason to move to 50 series even with multi-frame gen and the native performance uplift isnt that good. It must be the reason the 5070 is pretty low in comparison of previous to make it more enticing to switch and the 5090 makes even less sense.
It would be nice if we get a full gpu re-evaluation from one of the bigger reviewers though cause im interested in seeing those results.
I Iive in Canada, B.C. Buying a RTX 4090 from Amazon is literally $6K! How are they allowed to price it so high is beyond me. That honestly should be illegal.
it's because the 4090 is no longer being manufactured and the supply is low, of course the price is going to go up so much
Does anyone know what tool he’s using to see all of his PC’s specs and component temperatures in the top left?
msi afterburner
Dlss swapper plus lossless scaling , actually tried it with dynasty warrior origina and palworld it's so good
i have rtx 2060 6gb. Is this improvement including me? (i have dlss)
why would you think it doesnt ?
@@damasterpiece08 you know my GPU is old and most updates doesn't come to RTX 2060. Thanks for answer btw.
@ you're welcome; your gpu is the same generation as the 2080 and works on the same driver and has the same cores, only less of them
@ thanks again.
Where is this upgrade?
One more thing before everyone starts to say oh it's not double the 50 90 isn't double the quality we are at a point where we can't double anymore we've reached our physical limit I mean I think there's other ways but that's neither here nor there so even 10% better is going to be ridiculous people need to start using their brains 😊
I’ll never understand why people want the absolute, non plus ultra, best Graphics etc.
Playing on a 1080p monitor is fine and you get good graphics plus performance but 4K on your television?! Seriously?!
For that you do need the latest hardware BUT only if you want the best!!!
People can spend their Money however they want but using your brain, NOT just mindlessly spending thousands of bugs for a gaming PC!
You can get a nice gaming PC for round about 1200$!!! More than enough! It’s not gonna be 4K with high FPS. Who cares😑
So my RTX 4090 24GB is going even further beyond? Great!!
I play on 2060 super. On cyberpunk, with dlss on for max non-rt settings at 1080p I use dlss quality and that looks great, and i cap my fps at 82 and it works amazingly. The transformer model usually only gets me 62 fps. Its much better on the newer cards, but on the older ones it doesnt look as good. Also the grass in the badlands flickers like crazy on the transformer model, not so much on the cnn.
That is monitor pixel density issue and not ingame issue.
Dlss 4 performance looks like quality with the new update! Great job nvidia. Runs great on my 4090
Only reason I’m getting a 5070ti is because I’ve saved for about 3 years whilst raising kids. Hope this is a product refresh practice Nvidia continue, because I won’t be able to upgrade again for a lifetime cos I’ve got grandkids now 😂
AMD can probably train against Nvidia model+ their own stuff. ML is tricky from the competitive standpoint of building a moat. What happened to OpenAI recently is a good example : 🍿