DLDSR 1.78x +DLSS Quality (1.78x * 0.444% = about 80% the amount of native pixels) is a close contender to DLAA with similar ±performance in case DLAA is not available (although there now seem to be mods to hack it in). In my experience DLAA tends to ghost quite a bit on around edges
@@bluesharpie9744 I tried dldsr 2.25 for Death Stranding director's cut and it wasn't good. Not as bad as TAA alone but still a lot behind DLSS alone. PS: it's only one game in 1080p, I have test every game
Old comment I know, but DLAA is literally just DLSS Quality except a little bit better under a microscope. It's technically the best image quality possible, but I think most people would argue it's not worth the FPS loss compared to Quality (unless you're at 1080p, as at that output even Quality dips too low in terms of internal pixels) Oh and the ghosting issue was fixed with Preset F in recent versions, there was a period between 2.5.1 and 3.6 (iirc) where the default preset caused ghosting in small objects, but they have since fixed it. You can use DLSS Tweaks to force a preset in any game.
havent played that many modern games recently but good lord war thunder has a really bad problem with anti aliasing. Its so bad, SSAA is mandatory. Even with that, theres so much noise you will have a real hard time spotting enemies. TAA helps too but certain texture still blinks though its not as bad as it was previously. You have to sacrifice so much performance
@@CC-vv2ne So add sharpness, what's the issue? You seen TAA in R6 with a bit of sharpness (right under the TAA)? It looks CRYSTAL CRISP. It's AMAZING. I wish all games had such good anti-aliasing and sharpness.
@@Dyils Ubisoft is usually among the best technically. It would be like saying DX12 has not been a severe issue because WD Legion managed to make dx12 running better than dx11. TAA is usually badly implemented, R6 isn't a proof that most games could achieve the same thing.
Yup, dlaa looks really really good with native (what I use with Hogwarts legacy )with a 27" 1440p monitor. I found dlss quality looks quite bad in Hogwarts, noticeably softer and blurrier, especially indoors. I did not realize it was possible to enable on your own for any game with dlss, will have to try that in other games.
You have to disable then re-enable AA in Death Stranding Directors Cut edition each time you launch the game, otherwise it'll be without AA, even though in the graphics menu it says it's on. The standard Death Stranding didn't had this issue.
I knew something was wrong with the results. I played the standard edition of DS a few months ago and thought the TAA implementation was actually good. And I played at 1080P!
Awesome information. This channel has been especially hitting it out of the park with valuable and hard to find information. I know these videos are a ton of work, and I'm hoping others can appreciate and find this content as helpful and useful as I do.
Yeah for real, and these guys are truly independent and not willing to shine a turd. Like it'd be funny watching them seeing a GTX 1630 in action for the cost. It confuses me because I think they're pretty much fair, and rightfully slammed teh 6500XT as being a garbage product, yet lots of butthurt fanboys had to cope about what seemed like blatantly obvious common sense to me 8gb was not enough. I'm confused by techpowerup though it says a 6500XT performs the same as a 5500XT? Why really?? Funny because mostly AMD was the best brand last gen, except that one stinking turd. I automatically expect Gamersnexus and Hardware Unboxed to be utterly unswayed by corporate bullshit artists.
Meanwhile Lius techtips uses 1/20th of the effort to make a video, but includes some hand held camera movements, some gpu dropping, some sponsourshots, some jokes .
I will never understand why the DLSS or FSR files are not updated... there are really big differences in quality and it's not a big job for the DEVs. but the result is all the greater.
you can't just boot the game and test a couple scenes to verify it works correctly, there might be a specific area or shader that doesn't play well with it, it's not that straight forward
@@DavidFregoli I agree but I personally use the latest DLSS version with the games that support DLSS but are stuck with older version (eg. RDR2 ) and the game looks native with more fps. DLSS 2.5.1 or any 3.x.x looks so good
As a developer, it comes to : there is no ticket for that. Anyway, the nvidia SDK states that the dll is drop-in replaceable and that is a major feature of their sdk. Even have an update function you can call but nobody ever does.
I honestly just assumed going into this video that Native > DLSS and FSR in terms of quality, so it was pretty interesting to see how it resolved some issues with some games like DS. Great video!
Well you assumed correctly, because Tim makes a common mistake, Native > Upscaling (ANY), but DLAA > AA (ANY), because all AA is trying to imitate DOWNscaling. Render @ 8k, Display @ 4k gives you the best AA you can get, at cost of huge performance hit, and every AA ever invented tries to mimic this taxing AA method, without high FPS hit. DLAA is training the AI on Downscaling, to do better AA, without upscaling the render. So all this video shows is what we already know DLAA is "Better than any other AA. This is worst video Tim has ever made, especially as we can now force DLAA, in any game that has DLSS support (How? See my other posts in these comments).
Unbelievable content. Need to continue to showcase games relative performance between DLSS, fsr and native twice per year to keep up with the updates and new games coming out.
would love to see DLDSR together with DLSS added at some point in the future, i often use both together at 1440p and it further reduces flickering/aliasing.
Maybe instead of or in addition to further optimizing DLSS and FSR image quality, maybe developers should just spend more time properly fixing the native rendering in many cases... Basically every single time DLSS was better, it wasn't because it was so good, but because native TAA just was so bad. We have seen that before with stuff like AMD's CAS which seemed to work just a lot better than what ever developers built themselves...
@@jkahgdkjhafgsd Even on a 140ppi monitor jagged edges looks worse than everything, I'd rather take bad TAA that flickers a lot, but it's better than literally everything looking absolutely awful... If you want something good, look at something like SOTTRs SMAA + TAA implementations. They are expensive, but it works wonders, not a lot of flickering and other artifacts, just plain solid.
No anti aliasing is almost perfect during movement, it absolutely slams the clarity of even dlaa. Aliasing patterns are a problem because the pixels are only rasterized in the centers, which is pretty wasteful. It's better to rasterize random locations inside the pixels, which also randomizes the aliasing patterns and makes them a lot smoother in real time. To improve the clarity even further, foveated supersampling/downsampling is needed, on top of backlight strobing/black frame insertion and eye movement compensated motion blur, which make 100 hz look and feel like 1000 hz in a non destructive way
I think you should consider testing more games with manually updated DLL file for the lastest version of DLSS. Since as you said, just by updating its version can change the result from worse than Native to match or even better, so I wonder if this is a trend across the board, and since this is a comparison of DLSS vs Native, I think it's fair to use the best and lastest version of it, since it can be manually done by users for free, not depending on games developers to implement. If the results are indeed positive for all, then it's gonna be truly a game-changing technology, since it'd be a truly free, stable and noticeable performance boost toggle that comes at no cost, and might even improve visual, simply a much better solution than overclocking since it's not dependent on silicone lottery.
Some other channels could learn from your image quality. Always sharp and clear 4k. Probably the best looking channel I watch. Why do other channels not look this good? Lol. Keep it up!
DLSS has a wonderful effect on surfaces that get shimmering lighting/shading or shimmering/bubbling screen space reflections that is pretty common on games of the last 3 years, my guess is that the temporal effect of it smooths it out, in sometitles it's very obvious like Cyberpunk 2077 where this kind of shimmering appears on a lot of surfaces. I guess theoretically a good TAA setting would be able to do the same, but it would have to be able to understand know what kinds of objects it should do it to, or it would cause ghosting... i guess this is where DLSS comes in, since the machine learning can make it understand what parts should reasonably not be shimmering... and modern 3.1 dlss is very low in noticeable ghosting anyway... also now with dlsstweaks which is a fabulous tool, and now it even comes with a UI program for people who might feel intimdated by editing the text files directly. With this, you can set your own internal resolutions... i've been playing around with it a lot... most games can run on really weird internal resolutions, like rendering at 3:2 3240x2160, which i've tried for 4k, so it is DLAA vertically, and DLSS horizontally... in games where you would need just a bit more performance with dlss but you are not satisfied with the "quality" option... or in games where you want something in between the available settings. I hope in the future this can be baked in as a slider in games, so that instead of having a number of options "quality, balanced, performance and ultra performance", you can just have a slider from full internal 1x resolution (DLAA) down to even below the ultra performance (x0.333333) level... I've had great use of this tool to max out for example The Last Of Us Part 1, so that I can set a resolution just high enough to look very close to native (so above the DLSS quality setting, which is x0.6666667) but still fitting within my VRAM limitations for a certain set of settings, that i wouldn't have been able to do if i ran it at native. DLSS really saves the day and i've tried using FSR and XeSS to see if they are any better in any game, but i haven't yet found any case where they look better. Having gone over to 4k tv for gaming and content viewing recently, I wouldn't even be able to run a lot of the games i like in 4k without dlss, it's really the main thing that would keep me from thinking of going over to AMD cards if i was looking for a new graphics card now.
Great vid as always- There's also this strange micro stutter and ghosting in the DLSS my eyes can't unsee it, especially when it's scrolling past something, it's not smooth. 11:39 In hogwards legacy, the freaking coat judders and all the small things that move around like birds have massive ghosting. 12:44 judder judder judder. 13:35 rain ghosting, the rain is literally ghosting across the screen.
Great video ! People need to remember that DLSS / FSR / XeSS "ADD" performance with much better FPS. Which mean much better frametime and input lag that lead to smoother gameplay. When you can pull 30-40fps on native 4K/1440p/1080p in games you play and you can turn on DLSS to get much better fps or even stable 60fps that should be no brainer to turn on based on the 24 game comparison. With DLSS is edging closer to native in quality mode, FSR and XeSS catching up just mean better for costumer. The real problem just like the video point out, that new games release (hogwarts, deadspace, deathstranding, last of us) in native have much worse anti aliasing method implemented on their game in NATIVE. This feels the norm now that need more attention that developer making unoptimized game like when you can pull 14gb VRAM on the last of us just LOOKING AT A WALL.
I'd say it's a no brainer on Ties and DLSS wins. on Native+ it's the best tradeoff to get some performance, while on Native++ and Native+++ games you activate it only if you absolutely need, and potentially test lowering the quality settings first
You should see the native without TAA - you'll see jaggies everywhere and there will be shimmering on all the edges. There's a reason games use TAA....
6:40 ummm what is happening when it pans down on the spiderman brick wall with DLSS? It is hitching badly and looks so glitchy. If I saw that in game, I would immediately turn off whatever is causing it (so DLSS). Why wasn’t this incredibly obvious artifact mentioned? Am I confused here and missed you addressing it? Thx
Will you be making the same concept with new enhanced dlss super resolution which uses Transformer model? It would be such a good concept to see the evolution of dlss and whether it has been worth to use after all these years of development.
In some games I honestly prefer 4K native without any temporal AA/ similar techniques . FSR, DLSS and TAA when poorly implemented show lots of ghosting and shimmering which is super annoying
It bothers me that taa has become the goto AA technique. I realize it takes a lot less power than smaa, but it consistently looks like ass. Luckily, you can turn it off in most games.
22:10 While in theory automatic updates sounds good, as a developer - in practice, we never want to release an untested update. In normal development cycle (and most of the cases), after releasing version 1.0 active development stops and we move to a support phase - where we do not add any new features and focus only on fixing bug and glitches. Most of the developers move to another projects (which may be even DLC for that game, but that also count as a different project) and since there is less human resources, you can't just constantly update such an important library like DLSS (important since it directly affects visuals of the game). If something worked before it's not guaranteed that it will still work after update - e.g. there may be a particle effect at a waterfall which after update starts producing ugly artifacts, QA may not catch this (mostly due to reduced resources), but gamers surely will - that's a big risk. Risk which can be avoided by not updating that library.
That sounds pretty logical. And yet if upscaling techniques are being employed by more and more games and gamers and the version of the upscaling library directly affects the enjoyment of the game, it seems logical for a project management to allocate resources to keeping it updated after release, or at least evaluate an update some time after release. I'm not sure one visual glitch in a waterfall matters if the rest of the game looks better and is slightly faster than before.
If the result is better in 99% of the cases: Why not just always update and let the tested version as an option by default ? In many games, you can chose which version of dlss/fsr you want in the game settings. Players would probably love to see some minors updates after release. Especially when it doesn't require a heavy DL.
I’ve felt that native is the only true sense of what a gpu can do, but I can’t tell the difference in SOME scenarios. If I’m in the middle of a game, I’d say there’s only certain things like the fence and things flickering that would get on my nerves. So it seems like it’ll end up being a game by game thing for me as to whether or not I’d use the tech or just play with native rendering.
@@dante19890 I am basing my statement on what I see in the video. I haven’t yet compared them for myself in the games I play. I tend to play in native res without the frame generation on anyways. The one time I tried it, it wasn’t impressive to me, so I haven’t bothered since. Also, with whatever extra processing youtube does to these videos, and depending on the screen someone is watching on, it can be difficult to tell if I’m seeing things the same way. Especially since “seeing 4k” on a tablet vs a monitor or tv is dependent on those devices. What I should do is watch this video again on the pc and monitor I’d be gaming on. And I did say “SOME” scenarios. Not all. There’s a clear difference in many of the examples he showed.
@@benfowler2127 ye but either way its gonna be a little better, same or a little worse depending on the DLSS version and the game, but you are getting a huge performance boost so its a instant net win no matter how u look at it.
Problem is it's not just fences, it's bloom, rain, depth of field, air particles, god rays, wide color gamuts, OLED contrast, etc. on top of increased latency. There're lots of things AI can't do, and it certainly can't do native graphics better, just TAA better, when TAA is bad. Hitman in the rain being the worst offender here, even when DLSS is technically better as an AA solution it makes lighting look awful.
The dragon neon sign on Hitman is losing a lot of brightness and bloom compared to native. The ~1-2 pixel wide neon information is lost and thus doesn't produce the light it was supposed to. That's pretty impactfull in my opinion. It's completely extinguished/muted on FSR. This is something that will be noticed a lot in Cyberpunk, where 1-2 pixel elements (neon signs again) have to convey bright, contrasted information and is further enhanced by bloom post processing.
Yes, lighting effects take a hit when using upscaling. This is why I ran Death Stranding at native despite the shimmering, the lights look "wrong" when using DLSS. The more bloom light sources have the more it is noticeable. Neon lights often look like LED lights with DLSS.
There's a gamma shift and overal loss in color accuracy when scaling is performed, which is to be expected due to obvious reasons. And sadly this is not a problem most people, be it developers or consumers, are concerned about. Or what is actually sad is that I'm definitely not one of them as I'm very sensitive to such things.
@@Superdazzu2 Yes, it is. He is absolutely correct, from a technical perspective, on what the actual image processing techniques involved in the upscaling are doing. There is also a *side effect* that because it's starting with a lower resolution image, you get an FPS bump - but the fundamentals of the image processing that's going on, are, *unavoidably* that it's doing extremely computationally inefficient anti-aliasing.
@@Superdazzu2 no it doesn’t lmao, dlss3 the latest iteration of dlss actually REDUCES performance due to 10-15x the latency, fsr actually increases performance/quality. (Source - HUB, AdoredTV, NotAnAppleFan, PCworld etc)
You know, I had been playing newer games with DLSS quality activated at 4K and 1440p with my 4090 and 3080 respectively, as they came as default. They looked fine. However, a few days ago I went back to native and I was surprised by how much crisper everything looked.
Well, it's a trade-off. You easily get used to less crispy image, but the smoothness makes up for it and once that extra smoothness is gone you IMMEDIATELY notice it.
because AA sucks, when no AA (TAA, dLAA, DLSS) games look way sharper... but sadly these day you are forced to have at least taa in like 90% of games.. even thought at 4K i would personnaly rather have the slight shimmering than the smoothness.. basically thiscomparison is between TAA and DLSS/fsr, native doesn't mean s*** anymore..
incredible result for DLSS lets be honest, just being a tie to native would have been incredibly impressive but to pull wins out the bag on multiple occasions, wow.
That native image instability and flickering bothers me a lot once I notice it and DLSS Quality fixes it in most cases so I'm using it almost all the time. To me that looks better than native TAA and I can't notice the ghosting tbh + less power draw and lower temps with better performance so its a win-win for me. Even on my lowly 29" 2560x1080 ultrawide Quality mode looks good enough to me, usually I manually update the DLSS dll with the 2.5.1 version tho I'm yet to check the newer versions.
When I have performance left… I tend to do this trick… DLDSR 2.25x + DLSS Quality… I play on a 4K monitor, so the game would be rendered at 6K and DLSS Quality would make the game render internally at 4K and upscale to 6K… This makes the image look a lot sharper and better than regular 4K actually! Even on 6K Performance mode (which renders internally at 1620p), it usually looks almost like native 4K and lots of times even better than it by a bit.
It's called supersampling, also be used in video/anime industry. Render video at 4k and downscale to 1080p or 8k downscale to 4k, that's why you see those studio use 8k/12k high end camera.
Another use case for these upscalling techniques is to reduce the workload and power consumption on your graphics card. If it looks the same to you, just enable it.
That edition was "helpful-just-in-time". Getting these days frustrated with DLSS stuttering and lacking quality in FlightSim you mentioning the cpu limitation as deal breaker was very helpful. Enjoying the scenery low and slow with a real-life viewing distance means to run the simulation in a CPU limit more often then not. Your confirmation of the title looking much better in Native Res will have serious implications: the 7800X3d has to come to rescue. 😄💙PS: Regarding VRAM, the title (even in Mid Textures) uses more then 12GB ded. in dense areas.... the 3070 wasnt long in my system! Thanks
I would love if you tested DLSS balanced also. Would be nice to know whether it would be a good tradeoff or whether it in general is something to be avoided.
The reason I stuck with Nvidia last year was due to DLSS, not too bothered over Raytracing, just DLSS. I use a 50 inch 4k TV and my old 2060 super using DLSS got stable 60fps at 1440p. My 3070ti on Dying light 2 gets stable 60fps at 4k using DLSS. Only thing I don't like is the vram only 8gig on 3070ti, Nvidia messed up with vram.
I'm super surprised how close it is. My mind doesn't want to believe it can be this close while giving so much better performance... To the point I keep forgetting it. Brain just doesn't want to get the picture, it doesn't fit with my expectations. Thanks for such patience diving deep into these topics! Making these questions as close to objective, and best explained as possible. Super amazing work taking something like subjective visual quality in upscaling, and digesting it into something nigh objective! Your commentary about what game devs should do is also super concise and actionable. If I could give a gold star award to your channel, I would. I'm having to seriously consider donating or subscribing on Patreon, even though that's something I rarely (just about never) do. Thank you again for your tremendous efforts over the years. Gamers can be so much more informed now than any time I can recall in history. And it's due to dedicated reviewers such as yourselves.
A couple of things... is there a reason that the DLSS version of Miles Morales videos started stuttering at 6:35-6:47, and God of War did the same at 12:45-13:00. You'd think DLSS would perform much smoother than native resolution. Also, thank you for making a point about games updating their DLSS versions. Trying to update DLSS in RDR 2 is almost impossible since Rockstar Launcher ALWAYS check to see if files have been moderated every time you play, and even if you update the DLSS version, Rockstar Launcher will ALWAYS replace it with its older v2.2.10 file. If you try and deny write/edit changes to the file, Rockstar Launcher won't even launch the game; It will stay stuck trying to 'update' the file.
Well mate, you can always use EMPRESS edition for single player campaign of RDR 2, she has cracked a new version with DLSS so I would say F u c k off Rockstar launcher!!!! and enjoy my game anyway :-)
Considering DLSS is objectively rendering at lower resolution then attempting to upscale to native, it doesn't even make sense for it to ever be better than native.
Bro fsr suc*s. I use nvidia but most of the times i prefer native over dlss q especially in cyberpunk. 80% of the time i wud go with native n 20% of the time i wud go with dlss quality. Never fsr it is horrible
Can we take a second to talk about the jumpy vertical scroll at the end of spiderman using dlss. Is that an actual thing? Or trouble during editing/encoding?
I just pick 1440p dlss quality + sharpening all day vs forced TAA in latest titles where you cant even pick another AA (cp2077 as example). Dlaa is good alternative
@@mryellow6918 Main reason why I upgraded from 1080p...simply because no matter which settings you use you cant see anything. Blur mess vs pixelated mess. Lately my friend and me tried the Division 1 and we were astonished about AA implementation. Both 1080p and 1440p. I think we should blame devs fot AA at this point.
I used DLSS when first playing Cyberpunk on a 1080p ultrawide. Set a higher resolution than my panel and balanced it so it would render natively at the resolution of my panel; My panel would then downscale the higher game resolution image back to its native resolution. This gave me more distance clarity and a bit more stable image in Cyberpunk (that a month after release of CP, was before DLAA was an official thing in the driver or the game).
IMO when testing native using post-process TAA (temporal anti-alising) is not showing native at its best, TAA when compared to DLSS or FSR, lacks a sharpening filter, it slightly blurs the image instead and worse creates artifacts that wouldn't exist with no AA. MSAA (multi-sampling anti-alising) would be the best AA option to use for native, when comparing image quality, but unfortunately, modern games have no MSAA support at all and forcing it on the drivers, only works for older D3d9 (direct X 9) games and that is a real tragedy to me. You can of course, still use SSAA (super sampling anti-alising) in any game, by just rendering it at a higher resolution than the one you display at, and that has the best image quality of all but also destroys performance.
Pretty obvious that we see way more ghosting with DLSS that FSR in these examples. I can’t say I’ve noticed any “jump off the screen” problems with FSR yet (commented at 11:26 mark) as I watch.
In my personal experience I noticed that the newer DLAA anit-aliasing algorithm works incredibly well with solving flickering issues and making the image look high res. I think this test shows that the Nvidia AA algorithm on DLSS handles scenes much better than TAA for example hence why DLSS looks better than native in many cases. I wonder if we will ever get a test comparing DLSS with DLAA. DLAA native scenes should theoretically be better than DLSS.
Last of Us and Marvel's Spiderman look amazing even on a 24 inch 1080p screen with DLAA. If every game looked like that, 1080p resolution wouldn't be a problem.
Aren't we talking about antialiasing quality though? I can tell the difference in Hogwarts for example between DLSS performance and Native in terms of overall resolution, DLSS performance looks blurred in comparison. The aliasing is cleaner for sure but the image is noticeably lower res imo.
Standard TAA is often absolutely awful. I'll never understand how it was ever acceptable. DLSS (and fsr) often does a better job dealing with artifacts like ghosting, but obviously it can't compete with the sheer pixel count of native. A prime example is often flowing water. Transparencies often don't create motion vectors so any moving texture gets smeared like crazy with TAA, but dlss is capable of keeping the texture crystal clear in cases
It kind of depends. For a long time in M&B Bannerlord for example, even DLSS quality gives off major shimmering with water and ripples, and default AA at native looks significantly better. But later DLSS versions fixed some of that. What they should have done is to swap the DLL to latest version for all the comparisons.
@@zxbc1 for sure, it's not always better. Spiderman for example has very good temporal anti aliasing tech so these issues aren't as apparent. As for older dlss versions, it's extremely easy to replace the dll with a newer version so I don't really consider the older versions anymore
@@prman9984 To fix aliasing, like you would at any resolution. 4K doesn't somehow fix aliasing, it just makes the aliasing size smaller. For me it's absolutely required for a lot of games because I game on a 48" LG C2 at desktop distance, and I can see aliasing very clearly at this distance. If you game on a big screen in living room couch distance you probably won't need it, although for some games that have very bad aliasing, you will still notice severe shimmering even if you can't see the exact aliasing artifact.
@@zxbc1 I game on a big screen in 4K since 2016 and hate AA especially TAA with passion. It destroys image quality - its why I HATE these people that say DLSS is BETTER than Native because its NOT. It may be better (image quality) than Native with AA but NOT without.
I read a funny thing somewhere which i also think is true. When you want high visual quality, you enable highest settings possible right?! If your machine can't handle it, you reduce the video settings, right. All in native. If you activate the upscalers while selecting the high visual settings you are literally reducing the video quality. So why enable dlss/fsr instead of just keeping native with slight lower visual settings. By keeping native you aren't compromising anything in terms of video quality and you aren't demanding anything extra from the gpu either (by forcing a silly upscale calculation)
always update dlss to the newest version and test if its better. most of the time it will have great improvement in image quality (there are exceptions though)
This test is weird I don't understand. Instead of creating two graphs one where its FSR vs Native and DLSS vs Native you combine all 3 into one so that only DLSS is shown... why? What advantage does that serve us the consumers of this video? For non-RTX users (GTX, AMD, Intel) how do they know which games FSR is better than native in? The only technology they can use. That would be useful information. This test only benefited NVIDIA RTX users because you've left out useful results for everyone else. If FSR is ALSO better than Native in that game that's good information to know rather than simply just knowing what's best.
I would also like to see DLAA results and rendering the game at a higher resolution then down scaling it. I.e 4k to 1440p and seeing how those two compare to traditional TAA.
you people demand a lot without compensating them for it. what's stopping you from testing yourself? I'll give you a hint, downscaling pretty much always gives you the best image quality.
@@gozutheDJ if we were to follow your brilliant logic, this video would not exist. Hw does technical analysis that most people are not capable of, so it's interesting to see H.W opinion. about the lol reward the channel earns money with the videos by adsense
@@gozutheDJ It seems like you didn't understand anything I said. Hardware Unboxed goes much deeper into the differences in graphics than most people, in summary, the channel offers a much better analysis. And if someone were to do their own analysis on dozens of games, it would take dozens of hours, I know this because I analyzed 5 games. It's much easier to watch a video that provides a great summary of this issue than for a person to spend dozens of hours analyzing games. Is it difficult for you to understand this?
@@tvcultural68 because this is literally just content dude. its fun to watch but no one is using this to tune their games. because to tune their game they would just .... do it themselves. i have eyes. i dont need a channel to tell me what looks better to me. its all coming down to personal preference as i said, unless you just go along with what other people tell you is better. and also, your monitor factors heavily into this as well. and that's something they can't account for, they can only tell you how it looked to them on their monitors, but you might have a completely different result, so at the end of the day you STILL need to test it for yourself.
To be fair, I think the "better than native" really comes into place in scenes with limited motion. It could be an RTS game, like The Riftbreakers, a slow dialogue focused cutscene like the Witcher 3, puzzle games like Lego Builders or just when standing still, detail studying environments or characters in Cyberpunk. In those cases the temporal resolve has a super-sampled "better than native" quality to it. With FSR showcasing flickering instability with sub-pixel geometry (due to the feature lacking a temporal smoothing pass, which DLSS has. However, this extra pass often causes extra ghosting in DLSS as a result). There are so many components to upscaling, and there's just a lot of subjective opinion of the image quality that you can't or don't really need to quantify. For me personally I rather play Cyberpunk with FSR Quality over the native image at 1440p due to the super sampled temporal resolve in cutscenes, dialogues, or just when detail studying the environment. During motion it's close to native, but with added flickering and slight fuzz. which I don't mind when I get into the game. So even if flickering CAN be distracting, to me it isn't all that important, and I value the benefits of FSR, over its drawbacks. And that's something that you can't really quantify even in an image comparison, because it really depends on the scene, what you value, how much you are into the game and what best depicts the artists intended look for the game. and so on.
It would be good to point out that the image differs a lot based on the DLSS version which can be mostly replaced in a matter of minutes, for example the DLSS in RDR2 is trash but when replaced with version 2.5.1 it looks amazing.
unless there's a convenient method that scales for every owner (possibly GF Experience integration) or an always up-to-date table somewhere that matches game to dlss version this is an impractical solution that only a few enthusiast will research. Nvidia hasn't even been able to stay on top ReBar On/Off per-game settings as shown by HUB so not much to place trust on here.
He covered that with that exact game in the final thoughts. It maybe deserved a section of its own, as people probably don't expect new information to be provided in the final thoughts section.
It boils down to anti-aliasing, which can be pretty bad at native. Interesting that the FSR flaws to which Tim most often points are usually also present in the native/TAA presentation. This suggests to me that complaints about FSR are overblown. Yeah, FSR is worse than DLSS, but it's still pretty close to, and in rare cases better than, native quality. And given that modern games tend to have glaring issues otherwise--perhaps micro-transactions, brain-dead gameplay, bad AI/scripting, and/or things like persistent shadow/texture/mesh pop-in--given all of that, I have a hard time stressing out about minute differences in image quality. Still, I prefer to run native when I can.
Which for addition makes up additional point. Unlike DLSS, FSR works with just what it gets, not some magic box of trained data to mask lower resolution source. So, if native have shimmering and stability issues, FSR would get them as well, while sometimes exaggerating, sometimes suppressing. It's not that FSR is bad. It is just that DLSS creates stuff that wasn't there originally. It's also not a bad thing, but by itself it shouldn't make FSR worse technology. And cost of such creation (and reliance for TAA and AI) is ghosting (more for DLSS), or not fixing original shimmering (more for FSR). Probably that DLSS/FSR comparsion should've went with native as well (just not including it in results). Maybe overall conclusion would've not been so negative towards FSR. It is still inferior for multiple reasons, as, for example, there is no magic box to draw stuff from air with. But it doesn't "destroy" playing experience.
Agree. People forget how bad games looked 15 years ago. Its no longer about graphics or resolution. Gameplay is stagnating more. Physics implementations dont get enough attention nor audio. Worst offender is a.i. Most games have brain dead a.i especially if its a tacked on singleplayer with main focus being multiplayer. Which then make multiplayer bots suck too.
@@DimkaTsv I think they do have to train FSR 2 for games somehow, as it cannot be injected into any game like FSR 1 can be with the Magpie mod. I don't know how they do the training though. You make some very good points.
This just makes me hope that AMD eventually releases a version of FSR that utilizes AI to enhance the quality of the image similar to DLSS. Perhaps it might only work on newer AMD gpus, and would otherwise revert back to normal FSR. But knowing AMD it will also work on any gpu capable of utilizing tensor ML cores. That would be the true "DLSS Killer"
Interesting video. If anyone had asked me, I would never have thought any of the upscaling solutions would ever be better native. But it does make sense with titles that haven’t implemented proper high-end graphic features. Might even be a great way to visually overhaul older games if you can simply tack DLSS/FSR on to it, rather than having to update the core mechanics of the engine...
"Better than native" presentations are made possible due to shortcomings in the anti-aliasing solutions most games use, and it doesn't have anything to do with the games' age or graphical feature support. And DLSS/FSR2 require quite a bit of work on the engine side to get a good implementation of since these are temporal upscalers. You need extensive motion vector support at the very least. You do not simply tack these features on. And with motion vectors, you can also do good-quality TAA, which would reduce the need for DLSS/FSR to begin with.
It makes sense if you think about modern anti-aliasing techniques like TAA, which are essentially DLSS without (usually) the upscaling element, i.e., they look at past frames to reconstruct a smoother image than the native raster. Otoh I think those would be tough to implement in older games without some quite noticeable artefacts: what makes DLSS good is that it can look at depth information, motion vectors, and so on in order to create a coherent image, and old engines just don't have the capacity to communicate that info.
The fact alone that we got an antialiasing-technique now that is able to provide better image quality than native while at the same time giving a performance boost is just mind boggling. A few years ago i would have never belived this. It was either a mediocre performance hit with unbearably shimmering foliage and good geometry or an absurd performace hit (e.g. super-sampling) with an overall good quality while high-frequency textures still tended towards shimmering. DLSS is a real game changer.
whats your point? pause at 14.36 and you will see DLSS is much much better than native or FSR. Like the whole video tries to explain to you, every game is different and all of them have their ups and downs in image quality but FSR is not in the discussion, its trash, finding one example is desperation lol
I don't understand how updating from a lower resolution to a higher one can look better than the higher resolution running natively. That's like saying a blu-ray player upscaling to 4k looks better than an actual 4k player.
This is unreal... you guys are putting out informative in-depth videos so frequently from hardware unboxed, hub clips and monitors unboxed, I have difficulty catching up and watching all of them! I highly suspect DLSS 4.0 video generation trickery has been used =P
The questions that arise from this are: 1) Is the gain in visual quality when using DLSS quality, solely lies in the anti aliasing technic used in native rendering? i.e would compering dlss quality to 4k native with DLAA provide similar results as the ones in this video? I think not, but there are only a handful of games that support DLAA. 2) Does it matter? We already have concluded that "ultra" settings are not worth it, isn't dlss quality(or even lower) essentially similar to that? You get marginally worse image fidelity (50% of the time) and a massive 30-40%+ fps boost. 3) Is there any instance, where if you are forced to render (due to performance constrains of your gpu) on a lower than your monitor's native resolution, you shouldn't use DLSS and opt instead on relying on "dumb" gpu scaling or even letting your monitor to do the upscaling? for example, if I have a 4k monitor but i am only able to run the game at 1080p is there a case where it's better to render at 1080p instead of using dlss performance (internal 1080p rendering) ? Building on this: If I have the option of buying a 4k monitor or a 2k monitor, and money is not a concern, even though i don't have the hardware to run 4k native games but can run 2k native, why should i trouble myself with that and not just get the 4k one and use dlss to render at aprox 2k, effectively rendering the big old question of "who buys 4k when its so hard to run" irrelevant (notice that the difference in price of 4k screen as compared to a 2k one is much lower than the difference of a 4k vs a 2k capable gpu + a screen "ages" much slower than a gpu.
in my opinion the boost in FPS is ALWAYS worth using DLSS in quality at minimum. sure in some games the ghosting will be noticeable but with higher fps you see more detail than at a lower fps and higher (native) res.
@@2xKTfc Ghosting can be fixed with a simple DLSS dll replacement, unless you are on multiplayer competitive title.... which you would probably have plenty of FPS to begin with.
@@2xKTfc I've got a g-sync monitor so tearing isnt exactly an issue for me. It's up to you if you think playing at a lower framerate, with blurry motion and higher input lag is worse than a small amount of ghosting.
Im interest about the Wattage used in all modes as well in times like we have today. Hope we can see a bit more information as well in the modes. Thx anyway for the good work :)
That's very interesting. I didn't expect DLSS or FSR to match and even beat the native rendering. I guess it's because the TAA is not a great anti-aliasing technology... and hey, I'll take free performance with a better image any time! Thanks for the detailed look!
@@robertcarhiboux3164 most people hate the jaggies, including me. It just looks so bad that I can't stop looking at the jaggies moving around over edges instead of enjoying the game
@@damara2268 TAA is blurry and even worse than jaggies. I prefer MSAA but when the game does not have it I use DSR 4.0x when my GPU can handle it (with minimum smoothing factor like 6%). I also like very much the hitman resolution upscale. It's the only game in which it really improved the sharpness of the image with no particular downside. Even performancewise it was better than DSR.
Its true in some games that do not have an anti aliasing solution. For example, nioh 2 on PC looks better because the game has no anti aliasing and needs dlss to clean up the image.
When hes staying "Stability" are we speaking about anti aliasing flickering? because it seems rather confusing talking about stability because its often associated with camera/display shaking.
First off I just wanna say I really appreciate the effort you guys put into these videos to provide us with such in depth analysis and information on topics. That said, in this instance I think a more useful and widely applicable analysis would be to compare the image quality of native 1080p or 1440p to that of higher but upscaled resolutions, at quality settings where the upscalers render at or close to 1080p or 1440p respectively. For example: - Native 1080p vs 4K Upscaled on Performance (1080p render res) - Native 1440p vs 4K Upscaled on Quality (1440p render res) - 1440p Upscaled on Performance vs 1080p Upscaled on Quality (720p render res) I think these comparisons could be more useful because most gamers have hardware that actually allows them to use those setting combinations at playable framerates, whereas very few actually have the hardware that affords them the luxury of "choosing" between native 4K and upscaled 4K. Dlss/fsr were always meant to be tools to boost performance at a given res(or enable playable fps at higher res) while minimizing losses in image quality, more so than tools to outright boost image quality itself. Personally from my own sample size of 1, I have found that running games like MW2, Cyberpunk and Hogwarts at 2160p Performance or even 1620p Quality(also ~1080p render res) actually produces a more stable and sometimes better looking image than native 1080p, particularly in Cyberpunk. This makes the ~5-8% fps hit of running 2160p Performance Mode worth it over native 1080p. It would be nice to see what you guys think about this and if others experience the same thing.
Absolutely! Originally DLSS was there to take you to 4K with a GPU that wasn't powerful enough to render native 4K. Now if you're running a 4090, sure you can choose native vs DLSS, but that's a far cry from most people's need. Native 4K doesn't perform nearly the same as DLSS 4k.
@13:43 Pretty sure the rain shouldn't be THAT obvious, I don't remember white streaks falling down from the sky when it rains, it's much more subtle than that. I think you guys are off there, there's this thing called "too much of a good thing". And both upscalers seem to exaggerate the rain visuals.
@@prman9984 DLSS improves performance so I don't see how it could be stuttering when native doesn't, especially since Tim didn't mention anything as he tested it & would have felt it. Ig it could just be a bug while capturing gameplay.
Better is subjective. What bothers you more? Blurry? jagged edges? Flickering fences? ghosting? Sharpened/oversharpened? SHimmering? Every one of these might have s different weight for each of us. I personally find DLSS fine on quality most of the time, and if I need it more aggressive than that then I drop resolution. But at times when flickering or ghosting is introduced I simply drop resolution instead. I can bear sharpening problems and blurriness, that's why I prefer resolution change to the other artifacts. However I don't play much of these games. A lot of these have bad TAA by default.
hard to cross reference same area unless you pause, I'd rather see more stills than playback for fine detail, but playback also has its place to highlight image stability and shimmering. also YT compression doesn't help.
I have a theory about some of your test data, for Spider-Man for example. I wonder if the game engine and texture packs are only optimized and designed for 1080p and 2160p. So when the game renders natively at 1440p, it’s just using something like a checkerboard technique to up the image to 1440p, whereas DLSS can upscale to 1440p from 1080p (quality mode) using its magical mathematical algorithms, which would also explain why the native 4K presentations were better than DLSS upscaling from 1440p, because the game engine already adjusted the image from 1080p to 1440p prior to DLSS then upscaling from 1440p to 2160p, which could explain why in Spider-Man and last of us, 4K native was so much better in it’s fine details then DLSS/FSR but native 1440p was much closer to DLSS/FSR. 🤷♂️🤷♂️ maybe this is something to look at in certain games, do certain resolutions offer up certain pros/cons and do some of those go away with certain DLSS/FSR settings ?? Is this just a DLSS/FSR issue, or is it also a possible game engine issue? Do some developers just not take the time to properly develop 1440p textures and details, which would make the DLSS quality mode using an already less optimized image ??
Since I have a RTX 4060 in my G14 I don't have the luxury of running modern titles maxed out at 1080p and up, but I have been seeing a noticeable improvement modding DLSS 2 titles with DLSS 3 at 1080p, which is where the differences of DLSS will be more prominent. So far I've tried with Metro: Exodus and Hellblade: Senua's Sacrifice. I'm hoping it's not a placebo effect but would like to see a review on the matter to rule it out.
There is reason why VR games use MSAA, it simply provides far superior fine detail compared to any of the temporal AAs. Also, while MSAA is expensive, it's NOT the same thing as super-resolution and is not as expensive. Any time I've accidentally turned on TAA in a VR game it made the game terribly blurry. In motions, TAA and DLSS get noticeable blur penalty compared to no change in native. MSAA further adds detail and reduces aliasing. I wonder if we can find a game with both MSAA and DLSS because MSAA is the actual final boss of clarity.
I use 4k + DLDSR 1.78x with DLSS P. It runs better than native and worse than 4k DLSS Q but looks miles better than DLSS Q and better than native. Using a 4090 so VRAM is no issue Edit: So it renders in 1440p (dlss p) -> 5k (dldsr) -> 4k Instead of 1440p (dlss q) -> 4k
I can't really tell the difference ,but on my 1080p monitor , I prefer using 1620p+dlss quality(Dldsr) than 4k+dlss performance(dsr) same being upscaled at 1080p res.
FSR seems to always have a shimmer going on. I have both setups (5900x 3060) and (5800x 6700xt), and the FSR shimmer is really obvious in some games like RE4.
I think you might have gotten your pics mixed up for the Spider-Man comparison as the hair on the DLSS Quality mode looks betther than the native 4k pic @5:09
This was a very needed video for quite some time, a lot of people still think Native is always better, but it's not, and even in games where it really is, most of the time as shown in the video, it's just slightly better, and in my opinion, the boost in fps is well worth enabling dlss quality in these cases.
Good video but 1. There has to be something wrong with the Death Stranding results, I played it recently and the TAA implementation was the best I've seen in years. Maybe it's bugged in the Director's cut? And 2. We seem to have gone backwards in regards to AA quality. I remember playing Dead Space 3 and Battlefield 3 on a 720P TV and the image quality was spotless for both with just post-process AA. DS3 used SMAA, not sure about BF though. At least we've moved on from the hot garbage that was FXAA, 90% of the time the raw image was better (TAA isn't radically better but to its credit, it is "low-cost" and usually better than not having any AA)
AA is not the only problem, a lot of modern games are running screen space reflections, ambient oclusion, and other ''game parts'' on half or quarter resolution compared to native, this can look ok on 1440p, but not below that, in some modern games like Ghostwire: Tokyo, reflections break on 1080p. In Cyberpunk 2077 Digital foundry has noticed that on 1440p you could lower reflections to get better performance, but on 1080p you have to set reflections at least on high for them to look good. Metro Enhanced looks extremely blury on my 1080p monitor. but with DLSS/DLDSR 2880X1620 balanced looks amazing for same performance as native 1080p without DLSS.
I would asume that a native 4K image is better than an upscaled 4K image. The question is how much better (or not) looks a 4K Quality to a native 1440p. Because that is what 4K will render in Quality. Or 4K Performance vs 1080p native. So do you gain a better image with 4K even if you still use Performance compared to staying at native 1080p or native 1440p?
In my experience, 4k quality always gives way better resolution of details compared to native 1440p. That is the best use case of dlss, getting pretty close to 4k visuals with 1440p performance cost. I wouldn't expect 4k dlss quality to be better than native 4k unless native has a bad aa technique in use.
@@WhiteCrowInteractive which is why I always turn off AA tech on 4k. At least for me it isn't necessary and is like putting Vaseline on your camera's lens.
You can't compare 4k dlss quality to 1440p native, as you will get significant less fps. To my eyes, 4k dlss balance upscaled resolution is about 1440p native. 1440p native will have more fps than 4k performance WITH better image. Same thing to 4k performance is not relevant to upscaled 1080p, they are marketing scam. At least to what I've tested in cyberpunk2077.
@@WhiteCrowInteractive Because 4k quality in practical, not upscaled from 1440p, maybe around 1850p, ultra quality about 2000p. These are marketing scam. Look at the actual fps you will get from both 4k dlss quality and 1440p will get you the answer.
Fake image is still fake image, The GPU card I have can render the graphic the way it was meant to be immersed in or it just can't... not using fake stand-in. Sorry but I'm sticking with native.
for some people over sharpened image with denoised textures with low flickering is better than non sharpened images with normal textures with flickers only seen when zoomed too far. But for me the later is better.when ghosting is minused image looks little bad at the edges
Can we just take a moment and appreciate that this kind of upscaling is even possible? I still remember being blown away when DLSS 2 came out. It looked like magic to me honestly.
this was very interesting. see i play mostly at 1440 because i like high fps and ive found i really like the sharpness that dlss or even fsr can give. like hogwarts and rdr2 and a couple others im not really thinking of imo just look better at 1440 than taa
Amazing job on the video! However I can't really see a difference between the 3.😂 So if the game performs well at Native Il'play on Native. If it needs extra push then I go DLSS / FSR Quality.
Sure, if you don't care about smoothness and just want to take snapshots all day I can see that being ok. I use DLSS for smoothness and cause the flickering in native drives me nuts.
@@huggysocks I agree. Upscalers help a lot with 1% lows also. And yea it smooths out the expirience so sometimes it's good even if you have enough frames at Native.
Cool video! I've been wondering about this for some time so it was useful to me. I also noticed that DS was far better looking with DLSS enabled. The aliasing with native was atrocious but with DLSS it looks fantastic. HU please do this grading for all games in existence so we can just look up the game we're gonna play in a database and set our game accordingly.
Guys, I'm shocked by the scale of the work done. Thanks a lot. I would also like to see a comparison of DLAA with other AAs in native resolution
DLAA would also removed the "bad AA implementation" bias when comparing native to upscaling methods
DLDSR 1.78x +DLSS Quality (1.78x * 0.444% = about 80% the amount of native pixels) is a close contender to DLAA with similar ±performance in case DLAA is not available (although there now seem to be mods to hack it in).
In my experience DLAA tends to ghost quite a bit on around edges
@@bluesharpie9744 I tried dldsr 2.25 for Death Stranding director's cut and it wasn't good. Not as bad as TAA alone but still a lot behind DLSS alone.
PS: it's only one game in 1080p, I have test every game
DLAA will always be better
Old comment I know, but DLAA is literally just DLSS Quality except a little bit better under a microscope. It's technically the best image quality possible, but I think most people would argue it's not worth the FPS loss compared to Quality (unless you're at 1080p, as at that output even Quality dips too low in terms of internal pixels)
Oh and the ghosting issue was fixed with Preset F in recent versions, there was a period between 2.5.1 and 3.6 (iirc) where the default preset caused ghosting in small objects, but they have since fixed it. You can use DLSS Tweaks to force a preset in any game.
This is really a test of bad anti-alias has become in games. So much that resolution scaling becomes the only means of avoiding shimmering jaggies.
havent played that many modern games recently but good lord war thunder has a really bad problem with anti aliasing. Its so bad, SSAA is mandatory. Even with that, theres so much noise you will have a real hard time spotting enemies. TAA helps too but certain texture still blinks though its not as bad as it was previously. You have to sacrifice so much performance
@@quantumsage4008 TAA is worst thing to happen to games, every single TAA implementation makes images noticebly blurry
@@CC-vv2ne So add sharpness, what's the issue? You seen TAA in R6 with a bit of sharpness (right under the TAA)? It looks CRYSTAL CRISP. It's AMAZING. I wish all games had such good anti-aliasing and sharpness.
@@Dyils Ubisoft is usually among the best technically.
It would be like saying DX12 has not been a severe issue because WD Legion managed to make dx12 running better than dx11.
TAA is usually badly implemented, R6 isn't a proof that most games could achieve the same thing.
@@CC-vv2ne FH5 is the one example I can think of where TAA looks way better than even MSAA. But it is more of an outlier
This gets interesting when some game devs enable DLSS as an antialiasing solution (DLAA) while retaining the native render resolution.
The Finals enabled DLAA for its beta and it looked incredible
don't wait for developers to learn the nvidia sdk. Instead you can use the project dlss tweaks to enable DLAA into any game with the nvidia dll.
Yup, dlaa looks really really good with native (what I use with Hogwarts legacy )with a 27" 1440p monitor. I found dlss quality looks quite bad in Hogwarts, noticeably softer and blurrier, especially indoors.
I did not realize it was possible to enable on your own for any game with dlss, will have to try that in other games.
Dlaa looks absolutely horrible to me...
@@SweatyFeetGirl Too soft on lower resolutions, unlike msaa which works great
You have to disable then re-enable AA in Death Stranding Directors Cut edition each time you launch the game, otherwise it'll be without AA, even though in the graphics menu it says it's on. The standard Death Stranding didn't had this issue.
Wow, great observation. Doesn’t change the overall result but the devil is in the details.
Who fucking plays walking simulator!!
I knew something was wrong with the results. I played the standard edition of DS a few months ago and thought the TAA implementation was actually good. And I played at 1080P!
@ people without legs?
@@starcultiniser Imagining it, is its own journey
Awesome information. This channel has been especially hitting it out of the park with valuable and hard to find information. I know these videos are a ton of work, and I'm hoping others can appreciate and find this content as helpful and useful as I do.
Yeah for real, and these guys are truly independent and not willing to shine a turd. Like it'd be funny watching them seeing a GTX 1630 in action for the cost. It confuses me because I think they're pretty much fair, and rightfully slammed teh 6500XT as being a garbage product, yet lots of butthurt fanboys had to cope about what seemed like blatantly obvious common sense to me 8gb was not enough. I'm confused by techpowerup though it says a 6500XT performs the same as a 5500XT? Why really?? Funny because mostly AMD was the best brand last gen, except that one stinking turd. I automatically expect Gamersnexus and Hardware Unboxed to be utterly unswayed by corporate bullshit artists.
Meanwhile Lius techtips uses 1/20th of the effort to make a video, but includes some hand held camera movements, some gpu dropping, some sponsourshots, some jokes .
I will never understand why the DLSS or FSR files are not updated... there are really big differences in quality and it's not a big job for the DEVs. but the result is all the greater.
what you mean updated? to lastest versions or implemented in other games?
you can't just boot the game and test a couple scenes to verify it works correctly, there might be a specific area or shader that doesn't play well with it, it's not that straight forward
@@DavidFregoli I agree but I personally use the latest DLSS version with the games that support DLSS but are stuck with older version (eg. RDR2 ) and the game looks native with more fps. DLSS 2.5.1 or any 3.x.x looks so good
As a developer, it comes to : there is no ticket for that. Anyway, the nvidia SDK states that the dll is drop-in replaceable and that is a major feature of their sdk. Even have an update function you can call but nobody ever does.
@@TechTusiast Also dont get confused, NVIDIA DLSS DLL 3.1.11 is not DLSS 3 which we know.. Its DLSS 2 only but with version name 3.1.11
I honestly just assumed going into this video that Native > DLSS and FSR in terms of quality, so it was pretty interesting to see how it resolved some issues with some games like DS. Great video!
That's not always the case sometimes dlss 2 will have better image Integrity than native
in fact its true from this video you can see the only thing that fsr and dlss do better is antialiasing
Well you assumed correctly, because Tim makes a common mistake, Native > Upscaling (ANY), but DLAA > AA (ANY), because all AA is trying to imitate DOWNscaling.
Render @ 8k, Display @ 4k gives you the best AA you can get, at cost of huge performance hit, and every AA ever invented tries to mimic this taxing AA method, without high FPS hit.
DLAA is training the AI on Downscaling, to do better AA, without upscaling the render. So all this video shows is what we already know DLAA is "Better than any other AA.
This is worst video Tim has ever made, especially as we can now force DLAA, in any game that has DLSS support (How? See my other posts in these comments).
Unbelievable content. Need to continue to showcase games relative performance between DLSS, fsr and native twice per year to keep up with the updates and new games coming out.
would love to see DLDSR together with DLSS added at some point in the future, i often use both together at 1440p and it further reduces flickering/aliasing.
Yeah I play on 1080p plasma and the picture is just perfect when using DLDSR and DLSS quality combo.
I love DLDSR even at 4K. It performs even better than native and looks significantly better in most titles.
@@8Paul7 so you DLDSR it to 4K then do DLSS performance on it or?
@@Lollakuk you can't dldsr to 4k on a 1080 screen, but yes the idea is correct
@@impc8265 you definitely can. I have DSRd to 2160p on my 1080p monitor.
Maybe instead of or in addition to further optimizing DLSS and FSR image quality, maybe developers should just spend more time properly fixing the native rendering in many cases... Basically every single time DLSS was better, it wasn't because it was so good, but because native TAA just was so bad. We have seen that before with stuff like AMD's CAS which seemed to work just a lot better than what ever developers built themselves...
@@jkahgdkjhafgsd it does absolutely not, not even on a 4K monitor
@@jkahgdkjhafgsd Even on a 140ppi monitor jagged edges looks worse than everything, I'd rather take bad TAA that flickers a lot, but it's better than literally everything looking absolutely awful... If you want something good, look at something like SOTTRs SMAA + TAA implementations. They are expensive, but it works wonders, not a lot of flickering and other artifacts, just plain solid.
No anti aliasing is almost perfect during movement, it absolutely slams the clarity of even dlaa. Aliasing patterns are a problem because the pixels are only rasterized in the centers, which is pretty wasteful. It's better to rasterize random locations inside the pixels, which also randomizes the aliasing patterns and makes them a lot smoother in real time. To improve the clarity even further, foveated supersampling/downsampling is needed, on top of backlight strobing/black frame insertion and eye movement compensated motion blur, which make 100 hz look and feel like 1000 hz in a non destructive way
@@jkahgdkjhafgsd dlaa
I noticed that some DLSS version give you heavy ghosting. If i notice this in a new released game I replace the DLSS file with 2.5.0 or 2.5.1 version.
Where to get those dlss files?
I like those versions too! It's gives the sharpest and the least ghosting image.
Latest is 3.1.11, why not use that?
I use the 3.1.11 dll with DLSSTweaks to change the preset to C (preset used in 2.5.1). So it's even better than 2.5.1!
@@jemborg Precisely for the reason stated, duh. "Some DLSS versions give you heavy ghosting"
I think you should consider testing more games with manually updated DLL file for the lastest version of DLSS. Since as you said, just by updating its version can change the result from worse than Native to match or even better, so I wonder if this is a trend across the board, and since this is a comparison of DLSS vs Native, I think it's fair to use the best and lastest version of it, since it can be manually done by users for free, not depending on games developers to implement. If the results are indeed positive for all, then it's gonna be truly a game-changing technology, since it'd be a truly free, stable and noticeable performance boost toggle that comes at no cost, and might even improve visual, simply a much better solution than overclocking since it's not dependent on silicone lottery.
Couldn't agree more
How do you manually update that?
24 games, that's a lot of work) great job, Tim!
Some other channels could learn from your image quality. Always sharp and clear 4k. Probably the best looking channel I watch. Why do other channels not look this good? Lol. Keep it up!
Biased. They're all the same.
DLSS has a wonderful effect on surfaces that get shimmering lighting/shading or shimmering/bubbling screen space reflections that is pretty common on games of the last 3 years, my guess is that the temporal effect of it smooths it out, in sometitles it's very obvious like Cyberpunk 2077 where this kind of shimmering appears on a lot of surfaces. I guess theoretically a good TAA setting would be able to do the same, but it would have to be able to understand know what kinds of objects it should do it to, or it would cause ghosting... i guess this is where DLSS comes in, since the machine learning can make it understand what parts should reasonably not be shimmering... and modern 3.1 dlss is very low in noticeable ghosting anyway...
also now with dlsstweaks which is a fabulous tool, and now it even comes with a UI program for people who might feel intimdated by editing the text files directly.
With this, you can set your own internal resolutions... i've been playing around with it a lot... most games can run on really weird internal resolutions, like rendering at 3:2 3240x2160, which i've tried for 4k, so it is DLAA vertically, and DLSS horizontally... in games where you would need just a bit more performance with dlss but you are not satisfied with the "quality" option... or in games where you want something in between the available settings. I hope in the future this can be baked in as a slider in games, so that instead of having a number of options "quality, balanced, performance and ultra performance", you can just have a slider from full internal 1x resolution (DLAA) down to even below the ultra performance (x0.333333) level... I've had great use of this tool to max out for example The Last Of Us Part 1, so that I can set a resolution just high enough to look very close to native (so above the DLSS quality setting, which is x0.6666667) but still fitting within my VRAM limitations for a certain set of settings, that i wouldn't have been able to do if i ran it at native. DLSS really saves the day and i've tried using FSR and XeSS to see if they are any better in any game, but i haven't yet found any case where they look better. Having gone over to 4k tv for gaming and content viewing recently, I wouldn't even be able to run a lot of the games i like in 4k without dlss, it's really the main thing that would keep me from thinking of going over to AMD cards if i was looking for a new graphics card now.
Great vid as always- There's also this strange micro stutter and ghosting in the DLSS my eyes can't unsee it, especially when it's scrolling past something, it's not smooth. 11:39 In hogwards legacy, the freaking coat judders and all the small things that move around like birds have massive ghosting. 12:44 judder judder judder. 13:35 rain ghosting, the rain is literally ghosting across the screen.
Great video !
People need to remember that DLSS / FSR / XeSS "ADD" performance with much better FPS. Which mean much better frametime and input lag that lead to smoother gameplay. When you can pull 30-40fps on native 4K/1440p/1080p in games you play and you can turn on DLSS to get much better fps or even stable 60fps that should be no brainer to turn on based on the 24 game comparison. With DLSS is edging closer to native in quality mode, FSR and XeSS catching up just mean better for costumer. The real problem just like the video point out, that new games release (hogwarts, deadspace, deathstranding, last of us) in native have much worse anti aliasing method implemented on their game in NATIVE. This feels the norm now that need more attention that developer making unoptimized game like when you can pull 14gb VRAM on the last of us just LOOKING AT A WALL.
I'd say it's a no brainer on Ties and DLSS wins. on Native+ it's the best tradeoff to get some performance, while on Native++ and Native+++ games you activate it only if you absolutely need, and potentially test lowering the quality settings first
VRAM is reserved, it doesn't matter if you look at a wall or look at a very filled place
VRAM already loaded as soon as you started the game, it dozen matter if you just staring at a wall or having intense gunfights.
You should see the native without TAA - you'll see jaggies everywhere and there will be shimmering on all the edges. There's a reason games use TAA....
6:40 ummm what is happening when it pans down on the spiderman brick wall with DLSS? It is hitching badly and looks so glitchy. If I saw that in game, I would immediately turn off whatever is causing it (so DLSS). Why wasn’t this incredibly obvious artifact mentioned? Am I confused here and missed you addressing it? Thx
Will you be making the same concept with new enhanced dlss super resolution which uses Transformer model? It would be such a good concept to see the evolution of dlss and whether it has been worth to use after all these years of development.
Thank you for all your effort in producing these comparison videos it is much appreciated
Very well done balanced comparison! Thanks for the timely update on where we are at with the technologies.
In some games I honestly prefer 4K native without any temporal AA/ similar techniques . FSR, DLSS and TAA when poorly implemented show lots of ghosting and shimmering which is super annoying
In most games.
yes that's the point of the video
@@DavidFregoli No he compares native with the best AA which is typically TAA, if you actually listened to the video lmao
It bothers me that taa has become the goto AA technique. I realize it takes a lot less power than smaa, but it consistently looks like ass. Luckily, you can turn it off in most games.
Glad to see people are waking up to TAA looking like garbage.
22:10 While in theory automatic updates sounds good, as a developer - in practice, we never want to release an untested update.
In normal development cycle (and most of the cases), after releasing version 1.0 active development stops and we move to a support phase - where we do not add any new features and focus only on fixing bug and glitches. Most of the developers move to another projects (which may be even DLC for that game, but that also count as a different project) and since there is less human resources, you can't just constantly update such an important library like DLSS (important since it directly affects visuals of the game).
If something worked before it's not guaranteed that it will still work after update - e.g. there may be a particle effect at a waterfall which after update starts producing ugly artifacts, QA may not catch this (mostly due to reduced resources), but gamers surely will - that's a big risk. Risk which can be avoided by not updating that library.
It helps if you make a backup of the original dll before you update... in case it looks worse.
That sounds pretty logical. And yet if upscaling techniques are being employed by more and more games and gamers and the version of the upscaling library directly affects the enjoyment of the game, it seems logical for a project management to allocate resources to keeping it updated after release, or at least evaluate an update some time after release. I'm not sure one visual glitch in a waterfall matters if the rest of the game looks better and is slightly faster than before.
Outdated thinking
If the result is better in 99% of the cases: Why not just always update and let the tested version as an option by default ? In many games, you can chose which version of dlss/fsr you want in the game settings.
Players would probably love to see some minors updates after release. Especially when it doesn't require a heavy DL.
I’ve felt that native is the only true sense of what a gpu can do, but I can’t tell the difference in SOME scenarios. If I’m in the middle of a game, I’d say there’s only certain things like the fence and things flickering that would get on my nerves. So it seems like it’ll end up being a game by game thing for me as to whether or not I’d use the tech or just play with native rendering.
in terms of judging how performant a card is, 100% native. in terms of playing and enjoying a game, whatever looks best to you.
Why would u sacrifice performance if you can't tell the difference anyway.
@@dante19890 I am basing my statement on what I see in the video. I haven’t yet compared them for myself in the games I play. I tend to play in native res without the frame generation on anyways. The one time I tried it, it wasn’t impressive to me, so I haven’t bothered since. Also, with whatever extra processing youtube does to these videos, and depending on the screen someone is watching on, it can be difficult to tell if I’m seeing things the same way. Especially since “seeing 4k” on a tablet vs a monitor or tv is dependent on those devices. What I should do is watch this video again on the pc and monitor I’d be gaming on. And I did say “SOME” scenarios. Not all. There’s a clear difference in many of the examples he showed.
@@benfowler2127 ye but either way its gonna be a little better, same or a little worse depending on the DLSS version and the game, but you are getting a huge performance boost so its a instant net win no matter how u look at it.
Problem is it's not just fences, it's bloom, rain, depth of field, air particles, god rays, wide color gamuts, OLED contrast, etc. on top of increased latency.
There're lots of things AI can't do, and it certainly can't do native graphics better, just TAA better, when TAA is bad.
Hitman in the rain being the worst offender here, even when DLSS is technically better as an AA solution it makes lighting look awful.
The dragon neon sign on Hitman is losing a lot of brightness and bloom compared to native. The ~1-2 pixel wide neon information is lost and thus doesn't produce the light it was supposed to. That's pretty impactfull in my opinion. It's completely extinguished/muted on FSR.
This is something that will be noticed a lot in Cyberpunk, where 1-2 pixel elements (neon signs again) have to convey bright, contrasted information and is further enhanced by bloom post processing.
The intensity of bloom is, sadly, often related to the native resolution, so when that drops, you get less bloom...
Yes, lighting effects take a hit when using upscaling. This is why I ran Death Stranding at native despite the shimmering, the lights look "wrong" when using DLSS. The more bloom light sources have the more it is noticeable. Neon lights often look like LED lights with DLSS.
There's a gamma shift and overal loss in color accuracy when scaling is performed, which is to be expected due to obvious reasons. And sadly this is not a problem most people, be it developers or consumers, are concerned about. Or what is actually sad is that I'm definitely not one of them as I'm very sensitive to such things.
good call i didnt even notice that.
@@juanblanco7898 you can't adjust gamma settings in game?
DLSS is just really good anti aliasing, that's why they added DLAA, which is just DLSS without upscaling
This
no it's not lol, it also gives back huge chunks of fps, ranging from 20+ (quality) to almost double with performance
@@Superdazzu2 Yes, it is. He is absolutely correct, from a technical perspective, on what the actual image processing techniques involved in the upscaling are doing. There is also a *side effect* that because it's starting with a lower resolution image, you get an FPS bump - but the fundamentals of the image processing that's going on, are, *unavoidably* that it's doing extremely computationally inefficient anti-aliasing.
@@Superdazzu2 no it doesn’t lmao, dlss3 the latest iteration of dlss actually REDUCES performance due to 10-15x the latency, fsr actually increases performance/quality. (Source - HUB, AdoredTV, NotAnAppleFan, PCworld etc)
@@Superdazzu2 Yeah because DLSS is applying that anti aliasing to a lower resolution, instead of native
You know, I had been playing newer games with DLSS quality activated at 4K and 1440p with my 4090 and 3080 respectively, as they came as default. They looked fine. However, a few days ago I went back to native and I was surprised by how much crisper everything looked.
Well, it's a trade-off. You easily get used to less crispy image, but the smoothness makes up for it and once that extra smoothness is gone you IMMEDIATELY notice it.
because AA sucks, when no AA (TAA, dLAA, DLSS) games look way sharper... but sadly these day you are forced to have at least taa in like 90% of games.. even thought at 4K i would personnaly rather have the slight shimmering than the smoothness.. basically thiscomparison is between TAA and DLSS/fsr, native doesn't mean s*** anymore..
@@beachslap7359 I'm already getting well over 60 fps. I don't care if I get 80 or 120.
@@sebastien3648Suck? You mean look sharper when you take a screen shot and look at the resulting image for 5 minutes?
Graphics in games are brilliant.
@@cptnsx Calm your tits. I wrote "to mitigate", not "to fix".
incredible result for DLSS lets be honest, just being a tie to native would have been incredibly impressive but to pull wins out the bag on multiple occasions, wow.
That native image instability and flickering bothers me a lot once I notice it and DLSS Quality fixes it in most cases so I'm using it almost all the time.
To me that looks better than native TAA and I can't notice the ghosting tbh + less power draw and lower temps with better performance so its a win-win for me.
Even on my lowly 29" 2560x1080 ultrawide Quality mode looks good enough to me, usually I manually update the DLSS dll with the 2.5.1 version tho I'm yet to check the newer versions.
When I have performance left…
I tend to do this trick…
DLDSR 2.25x + DLSS Quality…
I play on a 4K monitor, so the game would be rendered at 6K and DLSS Quality would make the game render internally at 4K and upscale to 6K…
This makes the image look a lot sharper and better than regular 4K actually!
Even on 6K Performance mode (which renders internally at 1620p), it usually looks almost like native 4K and lots of times even better than it by a bit.
What is DLDSR?
It's called supersampling, also be used in video/anime industry. Render video at 4k and downscale to 1080p or 8k downscale to 4k, that's why you see those studio use 8k/12k high end camera.
@@shockwav3xx opposite of upscaling…
It downscales the game instead, this gives a sharper and crisper image than native.
I often use 1.78x which already looks great
@@Dionyzos Thats what I am wondering is it better to do 1.78x with Quality or 2.25x with Balanced/Performance etc..
Another use case for these upscalling techniques is to reduce the workload and power consumption on your graphics card. If it looks the same to you, just enable it.
It increases the cpu usage though. Some people shouldn't if they are using ancient hardware.
@Eternal Being33 A GPU consumes more power than CPU. You can also combine it with frame generation which will give it breathing room.
That edition was "helpful-just-in-time". Getting these days frustrated with DLSS stuttering and lacking quality in FlightSim you mentioning the cpu limitation as deal breaker was very helpful. Enjoying the scenery low and slow with a real-life viewing distance means to run the simulation in a CPU limit more often then not. Your confirmation of the title looking much better in Native Res will have serious implications: the 7800X3d has to come to rescue. 😄💙PS: Regarding VRAM, the title (even in Mid Textures) uses more then 12GB ded. in dense areas.... the 3070 wasnt long in my system! Thanks
than*
I would love if you tested DLSS balanced also. Would be nice to know whether it would be a good tradeoff or whether it in general is something to be avoided.
Depends on your starting resolution. At 4k balanced works well. 2k it's ok. 1080p I'd stick to quality
The reason I stuck with Nvidia last year was due to DLSS, not too bothered over Raytracing, just DLSS. I use a 50 inch 4k TV and my old 2060 super using DLSS got stable 60fps at 1440p. My 3070ti on Dying light 2 gets stable 60fps at 4k using DLSS. Only thing I don't like is the vram only 8gig on 3070ti, Nvidia messed up with vram.
8gb is fine if you dont max out the textures.
Well, the RTX 3070/Ti is a 1440p video card, so 8 GB VRAM is exactly what you need.
@@eternalbeing3339Cope
@@vladvah77Fanboys should stay silent.
I'm super surprised how close it is. My mind doesn't want to believe it can be this close while giving so much better performance... To the point I keep forgetting it. Brain just doesn't want to get the picture, it doesn't fit with my expectations.
Thanks for such patience diving deep into these topics! Making these questions as close to objective, and best explained as possible. Super amazing work taking something like subjective visual quality in upscaling, and digesting it into something nigh objective! Your commentary about what game devs should do is also super concise and actionable.
If I could give a gold star award to your channel, I would. I'm having to seriously consider donating or subscribing on Patreon, even though that's something I rarely (just about never) do. Thank you again for your tremendous efforts over the years. Gamers can be so much more informed now than any time I can recall in history. And it's due to dedicated reviewers such as yourselves.
Disable taa and dlss and fsr doesn't stand a chance when compared to native.
@@Sal3600 incorrect. If u play native without TAA, DLSS will actually have a better image stability
@@dante19890Why would you even do that lol. TAA has a very slight performance hit which is not even noticeable.
@@AnuragCrafts A lot of old school pc gamers hate TAA and just run native resolution with no image treatment or AA
@@dante19890 Better add sharpening from control panel or sharpening reshade or any mod for that particular game. It should be fine.
A couple of things... is there a reason that the DLSS version of Miles Morales videos started stuttering at 6:35-6:47, and God of War did the same at 12:45-13:00. You'd think DLSS would perform much smoother than native resolution.
Also, thank you for making a point about games updating their DLSS versions. Trying to update DLSS in RDR 2 is almost impossible since Rockstar Launcher ALWAYS check to see if files have been moderated every time you play, and even if you update the DLSS version, Rockstar Launcher will ALWAYS replace it with its older v2.2.10 file. If you try and deny write/edit changes to the file, Rockstar Launcher won't even launch the game; It will stay stuck trying to 'update' the file.
Trash D.R.M.
Well mate, you can always use EMPRESS edition for single player campaign of RDR 2, she has cracked a new version with DLSS so I would say F u c k off Rockstar launcher!!!! and enjoy my game anyway :-)
Considering DLSS is objectively rendering at lower resolution then attempting to upscale to native, it doesn't even make sense for it to ever be better than native.
Bad antialiasing
very useful info, would love a revisit with more recent games someday !
My takeaway from this is I can’t really tell the difference so I can just go for fps ❤
Yea with DLSS. Good luck not being able to tell the difference FSR2.
@@kilmor3151 FSR 2 is pretty good on 4K, I think its okay to use it for 4K quality, below that I am not sure.
@@Verpal heavily depends on the game. Unusable in RE4 for me.
Bro fsr suc*s. I use nvidia but most of the times i prefer native over dlss q especially in cyberpunk. 80% of the time i wud go with native n 20% of the time i wud go with dlss quality. Never fsr it is horrible
Can we take a second to talk about the jumpy vertical scroll at the end of spiderman using dlss. Is that an actual thing? Or trouble during editing/encoding?
I just pick 1440p dlss quality + sharpening all day vs forced TAA in latest titles where you cant even pick another AA (cp2077 as example). Dlaa is good alternative
Forced taa is so fkn annoying it's the reason I could never play the newer battlefields past hardline can't stand any type of aa it's all just blury
@@mryellow6918 Main reason why I upgraded from 1080p...simply because no matter which settings you use you cant see anything. Blur mess vs pixelated mess. Lately my friend and me tried the Division 1 and we were astonished about AA implementation. Both 1080p and 1440p. I think we should blame devs fot AA at this point.
I used DLSS when first playing Cyberpunk on a 1080p ultrawide.
Set a higher resolution than my panel and balanced it so it would render natively at the resolution of my panel; My panel would then downscale the higher game resolution image back to its native resolution. This gave me more distance clarity and a bit more stable image in Cyberpunk (that a month after release of CP, was before DLAA was an official thing in the driver or the game).
so you're basically using dldsr.
@@kiburikikiam before that was part of the driver, yes.
IMO when testing native using post-process TAA (temporal anti-alising) is not showing native at its best, TAA when compared to DLSS or FSR, lacks a sharpening filter, it slightly blurs the image instead and worse creates artifacts that wouldn't exist with no AA.
MSAA (multi-sampling anti-alising) would be the best AA option to use for native, when comparing image quality, but unfortunately, modern games have no MSAA support at all and forcing it on the drivers, only works for older D3d9 (direct X 9) games and that is a real tragedy to me.
You can of course, still use SSAA (super sampling anti-alising) in any game, by just rendering it at a higher resolution than the one you display at, and that has the best image quality of all but also destroys performance.
Pretty obvious that we see way more ghosting with DLSS that FSR in these examples. I can’t say I’ve noticed any “jump off the screen” problems with FSR yet (commented at 11:26 mark) as I watch.
In my personal experience I noticed that the newer DLAA anit-aliasing algorithm works incredibly well with solving flickering issues and making the image look high res. I think this test shows that the Nvidia AA algorithm on DLSS handles scenes much better than TAA for example hence why DLSS looks better than native in many cases. I wonder if we will ever get a test comparing DLSS with DLAA. DLAA native scenes should theoretically be better than DLSS.
Last of Us and Marvel's Spiderman look amazing even on a 24 inch 1080p screen with DLAA. If every game looked like that, 1080p resolution wouldn't be a problem.
Aren't we talking about antialiasing quality though? I can tell the difference in Hogwarts for example between DLSS performance and Native in terms of overall resolution, DLSS performance looks blurred in comparison. The aliasing is cleaner for sure but the image is noticeably lower res imo.
Standard TAA is often absolutely awful. I'll never understand how it was ever acceptable. DLSS (and fsr) often does a better job dealing with artifacts like ghosting, but obviously it can't compete with the sheer pixel count of native.
A prime example is often flowing water. Transparencies often don't create motion vectors so any moving texture gets smeared like crazy with TAA, but dlss is capable of keeping the texture crystal clear in cases
It kind of depends. For a long time in M&B Bannerlord for example, even DLSS quality gives off major shimmering with water and ripples, and default AA at native looks significantly better. But later DLSS versions fixed some of that. What they should have done is to swap the DLL to latest version for all the comparisons.
@@zxbc1 for sure, it's not always better. Spiderman for example has very good temporal anti aliasing tech so these issues aren't as apparent.
As for older dlss versions, it's extremely easy to replace the dll with a newer version so I don't really consider the older versions anymore
Why even use AA at 4k? It hurts performance and looks worse.
@@prman9984 To fix aliasing, like you would at any resolution. 4K doesn't somehow fix aliasing, it just makes the aliasing size smaller. For me it's absolutely required for a lot of games because I game on a 48" LG C2 at desktop distance, and I can see aliasing very clearly at this distance. If you game on a big screen in living room couch distance you probably won't need it, although for some games that have very bad aliasing, you will still notice severe shimmering even if you can't see the exact aliasing artifact.
@@zxbc1 I game on a big screen in 4K since 2016 and hate AA especially TAA with passion. It destroys image quality - its why I HATE these people that say DLSS is BETTER than Native because its NOT. It may be better (image quality) than Native with AA but NOT without.
I read a funny thing somewhere which i also think is true.
When you want high visual quality, you enable highest settings possible right?! If your machine can't handle it, you reduce the video settings, right. All in native.
If you activate the upscalers while selecting the high visual settings you are literally reducing the video quality.
So why enable dlss/fsr instead of just keeping native with slight lower visual settings.
By keeping native you aren't compromising anything in terms of video quality and you aren't demanding anything extra from the gpu either (by forcing a silly upscale calculation)
always update dlss to the newest version and test if its better. most of the time it will have great improvement in image quality (there are exceptions though)
This test is weird I don't understand. Instead of creating two graphs one where its FSR vs Native and DLSS vs Native you combine all 3 into one so that only DLSS is shown... why? What advantage does that serve us the consumers of this video?
For non-RTX users (GTX, AMD, Intel) how do they know which games FSR is better than native in? The only technology they can use. That would be useful information. This test only benefited NVIDIA RTX users because you've left out useful results for everyone else. If FSR is ALSO better than Native in that game that's good information to know rather than simply just knowing what's best.
Love the video just some feedback
I would also like to see DLAA results and rendering the game at a higher resolution then down scaling it. I.e 4k to 1440p and seeing how those two compare to traditional TAA.
you people demand a lot without compensating them for it.
what's stopping you from testing yourself?
I'll give you a hint, downscaling pretty much always gives you the best image quality.
@@gozutheDJ if we were to follow your brilliant logic, this video would not exist. Hw does technical analysis that most people are not capable of, so it's interesting to see H.W opinion. about the lol reward the channel earns money with the videos by adsense
@@tvcultural68people are not capable of turning a setting on, checking how it looks in game, turning it off, and comparing? wtf?
@@gozutheDJ It seems like you didn't understand anything I said. Hardware Unboxed goes much deeper into the differences in graphics than most people, in summary, the channel offers a much better analysis. And if someone were to do their own analysis on dozens of games, it would take dozens of hours, I know this because I analyzed 5 games. It's much easier to watch a video that provides a great summary of this issue than for a person to spend dozens of hours analyzing games. Is it difficult for you to understand this?
@@tvcultural68 because this is literally just content dude. its fun to watch but no one is using this to tune their games. because to tune their game they would just .... do it themselves. i have eyes. i dont need a channel to tell me what looks better to me. its all coming down to personal preference as i said, unless you just go along with what other people tell you is better.
and also, your monitor factors heavily into this as well. and that's something they can't account for, they can only tell you how it looked to them on their monitors, but you might have a completely different result, so at the end of the day you STILL need to test it for yourself.
To be fair, I think the "better than native" really comes into place in scenes with limited motion. It could be an RTS game, like The Riftbreakers, a slow dialogue focused cutscene like the Witcher 3, puzzle games like Lego Builders or just when standing still, detail studying environments or characters in Cyberpunk. In those cases the temporal resolve has a super-sampled "better than native" quality to it. With FSR showcasing flickering instability with sub-pixel geometry (due to the feature lacking a temporal smoothing pass, which DLSS has. However, this extra pass often causes extra ghosting in DLSS as a result).
There are so many components to upscaling, and there's just a lot of subjective opinion of the image quality that you can't or don't really need to quantify. For me personally I rather play Cyberpunk with FSR Quality over the native image at 1440p due to the super sampled temporal resolve in cutscenes, dialogues, or just when detail studying the environment. During motion it's close to native, but with added flickering and slight fuzz. which I don't mind when I get into the game. So even if flickering CAN be distracting, to me it isn't all that important, and I value the benefits of FSR, over its drawbacks. And that's something that you can't really quantify even in an image comparison, because it really depends on the scene, what you value, how much you are into the game and what best depicts the artists intended look for the game. and so on.
It would be good to point out that the image differs a lot based on the DLSS version which can be mostly replaced in a matter of minutes, for example the DLSS in RDR2 is trash but when replaced with version 2.5.1 it looks amazing.
unless there's a convenient method that scales for every owner (possibly GF Experience integration) or an always up-to-date table somewhere that matches game to dlss version this is an impractical solution that only a few enthusiast will research. Nvidia hasn't even been able to stay on top ReBar On/Off per-game settings as shown by HUB so not much to place trust on here.
He covered that with that exact game in the final thoughts. It maybe deserved a section of its own, as people probably don't expect new information to be provided in the final thoughts section.
@@DavidFregoli there is an app called DLSS swapper that makes it dead simple
It boils down to anti-aliasing, which can be pretty bad at native. Interesting that the FSR flaws to which Tim most often points are usually also present in the native/TAA presentation. This suggests to me that complaints about FSR are overblown. Yeah, FSR is worse than DLSS, but it's still pretty close to, and in rare cases better than, native quality. And given that modern games tend to have glaring issues otherwise--perhaps micro-transactions, brain-dead gameplay, bad AI/scripting, and/or things like persistent shadow/texture/mesh pop-in--given all of that, I have a hard time stressing out about minute differences in image quality.
Still, I prefer to run native when I can.
Native + blur
It's not just that, DLSS is trained on a 16K resolution dataset which is why it can generate fine detail better than native 4k sometimes.
Which for addition makes up additional point.
Unlike DLSS, FSR works with just what it gets, not some magic box of trained data to mask lower resolution source.
So, if native have shimmering and stability issues, FSR would get them as well, while sometimes exaggerating, sometimes suppressing.
It's not that FSR is bad. It is just that DLSS creates stuff that wasn't there originally. It's also not a bad thing, but by itself it shouldn't make FSR worse technology.
And cost of such creation (and reliance for TAA and AI) is ghosting (more for DLSS), or not fixing original shimmering (more for FSR).
Probably that DLSS/FSR comparsion should've went with native as well (just not including it in results). Maybe overall conclusion would've not been so negative towards FSR. It is still inferior for multiple reasons, as, for example, there is no magic box to draw stuff from air with. But it doesn't "destroy" playing experience.
Agree. People forget how bad games looked 15 years ago. Its no longer about graphics or resolution. Gameplay is stagnating more. Physics implementations dont get enough attention nor audio. Worst offender is a.i. Most games have brain dead a.i especially if its a tacked on singleplayer with main focus being multiplayer. Which then make multiplayer bots suck too.
@@DimkaTsv I think they do have to train FSR 2 for games somehow, as it cannot be injected into any game like FSR 1 can be with the Magpie mod. I don't know how they do the training though. You make some very good points.
I'd love to see something like that for 1080p. Is it crap or is it any good?
it is good.
@@kanakabp Old DLSS was crap, but 2.5.0.+ is very good.
This just makes me hope that AMD eventually releases a version of FSR that utilizes AI to enhance the quality of the image similar to DLSS. Perhaps it might only work on newer AMD gpus, and would otherwise revert back to normal FSR. But knowing AMD it will also work on any gpu capable of utilizing tensor ML cores. That would be the true "DLSS Killer"
Interesting video. If anyone had asked me, I would never have thought any of the upscaling solutions would ever be better native. But it does make sense with titles that haven’t implemented proper high-end graphic features. Might even be a great way to visually overhaul older games if you can simply tack DLSS/FSR on to it, rather than having to update the core mechanics of the engine...
"Better than native" presentations are made possible due to shortcomings in the anti-aliasing solutions most games use, and it doesn't have anything to do with the games' age or graphical feature support. And DLSS/FSR2 require quite a bit of work on the engine side to get a good implementation of since these are temporal upscalers. You need extensive motion vector support at the very least. You do not simply tack these features on. And with motion vectors, you can also do good-quality TAA, which would reduce the need for DLSS/FSR to begin with.
It makes sense if you think about modern anti-aliasing techniques like TAA, which are essentially DLSS without (usually) the upscaling element, i.e., they look at past frames to reconstruct a smoother image than the native raster. Otoh I think those would be tough to implement in older games without some quite noticeable artefacts: what makes DLSS good is that it can look at depth information, motion vectors, and so on in order to create a coherent image, and old engines just don't have the capacity to communicate that info.
The fact alone that we got an antialiasing-technique now that is able to provide better image quality than native while at the same time giving a performance boost is just mind boggling. A few years ago i would have never belived this. It was either a mediocre performance hit with unbearably shimmering foliage and good geometry or an absurd performace hit (e.g. super-sampling) with an overall good quality while high-frequency textures still tended towards shimmering. DLSS is a real game changer.
11:15 the falling leaves on DLSS leave a long trail behind them and FSR creates a distortion effect but no trail
whats your point? pause at 14.36 and you will see DLSS is much much better than native or FSR. Like the whole video tries to explain to you, every game is different and all of them have their ups and downs in image quality but FSR is not in the discussion, its trash, finding one example is desperation lol
@@jinx20001 no point just wanted to show some difference, no need to be aggresive about it :)
@@jinx20001 How boi , somebody is angry they touched their favorite upscaling solution 😂
@@A.Froster bro i dont need any upscaling, bet you can guess why.
I don't understand how updating from a lower resolution to a higher one can look better than the higher resolution running natively. That's like saying a blu-ray player upscaling to 4k looks better than an actual 4k player.
This is unreal... you guys are putting out informative in-depth videos so frequently from hardware unboxed, hub clips and monitors unboxed, I have difficulty catching up and watching all of them!
I highly suspect DLSS 4.0 video generation trickery has been used =P
The questions that arise from this are:
1) Is the gain in visual quality when using DLSS quality, solely lies in the anti aliasing technic used in native rendering? i.e would compering dlss quality to 4k native with DLAA provide similar results as the ones in this video? I think not, but there are only a handful of games that support DLAA.
2) Does it matter? We already have concluded that "ultra" settings are not worth it, isn't dlss quality(or even lower) essentially similar to that? You get marginally worse image fidelity (50% of the time) and a massive 30-40%+ fps boost.
3) Is there any instance, where if you are forced to render (due to performance constrains of your gpu) on a lower than your monitor's native resolution, you shouldn't use DLSS and opt instead on relying on "dumb" gpu scaling or even letting your monitor to do the upscaling? for example, if I have a 4k monitor but i am only able to run the game at 1080p is there a case where it's better to render at 1080p instead of using dlss performance (internal 1080p rendering) ? Building on this: If I have the option of buying a 4k monitor or a 2k monitor, and money is not a concern, even though i don't have the hardware to run 4k native games but can run 2k native, why should i trouble myself with that and not just get the 4k one and use dlss to render at aprox 2k, effectively rendering the big old question of "who buys 4k when its so hard to run" irrelevant (notice that the difference in price of 4k screen as compared to a 2k one is much lower than the difference of a 4k vs a 2k capable gpu + a screen "ages" much slower than a gpu.
in my opinion the boost in FPS is ALWAYS worth using DLSS in quality at minimum. sure in some games the ghosting will be noticeable but with higher fps you see more detail than at a lower fps and higher (native) res.
It's funny how everyone complains to no end about tearing/vsync but now crappy ghosting is somehow acceptable. Gamers really buy into everything..
@@2xKTfc Ghosting can be fixed with a simple DLSS dll replacement, unless you are on multiplayer competitive title.... which you would probably have plenty of FPS to begin with.
@@2xKTfc I've got a g-sync monitor so tearing isnt exactly an issue for me.
It's up to you if you think playing at a lower framerate, with blurry motion and higher input lag is worse than a small amount of ghosting.
Tim and Steve giving people the goodies! Good job!
watching this 4k comparison video on 720p youtube quality on a 1080p screen, gg me and probably 90% of viewers
Im interest about the Wattage used in all modes as well in times like we have today. Hope we can see a bit more information as well in the modes. Thx anyway for the good work :)
Reading the comments, it would seem many haven't actually watched the video???
Mudslinging…mudslinging everywhere…
Only thing I see here is how fucking great FSR is, and that it does not need meme hardware to actually just work.
That's very interesting. I didn't expect DLSS or FSR to match and even beat the native rendering. I guess it's because the TAA is not a great anti-aliasing technology... and hey, I'll take free performance with a better image any time! Thanks for the detailed look!
native is better than TAA most of the time, it creates blurryness, I even prefer the jaggies to this technology.
@@robertcarhiboux3164 most people hate the jaggies, including me.
It just looks so bad that I can't stop looking at the jaggies moving around over edges instead of enjoying the game
@@robertcarhiboux3164 U have to use Native with TAA or otherwise u get a very unstable image
@@damara2268 I hate jaggies, but TAA even more. I prefer MSAA x4 but it's less and less available.
@@damara2268 TAA is blurry and even worse than jaggies. I prefer MSAA but when the game does not have it I use DSR 4.0x when my GPU can handle it (with minimum smoothing factor like 6%). I also like very much the hitman resolution upscale. It's the only game in which it really improved the sharpness of the image with no particular downside. Even performancewise it was better than DSR.
Love the content!
Keep up the good work! 💪🏻
This tech has gotten so good lately, the very small issues that can be seen if you really look for it do not bother me.
Fr I can play with Dlss performance and I dont mind the “issues” because of the FPS gains
Its true in some games that do not have an anti aliasing solution. For example, nioh 2 on PC looks better because the game has no anti aliasing and needs dlss to clean up the image.
Native is GIGACHAD, no downscaling/upscaling, pure raw performance ftw!
When hes staying "Stability" are we speaking about anti aliasing flickering? because it seems rather confusing talking about stability because its often associated with camera/display shaking.
First off I just wanna say I really appreciate the effort you guys put into these videos to provide us with such in depth analysis and information on topics.
That said, in this instance I think a more useful and widely applicable analysis would be to compare the image quality of native 1080p or 1440p to that of higher but upscaled resolutions, at quality settings where the upscalers render at or close to 1080p or 1440p respectively.
For example:
- Native 1080p vs 4K Upscaled on Performance (1080p render res)
- Native 1440p vs 4K Upscaled on Quality (1440p render res)
- 1440p Upscaled on Performance vs 1080p Upscaled on Quality (720p render res)
I think these comparisons could be more useful because most gamers have hardware that actually allows them to use those setting combinations at playable framerates, whereas very few actually have the hardware that affords them the luxury of "choosing" between native 4K and upscaled 4K. Dlss/fsr were always meant to be tools to boost performance at a given res(or enable playable fps at higher res) while minimizing losses in image quality, more so than tools to outright boost image quality itself.
Personally from my own sample size of 1, I have found that running games like MW2, Cyberpunk and Hogwarts at 2160p Performance or even 1620p Quality(also ~1080p render res) actually produces a more stable and sometimes better looking image than native 1080p, particularly in Cyberpunk. This makes the ~5-8% fps hit of running 2160p Performance Mode worth it over native 1080p. It would be nice to see what you guys think about this and if others experience the same thing.
Absolutely! Originally DLSS was there to take you to 4K with a GPU that wasn't powerful enough to render native 4K. Now if you're running a 4090, sure you can choose native vs DLSS, but that's a far cry from most people's need. Native 4K doesn't perform nearly the same as DLSS 4k.
@13:43 Pretty sure the rain shouldn't be THAT obvious, I don't remember white streaks falling down from the sky when it rains, it's much more subtle than that. I think you guys are off there, there's this thing called "too much of a good thing". And both upscalers seem to exaggerate the rain visuals.
The motion performance of both brothers me, I notice it immediately. Obviously they're always improving it but I personally choose not to enable them.
agree
There's one clip in the video where he's saying DLSS is superior, while it's stuttering on DLSS but not in native.
TAA has the same problem so I guess no AA for you
@@prman9984 could just be the capture.
@@prman9984 DLSS improves performance so I don't see how it could be stuttering when native doesn't, especially since Tim didn't mention anything as he tested it & would have felt it. Ig it could just be a bug while capturing gameplay.
Better is subjective. What bothers you more? Blurry? jagged edges? Flickering fences? ghosting? Sharpened/oversharpened? SHimmering?
Every one of these might have s different weight for each of us. I personally find DLSS fine on quality most of the time, and if I need it more aggressive than that then I drop resolution. But at times when flickering or ghosting is introduced I simply drop resolution instead. I can bear sharpening problems and blurriness, that's why I prefer resolution change to the other artifacts. However I don't play much of these games. A lot of these have bad TAA by default.
I could barely tell any difference in most cases lol. but that took a lot of effort so massive respect! 🙌🏻
True unless i use like performance settings both on fsr and dlss
That's why videos on this are a plain moronic concept. You won't ever be able to tell in a video.
hard to cross reference same area unless you pause, I'd rather see more stills than playback for fine detail, but playback also has its place to highlight image stability and shimmering. also YT compression doesn't help.
Great content. Thanks for enourmous effort put into this video. Please keep up the great work. I learn a lot from your content.
I have a theory about some of your test data, for Spider-Man for example. I wonder if the game engine and texture packs are only optimized and designed for 1080p and 2160p. So when the game renders natively at 1440p, it’s just using something like a checkerboard technique to up the image to 1440p, whereas DLSS can upscale to 1440p from 1080p (quality mode) using its magical mathematical algorithms, which would also explain why the native 4K presentations were better than DLSS upscaling from 1440p, because the game engine already adjusted the image from 1080p to 1440p prior to DLSS then upscaling from 1440p to 2160p, which could explain why in Spider-Man and last of us, 4K native was so much better in it’s fine details then DLSS/FSR but native 1440p was much closer to DLSS/FSR. 🤷♂️🤷♂️ maybe this is something to look at in certain games, do certain resolutions offer up certain pros/cons and do some of those go away with certain DLSS/FSR settings ?? Is this just a DLSS/FSR issue, or is it also a possible game engine issue? Do some developers just not take the time to properly develop 1440p textures and details, which would make the DLSS quality mode using an already less optimized image ??
Since I have a RTX 4060 in my G14 I don't have the luxury of running modern titles maxed out at 1080p and up, but I have been seeing a noticeable improvement modding DLSS 2 titles with DLSS 3 at 1080p, which is where the differences of DLSS will be more prominent. So far I've tried with Metro: Exodus and Hellblade: Senua's Sacrifice.
I'm hoping it's not a placebo effect but would like to see a review on the matter to rule it out.
performance upscaling is only usable when you are lacking performance badly. there is no point to use it on high end gpus
@@McLeonVP do online games lacking performance on high end gpus? ppl usually prefer to set lower settings there so performance isnt an issue
There is reason why VR games use MSAA, it simply provides far superior fine detail compared to any of the temporal AAs. Also, while MSAA is expensive, it's NOT the same thing as super-resolution and is not as expensive. Any time I've accidentally turned on TAA in a VR game it made the game terribly blurry. In motions, TAA and DLSS get noticeable blur penalty compared to no change in native. MSAA further adds detail and reduces aliasing. I wonder if we can find a game with both MSAA and DLSS because MSAA is the actual final boss of clarity.
I use 4k + DLDSR 1.78x with DLSS P. It runs better than native and worse than 4k DLSS Q but looks miles better than DLSS Q and better than native.
Using a 4090 so VRAM is no issue
Edit:
So it renders in 1440p (dlss p) -> 5k (dldsr) -> 4k
Instead of 1440p (dlss q) -> 4k
Yeah. I test games with DLDSR + DLSS myself and they look much better than DLSS quality 4k.
wait, there is DSR version that uses DLSS? (I'm still on that GTX grind)
@@GewelReal yeah there is and dldsr 1.78x looks almost like dsr 4x in 4k + it runs so much better
(dldsr = deep learning dsr) really cool tech 😃
DLDSR is the truth. In combination with DLSS it looks ALWAYS better than native 4K.
I can't really tell the difference ,but on my 1080p monitor , I prefer using 1620p+dlss quality(Dldsr) than 4k+dlss performance(dsr) same being upscaled at 1080p res.
FSR 2.1 is more superior because of the fact that you can use it in any gpu
FSR seems to always have a shimmer going on. I have both setups (5900x 3060) and (5800x 6700xt), and the FSR shimmer is really obvious in some games like RE4.
I agree in the last of us in the dark it's horrible
FSR needs some work still but thx to FSR i can game on 1440p with a 1060 6gb thx AMD
@@UA-camcensorsmesince2015 bro what lmao omg your poor 1060 is literally cryin
@@Eleganttf2 bro he just not a sheep like you are
@@Eleganttf2 bro stfu not everyone have daddy's money like you
I think you might have gotten your pics mixed up for the Spider-Man comparison as the hair on the DLSS Quality mode looks betther than the native 4k pic @5:09
I always noticed that DLDSR + DLSS quality looked way better than native 1440p while performing better or the same
What's up with the weird stuttering with DLSS as the camera pans down at 6:43?
This was a very needed video for quite some time, a lot of people still think Native is always better, but it's not, and even in games where it really is, most of the time as shown in the video, it's just slightly better, and in my opinion, the boost in fps is well worth enabling dlss quality in these cases.
It's better than buying a 4090!
@@BlackJesus8463 lord jenson disapproves this comment!
massive cope
Good video but 1. There has to be something wrong with the Death Stranding results, I played it recently and the TAA implementation was the best I've seen in years. Maybe it's bugged in the Director's cut? And 2. We seem to have gone backwards in regards to AA quality. I remember playing Dead Space 3 and Battlefield 3 on a 720P TV and the image quality was spotless for both with just post-process AA. DS3 used SMAA, not sure about BF though. At least we've moved on from the hot garbage that was FXAA, 90% of the time the raw image was better (TAA isn't radically better but to its credit, it is "low-cost" and usually better than not having any AA)
AA is not the only problem, a lot of modern games are running screen space reflections, ambient oclusion, and other ''game parts'' on half or quarter resolution compared to native, this can look ok on 1440p, but not below that, in some modern games like Ghostwire: Tokyo, reflections break on 1080p. In Cyberpunk 2077 Digital foundry has noticed that on 1440p you could lower reflections to get better performance, but on 1080p you have to set reflections at least on high for them to look good. Metro Enhanced looks extremely blury on my 1080p monitor. but with DLSS/DLDSR 2880X1620 balanced looks amazing for same performance as native 1080p without DLSS.
I would asume that a native 4K image is better than an upscaled 4K image. The question is how much better (or not) looks a 4K Quality to a native 1440p. Because that is what 4K will render in Quality. Or 4K Performance vs 1080p native. So do you gain a better image with 4K even if you still use Performance compared to staying at native 1080p or native 1440p?
4k quality looks a lot better than 1440p on a 4k screen, it's not even close
In my experience, 4k quality always gives way better resolution of details compared to native 1440p. That is the best use case of dlss, getting pretty close to 4k visuals with 1440p performance cost. I wouldn't expect 4k dlss quality to be better than native 4k unless native has a bad aa technique in use.
@@WhiteCrowInteractive which is why I always turn off AA tech on 4k. At least for me it isn't necessary and is like putting Vaseline on your camera's lens.
You can't compare 4k dlss quality to 1440p native, as you will get significant less fps. To my eyes, 4k dlss balance upscaled resolution is about 1440p native. 1440p native will have more fps than 4k performance WITH better image. Same thing to 4k performance is not relevant to upscaled 1080p, they are marketing scam. At least to what I've tested in cyberpunk2077.
@@WhiteCrowInteractive Because 4k quality in practical, not upscaled from 1440p, maybe around 1850p, ultra quality about 2000p. These are marketing scam. Look at the actual fps you will get from both 4k dlss quality and 1440p will get you the answer.
Fake image is still fake image, The GPU card I have can render the graphic the way it was meant to be immersed in or it just can't... not using fake stand-in. Sorry but I'm sticking with native.
DLSS Quality is really good most of the time so I usually turn it on when available.
If there is no rt or smoke in the game, Upscaling works fine. Otherwise, a lot of noise is created
for some people over sharpened image with denoised textures with low flickering is better than non sharpened images with normal textures with flickers only seen when zoomed too far. But for me the later is better.when ghosting is minused image looks little bad at the edges
Can we just take a moment and appreciate that this kind of upscaling is even possible? I still remember being blown away when DLSS 2 came out. It looked like magic to me honestly.
It’s still magic to me tbh
It's not magic ^^
this was very interesting. see i play mostly at 1440 because i like high fps and ive found i really like the sharpness that dlss or even fsr can give. like hogwarts and rdr2 and a couple others im not really thinking of imo just look better at 1440 than taa
Amazing job on the video! However I can't really see a difference between the 3.😂 So if the game performs well at Native Il'play on Native. If it needs extra push then I go DLSS / FSR Quality.
Sure, if you don't care about smoothness and just want to take snapshots all day I can see that being ok. I use DLSS for smoothness and cause the flickering in native drives me nuts.
@@huggysocks I agree. Upscalers help a lot with 1% lows also. And yea it smooths out the expirience so sometimes it's good even if you have enough frames at Native.
Cool video! I've been wondering about this for some time so it was useful to me. I also noticed that DS was far better looking with DLSS enabled. The aliasing with native was atrocious but with DLSS it looks fantastic. HU please do this grading for all games in existence so we can just look up the game we're gonna play in a database and set our game accordingly.
or you could take the two seconds to check the game yourself. why are you so lazy?