FSR vs DLSS vs XESS!! Which Upscaler is Best and When to Use Each One
Вставка
- Опубліковано 31 тра 2024
- JOIN THE DISCORD!
/ discord
Link to Spreadsheet if you want a closer look at the performance numbers:
docs.google.com/spreadsheets/...
Upscalers aren't talked about enough, but they are extremely important to revive older hardware or play at higher resolutions. Intel's XESS us relatively new on the scene and it has to compete against AMD's FSR and Nvidia's DLSS. FSR is open-source, which means it's available on almost every GPU, while DLSS is locked to newer Nvidia GPUs. DLSS is typically seen as the best Upscaler overall, but what if you don't have Nvidia? Well FSR and XESS are the next best options, how do they compare? Let's figure this out.
Intel XESS explanation: • Intel Arc Graphics | I...
GN on AMD restriction competitors: • HW News - Bad Timing f...
www.amd.com/en/technologies/r...
www.rockpapershotgun.com/conf...
videocardz.com/newz/intel-xes...
0:00- These upscalers are VERY competitive
0:42- Explanation of testing
1:20- FSR vs XESS vs DLSS in visuals
4:50- What if you can't use DLSS?
7:38- FSR vs XESS vs DLSS- FPS on Nvidia GPUs
10:50- FSR vs XESS- FPS on AMD GPUs
14:06- FSR vs XESS- FPS on Intel GPUs
16:08- Is UE5's built-in upscaler better than DLSS?
17:37- Where and when to use each upscaler
20:30- AMD upscaler exclusivity drama - Наука та технологія
Hope this helps! I left a link to the spreadsheet in the description if you wanna take a closer look at the performance numbers
Maybe I'll revisit this with more games (sry games be expensive) or to see if it's EVER worth using XESS on a non-Intel GPU, because the visual quality can be worse using the DP4A fallback layer. That is possibly the reason they are able to maintain a decent performance uplift on non-Intel. However, I didn't look at that extensively here.
also there was a booger in my nose at the beginning, now I can't unsee it 🤢
the motion on the grates is pretty common issue with all 3 and how they handle that if the images are multiple frames of the grate slowly moving position then you will get flickering if you notice xess has a more softer look to it then look at dlss the edges are more defined in dlss just a big difference ive seen from dlss 3.1 vs xess 1.1 im sure when fsr 3 comes out this will all change up again. Just for a reference here xess performs almost the same as dlss for me at xess ultra quality which is higher resolution then dlss quality location is very dependent on this, i only run dlss if i need FG on due to pathtracing otherwise i run xess at ultra with 50% on the sharpen slider ( may be replaced with a contrast adaptive sharpening shader that has almost zero artifacts caused from it i just hate reshade.
MT GTX 1650 losses FPS using XESS on Shadow of tomb raider unless i usess performance XESS
Btw I take back what I said at 5:50. FSR looks better than XESS on the palm trees. I wasn’t looking at the footage while filming and I made assumptions I shouldn’t have. That’s my bad
However overall, xess does still solve many problems fsr has
Great video, you touched some important aspect many fail to report 👍
...but, you also didn't tested one critical aspect:
FSR on AMD GPU vs FSR on Nvidia GPU!
From my testing FSR always run and most of the times look better on AMD GPU compared to competition.
I know people use Nvidia cards to test both because only RTX cards can run all three upscaler for consinstency and to save time, but since you took the time and effort to compare using all three vendors you already have the data 😊
You should really do a follow-up testing FSR on two comparable AMD and Nvidia GPU (say an RX 7600 vs RTX 4060).
Often people using Radeon cards feel left out on the upscaler front, when it's really not the case IMO.
RX 6700 is basically ps5 gpu hence fsr performance is better
I can clearly see the difference in a 144p video. Love the video 👍
did you mean 1440p?
@@ManuSaraswatnope
@@ManuSaraswatdid he stutter?
@@priyanshusharma1812 no
Unfortunately, only shows half the difference in 240p 😩
Really wasn’t expecting a performance difference between the different upscalers. I wonder if DLSS was locked down to tensor cores for that very reasons demonstrated by XESS. Kudos to intel for not locking it down to intel only gpu’s for hardware acceleration.
Intel "unlocking" XeSS for other hardware is mostly for marketing purposes. The best version of XeSS still only available on intel Arc only. The DP4 fallback the highest quality will run with much worse FPS than native resolution. Think about that for a second: you already run below native res it should be faster and yet it is slower in term of performance. This is what happen when game rendering and AI computation have to compete for resource from the same shaders. This is why nvidia decided to make their DLSS exclusive for tensor core only despite early version of DLSS using in control are being accelerated by gpu shaders instead of tensor core.
Yeah Control had DLSS 1.9 which ran on traditional shaders and it looked much worse than the DLSS 2 we know of today.
No it was locked down to Tensor Cores because that is what is needed to create the Neural Network to run the model Nvidia uses to upscale. Tensor Cores and Xe Cores (and AMD's AI Cores) are all essentially the same thing, they do Tensor math faster and more accurately that regular GPU Compute Cores. Eventually and possibly starting with the next generation AMD will also be using Tensor Math cores to do their upscaling. How do I know this? Simple AMD has added their version of Tensor Cores ( They call them AI Cores) to the 7000 series but aren't actually using them even though you paid for them so either they plan on using them in the near future .... or were complete idiots for including them when they are just taking up space and cost the consumer money with no added benefit
Tensor/Xe/AI Cores are really no different from the dedicated circuitry added to CPU's (MMX, SSE, AVX, 3DNow, etc.) a couple of decades ago that eventually became standards across both Intel and AMD CPUs. Eventually these cores will be standardized likely with Nvidia's model being used because it's simply the most developed because they have had several years head start. Lots of the standards used to make graphics started out that way for example tessellation and HBAO were both once locked to specific GPUs but now are standardized and run on all GPUs
Upscalers are like televisions. As long as there isn't a better one playing next to it for you to see, the one you have will look just fine.
I would agree and disagree you're going to notice that flickering
I use XeSS sometimes but I have an A770 16GB LE graphics card. The main difference there is the it runs in a multi-channel like mode utilizing all 32 XMX cores vs running in single-channel or using a single core for other GPU manufactures. I can't wait for XeSS on Battlemage and Celestial though. Its good to have some more competition.
I have a 6950xt and if I need to upscale and I use XESS over FSR if available, FRS has so much shimmering in some games that it is borderline unusable. I think AMD did a great job in competing with nvidia for the first two years after DLSS came out, but it is now clear that AI upscaling as clear advantages and I'm afraid the gap will only widen if AMD does not pull out something really impressive to improve FSR
@@mastroitekWhat do you expect from AMD brother haha, they even failed to deliver the simple Hyper RX feature that's literally a combination of what's already exists
@@Eleganttf2AMD added AI cores to RDNA3. Looks like they are getting ready to introduce FSR AI version
@@GewelReal im talking about Hyper rx, AMD themselves said its gonna release h1 but its already H2 and they lied just like the phoenix laptops
XeSS is 100% like DLSS. If you run in DP4a mode which runs on most GPUs. Then quality is reduced and performance dies. You need intel xmx to run XeSS correctly at full performance and quality. The same as you need tensor cores to run DLSS correctly. This would likely happen with DLSS, running DP4a would mean no performance gain.
Even in this Early stage XeSS is not bad.
It utilizes 1 virtual XMX core on other GPUs vs all 32 XMX cores on the A770 and there it is really good. The benefit of ARC is I can use XeSS and FSR to its full potential.
@@phoenixrising4995 No, FSR uses CAS which is optimized for AMD GPUs. FSR will get best performance on an AMD GPU. ARC or GeForce cannot use FSR to it's full potential.
No1 tell him that dlss doesnt even use matrix computations on the tensor cores, its just paywalled software by nvidia
If you run XeSS on SM 6.4 visually it's gulag tier and performance is nuked from orbit.
Dude, how does this guy not have more subscribers? First off he has unbiased opinions, second his videos all always entertaining, and third, he never makes a bad video.
He gaining subs fast. As he should.
There was that one opinion about how Hardware Unboxed sucked at Fortnite. Loved his followup video on that one.
Let him cook
No bad videos? That’s debatable 😂
There was a hardwareunboxed or digitalfoundry comparison that had the interesting result that XeSS had worse quality on non-Intel cards.
The DP4 fallback actually decreased image quality.
I might take a look at that too. That would make sense because the performance hit isn’t as dramatic as you would think
The hardware unboxed video is from
8 months ago. It’s been updated since then.
@@bronsondixon4747 I suspect as much, however since upscaling is implemented on a per-game basis, it might not be fixed/updated in every one of them.
@@bronsondixon4747 XeSS outside of Arc simply does not run nor perform the same. Strongly doubt Intel has been working on XeSS for someone elses card with how much they already have on their plate.
But is the DP4a mode better or worse then FSR?
I used XESS on The Witcher 3 with a 6950xt recently. I changed it back to FSR but it worked fine on XESS.
It does use DP4a instead of dedicated cores. I do believe there is a quality difference between the 2. Perhaps with RDNA 3 it can use its AI cores to run same XeSS as Intel ARC does.
@@AndersHassthose XMX instruction is proprietary. It is made exclusively for intel Arc hardware. The universal version of XeSS use DP4a instruction.
@@arenzricodexd4409 thereby doesn't mean it is impossible. But yes so far it works on Intel proprietary cores. Possible it is not worth the hassle to make it work on anything else.
XeSS is on par with DLSS when run on Arc GPUs using XMX units.
Exactly, otherwise on other GPUs it runs in single-channel more or like having one Virtual XMX core. That's why it may be half the performance or less on other GPUs.. Works very well on the A770 and will only get better with Battlemage. Its nice to have more competitors to help keep Nvidia's dominance at bay.
Yeah I wouldnt exactly call XeSS on par with DLSS but its not bad at all way better than fsr 1
@@kingiument4627to get the most out of it you got to use it on the Intel hardware you could honestly probably say the same thing about the other upscaler if they weren't run on their hardware they would probably run worse
@@fancyfox5847 Eh on intel hardware it depends on the game, some work really well with xess some dont
This was a really well made video. For someone so thorough in their analysis, you're very underrated!
I find when playing cyberpunk, enabling XESS helped to improve the image clarity a lot.
I would usually put on my glasses because I felt the blurry image from FSR would feel a bit jarring on the eyes, where XESS helped to reduce that issue (In my case anyways)
Awesome video @vextakes , thx alot !
Just wish you could do even more games comparing FSR vs XeSS , on non-intel hardware in particular, as newer version 1.1 claims to have improved on non-native/alternative dp4a renderpath too .
This video was perfect as I understand more about upscaling now. Thanks!
It is important to note that presets means different starting resolutions for different vendors
Great video, just what i was looking for. i have a much better understanding of the technology now, cheers.
Genuinely informative video Vex, thank you. 🤜
You gained a sub, straight to the points and covering a whole lot of ground for every single viewer out there. ❤
the problem with upscalers being so common now , is now we are getting near the scenario where game devs rely on the upscaler to allow them to cram more on screen in their games forcing people to use the upscaler just to get "playable" frame rates (ie not really that playable 30-40fps) at much lwoer resolutions even. it sort of defeats the whole point of the upscaler in the first place (to achieve 4k , or 8k with out the hardware to do 4k or 8k native). because now these games will require most of us to use the upscaler just to run 1080p at middling frame rates. Mark my words this is where the industry is headed and it's disgusting.
I'm fine with it, devs have too much shit to do. stop being poor
@@echo5827 go upscale your washed cooperatist brain!
@@echo5827 how does that boot taste?
@@echo5827 your comment doesn't even make any sense it's literally their job to optimising their games
Great video breakdown. Much appreciated
you all have no idea how much i wanted this comparison
YES !!! spread sheets !!
Dlss works wonders on my 3070 for 4k cod .
Video i need, thanks!
Nothing to discuss here. DLSS is much better in every aspect possible. It makes original image even better.
Correction: That FidelityFX CAS setting in Tomb Raider is *not* some form of FSR. CAS stands for Contrast Adaptive Sharpening, and yeah it's just a glorified sharpening filter.
Exactly, CAS is very misleading. Like they did with Rage 2.
As a Linux user, the best thing about FSR is the fact that you can use it on pretty much any game that's runs through proton regardless of official support for it.
Nice
I mean, you dont even have to be at linux, if game has dlss and not fsr (or fsr in newest version) there is 99% of chance that someone already did a mod that works as FSR under DLSS option or updates the fsr. I remember I used FSR 2 mod for cp77 when it had only FSR 1 for quite long time
Very Good Video, Awesome Thank you 👏👏👏
Just started the video and I'm gonna call it now. It just depends from game to game which one works best.
i'm still waiting for fsr 3.0 with frame generation.
Still skeptical about FSR3 sadly, if DLSS3 Frame gen still has a bit of artifacts or ghosting even with a dedicated optical flow accelerator what kins of blurry mess will FSR3 be ? AMD need to stop playing this "good guy" bs about keeping everything "open available for all" and just for once try to compete premiumly with Nvidia heck even Intel aswell
@@Eleganttf2 that's probably why we've heard exactly 0 about it since the announcement. It was probably a panicky reaction to Frame Generation, and now they're desperately trying to figure out how to actually do it in a way that doesn't look worse than FSR1.
Framegen is not even good
Just need real performance
@@ryanspencer6778 It was announced for q3/q4, it should be out by the end of the year, it's not even running late yet. Based on info it's not creating fake frames like DLSS3 by interpolating between last frame and a held back frame but is instead using motion vectors to create a frame ahead of current frame or using reprojection, possibly a combination.
When the DLSS 3 one looks blurry , I can't imagine how bad it would look on FSR
Man, this is a really good video.
I bought my arc a770, and i hope intel make some competition on the market, we desperately need this. I actually have an RX 5700(wich i will sell), runs well some games, but i would like to test something different.
Whoa this is really good! These technologies are kind of a huge deal. Good to know.
I remember when the times were like : "omg the graphics are so good"
"playing gta 3"
I personally try to avoid upscaling where possible.
It’s most useful at 4K.
At 1440p output and below I always notice too much flickering and other artefacts for my liking.
DLSS in Death Stranding causes the game to crash without fail in the same area every time, and still hasn’t been fixed.
Both DLSS and FSR cause weird pixelated waterfalls in Uncharted collection, just to name a couple of examples with issues.
These upscaling techniques are nice to have, but I personally recommend tweaking settings to achieve 60fps first, and use upscaling as a last resort.
Yes, is a nice feature to have but i don't think is nice yet
I live in south east asia. Due to how hot it is dlss helps a lot with heat issues
@@backlogbuddiescan relate, dlss reduces the temps of my 2060 by 3-5 degrees😅
Upscaling can actually enhance image quality. You just need to use it right. For example, if you have 1080p screen and RTX card, enable DLDSR and set ingame resolution to 1440p. Then enable DLSS on Quality. This way a game will be rendered at 960p, then upscaled to 1440p via DLSS and downscaled to 1080p. Although it sounds weird, the end result will be very impressive. You will get much more stable, detailed and cleaner image than at native 1080p, but at the same time the performance will be on par with native 1080p or even a bit better.
Same goes to 1440p screen. Enable DLDSR, so that games can be rendered at 4K, enable DLSS and get better than native 1440p image quality with roughly the same performance.
@@PatGamingApasStyle It'll drop it 5 C for me too. Which is helpful when you're in a tiny cement room.
NOTE! Fsr works on every game in linux.
Great work done on that video, for most your conclusion will work!
But it is subjective, because you focused on details you think is important.
This kind of comparations should be based on prepared list of metrics (sharpness, details saved, stability, artifacts appeared, damaged forms, false materials, etc) and each should be measured fairly (in relation to native, not each other).
There is also bad situation in games where TAA is enabled by default and no easy way to disable it, TAA is upscaler too (most of it ideas is used in DLSS and FSR2), so it is not native picture to compare.
Also the upscale result may look different to people because of pixel density, where on 27 inch 4k display XESS may always look better, while on 25 inch 1440p monitor FSR will win.
Nice work bro.
AMD needs to add machine learning to FSR that enables when available automatically.
they claim the ai cores aren't necessary, but the lack of quality in comparison is not backing up their claim. Hopefully they can fix it without adding more hardware to gpus
Good quality upscaling requires GPU power to work properly, but that would defeat the purpose of FSR.
@@mindrover777 I think you missing the point of why upscallers exist not only to keep quality high, else you may as well just lower your resolution instead without using any upscalling.
On my low-end PC, the biggest difference between FSR and XeSS on Cyberpunk is that FSR smears the text of the LED signboard on modern police cars, where as XeSS makes screen space reflections pixelated (similar to the video). Performance-wise, really no difference and they both boost performance but found myself not using the upscalers to avoid some visual glitches.
from my experience xess can work really well as a substitute for anti aliasing and tends to have a hole lot less flickering and more consistency ingame but thats also helped by how blurry it looks permanently it always has that level of blur so if your fine with that i would say xess wins for non dlss cards
Awesome job dude! I have GPU with dlss
Got a GTX1650, I far prefer XeSS to FSR in CP2077. Sadly it's a lot slower in Quality mode (42 vs 48 FPS avg) but it looks so much better than I prefer to take the FPS hit.
I finished watchdogs first on 30fps. If you get used to it 40 fps is very much more than playable. I would also prefer the better looks with xess.
GTX1650 is not so bad. I have that GPU. You just install the proper driver for that and it should be better.
I’ve been testing out nvidia Dlaa in Modern Warfare 2 and it ain’t that bad, while the fps don’t increase as much compared to DLSS I’ve noticed a much lower visual impact compared to DLSS quality. From what Ive experienced with dlss textures in performance mode often shimmer or have unusual shading .
For dlss to look better I’ve found going into the nvidia control panel and set DLSS to upscale before downscaling removes a lot of the aweful fuzzy features. While performance gains may only be 10% compared to 15%, it makes for a much better immersive experience and doesn’t feel like the game is running at a lower resolution.
DLAA is machine lerning alasing not upscaler. If you turn on dlaa you play in nativ resolution. DLSS is superior upscaler making nativ or better image quality.
DLAA/DLSS are both machine learning. DLAA is for native resolution anti aliasing using tensor cores, and DLSS for upscaled resolution anti aliasing using tensor cores. DLAA is the best visual quality while DLSS is when you are going for performance.
The funny is how 1st iteration of Intel's RT is better than AMD where their cards are losing relatively less performance than AMD's cards.
Also funny is how XeSS is better than FSR. The first generation of Intel's GPUs in terms of features is already almost as good as AMD's after so many years of development. This is sad.
I wonder if you could test dxvk (dxvk-gplasync is the best current version due async but without stutters) in games across 3 gpu manufacturers? I've heard it makes Jedi Fallen Order much smoother experience.
It usually helps out the CPU (by lowering draw calls), it's not really about helping out the GPU.
@@markjacobs1086that's the exact issue jedi survivor has , aside from stutters , on release the 4090 was stuck at 50% utilization because of the cpu limits
I tested it on yakuza like a dragon and it makes the game significantly has more stutter than using the default directx. I'm using i7 3rd gen & rx 6600, so i'm more CPU limited.
it also deeply depends on what you're playing on, on my tv from my bed i cant see a difference between native and ultra performance in jedi survivors and it took me from 30-38fps to 90-110fps at 4k max settings however when i put it on my monitor the artifacts become visible
00:04 Comparison of FSR, DLSS, and XESS upscalers for visuals and performance.
02:05 DLSS provides the best upscaling performance and image quality compared to FSR and XESS.
04:05 DLSS outperforms other upscalers in terms of performance
06:09 FSR vs DLSS vs XESS: a comparison of upscalers in gaming
08:15 FSR and DLSS provide higher performance uplift compared to XESS on RTX 3080
10:31 DLSS is the best upscaler to use with Nvidia GPUs
12:52 FSR and XESS provide significant performance improvements in cyberpunk
15:00 XCSS on Intel GPU provides similar performance to FSR, making it a better choice if available.
17:09 DLSS is the go-to upscaler and performs better than TSR and FSR.
19:01 XCSS is recommended for Intel users due to its deep learning capabilities and minimal performance impact.
20:43 Upscalers in games can vary and may have limitations based on sponsorship.
22:40 Comparison of FSR, DLSS, and XESS upscalers
It is worth to mention that xess on arc gpu and non-arc uses different version
Not only peformanxe is different but even image quality
Do you still prefer HVEC or AV1 with the new cards is good?
thank you for pointing out these little details as we cant see whats the difference when watching through youtube
Maybe not on a phone, but def on a bigger screen. The zooms really help
@@vextakes yes and also the details go by so fast thanks again for slowing down the clips
It's a shame such a good upscaling is not open source (DLSS).
That's Nvidia for you, they would privatize your breathing if they could.
Nvidia makes me want AMD and Intel To Win , But 4 Sur AMD as they're the underdog
Nothing wrong with competition. Blame AMD for the lack of it.
Uh, wut? This video actually shows why DLSS is exclusive to Nvidia. XeSS is using a software fallback for the AI upscaling and it eats a big chunk out of FPS gains. While it would be nice if DLSS was open-source, clearly it would have lackluster FPS gains without native hardware AI support.
to be fair, with how dlss works, i doubt it would run well on other brand gpu's as iirc it relies heavily on the tensor cores. there is a case to be made that with ROCM on amd gpu's it could be used in decent capacity, however ROCM isnt consumer made yet and i believe its first consumer gpu to support it is the 7900XTX.
there would be a case to be made once amd has a decent selection of mainstream consumer gpu's with ROCM enabled... but we arent that far into the future yet. once we are there though, it would be a shame if dlss wasnt made open source to gpu's supporting their version of ''tensor cores'' (ROCM)
Tell me you dont know about tech without actually telling me you dont know tech, seems all in your head is "oh wow fsr works on all gpu unlike dlss! Dam those nvidia for gatekeeping their techs!!"
I'm really looking forward to FSR3.0.
i must say, i really love your taste when it comes to games, judging from the music used in the video at least
can you tell me the name of the song used?
@@Omelletr hollow knight - hornet, oneshot - phosphor are the only ones i caught
are you running XESS on a ARC GPU? the pipeline is nerfed (because of no dedicated silicon) on other GPUs (which leads to lower performance AND lower graphics quality)
Did you watch the video
I’ve used dlss3 and fsr 2 and fsr 2 is very noticeable when on. Especially in smoke textures on cyperpunk
Good work
Thank you for the in-depth comparison. Lots of heat over the Starfield AMD "exclusive" thing right now. There are a lot of "FSR2 works just fine on Nvidia" and "stop complaining, it looks the same" comments. Clearly, it does not. I'm surprised that XeSS was able to salvage as much lost detail and reduce artifacts as it did compared to FSR despite using a software-only solution. This makes FSR2 seem even more outdated since AMD could have pushed some AI into FSR to better compete with DLSS. It also shows that an "open" DLSS that runs on everything would not provide very impressive FPS gains if it had to fallback to software.
I DO think that open software might work very well (if all companies would agree on one solution, so they can support it on their hardware too), the problem here is simply that AMD alone doesnt seem to be capable or doesnt care to produce one that is on par with its competition.
Thats the case for productive work, for their video-encoders and also for their upscalers. Its a bad look for AMD if Intel comes in new and kicks the ass of a company (software-wise) that does dedicated GPU shit for decades.
@@CallMeTeci AMD users waited FOREVER for FSR2. FSR1 was as garbage as DLSS 1.0. Just a smudge-fest. Now that FSR2 feels old, FSR3 is...where? AMD keeps hyping it up with not as much as a screenshot. If FSR is supposed to be the "good enough" open solution, it needs to actually be good enough.
@@Aurummorituri It almost looks like AMD had no intention to develop FSR. But after Nvidia started to implement DLSS 3 they hastily made a presentation how good the FSR 3 is and now they are developing it.
@@Ddofik If it is just frame generation I am not super excited. There needs to be some AI DL goodness added to the base FSR to actually improve image reconstruction before worrying about fake frames. Those fake frames in DLSS3 are also AI generated. How is AMD going to create theirs? Probably why it is taking so long. You can’t create convincing fake frames with just temporal and spatial data.
It’s nice to get in the weeds of it. And I do in my games graphic settings. But at regular speed I can’t tell between all of these most of the time. Maybe a little bit barely.
I wouldn't say its up in the air, HUB showed that DLSS is the best overall. All of them have come a long way and its nice to see them get better and better with each update.
Considering that you can use FSR with balanced setting for the same performance than XeSS with performance on AMD cards, that should've been one of the comparisons IMO. The witcher 3 has something clearly wrong in the FSR/XeSS implementation with the high preset. It is just not possible to get that kind of performance gains as the base resolution that they should be rendering has hard time running that good. On the ultra settings this does not seem to be the case.
Xess is a miracle when it comes to cyberpunk on amd. Fsr is weird, blurry, and it sometimes removes some finer details. It’s bad, especially on performance mode. Xess looks way better. It’s a lot more clear, looks amazing, and I didn’t notice any loss of detail. Even on performance mode, it still looked pretty good.
Great video
@vextakes I like the video, however versions are quite important in these compares
if xess and fsr are available on all gpus
shouldn't one use the 'native' upscaler?
like xess for intel gpus ä, fsr for amd gpus and dlls for nvidia gpus?
I mean if you think about it, DLSS dose not support AMD Cards right ? but FSR can be run on RTX card, so tell me who's blocking the other ?
the shimmering is most likely due to reflections and idk if you had ray tracing enabled but depending on the reflections data and what exposure settings its turned to if its lower then the in game exposure ratio for eye adaptation it will cause flickering silver spots like that anywhere roughness determines its meant to have a reflection the upscaler can only do so much to filter this out considering its probably not aware its even an issue let alone if cd projekt cares about it being a problem. Also dont forget that cd projekt never fixed the multi threading in cyberpunk 2077 you get about 15-30fps gain if you fix it in the exe file.
Ray tracing would have severely crippled performance in Cyberpunk 2077 so I doubt it was turned on. When rendering graphics images to the screen you generally have to "dither" pixels to get fine details to pan across the screen without shimmering. By "dithering" I mean that you essentially render a larger picture and then down sample it so that some pixels are averaging data from 2 or more pixels in the original scene. Modern graphics drivers do this with fonts and it makes 6pt and smaller fonts highly legible. Using dithering the luminosity of an object will be equal when it is centered in the middle of a pixel vs. when it is halfway between two pixels. I suspect that the better upscalers are somehow decompositing the picture into constituent graphics shapes (iron bars for when there is a fence, checkerboards for when a building's windows are far away) and then dithering them back into the larger picture.
You could also test the upscaling in the AMD Drivers fidelityfx
I think there is little bit of issue with methodology. Because DLSS would have outright advantage running on 3080 i.e. it is hardware enabled upscale. SO it becomes kind of apples and oranges.
I understand that this is probably the only way to test it as you must have Nvidia graphics card to be able to run all these comparisons at once, but at the same time then it becomes the matter of comparing "software" based upscaling and "hardware" based upscaling. Meaning it is less about how good is the upscaler and more about software vs. hardware, which to surprise of no one hardware wins.
That said in my experience FSR works better on never AMD GPUs (6000 and 7000 series). So not only FSR works on all GPUs, not only FSR3 will work on ALL GAMES using DX11 and DX12, but as well it works batter on it's native hardware, which again isn't really that surprising.
My comparison comes from using DLSS on 3080Ti and FSR 2 on RX6900XT. On native these cars are almost identical, 3080Ti is ~1-2% faster, but because I have FE it just suck, it is hot and loud and needs a lot of power, so 1-2% better FPS isn't really that important to me, compared to RX6900XT which is Strix OC LC, sips power, is cool and quiet... but anyway... when it comes to performance in the games I play at 5k (I have 49" 240Hz Samsung Odyssey screen) the FSR works better or AMD than DLSS works on Nvidia. On average I am getting 60% more FPS with AMD + FSR and 48% more FPS with Nvidia + DLSS. Yet if I run FSR with Nvidia, then I get ~44% more FPS and worse image quality as well. That is why I basically saying your methodology of only using Nvidia is little bit flawed, as it gives advantage to Nvidia.
As well I didn't mention the setting I am using, because frankly it is almost irrelevant on FSR. Most of FPS improvement on FSR comes by simply turning it on. So between no FSR and FSR Quality I already get like 58% FPS boost, and then going all the way to performance I get only extra 2%. By the way this is mostly tested on Escape From Tarkov which is real bitch to test because it is hard to control the environment. But I got similar result on CP2077, Witcher, SCUM and few other games I own. It is quite different for DLSS thought, where DLSS seems to make game to look gradually worse, but improves FPS gradually as well, whereas FSR you really don't get much extra performance and even visually it is minor improvement if you keep it on quality.
Anyway - my main point an conclusion DLSS is the best if you have 3000 or 4000 series Nvidia... and Ideally xx80+ version of the car which can actually run DLSS. If you have ANYTHING else, then FSR is the answer, best GPU support, best Game support and very decent performance uplift. I even used very very old AMD R9 Fury X on modern games that have FSR available and I get completely playable FPS... SO DLSS is only relevant for relatively small number of people, I would argue maybe less than 10% of gamers, whereas FSR is relevant for 90%. And does better job in 60% of the cases either way.
What it comes down to in the end - and why these technologies exists is AMD and Nvidia battling on the card sales. So the questions is what is better - get 4080 or 7900XTX, one will allow you to play with DLSS, the other one will allow you to play with FSR. Sadly, I don't have definitive answer here, but what I know is that if FSR didn't exit then the answer would be 4080 every time, now that AMD has FSR, suddenly 7900XTX is on equal playing field basically. And that was the goal and it was achieved. Positive side effect that everyone that are choosing any other graphics cars as well benefits from FSR.
I have rtx 2050 on my laptop which supports dlss and runs great on it. Its not just for higher graphics cards. Saying that dlss 3.0 is more improved and maybe not out time of this video
i agree with most of the video besides the palm trees point, maybe it's because of the recording or something else, but the leaves and hairs of the palm trees don't look better on XeSS, the tips of the leaves have a huge amount of flickering compared to fsr and native, they are sharper than fsr but the flickering is a huge throw off
I know i am 9 days late but here is something to keep in mind - cuz i did not catched that in video (sorry if i simply missed it)
XeSS is still at its first 1st gen
FSR is already on a market for like 2 years - at version 2.2 - and DLSS for 4 years - at DLSS 3.0 now
having those notes in mind, Intel is getting really amazing scores at this point..
Imagine if Xess would be here with us since like 2020 or 2021...
intel have the best name backwards
period
That's why their A770 LE card has more neon lights then the red light district in CP2077. 🤣🤣
issues with all scalers , devs dont explain the usage of each sharpening pass dont allow us to use different quality presets in the trained models for each ai upscaler and the fact dlss 3 can run frame generation at the same time of DLAA but cyberpunk has this disabled its safe to assume its due to the overdrive setting and nvidia wants dlss to look good and FG+DLAA is another way for them to sell the rtx 5k series cards.
What's DLAA? next version of DLSS?
all positive comments aside.....
THAT HALLOWNEST MUSIC THO. Team Cherry you made a masterpiece that runs at 1080p at high on even igpus and I'm here for it.
Native 1440P 125FPS cap. I don't have to deal with whether DLSS degrades the image, improves it or suffers from ghosting.
Please make a part two of this when FSR3 comes out
Just wondering when you yalk fsr which fsr you used is it 1,2 or 3.
Well, what I understood was that the scaling technologies have problems to maintain the quality, but not that far from what they should be, it's very little difference just to gain 50% more performance, anyway, FSR 3 still needs to be released and Nvidia DLSS 3.5, which improve the quality of these technologies and further increase performance
I cant use dlss on cyberpunk. It's greyed out for some reason. My 3050 had no problem enabling dlss but my 4070 couldn't. Already updated drivers , reinstalled the game, and HAGS is enabled. No problem in other games only cyberpunk. How do I fix this?!
XESS AI mode you need Intel arc GPU,
If not the quality is same or less compare with FSR
To vet more accuracy on the difference between dlss and fsr, try it on dying light 2.
How do u use xess do u copy all the files to the game or what?
FSR also has massiv ghosting in some games. The first time i used it in RDR2 i had massiv ghosting and textures became muddy af just by rotate the camera
On my XTX, I only saw a 10-15% bump with FSR over XESS (I had RT on). But XESS just looks much better at 1440p, imo.
Which TSR quality did you use vex?
So how is this going to work with my 4090 on Starfield? Can I still use FSR?
AMD FSR is just raw power.
DLSS is also really really good
Dude DLSS really is amazing. I compared it to FSR too and thought it beat it out of the water. XEss was a close second. But the problem is it seems Nvidia like cutting their cards back now but charging more banking on DLSS making the generational jump and not the card itself. A lot of people run 1080 or don't want to upscale and games aren't always supporting DLSS so NVidia is really dropping the ball. With DLSS AND a stronger gpu Nvidia could have given us so much this generation.
Was this with frame Gen (dlss3) and fsr 2.0?
*edit
Guess its just dlss 2 cos 3080 cant use dlss 3, if using rtx 3080 then im also guessing fsr2 for radeon gpus .... either way good vid :)
fsr 3 (which comes with their version of frame gen) is not released yet at all. it cant be tested yet as its not there yet. every fsr thing you see is fsr 2 or one of its predecessors
If you're playing on warzone 2, use FSR 2.1 on performance. Gives clear quality and an fps boost. Better than dlss imo
will TSR improve enough in the future to rival DLSS? since if it would, then it just makes sense to get AMD GPU and use TSR rather than using NVIDIA and DLSS
Yeah, I agree with you, dlss is the best
Especially if it was for a handheld pc, because dlss at 360p demolish fsr at 360p
When testing TSR in Fortnite, did you use TSR low, medium, high or epic? Just wondering if that makes a difference.
Used medium. I would actually have to test that, but I can’t visually see a major different between them. They may also have a greater performance hit
Yeah, it would be interesting to know, if the visuals change in any significant way, and if the higher settings are worth the performance hit.
Hi, sorry to butt in but I've been doing extensive testing between TSR and DLSS on my 5800x and RTX 3070 computer to find that stable 60fps spot with good image quality, in the case of low to ultra TSR it's better to use high or ultra if rendering from a lower resolution to 1080p, while as 4K performance(50% 3d resolution) low TSR should do a good job without a big performance hit, that also depends on pc specs since TSR seems to be probably more CPU bound than DLSS which is entirely GPU bound and may even have worse performance than TSR if settings are too high, hope this helps
@@cesar9485 Very interesting, thanks for sharing.
Hardware vs software for almost a decade. At least your on point.
thanks
wish this upscalers could be universal and let people decide. DLSS is the best, XeSS (I have an ARC A770) is really good and not that far from DLSS, and FSR is a bit of a mixed bag but it works on every hardware, which is its main strength.
I have done detailed teating with a 4080 and cyberpunk. DLSS 1, FSR 2, xess last
Cant wait for fsr 3 very interesting 😂
Evil West Lets You Enable CAS and FSR, I Still Don't Know Or See The Benefit Of Using Both But Game Is Crisp As AF @4k With Or W/o Upscaling.
Intel will start using cpu chips in its graphic cards or something that is similar to get performance close to the high-end cards for cheaper
Pl ellaborate how IGT works along with all these options...?
“Hey, me in another shirt” had me dying