Join *r/MotionClarity* to discuss this issue & to find workarounds: www.reddit.com/r/MotionClarity/ & also watch my new video on the subject: ua-cam.com/video/LiUvA3cTdhg/v-deo.html And a few things to note - I didn't get everything correct, made some minor errors, I noticed them as recording but kept going, I did this video in one take. (At one point I called something dithered for example, but I was trying to say it looks dithered, because thin geometry can look that way sometimes) - I did not address everything I wanted, such as going into more detail on solutions and also providing more solutions, because the video was getting long. Another video will come out regarding that - From 11:38 onwards I do provide useful information for those curious, although it was mostly meant for developers. The comparison part of the video is all you need to watch if you want to see examples of the problem with brief explanations. - Sorry for the length of the video and if their was any word fumbling. I know it hurts viewer retention but I wanted this video to cover everything, for everyone, those who know nothing about it, those who already know a lot, devs, gamers, etc. But let's hope some UA-camrs who are better at captivating people and making entertaining videos can discuss this as well.
An important note is that TAA is a lot better when it uses a history buffer that is 200% screen resolution. 100% will immediately leak to other pixels, while 200% will leak to sub pixels first and blur less in the final result. (Edit: you can do it right away with 4x DSR (0% smoothness) and a 50% input upscaler, like DLSS performance without sharpening). Unreal engine has the console command r.temporalaa.historyscreenpercentage 200 for this, but it's only rarely used in games. Not even cinematic TAA utilizes it by default. Epic and cinematic TSR use a similar console command (r.tsr.history.screenpercentage 200), but they are very expensive: about 1.7 ms at 1080p on my 3070. It's not perfect though. 1080p will look like 960p in motion, so the blur is minimized and probably worth the greatly improved stability, but smearing cannot be avoided on changing surface colors and (semi) transparency, including grids and foliage with lots of detail. Either the foreground or the background will smear in motion, for at least one frame. Other console commands I use are r.tsr.history.samplecount 8 (to make the TSR less agressive and minimize blur even further) and r.tsr.shadingrejection.samplecount 0 (to minimize the said smearing due to parallax disocclusion). To avoid smearing due to vertex animation, you need to enable 'output velocities due to vertex deformation' in the project settings and use the console command r.BasePassForceOutputsVelocity 1 in unreal engine 4. This automatically corrects time based vertex animation and skeletal meshes. For other things like interactions and texture UV deformation, you need a previous frame switch to tell the compiler the difference between the current and previous frame. This goes into the world position offset, since motion vectors are calculated per vertex and interpolated to one value per pixel Sorry for my long comment. It's not necessary to understand all the technical stuff, but TAA is pretty involved and there are a lot of misconceptions about it. Even among game devs
@@normaalewoon6740 I'm aware of that console command, when people say their TAA looks bad in UE4 games that's one of the first commands I recommend they use to mitigate blur
@@normaalewoon6740 Also can I get your opinion of the CVARs "r.TSR.Subpixel.DepthMaxAge" & "r.TSR.Subpixel.Method"? I've tested them myself, I want to know what you use/think. Also r.TSR.History.GrandReprojection was a good thing but I think it's been removed in newer versions of UE5
@@HybredI actually did not see a difference when I tested them, I think you know more than I. The CVARs I mentioned did the trick. I go for the weakest settings that do the job, at the highest quality I can afford to keep the motion clarity as good as possible. All I need is a consistent 85 fps for backlight strobing, v-sync + an fps cap, motion blur during fast camera rotation only, high quality upscaling from 100 to 200% screen resolution and reprojection disabled on transparent surfaces, if that provides a clearer result
This is exactly what has been driving me insane with modern games and why I often even avoid them, I just want a crisp experience, I already have bad eyesight, if I wanted this effect I would just take off my glasses. I really dislike blur of any sort in games.
@@Odyssey636DLSS still uses temporal anti aliasing.What are you talking about mate? Edit: It can look better than native TAA SOMETIMES depending on how bad the TAA implementation is. I am on AMD and what I do if I can is set my display resolution to 4k and play it on my 1080p screen. This is better than native 1080p TAA
@@west5385 i try that, but my pc cant rull all games at 4k. My old rx 5600 xt couldn't run Jedi Fallen Order at 4k60, but the blur was manageable at a bit lower resolution, while keeping 60fps. Same with red dead 2, i have to use 150% resolution scaling, but that drops my fps to like 40
Wow this is what I mean! This 4k games looking blurry has been bothering for so long, I can't think why anyone think this was a good idea. It's literally looking at a picture with bad eyes, I don't care if this is creator content or not but just get rid of it!
Yes but it's getting worse and worse as the more it's used, the less options/control we get over it, and the worse it looks when we disable it. No one cared when it was optional. Much like DLSS/FSR is being relied on for performance now, TAA is the same problem before that was an issue.
@@kylerclarke2689 Technically even longer than that since the first prototype of temporal AA was found in Crysis 2 in 2011. Far Cry 4's only blurry AA is NVIDIA's TXAA. And that one was indeed awful.
All that extra horsepower to run the game just for it to look worse than a game from a decade ago, absolute dogshit visual clarity, not to mention the ghosting that already exit on non OLED monitors...
@@AverageDoomer69 Exactly. You would atleast think they optimize the games for consoles and that those games don't require much horsepower... nope, go buy an RTX 4090
Dude what. I’ve been wondering why the hell every game has been super blurry for years. I didn’t even think TAA was to blame. Thank you for bringing this to the front of my mind.
TSR is the new default in UE5 because upscaling from a lower resolution gets you better performance, especially with lumen/nanite being hard on your GPU. I'm sure it's not helping a lot with the blurring if you're not tuning it.
Yup, the information is gone. And you can't get it back. You can use sophisticated algorithms to guess information to add back into the image, but it is gone. Sharpening a blurry image doesn't help with aliasing or flicker, stuff still looks muddy, but also has way to hard edge contrast.
spot on - not to mention (idk if its the same in modern tv's) but back in the day there was a literal "sharpness" option on your TV. So in theory today's games would be doubly sharpened through the game itself then the TV.
One thing I REALLY dislike is that with the introduction of TAA and DLAA, devs have started removing filtering and other things on foliage and trees it seems, so a lot of games are really blurry with TAA, but look like a jagged mess without it, even if you turn on other AA options like SMAA or MSAA.
@@Vifnis palworld is unoptimised garbage and it's AA settings literally have no impact on performance. No matter what AA you choose, it still looks like shit, and you're better off turning it off and using reshade post processing AA and fine tuning it to make it look half decent. But palworld has much bigger issues than AA, one of them being subpixel geometry. Because it's in effect just an asset flip, with bough and free assets thrown haphazardly into a "game", there's basically no optimization, and thusly pretty much no LOD work other than what already came with the assets. If there's no or not enough LOD, this leads to polygons in distance that are smaller than the actual pixels they're being rendered into, and this causes massive performance drops (as the GPU has to "work out" how to render something smaller than a pixel, a few million times each frame) and of course decreased visual fidelity, as geometric accuracy is lost, leading to inaccurate pixel fill.
One thing that annoys me is that because it destroys detail, you’re still rendering all that detail, then smoothing it out. So you’re actually doing extra (albeit very easy) processing for less detail
@@GewelReal "TAA is one of the performance killers in modern games" I wonder if this is a feature for the future and not meant for present day.... might have to dig thru arXiv for this one... what I'm saying is, on a 1080p display it might be HUGE loss in detail, but for those using an 8K Gaming set-up (lmfao) it probably adds a lot of performance for the processing required... tbh FXAA just looks like crap and TAA easily adds a realism to games I play personally, BUT PalWorld coming out I recntly learned of TSR (Temporal Super Resolution, istelf tbh is kinda simplier and safer-and we have been doing it for years on desktops)
@@Vifnis why not just use smth like FSR or DLSS? It does both antialiasing and FPS boost. Theres no way you'll ever get MSAA8x-like image quality while keeping acceptable framerate. Miracles dont happen, you gotta pay with something. TAA is an amazing technology that allows you to completely eliminate pixelating at a very low price. AI upscaling will replace it completely however. There is no need to have both.
What annoys me even more is when games don't even have an option to disable it. Days Gone doesn't even have any anti aliasing settings. You have to mess with ini files but that's also removes HUD prompts
i got so used to the blurry look of modern games that whenever i go back to older games, it just feels like i entered a new dimension of visual clarity and crispiness. also, you talked about how TAA gets rid of specular highlights, and well, nfs heat had so much firefly artifacts from it's specular materials that you were pretty much forced to play with TAA on. it reaally sucked because i preferred the clearer image, but the visual glitches were too distracting
Haha, man I thought it was just me. A majority of these games look blurry and that's why I always cranked up sharpness even though that's frowned upon.
Gaming went through a really weird direction. I wish more games would go for compelling visuals like Dishonored instead of trying to make everything look "realistic".
hell on the topic of NFS, NFS Most Wanted (2005) on the Xbox 360 still looks great to this day because while they put a lot of detail into the game, especially for a 360 launch title, they didn't go overboard in trying to make the game look realistic and aging absolutely horribly and took a stylized approach to the game, having specular highlights on car paint be bigger than it'd be in real life, exaggerated particle effects, and the (in)famous color grading and bleach bypass filter
Interesting, since I only operate 4k monitors the whole thing is something I never considered because in 4k you dont need ANY AA at all. I haven't found a game where it would be necessary. I cant see the difference honestly .
@@Vanadium stop parroting crap from 10 years ago to self assure your sense of snobbery. That objectively isn't true for all games because not every game renders at native screen res first and foremost, and secondly there are shaders and post process effects that absolutely need AA to render in a way that is visually cohesive.
You see, to help with the temporal blurring issue what we need now is upscaling artifatcs, and AI generated frames for more artifacting, but then we add a sharpening filter that deepfries the image and it all goes back to looking good
Or you know just use dlss and frame gen instead of amds trash and not have artifacts at all lol. There is a reason software based frame generation was never made by nivida. You can't do it right. AMD has shown the results of software vs hardware based upscaling.
@@donkeymoo1581 FSR has improved significantly, as well as being free for literally everyone to use. I think you should look moreso at the reason for its creation versus the obviously superior but less accessible DLSS. Competition isn’t always about goin for the same goal, it can be about providing for different markets that don’t have access to that technology yet.
@@LieftheDragon Stop coping lol. Even intel xess since launch has been far superior to fsr and intel can't do anything right anymore. AMD fanboys argue "they don't need upscaling" now every game uses upscaling as a baseline. AMD fanboys argue RT performance doesn't matter. Now every other big game has RT baked into the graphics so you can't even turn it off. There is a reason AMD are pulling out of the mid to high range of gpus. They have nothing. They have no features they have no real world performance they are just going full focus on cpus as they should. Focus your money where your are leading the race. Don't keep chasing a market where even intel gpus are beating you in performance and features lol
@ it’s not cope dude it’s literally as simple as a google search away to see these things. Battlemage is not more powerful than AMD gpus yet, it gets up past the 4060. Which is good, I’m all for intel, but you’re making baseless claims because those cards haven’t even come out yet. Also, XESS is indeed quite good. I never said it wasn’t. I simply said that FSR still works very well now, and it has its purpose that doesn’t require buying new gen hardware to use, and that’s COOL. I’m just as pissed as the next guy that games are using sub-res textures and anti aliasing/upscaling to make the games look good. I was just pointing out the issue.
Whenever I play a newer game, I always check the graphic settings to disable things that intentionally blur parts of the image. Our eyes blur things, like rapid motion and distant things, naturally. Trying to emulate something that our eyes do automatically makes the image look bad at best, and headache inducing at worst.
Your eyes can't blur a screen stopped in place like motion blur, people hate it because it's not well implemented, but just compare ubisoft's new star wars game with respawn's star wars game and you can instantly see how not having blut makes movement feel robotic, now they can do per object blur or screen motion blur which is usually bad because it just blurs everything,but can't blame people that instantly turn it off from ptsd because it's trash in most games
I was sixteen when Fallout 4 came out and it was the first experience I ever had with TAA. I remember it to this day if that says anything. I was simply mind blown that nobody else could notice it. All my friends thought I was insane!! At the time, the game hurt my eyes so bad that I couldn't play it. Little did I know, the entire industry would adopt this actual game ruining trend.
That's how I felt about LCDs. Still do. I can't wait to dance on their grave when the technology is finally burried. CRTs illuminated pixels very briefly, just as the electron beam passed by. This meant perfect motion clarity as long as your framerate matched your refresh rate. On a constant illumination monitor like a typical LCD the image is stationary while your eyes move following something on the screen. This is caused persistence blur. If you take a 100 Hz 1440p monitor and focus on an object moving across the screen in 1 second it will move 26 pixels/frame. This means that your eyes move 26 pixels while the object is stationary, each frame. This means 26 pixels of linear blur. It's going to look like shit. It's not going to match the motion clarity of a CRT before it gets into the 1000 Hz range and even then there may just inherently be too much ghosting/inverse ghosting and other artifacts with LCDs to ever reasonably do that. At 500 Hz they don't really keep up. OLED can do this; the pixels switch very quickly and if you have the bandwidth you can crank out 1000 crisp frames per second without any ghosting or overdrive artifacts. MicroLED can do very high brightness for brief durations. Laser projectors are a bit of a wild card and might be able to do very high refresh rates.
I remember having headaches from TAA in Enderal (a Skyrim total conversion mod). Solution was to use both TAA and 4k, DSR option enabled in nvidia panel.
first time I boot up Mass Effect Andromeda, I was shocked to see how blurry it is... ruin the vibe of space scifi from the initial trilogy (not to mention that the game is also bad). to my surprise, here in 2024, game dev don't make object more clear, they opted to optimize TAA in a form of upscaling. The bastard Nvidia is also suspect of this, remember when they tease DLSS as optional, more like premium option? Now all games requires it to run smoothly.
@@PrefoX the thing is, a game from decade ago may benefit from AA off. And early PS4 titles such as Uncharted 4 or TLOU remastered doesn't even use TAA on launch. Those game looks absolutely pristine. And then games like RDR2 came along with advanced TAA implementation that became a staple standard of modern graphics, BUT somehow other dev can't keep up with it without sacrificing performance because of poor optimization. I recently played Alan Wake 2, it has DLSS on auto, and implement DLAA... Looks blurry as hell except the character faces. Any object located more than 5 feet of your character reminds me of taking off my glasses, not to mention how hard it is to even have 60fps on mid range hardware.
This explains a few things. Ever since upgrading my computer and enabling TAA in games I've been noticing games are suddenly really blurry in motion. Was really confused as higher specs usually means a clearer image, or so you'd think.
Yeah when I use the anti aliasing in mhw they always make that game looks so shit that u needed to search in the community the perfect right settings on gpu to make that game looks sharp
this is one of the reasons why 1080p feels lackluster in modern games. playing a game from 10 years ago at 1080p looks crisp in a way that can be only achieved with 1440p or 2160p in modern games.
@@TheNerd this is why PC gamers are such hypocrites. You say old gens hold gaming behind, and then you gatekeep technology that most people can't afford. "Oh old monitor? Why don't you have 2024 8k with $1000 GPU??? Bad graphics is your fault."
I always prefer options. More of those options is always better. I've always disliked motion blur and TAA blur. I like to see the detail and the cost of bad looking hair and jagged lines everywhere is worth it to me.
Yes, options. Devs just don't like to put effort(money) into it. But they really should. Players will choose for themselves what is preferable to them. I always disable motion blur. But sometimes I have to use some kind of AA. Glimmering in some games may be a huge issue. I tried to play first Shadow Warrior and it was just really bad. Really good AA is when you increase the resolution by 4x and then scale it down, which is really costing. TAA wins here 'cause it a much more "light". But blurry. And while sharpening may not be the best, it's better than without it. I play a lot of warframe and I've tried different settings, TAA with full sharpness is more nice to my eyes. P.S I've got some "easy to be tired" eyes. I can play some games 14 hours a day and be ok. But if it has glimmering, excessive blur, 30 fps(not 60), stuttering or something - 1 hour of it can destroy my will to play a game with a severe headache and eyestrain. So for me it's often not the choice "it looks prettier". It's "I won't die with migraine with these settings". And TAA helps. If my videocard is not good enough for better AA AND if it has sharpening. But I've seen only warframe to implement it. Other times I had to turn it on in nvidia drivers and it was somewhat ok. Really, if devs just try they can make blur really nice or make game not glimmer so much without it. P.P.S. I think that gta V has really good blur in games. I hate depth of field, motion blur etc, but in this game(and only) I even chose to to set it on.
The main problem is that developers aren’t at all in love with their games, they have corporate assholes botching their creations for extra profit which in return makes the devs say fuck it it’s not mine now anyways let’s just push this thing out and get paid. It’s not like back in the day when every person working at the company had love for what they were doing
You can break aliasing down into three general sources. 1. texture aliasing, this was solved completely in the 90's with mip mapping and anisotropic filtering so you may not even be aware it was a thing. on any kind of modern graphics hardware plain textures will never alias. 2. geometry aliasing, since polygons are mathemathical vector data, they will always be undersampled on any kind of digital display, msaa is the perfect solution for this type of aliasing, but even simple filters like smaa can have a high success rate with this type of aliasing, there are even post effects specifically optimized for geometry aliasing that aren't applied to the whole image such as SRAA and GPAA. 3. shader aliasing, this covers a lot of possible effects like normal mapping, shadow edges, rim lighting, HDR (not monitor), refractions and reflections, etc. there is not a good universal solution for every single one, but techniques like pre-resolve tone mapping, LEAN/CLEAN mapping, toksvig mapping etc. can effectively clean up the most common ones, and if you have some special requirement you can always do in-shader supersampling or custom filtering. TAA is attractive because it is a simple "drop-in" effect that requires little to no effort, but if you are simply careful, all aliasing can be adressed with not much more effort. If you simply solve each source of aliasing using well known techniques you will be left with a clean, sharp image that does not require agressive post processing.
Hey thanks for this comment. I have a developer resource on how to combat aliasing issues in games and improve AA quality. I hope you can give it a look and possibly contribute any sort of tips you might have that you think is missing: www.reddit.com/r/MotionClarity/s/S9t1LgQwgz
@@heksogen4788 probably not even the devs, but the companies themselfs most of the time, why would they spend more money on finding proper solutions if there is just one single drop in solution most people wont even notice is actually bad. its not like we were expecting much from those companies at this point since they have been milking the same games over and over by basicly releasing dlc's or updates as new games whilst somehow still making the new games more expensive and worse then thier predecesor.
@@heksogen4788Unfortunately, companies are made by time and deadlines. It's like how games come out buggy. If a game isn't being released, they're losing money. Made by time, not looks. If something is easy and cheap? Do it. If it takes hours to optimize, ignore it, get it out the door and fix it later. It's honestly disappointing as I feel a game should be as optimized as possible like how N64, GameCube, PS2 and everything had limitations people worked around to get the best out of their system like Banjo and Donkey Kong Country. Amazing games for their times pushing the system to the limits. Now? Probably some lag in that new triple A game that's pushing limits for the wrong reasons.
I was noticing these modern games looking softer than they should. Thank you for this video!! I perfer a sharp imagine but i see why TAA exists because these consoles cant handle native 4k in high frame rate modes so you would see alot of jaggies with it off but that might actually be better. TAA gets rid of so much detail that it makes it look like we are going backwards in graphics evolution
@sergiovinicius2221 PS5 uses it alot but because games run at higher resolution than ps4 the muddyness will be less obvious. But games like marvel spiderman 2 in the visaul mode looks stunning but the performance mode looks more blurry. Performance modes in games make games look more muddy. I haven't been playing much games these past few years but GTA 5 enhanced does have that muddy look in performance RT mode but zero jaggies. But it's so nice to play gta 5 at 60fps and game still looks good despite it using TAA. TAA negative effects are less pronounced at higher resolution
The higher the resolution, the less noticeable jaggies are thankfully. When I switched to 4K years ago, if a game didn't support real (non-post-processed) anti-aliasing like MSAA/SGSSAA or good old-fashioned brute force Super Sampling, I just turned AA off. TAA/FXAA just make everything look way too blurry. You hardly notice the jaggies, even now when I switched from 27" 4K monitor to playing on a 4K 65" OLED. Sure, it's a little more noticeable using the OLED, but that's because I only sit about 6 feet away, which is pretty close. It sucks though that if I wanna play my Series X or PS5 instead of PC, or are playing a shit console port on PC, that you can't usually turn AA off.
I think 4K was introduced to the general public much too soon. Displays can output to that resolution, but even for simple video streaming many people don't have adequate bandwidth, and for gaming the cards need radiators that take up a fraction of the case.
2 things that usually come with that in the same package: - motion blur (although, that u can usually turn off thankfully) - ultra low unchangeable fov that literally makes my eyes hurt
A 70 degree field of view is the standard. Most people say 90 degrees is the sweet spot. This is the equivalent of standing 2 meters away from a large window and taking one step forward, sure you can see a bit more of what's outside but it's not earth shattering. High FOV looks ridiculous, like you're a goat with eyes on the side of your head. How is that the less sickness inducing version for you? 😂 I'm not trying to be a jerk, I've just got a good nose for poor excuses. MOST modern games have a generous enough FOV or the ability to change it. So what's the real issue mate? 😂
Thank you for your work. I wish more people would be aware of developer's annoying abuse of TAA (using pixelated transparent assert assets and then blurring the whose whole image with excess I excessive TAA for us to not see). I'm grateful both for this video and any mod you worked on. Anyone combating this insanity deserves a medal. TAA itself has benefits, but developers rely on it completely for the game to be even tolerable, the worst case being perhaps RDR2, where it's unplayable with no TAA even at higher resolutions (I tested 2k + MSAA + SMAA). Another issue is that they make TAA too strong, but perhaps the technology itself is bad and should vbe replaced as it would ruin textures in any case. Third issue and a very big one is - Using TAA in upscaling comparison materials to cheat and exaggerate the result. Upscaling uses a TAA lacking image to create a bigger resolution one. Then they compare it to a non upscaled lower resolution image with TAA ENABLED ramvbling that upscaling has added back details magically as the smaller resolution image with TAA lacks them. Yet it is a silly lie, the main difference is that TAA blurs a lot of details and a lower resolution image has them hidden, whereas the higher resolution AI upscaled image has no such issue due to upscaling a non TAA version of a lower resolution image (so TAA haven't ruined all textre details yet). Really dishonest marketing of what upscaling can actually do.
For those wondering why this is so prevalent: This is not because because developers are lazy, but more so due to the fact that most modern engines use deferred rendering, whereas most engines prior used forward rendering. Deferred rendering supports more lights in a given scene which are also more accurate, allowing for complex scenes to be rendered relatively quickly. However it does not work well with traditional anti-aliasing methods because it apparently messes up the final render. The lighting of modern gaming is truly impressive, which is the main strength that deferred rendering brings to the table. It just means they more or less have to rely on TAA to fix the jaggies. Forward rendering can achieve similar levels of lighting by baking lightmaps and such, but that can increase development time, especially for complex scenes, and you will reduce the ability to have dynamic light interactions. Developers are not necessarily being lazy, they are just trading off one area of quality for another. I do hate TAA though. The Godot engine uses "forward+" rendering which apparently uses some deferred rendering techniques along with a "clustered lighting" to deal with complex lighting, but is still capable of using MSAA. Might be worth looking into for anyone who is interested.
Its still a developer laziness problem, because theirs deferred rendered games with thin geometry that either have decent TAA or good non-temporal options, because they're not using generic engine defaults and they actually tweak it and analyze the image in motion.
For me, I mostly solved the issue by using Dynamic Super Sampling (DSR Factors) in Nvidia control panel @ 1.5x resolution and 50% smoothness, and then combining that with DLSS Quality. It then doesn't cost much FPS, but it makes a MASSIVE difference in RDR2 specifically where I can then still have TAA on at a low setting. That 1.5x is because I'm already on 3440x1440. If you use 16:9 @ 1080p, then increase the DSR factor further.
😂 And your PC is nice and stable. Comfortable temps and everything is just coasting along?? Not a chance if you also have more than 70 frames. I don't know why folks like you try to use cope for RDR2's terrible TAA. And the whole thing about putting it on low, doesn't ease the issue. It introduces more. DRS is great. But then you said DLSS. Both of them together in RDR2 produces ghosting. And trees look terrible at night. DLSS in RDR2 is bad. No amount of cope can fix that@@elmhurstenglish5938
As a solo game developer, I find it funny if some big studio makes a game without offering an option to use your preferred anti aliasing or switch it completely off.
I really hate it when I have to edit game files or find a modder that already did it; I wanna play the game but my pc is shit and then I can't disable it... Bruh
apparently in Unreal Engine , epic have made many rendering features effectively tied to TAA so that it needs to be enabled for some things to work. I heard once that some studios rely on it for fast transparency, as they use dithering and rely on TAA to blur the dithering together
In big game, targeted at casual players. These stuff makes no sense to them and doesn't really matter. Kinda why any choice are less the more mainstream things is. Take Minecraft for example. Their devs seems like doing nothing. Even if they do it's like decades since the game released but nothing much changed. While stuff like terraria who targets semi niche audience has all these crazy things on them every .1 update.
The problem really started when the Deferred Rendering started getting more popular, towards the end of the 00s. While DR provides WAY faster lighting effects and other great optimizations compared to the older Direct Rendering pipeline, it also came with a bunch of new problems: it broke the MSAA and made transparency effects extra difficult or tasking to perform. The former, together with the increasing playing resolutions and rising amount of shader-oriented noise and artifacts, resulted the birth of these blurry Post-Processing based anti-aliasing methods, such as FXAA and now TAA. And all that on top of the already then ongoing "bloom & blur" visual trends, and you now got games that are like senior ladies wearing waaay too thick layer of tacky makeup. You only realize how far down we've fallen when you go back and play some ~2004-2005 AAA games again. The crisp, sharp, non-obscured visibility truly is refreshing.
This deferred rendering is the root cause here. It basicly allows easier way to scale up amount of light sources in scene but in reality, every game out there can be made with direct rendering pipeline. Engines can be written in way that each light source affect group of objects that are close enough and priorize them, and of course static lights in static objects can be precalculated to textures.
The deferred rendering tech demos back in the day were really misleading. Most game scenes don't actually feature a huge amount of tiny point lights, and light sources in games will almost always reach all the walls in a room. Lights IRL don't just affect small bubbles of space- they subtly spread out over very large distances.
Deferred rendering allows us to scale up the number of light sources, but that is only a single of the many benefits of a deferred lighting render pipeline. Forward lighting models mean that for each fragment / pixel on a geometry we do all the lighting computations on a single pass, so drawing geometry if it passes a basic depth test we do all the expensive lighting calculations. This is bad because geometry is not in order because sorting geometry is a CPU task and really slow and not at all cache optimized. In busy scenes this can mean that for just a pixel you end up running the lighting calculations anywhere from just once to over fifty times. Then consider that for all the lights you have a single run through the full lighting calculations means considering the contribution of every light and generally with an additive light model summing the results and then correcting in post processing. With a deferred model we still consider each light source, but we now have the guarantee that we only need to calculate lighting for each pixel/fragment once and only once, because we use the first pass for depth testing and doing simple drawing of all sorts of world data into different frame buffers, and on frame 1 we have all the data spatially localised and we have done the depth tests to make sure we are only doing a single correct lighting calculation for that pixel/fragment. Because we basically divided our work load into a tiny fraction of what it would be in a forward rendering pipeline, we actually have the time budget to put in any kind of decent graphics effects. Immediate rendering simply does not allow us to put the modern standards of graphic fidelity into games. MSAA isn't really visually "better" than other anti aliasing solutions. Deferred rendering doesn't mean games made with that pipeline be "blurry". It however make it way faster to do things like SSAO, SSR and layered transparency. Baked lighting is good, but does not work if you have moving light sources (This is most emissive light sources in modern games btw, think explosions, muzzle flash, swinging lamps, etc). It does not work if you have moving geometry (player models, cars, etc). It means that making changes to levels during development takes a lot of time unless you batch changes together to do a nightly bake or whatever, but either way it introduces dev ops complexity and often makes QA testing slower. It's a powerful tool but not a universal one. Also, MSAA is not "broken" or impossible in a deferred renderer, it's just not really worth doing because visually it has some pretty obvious shimmering that people like to forget about when getting nostalgic. You can use the world position gbuffer and do edge detection quite easily with a shader, then do a standard super sampling on the found interesting pixels/fragments. It's maybe slightly more time expensive than SMAA in a forward renderer but still cheap. But.. why bother adding it? It's added complexity for a mediocre antialiasing algorithm.
I always had that feeling the "older" games have FELT better visually. They didn't look better in terms of texture or hyper realistic lighting, but somehow offered a more pleasing and cleaner overall visual experience. 4K high refresh rate gaming was looking pretty optimistic (which would solve a lot of aliasing issues) but then they started pushing the "next gen" technology before we had hardware to support it. Path tracing and every other tech that promises next generation visuals just ruined performance and clarity. We are now gaming in sub 1080p with sub 60 fps, looking at a blurry mess. Developers just turn on every engine feature, slap it on their game without thinking and expect a miracle, where older games had to manually craft the lighting and visuals to min-max the existing technology.
Just posted one a few moments ago, its still marked as unlisted as I wait for it to process & create a thumbnail for it. So coming very soon, either later today or a few days from now
TAA is one of the first settings i disable in a game. Not only boosts performance but looks sharper. DLSS and FSR add the same blurryness but the performance gains greatly offset that.
This video cleared up for me why so many newer games on my 1440p monitor seem to look "blurrier" than some older games I play. It started to feel like upgrading from 1080p a few years back was pointless but I guess even that is worse now
I always used to put graphics on ultra and then wonder why it looks more blurred than on medium. When I finally figured out it was TAA I started changing it to SMAA. Even though the edges of some objects are still slightly pixelated on SMAA the overall image is considerably sharper and easier to look at (you don't loose the detail). DLAA is the best though.
Here is a list of effects that I always turn off. They have little to no performace cost but greatly detract from the experience when turned on. 1. Bloom (if it''s too glowie) 2. Film Grain 3. Chromatic Aboration/ Lens Distortion 4. Motion Blur 5. Lens Effects/ Lens Flare 5. Vignetting 6. Depth of Field if implemented badly (LEGO Batman I'm lookin at you!) 7. Screen Effects such as Blood and Dirt All of these effects are only for making it feel as if you are watching a film. I never understood why they keep implementing Camera effects into games.
@@ScoutReaper-zn1rz Yeah, pretty much all post processing effects. They have little to no impact on performance but they make the game cinematic. Imo it's fine for an action game like DMC or something like that where they add to the overall "bling" but for an FPS where you have to focus on things on the screen it's distracting.
I remember when TAA was invented, and how cool it sounded. I'm a graphics programmer, so I keep track of that stuff, read papers, etc. It seemed like a pretty genius solution to aliasing. I think, as you said a few times, it really comes down to how well it's done in each game specifically. Bad TAA could be as bad as the shots you're showing in the video, or it can be well done in other instances. It's debatable to say that information is lost in the same way, and in the same amount as, say, a gaussian blur. TAA isn't simply a blurring algorithm. Implementations that can properly use motion vector fields and other techniques to "correct" for the blurriness would be good examples of TAA working well, probably.
This. And often its just horrible settings. Like in Fallout 4, where they used the worst settings possible for TAA. Especially in F4 VR this was an issue, the blurriness literally caused headaches and it turns out you just need to adjust 3-4 settings in an .ini file to fix it and still have good anti-aliasing effects...
one game that does TAA very well is War Thunder surprisingly. The TAA effect makes the game look so smooth that it feels jittery to look at without it.
@r1zmy I genuinely enjoy TAA in WT as it's a very well-done anti-aliasing option compared to DLSS. I turn on DLSS for the sole reason of playing 'spot the dot' in GRB/ARB when I'm trying to spot air targets.
Also a programmer: TAA was massively oversold when it starting being implemented. Always has been shite, always will be shite. It's the crutch bad devs lean on who can't be assed to properly spec their assets, tune hot spots, and build performant games.
RDR2 is a real pain in that regard, I'm not surprised you included it. It's the only effective AA on that game, and honestly blur is not the worst, it's the ghosting with camera movement. Especially with light sources at night, it almost feel like in older versions of windows when it froze and you had that window cascade effect
The problem with DLSS in RDR2 is Rockstar locked it to an ancient version of DLSS 2 so you have to hack it and forget about online if you want to upgrade the DLL to a newer much more capable version. It's the only game that I know of that I can't just drop in the latest nvngx_dlss.dll and be done with it .... I'm pretty much done with Rockstar and Take Two and their sh*tty business practices
its because by default in an engine like unreal engine for example, UE5 turns on FSR/DLSS(or another scaler) by default AND applies an aggressive TAA and their motion blur, it can all be disabled very easily but as more games are made with unreal as it gets more popular(like unity) we're going to see people keeping the default 70% scaling with taa on because they couldnt be bothered to disable it or just didnt know
@@QU141. I know I'm late, but the only situation imo in which motion blur could be turned on, is when playing a very hefty story game with under 60 fps. Like if you have a 144hz+ monitor and enough frames for the monitor in an fps game, motion blur will ALWAYS be terrible.
I turn TAA off on any game that has it. I first picked up on this issue with Skyrim. It gives me headaches as my eyes are constantly trying to focus and deblur the deliberately blurred image.
I have a powerful PC and I turn off most graphics settings like bloom, motion blur, and depth of field. All I need is textures, shadows, and MSAA. The other settings are just unnecessary processing and the game actually looks better without them.
@@luca4870 Like he said problem isn't existence of TAA but absence of any choice of other way to do AA. It isn't and shouldn't be TAA or no AA discussion. You must be too young to remember when games offered three or more AA options to choose from.
I'm pretty sure this is a UE5 problem. I used it on a few games and it kept crashing at first and then when I did get it running it didn't perform very well. Strange that engine runs like that. I've seen amazing presentations from that engine but compared to actually running games in it like Source 2 with Half Life Alyx and Counter Strike 2 I'm kind of disappointed with UE5 right now.
@@gorky_vk Good comment. Another huge problem is that most people today don't understand the difference between _true_ AA (which _increases_ detail/accuracy) and fake AA (which decreases detail/accuracy). Therefore they believe that all antialiasing causes blur to one degree or another, which is of course false.
@@gorky_vk I'm probably old enough to be your dad but devs don't put other AA methods because incompatibility issue and those old AA methods were so costly only people who had latest and greatest could use them anyway.
Didn't realize you saw this video of mine as well. Thank you! This is an issue in every Call of Duty starting with Vanguard btw. BOCW & MW2019 are the last to give us full anti-aliasing options
I always try to turn off post processing. I had myopia for 15 years, I wore glasses, then I did laser correction and enjoyed my life. I can’t understand how someone can voluntarily worsen visibility like that.
Thank you for making an educational video about this topic, I hate TAA so much and I don't see enough protesting from players about being forced to use it in game and accepting it's overwhelmingly negative side effects
Oh and lazy developers relying on it to undersample and then literally just blur it back into existence, instead of actually optimising games and writing good code
Yeah well the main problem is most people don’t realize it’s a problem. They don’t know what’s causing this issue. Another thing that will become a large issue for people like me and you is the fact the modern game engines like UE5 are designed with technology like TAA in mind and basically all games on this engine use this method of AA to cover up all the artifacts caused by the new lighting solutions being pushed onto hardware that isn’t truly ready to handle it. So all the problematic lighting and effects are rendered at a low res and are being blurred for the sake of covering up the fact that your hardware can’t convincingly pull it off yet.
@@maixyt no. The sad reality is that many expensive effects are greatly reliant on some temporal smoothing, and TAA is just that. GPUs are not efficient enough (especially with memory latency and with register counts) to allow for such features without introducing exceptionally harsh performance impacts. Some of these effects include: -PCSS -SSAO -Stochastic-opaque transparencies -Stochastic rough SSR or rough raytraced reflections -Volumetric fog/lighting and other kind of raymarched effects, especially with volumetric grids Do notice how all of these are reliant on large sample counts and/or large memory traversals. Devs aren't lazy, they simply cannot beat the inherent limitation of GPUs, and this is the only solution (especially on consoles) when the company/studio pressures them to make graphics as rich as possible. The sole reason why we have such large advancements in graphics tech as of late is because it became more affordable to implement due to TAA. "Pick your poison", so to speak
@@Kolyasisan I wasn't downplaying the role played by TAA within making graphical strides. However, I should've elaborated more on my use of "lazy", I wasn't referring to devs abilities to optimise these graphical techniques for all types of machines not just the latest and most performant hardware, because that would be really hard or outright impossible, especially with the size of dev teams resulting in extra communication time within a time limited scenario and the difficulty of integrating all of the graphical effects without any conflict resulting in unintended glitches or artifacting. I was referring to Step 0 within the process of creating anything, what is the scope? At it's core The Finals is a fun, action filled, fast paced shooter, with emphasis on the importance of being able to see quick moving enemies and items clearly (and invisible light classes). Why would I need the latest and most technologically advanced GFX which require highly performant hardware? There are perfectly good more 'traditional' rendering methods and effects which don't require temporal filters to work properly. Below I will list additional accounts of how I view modern devs "laziness". However I'm going to skip over things such as motion sickness, blurry and out of date information, and having the option to configure TAA values, in game, and The Finals already having the ability to turn off TAA from within the engine.ini file, and then blocking that config file while not giving an option to disable it in game, showing that turning off TAA is possible and works just fine (I did this for the closed beta), but blocking players access to it (yes I know that engine.ini was used for cheating exploit, but you can whitelist specific commands, as talked about by Hybred in a newer video of his). I would much rather HAVE THE OPTION to decide how the game looks or performs for myself, as in I would pick either 1:1 sampled effects or even undersampled effects over TAA being plastered on my screen any day. Also, there is absolutely no way that temporal effects can't be implemented to ONLY certain effects such as volumetric smoke. Picking out a few of the effects, you say that volumetric smoke, PCSS, and SSAO require temporal filters to hide undersampling artifacts? Then how come Watch Dogs 2 (which I've played through recently) with all of these effects doesn't force any type of AA let alone TAA on my entire screen? Yes there may be temporal effects used with those GFX but they DO NOT affect everything, and they are NOT just an overlay covering my entire screen. In my opinion this is what a well developed graphics system looks like, it's within the scope of the game being a thirdperson, moderately active shooter with elements of story, with graphical effects that not only look great but are also implemented well. It is also in collaboration with nVidia and very well documented if you want to search it up. Or even the Ghostrunner games, they look absolutely gorgeous while being one of the most fast faced genres out there, managing to stay away from TAA and any sort of AA for that matter, staying within the devs scope of aiblity and making the most of what they know and what is available to the dev team, enabling me to play with maxed settings while still enjoying the beautiful experience along with the fast paced combat. Now this is not what I consider a well DEVELOPED graphics system, this is what I consider a perfectly tasteful and skillful use of a limited number of the more traditional effects, to get something that looks graphically competitive with modern effects while also wiping the floor with the FPS achieved in comparison. Also also, I don't know much about alternative methods out there, but I am willing to place a substantial bet that there are much better solutions which are unfortunately being overlooked because TAA has become the easy to implement and established status quo. At it's basis my issues with the modernity of using TAA to fix issues introduced by undersampled effects, is that TAA isn't only a mediocre solution, but that it is forced upon players. I don't give a damn if my game looks jagged or the smoke looks a bit funky because it shimmers. I grew up on extremely limited hardware (GT730) which required going into config files (take for example Black Ops 3) to lower the settings further than the ingame menu allowed, I was playing at a resolution of 40% of 720p, aka 288p. This only made me more familiar with how tech works, and got me interested in optimisation and effective use of graphical effects. I would much rather have the option to customise the experience of any game for myself. Playing DayZ and Uncharted at maxed settings in so issue, because it's within the scope of the games, DayZ is a mostly slow - medium paced shooter, however since it's quite old it uses traditional techniques to the best of their abilities, even using volumetric lighting, resulting in a very playable ~130 average fps on maxed settings while looking beautiful. While Uncharted and singleplayer games alike such as God Of War, don't have much if any multiplayer interactions, therefore it is within the scope of the game to run them at ~60 fps with great visuals. And even then none of those rely on a full screen TAA filter to my knowledge. Finally, TAA just annoys me and I personally don't like the look of it, therefore, if there is ever an option I would prefer to turn it off, the issue comes when there isn't an option, and that is unfortunately becoming a standard. Why wouldn't you want to have a choice? If you're accepting TAA as the new standard, that's cool, but I'm not accepting it. TAA isn't the be all and end all solution, there's always a compromise or alternative.
TAA in motion make many current games unplayable... no point of having a fast high hz monitor edit: It's incredible that they make so many effects with non-native resolution and yet the games are optimized (run) like shit
a lot of effects actually need a non-native res buffer in order to not have to sample a ridiculous amount of times, i.e. you get the same effect doing a lower sample gaussian blur at a lower resolution and letting hardware bilinear filtering smoothen it out than doing it at native res and having to sample way more times for the blur to remain smooth
Well you’ve never experienced a game that doesn’t optimize for performance. So what you think is terrible optimization is more just not great optimization.
When we gamed on Televisions there wasn't the 'clarity' we get from the digital TVs today. It took me sometime to learn to ignore the 'pixel' and really only notice it now in screen shots. As the resolutions kept getting better I was impressed but now as you mention: It would seem they have outpaced the ability to program the detail needed for modern resolutions and have returned to the tricks which reminds me of the warm fuzziness of old CRT technology.
Legit. I think people forget we all werent gaming 4k 10 years ago. Most of us were probably still on a 720p or 1080p, some people still not even having HD yet. I still remember when HD became a thing.
Thank you for this, damn. I thought I was going crazy mentioning this to friends who didn't really mind. The last CoD literally needed nvidia configs to not look blurred out. I feel like the more a game relies on dlss the more it lacks the option for a clear picture. Personally I'd wish we kept the counter-strike source clarity and went graphically up from there without skipping steps with pictures that look "ok" on a makro level but disgustingly blurry closer up. How anyone could care for 4k without pointing to this problem is wild to me.
This! You are so right! Games, even in ridiculously low resolutions for today’s standards, used to look so clear and nice (i.e. the 2D eras of SNES and Mega Drive). Today we have crazy amounts of polygons, very high resolutions, incredible artists creating textures and environments, only to have it all become a blurry crappy mess due to the over usage of so many different types of “image enhancement” systems that even overlap each other. It’s ridiculous.
Thank you for shedding light on this. My brain is wired so that whenever I see a blurry image, my eyes just instinctively try to adjust their focus to make it not blurry. Now I know why I can't play modern AAA games without getting severe eyestrain, even if I turn off any "motion blur" options in the graphics settings.
This has been a huge issue for me as i have somewhat impared vision and the blurring of any antialiasing, and espectially taa, makes it much harder to see anything, ive just had to stop playing newer games and its incredibly frustrating
@@gwyneveresnow5781for game like that it’s the developers fault for only optimizing for one platform (most likely console) then the pc port gets half assed most of the time. Tho ports have gotten better
In RDR2 specifically I found it looks better with FSR on the highest quality setting and sharpening cranked to the max. That compensates for the blurriness, you still get AA (it's forced on when you turn on FSR), and you get better FPS as a bonus.
@@256shadesofgreyYou should also enable RSR or DSR/DLDSR and use 1440p or 1620p as native resolution to have even less aliasing and blurryness. I don't know why but FSR looks better than DLSS on RDR II, I swear, not only looks better, I get more FP/s. Playing on an RTX 4070.
I use the upscaler mod with DLAA settings, still very blurry when turning the camera at 3440x1440, is DLAA just as bad as TAA when it comes to blurring?
Only semi related, but you reminded me of another Unreal thing that really bugs me and that the games industry has apparently just accepted as normal and fine; aggressive occlusion culling. It produces constant 'flickering' as level geometry is loaded in on the fly and you briefly see the, usually bright white, background of the level environment. It's less noticeable at higher frame-rates, but at 60 it very much is.
Doom uses this too, and its is not noticeable at least. I only found out by getting a bug where the fov was incorrect, allowing me to see the culling on the edges of the screen. Unreal engine does end up 1 frame late on the culling it seems, so you get that frame of nothing before it shows up
Developers are desperate to make relatively cheap consoles appear much more capable than they really are :) Look kids, if you want to game in 4K at 120 FPS with high-res textures rendered at maximum detail then you're gonna need a $1,500 computer with a $2,000 graphics card and that's that. Nothing's free, especially not hardware!
Pretty much every game uses some kind of occlusion culling technique; unloading stuff that the player can't see, it's just particularly aggressive in ue 4/5. I believe it's possible for the dev to tune it, but it seems the standard configuration makes it noticeable. It wouldn't be so bad if the background you see was a dark colour, on night or underground levels it's barely noticable. It wonder if it's possible to make a shader that always fills a negative space left by culling with a colour that blends in better.
@@irritablerodent Today's computer graphics hardware and software techniques are SOOO astonishingly brilliant it blows my mind. Technically and creatively it's such a mind-blowing art form and that's really all I can think of to say about it. The people that think this stuff up, and then conceive & then manufacture the hardware, write and debug the device drivers, and write, debug and implement the software to produce modern PC gaming graphics are absolute geniuses. It's 100% pure, distilled human GENIUS in addition to being an absolutely mind-bending amount of work. It's a shame that it's only major use is video games, which when you're honest with yourself about it, are mostly a waste of time. And, I *LIKE* gaming! :) I'll say straight-up that as far as massively time-consuming comfort-distractions go, it's probably better to waste your life on PC gaming than it is to waste it on heroin, for example. But, the people that made it all possible, especially with regards to graphics, are incredibly brilliant. Just pure concentrated human genius.
I normally don't mind as much in a chill single player game, but in an fps where visibility and colors are minimal, I hate that this exists and only adds to how annoying it can be to see some player models. That's why I like colorful games lmao
For years now, TAA is the first thing I disable after Chromatic Aberration in any game that has it on. If there’s no straightforward option, I desperately look for a ini file edit or some sort of workaround. I simply cannot stand those. Thanks for raising awareness!
I am glad you covered this! I was heavily modding Skyrim and noticed how anti-aliasing blurs the entire-scene rather than just the 'jaggy edges'. Since then, I swore to keep anti-aliasing off wherever I go. I don't care if I return from the future with Nvidia Quantum-RTX-9080 Ti. The blurriness is just not the way. If I am going to lose performance, might as well just get a higher-resolution monitor since they are more affordable these days for actual, TRUE less jagged edges.
first time experiencing really terrible TAA when playing halo infinite. so terrible that i had to google what's wrong. is it my settings? my PC? the game? why it is so freaking blurry only when i'm moving? that's where i first learned what TAA is. the worse part is, it can't be turned off at all.
I noticed this in some newer games especially in motion since I always turn off motion blur and the stupid 'film effects' before I event start. I had to go back and check a few times I actually had motion blur off since I could tell something was off when moving. Good to know im not crazy lol
Even after removing all that in Warzone, once MW 3 integration happened on December 6th, even will all blur turned off it was very blurry and weird. Then i tried AMD's CAS, and omg everything looks way better, like even guns in your hand...
@@fishfood8711I think the film grain is helpful to give better aspect to smooth flat texture: on Warzone (1), I think that some weapons, or the table on the gunsmith, were looking like plastic (way to smooth). But maybe I was using TAA, since they're telling in parameters that is really good.
I'm so glad to have watched this video. I have felt for quite some time now that games are really blurry and yet when I go back to older titles that blur is nowhere to be seen. I think it's probably more noticeable to those of us with excellent vision.
I would recommend turning Anti Alias OFF altogether if you're playing at 4K native, as you won't notice the pixels or jagged edges as much. When playing at 2K, it really depends on the implementation. I find myself requiring TAA at 1080p for every single game, as the pixels are huge and the noise plus the jagged edges are very noticeable. Also I find other AA methods to be less blurry but still leave a lot of jagged edges in place or hurt performance too much.
@@tharusmc9177 this makes no sense. 1080p to 4k is a perfect integer scaling, so 1080p signal should be identical. To take advantage of this, you need to use GPU scaling. Monitor scaling on the other hand will not use integer scaling and with it the picture will be blurred, so avoid it.
I've noticed this in No Man's Sky VR. Turning it on felt like I was not wearing my glasses, while turning it off was much clearer but jagged. So i preferd the off option.
VR and a lot of post-processing effects just do not mix at all. the effects aren't designed to recreate real vision or accommodate the depth that comes with VR. So many Unreal Engine VR games look really bad due to this
Wow. You just explained the _feeling_ I've been getting from every recent 3d game I've played. I thought it was just a trend that everyone was trying to jump on the motion blur bandwagon. This is extremely noticable to me, because i used to play a bunch of very fast competitive games. Their goal was to be as fast and clear as possible, then grab for as much effects as your computer could handle second. It wasn't uncommon for people to turn off things like motion blur and bloom - to keep visibility. (I chose to keep bloom, because it made immersion and beautiful scenes so much better). Ps moving through grass is where i find this the most noticeable. Moving feels like someone took the speed blur effect from a racing game and cranked it beyond max. I wonder if nvidia per game settings can force an override? (But could cause visual issues for some games) It seems like they are just grabbing way too much past info. I think you are right about dialing it way down and combining it with msaa or similar.
This has been a thing for, what, ten years? Unfortunately forcing MSAA simply doesn't work in many cases. I'd say most cases, but I haven't played AAA games in a while.
this is like a miracle video shown to me, as someone who only played older games because of pc limitations i had, everything looked normal, but ever since i started playing modern games, it felt odd and uncomfortable to play, because i saw aliasing so bad it felt like looking at bad pixel art, so i tried turning towards anti aliasing, and putting the settings on "max", which was most of the time taa, and now i had a new issue, it felt like i couldn't see anything properly. and this just makes so much sense
I've just resorted to using DSR.. It's a brute force fix that shouldn't be necessary, but games look so much better when rendered at a higher than native resolution. I guess it's basically like using MSAA.. wait, what ever happened to MSAA?? Games always used to have it. Sure, it was expensive, but it looked good..
The issue is theirs more stuff for MSAA to super sample so it got more expensive therefore it's not used. But if some people are super sampling anyways maybe they might as well bring it back? I know MSAA doesn't work well on certain effects either but I found a resource that worked around that issue but I can't remember the papers name. I'll have to look
MSAA can't effectively do it's thing in a deferred renderer (most game engines nowadays). You'll end up with a mismatch in the amount of samples for a given pixel between the geometry edges (however many AA samples you select, 2 ,4 or 8.) & the lighting since they're rendered in different passes which will look either very ugly or you match the samples in the lighting pass & give up a ton of performance (you'll be shading every affected pixel 2,4 or 8 times depending on the amount of samples). You can get around this however, but it's a lot of work & that'll only give you anti aliasing on geometry edges (which is the core reason it's defunct, it actually does next to nothing to actually solve modern aliasing problems which comes for the majority from shaders, not the geometry).
@@Hybred also GPUs nowadays aren't just built the same.. on the PS2 and X360 they had giant rasterizers and high bandwidth DRAM so overpowered you could do free MSAA or just abuse the fillrate to get an effect.. now we have barely enough bit width to feed the chip without undersampling things.
DSR is even more resource intensive than MSAA. On my 1080p monitor I find having to go 4x DSR for good results, which means the game is literally being rendered at 4k and downscaled down to 1080p. Too bad with my puny GPU I can only do this with really old or "oldschool grahics" games. And for games which only support borderless instead of true full screen, like Dread Templar, I have to change my desktop resolution to 4k first. But even top of the line GPUs these days don't seem to be expected to run modern AAA games at 4k natively, and I don't think there would be much point using DSR combined with FSR or DLSS...
THank you for the detailed review! It was a big surprise for me to see so blurrish Cyberpunk on 1080p monitor (RTX 4060, all High/1080p native with DLSS Quality and DLSS Off) compared to crystal clear Witcher 1 from 2008 and Crysis 2 from 2012. And it's crazy that 1080p native graphics quality looks like HD at best with raytracing/space screen reflections/grain/chromatic aberration/depth of field - all of them swtiched off. And only enabling DL DSR (1080p -> QHD -> 1080p round trip) made a picture a bit clearer but with at the cost of performance downgrade. If that blurrish nightmare continues I will just step aside rather than buy 4090-alike videocard and 4K monitor to have small details not being washed away with TAA-alike filters. I'm pretty sure there are still many owners of 1080p IPS pretty good quality monitors not wishing to swap it with QHD/UltraHD ones.
I remember when Battlefield 1 came out, I always thought it looked amazing, but was always blown away by how effective the resolution scale option was, 1080p with 200% resolution scale looked insane. Any game with TAA still has this issue, even some games that don't use TAA still have the problem. I now play on a 4k monitor and so many games still look so blurry. I go back to GTA 5 and 1080p, 1440p and 4k look pretty close, the upgrades are subtle but there, but on RDR2 I quite literally can't play it at anything below 4k now. It's not that 4k shows a ton of detail that isn't there are lower resolutions, it just cleans up the blurriness of TAA
One game that actually has good anti-aliasing is Warframe with it's SMAA option. It has options of FXAA and also TAA that you can also adjust. TAA makes the game look blurry for sure but SMAA has a really nice crisp image with only very minor aliasing that is non-bothersome IMO. I'm not surprised Warframe has these options as the previous creative director Steve was always a very big fan of graphics and they're always optimizing and adding new graphics tech.
I agree, I remember missing around with it's setting , their SMAA is perfect . No jaginess and no blur . I bet they are using their own implementation of the algorithms because it's clean as fuck
@@choppachino Most modern games has no option to change "Anisotropic filtering", but it instead is usually combined with other stuff under "Post Processing effects" switch. Although fun stuff is that it is possible technically force-enable the older filtering options via graphics card settings (like NVIDIA Control Panel)
Thank you so much for mentioning the motion sickness. It's so frustrating but tons of game insist on "realistic" motion in their games and add blurring to it and its ten times worse.
THANK YOU!!! I've always talked about that between my colleagues and they always says that "is not that bad". But I'm still playing in a 1080p monitor and it is VERY noticeable and annoying. All the blurriness in modern games is just so counter-intuitive. What is the point of having giant 4k/8k textures if we're blurrying everything? Look at Dota 2, the game looks crisp and sharp, very well defined, even without MSAA on....I hope we have a solution for this in the near future.
It's not just TAA. It all went south when deferred rendering became the only option in DX11 and up. TAA and DLSS are the only ways to get rid of aliasing, those "staircase shimmering" one might remember from Witcher 3, Just Cause series, GTA 5, or Rise of the Tomb Raider. DX9 games had a lot more options, like multisampling or sparse-grid supersampling, but they are almost as costly as running it in 4k or 8k.
What do you mean "the only option"? Deferred rendering is just a technique for rendering scenes, specifically for decoupling some operations away from scene geometry and frag shaders rendering on them. It's just that it provided very important benefits to performance due to the way how GPUs' fixed function hardware works, which still continues in a lot of games and tech. You can do deferred in DX9 as well (and on the original xbox, too).
That is true, much of this is the fault of deferred rendering being used simply because "it allows for more dynamic lights" to be used. Meanwhile, they ignore that fact that it rules out the ability to use a bunch of other AA techniques effectively... and the fact the fake (aka pre-baked) lighting and shadow techniques have already gotten to the point of being good enough (though not perfect) but much easier to run on older/weaker hardware. So in other words, it would really just be better to stick to old, but refined techniques... as it would allow for higher framerates instead and better performance on older/weaker hardware (which also helps the environment by keeping that hardware relevant instead of requiring users to upgrade shit) and also handheld gaming consoles (and smartphones) which have much stricter power constraints.
I want to add some technical context here why TAA has been so popular. Simple answer is that it requires very little GPU power to implement. FXAA is still pretty fast, faster than TAA usually, but it doesnt necessarily "catch" all edges in an image as it does so mostly by contrast. Edges with little contrast may get no AA blending at all. Like TAA its relatively recent, even if its not as recent. Its mostly sharp, but doesnt do any subpixel rendering like MSAA does, but rather by just looking at the angle and approximating from there, which can lose sub-pixel details, but not to a noticeable degree. MSAA is something youll probably run in on any game since the 2000s. Itll detect edges based on the actual geometry being rendered and thus typically catches all edges on the geometry and then samples subpixels to smooth the egde. It doesnt do transparency (Morphological Anti-Aliasing comes in here, but thats a different topic) but other than that its pretty much a brute-force approach and on higher settings gets a little intense. And I dont just mean the choice between 2x and 4x or even 8x, I mean it in the sense that modern games have a lot more geometry going on, which means more edges, which means more MSAA work. Especially with grass like on the Halo Infinite scenery this can easily become ridiculous. But on older games the performance hit was noticeable, but also handled by much weaker GPUs just fine. In image quality either will beat a blur filter like TAA, MSAA especially, but FXAA is "fine" as well, low contrast edges dont stick out so much and quality can usually be adjusted to make it fairly pleasing to look at as well. MSAA its just not worth the performance penalty. FXAA though? It smooths all edges, transparency, shader-related, anything, and goes fast. Why isnt it used instead of TAA? I have no clue. What Im looking forward to is FSR 3 though, because itll come with a mode that uses their upscaling method but uses it as a replacement for AA without actually needing to upscale, so image quality should stay similar to what MSAA and FXAA deliver without loss of detail and certainly without blur. Upscaling has been doing this for a while, has to, because if it didnt FSR and DLSS would be useless and just as ugly as no upscaling at all, so it smooths jaggies as it goes. Problem is that FSR especially still has issues with certain stuff, foliage, transparency, particle effects, shimmering, and running it purely as AA doesnt fix that. And DLAA for Nvidia is still only supported by a few games, even if its results are much better.
ive been a fan of FXAA for years. i am equally as confused as you are as to why developers have slowly refused to include it in games. its fast and it gets the job done just fine. its the "its better than nothing and less stressful than msaa" option so its a no brainer.
Thanks for discussing this, I have been saying for years that TAA is not an ideal solution. Whenever I find a game that has it implemented poorly, one of the first things I will always do is try and find a config file to manually disable it. I would rather have the jaggies than blurry textures. The amount of games that come out and have awful TAA implementation is simply staggering, I don't know why the industry is gravitating towards this method. Surely these devs are aware of the obvious drop in quality it creates in many cases.
Because it's the only practical method that can handle spatial and temporal aliasing/noise without completely gutting performance. MSAA no longer works with the bulky lighting pipelines that modern games use and SMAA is only a spatial post-process AA filter that cannot handle the temporal aliasing/noise that modern games have. That leaves us with TAA which, yes, we could apply only to the problematic lighting or foliage components of the image and use spatial AA at the end to clean up the rest, but this only half solves the problem as you'd still end up with blurry and smeary lighting, foliage, animated textures, etc. Rather than _everything_ looking blurry in that RDR2 shot at the start of the video you'd instead have all foliage being blurry which is still bad.
@@jcm2606 The solution to the problem is SMAAT2X. Not blurry and it fixes temporal aliasing. Great cost to quality ratio. I don't understand why they don't use it more, only a handful of games implemented it.
Run a modern title on Series S where some if not all textures cap at 2K mips, and compare it to a PC or Series X and full 4K. This is necessary for lower memory bandwidth, most importantly when a game is frequently streaming assets in and out. It's so much easier to move a 2-4mb 2K texture than it is to move a 25-35mb 4K texture. Esp when a FULLPBR asset at it's most optimized form may have at least 3 textures (albdo with colour information, normal map for bump information, and composite which stores ambient occlusion, roughness and metallic information inthe three RGB greyscale channels to reduce draw call). The difference between 2K and 4K visuall is night and day, especially if any text element was baked into a texture like artwork instead of a higher res decal sitting on top regardless of TAA. Source: I'm a first party developer for Microsoft Flight Simulator, and content we produce for Series S gets capped at 2K due to memory constraints.
Thanks for your comment. We've talked before on r/OptimizedGaming / Reddit, happy this hit your algorithm! I used my consumer psychology degree to get this video as big as possible however I do hope more gaming/tech channels cover it, its the only way the industry will stop heading in this direction is if enough people let them know their dissatisfied or worse even sick (headaches, eye fatigue, etc) I hope to see more options for everyone going forward, I think having options is not only pro-accessability but is fundamental to the PC platform where these decisions should be the gamers choice.
lol fxaa blurs the image but stops at that while taa damages everything, fxaa used to be a joke of an anti aliasing when smaa and msaa rolled and now we be lucky if fxaa is even supported. What a dumb age of gaming
I know a lot of people use TAA, but I've always thought it looks terribly blurry and sacrifices so much overall color and saturation just to look like it's adding a blur filter over the screen and hoping it looks good. It may sound silly (because it's also known for blurriness by those who like TAA, in my experience), but I'm a HUGE advocate for FXAA; I feel like it's a good mix between antialiasing and sharpness without losing visual detail and quality, including colors and shininess on reflective objects. Great video
This is an important topic. Thank you for bringing some much needed attention to it. Hopefully through voices like yours and ours we'll be able to affect some change in modern gaming anti aliasing implementations. As it stands, lately it feels like we're going backwards in terms of visual clarity. Many people feel this without even knowing what TAA is or how it's actively undermining visual clarity.
@@XenoX_98 Multisampling kinda fixes TAA too. In general I feel like PC gaming is lagging behind consoles which is why TAA is so bad here, many people still use FullHD monitors while current gen consoles are running at 4K or 4K Upscaled, which makes TAA look better.
@@XenoX_98Far Cry 4 was the last Far Cry game with MSAA, i remember when they moved to TAA in Far Cry 5 how awful the power lines looked unless I played on 4K, yeah it’s trash
@@lukkkasz323Not at all, PC gamers play at 1080p but consoles are nowhere near 4K, lots of games use FSR or TAAU from sub-1080p resolutions, some of them even sub-720p (this is not that common tho) like SW:JS that renders at 648p. Consoles are actually getting the worst TAA scenarios, it's just that PC players are more aware of the situation and tend to use DSR when a game looks jagged on their 1080p screen to render the games at higher resolution by lowering other settings, also console players tend to play away from their screen so they don't even notice Anti-Aliasing on games like GTAV (PS4) that was literally a jaggie & blurry mess thanks to PS4's HRAA, RDR2 which is also a blurry mess, even on Xbox One X, and GTAV (PS5) which also looks ultra blurry, especially on the Performance RT, but Fidelity Mode still looks bad. The only game I've actually seen with a good looking TAA is Horizon: Forbidden West.
@@aeon7748Far Cry 5 also offers SMAA, which is the AA method used on consoles, it's not as effective for the removal of jaggies but it's better than nothing, and does a very good job if your PPI is high. I don't honestly find Far Cry 4's MSAA great at all, it does a terrible job compared to GTAV's, not even TXAA does a good job in this game, I find SMAA to be the best looking one once again.
I already knew about TAA being trash in games, it was enabled by default on some games, and I quickly realized by swapping to FXAA that my games were looking way cleaner with a sharper image instead of being blurry.
FXAA is just Fullscreen Anti-aliasing... which means it only works on the pixels shown on screen... not frame-by-frame... it is a very 'quick and dirty' solution, and the industry moved away from it as there wasn't much to be done with it anymore... but it is still useful for comparisons... "Temporal" in this sense refers to frame-by-frame in the Graphics pipeline, if we wanted a way to do FXAA "Temporally" here it would just end up looking like TAA just the same lol... the "Temporal" part is the problem, we could generate A PERFECT 2160p gameplay experience of anything but it might just take TOO LONG hahaha... anyway, in the words of the Protoss from Starcraft, "You Must Construct Additional Pylons" as really there is no way around this except more powerful GCards and/or CPUs
again, just to moan more about TAA x FXAA... The REASON the Gaming industry switched off from FXAA is because there was no way to imrpove it, and SUDDEN changes on screen meant that FXAA had no way to compete with TAA which could *already process frames before they were shown*
@@Vifnis I always switch to FXAA on any game i install because it always looks better with better performance. As old as it might be, the new TAA and SSAO are just terrible in comparision. I don't think they are good substitutions.
I needed this video and the great info it contained. I'm 53 and my eyes are getting older. Blurry graphics is my biggest gaming enemy these days. I decrease the antiasiling and try to use Reshade in most of my games now to get a crisper sharper image.
Wow I learned a lot from this video! I always had some weird problem with my games being blurry but I just thought I was getting older and more nitpicky as my pc specs improved and I wanted higher quality graphics. This was really interesting and helped my understanding of what some of the settings do in the graphics panel that I just judge by eye test before implementing. The sliding images really helped me see the phenomenon in a way I hadn’t before, which was really appreciated.
Honestly in VR there are so many things at play that it's hard to point out a single culprit. If you're using something like air link to play a steam game we could be talking about the image being scaled two to three times, plys image reconstruction with spacewarp and so on
Thank you! It's been a while since I noticed many VR games in UE4/5 are a blurry mess even with super sampling. I just tried UE5, if you do a standard project (without lumen enabled) and enable VR, the TAA + some more settings make everything so bad. If you use their VR template, it's perfectly fine
I play a lot of Skyrim, which was pointed out already, uses TAA. However, a mod came out that allows for using DLAA. It is incredible on how it reduces jaggies without making a completely blurry mess. It requires a sharpener as well, but does a much better job than TAA (though does cost in performance).
Thank you for making this video essay, it was very thorough and balanced. I am an indie game dev who did not know that this technology existed or that there were some problems with it. I will be saving this video for my notes.
I'm actually starting to understand this because I couldn't tell the difference between TAA and no TAA now I do you made a great video and keeping up 👍💯
After all these years I just assumed I had really bad eyesight causing things to just turn into a blurry mess but after seeing that Witcher 3 comparison, it was a night and day difference.
@@lmAIoneI definitely think so as well. My eyesight has gotten so much worse since TAA was introduced, and I started getting horrible headaches from games whereas I never had issues before. Pure speculation on my part, but if I go back and play older games, I don’t have these issues.
This video has blown my mind. I always use TAA since I've always had a higher end machine and would always set settings to max. Since TAA is always at the end of the list I assumed it was "the best quality" one and would always use it. Now I totally understand what's been making my games so blurry, definitely going to FXAA or SMAA from now on. Great video!
Funny thing you mentioned those guys, FXAA just blurs your whole screen while SMAA blurs just the edges by detecting the edges but at the end of the day it's still applying blur. You can't escape the blurriness of today's modern games, unless you play at 4K with no DLSS because that still applies TAA by default and renders at lower resolution.
@@LazyBoyA1 I don't think FXAA is *supposed* to blur the entire screen. It's supposed to find sharp contrasts, which could indicate an edge, and blur just that. Of course, it doesn't always work as intended.
@@stale2665 You're absolutely right, but have you seen modern games with no anti-aliasing, it's all jaggies! FXAA just blurs the whole screen as a result.
@@stale2665 Dude, FXAA literally stands for Fast ApproXimate AA, it works by blurring everything even subtitles on your game to achieve an approximation of anti aliasing.
This video precisely articulates my feelings toward new games. It feels like they just put in a blur filter and that's it. It feels cheap, and you can tell there is less effort put into it.
You can mess around with the game files but more times than not crappy things start to surface like constant dithering and extreme sharpening (RE Engine, UE, AC Engine) so even if you have a choice to disable it, games are just built around it so turning it off results in a way worse image @@QUANTPAPA
Join *r/MotionClarity* to discuss this issue & to find workarounds: www.reddit.com/r/MotionClarity/ & also watch my new video on the subject: ua-cam.com/video/LiUvA3cTdhg/v-deo.html
And a few things to note
- I didn't get everything correct, made some minor errors, I noticed them as recording but kept going, I did this video in one take. (At one point I called something dithered for example, but I was trying to say it looks dithered, because thin geometry can look that way sometimes)
- I did not address everything I wanted, such as going into more detail on solutions and also providing more solutions, because the video was getting long. Another video will come out regarding that
- From 11:38 onwards I do provide useful information for those curious, although it was mostly meant for developers. The comparison part of the video is all you need to watch if you want to see examples of the problem with brief explanations.
- Sorry for the length of the video and if their was any word fumbling. I know it hurts viewer retention but I wanted this video to cover everything, for everyone, those who know nothing about it, those who already know a lot, devs, gamers, etc. But let's hope some UA-camrs who are better at captivating people and making entertaining videos can discuss this as well.
An important note is that TAA is a lot better when it uses a history buffer that is 200% screen resolution. 100% will immediately leak to other pixels, while 200% will leak to sub pixels first and blur less in the final result. (Edit: you can do it right away with 4x DSR (0% smoothness) and a 50% input upscaler, like DLSS performance without sharpening). Unreal engine has the console command r.temporalaa.historyscreenpercentage 200 for this, but it's only rarely used in games. Not even cinematic TAA utilizes it by default. Epic and cinematic TSR use a similar console command (r.tsr.history.screenpercentage 200), but they are very expensive: about 1.7 ms at 1080p on my 3070. It's not perfect though. 1080p will look like 960p in motion, so the blur is minimized and probably worth the greatly improved stability, but smearing cannot be avoided on changing surface colors and (semi) transparency, including grids and foliage with lots of detail. Either the foreground or the background will smear in motion, for at least one frame.
Other console commands I use are r.tsr.history.samplecount 8 (to make the TSR less agressive and minimize blur even further) and r.tsr.shadingrejection.samplecount 0 (to minimize the said smearing due to parallax disocclusion). To avoid smearing due to vertex animation, you need to enable 'output velocities due to vertex deformation' in the project settings and use the console command r.BasePassForceOutputsVelocity 1 in unreal engine 4. This automatically corrects time based vertex animation and skeletal meshes. For other things like interactions and texture UV deformation, you need a previous frame switch to tell the compiler the difference between the current and previous frame. This goes into the world position offset, since motion vectors are calculated per vertex and interpolated to one value per pixel
Sorry for my long comment. It's not necessary to understand all the technical stuff, but TAA is pretty involved and there are a lot of misconceptions about it. Even among game devs
@@normaalewoon6740 I'm aware of that console command, when people say their TAA looks bad in UE4 games that's one of the first commands I recommend they use to mitigate blur
@@normaalewoon6740 Also can I get your opinion of the CVARs "r.TSR.Subpixel.DepthMaxAge" & "r.TSR.Subpixel.Method"? I've tested them myself, I want to know what you use/think.
Also r.TSR.History.GrandReprojection was a good thing but I think it's been removed in newer versions of UE5
@@HybredI actually did not see a difference when I tested them, I think you know more than I. The CVARs I mentioned did the trick. I go for the weakest settings that do the job, at the highest quality I can afford to keep the motion clarity as good as possible. All I need is a consistent 85 fps for backlight strobing, v-sync + an fps cap, motion blur during fast camera rotation only, high quality upscaling from 100 to 200% screen resolution and reprojection disabled on transparent surfaces, if that provides a clearer result
also F***TAA subreddit needs mention
This is exactly what has been driving me insane with modern games and why I often even avoid them, I just want a crisp experience, I already have bad eyesight, if I wanted this effect I would just take off my glasses. I really dislike blur of any sort in games.
🤣🤣
@@Odyssey636DLSS still uses temporal anti aliasing.What are you talking about mate?
Edit: It can look better than native TAA SOMETIMES depending on how bad the TAA implementation is. I am on AMD and what I do if I can is set my display resolution to 4k and play it on my 1080p screen. This is better than native 1080p TAA
@@west5385 Yep, that's what I do: Use DSR (Dynamic Super Sampling).
@@west5385 i try that, but my pc cant rull all games at 4k. My old rx 5600 xt couldn't run Jedi Fallen Order at 4k60, but the blur was manageable at a bit lower resolution, while keeping 60fps. Same with red dead 2, i have to use 150% resolution scaling, but that drops my fps to like 40
Wow this is what I mean! This 4k games looking blurry has been bothering for so long, I can't think why anyone think this was a good idea. It's literally looking at a picture with bad eyes, I don't care if this is creator content or not but just get rid of it!
The worst part is that this has been going on for like 7 years.
Yes but it's getting worse and worse as the more it's used, the less options/control we get over it, and the worse it looks when we disable it.
No one cared when it was optional. Much like DLSS/FSR is being relied on for performance now, TAA is the same problem before that was an issue.
longer than that. Far Cry 4 (2014) has like 5 different kinds of AA and they all look awful
@@kylerclarke2689 Technically even longer than that since the first prototype of temporal AA was found in Crysis 2 in 2011.
Far Cry 4's only blurry AA is NVIDIA's TXAA. And that one was indeed awful.
@@Scorpwind
Crysis 2 dof was the worst for making the image blurry, it was absolutely horrendous.
@@originalityisdead.9513 I wasn't talking about DOF but sure, that too to some degree as well, I guess.
"8k ultra-HD mega textures, 250gb game" ran through so much post-processing that it might as well be 720p.
All that extra horsepower to run the game just for it to look worse than a game from a decade ago, absolute dogshit visual clarity, not to mention the ghosting that already exit on non OLED monitors...
@@AverageDoomer69
Worst example is Borderlands 3.
@@AverageDoomer69 Exactly. You would atleast think they optimize the games for consoles and that those games don't require much horsepower... nope, go buy an RTX 4090
720p would look better.
I think a lot of games rely on upscaling so they literally DO run 720p and then upscale to 4k lmao
Dude what. I’ve been wondering why the hell every game has been super blurry for years. I didn’t even think TAA was to blame. Thank you for bringing this to the front of my mind.
That’s kinda funny, it bothered me so much I googled immediately how to fix it.
I though I was going insane or blind cause I swear to god Crysis 1 has more satisfying graphics than newer games
Motion blur is awful and you are stupid if you enjoy it. Or visually impaired
Feels like all Unreal Engine games have massive amounts of TAA, its just really annoying
as of version 5.1 something is making it blurry by default.. maybe they turned it off in 5.3 bouta check next week@@PugoSixtyFour
You can see this with that awful kong game, the effect is cranked to 11. Is like an Unreal engine curse.
TSR is the new default in UE5 because upscaling from a lower resolution gets you better performance, especially with lumen/nanite being hard on your GPU. I'm sure it's not helping a lot with the blurring if you're not tuning it.
TAA is needed because without it LODs cause a bunch of flickering with Nanite. I don't know if there are effective alternatives.
FACTS its so horrid
Sharpening an image to "make it more detailed" is similar to treble heavy headphones that aim to give the illusion they are more detailed.
Yup, the information is gone. And you can't get it back. You can use sophisticated algorithms to guess information to add back into the image, but it is gone. Sharpening a blurry image doesn't help with aliasing or flicker, stuff still looks muddy, but also has way to hard edge contrast.
it's like the CSI "zoom in, enhance" meme
Both are high frequency noise, yeah
spot on - not to mention (idk if its the same in modern tv's) but back in the day there was a literal "sharpness" option on your TV. So in theory today's games would be doubly sharpened through the game itself then the TV.
Reminds me of the overly sharp and contrast heavy Half-Life 2 textures.
One thing I REALLY dislike is that with the introduction of TAA and DLAA, devs have started removing filtering and other things on foliage and trees it seems, so a lot of games are really blurry with TAA, but look like a jagged mess without it, even if you turn on other AA options like SMAA or MSAA.
If you have it, PalWorld just does a Temporal Super Resolution (TSR) and tbh it looks much better than TAA and runs just as fast too...
The ultimate issue is square pixels on monitors and TVs. Projectors have natural anti-alialising which makes them better than TVs for cinematic games.
@@Vifnis palworld is unoptimised garbage and it's AA settings literally have no impact on performance. No matter what AA you choose, it still looks like shit, and you're better off turning it off and using reshade post processing AA and fine tuning it to make it look half decent.
But palworld has much bigger issues than AA, one of them being subpixel geometry. Because it's in effect just an asset flip, with bough and free assets thrown haphazardly into a "game", there's basically no optimization, and thusly pretty much no LOD work other than what already came with the assets. If there's no or not enough LOD, this leads to polygons in distance that are smaller than the actual pixels they're being rendered into, and this causes massive performance drops (as the GPU has to "work out" how to render something smaller than a pixel, a few million times each frame) and of course decreased visual fidelity, as geometric accuracy is lost, leading to inaccurate pixel fill.
@@mikehawk6918 wut? TAA isn't great but DLAA is good. I don't have blurriness issues with DLAA
Hold on. DLAA is for down-sampling, so you're suppose to be starting with a resolution higher than your monitor.
If you use glasses, you always have the option for GOAA (Glasses Off Anti-Aliasing)
Or DGAA, Dirty Glasses Anti-Aliasing.
haha and thats why dlss, taa etc are making the game actually more realistic for me by making it blurry as hell. like rl
or NIEAA (Nut in eyes Anti-Aliasing)
This happened to me but no this problem, fallout 4
bro why did i say that wtf ☠️☠️
One thing that annoys me is that because it destroys detail, you’re still rendering all that detail, then smoothing it out. So you’re actually doing extra (albeit very easy) processing for less detail
TAA is one of the performance killers in modern games
@@GewelReal "TAA is one of the performance killers in modern games" I wonder if this is a feature for the future and not meant for present day.... might have to dig thru arXiv for this one... what I'm saying is, on a 1080p display it might be HUGE loss in detail, but for those using an 8K Gaming set-up (lmfao) it probably adds a lot of performance for the processing required... tbh FXAA just looks like crap and TAA easily adds a realism to games I play personally, BUT PalWorld coming out I recntly learned of TSR (Temporal Super Resolution, istelf tbh is kinda simplier and safer-and we have been doing it for years on desktops)
@@Vifnis why not just use smth like FSR or DLSS? It does both antialiasing and FPS boost. Theres no way you'll ever get MSAA8x-like image quality while keeping acceptable framerate. Miracles dont happen, you gotta pay with something.
TAA is an amazing technology that allows you to completely eliminate pixelating at a very low price. AI upscaling will replace it completely however. There is no need to have both.
fsr and dlss will never look native@@fureimu_64
What annoys me even more is when games don't even have an option to disable it. Days Gone doesn't even have any anti aliasing settings. You have to mess with ini files but that's also removes HUD prompts
i got so used to the blurry look of modern games that whenever i go back to older games, it just feels like i entered a new dimension of visual clarity and crispiness. also, you talked about how TAA gets rid of specular highlights, and well, nfs heat had so much firefly artifacts from it's specular materials that you were pretty much forced to play with TAA on. it reaally sucked because i preferred the clearer image, but the visual glitches were too distracting
nfs heat was specially bad for me with the forced chromatic aberration on top of the TAA blur. Had to scale to 1440p and trade some performance
Haha, man I thought it was just me. A majority of these games look blurry and that's why I always cranked up sharpness even though that's frowned upon.
Gaming went through a really weird direction. I wish more games would go for compelling visuals like Dishonored instead of trying to make everything look "realistic".
hell on the topic of NFS, NFS Most Wanted (2005) on the Xbox 360 still looks great to this day because while they put a lot of detail into the game, especially for a 360 launch title, they didn't go overboard in trying to make the game look realistic and aging absolutely horribly and took a stylized approach to the game, having specular highlights on car paint be bigger than it'd be in real life, exaggerated particle effects, and the (in)famous color grading and bleach bypass filter
One of my favorite options is to combine 4x4 SSAA with 16x CSAA for a perfectly smooth, flicker free and sharp image.
The fact that TAA in motion removes light/reflections is really heart breaking, because this often give objects character and texture.
Ferdinand Habsburg for Kaiser?
POV:
_* RTX died whilst escaping TAA_
@@locinolacolino1302 i am für den KAISER ! 🇩🇪🦅
Interesting, since I only operate 4k monitors the whole thing is something I never considered because in 4k you dont need ANY AA at all. I haven't found a game where it would be necessary. I cant see the difference honestly .
@@Vanadium stop parroting crap from 10 years ago to self assure your sense of snobbery. That objectively isn't true for all games because not every game renders at native screen res first and foremost, and secondly there are shaders and post process effects that absolutely need AA to render in a way that is visually cohesive.
You see, to help with the temporal blurring issue what we need now is upscaling artifatcs, and AI generated frames for more artifacting, but then we add a sharpening filter that deepfries the image and it all goes back to looking good
Or you know just use dlss and frame gen instead of amds trash and not have artifacts at all lol. There is a reason software based frame generation was never made by nivida. You can't do it right. AMD has shown the results of software vs hardware based upscaling.
Native Resolution or bust. No fcking round
@@donkeymoo1581 FSR has improved significantly, as well as being free for literally everyone to use. I think you should look moreso at the reason for its creation versus the obviously superior but less accessible DLSS. Competition isn’t always about goin for the same goal, it can be about providing for different markets that don’t have access to that technology yet.
@@LieftheDragon Stop coping lol. Even intel xess since launch has been far superior to fsr and intel can't do anything right anymore. AMD fanboys argue "they don't need upscaling" now every game uses upscaling as a baseline. AMD fanboys argue RT performance doesn't matter. Now every other big game has RT baked into the graphics so you can't even turn it off.
There is a reason AMD are pulling out of the mid to high range of gpus. They have nothing. They have no features they have no real world performance they are just going full focus on cpus as they should. Focus your money where your are leading the race. Don't keep chasing a market where even intel gpus are beating you in performance and features lol
@ it’s not cope dude it’s literally as simple as a google search away to see these things. Battlemage is not more powerful than AMD gpus yet, it gets up past the 4060. Which is good, I’m all for intel, but you’re making baseless claims because those cards haven’t even come out yet.
Also, XESS is indeed quite good. I never said it wasn’t. I simply said that FSR still works very well now, and it has its purpose that doesn’t require buying new gen hardware to use, and that’s COOL.
I’m just as pissed as the next guy that games are using sub-res textures and anti aliasing/upscaling to make the games look good. I was just pointing out the issue.
Whenever I play a newer game, I always check the graphic settings to disable things that intentionally blur parts of the image. Our eyes blur things, like rapid motion and distant things, naturally. Trying to emulate something that our eyes do automatically makes the image look bad at best, and headache inducing at worst.
Yeah blur in games is a cancer
exactly!
and that is why me and some of my friends call it "artificial blur"
I think motionblur is trying to emulate the effect of a camera using a slower shutter speed.
Only time i like blur is depth of field in cutscenes (ex. Ghost of Tsushima), otherwise keep that TAA far away
Your eyes can't blur a screen stopped in place like motion blur, people hate it because it's not well implemented, but just compare ubisoft's new star wars game with respawn's star wars game and you can instantly see how not having blut makes movement feel robotic, now they can do per object blur or screen motion blur which is usually bad because it just blurs everything,but can't blame people that instantly turn it off from ptsd because it's trash in most games
I was sixteen when Fallout 4 came out and it was the first experience I ever had with TAA. I remember it to this day if that says anything. I was simply mind blown that nobody else could notice it. All my friends thought I was insane!! At the time, the game hurt my eyes so bad that I couldn't play it. Little did I know, the entire industry would adopt this actual game ruining trend.
man I am convinced Fo4's TAA singlehandedly caused half the headaches i got around that time.
literally the same story here, that game is disgusting to look at without mods
That's how I felt about LCDs. Still do. I can't wait to dance on their grave when the technology is finally burried. CRTs illuminated pixels very briefly, just as the electron beam passed by. This meant perfect motion clarity as long as your framerate matched your refresh rate. On a constant illumination monitor like a typical LCD the image is stationary while your eyes move following something on the screen. This is caused persistence blur. If you take a 100 Hz 1440p monitor and focus on an object moving across the screen in 1 second it will move 26 pixels/frame. This means that your eyes move 26 pixels while the object is stationary, each frame. This means 26 pixels of linear blur.
It's going to look like shit. It's not going to match the motion clarity of a CRT before it gets into the 1000 Hz range and even then there may just inherently be too much ghosting/inverse ghosting and other artifacts with LCDs to ever reasonably do that. At 500 Hz they don't really keep up. OLED can do this; the pixels switch very quickly and if you have the bandwidth you can crank out 1000 crisp frames per second without any ghosting or overdrive artifacts. MicroLED can do very high brightness for brief durations. Laser projectors are a bit of a wild card and might be able to do very high refresh rates.
@@soylentgreenb Holy shit, I'm so glad that it isn't just me.
I remember having headaches from TAA in Enderal (a Skyrim total conversion mod).
Solution was to use both TAA and 4k, DSR option enabled in nvidia panel.
i thought my eyesight was getting bad but I'm glad you all have noticed the blur in these modern games i miss the crisp clear look from older games.
first time I boot up Mass Effect Andromeda, I was shocked to see how blurry it is... ruin the vibe of space scifi from the initial trilogy (not to mention that the game is also bad).
to my surprise, here in 2024, game dev don't make object more clear, they opted to optimize TAA in a form of upscaling. The bastard Nvidia is also suspect of this, remember when they tease DLSS as optional, more like premium option? Now all games requires it to run smoothly.
you love aliasing right? well just turn of TAA/FSR/DLSS and have fun with the freaking bad graphics
@@PrefoX the thing is, a game from decade ago may benefit from AA off. And early PS4 titles such as Uncharted 4 or TLOU remastered doesn't even use TAA on launch. Those game looks absolutely pristine.
And then games like RDR2 came along with advanced TAA implementation that became a staple standard of modern graphics, BUT somehow other dev can't keep up with it without sacrificing performance because of poor optimization.
I recently played Alan Wake 2, it has DLSS on auto, and implement DLAA... Looks blurry as hell except the character faces. Any object located more than 5 feet of your character reminds me of taking off my glasses, not to mention how hard it is to even have 60fps on mid range hardware.
Easy turn it off
@@nickochioneantony9288 rdr2 TAA is bad as well. And MSAA causes weird artifacts like trees glitching.
The thing about TAA is not just blurring the static image, but it makes frame transition looks smoother.
This explains a few things. Ever since upgrading my computer and enabling TAA in games I've been noticing games are suddenly really blurry in motion. Was really confused as higher specs usually means a clearer image, or so you'd think.
Yep, this is why I don't use TAA. Some newer games offer DLAA if you have an Nvidia card which usually looks quite a bit better.
Yeah when I use the anti aliasing in mhw they always make that game looks so shit that u needed to search in the community the perfect right settings on gpu to make that game looks sharp
So this problem only in pc games?
@@arthurmorgan8794 No as for consolesit's on by default and you can't turn it off. PC games at least have a chance to change settings.
Depends also if your monitor doing ghosting
this is one of the reasons why 1080p feels lackluster in modern games. playing a game from 10 years ago at 1080p looks crisp in a way that can be only achieved with 1440p or 2160p in modern games.
U can "dlss" 1080p to 1440 and it will look better than 1080 native.
Someone telling themselves that a 1080P Monitor from 2011 "is still okay".
@@TheNerd this is why PC gamers are such hypocrites. You say old gens hold gaming behind, and then you gatekeep technology that most people can't afford. "Oh old monitor? Why don't you have 2024 8k with $1000 GPU??? Bad graphics is your fault."
@@TheNerd1080p is great. Small screen and easy to get super high frame rates for high refresh rate monitors
Dude not just 1080p.Even in most old games 720p looks more sharper than todays 1080p native resolution.
I always prefer options. More of those options is always better. I've always disliked motion blur and TAA blur. I like to see the detail and the cost of bad looking hair and jagged lines everywhere is worth it to me.
I think the same the best quality and detail for me is without any filter, althought i get aliasing it doesn't bother me
supersampling is best of both worlds... if your pc doesnt overheat and explode.
Yes, options. Devs just don't like to put effort(money) into it. But they really should. Players will choose for themselves what is preferable to them. I always disable motion blur. But sometimes I have to use some kind of AA. Glimmering in some games may be a huge issue. I tried to play first Shadow Warrior and it was just really bad. Really good AA is when you increase the resolution by 4x and then scale it down, which is really costing. TAA wins here 'cause it a much more "light". But blurry. And while sharpening may not be the best, it's better than without it. I play a lot of warframe and I've tried different settings, TAA with full sharpness is more nice to my eyes.
P.S I've got some "easy to be tired" eyes. I can play some games 14 hours a day and be ok. But if it has glimmering, excessive blur, 30 fps(not 60), stuttering or something - 1 hour of it can destroy my will to play a game with a severe headache and eyestrain. So for me it's often not the choice "it looks prettier". It's "I won't die with migraine with these settings". And TAA helps. If my videocard is not good enough for better AA AND if it has sharpening. But I've seen only warframe to implement it. Other times I had to turn it on in nvidia drivers and it was somewhat ok. Really, if devs just try they can make blur really nice or make game not glimmer so much without it.
P.P.S. I think that gta V has really good blur in games. I hate depth of field, motion blur etc, but in this game(and only) I even chose to to set it on.
You guys should jump on 4k. I dont see the difference on this resolution for anything AA related. So you never need to enable AA.
I've never minded jaggies. I hate anti-aliasing. There's only one sort I'm alright with, and that's FXAA.
The main problem is that developers aren’t at all in love with their games, they have corporate assholes botching their creations for extra profit which in return makes the devs say fuck it it’s not mine now anyways let’s just push this thing out and get paid. It’s not like back in the day when every person working at the company had love for what they were doing
You can break aliasing down into three general sources.
1. texture aliasing, this was solved completely in the 90's with mip mapping and anisotropic filtering so you may not even be aware it was a thing. on any kind of modern graphics hardware plain textures will never alias.
2. geometry aliasing, since polygons are mathemathical vector data, they will always be undersampled on any kind of digital display, msaa is the perfect solution for this type of aliasing, but even simple filters like smaa can have a high success rate with this type of aliasing, there are even post effects specifically optimized for geometry aliasing that aren't applied to the whole image such as SRAA and GPAA.
3. shader aliasing, this covers a lot of possible effects like normal mapping, shadow edges, rim lighting, HDR (not monitor), refractions and reflections, etc. there is not a good universal solution for every single one, but techniques like pre-resolve tone mapping, LEAN/CLEAN mapping, toksvig mapping etc. can effectively clean up the most common ones, and if you have some special requirement you can always do in-shader supersampling or custom filtering.
TAA is attractive because it is a simple "drop-in" effect that requires little to no effort, but if you are simply careful, all aliasing can be adressed with not much more effort. If you simply solve each source of aliasing using well known techniques you will be left with a clean, sharp image that does not require agressive post processing.
Hey thanks for this comment. I have a developer resource on how to combat aliasing issues in games and improve AA quality. I hope you can give it a look and possibly contribute any sort of tips you might have that you think is missing: www.reddit.com/r/MotionClarity/s/S9t1LgQwgz
So you are saying that we get the modern slop purely because the devs are lazy to use proper techniques?
@@heksogen4788 probably not even the devs, but the companies themselfs most of the time, why would they spend more money on finding proper solutions if there is just one single drop in solution most people wont even notice is actually bad. its not like we were expecting much from those companies at this point since they have been milking the same games over and over by basicly releasing dlc's or updates as new games whilst somehow still making the new games more expensive and worse then thier predecesor.
@@heksogen4788Unfortunately, companies are made by time and deadlines. It's like how games come out buggy. If a game isn't being released, they're losing money. Made by time, not looks. If something is easy and cheap? Do it. If it takes hours to optimize, ignore it, get it out the door and fix it later.
It's honestly disappointing as I feel a game should be as optimized as possible like how N64, GameCube, PS2 and everything had limitations people worked around to get the best out of their system like Banjo and Donkey Kong Country. Amazing games for their times pushing the system to the limits. Now? Probably some lag in that new triple A game that's pushing limits for the wrong reasons.
I turn HP enhance on my monitor and fixes everything close enough, then turn up contrast and lower sharpness lol@@Hybred
I was noticing these modern games looking softer than they should. Thank you for this video!! I perfer a sharp imagine but i see why TAA exists because these consoles cant handle native 4k in high frame rate modes so you would see alot of jaggies with it off but that might actually be better. TAA gets rid of so much detail that it makes it look like we are going backwards in graphics evolution
PS5 console still need TAA ? or is it just ps4 console era ?
@sergiovinicius2221 PS5 uses it alot but because games run at higher resolution than ps4 the muddyness will be less obvious. But games like marvel spiderman 2 in the visaul mode looks stunning but the performance mode looks more blurry. Performance modes in games make games look more muddy. I haven't been playing much games these past few years but GTA 5 enhanced does have that muddy look in performance RT mode but zero jaggies. But it's so nice to play gta 5 at 60fps and game still looks good despite it using TAA. TAA negative effects are less pronounced at higher resolution
it look like we are going backwards in graphics evolution
yeah that's on point, it's like we are progressing back to 7 gen fidelity-wise
The higher the resolution, the less noticeable jaggies are thankfully. When I switched to 4K years ago, if a game didn't support real (non-post-processed) anti-aliasing like MSAA/SGSSAA or good old-fashioned brute force Super Sampling, I just turned AA off. TAA/FXAA just make everything look way too blurry. You hardly notice the jaggies, even now when I switched from 27" 4K monitor to playing on a 4K 65" OLED. Sure, it's a little more noticeable using the OLED, but that's because I only sit about 6 feet away, which is pretty close. It sucks though that if I wanna play my Series X or PS5 instead of PC, or are playing a shit console port on PC, that you can't usually turn AA off.
I think 4K was introduced to the general public much too soon. Displays can output to that resolution, but even for simple video streaming many people don't have adequate bandwidth, and for gaming the cards need radiators that take up a fraction of the case.
2 things that usually come with that in the same package:
- motion blur (although, that u can usually turn off thankfully)
- ultra low unchangeable fov that literally makes my eyes hurt
Also overdone DOF
The unchangeable low FOV has turned me off of modern games entirely. It literally makes me feel sick.
@@Sanguivore same. Often you can manually change it by editing configs, or find mods to fix it.
@@trapper1211 Yeah, when that’s an option, I definitely go looking for it if it’s a game I really wanna play.
A 70 degree field of view is the standard. Most people say 90 degrees is the sweet spot.
This is the equivalent of standing 2 meters away from a large window and taking one step forward, sure you can see a bit more of what's outside but it's not earth shattering.
High FOV looks ridiculous, like you're a goat with eyes on the side of your head. How is that the less sickness inducing version for you? 😂
I'm not trying to be a jerk, I've just got a good nose for poor excuses. MOST modern games have a generous enough FOV or the ability to change it. So what's the real issue mate? 😂
Thank you for your work. I wish more people would be aware of developer's annoying abuse of TAA (using pixelated transparent assert assets and then blurring the whose whole image with excess I excessive TAA for us to not see). I'm grateful both for this video and any mod you worked on. Anyone combating this insanity deserves a medal.
TAA itself has benefits, but developers rely on it completely for the game to be even tolerable, the worst case being perhaps RDR2, where it's unplayable with no TAA even at higher resolutions (I tested 2k + MSAA + SMAA). Another issue is that they make TAA too strong, but perhaps the technology itself is bad and should vbe replaced as it would ruin textures in any case.
Third issue and a very big one is - Using TAA in upscaling comparison materials to cheat and exaggerate the result. Upscaling uses a TAA lacking image to create a bigger resolution one. Then they compare it to a non upscaled lower resolution image with TAA ENABLED ramvbling that upscaling has added back details magically as the smaller resolution image with TAA lacks them. Yet it is a silly lie, the main difference is that TAA blurs a lot of details and a lower resolution image has them hidden, whereas the higher resolution AI upscaled image has no such issue due to upscaling a non TAA version of a lower resolution image (so TAA haven't ruined all textre details yet). Really dishonest marketing of what upscaling can actually do.
For those wondering why this is so prevalent:
This is not because because developers are lazy, but more so due to the fact that most modern engines use deferred rendering, whereas most engines prior used forward rendering. Deferred rendering supports more lights in a given scene which are also more accurate, allowing for complex scenes to be rendered relatively quickly. However it does not work well with traditional anti-aliasing methods because it apparently messes up the final render. The lighting of modern gaming is truly impressive, which is the main strength that deferred rendering brings to the table. It just means they more or less have to rely on TAA to fix the jaggies. Forward rendering can achieve similar levels of lighting by baking lightmaps and such, but that can increase development time, especially for complex scenes, and you will reduce the ability to have dynamic light interactions. Developers are not necessarily being lazy, they are just trading off one area of quality for another. I do hate TAA though. The Godot engine uses "forward+" rendering which apparently uses some deferred rendering techniques along with a "clustered lighting" to deal with complex lighting, but is still capable of using MSAA. Might be worth looking into for anyone who is interested.
Its still a developer laziness problem, because theirs deferred rendered games with thin geometry that either have decent TAA or good non-temporal options, because they're not using generic engine defaults and they actually tweak it and analyze the image in motion.
baking lightmaps? oh man i havent heard of that since source engine
For me, I mostly solved the issue by using Dynamic Super Sampling (DSR Factors) in Nvidia control panel @ 1.5x resolution and 50% smoothness, and then combining that with DLSS Quality. It then doesn't cost much FPS, but it makes a MASSIVE difference in RDR2 specifically where I can then still have TAA on at a low setting.
That 1.5x is because I'm already on 3440x1440. If you use 16:9 @ 1080p, then increase the DSR factor further.
😂 And your PC is nice and stable. Comfortable temps and everything is just coasting along?? Not a chance if you also have more than 70 frames. I don't know why folks like you try to use cope for RDR2's terrible TAA. And the whole thing about putting it on low, doesn't ease the issue. It introduces more. DRS is great. But then you said DLSS. Both of them together in RDR2 produces ghosting. And trees look terrible at night. DLSS in RDR2 is bad. No amount of cope can fix that@@elmhurstenglish5938
they still do that in source 2, quite based though @@mahuba2553
As a solo game developer, I find it funny if some big studio makes a game without offering an option to use your preferred anti aliasing or switch it completely off.
I really hate it when I have to edit game files or find a modder that already did it; I wanna play the game but my pc is shit and then I can't disable it... Bruh
because as a solo developer you’re closer to the work and actually listen to people like bro in the vid. thank you for noticing
apparently in Unreal Engine , epic have made many rendering features effectively tied to TAA so that it needs to be enabled for some things to work.
I heard once that some studios rely on it for fast transparency, as they use dithering and rely on TAA to blur the dithering together
In big game, targeted at casual players. These stuff makes no sense to them and doesn't really matter.
Kinda why any choice are less the more mainstream things is.
Take Minecraft for example. Their devs seems like doing nothing. Even if they do it's like decades since the game released but nothing much changed.
While stuff like terraria who targets semi niche audience has all these crazy things on them every .1 update.
@@Nanatajaa Yeah ofc Minecraft had no patches...
Please change your Name asap.
The problem really started when the Deferred Rendering started getting more popular, towards the end of the 00s.
While DR provides WAY faster lighting effects and other great optimizations compared to the older Direct Rendering pipeline, it also came with a bunch of new problems: it broke the MSAA and made transparency effects extra difficult or tasking to perform.
The former, together with the increasing playing resolutions and rising amount of shader-oriented noise and artifacts, resulted the birth of these blurry Post-Processing based anti-aliasing methods, such as FXAA and now TAA. And all that on top of the already then ongoing "bloom & blur" visual trends, and you now got games that are like senior ladies wearing waaay too thick layer of tacky makeup.
You only realize how far down we've fallen when you go back and play some ~2004-2005 AAA games again. The crisp, sharp, non-obscured visibility truly is refreshing.
You know to trust a commenter when he has a f*** google plus profile picture.
Well said!
Yeah those PS2 graphics are really next level
This deferred rendering is the root cause here. It basicly allows easier way to scale up amount of light sources in scene but in reality, every game out there can be made with direct rendering pipeline. Engines can be written in way that each light source affect group of objects that are close enough and priorize them, and of course static lights in static objects can be precalculated to textures.
The deferred rendering tech demos back in the day were really misleading. Most game scenes don't actually feature a huge amount of tiny point lights, and light sources in games will almost always reach all the walls in a room. Lights IRL don't just affect small bubbles of space- they subtly spread out over very large distances.
Deferred rendering allows us to scale up the number of light sources, but that is only a single of the many benefits of a deferred lighting render pipeline.
Forward lighting models mean that for each fragment / pixel on a geometry we do all the lighting computations on a single pass, so drawing geometry if it passes a basic depth test we do all the expensive lighting calculations. This is bad because geometry is not in order because sorting geometry is a CPU task and really slow and not at all cache optimized. In busy scenes this can mean that for just a pixel you end up running the lighting calculations anywhere from just once to over fifty times. Then consider that for all the lights you have a single run through the full lighting calculations means considering the contribution of every light and generally with an additive light model summing the results and then correcting in post processing.
With a deferred model we still consider each light source, but we now have the guarantee that we only need to calculate lighting for each pixel/fragment once and only once, because we use the first pass for depth testing and doing simple drawing of all sorts of world data into different frame buffers, and on frame 1 we have all the data spatially localised and we have done the depth tests to make sure we are only doing a single correct lighting calculation for that pixel/fragment.
Because we basically divided our work load into a tiny fraction of what it would be in a forward rendering pipeline, we actually have the time budget to put in any kind of decent graphics effects. Immediate rendering simply does not allow us to put the modern standards of graphic fidelity into games.
MSAA isn't really visually "better" than other anti aliasing solutions. Deferred rendering doesn't mean games made with that pipeline be "blurry". It however make it way faster to do things like SSAO, SSR and layered transparency.
Baked lighting is good, but does not work if you have moving light sources (This is most emissive light sources in modern games btw, think explosions, muzzle flash, swinging lamps, etc). It does not work if you have moving geometry (player models, cars, etc). It means that making changes to levels during development takes a lot of time unless you batch changes together to do a nightly bake or whatever, but either way it introduces dev ops complexity and often makes QA testing slower. It's a powerful tool but not a universal one.
Also, MSAA is not "broken" or impossible in a deferred renderer, it's just not really worth doing because visually it has some pretty obvious shimmering that people like to forget about when getting nostalgic. You can use the world position gbuffer and do edge detection quite easily with a shader, then do a standard super sampling on the found interesting pixels/fragments. It's maybe slightly more time expensive than SMAA in a forward renderer but still cheap. But.. why bother adding it? It's added complexity for a mediocre antialiasing algorithm.
I always had that feeling the "older" games have FELT better visually. They didn't look better in terms of texture or hyper realistic lighting, but somehow offered a more pleasing and cleaner overall visual experience. 4K high refresh rate gaming was looking pretty optimistic (which would solve a lot of aliasing issues) but then they started pushing the "next gen" technology before we had hardware to support it.
Path tracing and every other tech that promises next generation visuals just ruined performance and clarity. We are now gaming in sub 1080p with sub 60 fps, looking at a blurry mess.
Developers just turn on every engine feature, slap it on their game without thinking and expect a miracle, where older games had to manually craft the lighting and visuals to min-max the existing technology.
First time I hear someone other than me say that TAA affects highlights. This was a great video. Keep them coming.
Just posted one a few moments ago, its still marked as unlisted as I wait for it to process & create a thumbnail for it. So coming very soon, either later today or a few days from now
MY BROTHER IN CHRIST. THANK YOU.
I was noticing the blur for years and it has been driving me nuts. Didn't have a clue that TAA was tge issue
TAA may not be the issue; it depends if the game is even using it. AA done improperly is worse than no AA, no matter the resolution.
TAA is one of the first settings i disable in a game. Not only boosts performance but looks sharper. DLSS and FSR add the same blurryness but the performance gains greatly offset that.
TAA off looks sharper LOL jagged isnt sharper
@@wallacesousuke1433 you're not that bright.
@@RequiemOfSolo jagged edges and shimmering are the exact opposite of sharp, dear brainlet
@@wallacesousuke1433 do you know what sharp means?
@@thepastarat tell me again how jagged/pixelated edges and insane shimmering equal "sharper", smartass
Really informative and straightforward! Great video, appreciate your hard work and effort.
This video cleared up for me why so many newer games on my 1440p monitor seem to look "blurrier" than some older games I play. It started to feel like upgrading from 1080p a few years back was pointless but I guess even that is worse now
I always used to put graphics on ultra and then wonder why it looks more blurred than on medium. When I finally figured out it was TAA I started changing it to SMAA. Even though the edges of some objects are still slightly pixelated on SMAA the overall image is considerably sharper and easier to look at (you don't loose the detail). DLAA is the best though.
Here is a list of effects that I always turn off. They have little to no performace cost but greatly detract from the experience when turned on.
1. Bloom (if it''s too glowie)
2. Film Grain
3. Chromatic Aboration/ Lens Distortion
4. Motion Blur
5. Lens Effects/ Lens Flare
5. Vignetting
6. Depth of Field if implemented badly (LEGO Batman I'm lookin at you!)
7. Screen Effects such as Blood and Dirt
All of these effects are only for making it feel as if you are watching a film. I never understood why they keep implementing Camera effects into games.
@@ScoutReaper-zn1rz Yeah, pretty much all post processing effects. They have little to no impact on performance but they make the game cinematic. Imo it's fine for an action game like DMC or something like that where they add to the overall "bling" but for an FPS where you have to focus on things on the screen it's distracting.
My eyes have been sorta opened. Well not really my eyes are kinda bad..
This was exactly me aswell !!!! Lol
I remember when TAA was invented, and how cool it sounded. I'm a graphics programmer, so I keep track of that stuff, read papers, etc.
It seemed like a pretty genius solution to aliasing.
I think, as you said a few times, it really comes down to how well it's done in each game specifically.
Bad TAA could be as bad as the shots you're showing in the video, or it can be well done in other instances.
It's debatable to say that information is lost in the same way, and in the same amount as, say, a gaussian blur. TAA isn't simply a blurring algorithm.
Implementations that can properly use motion vector fields and other techniques to "correct" for the blurriness would be good examples of TAA working well, probably.
This. And often its just horrible settings. Like in Fallout 4, where they used the worst settings possible for TAA. Especially in F4 VR this was an issue, the blurriness literally caused headaches and it turns out you just need to adjust 3-4 settings in an .ini file to fix it and still have good anti-aliasing effects...
one game that does TAA very well is War Thunder surprisingly. The TAA effect makes the game look so smooth that it feels jittery to look at without it.
@r1zmy I genuinely enjoy TAA in WT as it's a very well-done anti-aliasing option compared to DLSS. I turn on DLSS for the sole reason of playing 'spot the dot' in GRB/ARB when I'm trying to spot air targets.
@@r1zmya little bit off topic: can I enjoy War Thunder without spending money? Or is it heavily p2w?
Also a programmer: TAA was massively oversold when it starting being implemented. Always has been shite, always will be shite. It's the crutch bad devs lean on who can't be assed to properly spec their assets, tune hot spots, and build performant games.
RDR2 is a real pain in that regard, I'm not surprised you included it. It's the only effective AA on that game, and honestly blur is not the worst, it's the ghosting with camera movement. Especially with light sources at night, it almost feel like in older versions of windows when it froze and you had that window cascade effect
You can use dlsstweaks to force DLAA in the game, or just use DLSS quality depending on the resolution.
ye, DLSS Q fixes blur.
theres MSAA and DLAA in RDR2
the TAA implementation on RDR2 is only half-decent on DX12
Vulkan's TAA looks like someone is smearing vaseline on my display
The problem with DLSS in RDR2 is Rockstar locked it to an ancient version of DLSS 2 so you have to hack it and forget about online if you want to upgrade the DLL to a newer much more capable version. It's the only game that I know of that I can't just drop in the latest nvngx_dlss.dll and be done with it .... I'm pretty much done with Rockstar and Take Two and their sh*tty business practices
I recall first noticing this is Skyrim, and modders fixed it with custom shaders. Good video, great summary at the end there! :)
As soon as I found out that TAA used past frames I knew it would blur at motion, and began turning it off in fast paced or fps games
its because by default in an engine like unreal engine for example, UE5 turns on FSR/DLSS(or another scaler) by default AND applies an aggressive TAA and their motion blur, it can all be disabled very easily but as more games are made with unreal as it gets more popular(like unity) we're going to see people keeping the default 70% scaling with taa on because they couldnt be bothered to disable it or just didnt know
@@HalbeargameZThank you. I'll look into that myself.
I turned off motion blur in mw2019 and it looked so bad for some reason, blur just makes everything more realistic
@@QU141. I know I'm late, but the only situation imo in which motion blur could be turned on, is when playing a very hefty story game with under 60 fps. Like if you have a 144hz+ monitor and enough frames for the monitor in an fps game, motion blur will ALWAYS be terrible.
Yeah me too, especially in r6s. These days I turn off AA completely off in any fps game I play
I turn TAA off on any game that has it. I first picked up on this issue with Skyrim.
It gives me headaches as my eyes are constantly trying to focus and deblur the deliberately blurred image.
I have a powerful PC and I turn off most graphics settings like bloom, motion blur, and depth of field. All I need is textures, shadows, and MSAA. The other settings are just unnecessary processing and the game actually looks better without them.
@@One.Zero.One101I get motion blur and depth of field, but bloom? I find bloom quite nice
Yea taa is pure trash
bruh
@@tdoyrbloom is nasty and annoying
Impressive presentation! I never knew how bad TAA can make a game look
shimmering and aliasing is way worse than TAA also at higher resolutions TAA looks less blurry
@@luca4870 Like he said problem isn't existence of TAA but absence of any choice of other way to do AA. It isn't and shouldn't be TAA or no AA discussion.
You must be too young to remember when games offered three or more AA options to choose from.
I'm pretty sure this is a UE5 problem. I used it on a few games and it kept crashing at first and then when I did get it running it didn't perform very well. Strange that engine runs like that. I've seen amazing presentations from that engine but compared to actually running games in it like Source 2 with Half Life Alyx and Counter Strike 2 I'm kind of disappointed with UE5 right now.
@@gorky_vk Good comment. Another huge problem is that most people today don't understand the difference between _true_ AA (which _increases_ detail/accuracy) and fake AA (which decreases detail/accuracy).
Therefore they believe that all antialiasing causes blur to one degree or another, which is of course false.
@@gorky_vk I'm probably old enough to be your dad but devs don't put other AA methods because incompatibility issue and those old AA methods were so costly only people who had latest and greatest could use them anyway.
Woah!! You got some views man! Great job!!! 👏 love the content
Didn't realize you saw this video of mine as well. Thank you! This is an issue in every Call of Duty starting with Vanguard btw. BOCW & MW2019 are the last to give us full anti-aliasing options
I always try to turn off post processing. I had myopia for 15 years, I wore glasses, then I did laser correction and enjoyed my life. I can’t understand how someone can voluntarily worsen visibility like that.
What kind of laser correction? Been thinking about getting it myself
-opens game
-graphics settings
-advanced options
-turn off antialising
-turn off MSAA
-turn off DLSS
Yep, it's gaming time
It's sad that so few appreciate crisp aliased polygons these days.
@@shru_u you just need an eye trained on ps1 jumpy pixelated stuff
dont forget turning off vsync
Chromatic aberration too. That one fucks with my head so bad
@@shru_u I would rather see those delicious crispy pixel borders than a washed-out image that looks like an oil painting.
Thank you for making an educational video about this topic, I hate TAA so much and I don't see enough protesting from players about being forced to use it in game and accepting it's overwhelmingly negative side effects
Oh and lazy developers relying on it to undersample and then literally just blur it back into existence, instead of actually optimising games and writing good code
Genuinely moronic thing to say. You know jack shit about graphics rendering.@@maixyt
Yeah well the main problem is most people don’t realize it’s a problem. They don’t know what’s causing this issue. Another thing that will become a large issue for people like me and you is the fact the modern game engines like UE5 are designed with technology like TAA in mind and basically all games on this engine use this method of AA to cover up all the artifacts caused by the new lighting solutions being pushed onto hardware that isn’t truly ready to handle it. So all the problematic lighting and effects are rendered at a low res and are being blurred for the sake of covering up the fact that your hardware can’t convincingly pull it off yet.
@@maixyt no. The sad reality is that many expensive effects are greatly reliant on some temporal smoothing, and TAA is just that. GPUs are not efficient enough (especially with memory latency and with register counts) to allow for such features without introducing exceptionally harsh performance impacts.
Some of these effects include:
-PCSS
-SSAO
-Stochastic-opaque transparencies
-Stochastic rough SSR or rough raytraced reflections
-Volumetric fog/lighting and other kind of raymarched effects, especially with volumetric grids
Do notice how all of these are reliant on large sample counts and/or large memory traversals. Devs aren't lazy, they simply cannot beat the inherent limitation of GPUs, and this is the only solution (especially on consoles) when the company/studio pressures them to make graphics as rich as possible. The sole reason why we have such large advancements in graphics tech as of late is because it became more affordable to implement due to TAA.
"Pick your poison", so to speak
@@Kolyasisan I wasn't downplaying the role played by TAA within making graphical strides. However, I should've elaborated more on my use of "lazy", I wasn't referring to devs abilities to optimise these graphical techniques for all types of machines not just the latest and most performant hardware, because that would be really hard or outright impossible, especially with the size of dev teams resulting in extra communication time within a time limited scenario and the difficulty of integrating all of the graphical effects without any conflict resulting in unintended glitches or artifacting. I was referring to Step 0 within the process of creating anything, what is the scope? At it's core The Finals is a fun, action filled, fast paced shooter, with emphasis on the importance of being able to see quick moving enemies and items clearly (and invisible light classes). Why would I need the latest and most technologically advanced GFX which require highly performant hardware? There are perfectly good more 'traditional' rendering methods and effects which don't require temporal filters to work properly. Below I will list additional accounts of how I view modern devs "laziness".
However I'm going to skip over things such as motion sickness, blurry and out of date information, and having the option to configure TAA values, in game, and The Finals already having the ability to turn off TAA from within the engine.ini file, and then blocking that config file while not giving an option to disable it in game, showing that turning off TAA is possible and works just fine (I did this for the closed beta), but blocking players access to it (yes I know that engine.ini was used for cheating exploit, but you can whitelist specific commands, as talked about by Hybred in a newer video of his).
I would much rather HAVE THE OPTION to decide how the game looks or performs for myself, as in I would pick either 1:1 sampled effects or even undersampled effects over TAA being plastered on my screen any day.
Also, there is absolutely no way that temporal effects can't be implemented to ONLY certain effects such as volumetric smoke. Picking out a few of the effects, you say that volumetric smoke, PCSS, and SSAO require temporal filters to hide undersampling artifacts? Then how come Watch Dogs 2 (which I've played through recently) with all of these effects doesn't force any type of AA let alone TAA on my entire screen? Yes there may be temporal effects used with those GFX but they DO NOT affect everything, and they are NOT just an overlay covering my entire screen. In my opinion this is what a well developed graphics system looks like, it's within the scope of the game being a thirdperson, moderately active shooter with elements of story, with graphical effects that not only look great but are also implemented well. It is also in collaboration with nVidia and very well documented if you want to search it up. Or even the Ghostrunner games, they look absolutely gorgeous while being one of the most fast faced genres out there, managing to stay away from TAA and any sort of AA for that matter, staying within the devs scope of aiblity and making the most of what they know and what is available to the dev team, enabling me to play with maxed settings while still enjoying the beautiful experience along with the fast paced combat. Now this is not what I consider a well DEVELOPED graphics system, this is what I consider a perfectly tasteful and skillful use of a limited number of the more traditional effects, to get something that looks graphically competitive with modern effects while also wiping the floor with the FPS achieved in comparison.
Also also, I don't know much about alternative methods out there, but I am willing to place a substantial bet that there are much better solutions which are unfortunately being overlooked because TAA has become the easy to implement and established status quo.
At it's basis my issues with the modernity of using TAA to fix issues introduced by undersampled effects, is that TAA isn't only a mediocre solution, but that it is forced upon players. I don't give a damn if my game looks jagged or the smoke looks a bit funky because it shimmers. I grew up on extremely limited hardware (GT730) which required going into config files (take for example Black Ops 3) to lower the settings further than the ingame menu allowed, I was playing at a resolution of 40% of 720p, aka 288p. This only made me more familiar with how tech works, and got me interested in optimisation and effective use of graphical effects. I would much rather have the option to customise the experience of any game for myself. Playing DayZ and Uncharted at maxed settings in so issue, because it's within the scope of the games, DayZ is a mostly slow - medium paced shooter, however since it's quite old it uses traditional techniques to the best of their abilities, even using volumetric lighting, resulting in a very playable ~130 average fps on maxed settings while looking beautiful. While Uncharted and singleplayer games alike such as God Of War, don't have much if any multiplayer interactions, therefore it is within the scope of the game to run them at ~60 fps with great visuals. And even then none of those rely on a full screen TAA filter to my knowledge.
Finally, TAA just annoys me and I personally don't like the look of it, therefore, if there is ever an option I would prefer to turn it off, the issue comes when there isn't an option, and that is unfortunately becoming a standard. Why wouldn't you want to have a choice? If you're accepting TAA as the new standard, that's cool, but I'm not accepting it. TAA isn't the be all and end all solution, there's always a compromise or alternative.
Just wanted to say that you have a good speaking voice and presented your information succinctly. Very informative video and easy to listen to!
TAA in motion make many current games unplayable... no point of having a fast high hz monitor
edit: It's incredible that they make so many effects with non-native resolution and yet the games are optimized (run) like shit
You still need a high refresh rate monitor to reduce sample & hold motion blur. Sample & hold blur is worse than TAA blur.
That's not how refreshrates work
a lot of effects actually need a non-native res buffer in order to not have to sample a ridiculous amount of times, i.e. you get the same effect doing a lower sample gaussian blur at a lower resolution and letting hardware bilinear filtering smoothen it out than doing it at native res and having to sample way more times for the blur to remain smooth
Well you’ve never experienced a game that doesn’t optimize for performance. So what you think is terrible optimization is more just not great optimization.
i have and it's not great
When we gamed on Televisions there wasn't the 'clarity' we get from the digital TVs today. It took me sometime to learn to ignore the 'pixel' and really only notice it now in screen shots. As the resolutions kept getting better I was impressed but now as you mention: It would seem they have outpaced the ability to program the detail needed for modern resolutions and have returned to the tricks which reminds me of the warm fuzziness of old CRT technology.
Modern lcd is the problem, way i ferrior to CRT
Age old debate between analog vs digital.
Legit. I think people forget we all werent gaming 4k 10 years ago. Most of us were probably still on a 720p or 1080p, some people still not even having HD yet. I still remember when HD became a thing.
Thank you for this, damn. I thought I was going crazy mentioning this to friends who didn't really mind.
The last CoD literally needed nvidia configs to not look blurred out.
I feel like the more a game relies on dlss the more it lacks the option for a clear picture.
Personally I'd wish we kept the counter-strike source clarity and went graphically up from there without skipping steps with pictures that look "ok" on a makro level but disgustingly blurry closer up.
How anyone could care for 4k without pointing to this problem is wild to me.
This! You are so right! Games, even in ridiculously low resolutions for today’s standards, used to look so clear and nice (i.e. the 2D eras of SNES and Mega Drive).
Today we have crazy amounts of polygons, very high resolutions, incredible artists creating textures and environments, only to have it all become a blurry crappy mess due to the over usage of so many different types of “image enhancement” systems that even overlap each other. It’s ridiculous.
Use AMD's FidelityFx CAS at max, it's so good... i cannot play without it anymore.
Thank you for shedding light on this. My brain is wired so that whenever I see a blurry image, my eyes just instinctively try to adjust their focus to make it not blurry. Now I know why I can't play modern AAA games without getting severe eyestrain, even if I turn off any "motion blur" options in the graphics settings.
This has been a huge issue for me as i have somewhat impared vision and the blurring of any antialiasing, and espectially taa, makes it much harder to see anything, ive just had to stop playing newer games and its incredibly frustrating
Why not just disable AA if it bothers you?
@@potatofuryy i do, but in a lot of new games it literally isn't an option, thats the problem
@@gwyneveresnow5781for game like that it’s the developers fault for only optimizing for one platform (most likely console) then the pc port gets half assed most of the time. Tho ports have gotten better
Now i understand why i felt like my rdr2 had motion blur even tho i turned motion blur off
In RDR2 specifically I found it looks better with FSR on the highest quality setting and sharpening cranked to the max. That compensates for the blurriness, you still get AA (it's forced on when you turn on FSR), and you get better FPS as a bonus.
@@256shadesofgreyYou should also enable RSR or DSR/DLDSR and use 1440p or 1620p as native resolution to have even less aliasing and blurryness. I don't know why but FSR looks better than DLSS on RDR II, I swear, not only looks better, I get more FP/s. Playing on an RTX 4070.
I use the upscaler mod with DLAA settings, still very blurry when turning the camera at 3440x1440, is DLAA just as bad as TAA when it comes to blurring?
Only semi related, but you reminded me of another Unreal thing that really bugs me and that the games industry has apparently just accepted as normal and fine; aggressive occlusion culling. It produces constant 'flickering' as level geometry is loaded in on the fly and you briefly see the, usually bright white, background of the level environment. It's less noticeable at higher frame-rates, but at 60 it very much is.
Doom uses this too, and its is not noticeable at least. I only found out by getting a bug where the fov was incorrect, allowing me to see the culling on the edges of the screen. Unreal engine does end up 1 frame late on the culling it seems, so you get that frame of nothing before it shows up
Developers are desperate to make relatively cheap consoles appear much more capable than they really are :) Look kids, if you want to game in 4K at 120 FPS with high-res textures rendered at maximum detail then you're gonna need a $1,500 computer with a $2,000 graphics card and that's that. Nothing's free, especially not hardware!
this issue is the main reason why I avoid ue4 games, it's so fucking annoying
Pretty much every game uses some kind of occlusion culling technique; unloading stuff that the player can't see, it's just particularly aggressive in ue 4/5. I believe it's possible for the dev to tune it, but it seems the standard configuration makes it noticeable. It wouldn't be so bad if the background you see was a dark colour, on night or underground levels it's barely noticable. It wonder if it's possible to make a shader that always fills a negative space left by culling with a colour that blends in better.
@@irritablerodent Today's computer graphics hardware and software techniques are SOOO astonishingly brilliant it blows my mind. Technically and creatively it's such a mind-blowing art form and that's really all I can think of to say about it. The people that think this stuff up, and then conceive & then manufacture the hardware, write and debug the device drivers, and write, debug and implement the software to produce modern PC gaming graphics are absolute geniuses. It's 100% pure, distilled human GENIUS in addition to being an absolutely mind-bending amount of work.
It's a shame that it's only major use is video games, which when you're honest with yourself about it, are mostly a waste of time. And, I *LIKE* gaming! :) I'll say straight-up that as far as massively time-consuming comfort-distractions go, it's probably better to waste your life on PC gaming than it is to waste it on heroin, for example.
But, the people that made it all possible, especially with regards to graphics, are incredibly brilliant. Just pure concentrated human genius.
You're a saint for this vid. I had no idea why everything seemed burry
going back and playing older games from 10+ years ago shows how much clearer everything was
I normally don't mind as much in a chill single player game, but in an fps where visibility and colors are minimal, I hate that this exists and only adds to how annoying it can be to see some player models. That's why I like colorful games lmao
I wonder who had the great idea of forcing TAA in Doom Eternal
For years now, TAA is the first thing I disable after Chromatic Aberration in any game that has it on. If there’s no straightforward option, I desperately look for a ini file edit or some sort of workaround. I simply cannot stand those.
Thanks for raising awareness!
I am glad you covered this!
I was heavily modding Skyrim and noticed how anti-aliasing blurs the entire-scene rather than just the 'jaggy edges'.
Since then, I swore to keep anti-aliasing off wherever I go. I don't care if I return from the future with Nvidia Quantum-RTX-9080 Ti. The blurriness is just not the way. If I am going to lose performance, might as well just get a higher-resolution monitor since they are more affordable these days for actual, TRUE less jagged edges.
first time experiencing really terrible TAA when playing halo infinite. so terrible that i had to google what's wrong. is it my settings? my PC? the game? why it is so freaking blurry only when i'm moving? that's where i first learned what TAA is. the worse part is, it can't be turned off at all.
Yea , ı cant play the games that wont alow the turn off the AA like halo infiniti and bf2042
I noticed this in some newer games especially in motion since I always turn off motion blur and the stupid 'film effects' before I event start. I had to go back and check a few times I actually had motion blur off since I could tell something was off when moving. Good to know im not crazy lol
if there’s one setting that i can confidently call useless, it’s definitely film grain
Even after removing all that in Warzone, once MW 3 integration happened on December 6th, even will all blur turned off it was very blurry and weird. Then i tried AMD's CAS, and omg everything looks way better, like even guns in your hand...
@@fishfood8711I think the film grain is helpful to give better aspect to smooth flat texture: on Warzone (1), I think that some weapons, or the table on the gunsmith, were looking like plastic (way to smooth). But maybe I was using TAA, since they're telling in parameters that is really good.
This helps me understand why modern games have so much smudge compared to older games
I'm so glad to have watched this video. I have felt for quite some time now that games are really blurry and yet when I go back to older titles that blur is nowhere to be seen. I think it's probably more noticeable to those of us with excellent vision.
I would recommend turning Anti Alias OFF altogether if you're playing at 4K native, as you won't notice the pixels or jagged edges as much.
When playing at 2K, it really depends on the implementation.
I find myself requiring TAA at 1080p for every single game, as the pixels are huge and the noise plus the jagged edges are very noticeable. Also I find other AA methods to be less blurry but still leave a lot of jagged edges in place or hurt performance too much.
Just use TAA + overrided sharpness = huge profit.
It's because you're using a 4k panel to run 1080p, if you run 1080p on 1080p panel, you wont need anti alias at all.
@@tharusmc9177 this makes no sense. 1080p to 4k is a perfect integer scaling, so 1080p signal should be identical. To take advantage of this, you need to use GPU scaling.
Monitor scaling on the other hand will not use integer scaling and with it the picture will be blurred, so avoid it.
@@Shajirr_ yeah but most people don't use gpu scaling and yeah you're right about it not making sense, idk what I was thinking heh
2k + MSAA would be better..
Pretty sure this is also a health issue, focusing your eyes on blurry images is horrible for them.
it’s not, but you’ll get tired more quickly.
I've noticed this in No Man's Sky VR.
Turning it on felt like I was not wearing my glasses, while turning it off was much clearer but jagged. So i preferd the off option.
VR games in general*
Specially on Skyrim VR, and if you use DLSS / DLAA... it's even worse.
Antialiasing isn't necessary as resolution increases. At 8K, without AA, you can't even see the jaggies even if you're looking.
@@WarningStrangerDanger yeah, but the angular resolution on VR is way too low
@@WarningStrangerDanger Yep, that's why DSR (Dynamic Super Sampling) is quite effective for this.
VR and a lot of post-processing effects just do not mix at all. the effects aren't designed to recreate real vision or accommodate the depth that comes with VR.
So many Unreal Engine VR games look really bad due to this
Awesome showcase of the issues. Hope more developers see this and make a push for it
Wow. You just explained the _feeling_ I've been getting from every recent 3d game I've played.
I thought it was just a trend that everyone was trying to jump on the motion blur bandwagon.
This is extremely noticable to me, because i used to play a bunch of very fast competitive games. Their goal was to be as fast and clear as possible, then grab for as much effects as your computer could handle second.
It wasn't uncommon for people to turn off things like motion blur and bloom - to keep visibility. (I chose to keep bloom, because it made immersion and beautiful scenes so much better).
Ps moving through grass is where i find this the most noticeable. Moving feels like someone took the speed blur effect from a racing game and cranked it beyond max.
I wonder if nvidia per game settings can force an override? (But could cause visual issues for some games)
It seems like they are just grabbing way too much past info. I think you are right about dialing it way down and combining it with msaa or similar.
This has been a thing for, what, ten years? Unfortunately forcing MSAA simply doesn't work in many cases. I'd say most cases, but I haven't played AAA games in a while.
i almost always turn off bloom because it makes most games look worse.
a little bloom is fine, but most games have too much bloom
this is like a miracle video shown to me, as someone who only played older games because of pc limitations i had, everything looked normal, but ever since i started playing modern games, it felt odd and uncomfortable to play, because i saw aliasing so bad it felt like looking at bad pixel art, so i tried turning towards anti aliasing, and putting the settings on "max", which was most of the time taa, and now i had a new issue, it felt like i couldn't see anything properly. and this just makes so much sense
I've just resorted to using DSR.. It's a brute force fix that shouldn't be necessary, but games look so much better when rendered at a higher than native resolution. I guess it's basically like using MSAA.. wait, what ever happened to MSAA?? Games always used to have it. Sure, it was expensive, but it looked good..
The issue is theirs more stuff for MSAA to super sample so it got more expensive therefore it's not used.
But if some people are super sampling anyways maybe they might as well bring it back? I know MSAA doesn't work well on certain effects either but I found a resource that worked around that issue but I can't remember the papers name. I'll have to look
MSAA can't effectively do it's thing in a deferred renderer (most game engines nowadays). You'll end up with a mismatch in the amount of samples for a given pixel between the geometry edges (however many AA samples you select, 2 ,4 or 8.) & the lighting since they're rendered in different passes which will look either very ugly or you match the samples in the lighting pass & give up a ton of performance (you'll be shading every affected pixel 2,4 or 8 times depending on the amount of samples). You can get around this however, but it's a lot of work & that'll only give you anti aliasing on geometry edges (which is the core reason it's defunct, it actually does next to nothing to actually solve modern aliasing problems which comes for the majority from shaders, not the geometry).
@@Hybred also GPUs nowadays aren't just built the same.. on the PS2 and X360 they had giant rasterizers and high bandwidth DRAM so overpowered you could do free MSAA or just abuse the fillrate to get an effect.. now we have barely enough bit width to feed the chip without undersampling things.
DSR is even more resource intensive than MSAA. On my 1080p monitor I find having to go 4x DSR for good results, which means the game is literally being rendered at 4k and downscaled down to 1080p. Too bad with my puny GPU I can only do this with really old or "oldschool grahics" games. And for games which only support borderless instead of true full screen, like Dread Templar, I have to change my desktop resolution to 4k first.
But even top of the line GPUs these days don't seem to be expected to run modern AAA games at 4k natively, and I don't think there would be much point using DSR combined with FSR or DLSS...
MSAA doesnt work well with deffered rendering which almost all games use now
THank you for the detailed review! It was a big surprise for me to see so blurrish Cyberpunk on 1080p monitor (RTX 4060, all High/1080p native with DLSS Quality and DLSS Off) compared to crystal clear Witcher 1 from 2008 and Crysis 2 from 2012. And it's crazy that 1080p native graphics quality looks like HD at best with raytracing/space screen reflections/grain/chromatic aberration/depth of field - all of them swtiched off. And only enabling DL DSR (1080p -> QHD -> 1080p round trip) made a picture a bit clearer but with at the cost of performance downgrade. If that blurrish nightmare continues I will just step aside rather than buy 4090-alike videocard and 4K monitor to have small details not being washed away with TAA-alike filters.
I'm pretty sure there are still many owners of 1080p IPS pretty good quality monitors not wishing to swap it with QHD/UltraHD ones.
I remember when Battlefield 1 came out, I always thought it looked amazing, but was always blown away by how effective the resolution scale option was, 1080p with 200% resolution scale looked insane. Any game with TAA still has this issue, even some games that don't use TAA still have the problem. I now play on a 4k monitor and so many games still look so blurry. I go back to GTA 5 and 1080p, 1440p and 4k look pretty close, the upgrades are subtle but there, but on RDR2 I quite literally can't play it at anything below 4k now. It's not that 4k shows a ton of detail that isn't there are lower resolutions, it just cleans up the blurriness of TAA
One game that actually has good anti-aliasing is Warframe with it's SMAA option. It has options of FXAA and also TAA that you can also adjust. TAA makes the game look blurry for sure but SMAA has a really nice crisp image with only very minor aliasing that is non-bothersome IMO.
I'm not surprised Warframe has these options as the previous creative director Steve was always a very big fan of graphics and they're always optimizing and adding new graphics tech.
They got a disable Anti Aliasing aswell.
At a high enough resolution, Warframe hardly needs AA in my opinion. I play without it. Very sharp overall.
I agree, I remember missing around with it's setting , their SMAA is perfect . No jaginess and no blur . I bet they are using their own implementation of the algorithms because it's clean as fuck
The TRUE wors thing is that most games dont have an option to disable this
Or having to configure internal files
you can add sharpening by your tv options (definition), on pc you can do it by reshade (luma sharpening)
You can disable in all games on PC
Lol, what are your ‘most games’?
@@choppachino Most modern games has no option to change "Anisotropic filtering", but it instead is usually combined with other stuff under "Post Processing effects" switch. Although fun stuff is that it is possible technically force-enable the older filtering options via graphics card settings (like NVIDIA Control Panel)
Thank you so much for mentioning the motion sickness. It's so frustrating but tons of game insist on "realistic" motion in their games and add blurring to it and its ten times worse.
THANK YOU!!! I've always talked about that between my colleagues and they always says that "is not that bad". But I'm still playing in a 1080p monitor and it is VERY noticeable and annoying. All the blurriness in modern games is just so counter-intuitive. What is the point of having giant 4k/8k textures if we're blurrying everything? Look at Dota 2, the game looks crisp and sharp, very well defined, even without MSAA on....I hope we have a solution for this in the near future.
It's not just TAA. It all went south when deferred rendering became the only option in DX11 and up.
TAA and DLSS are the only ways to get rid of aliasing, those "staircase shimmering" one might remember from Witcher 3, Just Cause series, GTA 5, or Rise of the Tomb Raider.
DX9 games had a lot more options, like multisampling or sparse-grid supersampling, but they are almost as costly as running it in 4k or 8k.
What do you mean "the only option"? Deferred rendering is just a technique for rendering scenes, specifically for decoupling some operations away from scene geometry and frag shaders rendering on them. It's just that it provided very important benefits to performance due to the way how GPUs' fixed function hardware works, which still continues in a lot of games and tech. You can do deferred in DX9 as well (and on the original xbox, too).
That is true, much of this is the fault of deferred rendering being used simply because "it allows for more dynamic lights" to be used. Meanwhile, they ignore that fact that it rules out the ability to use a bunch of other AA techniques effectively... and the fact the fake (aka pre-baked) lighting and shadow techniques have already gotten to the point of being good enough (though not perfect) but much easier to run on older/weaker hardware. So in other words, it would really just be better to stick to old, but refined techniques... as it would allow for higher framerates instead and better performance on older/weaker hardware (which also helps the environment by keeping that hardware relevant instead of requiring users to upgrade shit) and also handheld gaming consoles (and smartphones) which have much stricter power constraints.
I want to add some technical context here why TAA has been so popular.
Simple answer is that it requires very little GPU power to implement.
FXAA is still pretty fast, faster than TAA usually, but it doesnt necessarily "catch" all edges in an image as it does so mostly by contrast. Edges with little contrast may get no AA blending at all. Like TAA its relatively recent, even if its not as recent. Its mostly sharp, but doesnt do any subpixel rendering like MSAA does, but rather by just looking at the angle and approximating from there, which can lose sub-pixel details, but not to a noticeable degree.
MSAA is something youll probably run in on any game since the 2000s. Itll detect edges based on the actual geometry being rendered and thus typically catches all edges on the geometry and then samples subpixels to smooth the egde. It doesnt do transparency (Morphological Anti-Aliasing comes in here, but thats a different topic) but other than that its pretty much a brute-force approach and on higher settings gets a little intense. And I dont just mean the choice between 2x and 4x or even 8x, I mean it in the sense that modern games have a lot more geometry going on, which means more edges, which means more MSAA work. Especially with grass like on the Halo Infinite scenery this can easily become ridiculous. But on older games the performance hit was noticeable, but also handled by much weaker GPUs just fine.
In image quality either will beat a blur filter like TAA, MSAA especially, but FXAA is "fine" as well, low contrast edges dont stick out so much and quality can usually be adjusted to make it fairly pleasing to look at as well. MSAA its just not worth the performance penalty. FXAA though? It smooths all edges, transparency, shader-related, anything, and goes fast. Why isnt it used instead of TAA? I have no clue.
What Im looking forward to is FSR 3 though, because itll come with a mode that uses their upscaling method but uses it as a replacement for AA without actually needing to upscale, so image quality should stay similar to what MSAA and FXAA deliver without loss of detail and certainly without blur. Upscaling has been doing this for a while, has to, because if it didnt FSR and DLSS would be useless and just as ugly as no upscaling at all, so it smooths jaggies as it goes. Problem is that FSR especially still has issues with certain stuff, foliage, transparency, particle effects, shimmering, and running it purely as AA doesnt fix that. And DLAA for Nvidia is still only supported by a few games, even if its results are much better.
ive been a fan of FXAA for years. i am equally as confused as you are as to why developers have slowly refused to include it in games. its fast and it gets the job done just fine. its the "its better than nothing and less stressful than msaa" option so its a no brainer.
fxaa is blurry, smaa is much better @@afaqahmed43
@@afaqahmed43Probably have to do with its game engine?
Finally! People are talking about this and I'm happy!
Thanks for discussing this, I have been saying for years that TAA is not an ideal solution. Whenever I find a game that has it implemented poorly, one of the first things I will always do is try and find a config file to manually disable it. I would rather have the jaggies than blurry textures. The amount of games that come out and have awful TAA implementation is simply staggering, I don't know why the industry is gravitating towards this method. Surely these devs are aware of the obvious drop in quality it creates in many cases.
Because it's the only practical method that can handle spatial and temporal aliasing/noise without completely gutting performance. MSAA no longer works with the bulky lighting pipelines that modern games use and SMAA is only a spatial post-process AA filter that cannot handle the temporal aliasing/noise that modern games have. That leaves us with TAA which, yes, we could apply only to the problematic lighting or foliage components of the image and use spatial AA at the end to clean up the rest, but this only half solves the problem as you'd still end up with blurry and smeary lighting, foliage, animated textures, etc. Rather than _everything_ looking blurry in that RDR2 shot at the start of the video you'd instead have all foliage being blurry which is still bad.
a n i m e
n
i
m
e
@@jcm2606 The solution to the problem is SMAAT2X. Not blurry and it fixes temporal aliasing. Great cost to quality ratio. I don't understand why they don't use it more, only a handful of games implemented it.
Ok, whats the point of 4k textures when they're destroyed by TAA?
It's for to waste more VRAM and force you to upgrade GPU. 😅😂
Run a modern title on Series S where some if not all textures cap at 2K mips, and compare it to a PC or Series X and full 4K. This is necessary for lower memory bandwidth, most importantly when a game is frequently streaming assets in and out. It's so much easier to move a 2-4mb 2K texture than it is to move a 25-35mb 4K texture. Esp when a FULLPBR asset at it's most optimized form may have at least 3 textures (albdo with colour information, normal map for bump information, and composite which stores ambient occlusion, roughness and metallic information inthe three RGB greyscale channels to reduce draw call).
The difference between 2K and 4K visuall is night and day, especially if any text element was baked into a texture like artwork instead of a higher res decal sitting on top regardless of TAA.
Source: I'm a first party developer for Microsoft Flight Simulator, and content we produce for Series S gets capped at 2K due to memory constraints.
It's for people like me. Where I disable the TAA and can enjoy a crispy game.
Advertising. Most people can’t tell the difference between 4K and 1080p and I’ve seen people call 720p images 4K.
Who uses TAA in 2024, wtf? It's not 2014. I don't get what is happening in comments under this video, you don't know what DLSS is?
This. This only came up for me now, but I absolutely agree. TAA, or any upscaler - Every time I can play native with good enough FPS, I will.
Thanks for your comment. We've talked before on r/OptimizedGaming / Reddit, happy this hit your algorithm!
I used my consumer psychology degree to get this video as big as possible however I do hope more gaming/tech channels cover it, its the only way the industry will stop heading in this direction is if enough people let them know their dissatisfied or worse even sick (headaches, eye fatigue, etc)
I hope to see more options for everyone going forward, I think having options is not only pro-accessability but is fundamental to the PC platform where these decisions should be the gamers choice.
That is the reason I roll with FXAA most of the time. Clears up some of the jaggy edges but leaves the clarity of the image in-tact
lol fxaa blurs the image but stops at that while taa damages everything, fxaa used to be a joke of an anti aliasing when smaa and msaa rolled and now we be lucky if fxaa is even supported. What a dumb age of gaming
Finally someone said it!
Its annoying as hell to see a new game feeling like its running on lowest settings, even if the settings are kept to high.
I know a lot of people use TAA, but I've always thought it looks terribly blurry and sacrifices so much overall color and saturation just to look like it's adding a blur filter over the screen and hoping it looks good. It may sound silly (because it's also known for blurriness by those who like TAA, in my experience), but I'm a HUGE advocate for FXAA; I feel like it's a good mix between antialiasing and sharpness without losing visual detail and quality, including colors and shininess on reflective objects. Great video
I'm impressed the TAA image on the thumbnail isn't even clickbait at all
it literally is...
@@blaxrader112 8:55
I have to turn off blur, it makes my stomach no feel good when I turn too much and too quick lol. Visual motion sickness? lol.
This is an important topic. Thank you for bringing some much needed attention to it. Hopefully through voices like yours and ours we'll be able to affect some change in modern gaming anti aliasing implementations. As it stands, lately it feels like we're going backwards in terms of visual clarity. Many people feel this without even knowing what TAA is or how it's actively undermining visual clarity.
I had no idea TAA was that bad, never really messed with AA settings before.
Im using MSAA over TAA in any game i can! If not, i rather turn off AA...
@@XenoX_98 Multisampling kinda fixes TAA too. In general I feel like PC gaming is lagging behind consoles which is why TAA is so bad here, many people still use FullHD monitors while current gen consoles are running at 4K or 4K Upscaled, which makes TAA look better.
@@XenoX_98Far Cry 4 was the last Far Cry game with MSAA, i remember when they moved to TAA in Far Cry 5 how awful the power lines looked unless I played on 4K, yeah it’s trash
@@lukkkasz323Not at all, PC gamers play at 1080p but consoles are nowhere near 4K, lots of games use FSR or TAAU from sub-1080p resolutions, some of them even sub-720p (this is not that common tho) like SW:JS that renders at 648p. Consoles are actually getting the worst TAA scenarios, it's just that PC players are more aware of the situation and tend to use DSR when a game looks jagged on their 1080p screen to render the games at higher resolution by lowering other settings, also console players tend to play away from their screen so they don't even notice Anti-Aliasing on games like GTAV (PS4) that was literally a jaggie & blurry mess thanks to PS4's HRAA, RDR2 which is also a blurry mess, even on Xbox One X, and GTAV (PS5) which also looks ultra blurry, especially on the Performance RT, but Fidelity Mode still looks bad. The only game I've actually seen with a good looking TAA is Horizon: Forbidden West.
@@aeon7748Far Cry 5 also offers SMAA, which is the AA method used on consoles, it's not as effective for the removal of jaggies but it's better than nothing, and does a very good job if your PPI is high. I don't honestly find Far Cry 4's MSAA great at all, it does a terrible job compared to GTAV's, not even TXAA does a good job in this game, I find SMAA to be the best looking one once again.
I already knew about TAA being trash in games, it was enabled by default on some games, and I quickly realized by swapping to FXAA that my games were looking way cleaner with a sharper image instead of being blurry.
FXAA is just Fullscreen Anti-aliasing... which means it only works on the pixels shown on screen... not frame-by-frame... it is a very 'quick and dirty' solution, and the industry moved away from it as there wasn't much to be done with it anymore... but it is still useful for comparisons... "Temporal" in this sense refers to frame-by-frame in the Graphics pipeline, if we wanted a way to do FXAA "Temporally" here it would just end up looking like TAA just the same lol... the "Temporal" part is the problem, we could generate A PERFECT 2160p gameplay experience of anything but it might just take TOO LONG hahaha... anyway, in the words of the Protoss from Starcraft, "You Must Construct Additional Pylons" as really there is no way around this except more powerful GCards and/or CPUs
again, just to moan more about TAA x FXAA...
The REASON the Gaming industry switched off from FXAA is because there was no way to imrpove it, and SUDDEN changes on screen meant that FXAA had no way to compete with TAA which could *already process frames before they were shown*
@@ItalicMaze Nope im talking about FXAA. Which is more than enough on a 1440p display.
@@ItalicMaze oh okay
@@Vifnis I always switch to FXAA on any game i install because it always looks better with better performance. As old as it might be, the new TAA and SSAO are just terrible in comparision. I don't think they are good substitutions.
FINALLY i find someone talking about this, i thought it was only me😩😩
I needed this video and the great info it contained. I'm 53 and my eyes are getting older. Blurry graphics is my biggest gaming enemy these days. I decrease the antiasiling and try to use Reshade in most of my games now to get a crisper sharper image.
Piracy | Torrents folder on the hotbar is sick
Standart Bruh, porsting all the games with shittx price politics is key
Pirating*
Thank you for this! I couldn’t tell why my older games looked so much crisper and this is exactly it
Your voice is soothing. I could fall asleep listening to you talk about anything
Wow I learned a lot from this video! I always had some weird problem with my games being blurry but I just thought I was getting older and more nitpicky as my pc specs improved and I wanted higher quality graphics. This was really interesting and helped my understanding of what some of the settings do in the graphics panel that I just judge by eye test before implementing. The sliding images really helped me see the phenomenon in a way I hadn’t before, which was really appreciated.
It’s even worse in VR where the image is right in front of your eyes.
Honestly in VR there are so many things at play that it's hard to point out a single culprit. If you're using something like air link to play a steam game we could be talking about the image being scaled two to three times, plys image reconstruction with spacewarp and so on
Most vr games try to not use taa precisely because of this and use async timewarp instead, which game are you thinking about ???
@@javiergimenez40 Skyrim uses taa
Thank you!
It's been a while since I noticed many VR games in UE4/5 are a blurry mess even with super sampling.
I just tried UE5, if you do a standard project (without lumen enabled) and enable VR, the TAA + some more settings make everything so bad.
If you use their VR template, it's perfectly fine
I play a lot of Skyrim, which was pointed out already, uses TAA. However, a mod came out that allows for using DLAA. It is incredible on how it reduces jaggies without making a completely blurry mess. It requires a sharpener as well, but does a much better job than TAA (though does cost in performance).
Thank you for making this video essay, it was very thorough and balanced. I am an indie game dev who did not know that this technology existed or that there were some problems with it. I will be saving this video for my notes.
I'm actually starting to understand this because I couldn't tell the difference between TAA and no TAA now I do you made a great video and keeping up 👍💯
After all these years I just assumed I had really bad eyesight causing things to just turn into a blurry mess but after seeing that Witcher 3 comparison, it was a night and day difference.
Im almost certain this TAA stuff has been making our eyesight worse for years.
Kinda wild we are just finding this out now.
@@lmAIoneI definitely think so as well. My eyesight has gotten so much worse since TAA was introduced, and I started getting horrible headaches from games whereas I never had issues before. Pure speculation on my part, but if I go back and play older games, I don’t have these issues.
This video has blown my mind. I always use TAA since I've always had a higher end machine and would always set settings to max. Since TAA is always at the end of the list I assumed it was "the best quality" one and would always use it. Now I totally understand what's been making my games so blurry, definitely going to FXAA or SMAA from now on. Great video!
Funny thing you mentioned those guys, FXAA just blurs your whole screen while SMAA blurs just the edges by detecting the edges but at the end of the day it's still applying blur. You can't escape the blurriness of today's modern games, unless you play at 4K with no DLSS because that still applies TAA by default and renders at lower resolution.
@@LazyBoyA1 I don't think FXAA is *supposed* to blur the entire screen. It's supposed to find sharp contrasts, which could indicate an edge, and blur just that. Of course, it doesn't always work as intended.
@@stale2665 You're absolutely right, but have you seen modern games with no anti-aliasing, it's all jaggies! FXAA just blurs the whole screen as a result.
@@stale2665
Dude, FXAA literally stands for Fast ApproXimate AA, it works by blurring everything even subtitles on your game to achieve an approximation of anti aliasing.
@@LazyBoyA1 on a 4k display, the jaggies are so tiny, that you usually won't even notice them.
This video precisely articulates my feelings toward new games. It feels like they just put in a blur filter and that's it. It feels cheap, and you can tell there is less effort put into it.
you can just turn of TAA
Not on console
No you can't, most games nowadays come with that sht forced on or they drag other graphics along with it @@QUANTPAPA
@@TheRhalf can you name some games bro, maybe you can force it away in config file ore using Nvdia side tools
You can mess around with the game files but more times than not crappy things start to surface like constant dithering and extreme sharpening (RE Engine, UE, AC Engine) so even if you have a choice to disable it, games are just built around it so turning it off results in a way worse image @@QUANTPAPA