We have three things to ask: 1. Watch this video in 4K as in streaming settings as any lower compression will drastically hide visual issues we are trying to demonstrate. 2. Please excuse the audio issues as our studio founder Kevin was recently hit by a drunk driver and is still currently recovering. He did not want to delay the release of this video any further. 3. If you agree with us and want to be a part of the solution donate here: threatinteractive.wordpress.com/donate *Read the bottom note explaining why we're not actively seeking donations yet*
@@nzeu725 as in streaming quality since it's a youtube compression issue. You'll be able to view the issues on 720p monitors as long as you have the streaming setting set to 4k.
They want us to buy the newest Geforce and Radeon gpus. You can't run Ray Traycing? Buy our new Gefore RTX 666 now for $3000. Then you find out it only runs at 40fps. But wait you forgot about Frame generation! Let's double the framerates!
@@saricubra2867 *"I game on a 4:3 1440p CRT monitor from 2002, what is aliasing?"* Come on, man! I gamed on an FW900 (2304x1440) until late 2014, and then a 22" FP1350X (1920x1440) until mid 2018 when I couldn't find a suitable CRT replacement and bought a 55" 4K OLED. If you think there's essentially no aliasing on a CRT, you either have a really cruddy CRT or really cruddy eyesight, or both! Sure, aliasing may inherently be a _bit_ less annoying on a CRT but it's still aliasing. If it weren't visible at all, it would mean the display is extremely blurry --- which is the subject of this video after all. And why I have always despised FXAA and all of the other screen-space blur filters. I've always used 4x MSAA on CRT no matter how high the resolution. After all, it comes down to _angular_ resolution, and not _display_ resolution --- how large the pixels actually appear to your eye. Which is a function of screen size and viewing distance. I've always had my face right up against my monitors in order to maximize Display FoV and thus maximize immersion. I grant that if you sit quite far from your display, your angular resolution will be higher, and any aliasing will be relatively less annoying. But that's no way to game! P.S. If you're reading this and you are into maximizing immersion, ask me about High FoV Gaming --- it's probably not what you think.
Optimization has gone down the drain. There were 720p 60fps games on the original Xbox and 1080p60 on the PS3, and now almost two decades later we are told 540p on series S is just fine thanks to upscaling.
Basically TAA is implemented to mask all the graphics effects flickering like hell, with a soft blur that's almost literally like playing without glasses
Back in the days I treated FXAA as garbage, but now I prefer it over TAA as the latter somehow manages to be even more blurry. It's like somebody smeared my monitor with soap.
GTX 750Ti 🤝 R9 380X "Yeah 4gb vram should be able to run anything!" In all honesty too many games nowadays ask for WAY too much considering what they visually produce. And more accessible games means more sales for publishers which makes these incredibly demanding games make absolutely no sense to me
Some history: Killzone on PS4 was the first big game with temporal reprojection pipeline. They rendered odd/even scanlines and reprojected the previous frame with motion vectors for missing pixels. This technique allowed 60Hz multiplayer in a game with 30Hz single player campaign. Which was great for gamers. None of these long smear artifacts existed as pixel was fully refreshed every other frame. And there was no noise based optimizations. PS4 Pro was the big reason for these techniques to get popular. It was heavy marketed as 4K console and 4K sticker on the box required 4K framebuffer. But Ps5 Pro only has 2x faster GPU for 4x higher pixel count. Checkerboard rendering was invented. It was an improved version of the scanline technique of Killzone. 50% pixels refeshed every frame. I’d say modern temporal AA was first used by Ubisoft’s For Honor in 2017. It was basically a modified TAA shader that was accumulating into a native res buffer. This solved some of the checkerboard issues like sawtooth pattern in occluded areas. UE4.19 TAA was similar to this algorithm. When people got used to 4K rendering with checkerboard and TAAU, they didn’t want to spend half of the PS5 GPU gains by disabling these PS4 Pro 4K tricks. Thus temporal reconstruction become the standard way to render 4K on consoles. Meanwhile people had started to use TAA (no upscaling) as their general filter for everything. This was needed because games were moving away from static baked lighting to dynamic solutions. Dynamic solutions were more expensive and lower quality. TAA helped hiding the issues. There still isn’t enough performance to calculate high quality fully dynamic GI and reflections without any tricks. That would be even slower without TAA. If you want dynamic stuff that’s the cost. UE5 forcing dynamic stuff to all users is not great. Not all games need fully dynamic lighting and reflections. TAA also made it possible to add competely new effects. Such as volumetrics and volumetric clouds. Horizon Forbidden West had game play inside volumetric cloud caves for example. For most games it’s not worth to use that many GPU cycles for volumetrics rendering. Clouds can be faked using existing techniques just fine if you are not flying inside big cloud caves. Their new volumetrics tech leans heavily on temporal accumulation. Was it worth to Horizon Forbidden West to have reprojected volumetric could render tech? That’s up to players to decide. It made new gameplay possible. However if UE5 introduces similar tech and forces it to all users, then we have a problem. Simple efficient solutions are still useful and don’t require temporal accumulation. I just wanted to emphasize that people don’t do temporal jitter because they are lazy. It’s super difficult to make these algorithms fully real time. Added GPU cost is huge and you have to compromise. All of this is especially true for ray-tracing. You are tracing thin rays. Each gives you very little data. Randomizing the rays randomizes memory access patterns which hurts caches. You want to minimize the amount of rays you cast. Temporal accumulation is required for acceptable quality. Even offline ray-tracers use temporal techniques and denoisers. If we want to go there, we have to accept this. I personally feel that ray-tracing is too expensive right now compared to the gains. On RTX 4090 it’s usable due to HW advances and sheer HW power, but still far from efficient. Raster algos (and baking) provide similar results for much cheaper GPU cost.
Baked lighting is great for runtime (if your content is mostly static), but it has other issues. AAA games today have big game worlds and lots of content. Game downloads already are hitting 100GB. Baked lighting takes a lot of storage space. Dynamic solutions don’t. Baking lightmaps for small game levels was fine. But today’s massive BR maps require too long baking times to make the iteration time fast enough for level designers and lighting artists. Dynamic solution allows developrs to see lighting changes immediately. Productivity is better. Developers want to patch these big BR maps in production, which would require new light bake -> massive patch to all uses. That’s not really feasible for games like Fortnite that are constantly updated. There’s a lot of developers waiting for a future where you don’t need to bake any lighting. Super fast workflows and iteration time. The downside of this future is higher HW requirements for players (temporal makes perf hit more bearable). It’s a compromise, but many believe it’s a necessary compromise to produce more and more complex and polished content. Team sizes are massive today. Productivity is a real bottleneck.
No joke - perfect example are Xbox One X enhanced games from Xbox360 (9x resolution with increased AF and in some cases textures and maybe LOD). Ninja Gaiden II or Gears of War 3 are perfect examples. I highly suggest checking comparisons.
The fundamental problem here, and the reason the mainstream has adopted TAA as default for a decade now, is that most effects in modern games happen per pixel, not per vertex. This means MSAA can only be the first step in a long pipeline of ALIASING AVOIDANCE. After MSAA gives you clean geometry you have to watch out for every new pixel shader effect not to add noticable aliasing to the image once more. Yes, over the last decade we're also seeing a lot of noisy temporal sampling, but these techniques have been invented because TAA is the standard, not vice versa! For flawless image quality without supersampling you have to PREVENT aliasing at the shader level. By moving some effects back to the vertex shader you can leverage the additional samples MSAA gives you. And then pixel shaders can still be used for effects that create smooth gradients or are generally subtle (low contrast, low frequency). The only alternative I can imagine is sort of the opposite extreme, an engine built around supersampling. You minimize the cost of each pixel so you can render at least 4x, then apply some kind of post-process-AA (SSAAx4 is still visibly flawed), then scale back down to the screen resolution. Maybe this could get you *close enough* to 8x quality, but performance will be a problem.
One root cause of the problem is that everyone uses deferred shading. There is some weird mindset that drives developers to use that. I prefer forward rendering pipeline. That allows high performance and good MSAA and no need to recompile shaders. So it is actually simpler. You only need to take care overdraw having occluders in scene and have some logic to handle what dynamic light sources lit what assets because there is that 8 light source limitation per mesh.
@@gruntaxeman3740 Devs largely switchrd to deferred shading because it scales better with multiple light points and with open world games and games in general becoming larger traditional forward rendering is a problem and the 8th gen consoles had a big memory increase allowing deferred to become the standard.
@@lordanonimmo7699 I'm confident that most cases that is huge misstep. Light source limitation is not "8 light sources per scene". It is "8 light sources in fixed pipeline". So what is required is to light source states every area in 3D-space, based on light sources influence. So in reality we are talking max 8 eight light sources per mesh. There are really no issues to have plenty of light sources in scene. It is not only AA what suffers in deferred shading. In performance point of view, the issue is memory bandwidth. Forward rendering pipeline requires much less of memory bandwidth and that is the bottleneck. It was bottleneck in 8th gen consoles and it is still, because memory is shared to CPU and GPU.
You were right, this really is one of the most important videos about modern game engines. Thanks for putting our feelings into words. I hope you go far. What a colossal tragedy that the vid has this low views...
As an Unreal developer i'd say you said right things. I wish i could add support for SMAA or optimize TAA/TSR, but these are forgotten topics that have no support on the forums, no documentation and no tutorials. And trying to learn it from scratch is an expensive task that will cost even me as an indie thousands of dollars and months in R&D. But why don't huge AAA studios do it i have no idea honestly
Simple, because it costs time and money that publishers would rather not spend. And considering how long it takes games to come out these days, I really can't blame them in the slightest. They're practically getting death threats for already spending 5 plus years per title. And from a corporate perspective, what kind of fool spends time and money on something that doesn't immediately make them more money back?
I'm sure there are many papers out on reference implementations for FXAA and SMAA. My first place to look would be ReShade shaders that are commonly available. I'm assuming if you know how to code UE pixel shaders, you can port it over.
Regarding ghosting in TAA, 90% of those cases are improperly implemented TAA, the motion vectors not being reliable in those cases. For example the Guns leaving behind ghosting is because the Motion Vectors are ignoring the gun, they are drawing the gun on a separate camera ontop of the main camera which is used to prevent the gun intersecting the world. Blurryness can also be significantly mitigated with "unjittering" the textures in screen space which a lot of implementations don't do, so that way only the edges of geometry jitter, not textures. Another big thing is using Bicubic sampling for the reprojection instead of hardware bilinear which helps much more with sharpness than using a sharpness filter. Just those things alone bring TAA quality up a significant amount, its just for some reason the standard implementations don't really do these things, probably because its a little tricky to implement the texture "unjittering" specifically in an engine like Unreal/Unity because every shader that samples a texture to the screen would need to implement it.
We really appreciate your high effort comment. There is the argument that these are "improperly implemented" but at the same time you can find hundreds of games that display these issues from AAA to indie. And what legitimizes them is the standardized abuse of these flaws in the name of "optimizing" several effects when that is now clearly an invalid excuse and has no significant impact on 9th gen. RE: "hardware bilinear which helps much more with sharpness than using a sharpness filter." Guessing you mean better clarity? You can add a sharpening filter to a game with or without TAA. We strive for clarity since you can't add a "clarity" filter to either 😉👍(we'll look into it) We looked into the counter-jitter technique from textures but didn't have too much concern over it. Since we are big fans of the sample pattern found in the Decima AA research and it stated some information about the samples being so close, that texture alteration wasn't necessary. But maybe we should still test it ourselves considering their standards at the time (releasing with no motion vectors & causing severe ghosting). We don't have an official chat or online group but you might be interested in joining the r/f***taa discord server which has a channel dedicated to developers looking for improvements in this area: discord.gg/W7bWWfF5RW We always like to remind people we are not anti-TAA, only against flawed TAA that can be abused for covering hideous issues with other pipeline steps.
@@ThreatInteractive Regarding the bilinear you can instead use Bicubic another interpolation algorithm instead of Bilinear which is the default in hardware. It reduces the blurriness significantly, bilinear by nature is very inaccurate and blurry. I completely agree with how TAA gets abused through effects like SSR, SSAO, and SSGI they are generally poorly implemented then they rely on TAA to temporally blur it together to effectively denoise it. Even in deferred rendering some people opt to use Stochastic transparency and TAA to blend the pixels to implement transparency. And things like TAAU while the idea is honestly great its yet another thing abusing TAA, Temporal upscaling is "fake performance", not an optimization its a great option to have, but DLSS and other things shouldn't be the standard they should be an option for those with incredibly old/poor hardware. Optomizations in modern games does bother me a lot, things like Unreals Nanite shouldn't ever be used for games. It doesn't improve performance hell it hardly even improves quality, it just enables developers to just slap in 500 billion polygon models consuming a billion gigabytes of data. Even things like lumen, why do we need such a degree of dynamic lighting when lightmaps can look even better support changing daylight cycles and is almost free to render.
@@pakumies By default when you sample a texture the GPU uses Bilinear sampling, You can instead in the shader write your own Bicubic sampler to use instead, which generally produces better results.
@@michaels851 You talk about Lightmaps looking better, but I have never seen that before... not in cyberpunk or any game that gives ray tracing or path tracing
i think it's also important to note that a lot of the examples of ghosting with taa shown here come from the fact that the velocity vectors for those pixels are not being correctly calculated. if this is correctly done and inusable samples from previous frames are correctly discarded then ghosting becomes a much smaller issue, of course some blurryness and artifacts still remain because of it being a temporal method (which the video mentions in detail) and i agree that taa is probably not ideal for visual clarity.
Is it even realistic to expect the velocity vectors to be calculated correctly? The examples are extreme, but this effect appears all the time. At first I thought my monitor is shit, but then I noticed that not every game has that ghosting. Personally, I'd take a grainy image over those artifacts any day.
Great video, It's clear that modern 1080p games look way blurrier than 1080p back in the day. And It's visible factualy. Game devs are relaying too much on Upscalers, lighting gimmick and temporal solutions to cut cost of video developpements and cheap on effects. At this rate new triple A 4K games will be as clear as any 480p ps2 games.
In former days developers just reduced the output resolution and didn't use any advanced reconstruction at all, leading to most games of the PS3 era looking absolutely terrible. Either people forgot, or are simply too young to have experienced it. We're currently having the best image quality since CRTs went out of style.
@@NeovanGoth only started to be an issue with late half of xbox360 era games. The early games were all 720p with msaa and looked great. And on multiplatform ps3 games, it was more common since devs didn't know how to optimize for ps3. None of that changes the fact that on pc, players were still playing at 1024x768/ 1280x720, 768p, 900p and 1080p. Games of those times looked a lot sharper than current games. No way we are having best image quality when games not only look blurry, but has horrible smearing in motion which is the biggest problem.
@@NeovanGoth Console shit has not been cutting edge on any level post the PS2 days. The PS3 era PC games, that is DX9.0c to DX11, have some of the best art x graphical effects balance in history, with some sharp, high contrast shading, deep parallax maps, high enough polycounts and texture resolutions... etc. Again, it's mostly the console peasants driving themselves into a corner with their "TERAFLOPS! 4K AND 8K GAYMING ON A 799 USD CONSOLE!!" claims, that have resulted a new-found ""need"" to use lazy tricks and hacks to make their poorly designed, poorly optimized garbage to run even remotely passably. Hell, take 2015 games like MGSV, Witcher 3, Talos Principle... etc. Custom engines, gorgeous visuals, and all of them run smooth as butter at native 1080p, 60fps, on 1GB DDR5 GPUs.
@@GugureSux I'm pretty sure that Witcher 3 definitely didn't run in 60 fps on any console back then. I've played it on PS4, and it was ok, but not great. Also "console shit" _never_ was cutting edge, because consoles always have to compromise between performance, and cost. There was perhaps _one_ "no compromises" console in history, the Neo Geo, which was 100% on par with SNK's high end arcade machines, and so expensive that it failed spectacularly in terms of sales. Could it perhaps be that you, like so many people commenting the same phrases over and over, simply don't know a lot about gaming history, technology, and software development in general?
Some constructive feedback : At many points you really pass over points way too fast without actually elaborating what the issue is. At pretty much every point you barely give us, the viewer, any time, without pausing the video, to comprehend what the actual issue is, let alone recognize differences in the shown scenes. It would be better if you narrate what exactly the issue is and let people recognize what you are showing, instead of switching to a completely different scene every 5 seconds (exaggerated). I do realize that it's possible for the viewer to stop the video, rewind, and step through each part manually frame-by-frame to see what the difference is, but having to pause and manually compare two scenes every 3~10 seconds is not a pleasant viewer experience. While I do realize that the video would be significantly longer with you elaborating each scene comparison, it would be tremendously more helpful for the viewer to comprehend and have time to understand what you are talking about and what the problems are that need to be solved, as not every viewer will recognize the differences in the shown scenes (especially regarding ambient occlusion, which is rather subtle).
Video length was definitely a concern. It's bridging a lesser know connection between two topics being TAA and fake optimization. We are currently considering doing UA-cam shorts that explain these snippets slowly showing why modern versions are so flawed compared older/faster deferred implementations. It's especially difficult when youtube compression defaults to a lower compression below the 4k quality.
@@ThreatInteractive Bro your going to have to realize video length doesn't matter anymore. You have youtubers that do 8+ hour dissertations on ONE movie. Now, you're talking about something that actually matters, so you have all the time in the world to express your evidence and clearly show each point.. drilling the details into the viewers head. You gotta realize how important getting every fact into visual detail is, especially getting people onboard for a fork of the UE game engine. Let alone get people on board to contribute to the project. All in all, I'm on your side on this issue and hope your ideas get it resolved.
Disagreed with the comments on this thread, the video was the perfect length, only improvement would be to point to more in depth videos where we can learn more
Fundamentally, there is absolutely nothing wrong with TAA. Pretty much all anti aliasing methods have a level of jank, because you're attempting to achieve something that is meant to be done by downscaling a higher resolution to your monitor (SSAA). Everything else is merely an approximation of that. Upscalers are not exactly fake optimization, the problem is they just aren't good enough yet to be used so heavily the way they are right now. They are a VERY promising technology, especially for older hardware. But games should not RELY on TAA or upscalers, as relying on a post processing effect for pretty much anything is never a good idea. Your game should be able to stand on it's own two feet, and THEN add in things like TAA and upscaling. There is also different levels of TAA, some games TAA looks like literal motion blur, while others, such as DOOM 2016's TSSAA is beautiful, and produces next to no artifacts, from aliasing or the anti aliasing effect itself. TAA CAN be done right, it's just if you rely on it as a crutch from day one, then you are tricking yourself into thinking your game is running better than it actually is. Being a mobile dev attempting to bring AAA graphics to older phones like the 6S, TAA is a god send. Yes it looks blurry, yes there is a lot of artifacting, but the game would not run on the hardware otherwise and it looks pretty bad without it. However that is not my main platform, it is merely an option for consumers. I believe every developer who uses TAA and upscalers should put a warning in to say that you CAN run this game on older hardware, but be warned there will be artifacting. But the bottom line is that all developers should not rely on TAA, it should only be added in after the fact, and games should DEFINITELY add an option for other methods (FXAA +1 frame actually looks pretty awesome). Personally I hate jaggys, and SMAA or MSAA are just not effective enough. But FXAA is really quite amazing, especially considering it's basically a free effect.
Nice little trick: Using an underscore before and after a word allows you to make it cursive like in _this_ example. And an asterisk before and after a word make it bold, like in *this* example. Makes it look a little nicer than all uppercase if you wanna put emphisis on a single word.
True transparency or single layer transparency has many disadvantages. Sorting, shader cost, clipping, culling and and and. Temporal dithering is a simple opacity mask technology. It use small noise tilling tex and multiply this with a gradient fade. It need TAA for no flickering or showing pixled dithering. Advantage? Fast, low shader cost, no switching model for blend.
@@UlrichThümmler You forgot the downside: it looks absolutely shit. Subtle things that need transparency like hair, for example, have been completely ruined and not only do they need TAA to look even remotely acceptable, but even with TAA the hair looks like total crap compared to any game from 10 or sometimes even 15 years ago. Yeah, hair in the old days with its transparency had problems too, such as the interaction with depth of field which was totally flawed (hair goes out of focus even if the character's head is in focus), but even with these drawbacks, the end result was still so absurdly better than the garbage we have today that it's one of the best arguments as to why the industry is going down the wrong path: characters hair.
In a highly stylized game, like Genshin Impact, I think it's appropriate and, IMHO, actually adds to the charm, but in a game going for realism, it's wholly inappropriate.
VR could be a major driving force behind TAA's decline in popularity, especially as the mobile virtual reality market grows. Alyx has a very clean MSAA look.
I don't think so. With VR you can't have fast moving action-oriented games like Metal Gear Rising since they would introduce motion sickness in the majority of players, as well as with games like Titanfall. Plus the cost of entry will always prevent the popularity of VR's mass adoption.
Agreed, because the user's camera (Head) is *always* moving, temporal effects can suffer, with something like TAA, it would get turned into a blurry mess quite fast. MSAA is the cleanest one, but also sadly super expensive. With most of VR moving to mobile hardware as well (Like the Quest 2 / 3), there needs to be a solid solution for fast anti-aliasing that doesn't look awful.
@@guybrush3000 It's ideal but in some cases not performant, believe URP in Unity on Quest is significantly hit by performance when using MSAA on Quest.
Oh shit, instant subscribe! Finally someone coming hard to speak the truth out of this current BS and making lazy and greedy devs and companies balls twist with promising grip
"FXAA addresses stair stepping but introduces blurring" A slight correction. FXAA does NOT address stair-stepping and instead just introduces blurring. Also does nothing for temporal aliasing. Which is, all on its own, one of the worst blemishes on 3D graphics... blur + temporal aliasing = vomit. FXAA was the worst thing to ever be introduced to 3D graphics. (For those that don't know. Temporal Aliasing refers to that lovely jaggies-shimmer you see at all resolutions... even 4k.
@@ali32bit42 epic is the entity of all entities at this point and it's partially owned by chinese equity. Things aren't going to get any better. All it takes is them making a few icky changes in their ToS.
You can also say "Gone are the days of building your own library for running a game... You just import libraries with what you need" Yeah, for a good reason... because it is becoming more and more expensive to build a proprietary game engine. The effort, cost, and time to update a proprietary game engine becomes exponential, while the performance gain / improvement to graphical fidelity is diminishing... If you want more proprietary game engines, you should be prepared to pay a 100$ per game... Even CDPR is dropping their engine. It is simply far too expensive for them to continue it, and it leads to far too many issues compared to what they really want to do. And look at Starfield. It is so cobbled together that many of the old problems of the earlier titles still plague it to this day, and it ended up being terrible for its usecase, and actually became worse, because it is still relying on optimizations and game design from Morrowind, which is from 2002... The loading zones, today, could essentially be hidden and dropped if they built something new, but stappling it on to their already improvised engine was likely to make it too unstable. There isn't really something "amazing" about proprietary engines, other than the studio having to add another 2-3 years onto the development time. Quite a few studios go bankrupt or have to severely compromise to survive, because it is hella expensive. Most studios that build proprietary engines throughout history did it with the intent to sell it on... Serious Sam only exists because it is quite literally a tech demo for The Serious Engine. Luckily for them, Serious Sam was a success, because the engine didn't really sell... They also retired their engine... because again, exponential effort, cost, and time... diminishing gains...
I'm a Quest Standalone developer and the main technical artist for our company, this video makes me feel pretty annoyed with other studios abusing these effects. What was supposed to be used for good was mishandled. The advantages of TAA comes with SuperSampling the image. But other than that, it is detrimental like smearing vasoline over the screen. We are going from Lossless techniques to Lossy and it is very sad. I hope Unreal Engine can improve the guture of games in the future but not like this. Games coming out wasting electricity more than they should and gatekeeping player's with low-end hardware.
Sorry to hear about Kevin, I hope he recovered and is doing well. Great video btw. My guess is that game developers are in cahoot with NVIDIA and other hardware manufacturers that push for the implementation of these features in videogames to justify their insane prices on more recent gpus, I've had this suspicion for years and I haven't been able to prove it but everythime I watch at games from previous generations I cannot deny that modern games pale in comparison, and to be frank I'd rather have a game with more stylized graphics that runs well that one with ultra photorealistic graphics that runs like garbage anyway. This industry is sick to the core.
Incredibly well done video man this is gonna be a great channel, absolutely subbed. Good luck with the game!! I would also like to add that devs also rely on sample and hold motionblur to hide the noise in motion. Games are already blurrier than they should be because of modern lcds.
Thank you for bringing attention to this. I feel like my investment in a 4K display was foiled by developers taking shortcuts. TAA looks like shit, even at native 4K. It's visibly blurry, totally unacceptable. Older games legitimately look better.
Yes lets dismiss everything he said and imo proved, about TAA destroying motion clarity and the benefits of SMAA, despite its limitations or FXAA + a previous frame, over TAA, to just focus on a single SMAA issue...
@@Argoon1981 Oh no, I do agree that TAA looks bad at a low resolution, low framerate or just with a bad implementation in general. It's just funny how this is the second video I've seen by this dude where he uses SMAA as an example of good AA, then shows a frame that looks worse than even FXAA
@@michaels851 There's always a way to optimize everything. You'll never find a solution to problems if you don't try stuff out. Just experiment. The only limitation is if the engine itself allows you to change and experiment with your own technologies. That's why I always prefer my own engine, there's no one telling me what I can and can't do.
I'm throwing in some support and I look forward to seeing what this engine you propose will look and work like. Speaking the truth in these modern times is a revolutionary act!
THANK YOU, I JUST CANT THANK YOU ENOUGH. I legit thought my mind was playing games with me, but no its actually that bad, all those marketing materials got me bad....
Amazing Video. I have been playing around in the caustics branch of unreal for ray traced caustics and DXGI and the feature set seems a lot more responsive than epic's native build. I hope to see this project progress further, as I too am sick of blurry games in motion. Renders look amazing, stills, slow images. Moving just makes everything a blurry mess which only VRR and DLSS + Ray Reconstruction can resolve.
1:30 the moment I first played Timesplitters 1 on PS2 in like 2002 , I was like "ah, so a solid 60fps with no stutter ever, is going to be the new standard from now on and gamers will NEVER accept anything less." and yet here we are.
Console Gamers - "Ah, remember the days when the mud we had to drink didn't even have any dead bugs floating in it?" PC Gamers - "...No? Why were you drinking mud?"
I actually like upscalers in how they can help older systems BUT I feel like developers are abusing them in basically making them mandatory to get decent performance in modern games rather than doing actual optimizations. Upscaling is NOT a devtool (as claimed by some). It's an OPTION for endusers and should never be mandatory to get your game to run decently.
This is BY FAR the most extensive and well put together documentation of these problems. Thanks so much for making this. It's the perfect video to share around regarding the issues that plague modern AAA game visuals.
Once again, love your content and I resonate with what you're trying to convey. I see how UE5 can be a problem, their Radiance Caching for real-time GI presentation a Siggraph together with this video are very eye opening in that regard. In principle, I've been complaining about very similar things under DF videos since at least 2022; it was around the launch of TLOUP1 and DLSS3, where the guys there really started to nonsensically advocate and push for RT in games: they wanted, demanded, 4k together with RT, not just on PC, but on consoles! How can these "experienced" professionals be lacking this much awareness? Going back to the main point: where would you put Ray Tracing in all this? Could we say that the sudden and premature push for Ray Tracing has had a huge role in the way things developed, further exacerbating the situation? And related to that, what kind of role did nVidia have in all this (all things RTX), in your opinion? I was very surprised by the ending of this video, where you advocate for the much faster nVidia's DDGI: wouldn't that still be subject to the same issues that we have with modern "high quality" dynamic GI techniques, Lumen included, since they still need temporal accumulation in order to work/be viable when it comes to performance? I can understand that, if we will have to bear shitty IQ due to temporal reuse, the least we can do is to at least make it faster and smarter in order to mitigate its downsides, but still. I am just trying to understand, thanks!
Keep fighting the good fight! I must add, The Finals may perform well, but the amount of dithering and ghosting is still massively unacceptable. I recently saw a video on r/FuckTAA where they went thru just normal game play and highlighted the crazy amount ghosting the game has. I truly feel UE5 is the worst thing to happen to gaming when it comes to performance and visual clarity. Like you mentioned, it is so easily available, why wouldn't a company use it. I have been playing Grey Zone Warfare recently and that game looks like shit, and has terrible performance. DLSS and FG are a necessity, but they are trying to upscale a mess of pixels, so it is impossible to see anything clearly. Even with up-scaling off (kinda...?? im not sure if that's even really an option, just things like FSR AA and TSR) it looks horrible. They dont even have a render resolution slider. Just some weird one (cant remember what its called) that defaults to 62. The smoothness is off too. I will play an older game with 80-100fps and it feels smoother than GZW with the same fps... How the hell did we come to having games that look worse AND perform worse?? It kills me seeing comments from people saying shit like "my game runs perfect. 60 fps on my 4080 with high settings. It must be your PC"... I feel like im going crazy! 60 fps is NOT ok for a PC i spent $2,000 on!! Developers are using up-scaling in the system requirements now FFS... I feel like this exactly what is happening when you say they are trying to make this the new standard and to be expected, but its all BS. I cant even be excited for new AAA games anymore. They lack visuals clarity and run like shit. We have taken serious steps backwards and i dont think its gonna get better, only worse.
This is neat, but why base off of unreal when you can push these features to open source engines like Bevy, Godot, or O3D? I figure you must have experience in unreal that would make implementing features like this easier than learning those other engines. Would you consider making reference designs of your effects available separate from your unreal code, so that those of us without unreal licenses can implement your effects in other engines?
@@vegitoblue2187 Thats A good reason, but to compromise for existing projects doesn't strike me as the goal to strive for in this case, the chances of existing projects spending A bunch of time and money adding these improvements to their existing products seems unlikely to me. But I could see more devs using it for new unreal projects, than would be willing to learn an entire new engine for graphics improvements.
Unreal is more scalable than any of those engines. O3DE is the only exception, because it's based on CryEngine 3.8/Lumberyard-derived code, and probably doesn't have the TAA plague to the extent modern Unreal has.
The TAA-mod for ALIEN: Isolation known as ALIAS: Isolation is AWESOME, that's for sure! And that's the one of the best implementations of TAA in gaming EVER! Моддер, сделавший этот мод, - Моё почтение!
"Modern" blurry games that run at ~45fps@medium on hardware that runs Cyberpunk at 60+ fps @high is just not acceptable as "a good game" no matter what 4090 users say.
and people would call me crazy and bitching at nothing when I was complaining when games started using TAA years ago , I am in no ways a graphics engineer , just a guy who has played pc games since 2000 , I just could never get used to or stand TAA and would always switch it off and play with no AA or use super sampling when my hardware could handle it , up until games started looking horrible without TAA even with 200% SS , games just look bad now. We pay so much for hardware that is extremely powerful now and games still look and run like dog shit. It's really bad that I can't get a decent 1080p (as far as 1080p goes ) image with a 3080 when 10 years ago you could get a decent image without artifacts on a gtx 960
Thank you and kudos for bringing light to this issue. I hope this video goes viral. Modern graphics have very obviously gotten better but with TAA, it brings blur into the equation so you can't even see those graphics. It's like taking two steps forward but three steps back. I was always wondered why games like Warhammer 3, Cyberpunk, and Witcher 3 looked blurry when I built my new PC even with motion blur turned off. BRING BACK REAL ANTI ALIASING! BRING BACK MSAA!
MSAA is antialisasing a little bit of the image and leaving the rest untouched, not to mention the big performance hit. And like he said, many of the lighting and shading effects modern games use rely on temporal antialiasing/denoising.
MSAA isn't really viable anymore. It only applies to geometric edges so shader/post processing effects will be completely unaffected not to mention in surface detail like specular shimmering. There is also the massive performance hit that MSAA incurs. My favourite AA at the moment is SMAA but barely any games implement it. It suffers from some of the drawbacks of MSAA but is pretty much free in terms of performance and doesn't exhibit excessive blur like TAA.
It also fails to handle HDR with default MSAA resolve/tonemap to framebuffer. When dynamic range is large enough, those previous samples may fall completely out of the tone mapped brightness range. This is quite visible when dark object is in front of a bright object. (Ie. Sun) To get this work better, one has to fix it manually.
Omfg this video makes me EMOTIONAL 😂. FINALLY my anxiety, hatred, confusion, etc had been put fourth and presented all in one. I've been saying this for years in arguably way worse and less informed ways, but having this video to lay out the entire problem, the solutions, etc is amazing. Gods work! Let this be seen!
this is an incredibly important video for ANYONE who cares about the state of gaming to watch, and i'm deeply saddened by the fact the algorithm hasn't spread this around way more homepages already. i can only hope this video blows up belatedly :D very well explained with the visual examples provided, as somebody who is an avid gamer but is familiar with the technical aspect of games in only an 'intuitive' way, i found it easy to follow along with the line of thought. it's incredible that as a life-long pc gamer i can confidently say i would NOT have built my first pc in 2020 had i known about the trends to come. i don't even feel like i would be missing out on anything with inferior hardware; i couldnt run cyberpunk on launch above 60 fps WITH dlss, elden ring was (still probably is) locked at 60 fps, ray tracing barely shows up in games and when it does its not usable unless you have THE most expensive card (i have a 3070 and dont even run RT on control because the performance hit is just so noticable), DLSS is cool but, like illustrated by this video, it's a band-aid. i got helldivers 2 and i went from running the game at 100+ fps in april to now getting 60 fps lmao. it's just a joke, and sadly it's a pattern i am noticing in many other aspects of culture as well..
I felt like I was getting CRAZY, even in Fortnite which became Unreal Engine's flagship, you just can't escape that blur. Just use a car, and watch under the wheels, there's always ghosting. Even with TAA disabled or whatever, I can't manage to get rid of it. So many games where there's that blur and somehow ... everyone seems fine with it ?? Everytime I mentioned it, it was brushed off as if I was just nitpicking. But this looks just horrible. Thank God this video just came in so I can rest easy, knowing that I'm not crazy
Quite the opposite. I'm pretty sure most people who like to complain about modern anti-aliasing and reconstruction techniques never connected a PS3 to a 1080p display, otherwise they'd know how terrible most games looked back then. And yes, even on the PC, because literally nobody could run Crysis smoothly at whatever native resolution common LCDs had back then. I guess most people here are simply too young to have even tried it.
@@NeovanGoth emm no. Always played mainly on pc, always native res. Even when i was i child the diff between native and even one res below was outstanding
What is wrong with the Digital Foundry video? Just watched it again and it is super accurate. He also mentions downfalls with TAA and how he thinks you should always have alternatives such as super sampling.
A lot of these people don’t like DF and think they’re shills just because they don’t share their primal hatred for TAA lol. Then you also have fanboys that think DF is biased against their favorite brand (Xbox vs PS, Nvidia vs AMD) even though there’s nothing I’ve seen from them that indicates they arbitrarily favor any given platform. If they favor one in a comparison it’s because it provided the best quality and performance, not because they wanna “own the Xbots, Sony Ponies, AMDumbs, Nvidiots, etc…” Their lack of bias should be evident by the fact that every single time they do anything there’s a different group in the comments claiming they were wronged by DF’s comparison or tech dive.
It's defending a broken mess, they use the typical shill tactic of using neutral sounding language to mask their bias. It only works if you're stupid btw.
@@Brawltendo Oh they absolutely shill for nVidia, Richard is way more tepid about it lately, but Battaglia is a massive fanboy. It took me a while to open my eyes, as I myself was a fan for long time, until I started to see things going against basic common sense, and nVidia marketing being shoved into analyses. From that point on, I've been way more critical towards the stuff they say. A couple of examples: do you recall Battaglia having a video talking about and comparing DLSS FG vs FSR FG? No? Ask yourself why. Or more recently: since a few videos now, he keeps selling nVidia's Ray Reconstruction at every chance he gets, that's because he knows that the upcoming battle will be about denoisers, with RR pretty much being the only tech left setting nVidia apart. He did that in his Megalights "explanation" (lmao) video, and in his latest Silent Hill 2 Remake video. Don't get me wrong, I still enjoy all of his videos, his delta time investigation in SH2 was brilliant, still, the guy got a platform and he's making full use of it, for good and bad. Hopefully he doesn't push his own luck too far, it takes long time to build a reputation, and very little time to destroy it. With me he definitely lost a few points already, but hey, I am nobody, so yea.
@@ctrlectrld "do you recall Battaglia having a video talking about and comparing DLSS FG vs FSR FG? No? Ask yourself why" Because Rich already did it. Alex afaik already talked about FSR FG positively in forums if it was implemented well in a certain game. "with RR pretty much being the only tech left setting nVidia apart" What are you talking about? DLSS is also still unrivaled from a quality standpoint, by far actually. "since a few videos now, he keeps selling nVidia's Ray Reconstruction at every chance he gets" That's completely plausible in a video about very obvious standard denoiser artifacts like in SH2. Your feelings are massively deceiving you.
Nice informative video, I had the impression that TAA had become the computer graphics equivalent of generating electricity by boiling water to make the steam spin a turbine: too damn common and not efficient.
How is it that when someone derails itself to much in anything, someone always comes up to rectify it, the parties involved not someone always comes up! I knew something was off with the new games ,but I don't know enough to put it into words. God Bless, hopefully you can help.
The worst part of UE5 isn't even TAA. It's the nasty ass blurriness of reflections. Anything shiny looks like it's being perpetually spray-painted with an undulating haze. There is no clarity anymore. So ridiculous.
It makes me upset that there are these systems forced upon developers that don’t need them most of the time, shouldn’t need them at all. I think a lot about indie devs that have been enthralled by the ease of development in UE5 because of its robust Visual Scripting systems, but their games in the engine don’t have super detailed models and the games look weird in UE’s light systems. Unity as well. I remember back at the start of Unity and in UE3, these engines prided themselves on making their engines scalable from the highest end PCs of the time to being able to run in a web browser. I miss those days. I miss them a lot.
While I'm sure all this is true, you highlighted the key problem here - cost. If developers can do things inefficiently but get the game released quicker, particularly the bigger publishers will force them to do it. UE5 seems to be hugely focused on development speed, not efficiency.
As an Indie Developer currently utilizing Unreal Engine 5 in my project's development. Your words and evidence alone was damning. I use this engine on a semi-daily basis, and I hate how true this all is. Blatant disregard for traditional rendering techniques that were in use years ago, only to be thrown to the wayside. For more in favor of noisy, badly designed TAA-dependent effects. I used Unreal Engine back when it was on 4.24, it was a breath of fresh air when we didn't have to deal with any of this crap. But now, here we are. Where has it all gone? Just to save a quick buck? If you do manage to get a custom engine branch going. I am all in, I want to use that branch over Epic's, inheriantly flawed branch of their engine. I wish you the best of luck, in hoping to secure funding and investment. And oh yeah, I am less than pleased about Epic Games' behavior and stance on all of this, instead of pushing the industry to do better. They want to do things in a way that costs more time to process on next-gen consoles and PCs alike. It's unacceptable, having to find undocumented loopholes to fix the issues with this engine. And seeing everybody ditching propretiary engines over this? Why? Just why? Also can we like, stop adopting TAA, like...for good? And not use TAA-dependent effects, at all?
Back in the late 90's early 2000's I dreamt of the great battle between John Carmack and Tim Sweeney culminating in amazing engine choices and freedom. The world needs Carmack now more than ever!
@@gstreetgames2530 I think any competitor to Unreal would be great now because back then, it's competition with id Tech was what drove the engines to innovate and become more optimised and accessible. I think Unity in the early days was also a big factor for making Unreal become more accessible especially for indie devs but sadly Unity is fumbling now and Godot is still getting on it's feet so I'm hopeful that there'll eventually be a worthy game engine to compete with Unreal in order to to make it good again
Have you guys considered putting your efforts towards improving Godot engine? An open source project like that could use people with your talent and knowledge
Yeah, I think that the effort might be more useful to the industry if put towards a truly open engine like Godot. On the other hand - Godot is not targeted at AAA studios, and indie games have a bit different needs. On yet another hand - Godot is open and modular, so it wouldn't be that crazy to maintain a fork.
@@sowa705 What do you mean? It's a FOSS framework. You can do anything with it. Seen the Road To Vostok game project, that's being made by one guy in his free time? Yeah, you can do anything with the engine.
Because modern programmers can't actually program. There's an entire generation of programmers now that believe that putting their website inside a web wrapper is "coding a desktop application". See Discord, Spotify, Teams, etc. It's all just crap running in Chromium requiring 1GB RAM just to render text on your screen. Discord literally takes longer to launch on a 5800X3D with 32GB RAM and an Nvme SSD than MSN Messenger on a Core 2 Duo from 2007 on spinning rust. I wouldn't be at all surprised to walk into a game dev studio today and see an entire room of game devs using Blueprints with one or two actual programmers with C++ knowledge desperately trying to optimise everything by themselves.
Wow thank you for the explanation! This whole situation just hit me full blast with one of my favorite game franchises: Codemaster's Rally Sims like Dirt Rally and recently EA WRC. Since Dirt Rally 1 I have been playing these games in VR exclusively. In VR all these temporal solutions that generate frames are pretty much useless. The blur makes you feel like you need glasses and ghosting is super obvious. I used to play Dirt Rally 2 (EGO engine) with my VR glasses (Quest 2) 4 years ago on a 2080ti with mid-high settings and 4x msaa in 90hz. Not average 90fps, there where no dips below 90fps. A while ago the same studio released EA: WRC, this time using Unreal Engine 4. The game looks the same as Dirt Rally 2 (2020) at best but often worse. It has only TAA which cannot be turned off in game and it runs soooo bad. I have a 4080s. I barely reach 90fps with low settings and reduced resolution and it will stutter and dip below 90fps no matter what. But the absolute worst thing is the horrible TAA that cannot be turned off... It's simply not playable in VR and the whole community just whishes they would have made new stages/cars for Dirt Rally 2.0 instead. Weird times^^
Funny u say this cause I've seen digital foundry question why doesn't Nintendo ever use any anti ailiasing on their games and I wonder if Nintendo understands this and they don't
Nintendo may be a-holes regarding to copyright on their own franchises, but at the end of the day, they are probably the least scammy actor among the « titans » of the game industry.
easy sub, this is so important to me. I've been complaining for years that there's no engine version or engine setting that concentrates on higher performance and sharper image quality.
Very informative stuff. I’ve played a lot of games that look like hell and never thought twice about it but honestly if I’m paying 60 bucks for a game and however much more for DLC I want to pay for higher quality. Thanks for the info.
Not only are you paying 60 and more recently 70 bucks +DLC, but when people are purchasing powerful hardware such as 9th gen consoles or modern GPUs around the $300 range, people deserve way better results considering the standard set on far lower tier hardware.
It's crazy how low priority it seems to be in 2024 for games to have a remotely clean/crisp final image. Boot up an older game that predates TAA and run it at native 4k and just have your mind blown at how clean an image CAN look. In the race to push new features we've sacrificed so much. With that said, I do believe that Ray Tracing and these other features are still in their infancy and may be another generation or even 2 before fully RT enabled games are performant enough to where we can afford to boost internal resolutions and not rely EXCLUSIVELY on upscaling. Also despite ALLLL of their issues, the quality of the upscalers is going to rapidly improve
I spent quite awhile trying to figure out why new games, despite higher texture resolution, still looked worse than Crysis 3 to me, even with all these fancy new lighting technologies. Finally realized it was the inclusion of TAA and lack of MSAA.
MSAA doesn't work well with modern games even without the TAA jitter. The reason stems from pixel shaders. MSAA is expensive and looks bad today. There's a reason why the last major game to use MSAA was Forza Horizon 5 and even then it didn't look great and misses many edges. FH5 didn't have TAA at launch either
@@crestofhonor2349 FH5 looks far better than Motorsport. That's just plain bullshit. MSAA is the best AA besides SSAA which is extremely demanding. SMAA can supplement it nicely and that combo is 10000x better looking than TAA+FXAA. No. It's pure laziness. They are using 2012 rendering tech underneath all that fancy lighting. Deferred rendering is obsolete in the face of Clustered Forward.
@@riston7264 Did you read? That's why I said it can be supplemented nicely by SMAA if you like the inner textures to be antialiased. And many of us don't, by the way.
I didn't know how TAA worked and its literally been the bane of every time I've tried to develop post process shaders. Thank you. Also, for a moment I thought you were a meta human + magical settings Epic didn't tell me.
As the video expected Wukong's performance was horrible. Constant FPS drop and various input lag issues. Yes the game sold a lot but as a general tech guy playing Wukong was painful.
Not to mention it destroys AMD GPUs by default without RT My RTX2050 laptop gets the same result as a desktop 6500XT. The latter is 30% faster btw Im not a tech guy. But i do know current game devs makes the game calculates graphical effects for real rather than visual tricks Can we go back to when games are full of visual tricks to make the game looked better at lighter performance needed? Because game devs now for sure dont even bother with that
@@3dcomrade We will never go back to the good old times because companies keep looking for cost cutting and investing in-house game engine costs too much money. In addition skilled game engine developers are very expensive. I tried to study about game engine development and it's no joke at all.
This video could've been 3 hours long and I would've watched the whole thing, i hope u continue making videos on this subject I feel like u barely scratched the surface
Well I'm convinced. Its a very Interesting issue. But I can't see your solution getting much traction. Not because its a bad idea, but because of the scale and scope of dealing with unreal engine. Forks come and go, and most people don't know they exist or that they even should use them. Best case scenario you get hired by Epic and can make a waves there to add appropriate alternative configurations. Worst case: keep making videos like this and they will get seen by the Epic Devs. I hope you can good funding, and even better devs. That's won't be an easy feat. I think if you can hammer home the point with even more videos like this one, Epics devs won't be able to ignore these issues. All devs love a good: 'stop making it worse video' for optimizations.
I'm happy people with great technical understanding and skillsets are also upset about this issue. Almost every use case of temporal effects in new games give me motion sickness due to blurry appearances or ghosting, and even cranking the settings down to lowest don't solve the performance issues. One game I played recently performed worse on lowest settings without DLSS, compared to max settings with DLSS. Things become exponentially worse when it's a game labeling itself as a competitive FPS, which should prioritize visual clarity above all.
What I find very weird about all this is that Epic Games never used to be this bad when it came to managing their Unreal Engine. Back in the old days from Unreal 1 up to 4, they were managing the engine very well, as well as innovating it a lot with new features and optimisations, they also had good games to showcase it like Unreal Tournament, Gears of War, Infinity Blade, etc. Nowadays it seems that Epic doesn't care about Unreal Engine as much as they used to anymore and instead care more about Fortnite and making it look more presentable and profitable
Than you very much for all of this work. This is really impactful for the community and future of the games. I hope more people get away from the mainstream echo chambers that Nanite and Lumen are great and the future. I was wondering if all those issues I have found while playing around with settings in my personal projects are my fault of setting it up incorrectly but turns out it just is as is.
In a society that only values profit at the expense of all other motives, doing things optimally is never the end result especially when it comes to art. Games are built lazily and cheaply not because of incompetence, but because shareholders want a return on their investment and demand that profit be the only motivation so they force management to uphold that mandate. Welcome to the inevitable result of our capitalist society that caters everything around the wishes of the ultra wealthy.
I will say this. Injecting AMD CAD with ReShade will clear up a lot of the TAA blur that happens to textures. I now do it in almost every game I play, even with DLSS. I do generally agree. UE 4 and 5 look nice but don't play nice.
Im a gamedev using ue5, this video is quite interesting. I wouldn't call devs lazy though. I tink the business part of gamedev creates certain dynamics and pressures. But also as you've marked engine companies have different motivations and values taa differently.
We wanted to put out a first video explaining why, then we planned on doing a second video detailing out funding goal and how we intend to use it. But since people are already asking, we are setting up a funding site now.
@@ThreatInteractiveSo if you did raise $900k for this, and you successfully modified the engine, then what? Let's say you modified the UE5.4 branch successfully, but then newer versions come out with more features, your improvements would only be for that version, you'd have to raise funding all over again just to update?
@@Hybred When we successfully modify the engine we will use it for our game & make it available for other developers (there are already many who want something like this). Take Epic Games with Fortnite (income of well over 3 billion dollars) their new partners (CD Projekt Red) have abused flawed TAA just as much & barely improved the engine.They have had years to prove their worth & the vast number of games that have used their engines have suffered for it. portal.productboard.com/epicgames/1-unreal-engine-public-roadmap/tabs/46-unreal-engine-5-0 This is the link to their UE5 roadmap. Over the years we tested all the "improvements" & found 90% had barely any performance gains or depended on bad TAA.They have admitted to their desire to faze out traditional rendering for nanite only & recently forums.unrealengine.com/t/lumen-gi-and-reflections-feedback-thread/501108/1727 the lumen developers stated they will not support traditional optimization because "they use nanite". This forceful & arrogant behavior is why developers need to form independence from Epic Games. Even if the improvements paid for by crowdfunding don't get integrated into UE5-main (we're not just fixing AA, we're going to be proposing other effects like stable GI). Say we had done this with 4.27, there would only be so much we would bring from newer versions. If by some off chance they do improve newer versions of UE5 with new game related technology (like something seen with the motion matching tech) that is something we'll consider updating our version to include. We would not have to raise money to integrate our improvements to newer engines since by the time Epic adds anything useful we'll be independent of funding through our own game (assuming our game is successful).
I don't even hate TAA, but I almost never see developers do the most fundamental basics of the technique correctly, like masking out absolutely everything that is moving or animated. You don't want to accumulate those, it's what gives you the awful ghosting artefacts. And I hate the way modern games do hair and transparency... ultimatum between a) everything is blurry or b) everyone has leukaemia and in chemo.
taa is the reason why i need to use glasses nowdays, since after control released, every single stupid game had that enabled by default and the worst part? its tied to the EFFECTS slider, and you know what AA and AAA games NEED TO HAVE to look REAL? FREACKING AO, and they bundle BOTH together you CANNOT decide between ONE or the OTHER, or its BOTH OR ITS NONE!!! its so absudrly dogshit specially for people who dont have rigs who can even tank the game on 1080p....
We have three things to ask:
1. Watch this video in 4K as in streaming settings as any lower compression will drastically hide visual issues we are trying to demonstrate.
2. Please excuse the audio issues as our studio founder Kevin was recently hit by a drunk driver and is still currently recovering.
He did not want to delay the release of this video any further.
3. If you agree with us and want to be a part of the solution donate here: threatinteractive.wordpress.com/donate
*Read the bottom note explaining why we're not actively seeking donations yet*
Very few people can view it on a 4K monitor though
@@nzeu725 as in streaming quality since it's a youtube compression issue. You'll be able to view the issues on 720p monitors as long as you have the streaming setting set to 4k.
@@ThreatInteractive oh ok
CDPR is actually rewrite core of Unreal. IK because I have contacts there.
Best wishes to Kevin, I hope he does alright.
When you have to run 8K resolution for a game to look as good as old games did on 1440p, you know something is very wrong.
LITERALLY
old ???
They want us to buy the newest Geforce and Radeon gpus. You can't run Ray Traycing? Buy our new Gefore RTX 666 now for $3000. Then you find out it only runs at 40fps. But wait you forgot about Frame generation! Let's double the framerates!
I game on a 4:3 1440p CRT monitor from 2002, what is aliasing?
@@saricubra2867 *"I game on a 4:3 1440p CRT monitor from 2002, what is aliasing?"*
Come on, man! I gamed on an FW900 (2304x1440) until late 2014, and then a 22" FP1350X (1920x1440) until mid 2018 when I couldn't find a suitable CRT replacement and bought a 55" 4K OLED.
If you think there's essentially no aliasing on a CRT, you either have a really cruddy CRT or really cruddy eyesight, or both! Sure, aliasing may inherently be a _bit_ less annoying on a CRT but it's still aliasing. If it weren't visible at all, it would mean the display is extremely blurry --- which is the subject of this video after all. And why I have always despised FXAA and all of the other screen-space blur filters.
I've always used 4x MSAA on CRT no matter how high the resolution. After all, it comes down to _angular_ resolution, and not _display_ resolution --- how large the pixels actually appear to your eye. Which is a function of screen size and viewing distance. I've always had my face right up against my monitors in order to maximize Display FoV and thus maximize immersion.
I grant that if you sit quite far from your display, your angular resolution will be higher, and any aliasing will be relatively less annoying. But that's no way to game!
P.S. If you're reading this and you are into maximizing immersion, ask me about High FoV Gaming --- it's probably not what you think.
Optimization has gone down the drain. There were 720p 60fps games on the original Xbox and 1080p60 on the PS3, and now almost two decades later we are told 540p on series S is just fine thanks to upscaling.
0:00 Intro
0:26 Creativity
1:26 Performance
2:30 TAA On Issues (Visuals)
6:10 TAA Off Issues (Visuals)
7:28 TAA "Optimizations" (Visuals)
9:05 DLAA & TSR (Visuals)
11:25 TAA Design Requisites
11:41 Epic Games & UE5 Problems
19:09 Threat Interactive Info
Basically TAA is implemented to mask all the graphics effects flickering like hell, with a soft blur that's almost literally like playing without glasses
what you described is fxaa, taa is a tad bit more complex than that
yes, they basically "Blur it until no jaggies = better".
@@eon96 Idk man TAA on Monster Hunter World gives me ungodly amount of flickering while FXAA just gives me a clean image.
@@eon96 You're righ, TAA also causes trails behind moving objects
Back in the days I treated FXAA as garbage, but now I prefer it over TAA as the latter somehow manages to be even more blurry. It's like somebody smeared my monitor with soap.
Me watching with 750ti: "Yeah you tell them!"
That garbage
@@xr.spedtech give me money for a new one then
GTX 750Ti 🤝 R9 380X "Yeah 4gb vram should be able to run anything!" In all honesty too many games nowadays ask for WAY too much considering what they visually produce. And more accessible games means more sales for publishers which makes these incredibly demanding games make absolutely no sense to me
Me watching with GT 620 : :)
Don't worry. Games aren't worth shit these days and it's even worse with the DEI crap.
Some history: Killzone on PS4 was the first big game with temporal reprojection pipeline. They rendered odd/even scanlines and reprojected the previous frame with motion vectors for missing pixels. This technique allowed 60Hz multiplayer in a game with 30Hz single player campaign. Which was great for gamers. None of these long smear artifacts existed as pixel was fully refreshed every other frame. And there was no noise based optimizations.
PS4 Pro was the big reason for these techniques to get popular. It was heavy marketed as 4K console and 4K sticker on the box required 4K framebuffer. But Ps5 Pro only has 2x faster GPU for 4x higher pixel count. Checkerboard rendering was invented. It was an improved version of the scanline technique of Killzone. 50% pixels refeshed every frame.
I’d say modern temporal AA was first used by Ubisoft’s For Honor in 2017. It was basically a modified TAA shader that was accumulating into a native res buffer. This solved some of the checkerboard issues like sawtooth pattern in occluded areas. UE4.19 TAA was similar to this algorithm.
When people got used to 4K rendering with checkerboard and TAAU, they didn’t want to spend half of the PS5 GPU gains by disabling these PS4 Pro 4K tricks. Thus temporal reconstruction become the standard way to render 4K on consoles.
Meanwhile people had started to use TAA (no upscaling) as their general filter for everything. This was needed because games were moving away from static baked lighting to dynamic solutions. Dynamic solutions were more expensive and lower quality. TAA helped hiding the issues. There still isn’t enough performance to calculate high quality fully dynamic GI and reflections without any tricks. That would be even slower without TAA. If you want dynamic stuff that’s the cost. UE5 forcing dynamic stuff to all users is not great. Not all games need fully dynamic lighting and reflections.
TAA also made it possible to add competely new effects. Such as volumetrics and volumetric clouds. Horizon Forbidden West had game play inside volumetric cloud caves for example. For most games it’s not worth to use that many GPU cycles for volumetrics rendering. Clouds can be faked using existing techniques just fine if you are not flying inside big cloud caves. Their new volumetrics tech leans heavily on temporal accumulation. Was it worth to Horizon Forbidden West to have reprojected volumetric could render tech? That’s up to players to decide. It made new gameplay possible. However if UE5 introduces similar tech and forces it to all users, then we have a problem. Simple efficient solutions are still useful and don’t require temporal accumulation.
I just wanted to emphasize that people don’t do temporal jitter because they are lazy. It’s super difficult to make these algorithms fully real time. Added GPU cost is huge and you have to compromise.
All of this is especially true for ray-tracing. You are tracing thin rays. Each gives you very little data. Randomizing the rays randomizes memory access patterns which hurts caches. You want to minimize the amount of rays you cast. Temporal accumulation is required for acceptable quality. Even offline ray-tracers use temporal techniques and denoisers. If we want to go there, we have to accept this. I personally feel that ray-tracing is too expensive right now compared to the gains. On RTX 4090 it’s usable due to HW advances and sheer HW power, but still far from efficient. Raster algos (and baking) provide similar results for much cheaper GPU cost.
Baked lighting is great for runtime (if your content is mostly static), but it has other issues.
AAA games today have big game worlds and lots of content. Game downloads already are hitting 100GB. Baked lighting takes a lot of storage space. Dynamic solutions don’t.
Baking lightmaps for small game levels was fine. But today’s massive BR maps require too long baking times to make the iteration time fast enough for level designers and lighting artists. Dynamic solution allows developrs to see lighting changes immediately. Productivity is better.
Developers want to patch these big BR maps in production, which would require new light bake -> massive patch to all uses. That’s not really feasible for games like Fortnite that are constantly updated.
There’s a lot of developers waiting for a future where you don’t need to bake any lighting. Super fast workflows and iteration time. The downside of this future is higher HW requirements for players (temporal makes perf hit more bearable). It’s a compromise, but many believe it’s a necessary compromise to produce more and more complex and polished content. Team sizes are massive today. Productivity is a real bottleneck.
i would rather have visual clarity & FPS than ray-tracing and the promise of 8K at 5FPS
We have gotten to a point where firing up a game from 2006 and running it in 4k looks better than latest UE5 titles
I can run Crysis 3 in 240 FPS native 4K, while Wukong struggles to get 60 FPS in 1080p and looks like smeary shit
Even UE3 games from 2006 look better than most UE5 games today
any care to compare gmod w/ and w/o TAA for us???
I had this experience with DoA on og Xbox (emulator) I was impressed on how good they looked on a system 20 years old
No joke - perfect example are Xbox One X enhanced games from Xbox360 (9x resolution with increased AF and in some cases textures and maybe LOD).
Ninja Gaiden II or Gears of War 3 are perfect examples. I highly suggest checking comparisons.
The fundamental problem here, and the reason the mainstream has adopted TAA as default for a decade now, is that most effects in modern games happen per pixel, not per vertex. This means MSAA can only be the first step in a long pipeline of ALIASING AVOIDANCE. After MSAA gives you clean geometry you have to watch out for every new pixel shader effect not to add noticable aliasing to the image once more.
Yes, over the last decade we're also seeing a lot of noisy temporal sampling, but these techniques have been invented because TAA is the standard, not vice versa! For flawless image quality without supersampling you have to PREVENT aliasing at the shader level. By moving some effects back to the vertex shader you can leverage the additional samples MSAA gives you. And then pixel shaders can still be used for effects that create smooth gradients or are generally subtle (low contrast, low frequency).
The only alternative I can imagine is sort of the opposite extreme, an engine built around supersampling. You minimize the cost of each pixel so you can render at least 4x, then apply some kind of post-process-AA (SSAAx4 is still visibly flawed), then scale back down to the screen resolution. Maybe this could get you *close enough* to 8x quality, but performance will be a problem.
I'd appreciate the video creator's input on this
One root cause of the problem is that everyone uses deferred shading. There is some weird mindset that drives developers to use that.
I prefer forward rendering pipeline. That allows high performance and good MSAA and no need to recompile shaders. So it is actually simpler. You only need to take care overdraw having occluders in scene and have some logic to handle what dynamic light sources lit what assets because there is that 8 light source limitation per mesh.
The video doesn't talk about MSAA at all. It mentions SMAA, a per pixel, post-process antialiasing that works with pixel-shaders.
@@gruntaxeman3740 Devs largely switchrd to deferred shading because it scales better with multiple light points and with open world games and games in general becoming larger traditional forward rendering is a problem and the 8th gen consoles had a big memory increase allowing deferred to become the standard.
@@lordanonimmo7699
I'm confident that most cases that is huge misstep. Light source limitation is not "8 light sources per scene". It is "8 light sources in fixed pipeline".
So what is required is to light source states every area in 3D-space, based on light sources influence. So in reality we are talking max 8 eight light sources per mesh. There are really no issues to have plenty of light sources in scene.
It is not only AA what suffers in deferred shading. In performance point of view, the issue is memory bandwidth. Forward rendering pipeline requires much less of memory bandwidth and that is the bottleneck. It was bottleneck in 8th gen consoles and it is still, because memory is shared to CPU and GPU.
You were right, this really is one of the most important videos about modern game engines.
Thanks for putting our feelings into words. I hope you go far.
What a colossal tragedy that the vid has this low views...
As an Unreal developer i'd say you said right things. I wish i could add support for SMAA or optimize TAA/TSR, but these are forgotten topics that have no support on the forums, no documentation and no tutorials. And trying to learn it from scratch is an expensive task that will cost even me as an indie thousands of dollars and months in R&D. But why don't huge AAA studios do it i have no idea honestly
Geez...
don't we have open source implementation of these kinda stuff like smaa? should be able to reference those stuff when implementing smaa in unreal
Simple, because it costs time and money that publishers would rather not spend. And considering how long it takes games to come out these days, I really can't blame them in the slightest. They're practically getting death threats for already spending 5 plus years per title. And from a corporate perspective, what kind of fool spends time and money on something that doesn't immediately make them more money back?
I'm sure there are many papers out on reference implementations for FXAA and SMAA. My first place to look would be ReShade shaders that are commonly available. I'm assuming if you know how to code UE pixel shaders, you can port it over.
@@NighttimeJuneau Like 90% of ue devs only code in blueprints. 4% knows they can code shaders 2% know how to do it
Regarding ghosting in TAA, 90% of those cases are improperly implemented TAA, the motion vectors not being reliable in those cases.
For example the Guns leaving behind ghosting is because the Motion Vectors are ignoring the gun, they are drawing the gun on a separate camera ontop of the main camera which is used to prevent the gun intersecting the world.
Blurryness can also be significantly mitigated with "unjittering" the textures in screen space which a lot of implementations don't do, so that way only the edges of geometry jitter, not textures.
Another big thing is using Bicubic sampling for the reprojection instead of hardware bilinear which helps much more with sharpness than using a sharpness filter.
Just those things alone bring TAA quality up a significant amount, its just for some reason the standard implementations don't really do these things, probably because its a little tricky to implement the texture "unjittering" specifically in an engine like Unreal/Unity because every shader that samples a texture to the screen would need to implement it.
We really appreciate your high effort comment.
There is the argument that these are "improperly implemented" but at the same time you can find hundreds
of games that display these issues from AAA to indie. And what legitimizes them is the standardized abuse
of these flaws in the name of "optimizing" several effects when that is now clearly an invalid excuse and
has no significant impact on 9th gen.
RE: "hardware bilinear which helps much more with sharpness than using a sharpness filter."
Guessing you mean better clarity? You can add a sharpening filter to a game with or without TAA.
We strive for clarity since you can't add a "clarity" filter to either 😉👍(we'll look into it)
We looked into the counter-jitter technique from textures but didn't have too much concern over it.
Since we are big fans of the sample pattern found in the Decima AA research and it stated some
information about the samples being so close, that texture alteration wasn't necessary. But maybe
we should still test it ourselves considering their standards at the time (releasing with no motion
vectors & causing severe ghosting).
We don't have an official chat or online group but you might be interested in joining the r/f***taa
discord server which has a channel dedicated to developers looking for improvements in this area:
discord.gg/W7bWWfF5RW
We always like to remind people we are not anti-TAA, only against flawed TAA that can be abused
for covering hideous issues with other pipeline steps.
@@ThreatInteractive Regarding the bilinear you can instead use Bicubic another interpolation algorithm instead of Bilinear which is the default in hardware.
It reduces the blurriness significantly, bilinear by nature is very inaccurate and blurry.
I completely agree with how TAA gets abused through effects like SSR, SSAO, and SSGI they are generally poorly implemented then they rely on TAA to temporally blur it together to effectively denoise it. Even in deferred rendering some people opt to use Stochastic transparency and TAA to blend the pixels to implement transparency.
And things like TAAU while the idea is honestly great its yet another thing abusing TAA, Temporal upscaling is "fake performance", not an optimization its a great option to have, but DLSS and other things shouldn't be the standard they should be an option for those with incredibly old/poor hardware.
Optomizations in modern games does bother me a lot, things like Unreals Nanite shouldn't ever be used for games.
It doesn't improve performance hell it hardly even improves quality, it just enables developers to just slap in 500 billion polygon models consuming a billion gigabytes of data.
Even things like lumen, why do we need such a degree of dynamic lighting when lightmaps can look even better support changing daylight cycles and is almost free to render.
@@michaels851 You mentioned using bicubic sampling, can you give more context on this? How do you change the reprojection type?
@@pakumies By default when you sample a texture the GPU uses Bilinear sampling, You can instead in the shader write your own Bicubic sampler to use instead, which generally produces better results.
@@michaels851 You talk about Lightmaps looking better, but I have never seen that before... not in cyberpunk or any game that gives ray tracing or path tracing
i think it's also important to note that a lot of the examples of ghosting with taa shown here come from the fact that the velocity vectors for those pixels are not being correctly calculated. if this is correctly done and inusable samples from previous frames are correctly discarded then ghosting becomes a much smaller issue, of course some blurryness and artifacts still remain because of it being a temporal method (which the video mentions in detail) and i agree that taa is probably not ideal for visual clarity.
Is it even realistic to expect the velocity vectors to be calculated correctly? The examples are extreme, but this effect appears all the time. At first I thought my monitor is shit, but then I noticed that not every game has that ghosting. Personally, I'd take a grainy image over those artifacts any day.
@@tofunoodles it's more image stability than grain that TAA fixes
Which remains the source of this problem. Fake optimization, and those methods being used to hide the lame work.
Not to mention that devs intentionally try to hide those issues with the worst lighting decisions ever.
Great video, It's clear that modern 1080p games look way blurrier than 1080p back in the day. And It's visible factualy.
Game devs are relaying too much on Upscalers, lighting gimmick and temporal solutions to cut cost of video developpements and cheap on effects.
At this rate new triple A 4K games will be as clear as any 480p ps2 games.
In former days developers just reduced the output resolution and didn't use any advanced reconstruction at all, leading to most games of the PS3 era looking absolutely terrible. Either people forgot, or are simply too young to have experienced it. We're currently having the best image quality since CRTs went out of style.
@@NeovanGoth only started to be an issue with late half of xbox360 era games. The early games were all 720p with msaa and looked great. And on multiplatform ps3 games, it was more common since devs didn't know how to optimize for ps3.
None of that changes the fact that on pc, players were still playing at 1024x768/ 1280x720, 768p, 900p and 1080p. Games of those times looked a lot sharper than current games. No way we are having best image quality when games not only look blurry, but has horrible smearing in motion which is the biggest problem.
It's a tradeoff for a bit more blur we get better and more realistic lighting
@@NeovanGoth Console shit has not been cutting edge on any level post the PS2 days. The PS3 era PC games, that is DX9.0c to DX11, have some of the best art x graphical effects balance in history, with some sharp, high contrast shading, deep parallax maps, high enough polycounts and texture resolutions... etc.
Again, it's mostly the console peasants driving themselves into a corner with their "TERAFLOPS! 4K AND 8K GAYMING ON A 799 USD CONSOLE!!" claims, that have resulted a new-found ""need"" to use lazy tricks and hacks to make their poorly designed, poorly optimized garbage to run even remotely passably.
Hell, take 2015 games like MGSV, Witcher 3, Talos Principle... etc. Custom engines, gorgeous visuals, and all of them run smooth as butter at native 1080p, 60fps, on 1GB DDR5 GPUs.
@@GugureSux I'm pretty sure that Witcher 3 definitely didn't run in 60 fps on any console back then. I've played it on PS4, and it was ok, but not great.
Also "console shit" _never_ was cutting edge, because consoles always have to compromise between performance, and cost. There was perhaps _one_ "no compromises" console in history, the Neo Geo, which was 100% on par with SNK's high end arcade machines, and so expensive that it failed spectacularly in terms of sales.
Could it perhaps be that you, like so many people commenting the same phrases over and over, simply don't know a lot about gaming history, technology, and software development in general?
Some constructive feedback : At many points you really pass over points way too fast without actually elaborating what the issue is. At pretty much every point you barely give us, the viewer, any time, without pausing the video, to comprehend what the actual issue is, let alone recognize differences in the shown scenes. It would be better if you narrate what exactly the issue is and let people recognize what you are showing, instead of switching to a completely different scene every 5 seconds (exaggerated). I do realize that it's possible for the viewer to stop the video, rewind, and step through each part manually frame-by-frame to see what the difference is, but having to pause and manually compare two scenes every 3~10 seconds is not a pleasant viewer experience. While I do realize that the video would be significantly longer with you elaborating each scene comparison, it would be tremendously more helpful for the viewer to comprehend and have time to understand what you are talking about and what the problems are that need to be solved, as not every viewer will recognize the differences in the shown scenes (especially regarding ambient occlusion, which is rather subtle).
Video length was definitely a concern. It's bridging a lesser know connection between two topics being TAA and fake optimization. We are currently considering doing UA-cam shorts that explain these snippets slowly showing why modern versions are so flawed compared older/faster deferred implementations.
It's especially difficult when youtube compression defaults to a lower compression below the 4k quality.
@@ThreatInteractive Bro your going to have to realize video length doesn't matter anymore. You have youtubers that do 8+ hour dissertations on ONE movie. Now, you're talking about something that actually matters, so you have all the time in the world to express your evidence and clearly show each point.. drilling the details into the viewers head.
You gotta realize how important getting every fact into visual detail is, especially getting people onboard for a fork of the UE game engine. Let alone get people on board to contribute to the project.
All in all, I'm on your side on this issue and hope your ideas get it resolved.
i had to reduce the playback speed to properly see everything and not feel overwhelmed
@@dothex agree with this. take your time and explain well
Disagreed with the comments on this thread, the video was the perfect length, only improvement would be to point to more in depth videos where we can learn more
Fundamentally, there is absolutely nothing wrong with TAA. Pretty much all anti aliasing methods have a level of jank, because you're attempting to achieve something that is meant to be done by downscaling a higher resolution to your monitor (SSAA). Everything else is merely an approximation of that. Upscalers are not exactly fake optimization, the problem is they just aren't good enough yet to be used so heavily the way they are right now. They are a VERY promising technology, especially for older hardware. But games should not RELY on TAA or upscalers, as relying on a post processing effect for pretty much anything is never a good idea. Your game should be able to stand on it's own two feet, and THEN add in things like TAA and upscaling. There is also different levels of TAA, some games TAA looks like literal motion blur, while others, such as DOOM 2016's TSSAA is beautiful, and produces next to no artifacts, from aliasing or the anti aliasing effect itself. TAA CAN be done right, it's just if you rely on it as a crutch from day one, then you are tricking yourself into thinking your game is running better than it actually is. Being a mobile dev attempting to bring AAA graphics to older phones like the 6S, TAA is a god send. Yes it looks blurry, yes there is a lot of artifacting, but the game would not run on the hardware otherwise and it looks pretty bad without it. However that is not my main platform, it is merely an option for consumers. I believe every developer who uses TAA and upscalers should put a warning in to say that you CAN run this game on older hardware, but be warned there will be artifacting. But the bottom line is that all developers should not rely on TAA, it should only be added in after the fact, and games should DEFINITELY add an option for other methods (FXAA +1 frame actually looks pretty awesome). Personally I hate jaggys, and SMAA or MSAA are just not effective enough. But FXAA is really quite amazing, especially considering it's basically a free effect.
Nice little trick: Using an underscore before and after a word allows you to make it cursive like in _this_ example. And an asterisk before and after a word make it bold, like in *this* example. Makes it look a little nicer than all uppercase if you wanna put emphisis on a single word.
My biggest complaint with modern graphics is the horrible transparency methods we have now that lean towards dithering
True transparency or single layer transparency has many disadvantages. Sorting, shader cost, clipping, culling and and and. Temporal dithering is a simple opacity mask technology. It use small noise tilling tex and multiply this with a gradient fade. It need TAA for no flickering or showing pixled dithering. Advantage? Fast, low shader cost, no switching model for blend.
Yeah, that was my #1 complaint too, on par with ugly ssr, until unreal additionally ruined every game with ghosting and ugly ai lighting
it's 2024 and the standard for transparency is: doing what super mario world for the super nintendo did for its water effect
@@UlrichThümmler You forgot the downside: it looks absolutely shit. Subtle things that need transparency like hair, for example, have been completely ruined and not only do they need TAA to look even remotely acceptable, but even with TAA the hair looks like total crap compared to any game from 10 or sometimes even 15 years ago. Yeah, hair in the old days with its transparency had problems too, such as the interaction with depth of field which was totally flawed (hair goes out of focus even if the character's head is in focus), but even with these drawbacks, the end result was still so absurdly better than the garbage we have today that it's one of the best arguments as to why the industry is going down the wrong path: characters hair.
In a highly stylized game, like Genshin Impact, I think it's appropriate and, IMHO, actually adds to the charm, but in a game going for realism, it's wholly inappropriate.
VR could be a major driving force behind TAA's decline in popularity, especially as the mobile virtual reality market grows. Alyx has a very clean MSAA look.
I don't think so. With VR you can't have fast moving action-oriented games like Metal Gear Rising since they would introduce motion sickness in the majority of players, as well as with games like Titanfall. Plus the cost of entry will always prevent the popularity of VR's mass adoption.
Agreed, because the user's camera (Head) is *always* moving, temporal effects can suffer, with something like TAA, it would get turned into a blurry mess quite fast. MSAA is the cleanest one, but also sadly super expensive. With most of VR moving to mobile hardware as well (Like the Quest 2 / 3), there needs to be a solid solution for fast anti-aliasing that doesn't look awful.
@@Barbaroossa thats just not true. People play high speed games all the time in VR. You are literally making things up
@@Camobiwon Most VR already uses MSAA. Standard setup for Quest games is to use 4x or 2x MSAA
@@guybrush3000 It's ideal but in some cases not performant, believe URP in Unity on Quest is significantly hit by performance when using MSAA on Quest.
Oh shit, instant subscribe! Finally someone coming hard to speak the truth out of this current BS and making lazy and greedy devs and companies balls twist with promising grip
"FXAA addresses stair stepping but introduces blurring"
A slight correction. FXAA does NOT address stair-stepping and instead just introduces blurring. Also does nothing for temporal aliasing. Which is, all on its own, one of the worst blemishes on 3D graphics... blur + temporal aliasing = vomit. FXAA was the worst thing to ever be introduced to 3D graphics.
(For those that don't know. Temporal Aliasing refers to that lovely jaggies-shimmer you see at all resolutions... even 4k.
I prefer to use gaussian blur tbh
Gone are the days of prorprietary game engines, its kinda sad that we have a "all arounder", but it doesnt excel at anything.
There are still mamy, just not that many
this is RenderWare all over again. all it takes is some power abuse or somebody buys epic games and the entire industry crashes
@@ali32bit42 epic is the entity of all entities at this point and it's partially owned by chinese equity. Things aren't going to get any better. All it takes is them making a few icky changes in their ToS.
You can also say "Gone are the days of building your own library for running a game... You just import libraries with what you need"
Yeah, for a good reason... because it is becoming more and more expensive to build a proprietary game engine.
The effort, cost, and time to update a proprietary game engine becomes exponential, while the performance gain / improvement to graphical fidelity is diminishing...
If you want more proprietary game engines, you should be prepared to pay a 100$ per game...
Even CDPR is dropping their engine. It is simply far too expensive for them to continue it, and it leads to far too many issues compared to what they really want to do.
And look at Starfield. It is so cobbled together that many of the old problems of the earlier titles still plague it to this day, and it ended up being terrible for its usecase, and actually became worse, because it is still relying on optimizations and game design from Morrowind, which is from 2002... The loading zones, today, could essentially be hidden and dropped if they built something new, but stappling it on to their already improvised engine was likely to make it too unstable.
There isn't really something "amazing" about proprietary engines, other than the studio having to add another 2-3 years onto the development time. Quite a few studios go bankrupt or have to severely compromise to survive, because it is hella expensive.
Most studios that build proprietary engines throughout history did it with the intent to sell it on... Serious Sam only exists because it is quite literally a tech demo for The Serious Engine.
Luckily for them, Serious Sam was a success, because the engine didn't really sell...
They also retired their engine... because again, exponential effort, cost, and time... diminishing gains...
I'd argue it excels at running like shit while looking like shit on anything but the highest settings
I'm a Quest Standalone developer and the main technical artist for our company, this video makes me feel pretty annoyed with other studios abusing these effects. What was supposed to be used for good was mishandled. The advantages of TAA comes with SuperSampling the image. But other than that, it is detrimental like smearing vasoline over the screen.
We are going from Lossless techniques to Lossy and it is very sad. I hope Unreal Engine can improve the guture of games in the future but not like this. Games coming out wasting electricity more than they should and gatekeeping player's with low-end hardware.
Sorry to hear about Kevin, I hope he recovered and is doing well. Great video btw. My guess is that game developers are in cahoot with NVIDIA and other hardware manufacturers that push for the implementation of these features in videogames to justify their insane prices on more recent gpus, I've had this suspicion for years and I haven't been able to prove it but everythime I watch at games from previous generations I cannot deny that modern games pale in comparison, and to be frank I'd rather have a game with more stylized graphics that runs well that one with ultra photorealistic graphics that runs like garbage anyway. This industry is sick to the core.
Everyone thought it would be planned obsolete drivers. Nope, it's modern GPU pipelines and image quality that's doing the dirty work.
Incredibly well done video man this is gonna be a great channel, absolutely subbed. Good luck with the game!!
I would also like to add that devs also rely on sample and hold motionblur to hide the noise in motion. Games are already blurrier than they should be because of modern lcds.
Thank you for bringing attention to this. I feel like my investment in a 4K display was foiled by developers taking shortcuts. TAA looks like shit, even at native 4K. It's visibly blurry, totally unacceptable. Older games legitimately look better.
I thought I was going crazy with todays games looking worse than pre2013 thanks for the video brother in Christ
I know this whole video is about TAA but the main thing that stuck with me is how shit SMAA grass looks
Yes lets dismiss everything he said and imo proved, about TAA destroying motion clarity and the benefits of SMAA, despite its limitations or FXAA + a previous frame, over TAA, to just focus on a single SMAA issue...
@@Argoon1981 Oh no, I do agree that TAA looks bad at a low resolution, low framerate or just with a bad implementation in general. It's just funny how this is the second video I've seen by this dude where he uses SMAA as an example of good AA, then shows a frame that looks worse than even FXAA
Why not use different solutions for different problems?
@@SFNB-f5s its not so simple to apply them independently like that, would be quite a lot of work and come with a hefty performance cost.
@@michaels851 There's always a way to optimize everything. You'll never find a solution to problems if you don't try stuff out. Just experiment. The only limitation is if the engine itself allows you to change and experiment with your own technologies. That's why I always prefer my own engine, there's no one telling me what I can and can't do.
Thank you for mentioning THE FINALS! I was thinking of Embark the whole time during this video. They made that game with a total team size of ~30!
I hope you take over the game industry in terms of standardization
This needs to reach as many gamers as possible. The image clarity of games nowadays is abysmal.
I'm glad to see more people talking about what TAA does to image quality!
it's worse than I thought.
bless you.
I'm throwing in some support and I look forward to seeing what this engine you propose will look and work like. Speaking the truth in these modern times is a revolutionary act!
THANK YOU, I JUST CANT THANK YOU ENOUGH. I legit thought my mind was playing games with me, but no its actually that bad, all those marketing materials got me bad....
Amazing Video. I have been playing around in the caustics branch of unreal for ray traced caustics and DXGI and the feature set seems a lot more responsive than epic's native build. I hope to see this project progress further, as I too am sick of blurry games in motion. Renders look amazing, stills, slow images. Moving just makes everything a blurry mess which only VRR and DLSS + Ray Reconstruction can resolve.
1:30 the moment I first played Timesplitters 1 on PS2 in like 2002 , I was like "ah, so a solid 60fps with no stutter ever, is going to be the new standard from now on and gamers will NEVER accept anything less." and yet here we are.
Console Gamers - "Ah, remember the days when the mud we had to drink didn't even have any dead bugs floating in it?"
PC Gamers - "...No? Why were you drinking mud?"
The lack of MSAA options forces me to use SSAA through custom resolutions on almost every game.
These studios need to get their stuff together.
I actually like upscalers in how they can help older systems BUT I feel like developers are abusing them in basically making them mandatory to get decent performance in modern games rather than doing actual optimizations. Upscaling is NOT a devtool (as claimed by some). It's an OPTION for endusers and should never be mandatory to get your game to run decently.
Yes, thank you for this video, UE is so frustrating.
This is BY FAR the most extensive and well put together documentation of these problems. Thanks so much for making this. It's the perfect video to share around regarding the issues that plague modern AAA game visuals.
Once again, love your content and I resonate with what you're trying to convey. I see how UE5 can be a problem, their Radiance Caching for real-time GI presentation a Siggraph together with this video are very eye opening in that regard. In principle, I've been complaining about very similar things under DF videos since at least 2022; it was around the launch of TLOUP1 and DLSS3, where the guys there really started to nonsensically advocate and push for RT in games: they wanted, demanded, 4k together with RT, not just on PC, but on consoles! How can these "experienced" professionals be lacking this much awareness?
Going back to the main point: where would you put Ray Tracing in all this? Could we say that the sudden and premature push for Ray Tracing has had a huge role in the way things developed, further exacerbating the situation? And related to that, what kind of role did nVidia have in all this (all things RTX), in your opinion? I was very surprised by the ending of this video, where you advocate for the much faster nVidia's DDGI: wouldn't that still be subject to the same issues that we have with modern "high quality" dynamic GI techniques, Lumen included, since they still need temporal accumulation in order to work/be viable when it comes to performance? I can understand that, if we will have to bear shitty IQ due to temporal reuse, the least we can do is to at least make it faster and smarter in order to mitigate its downsides, but still. I am just trying to understand, thanks!
Keep fighting the good fight! I must add, The Finals may perform well, but the amount of dithering and ghosting is still massively unacceptable. I recently saw a video on r/FuckTAA where they went thru just normal game play and highlighted the crazy amount ghosting the game has. I truly feel UE5 is the worst thing to happen to gaming when it comes to performance and visual clarity. Like you mentioned, it is so easily available, why wouldn't a company use it. I have been playing Grey Zone Warfare recently and that game looks like shit, and has terrible performance. DLSS and FG are a necessity, but they are trying to upscale a mess of pixels, so it is impossible to see anything clearly. Even with up-scaling off (kinda...?? im not sure if that's even really an option, just things like FSR AA and TSR) it looks horrible. They dont even have a render resolution slider. Just some weird one (cant remember what its called) that defaults to 62. The smoothness is off too. I will play an older game with 80-100fps and it feels smoother than GZW with the same fps... How the hell did we come to having games that look worse AND perform worse?? It kills me seeing comments from people saying shit like "my game runs perfect. 60 fps on my 4080 with high settings. It must be your PC"... I feel like im going crazy! 60 fps is NOT ok for a PC i spent $2,000 on!! Developers are using up-scaling in the system requirements now FFS... I feel like this exactly what is happening when you say they are trying to make this the new standard and to be expected, but its all BS. I cant even be excited for new AAA games anymore. They lack visuals clarity and run like shit. We have taken serious steps backwards and i dont think its gonna get better, only worse.
this needs more views
This is neat, but why base off of unreal when you can push these features to open source engines like Bevy, Godot, or O3D?
I figure you must have experience in unreal that would make implementing features like this easier than learning those other engines.
Would you consider making reference designs of your effects available separate from your unreal code, so that those of us without unreal licenses can implement your effects in other engines?
Unreal is currently just the go to engine this industry
Many people have already built codebases in unreal which they may not be able to simply port to other engines
@@vegitoblue2187 Thats A good reason, but to compromise for existing projects doesn't strike me as the goal to strive for in this case, the chances of existing projects spending A bunch of time and money adding these improvements to their existing products seems unlikely to me.
But I could see more devs using it for new unreal projects, than would be willing to learn an entire new engine for graphics improvements.
@@nullvoid3545 yeah, people underestimate the efforts of getting used to a new engine, making sure you are doing things right as well.
Unreal is more scalable than any of those engines. O3DE is the only exception, because it's based on CryEngine 3.8/Lumberyard-derived code, and probably doesn't have the TAA plague to the extent modern Unreal has.
The TAA-mod for ALIEN: Isolation known as ALIAS: Isolation is AWESOME, that's for sure! And that's the one of the best implementations of TAA in gaming EVER!
Моддер, сделавший этот мод, - Моё почтение!
TAA can eat a fat one
Wow, never knew i needed to hear this, learned so much! Thank you for bringing this up, great video man!
Man I thought I was going crazy with how often I've found light reflections acting so weird. This helps explain it.
"Modern" blurry games that run at ~45fps@medium on hardware that runs Cyberpunk at 60+ fps @high is just not acceptable as "a good game" no matter what 4090 users say.
Fighting the good fight! 💪🏽
A shame a video like this isn't mainstream in the gaming industry.
and people would call me crazy and bitching at nothing when I was complaining when games started using TAA years ago , I am in no ways a graphics engineer , just a guy who has played pc games since 2000 , I just could never get used to or stand TAA and would always switch it off and play with no AA or use super sampling when my hardware could handle it , up until games started looking horrible without TAA even with 200% SS , games just look bad now. We pay so much for hardware that is extremely powerful now and games still look and run like dog shit. It's really bad that I can't get a decent 1080p (as far as 1080p goes ) image with a 3080 when 10 years ago you could get a decent image without artifacts on a gtx 960
Thank you and kudos for bringing light to this issue. I hope this video goes viral. Modern graphics have very obviously gotten better but with TAA, it brings blur into the equation so you can't even see those graphics. It's like taking two steps forward but three steps back. I was always wondered why games like Warhammer 3, Cyberpunk, and Witcher 3 looked blurry when I built my new PC even with motion blur turned off. BRING BACK REAL ANTI ALIASING! BRING BACK MSAA!
MSAA is antialisasing a little bit of the image and leaving the rest untouched, not to mention the big performance hit. And like he said, many of the lighting and shading effects modern games use rely on temporal antialiasing/denoising.
MSAA isn't really viable anymore. It only applies to geometric edges so shader/post processing effects will be completely unaffected not to mention in surface detail like specular shimmering. There is also the massive performance hit that MSAA incurs. My favourite AA at the moment is SMAA but barely any games implement it. It suffers from some of the drawbacks of MSAA but is pretty much free in terms of performance and doesn't exhibit excessive blur like TAA.
It also fails to handle HDR with default MSAA resolve/tonemap to framebuffer.
When dynamic range is large enough, those previous samples may fall completely out of the tone mapped brightness range.
This is quite visible when dark object is in front of a bright object. (Ie. Sun)
To get this work better, one has to fix it manually.
Omfg this video makes me EMOTIONAL 😂. FINALLY my anxiety, hatred, confusion, etc had been put fourth and presented all in one. I've been saying this for years in arguably way worse and less informed ways, but having this video to lay out the entire problem, the solutions, etc is amazing. Gods work! Let this be seen!
I always knew that TAA is garbage, but now i know exactly in what way it destroys everything.
wow alot of stuff covered here thanks alot keep it up!
this is an incredibly important video for ANYONE who cares about the state of gaming to watch, and i'm deeply saddened by the fact the algorithm hasn't spread this around way more homepages already. i can only hope this video blows up belatedly :D
very well explained with the visual examples provided, as somebody who is an avid gamer but is familiar with the technical aspect of games in only an 'intuitive' way, i found it easy to follow along with the line of thought.
it's incredible that as a life-long pc gamer i can confidently say i would NOT have built my first pc in 2020 had i known about the trends to come.
i don't even feel like i would be missing out on anything with inferior hardware; i couldnt run cyberpunk on launch above 60 fps WITH dlss, elden ring was (still probably is) locked at 60 fps, ray tracing barely shows up in games and when it does its not usable unless you have THE most expensive card (i have a 3070 and dont even run RT on control because the performance hit is just so noticable), DLSS is cool but, like illustrated by this video, it's a band-aid.
i got helldivers 2 and i went from running the game at 100+ fps in april to now getting 60 fps lmao. it's just a joke, and sadly it's a pattern i am noticing in many other aspects of culture as well..
There could be some monetary enticement behind that makes sure this video does NOT trend and become viral. I'm sure you can catch my drift.
A hidden gem of a channel..
Keep up the good work...
I felt like I was getting CRAZY, even in Fortnite which became Unreal Engine's flagship, you just can't escape that blur. Just use a car, and watch under the wheels, there's always ghosting. Even with TAA disabled or whatever, I can't manage to get rid of it.
So many games where there's that blur and somehow ... everyone seems fine with it ?? Everytime I mentioned it, it was brushed off as if I was just nitpicking. But this looks just horrible. Thank God this video just came in so I can rest easy, knowing that I'm not crazy
So much truth in this video 😂. The problem is most of the people are casual blind people or too young to know what a real good looking sharp game is
Best thing you can do is like, sub and share in any place possible.
Thanks for the support!
Quite the opposite. I'm pretty sure most people who like to complain about modern anti-aliasing and reconstruction techniques never connected a PS3 to a 1080p display, otherwise they'd know how terrible most games looked back then. And yes, even on the PC, because literally nobody could run Crysis smoothly at whatever native resolution common LCDs had back then. I guess most people here are simply too young to have even tried it.
@@NeovanGoth emm no.
Always played mainly on pc, always native res. Even when i was i child the diff between native and even one res below was outstanding
Lol bro great intro. You should wear a black shirt. Black is safe and powerful. Black is... Professional.
What is wrong with the Digital Foundry video? Just watched it again and it is super accurate. He also mentions downfalls with TAA and how he thinks you should always have alternatives such as super sampling.
A lot of these people don’t like DF and think they’re shills just because they don’t share their primal hatred for TAA lol. Then you also have fanboys that think DF is biased against their favorite brand (Xbox vs PS, Nvidia vs AMD) even though there’s nothing I’ve seen from them that indicates they arbitrarily favor any given platform. If they favor one in a comparison it’s because it provided the best quality and performance, not because they wanna “own the Xbots, Sony Ponies, AMDumbs, Nvidiots, etc…” Their lack of bias should be evident by the fact that every single time they do anything there’s a different group in the comments claiming they were wronged by DF’s comparison or tech dive.
It's defending a broken mess, they use the typical shill tactic of using neutral sounding language to mask their bias. It only works if you're stupid btw.
Why count pixels of the first frame after a scene change where the taa is not kicked in yet?
@@Brawltendo Oh they absolutely shill for nVidia, Richard is way more tepid about it lately, but Battaglia is a massive fanboy. It took me a while to open my eyes, as I myself was a fan for long time, until I started to see things going against basic common sense, and nVidia marketing being shoved into analyses. From that point on, I've been way more critical towards the stuff they say. A couple of examples: do you recall Battaglia having a video talking about and comparing DLSS FG vs FSR FG? No? Ask yourself why.
Or more recently: since a few videos now, he keeps selling nVidia's Ray Reconstruction at every chance he gets, that's because he knows that the upcoming battle will be about denoisers, with RR pretty much being the only tech left setting nVidia apart. He did that in his Megalights "explanation" (lmao) video, and in his latest Silent Hill 2 Remake video. Don't get me wrong, I still enjoy all of his videos, his delta time investigation in SH2 was brilliant, still, the guy got a platform and he's making full use of it, for good and bad. Hopefully he doesn't push his own luck too far, it takes long time to build a reputation, and very little time to destroy it. With me he definitely lost a few points already, but hey, I am nobody, so yea.
@@ctrlectrld "do you recall Battaglia having a video talking about and comparing DLSS FG vs FSR FG? No? Ask yourself why"
Because Rich already did it. Alex afaik already talked about FSR FG positively in forums if it was implemented well in a certain game.
"with RR pretty much being the only tech left setting nVidia apart"
What are you talking about? DLSS is also still unrivaled from a quality standpoint, by far actually.
"since a few videos now, he keeps selling nVidia's Ray Reconstruction at every chance he gets"
That's completely plausible in a video about very obvious standard denoiser artifacts like in SH2.
Your feelings are massively deceiving you.
You've pointed out so many things I never knew about. Thank you for the amazing work
Nice informative video, I had the impression that TAA had become the computer graphics equivalent of generating electricity by boiling water to make the steam spin a turbine: too damn common and not efficient.
How is it that when someone derails itself to much in anything, someone always comes up to rectify it, the parties involved not someone always comes up!
I knew something was off with the new games ,but I don't know enough to put it into words.
God Bless, hopefully you can help.
The worst part of UE5 isn't even TAA. It's the nasty ass blurriness of reflections. Anything shiny looks like it's being perpetually spray-painted with an undulating haze. There is no clarity anymore. So ridiculous.
It makes me upset that there are these systems forced upon developers that don’t need them most of the time, shouldn’t need them at all. I think a lot about indie devs that have been enthralled by the ease of development in UE5 because of its robust Visual Scripting systems, but their games in the engine don’t have super detailed models and the games look weird in UE’s light systems. Unity as well.
I remember back at the start of Unity and in UE3, these engines prided themselves on making their engines scalable from the highest end PCs of the time to being able to run in a web browser. I miss those days. I miss them a lot.
While I'm sure all this is true, you highlighted the key problem here - cost. If developers can do things inefficiently but get the game released quicker, particularly the bigger publishers will force them to do it. UE5 seems to be hugely focused on development speed, not efficiency.
me never coding a game in my life: Thank god I know this now
I don't really understand a lot of this, but you're saying complex words and stuff, so I believe you.
You shouldn't believe someone just because you don't understand it.
The comparisons should give you a solid idea of what the fuss is about.
Great documentary - I hope you secure funding.
As an Indie Developer currently utilizing Unreal Engine 5 in my project's development. Your words and evidence alone was damning. I use this engine on a semi-daily basis, and I hate how true this all is. Blatant disregard for traditional rendering techniques that were in use years ago, only to be thrown to the wayside. For more in favor of noisy, badly designed TAA-dependent effects.
I used Unreal Engine back when it was on 4.24, it was a breath of fresh air when we didn't have to deal with any of this crap. But now, here we are. Where has it all gone? Just to save a quick buck?
If you do manage to get a custom engine branch going. I am all in, I want to use that branch over Epic's, inheriantly flawed branch of their engine. I wish you the best of luck, in hoping to secure funding and investment.
And oh yeah, I am less than pleased about Epic Games' behavior and stance on all of this, instead of pushing the industry to do better. They want to do things in a way that costs more time to process on next-gen consoles and PCs alike. It's unacceptable, having to find undocumented loopholes to fix the issues with this engine.
And seeing everybody ditching propretiary engines over this? Why? Just why? Also can we like, stop adopting TAA, like...for good? And not use TAA-dependent effects, at all?
Back in the late 90's early 2000's I dreamt of the great battle between John Carmack and Tim Sweeney culminating in amazing engine choices and freedom. The world needs Carmack now more than ever!
@@gstreetgames2530 I think any competitor to Unreal would be great now because back then, it's competition with id Tech was what drove the engines to innovate and become more optimised and accessible. I think Unity in the early days was also a big factor for making Unreal become more accessible especially for indie devs but sadly Unity is fumbling now and Godot is still getting on it's feet so I'm hopeful that there'll eventually be a worthy game engine to compete with Unreal in order to to make it good again
@@gstreetgames2530 Carmack creating a startup to take on Epic games would be amazing
@@miguelpereira9859 Indeed it would be, I wonder what he is working on now. Hopefully he lost his fixation with the VR fad and his ties to fedbook.
Thank you so much for this, I've been an unreal engine hater for a while but now I finally have actual points to bring up
Have you guys considered putting your efforts towards improving Godot engine? An open source project like that could use people with your talent and knowledge
Have you considered our Lord and Savior Gentoo GNU/Linux?
Yeah, I think that the effort might be more useful to the industry if put towards a truly open engine like Godot. On the other hand - Godot is not targeted at AAA studios, and indie games have a bit different needs.
On yet another hand - Godot is open and modular, so it wouldn't be that crazy to maintain a fork.
godot is not fixable for large scale games
@@sowa705 What do you mean? It's a FOSS framework. You can do anything with it.
Seen the Road To Vostok game project, that's being made by one guy in his free time? Yeah, you can do anything with the engine.
damn, son is pissed... rightfully so...
How did videos like this didn`t go viral in game dev space?
EVEN AN END USER LIKE ME NOTICED THAT DEPENDING ON TEMPORAL SOLUTIONS IS BAD MAN
Because modern programmers can't actually program. There's an entire generation of programmers now that believe that putting their website inside a web wrapper is "coding a desktop application". See Discord, Spotify, Teams, etc. It's all just crap running in Chromium requiring 1GB RAM just to render text on your screen. Discord literally takes longer to launch on a 5800X3D with 32GB RAM and an Nvme SSD than MSN Messenger on a Core 2 Duo from 2007 on spinning rust.
I wouldn't be at all surprised to walk into a game dev studio today and see an entire room of game devs using Blueprints with one or two actual programmers with C++ knowledge desperately trying to optimise everything by themselves.
Wow thank you for the explanation! This whole situation just hit me full blast with one of my favorite game franchises: Codemaster's Rally Sims like Dirt Rally and recently EA WRC. Since Dirt Rally 1 I have been playing these games in VR exclusively. In VR all these temporal solutions that generate frames are pretty much useless. The blur makes you feel like you need glasses and ghosting is super obvious. I used to play Dirt Rally 2 (EGO engine) with my VR glasses (Quest 2) 4 years ago on a 2080ti with mid-high settings and 4x msaa in 90hz. Not average 90fps, there where no dips below 90fps.
A while ago the same studio released EA: WRC, this time using Unreal Engine 4. The game looks the same as Dirt Rally 2 (2020) at best but often worse. It has only TAA which cannot be turned off in game and it runs soooo bad. I have a 4080s. I barely reach 90fps with low settings and reduced resolution and it will stutter and dip below 90fps no matter what. But the absolute worst thing is the horrible TAA that cannot be turned off... It's simply not playable in VR and the whole community just whishes they would have made new stages/cars for Dirt Rally 2.0 instead. Weird times^^
Anyone who has ever played a Switch game (which typically don't use any AA) will know how crisp 1080p graphics can actually look on a 4k display...!
Mario kart on 4k TV looks really good
What funny is a lot of those switch games you mention are made in Unreal 😂
Funny u say this cause I've seen digital foundry question why doesn't Nintendo ever use any anti ailiasing on their games and I wonder if Nintendo understands this and they don't
Nintendo may be a-holes regarding to copyright on their own franchises, but at the end of the day, they are probably the least scammy actor among the « titans » of the game industry.
@@Clockwork0nions They might have changed the source code
easy sub, this is so important to me. I've been complaining for years that there's no engine version or engine setting that concentrates on higher performance and sharper image quality.
Very informative stuff. I’ve played a lot of games that look like hell and never thought twice about it but honestly if I’m paying 60 bucks for a game and however much more for DLC I want to pay for higher quality. Thanks for the info.
Not only are you paying 60 and more recently 70 bucks +DLC, but when people are purchasing powerful hardware such as 9th gen consoles or modern GPUs around the $300 range, people deserve way better results considering the standard set on far lower tier hardware.
It's crazy how low priority it seems to be in 2024 for games to have a remotely clean/crisp final image. Boot up an older game that predates TAA and run it at native 4k and just have your mind blown at how clean an image CAN look. In the race to push new features we've sacrificed so much. With that said, I do believe that Ray Tracing and these other features are still in their infancy and may be another generation or even 2 before fully RT enabled games are performant enough to where we can afford to boost internal resolutions and not rely EXCLUSIVELY on upscaling. Also despite ALLLL of their issues, the quality of the upscalers is going to rapidly improve
I spent quite awhile trying to figure out why new games, despite higher texture resolution, still looked worse than Crysis 3 to me, even with all these fancy new lighting technologies. Finally realized it was the inclusion of TAA and lack of MSAA.
MSAA doesn't work well with modern games even without the TAA jitter. The reason stems from pixel shaders. MSAA is expensive and looks bad today. There's a reason why the last major game to use MSAA was Forza Horizon 5 and even then it didn't look great and misses many edges. FH5 didn't have TAA at launch either
@@crestofhonor2349 FH5 looks far better than Motorsport. That's just plain bullshit. MSAA is the best AA besides SSAA which is extremely demanding. SMAA can supplement it nicely and that combo is 10000x better looking than TAA+FXAA. No. It's pure laziness. They are using 2012 rendering tech underneath all that fancy lighting. Deferred rendering is obsolete in the face of Clustered Forward.
@@eclipsegst9419 MSAA doesnt even work properly nowdays, it doesnt antialias pixel shader effects and thats like 90% of graphic fidelity nowdays
@@riston7264 Did you read? That's why I said it can be supplemented nicely by SMAA if you like the inner textures to be antialiased. And many of us don't, by the way.
@@eclipsegst9419 sure, but from what I experienced, normal SMAA barely helps, and is temporally unstable.
Thanks for voicing these issues, we need way more ppl educated about this, and even more bugging the engine devs to hopefully improve it.
This need 1 million views at least
Indeed.
I didn't know how TAA worked and its literally been the bane of every time I've tried to develop post process shaders. Thank you.
Also, for a moment I thought you were a meta human + magical settings Epic didn't tell me.
As the video expected Wukong's performance was horrible. Constant FPS drop and various input lag issues. Yes the game sold a lot but as a general tech guy playing Wukong was painful.
Not to mention it destroys AMD GPUs by default without RT
My RTX2050 laptop gets the same result as a desktop 6500XT. The latter is 30% faster btw
Im not a tech guy. But i do know current game devs makes the game calculates graphical effects for real rather than visual tricks
Can we go back to when games are full of visual tricks to make the game looked better at lighter performance needed? Because game devs now for sure dont even bother with that
@@3dcomrade We will never go back to the good old times because companies keep looking for cost cutting and investing in-house game engine costs too much money. In addition skilled game engine developers are very expensive. I tried to study about game engine development and it's no joke at all.
This video could've been 3 hours long and I would've watched the whole thing, i hope u continue making videos on this subject I feel like u barely scratched the surface
Well I'm convinced. Its a very Interesting issue.
But I can't see your solution getting much traction. Not because its a bad idea, but because of the scale and scope of dealing with unreal engine. Forks come and go, and most people don't know they exist or that they even should use them. Best case scenario you get hired by Epic and can make a waves there to add appropriate alternative configurations. Worst case: keep making videos like this and they will get seen by the Epic Devs.
I hope you can good funding, and even better devs. That's won't be an easy feat.
I think if you can hammer home the point with even more videos like this one, Epics devs won't be able to ignore these issues. All devs love a good: 'stop making it worse video' for optimizations.
I'm happy people with great technical understanding and skillsets are also upset about this issue.
Almost every use case of temporal effects in new games give me motion sickness due to blurry appearances or ghosting, and even cranking the settings down to lowest don't solve the performance issues.
One game I played recently performed worse on lowest settings without DLSS, compared to max settings with DLSS.
Things become exponentially worse when it's a game labeling itself as a competitive FPS, which should prioritize visual clarity above all.
What I find very weird about all this is that Epic Games never used to be this bad when it came to managing their Unreal Engine. Back in the old days from Unreal 1 up to 4, they were managing the engine very well, as well as innovating it a lot with new features and optimisations, they also had good games to showcase it like Unreal Tournament, Gears of War, Infinity Blade, etc. Nowadays it seems that Epic doesn't care about Unreal Engine as much as they used to anymore and instead care more about Fortnite and making it look more presentable and profitable
Than you very much for all of this work.
This is really impactful for the community and future of the games. I hope more people get away from the mainstream echo chambers that Nanite and Lumen are great and the future.
I was wondering if all those issues I have found while playing around with settings in my personal projects are my fault of setting it up incorrectly but turns out it just is as is.
In a society that only values profit at the expense of all other motives, doing things optimally is never the end result especially when it comes to art. Games are built lazily and cheaply not because of incompetence, but because shareholders want a return on their investment and demand that profit be the only motivation so they force management to uphold that mandate.
Welcome to the inevitable result of our capitalist society that caters everything around the wishes of the ultra wealthy.
Got my like and subscribe. I hope you will be able to change the industry for the better.
I will say this. Injecting AMD CAD with ReShade will clear up a lot of the TAA blur that happens to textures. I now do it in almost every game I play, even with DLSS. I do generally agree. UE 4 and 5 look nice but don't play nice.
ok uhhh just to be sure, did you mean AMD CAS?
problem is that something they should been already do as the base in game, not us
Im a gamedev using ue5, this video is quite interesting.
I wouldn't call devs lazy though. I tink the business part of gamedev creates certain dynamics and pressures.
But also as you've marked engine companies have different motivations and values taa differently.
You said you need crowd funding but provided no donation link?
Also - good video
We wanted to put out a first video explaining why, then we planned on doing a second video detailing out funding goal and how we intend to use it. But since people are already asking, we are setting up a funding site now.
Here is our donate page: threatinteractive.wordpress.com/donate
@@ThreatInteractiveSo if you did raise $900k for this, and you successfully modified the engine, then what?
Let's say you modified the UE5.4 branch successfully, but then newer versions come out with more features, your improvements would only be for that version, you'd have to raise funding all over again just to update?
@@cyclonebee2205 How do you know?
@@Hybred When we successfully modify the engine we will use it for our game & make it available for other developers (there are already many who want something like this). Take Epic Games with Fortnite (income of well over 3 billion dollars) their new partners (CD Projekt Red) have abused flawed TAA just as much & barely improved the engine.They have had years to prove their worth & the vast number of games that have used their engines have suffered for it.
portal.productboard.com/epicgames/1-unreal-engine-public-roadmap/tabs/46-unreal-engine-5-0
This is the link to their UE5 roadmap. Over the years we tested all the "improvements" & found 90% had barely any performance gains or depended on bad TAA.They have admitted to their desire to faze out traditional rendering for nanite only & recently forums.unrealengine.com/t/lumen-gi-and-reflections-feedback-thread/501108/1727 the lumen developers stated they will not support traditional optimization because "they use nanite". This forceful & arrogant behavior is why developers need to form independence from Epic Games.
Even if the improvements paid for by crowdfunding don't get integrated into UE5-main (we're not just fixing AA, we're going to be proposing other effects like stable GI). Say we had done this with 4.27, there would only be so much we would bring from newer versions. If by some off chance they do improve newer versions of UE5 with new game related technology (like something seen with the motion matching tech) that is something we'll consider updating our version to include.
We would not have to raise money to integrate our improvements to newer engines since by the time Epic adds anything useful we'll be independent of funding through our own game (assuming our game is successful).
I don't even hate TAA, but I almost never see developers do the most fundamental basics of the technique correctly, like masking out absolutely everything that is moving or animated.
You don't want to accumulate those, it's what gives you the awful ghosting artefacts.
And I hate the way modern games do hair and transparency... ultimatum between a) everything is blurry or b) everyone has leukaemia and in chemo.
taa is the reason why i need to use glasses nowdays, since after control released, every single stupid game had that enabled by default and the worst part? its tied to the EFFECTS slider, and you know what AA and AAA games NEED TO HAVE to look REAL? FREACKING AO, and they bundle BOTH together you CANNOT decide between ONE or the OTHER, or its BOTH OR ITS NONE!!!
its so absudrly dogshit specially for people who dont have rigs who can even tank the game on 1080p....
We've just gone back to how Castlevania on the Gameboy rendered its translucent backgrounds, gotta love the classics