"I don't think the average gamer will notice" I think the average gamer notices, but doesn't have the ability to communicate exactly what the issue is.
@@GeneralPretzle Which is intentional, one must presume, give how both consoles and graphics card manufacturers want you to always be buying the newest stuff.
Yea, but most gamers don't notice how disconnected rasterization makes everything. RT and a lot of these new methods fix that, but at a cost. It's kind of a transitional phase.
It’s so crazy how people were recommending expensive 4k tvs with hdmi 2.1 bandwidth to take advantage of 4k120hz when the 9th gen consoles launched. Years later and your average unreal engine 5 title is running at 900p 30fps on these machines lmao.
But then you get lower resolution textures and you golems will start posting "LOOKS LIKE PS3 LOLOLOL" in trailers of games, the developers can't win. You morons brought this on yourselves with your obsession with graphics.
@user-hc1oi2bq8u I just learned the reason the wii was blurry wasn't because the console wasn't good, it was because you need to use component cables instead of composite cables. Additionally it uses interlacing by default and anti flicker (designed for CRT tvs) and once you disable that through mods the console looks way more crisp and responds better.
Last time I loaded Dragons Dogma Dark Arisen and I went "shieet, is this 1080p?" when I was checking my settings, it was looking really sharp. New games don't look that sharp at 1440p
People in the chat were like "yeah but 4ms is nothing!" - And it really highlights just how out of touch gamers are and why they don't raise their voices against these types of issues. To play a game at 60fps a second, you need to compute *EVERYTHING* in 16 milliseconds. By turning on a feature that alone takes up 4ms, That leaves only 12ms for *EVERYTHING* else, that includes game logic, physics, the whole rendering process and again, LITERALLY EVERYTHING. 4ms is 1/4 your 16ms budget if you wanna make a game run at 60fps. And yes, on paper 4ms doesn't sound like a lot but it will increase or decrease your performane by a factor of 1/4 at 60fps. That is MASSIVE.
Yeah a lot of people aren't educated enough to know and then there are people who think they're educated and then become obstinate against anything that conflicts with what they think they know. and I've watched videos from this guy threat interactive. He is explaining it quite clearly and he shows his homework and he writes it all down to show you the process, One of my favorite videos he did was another one kind of crapping on unreal engine because he showed how unoptimized things were there was this pre-built area that had the worst light mapping possible and it overlaps so much that it dragged your frames down to pretty much nothing and he just went and fixed it made it look better doing some old-school techniques and suddenly instead of having 13 frames a second he was in the hundreds.
One can not expect the general public to understand that 4ms in rendering is a lot. TI put an example in the video, saying that those 4ms were the difference between going from 60 to 50 fps. We need people who can explain these pitfalls of the current industry in a simplified, easy to understand manner and have that get actual attention.
We're talking about the same people that spent 90% of the video talking about how the Threat Interactive developer looks weird and is mad. Don't expect them to be smart enough to understand anything when that's all they can talk about.
The following are non-negotiable: 1. 60fps min framerate, no exceptions. 2. Zero stutter. 3. Absolutely no ghosty/smeary/noisy upscaler. I refuse to believe we can't have these today. We HAD these 10 years ago!
Play at low settings then if you want all of that. 10 years ago we didn't have the graphics we have today. 60 fps max at 1080p, baked lighting, lower res textures, etc.
@@agusr32 baked lighting looks better than RT, because RT is so expensive that it has to be downgraded so much to make it run online. Well, baked lighting is also RT, but offline, which means the quality can be much better due to no time budget constraints.
@@LtdJorge Yeah, but you need to have static environments, and save a ton of files for the lighting. In open world games with day-night cycles + weather effects, baking lights is not ideal. Also, I don't think it is easy to maintain the lighting resolution for meshes of different scales. You might bake lighting with coarse resolution for buildings and terrain, and rely on cheap but nonrealistic methods to calculate lighting for smaller objects and moving meshes.
No, they really aren't. Most of the time, DLSS at 1440p is near indistinguishable from native resolution and uses less VRAM to do the same thing. That's a win for the gamer.
@@Krypto1211 except you need an Nvidia card for DLSS. Not a win for the gamer when you’re forced to pay Nvidia prices. The gaming industry is using a proprietary feature as a crutch that not everyone has access to (console gamers for instance). FSR has abysmal ghosting, even XeSS looks better than that.
@@kruz3d573 Source? Some random guy on twitter said so. I am deadass. Gamers thinking they know how games are done is still one of the funniest shit there is.
@lopesmorrenofim there are multiple videos covering and explaining the mass exodus of cdpr. If you actually want to know more watch dr. disaster"s video on the witcher 4
I've heard similar things somewhere. The witcher 3 team is very different from the witcher 4 team and the reason they use UE is because literally anyone can get and it and learn it themselves, without any cost. You can hire way, way more people and don't have to teach them your own engine, before they can get any work done. It's a logical step but CDPR has already noticed how crazy bad UE5 is for open world games. I've heard they made an entire game devs conference talk about how the engine literally runs worse, the more assets there are used in open world games, so they have to heavily modify the engine with the help of epic.
Fact that the other developers at 30:25 are screaming for a solution and Epic isn't doing anything it is a classic example of why relying on another vendor for your tech stack can sometimes suck. In comparison to writing your own stuff, at least you'd be able to come up with better solutions as opposed to just sitting around waiting for Epic to do something. I expect this trend of Epic not listening to developers to continue as they sway more studios to switch to Unreal. To them, it's not worth fixing if they're banking on those licenses and fees from studios.
Unreal Engine is proprietary and open source. In other words, any company using it is free to change the code however they see fit, that is if they have the know-how. They either don't want to or have possibly forced the more experienced devs out of the company to save money. Also, it appears to me Epic is trying to steer UE into being less of a game engine and more as a tool for visual effects/cinematography. So UE devs are doing plenty. They've developed probably the most advanced game engine to date, but the AAA game developers (and some Indies) are trying to maximize their profits rather than make good games.
I enjoyed playing Jedi Survivor and 100% it, but it did not need to be 135 gb when fallen order was 45 gb I think. I’m really not a fan of single player or even multiplayer games being over 100 gb.
Great game ... I just played it on PS4 lol... They had to drop resolution to get it to run well but I enjoyed it .. though the ending wasn't that great
I was surprised how crisp that game looked . And that was just watching game play on U2b... Then we get mullet order recently, runs worse, and looks like PS3
Unless it's a massive solar-system sized game, you're right, they do not need to be over 100gb. No Man's Sky puts a lot of these games to shame, looking a lot better (at least aesthetically) and being much larger. Even Star Citizen is only 125gb -- the fact Call of Duty games are twice or three times that size and nowhere near as big is just embarrassing.
Sony and Microsoft: Our new console will handle 4K at 120fps for real next-gen graphics! Eight years later... Sony and Microsoft: Our new consoles will actually handle 4K at 120fps for REAL next-gen graphics! Eight years later... Sony and Microsoft: Real, native 4K at 120fps doesn't matter. It's all about our amazing AI processing for REAL next-gen graphics!
I truly feel that devs are just going to keep doing this more and more until it becomes the norm. It’s so crazy how most modern triple A games genuinely look worse than games made 10 years ago because of all this fuzzy frame generation bs.
They will do whatever enough people throw Top Dollar at them to do. I don't want to hear people complaining about the very things they are funding with their own money.
I don’t know if it’s even sustainable, honestly. The more insane hardware requirements get, the smaller the potential playerbase. Not everyone is gonna update hardware a the rate the requirements grow.
The question people should be asking is why development costs have skyrocketed in recent years. Game quality and creativity hasn’t increased commensurate with increased costs. I would also caution anyone to argue that costs increased because of an increase in talent. “Talent” implies creativity, productivity, and/or innovation, qualities that aren’t readily apparent in general these days.
@@Khrist75total budget of Spider-Man 2 = $315 million Spider-Man 1 = $100 million dollars. As far as I found in research these are the numbers. A roughly 215% increase from the original. Math wise it’s 3.15 * the original cost of production because they used 100% of the old budget.
The diminishing returns graph applies to graphics but not really to performance. I don't expect a big leap in visuals from the ps4 generation to the ps5, but I do expect a much larger leap in performance, especially in the CONSISTENCY of performance. Why are so many ps5 games still only "targeting" 60 fps but not always managing to hit it? And why do they need dynamic resolution scaling to be able to hit that fps target? This was understandably an issue for last gen, but we should be past this now.
Last gen most games were 30fps and often a shaky 30fps. Thus gen most games hold a much more stable rate and all but a handful have 60fps options on console… that IS a big leap
@@themightyant. This is patently ridiculous the consoles arent twice as fast they are 6.5 times as fast try to game on a pc even a top of the line on from 2013 compared to a low end one from 2020 it will run those games at over 200 fps where the 2013 pc runs at 40
@@themightyant. the current Gen consoles run on hardware that's more than capable of running these games consistently at high frame rates. PC's with similar specs already do
Unfortunately, there's plenty of room for devs to add more graphics (not much you'll notice but still costly in performance) so I don't think they'll start targeting 60 yet
Year ago I got a gaming PC, been playing on 1440p monitor and resolution, and began seeing things that made me think my graphic card was acting up, but no, its just part of the experience now.
Going back and seeing games like Detroit Become Human running on base PS4 or Uncharted 4, hell they even unveiled and launched TLoU Part 2 on it first, it makes no sense that so many games are uglier and run terribly not just on "current gen" hardware but in some cases depending on the game, high end PCs that are much more powerful than said machines. Upscaling was sold as the savior of old hardware, instead it's being used as a crutch to make shit playable on new hardware. Another thing is shit like lumen being pushed much earlier than it should, when UE5 can't even handle streaming in worlds without stuttering, and the fact that studios are having to find a fix for it now instead of Epic themselves fixing is an absolute joke. EDIT: Also, I don't know if I'm wrong or not, but didn't we have a cost saving technology for geometry called Tesselation for nearly 2 decades now, yet devs seemingly forgot it existed? Instead you either get terrible pop-ins or in the case of UE5, nanite which is part of the stutter problem.
And now FF7R part 2 won't be able to run at all on my rx 5700 xt because it doesn't have RT cores despite game looking much much uglier than the part 1 or Crisis Core remake.
I never noticed all the issues with TAA until my buddy pointed them out to me while I was playing one day, and now I can't stop seeing them everywhere!
@RusticRonnie Understandable! I think we actually didn't properly notice the TAA side effects while actually playing games in comparison to replaying our recorded gameplay tho- Edit: Plus, I prefer TSR since it looks better and run smoother (regardless of its side effects but they're rarer) than TAA but sadly it's only usable for UE5 games...I mean idk why they talked about TAA more than TSR tho-
18:43 no, rather it is the fact that a lot of the "gamers" are post "cyberpunk" gamers, they have not seen the days where everything is clear. Just think about it, vegetation in Division2 is the same as the last of us part 1 but the hardware requirement is so much higher, why?
The Upscalers have become crutches, look at UE5 games, they struggle on even High end rigs, you can barely get 60fps on all ultra with a 4090 in some games, but look at Indiana Jones, it’s beautifully optimized where you can run the game on a Laptop RX 6700 at 1440p low settings at 60-70fps
I've been playing Life is Strange Double Exposure lately and that title convinced me that UE5 is just a tool to make games with less effort more quickly. At least most of the time. Obviously it looks much better than earlier Life is Strange games but it still has very simple graphics by modern standards. And guess what? It still runs like absolute ass. I'm running it with an overlocked RX 6900 XT and it can't even maintain 50 FPS on high settings 1440p, let alone the highest "cinematic" settings. After SH2 remake this is the second game where performance is somehow even worse indoors than outdoors. How on God's green Earth can games like HFW, Last of Us Remake or Plague Tale Requiem and many more run miles better with way higher fidelity. Even an another UE5 title, Lords of The Fallen has much better performance.
@@Rexperto6454 I hate most unreal engine 5 games. I was surprised though with how well Chivalry 2 is optimized on it though. And Codemasters seem to have improved the release of their WRC game. It's really not good for open world games though.
This problem has now got developers creating games that will actually rely on upscalers as standard practice now rather than those being the tools of enhancement for the gamers systems. That's really bad.
Threat Interactive is tilting a lot of folks. And even the guys at DF are REFUSING to just talk about the subject, why. The amount of exposed people is incredible... Its true that publishers are largely to blame. But let's not ignore the fact that the lower echelons are also showing disproportionate indignation at the truth, this means something...
Because he doesn't know what he is talking about. There were multiple actually professional developers debunking his claims. He has surface level understanding of Nanite, Lumen, LODs and other tech, and in many cases his conclusions are simply wrong.
I'm gonna say right now, everything Threat Interactive says has truth to it.... but at the same time there's a lot of lies by omission and generally not telling the whole story. Lets talk about TAA, something Threat Interactive has talked about a lot because it makes games blurry and has artifacts compared to older AA methods. He has conveniently not mentioned the reason WHY we stopped using older methods. SMAA, MSAA, and SSAA are bloody expensive compared to TAA, and SMAA and MSAA don't necessarily have the bests= results with deferred rendering. FXAA was brought about to deal with the fact older techniques didn't play nice with deferred rendering but it had MORE ISSUES THEN TAA. We need a new solution better then TAA, yes. Game devs absolutely over rely and abuse the hell out of it. Older techniques are not the answer. I hope Threat Interactive helps push us into a better optimized era because its awful right now..... but take everything said there with a HEAVY dose of salt because its hella myopic
it was, indeed. was kinda surprised they managed to make that game look that good while running on the ps4s hardware. The only thing I didn't like about that game was that they made it too on rails.
@@jamesFX3ironically I thought it being so linear would be a con. Replayed it recently and it was nice to have such a straight forward game with good atmosphere
"CDPR has competent people who worked on the red engine for years and can tell Epic what needs to be changed." Bro, those competent people (a bunch of directors included) are long gone, most of them even before cyberpunk released, and especially now with several dozen more leaving CDPR to make two more studios of their own. Why do you think they are switching to unreal in the first place? In conclusion, CDPR is cooked and UE5 is still trash.
As a born PC player, on console I always go with 60fps performance mode to reduce risk of the controller physically entering the game via the display device
Is this an ai, or google tl? In any case, I assume you mean reduce input latency? Becasue the stuff you wrote doesn't really read like coherent english.
@@bazzy5644well you just lack basic reading comprehension and ability to contextualize what you read . what he said is : "as an originally pc master race gigachad I always set my console games to 60fps performance mode , because anything lower than that makes me disgusted and angry to the point I will throw the gamepad in the TV if it does the thing again"
@@bazzy5644 Remember the times when "shit hit the fan" could be rephrased as "excrement" and "rotary device", and everyone would get the joke because they passed elementary school? Good times.
Epic has always put developers first and gamers last. Incentivizing developers to EGS with a lower cut when none of that matters to the regular gamer because it will be the same price anyway but now you have to use their store and can't use cool features like workshop, remote play, community controller presets etc. Same with the engine, they make it easier and less costly to make games for developers but gamers will just have to upgrade to the latest 5090ti super if you want playable framerates.
The options Unreal Engine gives us as developers are absolutely hard-shafted in one direction or the other: you either give players a decent experience and completely screw yourself as a developer, or you give yourself as a developer a great experience and completely screw half your potential playerbase (and the other half has to deal with horrible temporal smearing and/or upscaling from sub-1080p resolutions). There's no middle ground. It's infuriating because now with so many horrible AAA releases with horrible performance, quality matters more than ever, but with Unreal (which is what I've been using for the past few years) it's very difficult to deliver on the performance side. I'm experienced in programming so building a custom engine is hypothetically within my reach, but still far from easy, not to mention having to rebuild all my tools from scratch, setting back my ship date by a few years, and crowdfund the ability to work on it full time and hire a graphics programmer for assistance (which I absolutely can't, I'd never be lucky enough to get the popularity needed to raise that much money). So while hypothetically possible, it's definitely not realistic. All in all it's really not a good situation. I'll do what I can with what I have though.
@@CactousMan Brother without Unreal Engine most indies couldn't make a game. Unreal has the best deal and tools provided for hobbyists and indies. They offer cheaper games and actually advanced gaming. It's like the hate Nvdia gets when they're actively developing technologies that improve stuff for gamers. AMD is SLIGHTLY cheaper but just piggybacks of Nvdia tech and is always years behind. Steam basically fucked off with all their money. Nothing is released on the cheap. Where is Source 2 for developers? More gambling simulators for valve games and over priced goods like the Index and Steam Deck.
@@ramsaybolton9151 Yeah, UE5 has some advantages in that case, but it's not a excuse for its bad performance, lacking better optimization options, from such a wealthy company with more than enough resources to improve their software.
we all jumped from 1080p to 4K way too fast without waiting for hardware to catch up for some reason 1440p just didn't roll off the tongue for casual consumers and 4K was the MAGA of resolution names. we needed at least 10 years of peaceful 1440p gaming AND TV.. before even thinking about moving to 4K now everything is messed up and no one knows where to start fixing it all
Also, people don't understand that entry-level GPU (RTX x060) isn't meant for native 1440p, it's for 1080p with DLSS. And high-end GPU (RTX x090) is meant for 4K@60hz with DLSS Quality.
Nah - that's blaming people for upgrading, and hardware for not being strong enough But the point made is, that there is too little effort put into optimizing the game itself. You can put your million poly models into the game and expect Nanite to figure out how to render the stuff - or you put in the effort as a dev and use LoD and/or reduce the polygon amount at least somewhat. Hardware does not get stronger just so devs can be lazier!
@@Hyp3rSon1X plebs think optimization means high framerate, it doesn't. Optimization means high framerate without compromising visual quality. LODs compromise visual quality because you will notice switching between LODs, but Nanite always shows the geometry that is most optimal. You only get pop-in with Nanite when your PC can't handle streaming the content.
I think this is a miss. The problem isnt waiting for hardware to catch up; more often than not, it is the design and implementation of the game/systems. And I know this is the problem because it filters all the way down the "tech tree". I'm not a higher-end gamer: I dont play graphicaply demanding games, and I dont play on a high-end computer. But I have still noticed that many of my newer games run _worse_ than comparable older games. Edit: fixed typos
@@chillin5703which games, name old games that are comparable with the new games. In most cases, they are less detailed or more static than new games.
The video seems to imply that one of the solutions is baked lighting. Baked lighting takes up a lot of space so it's not something we're going to go back to for huge games. It was largely not seen as a problem going into the PS3 generation because we suddenly had so much space. Jak 3 was 3.25 GB, The Last of Us was 34.55GB. 5GB of baked lighting is no problem at this point, but what happens when you want 4 times the fidelity on the baked lighting and also the map size is 5 times bigger. Suddenly the baked lighting alone is taking up 100GB
@@Callsign_Sturm The game is 67GB so compared to The Last Of Us on the PS3 it's huge. The foliage shadows are done in screen space which is part of why they're so imprecise. The rest of the shadows are really low resolution as well. It was 30fps capped on the PS4. It's a UE game which means it also uses TAA It's a nice looking game even if it looks a bit dated at this point, but it is also an example of the type of game that this video would complain about.
@@TheDuzx If you see Threat Interactive’s videos on Days Gone you can see it’s a very well optimized game. Given it’s open world with 200+ enemy hordes coming after you without any noticeable frame drops the game is very impressive.
@@theatheistbear3117 I find that strange as the game had most of the "issues" highlighted in this video when it launched on the PS4 and it still has a few of them. The experience is way better on the PS5. Not because the game was optimized furhter, but because the PS5 is just more powerful. The same is going to be true of most PS5 game they complain about today when we play them on the PS6.
Series x is a MASSIVE LEAP over one x it’s not even playable in most cases on new games! Go play any new COD on a last generation console. They shouldn’t even be allowed to sell it for last gen it’s that bad!
As an ordinary gamer who doesn't know anything about this stuff, I was appalled at the MHWilds PS5 visuals during the beta they ran. It genuinely looks and runs worse than MHW (6yo game btw). I'm not sure if it's due to things that this video illustrates or if its just because it isn't finished yet. I'm praying for the latter.
18:30 people do pick up on it they just pick it up as "something is off" without the ability to articulate what exactly is off it can throw people off and they won't even know it
In the 90's game programmers needed to be creative to find ways to reduce computation power required to render each frame. Now they just ignore optimizations on any and all steps in the pipeline, and just rely on enough compute cores to do the math.
I'm glad you are sharing this video. I've seen it a few days back and despite the somewhat goofy technical execution of the video (sound and image quality) the contents are super interesting and very on point, factual. Happy to see this gain more and more spotlight.
AC Unity has such an impressive looking lighting because they baked everything and set it to one specific time of the day to make it work becuase they couldn't figure out a way to make a day night cycle that looked as pretty, obviously the textures and models leave a lot to be desired, but in the interiors it still looks as good or better than current games with half of the hardware, i wish they had this type of non temporal bullshit for STATIC maps
@@Gaming_Legend2 And they continued to develop their lightprobe-based approach after that. It's not as precise as actual RT, of course, but it's an absolutely great compromise between visuals and performance. Now we have certain people praising RT, like: "everything is ray traced, wow". But it's some game again with static environments and mostly static lighting. Latest Indiana Jones is a great example. Sure, the tech is impressive, but at the end of the day I've seen comparable games with lightmaps, light probes, etc. with much better performance and only slightly worse visuals. I don't care if it's not "real-time all the time". All I care about is how the final games plays and looks.
The pure irony is that more tools were introduced to help assist and speed up the process of development, but at the same time all it did was actually hurt development instead. You have games from 10 years ago that hold up much better than games visually today because everything was working on organically, by hand, but now they have so many in-engine tools that were made to speed up the process, but just ended up costing them in the long run. Sure, doing this one thing with this one tool might help the product finish faster now, but it ends up breaking or causing so many issues down the line after release that its almost pointless because then you spend that post development time trying to fix it. Completely renders all those shortcuts pointless. All they are really doing with all these tools is bloating the file sizes of games.
agreed. people keep glazing UE5 for its easy accessibility for devs. but overlook the clear problem with it just allowing laziness and over reliance on a game engine designed for a wide range of uses instead of dedicated engines that specialise in an area/genre.
Its not usless in the end wtf are you talking about???? They get your money into their pockets faster, through ea and not having to pay 500 employees but 100 cuz they put out slop QUICKLY, rather than slowly.
@zenoomorph it's not misinformation, it's a fact. No one's glazing EA or any other company. Most games these days are unoptimised garbage regardless of if they use UE5 because they all use the same shitty solutions. I'm of the opinion that studios should use their own in house engines.
@@SilverSpectre266 false, the bloated filesizes of games are from baked lighting, which many games still has as a fallback. RTGI doesn't need lightmaps, but the filesizes are big because 4K textures.
On the PlayStation 1 2 and 3 era though, it felt like the attitude was to squeeze as much as you could out of what resources you had, whereas now the attitude seems to be to brute force with the resources you've got and not even optimize the software at all. I feel like that has got to have an impact on these diminishing returns.
Also, producers put more efforts into bogging the whole game down with feature creep instead of getting the best possible output on the feature set that they have
Guys don't believe everything on internet Threat interactive is asking for 900k to make his own no blur TAA on UE5 with no info of his previous experience on his crowd funding UE5 is open source and you can modify it however you like to remove blot in games And one more part to think is why more accomplished studios with bottomless money didn't achieve it till now and relying on traditional way of optimising and adding DLSS and FSR
I'm now used to not owning games, game publishers are going to have get used to the fact that I'm not paying $70+ for a broken/unoptimized GAME RENTAL!!
I played Mad max two days ago and let me tell you I was still in awe with how to graphics hold up and the visual style of the game I think that art style Matter more than photo realistic graphics.
20:33 this whole nanite discussion is so frustrating because he doesnt understand the premise of nanite in the first place. Yes Epic is selling it as a silver bullet.. which it isn't... But it opens up the possibilities for insanely geometry rich worlds that was not possible before. It has a ton of issues sure, like the relatively large base cost, but it was not designed to make geo free it was designed to allow for MORE geometry. All his tests are based on the idea that nanite will make geo be cheap and not a single test hes done is in a real world game level, only ever small test areas.
It's like we went full circle, from using dithering to simulate blur and shade on old CRTs, to using dithering again to simulate blur and shade on modern TVs because of TAA 🤣
I think it would help if he was a bit more matter of fact about presenting his findings, he gets progressively more worked up with every new video and it's not helping his case. Given the numerous UE5 games with weird issues and lack of shutdowns from other devs (that i've seen) i'm inclined to believe him, but i'm just the layman, i'm also on the side of reducing game budgets and having more reasonable hardware targets and it feels like UE5 is in the way of that. I first noticed what devs were doing with TAA in 2018 with The Crew 2, that was the first game i played that forced you to use TAA so naturally i wanted to fix the blurry image by turning it off, i did the config edit or whatever it was and parts of the graphics broke like shadows and reflections, they turned into a glitchy grainy mess, shortly after they released a patch which stopped you being able to remove TAA which while extreme is fine because the game looked horrible without it, but their insane and strong use of it caused the image to be blurry and in some cases warbly as the effects and some geometry like trees tried to reconstruct itself. I also noticed it in RE2 remake, except the option to turn off TAA was in the settings and caused Leon's hair to become a glitchy mess, the fact that this is STILL what it's being used for is ridiculous.
I was a layman, and now I'm an indie dev. It's not just Epic fault, you would be amazed how many AAA don't do the basic of optimization, they just throw garbage mesh and think Nanite will solve it. Sometimes they don't use hardware lumen, just because is off by default, you just need to click a check button, it's beyond stupid.
@@mrxcs I remember when UE5 was first shown off and Nanite was presented as if you could stick original million polygon models in the game and it would just sort it out for you :|
@@cikame I left a comment on Threat Interactive's original video, noting that perhaps his content would be better received if he was less "aggressive" in his presentation. There's an air of indignance that I find off-putting. It lessens the value of the message he's trying to put out. If hIs studio's efforts yield the benefits he suggested, then let the work speak for itself. Beating your chest is a waste of energy.
@@cikame likely reason why he's getting worked up is that a lot of people that know more about the game development tech have been telling him he's been wrong on multiple occasions about the stuff he's trying to represent and cannot handle the criticism levied against him and he's had a track record before on making incorrect takes before he made his vids
What's wild with PS5 is that all of it's best looking games are ones that had to also be optimized for ps4 to ensure sales, and were just patched for PS5, while the actual ps5 exclusives have arguably less visual detail, and unplayable performance modes.
In an era where graphics have plateaued, why would anyone buy a new console or a new GPU? Oh right, it's because the games run worse while looking the same! Big companies can't make money selling hardware if what you already have can run every game out there.
My favorite game in decades has been Borderlands 2 and because of the art style it still looks pretty damn good, if they slapped a new coat of paint on it and added BL3's movement mechanics I'd pay full price for it
i like this video from Threat Interactive but i have some doubts. I think he take this topic way to easy. And to say that digital foundry spreading missleading information is hefty. TAA was invented because all other AA-Methodes cause much more problems. I am from the older days and noone seems to understand how aweful FXAA or how insanly performance hungry MSAA was.
My friends worked in early 2000s game companies, mentioned back then the basic requirment for game dev was really high, they even had a dedicated optimization team full of code wizards, back then game dev often used their in-house engines or had deep understanding of the 3rd party engines and coding. Thanks to UE3&4, the requirment for game dev is lower significantly, but now most of dev have no really understanding of the engine or coding they work on, they become more like assembly line workers, compare their works to old time dev like Valve,you will see huge differences, but when I talk about that with younger gens, they just claim it's just art-style, makes me wonder how their eyes or brains are damaged beyond repairing
@@cloverraven yeah. When he started talking about the DF video as though it was sucking off TAA, I knew this kid likely wasn't worth listening to since he was incapable of presenting facts that I knew to be true, so wouldn't be able to trust that he's not misrepresenting anything else he says. No matter what else he says, whether or not it's accurate on an individual point, I can't trust his conclusions in the gestalt.
There's a massive post on the UE5 subreddit responding to his most recent video (just look up threat interactive on UE5 subreddit if you're interested) Turns out the 'optimizations' he presented in the video are just standard practice when developing a game in UE5. He has no clue how to use Unreal Engine correctly so his results are inaccurate at best and deliberately misleading at worst. When I mentioned this response in the comments section of his video he removed my comment. Pretty much says all that needs to be said about this grifter. I'm surprised people are falling for this crap. He labelled Digital Foundry's TAA video as 'Ignorant' when Alex clearly pointed out there are benefits and issues with TAA and even said it should be an option to toggle off. He's clearly not arguing in good faith so DF shouldn't bother responding
@@vast9467 he misrepresents the content of the Digital Foundry video. Don't know why you jump straight to the word "lie" when I'm only saying his misrepresentation makes him a source I cannot trust, given the very basic nature of the facts being misrepresented, and the triviality of checking the claims he makes about what Digital Foundry says. Other stuff could be argued as differences of opinion based on different areas of expertise, but since I can't trust him not to misrepresent what DF say (by ignoring the existence of large parts of that video) then I can't trust the other things he says, since I can't tell if he may be misrepresenting things there too in ways that I cannot validate. As to the lie aspect: I don't go into his motivations. He may be blinded by his biases, or might just not know what he's talking about. It's also possible that he's just a liar. I don't have any insight into that, so I'm not going to speculate on why, nor cast aspersions on his character. You should probably ask yourself why you jump straight to the conclusion that my judgement of his reliability as a technical expert is a judgment on his character, and double check that it's not supporting your own biases that lead you to do so.
Happy to see people talking about these in my honest opinion, problems, with modern graphics in AAA games. A game can be realistic with different ways to show those graphics... And it needs optimization... not upscaling. The vision should not hinder the execution from being well made.
Definitely just the one guy. He always says “his studio” but there’s no external funding and no other team members. And no game or preview of a game. I think he makes valid critiques of TAA and unreal but also misses some critical trade offs. It’s a very surface level and one-sided overview with lots of spin. The “our studio” spiel is the cherry on top 😂
@@SoulbentAnime no guy seriously has danning kruger effect. Thing is he hasnt made a thing, not a game or a tech demo, literally all he does he uses intel analysis tool on a game and complains. His optimization video was really shallow only showing insanely badly optimized scene and with few super simple very basic techniques he easily made it better. but that thing really works on people that really dont understand stuff, they just see 2x better fps. If he made different rendering engine or rendering plugin that would speed up rendering, I would have different opinion. yeah taa has its tradeoff, but if it was really that bad, not the whole game industry would use it. Different thing is that devs force lumen, nanite and raytraced shadows in games, just because its enabled in ue5 settings by default. Also heard his discord has cult-like following where he bans everyone who even tries to tell something different or ask questions.
@@HenrichAchberger that's not dunning kruger effect. The threat interactive guy is unlikeable, comes off as really know it all and whiny but he is right about several things. He's not completely full of crap. He's got some points like how TAA is terrible. I've been saying that way before he was probably born. Also you don't need to have made a game to critique. We wouldn't be able to criticize anything if we went by that metric. I don't need to have made a game to say that a game sucks, I just need to play it. If he's right about some things, (which he is) then your argument doesn't hold as much weight.
The first time I noticed these kinds of problems was Dragons Dogma 2. It was the first time that it felt like I had to "fix" something before I could even tolerate playing it, let alone enjoy playing it.
Threat interactive is a patreon grifter exploiting the lay person's annoyance with bad implementations in modern games. The problems he highlights are all correct, but he proposes panacea and intentionally ignores the drawbacks of his own approaches
he's an idiot deserve to be demonitized due to his disrespectful nature for creators who have been in the game longer than he has. Just an all round whiney child
tbf epic is even bugging in fortnite, im on a 4090 and cant finish a second match on dx12 which is required for all of the max graphical settings like nanite and lumen lighting, without crashing, even then natively its really unoptimized and runs on 70-85 fps but can have heavy stutters, its also weird because all of this only happened with the last halloween update, eversince then i got crashes and worse performance, now im on dx11 and cant even use ray tracing, lumen and nanite
That’s something I’ve noticed too. Unreal engine is now being used in movies and other stuff not related to games and it’s becoming a jack of all trades, master of none.
This is exactly it. Seems like all they want nowadays is to offload everything to the hardware based automation like DLSS, AI and all that crap, which the end user has to end up paying for.
LIES! Threat interactive is asking for 1M dollars in donations. I agree with his message but DO NOT DONATE HIM. He already has closed replies in Twitter, closed his discord server where some of his answers compromised him, etc.
Honestly, I wish developers relied more on art direction and scaled back the games a little. There were so many tricks developers used. Sure, it took more time, but they did so much more with less. I am currently playing Sleeping Dogs: The Definitive Edition. While you can tell it is an older game, the NPC count and art direction, and animation variety do more to immerse me more than most modern games. I also love games like Sifu with the stylized textures. While it doesn't look realistic, it is one of my favorite looking games in modern times. Unicorn Overlord is another gem that had amazing art direction. I feel like a style like that could allow more AAA devs to allocate more resources to gameplay. I felt more immersed in these games. They also have better visual clarity overall.
The thing is that people also hate delays of games. And there are so many things that can go wrong during a development of a game, and then these performance optimizations will be the first to go, because not enough people care to not buy a game because of it. It is a typical case where the nerds are correct in theory, but in practice nobody cares enough to make the priorities change from all the other things that will have a bigger impact on what people end up buying.
We should really start making game studios develop their games primarily on mid ranged hardware that’s comparative to what the majority of people actually run. Force them to live with the problems, then they’ll want change
See this is why I have a problem with upscaling and frame generation in general. It becomes a selling point for hardware making that more expensive overall while at the same time, everything said in this video. They aren't optimizing for native resolutions which should be the first priority. I don't wanna use DLSS or FSR if I don't have to.
Hi, game dev here putting my 2 cents to this. While its nice to see an ambitious young man. His claim he is going to revolutionize gaming and that everyone else is making terrible choices and errors is nonsensical. The reality is this person has developed nothing so far and doesn't even seem to show off concept or even pre-alpha builds of any project that they actually plan to turn into a real game. While he does seem to have an above average understanding of development it's not like he is ready to begin development of some large groundbreaking product like he seems to think. It's also easy for layman on this subject to listen to what he says and simply nod along because he sounds informed and intelligent and that gives a false sense of him being correct. While I have no desire to accuse outright of any ill intent by this young man, there is a worrisome red flag occurring. He claims to have a development studio and seems to imply he can do what all of us cannot while like I previously said. Where exactly has he obtained the funding to do this? It's almost certain that he has not and no investor or publisher is going to do so. He almost certainly has zero actual people employed by the studio currently as well and if he did his studio website would likely show it off and mention any experience his current and no doubt small team has... this does not exist at all however. Everything that is going on here is a red flag and sounds like this young man is going to crowd fund. Even if we assume good intent by him realistically any ambitious project like he claims he could make would bankrupt very quickly if crowd funded. Even experienced dev teams can make this mistake and this young man has not shared any sort of management skills either. Meanwhile we see games like Unsung Story under Yasumi Matsuno and a team of Square Enix vets failing to deliver the game they crowd funded. Even with the best of intent and tons of experience isn't enough. Worst case scenario however is this young man doesn't have good intent and quite how many times have we seen people just like him swindle through crowd funding? While he has yet to actually do this would anyone be surprised? While I think you personally mean well you shouldn't be showcasing him and building a form of trust between your audience and him as this will increase the chances of your own fans. Whether he has good intent or not the most obvious path this goes down is people losing their money and getting nothing in return. It's a matter of when not if unless people like yourself warn audiences ahead of time. Now, sadly this is getting extremely long. But I'll touch down a little on the actual subject of development. The way he talks he seems hyper focused on graphical elements. Graphics are of course important. But even a baby in the industry would know that there are more to a game going on than graphical fidelity. Think of it like a car, just because you don't see what is going on under the hood or the entire body of the car doesn't mean those things aren't extremely important. For example there are much more CPU intensive actions going on and having much larger RAM requirements as well than the previous generation. It's also important to note that the RAM is shared between the CPU and GPU of the PS5. Now lets say Capcom wanted to port Dragon Dogma 2 to the PS4 for whatever reason. It simply wouldn't be possible without greatly reducing the utilization of the CPU and RAM and having to make massive sacrifices way beyond graphics. He however seems to be completely unaware of this element of game development. This is why in Dragon Dogma 2 even on the current gen of consoles (The fact that he calls the current gen, next-gen is embarrassing no one does this except for marketing purposes which reveals he has zero industry experience) performance takes such noticeable hits when in towns. Its worth nothing that the level of graphical returns will always be diminishing and we have been on this since the PS3/360 era of games. Let's say the PS6 was announced tomorrow and it had 4x the power of the PS5 pro. No gimmicks genuinely 4x in every single metric. This doesn't mean you would see 4x the graphical effects and textures, framerate, or resolution capabilities. It's also worth noting that insane graphics even more so games that do try to go beyond the standard have extreme cost in both budget and development time. Even today's standard is extremely costly so going beyond that is crippling and how exactly is this young man going to overcome that as well? Lastly his "realism" complaint. Other graphical styles are in no way inferior as he seems to make it sound. It really matters on what the game itself is aiming for mixed with genre all the way down to subgenre. A serious sports title is typically going to be a better experience looking as realistic as possible. Madden obviously should aim for this but a silly sports title like Mario Golf would be bizarre and off-putting if it was made in realistic graphics. Forspoken is a great example of what happens when you mismatch these and how it usually doesn't work. The game is a fantasy RPG something that typically doesn't aim for a realism look and rightfully so. Forspoken did the opposite however and if you play the game it just feels off to have these unrealistic creatures, magic, and trying to look as graphically real as possible at the same time. Anyhow, if you are anyone else read this thanks for reading it through. Just be cautious if this young man starts crowdfunding like I suspect he will. Hopefully he is just young and a tad overconfident and ambitious.
I agree, unfortunately 'hate always gain more traction on the internet. Gamers don't understand how game development works and it's been like that for eternity. Threat Interactive is blatantly lying and pulling things from their context. There are some points which are valid but he puts it in a different context and overexaggerates the issue.
I say modern gaming is in a rough spot right now. So right now I've been replaying my old Wii games and it's always fascinating being able to play them at 4K and still be able to turn the game speed up to 400% for slow scenes or to speed up loading screens. Plus, no TAA. It's kinda funny that playing an old game on an emulator looks sharper than a modern game made with the most advanced graphics.
I just learned the reason the wii was blurry wasn't because the console wasn't good, it was because you need to use component cables instead of composite cables. Additionally it uses interlacing by default and anti flicker (designed for CRT tvs) and once you disable that through mods the console looks way more crisp and responds better. Still 480P, but 480P instead of 480I. But yes, emulated is a superior experience otherwise.
Create the problem (better graphics which are not optimised and tank performance), then sell the solution (upscalers which bring the performance back, but with degraded visual clarity cancelling out the better graphics that tanked performance in the first place. Genius.
The thing really is that in the past game developers were explicitly working to understand the hardware they were working with and find ways to use it to its maximum potential, hence why console titles were always extremely optimized, and why games were optimized in general. Currently no such efforts are made whatsoever for the wide majority of 'AAA' titles, where optimization is a complete non-priority, and no effort is made to understand the hardware they're working with directly, never mind use it to its fullest potential. Think of extremely late PS2 titles vs the early PS3 titles, on paper one should be superior to the other, but it isn't because on the PS2 the latest developers were incredibly familiar with the consoles' architecture and knew how to squeeze every droplet of performance out of it, or at least enough that some PS2 titles were more than on-par with PS3 quality and performance at the time of its' release, same for the transition from PS3 to PS4. Obviously much of this doesn't apply to PC, but even still the priority was always optimization, because if somebody can't run a game smoothly, that's a lost customer. Especially with the prevalence of Steam and their pretty much guaranteed under 2 hour refund policy. Even outside of that scenario, if you made a game that for the wide majority of people ran like shit, it would become a very well known fact that game ran like shit. Obviously this applies today since it is a very well known fact that 9th gen games often run like shit and are miserably optimized in not only performance regard, but also storage space.
The people who tell us upscaling is so good that we can't spot a difference are the same people that used to tell us it's impossible for human eyes to tell the difference between 30fps and 60fps.
There is visible difference on resolution and fps. However, game content can be intentionally made to 30fps, lower resolution or 4:3 aspect ratio. The whole idea that games should be 60fps with native resolution and aspect ratio is wrong.
@@gruntaxeman3740what ? 60 fps should be the standard end of story we had it before we should have it back , I'm not paying to play an underperforming blurry mess
You just don't know basics of visual story telling. End of story. There is very good reason why about all movies are shot 24fps because that blurriness hides acting, bad CGI and inaccuracies in physics effects. That means, lower fps looks better and it is more immersive. That same thing happens in games too. Games can be however based on action and that doesn't work 30fps, that works better 60fps. But if game is slow paced and camera panning is controlled, 30fps can look better, just like in movies. Aspect ratio is other thing. Jurassic park movies are good example and the very first movie was the best one, and that was not shot 2.39:1 aspect ratio. Aspect ratio was 1.85:1 because dinosaurs happens to be tall so they can fill frame better. Image sharpness should not be maxed out on every content. Game graphics are game developers communicating with player and that is known long time that example horror works better if the monster is not clean and sharp. Image can be intentionally made to black and white or look soft because of story telling. It is just dumbest idea ever to have all games to be 5k resolution, 21:9 aspect ratio and 60 fps because that is not ideal for all content.
@@gruntaxeman3740sure, but now you have to question whether the developers are processing inputs and tick functions using delta time or not because a lot of games run horribly at 30 when they could run very responsivly if programmed well. Of the 30+ games I've tried and modified on the steam deck, only about five or six do it correctly to where the impact of a low framerate is minimized.
Computers and consoles are so fast that they can easily run games constant 60fps if wanted, including Steam deck. Games are just doing something incredible stupid
I think the main problem is that "managers" that might not have "technical background" decide where to focus and what are the time lines, so I won't hold my breath that other studios will invest in what you shared in this video.
Hey Luke, I don't know you, your video simply popped in my recommended. A bunch of what TI is saying is either not true or misleading, there are a LOT of inaccuracies in his research that many people tried to point out over his videos and he either ignores completely of brushes off as if it didn't matter anyway because "nanite bad", "epic bad". Let me know if you are interested in learning more, and I'll post some of stuff here.
Yup, TI is a walking example of the Dunning-Kruger effect. He knows just enough to convince regular gamers that he's an expert with more knowledge than render engine devs with decades of experience. His videos feed into the gamer outrage over TAA, and the fact that their gtx 1080s cant run games made in 2024-25.
@@Arms2 guys this issue is easy to solve if you know so much about TAA , just make your own video explaining with real examples the issue and why TI is wrong , you are free to enlighten everyone on the subject , untill then you just waste everyone's time by saying the guys ignorant but not giving concrete examples what he's wrong about or why
@@Anon20855the very DF video about TAA that TI references already talked about all the benefits *and costs* The TLDR though is TAA solutions are required to actually fix temporal aliasing (shimmer on thin detail) in motion and can resolve subpixel detail when done well, and once you have it you have a much better performing way to implement pretty much all your post effects, to the point that to implement a toggle for TAA means you need a completely separate implementation for *all* those effects (and all their options etc.) It's mostly upside, but hard to get right in every situation, as it requires good motion vectors that tell it where the thing was last frame, and thats especially hard when you're dealing with transparencies and reflections. Done poorly, you get blur and smearing. Personally, I prefer occasional smearing to temporal aliasing, but yes it should be an option, clearly there are plenty of people that don't notice the latter as much.
@@Anon20855 Ah yes, good idea! Let me just type out a long ass essay in a comment section so that I can try and convince one UA-cam commentor named "Anon" that TI isn't exactly a good source of information. Only to have the one UA-cam commentor still tell me I'm wrong, that games looked better 10 years ago, and that modern devs are lazy. lol
The worst thing is that without seeing what games would look like without AI and upscaling and TAA and whatever you'd think that thats just how the game is by itself but in reality thats just not the case. I remember playing cyberpunk using an upscaler and for some reason it just felt wrong like i just couldnt get the visuals right. After some tinkering i turned off upscaling and it looked and actually felt so much better for me.
yeah, the devs might be interchangeable studio to studio but what those devs don't realize is that just means you're easily replaceable by overseas workers willing to do your job for 1/4th the cost.
Honestly, he’s a smart and talented kid, but I love how the “experts” out there have zero experience on how a development pipeline with 100s of people work.
I understand wanting to take a compromise to smooth development out but this is the same route modern car manufacturers have gone and it's why they're all not worth buying now.
Graphics in my opinion are important in some games but not others. Whether you're for more photorealism or you really don't care. We should all agree that games need better optimization, not short cuts.
It's ironic how Digital Foundry themselves never ever said that "UE5 and Epic are the problem", only highly regarded "investigative journalists" say that, citing DF.
@@LardoAG digital foundry run games in 4k on ultra on their high end PC with 4090 where these issues dont exist thanks to high res and love to comment how everything looks when they use high zoom so ofc they dont talk about these issues.
@@Extreme96PL Even if that were true, which it clearly isn't, they were always talking about shader compilation, traversal stutters, and uneven frame pacing. Those issues exist, regardless of hardware and you average dev wouldn't even know about these issues, if it wasn't for DF. Also, DF always talks about TAA and upscaling, thoroughly comparing them in motion, and they clearly have a multitude of test configurations, not just "high end 4090". Their latest video on The Thing featured a Windows XP PC with Radeon 9700.
They've made numerous issues talking about unoptimized unreal engine games. They have dedicated videos on multiple topics like Taa, stutter struggle, pc optimization etc. do you even watch df?
@@theanimerapper6351 That's the point, "unoptimized UE games" "UE is the problem". "Unoptimized UE games" implies that devs didn't optimize them, not that UE somehow prevented them from optimizing the games. I've heard DF cautiously suggest that the fact that this is a wide-spread problem indicates some fundamental issues with the engine, but they're waiting for CDPR to confirm that, since they're way more competent than the average dev.
@@LardoAG if what you said is true then how are digital foundry unreasonable? Cdpr even did a talk discussing the optimizations they are making for the Witcher 4 that will hopefully help other ue5 games. Idk what else you want them to say 😂
21:51 I've been around long enough to have seen the era before TAA was introduced and the kinds of graphics solutions that were being presented at conferences and in papers. One of the things about the graphics world is that graphical artifacts are a game of whack-a-mole: when you try to create a solution, you also create a different problem. In film CG, the trade-off can be made favorable because you just solve enough of the problems to get through that scene, then do something completely different for the next one. But in games, it has to do everything flexibly in every scenario, so in so many instances, the solution is a non-solution by convention: "we'll drop it on the floor. The models are going to pop in and the characters are going to clip through each other and the lighting will glitch out in certain places." AAA can often push things a little more in the direction of a movie by running different engine code for different scenes, but that's the kind of thing that really does blow up your technical budget when you go deep into it, and it creates a new set of tradeoffs when you switch between engines - is it a load screen, is it a hidden load where you walk through a tunnel, how does it handle gameplay scenarios during the transition and so on. The downside of having to wear those trade-offs is that it does open the floor to gamer conspiracy. Whichever thing you chose, the negatives of it will be attacked, and grifters will appear proposing easy answers. TAA happens to be a visible target now, and there are some valid reasons for that. At the same time, it was a solution that created a lot of flexibility elsewhere in the shading pipeline - AA solutions are part of a compositing stack, and have to be thought through in terms of "how does this impact geometry, how does this impact textures," and so on right up through the final post-processing filters. Early 3D games were texture-heavy - Quake and the like, very low poly angular spaces - so those engines and the GPUs also focused really intensively on aliasing in textures(bilinear, trilinear, anisotropic filters, mipmapping...). When the scene is made with Nanite then you have the opposite thing where almost all the pixels are geometry edges, so you need a solution that works well for edge aliasing. TAA does OK with this - it's the kind of glue that you can fit in wherever. From an art direction standpoint it often is just shuffling papers - you still are constrained in what's easy to show and the cost of the result, just in different ways. The whole game graphics space exists to sell you more chips, and the AAA game industry sits downstream of that: Nvidia says what the next gen is supposed to look like, and everyone just gets in line and implements what their researchers published. As a dev you can always opt to ignore Nvidia and take the graphics in a much different direction, but in many cases this means "missing out" on something they do help with. It's actually easier to make that jump as an indie since you can assume your scenes need to stay simple in their production cost and density of assets, so there's no burning need to push the engine to render more stuff.
Most people discussing the fall of graphics and the unreasonable increase for the quality and the performance of said graphics lack the understanding of the why it has happened and been happening. TAA is not at fault. It can worsen the situation, but it was a reasonable solution to aliasing on the 8th generation consoles(that had a lot of memory, but mediocre gpus and horrible cpus). It also could be used as denoiser in several processes in the pipeline, further saving computational time. The move for Unreal Engine is also at fault, but it didn't come from nowhere. The cost of developing a triple AAA game skyrocketed, many studios could not maintain their own tech, so their teams move to third party engines(Unreal, Cryengine, Unity, etc). Those engines aren't bad, but aren't made for what most of those studios want to use them for either. That's when the studios should use the acess to the engines source code(that they all pay for) to make it fit their needs. But here comes the catch, they left(if it even existed to begin with) their own tech because they could not maintain/upgrade it, so how could they change it to fit or fit better? There's still a lot of technically competent games being made, but the vast majority are now being made team that have the expertise and their own tech. There where always a gap between those teams, but now it feels like it is an abyss, and an insurmountable one.
yeah I agree with some of what he says, but I can't get onboard with the argument that TAA is the cause of these problems. There are legitimate reasons why it's used. It's much better than FXAA, MSAA just isn't a viable option with modern lighting engines, and supersampling is way too performance intensive.
If that was true, then they should manage their investments better. You have AAA games that cost hundreds of millions of dollars going the easy route, where is all that moneyy going to?
@screaminjesus To be fair, I never said it was the problem, just part of it. I even pointed out why(in my opinion at least) it became so used. MSAA is a possibility even today, but the cost on every type of denoiser for various effects, plus the extra time to integrate it all, made it hard to justify to most studios(I belive Forza Horizon still use it primarily and looks good).
@Gatitasecsii Of that front, I can not comment on, as I am not in the industry(video game), and everything I could possibly say about would be just an opinion based on some outside perspective. I, however, am a software developer (and had some interest in computer graphics in college). From that perspective, it is simply perplexing that most games aren't developed with agile software development methods, which means it is really hard to shift their directions once their development starts. Any heavy alterations can really impact the cost and schedule of the project. Capcom has started to try and do agile development with some success, so 🤞 the industry will catch up. As for the cost, it is to be expected. Games now are huge projects with thousand plus workers between programmers, artists, writers, etc. Any project that size, lasting form years would cost 10s of millions of dollars at the minimum. In a country like the US, that cost can easily reach for the 100s of millions.
To be honest, we should've been at a point where games dropping to 60fps was considered poor performance and 100+fps was standard on recent hardware. Even if graphics didn't change much, performance should've improved generally
It is easier for that generation to belittle the work of others while benefiting from their efforts for minor financial gain through content generation.
most AAA games have 250ml or more budget they should not be excused if they are providing a blurry and smeary mess that runs worse than 10 year old games while also charging $70
Self publish and don't work with ego inflated credentialists. I'm a solo dev building a garage out to work with other local devs using a pay based on release success model. For other people who pay their bills using other means so they arent pressured. The game is CRISP and even runs on media center PCs well. It's a shooter.
"We use Unreal Engine" sounds to me like corpo speak for "our working conditions push experienced optimizers away, so we need new developers more than 60FPS". I wish AAA would just make shorter games with great pacing at higher base price, instead of microtransaction storefronts with filler content for +66% Steam discount.
I think that we should also think about ways to convince executives at publishers (non-engineers, non-developers, etc) that this is a big enough issue that warrants attention and funding to fix, since ultimately they make the final decisions when it comes to finances and cost cutting, and are probably the reason many studios switch to unreal. They’re not gamers nor developers, so they won’t really understand the problems even if they watched every Threat Interactive video. We need to speak their language somehow lol.
Luke, I wish it was true, but unfortunately studios are laying off a lot of employees and looking for people with several years of experience and at least 2 released titles, and now there is no room for new developers, even interns.
the layoffs we've seen were in response to the inflation issues. now since inflation is down there should be more positions opening up relatively soon.
@@Smash_ter That's not how it works, or what has caused it (kinda, but not just that). Feel free to look up the amount of layoffs that happened in December in gamedev. And then do some basic number crunching on this, the total of layoffs in past two years, the total of game-dev students graduating, and the amount of open positions. Do that homework and let that sink.
@mercai reminder that during the lockdowns we had more people being hired into these positions. It was after the lockdowns ended and inflation kicked in that the layoffs began unfolding across the entire industry. I honestly don't like it, but we could see them start to hire again since inflation is now under control again
Probably the funniest thing about console gaming and gaming in general really is anti aliasing turned of gives you better performance and makes the game look better and a lot of games don’t allow you to turn it off.
Upscalers are a crutch. When I built my first PC the whole point was to avoid upscaling tech that was widely in use on consoles. I just bought Allen Wake 2 and while optimizing my settings to make it run as well as possible, 4k on my 6950XT, I have the game set to the low preset and have other stuff turned off for enhanced shadows etc, and the game looks incredible still. We have FF7 Rebirth that looks like absolute fucking garbage if you want decent performance
"I don't think the average gamer will notice"
I think the average gamer notices, but doesn't have the ability to communicate exactly what the issue is.
Exactly, I know a few people like that.
@@khatdubell I feel like it's also easy for people to assume the blame falls with their outdated hardware/console limitations.
@@GeneralPretzle Which is intentional, one must presume, give how both consoles and graphics card manufacturers want you to always be buying the newest stuff.
Yea, but most gamers don't notice how disconnected rasterization makes everything. RT and a lot of these new methods fix that, but at a cost. It's kind of a transitional phase.
They notice. everybody is complaining of newer games looking "fuzzy" or "unpolished", or anyway less clear than in the previous generation
It’s so crazy how people were recommending expensive 4k tvs with hdmi 2.1 bandwidth to take advantage of 4k120hz when the 9th gen consoles launched. Years later and your average unreal engine 5 title is running at 900p 30fps on these machines lmao.
I've got a 4090 laptop (~ Desktop 4070 TI). There's no way I'm getting more than 60 FPS at 4K Ultra, RT off.
This generation of gaming is fuckd
@@Rapunzel879 but you do get 4k120 in games older than 5years, right? So up until ps5 launch.
I thought Spider Man games had awesome graphics. I changed my mind
But but, muh fancy RT lightings 😂
Studios should prioritize 60 FPS , no ghosting and no blur
Yeah but they wont. Even Rockstar, with RDR2 had quite abit of blur and ghosting
But then you get lower resolution textures and you golems will start posting "LOOKS LIKE PS3 LOLOLOL" in trailers of games, the developers can't win. You morons brought this on yourselves with your obsession with graphics.
@@cxngo8124rdr2 is the blurriest game ever made imo
that would cost time and money so uh no
Yes, and at native on the most common consumer hardware being purchased when the game is released.
I thought my eyes were just getting old when I started to notice some games were blurry.
@@user-hc1oi2bq8u just go back and play an older game that was based around MSAA.
FC4 vs FC6 for example
Just play Half-Life 2 for comparison - crisp and clear!
@user-hc1oi2bq8u I just learned the reason the wii was blurry wasn't because the console wasn't good, it was because you need to use component cables instead of composite cables. Additionally it uses interlacing by default and anti flicker (designed for CRT tvs) and once you disable that through mods the console looks way more crisp and responds better.
okay, im not alone in this.
Last time I loaded Dragons Dogma Dark Arisen and I went "shieet, is this 1080p?" when I was checking my settings, it was looking really sharp. New games don't look that sharp at 1440p
People in the chat were like "yeah but 4ms is nothing!" - And it really highlights just how out of touch gamers are and why they don't raise their voices against these types of issues.
To play a game at 60fps a second, you need to compute *EVERYTHING* in 16 milliseconds. By turning on a feature that alone takes up 4ms, That leaves only 12ms for *EVERYTHING* else, that includes game logic, physics, the whole rendering process and again, LITERALLY EVERYTHING. 4ms is 1/4 your 16ms budget if you wanna make a game run at 60fps. And yes, on paper 4ms doesn't sound like a lot but it will increase or decrease your performane by a factor of 1/4 at 60fps. That is MASSIVE.
Yeah a lot of people aren't educated enough to know and then there are people who think they're educated and then become obstinate against anything that conflicts with what they think they know. and I've watched videos from this guy threat interactive. He is explaining it quite clearly and he shows his homework and he writes it all down to show you the process,
One of my favorite videos he did was another one kind of crapping on unreal engine because he showed how unoptimized things were there was this pre-built area that had the worst light mapping possible and it overlaps so much that it dragged your frames down to pretty much nothing and he just went and fixed it made it look better doing some old-school techniques and suddenly instead of having 13 frames a second he was in the hundreds.
One can not expect the general public to understand that 4ms in rendering is a lot. TI put an example in the video, saying that those 4ms were the difference between going from 60 to 50 fps.
We need people who can explain these pitfalls of the current industry in a simplified, easy to understand manner and have that get actual attention.
We're talking about the same people that spent 90% of the video talking about how the Threat Interactive developer looks weird and is mad. Don't expect them to be smart enough to understand anything when that's all they can talk about.
False, you always have 16ms for CPU and 16ms for GPU. Some tasks on CPU are multi-threaded, which can take little some load off from main thread.
@ how does doing something in parallel contradict doing everything in 16ms?
The following are non-negotiable:
1. 60fps min framerate, no exceptions.
2. Zero stutter.
3. Absolutely no ghosty/smeary/noisy upscaler.
I refuse to believe we can't have these today. We HAD these 10 years ago!
Play at low settings then if you want all of that. 10 years ago we didn't have the graphics we have today. 60 fps max at 1080p, baked lighting, lower res textures, etc.
@@agusr32 baked lighting looks better than RT, because RT is so expensive that it has to be downgraded so much to make it run online. Well, baked lighting is also RT, but offline, which means the quality can be much better due to no time budget constraints.
@@LtdJorge Yeah, but you need to have static environments, and save a ton of files for the lighting. In open world games with day-night cycles + weather effects, baking lights is not ideal. Also, I don't think it is easy to maintain the lighting resolution for meshes of different scales. You might bake lighting with coarse resolution for buildings and terrain, and rely on cheap but nonrealistic methods to calculate lighting for smaller objects and moving meshes.
@@agusr32 did you watch the video or just decide to post?
@@Archikuus I watched the video and decided to post. Anything else? preferably on topic?
Games should be running on native resolution, upscalers are a plague.
Devs should be optimizing for native as well
No, they really aren't. Most of the time, DLSS at 1440p is near indistinguishable from native resolution and uses less VRAM to do the same thing. That's a win for the gamer.
@@Krypto1211 except you need an Nvidia card for DLSS. Not a win for the gamer when you’re forced to pay Nvidia prices. The gaming industry is using a proprietary feature as a crutch that not everyone has access to (console gamers for instance). FSR has abysmal ghosting, even XeSS looks better than that.
@@Pårchmēntôs DLSS is built into RTX cards, so obviously you need one to use it.
FSR and XESS are available to everyone.
Native res is too taxing for gamers to achieve 4k on their PC rigs. You would all have to buy expensive cards to be part of the pcmr.
CDPR is moving to Unreal Engine because the entire team responsible for developing their in-house engine has left the studio.
@@Asynthetic source?
@@kruz3d573 Source? Some random guy on twitter said so. I am deadass.
Gamers thinking they know how games are done is still one of the funniest shit there is.
@lopesmorrenofim there are multiple videos covering and explaining the mass exodus of cdpr. If you actually want to know more watch dr. disaster"s video on the witcher 4
I've heard similar things somewhere. The witcher 3 team is very different from the witcher 4 team and the reason they use UE is because literally anyone can get and it and learn it themselves, without any cost. You can hire way, way more people and don't have to teach them your own engine, before they can get any work done. It's a logical step but CDPR has already noticed how crazy bad UE5 is for open world games. I've heard they made an entire game devs conference talk about how the engine literally runs worse, the more assets there are used in open world games, so they have to heavily modify the engine with the help of epic.
They are moving to UE5 because Red Engine has severe problems; it's not a bad engine. UE5 just does more than Red Engine.
Fact that the other developers at 30:25 are screaming for a solution and Epic isn't doing anything it is a classic example of why relying on another vendor for your tech stack can sometimes suck. In comparison to writing your own stuff, at least you'd be able to come up with better solutions as opposed to just sitting around waiting for Epic to do something. I expect this trend of Epic not listening to developers to continue as they sway more studios to switch to Unreal. To them, it's not worth fixing if they're banking on those licenses and fees from studios.
Unity being scummy really played into Epic's hand...
Unreal Engine is proprietary and open source. In other words, any company using it is free to change the code however they see fit, that is if they have the know-how. They either don't want to or have possibly forced the more experienced devs out of the company to save money. Also, it appears to me Epic is trying to steer UE into being less of a game engine and more as a tool for visual effects/cinematography. So UE devs are doing plenty. They've developed probably the most advanced game engine to date, but the AAA game developers (and some Indies) are trying to maximize their profits rather than make good games.
@@benhunter8551 Proprietary and Open Source don't go together. Unreal is not Open Source.
I enjoyed playing Jedi Survivor and 100% it, but it did not need to be 135 gb when fallen order was 45 gb I think. I’m really not a fan of single player or even multiplayer games being over 100 gb.
Great game ... I just played it on PS4 lol... They had to drop resolution to get it to run well but I enjoyed it .. though the ending wasn't that great
I was surprised how crisp that game looked . And that was just watching game play on U2b...
Then we get mullet order recently, runs worse, and looks like PS3
Every developer seem to go over to UE5, so doubt we will see below 100Gb in single player games for a long time.
There should be a way to download the game at the resolution you want it. I imagine 4k textures are a big chunk of this.
Unless it's a massive solar-system sized game, you're right, they do not need to be over 100gb. No Man's Sky puts a lot of these games to shame, looking a lot better (at least aesthetically) and being much larger. Even Star Citizen is only 125gb -- the fact Call of Duty games are twice or three times that size and nowhere near as big is just embarrassing.
Sony and Microsoft: Our new console will handle 4K at 120fps for real next-gen graphics!
Eight years later...
Sony and Microsoft: Our new consoles will actually handle 4K at 120fps for REAL next-gen graphics!
Eight years later...
Sony and Microsoft: Real, native 4K at 120fps doesn't matter. It's all about our amazing AI processing for REAL next-gen graphics!
for the ps5 and series x they advertised 8k 120fps yet what we get is 1440p upscaled to 4k at 30fps
@@marlon.8051 Pretty much. Often, not even that high.
@@marlon.8051 900-720p or even LESS in some cases!
I truly feel that devs are just going to keep doing this more and more until it becomes the norm. It’s so crazy how most modern triple A games genuinely look worse than games made 10 years ago because of all this fuzzy frame generation bs.
They will do whatever enough people throw Top Dollar at them to do. I don't want to hear people complaining about the very things they are funding with their own money.
@@nandoman4769 it's already the norm tho
Devs don't do this, or rather try not to. Executives do, and will do, as you said.
It’s either this or basic graphics with frame pacing issues like Elden Ring. Most developers have completely forgotten or chose to forgo optimization
I don’t know if it’s even sustainable, honestly. The more insane hardware requirements get, the smaller the potential playerbase. Not everyone is gonna update hardware a the rate the requirements grow.
The question people should be asking is why development costs have skyrocketed in recent years. Game quality and creativity hasn’t increased commensurate with increased costs. I would also caution anyone to argue that costs increased because of an increase in talent. “Talent” implies creativity, productivity, and/or innovation, qualities that aren’t readily apparent in general these days.
100% this . Spiderman 2 costed 200$ million more than spiderman 1. Why?
@@georgep.2352 BS...200 millions is its whole budget...not the gap with the first episode.
indeed, especially when 90% of natural game environments are made with free Quixel megascans assets
Higher fidelity assets that take longer to create, and larger scopes, and bad management. Those are the major reasons.
@@Khrist75total budget of Spider-Man 2 = $315 million
Spider-Man 1 = $100 million dollars. As far as I found in research these are the numbers. A roughly 215% increase from the original. Math wise it’s 3.15 * the original cost of production because they used 100% of the old budget.
The diminishing returns graph applies to graphics but not really to performance. I don't expect a big leap in visuals from the ps4 generation to the ps5, but I do expect a much larger leap in performance, especially in the CONSISTENCY of performance. Why are so many ps5 games still only "targeting" 60 fps but not always managing to hit it? And why do they need dynamic resolution scaling to be able to hit that fps target? This was understandably an issue for last gen, but we should be past this now.
Last gen most games were 30fps and often a shaky 30fps. Thus gen most games hold a much more stable rate and all but a handful have 60fps options on console… that IS a big leap
@@themightyant. This is patently ridiculous the consoles arent twice as fast they are 6.5 times as fast try to game on a pc even a top of the line on from 2013 compared to a low end one from 2020 it will run those games at over 200 fps where the 2013 pc runs at 40
@@themightyant. the current Gen consoles run on hardware that's more than capable of running these games consistently at high frame rates. PC's with similar specs already do
Unfortunately, there's plenty of room for devs to add more graphics (not much you'll notice but still costly in performance) so I don't think they'll start targeting 60 yet
@@jace_albers I mean to be fair a low end pc is better than it’s ever been but no game that gets 200 fps on a 4060 doesn’t run well on ps5
Year ago I got a gaming PC, been playing on 1440p monitor and resolution, and began seeing things that made me think my graphic card was acting up, but no, its just part of the experience now.
Then you have me buying an NVidia graphics cared BECAUSE of DLSS... Never again u.u
@@Gatitasecsiia little bit of dlss is ok
@@Limbaugh_
Yeah but at 8GB of VRAM some games actually end up being unable to load textures after like an hour of playing...
@@Gatitasecsii We both bought the 4060 for dlss huh?
@silentlore2458
Yeah... feelsbadman.
Well at least my excuse is that I passed my old graphics card to my little brother but still... 8GB of vram...
Going back and seeing games like Detroit Become Human running on base PS4 or Uncharted 4, hell they even unveiled and launched TLoU Part 2 on it first, it makes no sense that so many games are uglier and run terribly not just on "current gen" hardware but in some cases depending on the game, high end PCs that are much more powerful than said machines. Upscaling was sold as the savior of old hardware, instead it's being used as a crutch to make shit playable on new hardware. Another thing is shit like lumen being pushed much earlier than it should, when UE5 can't even handle streaming in worlds without stuttering, and the fact that studios are having to find a fix for it now instead of Epic themselves fixing is an absolute joke.
EDIT: Also, I don't know if I'm wrong or not, but didn't we have a cost saving technology for geometry called Tesselation for nearly 2 decades now, yet devs seemingly forgot it existed? Instead you either get terrible pop-ins or in the case of UE5, nanite which is part of the stutter problem.
it would help if we had another engine beside unreal... problem is, its a huge expense that no one else want to help with.
And now FF7R part 2 won't be able to run at all on my rx 5700 xt because it doesn't have RT cores despite game looking much much uglier than the part 1 or Crisis Core remake.
Detroit become human is not a game, it's an interactive movie.
Nanite is literally just a fancy version of tessellation.
Tesselation only makes flat surfaces look like 3D objects. It doesn't make anything with pop ins
I never noticed all the issues with TAA until my buddy pointed them out to me while I was playing one day, and now I can't stop seeing them everywhere!
do you play on a TV in your living room, I really Didn't notice it much until I started playing on a monitor.
@RusticRonnie Understandable! I think we actually didn't properly notice the TAA side effects while actually playing games in comparison to replaying our recorded gameplay tho-
Edit: Plus, I prefer TSR since it looks better and run smoother (regardless of its side effects but they're rarer) than TAA but sadly it's only usable for UE5 games...I mean idk why they talked about TAA more than TSR tho-
@@RusticRonnie I play with a monitor and I somehow still can't see what TAA ruins, unless it's badly implemented with obvious ghosting and the such
But I could also point out all the flaws of old rendering techniques as well. It's always a trade off.
@@alyasVictorio because tsr is TAA with a spatial upscaling bit tacked on
18:43 no, rather it is the fact that a lot of the "gamers" are post "cyberpunk" gamers, they have not seen the days where everything is clear. Just think about it, vegetation in Division2 is the same as the last of us part 1 but the hardware requirement is so much higher, why?
768p BO2 on my old GT740m laptop used to look sharper than the current games i play on my 4080 at 1440p. Sh|t is absurd
The Upscalers have become crutches, look at UE5 games, they struggle on even High end rigs, you can barely get 60fps on all ultra with a 4090 in some games, but look at Indiana Jones, it’s beautifully optimized where you can run the game on a Laptop RX 6700 at 1440p low settings at 60-70fps
So if Indiana Jones team decides to make their game engine public/open sourced, you think NO ONE will make a single unoptimized game?
I've been playing Life is Strange Double Exposure lately and that title convinced me that UE5 is just a tool to make games with less effort more quickly. At least most of the time. Obviously it looks much better than earlier Life is Strange games but it still has very simple graphics by modern standards. And guess what? It still runs like absolute ass. I'm running it with an overlocked RX 6900 XT and it can't even maintain 50 FPS on high settings 1440p, let alone the highest "cinematic" settings. After SH2 remake this is the second game where performance is somehow even worse indoors than outdoors.
How on God's green Earth can games like HFW, Last of Us Remake or Plague Tale Requiem and many more run miles better with way higher fidelity. Even an another UE5 title, Lords of The Fallen has much better performance.
@@Rexperto6454 I hate most unreal engine 5 games. I was surprised though with how well Chivalry 2 is optimized on it though. And Codemasters seem to have improved the release of their WRC game. It's really not good for open world games though.
When you use RT in that game it still drops down to 60fps and below with upscaled 4k with 4090.
You know PUBG and many games are made in Unreal engine
This problem has now got developers creating games that will actually rely on upscalers as standard practice now rather than those being the tools of enhancement for the gamers systems. That's really bad.
Threat Interactive is tilting a lot of folks. And even the guys at DF are REFUSING to just talk about the subject, why. The amount of exposed people is incredible...
Its true that publishers are largely to blame. But let's not ignore the fact that the lower echelons are also showing disproportionate indignation at the truth, this means something...
Absolutely. Threat Interactive is the best UA-cam channel on the platform related to gaming right now and it makes a lot of studios look like clowns.
Alex from DF is pretty open about his frustrations about UE5 stutter issues.
Because he doesn't know what he is talking about. There were multiple actually professional developers debunking his claims. He has surface level understanding of Nanite, Lumen, LODs and other tech, and in many cases his conclusions are simply wrong.
I'm gonna say right now, everything Threat Interactive says has truth to it.... but at the same time there's a lot of lies by omission and generally not telling the whole story. Lets talk about TAA, something Threat Interactive has talked about a lot because it makes games blurry and has artifacts compared to older AA methods. He has conveniently not mentioned the reason WHY we stopped using older methods. SMAA, MSAA, and SSAA are bloody expensive compared to TAA, and SMAA and MSAA don't necessarily have the bests= results with deferred rendering. FXAA was brought about to deal with the fact older techniques didn't play nice with deferred rendering but it had MORE ISSUES THEN TAA. We need a new solution better then TAA, yes. Game devs absolutely over rely and abuse the hell out of it.
Older techniques are not the answer. I hope Threat Interactive helps push us into a better optimized era because its awful right now..... but take everything said there with a HEAVY dose of salt because its hella myopic
@@Reverie_Blaze Haven't seen any "professional developer" with an actual product, that runs UE5 with Nanite, Lumen that runs 4k60 without upscaling.
This dude looks like he's had one of those samsung smoothing filter's added to his entire face. Like, it's uncannily smooth.
@@Earthserpent89 optimized face lol
He looks like from my point view the Jedi are evil.
I thought was A.I
He's super young. Literally a kid.
dude hes just young. i swear yall only bring this up to distract yourself from the fact that hes RIGHT
the order 1886 looked amazing on ps4, we got ps5 pro and still waiting for games like this
don't make me sad reminding me what they took from us.. if any game of the PS4 era deserved a sequel it was 1886.
it was, indeed. was kinda surprised they managed to make that game look that good while running on the ps4s hardware.
The only thing I didn't like about that game was that they made it too on rails.
@@jamesFX3ironically I thought it being so linear would be a con. Replayed it recently and it was nice to have such a straight forward game with good atmosphere
Add DRIVECLUB to the amazing list 😃
@@chrisbrown113096 That's why I like Metro 2033 and Last Light better than Exodus. They're the perfect length.
"CDPR has competent people who worked on the red engine for years and can tell Epic what needs to be changed." Bro, those competent people (a bunch of directors included) are long gone, most of them even before cyberpunk released, and especially now with several dozen more leaving CDPR to make two more studios of their own. Why do you think they are switching to unreal in the first place? In conclusion, CDPR is cooked and UE5 is still trash.
As a born PC player, on console I always go with 60fps performance mode to reduce risk of the controller physically entering the game via the display device
Is this an ai, or google tl? In any case, I assume you mean reduce input latency? Becasue the stuff you wrote doesn't really read like coherent english.
@@bazzy5644well you just lack basic reading comprehension and ability to contextualize what you read . what he said is : "as an originally pc master race gigachad I always set my console games to 60fps performance mode , because anything lower than that makes me disgusted and angry to the point I will throw the gamepad in the TV if it does the thing again"
@bazzy5644 It makes perfect sense my dude. He’s holding back from throwing his controller through the TV
@@thecompanioncube4211 Well, when you put it that way it makes sense.
@@bazzy5644 Remember the times when "shit hit the fan" could be rephrased as "excrement" and "rotary device", and everyone would get the joke because they passed elementary school? Good times.
Epic has always put developers first and gamers last. Incentivizing developers to EGS with a lower cut when none of that matters to the regular gamer because it will be the same price anyway but now you have to use their store and can't use cool features like workshop, remote play, community controller presets etc. Same with the engine, they make it easier and less costly to make games for developers but gamers will just have to upgrade to the latest 5090ti super if you want playable framerates.
The options Unreal Engine gives us as developers are absolutely hard-shafted in one direction or the other: you either give players a decent experience and completely screw yourself as a developer, or you give yourself as a developer a great experience and completely screw half your potential playerbase (and the other half has to deal with horrible temporal smearing and/or upscaling from sub-1080p resolutions). There's no middle ground.
It's infuriating because now with so many horrible AAA releases with horrible performance, quality matters more than ever, but with Unreal (which is what I've been using for the past few years) it's very difficult to deliver on the performance side. I'm experienced in programming so building a custom engine is hypothetically within my reach, but still far from easy, not to mention having to rebuild all my tools from scratch, setting back my ship date by a few years, and crowdfund the ability to work on it full time and hire a graphics programmer for assistance (which I absolutely can't, I'd never be lucky enough to get the popularity needed to raise that much money). So while hypothetically possible, it's definitely not realistic. All in all it's really not a good situation. I'll do what I can with what I have though.
This is a truly idiotic opinion.
@@ramsaybolton9151 It is the truth, and shipped games confirm it, as well as the current "AAA" situation in the industry
@@CactousMan Brother without Unreal Engine most indies couldn't make a game. Unreal has the best deal and tools provided for hobbyists and indies. They offer cheaper games and actually advanced gaming. It's like the hate Nvdia gets when they're actively developing technologies that improve stuff for gamers. AMD is SLIGHTLY cheaper but just piggybacks of Nvdia tech and is always years behind.
Steam basically fucked off with all their money. Nothing is released on the cheap. Where is Source 2 for developers? More gambling simulators for valve games and over priced goods like the Index and Steam Deck.
@@ramsaybolton9151 Yeah, UE5 has some advantages in that case, but it's not a excuse for its bad performance, lacking better optimization options, from such a wealthy company with more than enough resources to improve their software.
we all jumped from 1080p to 4K way too fast without waiting for hardware to catch up
for some reason 1440p just didn't roll off the tongue for casual consumers and 4K was the MAGA of resolution names.
we needed at least 10 years of peaceful 1440p gaming AND TV.. before even thinking about moving to 4K
now everything is messed up and no one knows where to start fixing it all
Also, people don't understand that entry-level GPU (RTX x060) isn't meant for native 1440p, it's for 1080p with DLSS. And high-end GPU (RTX x090) is meant for 4K@60hz with DLSS Quality.
Nah - that's blaming people for upgrading, and hardware for not being strong enough
But the point made is, that there is too little effort put into optimizing the game itself.
You can put your million poly models into the game and expect Nanite to figure out how to render the stuff - or you put in the effort as a dev and use LoD and/or reduce the polygon amount at least somewhat.
Hardware does not get stronger just so devs can be lazier!
@@Hyp3rSon1X plebs think optimization means high framerate, it doesn't. Optimization means high framerate without compromising visual quality. LODs compromise visual quality because you will notice switching between LODs, but Nanite always shows the geometry that is most optimal. You only get pop-in with Nanite when your PC can't handle streaming the content.
I think this is a miss. The problem isnt waiting for hardware to catch up; more often than not, it is the design and implementation of the game/systems. And I know this is the problem because it filters all the way down the "tech tree".
I'm not a higher-end gamer: I dont play graphicaply demanding games, and I dont play on a high-end computer. But I have still noticed that many of my newer games run _worse_ than comparable older games.
Edit: fixed typos
@@chillin5703which games, name old games that are comparable with the new games. In most cases, they are less detailed or more static than new games.
The video seems to imply that one of the solutions is baked lighting. Baked lighting takes up a lot of space so it's not something we're going to go back to for huge games. It was largely not seen as a problem going into the PS3 generation because we suddenly had so much space. Jak 3 was 3.25 GB, The Last of Us was 34.55GB. 5GB of baked lighting is no problem at this point, but what happens when you want 4 times the fidelity on the baked lighting and also the map size is 5 times bigger. Suddenly the baked lighting alone is taking up 100GB
Days gone wasnt too bad. File size was kind of crazy but I think most of that was texture or audio format.
@@Callsign_Sturm The game is 67GB so compared to The Last Of Us on the PS3 it's huge. The foliage shadows are done in screen space which is part of why they're so imprecise. The rest of the shadows are really low resolution as well. It was 30fps capped on the PS4. It's a UE game which means it also uses TAA
It's a nice looking game even if it looks a bit dated at this point, but it is also an example of the type of game that this video would complain about.
he has a recent video about dynamic lighting where he showcased a more efficient way on rendering it than what UE5 has
@@TheDuzx If you see Threat Interactive’s videos on Days Gone you can see it’s a very well optimized game. Given it’s open world with 200+ enemy hordes coming after you without any noticeable frame drops the game is very impressive.
@@theatheistbear3117 I find that strange as the game had most of the "issues" highlighted in this video when it launched on the PS4 and it still has a few of them. The experience is way better on the PS5. Not because the game was optimized furhter, but because the PS5 is just more powerful. The same is going to be true of most PS5 game they complain about today when we play them on the PS6.
Series x is a MASSIVE LEAP over one x it’s not even playable in most cases on new games! Go play any new COD on a last generation console. They shouldn’t even be allowed to sell it for last gen it’s that bad!
LOL bo6 on last gen is great on all except XBone
As an ordinary gamer who doesn't know anything about this stuff, I was appalled at the MHWilds PS5 visuals during the beta they ran. It genuinely looks and runs worse than MHW (6yo game btw). I'm not sure if it's due to things that this video illustrates or if its just because it isn't finished yet. I'm praying for the latter.
18:30 people do pick up on it they just pick it up as "something is off" without the ability to articulate what exactly is off
it can throw people off and they won't even know it
I recently played the original Alan Wake, and the baked-in lighting is amazing!!
Alan Wake runs and looks very poorly on PS5. It drops to pixalated 360p all the time during gameplay.
@doanamo PC 😁
In the 90's game programmers needed to be creative to find ways to reduce computation power required to render each frame. Now they just ignore optimizations on any and all steps in the pipeline, and just rely on enough compute cores to do the math.
120fps should be the industry standard because persistence blur ruins everything and BFI is just a band-aid.
critical flicker frequency is 83, so 166hz would be nicer
It would be amazing if we could unblurr the lines between advertising bs and blatant lying.
I'm glad you are sharing this video. I've seen it a few days back and despite the somewhat goofy technical execution of the video (sound and image quality) the contents are super interesting and very on point, factual. Happy to see this gain more and more spotlight.
AC Unity has such an impressive looking lighting because they baked everything and set it to one specific time of the day to make it work becuase they couldn't figure out a way to make a day night cycle that looked as pretty, obviously the textures and models leave a lot to be desired, but in the interiors it still looks as good or better than current games with half of the hardware, i wish they had this type of non temporal bullshit for STATIC maps
@@Gaming_Legend2 And they continued to develop their lightprobe-based approach after that. It's not as precise as actual RT, of course, but it's an absolutely great compromise between visuals and performance. Now we have certain people praising RT, like: "everything is ray traced, wow". But it's some game again with static environments and mostly static lighting. Latest Indiana Jones is a great example. Sure, the tech is impressive, but at the end of the day I've seen comparable games with lightmaps, light probes, etc. with much better performance and only slightly worse visuals. I don't care if it's not "real-time all the time". All I care about is how the final games plays and looks.
23:30 maybe they shouldn’t have fired their talent for disagreeing with them ideologically
🎯💯
SSSSSSSSSHHHHHHHHHHHHHHHH . He doesn't like to talk truths in his videos or even mention the W word
The pure irony is that more tools were introduced to help assist and speed up the process of development, but at the same time all it did was actually hurt development instead.
You have games from 10 years ago that hold up much better than games visually today because everything was working on organically, by hand, but now they have so many in-engine tools that were made to speed up the process, but just ended up costing them in the long run. Sure, doing this one thing with this one tool might help the product finish faster now, but it ends up breaking or causing so many issues down the line after release that its almost pointless because then you spend that post development time trying to fix it. Completely renders all those shortcuts pointless.
All they are really doing with all these tools is bloating the file sizes of games.
agreed. people keep glazing UE5 for its easy accessibility for devs. but overlook the clear problem with it just allowing laziness and over reliance on a game engine designed for a wide range of uses instead of dedicated engines that specialise in an area/genre.
Its not usless in the end wtf are you talking about????
They get your money into their pockets faster,
through ea and not having to pay 500 employees but 100 cuz they put out slop QUICKLY, rather than slowly.
Please stop spreading MISSINFORMATION!!!!
@zenoomorph it's not misinformation, it's a fact. No one's glazing EA or any other company. Most games these days are unoptimised garbage regardless of if they use UE5 because they all use the same shitty solutions. I'm of the opinion that studios should use their own in house engines.
@@SilverSpectre266 false, the bloated filesizes of games are from baked lighting, which many games still has as a fallback. RTGI doesn't need lightmaps, but the filesizes are big because 4K textures.
Pop in along with a lot of these titles making even a 4090 struggle at native 4K has been really taking me out of it on recent games. It’s crazy.
On the PlayStation 1 2 and 3 era though, it felt like the attitude was to squeeze as much as you could out of what resources you had, whereas now the attitude seems to be to brute force with the resources you've got and not even optimize the software at all. I feel like that has got to have an impact on these diminishing returns.
Also, producers put more efforts into bogging the whole game down with feature creep instead of getting the best possible output on the feature set that they have
Guys don't believe everything on internet Threat interactive is asking for 900k to make his own no blur TAA on UE5 with no info of his previous experience on his crowd funding
UE5 is open source and you can modify it however you like to remove blot in games
And one more part to think is why more accomplished studios with bottomless money didn't achieve it till now and relying on traditional way of optimising and adding DLSS and FSR
I started watching Linus when he was doing videos out of some back room at NCIX. Of course, I'm Canadian and middle aged, so I used to shop at NCIX.
I'm now used to not owning games, game publishers are going to have get used to the fact that I'm not paying $70+ for a broken/unoptimized GAME RENTAL!!
I played Mad max two days ago and let me tell you I was still in awe with how to graphics hold up and the visual style of the game
I think that art style Matter more than photo realistic graphics.
Mad Max has photorealistic graphics though. At least in my eyes. Doesnt really have a ''style'' like Fortnite or something like that.
20:33 this whole nanite discussion is so frustrating because he doesnt understand the premise of nanite in the first place. Yes Epic is selling it as a silver bullet.. which it isn't... But it opens up the possibilities for insanely geometry rich worlds that was not possible before. It has a ton of issues sure, like the relatively large base cost, but it was not designed to make geo free it was designed to allow for MORE geometry. All his tests are based on the idea that nanite will make geo be cheap and not a single test hes done is in a real world game level, only ever small test areas.
I think it’s crazy this kid knows so much and is so determined to change the bad gaming trends that are metastasising, I really admire him
It's like we went full circle, from using dithering to simulate blur and shade on old CRTs, to using dithering again to simulate blur and shade on modern TVs because of TAA 🤣
TAA was the answer for Deferred Shading putting an end to the use of MSAA. I miss MSAA and SGSSAA.
I think it would help if he was a bit more matter of fact about presenting his findings, he gets progressively more worked up with every new video and it's not helping his case.
Given the numerous UE5 games with weird issues and lack of shutdowns from other devs (that i've seen) i'm inclined to believe him, but i'm just the layman, i'm also on the side of reducing game budgets and having more reasonable hardware targets and it feels like UE5 is in the way of that.
I first noticed what devs were doing with TAA in 2018 with The Crew 2, that was the first game i played that forced you to use TAA so naturally i wanted to fix the blurry image by turning it off, i did the config edit or whatever it was and parts of the graphics broke like shadows and reflections, they turned into a glitchy grainy mess, shortly after they released a patch which stopped you being able to remove TAA which while extreme is fine because the game looked horrible without it, but their insane and strong use of it caused the image to be blurry and in some cases warbly as the effects and some geometry like trees tried to reconstruct itself. I also noticed it in RE2 remake, except the option to turn off TAA was in the settings and caused Leon's hair to become a glitchy mess, the fact that this is STILL what it's being used for is ridiculous.
I was a layman, and now I'm an indie dev. It's not just Epic fault, you would be amazed how many AAA don't do the basic of optimization, they just throw garbage mesh and think Nanite will solve it. Sometimes they don't use hardware lumen, just because is off by default, you just need to click a check button, it's beyond stupid.
@@mrxcs I remember when UE5 was first shown off and Nanite was presented as if you could stick original million polygon models in the game and it would just sort it out for you :|
@@mrxcs You're an indie dev and you believe AAA game dev studios don't do any mesh optimization? lol
@@cikame I left a comment on Threat Interactive's original video, noting that perhaps his content would be better received if he was less "aggressive" in his presentation. There's an air of indignance that I find off-putting. It lessens the value of the message he's trying to put out.
If hIs studio's efforts yield the benefits he suggested, then let the work speak for itself. Beating your chest is a waste of energy.
@@cikame likely reason why he's getting worked up is that a lot of people that know more about the game development tech have been telling him he's been wrong on multiple occasions about the stuff he's trying to represent and cannot handle the criticism levied against him and he's had a track record before on making incorrect takes before he made his vids
What's wild with PS5 is that all of it's best looking games are ones that had to also be optimized for ps4 to ensure sales, and were just patched for PS5, while the actual ps5 exclusives have arguably less visual detail, and unplayable performance modes.
In an era where graphics have plateaued, why would anyone buy a new console or a new GPU? Oh right, it's because the games run worse while looking the same! Big companies can't make money selling hardware if what you already have can run every game out there.
Because the games aree nowadays programmed with offloading all the optimization on the hardware's automated processes.
My favorite game in decades has been Borderlands 2 and because of the art style it still looks pretty damn good, if they slapped a new coat of paint on it and added BL3's movement mechanics I'd pay full price for it
i like this video from Threat Interactive but i have some doubts. I think he take this topic way to easy. And to say that digital foundry spreading missleading information is hefty. TAA was invented because all other AA-Methodes cause much more problems. I am from the older days and noone seems to understand how aweful FXAA or how insanly performance hungry MSAA was.
My friends worked in early 2000s game companies, mentioned back then the basic requirment for game dev was really high, they even had a dedicated optimization team full of code wizards, back then game dev often used their in-house engines or had deep understanding of the 3rd party engines and coding.
Thanks to UE3&4, the requirment for game dev is lower significantly, but now most of dev have no really understanding of the engine or coding they work on, they become more like assembly line workers, compare their works to old time dev like Valve,you will see huge differences, but when I talk about that with younger gens, they just claim it's just art-style, makes me wonder how their eyes or brains are damaged beyond repairing
Those Wizard probably expensive and hard to replace. Big comoany hate that system.
They want replacable workers who are cheap.
Also. As far as Digital Foundry video on TAA, they spent 15 minutes highlighting what’s good about it. And 15 minutes highlighting its many flaws. So.
@@cloverraven yeah. When he started talking about the DF video as though it was sucking off TAA, I knew this kid likely wasn't worth listening to since he was incapable of presenting facts that I knew to be true, so wouldn't be able to trust that he's not misrepresenting anything else he says.
No matter what else he says, whether or not it's accurate on an individual point, I can't trust his conclusions in the gestalt.
There's a massive post on the UE5 subreddit responding to his most recent video (just look up threat interactive on UE5 subreddit if you're interested)
Turns out the 'optimizations' he presented in the video are just standard practice when developing a game in UE5. He has no clue how to use Unreal Engine correctly so his results are inaccurate at best and deliberately misleading at worst.
When I mentioned this response in the comments section of his video he removed my comment. Pretty much says all that needs to be said about this grifter.
I'm surprised people are falling for this crap. He labelled Digital Foundry's TAA video as 'Ignorant' when Alex clearly pointed out there are benefits and issues with TAA and even said it should be an option to toggle off. He's clearly not arguing in good faith so DF shouldn't bother responding
@@maximusprime1994 If only they were standard practices.
@@NathanaelRouillard what exactly has he lied about?
@@vast9467 he misrepresents the content of the Digital Foundry video. Don't know why you jump straight to the word "lie" when I'm only saying his misrepresentation makes him a source I cannot trust, given the very basic nature of the facts being misrepresented, and the triviality of checking the claims he makes about what Digital Foundry says.
Other stuff could be argued as differences of opinion based on different areas of expertise, but since I can't trust him not to misrepresent what DF say (by ignoring the existence of large parts of that video) then I can't trust the other things he says, since I can't tell if he may be misrepresenting things there too in ways that I cannot validate.
As to the lie aspect: I don't go into his motivations. He may be blinded by his biases, or might just not know what he's talking about. It's also possible that he's just a liar. I don't have any insight into that, so I'm not going to speculate on why, nor cast aspersions on his character. You should probably ask yourself why you jump straight to the conclusion that my judgement of his reliability as a technical expert is a judgment on his character, and double check that it's not supporting your own biases that lead you to do so.
Happy to see people talking about these in my honest opinion, problems, with modern graphics in AAA games. A game can be realistic with different ways to show those graphics... And it needs optimization... not upscaling.
The vision should not hinder the execution from being well made.
its just funny how he always talks about Threat Interactive as a company while its clearly just him, or am i wrong?
Definitely just the one guy. He always says “his studio” but there’s no external funding and no other team members. And no game or preview of a game. I think he makes valid critiques of TAA and unreal but also misses some critical trade offs. It’s a very surface level and one-sided overview with lots of spin. The “our studio” spiel is the cherry on top 😂
@@PaulSmith-nd7gd I agree with you. The Dunning Kruger effect is on display.
@@Chris-ik2jp not if he's right about TAA and unreal
@@SoulbentAnime no guy seriously has danning kruger effect. Thing is he hasnt made a thing, not a game or a tech demo, literally all he does he uses intel analysis tool on a game and complains. His optimization video was really shallow only showing insanely badly optimized scene and with few super simple very basic techniques he easily made it better. but that thing really works on people that really dont understand stuff, they just see 2x better fps. If he made different rendering engine or rendering plugin that would speed up rendering, I would have different opinion. yeah taa has its tradeoff, but if it was really that bad, not the whole game industry would use it. Different thing is that devs force lumen, nanite and raytraced shadows in games, just because its enabled in ue5 settings by default. Also heard his discord has cult-like following where he bans everyone who even tries to tell something different or ask questions.
@@HenrichAchberger that's not dunning kruger effect. The threat interactive guy is unlikeable, comes off as really know it all and whiny but he is right about several things. He's not completely full of crap. He's got some points like how TAA is terrible. I've been saying that way before he was probably born. Also you don't need to have made a game to critique. We wouldn't be able to criticize anything if we went by that metric. I don't need to have made a game to say that a game sucks, I just need to play it. If he's right about some things, (which he is) then your argument doesn't hold as much weight.
The first time I noticed these kinds of problems was Dragons Dogma 2. It was the first time that it felt like I had to "fix" something before I could even tolerate playing it, let alone enjoy playing it.
Threat interactive is a patreon grifter exploiting the lay person's annoyance with bad implementations in modern games. The problems he highlights are all correct, but he proposes panacea and intentionally ignores the drawbacks of his own approaches
he's an idiot deserve to be demonitized due to his disrespectful nature for creators who have been in the game longer than he has. Just an all round whiney child
Bait used to be believable
Stop the yap, you say he is right, so he is right. I don't care if he is an ass.
tbf epic is even bugging in fortnite, im on a 4090 and cant finish a second match on dx12 which is required for all of the max graphical settings like nanite and lumen lighting, without crashing, even then natively its really unoptimized and runs on 70-85 fps but can have heavy stutters, its also weird because all of this only happened with the last halloween update, eversince then i got crashes and worse performance, now im on dx11 and cant even use ray tracing, lumen and nanite
Its simple. They are putting the cost of development off to you by making you need a 1200 watt rig with insane specs to run their POS software.
Epic is building UE not for Fortnite, but for Disney
That’s something I’ve noticed too. Unreal engine is now being used in movies and other stuff not related to games and it’s becoming a jack of all trades, master of none.
Glad to see this channel and video being shared and talked about. A glimmer of light in the dark.
Hate TAA, hate forced upscaler. Simple as.
TAA ghosting on Squad is VERY noticeable. When they started using it I wondered if my GPU was dying…
The more the technology grows the laziest the devs become. Technical limitation was a gift.
This is exactly it. Seems like all they want nowadays is to offload everything to the hardware based automation like DLSS, AI and all that crap, which the end user has to end up paying for.
LIES! Threat interactive is asking for 1M dollars in donations. I agree with his message but DO NOT DONATE HIM. He already has closed replies in Twitter, closed his discord server where some of his answers compromised him, etc.
Honestly, I wish developers relied more on art direction and scaled back the games a little.
There were so many tricks developers used. Sure, it took more time, but they did so much more with less.
I am currently playing Sleeping Dogs: The Definitive Edition. While you can tell it is an older game, the NPC count and art direction, and animation variety do more to immerse me more than most modern games.
I also love games like Sifu with the stylized textures. While it doesn't look realistic, it is one of my favorite looking games in modern times. Unicorn Overlord is another gem that had amazing art direction.
I feel like a style like that could allow more AAA devs to allocate more resources to gameplay. I felt more immersed in these games. They also have better visual clarity overall.
The thing is that people also hate delays of games. And there are so many things that can go wrong during a development of a game, and then these performance optimizations will be the first to go, because not enough people care to not buy a game because of it. It is a typical case where the nerds are correct in theory, but in practice nobody cares enough to make the priorities change from all the other things that will have a bigger impact on what people end up buying.
I’m not going to act like I understand one fourth of what he’s saying. But, I get the gist.
We should really start making game studios develop their games primarily on mid ranged hardware that’s comparative to what the majority of people actually run. Force them to live with the problems, then they’ll want change
DF video on TAA is much better, it explains better, provides historical context and better arguments for TAA and against TAA.
Zoomers prefer concise, confirmation-biased content over factual information.
And that's why they refuse to even mention Threat Interactive's arguments?
@@kaloryfer99999 They do mention issues with TAA ?
@@kaloryfer99999what arguments?
Personally I have a very hard time enjoying a game using frame generation, it just feels so weird when you play it.
See this is why I have a problem with upscaling and frame generation in general. It becomes a selling point for hardware making that more expensive overall while at the same time, everything said in this video. They aren't optimizing for native resolutions which should be the first priority. I don't wanna use DLSS or FSR if I don't have to.
Yes the down side to forcing people to one set of tools is that it limits you. Real talent comes from the worker not the tools he uses.
Hi, game dev here putting my 2 cents to this.
While its nice to see an ambitious young man. His claim he is going to revolutionize gaming and that everyone else is making terrible choices and errors is nonsensical. The reality is this person has developed nothing so far and doesn't even seem to show off concept or even pre-alpha builds of any project that they actually plan to turn into a real game. While he does seem to have an above average understanding of development it's not like he is ready to begin development of some large groundbreaking product like he seems to think. It's also easy for layman on this subject to listen to what he says and simply nod along because he sounds informed and intelligent and that gives a false sense of him being correct. While I have no desire to accuse outright of any ill intent by this young man, there is a worrisome red flag occurring. He claims to have a development studio and seems to imply he can do what all of us cannot while like I previously said. Where exactly has he obtained the funding to do this? It's almost certain that he has not and no investor or publisher is going to do so. He almost certainly has zero actual people employed by the studio currently as well and if he did his studio website would likely show it off and mention any experience his current and no doubt small team has... this does not exist at all however.
Everything that is going on here is a red flag and sounds like this young man is going to crowd fund. Even if we assume good intent by him realistically any ambitious project like he claims he could make would bankrupt very quickly if crowd funded. Even experienced dev teams can make this mistake and this young man has not shared any sort of management skills either. Meanwhile we see games like Unsung Story under Yasumi Matsuno and a team of Square Enix vets failing to deliver the game they crowd funded. Even with the best of intent and tons of experience isn't enough. Worst case scenario however is this young man doesn't have good intent and quite how many times have we seen people just like him swindle through crowd funding? While he has yet to actually do this would anyone be surprised?
While I think you personally mean well you shouldn't be showcasing him and building a form of trust between your audience and him as this will increase the chances of your own fans. Whether he has good intent or not the most obvious path this goes down is people losing their money and getting nothing in return. It's a matter of when not if unless people like yourself warn audiences ahead of time.
Now, sadly this is getting extremely long. But I'll touch down a little on the actual subject of development. The way he talks he seems hyper focused on graphical elements. Graphics are of course important. But even a baby in the industry would know that there are more to a game going on than graphical fidelity. Think of it like a car, just because you don't see what is going on under the hood or the entire body of the car doesn't mean those things aren't extremely important. For example there are much more CPU intensive actions going on and having much larger RAM requirements as well than the previous generation. It's also important to note that the RAM is shared between the CPU and GPU of the PS5. Now lets say Capcom wanted to port Dragon Dogma 2 to the PS4 for whatever reason. It simply wouldn't be possible without greatly reducing the utilization of the CPU and RAM and having to make massive sacrifices way beyond graphics. He however seems to be completely unaware of this element of game development. This is why in Dragon Dogma 2 even on the current gen of consoles (The fact that he calls the current gen, next-gen is embarrassing no one does this except for marketing purposes which reveals he has zero industry experience) performance takes such noticeable hits when in towns.
Its worth nothing that the level of graphical returns will always be diminishing and we have been on this since the PS3/360 era of games. Let's say the PS6 was announced tomorrow and it had 4x the power of the PS5 pro. No gimmicks genuinely 4x in every single metric. This doesn't mean you would see 4x the graphical effects and textures, framerate, or resolution capabilities. It's also worth noting that insane graphics even more so games that do try to go beyond the standard have extreme cost in both budget and development time. Even today's standard is extremely costly so going beyond that is crippling and how exactly is this young man going to overcome that as well?
Lastly his "realism" complaint. Other graphical styles are in no way inferior as he seems to make it sound. It really matters on what the game itself is aiming for mixed with genre all the way down to subgenre. A serious sports title is typically going to be a better experience looking as realistic as possible. Madden obviously should aim for this but a silly sports title like Mario Golf would be bizarre and off-putting if it was made in realistic graphics. Forspoken is a great example of what happens when you mismatch these and how it usually doesn't work. The game is a fantasy RPG something that typically doesn't aim for a realism look and rightfully so. Forspoken did the opposite however and if you play the game it just feels off to have these unrealistic creatures, magic, and trying to look as graphically real as possible at the same time.
Anyhow, if you are anyone else read this thanks for reading it through. Just be cautious if this young man starts crowdfunding like I suspect he will. Hopefully he is just young and a tad overconfident and ambitious.
I agree, unfortunately 'hate always gain more traction on the internet. Gamers don't understand how game development works and it's been like that for eternity. Threat Interactive is blatantly lying and pulling things from their context. There are some points which are valid but he puts it in a different context and overexaggerates the issue.
The guy is a snake oil salesman. It's crazy how he elevated himself to a position of authority by lying and tapping into gamer rage.
I say modern gaming is in a rough spot right now. So right now I've been replaying my old Wii games and it's always fascinating being able to play them at 4K and still be able to turn the game speed up to 400% for slow scenes or to speed up loading screens. Plus, no TAA. It's kinda funny that playing an old game on an emulator looks sharper than a modern game made with the most advanced graphics.
I just learned the reason the wii was blurry wasn't because the console wasn't good, it was because you need to use component cables instead of composite cables. Additionally it uses interlacing by default and anti flicker (designed for CRT tvs) and once you disable that through mods the console looks way more crisp and responds better.
Still 480P, but 480P instead of 480I.
But yes, emulated is a superior experience otherwise.
Create the problem (better graphics which are not optimised and tank performance), then sell the solution (upscalers which bring the performance back, but with degraded visual clarity cancelling out the better graphics that tanked performance in the first place. Genius.
The thing really is that in the past game developers were explicitly working to understand the hardware they were working with and find ways to use it to its maximum potential, hence why console titles were always extremely optimized, and why games were optimized in general. Currently no such efforts are made whatsoever for the wide majority of 'AAA' titles, where optimization is a complete non-priority, and no effort is made to understand the hardware they're working with directly, never mind use it to its fullest potential. Think of extremely late PS2 titles vs the early PS3 titles, on paper one should be superior to the other, but it isn't because on the PS2 the latest developers were incredibly familiar with the consoles' architecture and knew how to squeeze every droplet of performance out of it, or at least enough that some PS2 titles were more than on-par with PS3 quality and performance at the time of its' release, same for the transition from PS3 to PS4. Obviously much of this doesn't apply to PC, but even still the priority was always optimization, because if somebody can't run a game smoothly, that's a lost customer. Especially with the prevalence of Steam and their pretty much guaranteed under 2 hour refund policy. Even outside of that scenario, if you made a game that for the wide majority of people ran like shit, it would become a very well known fact that game ran like shit. Obviously this applies today since it is a very well known fact that 9th gen games often run like shit and are miserably optimized in not only performance regard, but also storage space.
The people who tell us upscaling is so good that we can't spot a difference are the same people that used to tell us it's impossible for human eyes to tell the difference between 30fps and 60fps.
There is visible difference on resolution and fps.
However, game content can be intentionally made to 30fps, lower resolution or 4:3 aspect ratio. The whole idea that games should be 60fps with native resolution and aspect ratio is wrong.
@@gruntaxeman3740what ? 60 fps should be the standard end of story we had it before we should have it back , I'm not paying to play an underperforming blurry mess
You just don't know basics of visual story telling. End of story.
There is very good reason why about all movies are shot 24fps because that blurriness hides acting, bad CGI and inaccuracies in physics effects. That means, lower fps looks better and it is more immersive. That same thing happens in games too. Games can be however based on action and that doesn't work 30fps, that works better 60fps. But if game is slow paced and camera panning is controlled, 30fps can look better, just like in movies.
Aspect ratio is other thing. Jurassic park movies are good example and the very first movie was the best one, and that was not shot 2.39:1 aspect ratio. Aspect ratio was 1.85:1 because dinosaurs happens to be tall so they can fill frame better.
Image sharpness should not be maxed out on every content. Game graphics are game developers communicating with player and that is known long time that example horror works better if the monster is not clean and sharp. Image can be intentionally made to black and white or look soft because of story telling.
It is just dumbest idea ever to have all games to be 5k resolution, 21:9 aspect ratio and 60 fps because that is not ideal for all content.
@@gruntaxeman3740sure, but now you have to question whether the developers are processing inputs and tick functions using delta time or not because a lot of games run horribly at 30 when they could run very responsivly if programmed well. Of the 30+ games I've tried and modified on the steam deck, only about five or six do it correctly to where the impact of a low framerate is minimized.
Computers and consoles are so fast that they can easily run games constant 60fps if wanted, including Steam deck. Games are just doing something incredible stupid
Optimizing shaders please wait.......
So funny because the game ends running like crap anyways.
I think the main problem is that "managers" that might not have "technical background" decide where to focus and what are the time lines, so I won't hold my breath that other studios will invest in what you shared in this video.
Hey Luke, I don't know you, your video simply popped in my recommended. A bunch of what TI is saying is either not true or misleading, there are a LOT of inaccuracies in his research that many people tried to point out over his videos and he either ignores completely of brushes off as if it didn't matter anyway because "nanite bad", "epic bad".
Let me know if you are interested in learning more, and I'll post some of stuff here.
Yup, TI is a walking example of the Dunning-Kruger effect. He knows just enough to convince regular gamers that he's an expert with more knowledge than render engine devs with decades of experience. His videos feed into the gamer outrage over TAA, and the fact that their gtx 1080s cant run games made in 2024-25.
@@Arms2 guys this issue is easy to solve if you know so much about TAA , just make your own video explaining with real examples the issue and why TI is wrong , you are free to enlighten everyone on the subject , untill then you just waste everyone's time by saying the guys ignorant but not giving concrete examples what he's wrong about or why
@@Anon20855 You say this as though making a video and getting it to millions of eyes is easy
@@Anon20855the very DF video about TAA that TI references already talked about all the benefits *and costs*
The TLDR though is TAA solutions are required to actually fix temporal aliasing (shimmer on thin detail) in motion and can resolve subpixel detail when done well, and once you have it you have a much better performing way to implement pretty much all your post effects, to the point that to implement a toggle for TAA means you need a completely separate implementation for *all* those effects (and all their options etc.)
It's mostly upside, but hard to get right in every situation, as it requires good motion vectors that tell it where the thing was last frame, and thats especially hard when you're dealing with transparencies and reflections. Done poorly, you get blur and smearing.
Personally, I prefer occasional smearing to temporal aliasing, but yes it should be an option, clearly there are plenty of people that don't notice the latter as much.
@@Anon20855 Ah yes, good idea! Let me just type out a long ass essay in a comment section so that I can try and convince one UA-cam commentor named "Anon" that TI isn't exactly a good source of information. Only to have the one UA-cam commentor still tell me I'm wrong, that games looked better 10 years ago, and that modern devs are lazy. lol
The worst thing is that without seeing what games would look like without AI and upscaling and TAA and whatever you'd think that thats just how the game is by itself but in reality thats just not the case. I remember playing cyberpunk using an upscaler and for some reason it just felt wrong like i just couldnt get the visuals right. After some tinkering i turned off upscaling and it looked and actually felt so much better for me.
Dude's face looks HD. Uncanny.Optimized.
his face hasn't fully rendered yet
Looks HD? What?
yeah, the devs might be interchangeable studio to studio but what those devs don't realize is that just means you're easily replaceable by overseas workers willing to do your job for 1/4th the cost.
Honestly, he’s a smart and talented kid, but I love how the “experts” out there have zero experience on how a development pipeline with 100s of people work.
I understand wanting to take a compromise to smooth development out but this is the same route modern car manufacturers have gone and it's why they're all not worth buying now.
We don't need to "understand" service providers when the service they provide is dogshit.
Fire the management
Graphics in my opinion are important in some games but not others. Whether you're for more photorealism or you really don't care. We should all agree that games need better optimization, not short cuts.
It's ironic how Digital Foundry themselves never ever said that "UE5 and Epic are the problem", only highly regarded "investigative journalists" say that, citing DF.
@@LardoAG digital foundry run games in 4k on ultra on their high end PC with 4090 where these issues dont exist thanks to high res and love to comment how everything looks when they use high zoom so ofc they dont talk about these issues.
@@Extreme96PL Even if that were true, which it clearly isn't, they were always talking about shader compilation, traversal stutters, and uneven frame pacing. Those issues exist, regardless of hardware and you average dev wouldn't even know about these issues, if it wasn't for DF. Also, DF always talks about TAA and upscaling, thoroughly comparing them in motion, and they clearly have a multitude of test configurations, not just "high end 4090". Their latest video on The Thing featured a Windows XP PC with Radeon 9700.
They've made numerous issues talking about unoptimized unreal engine games. They have dedicated videos on multiple topics like Taa, stutter struggle, pc optimization etc. do you even watch df?
@@theanimerapper6351 That's the point, "unoptimized UE games" "UE is the problem". "Unoptimized UE games" implies that devs didn't optimize them, not that UE somehow prevented them from optimizing the games. I've heard DF cautiously suggest that the fact that this is a wide-spread problem indicates some fundamental issues with the engine, but they're waiting for CDPR to confirm that, since they're way more competent than the average dev.
@@LardoAG if what you said is true then how are digital foundry unreasonable? Cdpr even did a talk discussing the optimizations they are making for the Witcher 4 that will hopefully help other ue5 games. Idk what else you want them to say 😂
21:51 I've been around long enough to have seen the era before TAA was introduced and the kinds of graphics solutions that were being presented at conferences and in papers. One of the things about the graphics world is that graphical artifacts are a game of whack-a-mole: when you try to create a solution, you also create a different problem. In film CG, the trade-off can be made favorable because you just solve enough of the problems to get through that scene, then do something completely different for the next one. But in games, it has to do everything flexibly in every scenario, so in so many instances, the solution is a non-solution by convention: "we'll drop it on the floor. The models are going to pop in and the characters are going to clip through each other and the lighting will glitch out in certain places." AAA can often push things a little more in the direction of a movie by running different engine code for different scenes, but that's the kind of thing that really does blow up your technical budget when you go deep into it, and it creates a new set of tradeoffs when you switch between engines - is it a load screen, is it a hidden load where you walk through a tunnel, how does it handle gameplay scenarios during the transition and so on.
The downside of having to wear those trade-offs is that it does open the floor to gamer conspiracy. Whichever thing you chose, the negatives of it will be attacked, and grifters will appear proposing easy answers.
TAA happens to be a visible target now, and there are some valid reasons for that. At the same time, it was a solution that created a lot of flexibility elsewhere in the shading pipeline - AA solutions are part of a compositing stack, and have to be thought through in terms of "how does this impact geometry, how does this impact textures," and so on right up through the final post-processing filters. Early 3D games were texture-heavy - Quake and the like, very low poly angular spaces - so those engines and the GPUs also focused really intensively on aliasing in textures(bilinear, trilinear, anisotropic filters, mipmapping...). When the scene is made with Nanite then you have the opposite thing where almost all the pixels are geometry edges, so you need a solution that works well for edge aliasing. TAA does OK with this - it's the kind of glue that you can fit in wherever.
From an art direction standpoint it often is just shuffling papers - you still are constrained in what's easy to show and the cost of the result, just in different ways. The whole game graphics space exists to sell you more chips, and the AAA game industry sits downstream of that: Nvidia says what the next gen is supposed to look like, and everyone just gets in line and implements what their researchers published. As a dev you can always opt to ignore Nvidia and take the graphics in a much different direction, but in many cases this means "missing out" on something they do help with. It's actually easier to make that jump as an indie since you can assume your scenes need to stay simple in their production cost and density of assets, so there's no burning need to push the engine to render more stuff.
Most people discussing the fall of graphics and the unreasonable increase for the quality and the performance of said graphics lack the understanding of the why it has happened and been happening.
TAA is not at fault. It can worsen the situation, but it was a reasonable solution to aliasing on the 8th generation consoles(that had a lot of memory, but mediocre gpus and horrible cpus). It also could be used as denoiser in several processes in the pipeline, further saving computational time.
The move for Unreal Engine is also at fault, but it didn't come from nowhere. The cost of developing a triple AAA game skyrocketed, many studios could not maintain their own tech, so their teams move to third party engines(Unreal, Cryengine, Unity, etc). Those engines aren't bad, but aren't made for what most of those studios want to use them for either.
That's when the studios should use the acess to the engines source code(that they all pay for) to make it fit their needs. But here comes the catch, they left(if it even existed to begin with) their own tech because they could not maintain/upgrade it, so how could they change it to fit or fit better?
There's still a lot of technically competent games being made, but the vast majority are now being made team that have the expertise and their own tech. There where always a gap between those teams, but now it feels like it is an abyss, and an insurmountable one.
yeah I agree with some of what he says, but I can't get onboard with the argument that TAA is the cause of these problems. There are legitimate reasons why it's used. It's much better than FXAA, MSAA just isn't a viable option with modern lighting engines, and supersampling is way too performance intensive.
If that was true, then they should manage their investments better. You have AAA games that cost hundreds of millions of dollars going the easy route, where is all that moneyy going to?
@screaminjesus To be fair, I never said it was the problem, just part of it. I even pointed out why(in my opinion at least) it became so used. MSAA is a possibility even today, but the cost on every type of denoiser for various effects, plus the extra time to integrate it all, made it hard to justify to most studios(I belive Forza Horizon still use it primarily and looks good).
@Gatitasecsii Of that front, I can not comment on, as I am not in the industry(video game), and everything I could possibly say about would be just an opinion based on some outside perspective. I, however, am a software developer (and had some interest in computer graphics in college). From that perspective, it is simply perplexing that most games aren't developed with agile software development methods, which means it is really hard to shift their directions once their development starts. Any heavy alterations can really impact the cost and schedule of the project. Capcom has started to try and do agile development with some success, so 🤞 the industry will catch up.
As for the cost, it is to be expected. Games now are huge projects with thousand plus workers between programmers, artists, writers, etc. Any project that size, lasting form years would cost 10s of millions of dollars at the minimum. In a country like the US, that cost can easily reach for the 100s of millions.
To be honest, we should've been at a point where games dropping to 60fps was considered poor performance and 100+fps was standard on recent hardware. Even if graphics didn't change much, performance should've improved generally
Can't wait to see Threat Interactives create a game under budget with all those new visuals, unless it's a tiktaktoe game :P
It is easier for that generation to belittle the work of others while benefiting from their efforts for minor financial gain through content generation.
most AAA games have 250ml or more budget they should not be excused if they are providing a blurry and smeary mess that runs worse than 10 year old games while also charging $70
I have a feeling noone will ever see their game
Self publish and don't work with ego inflated credentialists.
I'm a solo dev building a garage out to work with other local devs using a pay based on release success model. For other people who pay their bills using other means so they arent pressured.
The game is CRISP and even runs on media center PCs well. It's a shooter.
ID software engines being one of the exceptions. Indiana Jones game has path tracing and runs well.
DOOM Eternal was sooo good broo, why can't all games run like that 😭😭
"We use Unreal Engine" sounds to me like corpo speak for "our working conditions push experienced optimizers away, so we need new developers more than 60FPS".
I wish AAA would just make shorter games with great pacing at higher base price, instead of microtransaction storefronts with filler content for +66% Steam discount.
I think that we should also think about ways to convince executives at publishers (non-engineers, non-developers, etc) that this is a big enough issue that warrants attention and funding to fix, since ultimately they make the final decisions when it comes to finances and cost cutting, and are probably the reason many studios switch to unreal. They’re not gamers nor developers, so they won’t really understand the problems even if they watched every Threat Interactive video. We need to speak their language somehow lol.
Luke, I wish it was true, but unfortunately studios are laying off a lot of employees and looking for people with several years of experience and at least 2 released titles, and now there is no room for new developers, even interns.
the layoffs we've seen were in response to the inflation issues. now since inflation is down there should be more positions opening up relatively soon.
@@Smash_ter That's not how it works, or what has caused it (kinda, but not just that).
Feel free to look up the amount of layoffs that happened in December in gamedev.
And then do some basic number crunching on this, the total of layoffs in past two years, the total of game-dev students graduating, and the amount of open positions.
Do that homework and let that sink.
@mercai reminder that during the lockdowns we had more people being hired into these positions. It was after the lockdowns ended and inflation kicked in that the layoffs began unfolding across the entire industry. I honestly don't like it, but we could see them start to hire again since inflation is now under control again
Probably the funniest thing about console gaming and gaming in general really is anti aliasing turned of gives you better performance and makes the game look better and a lot of games don’t allow you to turn it off.
Upscalers are a crutch. When I built my first PC the whole point was to avoid upscaling tech that was widely in use on consoles. I just bought Allen Wake 2 and while optimizing my settings to make it run as well as possible, 4k on my 6950XT, I have the game set to the low preset and have other stuff turned off for enhanced shadows etc, and the game looks incredible still. We have FF7 Rebirth that looks like absolute fucking garbage if you want decent performance