the stutter is fixed once your game devs can recognize the limits and work around them. there is no magic involved in 3d engines...everything is still having limits
@@krz9000 Yeah it’s silly to rely on game engines to do all the work for you, game devs still need to be competent programmers and find solutions to any type of stuttering that the engine can’t deal with on its own
@@Z3uS2when i see a game with unreal engine i know already that it's gonna stutter like crazy. If i see cryengine i know I'm in for a good time with great visuals. Sadly not many cryengine games
It’s either the upscaler being wonky (TSR isn’t very good atm), or Lumen using an extremely low sample count, or both. Could also being changes to Temporal values in the Lumen denoiser. You can see traditional upscalers destroy RTX lighting quality and reflections in basically every game in this way, especially at 50%. This is why Ray Reconstruction exists, allowing the denoiser to run on a full resolution image rather than the non upscaled lowered resolution. One would have to compare the Matrix demo using the NvRTX branch in order to test this as it uses its own systems.
3 years and we are getting there, almost usable! Closing in! Certainly good the performance optimization, but the shader compilation issue is taking too damn much time to get it fixed.
One note to the Reflection issue showing at 15:38: Lumen has a CVAR r.Lumen.Reflections.MaxRoughnessToTrace which is a cutoff that determines if raytraced or screen space reflections (You called it probe based) should be used by lumen. This is set globally so materials with a certain roughness value do not use raytraced reflections. In UE5.4 this now is a setting in the scenes postprocessing. It is possible that UE5.0 used a different value for this setting than the default of 0.4 and is now overwritten. If i.e. 0.3 or lower would have been used. noise gets reduced significantly which in turn reduces accuracy but improves performance.
I remember an epic engineer years ago saying on a DF video they are working hard on the fortnite shader stuttering.. few years later and its just as bad. Wtf are they doing
It's also worth pointing out, that every driver update will bring back that stuttering too. Which is a real pain. It's not just a "first play experience", it's a "first play experience every few weeks".
I like the Steam Deck solution to this, which is to crowdsource the shader cache. Steam keeps a constantly-updating shared blob of compiled shaders that get automatically uploaded and downloaded to and from Steam Deck users for each game. I'll bet this functionality could one day come to GFE if people were actually excited to use it.
All I hope when watching DF videos is that the developers and publishers are listening and taking notes. You guys do such a good job of offering advice through your videos that would benefit everyone.
Such a shame they haven’t fix this stuttering, with ddr4, ddr5 and high bandwidth cards with direct storage support, stutters should be a thing of the past!
Just found this channel and i know im in the right place when viewers are actually smart and not just leaving bot comments i dont think many people caught this because its complicated 😂
The shader comp is insane in games like Fortnite, which on a fresh install means a few games that are literally a slideshow going from 144 FPS to 20 constantly until you play a few matches.
Omg is this why my Fortnite stutters? I never play but when I do get on I get constant stutters despite having a good pc and no stutters in any other game
@@wanshurst2416 If you've been playing for a while its not an issue. If you do a cold boot it will stutter while it builds shaders. I tested how bad it an be with 10900k 4090 pc vs 7800x3d 4090 and while the amd cpu was noticely less worse in frametime and avg fps its still bad for the first 5+ games on fresh install.
I will take frames and no stutter over a window reflection or a better looking brick wall every single time. Its really WILD that they are just now working on multi-core performance improvements considering how long multi core cpus have been around. Its also still wild to me that we are so obsessed with and pushing for technolgies that cant even run well at the standard resolutions we use now (4k is the tv standard, 1440 pc standard) without upscaling, and even then you need to have the top tier equipment for an experience plauged by stuttering to look at a window reflection or to stare at a brick wall to see how the sun reflects. Im not saying we shouldnt be pushing for these new plateaus but cmon guys, lets build on solid foundations here and not sand.
Not to mention you need to be well versed in overclocking. This was well said tho. I’m one of the few that will build a god rig just to run it at 1080p since I care about input lag
In my opinion, the reason they want raytracing to work so badly is because of how much time it saves in development. Without raytracing, lighting needs to be baked and requires lots of trial and error to get scene lighting to look right/good. With raytracing, the engine and GPU does it almost all of it for you in real-time. It reduces dev time and saves money.
Since there are still several years until the release of these games (with further improvements to the game engine coming out in the meantime) and also because I trust in CPDR's abilities, I remain quite confident.
@@Z3t487There's no guarantee that those developers are updating the engine in line with their game code, they usually pick one version of the engine and build from there. If they try to update the engine during development, many of the game's source files will break and it will be exceedingly hard to pin point bugs. This is why Unreal Engine 4 games are still being released because It took those projects that long to complete.
I'm an audio enthusiast much more than a graphics enthusiast, and I'm always afraid technology will forget us. Especially with so many people today using smart phones and portable consoles with headsets (or heaven forbid without them), and even home theaters moving towards "sound bars" and all sorts of other magic gimmicks. Which, on the other hand, I guess makes it easy to understand why it might be frustrating to develop audio for games. You'll spend your time carefully capturing all the foley sounds, recording crickets at night, composing and conducting symphony orchestras, mastering everything to a T, figuring out how to place everything correctly in a surround or atmos system etc etc., meanwhile knowing a large portion of your players will be playing those carefully crafted soundscapes through a "surround" bar made from soap box plastic in a giant, barren concrete room.
they've changed how lumen works but I doubt they went through the city project and tweaked things for 5.4. So things that were not problems in 5.0 will become issues in 5.4. zero surprise there really, it's just what happens
that being said it's known to not use strong emissive values with small sources with lumen for a while now, this would never be an issue with a real game (if the devs are competent). It's doing real time GI god damn it, it's insane it even works at all at reasonable performance. I feel like people quickly forget how impressive and truly next gen UE5 still is. It's not without problems for sure but holy hell which other free engine has a tool suite this impressive? spoilers there are none that compare at all
@@quackcharge Unity compares. In fact, their Screen Space Global Illumination has very stable emissives. Maybe Epic can take a look at it. And runtime animation rigging since 2019. And before you mention Lumen, Unity has had realtime GI since 2017, except the geometry positions are baked, so light won't spill through an open door. Gray Zone Warfare is a disaster so far. They are realising nobody can run these games.
4:05 good performance gains, but what seems like massive downgrade in indirect lighting. There's flickering everywhere under that bridge, where it was fine in 5.0
I saw a video recently -- "Optimizing my Game so it Runs on a Potato" by @Blargis3d. He's making an indie game and was having the same compilation stutter problem. He solved it in kind of a genius way. When the game starts, before every level, he has a black loading screen. Thing is --- it's a trick. What he's *actually* doing behind that black screen, is playing the game at 10 times speed, walking in every room, loading every texture and killing every enemy. That way, all of the hitching and loading that has to happen, happens during that period. Then, when the 'loading' completes and the player plays the level, it's actually the *second* time that all of those assets are loaded. Thus -- complete elimination of compilation stutter. This was the first time I ever heard of this (I didn't even know such a thing was possible) and thought it was really cool. Thus, sharing it here. :)
The game is Bloodthief by @Blargis3d. The shader compilation trick is in the video "Optimizing my Game so it Runs on a Potato" and is indeed very cool!
Because it's not that easy to do it properly and a lot of studios who are using a ready made engine do this because they lack the technical know-how and/or financial resources to either develop an own engine, or to dive deep into the quirks of a complex monster like UE. They import their art, design their levels, write their game logic in Lua or UnrealScript, click the "build project" button, and hope for the best. Many devs aren't even aware of the typical issues, which is why DF does such an important job in explaining it over and over again.
The work involved isn't just a checkbox and a screen or something: the way shaders work in Unreal (and every other engine, really) is that you dynamically stitch them together and with potentially dynamic external parameters: think weather systems, characters getting wet or dirty, blending between different ground materials, adding effects for power ups, that sort of thing (and those are just the most obvious ones) Unreal has a big list of these snippets, but not how they will be combined or with what values until you actually tell it to use them. Doing a shader compile is basically the developer finding all the final shader combinations they use in a game (easily thousands nowadays), often through just running through it and trying everything, and telling the engine to use them all in order. Unreal can't easily fix this without breaking the entire workflow that pretty much every artist has used for about 20 years now. Developers need to perform a huge amount of work for what they might not see as being high value, since "it's only the first time"
have u played last of us part 1 remake? i had to wait 40-45mins just in loading for the shader cache imagine waiting that long haha no one would wait that long and either play with stutter or refund and delete the game
@@ChillieGaming I would genuinely rather wait 48 hours for a game to load than it be shipped unfinished, this is why they get away with it, coz y’all bit it regardless
The Matrix tech demo settings were similar but seemingly something changed under the hood in UE 5.4. Most likely this artifacting is somehow connected to the render parallelization improvement. Kinda seems like something is out of order/not synched in the pipeline.
@@Waffle4569 I kind of agree. We're comparing performance between UE5 iterations though so if the way lighting is processed helps performance while also making things prettier/uglier, that's what we're seeing. The *Matrix* game/demo settings are the same, just the engine changed. Apparently those lighting artifacts/fizzles are from MORE rays being processed into the scene and weren't there before due to those lights casting nearly no raytraced light before.
the music at the very start of the video is my favorite track in Unreal Tournament. Brings back the good memories from 2001 when I first played the game.
UE5 is really giving off those classic cryengine 2/3 vibes. Great to look at, with performance issues. For them to not have parallelization right out of the box is baffling.
@@coffin7904 It's extremely wrong and you are very mitsaken. It's like calling Skrillex "house music." Or Aphex Twin "techno". Or Lynyrd Skynyrd "country". "Breakcore" and "drum and bass" are distinct genres and saying the difference is minute shows that you really don't know much about this type of music. You could say that both are part of the same super-genre of Jungle, but that's not what we're talking about. If you're going to claim that breakcore and D&D have a "minute" difference, what's the point of even using distinct genre terms at all? Let's just call it all "electronic music" and skip the specificity altogether! Christ.
@@azazelleblack It turns me into the Hulk whenever I hear kids put the word 'core' at the end of any arbitrary word. Like calling frutiger aero 'cleancore' or something.
@@coffin7904 Absolutely! Before we get started, it's important to understand that these genre terms are sort of poorly-defined and used pretty loosely. With that said, there ARE definitions to these terms, and in particular "Breakcore" is a distinct subgenre under the heading of "Drum & Bass" or "Jungle" music. People disagree over whether Jungle or D&B came first (and thus deserves to be the super-genre), but both evolved from earlier Breakbeat and Rave Hardcore music. Jungle music had heavy influence from Dancehall and usually was lower tempo, with an MC and party vibe, while Drum & Bass was focused on literally just rumbling basslines and breakbeat drum loops. Drum & Bass as a genre typically refers to the early works of artists like Technical Itch, Q Project, Grooverider, and many, many others. This style originated in the early-to-mid 1990s and was still very much party music. It's danceable, and while it has a much darker vibe than something like Rave Hardcore (or especially that genre's successor, Happy Hardcore), it's still chill enough that you can zone out and relax to it. Breakcore, meanwhile, is an evolution at least three stages removed from the original Drum & Bass sound. Around the same time people were coming up with D&B, other artists were experimenting with new sounds and creating what was then called the very stupid name "Intelligent Dance Music," or "IDM". IDM is often harsh, atonal, and challenging to listen to, and despite the name, it's almost entirely undanceable. The early crossover efforts between D&B's successors (known commonly as "Darkcore", see: Dieselboy) and IDM were mockingly called "Drill & Bass", but this style became somewhat popular within its niche and has numerous artists. You get crossover from both D&B guys and IDM guys in this genre. Drill & Bass eventually gave way to Breakcore, the successor genre that takes the relatively technical and stripped-down Drill & Bass and turns it up to 11 with influences from Hardcore (both EDM and Punk), industrial music, and even avant-garde noise music. True Breakcore is a brutal and harsh genre that's hard to enjoy for most people. It features gruesomely mutilated breakbeats with little rhythmic coherence and sharp, distorted sounds that can be like audio jumpscares or just constant stressors. Even if you say "Drum & Bass" is the super-genre, nobody is thinking about Breakcore when they say "Drum & Bass". They're thinking about Aphrodite, about LTJ Bukem, about Dieselboy, about Evol Intent, and so on. Breakcore artists are guys like Venetian Snares, Bogdan Raczynski, Sh!tmat, Rotator, and so on. Foregone Destruction is absolutely not Breakcore, lol.
I think they should put all of their resources to fixing the performance before improving rendering. Makes the most sense even from a marketing perspective
How on earth did it take 3 years to add multi-core rendering support? Hey, we're going to build a house without a roof, release it, and 3 years later we'll add the roof. Didn't they run this on a PS5? The demo of Matrix on a 7800X3D with a 4090 is unacceptable even today with 5.4 given that this is the best you can buy.
Great content, Alex. Instead of the technical talk (which I’ve grown in understanding over the years due to Digital Foundry) you did a great job illustrating the points you made. Bravo.
Problem with the Frankenstein PC is it is missing the co-processing from the I/O Chip in the console. The PS5 I/O has the equivalent of 11 PS5 CPU cores dedicated to Decompression and Dieect Memory Access, it has an addition processor dedicated to I/O of the SSD, and another dedicated to memory mapping. On the Frankenstein PC, all of these tasks have to be handled on the CPU or GPU depending on how they decided to tackle it. As we move into full fledged PS5/XBSX titles, the gap between the consoles and this PC will significantly increase as the I/O Olin the console becomes gets more and more use.
I'm holding out hope that we can have some kind of machine learning model to handle shader compilation and that can run on the GPU. A model trained specifically to do shader compilation could be infinitely more efficient than the dumb "on demand" way it's done now.
the historical trend says that by UE 7.4 the stutter will be a constant up and down between 16ms and 100+ ms and the entire industry will pretend it isn't happening or if it is happening it's not a big deal or if it is a big deal it's impossible to solve and we just have to accept it any way. no version of Unreal has ever stuttered LESS than its predecessor.
7:30 "Developers are just not thourough" - you know that "Shader Pre-Compilation" is essentially just splurting out all shaders and effects onto the screen and putting a progress bar ontop of it?
This stuttering issue has me quite worried about CDPR’s next games on it. It would be so sad if the new Witcher released with terrible shader compilation or traversal stuttering considering how great CDPR’s games ran on high end PC’s with RED Engine.
UE stutter became most apparent to me with Lords of the Fallen... even after 40+ patches, the game STILL suffers from hitching, micro-stutters, and the like. It begs the question: Which engine can rise up and offer a robust set of "future-proof" features, provide affordable licensing, etc.?
The comparison at 4:00 does have more fps in 5.4 but global illumination is way more noisy under the bridge. So Lumen is doing a worst job at calculating those emissive materials. (oh, it's mention at the end... hehehe)
Hearing the Unreal Tournament music makes my heart melt ♥️. Truly brings me back, and gets reminded that Epic Games is a juggernaut when it comes to their graphics from the Unreal Engine.
It has just been mind-blowing to me that these companies can put sooo much time, money, energy, human resources into making these amazing pieces of art, just to get to the very end finished product and accepting shader compilation stutter and traversal stutter. It's unfortunate, frustrating and just upsetting. Millions of dollars go into these projects and us consumers spend thousands on PCs just to have all these little hitches. Hope this will come to an end
I doubt they could even if they wanted to, and for all we know the de-listing might have something to do with Tencent. And never forget that the original series of Unreal games were co-developed and directed by Digital Extremes. The only Unreal games made entirely by Epic Games were UT3 and UC2.
@@AlexanTheMan 99% of large companies like epic are after 1 thing. Money, don't know what else you would expect. An Unreal game would only get the boomers interested and they're less likely to spend money on fomo and battlepasses and dumps 100s hours into a game unlike the kids that play fortnite.
Dude thank you i hope you cover additional versions as they release this was very informative if ue5 was ready for the project im starting! It is not there yet sadly!
Overall it's great to see improvements, but it still needs a lot of work. No wonder all new UE5 games look and run the way they do. However, I still prefer path tracing with DLSS RR over UE5's global illumination, reflections, shadows... They have that unstable flickering "boiling" look that I find distracting. 🤔
Yeah, it's noticeable in ark. The software lumen is a big difference to path tracing. Software raytracing, hardware raytracing, then path tracing. Epic has said they want to optimize the engine to the point where hardware lumen is as expensive as software lumen is today. So if they can achieve this, we'll get better performance or resolution for UE5 games. Also we'll get better rt quality for 30fps modes. Something tells me they want software lumen working phones and mobile gaming PC's and the switch 2.
This is why it saddens me to see so many developers dropping their in-house engines for UE5. I always had at least 1-2 consoles for every generation, but always played big titles on PC, but current generation makes me rethink this bc it seems like the only way to dodge stutters is to switch to consoles...
Me with my 5800X3D and 4070 Super: *Plays Fortnite in 1080p, performance mode and low settings* I cannot suffer through a few games just to cache my shaders in dx12. It is a miserable experience.
I've been refusing to play UE titles since Jedi Fallen Order, and this is more confirmation to continue to avoid UE games. Unless DF/Alex confirms a UE title doesn't have stutter, then I refuse to play/purchase.
What engine developers won't tell you is most often when they "optimise" some process or effects, they actually downgrade them. For example there's a clear issue with the gi/radiance on all point light in the 5.4 version compared to the stable 5.0 and refelctions and stuff are blotchier . And that's only what I can see
I know engines get patches and updates throughout their life, but I'm surprised by how poorly UE5 can work with multi-threading. Like its nothing new, its been around for years
It needs a 2D shader animation for 3D buildings at a distance, and an option to lock frames at 40FPS and 45fps, it also be nice to integrate frame generation, so having something like internal 52fps lock and upscaling to 60fps with frame generation..
The unstable frametimes in UE5 are a huge problem. It makes me angry that half the industry is switching to this engine when it's impossible for a UE5 game to actually run smoothly, and Epic isn't doing anything about it! When I played The Talos Principle 2 I got so sick of it I ended up limiting the framerate to 60 fps (with my system easily capable of 100+) and would still experience visible frametime spikes pretty regularly.
"when it's impossible for a UE5 game to actually run smoothly" Not true at all. "and Epic isn't doing anything about it!" - Also not true at all. There are many features in recent iterations of the engine that combat this. You even see them in this video. It will take time for developers to get onto those versions of the engine, though. Hell, Epic has done talks on this issue.
@@Cinnamon1080 I'll believe it when I see results. It looks only marginally improved at best in the video here. They've done lots of talking about shader compilation stutter too, but you can also see how poorly that's going in the video here. My personal experiences playing UE5 games so far have been very negative - frequent frametime spikes and poor image quality due to Lumen artifacts, on top of the ever-present shader compilation and traversal stutter that were inherited from UE4. Upscalers other than TSR look inexplicably poor compared to other engines too. It's just one big disaster as far as I'm concerned.
Why isn't it possible for the engine to download the shaders from a cdn/steam/nvidia/amd, matching to the video card? Steam does this for some games games (apparently not for UE5 games). And consoles do it too.
They depend not only on the card, but also the driver. It might be possible, but it would be a lot weaker of a system than it is in, lets say, a steam deck
@@Wobbothe3rd the processor of a 4060 is the same, no matter if it’s from asus or msi. I’m a software dev, and im pretty sure that it’s even simpler. Im mean it’s just a simple compiler. It’s like compiling C code for i386, i486, i586, i686. There is no need to compile it differently for every single SKU. Only for the architecture: Turing, Ada Lovelace, etc. So that’s about five/six different architectures. UNLESS, shaders contain precompiler statements which check for VRAM size or number of shader cores, which would be IMO very weird.
@@iurigrang you raised a good point with the driver. But the fun fact is, that’s because one essential part of the driver is the shader compiler. It’s likely, that not every small driver increment, changes the output of the compiler. So nvidia/amd needs to keep a lookup table: source shader hash + architecture + driver version => compiled shader hash. That would be a finite number of files to deliver via cdn. I mean they could say, we offer that service only for the latest WHQL driver and the latest upstream driver. That would be even fewer files tob host. With a fallback to self compiling. I guess the problem is, nobody wants to pay for the CDN, not the game developer and not Nvidia/AMD. ^^
It's pretty shameful that a company like Epic, with all its resources still has problems with shader compilation and traversal stutter on PC If I were Jensen at Nvidia, I would be calling Epic every day pressuring them to put in the work. So many studios use UE. So to the average person, who has never heard of shader stutter, booting up a UE game would just put a black mark on PC gaming for them. This would have a negative affect on PC hardware sales. I bought a PS5 for it's small number of exclusives but mainly to play UE5 games that interest me just to avoid them on my PC (7800X3D/4090) And this hurts smaller studios that can't afford making their own engine. UE5 has so many great features which attracts smaller studios who want to make something more than a sides scroller.
It's a difficult problem because of how flexibly artists can author new shaders in the engine, with potentially tons of different permutations. There is effectively no way of knowing ahead of time which permutations of those shaders will be used at runtime, and the engine was structured for APIs like DirectX 11 where the driver automatically recompiled/restructured pipelines for different state and shader combinations with less overhead. There is even a new Vulkan extension that reintroduces part of this dynamic driver recompilation to subvert the issue, but DirectX 12 still requires you to compile your pipelines ahead of time for all possible shader/state combinations you could encounter during runtime if you want to avoid this problem.
@@TheSandvichTrials I assume you were talking about the dynamic rendering extension for Vulkan? If so DX12 doesn't yet have something like that that I'm aware of, but it does have an equivalent of Vulkan's new shader objects where you can compile individual shaders ahead of time and mix-and-match them at runtime, opting out of certain optimisations. The new work graphs feature of DX12 makes use of its equivalent of shader objects.
@@TheSandvichTrials Yeah, it's real fucking neat and could open up a ton of possibilities for GPU computation. Apparently Epic was playing around with porting Nanite to work graphs and held a presentation at GDC with their findings, so I'm hoping the footage of that is released, if it isn't already.
@@jcm2606 I watched that presentation, they showed Unreal running in that megascans canyon/desert environment, though they refused to share any details. But it seemed to work!
i would rather take 40 fps with no stutters than 144 fps with micro stutters. idc about this new graphical leap just make sure my games dont stutter. PC is supposed to be the best platform yet we always get the crappy end of the stuck with stutters
They would need to use newer versions of UE. The immortals of aveum devs said they're gonna upgrade the engine. Also Epic has said they wanna make both software lumen and hardware lumen cheaper.
Lol no, only cross gen games (PS4 &. PS5) were able to afford 60 fps, but now more and more new PS5 only gen games are basically 30fps by design. FF16 is an example, performance mode there is a joke, at 720p with drops to 30-40fps, absolutely not worth it. Minimal playable framerates (30fps on consoles) will always be the baseline for developers, especially after PS4 support is dropped they simply will aim for 30fps on PS5
Developers COULD design every game to run at 60fps... It would just require scaling back graphic features, but it seems gamers prefer more 'next gen' visuals and are willing to accept unstable 30fps performance at Dynamic resolutions that dip below 720p internal rendering. The games are being produced for KNOWN HARDWARE, so it's just a design decision not to prioritize 60fps.
@@techsamurai11 On ps5 I've heard it works in 720p 24fps. Maybe it was 720p upscaled to 1440p. On Frankenstein pc they test in 1080p upscaled to 4k. For fsr it's double visible resolution. Also big cost of upscaling to 4k. The problem is with games - for cyberpunk 1080p looks like 720p because of all the blur.
@@techsamurai11 Yes, I may have remembered the low points. Edit: watched the video, you were right. So ps5 is even faster because Frankenstein pc did 1080p native. But still it's strange - if it cannot keep 30fps while driving that would mean lower res than 1420p, I'd say 1080p.
On PS5/Series X it was dynamic 1620p and most often 1440p before being upscaled with TSR. Whereas Series S was dynamic 1080p and most often 720p before TSR. Also as you can see in the video consoles were only 24fps in cutscenes. The rest of the game was consistent 30fps except driving fast or crashing the car which would tank frame rates.
11:41 The denoiser seems to struggle substantially more in 5.4 as visible in the lights under the bridge/overpass. So some of that extra performance may have been gained by reducing overall image quality
We'll land on mars before shader compilation stutters are fixed in UE.
the stutter is fixed once your game devs can recognize the limits and work around them. there is no magic involved in 3d engines...everything is still having limits
@@krz9000 Yeah it’s silly to rely on game engines to do all the work for you, game devs still need to be competent programmers and find solutions to any type of stuttering that the engine can’t deal with on its own
@@krz9000 Not even Epic itself can get rid of the stutters in their own damn game, as it's shown in this very video.
Maybe... I bet we don't land on Mars for at least 15 years.
@@Z3uS2when i see a game with unreal engine i know already that it's gonna stutter like crazy. If i see cryengine i know I'm in for a good time with great visuals. Sadly not many cryengine games
Yeesh the lighting under the bridge in 5.4 is fireworks
Been seeing that in Gray Zone Warfare, too. It's incredibly distracting.
@@SHABAD0O i'm watching this hoping they update to 5.4 so we get a few more fps in GZW 🤣
It’s either the upscaler being wonky (TSR isn’t very good atm), or Lumen using an extremely low sample count, or both. Could also being changes to Temporal values in the Lumen denoiser. You can see traditional upscalers destroy RTX lighting quality and reflections in basically every game in this way, especially at 50%. This is why Ray Reconstruction exists, allowing the denoiser to run on a full resolution image rather than the non upscaled lowered resolution. One would have to compare the Matrix demo using the NvRTX branch in order to test this as it uses its own systems.
I was noticing it was before he got to the subject. It's worse a far render distance it seems.
What did they change to make it look so much worse? Now better performance but worse visuals. I guess we just won't get both
That demo was 3 years ago... god damn whered the time go
better question, where did the frames go? oh wait, they were never really there.
And still no games lol
@@yc_030We have that crappy graphics tekken 8😂
damn, really? wtf 😅 feels like a year
3 years and we are getting there, almost usable! Closing in! Certainly good the performance optimization, but the shader compilation issue is taking too damn much time to get it fixed.
One note to the Reflection issue showing at 15:38:
Lumen has a CVAR r.Lumen.Reflections.MaxRoughnessToTrace which is a cutoff that determines if raytraced or screen space reflections (You called it probe based) should be used by lumen. This is set globally so materials with a certain roughness value do not use raytraced reflections. In UE5.4 this now is a setting in the scenes postprocessing. It is possible that UE5.0 used a different value for this setting than the default of 0.4 and is now overwritten. If i.e. 0.3 or lower would have been used. noise gets reduced significantly which in turn reduces accuracy but improves performance.
/\ This guy lumens.
|
DF to Epic...what about shader stutterring?
Epic to DF...Ye,Ye,Yes.
💀RIP to the performance for all next gen games
I remember an epic engineer years ago saying on a DF video they are working hard on the fortnite shader stuttering.. few years later and its just as bad. Wtf are they doing
@@sven957 nothing, nothing all, naturally.
@@sven957making it more noticeable.
the epic store pc strugles when seach. store page frontend we talking about no graphics 3d.
12:29 that guy is really interested in the parking rules
I thought he was reading the bus stop time table lol
I'm dying.
Epic devs trying to find the instructions to fix shader stutter.
He likes those +500% zoom views.
I hope he finds what he's looking for
It's also worth pointing out, that every driver update will bring back that stuttering too. Which is a real pain. It's not just a "first play experience", it's a "first play experience every few weeks".
I like the Steam Deck solution to this, which is to crowdsource the shader cache. Steam keeps a constantly-updating shared blob of compiled shaders that get automatically uploaded and downloaded to and from Steam Deck users for each game. I'll bet this functionality could one day come to GFE if people were actually excited to use it.
@@GallileoPaballa That's a great solution.
Would it talked shader for every game / driver / card combination?
All I hope when watching DF videos is that the developers and publishers are listening and taking notes. You guys do such a good job of offering advice through your videos that would benefit everyone.
Me too.
UE 5.4: "Ddddid I stutter? Well yes I did..."
Sigma
Fortnite could be shifting to Unreal Engine 5.4 soon
Such a shame they haven’t fix this stuttering, with ddr4, ddr5 and high bandwidth cards with direct storage support, stutters should be a thing of the past!
@@Multimeter1Seriously, at this point there’s no excuse.
Say what again!
'a frametime graph that looks like post-modern art'
never change Alex... never change
5:21 the frame-time graph looks like the city its trying to render
Just found this channel and i know im in the right place when viewers are actually smart and not just leaving bot comments i dont think many people caught this because its complicated 😂
The shader comp is insane in games like Fortnite, which on a fresh install means a few games that are literally a slideshow going from 144 FPS to 20 constantly until you play a few matches.
I always had rock solid 240fps? 🤔
Omg is this why my Fortnite stutters? I never play but when I do get on I get constant stutters despite having a good pc and no stutters in any other game
@@wanshurst2416 If you've been playing for a while its not an issue. If you do a cold boot it will stutter while it builds shaders. I tested how bad it an be with 10900k 4090 pc vs 7800x3d 4090 and while the amd cpu was noticely less worse in frametime and avg fps its still bad for the first 5+ games on fresh install.
@@wanshurst2416 update your drivers and you're back to the stutterfest
The fact that stuttering still isn't fixed is surreal, unreal you might even say!
*ba dum tsss*
That Unreal Tournament music slaps
loved that game series
It's Foregone Destruction I believe !Such a good track!
You gotta remember your roots
Every time I see Forgone Destruction’s map image in Fortnite I smile. I’m sad I keep missing the track in the shop. It’s an all time classic
I love Unreal Tournament and the fact that EPIC removed the series from all storefronts still keeps me salty AF
With that music it's impossible not to imagine a UT capture the flag match on Facing Worlds with these graphics.
I'm still salty at Epic for abandoning the next Unreal Tournament game. (and Paragon, for that matter)
I wish they’d update the demo on consoles like kind of a UE5.4 tech demo for users.
Just for info, every update the devs need to pay to sony/MS.
@@Swisshostthat sucks
@@Swisshost That was in the 7th gen, but it hasn't been a thing for at least a decade.
I will take frames and no stutter over a window reflection or a better looking brick wall every single time. Its really WILD that they are just now working on multi-core performance improvements considering how long multi core cpus have been around. Its also still wild to me that we are so obsessed with and pushing for technolgies that cant even run well at the standard resolutions we use now (4k is the tv standard, 1440 pc standard) without upscaling, and even then you need to have the top tier equipment for an experience plauged by stuttering to look at a window reflection or to stare at a brick wall to see how the sun reflects. Im not saying we shouldnt be pushing for these new plateaus but cmon guys, lets build on solid foundations here and not sand.
Not to mention you need to be well versed in overclocking. This was well said tho. I’m one of the few that will build a god rig just to run it at 1080p since I care about input lag
In my opinion, the reason they want raytracing to work so badly is because of how much time it saves in development. Without raytracing, lighting needs to be baked and requires lots of trial and error to get scene lighting to look right/good. With raytracing, the engine and GPU does it almost all of it for you in real-time. It reduces dev time and saves money.
@@silfrido17681080p looks good on an actual 1080p monitor. 1080p on a 1440p or 4K display is horribly blurry.
intro track id for those wandering: Michiel van den Bos - Forgone Destruction
Adore the Young Frankenstein bit with the thunder at each mention.
Can't wait to see all the stutter in Witcher 4 and Cyberpunk 2 :/
Since there are still several years until the release of these games (with further improvements to the game engine coming out in the meantime) and also because I trust in CPDR's abilities, I remain quite confident.
@@Z3t487There's no guarantee that those developers are updating the engine in line with their game code, they usually pick one version of the engine and build from there.
If they try to update the engine during development, many of the game's source files will break and it will be exceedingly hard to pin point bugs. This is why Unreal Engine 4 games are still being released because It took those projects that long to complete.
@@AlexanTheManaccording to cdpr, it's implied that the engine will be improved throughout development of the game
@@aquaneon8012 Source?
@@Z3t487 shader stutter has been a thing in UE since like 2010, i doubt its going to be fixed in the next 10 years
Super Castlevania IV Simon’s theme kicks ass in the background!
Castlevania goated
Castlevania has stupidly good music. Symphony of the Night and IV are something truly amazing.
I think Alex is mixing up Frankenstein and Dracula :D
Agreed, SC4 had without a doubt the best music on the Super NES. Even today it sounds amazing coming from 8 sound channel's.@@WH250398
I looove these tech spotlight videos from Alex. Absolutely makes my day!
Trying to figure out UE5 as an audio person has been fun 😂
Jesus Christ, I was trying to get Atmos working on 5.3 and in the end I just gave up
I'm an audio enthusiast much more than a graphics enthusiast, and I'm always afraid technology will forget us. Especially with so many people today using smart phones and portable consoles with headsets (or heaven forbid without them), and even home theaters moving towards "sound bars" and all sorts of other magic gimmicks. Which, on the other hand, I guess makes it easy to understand why it might be frustrating to develop audio for games. You'll spend your time carefully capturing all the foley sounds, recording crickets at night, composing and conducting symphony orchestras, mastering everything to a T, figuring out how to place everything correctly in a surround or atmos system etc etc., meanwhile knowing a large portion of your players will be playing those carefully crafted soundscapes through a "surround" bar made from soap box plastic in a giant, barren concrete room.
in the 5.4 shots there is a TON more GI sparking . .
they've changed how lumen works but I doubt they went through the city project and tweaked things for 5.4. So things that were not problems in 5.0 will become issues in 5.4. zero surprise there really, it's just what happens
that being said it's known to not use strong emissive values with small sources with lumen for a while now, this would never be an issue with a real game (if the devs are competent). It's doing real time GI god damn it, it's insane it even works at all at reasonable performance. I feel like people quickly forget how impressive and truly next gen UE5 still is. It's not without problems for sure but holy hell which other free engine has a tool suite this impressive? spoilers there are none that compare at all
Yeah it looks blotchy
@@quackcharge Unity compares. In fact, their Screen Space Global Illumination has very stable emissives. Maybe Epic can take a look at it. And runtime animation rigging since 2019.
And before you mention Lumen, Unity has had realtime GI since 2017, except the geometry positions are baked, so light won't spill through an open door.
Gray Zone Warfare is a disaster so far. They are realising nobody can run these games.
I am happy you covered the emmisive lighting noise. I noticed that and did not know what was causing that.
4:05 good performance gains, but what seems like massive downgrade in indirect lighting. There's flickering everywhere under that bridge, where it was fine in 5.0
15:00 ish: Phew I'm glad you addressed the ceiling lights. They were driving me mad in the performance comparison :D
that car deformation is actually the thing that impresses me most of the Matrix Demo
I saw a video recently -- "Optimizing my Game so it Runs on a Potato" by @Blargis3d. He's making an indie game and was having the same compilation stutter problem. He solved it in kind of a genius way. When the game starts, before every level, he has a black loading screen. Thing is --- it's a trick. What he's *actually* doing behind that black screen, is playing the game at 10 times speed, walking in every room, loading every texture and killing every enemy. That way, all of the hitching and loading that has to happen, happens during that period. Then, when the 'loading' completes and the player plays the level, it's actually the *second* time that all of those assets are loaded. Thus -- complete elimination of compilation stutter.
This was the first time I ever heard of this (I didn't even know such a thing was possible) and thought it was really cool. Thus, sharing it here. :)
The game is Bloodthief by @Blargis3d. The shader compilation trick is in the video "Optimizing my Game so it Runs on a Potato" and is indeed very cool!
@@HildingL Thank you so much for the video! I'll update my comment to reflect this info. Thanks again! :)
Alex's pronunciation of Frankenstein is peak DF content 🤌
Frrrankenstein. The German is strong in this. :D
Actually the correct pronunciation
That emoji doesn't mean what you think it means. :-)
@@colaboytje It means different things to people from different cultures.
@@andreasheinze9685 That is the Italian dead duck emoji, no?
0:03 Unreal Tournament - Forgone Destruction ❤️
Thank you for using Unreal Tournament soundtrack for this video.
So why doesn't all developers add a loading shader cache before you even start playing?
coz they don't need to, people will still buy the game, ad paying devs to do something that does not increase profit makes no sense to them.
Because it's not that easy to do it properly and a lot of studios who are using a ready made engine do this because they lack the technical know-how and/or financial resources to either develop an own engine, or to dive deep into the quirks of a complex monster like UE. They import their art, design their levels, write their game logic in Lua or UnrealScript, click the "build project" button, and hope for the best. Many devs aren't even aware of the typical issues, which is why DF does such an important job in explaining it over and over again.
The work involved isn't just a checkbox and a screen or something: the way shaders work in Unreal (and every other engine, really) is that you dynamically stitch them together and with potentially dynamic external parameters: think weather systems, characters getting wet or dirty, blending between different ground materials, adding effects for power ups, that sort of thing (and those are just the most obvious ones)
Unreal has a big list of these snippets, but not how they will be combined or with what values until you actually tell it to use them. Doing a shader compile is basically the developer finding all the final shader combinations they use in a game (easily thousands nowadays), often through just running through it and trying everything, and telling the engine to use them all in order.
Unreal can't easily fix this without breaking the entire workflow that pretty much every artist has used for about 20 years now. Developers need to perform a huge amount of work for what they might not see as being high value, since "it's only the first time"
have u played last of us part 1 remake?
i had to wait 40-45mins just in loading for the shader cache
imagine waiting that long haha
no one would wait that long and either play with stutter or refund and delete the game
@@ChillieGaming I would genuinely rather wait 48 hours for a game to load than it be shipped unfinished, this is why they get away with it, coz y’all bit it regardless
Using classic UT music at the beginning is amazing!
The entire video used UT music except the Frankenstein part
4:04 "Exact same content and settings"
> Extremely noticeable artifacting on the ceiling
The Matrix tech demo settings were similar but seemingly something changed under the hood in UE 5.4. Most likely this artifacting is somehow connected to the render parallelization improvement. Kinda seems like something is out of order/not synched in the pipeline.
I thought the same but he explains why this is the case at the end of the video.
@@Cheynanigans__ It is the same, but it kind of invalidates performance comparisons when there's such a difference.
@@Waffle4569 I kind of agree. We're comparing performance between UE5 iterations though so if the way lighting is processed helps performance while also making things prettier/uglier, that's what we're seeing. The *Matrix* game/demo settings are the same, just the engine changed. Apparently those lighting artifacts/fizzles are from MORE rays being processed into the scene and weren't there before due to those lights casting nearly no raytraced light before.
It's because of lumen using variable rate shading i would say
Alex, you have no idea how much I appreciate your choice of music for this video!
That Unreal Tournament track brought back some great memories! I wish Epic would revive the series.
the music at the very start of the video is my favorite track in Unreal Tournament. Brings back the good memories from 2001 when I first played the game.
UE5 is really giving off those classic cryengine 2/3 vibes. Great to look at, with performance issues. For them to not have parallelization right out of the box is baffling.
I saw some kid describe Foregone Destruction as "breakcore" a while back. I almost turned into The Hulk.
I mean it is drum and bass but the difference between breakcore and dnb is minute. It's not exactly wrong to call it breakcore.
@@coffin7904 It's extremely wrong and you are very mitsaken. It's like calling Skrillex "house music." Or Aphex Twin "techno". Or Lynyrd Skynyrd "country". "Breakcore" and "drum and bass" are distinct genres and saying the difference is minute shows that you really don't know much about this type of music. You could say that both are part of the same super-genre of Jungle, but that's not what we're talking about. If you're going to claim that breakcore and D&D have a "minute" difference, what's the point of even using distinct genre terms at all? Let's just call it all "electronic music" and skip the specificity altogether! Christ.
@@azazelleblack can you explain what the difference is?
@@azazelleblack It turns me into the Hulk whenever I hear kids put the word 'core' at the end of any arbitrary word. Like calling frutiger aero 'cleancore' or something.
@@coffin7904 Absolutely!
Before we get started, it's important to understand that these genre terms are sort of poorly-defined and used pretty loosely. With that said, there ARE definitions to these terms, and in particular "Breakcore" is a distinct subgenre under the heading of "Drum & Bass" or "Jungle" music.
People disagree over whether Jungle or D&B came first (and thus deserves to be the super-genre), but both evolved from earlier Breakbeat and Rave Hardcore music. Jungle music had heavy influence from Dancehall and usually was lower tempo, with an MC and party vibe, while Drum & Bass was focused on literally just rumbling basslines and breakbeat drum loops.
Drum & Bass as a genre typically refers to the early works of artists like Technical Itch, Q Project, Grooverider, and many, many others. This style originated in the early-to-mid 1990s and was still very much party music. It's danceable, and while it has a much darker vibe than something like Rave Hardcore (or especially that genre's successor, Happy Hardcore), it's still chill enough that you can zone out and relax to it.
Breakcore, meanwhile, is an evolution at least three stages removed from the original Drum & Bass sound. Around the same time people were coming up with D&B, other artists were experimenting with new sounds and creating what was then called the very stupid name "Intelligent Dance Music," or "IDM". IDM is often harsh, atonal, and challenging to listen to, and despite the name, it's almost entirely undanceable.
The early crossover efforts between D&B's successors (known commonly as "Darkcore", see: Dieselboy) and IDM were mockingly called "Drill & Bass", but this style became somewhat popular within its niche and has numerous artists. You get crossover from both D&B guys and IDM guys in this genre.
Drill & Bass eventually gave way to Breakcore, the successor genre that takes the relatively technical and stripped-down Drill & Bass and turns it up to 11 with influences from Hardcore (both EDM and Punk), industrial music, and even avant-garde noise music. True Breakcore is a brutal and harsh genre that's hard to enjoy for most people. It features gruesomely mutilated breakbeats with little rhythmic coherence and sharp, distorted sounds that can be like audio jumpscares or just constant stressors.
Even if you say "Drum & Bass" is the super-genre, nobody is thinking about Breakcore when they say "Drum & Bass". They're thinking about Aphrodite, about LTJ Bukem, about Dieselboy, about Evol Intent, and so on. Breakcore artists are guys like Venetian Snares, Bogdan Raczynski, Sh!tmat, Rotator, and so on. Foregone Destruction is absolutely not Breakcore, lol.
What a trainwreck of an engine when it comes to the shader stuff. Like holy shit
I think they should put all of their resources to fixing the performance before improving rendering. Makes the most sense even from a marketing perspective
12:31 Alex in the background looking for high frequency detail textures
Man that Unreal tournament song bring back memories, I wish there is a Unreal Tournament
Epic should bring back Unreal Tournament but inside Fortnite to advertise the capabilities of UEFN.
Starting the video with the facing worlds track. Nice!
Opening with Foregone Destruction - instant throwback to the good ol' UT99 days.
How on earth did it take 3 years to add multi-core rendering support? Hey, we're going to build a house without a roof, release it, and 3 years later we'll add the roof.
Didn't they run this on a PS5? The demo of Matrix on a 7800X3D with a 4090 is unacceptable even today with 5.4 given that this is the best you can buy.
If it was the easiest thing it wouldn't have been an issue to begin with.
Traversal stutter has been in UE for, what… 20 years now maybe?
Great content, Alex. Instead of the technical talk (which I’ve grown in understanding over the years due to Digital Foundry) you did a great job illustrating the points you made. Bravo.
Problem with the Frankenstein PC is it is missing the co-processing from the I/O Chip in the console. The PS5 I/O has the equivalent of 11 PS5 CPU cores dedicated to Decompression and Dieect Memory Access, it has an addition processor dedicated to I/O of the SSD, and another dedicated to memory mapping.
On the Frankenstein PC, all of these tasks have to be handled on the CPU or GPU depending on how they decided to tackle it.
As we move into full fledged PS5/XBSX titles, the gap between the consoles and this PC will significantly increase as the I/O Olin the console becomes gets more and more use.
Careful, the PC fanboys will attack u
making me think of ctf-face while at work... thanks guys I needed that.
The music from the original Unreal Tournament is epic.
Bring back the matrix demo for consoles!
the day the gpu handles almost everythin is gonna be nice
Doubt it will ever happen. Been saying this for years now
@@SPG8989 Alan wake 2 did a pretty good job with it though hey, unless you mean unreal engine specifically.
I'm holding out hope that we can have some kind of machine learning model to handle shader compilation and that can run on the GPU. A model trained specifically to do shader compilation could be infinitely more efficient than the dumb "on demand" way it's done now.
we are getting close just 4 to 5 years and ue5 will be playable
Too bad games are being made using ue5 as we speak.
On the next gen after this maybe
Well boys we maybe by Unreal Engine 7.4 we can finally get a stutter free experience.
the historical trend says that by UE 7.4 the stutter will be a constant up and down between 16ms and 100+ ms and the entire industry will pretend it isn't happening or if it is happening it's not a big deal or if it is a big deal it's impossible to solve and we just have to accept it any way. no version of Unreal has ever stuttered LESS than its predecessor.
Epic really needs to get stuttering fixed and ASAP. It utterly ruins games on PC. Makes me not want to touch the engine.
I dread every UE 5 release as a PC gamer. I always wonder how bad the stutter will be. Not if it will stutter.
7:30 "Developers are just not thourough" - you know that "Shader Pre-Compilation" is essentially just splurting out all shaders and effects onto the screen and putting a progress bar ontop of it?
This stuttering issue has me quite worried about CDPR’s next games on it. It would be so sad if the new Witcher released with terrible shader compilation or traversal stuttering considering how great CDPR’s games ran on high end PC’s with RED Engine.
You can manually fix it, as Alex says. You need to force pre-compilation of all the shaders.
It wont probably it will be minimized if well developed but still be present
gear up, the performance is gonna be worse than you can possibly imagine.
The Hitcher
We're going back to Witcher 1 levels of stutter baby lets go
UE stutter became most apparent to me with Lords of the Fallen... even after 40+ patches, the game STILL suffers from hitching, micro-stutters, and the like.
It begs the question: Which engine can rise up and offer a robust set of "future-proof" features, provide affordable licensing, etc.?
The comparison at 4:00 does have more fps in 5.4 but global illumination is way more noisy under the bridge. So Lumen is doing a worst job at calculating those emissive materials. (oh, it's mention at the end... hehehe)
This could be an issue with the setup. He literally just ported the demo to the new version. Some confog tweaks could be necessary to fix this.
Hearing the Unreal Tournament music makes my heart melt ♥️. Truly brings me back, and gets reminded that Epic Games is a juggernaut when it comes to their graphics from the Unreal Engine.
The Frankenstein PC had me laughing, well done!
It has just been mind-blowing to me that these companies can put sooo much time, money, energy, human resources into making these amazing pieces of art, just to get to the very end finished product and accepting shader compilation stutter and traversal stutter. It's unfortunate, frustrating and just upsetting. Millions of dollars go into these projects and us consumers spend thousands on PCs just to have all these little hitches. Hope this will come to an end
Play on consoles.
@@борисрябушкин-з9н Consoles aren't exempt from these issues.
you need to overclock lol, no other way around it
@@silfrido1768 lol. Nope. That is not the issue at all.
The shader compilation issue is the reason why I still play games in consoles. Looking forward to the Nintendo Switch 2.
Can Epic do an Unreal remake now? Is it still too soon?
That would be incredible.
I doubt they could even if they wanted to, and for all we know the de-listing might have something to do with Tencent. And never forget that the original series of Unreal games were co-developed and directed by Digital Extremes. The only Unreal games made entirely by Epic Games were UT3 and UC2.
Yeah, Epic wants nothing to do with Unreal anymore. Shows the kind of company they are.
@@sulphurous2656 This is why I hope Blizzard never follows up on D2R.
@@AlexanTheMan 99% of large companies like epic are after 1 thing. Money, don't know what else you would expect. An Unreal game would only get the boomers interested and they're less likely to spend money on fomo and battlepasses and dumps 100s hours into a game unlike the kids that play fortnite.
Love how clear and concise this video is Alex! The Matrix demo still looks so good. Accurate lighting is so powerful.
Whoahhhh digital foundry with the jungle/ambient/intelligent DnB music at the start !!! Cool!
Not bad. If they now can make the Epic Game Store run 60% faster too...
Dude thank you i hope you cover additional versions as they release this was very informative if ue5 was ready for the project im starting! It is not there yet sadly!
Blessed forgone destruction music
youre not gonna talk about the wierd lighting in 5.4 ? well well well
Overall it's great to see improvements, but it still needs a lot of work. No wonder all new UE5 games look and run the way they do. However, I still prefer path tracing with DLSS RR over UE5's global illumination, reflections, shadows... They have that unstable flickering "boiling" look that I find distracting. 🤔
Yeah, it's noticeable in ark. The software lumen is a big difference to path tracing. Software raytracing, hardware raytracing, then path tracing. Epic has said they want to optimize the engine to the point where hardware lumen is as expensive as software lumen is today. So if they can achieve this, we'll get better performance or resolution for UE5 games. Also we'll get better rt quality for 30fps modes. Something tells me they want software lumen working phones and mobile gaming PC's and the switch 2.
This is why it saddens me to see so many developers dropping their in-house engines for UE5. I always had at least 1-2 consoles for every generation, but always played big titles on PC, but current generation makes me rethink this bc it seems like the only way to dodge stutters is to switch to consoles...
Me with my 5800X3D and 4070 Super: *Plays Fortnite in 1080p, performance mode and low settings*
I cannot suffer through a few games just to cache my shaders in dx12. It is a miserable experience.
You can download the shaders for fortnite
Last time i played the game it worked just fine?
@@timodeurbroeck9957even when downloading the shaders doesn't help. You need to compile them for your specific PC configuration.
Great video. Loved the UT and Castlevania IV music!
I've been refusing to play UE titles since Jedi Fallen Order, and this is more confirmation to continue to avoid UE games. Unless DF/Alex confirms a UE title doesn't have stutter, then I refuse to play/purchase.
You play games because they're fun. If this is your concern, then you're robbing yourself of good experiences.
What engine developers won't tell you is most often when they "optimise" some process or effects, they actually downgrade them. For example there's a clear issue with the gi/radiance on all point light in the 5.4 version compared to the stable 5.0 and refelctions and stuff are blotchier . And that's only what I can see
Always great stuff Alex 👍
Always fun to see what you guys come up with
They really need to finally fix this god-awful stuttering. It ruins the gaming experience big time
4:17 hearing Castlevania music on a DF video is a surprise to be sure, but a welcome one
I know engines get patches and updates throughout their life, but I'm surprised by how poorly UE5 can work with multi-threading. Like its nothing new, its been around for years
It needs a 2D shader animation for 3D buildings at a distance, and an option to lock frames at 40FPS and 45fps, it also be nice to integrate frame generation, so having something like internal 52fps lock and upscaling to 60fps with frame generation..
Games still look a blurry mess, seems like realistic effects are all based on vaseline vision for devs.
"Frankenstein PC" *thunder souds* - lol love you guys at DF
lmao its 2024 and we are still talking about Unreal Engine Stuttering? 😂😂😂😂 UNREAL!
Only 3 years to optimise the UE mess. Well done, Epic :)
The unstable frametimes in UE5 are a huge problem. It makes me angry that half the industry is switching to this engine when it's impossible for a UE5 game to actually run smoothly, and Epic isn't doing anything about it! When I played The Talos Principle 2 I got so sick of it I ended up limiting the framerate to 60 fps (with my system easily capable of 100+) and would still experience visible frametime spikes pretty regularly.
"when it's impossible for a UE5 game to actually run smoothly"
Not true at all.
"and Epic isn't doing anything about it!" - Also not true at all. There are many features in recent iterations of the engine that combat this. You even see them in this video. It will take time for developers to get onto those versions of the engine, though. Hell, Epic has done talks on this issue.
@@Cinnamon1080 I'll believe it when I see results. It looks only marginally improved at best in the video here. They've done lots of talking about shader compilation stutter too, but you can also see how poorly that's going in the video here. My personal experiences playing UE5 games so far have been very negative - frequent frametime spikes and poor image quality due to Lumen artifacts, on top of the ever-present shader compilation and traversal stutter that were inherited from UE4. Upscalers other than TSR look inexplicably poor compared to other engines too. It's just one big disaster as far as I'm concerned.
We are proud to release Unreal Engine 5.FFFFFFFF4! 🐛
Gonna be 6.0 soon at this rate
Big up having forgone destruction at the beginning
Why isn't it possible for the engine to download the shaders from a cdn/steam/nvidia/amd, matching to the video card?
Steam does this for some games games (apparently not for UE5 games).
And consoles do it too.
There are literally hundreds of SKUs of PC GPUs.
They depend not only on the card, but also the driver.
It might be possible, but it would be a lot weaker of a system than it is in, lets say, a steam deck
@@Wobbothe3rd the processor of a 4060 is the same, no matter if it’s from asus or msi.
I’m a software dev, and im pretty sure that it’s even simpler. Im mean it’s just a simple compiler. It’s like compiling C code for i386, i486, i586, i686. There is no need to compile it differently for every single SKU. Only for the architecture: Turing, Ada Lovelace, etc. So that’s about five/six different architectures.
UNLESS, shaders contain precompiler statements which check for VRAM size or number of shader cores, which would be IMO very weird.
@@iurigrang you raised a good point with the driver. But the fun fact is, that’s because one essential part of the driver is the shader compiler.
It’s likely, that not every small driver increment, changes the output of the compiler.
So nvidia/amd needs to keep a lookup table:
source shader hash + architecture + driver version => compiled shader hash.
That would be a finite number of files to deliver via cdn.
I mean they could say, we offer that service only for the latest WHQL driver and the latest upstream driver. That would be even fewer files tob host.
With a fallback to self compiling.
I guess the problem is, nobody wants to pay for the CDN, not the game developer and not Nvidia/AMD. ^^
Just compute the shader on device before starting playing, that's way easier.
0:03 ah Facing Worlds OST from OG UT. Good to know that Epic tries to erase UT from it's history.
It's pretty shameful that a company like Epic, with all its resources still has problems with shader compilation and traversal stutter on PC
If I were Jensen at Nvidia, I would be calling Epic every day pressuring them to put in the work. So many studios use UE. So to the average person, who has never heard of shader stutter, booting up a UE game would just put a black mark on PC gaming for them. This would have a negative affect on PC hardware sales.
I bought a PS5 for it's small number of exclusives but mainly to play UE5 games that interest me just to avoid them on my PC (7800X3D/4090)
And this hurts smaller studios that can't afford making their own engine. UE5 has so many great features which attracts smaller studios who want to make something more than a sides scroller.
It's a difficult problem because of how flexibly artists can author new shaders in the engine, with potentially tons of different permutations. There is effectively no way of knowing ahead of time which permutations of those shaders will be used at runtime, and the engine was structured for APIs like DirectX 11 where the driver automatically recompiled/restructured pipelines for different state and shader combinations with less overhead. There is even a new Vulkan extension that reintroduces part of this dynamic driver recompilation to subvert the issue, but DirectX 12 still requires you to compile your pipelines ahead of time for all possible shader/state combinations you could encounter during runtime if you want to avoid this problem.
@@TheSandvichTrials I assume you were talking about the dynamic rendering extension for Vulkan? If so DX12 doesn't yet have something like that that I'm aware of, but it does have an equivalent of Vulkan's new shader objects where you can compile individual shaders ahead of time and mix-and-match them at runtime, opting out of certain optimisations. The new work graphs feature of DX12 makes use of its equivalent of shader objects.
@@jcm2606 Interesting, I didn't know about that yet. The work graph stuff in general looks pretty nutso...
@@TheSandvichTrials Yeah, it's real fucking neat and could open up a ton of possibilities for GPU computation. Apparently Epic was playing around with porting Nanite to work graphs and held a presentation at GDC with their findings, so I'm hoping the footage of that is released, if it isn't already.
@@jcm2606 I watched that presentation, they showed Unreal running in that megascans canyon/desert environment, though they refused to share any details. But it seemed to work!
i would rather take 40 fps with no stutters than 144 fps with micro stutters. idc about this new graphical leap just make sure my games dont stutter. PC is supposed to be the best platform yet we always get the crappy end of the stuck with stutters
I wonder if we will see more 60 FPS games on console now
No
They would need to use newer versions of UE. The immortals of aveum devs said they're gonna upgrade the engine. Also Epic has said they wanna make both software lumen and hardware lumen cheaper.
Lol no, only cross gen games (PS4 &. PS5) were able to afford 60 fps, but now more and more new PS5 only gen games are basically 30fps by design. FF16 is an example, performance mode there is a joke, at 720p with drops to 30-40fps, absolutely not worth it. Minimal playable framerates (30fps on consoles) will always be the baseline for developers, especially after PS4 support is dropped they simply will aim for 30fps on PS5
Developers COULD design every game to run at 60fps... It would just require scaling back graphic features, but it seems gamers prefer more 'next gen' visuals and are willing to accept unstable 30fps performance at Dynamic resolutions that dip below 720p internal rendering. The games are being produced for KNOWN HARDWARE, so it's just a design decision not to prioritize 60fps.
@@StreetPreacherr Having less CPU limitation in the engine could influence design choices.
Very detailed and thorough analysis, as always.
Great job on the video guys. Thanks for analyzing 5.4. Matrix demo runs in 720p base resolution - 1080p base resolution should work slower :).
Is that 720P? In this case, it looks amazing. It might be the only exception where 720p is acceptable as long as it runs over 60fps.
@@techsamurai11 On ps5 I've heard it works in 720p 24fps. Maybe it was 720p upscaled to 1440p. On Frankenstein pc they test in 1080p upscaled to 4k. For fsr it's double visible resolution. Also big cost of upscaling to 4k. The problem is with games - for cyberpunk 1080p looks like 720p because of all the blur.
@@michahojwa8132 that's not what Alex showed right? He had a version that was at 30fps on the PS5
@@techsamurai11 Yes, I may have remembered the low points. Edit: watched the video, you were right. So ps5 is even faster because Frankenstein pc did 1080p native. But still it's strange - if it cannot keep 30fps while driving that would mean lower res than 1420p, I'd say 1080p.
On PS5/Series X it was dynamic 1620p and most often 1440p before being upscaled with TSR.
Whereas Series S was dynamic 1080p and most often 720p before TSR.
Also as you can see in the video consoles were only 24fps in cutscenes. The rest of the game was consistent 30fps except driving fast or crashing the car which would tank frame rates.
11:41 The denoiser seems to struggle substantially more in 5.4 as visible in the lights under the bridge/overpass. So some of that extra performance may have been gained by reducing overall image quality