didnt get you , mind rephrasing? did you mean you want games which run without a dedicated GPU like in good old days, well if thats the case i do agree with you .... nowadays its all about graphics, the game design good music , sound effects , story are at a backseat and i could have played it with no issues with my integrated GPU , like max payne 1 , 2 , gta series till sanandreas, need for speed u2 , u1 , carbon , mostwanted etc , hitman series etc
well to be honest i also do that since i cant afford these expensive gpus i run with rx 6400 at 1366x768 but i am able to play my favourite games with it like rdr 2 at ultra settings... :D@@CaZuaLDeMoN
@@garrusvakarian8709 the funny thing is the prequel had textures that were already below the industry standard. The sequel has better textures but they could also be a lot better.
it’s absolutely mind blowing to me that a 4090/7800x3d setup only gets 50-60fps with everything cranked in one of earliest areas. upscaling seems like it’s being used as a crutch rather than a tool
It does that as the game is using Lumen (Software/ Hybrid Ray Tracing), Nanite (software based mesh shading for high polygon models) Niagara (Particle and Fire Effects)
People are a bit mistaken on what 4090 can do...Dude not even a 4090 can handle native 4k RT locked 60fps just go look at Cyberpunk, Hitman, The witcher 3, Plague tale. All of them will have drops below 60 at native 4k. Sure you can do native 4k high refresh but not with RT.
Sorry to kinda hijack your comment here as well but has anyone managed to fix or help the stutters that cut your fps in half when you roll or attack? I'll be getting 110 fps and then it'll dip down to 60 when I swing my sword, makes no sense and I have not been able to find ANYTHING about this specific issue online.
We reached a point where if a game is announced to be made on UE5 you can take that as bad news. The only developer I trust 100% with Unreal Engine is The Coalition. They are always the ones to make the best possible use of any generation of the engine
They're still working on UE5, it's effectively early access and is being updated and added on to all the time. Any AAA game deciding to use it while trying to tax it is choosing to run into issues.
It is at least good to see that competent devs are able to make good running UE5 games. This shows us the engine is very capable in the right hands. I also don't understand why they don't simply pre-load some assets in this game, when the cut-scene starts, just allow it to load some stuff for just 1 second and everything should be fine otherwise it will always revert to the on-the-fly asset/texture loading like UE does by default when no special flags are set. So again, lazy work.
Bend Studio (Days Gone) did an amazing job as well. Although they used UE4, still their game runs fine (no stutters), looks great and performs pretty adequate. Ryzen 2600 + 1660 Super can easily run this game at 1080p/High/60+fps. What a shame Sony cancelled the sequel. It could be amazing.
This seems to be one of the better performing UE games (especially with such good graphics). I mean he got 60+ FPS on a 9600K, even if at low settings, and this engine has usually been CPU-limited (architecture much more important than core count). Newest CPUs are over 50% faster in terms of single-threaded performance. Outside of stutters, pop-in and crashes, the game runs pretty well.
Interesting that they decided to make High Graphic Settings be Ray Traced for Reflection Quality and Global Illumination, the devs should probably change that around to "Medium, High, Ray Traced/Lumen" because people could confuse some people until they either read the description or search online. Another solution would be to have a more noticeable warning in red text or something like that.
That's how unreal 5 works when using lumen. Once you apply high settings it uses lumen. I don't see any problem with that. If the high settings are lumen based, then they're lumen based. How are people going to complain high settings are actually "high"
@@tbunreallUsing a completely different illumination technique at higher settings isn't the same as "high settings being high", especially considering that RT affects different cards in different ways.
@@HunterTracks huh? what a weird way to look at things, its graphical setting, the method between each preset is irrelevant as long as its tied to its named option, EG. reflection quality changes reflections.
@@HunterTracks You and the OP are splitting hairs, really. Settings affecting cards differently isn't unseen before RT. And even so, I fail to see the relevance of the argument. If it works for you on High, then awesome, if not, you can lower it to medium. That's what you're supposed to do when you want better performance, and it works the same with all the other settings (of course, with each setting having different impact)
What I've learned over the last few years, is that you really shouldn't buy a game right at release. Most of them seem to "ship" with a ton of problems of one sort or another, some are actually in an unfinished state...it's best to wait awhile and you'll have less frustration to deal with as a reward for your patience. (plus, I'm too poor for these new titles that require at least a 70 series card to be able to consider playing at native 1080 and 60fps.....)
this year a bought only....re4remake,street fighter 6,mortla kombat 1,jedi survivor and cyberpunk dlc...alan wake 2 next....dont trust anybody...jedi survivor was such pain to play...
Best practice in gaming today: wait 3-6 months, buy the game after it's received a dozen patches fixing bugs and performance, and get it on sale for 30-50% off. If most people would do this, it would HIGHLY incentivize publishers to let the devs FINISH the game and release and acceptable product when they launch it. As a consumer, it's a win win if you just wait a few months. In the case of cyberpunk I waited almost 2/3 years and the wait was definitely worth it. Pretty much the case for almost every PC release nowadays
You can tell most commenters didn't even bother to watch the video and just came here to rant: Throughout the whole video, even when he's focusing on some problems, Daniel is saying it's not that bad. But then you look at the comment section and people aren't responding to that at all, just post a preformulated sweeping Unreal Engine rant. I understand being wary of UE5, but at least watch the video and respond to it before you comment. Don't act like badly written NPCs.
Lumens is RT, it's a different kinda of kind of RT the way it does thing, but still is RT. That's why resolution boost fps A LOT, the only thing Hardware Lumens does is makes lumens more precise (better shadowing precision and light boucing).
So the game is playable on a $300 7 year old GPU at 1080p native and people are mad at the performance and are saying its unoptimised/bad/engine sucks? I just want to throw out a sanity check here but in 2007 the Radeon 3870 released on the 19th of november for just under $300 and in a game released 6 days before on the 13th called Crysis at 1080p it could deliver around 13 fps. Might not be the fairest apples to apples comparison there for sure but for a game made on a new engine to be playable on 7 year/4 generation old low end hardware is if anything impressive.
I tried it on my Ryzen 5 5600 non X and RX 6700 XT. Stutterfest with Ultra Preset but when I tried it then I just let the game do auto detect then Resolution to 100, applied FSR2 to Quality with Sharpening to 60, to performs really good. No Stutters. Finally I used the October 10 AFMF drivers that further improved the smoothness.
I think most of the complains are caused by people using hardware raytracing lumen in the game without knowing they're using it. The game doesn't have a switch for hardware raytracing, it just gets enabled once you put certain settings to high or ultra. This is the biggest reason why people are complaining about low fps imo because they want to play on high settings as they're used to from all other games but have let's say Rx 6700xt or RTX 2070 or RTX 3060 and those GPUs get destroyed when heavy hardware RT gets used
@@ProtossOP I agree, but I only used it cause of sharpness. NGL the details in most games these days are just blurry that I just toggle them as I please.
@@ProtossOP True, these devs are so lazy when it comes to optimization. Just slap on some lumen and nanite , no need for shader pre compilation and done. Release game make money, fix later.
The game is great imo, I'm playing on a 3060 Ti and it runs pretty well after some tweaking, the auto settings is for sure trash. Having a blast with the game, feels good, combat feels very impactful and weighty. Definitely needs more optimization but it's still playable on a lot of hardware for how well it looks, which is crazy. It's easy to get lost in this game, time wise and literally lol. Keep up these vidoes dude, I always look for them when a new game comes out, love to see your tests and thoughts on these things!
This is purely Epic's problem and they have to fix UE5 performance issues, 3rd party engines are there to reduce programming overhead for devs, not increase it, also its not an open source engine(custom license not GPL or MIT)..
@@thegreatimuif their specs are above minimum then it is absolutely the engine's fault. Also if there are games who look and run a lot better with the same hardware then it's also the engine's fault. Dice's games look just as good and run at at least double the frame. RDR2 runs a lot better and also looks just as good.
@@Godmode_ON24you can play this game at 1080p 50-60fps on GTX 1060 which is the minimum requirement for the game. If that's too bad then your expectations of what 7 years old budget graphics card can do are way too high.
Epic just needs to work on threading in the engine. We are in the age of 8-16 core (or more) CPU's and the engine still doesn't care cause it only needs a few threads for the main game threads and the rest are just not doing much. And ofc the eternal UE loading stutter/shader stutters. Maybe in 5.4+ we can hope to see them tackle these massive leftover issues. Rest of the engine is pretty damn great.
You could say that about the whole industry. I think you can count with one hand those that use multi threading....And by one hand I mean not even the whole hand... in fact I can't even think of any game that noticeably benefits from multithreading
You are right, there are not a lot. Dx12 has overall been a proper fiasco for the gaming industry. The devs just dont seem to know what to do with it. There are maybe a handful games that do it properly, like Nixxes ports, Metro Exodus, Doom Eternal@@riczz4641
With the launch of 5.3 they added Refactor of Hardware API for Multithreading, they said we may see the beginning of this on 5.4 but that will took a long time to full implement.
Daniel, I have a serious question: How did this video go up in its current state? In most of the footage-including the very first setup-the game is running at 60fps or higher. The video is encoded and uploaded at 60fps. _Why is all of the video captured at 30fps??_ I skipped around and every moment of the video was clearly 30fps capture, regardless of game performance and regardless of the video's encoded framerate. Was this intentional, or just an oversight?
That’s the problem why release it when it’s not ready mind you games are costing 70 dollars now and we’re getting worse quality. Literally indie games are better then AAAs now in terms of quality especially on release day.
@@inowtf4299 honestly I don't even remember a single game I've ever played that didn't have some issues, even older games had issues. At least these days they can fix the issues with updates. Older games they couldn't.
@@inowtf4299 but I do agree, indie games are kind of the way to go these days. Most console exclusive games I have no interest in. And most AAA games I find boring now. Don't get me wrong, I loved elden ring. But even it was a hot mess on release for all platform, it had more issues on release then I have encountered so far with lords of the fallen.
@@kurrupoppo6937 I honestly don’t remember many bugs In the old games I use to play back on ps2, ps3, GameCube ect but they were more like you died or had to restart a level but now it’s like the bugs make the game unplayable or just completely ruin the experience. Sony has done pretty good with their AAAs like spiderman 2 came out and its had a few graphical bugs so far but that’s about it and ragnarok had no bugs for me on release. But time and quality control needs to be a priority for sure nowadays.
Lumen in this game uses software RT only, the game is not built with shader support for hardware RT and force-enabling it will just lead to crashes due to lack of necessary shader packages.
Owen one thing that really need to be point at that the fps always drop when the boss/enemy start to swing or doing any attacks. People used to fluid 60 on other souls game see these as a slowdown that need to get fixed. Because when you do dodge/roll to avoid the attacks with fps drops in the game, its always feels bad and janky.
Aside from ds2 sotfs, none of fromsoft's games are anything to write home about in terms of optimization. Maybe sekiro has decent optimization, but the rest are poorly optimized. And they stutter and lag as well. With fromsoft not even offering upscaling as an option.
@@enricod.7198Only on a few specific loading zones where there's no combat. Used to be unplayable near release though with stuttering far worse than this.
@@MLWJ1993 only because you compiled all shaders, first run it's full of stutters and the performance is still suboptimal for how it looks (I'm running an rtx 3070, 32gb of ram and a 5800x3d at 1080p ultrawide lol). But you get a lot of hate from the usual delusional fromsoftware toxic fans for saying that the game runs like crap even if they sold what, 20 milliln copies? Fromsoft aren't capable from the technical pov.
@@enricod.7198 Nah, I can wipe all my drivers & shader cache. It'll not stutter, it was never shader compilation stutter to begin with. Elden Ring is all traversal stutter & most of that is fixed now. It is however very GPU intensive as well. Likely due to the massive amounts of foliage (hence why "grass quality" is the biggest performance determining setting in the game).
I want to see some dev opinions on UE5. Is this on the devs or epic for over promising and underdelivering on what their engine was caple of. It might just be chance and modern industry habits that have caused all ue5 games to be full of issues. Or the engine itself. Have only seen pure speculation so far.
Fortnite is very optimized even at potato pc so idk if it is entirely epic's fault. Most likely both parties. Devs tend to be lazy nowadays and UE5 probably is hard to work with. Optimization from both is needed.
"t might just be chance and modern industry habits that have caused all ue5 games to be full of issues" we're talking as if every singe pc users has the same high end pcs or what?
@@PhilosophicalSock Fortnite has a simplistic artstyle but it can look very good. It just has very scalable graphics settings so you can go from very nice visuals to very bad and roughly get the performance you expect. It also probably helps that it's developed by Epic. They own/work on the engine and must be very familiar with it at this point, where even if it is separate dev teams internally, they probably still have privileges and understanding other studios don't. The game also came out in 2017, so it's got plenty of work done to it post-launch. I'd imagine it also helps that its a competitive online game so they have to keep a performance oriented audience in mind at all times. Let's also not forget the amount of money that Fortnite makes and the amount of resources Epic are incentivized to keep pouring back into the game.
It's a new engine made for the next 10 years of games. In it's early stages it's still not fully optimised and lower end hardware is not up to the task of driving the new features yet. Within a gen or 2 most of these issues should disappear. UE4 had similar issues with early titles. A lot of it is still on the devs not optimising their game properly. Regardless it's probably overly optimistic to expect the likes of a 1060 to drive ue5 games well.
Pitfall of performance reviewers... You can't judge this game's performance from the first area of the game. There are more graphically sophisticated areas in the game but more importantly, what I notice is that you do not get many enemies in Umbral world in the first section of the game. So, I think this game has an issue with the "Umbral" world. In that world enemies constantly spawn/despawn. And as far as I can see, more you stay in that world, more the performance deteriorates.
He did say this game likely has worse performing areas later on. If its this bad at the start, then you can only imagine the later areas. If the 4090 is struggling at 4k ultra with just 50-60fps, it'll drop to 40-50fps later on.
Yes you can. Excusing a game for performing in 1 bad area is dumb. Every game should be running at top performance no matter the area of the game you dumb NPC.
Funny you said that, generally you would be right, but luck has it the area he is playing in is the most performance draining area of the game. This is the area over main hub area, game loads and unloads 4 areas constantly when you run around there (Upper Calrath, Craneport, Skybridge, Skybridge catacombs), that is why he sees stuters when running up and down there.
I've got pretty decent hardware, running the game around 70-80fps with a high/ultra mix, but for some reason it just never feels smooth. That's even taking the loading stutters into consideration, when it's just running normally it feels like the frame pacing is completely jacked. Might be worth capping it externally with RTSS or the like.
Feeling exactly like you my friend, and I run it at 150fps+... Never feels really smooth. I tried to cap FPS with RivaTuner and it still feels the same
The frametimes/frame pacing in this game just aren't good. It's tolerable but for me the movement just feels so inferior to Lies of P. I'm hoping everything will get better over time but so far Lies of P set the bar high and Lords of the Fallen just isn't keeping up.
I've been running the game for 15+ hours now on a 5600X + 6950 XT, and outside of 2-3 crashes, it's been seriously fine. 3440x1440, everything on high except GI and Reflections (so no RT), and I'm basically locked at 60 fps. It seems 70% of the people having major issues are on GeForce cards.
For me problem was solved by enabling Resizable bar on my Vega 56 by downloading Rebar config for Legacy and reinstaling drivers 23.9.2 afterwards. You can make rebar config by yourself just type this in a notepad and save as .reg file: Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Class\{4d36e968-e325-11ce-bfc1-08002be10318}\0000] "KMD_EnableReBarForLegacyASIC"=dword:00000001 "KMD_RebarControlMode"=dword:00000001 "KMD_RebarControlSupport"=dword:00000001 Not a single stutter I had afterwards :) Hope this helps someone...
The issue I think many have is related to unrealistic expectations based on “benchmarks”. People see for example the RTX 4090 can achieve 100FPS in a cluster of current games at 4K. However, deciding on the “optimal” resolution should be based on performance via the most “demanding” current games NOT the “average”. Had more done this they would understand that 1440p not 4K is really the “optimal / 90FPS+”resolution for max settings / ray tracing going forward 2023+ w/ UE5, etc. Now because many did not do this they are left to blame “game developers” for “unoptimized” games due to not being able to achieve optimal performance at 4K.. Me: 1440p/ 90FPS+ > 4K/~ 60FPS This comment is never popular (ppl generally feel inclined to justify said purchase..) but it’s regardless true. Now if you simply just want to game on your ~ 55” 4K TV or whatever and fine with ~ 60FPS more power to you. Just saying 1440p would be a more sensible option (for a high frame rate 90FPS+ experience) for many as these games aren’t going to become any less demanding.. 1440p to 4K is nowhere NEAR as dramatic as 1080p to 1440p. 1440p on a ~ 27” monitor looks amazing especially when you can run everything to the max w/ ray tracing @ 90FPS+ I also have a 55” LG OLED 4K / 120Hz / VRR which I’ve gamed on with my RTX 4090 / R9 7900X / etc.. setup. Never felt like gaming at 4K was worth sacrificing the performance I can get on my 1440p / 240Hz / G-Sync monitor.
@@justfun5479 “Premium experience” Yeah 1440p / 90FPS+ is certainly a more “Premium experience” vs 4K / 60FPS Also, up to 240FPS in First Person Shooter (not talking e sports) is a more “Premium experience” at 1440p vs. ~ 120FPS at “4K” Regardless, there’s this concept called “diminished returns” beyond 1440p when it comes to games. I’ve gamed plenty at both 1440p & 4K and the “premium experience” is at 1440p with the much higher performance / frame rates.
@@justfun5479 High fps is the premium experience and the reason why many people buy high-end gear to begin with. If you just wanna game at 4k, even a PS5 or Xbox will do the trick. The only games that I play at 4k are those that aren't demanding (Diablo 4, Hitman) and can still hit 100+ fps or games that are hardlocked to 60fps (Elden Ring). And let's be real, anybody that can't detect the difference between 60fps and 120 or 165 can't detect the difference between medium and ultra settings either and is therefore effectively wasting money for zero benefits...I wouldn't be buying high-end hardware if it weren't for the performance, you can get 90% of the graphics on consoles for much, much cheaper. High fps? Not so much, only very few console games even have 120fps modes so this is where the $$$-draining starts. 60fps is subjectively very choppy and anything but a "premium" experience to me.
After watched from many videos of different youtubers and read the comments I see that everyone forget what UE5 sales pitch was. UE5 sales pitch was "dev can use raw high definition 3d assets, all the optimization would be in ue5 (nanite,lumien)" what is this translate to? dev don't have to care about optimize the data footprint of 3D assets. Before ue5 they have to optimized the 3d assets (convert the raw data to optimized format aka compressed format like DXT1 DXT3 DXT5 etc) it is why we see games go up hundreds of GB of data. before ue5 game hover around 10 to 30 GB now, game hover around 50 to under 300 GB but now, can't run at high fps? just turn on TSR, DLSS, FSA. with optimized format the data load would be minimum, data goes from ssd/hhd to cpu to gpu would leave more computation power for those devices to rendering the scene, now in ue5 cpu, gpu even ssd would need more computation power just to move the assets data around then the cpu gpu have to work with all that unoptimized assets to rendering the scene. no wonder why stutter happened and lower fps. with no dlss, frs, tsr, stutter cause by ssd/hhd loading assets, cpu/gpu working on big data. with dlss, fsr, tsr stutter cause by ssd/hhd loading assets, cpu, gpu working on game engine to discard unneeded data as fast as it can.
The game has transversal stutters, which comes down to the devs, the Assets are not optimized enough to support level streaming... Moreover the game needed a shader precompilation step, along with the existing asynchronous runtime shader compilation, which reduces the number of shaders the game has to compile every time it loads an area.. Even the Unreal Engine Editor itself compiles around 7000 shaders, when I start it for the first time.. I feel like EPIC games should have a better documentations regarding shader compilation and asset streaming.. -This is my experience with Unreal engine, as I start learning it, in the past few months...
No one play on 50% res scale 100% native Or 66% dlss or fsr Q on 1080p It's 1080p even quality dlss or fsr will look bad So 50% res scale is a very bad idea
Man, is it just me or the FSR algorithm actually resolves *less* detail compared to the engines temporal reconstruction? I definetly think AMD needs to leverage those AI modules in the compute shader to compete with now even the baked in reconstruction technique.
Upscalers look like dog crap in motion and only look slightly acceptable when the output is 4k. We as gamers need to stop supporting devs who "develope with upscalers in mind". It looks so bad/blurry/pixelated/smeary in motion because it blends so many frames together. Gamers with crap standards are making upscalers the new industry standard, not devs.
4070ti / 7600x Ultra settings @ Native 1440p 65-75fps ~4hrs in to the game. So far, no crashes, bugs, or glitches. Game seems pretty stable and really looks fantastic visually. Had to force vsync off in nVidia control panel after recent game patch disabled vsync toggle in settings.
Same with 4080 / i9 10850K Couple crashes but besides that has been running great even at 4K Auto settings give me Ultra Everything with 2 random low settings Around like 70-90 FPS with DLSS balanced
Good to know. I'm going into this area of hardware with a PC I'm currently building. So, yeah. TechPowerUp Relative Performance chart suggests I will be in a 300% uplift of my current setup.
@@jasonvorhees6592 my girlfriend has the 2070 Super Laptop Version which is just a tiny bit faster than a desktop 2060 and I tested it out at 1440P all settings medium with dlss and it ran okay-ish. What CPU do you have? She has i7 10700H and 32 GB Ram
The "auto" settings seem to be very conservative when it comes to texture quality. That 3060 12GB had 7GiB of VRAM utilization at medium settings. There was plenty of space left to go higher. Texel performance isn't a limiting factor nowadays either.
This is also a very early area of the game. It's likely the settings are using some kind of rounded average based on tester's full game playthroughs, then offset likely 1GB down because running out of VRAM is such a showstopper when it comes to performance.
Hi Dan, watching your video. I have been playing since launch day. Personally I was more impressed with the performance than I thought I would be. The detail on display in this game is another level on good hardware. I use a 4080/11th gen intel ,fast DDR4 and a fast SSD and I get extremely solid performance. I keep DLSS on DLAA or DLSS quality...native 4K. Only the alternative world causes some dips... Generally it's super smooth. I cap the game at 90 with Vsync on. I put in the code to Allow frame Gen again...but I don't use it anyway. At 4K that 7800x3d is no real boost. I found that out when I had one. It's only good at 1080p over other CPUs. Keep up the good work. The game is generally top tier...such enjoyment to play (so far).
Resolution scaling makes sense as being the solution to 4K with high framerates. It also might make sense for pushing a mid range GPU onto 1440. But resolution scaling should really be a thing at 1080p. This game seems think that xx60 = 1080p, xx70/80 = 1440p, and xx90 = 4k with all the cards using upscaling. Personally I would pair the same GPUs with those resolutions but without the need for upscaling. Upscaling should just allow to jump one resolution category or move into the triple digit FPS without a jump.
@@niks660097 Cuz the only thing unreal Engine is good at is trailers , demos and the cinematics department. They try to sneak it into gaming but if you look closely its not optimized for gaming at all and all those "cool new features" are cinematics focused.
Ive said it from the very start 2 years ago when i saw ue5. Ue5 will become the cancer of pc gaming if they cant fix these stutter issues. Not to mention ue5 in general is just WAY too heavy. Its the only engine that just crawls to a halt on the steam deck also. Ue5 has some awesome features but current hardware and consoles arent ready for it at high resolutions PLUS it has awful stuttering issues. I really fear for CD Project Red's next games which have switched to ue5 instead of their incredible RED engine. So sad to see so many devs on ue5
Many move to UE5 because Epic allows them to integrate some of their engine feature into UE. Game scale are getting bigger. It become harder for game studio to maintain and updating their game engine and at the same time try to create good games. The Witcher 2 and 3 was well received from the get go. Then many considered that CDPR screwed up big time with CP2077. Be it on the game itself or the game engine as well. The only issue we had is there is no real alternative to compete with UE head to head for triple A game engine now. Unity in general cater more towards indie but the recent fiasco also quite a problem. Cryengine ia another good triple A engine but the Cevat brothers mess up with the company years ago. It seems they rather focussing on creating their own game now rather than develop the engine to compete with the likes of UE.
Their next title probably won't see the light for 4-5 years. By then there will be a new gen of consoles and 2 gens ahead on graphics cards. Epic will have 4 years + to further optimise the engine as well. It won't be a problem by the time it launches. Expecting a steam deck to run the latest cutting edge graphics engine well is asking a bit much?
What was so incredible about the redengine? cyberpunk launched in a broken state required at least 2 years of post launch support to get it working properly. maintaining and upgrading the redengine was a major factor in why cyberpunk development was so difficult
@@arenzricodexd4409 it's not even an engine issue. There are ue games that don't have shader compilation stutter but again it requires Devs to actually acknowledge and work around.
-Hey Mike how's the framerate on a 4090 & 7800X3D? -Just about 45-60 frames Joe -Ok Mike let's ship the game -But Joe we didn't even optimize it -Shush Mike, it's 2023, it's gonna be ok.
45 - 60... with maxed out graphic settings and 4K Native resolution. ... yeah that still sounds pretty bad come to think of it considering the hardware. :/
maybe a little bit of out topics, but I personally feel like UE5 games (like Remnant 2) overall low/medium graphic settings doesn't look that horrible compare with other game engines' game. So maybe instead of using FSR/DLSS to further lower down the resolution especially when u are playing in just native 1080p already, u should just lowered the graphic settings. And u do get a significant performance boost from turning high to medium unlike a lot of other games where u gain just a marginal performance boost with a much higher graphic trade off.
"overall low/medium graphic settings doesn't look that horrible compare with other game engines' game" indeed but because of that it also has barelly any peformance gain as you lower the settings
Yes . People need to understand that the playing field has changed. Medium is the PS5/seriesX settings this is no longer 2012 where consoles were trash and medium settings were disgusting. Medium is really good, medium is a PS5 meaning medium is the rx6700 PC equivalent PC gamers are a bit delusional thinking their gtx1660 and gtx1060 are still top dogs.
@@riczz4641 i mean peopls delusions are valid i mean just look at all of thos new games they are all demanding yet barely look better, why because the new engine if so why use it when the old one could achieve the same visuals and be less demanding, console being more powerful means they need to opitmise the game less for it which further means even worser pc port
@@cxngo8124 Bro cmon. 1060 is a gpu from 2016... you are literally the equivalent guy with a PS4 complaining why ps5 games don't run well...Don't give me the money BS. What can I tell you. This is a luxury, if you can afford 60$ games then you can afford a GPU. I'm from a third world country where minimum wage per month is 350$ if I can save for a last gen card then you can too.
I assume that the number on the right in the 'RAM' stats is the amount of RAM used only by the game? how can I turn in on? I see only the total system used RAM...
Playing on a system with a 7800x3d and 7900xtx, had everything set to Ultra at 4k and FSR Balanced. For the system it is I expected more performance than the around 70fps (at the lowest in heavy areas) I was getting. I noticed AMD haven't got their Driver up for the game yet because of the anti lag stuff with CS2, so hoping that brings some more performance. I did notice FSR makes some of the lighting looking really sparkly to the point where it's really distracting so turned it down to 1440p with no FSR and getting better frame rate and the game looks much better. It definitely needs some work.
Bro it's ultra. Those settings are pretty much future proofing in games. I don't understand why people think they should be playing ultra settings at 100 fps
@@tbunreall I understand that, I also don't expect over 100fps at ultra settings. I just thought I'd get more out of the upscaling but it looks really bad anyway so now playing at 1440p without it. I get that the engine is for future games but isn't it realistic to think that they should be aiming Ultra settings for Current high end cards. Save those extra lumen pixels for the sequel.
@@tbunreall sub 60fps, you can straight up see a 4090/7800X3D struggle to maintain 60fps and at 4k thats what most will target unless they have higher refresh rate displays but if your struggling to keep 60fps a float on the BEST HARDWARE currently, I wanna know what they even tested this game on to OK this. Also I'm sick of people arguing the whole "ultra is future proofing" it isnt end of, RDR2 did exactly this and how many returned to that game when 3000 series dropped? yh exactly, many play games in the moment not "when they have hardware capable to play it" I wouldnt even consider going back to LotF when 5000 series drops because by then some more demanding titles will be released that will generally keep someone interested and away from older titles riddled with performance problems. Its a cop out adding "future proofing" the entire point of Ultra if you havent released was so people could leverage FSR 3 and DLSS 3 frame gen tech, FSR 3 isnt in yet (but is confirmed) and DLSS 3 got removed, thats why Ultra exists so anyone can run Ultra using frame gen to give them "smoother" fps at higher fidelity, absolutely nothing to do with "future proofing" its not a live service game where they will maintain support for years to come, its a done deal. give 12 months and they will have moved on in terms of bug fixing/support
This game is a lot of fun especially if you are playing with friends and like Soulslike titles. Reduce global illumination to medium and maybe shadows if you’re having performance problems. I’m rotating 5 friends and we’re all having no issues just having a good time.
UE5 core feature set scale on a per pixel basis. We've already observed this from Remnant 2 and Immortals of Aveum. As always I'm more concerned about the uneven frametimes and poor CPU performance of UE5. This honestly has me worried a lot about upcoming releases.
Yeah the multi core/ threaded performance is crap, you are lucky if it uses 4 cores. All they care about is single core performance that goes vroom vroom rather than spreading the workload out amongst all the cores instead of relying on speed
My son is playing this on a GTX 1070 and Ryzen 1600 (original) and hasn't crashed once so far and is enjoying it. He's playing on default settings and has 4-5 hours in now. From what I can tell it looks like its running great. Didn't turn on telemetry as it appears fluid enough visually.
@@pomps8085 On high settings + DLSS balance runs well, though for stable 60 fps all of the time you have to go dlss performance. Also it crashes sometimes but it is the game itself.
@@pomps8085It runs well for me at 1440p quality with my 3080. I set all settings to high. I also use a 60fps cap, without it I found I got stutters so having a little lower but locked frame rate is much better than a higher but unstable one
Great video in terms of analysing the performance, but you kinda completely avoided to mention what visual impact these settings actually have. I was watching this in 1440p60 and it's obviously very hard to judge anything from YT, but I didn't really notice the game looking any better on the 4k Ultra vs whatever resolution on low/medium. Especially didn't see anything with the raytraced settings compared to nonraytraced.
Not gonna lie I'm very underwhelmed with UE5 thus far visually it just doesn't look better than some of the big triple A games like rdr 2, horizon forbidden west, cyberpunk, etc idk maybe cd projekt red will prove me wrong on their next game that's supposed to use UE5
Did you buy any of the unreal 5 games so you can see first hand? I have not bought any unreal 5 games yet but from what I have been hearing from reviewers and seeing on youtube is that there is an improvement. I will wait until 6 months or more after the release to buy a few unreal 5 games that have come out so far.
I haven't had any issues. I. Running a 7900xt and a Ryzen 9 3700x. I have most everything at its highest, other than a few things that I don't really care about. I can't remember off hand all my specific settings, but if you've got a similar setup and you want my settings to see if they work for any of yall I can post em when I get home. I'm loving this game, and I'd hate to be held back by crashes or bad lag.
One thing I've been thinking is what is the point of games having this level of graphical fidelity and features if for most people, they're gonna look like a pixelated mess due to devs relying on upscaling, I'd rather have a game that looks 'dated' but runs well and can be maxed out
Game runs perfectly UNLESS you enable G-Sync. Then there are frame drops every time you kill a mob, sprint, or dodge. Was able to consistently recreate this. I’m aware UE5 breaks if you apply any OC to GPU (it crashed like usual when I tested), but stock boost meant no crashing.
The bigger disappointment for me is that UE5 still has the same stutter issues. The loading stutter can be tolerable depending on level design and how frequent it occurs, but is still very disappointing. The cutscene stutter for me is always a huge red flag. It's a scripted sequence. There has to be a way to avoid stutter and pop-in, right? Really does not bode well for future UE5 games, and supports my overall pessimism about the quality of software and hardware declining. Seems financialization is ruining everything by creating the wrong incentives and terrible workplace environments.
At ultra Settings it is actually using ray traced global illumination. Considering this it runs not too bad😀 High settings give a big performance boost
Ultra settings are almost always for multiple Generations down the line. People are becoming confused because they can play 3 to 6 year old games on Ultra. So why can't my current game not run 4K Ultra 😂
its the engine. Ask yourself why games with custom engines running DX12 have no issues but every UE5 released lately has issues? Oh and some of their games stutter so bad it can make you physically sick.
It IS very bad. For the fidelity it offers, the performance hit is so bad despite the best efforts of developers. LotF looks good, but it doesn't merit running at only about 70fps average on a 7800xt.
Hello, do you know that I have had an RTX 3060 since 2021 or so and I have noticed that the temperature rises a lot when playing in 2K at 60 fps... it reaches almost 80C, and the fan noise is very annoying... what I do is lock at 45 fps and I keep the temperature at under 75... but I would still like to play at 60 fps I will have to do a complete repaste or I could only clean the fans... since I don't dare to open it completely, with the risk of being left with nothing---thanks Daniel
50% Render resolution is pretty terrible, but I guess for a 1060 that may be the normal way to run this game. Some people may flip that to %100. Video recording on the SAME machine can cause crashing and performance issues. Even when using hardware encoding. Only way around this is to get a very high quality capture card, or perhaps output the record data to a network machine for processing, that may be more stable.
for a game that doesn't look that great the performance is something left to be desired. and it's clear that the devs are not capable of understanding what parts of a game consumes the most amount of resources.. also what i find wierd is the frametime is going crazy. from as low as 12ms to 25 ms on the lowest capable hardware. while only a few things are happening
Since UE sourcode is opensource on Github, have any dev studios figured out how to solve the stutter? I vaguely remember for UE4, even Gears 5 had stutter, that the Coalition didn't eliminate (maybe they eventually fixed some of it post launch?)
Everyone can't be coalition, they have always the to optimize epic's mess, also its not open source license, they have their own custom license, where pushing code is not allowed..
I absolutely love this game so far. (got almost 3 beacons out of 5). i do play on a 4090 but i havent noticed any stutter. The game is great for anyone who misses the Dark souls 1 feel that has been missing from later FS games. The maps is huge, the exploration is top notch there are so many branching paths and things to find.
Also they used to have framegen (deactivated because of issues i guess), but with framegen on at 4k dlss quality i had almost always maxed out my 144hz 4k monitor.
Me too i downloaded a perfomace fix mod and the game hasnt stuttered or crashed since and really enjoying it , needs optimisation for sure tho on my specs i shuld be getting way more fps
But it's not their game to iterate on. It's someone else's game. I can't get my head around that, you can't just clone something and change a few things here and there can you??
UE5 is a very young engine - I think it makes sense that we'll see growing pains for this first generation of UE5 games. I'm sure it'll improve as devs get more time with the new tools. I don't see why people are so against using upscalers, though.
I can explain that :) First: I have a 1080p Monitor and I very much notice every pixel less calculated and upscaled because even at FSR Q (720p internal resolution) I immediately lose a lot of fine details (hair/fur for example). Second: I will not accept that upscaling to 1080p is necessary because the devs are not given enough time to optimize the game thoroughly by the publishers. These are my reasons why I refuse to use upscaling :)
Agreed that UE5 is a young engine and I think many people miss that its the first engine (big engine at least) to include ray tracing via lumen as a core part of the design its really an engine for the next 10+ years of hardware not the last 10 or so years of hardware. As for the aversion to upscaling, personally it reminds me of consoles lieing to consumers about what resolution they were and the period of time where xbox (360 era) owners kept trying to tell me how there 540/720p console had better graphics than my natively rendering at 1080p PC. The term is just ingrained in to me as lesser or fake, a trick to scam the uninformed and honestly that is still true at least in some cases. We stil get people saying they are 'rendering' at 4k while they have DLSS performance on and well they certainly are not 'rendering' at that resolution so they are mislead. Also having used a bunch of DLSS now as its an option I can confidently say that the evidence of my own eyes is that its a visual downgrade overal even if in some areas it can work really really well. My personal best example of this is Darktide that I can play native in 1440p at a stable ish 60fps or enable dlss (or fsr) to push higher and it is a fast pace game so thats usefull but it is at the cost of some visual errors especially on fast moving objects and thats nearly all objects because of my play style. In Cyberpunk I can play without RT native or basic RT with dlss and even though the lighting is better with rt+dlss the game overal looks and feels worse. I think too many people do the pixel zoomed in compressed you tube video thing and not enough actually play a game with the setting on thing as both sides of the argument might just end up somewhere more reasoned (in the middle) after doing so. The other issue is that people spend way more on GPUs now and don't feel the value of them as they are used to playing a game in native settings for much less money and if you spent more on this gpu than the last one but feel you are playing on lesser settings it creates a bad feeling. Thanks to GPU prices I think thats every single one of us now. Ultimately its all nvidias fault they pushed up prices and said you now render at lower resolutions than before suck it chumps its just taking the market a little while to get used to being bent over so far. Give it enough time and everyone will be rendering at 720p on 5/6k monitors and be happy with that because its 'better than native' just like how console owners used to brag about the TV saying 1080p with their 540p consoles. It just (about) works Upscaling on old hardware is amazing and AMD supporting old cards is such a win for gamers but it really has no place in the high end unless your doing something extreme like trying to play in 8k
@@notwhatitwasbefore I personally found that path-tracing in Cyberpunk increased the visuals significantly whereas upscaling (at 4K at least) had very little effect on visuals. Ray reconstruction made reflections look better for the most part - I always found reflections in the rain to be frustrating in that game because they were so noisy.
@@sapedibeppo Yeah path tracing looks nice but as my gpu delivers a stagering frame rate of around 240 FPM (frames per minute) I was specificly refering to the differences bewteen no RT at all and just turing the setting on at its lowest level not path tracing just ray tracing. Also I'm at 1440p and upscaling to 1440p does have a visual impact. Its one of those things if your upscaling from a good starting point its fine but as you drop down the resolutions it gets less and less data to upscale from and so isn't as good. I personally don't see a huge difference between 1440p and 4k when actually playing a game and not staring at zoomed in screen shots so yeah I agree at 4k the difference is minimum if your render resolution is already 1440p or higher. Because 1440p looks good/great. On a 3070 with a slight tweek from ultra I get stable 60fps with no dlss no RT but just turning on RT and leaving all the other settings at off (Reflections, sun shadows, local shadows and lighting all set to off) means I require DLSS and if I turn on Reflections I can expect dips below 30fps even on performance dlss in some places. As you mention your at 4k and use path tracing its fairly safe to asume your GPU cost more than my whole PC and yet we are rendering Cyberpunk at the same resolution you just get nicer lighting and I mean no offence but thats hilarious as you essentially paid like a grand more for nice(r) looking puddles. When I played Cyberpunk for the first time using FSR to hit a playable frame rate at 1080p on an RX480 (4gb!!) I was asstounded at what was possible on my old cheap (for my normal spend) gpu. In situations like that I am a huge proponent of upscaling even if the visual quality is a bit iffy but not once has that satisfaction/experience been repeated using a better upscaller on a card that cost twice the money and if I had spent twice as much again (or more tbh) on a GPU I would be so mad if I ever felt the need to turn on an upscaler, while the card was current ofc in like 5 years trying to stretch the life of a gpu to one more AAA game yeah great tech glad it exists.
well the 4090 is able to max it out ? just because you are not over 60 fps doesnt mean it can run the game on max for me it would be ok because i can play at 30 fps when you play like 10 mins you dont even notice
Bashing "unoptimised" games these days seems to be a trend but i think we have collective amnesia because back in the days (2000, 2010's) if you take a 6 years old GPU, nothing AAA in 3D that came out was running on it, like at all. What we need to realise is: Yes 3rd party engines have a performance overhead, and the more complex/graphic intensive your game, the more it seems to be unoptimised, it is what it is, it's not dev fault, it's game companie's fault there, UE games were always a disaster in some areas since inception of UE as a third party engine. For a good couple of years PC hardware became stupidly cheap and now prices are skyrocketing to pre-2000 era prices, all mid-tier, low end hardware is just crap to keep market segmentation and justify the actual product at an inflated price, the tech market is now saturated, honeymoon is over it's time for profit.
I’m already tired of devs making their games with dlss required. A game should not come out where the absolute max hardware can’t handle it at a rock solid 60 maxed out.
The funny thing is I recently checked how my PC hardware was relative to the normal distribution on Timespy and my score was better than 67 percent of other people. I have a mid range PC and I'm starting to struggle in the newest games but I also fund it wild how other PCs that are more than twice as powerful than mine are still having issues. It makes PC gaming so cringe.
Could you do a 4090 vs 7900XTX comparison when it comes to RT in this game? I've noticed that many UE5 games seem to do RT perfectly fine on AMD hardware, and you're probably the only one who's going to bother talking about this game. Also: The youtube compression algorithm really makes quality comparisons impossible. On my 768p laptop I could only barely tell a difference between 1080p Low with 50% render scale and 4k Ultra except for the better lighting. I'll have to watch it again on my desktop computer later... but I'm on the couch :P
Fwiw, I have a 7900xtx/7800x3d and I'm in the 50s with everything on Ultra at native 4k. FSR Quality at those settings locks pretty well to 60, at least fairly early in the game where I'm at.
@@millerc82 Thanks. That means +60FPS at 1440p Native which is the answer applicable to my situation. That also means that the XTX scales to the 4090 with RT on as it does in pure raster.... So another instance where "the Nvidia RT advantage" doesn't exist.
@@andersjjensen Lumen is software based ray tracing. However, it can be hardware accelerated in some games. This game probably doesn't utilize hardware acceleration for lumen.
@@TonyHerson People say, HW acceleration does not make it run faster, but gives better visual quality (mor detailed). Such as Fortnite, just around 30% fps difference between xtx and 4090 with HW acceleration On. Real Path tracing makes the gap of hundreds of %, because Lumen is far from that level of RT workload
The problem is not in the engine, but in game developers. The are ethier do not have enough experience or just too lazy to make a decently running game. Coalition and Bend made top class games (in terms of performance, optimization and smoothness) by using UE4, while others just do not care.
Basically if it's not made by Nixxes it's 90% likely to be trash (and even Nixxes isn't 100% flawless) (also kinda funny/ironic how Sony has the good ports...usually) Maybe Alan Wake 2 but it doesn't even come out on Steam
Could just be unreal engine 5. It has legs to run, but it's walking due to features not being used. Technology can't keep up with the engine. Nixxes isn't the end all beat all.
@@gavinderulo12 The requirements just dropped and it's worse than i expected, you need a 6700 XT/3070 for Medium and upscaling from 540p to 1080p if you want to achieve 60fps, might be the heaviest game so far considering this is with no raytracing.
It doesn't seem so badly optimized. I mean its already next gen guys. It's a tough pill to swallow but we can't be expecting gtx1060 to get 60 fps not even on low. In fact this one is giving you exactly that with upscaling but it's better than nothing....I think it's pretty generous.
@@Scott99259 Go play cyberpunk 4k native maxed RT....You'll get not 40fps dops...you'll get 40 average...Go try remnant 2...Same. The witcher 3 next ge? 30 fps maybe outside cities I don't know in what bubble you live but NOT EVEN a 4090 is ready for 4k native ray tracing. You are delusional if you think that's true...And if so then mention a game that runs natively at 4k RT that is at this level of visuals? There are none....
It's quite sad to see the once announced optional DLLS for 4K high FPS gaming is nowadays pretty much required for 1080p if we want a decent FPS. So instead of bringing high framerate gaming to consoles, they bring the 30/60 FPS gaming to PC.
The game runs incredibly poorly on series x as well. They released a second patch today that has definitely helped, but it still looks like a 360 game that you have to pay $70 for.
Isn’t it crazy that most demanding game CP 77 with all settings maxed doesn’t stutter, whilst some UE 4/5 game on medium settings stutters at every corner?
@@jameswayton2340 yeah well, PC gamers are some of the biggest douche bags on the market. So forgive me if I could care less about your game only running at 80fps 😭😭
No, it still has a lot of work ahead of it before it feels like a polished release. With the frequency of the patches, I see that they are trying, and I hope they do get there sooner rather than later. FPS drops all over the place and pixelated fire is not the final state of the game, I would hope, and that's not even getting into the gameplay balance issues they need to address as well
The game runs fine on my 4070 in ultra settings. The issues are probably because people are running with old GPUs since UE 5 is demanding it is to be expected
mine is fine. ive got RTX4070 running on ultra all round and getting average 80fps. ryzen threadripper 2950x. 16 cores 32 threads. asus rog zenith extreme motherboard. 64gb corsair rgb 3200mhz 2tb western digital black m.2 ssd. alienware ultra wide 38" resolution 3840 x 1600.
I've had a few conversations about UE5 performance around the Talos Principle 2 demo and a lot of people seem to think that whatever system they have should be able to run the game on high or ultra at 60 and if it doesn't it's unoptimized. There's this one guy where auto puts him at low everything and he's complaining he can't get solid 60 fps on high. Stutters are an issue but a lot of this optimization talk comes from people who seem to think their 3/5/7 year old hardware should be able to run the game at a lof higher settings than their hardware can actually support years after it came out. On the one hand you get the guy with a 1080TI being happy he can still play new games in 2023 and doesn't need to upgrade, on the other you get the guy with the 3060 that's complaining his 3 year old mid range card can't run the game on high.
Nobody expects that. We do expect competent games that don't have massive traversal/shader compilation stutter and we do expect games to be reasonably optimized. When you look at games like Cyberpunk, Metro Exodus and Dying Light 2 offering full ray tracing suites and top tier visuals in massive open worlds and then look at the state of Unreal Engine games over the last 3 years there's clearly an issue.
@@landfillao4193 Cyberpunk? that game that's used as the default Nvidia benchmark because at max settings even a 4090 can't get 4k 60fps? That Cyberpunk? the one that was so buggy on release it negatively effected CD project's stock by over 75%? Who the fuck do you think you're lying to mate? Because it sure as hell isn't me. Might be the guy you see in the mirror every morning though. CD project almost went under because of Cyberpunk and it took them 3 years to fix most of the bugs. And you know what else they did? they updated the minimum requirements for the games while adding a new high end that shames even the highest of high end cards. 3 years of constant development post release to get the game where it should have been at release, new minimum specks that are actually reflective of the performance of the delivered product and settings that go so high no as of yet released card can run them but that's the first game your mind goes to when you think of the opposite of Unreal's performance issues? That games suffers from everything people are accusing UE5 games of without being an unreal game engine but that's the game you use to showcase the problems with Unreal? The game that shows this isn't an Unreal issue but a wider issue systemic to the industry is what you're going to use as a critique of UE5s performance? Do you like Cyberpunk so much that you've been literally blind to it's issue since it launched?
It seems the early UE5 games coming out are just badly optimised, all the ones I've seen have limited vis blocking choosing to have vast open spaces which results in additional performance hits. Am talking from the experience of game dev from way back in 2004 or so with the Quake 4 engine but the same dynamics apply am sure. The early games seem to be focusing game visuals at the expense of smart level design to minimise tris count.
Dear Devs,
When I said I missed the games of 25 years ago, I wasn't talking about the resolutions you had to play at WITHOUT a 3d card.
640x480 at 15-20 FPS?
didnt get you , mind rephrasing? did you mean you want games which run without a dedicated GPU like in good old days, well if thats the case i do agree with you .... nowadays its all about graphics, the game design good music , sound effects , story are at a backseat and i could have played it with no issues with my integrated GPU , like max payne 1 , 2 , gta series till sanandreas, need for speed u2 , u1 , carbon , mostwanted etc , hitman series etc
@@bmqww223 I think he is saying that he is playing at the same resolutions 25 years ago only with a gpu because the game runs so bad.
well to be honest i also do that since i cant afford these expensive gpus i run with rx 6400 at 1366x768 but i am able to play my favourite games with it like rdr 2 at ultra settings... :D@@CaZuaLDeMoN
@@NS-jj1tl People played at 320x240 with 10-15 fps on their Pentium ll PC's running Windows 98.
I just can't believe they still haven't got around to fix the Unreal Engine stutters after all these years.
I played the Talos Principle 2 demo (made with UE5) yesterday and it looks fantastic BUT I also noticed stuttering in that game unfortunately.
Epic doesn't care, devs doesn't care. Don't buy so they will care.
@@gaetanoisgro6710devs can do only so much if the game engine is the problem. This is a fantastic game thik.
Star Wars Survivor and Hogwarts Legacy are the worst performing unreal games I've ever played.
@@garrusvakarian8709 the funny thing is the prequel had textures that were already below the industry standard. The sequel has better textures but they could also be a lot better.
it’s absolutely mind blowing to me that a 4090/7800x3d setup only gets 50-60fps with everything cranked in one of earliest areas. upscaling seems like it’s being used as a crutch rather than a tool
Yeah, and then that title...
They are building the UE5 games with upscaling in mind.
@@dante19890this is NOT what they have been created to do. Epic again does something that in the end damages the consumer
It does that as the game is using Lumen (Software/ Hybrid Ray Tracing), Nanite (software based mesh shading for high polygon models) Niagara (Particle and Fire Effects)
People are a bit mistaken on what 4090 can do...Dude not even a 4090 can handle native 4k RT locked 60fps just go look at Cyberpunk, Hitman, The witcher 3, Plague tale. All of them will have drops below 60 at native 4k.
Sure you can do native 4k high refresh but not with RT.
Hold on, anyone else peep Daniel's skill at the very first boss? He is way better in this game than he is in Call of Duty 😅
He mentioned in one of his Elden Ring videos that FromSoftware games are some of his favourites. That's why he's good at this one.
Yup! :D
Sorry to kinda hijack your comment here as well but has anyone managed to fix or help the stutters that cut your fps in half when you roll or attack? I'll be getting 110 fps and then it'll dip down to 60 when I swing my sword, makes no sense and I have not been able to find ANYTHING about this specific issue online.
We reached a point where if a game is announced to be made on UE5 you can take that as bad news. The only developer I trust 100% with Unreal Engine is The Coalition. They are always the ones to make the best possible use of any generation of the engine
They're still working on UE5, it's effectively early access and is being updated and added on to all the time. Any AAA game deciding to use it while trying to tax it is choosing to run into issues.
It is at least good to see that competent devs are able to make good running UE5 games. This shows us the engine is very capable in the right hands. I also don't understand why they don't simply pre-load some assets in this game, when the cut-scene starts, just allow it to load some stuff for just 1 second and everything should be fine otherwise it will always revert to the on-the-fly asset/texture loading like UE does by default when no special flags are set. So again, lazy work.
Bend Studio (Days Gone) did an amazing job as well. Although they used UE4, still their game runs fine (no stutters), looks great and performs pretty adequate. Ryzen 2600 + 1660 Super can easily run this game at 1080p/High/60+fps.
What a shame Sony cancelled the sequel. It could be amazing.
Literally yeah i only trust the gears of war devs (coalition) with that engine
This seems to be one of the better performing UE games (especially with such good graphics). I mean he got 60+ FPS on a 9600K, even if at low settings, and this engine has usually been CPU-limited (architecture much more important than core count). Newest CPUs are over 50% faster in terms of single-threaded performance.
Outside of stutters, pop-in and crashes, the game runs pretty well.
Interesting that they decided to make High Graphic Settings be Ray Traced for Reflection Quality and Global Illumination, the devs should probably change that around to "Medium, High, Ray Traced/Lumen" because people could confuse some people until they either read the description or search online. Another solution would be to have a more noticeable warning in red text or something like that.
That's how unreal 5 works when using lumen. Once you apply high settings it uses lumen. I don't see any problem with that. If the high settings are lumen based, then they're lumen based. How are people going to complain high settings are actually "high"
@@tbunreallUsing a completely different illumination technique at higher settings isn't the same as "high settings being high", especially considering that RT affects different cards in different ways.
there is auto detect button at settings menu people for not savy people. it doesn't turn on those ray tracing.
@@HunterTracks huh? what a weird way to look at things, its graphical setting, the method between each preset is irrelevant as long as its tied to its named option, EG. reflection quality changes reflections.
@@HunterTracks You and the OP are splitting hairs, really. Settings affecting cards differently isn't unseen before RT. And even so, I fail to see the relevance of the argument. If it works for you on High, then awesome, if not, you can lower it to medium. That's what you're supposed to do when you want better performance, and it works the same with all the other settings (of course, with each setting having different impact)
What I've learned over the last few years, is that you really shouldn't buy a game right at release. Most of them seem to "ship" with a ton of problems of one sort or another, some are actually in an unfinished state...it's best to wait awhile and you'll have less frustration to deal with as a reward for your patience. (plus, I'm too poor for these new titles that require at least a 70 series card to be able to consider playing at native 1080 and 60fps.....)
I don't feel sorry for those who pre-order.
Except for playstation exclusives. They always launch extremely polished.
Alot of devs are lazy as fuck
this year a bought only....re4remake,street fighter 6,mortla kombat 1,jedi survivor and cyberpunk dlc...alan wake 2 next....dont trust anybody...jedi survivor was such pain to play...
@@gavinderulo12That depends, FF16 is still choppy as hell on framerate mode
Best practice in gaming today: wait 3-6 months, buy the game after it's received a dozen patches fixing bugs and performance, and get it on sale for 30-50% off. If most people would do this, it would HIGHLY incentivize publishers to let the devs FINISH the game and release and acceptable product when they launch it. As a consumer, it's a win win if you just wait a few months. In the case of cyberpunk I waited almost 2/3 years and the wait was definitely worth it. Pretty much the case for almost every PC release nowadays
Can't wait to see what a stutterfest will Ark SA be. It was a nightmare in UE4, it can only get worse.
Man there still no news except the shareholder thing that force them to release the game this month? This sucks hard and is not reassuring
thankfully we'll probably never get to see it
You can tell most commenters didn't even bother to watch the video and just came here to rant: Throughout the whole video, even when he's focusing on some problems, Daniel is saying it's not that bad. But then you look at the comment section and people aren't responding to that at all, just post a preformulated sweeping Unreal Engine rant.
I understand being wary of UE5, but at least watch the video and respond to it before you comment. Don't act like badly written NPCs.
Lumens is RT, it's a different kinda of kind of RT the way it does thing, but still is RT. That's why resolution boost fps A LOT, the only thing Hardware Lumens does is makes lumens more precise (better shadowing precision and light boucing).
So the game is playable on a $300 7 year old GPU at 1080p native and people are mad at the performance and are saying its unoptimised/bad/engine sucks? I just want to throw out a sanity check here but in 2007 the Radeon 3870 released on the 19th of november for just under $300 and in a game released 6 days before on the 13th called Crysis at 1080p it could deliver around 13 fps.
Might not be the fairest apples to apples comparison there for sure but for a game made on a new engine to be playable on 7 year/4 generation old low end hardware is if anything impressive.
I tried it on my Ryzen 5 5600 non X and RX 6700 XT. Stutterfest with Ultra Preset but when I tried it then I just let the game do auto detect then Resolution to 100, applied FSR2 to Quality with Sharpening to 60, to performs really good. No Stutters. Finally I used the October 10 AFMF drivers that further improved the smoothness.
I mean the fact you have to play with upscaler on actually good and modern hardware to have enjoyable experience is actually really bad.
I think most of the complains are caused by people using hardware raytracing lumen in the game without knowing they're using it.
The game doesn't have a switch for hardware raytracing, it just gets enabled once you put certain settings to high or ultra.
This is the biggest reason why people are complaining about low fps imo because they want to play on high settings as they're used to from all other games but have let's say Rx 6700xt or RTX 2070 or RTX 3060 and those GPUs get destroyed when heavy hardware RT gets used
@@ProtossOP I agree, but I only used it cause of sharpness. NGL the details in most games these days are just blurry that I just toggle them as I please.
@@damara2268 True, I do wish that the options say what is really heavy in hardware.
@@ProtossOP True, these devs are so lazy when it comes to optimization. Just slap on some lumen and nanite , no need for shader pre compilation and done. Release game make money, fix later.
The game is great imo, I'm playing on a 3060 Ti and it runs pretty well after some tweaking, the auto settings is for sure trash. Having a blast with the game, feels good, combat feels very impactful and weighty. Definitely needs more optimization but it's still playable on a lot of hardware for how well it looks, which is crazy. It's easy to get lost in this game, time wise and literally lol. Keep up these vidoes dude, I always look for them when a new game comes out, love to see your tests and thoughts on these things!
auto settings be like "oh you have gtx 1050 and intel core i3, let's give you ULTRA RAT-TRACING AND 4K UPSCALED TO 8K SETTINGS"
This is purely Epic's problem and they have to fix UE5 performance issues, 3rd party engines are there to reduce programming overhead for devs, not increase it, also its not an open source engine(custom license not GPL or MIT)..
If someone plays the game with a low end pc, and has stutters, is that UE5's fault or the pc user's fault?
@@thegreatimurdr2 looks phenomenal and runs smooth on lower end machines, inhouse engine can make a lot of difference for optimisations.
@@thegreatimuthe users fault
@@thegreatimuif their specs are above minimum then it is absolutely the engine's fault.
Also if there are games who look and run a lot better with the same hardware then it's also the engine's fault.
Dice's games look just as good and run at at least double the frame. RDR2 runs a lot better and also looks just as good.
@@Godmode_ON24you can play this game at 1080p 50-60fps on GTX 1060 which is the minimum requirement for the game.
If that's too bad then your expectations of what 7 years old budget graphics card can do are way too high.
Epic just needs to work on threading in the engine. We are in the age of 8-16 core (or more) CPU's and the engine still doesn't care cause it only needs a few threads for the main game threads and the rest are just not doing much. And ofc the eternal UE loading stutter/shader stutters. Maybe in 5.4+ we can hope to see them tackle these massive leftover issues. Rest of the engine is pretty damn great.
You could say that about the whole industry. I think you can count with one hand those that use multi threading....And by one hand I mean not even the whole hand... in fact I can't even think of any game that noticeably benefits from multithreading
I believe Unreal doesn't even use 6 cores properly yet.
You are right, there are not a lot. Dx12 has overall been a proper fiasco for the gaming industry. The devs just dont seem to know what to do with it. There are maybe a handful games that do it properly, like Nixxes ports, Metro Exodus, Doom Eternal@@riczz4641
@@riczz4641Cyberpunk
With the launch of 5.3 they added Refactor of Hardware API for Multithreading, they said we may see the beginning of this on 5.4 but that will took a long time to full implement.
Daniel, I have a serious question: How did this video go up in its current state? In most of the footage-including the very first setup-the game is running at 60fps or higher. The video is encoded and uploaded at 60fps. _Why is all of the video captured at 30fps??_ I skipped around and every moment of the video was clearly 30fps capture, regardless of game performance and regardless of the video's encoded framerate. Was this intentional, or just an oversight?
It's definitely got its issues, but I'm loving it still. Seems like the devs care and will fix lots of the problems with future patches.
That’s the problem why release it when it’s not ready mind you games are costing 70 dollars now and we’re getting worse quality. Literally indie games are better then AAAs now in terms of quality especially on release day.
@@inowtf4299 honestly I don't even remember a single game I've ever played that didn't have some issues, even older games had issues. At least these days they can fix the issues with updates. Older games they couldn't.
@@inowtf4299 but I do agree, indie games are kind of the way to go these days. Most console exclusive games I have no interest in. And most AAA games I find boring now. Don't get me wrong, I loved elden ring. But even it was a hot mess on release for all platform, it had more issues on release then I have encountered so far with lords of the fallen.
I think if they care they should done more test before release. Sold deflected game and then fix shouldn’t get praised. It destroyed our trust.
@@kurrupoppo6937 I honestly don’t remember many bugs In the old games I use to play back on ps2, ps3, GameCube ect but they were more like you died or had to restart a level but now it’s like the bugs make the game unplayable or just completely ruin the experience. Sony has done pretty good with their AAAs like spiderman 2 came out and its had a few graphical bugs so far but that’s about it and ragnarok had no bugs for me on release. But time and quality control needs to be a priority for sure nowadays.
Lumen in this game uses software RT only, the game is not built with shader support for hardware RT and force-enabling it will just lead to crashes due to lack of necessary shader packages.
I actually blacklist games using UE, I just can’t stand the stuttering anymore.
I'm 8 hours in and never seen stutter. I'm actually really sensitive about it and always looking to frametime graph with afterburner.
1060 is a seven years old GPU, not sure if it was a sensible choice for a min spec.
Owen one thing that really need to be point at that the fps always drop when the boss/enemy start to swing or doing any attacks. People used to fluid 60 on other souls game see these as a slowdown that need to get fixed. Because when you do dodge/roll to avoid the attacks with fps drops in the game, its always feels bad and janky.
like elden ring doesn't stutter..
Aside from ds2 sotfs, none of fromsoft's games are anything to write home about in terms of optimization. Maybe sekiro has decent optimization, but the rest are poorly optimized. And they stutter and lag as well. With fromsoft not even offering upscaling as an option.
@@enricod.7198Only on a few specific loading zones where there's no combat.
Used to be unplayable near release though with stuttering far worse than this.
@@MLWJ1993 only because you compiled all shaders, first run it's full of stutters and the performance is still suboptimal for how it looks (I'm running an rtx 3070, 32gb of ram and a 5800x3d at 1080p ultrawide lol). But you get a lot of hate from the usual delusional fromsoftware toxic fans for saying that the game runs like crap even if they sold what, 20 milliln copies? Fromsoft aren't capable from the technical pov.
@@enricod.7198 Nah, I can wipe all my drivers & shader cache. It'll not stutter, it was never shader compilation stutter to begin with. Elden Ring is all traversal stutter & most of that is fixed now.
It is however very GPU intensive as well. Likely due to the massive amounts of foliage (hence why "grass quality" is the biggest performance determining setting in the game).
pretty sure the stuttering is a U5 problem
I want to see some dev opinions on UE5. Is this on the devs or epic for over promising and underdelivering on what their engine was caple of.
It might just be chance and modern industry habits that have caused all ue5 games to be full of issues. Or the engine itself. Have only seen pure speculation so far.
Fortnite is very optimized even at potato pc so idk if it is entirely epic's fault. Most likely both parties. Devs tend to be lazy nowadays and UE5 probably is hard to work with. Optimization from both is needed.
@@quadragoo8484 Bad example. Fortnite has literally 0 graphics compared to new games
"t might just be chance and modern industry habits that have caused all ue5 games to be full of issues" we're talking as if every singe pc users has the same high end pcs or what?
@@PhilosophicalSock Fortnite has a simplistic artstyle but it can look very good. It just has very scalable graphics settings so you can go from very nice visuals to very bad and roughly get the performance you expect.
It also probably helps that it's developed by Epic. They own/work on the engine and must be very familiar with it at this point, where even if it is separate dev teams internally, they probably still have privileges and understanding other studios don't. The game also came out in 2017, so it's got plenty of work done to it post-launch. I'd imagine it also helps that its a competitive online game so they have to keep a performance oriented audience in mind at all times. Let's also not forget the amount of money that Fortnite makes and the amount of resources Epic are incentivized to keep pouring back into the game.
It's a new engine made for the next 10 years of games. In it's early stages it's still not fully optimised and lower end hardware is not up to the task of driving the new features yet. Within a gen or 2 most of these issues should disappear. UE4 had similar issues with early titles. A lot of it is still on the devs not optimising their game properly. Regardless it's probably overly optimistic to expect the likes of a 1060 to drive ue5 games well.
Pitfall of performance reviewers... You can't judge this game's performance from the first area of the game. There are more graphically sophisticated areas in the game but more importantly, what I notice is that you do not get many enemies in Umbral world in the first section of the game. So, I think this game has an issue with the "Umbral" world. In that world enemies constantly spawn/despawn. And as far as I can see, more you stay in that world, more the performance deteriorates.
He did say this game likely has worse performing areas later on. If its this bad at the start, then you can only imagine the later areas. If the 4090 is struggling at 4k ultra with just 50-60fps, it'll drop to 40-50fps later on.
Yes you can. Excusing a game for performing in 1 bad area is dumb. Every game should be running at top performance no matter the area of the game you dumb NPC.
Funny you said that, generally you would be right, but luck has it the area he is playing in is the most performance draining area of the game. This is the area over main hub area, game loads and unloads 4 areas constantly when you run around there (Upper Calrath, Craneport, Skybridge, Skybridge catacombs), that is why he sees stuters when running up and down there.
I've got pretty decent hardware, running the game around 70-80fps with a high/ultra mix, but for some reason it just never feels smooth. That's even taking the loading stutters into consideration, when it's just running normally it feels like the frame pacing is completely jacked. Might be worth capping it externally with RTSS or the like.
Feeling exactly like you my friend, and I run it at 150fps+... Never feels really smooth. I tried to cap FPS with RivaTuner and it still feels the same
The frametimes/frame pacing in this game just aren't good. It's tolerable but for me the movement just feels so inferior to Lies of P. I'm hoping everything will get better over time but so far Lies of P set the bar high and Lords of the Fallen just isn't keeping up.
That's just the typical Unreal Engine experience unfortunately
@@TonyHerson I'm pretty sure I already played UE games that felt really smooth but if what you said is true, well that's a shame
@@TonyHerson typical Unreal Engine experience? You mean super high end Unreal Engine games that are not suppose to run well on low end pcs?
I've been running the game for 15+ hours now on a 5600X + 6950 XT, and outside of 2-3 crashes, it's been seriously fine. 3440x1440, everything on high except GI and Reflections (so no RT), and I'm basically locked at 60 fps. It seems 70% of the people having major issues are on GeForce cards.
For me problem was solved by enabling Resizable bar on my Vega 56 by downloading Rebar config for Legacy and reinstaling drivers 23.9.2 afterwards. You can make rebar config by yourself just type this in a notepad and save as .reg file:
Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Class\{4d36e968-e325-11ce-bfc1-08002be10318}\0000]
"KMD_EnableReBarForLegacyASIC"=dword:00000001
"KMD_RebarControlMode"=dword:00000001
"KMD_RebarControlSupport"=dword:00000001
Not a single stutter I had afterwards :) Hope this helps someone...
The issue I think many have is related to unrealistic expectations based on “benchmarks”.
People see for example the RTX 4090 can achieve 100FPS in a cluster of current games at 4K.
However, deciding on the “optimal” resolution should be based on performance via the most “demanding” current games NOT the “average”.
Had more done this they would understand that 1440p not 4K is really the “optimal / 90FPS+”resolution for max settings / ray tracing going forward 2023+ w/ UE5, etc.
Now because many did not do this they are left to blame “game developers” for “unoptimized” games due to not being able to achieve optimal performance at 4K..
Me: 1440p/ 90FPS+ > 4K/~ 60FPS
This comment is never popular (ppl generally feel inclined to justify said purchase..) but it’s regardless true.
Now if you simply just want to game on your ~ 55” 4K TV or whatever and fine with ~ 60FPS more power to you.
Just saying 1440p would be a more sensible option (for a high frame rate 90FPS+ experience) for many as these games aren’t going to become any less demanding..
1440p to 4K is nowhere NEAR as dramatic as 1080p to 1440p.
1440p on a ~ 27” monitor looks amazing especially when you can run everything to the max w/ ray tracing @ 90FPS+
I also have a 55” LG OLED 4K / 120Hz / VRR which I’ve gamed on with my RTX 4090 / R9 7900X / etc.. setup.
Never felt like gaming at 4K was worth sacrificing the performance I can get on my 1440p / 240Hz / G-Sync monitor.
This is your opinion. If I pay 1600 bucks for a GPU, I expect a premium experience for the next 4 years minimum.
@@justfun5479
“Premium experience”
Yeah 1440p / 90FPS+ is certainly a more “Premium experience” vs 4K / 60FPS
Also, up to 240FPS in First Person Shooter (not talking e sports) is a more “Premium experience” at 1440p vs. ~ 120FPS at “4K”
Regardless, there’s this concept called “diminished returns” beyond 1440p when it comes to games.
I’ve gamed plenty at both 1440p & 4K and the “premium experience” is at 1440p with the much higher performance / frame rates.
@@justfun5479not anymore
@@justfun5479 High fps is the premium experience and the reason why many people buy high-end gear to begin with.
If you just wanna game at 4k, even a PS5 or Xbox will do the trick. The only games that I play at 4k are those that aren't demanding (Diablo 4, Hitman) and can still hit 100+ fps or games that are hardlocked to 60fps (Elden Ring).
And let's be real, anybody that can't detect the difference between 60fps and 120 or 165 can't detect the difference between medium and ultra settings either and is therefore effectively wasting money for zero benefits...I wouldn't be buying high-end hardware if it weren't for the performance, you can get 90% of the graphics on consoles for much, much cheaper. High fps? Not so much, only very few console games even have 120fps modes so this is where the $$$-draining starts.
60fps is subjectively very choppy and anything but a "premium" experience to me.
@@gameurai5701 I agree on the FPS part, but not on the resolution.
1600 bucks need to deliver also in 4K.
After watched from many videos of different youtubers and read the comments I see that everyone forget what UE5 sales pitch was.
UE5 sales pitch was "dev can use raw high definition 3d assets, all the optimization would be in ue5 (nanite,lumien)"
what is this translate to?
dev don't have to care about optimize the data footprint of 3D assets.
Before ue5 they have to optimized the 3d assets (convert the raw data to optimized format aka compressed format like DXT1 DXT3 DXT5 etc)
it is why we see games go up hundreds of GB of data.
before ue5 game hover around 10 to 30 GB
now, game hover around 50 to under 300 GB
but now, can't run at high fps? just turn on TSR, DLSS, FSA.
with optimized format the data load would be minimum, data goes from ssd/hhd to cpu to gpu would leave more computation power for those devices to rendering the scene, now in ue5 cpu, gpu even ssd would need more computation power just to move the assets data around then the cpu gpu have to work with all that unoptimized assets to rendering the scene. no wonder why stutter happened and lower fps.
with no dlss, frs, tsr, stutter cause by ssd/hhd loading assets, cpu/gpu working on big data.
with dlss, fsr, tsr stutter cause by ssd/hhd loading assets, cpu, gpu working on game engine to discard unneeded data as fast as it can.
The game has transversal stutters, which comes down to the devs, the Assets are not optimized enough to support level streaming... Moreover the game needed a shader precompilation step, along with the existing asynchronous runtime shader compilation, which reduces the number of shaders the game has to compile every time it loads an area..
Even the Unreal Engine Editor itself compiles around 7000 shaders, when I start it for the first time.. I feel like EPIC games should have a better documentations regarding shader compilation and asset streaming.. -This is my experience with Unreal engine, as I start learning it, in the past few months...
Non UE5 games are also huge. Look at jedi survivor which is UE4 and has 120gb. Call of duty modern warfare also has over 100gb.
UE5 is killing gaming forever and i dont know how the people continue making games on that engine
No one play on 50% res scale
100% native
Or 66% dlss or fsr Q on 1080p
It's 1080p even quality dlss or fsr will look bad
So 50% res scale is a very bad idea
Man, is it just me or the FSR algorithm actually resolves *less* detail compared to the engines temporal reconstruction?
I definetly think AMD needs to leverage those AI modules in the compute shader to compete with now even the baked in reconstruction technique.
Not just you, TSR did look better to me too, although neither actually looks "good" at those crazy scaling factors.
You're looking at a compressed video stream as well
Upscalers look like dog crap in motion and only look slightly acceptable when the output is 4k.
We as gamers need to stop supporting devs who "develope with upscalers in mind".
It looks so bad/blurry/pixelated/smeary in motion because it blends so many frames together.
Gamers with crap standards are making upscalers the new industry standard, not devs.
4070ti / 7600x
Ultra settings @ Native 1440p
65-75fps
~4hrs in to the game. So far, no crashes, bugs, or glitches. Game seems pretty stable and really looks fantastic visually. Had to force vsync off in nVidia control panel after recent game patch disabled vsync toggle in settings.
Same with 4080 / i9 10850K
Couple crashes but besides that has been running great even at 4K
Auto settings give me Ultra Everything with 2 random low settings
Around like 70-90 FPS with DLSS balanced
On my 2060S it is stuttery mess
Good to know. I'm going into this area of hardware with a PC I'm currently building. So, yeah. TechPowerUp Relative Performance chart suggests I will be in a 300% uplift of my current setup.
@@jasonvorhees6592 my girlfriend has the 2070 Super Laptop Version which is just a tiny bit faster than a desktop 2060 and I tested it out at 1440P all settings medium with dlss and it ran okay-ish.
What CPU do you have? She has i7 10700H and 32 GB Ram
Unreal engine making hardware companies rich. Who needs optimization when you can brute force your way to a subjectively stable playable experience 🤪!
The "auto" settings seem to be very conservative when it comes to texture quality. That 3060 12GB had 7GiB of VRAM utilization at medium settings. There was plenty of space left to go higher. Texel performance isn't a limiting factor nowadays either.
This is also a very early area of the game.
It's likely the settings are using some kind of rounded average based on tester's full game playthroughs, then offset likely 1GB down because running out of VRAM is such a showstopper when it comes to performance.
Hi Dan, watching your video. I have been playing since launch day. Personally I was more impressed with the performance than I thought I would be.
The detail on display in this game is another level on good hardware.
I use a 4080/11th gen intel ,fast DDR4 and a fast SSD and I get extremely solid performance. I keep DLSS on DLAA or DLSS quality...native 4K. Only the alternative world causes some dips...
Generally it's super smooth. I cap the game at 90 with Vsync on.
I put in the code to Allow frame Gen again...but I don't use it anyway.
At 4K that 7800x3d is no real boost. I found that out when I had one. It's only good at 1080p over other CPUs.
Keep up the good work. The game is generally top tier...such enjoyment to play (so far).
Resolution scaling makes sense as being the solution to 4K with high framerates. It also might make sense for pushing a mid range GPU onto 1440. But resolution scaling should really be a thing at 1080p.
This game seems think that xx60 = 1080p, xx70/80 = 1440p, and xx90 = 4k with all the cards using upscaling. Personally I would pair the same GPUs with those resolutions but without the need for upscaling. Upscaling should just allow to jump one resolution category or move into the triple digit FPS without a jump.
Will the Unreal Engine ever run smooth?
nope. trash engine
Only unreal engine 3 and mortal kombat at some extent (4 or 5)
32GB of RAM and a fast NVMe helps.
UE4 still has stutter problem, i don't know why they are wasting money on making demos for marketing, use that to fix their damn engine..
@@niks660097 Cuz the only thing unreal Engine is good at is trailers , demos and the cinematics department. They try to sneak it into gaming but if you look closely its not optimized for gaming at all and all those "cool new features" are cinematics focused.
Daniel, wait, I saw those dodges on Pieta, you're really good??
If you read the FSR description, it's FSR 3 and has has Frame gen included, but it's not working so good.
Probably Its Still Being Worked on
wait for real?
"Is it really that bad"?
Short answer: Yes
Long answer: Yes
Ive said it from the very start 2 years ago when i saw ue5. Ue5 will become the cancer of pc gaming if they cant fix these stutter issues. Not to mention ue5 in general is just WAY too heavy. Its the only engine that just crawls to a halt on the steam deck also. Ue5 has some awesome features but current hardware and consoles arent ready for it at high resolutions PLUS it has awful stuttering issues. I really fear for CD Project Red's next games which have switched to ue5 instead of their incredible RED engine. So sad to see so many devs on ue5
Many move to UE5 because Epic allows them to integrate some of their engine feature into UE. Game scale are getting bigger. It become harder for game studio to maintain and updating their game engine and at the same time try to create good games. The Witcher 2 and 3 was well received from the get go. Then many considered that CDPR screwed up big time with CP2077. Be it on the game itself or the game engine as well. The only issue we had is there is no real alternative to compete with UE head to head for triple A game engine now. Unity in general cater more towards indie but the recent fiasco also quite a problem. Cryengine ia another good triple A engine but the Cevat brothers mess up with the company years ago. It seems they rather focussing on creating their own game now rather than develop the engine to compete with the likes of UE.
Their next title probably won't see the light for 4-5 years. By then there will be a new gen of consoles and 2 gens ahead on graphics cards. Epic will have 4 years + to further optimise the engine as well. It won't be a problem by the time it launches. Expecting a steam deck to run the latest cutting edge graphics engine well is asking a bit much?
What was so incredible about the redengine? cyberpunk launched in a broken state required at least 2 years of post launch support to get it working properly. maintaining and upgrading the redengine was a major factor in why cyberpunk development was so difficult
@@mojojojo6292 epic had almost a decade to fix UE4 stutter issue. Now UE5 still having those issue
@@arenzricodexd4409 it's not even an engine issue. There are ue games that don't have shader compilation stutter but again it requires Devs to actually acknowledge and work around.
Hmm, WRT min spec, did the game default to 1080 with 50% scale, AKA 540? The screenshot shown listed 720 as min spec rez.
-Hey Mike how's the framerate on a 4090 & 7800X3D?
-Just about 45-60 frames Joe
-Ok Mike let's ship the game
-But Joe we didn't even optimize it
-Shush Mike, it's 2023, it's gonna be ok.
45 - 60... with maxed out graphic settings and 4K Native resolution.
... yeah that still sounds pretty bad come to think of it considering the hardware. :/
It is loading stutter. At 16:32 when you go up the stairs, the skyboc changes and when it does (loading new area) the stutter occurs.
I love this game. It’s like souls reaver meets souls games. Would like more AOE abilities tho since this game throws so many enemies at ya at a time
Longswords are so satisfying. Cleaving through hordes. It’s such an awesome game that’s getting what I see as unjustified hate
Ain't that the fucking truth
I did not notice which game version patch you are testing. That might be useful to know.
maybe a little bit of out topics, but I personally feel like UE5 games (like Remnant 2) overall low/medium graphic settings doesn't look that horrible compare with other game engines' game. So maybe instead of using FSR/DLSS to further lower down the resolution especially when u are playing in just native 1080p already, u should just lowered the graphic settings. And u do get a significant performance boost from turning high to medium unlike a lot of other games where u gain just a marginal performance boost with a much higher graphic trade off.
"overall low/medium graphic settings doesn't look that horrible compare with other game engines' game" indeed but because of that it also has barelly any peformance gain as you lower the settings
Yes . People need to understand that the playing field has changed. Medium is the PS5/seriesX settings this is no longer 2012 where consoles were trash and medium settings were disgusting.
Medium is really good, medium is a PS5 meaning medium is the rx6700 PC equivalent
PC gamers are a bit delusional thinking their gtx1660 and gtx1060 are still top dogs.
@@riczz4641 i mean peopls delusions are valid i mean just look at all of thos new games they are all demanding yet barely look better, why because the new engine if so why use it when the old one could achieve the same visuals and be less demanding, console being more powerful means they need to opitmise the game less for it which further means even worser pc port
@@riczz4641 how are pc gamer delusional? Just because not everyone has money doesn't mean a game should not be able to scale well.
@@cxngo8124 Bro cmon. 1060 is a gpu from 2016... you are literally the equivalent guy with a PS4 complaining why ps5 games don't run well...Don't give me the money BS. What can I tell you. This is a luxury, if you can afford 60$ games then you can afford a GPU.
I'm from a third world country where minimum wage per month is 350$ if I can save for a last gen card then you can too.
I assume that the number on the right in the 'RAM' stats is the amount of RAM used only by the game?
how can I turn in on? I see only the total system used RAM...
Playing on a system with a 7800x3d and 7900xtx, had everything set to Ultra at 4k and FSR Balanced. For the system it is I expected more performance than the around 70fps (at the lowest in heavy areas) I was getting. I noticed AMD haven't got their Driver up for the game yet because of the anti lag stuff with CS2, so hoping that brings some more performance. I did notice FSR makes some of the lighting looking really sparkly to the point where it's really distracting so turned it down to 1440p with no FSR and getting better frame rate and the game looks much better. It definitely needs some work.
Bro it's ultra. Those settings are pretty much future proofing in games. I don't understand why people think they should be playing ultra settings at 100 fps
Ultra includes RT, the fps killer, change it to medium and your fps will be better jaja
@@tbunreall I understand that, I also don't expect over 100fps at ultra settings. I just thought I'd get more out of the upscaling but it looks really bad anyway so now playing at 1440p without it. I get that the engine is for future games but isn't it realistic to think that they should be aiming Ultra settings for Current high end cards. Save those extra lumen pixels for the sequel.
@@i11usiveman97 Well I think 60+fps on ultra is perfectly reasonable.
@@tbunreall sub 60fps, you can straight up see a 4090/7800X3D struggle to maintain 60fps and at 4k thats what most will target unless they have higher refresh rate displays but if your struggling to keep 60fps a float on the BEST HARDWARE currently, I wanna know what they even tested this game on to OK this.
Also I'm sick of people arguing the whole "ultra is future proofing" it isnt end of, RDR2 did exactly this and how many returned to that game when 3000 series dropped? yh exactly, many play games in the moment not "when they have hardware capable to play it" I wouldnt even consider going back to LotF when 5000 series drops because by then some more demanding titles will be released that will generally keep someone interested and away from older titles riddled with performance problems.
Its a cop out adding "future proofing" the entire point of Ultra if you havent released was so people could leverage FSR 3 and DLSS 3 frame gen tech, FSR 3 isnt in yet (but is confirmed) and DLSS 3 got removed, thats why Ultra exists so anyone can run Ultra using frame gen to give them "smoother" fps at higher fidelity, absolutely nothing to do with "future proofing" its not a live service game where they will maintain support for years to come, its a done deal. give 12 months and they will have moved on in terms of bug fixing/support
This game is a lot of fun especially if you are playing with friends and like Soulslike titles. Reduce global illumination to medium and maybe shadows if you’re having performance problems. I’m rotating 5 friends and we’re all having no issues just having a good time.
UE5 core feature set scale on a per pixel basis. We've already observed this from Remnant 2 and Immortals of Aveum.
As always I'm more concerned about the uneven frametimes and poor CPU performance of UE5. This honestly has me worried a lot about upcoming releases.
Yeah the multi core/ threaded performance is crap, you are lucky if it uses 4 cores.
All they care about is single core performance that goes vroom vroom rather than spreading the workload out amongst all the cores instead of relying on speed
My son is playing this on a GTX 1070 and Ryzen 1600 (original) and hasn't crashed once so far and is enjoying it. He's playing on default settings and has 4-5 hours in now. From what I can tell it looks like its running great. Didn't turn on telemetry as it appears fluid enough visually.
The VRAM consumption is very good. I'am playing on my 3080 10g in 4k and it is nice to see that UE5 utilizes not a huge amount of VRAM.
Does it run well? I have the same card and I'm interested in buying this
@@pomps8085 On high settings + DLSS balance runs well, though for stable 60 fps all of the time you have to go dlss performance. Also it crashes sometimes but it is the game itself.
@@Fantomas24ARM thanks. It's that 2k resolution? I might wait until Xmas for a few more patches
@@pomps8085 this is 4k, I will recomend you to wait a couple of months indeed, those crashes are pretty often.
@@pomps8085It runs well for me at 1440p quality with my 3080.
I set all settings to high. I also use a 60fps cap, without it I found I got stutters so having a little lower but locked frame rate is much better than a higher but unstable one
Great video in terms of analysing the performance, but you kinda completely avoided to mention what visual impact these settings actually have. I was watching this in 1440p60 and it's obviously very hard to judge anything from YT, but I didn't really notice the game looking any better on the 4k Ultra vs whatever resolution on low/medium. Especially didn't see anything with the raytraced settings compared to nonraytraced.
Not gonna lie I'm very underwhelmed with UE5 thus far visually it just doesn't look better than some of the big triple A games like rdr 2, horizon forbidden west, cyberpunk, etc idk maybe cd projekt red will prove me wrong on their next game that's supposed to use UE5
Did you buy any of the unreal 5 games so you can see first hand? I have not bought any unreal 5 games yet but from what I have been hearing from reviewers and seeing on youtube is that there is an improvement. I will wait until 6 months or more after the release to buy a few unreal 5 games that have come out so far.
I haven't had any issues. I. Running a 7900xt and a Ryzen 9 3700x. I have most everything at its highest, other than a few things that I don't really care about. I can't remember off hand all my specific settings, but if you've got a similar setup and you want my settings to see if they work for any of yall I can post em when I get home. I'm loving this game, and I'd hate to be held back by crashes or bad lag.
One thing I've been thinking is what is the point of games having this level of graphical fidelity and features if for most people, they're gonna look like a pixelated mess due to devs relying on upscaling, I'd rather have a game that looks 'dated' but runs well and can be maxed out
1080p 50% resolution scale is not 1080p cut in half (540p), its actually 1280 x 720 (720p)
2023: The year of unoptimized games.
Game runs perfectly UNLESS you enable G-Sync. Then there are frame drops every time you kill a mob, sprint, or dodge. Was able to consistently recreate this. I’m aware UE5 breaks if you apply any OC to GPU (it crashed like usual when I tested), but stock boost meant no crashing.
Are you sure that your GPU OC was fully stable in the first place?
@@Psychx_ Yes. On any UE5 game I’ve played I can’t apply any OC at all or it crashes. Not even +15. Normally I do +90.
Would've loved to see more radeon gpus tested, hoping for a video on that soon. Keep it up man, great work!
The bigger disappointment for me is that UE5 still has the same stutter issues. The loading stutter can be tolerable depending on level design and how frequent it occurs, but is still very disappointing. The cutscene stutter for me is always a huge red flag. It's a scripted sequence. There has to be a way to avoid stutter and pop-in, right? Really does not bode well for future UE5 games, and supports my overall pessimism about the quality of software and hardware declining. Seems financialization is ruining everything by creating the wrong incentives and terrible workplace environments.
At ultra Settings it is actually using ray traced global illumination.
Considering this it runs not too bad😀
High settings give a big performance boost
Ultra settings are almost always for multiple Generations down the line.
People are becoming confused because they can play 3 to 6 year old games on Ultra.
So why can't my current game not run 4K Ultra 😂
its the engine. Ask yourself why games with custom engines running DX12 have no issues but every UE5 released lately has issues? Oh and some of their games stutter so bad it can make you physically sick.
when ue 5.0 advertised their real time light reflection can make gtx card usefull again. it was a complete bs.
I really forgot about this..
Maybe gtx1080ti feels alive kind of.
It IS very bad. For the fidelity it offers, the performance hit is so bad despite the best efforts of developers. LotF looks good, but it doesn't merit running at only about 70fps average on a 7800xt.
Lord of Fallen Frames
Unreal Engine 5 seems like a disaster
Hello, do you know that I have had an RTX 3060 since 2021 or so and I have noticed that the temperature rises a lot when playing in 2K at 60 fps...
it reaches almost 80C, and the fan noise is very annoying...
what I do is lock at 45 fps and I keep the temperature at under 75... but I would still like to play at 60 fps
I will have to do a complete repaste or I could only clean the fans...
since I don't dare to open it completely, with the risk of being left with nothing---thanks Daniel
50% Render resolution is pretty terrible, but I guess for a 1060 that may be the normal way to run this game. Some people may flip that to %100.
Video recording on the SAME machine can cause crashing and performance issues. Even when using hardware encoding.
Only way around this is to get a very high quality capture card, or perhaps output the record data to a network machine for processing, that may be more stable.
50% scaling *is really really terrible* . You literally played on 540p. It's lower than PS2 generation resolution. No excuse can save this.
I'm recording my on a separate PC with a capture card
for a game that doesn't look that great the performance is something left to be desired. and it's clear that the devs are not capable of understanding what parts of a game consumes the most amount of resources.. also what i find wierd is the frametime is going crazy. from as low as 12ms to 25 ms on the lowest capable hardware. while only a few things are happening
The shadows don't look the best tbh.
I'm afraid HL2 had better shadows🤣
Since UE sourcode is opensource on Github, have any dev studios figured out how to solve the stutter? I vaguely remember for UE4, even Gears 5 had stutter, that the Coalition didn't eliminate (maybe they eventually fixed some of it post launch?)
Everyone can't be coalition, they have always the to optimize epic's mess, also its not open source license, they have their own custom license, where pushing code is not allowed..
Days gone??? Maybe 🤔
I absolutely love this game so far. (got almost 3 beacons out of 5). i do play on a 4090 but i havent noticed any stutter. The game is great for anyone who misses the Dark souls 1 feel that has been missing from later FS games. The maps is huge, the exploration is top notch there are so many branching paths and things to find.
Also they used to have framegen (deactivated because of issues i guess), but with framegen on at 4k dlss quality i had almost always maxed out my 144hz 4k monitor.
Me too i downloaded a perfomace fix mod and the game hasnt stuttered or crashed since and really enjoying it , needs optimisation for sure tho on my specs i shuld be getting way more fps
But it's not their game to iterate on. It's someone else's game. I can't get my head around that, you can't just clone something and change a few things here and there can you??
@@DuckAlertBeatswhat? Fromsoftware doesn't own soulslike games
To think, we went from 1080p to 540p... Seems to be kinda backwards...
UE5 is a very young engine - I think it makes sense that we'll see growing pains for this first generation of UE5 games. I'm sure it'll improve as devs get more time with the new tools. I don't see why people are so against using upscalers, though.
I can explain that :)
First: I have a 1080p Monitor and I very much notice every pixel less calculated and upscaled because even at FSR Q (720p internal resolution) I immediately lose a lot of fine details (hair/fur for example).
Second: I will not accept that upscaling to 1080p is necessary because the devs are not given enough time to optimize the game thoroughly by the publishers.
These are my reasons why I refuse to use upscaling :)
@@cpt.mccartman that’s fair - I was only really thinking of it from a 4K perspective, but I agree that upscaling looks terrible at lower resolutions.
Agreed that UE5 is a young engine and I think many people miss that its the first engine (big engine at least) to include ray tracing via lumen as a core part of the design its really an engine for the next 10+ years of hardware not the last 10 or so years of hardware.
As for the aversion to upscaling, personally it reminds me of consoles lieing to consumers about what resolution they were and the period of time where xbox (360 era) owners kept trying to tell me how there 540/720p console had better graphics than my natively rendering at 1080p PC. The term is just ingrained in to me as lesser or fake, a trick to scam the uninformed and honestly that is still true at least in some cases. We stil get people saying they are 'rendering' at 4k while they have DLSS performance on and well they certainly are not 'rendering' at that resolution so they are mislead.
Also having used a bunch of DLSS now as its an option I can confidently say that the evidence of my own eyes is that its a visual downgrade overal even if in some areas it can work really really well. My personal best example of this is Darktide that I can play native in 1440p at a stable ish 60fps or enable dlss (or fsr) to push higher and it is a fast pace game so thats usefull but it is at the cost of some visual errors especially on fast moving objects and thats nearly all objects because of my play style. In Cyberpunk I can play without RT native or basic RT with dlss and even though the lighting is better with rt+dlss the game overal looks and feels worse. I think too many people do the pixel zoomed in compressed you tube video thing and not enough actually play a game with the setting on thing as both sides of the argument might just end up somewhere more reasoned (in the middle) after doing so.
The other issue is that people spend way more on GPUs now and don't feel the value of them as they are used to playing a game in native settings for much less money and if you spent more on this gpu than the last one but feel you are playing on lesser settings it creates a bad feeling. Thanks to GPU prices I think thats every single one of us now. Ultimately its all nvidias fault they pushed up prices and said you now render at lower resolutions than before suck it chumps its just taking the market a little while to get used to being bent over so far. Give it enough time and everyone will be rendering at 720p on 5/6k monitors and be happy with that because its 'better than native' just like how console owners used to brag about the TV saying 1080p with their 540p consoles. It just (about) works
Upscaling on old hardware is amazing and AMD supporting old cards is such a win for gamers but it really has no place in the high end unless your doing something extreme like trying to play in 8k
@@notwhatitwasbefore I personally found that path-tracing in Cyberpunk increased the visuals significantly whereas upscaling (at 4K at least) had very little effect on visuals. Ray reconstruction made reflections look better for the most part - I always found reflections in the rain to be frustrating in that game because they were so noisy.
@@sapedibeppo Yeah path tracing looks nice but as my gpu delivers a stagering frame rate of around 240 FPM (frames per minute) I was specificly refering to the differences bewteen no RT at all and just turing the setting on at its lowest level not path tracing just ray tracing. Also I'm at 1440p and upscaling to 1440p does have a visual impact.
Its one of those things if your upscaling from a good starting point its fine but as you drop down the resolutions it gets less and less data to upscale from and so isn't as good. I personally don't see a huge difference between 1440p and 4k when actually playing a game and not staring at zoomed in screen shots so yeah I agree at 4k the difference is minimum if your render resolution is already 1440p or higher. Because 1440p looks good/great.
On a 3070 with a slight tweek from ultra I get stable 60fps with no dlss no RT but just turning on RT and leaving all the other settings at off (Reflections, sun shadows, local shadows and lighting all set to off) means I require DLSS and if I turn on Reflections I can expect dips below 30fps even on performance dlss in some places.
As you mention your at 4k and use path tracing its fairly safe to asume your GPU cost more than my whole PC and yet we are rendering Cyberpunk at the same resolution you just get nicer lighting and I mean no offence but thats hilarious as you essentially paid like a grand more for nice(r) looking puddles.
When I played Cyberpunk for the first time using FSR to hit a playable frame rate at 1080p on an RX480 (4gb!!) I was asstounded at what was possible on my old cheap (for my normal spend) gpu. In situations like that I am a huge proponent of upscaling even if the visual quality is a bit iffy but not once has that satisfaction/experience been repeated using a better upscaller on a card that cost twice the money and if I had spent twice as much again (or more tbh) on a GPU I would be so mad if I ever felt the need to turn on an upscaler, while the card was current ofc in like 5 years trying to stretch the life of a gpu to one more AAA game yeah great tech glad it exists.
Hey! How come you don’t like dlss at 1080p?
The fact that a 4090 is not able to max this game out is CRAZY like u would expect a card like that to be able to max out games for atleast 3-4 years
That has never been a thing in the PC market.
then why even pay over $1600 for it them smh
@@ZackSNetwork
He was testing at 4k and the dlss seems to work properly compared to other recent releases, all things considered the Dev's did a good job.
well the 4090 is able to max it out ? just because you are not over 60 fps doesnt mean it can run the game on max for me it would be ok because i can play at 30 fps when you play like 10 mins you dont even notice
Why do people think that the 4090 is so much different than any other GPU? 3-4yrs and the 4090 will need to use DLSS just to play 1440p.
Bashing "unoptimised" games these days seems to be a trend but i think we have collective amnesia because back in the days (2000, 2010's) if you take a 6 years old GPU, nothing AAA in 3D that came out was running on it, like at all.
What we need to realise is:
Yes 3rd party engines have a performance overhead, and the more complex/graphic intensive your game, the more it seems to be unoptimised, it is what it is, it's not dev fault, it's game companie's fault there, UE games were always a disaster in some areas since inception of UE as a third party engine.
For a good couple of years PC hardware became stupidly cheap and now prices are skyrocketing to pre-2000 era prices, all mid-tier, low end hardware is just crap to keep market segmentation and justify the actual product at an inflated price, the tech market is now saturated, honeymoon is over it's time for profit.
I’m already tired of devs making their games with dlss required. A game should not come out where the absolute max hardware can’t handle it at a rock solid 60 maxed out.
The funny thing is I recently checked how my PC hardware was relative to the normal distribution on Timespy and my score was better than 67 percent of other people. I have a mid range PC and I'm starting to struggle in the newest games but I also fund it wild how other PCs that are more than twice as powerful than mine are still having issues. It makes PC gaming so cringe.
JUST STOP BUYING NEW GAMES. For companies to change they need to feel it in their pockets.
Could you do a 4090 vs 7900XTX comparison when it comes to RT in this game? I've noticed that many UE5 games seem to do RT perfectly fine on AMD hardware, and you're probably the only one who's going to bother talking about this game.
Also: The youtube compression algorithm really makes quality comparisons impossible. On my 768p laptop I could only barely tell a difference between 1080p Low with 50% render scale and 4k Ultra except for the better lighting. I'll have to watch it again on my desktop computer later... but I'm on the couch :P
Fwiw, I have a 7900xtx/7800x3d and I'm in the 50s with everything on Ultra at native 4k. FSR Quality at those settings locks pretty well to 60, at least fairly early in the game where I'm at.
@@millerc82 Thanks. That means +60FPS at 1440p Native which is the answer applicable to my situation. That also means that the XTX scales to the 4090 with RT on as it does in pure raster.... So another instance where "the Nvidia RT advantage" doesn't exist.
@@andersjjensen Yeah, 1440p was nice and smooth at 60 max settings, haven't checked off vsync yet to see what it's actually hitting.
@@andersjjensen Lumen is software based ray tracing. However, it can be hardware accelerated in some games. This game probably doesn't utilize hardware acceleration for lumen.
@@TonyHerson People say, HW acceleration does not make it run faster, but gives better visual quality (mor detailed).
Such as Fortnite, just around 30% fps difference between xtx and 4090 with HW acceleration On.
Real Path tracing makes the gap of hundreds of %, because Lumen is far from that level of RT workload
Hi Daniel, I have a video request: Can you check out the latest AMD preview driver from 3 days ago? AFMF "should" be improved. Thanks for your videos!
Ue5 is embarrassing. But what do you expect from the developer of fortnite.
Your a hater?
What😂 Fortnite runs well
The problem is not in the engine, but in game developers. The are ethier do not have enough experience or just too lazy to make a decently running game.
Coalition and Bend made top class games (in terms of performance, optimization and smoothness) by using UE4, while others just do not care.
If game was published with no stutters at all, I guess people wouldn't mind to lower some settings on their hardware to play with acceptable fps.
Basically if it's not made by Nixxes it's 90% likely to be trash (and even Nixxes isn't 100% flawless) (also kinda funny/ironic how Sony has the good ports...usually)
Maybe Alan Wake 2 but it doesn't even come out on Steam
Alan Wake 2 will be another juggernaut, it's Remedy's Northlight Engine + an exclusive of the new gen consoles.
Could just be unreal engine 5. It has legs to run, but it's walking due to features not being used. Technology can't keep up with the engine. Nixxes isn't the end all beat all.
Ratchet & Clank: Rift Apart is so optimized its crazy im getting 130 fps on max settings on a RX 6800 XT
@@marcelosoares7148they Also already said they are targeting 30fps on consoles. So it's definitely going to be heavy.
@@gavinderulo12 The requirements just dropped and it's worse than i expected, you need a 6700 XT/3070 for Medium and upscaling from 540p to 1080p if you want to achieve 60fps, might be the heaviest game so far considering this is with no raytracing.
Running super smooth at 3440x1440p all maxed dlss quality on a 4070ti. Game looks exquisite. Dual worlds is so cool
It doesn't seem so badly optimized. I mean its already next gen guys. It's a tough pill to swallow but we can't be expecting gtx1060 to get 60 fps not even on low. In fact this one is giving you exactly that with upscaling but it's better than nothing....I think it's pretty generous.
That absolutely songs like coping
@riczz4641 rtx 4090 dropping to mid 40 fps at 4k max without upscaling and you call it optimized? lmao what are you smoking
@@Scott99259 Go play cyberpunk 4k native maxed RT....You'll get not 40fps dops...you'll get 40 average...Go try remnant 2...Same. The witcher 3 next ge? 30 fps maybe outside cities
I don't know in what bubble you live but NOT EVEN a 4090 is ready for 4k native ray tracing. You are delusional if you think that's true...And if so then mention a game that runs natively at 4k RT that is at this level of visuals? There are none....
@@Scott99259 You don't have to play at max settings. Other games simply lock these settings out so people like you won't complain.
And where are the graphics of the next gen ?
It's quite sad to see the once announced optional DLLS for 4K high FPS gaming is nowadays pretty much required for 1080p if we want a decent FPS.
So instead of bringing high framerate gaming to consoles, they bring the 30/60 FPS gaming to PC.
The game runs incredibly poorly on series x as well. They released a second patch today that has definitely helped, but it still looks like a 360 game that you have to pay $70 for.
Supposedly in UE 5.2 there are fixes for this shader compilation stutters but developers have to implement it
The is no shader compilation stutter in this game from my experience. Just normal traversal sutter.
Omg. Im so fed up. Every game coming out is shit and has optimisation problems
Isn’t it crazy that most demanding game CP 77 with all settings maxed doesn’t stutter, whilst some UE 4/5 game on medium settings stutters at every corner?
The game plays beautifully on console after the performance patch. Graphically amazing as well.
I speak on behalf of all pc gamers here: Thanks for letting us know. Very usefull information.
@@jameswayton2340 yeah well, PC gamers are some of the biggest douche bags on the market. So forgive me if I could care less about your game only running at 80fps 😭😭
No, it still has a lot of work ahead of it before it feels like a polished release. With the frequency of the patches, I see that they are trying, and I hope they do get there sooner rather than later. FPS drops all over the place and pixelated fire is not the final state of the game, I would hope, and that's not even getting into the gameplay balance issues they need to address as well
The game runs fine on my 4070 in ultra settings. The issues are probably because people are running with old GPUs since UE 5 is demanding it is to be expected
mine is fine.
ive got RTX4070 running on ultra all round and getting average 80fps.
ryzen threadripper 2950x. 16 cores 32 threads.
asus rog zenith extreme motherboard.
64gb corsair rgb 3200mhz
2tb western digital black m.2 ssd.
alienware ultra wide 38" resolution 3840 x 1600.
@@malcolmswift1889 Everyone knows 4070 is for 1080p and 1440p gaming, don't play in resolution higher than that.
@@DhruvRed quick link and view ultrawide
ua-cam.com/video/gb0FeeEbvJQ/v-deo.html
@@DhruvRed ua-cam.com/video/gb0FeeEbvJQ/v-deo.html
I've had a few conversations about UE5 performance around the Talos Principle 2 demo and a lot of people seem to think that whatever system they have should be able to run the game on high or ultra at 60 and if it doesn't it's unoptimized. There's this one guy where auto puts him at low everything and he's complaining he can't get solid 60 fps on high. Stutters are an issue but a lot of this optimization talk comes from people who seem to think their 3/5/7 year old hardware should be able to run the game at a lof higher settings than their hardware can actually support years after it came out. On the one hand you get the guy with a 1080TI being happy he can still play new games in 2023 and doesn't need to upgrade, on the other you get the guy with the 3060 that's complaining his 3 year old mid range card can't run the game on high.
Nobody expects that. We do expect competent games that don't have massive traversal/shader compilation stutter and we do expect games to be reasonably optimized. When you look at games like Cyberpunk, Metro Exodus and Dying Light 2 offering full ray tracing suites and top tier visuals in massive open worlds and then look at the state of Unreal Engine games over the last 3 years there's clearly an issue.
@@landfillao4193 Cyberpunk? that game that's used as the default Nvidia benchmark because at max settings even a 4090 can't get 4k 60fps? That Cyberpunk? the one that was so buggy on release it negatively effected CD project's stock by over 75%?
Who the fuck do you think you're lying to mate? Because it sure as hell isn't me. Might be the guy you see in the mirror every morning though.
CD project almost went under because of Cyberpunk and it took them 3 years to fix most of the bugs. And you know what else they did? they updated the minimum requirements for the games while adding a new high end that shames even the highest of high end cards. 3 years of constant development post release to get the game where it should have been at release, new minimum specks that are actually reflective of the performance of the delivered product and settings that go so high no as of yet released card can run them but that's the first game your mind goes to when you think of the opposite of Unreal's performance issues? That games suffers from everything people are accusing UE5 games of without being an unreal game engine but that's the game you use to showcase the problems with Unreal? The game that shows this isn't an Unreal issue but a wider issue systemic to the industry is what you're going to use as a critique of UE5s performance? Do you like Cyberpunk so much that you've been literally blind to it's issue since it launched?
It seems the early UE5 games coming out are just badly optimised, all the ones I've seen have limited vis blocking choosing to have vast open spaces which results in additional performance hits. Am talking from the experience of game dev from way back in 2004 or so with the Quake 4 engine but the same dynamics apply am sure. The early games seem to be focusing game visuals at the expense of smart level design to minimise tris count.
4k ultra is crazy demanding
I'd say it's definitely playable on older hardware.
Assuming you're comfortable with less than 100hz
Not on an i5 6600k and 8gb of ram it isn't
@@formarkv isn't that even _older than_ the stated minimum specs?