Your PC isn't ready for this Unreal Engine 5 game! Remnant 2 PC Performance

Поділитися
Вставка
  • Опубліковано 14 жов 2024

КОМЕНТАРІ • 1,9 тис.

  • @A.Froster
    @A.Froster Рік тому +1992

    No game should require upscaling to be playable with modern hardware. A 4090 struggling at 1440p without upscaling is a joke and sign of incompetence

    • @ManuSaraswat
      @ManuSaraswat Рік тому +193

      I was really excited to try it as the first game in the franchise was kind of a hidden gem, but after trying this on my 6700xt @1080p nd couldn't hit even 60, I promptly returned the game, do not support such bullshit levels of incompetent optimization.

    • @julianmalarz5227
      @julianmalarz5227 Рік тому +18

      ​@@ManuSaraswatno kidding? I have a 6700xt and it smashes the first game. Have yet to grab this but it seems like I'll be waiting while they just let the public bug test it.

    • @KeepAnOpenMind
      @KeepAnOpenMind Рік тому +23

      You are not educated enough to know that not everything is possible without upscaling these days yet.

    • @moichamoy2274
      @moichamoy2274 Рік тому +160

      ​@@KeepAnOpenMindyou shouldn't need upscaling with a 3080 at 1080p that is just bad.

    • @oogwaythesussyturtle
      @oogwaythesussyturtle Рік тому +85

      and the fact that the game doesn't even look good is basically salt on the injury. i mean com'mon, rdr 2 wipes the floor with this game in visuals.

  • @Zecuto
    @Zecuto Рік тому +46

    Ah, the magical process of seeing my 3080 turn into 1080p card with each new release.

  • @KurisuKun
    @KurisuKun Рік тому +880

    That's just outright foul. Upscaling was supposed to help lower-mid tier cards gain playable framerates, not serve as a crutch for poor optimization.

    • @J3f3R20n
      @J3f3R20n Рік тому +117

      It was bound to happen. I knew some devs would start doing this when Control came out.
      The real problem with REmnant 2 is the system requirements claiming you can play with a gtx 1650 and the "recommended" as a rtx 2060. This is straight up bs to lure people into buying the game.

    • @Biaccovich
      @Biaccovich Рік тому +6

      Al least dlss looks really good in Remnant 2. The game looks crisp even on performance mode

    • @west1329
      @west1329 Рік тому +7

      That my good sir, is why dlss and fsr were created. For poorly unoptimized games!

    • @Steelninja77
      @Steelninja77 Рік тому +1

      That's so true.

    • @GewelReal
      @GewelReal Рік тому +2

      At least it doesn't stutter as long as you are not using an ancient CPU

  • @plesinsky
    @plesinsky Рік тому +56

    I barely hit 75 (my monitor's refresh rate) at 2560x1080 21:9 on 13600k and rtx 3090 all ultra no dlss. Was very surprised with this "performance". The previous game was working flawlessly on my gtx 1060 back then. Today's gaming is disgusting, devs basically say "just buy 4090 bro, what's the problem?"

    • @GeneralS1mba
      @GeneralS1mba Рік тому +9

      Yeah just buy 4090 and turn on dlss balanced aswell

    • @GeneralS1mba
      @GeneralS1mba Рік тому +2

      Locking to 1440p native with this level of hardware is insane, bet there will be optimization guided for good frame rates

    • @lucidbarrier
      @lucidbarrier Рік тому +1

      Yeah Remnant from the Ashes looked amazing on an i7-4790k and GTX 1070. Don't know why they wanted to use this engine on their small playerbase

  • @theartofgaming905
    @theartofgaming905 Рік тому +632

    Unreal Engine 5 game that looks like a Unreal Engine 4 game and has the performance of an Unreal Engine 5 . Amazing!

    • @RichLifeStories
      @RichLifeStories Рік тому +50

      Haha yes!!! I didn't even realize it was an unreal engine 5 game. Until today. Had me scratching my head

    • @AntonioLexanTeh
      @AntonioLexanTeh Рік тому +54

      It actually has the performance of a game made 5 years in the future. Mind-blowing 😂😂😂

    • @RichLifeStories
      @RichLifeStories Рік тому +6

      @@AntonioLexanTeh 😂😂 that is the truth.

    • @teddyholiday8038
      @teddyholiday8038 Рік тому +4

      Science.

    • @enricod.7198
      @enricod.7198 Рік тому +2

      Even ue5 runs like shit and should not run like that.

  • @hYpYz
    @hYpYz Рік тому +36

    "designed with upscaling in mind" translates to "did not optimise". It's pretty much that simple and this video proves it.

    • @syncmonism
      @syncmonism Рік тому

      It is NOT that simple. It's way harder than you think it is.

    • @hastesoldat
      @hastesoldat Рік тому +5

      @@syncmonism Then put the resources, money and time into it.
      It's not normal that such a tiny fraction of the budget goes to optimization when this is one of the most crucial aspect of games.
      You can make the best game in the world, if it runs like shit, it's not gonna be worth playing. It's a pure waste.

    • @bigturkey1
      @bigturkey1 Рік тому

      what game have you played in the last year that maxed out a 4k 120hz panel without using dlss?

    • @Alex.Holland
      @Alex.Holland Рік тому

      I think what they said is true. DLSS works better in this game than any other game i have seen. Its flawless to the point i cant even tell if its on or off. I believe its likely they put a ton of work to get the game to run this way. The end result is odd, in that it only looks a bit better than the original did and runs much worse objectively.

    • @hYpYz
      @hYpYz Рік тому

      @@Alex.Holland if you look at the video it looks like it's stuttering no matter what. Comparing it to other gameplay videos on this channel it still looks like it's not running that well. Even if a lot of work went in to this I would bet money this was chosen as easier/cost or time saving option. It's lazy and it shows. If you create a game that runs like shit on top spec equipment it's shit no matter how you dress it up in excuses.

  • @andrewwowk9385
    @andrewwowk9385 Рік тому +530

    The fact that a game which looks as average as this NEEDS up-scaling is just ridiculous.

    • @teddyholiday8038
      @teddyholiday8038 Рік тому +51

      I have a feeling this is only gonna get worse

    • @edzymods
      @edzymods Рік тому +26

      I would say shadow of war looks better LMAO

    • @L_e_t_h_e_r_e_a_l
      @L_e_t_h_e_r_e_a_l Рік тому +4

      @@edzymodsI just watched a video from kevduit playing it. And I was like “damn this game looks good as hell still”

    • @enricod.7198
      @enricod.7198 Рік тому +37

      ​@@teddyholiday8038dlss was only the beta of shitty game optimization, frame generation will be the opus magnum of shitty game optimization. Devs will skip any proper multithreading and optimization and slap dlss on it. Man I hate this undustry.

    • @BladeCrew
      @BladeCrew Рік тому +9

      @@edzymods I would say that torchlight 2 looks better than this.

  • @cyclonous6240
    @cyclonous6240 Рік тому +62

    If you see unoptimized and buggy game, just don't buy one and wait for it to get fixed. People who buy broken games, are the main problem because of which devs can comfortably launch broken, buggy game without any consequences.

  • @aftereight9143
    @aftereight9143 Рік тому +594

    A new game coming out barely looking better than some decade old games while running like absolute garbage?
    Now that's a certified modern dev moment

    • @japanesesamurai4945
      @japanesesamurai4945 Рік тому +27

      @@Noob._gamer on ps5 the game runs at 720p 60 fps

    • @playcloudpluspc
      @playcloudpluspc Рік тому +18

      ​@@japanesesamurai4945It upscales to 1440P though and at least it's much cheaper. As a PC gamer I don't like being shortchanged.

    • @stangamer1151
      @stangamer1151 Рік тому +25

      If this game used Lumen in conjuction with hardware RT, this kind of GPU and CPU performance could be justified. But, man, in it's current state Remnant II is probably the worst game in terms of performance/graphics quality ratio. It surpassed even horribly optimized TLoUP1 in this regard.

    • @mihai2174
      @mihai2174 Рік тому +24

      @@stangamer1151 well, if you want to be amazed, check the steam ratings..it looks like it is very well received(at the time of writing this)..so i guess people don't care for low performance of using upscalers?..it really makes me question existance haha

    • @RickSFfan
      @RickSFfan Рік тому +8

      I was kind of surprised at that also. Compared to all the photo realistic tech demos and the hype, I fear Unreal 5 isn't gonna actually mean much for a while in terms of better looking graphics.

  • @nekov4ego
    @nekov4ego Рік тому +14

    720p gaming is back with a new name! Not only does it not look special at native res but you can't see any detail at such aggressive upscaling.

  • @SeventhCircle77
    @SeventhCircle77 Рік тому +143

    Yep, companies are gonna get lazy and lean on upscaling and frame generation vs just making the game properly. The game looks good but not enough to require upscaling

    • @Hombremaniac
      @Hombremaniac Рік тому

      No wonder when Nvidia is working their butts to promote DLSS3 as something everybody must use. Damn greedy bastards.

    • @JABelms
      @JABelms Рік тому +4

      I keep telling noobs who have been partying with DLSS and FSR like it's the greatest thing ever. It's native or go home for me.

  • @Steven-ex3ne
    @Steven-ex3ne Рік тому +24

    One of your best videos yet, a fantastic realworld look at performance across various options.

  • @ferrino4145
    @ferrino4145 Рік тому +417

    Imagine paying $400-ish for GPU and being able to play new games at 1080p 30fps 💀

    • @FloVelo941
      @FloVelo941 Рік тому +46

      That's exactly what happened with me unfortunately, bought a Radeon RX 6700XT 12GB and playing in 30fps , and even with that my PC crashes after about 30 minutes. Fuck gunfiregames

    • @chadjr2004
      @chadjr2004 Рік тому +1

      @@FloVelo941 playing this game? Or in general?

    • @denks7849
      @denks7849 Рік тому +57

      ​​​@@chadjr2004this game. the 6700xt is a great 1080p card and can run 1440p games quite well, too, when they're not unoptimized garbage like this game.

    • @PeeNutButHer
      @PeeNutButHer Рік тому +30

      @@denks7849yeah the 6700 xt is a beast budget card, would recommend to anyone

    • @parker4447
      @parker4447 Рік тому +5

      Well, not really u can use DLSS and have over 60 fps and also if u buy 40 series u can use DLSS3 FG it works well only with minor latency hit and it also negates any CPU bottleneck u can get away with weaker CPU but of course it sucks. Also everyone is talking like a 8GB GPU won't work for future UE5 games and this game runs perfectly on 8GB VRAM the VRAM is not an issue.

  • @Star_Wars_Galaxy
    @Star_Wars_Galaxy Рік тому +5

    I really like this style of video. Where you alternating step your way up in performance from the GPU and CPU. Really lets you get a fell for how the game performs on most hardware. Would love to see it with another CPU option although in know that's more work than just slotting in another graphics card or would require another system.

  • @ericjr2601
    @ericjr2601 Рік тому +143

    The funny thing is, this game has no ray tracing. Usually upscalling is kind of required when you kick in a couple RT solutions. Imagine what it would be like if they had some RT implemented. Crazy.

    • @nossy232323
      @nossy232323 Рік тому +13

      Start putting money aside for the RTX 5090 :)

    • @MechAdv
      @MechAdv Рік тому +8

      UE5 is the reason I skipped this graphics generation. Everything below 4080 and XTX is gonna be obsolete by next year if you play AAA titles.

    • @nossy232323
      @nossy232323 Рік тому +4

      @@MechAdv I also expected something like this. But I had to upgrade, my GTX 1080 + 6700K just wasn't enough for me anymore. I upgraded to a RTX 4090 + 7950X3D. And I changed my monitor to that new Asus 240 refresh 1440P OLED monitor. That way at least it will stay relevant for some time.

    • @lombredeshakuras1481
      @lombredeshakuras1481 Рік тому +1

      Lumen would absolutely destroy performance. It's such a shame devs aren't going for a normal approach on optimization and just slap upscaling to smooth that out without caring about players.

    • @zaczeen1121
      @zaczeen1121 Рік тому

      ​@@lombredeshakuras1481 85% disagree I guess lol

  • @Legnalious
    @Legnalious Рік тому +5

    DigitalFoundry's video about this game on the consoles says that the resolution is around ~1200p upscaled to 1440p for 30 fps. And 792p for balanced and 720p for performance. It's unoptimized and they're using upscaling as a crutch for playable framerates.

    • @Stardomplay
      @Stardomplay Рік тому +2

      *sigh* same story with Forspoken, Jedi Survivor, and now this. I guess we should be getting use to this as the norm for most of us with mid ranged PCs, though I do plan on building a 7900xt build in future.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +2

      @@Stardomplay turn the setting to medium and most of the problem will be solved (unless it is game engine issue like those shader compilation stutter stuff). the problem is i see people take ultra setting as a measure to see if the game is well optimized or not when ultra have always been build to punish even the fastest hardware available at the time.

  • @gamerforlife7903
    @gamerforlife7903 Рік тому +144

    The requirements to play these games are increasing exponentially but very little advancements in actual graphical quality, the pc gaming market is in a dire need of another revolution which I don't will be happening anytime soon

    • @penumbrum3135
      @penumbrum3135 Рік тому +15

      The problem is most improvements this gen is in the background of modern titles. Devs try to use the new hardware to create systems to make their jobs easier at the cost of performance.
      Considering how many companies in many industries try to cut cost now I'm not surprised devs try to use the most convenient and cost effective method of manufacturing games.

    • @UniverseGd
      @UniverseGd Рік тому +4

      Funny that next revolution is in the AI which ironically goes back to upscaling technologies.

    • @GewelReal
      @GewelReal Рік тому +1

      DLSS and framegen is that revolution

    • @megamanx1291
      @megamanx1291 Рік тому +47

      ​@@GewelRealNo it's fucking not

    • @100500daniel
      @100500daniel Рік тому +4

      RT,Upscalers, and frame gen are those "revolutions" lol.
      Although we BADLY need an RPG like Fallout with smart AI that can actually interact with player's speech.

  • @astoraan6071
    @astoraan6071 Рік тому +9

    It's embarrassing how little of a graphical upgrade there is with this much of a performance hit

  • @teddyholiday8038
    @teddyholiday8038 Рік тому +101

    Peope warned us about DLSS being used as a crutch. Looks like we’ve reached that point.

    • @kanta32100
      @kanta32100 Рік тому +3

      Since they invented java, nobody cares about optimization. They just add more cores.

    • @bigturkey1
      @bigturkey1 Рік тому

      how do you max out your 4k monitor without using dlss?

    • @bigturkey1
      @bigturkey1 Рік тому

      @@Xcepter i dont know what you are talking about. why cant a 4090 get 60fps? what is CP

    • @Robyne_u61
      @Robyne_u61 Рік тому

      @@bigturkey1cyberpunk

    • @bigturkey1
      @bigturkey1 Рік тому

      @@Xcepter good

  • @VegetarianZombie
    @VegetarianZombie Рік тому +9

    Me: I'm going to buy a 4090 so I can play games on ultra 1440p without ever having to lower settings or use upscaling!
    Remant 2: I'm going to ruin this man's whole career

  • @imo098765
    @imo098765 Рік тому +21

    Great work Daniel, love these type of vids
    Edit: The floating torso to point to a part of the screen is always great

  • @osohista
    @osohista Рік тому +1

    I like how he glides around to screen to point at things like Casper the friendly ghost.

  • @tomthomas3499
    @tomthomas3499 Рік тому +16

    It goes hand in hand then, devs becoming more reliant on upscaling tech, while gpu maker selling said proprietary upscaling tech with little to no effort on actually increasing performance..hmmm

    • @photonboy999
      @photonboy999 Рік тому +6

      I'd say that's partly true. For some game devs the ability to have some DYNAMIC render resolution setup is low hanging fruit. Spend months of optimization at different parts of the game to find those areas you're dropping down to 20FPS or lower at times or just let the game drop as low as it needs to, even 360p causing massive blurriness.
      Hopefully people vote with their wallets and make it clear this is unacceptable. I certainly wanted the new Jedi game but didn't buy it.

  • @THU31
    @THU31 Рік тому +8

    A small AA studio was able to include all three upscaling technologies, and frame generation? Where did they find the resources to do that? It must have taken so much time and effort, they probably had to starve themselves. Thank heavens big AAA studios are not going for such sacrifices.

  • @stylie473joker5
    @stylie473joker5 Рік тому +49

    The game doesn't even look as good to justify the requirements and poor optimization, even the scaling between graphical options is bad

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому +6

      No game looks good enough to justify poor optimization. Some games do however look good enough to justify sub par performance.

    • @Aleph-Noll
      @Aleph-Noll Рік тому +4

      same, i was like what is it even rendering? everything looks like ass

    • @zaczeen1121
      @zaczeen1121 Рік тому

      ​​@@Aleph-Noll85% disagree lol glad ure in the minority

  • @iaindyer1629
    @iaindyer1629 Рік тому +9

    Definitely would like to see the same tests on the AMD range of GPUs. Currently have a 6750XT. Don’t care how long the video is, very interesting stuff. Well done sir 👍🏻.

  • @sunlightheaven
    @sunlightheaven Рік тому +27

    I'm convinced that CULLING is not done on certain meshes in certain dungeons/areas, worst offender being dungeons in N'Erud world.

    • @wallacesousuke1433
      @wallacesousuke1433 Рік тому

      What's culling?

    • @sunlightheaven
      @sunlightheaven Рік тому +7

      @@wallacesousuke1433 objects not in your field of view not rendering increasing performance

    • @mimimimeow
      @mimimimeow Рік тому +3

      Nanite automatically culls everything not visually-perceptable in real time so the argument doesn't make sense. That's why LOD isn't relevant in these games, they simply cull CG-quality assets in real time. It will however try to load your GPU with geometries as long as there is headroom. So I think what happens here is Nanite can't gauge how much headroom your GPU has - a common theme in PC space. It seems to work very well on consoles.

    • @captainthunderbolt7541
      @captainthunderbolt7541 Рік тому

      I am 100% certain that CULLING *IS* being done. Do you know how I know? I know because the shadows in this game are a screen space effect, and as soon and the source of the shadow is out of the frame, the shadow itself disappears!!

    • @Franku40keks
      @Franku40keks Рік тому

      @@mimimimeow Don't forget that consoles also heavily rely on upscaling, and the game has a performance and quality mode there too.

  • @n1lknarf
    @n1lknarf Рік тому +4

    Your testing is also very generous; fill the screen with particle effects and more characters running their Ai Behavior and timers, and you'll definitely see the frame drops.

  • @ojhuk
    @ojhuk Рік тому +19

    I want to play this but I'll be holding off buying this for now until they get around to finishing it. I miss the days where an optimized game was implied by the term "release version". Thanks for the heads up Daniel!

    • @iPeanutJelly
      @iPeanutJelly Рік тому

      the game runs fine if you arent pushing over 60fps. i pre ordered and played the game fine enough. small stutters here and there but nothing bad until final boss. less than 30 fps on med settings

    • @MaxIronsThird
      @MaxIronsThird Рік тому +7

      That's going to take a while.

    • @moldyshishkabob
      @moldyshishkabob Рік тому +28

      @@iPeanutJelly "If you aren't pushing over 60fps"
      "I pre ordered"
      Pfft okay. Really talking to the masses with this. The latter makes it sound to me like you're justifying the waste of money after the fact.

    • @ojhuk
      @ojhuk Рік тому +3

      @@MaxIronsThird If the past few years have taught me anything it's patience.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому +6

      @@iPeanutJelly Go to consoles if you don't want more than 60FPS. And STOP advising others on wasting their money like you did.

  • @ThatsMySkill
    @ThatsMySkill Рік тому +2

    when dlss first released i loved the idea of it giving us a boost but sadly its now being used as a lazy way to optimize. years ago we managed to run games fine without dlss but now its like a mandatory thing. its so lazy. we could have insane graphics combined with insane fps but studios dont wanna optimize their games anymore. thats why i appreciate the last two doom games and also atomic heart for running so smoothly despite looking really good.

  • @R4in84
    @R4in84 Рік тому +17

    Yeah i should not need to have on DLSS to gain above 60fps at 1080p on a 3070 but here we are, finished the game yesterday. Really hoping the devs can solve the performance issues.

    • @SALTINBANK
      @SALTINBANK Рік тому

      worth it ? how many hours mate ?

  • @Donnie_Oculus
    @Donnie_Oculus Рік тому

    love the pointing and the moving your cam in obs, made me smile xP

  • @yeahbuddy300lbs
    @yeahbuddy300lbs Рік тому +48

    I can’t believe I switched from console to PC because console games were starting to be very low res upscaled by FSR, and here I am with my new PC about to experience the same thing 😂 fml

    • @Boris-Vasiliev
      @Boris-Vasiliev Рік тому +11

      You dont have to play those few unoptimized games on PC. Luckily you can play thousands other games on PC, compared to like 50-100 on any console. And most popular games on PC run perfectly smooth even on low-end systems.

    • @papichuckle
      @papichuckle Рік тому +2

      Plus you end up spending more and the devs advice for optimising there games is throwing money at your pc.

    • @yeahbuddy300lbs
      @yeahbuddy300lbs Рік тому +1

      @@garrusvakarian8709 yep that's exactly why I sold mine. I think Jedi Survivor is like 800p internally on xbox and ps5 😂💀

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому +1

      @@yeahbuddy300lbs 648p

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому +1

      @@garrusvakarian8709 Like Forsbroken which comes from that same square Enix 🤣

  • @draxrdax7321
    @draxrdax7321 Рік тому +2

    I have a modest computer, Ryzen 3 with RX 6600, can run this game at high 1080p. But there are stutters from time to time, using FSR on quality makes it work really well at 60fps. But i agree this is not a solution, the game should be playable as is, not to mention people with older cards that don't benefit from upscaling. Not to mention the game isn't exceptionally demanding in itself, like some Capcom games that have facial rendering where you can't even tell if the characters are 3D models or real people.
    PS: It's even weirder, the fact that this game looks so similar to the first Remnant game in terms of graphics, yet they managed to make it require so much more resources. The first game ran even on a potato.

  • @sherlockholmes7630
    @sherlockholmes7630 Рік тому +23

    Those faces of them devs should be slapped inside out to teach them how to code and optimize so that it still looks great and runs butter.

    • @photonboy999
      @photonboy999 Рік тому +3

      Ya, that's rarely how it works. Games are complicated, especially if you're learning new software. USUALLY the issue is upper management setting deadlines that are just not possible. I wish people would stop blaming the coders.

    • @metroided34
      @metroided34 Рік тому +6

      @@photonboy999 Id usually agree with this stance, sadly doesn't work in this case as the devs outright stated the game was designed with DLSS/FSR in mind.

    • @bigturkey1
      @bigturkey1 Рік тому

      what game have you played in the last year that maxed out a 4k 120hz panel without using dlss?

  • @user-mh6ie9wm6m
    @user-mh6ie9wm6m Рік тому +1

    search exploit protection in windows, add remnant 2 under program settings. click on the new extension you've added, click edit, scroll down until you see control flow guard (CFG). Override system settings and turn it off for remnant. Its case by case with hardware, but CFG is known to cause micro stutters in Unreal Engine games. Borderlands 3 is super notorious for this.

  • @BPMa14n
    @BPMa14n Рік тому +32

    My brothers and sisters with last gen mid range cards this is a blessing the backlog has been waiting for us for years. Maybe in 2025 we can play this game at 1440p 100+ fps as God intended Pc games be played.

    • @RGInquisitor
      @RGInquisitor Рік тому +3

      Tell that to the people who are still waiting for a GPU worthy enough to get a stable 60fps on Attila: Total War. The game came out 8 years ago, and the developers claimed it was "made with future GPUs in mind" when they received backlash for the performance.

    • @beri4138
      @beri4138 Рік тому +1

      @@RGInquisitor That game runs almost exclusively on your CPU.
      GPU is irrelevant. I played it on a Gtx 260 back in the day.

    • @RGInquisitor
      @RGInquisitor Рік тому +1

      @@beri4138 Indeed it does. Still runs like shit even with the latest cpus. Newer and older games perform much better. Very bad optimization regardless.

    • @escape808
      @escape808 Рік тому

      I play at 1440p and get average of 120fps on a 3080 and 10900k. just need to upgrade my guy

    • @RGInquisitor
      @RGInquisitor Рік тому

      @@escape808 No, I do not. You also do not get an average of 120fps on Atilla on any hardware.

  • @PixelShade
    @PixelShade Рік тому +3

    I am afraid of the UE5 future... It feels like developers get a lot of convenience out of the engine, like offering unlimited detail with nanite, ridicolous amount of layers for physically based materials, cinematic lighting and camera effects. Yet we simulated all of that stuff, for a fraction of the cost 10 years ago. I think what will happen is that development studios will just cut corners in terms of staff, and publishers will profit more. The only AAA game I have been impressed me A LOT these last couple of years has been Half Life Alyx (and Cyberpunk, which also has it's own in-house engine). Which looks pre-rendered in VR. They use highly optimized old-school techniques, and it's impressive that they reach that level of quality while rendering two scenes simultaneously for each eye at insane resolutions and high refresh rates. Heck. I play at 5400x2700 at a solid 90fps with a RX6800. The game renders wonderfully at 4320x2160 on a RX6600XT. Yes, they use a combination of baked and dynamic lighting, yes they use parallax corrected cube maps for reflections, but honestly. Why not!? when they look that good?
    I'm looking forward to Counter Strike 2, just to see how good the Source2 will look and how performant the game will be. :)

  • @AntonioLexanTeh
    @AntonioLexanTeh Рік тому +56

    Game looks good, but not good to require any amount of upscaling. Plague tale requiem ran better at native on my system than this game with FSR quality, INSANE. And we all know that game has one of the most beatiful graphics we have today. Plus when you change quality settings, there is no clear visual change. Basically this game is trash

    • @nannnanaa
      @nannnanaa Рік тому

      it doesnt have AA. upscaling looks great with some sharpening

    • @nannnanaa
      @nannnanaa Рік тому

      have you played the game?

    • @christonchev9762
      @christonchev9762 Рік тому +13

      game looks like a PS4 game not PS5 let alone UE5

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому +1

      @@nannnanaa "upscaling looks great" and you look like a corporate shill. Off course upscaling looks great when your native resolution looks trash.

    • @spitfire1789
      @spitfire1789 Рік тому +3

      It's crazy cos I thought even Plague's Tale should be running better when it came out since even tho it looks gorgeous, it's a really linear, straight forward game. This game doesnt even look anywhere near as good, yet, runs worse. Gaming is in the mud.

  • @cookie_of_nine
    @cookie_of_nine Рік тому +13

    I have a feeling Nanite is responsible for the dramatic improvement made by the upscaling compared to other games.
    Nanite likely tries to put more detail in an object the more pixels it takes up on screen (provided there is more detail to be had), thus decreasing the resolution, either manually, or via upscaling has the side effect that it changes both the amount of work the gpu/cpu needs to do for the scene in general on top of the fewer pixels it needs to fill.
    Not sure if nanite has a multiplier for how much detail it works to put on screen, and the remnant devs set it too high, or maybe nanite doesn't provide that knob (or make it easy to change) so a resolution change is unfortunately the "easiest" way to trade off detail for performance.

    • @seanmaclean1341
      @seanmaclean1341 Рік тому +2

      this is 100% it.

    • @Larathu
      @Larathu Рік тому +1

      For your first point you are correct, but from my understanding of Nanite you are incorrect on the second.
      For Nanite to work as intended you need a target framerate,
      if FPS < Target FPS : Nanite lowers details
      if FPS > target : Nanite Increases details
      Uncapped FPS = Max Details at all times

  • @MaxIronsThird
    @MaxIronsThird Рік тому +21

    Hope to see a follow up with some AMD GPUs like the 6700XT and 7900XTX and the Intel A750.
    Also other CPU configs like a 12400, 12600K, 7600, 5600X, 5800X and the 5800X3D, how strong does the CPU need to be to stop the stutters?

    • @Rexxxed
      @Rexxxed Рік тому +1

      With a 6800XT and a 5800X3D I get occasional stutters, but nothing too crazy. Going into a new area can cause stutters for a few moments but then it goes away, with an average frame rate of 70-100 fps, typically in the 80s and 90s. This is on mostly high settings with shadows and effects on medium, with FSR Quality. I was honestly pretty sad about the performance. It's playable, but I figured I'd get better numbers. I feel for anyone with a weaker PC, it must be miserable in some cases.
      Edit: forgot to mention this is at 1440p. Haven't tested 1080p as it screws with my second monitor

    • @MaxIronsThird
      @MaxIronsThird Рік тому +1

      @@Rexxxed Use XeSS, it looks better and gives you more performance in this game.

    • @Rexxxed
      @Rexxxed Рік тому +1

      @@MaxIronsThird I tried using it, and while it does look better it actually decreased performance by a few frames in my case

    • @Kage0No0Tenshi
      @Kage0No0Tenshi Рік тому

      r7-5800x3d/i5-13600k is minimum to remove bottolneck

  • @lethargicmosquito
    @lethargicmosquito Рік тому +4

    I played it for a couple of hours on my 3070, 10700K, 32GB Ram rig, on 1440p I was getting an average of 60 to 80 frames with my settings being a mixture of Ultra and High and DLSS on Quality. Game is pretty solid and the art style is awesome, I hope they continue to work on the performance so that a lot of people get to try it eventually. Being gatekept out of content due to hardware sucks.

  • @JohnDoe-cw9oo
    @JohnDoe-cw9oo Рік тому +18

    To see how CPU affects GPU performance is eye opening

  • @spurio187
    @spurio187 Рік тому +12

    Nanite can be really heavy and is directly tied to number of pixels being pushed, its inherent to how the tech works. It basically tries to cull the number of polys to the number of pixels for a per pixel effect, thats why upscaling works so well here. Now imagine when we get games with Lumen as well. Hopefully the tech will mature nicely tho and we will see big performance jumps.

    • @Franku40keks
      @Franku40keks Рік тому +2

      This is a good explanation, thank you.

    • @Alex.Holland
      @Alex.Holland Рік тому +1

      While i was disappointed with the mandatory dlss, I was blown away at how well the dlss worked in this game compared to any otehr implementation i had seen before. Other than the standard parallel line distortion effect, I legit cannot tell when dlss is on or off, or what setting its on. 3070 and I tweaked settings to get a rock solid 72 fps at 1440p and the game is extremely playable.

  • @Cygnus-Phi
    @Cygnus-Phi Рік тому +13

    Awesome work! The one thing that's missing is checking it with a fast/modern Intel processor (or a fast/modern non-x3d AMD) to see if it's just pure raw power that's needed or 3D V-cache related, and then a 5800X3D just to cross check. Also why no AMD cards checked?

  • @UncannySense
    @UncannySense Рік тому +1

    I've adapted by only getting 'new releases' about 5 years after the launch when they're actually optimized have a fully developed DLC/GOTY edition on sale and can run on current low-mid range hardware.

  • @einstien2409
    @einstien2409 Рік тому +12

    Can you see if XESS and FSR has the same effect on the frame rate stability?

    • @sergtrav
      @sergtrav Рік тому

      XeSS is significantly faster in this game for some reason.

    • @einstien2409
      @einstien2409 Рік тому

      @@sergtrav have to try it out then, thanks.

  • @Aaron_Jason
    @Aaron_Jason Рік тому +6

    I find capping framerate usually fixes terrible frametimes in most games, usually have to do it with rivatuner but sometimes ingame works.

    • @Robyne_u61
      @Robyne_u61 Рік тому +4

      Ah yea cap it at 30 fps 😂😂😂

    • @ChrisM541
      @ChrisM541 Рік тому +1

      You did watch the video? If you did, you'll know the problem is with LOW frame rates. Just how much lower would you like it capped? ;)

  • @paulcoffield2102
    @paulcoffield2102 Рік тому +8

    I remember only a couple of years ago, the 3080 was seen as THE card to have in terms of price to performance for 1440 and 4K gaming. Now it's been relegated to a 1080p card by this game. I have a 4090 and 7950x, and I can barely hold 45-50FPS at native 4K, even when it's overclocked to 3Ghz and pulling over 500w. Just insanity. I could understand it a little more if the game looked ridiculously good, but it doesn't.

    • @latambox
      @latambox Рік тому +1

      Yes true and Red Dead Redemption 2 has better graphics as well as star war jedi survivor

    • @beri4138
      @beri4138 Рік тому

      @@latambox Both of those games are demanding as fuck

    • @pliat
      @pliat Рік тому +2

      @@beri4138 nowhere near this game. i can run RDR2 at max everything at native 5120x2160 at ~110fps with my 4090. not very demanding

  • @stewie410
    @stewie410 Рік тому

    I've made a few comments on the game's subreddit -- I'm well aware my my rig is quite old at this point, however its served me great since I built it:
    - Intel Core i7-8700K
    - 32GB DDR4-3200 CL14 (4x8)
    - GeForce GTX 1080 Ti
    - Acer Predator 1440p, 144Hz + G-Sync
    - Game is installed on a HDD currently (though storage upgrade planned soon)
    I'm able to get a _mostly_ stable ~85fps on Low with AMD FSR Performance, which is playable and smooth, albeit grainy. I'm also trialing the hidden "DynamicResolution" setting from the INI configuration file (which exposes more granular graphics settings than the in-game UI). Additionally, outside of framerate changes, I can't say I really notice any difference between each of the preset settings -- they all look the same with upscaling disabled...but the framerate certainly changes...and changes enough that G-Sync doesn't help.
    I'm able to enjoy the game at Low+FSR, but it definitely makes me question my sanity a bit compared to my machine's performance with _literally_ ever other game I've ever thrown at it (barring Cyberpunk 2077 at launch)...but it does concern me, especially for how almost perfect the previous game was/ran. Frankly, I was worried it was finally time to build another overly-expensive, 5-year+ rig...but hearing that even near-top-spec machines can struggle shows my original un-optimized hypothesis seems more accurate.
    I've also heard that the move the UE5 was made later in the development cycle -- if true, this could help explain why the game runs the way it does, compared to other UE5 titles on the market today. Again, if true, I'd wonder if the publisher required UE5 for launch, to have that to brag about in their portfolio (as I wouldn't expect that of Gunfire, to be honest).

  • @wekkimeif7720
    @wekkimeif7720 Рік тому +3

    Those spikes most likely is the game building new shaders. The areas you already visited before has no stutter as the shaders are already loaded. It's common Unreal engine issue and devs 99% of the time don't optimize shaders

    • @BlackParade01
      @BlackParade01 Рік тому +1

      Unreal Engine 5.2 is supposed to have a built-in solution for this, and Remnant 2 is built on 5.2.

    • @wekkimeif7720
      @wekkimeif7720 Рік тому +3

      @@BlackParade01 Unreal Engine 4 had already a solution for this. Just build the shaders before the game starts, but this is just another developer studio that doesn't care about performance at all. They admitted in their own post they had done no optimization and relied fully on temporal up scaling.

  • @RazielAU
    @RazielAU Рік тому

    Worth noting that a lot of Nanite geometry actually renders on the CPU, not the GPU. All the small geometry from Nanite is software rendered. It's only the larger polygons that the engine sends to the GPU (along with any non-Nanite geometry like characters and such).
    It's done this way as GPUs are extremely inefficient at rendering small triangles. The way GPUs are designed to work is that each triangle will have a lot of pixels to render, which is extremely efficient and able to be executed in parallel across different compute units. When a triangle is only a single pixel on the screen (which most Nanite triangles are), what happens is that the triangle is queued for rasterisation, but almost instantly finishes rendering on a single compute unit. The GPU then idles until the next triangle is queued. Instead, if the triangle covered 200 pixels, then different pixels could be processed on different compute units, and it would give enough time for the next triangle to get processed/queued, this is how GPUs are meant to work.
    So Nanite is only really possible because of Epic's custom software renderer that deals with Nanite geometry on the CPU. This is why you're seeing the results you're seeing. A higher end CPU with higher clock speeds and more cores should in theory help with Nanite rendering performance.
    I vaguely recall Epic saying that an 8-core CPU is the minimum recommended spec for running Nanite games, but that may have changed as I'm talking about very early versions of the engine predating the final public release. For most people, Nanite games will pretty much require upscaling as it's not possible to do software rendering at higher resolutions on most consumer CPUs. As Nanite games become more common, you might find your next big 'graphics' upgrade not being a new GPU, but a higher end CPU.
    One thing I'm not sure of is how well the engine scales between deciding whether geometry should go to the CPU or the GPU, I'm a little vague on those details, but it's fair to assume there will be some smarts around that. Also worth noting that over time I would expect improvements in the software renderer to increase performance in future.

  • @cosmickirby580
    @cosmickirby580 Рік тому +7

    I was talking to my friend a while ago about my hate for fsr and dlss.
    They are not bad technologies and have their use, but this exact situation was why I despise it.
    Dlss frame generation and the way they marketed it was the giveaway for me. The writing was on the wall, just can believe it wasn't limited to the gpu companies, and their want to sell underpowered hardware for higher prices.

    • @hastesoldat
      @hastesoldat Рік тому

      So you think gpu makers are secretly bribing game developers to sabotage their games optimization? oO

    • @cosmickirby580
      @cosmickirby580 Рік тому +1

      @hastesoldat nah, not that deep. With the technology that is available, game devs don't have to fully optimize their games anymore.
      Essentially, "Who cares if the game doesn't have a fps over 60, dlss frame generation makes up for it.".
      It's Oversimplified, but that's the point. It boils down to laziness or cutting corners.

    • @hastesoldat
      @hastesoldat Рік тому

      @@cosmickirby580 Still don't see how gpu makers and engineers are at fault there. It's the game devs that are incompetent and lazy as usual and are using the technologies wrong.

    • @cosmickirby580
      @cosmickirby580 Рік тому

      @hastesoldat again, gpu makers and engineers are not conspiring with game devs. it's not that deep.
      What I'm trying to tell you is that the technology, regardless of its intended purpose, allows for corners to be cut both on the manufacturing side and the actual data being processed.
      In oversimplified words,
      gpus don't need to be as powerful because supersampling makes up for it.
      Games don't need to be as optimized because supersampling makes up for it.
      This does not mean that they have to be conspiring with each other. In reality, they are working towards the same goal, making more money with less time invested, and this technology simply allows it to be done.

  • @Markoul11
    @Markoul11 Рік тому +1

    In a low-cpu system locking the FPS to say at 60, 50, or 40 FPS will take out the strain from the cpu for a smoother gameplay. Do not unclock the FPS in the game settings! Also as grazy at it sounds in a low-cpu system DLSS balanced or even quality setting can give you better performance than the performance setting since DLSS takes strain out of the GPU but increases the strain on the cpu.

  • @stratuvarious8547
    @stratuvarious8547 Рік тому +7

    Given the recommended specs, I don't see why higher end systems would need upscaling. I'm running a Ryzen 7 3800X/6900XT, which is well above the "recommended", especially on the GPU side. This is a real problem, I've been trying to give developers the benefit of the doubt when it comes to PS5/Series ports, but it doesn't seem like Remnant 2 has that excuse. Eventually, this is going to become a problem for developers, because people are quickly getting sick of the idea of release now/optimize later.

    • @rahulpandey9472
      @rahulpandey9472 Рік тому +1

      If we go by what devs said regarding the use of upscaling tech then those requirements are for Dlss performance at 1080p in the case of 2060 for 60 fps and FSR performance at 1080p in the case of 1650 for 30 fps makes perfect sense.

  • @MrLathor
    @MrLathor Рік тому +2

    It’s one thing for Cyberpunk 2077 Path Tracing to require upscaling. That is bleeding edge future looking graphics technology. It’s another for some typical looking UE5 game.

  • @nomustacheguy5249
    @nomustacheguy5249 Рік тому +5

    How does the game perform on AMD gpus ? Any chance you will make another video ?

  • @filip9587
    @filip9587 Рік тому +1

    The moving head is a great addition. Really made me crack up at how genius it is. Hope to see more floating Owen.

  • @CryonKing1410
    @CryonKing1410 Рік тому +4

    Hey Danie! First of all, fantastic video, and I hope to see more from you in the future. I love your channel! Keep it up :)
    I want to share my experience with Remant 2.
    I have a 4070 Ti, Ryzen 7 7700X, and 32 DDR5 6000 MHz CL40 on a MSI B650M Mortar WiFi.
    At ULTRA settings with DLSS Quality, FrameGeneration at 2560x1440p, I got from 144FPS (my native refresh rate) to a whopping (around) 60/50 FPS at intense scenes, and believe me, you can feel that drop. This game should not run like that.
    I can accept that they rely on upscalers, etc. (I like to run games natively and don't like the idea of being forced to use an upscaler to enjoy the game.) But if they "Made the game with upscalers in mind", the game should not drop from 144 to 60/50 fps with all that "fancy" Nvidia Upscaling/AI technology.
    My friend that plays on a laptop with a 3080 10GB 125W TDP (MSI GP66) has a reasonable 60/70 FPS at 2560x1440p on medium-ish settings with upscaling, but it is very stuttery none the less. At intense scenes (mine are 60/50 FPS), he gets around 30/20 FPS.
    We played yesterday before the global release. We hope that the performance will get better soon because we love the gameplay of Remnant 2, and we loved the first game and hoped to play this one with ease :/.

    • @franciscojavierolivaresval3574
      @franciscojavierolivaresval3574 Рік тому

      If you are playing at more than 144fps I don't understand what the problem is, anyway, the hypotenuse, the technologies are there to be used by those who are going to play natively if you can improve performance without sacrificing graphic quality, I also have the rtx 4070 ti and it works for me plenty of 2k and 4k without sacrificing image quality since the dlss is very well executed in this game

  • @blackdoc6320
    @blackdoc6320 Рік тому

    People definitely underestimate what percentage shadows take up as far as gpu usage. In an unreal project, simply enabling virtual shadowmap over standard shadowmaps can cut fps in half in some scenes, but there is literally nothing crispier or more realistic other than raytraced shadows. Shadows are a huge part and nanite definitely takes a chunk too. Now I know there's a lot of people saying "oh it looks like an ue4 game" thats just because devs were really good at tricking your eyes and perception of low poly objects using normal maps and ambient occlusion. Nanite allows for almost to life detail in items that otherwise would've been baked into a normal map.

  • @ThisAintPizzaHut445
    @ThisAintPizzaHut445 Рік тому +4

    Super interested in seeing your AMD benchmarks. Other channels have shown the 7900XTX matching or even beating the 4090 in this game.

    • @GeneralS1mba
      @GeneralS1mba Рік тому

      Probably was since it wasn't optimized properly, it just inherintly works better on amd cos of consoles

  • @kwinzman
    @kwinzman Рік тому +2

    Extremely informative, systematic testing. Thank you!

  • @J0ttaD
    @J0ttaD Рік тому +7

    Its crazy how this game barely looks any better than the first game but crushes that system... !?!?!? WHAT

    • @FloVelo941
      @FloVelo941 Рік тому +1

      It literally crashes my system and I just got $400 AMD 16GB GPU for this game. I have to stay at 30fps with lowest setting. At this point Mad Max looks better on my computer than this game.

    • @goblinphreak2132
      @goblinphreak2132 Рік тому +2

      Remnants, the first one. 7800x3d and 7900xtx and 64gb ddr5 6000 cl30 max settings 3440x1440, i get 180-300fps. Looks good.
      Remnants 2, same 3440x1440, same ultra settings (no FSR) and I get 56fps AVERAGE. Turn on FSR QUALITY and I get 110-120fps but game looks like literal shit. The game isnt that much prettier than the first game.

  • @Azalis1701
    @Azalis1701 Рік тому +1

    I think they big problem with PC ports lately is you can't brute force performance as well anymore. The last gen systems were so weak compared to a PC, especially on the CPU side. This generation is actually really solid even for a PC. The PS5 and Series X are running a Zen 2 CPU roughly equivalent to a Ryzen 3800 and a RDNA2 GPU somewhere close to a RX 6700. Then they optimize for that hardware at 4k 30 FPS or 1080p-1440p 60 FPS and it doesn't leave a lot of room for PC hardware to brute force past it and get 120+ FPS.

  • @Jakesmith943
    @Jakesmith943 Рік тому +3

    Thank you SOOOO much for this review, saved me from buying it and being disappointed yet again with a new AAA title on PC. There is no way I am paying money now for something that will be fixed later. I didn't see this issue raised in any of the reviews I watched. Again, thanks!

  • @RGInquisitor
    @RGInquisitor Рік тому +1

    I remember almost ten years ago being able to max out almost all the latest games and get stable 60+fps @1080p with my $250 mid-range (at the time) GPU. Metal Gear Solid V: The Phantom Pain, Fallout 4, Battlefield 4, and Far Cry 3 all ran smooth as butter. I paired it with an i5 4690k and 8GB of RAM (later 16GB).
    That 4GB R9 380 was my first ever GPU, and I feel like I got more out of it then than I do now with my RTX 3080. These days I have to scour the internet to figure out why the hell my games are stuttering, suffering frame drops, crashing, or even how to get 60fps without the need for upscaling. My 12-core/24-thread CPU barely gets utilized, games are uncompressed and take up a lot of storage space, and they run like refrigerators. I also remember during those times wishing I could afford a high-end system so I could use all those fancy graphics mods and play at higher resolutions, and now that I have a pretty damn good PC, its a pain in the ass!
    But the laziness of these developers is not new. Even during those times we had a few of them. Remember when Total War: Attila came out and they claimed that the game was made with future GPUs in mind, and that the shitty performance was as intended? Well, 8 years later and why does the game still run like shit, Creative Assembly? I still cant get a stable 60 in that game, regardless of graphic settings and resolutions. If I could go back and benchmark my 380 and compare it to my 3080 in that game, it would seem that there has been no performance improvement in GPUs at all!
    I am also of the opinion that we should stop blaming all these terrible releases on the "management" alone. There for sure are "hardworking" developers that are to blame for some of the issues that have been plaguing the industry, but they keep getting away with it because a lot of people treat them like they can do no wrong and the fault rests entirely on management.

    • @papichuckle
      @papichuckle Рік тому +1

      I cant see my self sticking to pc gaming this sought of stuff is free advertising for consoles.
      With how expensive it is to live in the modern world and games not being something we need to survive pc gaming for average people isn't viable anymore when we have to throw money at our pcs for the game to work even when the games in the first place don't even use our pcs properly now.
      Atleast with a console you'll have a much higher chance of it being optimised and if its not you know its not on your end as you all have the same hardware so when you talk about it online if its a pc game the pc community will blame your hardware all the time rather then actually admit the game they like is badly optimised.
      You can never trust positive reviews on steam as there are countless "overwhelmingly positive" reviews but then the game has optimisation problems I'm convinced the pc community have insanely low standards

    • @RGInquisitor
      @RGInquisitor Рік тому

      @@papichuckle I cannot disagree with you, really. However I am still unable to return to the consoles. I bought a series S last year and it barely gets any use out of me. I cannot go back to 30fps releases, and these days, games are broken even on console. Gaming in general has been in a rough spot. I'm expecting Armored Core and Star field to have rough releases as well, but hopefully I'm wrong.

    • @papichuckle
      @papichuckle Рік тому +1

      @@RGInquisitor oh yeah like fortunately I had the option to do star ward jedi survivor on xbox series x in performance mode "60fps" because of how unoptimised it was on pc but even on console it was definitely dipping below 60fps often and the fsr on it was really aggressive but even then it was in a much better condition then the pc version and ofcourse that's just one example.
      I couldn't imagine only having a pc its nice to either have a series x or ps5 so when you see a unreal engine 4 game or If its just unoptimised anyway I can just go on the console version have a less of a headache trying to fix it.
      And even with starfield being locked at 30fps on series x is depressing that they didn't give an option for performance mode but knowing its Bethesda and that I'm 100% sure the game will be a cpu bound bottlenecked mess its going to be an absolute unoptimised heap of shit on pc, plus i think its sponsored by amd so that's a red flag aswell
      Another example was dead island 2 you'll see the community on pc sucking it off but it will use like 1 cpu core at 53% whilst the others go to sleep and the gpu is an unstable mess aswell and obviously that's amd sponsored aswell so only fsr to use which when it is used actually makes my performance worse (it was unreal engine 4)

  • @stirfrysensei
    @stirfrysensei Рік тому +14

    add another one to the pile I guess…regardless of what the devs claim, this is poorly optimized and shouldn’t have released until it was actually finished. Sadly, people will use this info in their GPU wars, instead of holding the devs accountable.

  • @jackshephard1693
    @jackshephard1693 Рік тому +1

    This game is heavy and unoptimized on consoles, to reach 60fps the resolution goes down to 792p and still dips into mid 30s fps according to DF.

  • @SkeeziOS
    @SkeeziOS Рік тому +3

    Something tells me the ultra setting just adds unnecessary stuff that devs left there for the players to get the best experience possible from the game if their system can support it but not needed for most people.
    Still the game is too demanding and needs some serious revision, but for now would like to see some testing at medium and high settings.
    Either way, it’s DOA imo

    • @FloVelo941
      @FloVelo941 Рік тому

      Ultra setting crashes Remnant 2 on the spot for me and with this new GPU I got, I can play Forza 5 at near ultra settings without hiccups.

  • @kathrynck
    @kathrynck Рік тому +2

    I think running Remnant II on a GTX 1650 (as per their listedn "minimum requirements", was incredibly optimistic of the devs.
    Remnant 1 (UE4) was very playable with a GTX 1080 Ti in 1440p native. But it didn't exactly flat-line against the monitor refresh cap (mostly 60-90 fps).
    And for reference, a 1080 Ti makes a 1650 look like a bad joke. I seriously doubt a 1650 could even come close to running Remnant 1 in 1440p.
    The footage of Remnant II (at 540p upscaled to 1080p) looks pretty bad. Lots of stutter, and the eye candy isn't even as pretty as Remnant 1 on UE4.
    Remnant one was _terrible_ at pre-compiling content as you approach new terrain. It would crash somewhat frequently because of not pre-loading early, and then freaking out when you move forward and suddenly need new render content. And that was with 11GB on a 1080 ti.
    Excellent testing by the way. Clearly the cpu/mobo/ram is critical for this game. There's a LOT of gaming pc's out there with a 9600K or less. That's gonna hurt Remnant II a lot.
    Adding "eye candy" (UE5) while relying on huge amounts of upscaling to make it work, is counter-productive. Actually looks worse than if they just made the game on UE4, running in native rez.
    Also, wow, a beastly system with a 4060 @ 1080p/medium can't 60fps? That's horrendously unoptimized.
    And a 4090 @ 1440p ultra only hits 70 fps, with no ray tracing in the game????? That's just broken.
    This may be the "hardest-to-run" game in existence. Generally cyberpunk with max RT is the lowest fps you can get on a game. But this beats cyberpunk for low FPS.
    You really NEED very high fps to play Remnant too. It's got a lot of content which requires 'immediate' reflexes. It's basically a gun-toting version of Dark Souls with much better multiplayer.
    I have to say this is not a very good pc port. Which is a pity, because Remnant is an amazing game.

  • @zretil
    @zretil Рік тому +3

    I was expecting a similar system requirement as the previous game, graphically the game looks pretty much identical to the first one, I don't see any benefits from using UE5 other than bumping new GPUs sales of course.

  • @recordatron
    @recordatron Рік тому +1

    It raises some interesting questions, on paper I don't personally have any issue with how I get to decent image quality with the performance I'm targeting if it can be achieved on myriad hardware given that most of today's graphics are primarily made up of post processing and image reconstruction to achieve their levels of fidelity. Where it gets murky is the fact that depending on the upscaling technique(s) you have access to, you can essentially end up having a noticeably worse experience from game to game often due to arbitrary reasons outside of your control.

  • @BUCCIMAIN
    @BUCCIMAIN Рік тому +3

    People are saying ''the game doesn't even look good'' but graphically it does, the art direction is just VERY specific and liking it is subjective. You have a valid argument about the performance, no need to bash something for no reason.

    • @j-swag7438
      @j-swag7438 Рік тому

      the graphics look average/decent to me but the graphics are nowhere near good enough to warrant those system requirements. There's a level in black ops 3 zombies called "Zetsubou No Shima" which has a similar art style to what's shown in this video, but it runs perfectly on my gtx 970 and the graphical quality is pretty much the same. and that game released 8 years ago

    • @BUCCIMAIN
      @BUCCIMAIN Рік тому

      @@j-swag7438 I took a look at a few UA-cam videos of Zetsubou No Shima and no, that isn't ''pretty much the same''. The details and overall quality are obviously far better in Remnant 2. Like I said, the argument about performance is valid but saying it looks 'bad' is subjective.

  • @RubbingPotatoes
    @RubbingPotatoes Рік тому +2

    @5:30 we later determined that it was CPU bottlenecked. But why does the CPU utilization not show 100% usage on any of the cores?

  • @spg3331
    @spg3331 Рік тому +51

    game devs being lazy? NO WAY! the actual state of gaming the last few years have been unacceptable

    • @TheVanillatech
      @TheVanillatech Рік тому +1

      Yet you keep slapping that pre-order button!
      And "few years" is a bit of a stretch. Try almost 10 years! Since Witcher III - basically. That was the first big test the water "FUCK YOU" from developers. Putting NDA's on reviewing the game, because of the EPIC downgrade in visuals and the blatent "drop port" the PC version was of the console versions, with CDPR protecting their pre-order millions by threatening that german website with legal action for leaking the truth days before the release - in a desperate attempt to expose CDPR and allow people to get their money back in time by cancelling pre-orders based on FALSE ADVERTISING.
      Now the devs don't even try to hide it anymore - cos ya'll KEEP ON SLAPPING that pre-order button.
      Oh and you keep paying Nvidia $400 for entry level GPU's too. And $300 for their media centre options.
      GG. WP.

    • @Rythm1337
      @Rythm1337 Рік тому +2

      hes right devs are lazy, you can see whats wrong with the optimizations within the stats.

    • @nannnanaa
      @nannnanaa Рік тому +1

      have you played the game?

    • @BladeCrew
      @BladeCrew Рік тому +5

      Buy indie games, those devs make sure you can run your game even on a potato or toaster.

    • @TheVanillatech
      @TheVanillatech Рік тому +2

      @@BladeCrew Just buy games that are already out - have been reviewed - and those reviewers said "Great game! Well optimized!". SURE - take my money!
      Not pre-ordering nonsense 2 years in advance and then getting trash like this - and having to spend $500 on DLC while you wait for the trash to get patched! XD
      Gamers all lost their minds! XD

  • @dtrjones
    @dtrjones Рік тому

    The first two comparisons with the RTX 2060 and RTX 3060 revealed that the RTX 2060 had a 10% utilisation drop on top of the 20 fps drop with stutters. That to me tells me the overhead of DLSS Performance is contributing to the frame time stutters and not just the CPU making the GPU wait. While the game is playble with a 2060, in my eyes the minimum GPU should ideally be an RTX 2070 or RTX 3060.

  • @jeremyf1901
    @jeremyf1901 Рік тому +6

    I hope Digital Foundry cover this game.

  • @dra6o0n
    @dra6o0n Рік тому

    Modding to turn off nanites doesn't give you frame rates overall, it's a placebo effect of basically a better GPU causing better performance solely because Nanites isn't using something that occupies said powerful GPU, but the lower end GPU when the game is modded to turn off nanites, will see actual frame drops.
    The true culprit is Textures through Stream Boosting. When you use lower level values the better FPS is gained with modding when you turn down or even turn off stream boosting, but the quality gets affected massively.
    This is because UE5 is working with textures on the fly, and Remnant 2 has only 4k or higher textures in it's PAK files that is heavily compressed, so it loads and streams them on the fly and always needs new files, hence your VRAM and RAM is always working on overtime, causing latency to occur.
    On minimum to recommended specs for this game, they based it on use of UPSCALER at performance levels. The minimum specs is NOT native rendering.
    The developers don't wanna create and utilize lower resolution assets for proper LOD, hence nanites manages the LOD transitions. But Nanites is also known for trying to squeeze as much quality by using up as much free resources from your GPU/CPU as possible, targeting a 60 fps game, not 120fps or higher.

  • @jacksonoliveira2813
    @jacksonoliveira2813 Рік тому +4

    Quando comprei minha RTX3060Ti fiquei super feliz, capaz de rodar todos os games atuais e ainda por cima em QHD (alguns com ajuda do DLSS para manter o frametime estável), agora ver que pouco tempo se passou e me sinto usando minha velha GTX970 de novo (olha que ela durou viu), essas otimizações estão porcas de mais.

    • @KING0SISQO
      @KING0SISQO Рік тому +1

      Still have a 970 ☠️

    • @jacksonoliveira2813
      @jacksonoliveira2813 Рік тому

      @@KING0SISQO in GPU world 970, 980, 1070 and 1080 are kings... but otimizations and thats sh*s upscalers and fg kill all good raw power GPU.

  • @SunnyJCR
    @SunnyJCR Рік тому +1

    I just finished a playthrough of Doom Eternal before swapping over to Remnant 2 yesterday.
    Let's just say that few experiences in my life have been quite so jarring.

    • @kolppi
      @kolppi Рік тому

      ID's game engines have always been superior to Unreal ones IMO.

  • @HoldinContempt
    @HoldinContempt Рік тому +7

    Since they wont optimize the game to run on current hardware. Just refuse to buy the shitty game until 3 years later when its 80% discounted.

  • @jaceofspades.
    @jaceofspades. Рік тому +1

    Dang man, thank you for all the work you put in for us!

  • @cgerman
    @cgerman Рік тому +5

    It could be that developers are starting to become lazy and instead of writing better code they rely on upscaling or they simply say just buy a better GPU. It's insane that even the 4090 can't keep up.

    • @Leahi84
      @Leahi84 Рік тому +3

      Thats the atitude alot of people ive run into in the PC games community have too. You're just expected to upgrade or stop complaining, which is horrible with the economy the way it is.

  • @polokthelegend
    @polokthelegend Рік тому +1

    13600k here. Stuttery mess no matter what. GPU utilization on 7900xt never goes above 80 percent. Wonder if there's an intel cpu issue as well since the higher end system tested was an amd cpu.
    Update: After more testing, I found that removing my undervolt seemed to fix the stuttering. Upscaling is still obviously, and unfortunately, required. My undervolt settings have not been an issue while using any other applications and games(including other UE5 titles). This is the only title I own where undervolting caused stutters. Just wanted to update in case any other struggling intel users with an undervolt see this.

  • @erickelly4107
    @erickelly4107 Рік тому +3

    As I've been stating for months now that the RTX 4090 makes more sense paired with a 1440p / 240hz monitor.(for a wide variety of games: Single Player RPG's Max settings / Ray tracing, etc. @ ~90FPS+ FPS@ up to 240FPS etc.) This upsets some as 4K monitors have become much more popular ( many more have them and want to justify the purchase obviously) but it really shouldn't come as a big surprise given that even older games such as Cyberpunk 2077, Control, Halo Infinite, etc. are plenty demanding at 1440p with a corresponding optimal ~ 90+FPS experience running on an RTX 4090 PC.
    So why anybody would be surprised that newer UE5 games wouldn't somehow just farther make this point clear that the RTX 4090 is more ideally suited for 1440p is just bewildering.. 1080p / 1440p /4K is NOT what it use to mean, an issue many seem not to grasp..
    Really though having both a 4K/ 120Hz / VRR/ OLED and 1440p / 240hz / G-Sync display I've preferred gaming on my 1440p monitor overall with my RTX 4090 PC. The largest leap in visuals is going from 1080p to 1440p, 1440p to 4K isn't nearly as mind blowing (diminished returns after 1440p) when it comes to gaming. For example, if the option is 4K/ 60FPS or 1440p/ 90FPS I'd choose 1440p EVERY time.
    The consoles have more VRAM (~ 13.5GB "usable" for games) than they literally know what to do with and this is the reason for so many of the terrible "PC Ports" IMO which is a shame. The GPU's in these consoles (~ RTX 2070 at best) aren't nearly powerful enough to justify gaming at 4K and thus the reason for the upscaling / dynamic resolution gimmicks and even still the performance "FPS" is quite abysmal ~ 30FPS.
    Regardless of this terrible error via pandering to ignorance and marketing "4K" (4K more better...)with a $500 budget, game developers should NOT let this be a lame excuse to release terrible PC Ports that run like crap due to absurd VRAM requirements so that the consoles can run high textures at ~30FPS.. Pretty visuals don't mean much when your "in motion" performance is sub-par.
    As far as CPU requirements, again no real surprise that the new platform (AM5 / DDR5) will begin to start showing its advantages more over the older (DDR4) platform as time passes. People love to complain (a favorite pastime of many / Nvidia, AMD, Intel is out to get me, etc..) about the price of admission for AM5 / DDR5 but it's not like there would be no benefit to spending more to enjoy the benefits well into the future. It pays not to be short sighted.
    The RTX 4060 is not nearly a terrible as many have made it out to be based on many reviews / benchmarks I've seen. The problem by in large is that most reviewers aren't providing the proper context but instead pander to the outrage mob whom demand outrage porn. Very EASY to do considering the horrendous state of the economy in general - Just look at the comments with most "likes" and this point becomes crystal clear.
    People very often don't like hearing the reality (often involves more $$$) of things but it is what it is regardless of that the mob wants to hear.
    As far as having to use "upscaling" to play the game optimally I find it difficult to become remotely bothered by this really. I mean if the game still looks and plays well is it really that big of a deal? I mean who would have thought that over time games would become more demanding to run… Again comparing 1080p /1440p/4K in the past to now is like comparing apples to oranges.
    Another thing people seem not to understand is that viewing videos on UA-cam (Decompression, 60FPS cap, etc.) is NOT the same as what the game looks like when playing on a high refresh monitor natively at home.. So the comments about how the game doesn't look good, etc. are pretty irrelevant / farther demonstrates the appetite for outrage porn - people are broke / life sucks and why did I purchase this 4K monitor again?

    • @Wobbothe3rd
      @Wobbothe3rd Рік тому +2

      Good comment, but the consoles don't actually have as much VRAM as you think. Very few games on Xbox X or PS5 actually use more than 8GB, 10 or 11GB absolute max. The ps5 reserves a huge chunk (over 3GB) of the shared RAM for the OS, and obviously not ALL of the remaining RAM is used for video!

    • @wallacesousuke1433
      @wallacesousuke1433 Рік тому

      Holy wall of text...

    • @erickelly4107
      @erickelly4107 Рік тому

      @@wallacesousuke1433 Yeah but I space it out to be easier to read..

    • @leoSaunders
      @leoSaunders Рік тому +1

      just for everyone else: 2070 is a 6600
      6600 = 2070 =< 3060 < 6600xt < 6650xt

  • @krzysiekj2522
    @krzysiekj2522 Рік тому +1

    i have 2080, 5800x3d, 16GB RAM.
    I play medium 1440p, shadows low. I have 58-68fps in open areas. In dungeons I have 75-80fps.
    My frametime graph looks way better than what you have with 3060ti, though. the frametime graph is not totally flat, but I did have so frequent and such huge spikes as you are showing.
    Also, their hotfixes are doing well- game runs a bit better today than it did on 23rd. On 23rd to get 60fps at 1440p I had to go for all low, dlss performance

  • @derekn615
    @derekn615 Рік тому +5

    This game doesn't have RT, but it makes heavy use of UE5 Nanite. Its likely that having the option to turn this off would dramatically improve performance. You touched on this in your Fortnite video when they moved to UE5 ua-cam.com/video/WcCUL3dR_V0/v-deo.html

    • @Wobbothe3rd
      @Wobbothe3rd Рік тому +1

      You can't just turn nanite off lol

    • @WeaselPaw
      @WeaselPaw Рік тому +1

      With nanite "off" it would probably just run at 4 fps as it would have more stuff to render.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому

      @@WeaselPaw depends on game implementation. they can go back to older way of doing things which will have more performance but less detail.

    • @WeaselPaw
      @WeaselPaw Рік тому

      @@arenzricodexd4409 Yeah but in that case you can't just have an option to turn it on or off. Otherwise they would have to implement both ways, which would defeat the purpose of using nanite in the first place.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому

      @@WeaselPaw implementing both old and new method is common when everyone still on transition phase. that's why we see when new DX being released majority of game did not use new API exclusively. rather they release on both old and newer API. most often the older API serve as a way for people with older hardware or slower hardware to play the game. but as usual for some gamer especially those with newer hardware they will accuse developer being lazy because developer did not use newer API exclusively and fully optimized the game with newer API feature when they did this. i think something similar also happen here.

  • @i11usiveman97
    @i11usiveman97 Рік тому +1

    Unreal 5 is an engine built for the coming years so I wouldn't expect even on the highest GPU to just crank everything to Ultra at 4k and expect over 60fps.
    My 7900xtx performs very similar to that 4090 but FSR doesn't pull back as many frames as DLSS. Quite surprised actually how well 7900xtx is performing.

    • @FuelX
      @FuelX Рік тому +1

      Same here, my 7900 XTX loves this game.

  • @__-fi6xg
    @__-fi6xg Рік тому +4

    Oh oh, but mister Owen, my PC is ready, clearly you missed the part where i didnt buy a 1440p or 4k monitor, so my new pc feels stacked in any game at 1080p.

  • @MechAdv
    @MechAdv Рік тому +1

    While this game doesn’t appear to be breathtaking visually, remember that UE5 demos WITHOUT Nanite or Lumen from 2 years ago were crushing the 3090. Just because the devs didn’t do a good job artistically doesn’t make the engine any less challenging to push frames through. UE5 is the reason that I’m waiting until next gen to upgrade my GPU. There isn’t enough perf on offer with this generation to get 3-4 years out of a card from this generation.

  • @nebsi4202
    @nebsi4202 Рік тому +4

    i use a 4080 at 1080p and with a 13600k and this is 1 of the games that actually pushed my gpu to the max and without dlss/framegen i was having around 100 fps average AT 1080p and in some areas it would actually drop to as low as 74 frames

    • @bigturkey1
      @bigturkey1 Рік тому

      should game at 4k with 4080

    • @nebsi4202
      @nebsi4202 Рік тому

      @@bigturkey1 im gonna game at whatever Resolution i want its my pc xdd

    • @bigturkey1
      @bigturkey1 Рік тому

      @@nebsi4202 thats dumb

    • @nebsi4202
      @nebsi4202 Рік тому

      @@bigturkey1 my money my GPU i can dp what i want with it Simple

  • @6ch6ris6
    @6ch6ris6 Рік тому +1

    the biggest issues is this could very well be an internal probelm of the UE5 engine in general. and Unreal is the leader in game engines, so we might get a lot of games with similar issues.

    • @mannyc19
      @mannyc19 Рік тому

      Tim Sweeny is no John Carmack.

  • @callmekrautboy
    @callmekrautboy Рік тому +5

    Thanks for putting in the work to get this out. Great overview. 100% not supporting this type of development.

  • @CHA0SBLEEDS
    @CHA0SBLEEDS Рік тому +2

    The recommended CPU is a 6 core 12 thread CPU. 9600K is only 6 core 6 thread, so that's probably why it was stuttering so much.

    • @Wobbothe3rd
      @Wobbothe3rd Рік тому

      This channel is equally as biased towards inferior AMD CPUs as inferior AMD GPUs.

    • @CHA0SBLEEDS
      @CHA0SBLEEDS Рік тому

      @@Wobbothe3rd He said he wasn't sure why the 9600K was stuttering, and I was just letting him know it's possibly because the CPU he chose doesn't meet the recommended spec.

  • @KidGacy
    @KidGacy Рік тому +5

    They didn't make the game with upscaling in mind , they are hiding their incompetence with upscaling, this is ridiculous , the visuals are nice but nothing to write home about and games should be made for current hardware, it's like they didn't look at the steam hardware survey.

  • @ch0ketv_
    @ch0ketv_ Рік тому

    The game actually runs pretty well on my system without upscaling. I have a 7800X3D and RTX 4080 and it runs around 80fps at native 1440p without DLSS.

  • @anchannel100
    @anchannel100 Рік тому +16

    PC Gaming is turning into a rat race

  • @nannnanaa
    @nannnanaa Рік тому +1

    everyone is taking this the wrong way. the game does not have anti-aliasing in native mode. and there is even a weird artifacting that happens at native res. looks great with FSR.

  • @mchonkler7225
    @mchonkler7225 Рік тому +5

    In other words, the devs were lazy and tried to cut costs by having hardware brute force it rather than spending time optimizing.

    • @bigturkey1
      @bigturkey1 Рік тому

      what game have you played in the last year that maxed out a 4k 120hz panel without using dlss?

  • @markus.schiefer
    @markus.schiefer Рік тому +1

    I wouldn't have a problem if DLSS and FSR were used to create even better looking games out of the box, but the issue with this game specifically is that it doesn't look special at all.
    In this case I'd say that the developer indeed skipped large portions of the performance optimizations.
    So yes, they were either lazy or simply out of time.
    Which one it is will be decided by the degree of their post-launch optimizations.