It's not even the GPU's fault. Its hilarious how all these new cutting edge cpus are coming out but these fucking PC ports don't use any of the CPU power at all bottle necking my 4090. 🙄
That's right everybody who lacks sixth sense is blaming everything on the rtx 4090 for one such buggy, ridiculous EA game, whereas rtx 4090 is a monster which could run more than 200 fps easy on ultra 4k setting with a 13900k or ks combo on most of the AAA titles out there.. People lose their ability to think whose fault it is and where the bottleneck is.
Just found out there is also a ray tracing bug. Sometimes you must go to the in game menu and turn if off then on again as it will turn it off itself at times despite saying its on. Another bug is to turn off GPU overclocks and once out of the tutorial area GPU usage will go up to the 90s. Bug also exist with achievements not unlocking to fix them log out of the EA launcher then log back in, sometimes the achivements wont unlock when you earned them but only after the logging back in and then going to a meditation point.
Apparently you have to restart the game if you change the ray tracing option to off in order to have it off...even the options menu is not optimized...
Im just a victim of bad timing. Brought a pc with i9, gtx3080 in Oct 2022. Since then ive brought Gotham Knights, Callisto Protocol, Dead Space Remake, Hogwarts, Forspoken and now this lol. Thought pc was master race. 😂
Releasing games in states like this only promotes piracy. Pc gamers do not deserve to deal with such terrible ports at launches. Another very good recent example was the last of us. The last of us was developed by same studio that did Arkham knight back in the day and that was a disater too!
Tried it for a few minutes this morning. 7800x3d / 7900xtx rig. Settings - 4k all epic settings, no ray tracing. I was getting 55 -62fps 80%GPU utilization with FSR off. Yes FSR does work because after enabling it My GPU utilization went to 100% and I was getting over 100fps.🤷🏾♂️
Did anyone else noticed that the recommended specs listed RX 6700 XT / Nvidia GeForce GTX 2070 as the gpus and the recommend VRAM as 16G. I don't know who they should fire at EA, but neither of those cards have this amount of VRAM as an option, 6700 XT only has 12 while the 2070 has 8. Even the 3080 only has 12G of VRAM meaning to get a gpu with this amount of VRAM the you would need at least a 4080 or RX 7900 XT.
The first one ran maxed out locked at 1080p60 Native with a 8700K (10600K) and a RX480 8GBb OC 1342MHz (RX580 8GB) and 4K120 locked with a 6900XT TOXIC.
This game will run between 60-80 fps on a 13700K. Since the IPC on that CPU is over 50% better than your 9900K, I'd expect you'd get close to 100% utilization on your 4090. Not to say that this makes the game okay or reasonable performance-wise, but just noting that raw power can kick this game above 60 fps on its day 1 patch
Same shit as with Callisto Protocol. I used DSR and ran the game at 8k trying to give the CPU some breathing room. The callisto protocol benchmark avg dropped by 2fps. embarrassing.
I mean instead of picking it up on ps5 then you can just change your setting to medium/high and play at 1440p then there you go... just like playing it on ps5. youll get 60 fps on that build
Similar set up here; I'm getting the same results. Should have expected as much, being an EA published game. Rushing the developer to release the game, destroying their reputation in the process.
I get limited cpu utilization on an ultrawide but my gpu stays around 80 to 99 utilization with everything maxed and the fps averages low 70s to 110 on the first and second world. Gpu paired with 13700k , so far enjoying the game with minimal to no stutters unless a new area has to load. I do have to say though the vram usage is out of this world, climbing up to 22gb max in the first world.
ryzen 5600 7900 xt 32 ram and fps all over the place and not even using half my gpu for most of game. When game decides to use my whole gpu i get 120 fps at 1440p. thats rare tho
Yeah EA claim it is because people are pairing powerful GPU's with lower CPUs and using WIN 10, well I am on Win 11 with a 4090 / 12900k / 64g DDR5 and I can confirm the exact same frame drops, maxed 4k.
With for off it looked worse to me. With it on quality it looked better. I'm on ultrawide 1440p. You could see a difference in his hair. I thought it was weird as it seems backwards.
I am afraid that next generations of GPUs will be hard bottlenecked by next gen CPUs because GPU progression is much faster than CPU progression. If somehow Intel and AMD figure out to double current CPU performance RTX 5090 and potential RX 8900 XTX will be hard bottlenecked even in 4K.
Theoretically if we ever get to 100% GPU usage and near 80 fps, do you think the graphics justify that kind of performance? The game doesn’t look that great to me so far. 🤔
Plenty of good UE4 games.... It just requires a development studio to actually put effort into the PC port. But alas. They take our money and give us the bare minimum.
The previous game, Jedi Fallen Order was also on UE4 and it ran great. Maybe not on release date but I did a second playthrough 2 months ago and I was getting 120+ fps with no stuttering on a puny 3070. Well, let's give Respawn a month or two and it should get better.
Can someone explain why people don’t like vsync? Wtf is the downside other than performance? I can’t stand screen tearing how tf y’all keep that turned off
V-sync adds more input lag, that's the only issue. On PC we've had VRR (G-sync/Freesync) on monitors for around 8 years already, so as long as you set that up properly you don't get tearing, the framerate can go up and down as it likes and it will still be smooth. Large frametime spikes, shader compilation stutter and traversal stutter will still look bad, but not variable framerates. For games that have very high fps you can just cap the fps in Rivatuner just below your screen's max refresh rate. The only time I use v-sync anymore is if I'm playing on a bit older TV that doesn't have VRR support.
This morning it was running slow and stutter issues but this night it was running smooth ultra setting Ray tracing on and 4K. Wonder if a patch came out already to fix it? Cpu is 5800x3d and Rtx 4090 for gpu.
This is EA after all. What did you expect!!! Looking at the overlay, only one CPU thread is loaded properly and the rest are usually around 20 to 30%. Once that issue is fixed, I imagine the fps issue will be alleviated. But if something as fundamental is that is broken in the game... imagine what else is in need of a patch?
Having a faster CPU in the case of this terrible port would help you massively in regards to GPU utilization thus upping the FPS to near 70 80. But this should not need to be the case when it’s your job to make a good port and not to make people buy better hardware. The game doesn’t look near good enough to warrant this kind of performance anyway. Brute forcing a game should not be the new standard.
On amd run better probabily due to less driver overhead...i saw a gameplay on cpu bound area on 7900xtx and it was all time over 60 fps ps. developer should stop to use UE4 expecially in open world game because run bad the only exception of good games on UE4 are gears 4/5 (and they are not open world but they have big maps)...and i dont think frame generation would help this game because under 60fps (i think fake frame should be used only over 70fps) you ll notice the fake frame and you ll feel while playing that is not smooth
Naughty Dog: We good, boyz, we haven't made the worst port of 2023! 😁 Jokes aside, though, I finished TLoU 2 days after release and my personal experience was flawless: no crashes, no stutters (Ryzen 7 5800X3D + RTX 4070Ti). So... it seems like this Jedi game is far, far worse...
7900xtx and 5800x3d. The game does not look that good seriously it doesn't. 1440p with Ray tracing 60-90 fps with 60-90 percent GPU utilization. 22.5gb VRAM allocation. Unacceptable. Screw ea and they better not get star wars exclusivity deal extended.
EA made a shit console to PC port? Say it ain't so, Sir! Under 60FPS on a 4090! Does anyone test this shit before release? OK, I was being rhetorical! Have a great day, Sir! And God Bless! o7
Its been somewhat enjoyable on PC for me that is and we're getting a patch today or tomorrow but overall, I'm enjoying the gameplay and as long as my framerates are 50fps or higher, I'm happy. No way I'm playing this on PS5
Joker I noticed you mentioned using FSR didn't help despite having GPU headroom. Actually DLSS and FSR help games when the games are GPU limited, otherwise you won't see an improvement in framerates.
I could have used this video a day or two ago lol this is the EXACT same BS I dealt with when I stupidly pre-bought Forspoken. Exactly the same. I wasn't/ wouldn't play that until they fixed it. So pathetic and sooooo inexcusable!
I'm not sure what to think about this game and the performance results people are getting. I'm running it on a 5800x, RTX 4070 Ti, 32GB DDR4-3600 CL14 off a gen 3 NVME SSD under Windows 10, and it's running really well. I'm seeing 100% GPU utilization and all CPU cores/threads heavily used. It's running really fast with ray tracing enabled, at 1440p mix of high/epic settings. Max VRAM usage so far is 7.6GB (HWmonitor). There have been a couple small hitches, but overall performance is quite good for me.
Honestly it's fucking shit from amd to lock the game out of DLSS (since as you mentioned nvidia doesn't ONLY allow dlss in dlss games) since DLSS frame gen would circumvent the CPU bottleneck. This game just screams for it lol.
I doubt DLSS will even help. I watched another review and FSR isn't working on AMD GPU's either. This is the second rant...I mean review video that I have watched for this dumpster fire. I will not watch any more videos on this topic because my ears may begin to bleed. lol
Why is 40-50 frames so bad for this type of game? Its not ideal i get it but its still very playable for the time being? What did you guys do in the ps1 and ps2 era when everything was 30fps lol.
It's less the frame rate and more of, how the game runs in general on high end systems, while lower ones unless with a good CPU run poorly Not to mention the game crashes and stutters regardless unless using the recommended requirements
When you play game at 120+ fps and you go back to < 60 fps, it feels like a slideshow. Also, I can't stand the bluriness/ghosting of LCD display at low frame rate.
@@dramaticmudderer5208 ah I see. I guess I'm lucky. I have 6 hours in and havent crashed once. I got 40-50 frames on the first planet. 2nd planet I'm getting around 70-80
@Robert R. I'm in the same boat as you my man. I'm still able to find enjoyment with the lower frames. It's playable just not ideal at the moment. I'm gona play it through then once they patch it up replay it and we will have some nice sweet mods out by then :). Fallen order was awesome with the skywalker reshade mod, added dismemberment mod, and the scortch marks mod, and that custom saver editor mod.
@@lamNoJedi yeah, first planet is a mess but second is playable, hey, how long did it take you to beat the first planet? Im curious how much I have to endure of 20 fps on deck before it preforms similar to fallen order?
I've played indie 3d platformers on UE4 that had RT and DLSS and no stutter issues and great fps. AAA gaming on pc gets worse by the minute. Don't even get me started on AMD sponsered weak ass games with shite RT, dog crap FSR only. Oh and in case no one knows if you alter the graphics presets it changes the resolution to be upscaled by I presume by TAAU. You need to set the preset to epic and then adjust each setting down on it's own to avoid this F up, my god 🤣🤣🤣🤦🤦🤦🤡🤡🤡.
Thats EA for you at its finest. When they promote TF out of this overrated franchise instead of making Titanfall 3 possible. 🖕🖕 EA. At least Armored Core 6 will fill that Titanfall 3 void for now. Lets hope the PC port will be optimized like Sekiro and unlike ER
adding dlss to this version of unreal engine is literally just ticking a checkbox during build. There is only 1 explaination for these amd sponsored titles and it's not a conspiracy theory
1:00 Ray tracking typically adds significant load on the CPU, in addition to the GPU. So if you turn it on you'll be worse off if you're already bottlenecked by the CPU without it. 12:42 Likewise, AI upscaling such as FSR or DLSS require a small performance hit to the CPU, so in a CPU bottlenecked situation you'll typically see lower performance with them on. They are mainly only useful in a GPU bottlenecked situation. These results make sense since you are bottlenecked by the CPU. They need to better utilize multithreading and improve CPU utilization, and possibly furthur optimized the CPU performance since there is no clear reason why such a game would be that heavy on the CPU even with the lack of multithreading.
That's right everybody who lacks sixth sense is blaming everything on the rtx 4090 for one such buggy, ridiculous EA game, whereas rtx 4090 is a monster which could run more than 200 fps easy on ultra 4k setting with a 13900k or ks combo on most of the AAA titles out there.. People lose their ability to think whose fault it is and where the bottleneck is.
The thing is 13900k and 7800x3d barely clears 60fps with RT on, this game is just ridiculously unoptimized, probably the most unoptimized game since arkham night (as much as i love that game)
99% GPU utilization in the menu, around 60% average in game... wonderful.
This should have been exactly the opposite, great optimization!!!!!
It’s magical.😂
I won’t miss 😂
Well, at least the menu is optimized, right? That's a good thing, right?
To be fair, fallen order was not optimized either, while I am not shocked sucks that game was released like this to us.
‘I won’t miss’ 😂
The game should be running at 120fps+ consistently maxed out 4k on a 4090 without raytracing.
ahhhh with dlss things would def look different i agree, but dev's are not very smart
kinda looks like the devs skipped the whole optimization part entirely
It's not even the GPU's fault. Its hilarious how all these new cutting edge cpus are coming out but these fucking PC ports don't use any of the CPU power at all bottle necking my 4090. 🙄
That's right everybody who lacks sixth sense is blaming everything on the rtx 4090 for one such buggy, ridiculous EA game, whereas rtx 4090 is a monster which could run more than 200 fps easy on ultra 4k setting with a 13900k or ks combo on most of the AAA titles out there.. People lose their ability to think whose fault it is and where the bottleneck is.
Great content. Really like these port reports. Keep it up
Its 2023 and these are the ports we're still getting on PC.
Just found out there is also a ray tracing bug. Sometimes you must go to the in game menu and turn if off then on again as it will turn it off itself at times despite saying its on. Another bug is to turn off GPU overclocks and once out of the tutorial area GPU usage will go up to the 90s. Bug also exist with achievements not unlocking to fix them log out of the EA launcher then log back in, sometimes the achivements wont unlock when you earned them but only after the logging back in and then going to a meditation point.
it was working for him though, the GPU util went up and down when toggled.
Apparently you have to restart the game if you change the ray tracing option to off in order to have it off...even the options menu is not optimized...
That's is not unheard of as Hogwarts Legacy did the same thing.
that has been fixed and the gpu utilization wont change with the bug.
Gameplay looks so jittery and unsmooth without motion blur.
It’s a dogs dinner on console too, screen tearing, sub 1080p in places & massive frame drops.
Im just a victim of bad timing. Brought a pc with i9, gtx3080 in Oct 2022. Since then ive brought Gotham Knights, Callisto Protocol, Dead Space Remake, Hogwarts, Forspoken and now this lol. Thought pc was master race. 😂
Releasing games in states like this only promotes piracy. Pc gamers do not deserve to deal with such terrible ports at launches. Another very good recent example was the last of us. The last of us was developed by same studio that did Arkham knight back in the day and that was a disater too!
The poles that you swing off are being flogged to death in games.
Tried it for a few minutes this morning. 7800x3d / 7900xtx rig. Settings - 4k all epic settings, no ray tracing.
I was getting 55 -62fps 80%GPU utilization with FSR off. Yes FSR does work because after enabling it My GPU utilization went to 100% and I was getting over 100fps.🤷🏾♂️
Did anyone else noticed that the recommended specs listed RX 6700 XT / Nvidia GeForce GTX 2070 as the gpus and the recommend VRAM as 16G. I don't know who they should fire at EA, but neither of those cards have this amount of VRAM as an option, 6700 XT only has 12 while the 2070 has 8. Even the 3080 only has 12G of VRAM meaning to get a gpu with this amount of VRAM the you would need at least a 4080 or RX 7900 XT.
Nvidia is forcing developers to use more vram so people can start buying latest gpus its all bussiness
The first one ran maxed out locked at 1080p60 Native with a 8700K (10600K) and a RX480 8GBb OC 1342MHz (RX580 8GB) and 4K120 locked with a 6900XT TOXIC.
apparently disabling smt or hyperthreading dramatically improves the performance
Is this something you experienced, or do you happen to have a source for this info?
@@balgoth18 hi, i dotn have the game but there is a thread on the steam discussion page for this game
This game will run between 60-80 fps on a 13700K. Since the IPC on that CPU is over 50% better than your 9900K, I'd expect you'd get close to 100% utilization on your 4090. Not to say that this makes the game okay or reasonable performance-wise, but just noting that raw power can kick this game above 60 fps on its day 1 patch
Even 13900k cannot max out 4090 in this game let alone 13700k
Ya my 3090 only using 50-60% utilization at 60fps. Has room to be over 100fps average if they did any optimization AT ALL.
Another trash port made with the Unreal Trash, I'm so glad Capcom ditched it, all their games run like 😘
Same shit as with Callisto Protocol. I used DSR and ran the game at 8k trying to give the CPU some breathing room. The callisto protocol benchmark avg dropped by 2fps. embarrassing.
I mean instead of picking it up on ps5 then you can just change your setting to medium/high and play at 1440p then there you go... just like playing it on ps5. youll get 60 fps on that build
FSR requires some CPU load, you will not get more frames if CPU is overwhelmed
If your game is running at
‘I won’t miss’ 😅😂🤣😅😂🤣Roflmao!
Why is ma man still on 9900k?
So many bad PC ports recently 😢
"Useless FSR" and "dogshit raytracing" funniest shit i have heard.
Works for me, gives me 100+fps enabled
Similar set up here; I'm getting the same results.
Should have expected as much, being an EA published game.
Rushing the developer to release the game, destroying their reputation in the process.
I get limited cpu utilization on an ultrawide but my gpu stays around 80 to 99 utilization with everything maxed and the fps averages low 70s to 110 on the first and second world. Gpu paired with 13700k , so far enjoying the game with minimal to no stutters unless a new area has to load. I do have to say though the vram usage is out of this world, climbing up to 22gb max in the first world.
ryzen 5600 7900 xt 32 ram and fps all over the place and not even using half my gpu for most of game. When game decides to use my whole gpu i get 120 fps at 1440p. thats rare tho
Yeah EA claim it is because people are pairing powerful GPU's with lower CPUs and using WIN 10, well I am on Win 11 with a 4090 / 12900k / 64g DDR5 and I can confirm the exact same frame drops, maxed 4k.
Enable Rebar in Nvidia Profile Inspector
That's how I fixed hogwarts legacy. Same issue here, huh?
Same here on my 13900k with 4090 55fps with ray tracing on, crap!!
Aye, I'm thinking the same thing J, if its runs solid on Ps5 I will just buy it there, no way I'm playing on PC with that performance, yikes. Thanks!
The console versions aren't a consistent 60fps and you're also looking at a much lower resolution and some pared back graphics settings.
Will the community fix the game before EA will do it? They did it with Hogwarts, they did it with pathtraced Cyberpunk. Will they do it again?
Oh no
Are you pairing a 4090 with an old ass 9900K?
They don't do any otpimalization they just copied assets from console engine to pc engine and called it a day.
The second planet gets smooth, and GPU utilization is above 90 to 99 even. Tested with R5 7600 + 4070 Ti at 144op
stop lying
@@HardWhereHero What lie? Do you have the exact same system?
@@HardWhereHero ua-cam.com/video/njYael9go9A/v-deo.html&ab_channel=HM here's the proof.
With for off it looked worse to me. With it on quality it looked better. I'm on ultrawide 1440p. You could see a difference in his hair. I thought it was weird as it seems backwards.
Joker what 4k monitor do you game on, the model? just got myself a 4090 fe and was thinking about a monitor upgrade
Samsung G7
I am afraid that next generations of GPUs will be hard bottlenecked by next gen CPUs because GPU progression is much faster than CPU progression. If somehow Intel and AMD figure out to double current CPU performance RTX 5090 and potential RX 8900 XTX will be hard bottlenecked even in 4K.
GPU workload is much easier to "parallel compute" than CPU workload. The trend of GPU progression been faster than CPU will continue.
We need two more cpu generations to bring my 4090 to 100%
@@FunKaYxxD1sCO Or you know just maybe these AAA titles should at least utilize 8 cores as a standard...
Seems to run good on mine 7700X RTX 4080 4k fsr quality RT on epic settings 60 fps.
Theoretically if we ever get to 100% GPU usage and near 80 fps, do you think the graphics justify that kind of performance? The game doesn’t look that great to me so far. 🤔
I would be fine with that.
Weirdly with FSR enabled I do get 100% GPU usage and over 100fps
Been waiting for your review i like the no bs approach im running a 4080 with 5800x3d at 4k so gonna wait a while till its patched
Yeah me as well, the game is really just fucking optimized really bad.
I think it might be your pc I reach 99%-100 % gpu utilization. With rtx on and no fsr on a 4080 and I have a much higher frame rate
It does get better when you get off the first planet but that does not excuse the piss poor port.
I mean its a UE4 game so idk what you guys are expecting
Plenty of good UE4 games.... It just requires a development studio to actually put effort into the PC port.
But alas. They take our money and give us the bare minimum.
dead island 2 just came out using UE4 and that runs amazingly.
The previous game, Jedi Fallen Order was also on UE4 and it ran great. Maybe not on release date but I did a second playthrough 2 months ago and I was getting 120+ fps with no stuttering on a puny 3070. Well, let's give Respawn a month or two and it should get better.
@@MrMrTravman this game runs bad on PS5 as well it's not laziness it's the engine
I get it from AMD rewards, should I return it?
Calling a game playable with a 4090. What is even going on here :D
This is running on UE4. The jury is still out on whether or not that’s a dog shot engine. Lol
Can someone explain why people don’t like vsync? Wtf is the downside other than performance? I can’t stand screen tearing how tf y’all keep that turned off
V-sync adds more input lag, that's the only issue. On PC we've had VRR (G-sync/Freesync) on monitors for around 8 years already, so as long as you set that up properly you don't get tearing, the framerate can go up and down as it likes and it will still be smooth. Large frametime spikes, shader compilation stutter and traversal stutter will still look bad, but not variable framerates. For games that have very high fps you can just cap the fps in Rivatuner just below your screen's max refresh rate. The only time I use v-sync anymore is if I'm playing on a bit older TV that doesn't have VRR support.
@@Mogura87 how would I set up g sync?
This morning it was running slow and stutter issues but this night it was running smooth ultra setting Ray tracing on and 4K. Wonder if a patch came out already to fix it? Cpu is 5800x3d and Rtx 4090 for gpu.
Nope no patch you just lucked out.
Oh how Sad, for a change they didn't Cater to Nvidia this time... These games play great on AMD and Radeon..
This is EA after all.
What did you expect!!!
Looking at the overlay, only one CPU thread is loaded properly and the rest are usually around 20 to 30%.
Once that issue is fixed, I imagine the fps issue will be alleviated.
But if something as fundamental is that is broken in the game... imagine what else is in need of a patch?
Lost all respect for AMD after they paid to keep DLSS off and frame gen. Both of which would make this game playable for many more people.
Having a faster CPU in the case of this terrible port would help you massively in regards to GPU utilization thus upping the FPS to near 70 80. But this should not need to be the case when it’s your job to make a good port and not to make people buy better hardware. The game doesn’t look near good enough to warrant this kind of performance anyway. Brute forcing a game should not be the new standard.
Honestly kinda disappointed by the visuals in game, they're not worth 15gb of Vram thats for sure.
On amd run better probabily due to less driver overhead...i saw a gameplay on cpu bound area on 7900xtx and it was all time over 60 fps
ps. developer should stop to use UE4 expecially in open world game because run bad the only exception of good games on UE4 are gears 4/5 (and they are not open world but they have big maps)...and i dont think frame generation would help this game because under 60fps (i think fake frame should be used only over 70fps) you ll notice the fake frame and you ll feel while playing that is not smooth
Game is terrific but my god did they screw the port up
It's not just the pc port, the consoles also run like shit 😭
Nice looking and well done game but i laughed out loud when i saw double jump. 🤣
Does EA EVER release a new PC game that DOESN'T HAVE ISSUES FOR MONTHS????
The last of us is a bad port.
EA: hold my beer.
Naughty Dog: We good, boyz, we haven't made the worst port of 2023! 😁
Jokes aside, though, I finished TLoU 2 days after release and my personal experience was flawless: no crashes, no stutters (Ryzen 7 5800X3D + RTX 4070Ti). So... it seems like this Jedi game is far, far worse...
7900xtx and 5800x3d. The game does not look that good seriously it doesn't. 1440p with Ray tracing 60-90 fps with 60-90 percent GPU utilization. 22.5gb VRAM allocation. Unacceptable. Screw ea and they better not get star wars exclusivity deal extended.
EA made a shit console to PC port? Say it ain't so, Sir!
Under 60FPS on a 4090! Does anyone test this shit before release? OK, I was being rhetorical!
Have a great day, Sir! And God Bless! o7
Its been somewhat enjoyable on PC for me that is and we're getting a patch today or tomorrow but overall, I'm enjoying the gameplay and as long as my framerates are 50fps or higher, I'm happy. No way I'm playing this on PS5
You already got the patch lol. It came out right before release. This is the performance for at least a week or two when they patch it again.
Joker I noticed you mentioned using FSR didn't help despite having GPU headroom. Actually DLSS and FSR help games when the games are GPU limited, otherwise you won't see an improvement in framerates.
pls guys dont buy this and return it if you bought it on steam!
It seems they develop the game on AMD hardware. Nvidia users are beta-testers
I could have used this video a day or two ago lol this is the EXACT same BS I dealt with when I stupidly pre-bought Forspoken. Exactly the same. I wasn't/ wouldn't play that until they fixed it. So pathetic and sooooo inexcusable!
Thumbs up for the awesome title
The intro killed me
This the crap of us pc ports are realy well made.🤮
the console versions seem better. as always. so they deved more for consoles and then tried to convert to a pc version
I'm not sure what to think about this game and the performance results people are getting. I'm running it on a 5800x, RTX 4070 Ti, 32GB DDR4-3600 CL14 off a gen 3 NVME SSD under Windows 10, and it's running really well. I'm seeing 100% GPU utilization and all CPU cores/threads heavily used. It's running really fast with ray tracing enabled, at 1440p mix of high/epic settings. Max VRAM usage so far is 7.6GB (HWmonitor). There have been a couple small hitches, but overall performance is quite good for me.
Honestly it's fucking shit from amd to lock the game out of DLSS (since as you mentioned nvidia doesn't ONLY allow dlss in dlss games) since DLSS frame gen would circumvent the CPU bottleneck. This game just screams for it lol.
welp ps5 drops to 35 at like 1200p far from 4k so its bad on all platforms but on pc it still might be the worst port ever
Too farken late mate!!
I doubt DLSS will even help. I watched another review and FSR isn't working on AMD GPU's either.
This is the second rant...I mean review video that I have watched for this dumpster fire. I will not watch any more videos on this topic because my ears may begin to bleed. lol
FSR is working on my 7900xtx. Went from 60fps to over 100fps by enabling FSR
@@cjpp78ytube It does not work and that was not a question. ua-cam.com/video/yujdHuskhSk/v-deo.html&ab_channel=OhNoItsAlexx
Jedi Fallen Survivor ? Haha.
Why is 40-50 frames so bad for this type of game? Its not ideal i get it but its still very playable for the time being? What did you guys do in the ps1 and ps2 era when everything was 30fps lol.
It's less the frame rate and more of, how the game runs in general on high end systems, while lower ones unless with a good CPU run poorly
Not to mention the game crashes and stutters regardless unless using the recommended requirements
When you play game at 120+ fps and you go back to < 60 fps, it feels like a slideshow. Also, I can't stand the bluriness/ghosting of LCD display at low frame rate.
@@dramaticmudderer5208 ah I see. I guess I'm lucky. I have 6 hours in and havent crashed once. I got 40-50 frames on the first planet. 2nd planet I'm getting around 70-80
@Robert R. I'm in the same boat as you my man. I'm still able to find enjoyment with the lower frames. It's playable just not ideal at the moment. I'm gona play it through then once they patch it up replay it and we will have some nice sweet mods out by then :). Fallen order was awesome with the skywalker reshade mod, added dismemberment mod, and the scortch marks mod, and that custom saver editor mod.
@@lamNoJedi yeah, first planet is a mess but second is playable, hey, how long did it take you to beat the first planet? Im curious how much I have to endure of 20 fps on deck before it preforms similar to fallen order?
Ea play this port u should
I've played indie 3d platformers on UE4 that had RT and DLSS and no stutter issues and great fps. AAA gaming on pc gets worse by the minute. Don't even get me started on AMD sponsered weak ass games with shite RT, dog crap FSR only. Oh and in case no one knows if you alter the graphics presets it changes the resolution to be upscaled by I presume by TAAU. You need to set the preset to epic and then adjust each setting down on it's own to avoid this F up, my god 🤣🤣🤣🤦🤦🤦🤡🤡🤡.
Most triple aaa games are , soulless , empty , lack features and are wank anyways
Thats EA for you at its finest. When they promote TF out of this overrated franchise instead of making Titanfall 3 possible. 🖕🖕 EA. At least Armored Core 6 will fill that Titanfall 3 void for now. Lets hope the PC port will be optimized like Sekiro and unlike ER
adding dlss to this version of unreal engine is literally just ticking a checkbox during build. There is only 1 explaination for these amd sponsored titles and it's not a conspiracy theory
1:00 Ray tracking typically adds significant load on the CPU, in addition to the GPU. So if you turn it on you'll be worse off if you're already bottlenecked by the CPU without it.
12:42 Likewise, AI upscaling such as FSR or DLSS require a small performance hit to the CPU, so in a CPU bottlenecked situation you'll typically see lower performance with them on. They are mainly only useful in a GPU bottlenecked situation.
These results make sense since you are bottlenecked by the CPU. They need to better utilize multithreading and improve CPU utilization, and possibly furthur optimized the CPU performance since there is no clear reason why such a game would be that heavy on the CPU even with the lack of multithreading.
That's right everybody who lacks sixth sense is blaming everything on the rtx 4090 for one such buggy, ridiculous EA game, whereas rtx 4090 is a monster which could run more than 200 fps easy on ultra 4k setting with a 13900k or ks combo on most of the AAA titles out there.. People lose their ability to think whose fault it is and where the bottleneck is.
The thing is 13900k and 7800x3d barely clears 60fps with RT on, this game is just ridiculously unoptimized, probably the most unoptimized game since arkham night (as much as i love that game)
@@h1tzzYT yep agree