Yay but at least remake had 120 fps cutscenes. The stuttering in the slums was annoying but at least it was not the whole game. Here Always alternate between 30 and higher is not great in my opinion
@@MrMrAragoare the cutscenes even realtime in this demo? The GPU load became very low whenever the cutscenes played. So I assumed it's just a pre rendered video.
@@r4vd0the way they made the cutscenes is similar to the ps1 ones where they layer different types of cg videos with realtime graphics to make them seamless while allowing large scale heavy stuff without resorting to huge high quality videos. Thats the reason they cap it to 30 because the videos being layered are sampled at 30
@@r4vd0 there are a lot of cases where none in real time rendering has been patched to 60 fps through mods. It's even moreso possible now with AI which makes artificial frames in-between. Though mods like those might take a bit longer to make.
oh, let me guess, nvidia prepares their 5090 stock to sell. Why not force AMD out with money to developers. Nothing stops devs to spend an hour installing FSR 3 in a game, except... money.
@@Jakiyyyyy i hate sponsorships. Because then you get situations where you get worse performance than the other brand of GPUs because you went with either team Red or Team Green. Watched the Star wars Outlaws "gameplay tailer" and that all they talked about was all the nvidia features, there was probably actually 15 seconds of gameplay footage shown entirely...
there are lots of things to work out but the difference in visual quality over the ps5 is monumental. moving and panning the camera in 4k at over 100fps is just amazing. black myth and ff16 demo on the same day feels like a holiday where you get a raise and free ice cream!
Make sure you're on the latest Nvidia driver or this demo is completely broken performance wise (It's a stuttering mess). Overall though had a great time with this demo although it's a tad blurry even using DLAA at 1440p. 30fps cutscenes and no UW support needs to be fixed though for launch...
@@GameoftheYear-fx4mqWhy are you even here on this channel running damage control for a demo and dismissing any issues people are having? Get a life, bro.
DLAA iat Nvidias anti aliasing tech, using DLSS tech for AA. But its not upscaling, it doesnt improve performance, it increases visual fidelity. If you want increased perf, you need DLSS upscaling.
Cutscenes in this port are atrocius. Its killing this game for me if they don't fix it. I mean... This game runs at 60-80fps (max setings, 1440p on 4070ti) and suddenly cutscene at 30fps but it does not feel like 30. It stutters and for me looks like 20fps or less but the counter stays at stable 30...
@@Fronioll9973 It only looks bad if the framerate is really low during gameplay but in the cutscenes, it works really well believe it or not. Also a tip when using Lossless Scaling with this game, is try to get the framerate from the game as high as possible preferably to 70fps and the app will try to smooth out those extreme low dips with it's frame generation. I actually had better luck and lower latency using it this way over the in game frame generation. Don't use Vsync in the game but use it in the app.
6700xt 1080p medium??? Man wtf is going on with these games, this card was running games at 1440p high not so long ago, games that look better than this!
That was only on last gen PS4 ports. This game gets down to 720p on the PS5. It's an actual next gen game. It getting to 60 fps on a 1070 means it scales well.
Japanese developers and games lacking optimization: name a more iconic duo.. PS: for those complaining about Wukong, come here and see this mess...60 fps on a 4090 with no RT...
I'm not sure what they do to the lighting in this game but it looks fantastic at times. When there are overhead clouds you can see the scenes dim and brighten. I have a feeling that lighting is making a pretty big hit on performance. But also, yeah, this game is not well optimized either.
It's good they are releasing demos now in games. It can help too smooth out issues before the full game launch and I appreciate the chance to see games before release from the buyers perspective.
So the game is now prettier while crisper, smooth quick cam pans, responsive and not blurry. A joy to play with dlss dlaa and FG. Would wish 60fps 4k for the ingame sequences.. its far better now than on the PS5. 4090/13900ks
I'm playing the demo right now and even on the 7900XT it's not that bad of an experience using 4k, but I'm probably gonna go back to 1440p cause of my monitor and it did drop to 40 in some sequences. Still, aside from the cutscenes, not a badly optimized game at all. Then again, the graphics look like PS4, don't think it even uses RT.
Crazy because I just finished the demo on a 3060ti and it ran ultra 1440p with a locked 60, If I had a higher refresh rate monitor, would bet I get at least a stable 100 fps. Had no stuttering either
@@GameoftheYear-fx4mq i have a gpu vastly superior to yours and i can tell you, you in fact did not run the game at ultra 1440 with a locked 60. You used the upscalers and or FG. At native 1080p max settings the game was dropping to 55 at points on my 7700xt, which is a far more powerful card more in line with a 3080. This is exactly why i hate upscalers and FG because now it muddies the waters when you want to talk about performance and you will have people spouting off about how great the game runs when in reality they were using all the crutches available and not rendering at native res and native frame rate.
It's not surprising to see modern AAA Games running low fps even on the most expensive gaming Geforce RTX4090 Card, blurry screen + lower fps than 60fps
just tried it. these 30 fps in cutscenes looks absolutely horrible. main problem that you get huge stuttering when game lowers fps. and they forgot to lock mouse inside window when you play. not tested it in fullscreen, only borderless window. and game looks all fuzzy in 1080p, even without upscalers. upd. i played for 2 hours, and when long cutscenes started, my screens turn off because of 10 min default windows timer. so they even forgot to ignore windows power settings... and when i touched mouse, window with "operation successful" appeared and game closed. i was in fullscreen mode, so game was always active.
The game was incredibly blurry for me until I disabled dynamic resolution and forced DLSS on quality. Still not crisp at 1080p but better, and I can keep almost consistent 60 fps without frame generation.
ran beautifully for me. rtx 2080ti (9900k and 32gb of ram). around 55-60fps at 1440p max with DLSS quality. G-sync works down to 30fps, so the cutscenes looked great also (but is admittedly a NVIDIA only feature)
I'm so happy. I bought a PS5 to play this game but it was a mess with the necessary 60fps mode on PS5. Sold my PS5. Now I can play the game perfectly with my RTX 3080 instead. Fantastic game.
Blurry as hell, 30FPS in-engine cutscenes like Stranger of Paradise, mediocre graphics for the performance it demands (looks like Witcher 3.. from 2015), random stuttering/framepacing issues, no ultrawide support.. Yeeeah, this engine seems worse than Luminous.
Cbu3 doesn't like to shift to new tech very quickly.... I'm sure it'd be a much faster developed game if they made it from scratch in unreal and asked the kh team for help.
My PC is pretty beefy. Ryzen 9 with a 4090. The cutscenes were locked to 30fps which I didn't mind but the rest of the game felt like I was still playing 4K 30fps. This game really needs more optimization I should be able to hit 4K 120fps easily.
runs fine for me with rtx 2080ti and 9900k (32gb of ram). so it's odd that he is having issues. Im getting about 60fps all maxed out, at 1440p with dlss quality
@@legendp2011 I have the same specs and it runs like crap unless I use the Lossless Scaling app with it. The Lossless Scaling app will also make the 30fps cutscenes run at 60fps.
I cant even get passed the first loading screen without artifacts. Running a 3090ti with 12900k and 32gbs of ram. Just updated the driver to latest. All other games running fine. Thinking bout rolling back the driver update. SE is stupid. Update : well i fixed it. Apparently rBAR and ultra low latency was turned on in nvidia inspector and that is what was causing the issue. Turned them off and the game ran swell. Couldnt get 60 fps without dlss quality but good nuff. Game looks crisp in dlaa though. Wish it ran at 60 with dlaa.
I'm on the latest drivers with a 12700k and a 4090, and the game is a stuttery mess that is almost painful in cutscenes. It is so bad that I am having trouble even understanding what is happening on the screen at times. I'm getting an average of 70fps with DLAA native 4k, but even outside of cutscenes the stutters are constant. I'll wait for more patches to hopefully iron out the performance.
After making comments earlier I had to try. I get ~53 fps on my 7900 GRE (Manual OC), native 4K, Ultra with no upscaling or frame gen BS. Not bad considering the 4090 got low 60's. However I can confirm this is far more demanding on a GPU than it should be for the level of quality and quantity of what is going on, not by a small margin, but huge.
The game is absolutely beautiful but it is quite heavy. Those cutscenes especially are very heavy with those very detailed character models. The lighting is also very good
I think it would be nice if they add a toggle to switch between 30 and 60 fps on cinematic cutscenes at launch. Would be a nice option to have for those who have the high-end GPU to run it.
elden ring still does not support ultrawide (even with the new DLC that just realesed). there are mods that easily add ultrawide, but it trips the anticheat
@@legendp2011 well luckily there is (I think) no online with the single player component so hopefully they'll be a mod out super quick to add ULTRAWIDE to enjoy this game in it's full glory... Literally my only gripe and a very puzzling omission smh
I had a few minor stutters on my newly built rig but overall the experience was breathtaking! What a game and story so far. Definitely sold on this and knew Yoshi-P and CB3 would deliver. I'm happy I kept myself spoiler-free but true, did see some minor stutters and pauses on rare occasions. The 30 FPS cutscenes I didn't mind as they are integrated so well it really didn't take me out of it so to speak but will still be eager to see how they continue to improve the game. Thanks for an early video, too!
at least they are not 24fps cutscenes! it hasn't bothered me too much for some reason, something about this game the 30fps cutscenes work better than older games that had this problem
Yeah I mean again, went into the demo on drivers 537.xx and it froze up after the beginning so just had to ensure I grabbed one of the latest drivers and then the game worked just fine. No crashes or other issues at all, just like one or two stutters here or there, nothing really serious.
@@TheRealLink like a lot of games, play it with windowed mode, not full screen mode, and it won't crash (and maybe avoid MSI Afterburner if you get crashes and see if that makes a difference)
@@TheRealLink to be fair, demo did warn me that I need to update driver, because when I ran it pre-update, game was a glitchy mess of colors. So driver update is a must.
Hey look at he new digs! Is that the new basement studio? I'm really happy for your success Dan. I remember when you first came up on the scene, and damn you exploded. I bet your students are so proud of you, just like your fellow teachers. Cheers bro
I played it at 30 fps on ps5. I pretty much just assumed PC was looking at 60 fps stable at best. PC can go higher, mine is new also, but the game was very demanding.
DEMO SIZE 16GB STEAM SAY 16.6GB SO CLOSE TO 17GB A HUGE THANKS I COULDNT FIND THIS INFO ANYWHERE.. BUT WATCHING YOUR VIDEOS I ALWAYS GET THE NR1 INFO ABOUT THE STUFF THE VIDEO IS ABOUT!! GREEEEAAAT
I got about 55-60fps on a 2080ti max out, 1440p dlss quality (9900k cpu and 32gb of ram). I didn't have much stuttering (so perhaps there was a driver update)
I don't enjoy the 30fps cutscenes (apparently, they were on ps5, too), but by god, I've been waiting so long for this game and am just happy it's finally here.
This game is 10x better looking than Forspoken. This is a quality game, sure it has some issues, but FF games are innovative and new every one of them and I can confirm this game is special despite any flaws. A must play.
As a self-proclaimed Final Fantasy fan who has never heard of Tactics or Ivalice or played 14, I will be boycotting the game for being dark fantasy and trying to copy Game of Thrones.
@@GetterRay ? It's a world at war. It isn't Game of Thrones. Fantasy games where everyone is at war have happened before. I don't understand. The world for FF14 is the same for FF12, the single player game.
Game works surprisingly very well on my RX6600 paired with a Ryzen 5600. I got 150fps in the opening, around 100 in the first city and 120 in the swap area. Everything on high but shadows using FSR on quality + FG. No stutters is what surprised me the most.
When you launched the demo, did a message about updating drivers appear? I checked mine and I have the most recent driver but that message still appears
Daniel, great video as always. Did you see Nvidia launched specific drivers for FF16 demo and main game earlier today (Tuesday aug 20). Wonder if that cures some of the frame issues.
Well, I'm glad to see that it runs on your 6700 XT -- entire thing is so busted on my 7900 XTX with the AFMF2 preview drivers that I couldn't get through the initial brightness configuration.
i have a radeon 6800 and i just accepted that the game wouldn't run at native 1440p high above 60fps all that much. In fact, it's dropping below that so i had to resort to using FSR 3 to get it to a level of performance i'm satisfied with. It's quite a shame i need an upscaler, i prefer to avoid them whenever possible. What's worse is FSR 3 is still nowhere near as good as DLSS and it kind of frustrates me a bit. I hope AMD manages to catch up to nvidia soon because with linux drivers from nvidia finally being functionnal, i may no longer have a reason to stick to radeon in the future if everything requires an upscaler
@@Stars-Mine it's a GAME cutscene, not film, framerate doesn't matter as much in film, but when you get 30 fps with essentially a shutter angle of 0° it looks like crap
@@dahahaka I know its not film. but people saying 30fps for a custscene looks like crap even with it not doing all the tricks a camera does, is not being serious.
@@Stars-Mine it's very jarring... And it's just completely unnecessary. If it were some crazy engineering challenge sure, but the game is already running at higher framerates
Performance is all over the place no matter the settings. What I did was turned Vsync off and tried to get the framerate to over 70fps when it was possible but since performance was so bad, the framerate would dip between the 40s fps and 70s fps. Then I used the "Lossless Scaling" app from Steam and set the frame gen in the app to X2 as well as set Vsync to on in the "Lossless Scaling" app not the game. This really helped try to keep those framerate dips to a minimum as well as keep the fps close to 60. Another good benefit is the "Lossless Scaling" app ran the cutscenes fantastically to 60fps!!! It took awhile to dial in the settings but after a few hours of back and forth with settings I finally found something that worked well enough.
Thanks for the headsup Daniel! Stuttering fullstop really kills my desire to play some games (looking at you Jedi Fallen Order) so hopefully they can fix that, 30 frames for cutscenes is also an odd pick in the modern age (I guess they really wanted to show off the engine in cutscenes). P.S I spot a nice Les Paul or Gibson in the background?
You should do your testing on the summons battle segments of the game, I reckon those would be the most demanding ones I think there is one at the end of the demo
@@thelazyworkersandwich4169 It didn't come day one because of exclusivity deals, something the developers wouldn't have much say in to begin with. This specific team have also been working on PC for years with XIV. As their first big AAA PC port, it's already miles ahead of most other releases, and is a better game than most modern slop.
@danielowentech I think they updated the demo. Cutscenes now try to run at 60fps, but the framepacing is still very uneven. I only noticed one little stutter during gameplay so far and it runs at 200fps+ with my 7800x3d and 7900gre... Maybe they tried to fix the framepacing in cutscenes with a 30fps cap, but it didn't work like intended... 1% low was at 29 fps (during cutscene). Maybe you should give it another try?
I believe DF said the combat in performance mode (60fps) went down to 720p on the ps5. Also on my playthrough in performance mode traversal you can feel the frames drop
I noticed my friend runs this demo horribly on his 5600X + 4080 Super with low GPU usage/stutters but my 5800X3D + base 4080 keeps the GPU pegged at 100% with little to no stuttering (3D cache heavy title?)
I'm about to be done with pc gaming... digital foundry just released a video and black myth wukong has stuttering issues also and I know it won't be fix.
@@Mr_CopaceticGamer not really, its the same engine, so same HW req, but Treyarch (this years cod developer) has been using it for 4 year now on this years cod. so its gonna be pretty different in terms of art style and gameplay. Last gen.. yeah its sad, but its still like 50% of all players, you don't want to loose that
Update: Official 24.8.1 AMD Drivers & FF16 Demo v1.01 +Updated Buttery Smooth Settings with more Visual Fidelity! AMD Users: The Official 24.8.1 drivers fixed my FF16 issues. I had to factory reset to not crash though & not change any AMD Software Gamming tab settings. I also used AMD Cleanup Utility before driver install. Much smoother FPS now! No crashes or long stutters even at Upscaling = Native AA & Dynamic Rez = OFF. My Rig is an AM4 Ryzen 7 5800 (Non-X ...Basically 5700X) on a RX7800XT, 16 GB DDR4@3600Mhz & B550 Mobo. On Windows 10 Pro 22H2 ...If anyone was curious. Only changed from default settings: At Max settings: 4K (Borderless) I'm getting fairly steady ~30FPS with Dynamic Rez = OFF, Upscaling = Native AA. If FSR3 Frame Gen = ON.~40-60FPS Smooth Settings: 4K (Borderless) with Dynamic Rez = OFF, Upscaling = Quality, FSR3 Frame Gen ON & Frame Rate = 30.00 FPS ~60FPS with very occasional drops. >>>Buttery Smooth 60 FPS Settings + Lossless Scaling>SOLID 60 FPS
Isn‘t it crazy that my GTX 1070 was able to push around 90-110 FPS at native 1440p max settings in Doom 2016. Here in FF16 with an internal resolution of 720p a quarter of 1440p total pixels, it‘s not even able to stay at 60 FPS…
@@3dcomradeyou mean like every single generation that wants to push graphics forward by being able to do stuff in real time that wasn’t possible before
@@crestofhonor2349 yes, but the demand of current gen maximum is unreachable. In an age where a new generation doesnt give much extra raw performance Crysis is out in am age of constant 40%+ extra performance per new generation. We dont live in such times anymore
I had a very good experience on the 7900xtx. With Fsr enabled (no FG) I'd get over 100fps all the time, but some stuff wasn't working properly. Hair on some characters was a bit washed up. Disabled it and was still getting between 80 to 120fps depending on the scenario. Was playing on 1440p, but I'll likely play it on 4k on my TV when it releases and just lock the fps to 60 for a smooth experience.
FSR3 frame-gen gets me a stable 120fps with zero noticeable artifacting (aside from ghosting which already was on the ps5 version), runs amazing, loving it so far
Just tested this game right now so maybe there has been an update since I just downloaded the demo but I had absolutely no stutter while playing (4090 13900k). Hopefully at launch the cutscenes are gonna be 60 fps, or maybe a modder will fix that, that's my only complaint from this demo.
7800x3d + 4070S here and I play on oled 4k 120hz tv I found out that if you set dynamic res to 60fps + 60fps lock on Riva tuner you get an almost flat frametime line As soon as you turn off the in game framerate limiter it stutter like crazy...
Probably defaulting to fxaa with motion blur. Fxaa blurs the gamr to make jaggies less obvious. Either that or its bad post processing which can hopefully be tuned off.
Nothing is crazy since the introduction of 4K. Those textures and cutscenes weight hell of a lot. Idk, maybe they ought to give people to choose if they want to install 4K or not. But until then - better get used to it.
I will never understand gamers simultaneously complaining about graphics and game sizes, especially since SSDs are so damn cheap. You want to play ps1 Final Fantasies on PC? That doesn't take much space indeed. But if you want latest gen graphics, why are you complaining about size? You do understand you can get a 4TB SSD for 200 bucks these days? What's the issue?
@@jonirischx8925 Most people have around 1TB maybe 2TB, 4TB is enthusiast or content creator-grade storage space. I don't really mind the big install size either btw
@@stygiann Yeah, so get a 2TB SSD for a 100 bucks. That's what I have, because I really didn't see any sense going lower than 2TB, and I'm by no means wealthy. Also, it's not like you need to have all your games installed simultaneously. I make music, and have humongous sample libraries on one partition, multimedia, including games, on another partition, 1TB each. I still have a lot of leftover space even with the 2TB. Like... What are you guys hoarding on your hard drives to not be able to accommodate a 170GB game? People don't remember how expensive strorage used to be. 100 bucks for 2TB is INSANE. We are so spoiled in terms of storage, that it seems an even weirder point to complain about. Graphics cards and CPUs are expensive. Storage is really not.
Who would want to play a game in 1080 rez? That is 90's gaming not 2024!!!!! That is the rez people in the 90's played at on PC and here consoles are 4k and PC gamers 1080? That is insane and that PC industry keeps pushing 1080 on everything is insane and reviewers and every mouthpiece out there!!!! 1440 and 4k is all that should be said in today's world in gaming. In a few years only 4k no more anything less than other than scaling from rezes.
It looks like they didn't change or improve anything and it's basically just the ps5 port with the same damn issues it had last year! They also promised to fix the console version but never did that so I honestly never had any high hopes for the pc port anyways.
That's almost every port now. Barely any difference except from uncapped resolution/fps, twice as demanding to run. Absolutely dogshit PC treatment as usual.
The performance seems a bit lower on those cards than I'd expect for 1080p. Maybe it still needs optimizing. But good on them for releasing a demo. Now to see how it runs on a 7900 XTX in Linux... lol
its pretty rough jumps all over the place at 1080p high, native, no upscaling, no FG im seeing fps all over the place in the 60 to 130 range on a 7700xt. I smell a day 1 patch or driver, or both tbh.
I'm going to try this out. I've been avoiding spoilers. I just hope the combat is more interesting than FF15. And should be a given that they fixed the magic
@@Goujiki The magic system was overturned in FF15 but it wasn’t bad. You could do damage to your teammates so you had to be careful when you cast. I actually did a no magic run on my first play-through.
If you played devil may cry you already played FF16. DMC combat is also way more advanced in comparison. FF16’s story is also sleep-inducing, lots of cutscenes that don’t need to be present
Dawntrail has had broken DLSS for two months now, they didn't even put camera jitter into the cutscenes so they're all aliased unless you "cheat" by fixing the broken game via an addon. Yoshi P is not a good studio overseer if you want your game to have technology from the past 10 years (or a functioning game world, but thats another story)
I hope the menu separates max framerate and target framerate into different options. I'd like the game to scale to target 60/72 but maybe hit 100 when the scene gives me the headroom. Currently, if you wish to set the "frame rate" option to a high refresh(100hz+) with dynamic resolution on, the game will automatially set the internal resolution a lower preset of whatever upscaler you're using in a vain attempt to push pass CPU limitations.
@@richardsmith9615 it these GPUs were from like 2018-2019 or 2020 maybe 60+ fps native in medium would be ok and understandable but this is just too much
I agree but. On ps5 I played through the game for the first time and thought the graphics were kinda mid. But I did a 2nd playthrough and thought the visuals were amazing. I definitely think the game looks great. Only in the 30fps mode anyway.
I’m pretty sure that denuvo is more than likely going to be the reason those stutters are happening. You really can’t do a proper benchmark until it’s removed because it’s been known to cause problems 😣 I’m pretty sure it’s the same case with wukong.
@@imAgentR FF16 uses a heavily modified FF14 engine. They tried going with UE but UE could not support the texture detail they wanted. This is mentioned in their technical presentation.
The optimization of this game is an absolute joke. How can a game with such outdated graphics be so demanding? It's laughable that an amateur-made port of Spider-Man 2 for PC runs far better than so-called "professionally optimized" games. It really makes you question the competence of these developers-are they even trying, or are they just relying on hardware brute force to cover up their lackluster work?
Both. Games are getting more demanding because the devs are pushing graphics in certain games not all games and the devs seem like they are not spending enough time fixing the games before they release but fixing them after release or at least try to for months or maybe 1 year.
@@psobbtutorials6792 You must be sniffing way too much RAM if you think this game is top-tier. Are you out of your mind? Horizon Forbidden West annihilates this game graphically. Even a double-A title like Banishers: Ghosts of New Eden has better visuals. It’s ridiculous to even compare them-one is a visual masterpiece, while the other struggles with mediocre textures and uninspired environments. Get real.
@@psobbtutorials6792 tell me you haven’t played cyberpunk without telling me you haven’t played cyberpunk 💀 💀💀 (Which btw runs SIGNIFICANTLY better while looking insane)
@@tvcultural68 If running this mostly linear game gave your pc trouble I doubt you low wage bob ever experienced cyber punk at max settings. It doesn't look better at all, the attention to detail is really bland in gay punk
The fact that pre-rendered cutscenes are locked at 30 FPS is atrocious in this day and age. The rest is okay but I can't be jumping from 120 down to 30, back to 120, back to 30.
the characters' models are amazing, very natural and detailed, but the environments and the scenes are like from graphics 10 years ago, it's a bit disappointing given the requirement of hardware is not that low
The environments are extremely detailed. If you actually looked at games 10 years ago on period accurate hardware you would see they are less detailed. But the ones most fondly remembered hold up well because of strong art direction. FF16 can have all the detail it wants but it will still look bad when the art direction is super boring. The art direction is good in some areas and the game looks amazing, but then sometimes like in the hub of the full game or in the opening castle area it is really boring.
There are very detailed environments. Some areas do look bad but there are alot of moments where the game looks amazing. When you get back to your castle later on in the game the geometry detailed is very good.
there has to be something not configured right with your PC as I get around 95FPS (in certain areas) without any FG on a 3080 at 1440p with DLSS Quality mode.
Native 1440p is really the "optimal" resolution for the RTX 4090 for the latest AAA Games / UE5, etc. No need for increased input lag bought about via "frame gen" nor visual artifacts brought about via "AI Upscaling". Just a pristine NATIVE image with no negative impact to input lag as the PC Gods intended. I have both a 55" 4K 120Hz VRR OLED and 27" 1440p 240Hz IPS G SYNC Display. Never have I felt Native 1440p was somehow "inferior" to gaming at 4K on my RTX 4090 - R7 7800X3D - 2 X 16GB 6200 MT/S CL 26 36 32 44 - 2200 FCLK - 55.6ns latency Aida64 system. IMO game developers should replace the "Ultra" requirements with NATIVE 1440p / 90 FPS instead of 4K / 60 FPS. Never would I prefer 4K / 60 FPS over 1440p / 90 FPS. Many simply don't want to come to terms with this reality as they prematurely jumped the "4K hype train" and now think 1440p is somehow beneath them but they are only hurting themselves with this absurd mentality. Of course I'm referring to most of us running your standard 27-34'" desktop monitor. If you're running something like a TV at 48+ inches well it is what it is..I'm not suggesting 4K is horrible or anything (it's a viable option for an RTX 4090 build) just that for many 1440p would be the better option if you're running a standard desktop monitor up to ~ 34".
At least outside of UE5 which is designed to run with supersampling, 4090 walks 4K native and 4K with DLAA (which is an extremely stable image) even without frame gen in a huge number of games. The raw grunt of the 4090 is basically wasted at 1440p in a non raytraced/360FPS+ setting. Frame gen input lag can be a real hit and miss, in a game like GOT you have to turn it off because it throws off reactions. But honestly here where you're already at a nice base and it's a visually driven game, it's not particularly intrusive.
It's interesting how game requirements progressed within the same gpu gen. When I bought my 7900XT it was considered an entry-level 4k GPU, it could max out most games and still get 60fps, basically a 3090 TI clone. Now, before even the 5xxx series is out, even the 4090 turned into a 1440p GPU, albeit a high refresh rate one while mine went to overkill 1440p to just 1440p/60. I also spent a grand on it on release day, which is a real bummer. For that price, I could get a 4080/7900XTX nowadays so I do have some buyer's remorse. Alas, my own fault for being dumb. I feel like the system requirements increase cannot really go any further at this point, simply because you cannot upgrade PS5 or Xbox hardware and it's still years away from a new console gen.
back in 90s and early 2000s, demos what led me to buy most of the games i brought. its nice in the last few years more devs are releasing demos, early access is good and all but i still rather a playable functional demo over WIP early access.
@@RyviusRan So true man....and I'm not defending my PC because it's pretty old (RTX 2060) but even with my GPU I can't even hit a stable 30 fps on medium graphics at 1080p with DLSS enabled......yeah terribly optimized.
@@RyviusRan PS5 often runs games at custom settings not available on PC, often settings that are below the lowest PC settings. In videos they look somewhat similar, but not really in this case even with youtube compression so if the consoles used PC settings, they would also barely hit 1080p/30. Moreover, keep in mind that games since the PS4 have been optimized for AMD hardware. Generally speaking, AMD performs better at low settings vs Ultra compared to nVidia at most games for some reason.
Running 7950X3D, 7900XTX, 64GB Ram, (3) 1440p 240HZ monitors. On max settings in quality mode, my game caps at 180 for some reason and sits there while having it at 240fps. In the castle area, I sit at around ~90-130fps. In the battle area with the goblins I sit around 120-150 with random fps drops to ~80-90fps. If I turn on Dynamic Resolution and FSR Frame Generation, castle sits around ~200fps, in battle it juggles between 160-280fps. Max fps I've gotten are ~390fps, lowest fps have been around ~75fps on a spike drop. So the way I have it now, I turn off Dynamic resolution, leave FSR FG on and cap my fps at 120 and it just sits at 120fps for most of the playthrough. I also have never gotten any screen tearing like I see in your video. So the game looks smooth for me, but characters seem to appear blurry/foggy in cutscenes and when zoomed into a character. STILL better than on PS5 though lol.
In this day and age it takes a lot of guts and faith in your product to release a demo like this.
It could easily make or break peoples enthusiasm.
In this case break
Beautiful game
Jokes aside, yes, but I hope modders unlock 30fps cutscenes and also ultrawide support
Worked fine for me and I was surprised
@@SamiJuntunen1 It made it for me
I appreciate the fact that they put a demo with all these graphical options because i remember the FF7 Remake pc port… its was something
Yay but at least remake had 120 fps cutscenes. The stuttering in the slums was annoying but at least it was not the whole game. Here Always alternate between 30 and higher is not great in my opinion
@@rivianbaguette9114my bet is that there will be a mod which fixes that on the first few days of the game release
@@MrMrAragoare the cutscenes even realtime in this demo? The GPU load became very low whenever the cutscenes played. So I assumed it's just a pre rendered video.
@@r4vd0the way they made the cutscenes is similar to the ps1 ones where they layer different types of cg videos with realtime graphics to make them seamless while allowing large scale heavy stuff without resorting to huge high quality videos. Thats the reason they cap it to 30 because the videos being layered are sampled at 30
@@r4vd0 there are a lot of cases where none in real time rendering has been patched to 60 fps through mods. It's even moreso possible now with AI which makes artificial frames in-between. Though mods like those might take a bit longer to make.
is that a Gibson Mr. Owen? Teacher, Rockstar, PC news, what's next astronaut doctor lawyer? lol
you add chef and you got a regular ole sims 4 character. ^^
Why are devs STILL refusing to use FSR 3.1, come on now
Probably because the upscaler was implemented early in the porting process. FSR 3.1 has only been out for like a month.
oh, let me guess, nvidia prepares their 5090 stock to sell. Why not force AMD out with money to developers. Nothing stops devs to spend an hour installing FSR 3 in a game, except... money.
@@saynotomanifestv3101 Strong sponsorships sometimes. FSR usually coming out late, in future patches. 🥴
no es fsr es el mal rendimiento que tiene en general
@@Jakiyyyyy i hate sponsorships. Because then you get situations where you get worse performance than the other brand of GPUs because you went with either team Red or Team Green.
Watched the Star wars Outlaws "gameplay tailer" and that all they talked about was all the nvidia features, there was probably actually 15 seconds of gameplay footage shown entirely...
there are lots of things to work out but the difference in visual quality over the ps5 is monumental. moving and panning the camera in 4k at over 100fps is just amazing. black myth and ff16 demo on the same day feels like a holiday where you get a raise and free ice cream!
The PS5 version has this slight blurry image to it, I wonder if the pc one does to.
not on my 3070 😭😭😭
gonna need an upgrade soon
ps5 60 fps mode is stutter trash
@@yobro6053NO
It’s PC my guy we get all the bells and whistle
@@yobro6053 There is a blur setting on PS5, you can turn that off or down if you prefer
The game tanks its performance by changing graphic settings in game for some reason. A restart fixed that.
PC issue, get that checked
@@GameoftheYear-fx4mqYou make me feel a hundred times smarter
@@kalle8960lmfao
@@GameoftheYear-fx4mq🥸🤓luckily the ps5 version doesn’t run like shit… oh wait it does
I'm VERY VERY surprised that it's only $50.
Square Enix still wants $70 for the stuttery FF7 remake port, which was a PS4 game.
They want you to buy on PS5 because there is at $15.
your pc must use a igpu from 2018 if you're stuttering on ff7rm that game will run on android with a windows emu
They want to actually get good sales this time
@xxzenonionnex7658 the FF7 Remake wasn't a great port. Not as bad as Starfield or Dragons Dogma.
Always getting my games on sale with many sales and deals.
Make sure you're on the latest Nvidia driver or this demo is completely broken performance wise (It's a stuttering mess). Overall though had a great time with this demo although it's a tad blurry even using DLAA at 1440p. 30fps cutscenes and no UW support needs to be fixed though for launch...
Does everyone have ultrawide or something? Why is that an issue, you're in the minority with this one
@@GameoftheYear-fx4mqWhy are you even here on this channel running damage control for a demo and dismissing any issues people are having? Get a life, bro.
DLAA iat Nvidias anti aliasing tech, using DLSS tech for AA. But its not upscaling, it doesnt improve performance, it increases visual fidelity. If you want increased perf, you need DLSS upscaling.
@@GameoftheYear-fx4mq You're missing out. Also, being in a minority doesn't mean problems don't matter.
@@CaptainKenway my life is ensuring people don't spread misinformation, if I wasn't paid for this, I wouldn't bother.
Cutscenes in this port are atrocius. Its killing this game for me if they don't fix it. I mean... This game runs at 60-80fps (max setings, 1440p on 4070ti) and suddenly cutscene at 30fps but it does not feel like 30. It stutters and for me looks like 20fps or less but the counter stays at stable 30...
use lossless scaling for the 30fps cutscenes. night and day difference
Fact
why not amd fg or dlss fg . lossless scaling makes the img quality very bad
@@Fronioll9973 It only looks bad if the framerate is really low during gameplay but in the cutscenes, it works really well believe it or not. Also a tip when using Lossless Scaling with this game, is try to get the framerate from the game as high as possible preferably to 70fps and the app will try to smooth out those extreme low dips with it's frame generation. I actually had better luck and lower latency using it this way over the in game frame generation. Don't use Vsync in the game but use it in the app.
It doesn't work for me, it just crashes
6700xt 1080p medium??? Man wtf is going on with these games, this card was running games at 1440p high not so long ago, games that look better than this!
That was only on last gen PS4 ports. This game gets down to 720p on the PS5. It's an actual next gen game. It getting to 60 fps on a 1070 means it scales well.
Looks outdated tho. @@jorge69696
Well this game's really fucking good so it might get a pass imo
I think this game must be running some software based ray tracing, there's no other explanation.
@@okami9039 no it doesn't it looks like a ps4 game but runs 2x worse
Japanese developers and games lacking optimization: name a more iconic duo.. PS: for those complaining about Wukong, come here and see this mess...60 fps on a 4090 with no RT...
Lord
im getting 60fps on my 2080ti at 1440p (dlss quality). it just struggles to go beyond 60fps even if I drop to 720p. so could be single core limited
@Antares-dw9iv True
I'm not sure what they do to the lighting in this game but it looks fantastic at times. When there are overhead clouds you can see the scenes dim and brighten. I have a feeling that lighting is making a pretty big hit on performance. But also, yeah, this game is not well optimized either.
I love the "Binding Arbitration and Waiver of Class Action" (.14 in EULA) applies to customers and not towards Square Enix itself - VERY COOL
Huh. Good call.
It's good they are releasing demos now in games. It can help too smooth out issues before the full game launch and I appreciate the chance to see games before release from the buyers perspective.
So the game is now prettier while crisper, smooth quick cam pans, responsive and not blurry. A joy to play with dlss dlaa and FG. Would wish 60fps 4k for the ingame sequences.. its far better now than on the PS5. 4090/13900ks
I'm playing the demo right now and even on the 7900XT it's not that bad of an experience using 4k, but I'm probably gonna go back to 1440p cause of my monitor and it did drop to 40 in some sequences. Still, aside from the cutscenes, not a badly optimized game at all. Then again, the graphics look like PS4, don't think it even uses RT.
@@gameurai5701 The game only uses software RT, kinda like Lumen as far as I know.
I wouldn't call fg a joy to play.
@@Masaim6it’s not bad on this game in my experience. Gives me a stable 60 at 1440p fsr 3 quality with 3070 and 5900x
@@Masaim6the mission markers and health bars look choppy however
Another unoptomized SquareEnix pc port, wow I'm so shocked. >_>
you expecting to play on a potato PC eh?
Crazy because I just finished the demo on a 3060ti and it ran ultra 1440p with a locked 60, If I had a higher refresh rate monitor, would bet I get at least a stable 100 fps. Had no stuttering either
@@GameoftheYear-fx4mq i have a gpu vastly superior to yours and i can tell you, you in fact did not run the game at ultra 1440 with a locked 60. You used the upscalers and or FG. At native 1080p max settings the game was dropping to 55 at points on my 7700xt, which is a far more powerful card more in line with a 3080. This is exactly why i hate upscalers and FG because now it muddies the waters when you want to talk about performance and you will have people spouting off about how great the game runs when in reality they were using all the crutches available and not rendering at native res and native frame rate.
@@thefilmdirector1 I meant high settings, I don't think ultra was even an available setting
@@thefilmdirector1 my 2080Ti does 4kDLSS balanced (1260p) solid 60 with 10% overhead, maxed. Fix your PC.
It's not surprising to see modern AAA Games running low fps even on the most expensive gaming Geforce RTX4090 Card, blurry screen + lower fps than 60fps
just tried it. these 30 fps in cutscenes looks absolutely horrible. main problem that you get huge stuttering when game lowers fps. and they forgot to lock mouse inside window when you play. not tested it in fullscreen, only borderless window. and game looks all fuzzy in 1080p, even without upscalers.
upd. i played for 2 hours, and when long cutscenes started, my screens turn off because of 10 min default windows timer. so they even forgot to ignore windows power settings... and when i touched mouse, window with "operation successful" appeared and game closed. i was in fullscreen mode, so game was always active.
the blur at native res is most likely forced TAA sadly.
@@thefilmdirector1 forced to use fsr3, xess has same blur mess...
The game was incredibly blurry for me until I disabled dynamic resolution and forced DLSS on quality. Still not crisp at 1080p but better, and I can keep almost consistent 60 fps without frame generation.
ran beautifully for me. rtx 2080ti (9900k and 32gb of ram). around 55-60fps at 1440p max with DLSS quality. G-sync works down to 30fps, so the cutscenes looked great also (but is admittedly a NVIDIA only feature)
I'm so happy. I bought a PS5 to play this game but it was a mess with the necessary 60fps mode on PS5. Sold my PS5. Now I can play the game perfectly with my RTX 3080 instead. Fantastic game.
Blurry as hell, 30FPS in-engine cutscenes like Stranger of Paradise, mediocre graphics for the performance it demands (looks like Witcher 3.. from 2015), random stuttering/framepacing issues, no ultrawide support..
Yeeeah, this engine seems worse than Luminous.
Cbu3 doesn't like to shift to new tech very quickly.... I'm sure it'd be a much faster developed game if they made it from scratch in unreal and asked the kh team for help.
@@Jay-mx5ky They tried UE4/5 and it couldn't handle their PBR assets. Hence the modified FF14 engine.
My PC is pretty beefy. Ryzen 9 with a 4090. The cutscenes were locked to 30fps which I didn't mind but the rest of the game felt like I was still playing 4K 30fps. This game really needs more optimization I should be able to hit 4K 120fps easily.
I was thinking about picking up this game on PC but I'm not so sure about the optimization right now. May wait for a sale and some patches.
It's only an issue if you have a shitty pc
@@AXL.Stuttering with the 4090 as seen in the video. Sure thing buddy
@@AXL. I guess fuck everyone who can't afford a high end PC, right?
runs fine for me with rtx 2080ti and 9900k (32gb of ram). so it's odd that he is having issues. Im getting about 60fps all maxed out, at 1440p with dlss quality
@@legendp2011 I have the same specs and it runs like crap unless I use the Lossless Scaling app with it. The Lossless Scaling app will also make the 30fps cutscenes run at 60fps.
Can't wait to see how the game holds up in the fight against Titan at 4K even with a 4090 with DLSS enabled, lol.
16 seconds after this went live. crazy refresh chance.
😅
Big brother Google n it's ai
Man of Culture, that's a beautiful Tweed amp in the background.
15:37 6700xt is now a 1080p medium 60fps gpu 😂
yes it is, it is basically the Xbox Series X GPU
don't worry, you upscale it from 1080p to 4k, so 4k upscaled 60fps is pretty good looking
@@Garrus-w2hThe PS5 and Series X are both equivalent to a RX 6700 in rasterization performance. They are equivalent to a RTX 2060 in raytracing.
Weird I just played through the whole game on ultra @1440p with the 6700xt. And I had 90+ fps the entire time.
I cant even get passed the first loading screen without artifacts. Running a 3090ti with 12900k and 32gbs of ram. Just updated the driver to latest. All other games running fine.
Thinking bout rolling back the driver update. SE is stupid.
Update : well i fixed it. Apparently rBAR and ultra low latency was turned on in nvidia inspector and that is what was causing the issue. Turned them off and the game ran swell. Couldnt get 60 fps without dlss quality but good nuff. Game looks crisp in dlaa though. Wish it ran at 60 with dlaa.
am i the only one looking at that 170 gigs and going 😵💫
Finish BG3 so you can delete it off your PC bro
God of war ragnarok is 190 gigs.....
It's half size on PS5. Until Microsoft and Nvidia can actually deliver proper HW decompression support, this problem will continue.
MY M.2 SSD!
Bold of you to assume you can ever finish BG3, you cannot@@GameoftheYear-fx4mq
I'm on the latest drivers with a 12700k and a 4090, and the game is a stuttery mess that is almost painful in cutscenes. It is so bad that I am having trouble even understanding what is happening on the screen at times. I'm getting an average of 70fps with DLAA native 4k, but even outside of cutscenes the stutters are constant. I'll wait for more patches to hopefully iron out the performance.
Try using the Lossless Scaling app with it. It helped a lot for me. It even made the cutscenes 60fps.
After making comments earlier I had to try. I get ~53 fps on my 7900 GRE (Manual OC), native 4K, Ultra with no upscaling or frame gen BS. Not bad considering the 4090 got low 60's. However I can confirm this is far more demanding on a GPU than it should be for the level of quality and quantity of what is going on, not by a small margin, but huge.
The game is absolutely beautiful but it is quite heavy. Those cutscenes especially are very heavy with those very detailed character models. The lighting is also very good
I think it would be nice if they add a toggle to switch between 30 and 60 fps on cinematic cutscenes at launch. Would be a nice option to have for those who have the high-end GPU to run it.
It runs so good on my PC I wish I could delete my memories of it and play it for the first time
No Ultrawide support is inexcusable....i can't name a single recent modern AAA pc game that does not support ultrawide...i hope they fix it
Yeah, seriously. But I'm not surprised because FF7 Remake Intergrade didn't support ultrawide either.
@@JorgeMartinez-dp3im I haven't played that one yet, but they still haven't added ULTRAWIDE support for that game!? And is that the same studio?
elden ring still does not support ultrawide (even with the new DLC that just realesed). there are mods that easily add ultrawide, but it trips the anticheat
@@legendp2011 well luckily there is (I think) no online with the single player component so hopefully they'll be a mod out super quick to add ULTRAWIDE to enjoy this game in it's full glory... Literally my only gripe and a very puzzling omission smh
These games are never designed for ultrawide. You wasted your money on a meme monitor. Cope, seethe, and rethink your life choices.
I had a few minor stutters on my newly built rig but overall the experience was breathtaking! What a game and story so far. Definitely sold on this and knew Yoshi-P and CB3 would deliver. I'm happy I kept myself spoiler-free but true, did see some minor stutters and pauses on rare occasions. The 30 FPS cutscenes I didn't mind as they are integrated so well it really didn't take me out of it so to speak but will still be eager to see how they continue to improve the game. Thanks for an early video, too!
at least they are not 24fps cutscenes! it hasn't bothered me too much for some reason, something about this game the 30fps cutscenes work better than older games that had this problem
Yeah I mean again, went into the demo on drivers 537.xx and it froze up after the beginning so just had to ensure I grabbed one of the latest drivers and then the game worked just fine. No crashes or other issues at all, just like one or two stutters here or there, nothing really serious.
@@TheRealLink like a lot of games, play it with windowed mode, not full screen mode, and it won't crash (and maybe avoid MSI Afterburner if you get crashes and see if that makes a difference)
@@TheRealLink to be fair, demo did warn me that I need to update driver, because when I ran it pre-update, game was a glitchy mess of colors. So driver update is a must.
This game seems a bit too demanding for the graphics it's offering
Looks great, especially the boss battles
HFW looks better and is far less demanding
@@saiibox3174 yeah even some textures looks better on the first descendant and that's a free2play
Game engine.
@@saiibox3174lying through your teeth, HFW recommends a 3060
Hey look at he new digs! Is that the new basement studio? I'm really happy for your success Dan. I remember when you first came up on the scene, and damn you exploded. I bet your students are so proud of you, just like your fellow teachers. Cheers bro
This is crazy, this needs way better optimisation I would push this back a little to December...
1000% 60fps at 1080p "Expected" with a 6700xt. It's like we're all being slapped in the face.
@@richardsmith9615 the ps5 has essentially a 6700xt as a gpu and it can't maintain 60 fps, what made you think your pc would?
I played it at 30 fps on ps5. I pretty much just assumed PC was looking at 60 fps stable at best. PC can go higher, mine is new also, but the game was very demanding.
@daiyousei3847 what are your specs?
нвидиа отвалила сони не малые бабки за отсутствие оптимизации...ну вы поняли, хомячки)
DEMO SIZE 16GB STEAM SAY 16.6GB SO CLOSE TO 17GB
A HUGE THANKS I COULDNT FIND THIS INFO ANYWHERE..
BUT WATCHING YOUR VIDEOS I ALWAYS GET THE NR1 INFO ABOUT THE STUFF THE VIDEO IS ABOUT!!
GREEEEAAAT
MORE TESTING!! In tthe castle area please!! :D Great Video btw.. ;)
if the pc demo also has the dungeon demo in the ps then they should also test that as it has a boss fight
I got about 55-60fps on a 2080ti max out, 1440p dlss quality (9900k cpu and 32gb of ram). I didn't have much stuttering (so perhaps there was a driver update)
I don't enjoy the 30fps cutscenes (apparently, they were on ps5, too), but by god, I've been waiting so long for this game and am just happy it's finally here.
30 FPS cutscenes is 100% a dealbreaker, I'll wait until it's patched to at least 60 FPS
This game is 10x better looking than Forspoken. This is a quality game, sure it has some issues, but FF games are innovative and new every one of them and I can confirm this game is special despite any flaws. A must play.
As a self-proclaimed Final Fantasy fan who has never heard of Tactics or Ivalice or played 14, I will be boycotting the game for being dark fantasy and trying to copy Game of Thrones.
@@GetterRay ? It's a world at war. It isn't Game of Thrones. Fantasy games where everyone is at war have happened before. I don't understand. The world for FF14 is the same for FF12, the single player game.
@@GetterRay (11, 12, 14, 15, and 16 are all in the same world??? not sure, it never really mattered to me)
Tactics 1 is the same world as 12. 14 is its own thing entirely. Although as an mmo it has a ton of crossovers with other titles @@Garrus-w2h
Game works surprisingly very well on my RX6600 paired with a Ryzen 5600. I got 150fps in the opening, around 100 in the first city and 120 in the swap area. Everything on high but shadows using FSR on quality + FG.
No stutters is what surprised me the most.
They're gearing up for Rebirth and KH4, expect the same high polish with those
1080p resolution?
@@wumpratt yep 1080p
@@matrix255 What fps do you get without FG? You know, FG fps is not the same as real fps
When you launched the demo, did a message about updating drivers appear? I checked mine and I have the most recent driver but that message still appears
Daniel, great video as always. Did you see Nvidia launched specific drivers for FF16 demo and main game earlier today (Tuesday aug 20). Wonder if that cures some of the frame issues.
Well, I'm glad to see that it runs on your 6700 XT -- entire thing is so busted on my 7900 XTX with the AFMF2 preview drivers that I couldn't get through the initial brightness configuration.
Really? Runs very well on my 7800XT
ran really well on my 2080ti. I';m guessing drivers are still behind for many cards
i have a radeon 6800 and i just accepted that the game wouldn't run at native 1440p high above 60fps all that much. In fact, it's dropping below that so i had to resort to using FSR 3 to get it to a level of performance i'm satisfied with. It's quite a shame i need an upscaler, i prefer to avoid them whenever possible. What's worse is FSR 3 is still nowhere near as good as DLSS and it kind of frustrates me a bit. I hope AMD manages to catch up to nvidia soon because with linux drivers from nvidia finally being functionnal, i may no longer have a reason to stick to radeon in the future if everything requires an upscaler
30fps lock in cutscenes in 2024 is a bad joke
its a cutscene, not gameplay.
@@Stars-Mine it's a GAME cutscene, not film, framerate doesn't matter as much in film, but when you get 30 fps with essentially a shutter angle of 0° it looks like crap
@@dahahaka I know its not film. but people saying 30fps for a custscene looks like crap even with it not doing all the tricks a camera does, is not being serious.
@@Stars-Mine it's very jarring... And it's just completely unnecessary. If it were some crazy engineering challenge sure, but the game is already running at higher framerates
You watch movies in the cinema at 24fps lol
Performance is all over the place no matter the settings. What I did was turned Vsync off and tried to get the framerate to over 70fps when it was possible but since performance was so bad, the framerate would dip between the 40s fps and 70s fps. Then I used the "Lossless Scaling" app from Steam and set the frame gen in the app to X2 as well as set Vsync to on in the "Lossless Scaling" app not the game. This really helped try to keep those framerate dips to a minimum as well as keep the fps close to 60. Another good benefit is the "Lossless Scaling" app ran the cutscenes fantastically to 60fps!!! It took awhile to dial in the settings but after a few hours of back and forth with settings I finally found something that worked well enough.
I got a rx 6650xt with ryzen 5700x so just under the recommended so will give the demo a try when I get home.
I've got 6600rx, ryzen 7 7700, and 32GB RAM. Downloading right now. Hope I can run it decently 🤞🏼
Thanks for the headsup Daniel! Stuttering fullstop really kills my desire to play some games (looking at you Jedi Fallen Order) so hopefully they can fix that, 30 frames for cutscenes is also an odd pick in the modern age (I guess they really wanted to show off the engine in cutscenes).
P.S I spot a nice Les Paul or Gibson in the background?
live now on my 3080 runs well
performance isnt bad at all cant wait to play release
You should do your testing on the summons battle segments of the game, I reckon those would be the most demanding ones
I think there is one at the end of the demo
To be fair they only had over a year to solely work on the pc version. /s
Watch modders fix this shit in a week
Throwback to ff7 remake. Square enix put the dynamic res to always on and we need a mod to turn it off 😂
It's just SE being SE. They don't much care for PC, Why do you think it didn't come out day one.
@@thelazyworkersandwich4169that’s on Sony paying them to release on PS5 only. That’s also why these games don’t come to Xbox either post FFXV
@@thelazyworkersandwich4169 It didn't come day one because of exclusivity deals, something the developers wouldn't have much say in to begin with. This specific team have also been working on PC for years with XIV. As their first big AAA PC port, it's already miles ahead of most other releases, and is a better game than most modern slop.
@danielowentech I think they updated the demo. Cutscenes now try to run at 60fps, but the framepacing is still very uneven. I only noticed one little stutter during gameplay so far and it runs at 200fps+ with my 7800x3d and 7900gre... Maybe they tried to fix the framepacing in cutscenes with a 30fps cap, but it didn't work like intended... 1% low was at 29 fps (during cutscene). Maybe you should give it another try?
I believe DF said the combat in performance mode (60fps) went down to 720p on the ps5. Also on my playthrough in performance mode traversal you can feel the frames drop
I’ll wait till it improves but I’ll be excited to play it on PC
I noticed my friend runs this demo horribly on his 5600X + 4080 Super with low GPU usage/stutters but my 5800X3D + base 4080 keeps the GPU pegged at 100% with little to no stuttering (3D cache heavy title?)
XIV adores the X3D chips so it might be the case, but honeslty low GPU usage looks like bad/old driver issue.
Bottleneck 🤔 ?
Im convinced these modern final fantasy titles profit almost exclusively off nostalgia because they are the uninspired AAA slop
Spot on, FF16 is forgettable
I'm about to be done with pc gaming... digital foundry just released a video and black myth wukong has stuttering issues also and I know it won't be fix.
I'm sure it will be fixed over time. Black myth wukong is game sciences first big AAA game and they are using unreal engine 5. Give them some time.
Probably DRM protection causing problem..again, nothing you can do unless they remove it.
@@tomthomas3499 exactly
Very interested in this game, would love to see more testing.
Black Ops 6 requirements video next? Hate to sound like a pest though
yeah pls
same as all previous cods. same engine.
Lmao why
same engine and it’s on last GEN consoles 😂 totally waste of time 🤦
@@Mr_CopaceticGamer not really, its the same engine, so same HW req, but Treyarch (this years cod developer) has been using it for 4 year now on this years cod. so its gonna be pretty different in terms of art style and gameplay. Last gen.. yeah its sad, but its still like 50% of all players, you don't want to loose that
Update:
Official 24.8.1 AMD Drivers & FF16 Demo v1.01
+Updated Buttery Smooth Settings with more Visual Fidelity!
AMD Users: The Official 24.8.1 drivers fixed my FF16 issues. I had to factory reset to not crash though & not change any AMD Software Gamming tab settings. I also used AMD Cleanup Utility before driver install.
Much smoother FPS now! No crashes or long stutters even at Upscaling = Native AA & Dynamic Rez = OFF.
My Rig is an AM4 Ryzen 7 5800 (Non-X ...Basically 5700X) on a RX7800XT, 16 GB DDR4@3600Mhz & B550 Mobo. On Windows 10 Pro 22H2
...If anyone was curious.
Only changed from default settings:
At Max settings: 4K (Borderless) I'm getting fairly steady ~30FPS with Dynamic Rez = OFF, Upscaling = Native AA. If FSR3 Frame Gen = ON.~40-60FPS
Smooth Settings: 4K (Borderless) with Dynamic Rez = OFF, Upscaling = Quality,
FSR3 Frame Gen ON & Frame Rate = 30.00 FPS ~60FPS with very occasional drops.
>>>Buttery Smooth 60 FPS Settings + Lossless Scaling>SOLID 60 FPS
Isn‘t it crazy that my GTX 1070 was able to push around 90-110 FPS at native 1440p max settings in Doom 2016. Here in FF16 with an internal resolution of 720p a quarter of 1440p total pixels, it‘s not even able to stay at 60 FPS…
Blame the shift to "doing graphical effects for real" rather than merely emulating it like older games
Doom 2016 graphic was not that demanding even in 2016.
Doom 2016 is doing significantly less from a graphical standpoint than Final Fantasy 16
@@3dcomradeyou mean like every single generation that wants to push graphics forward by being able to do stuff in real time that wasn’t possible before
@@crestofhonor2349 yes, but the demand of current gen maximum is unreachable. In an age where a new generation doesnt give much extra raw performance
Crysis is out in am age of constant 40%+ extra performance per new generation. We dont live in such times anymore
That's the most stable in-game fps limiter I've seen. The frame cadence is good imho, no issues there :)
I had a very good experience on the 7900xtx. With Fsr enabled (no FG) I'd get over 100fps all the time, but some stuff wasn't working properly. Hair on some characters was a bit washed up. Disabled it and was still getting between 80 to 120fps depending on the scenario. Was playing on 1440p, but I'll likely play it on 4k on my TV when it releases and just lock the fps to 60 for a smooth experience.
What is your CPU?
Same experience here on a 7900 XTX, very smooth gameplay.
FSR3 frame-gen gets me a stable 120fps with zero noticeable artifacting (aside from ghosting which already was on the ps5 version), runs amazing, loving it so far
The 30 FPS cut scenes are absolutely egregious, most certainly that would be corrected in the final release.? 🧐
Just tested this game right now so maybe there has been an update since I just downloaded the demo but I had absolutely no stutter while playing (4090 13900k). Hopefully at launch the cutscenes are gonna be 60 fps, or maybe a modder will fix that, that's my only complaint from this demo.
Oh no... I was SO looking forward to this game and I waited more than a year for PC release... this stuttering is f*
I’ve played it on PS5. It’s not that great, they turned FF into Devil May Cry and it didn’t end well. Story is very underwhelming as well
7800x3d + 4070S here and I play on oled 4k 120hz tv
I found out that if you set dynamic res to 60fps + 60fps lock on Riva tuner you get an almost flat frametime line
As soon as you turn off the in game framerate limiter it stutter like crazy...
10 year old graphics for the 1/4 of the performance, total garbage
Yeah i'm done with modern GAYming
You've hit the nail on the head
13600k and 3070 and it runs awfully :( I don't need 120fps but when it's going below 20 in cutscenes it kills all immersion
Game is really blurry. Something is off.
yup
New to modern gaming?, thats how games been looking even at native sometimes.
Choose the upscaling quality when you choose FSR or DLSS. It's auto by default which makes it look like dog water. Native preferable.
Probably defaulting to fxaa with motion blur. Fxaa blurs the gamr to make jaggies less obvious. Either that or its bad post processing which can hopefully be tuned off.
disable fsr, dlss and other upscaling features
Would have bought this if it came out on PC initially.
Already watched the story on youtube which is the main point of any FF game.
ouch, yeah, shouldn't have watched it...
170GB is absolutely crazy
Nothing is crazy since the introduction of 4K. Those textures and cutscenes weight hell of a lot. Idk, maybe they ought to give people to choose if they want to install 4K or not. But until then - better get used to it.
I will never understand gamers simultaneously complaining about graphics and game sizes, especially since SSDs are so damn cheap. You want to play ps1 Final Fantasies on PC? That doesn't take much space indeed. But if you want latest gen graphics, why are you complaining about size?
You do understand you can get a 4TB SSD for 200 bucks these days? What's the issue?
@@jonirischx8925 Most people have around 1TB maybe 2TB, 4TB is enthusiast or content creator-grade storage space. I don't really mind the big install size either btw
@@jonirischx8925 Plus, $200 could get you started on a CPU, RAM or GPU upgrade, I don't think it's an adequate price for the average builder either.
@@stygiann Yeah, so get a 2TB SSD for a 100 bucks. That's what I have, because I really didn't see any sense going lower than 2TB, and I'm by no means wealthy. Also, it's not like you need to have all your games installed simultaneously.
I make music, and have humongous sample libraries on one partition, multimedia, including games, on another partition, 1TB each. I still have a lot of leftover space even with the 2TB. Like... What are you guys hoarding on your hard drives to not be able to accommodate a 170GB game?
People don't remember how expensive strorage used to be. 100 bucks for 2TB is INSANE. We are so spoiled in terms of storage, that it seems an even weirder point to complain about. Graphics cards and CPUs are expensive. Storage is really not.
Who would want to play a game in 1080 rez? That is 90's gaming not 2024!!!!!
That is the rez people in the 90's played at on PC and here consoles are 4k and PC gamers 1080?
That is insane and that PC industry keeps pushing 1080 on everything is insane and reviewers and every mouthpiece out there!!!!
1440 and 4k is all that should be said in today's world in gaming. In a few years only 4k no more anything less than other than scaling from rezes.
looks like another dogwater PC port courtesy of Squeenix
It’s a meh game anyway so not a big deal
I was surprised how well my 3080 10GB OC ran it at 4K ultra. And I didn't mind the 30fps cutscenes.
It looks like they didn't change or improve anything and it's basically just the ps5 port with the same damn issues it had last year!
They also promised to fix the console version but never did that so I honestly never had any high hopes for the pc port anyways.
That's almost every port now. Barely any difference except from uncapped resolution/fps, twice as demanding to run. Absolutely dogshit PC treatment as usual.
The performance seems a bit lower on those cards than I'd expect for 1080p. Maybe it still needs optimizing. But good on them for releasing a demo. Now to see how it runs on a 7900 XTX in Linux... lol
its pretty rough jumps all over the place at 1080p high, native, no upscaling, no FG im seeing fps all over the place in the 60 to 130 range on a 7700xt. I smell a day 1 patch or driver, or both tbh.
This game look like witcher 3 with texture and color filter mod applied and performce alot worse
I'm going to try this out. I've been avoiding spoilers. I just hope the combat is more interesting than FF15. And should be a given that they fixed the magic
This is action based combat (similar to DMC or maybe a bit closer to TW3). also…“fixed the magic“ wtf are you talking about ?
@@massterwushu9699 Magic system in FF15 was bad. Magic like Fire, Ice, Thunder, etc.
It's basically mild-DMC
@@Goujiki The magic system was overturned in FF15 but it wasn’t bad. You could do damage to your teammates so you had to be careful when you cast.
I actually did a no magic run on my first play-through.
If you played devil may cry you already played FF16. DMC combat is also way more advanced in comparison. FF16’s story is also sleep-inducing, lots of cutscenes that don’t need to be present
Dawntrail has had broken DLSS for two months now, they didn't even put camera jitter into the cutscenes so they're all aliased unless you "cheat" by fixing the broken game via an addon.
Yoshi P is not a good studio overseer if you want your game to have technology from the past 10 years (or a functioning game world, but thats another story)
I hope the menu separates max framerate and target framerate into different options. I'd like the game to scale to target 60/72 but maybe hit 100 when the scene gives me the headroom. Currently, if you wish to set the "frame rate" option to a high refresh(100hz+) with dynamic resolution on, the game will automatially set the internal resolution a lower preset of whatever upscaler you're using in a vain attempt to push pass CPU limitations.
this is unplayable in budget GPUs like 4060 or 7600 XT then
Embarrassing if they think that people would be content running 720p at 30fps in 2024 even if on entry level gpu's
It is playable at lower settings just not the higher settings.
@@richardsmith9615 it these GPUs were from like 2018-2019 or 2020 maybe 60+ fps native in medium would be ok and understandable but this is just too much
@@DelgaDude you need to play at medium + fsr quality to get 50ish to 60 fps
The Demo is a good test. I hope they will fix the issues when it's out! Loved it on Ps5
Ff16 is a weird mix of some really high quality visuals, combined with some really drab art direction.
i agree the art direction is terrible. i'm skipping this game. not my cup of tea anyways
i'd rather play tales of arise than this crap
*The future of gaming graphics*
@@polarvortex6601 Tales of Arise is utter dogshit though lol Vesperia forever
Nah man it looks amazing
I agree but. On ps5 I played through the game for the first time and thought the graphics were kinda mid. But I did a 2nd playthrough and thought the visuals were amazing. I definitely think the game looks great. Only in the 30fps mode anyway.
3060 ti, 1440p dlss balanced, high settings, 40-60 fps, eikon fight 30s, dynamic ress kinda helps
I’m pretty sure that denuvo is more than likely going to be the reason those stutters are happening. You really can’t do a proper benchmark until it’s removed because it’s been known to cause problems 😣 I’m pretty sure it’s the same case with wukong.
Interesting; we’ll only find out once it’s cracked. Until then, I see a UE title, I blame working with UE to develop a game.
Its not UE@@imAgentR
@@imAgentR FF16 uses a heavily modified FF14 engine. They tried going with UE but UE could not support the texture detail they wanted. This is mentioned in their technical presentation.
@@mimimimeow thanks for letting me know. I’m looking into that now for when I get done with the last presentation about remake part 3
@@imAgentRthis is a heavily modified version of Crystal Tools not unreal
i rly wish this and FF7 Rebirth would have FSR 3.1. Especially Rebirth....
The optimization of this game is an absolute joke. How can a game with such outdated graphics be so demanding? It's laughable that an amateur-made port of Spider-Man 2 for PC runs far better than so-called "professionally optimized" games. It really makes you question the competence of these developers-are they even trying, or are they just relying on hardware brute force to cover up their lackluster work?
This game is top of the line when it comes to graphics, what sort of crack do you snort?
Both. Games are getting more demanding because the devs are pushing graphics in certain games not all games and the devs seem like they are not spending enough time fixing the games before they release but fixing them after release or at least try to for months or maybe 1 year.
@@psobbtutorials6792 You must be sniffing way too much RAM if you think this game is top-tier. Are you out of your mind? Horizon Forbidden West annihilates this game graphically. Even a double-A title like Banishers: Ghosts of New Eden has better visuals. It’s ridiculous to even compare them-one is a visual masterpiece, while the other struggles with mediocre textures and uninspired environments. Get real.
@@psobbtutorials6792 tell me you haven’t played cyberpunk without telling me you haven’t played cyberpunk 💀 💀💀
(Which btw runs SIGNIFICANTLY better while looking insane)
@@tvcultural68 If running this mostly linear game gave your pc trouble I doubt you low wage bob ever experienced cyber punk at max settings. It doesn't look better at all, the attention to detail is really bland in gay punk
The fact that pre-rendered cutscenes are locked at 30 FPS is atrocious in this day and age. The rest is okay but I can't be jumping from 120 down to 30, back to 120, back to 30.
the characters' models are amazing, very natural and detailed, but the environments and the scenes are like from graphics 10 years ago, it's a bit disappointing given the requirement of hardware is not that low
The environments are extremely detailed. If you actually looked at games 10 years ago on period accurate hardware you would see they are less detailed. But the ones most fondly remembered hold up well because of strong art direction.
FF16 can have all the detail it wants but it will still look bad when the art direction is super boring. The art direction is good in some areas and the game looks amazing, but then sometimes like in the hub of the full game or in the opening castle area it is really boring.
There are very detailed environments. Some areas do look bad but there are alot of moments where the game looks amazing. When you get back to your castle later on in the game the geometry detailed is very good.
I have seen absolutely nothing that justifies such high hardware requirements.
It's not optimized that well but the lighting and materials quality is pretty high in this game.
Nothing visually warrants it from a quality or quantity standpoint. It doesn't do anything special and should be performing twice as much
3090 here, 1080p, 15 fps, uninstalled.
No way are you being serious ?
You need a 4080
You might be CPU bound. What CPU do you have?
if a 3090 is getting 15 fps at 1080p then you have system problems up the wazoo
there has to be something not configured right with your PC as I get around 95FPS (in certain areas) without any FG on a 3080 at 1440p with DLSS Quality mode.
Native 1440p is really the "optimal" resolution for the RTX 4090 for the latest AAA Games / UE5, etc.
No need for increased input lag bought about via "frame gen" nor visual artifacts brought about via "AI Upscaling". Just a pristine NATIVE image with no negative impact to input lag as the PC Gods intended.
I have both a 55" 4K 120Hz VRR OLED and 27" 1440p 240Hz IPS G SYNC Display. Never have I felt Native 1440p was somehow "inferior" to gaming at 4K on my RTX 4090 - R7 7800X3D - 2 X 16GB 6200 MT/S CL 26 36 32 44 - 2200 FCLK - 55.6ns latency Aida64 system.
IMO game developers should replace the "Ultra" requirements with NATIVE 1440p / 90 FPS instead of 4K / 60 FPS. Never would I prefer 4K / 60 FPS over 1440p / 90 FPS.
Many simply don't want to come to terms with this reality as they prematurely jumped the "4K hype train" and now think 1440p is somehow beneath them but they are only hurting themselves with this absurd mentality.
Of course I'm referring to most of us running your standard 27-34'" desktop monitor. If you're running something like a TV at 48+ inches well it is what it is..I'm not suggesting 4K is horrible or anything (it's a viable option for an RTX 4090 build) just that for many 1440p would be the better option if you're running a standard desktop monitor up to ~ 34".
At least outside of UE5 which is designed to run with supersampling, 4090 walks 4K native and 4K with DLAA (which is an extremely stable image) even without frame gen in a huge number of games. The raw grunt of the 4090 is basically wasted at 1440p in a non raytraced/360FPS+ setting.
Frame gen input lag can be a real hit and miss, in a game like GOT you have to turn it off because it throws off reactions. But honestly here where you're already at a nice base and it's a visually driven game, it's not particularly intrusive.
For unreal engine 5 games yes but most other games the 4090 can run native 4k.
It's interesting how game requirements progressed within the same gpu gen.
When I bought my 7900XT it was considered an entry-level 4k GPU, it could max out most games and still get 60fps, basically a 3090 TI clone. Now, before even the 5xxx series is out, even the 4090 turned into a 1440p GPU, albeit a high refresh rate one while mine went to overkill 1440p to just 1440p/60. I also spent a grand on it on release day, which is a real bummer. For that price, I could get a 4080/7900XTX nowadays so I do have some buyer's remorse. Alas, my own fault for being dumb.
I feel like the system requirements increase cannot really go any further at this point, simply because you cannot upgrade PS5 or Xbox hardware and it's still years away from a new console gen.
back in 90s and early 2000s, demos what led me to buy most of the games i brought. its nice in the last few years more devs are releasing demos, early access is good and all but i still rather a playable functional demo over WIP early access.
When 80% of the people won't even be able to play at a smooth framerate despite having good 1080p gpu's... What the hell are they thinking
It was a PS5 game targeting 30fps first and foremost. That was their main audience.
@@teehundeart PS5 is barely 30-40% better than a GTX 1070 yet that GPU can barely run at 720P 30fps. Obviously the game is unoptimized on PC.
@@RyviusRan So true man....and I'm not defending my PC because it's pretty old (RTX 2060) but even with my GPU I can't even hit a stable 30 fps on medium graphics at 1080p with DLSS enabled......yeah terribly optimized.
@@RyviusRan PS5 often runs games at custom settings not available on PC, often settings that are below the lowest PC settings. In videos they look somewhat similar, but not really in this case even with youtube compression so if the consoles used PC settings, they would also barely hit 1080p/30. Moreover, keep in mind that games since the PS4 have been optimized for AMD hardware. Generally speaking, AMD performs better at low settings vs Ultra compared to nVidia at most games for some reason.
upgrade your GPU
Running 7950X3D, 7900XTX, 64GB Ram, (3) 1440p 240HZ monitors. On max settings in quality mode, my game caps at 180 for some reason and sits there while having it at 240fps. In the castle area, I sit at around ~90-130fps. In the battle area with the goblins I sit around 120-150 with random fps drops to ~80-90fps. If I turn on Dynamic Resolution and FSR Frame Generation, castle sits around ~200fps, in battle it juggles between 160-280fps. Max fps I've gotten are ~390fps, lowest fps have been around ~75fps on a spike drop. So the way I have it now, I turn off Dynamic resolution, leave FSR FG on and cap my fps at 120 and it just sits at 120fps for most of the playthrough. I also have never gotten any screen tearing like I see in your video. So the game looks smooth for me, but characters seem to appear blurry/foggy in cutscenes and when zoomed into a character. STILL better than on PS5 though lol.
7600x and 6800xt gives me 1440p native 60+ max settings. demo is lots of fun.. might buy this 1