Ubisoft's menus how you see what your settings are doing, should be a standard in PC games. This is a beautiful looking game so seeing what you can get with the right settings which do not have a big impact on performance is very helpful.
It's not perfect though, as you don't have a live preview of the actual game graphics in the background, so you have no direct feedback when changing options.
For me Gears 5 is still the gold standard. A very indicative benchmark, visual representations, where the load will be and on what, where the stress points were in the benchmark, etc.
@@NeovanGothfor those of us who know what to look for in each change, absolutely. Set up the camera, change an option, see it change in real time. But for people who know less, showing a direct photo example can show less knowledgable users what exactly is changing.
Yeah, and their Snowdrop engine seems to be competitive now with UE5. Except for UE5's virtualized geometry, where Snowdrop (like other engines) still uses LOD instead.
@@cube2fox Yeah Nanites is really just UE5s biggest selling point now, everything else can be had in other engines as well. I hope they'll get competition there too soon, because Nanite really just removes another obstacle to games becoming photo real and even usually increases performance as well (as it's pretty aggressive in removing stuff that doesn't have to be rendered due to occlusion and or distance from the player)
Agreed, seems other custom engines have their own version of software RT now which run well, making UE5 even more unappealing as Nanite is the only real draw, but with UE’s poor CPU utilisation and them still having not yet solved shader compilation stutter (embarrassing at this point and a deal breaker for me) I am less and less exited for each UE5 release and far more interested in what these “custom” engines can do.
@@minbari73 Well even though it releases on PC at the same time it's probably basically ported from consoles, as that's clearly their main target. That's the fate of most games nowadays except for the few that are PC exclusive or PC focused, like a cyberpunk or so (which seems PC first and then ported to consoles, with mixed results...)
I bought this game because of Alex's first video on the graphics and his emphasis on the explorer mode and crysis style gameplay. First Ubi game I've been happy with in a while.
Haha same for me. I was not expecting the game to look so good. So i didn't bother checking it. But after seeing that video, instantly purchased the game to see how it looks at Max setting on my 4090. Will have to say, one of the best looking game I've played.
@@Pranjalchoudhary100 How does it perform at unobtanium settings? I've a 3060 ti and 3600x and I play most of my games at 4k maxed out 30 fps lock Dlss perf(for a smoother frametime). I was thinking of upgrading my rig to 7800x3d and 4080/4090, but I'm not sure if I should just wait for the Nvidia 5000 series to drop.
@@prateekmishra6140 Should probably wait tbh. The US export ban on AI chips has caused the price of the 4090 to skyrocket and the 4080's price/perf is not good this gen. Bought my 4090 in May and I could sell it on the secondhand market for 100's more than I paid lol.
@@prateekmishra6140what is yoir budget? I have an all white full tower build with the gigabyte aero oc 4090/ 7800x3d/ 64 gigs of ddr5 which I will clean soon. I clean my pc every 2 weeks to 1 month at the latest. The 5090 will be what the 4090 is going through now being well above msrp and hard to find when it releases. So I would recommend the 4090/ 7800x3d build now or wait next year when the new cards come out and hopefully the 4090 prices come down. I dont think the 5090 is coming out anytime soon. If you really need the 4090 upgrade then get it now.
I think what the video is missing is an explanantion of the "Fixed" and "Biased" Upscaling modes. Usually you just change the resolution for DLSS and FSR with the Quality modes (Qualiy, Balanced, Performance, etc), but the "Biased" scaling mode is an additional scaling modifier that depends on the output resolution and is also inconsistent. I need a chart to fully explain it. But summarized, the internal resolution in "Biased" mode is a lot higher than the usual DLSS and FSR screen percentages. For example Biased-Performance is higher res than Fixed-Quality. At least for 1080p and 1440p. At 2160p there is no difference between Fixed and Biased. Also no combination seems to drop below 720p internal resolution. So at 1080p, DLSS Ultra Performance has the same 720p internal resolution as DLSS Quality mode. This probably leads to many users thinking the game performs worse than it does, because "DLSS Performance" doesn't give you nearly as much extra fps in the biased mode as it usually does, and the users don't know that it's actually running at a way higher resolution.
On a PC the user is tasked with optimizing a game to their unique hardware set up ...... If you aren't willing to do that yourself and want to just use presets then just go buy a console and be done with it
@@longjohn526 Very true. I optimize all the PC gaming videos on my own channel according to my own setup. It's good to have a starting point for optimizing rather than wasting time though.
Very true but I gotta wonder who is the target audience. People with a 10XX card or an AMD card that doesn’t support RT? I’d surmise that even the highest non rt card doesn’t have enough power to run this game with software RT for a constant 30 fps or higher.
@@Lock2002ful You can have the 1660ti to run this at 1080p30 optimized settings, so it should be getting 1080p60 on the gtx1080 or the rx 5700 It looks like they are just running a custom code to perform the ray tracing on the exact same scene as the HWRT one the same way as crytek did
I'm playing it on PS5 now on a QD-OLED TV (A95L) and it looks phenomenal. Played on a 4090 top spec PC prior but it's only connected to a Mini LED monitor so didn't look as good. Display in this case made the bigger difference: PS5 on top end QD-OLED TV > PC max settings on Mini LED monitor
Picked up Avatar Frontiers of Pandora on PS5 and man the graphics are absolutely stunning! It’s the first game since Horizon Forbidden West that has blown me, the world of Pandora is so lush, the natural environment, lighting is top notch. Flying around on the Ikran is a treat, Far Cry Primal is one of my all time favorite games and Avatar Frontiers of Pandora captures the vibe of that game perfectly.
Awesome video Alex! Really detailed as usual but a quick thing I noticed: for the judder with FSR3 & VRR there is a simple fix. Just limit the fps to slightly under the max refresh of the screen and the pb is gone. I know you got a lof of work, and this video was surely a lot of work as well since you went through so much stuff, but you can't spend a minute telling us that you can fix DLSS by doing "x and y" and not investigating if FSR3 VRR problem can be solve. It's not optimal of course I know, but the workaround is not the worst thing either.
For anyone else missing their "farewell and auf wiedersehen" at the end of the video who was left wondering as I was at what we got instead; I believe it was "farewell and Frohe Weihnachten" or "farewell and Merry Christmas" to any other German illiterates like myself.
I love this games setting page it shows good pictures to show what each setting is and has alot of options. I really like the audio settings too because it plays a sound effect when adjusting the levels so you don’t have to guess is voice or music is how loud you want it.
The most visually mind blowing and mind boggling game of the decade! This shit blows everything else out of the water! Every tiny little detail to every grand cascading is vista just insanely beautiful and the lighting is just insane and pulls it all together so well! Playing with Unobtanium settings on my RTX 4080 and having a blast and even with Ultra settings it looks insanely good already. The RTGI is just incredible and the shadows are next level. Every little plant, blades of grass and even the tiny little moss growing on trees have shadows, it's UNREAL (actually it's SNOWDROP!) XD. The world is SUPER immersive, traversal is WILD with its freedom, fluidity and verticality, the gameplay overall feels really great (love the unexpected depth and scope of the cooking, crafting and gathering system) and Exploration Mode is the cherry on top that makes questing super immersive, tough and satisfying when you actually find where you need to go WITHOUT a floating quest marker, just using your brain cells and decoding the map with clues! A true Navi simulator for all the Avatar fans out there who have been wanting to feel like a Navi for over a decade. Here's your chance, don't miss it! Lots of love to the MASSIVE team from a huge fan!! Hope this game does amazingly! PS: The Hunter's Guide has really turned me into a zoologist and botanist overnight, brings a whole another level of immersion to the game!
I agree. I have both a PS5 and a 4070 PC and games on the PS5 look amazing considering the cost difference in systems. PS5 owners have nothing to feel bad about from the fanboy haters. I can't wait to see what the PS5 Pro can deliver!
The devs need to be commended to making both PC and console look absolutely phenomenal. Frankly I cant believe how good the game looks on PS5 and with VRR it runs 99% of the time great. And even the image quality is great considering what the game does. Very very good job.
It is similar. Different architectures, but the 2070 Super basically performs like an RX 6700 across the board, and that's basically the consoles' GPU.
@@rob4222 It's similar in performance almost across the board and 90 percent of people use Nvidia on PC, that's why. Though I think an amd card like you mentioned would be good for them to use for educational purposes and people that use them.
@@sevrent2811 There used to be a big advantage on consoles with low level access, but the PC closed the gap a lot with Vulkan and DX12 that even though consoles still have an advantage, the advantage is really small now if developers optimise the game well on PC and consoles, which many do a poor job. I do have to agree that an AMD card should have been thrown in the mix on the optimised settings to get an idea of how it performs, yes we know that the 2070 performs like the 6700 gpu, but some games perform much better on AMD or Nvidia gpu's of the same class, so we don't really know how it performs on AMD or Intel hardware, we can only assume it performs like the same class of Nvidia gpu's and honestly, I think Alex could have done better here with just showing the optimised settings on AMD and Intel hardware, after all, just showing Nvidia settings, encourages gamers to lean towards buying Nvidia hardware and then Nvidia will bump up the price more, but seriously, these optimised settings are more likely to be useful for gamers that don't understand what a lot of these settings means, but to be fair, Alex has always been in the Nvidia camp so it's no surprise.
Gotta respect ubi for this one. When they get up from their lazy asses, they can really deliver great games. Personally love ac odyssey and fc5. Will for sure try this one soon
I think Ubisoft's PC versions are generally very good. Not perfect (Why does the Dunja Engine _still_ have traversal stutter?), but very solid. Games run well, they know how to properly implement HDR, and even the PS5 controller is fully supported, including adaptive triggers via Bluetooth.
I checked like a dragon gaiden with fsr3, it has a similar implementation but i noticed that if u cap your frame rate at something less than your max refresh rate(and hit that target) but within the vrr window u do get smooth frame times. I had jutter with v sync so i capped the frame rate 1 fps below the max refresh and it worked perfectly with fsr3, straight line on the frame times.
You'd expect DF to investigate and find the best settings for FSR3 but they didn't. Instead they wrote it off already judging by what was said in this video. There's a reason DF is called Nvidia Foundary by many. I've tested FSR3 with Nukem's dlssg-to-fsr3 mod and Cyberpunk 2077, Witcher 3, etc are buttery smooth. I also have the frame rate limit set to 2 fps less than my monitor frequency.
@@Dark-qx8rk i mean that’s the issue that really doesnt let amd be mainstream, you shouldn’t need to do all these extra things to get a good experience (and most people don’t) so i do understand df’s perspective from a user experience standpoint.
@@Relex_92 I agree, I got a VRR monitor so that I'd never have to worry about v-sync and it's judder again, FSR3 is unacceptable to me in its current form, and that's fine, it is not like I even own an AMD GPU so it is nice that it is available at all on Nvidia cards... But I won't be using it until/unless it improves.
TBH, capping fps few frames under max refresh rate (or under what one's hardware can handle) is what I'd expect to be a standard and a basic knowledge of everyone with VRR monitor.
@@mryellow6918 sure, but I will still say that it does require hardware manufacturer by AMD to prove so. AMD and Nvidea are not necessarily the most truthful corporations.
If you are using Radeon Anti-Lag and an in-game frame limiter set to half the desired framerate, then it will be smooth. It is the same with AFMF. Playing with unstable framerates that range from 70fps - 120fps isn't going to be a good experience.
Overall I think using hardware that is manufactured by the company can confirm or negate these issues. Neither AMD or Nvidea is really being honest, so using AMD hardware is always a plus to increase the accuracy of any assumptions.
Legend for focusing optimization for "older, mid range PC components". GPU prices have been crazy in recent years obviously since the boom in computer hardware enthusiasts and demand has been huge.
Thank you for the tip on using DLSS. The smearing issue is terrible otherwise, especially in the plains, and I’m currently using FSR because of that. Frame generation is not usable anyway because it does blocking issue on the UI, which is way too much distracting. Playing it on a 4080 coupled with a 7700X, 3440x1440, I do get around 90ish fps with FSR balanced. Decent, but I’m gonna go with the optimized settings and go up from there. Also wanted to mention that for anyone that wants to play the game, it’s a decent classic fps open world, but it’s GOLD for Avatar fans.
Odd .... I'm getting the same to slightly better with a 5800X, 4070 TI DLSS Quality (I never go below that at 1440p because it defeats the purpose of better graphics) and the same monitor resolution ..... However I have Depth of Field turned off, Motion Blur turned off and then in the CFG file I have Chromatic Aberration and Vignetting turned off which are all features I don't like but also take away performance and cause blurrier less defined graphics. I also hacked the EXE so ultrawide is enabled in game rendered cutscenes and it seems to have zero effect on performance so I really don't understand why it's not enabled when all it took was changing 4 bytes of code. In fact squashing them down to 16:9 seems to degrade the graphics like changing the Ultra preset down to High or Medium
It is sad that the most exiting thing for me is the lack of shader stutter. Unreal really needs to fix that, which prove out there it is totally avoidable
That is not Unreal Engine's biggest issue. Shader stutter can be avoided, what can not be avoided is traversal stutter. And there is barely any UE game that doesn't have that. Jedi Survivor and Hogwarts Legacy are examples of UE 4's traversal stutter issues. But Immortal's of Aveum has the same issue, every time something loads in. The Finals doesn't have it, but i guess thats because levels are just fully loaded. In Fortnite it's not very bad today, but i still think it can be done much better, with a better parallelized load on the CPU. This snowdrop engine does it perfectly... CPU wise a high end CPU can run this game at 200 fps throughout. An Unreal Engine 5 game with similar gfx would certainly not reach that.
@@oropher1234nah shader stutter is still pervasive despite claims that it’s easy enough to fix. Traversal stutter may be equally, if not more ubiquitous in UE titles but has way less of an impact compared to shader compilation issues.
@@DroneCorpse Shader stutter is an issue just because devs often are careless about it, if they actually implement a precompilation step, it's not that big of a deal. Fortnite; today does it in a weird way though, it seems that slower CPU's will be busy for too long and not reach completion before the first match actually started. I don't understand why Fortnite doesn't precompile in the menu before loading the game.
One thing i wish devs would leave in PS5 is the motion blur setting. I am one of the few who actually prefers to play with motionblur as it helps with motion sickness. Disabling it for the performance mode is just a crime, i've had cases where my head would start to spin when i play for more than an hour. I wish i could play with it and adjust the value of it. Not a fan of devs deciding what is best for the user. Not sure who decided that motion blur is bad, because i always loved it.
I don't think is a case of deciding what is best for the user here but more a case of deciding what is the best way to get performance out of the console. I suspect intense motion blur scenes caused some frame drops that made them decide to deactivate it on console where they try to hit that 60fps or 30fps exactly. On PC if you play with 90 frames and you drop to 85 for a little is not much of an issue.
@@snowpuddle9622 frl they spend all day knowing they are spewing bullshit information. The one that gets me all the time is when they go to a random frame of a door or whatever and cont the “pixels” to see what resolution the game is running at 😂😂😂 that’s completely illogical math 😂
I didnt' realize this was Massive!! I assumed this was just another skinned Farcry game. Once I saw it was Massive I went out and bought the PS5 Gold edition and I'm loving the game so far.
The studio's earlier game, Division, looks kind of amazing even by today's standards. Especially when you compare some of the modern "Next gen" titles like Redfall, it really goes to show how some developers have actually regressed.
@@Goodbutevilgenius When you actually play the game, there are so many locations that totally live up to the trailers with the lighting, geometry detail, destruction and particles. I’d rather invite you to take a look at what the new “From the ground up” Forza game ended up looking like compared to the reveal trailer. That’s like Ubisoft marketing on steroids.
Nice video as always. I’d be very thankful if the 3080 would get some love in these comparisons and for AMD users something comparable to the 2070-2080 and 3070-3080. Testing only for two gpus and only nvidia for that is a bit narrow tbf..
I think we should keep the more limited purpose of this video in mind. The performance/return optimised settings derived in this video will hold for most systems and GPUs, although of course the need for such optimisations declines as one moves further up the performance stack. Between the videos DF has put out about this game I think we can put together a general picture of the game's performance profile at both the mid-range and high-end, but perhaps a summarizing paragraph would be a useful addition to wrap things up, to the effect that the game remains both performant and attractive on console-class hardware while also scaling up to challenge even top-tier hardware.
It provides you with a baseline - you can just increase resolution or some of the settings that provide the biggest impact. I think Alex could make note of the 3-4 settings that have the most visual impact so that 3080 or 3090 owners can bump those up.
Loved what Massive did with The Division 2 on PC. Put in over 130hrs with that one, which uses the same Snowdrop Engine, the only decent Gfx Engine that Ubisoft has, IMO.
Rockstar must also incorporate Unobtanium Settings in their upcoming GTA VI, since it will mostly hung around for another Decade before another GTA franchise!
I think the devs just need to include a default performance/optimized settings or console equivelant presets in their PC versions at this point. This will hugely help a lot users to achive more stable frame rates with minimal quality exchange without trying to tweak every other setting in the game for hours.
I was annoyed that throughout the entire intro sequence, there was no way for me to access the settings and it was giving me anxiety because I knew the game could look better in that part and rather than paying attention to the story setup I was worried about the game's settings the entire time. But the menu was locked away from me and the only settings I had in the initial screen were simple QOL type settings.
I really wish devs just added a "console" preset. It would be the optimized settings by default. Good visuals with optimal performance. Then based on the user's setup, they can increase or decrease settings.
man, with the great optimize from this game and the engine they build, i have big faith as Starwars fans for upcoming Ubisoft Starwars Outlaws coming later...
Man its one of those rare cases when a game everybody wanted to fail so bad - delivers, and shuts everybody up. Even the PC version runs flawlessly with no shader compilation stutters. Gameplay-wise it might be not everybody's cup of tea, but i've yet to hear valid criticisms, so far it is a really solid game that is very underrated.
Guaranteed if Ubisoft had added Unobtainium settings to the menu by default with a warning, people would have still complained online about the game being unoptimized. I very much appreciate games that offer settings that scale to future GPU’s, even if they have to hide it.
I noticed the same fog artefacts in Alan Wake 2 with DLSS Quality, behind the Valhalla nursing home. Maybe it has more to do with DLSS than with the game itself?
Pretty blown away by this games quality, the bioluminescence at night on my Alienware OLED being churned out by my 5800X3D and 7900XT is stunning. So far I think my 70$ is pleasing me. So glad FSR3 Frame Gen is working I am getting smooth game play 📈 🤩
10:10 You can use a specific refresh rate if you want fix the frametime issues with FSR3 frame gen. The FPS just have to be locked. I'm playing at 70FPS locked via NVCP with VRR and DLDSR on my 4090. The frametime issues with FSR frame gen are way more obvious when complete GPU bound. You can perfectly test this with downsampling like DLDSR and unlocked FPS.
German game site article I saw indicated that fsr3 frame gen only has this problem on nvidia gpus for some reason, at least in this game. Their frametime graph from a 7000-series Radeon was fine. In any case Avatar seems to be proof that their solution can work properly with the right implementation.
@@harryarmstrong5728 Yeah, after comparing several videos, it seems certain that NV GPUs have frametimes issues with FG enabled in this game. These zigzag lines in the frametime graph are only there on NV cards.
I have a 3080 Ti, which isn't a budget card though it's nearly 3 years old. I'm surprised I can run Avatar at standard max settings at 3440x1440, a fairly consistent 60fps, DLSS performance. Considering the fidelity of the visuals and all that RT, Massive did a great job optimizing the game. All Snowdrop really needs is an equivalent to Nanite to eliminate any noticeable LOD pop-in.
I have a 3080ti as well, if you’re willing I am curious if you have ALL settings maxed out? Reflections as well? Trying to optimize my settings. Any tips would be greatly appreciated:)
@@Gimpy17 It's all relative. Avatar is using a lot of RT and the draw distance and density at max settings is impressive. Image quality is good and performance is consistent. Compare with Alan Wake 2, not an open world, on my PC its performance is all over the place on medium settings.
@@eternalbeing3339 Depends on the game. A 3080 Ti can run lots of games at native 4k/60+fps/max settings, like Stray, Elden Ring(no RT), Gears Tactics, and Lies of P. Demanding new games like Alan Wake 2 and Jedi Survivor, not so much, but with some concessions can still look and perform much better than current consoles. RT performance is the main issue for 30 series GPUs which the 40 series addresses with frame gen. That or Alex's optimized settings ;) But as DF often claims, raw pixel count isn't as important these days for image quality with modern image clarity techniques. An internal render of 1440p or even 1080p reconstructed to 4k looks great and allows older/less demanding games to run maxed out at very high fps or newer titles like the aforementioned run with all the RT bells and whistles. I'm skipping the 40 series and will try to hold out till the 60s but might cave when the 50s come out lol. Frame gen is just such a game changer.
Framepacing is broken if you are using overlays. By AMD documentation "One thing to note is that utility overlays provided by third parties will incur a cost which the FSR 3 frame pacing algorithm may be unable to measure and consider. Overlays should always be implemented efficiently, with minimal GPU resource cost to minimize impact. We also recommend that Hardware Accelerated GPU Scheduling be turned on if supported for best frame pacing results."
Hence why it would also be nice if digital foundries would invest in having a high-end AMD GPU as well maybe at 6950 XT or a 7900 XTX so that we can see how vrr performs along with freesync premium for example to see if there's any variation between it and using a Nvidia graphics card with FSR and variable refresh rate
Imagine if it properly supported explicit multi-gpu, with PCIe Gen5 bandwidth available etc. I still remember the ye old days of GN putting two Titan Vs to the test in AoS. One can wish.
3:00 Agree 100%! I'm always happy when games make themselves future proof by providing graphics options that exceed what even the beefiest high end systems at time of release can provide. Sadly the "PC master race" is extremely dumb and hates games for doing so, calling it "bad optimization" (oh Lord, how I hate this term). Hiding such options behind a "secret" param is _perfect_.
I think all PC games need to have a bang for bucks setting, basically settings that more or less are maxed settings but cut back in areas that are minimal for most gamers to notice and can offer performance improvements. Consoles are usually a good starting point but sometimes you can go even lower and still keep the core visual look of max settings. Like Alex said, many PC gamers can get obsessed with maxing the settings out, I know this because I've got two brothers that are like that, but the performance impact to the tradeoff on visuals you get, in a lot of cases, it's not worth it. With a lot of games, you can more or less get the core max visual settings without having to max the settings out, and that usually works much better on more modest hardware and from what I've seen with most games, max settings are a waste, the quality improvement is so small but the performance impact is usually quite big, that it's not really worth it unless you've got the hardware to power through it all. But seriously, do a test with a few gamers at different visual settings and see if they can tell which is better, my brothers and I have tried it out with a few games years ago, going as low as medium where it can be hard to tell them apart in real gameplay, but that is different from game to game, some games can drop settings a lot more then others whiles maintaining good visuals. I also do wonder if AMD and Intel gamers should look for alternative sites when it comes to finding the right settings, after all, Alex is supposed to be representing PC gamers, but it feels like he represents Nvidia above all else, you see that if you go back through the year of all his videos, a lot are about Nvidia tech,
Loving the tech in this game. 4090 on an OLED with HDR is stunning. For whatever reason, HDR on my Samsung G9 OLED doesn't look that good. The lack of DLSS Frame Generation was the only obvious (and embarrassing) mistake.
Hi Alex! Since you mentioned DLSSTweaks in the video, I think it would have added a lot to the video if you had taken a look at the DLSS 3 Frame Generaion mods available for the game, as a comparison to FSR 3, especially in the frame time consistency department. I've noticed that FSR 3 did not feel nearly as smooth as DLSS 3's Frame Generation, but the frame time graph was quite smooth for me with both. Nevertheless, another awesome video! Thank you for all the effort you put into it, and have a Merry Christmas and some well deserved rest!
Ubisoft's menus how you see what your settings are doing, should be a standard in PC games. This is a beautiful looking game so seeing what you can get with the right settings which do not have a big impact on performance is very helpful.
Agree. I may not be a major fan of Ubisoft gameplay design, but I love their interface design for being so complete.
It's not perfect though, as you don't have a live preview of the actual game graphics in the background, so you have no direct feedback when changing options.
Still better than nothing at all. Which is closer to perfect than anything at least in that regard.@@NeovanGoth
For me Gears 5 is still the gold standard. A very indicative benchmark, visual representations, where the load will be and on what, where the stress points were in the benchmark, etc.
@@NeovanGothfor those of us who know what to look for in each change, absolutely. Set up the camera, change an option, see it change in real time. But for people who know less, showing a direct photo example can show less knowledgable users what exactly is changing.
Still just shocked to see Ubisoft putting out such an amazing PC port. Massive is easily their best studio.
Yeah, and their Snowdrop engine seems to be competitive now with UE5. Except for UE5's virtualized geometry, where Snowdrop (like other engines) still uses LOD instead.
@@cube2fox Yeah Nanites is really just UE5s biggest selling point now, everything else can be had in other engines as well. I hope they'll get competition there too soon, because Nanite really just removes another obstacle to games becoming photo real and even usually increases performance as well (as it's pretty aggressive in removing stuff that doesn't have to be rendered due to occlusion and or distance from the player)
Agreed, seems other custom engines have their own version of software RT now which run well, making UE5 even more unappealing as Nanite is the only real draw, but with UE’s poor CPU utilisation and them still having not yet solved shader compilation stutter (embarrassing at this point and a deal breaker for me) I am less and less exited for each UE5 release and far more interested in what these “custom” engines can do.
What is it ported from?
@@minbari73 Well even though it releases on PC at the same time it's probably basically ported from consoles, as that's clearly their main target. That's the fate of most games nowadays except for the few that are PC exclusive or PC focused, like a cyberpunk or so (which seems PC first and then ported to consoles, with mixed results...)
Devs need to have a console settings preset like Horizon/God of War did. Makes it easy to tweak and compare.
Yeah. It would be great to see a separate preset named "PS5 quality" in the graphics menu.
I mean sure but what market is interested in that?
@@Deceniumthe majority of the PC market is on mid to low end hardware.
@@DeceniumThe absolutely enormous number of gamers still on 10-series and 20-series cards.
@@Deceniuma lot of people, presents are very useful for people
Good to see Massive's PC developer roots are still there. Wish we could get another World In Conflict though.
There is still much love for this game although it's from 2008
Oh wow, I didn't realize it was the same developer. That takes me back. Now that was a LAN party game back in college for me ha.
@@TechDiffuse It's a shame it's not on steam. But atleast it's on GOG. (complete edition is on sale for 2.50 right now even)
Yeah man, thatg was a great game. I dont spiritual sequels. I was WIC 2
@@bb5307 I downloaded the complete edition for free from some of the platforms. I forgot which it was to they just gave it away for free.
I bought this game because of Alex's first video on the graphics and his emphasis on the explorer mode and crysis style gameplay. First Ubi game I've been happy with in a while.
I bought the game literally because of that video too.... Earlier I was confused, but halfway through that video and the game was in my cart 😂
Haha same for me. I was not expecting the game to look so good. So i didn't bother checking it. But after seeing that video, instantly purchased the game to see how it looks at Max setting on my 4090. Will have to say, one of the best looking game I've played.
@@Pranjalchoudhary100 How does it perform at unobtanium settings? I've a 3060 ti and 3600x and I play most of my games at 4k maxed out 30 fps lock Dlss perf(for a smoother frametime). I was thinking of upgrading my rig to 7800x3d and 4080/4090, but I'm not sure if I should just wait for the Nvidia 5000 series to drop.
@@prateekmishra6140 Should probably wait tbh. The US export ban on AI chips has caused the price of the 4090 to skyrocket and the 4080's price/perf is not good this gen. Bought my 4090 in May and I could sell it on the secondhand market for 100's more than I paid lol.
@@prateekmishra6140what is yoir budget? I have an all white full tower build with the gigabyte aero oc 4090/ 7800x3d/ 64 gigs of ddr5 which I will clean soon. I clean my pc every 2 weeks to 1 month at the latest. The 5090 will be what the 4090 is going through now being well above msrp and hard to find when it releases. So I would recommend the 4090/ 7800x3d build now or wait next year when the new cards come out and hopefully the 4090 prices come down. I dont think the 5090 is coming out anytime soon. If you really need the 4090 upgrade then get it now.
Frohe Weihnachten Alex & dem gesamten DF-Team! Ihr macht einen super Job!
Was loci sagt.
I think what the video is missing is an explanantion of the "Fixed" and "Biased" Upscaling modes. Usually you just change the resolution for DLSS and FSR with the Quality modes (Qualiy, Balanced, Performance, etc), but the "Biased" scaling mode is an additional scaling modifier that depends on the output resolution and is also inconsistent.
I need a chart to fully explain it.
But summarized, the internal resolution in "Biased" mode is a lot higher than the usual DLSS and FSR screen percentages. For example Biased-Performance is higher res than Fixed-Quality.
At least for 1080p and 1440p. At 2160p there is no difference between Fixed and Biased.
Also no combination seems to drop below 720p internal resolution. So at 1080p, DLSS Ultra Performance has the same 720p internal resolution as DLSS Quality mode.
This probably leads to many users thinking the game performs worse than it does, because "DLSS Performance" doesn't give you nearly as much extra fps in the biased mode as it usually does, and the users don't know that it's actually running at a way higher resolution.
[20:13] Thanks for testing, analyzing and showing the PC optimized settings. This is going to save a lot of time for so many people.
On a PC the user is tasked with optimizing a game to their unique hardware set up ...... If you aren't willing to do that yourself and want to just use presets then just go buy a console and be done with it
@@longjohn526 Very true. I optimize all the PC gaming videos on my own channel according to my own setup. It's good to have a starting point for optimizing rather than wasting time though.
I'm actually surprised how well the software RT holds up visually, compared to how it performs
Very true but I gotta wonder who is the target audience.
People with a 10XX card or an AMD card that doesn’t support RT?
I’d surmise that even the highest non rt card doesn’t have enough power to run this game with software RT for a constant 30 fps or higher.
@@Lock2002ful You can have the 1660ti to run this at 1080p30 optimized settings, so it should be getting 1080p60 on the gtx1080 or the rx 5700
It looks like they are just running a custom code to perform the ray tracing on the exact same scene as the HWRT one the same way as crytek did
This game uses software RT?
8:03 listen bro it uses hardware RT to increase performance.@@matacachorro4090
@@matacachorro4090 Yes as shown by Alex in the video
Kudos to Massive for the quality work.
And the co studios, Ubisoft and other devs says it was also a team work lead by Massive.
It is Ubisoft, really. They are having a break from the bashing by the people. At least for now.
I am playing on PS5 on Performance mode, on a oled television.
It looks absolutely stunning
I'm playing it on PS5 now on a QD-OLED TV (A95L) and it looks phenomenal. Played on a 4090 top spec PC prior but it's only connected to a Mini LED monitor so didn't look as good. Display in this case made the bigger difference: PS5 on top end QD-OLED TV > PC max settings on Mini LED monitor
@@vgnvideogameninja2930 Now connect a PC to a TV and enjoy the ultimate experience some people have had for years. 😉
If you have enough dough to buy a RTX 4090 you owe yourself to get a LG C3@@vgnvideogameninja2930
Picked up Avatar Frontiers of Pandora on PS5 and man the graphics are absolutely stunning! It’s the first game since Horizon Forbidden West that has blown me, the world of Pandora is so lush, the natural environment, lighting is top notch. Flying around on the Ikran is a treat, Far Cry Primal is one of my all time favorite games and Avatar Frontiers of Pandora captures the vibe of that game perfectly.
Love that you used the RTX2070 Super for performance showcase!
A RTX 2070 Super is roughly the same GPU power as a PS5 ......
@@longjohn526... but a AMD GPU like a RX 6600 XT would even be a better match.
@@hassosigbjoernson5738ps5 gpu is basically rx 6700 non-xt with a bit higher memory clock
@@hookgrit’s a mix of 6600-6800.
Awesome video Alex! Really detailed as usual but a quick thing I noticed: for the judder with FSR3 & VRR there is a simple fix. Just limit the fps to slightly under the max refresh of the screen and the pb is gone. I know you got a lof of work, and this video was surely a lot of work as well since you went through so much stuff, but you can't spend a minute telling us that you can fix DLSS by doing "x and y" and not investigating if FSR3 VRR problem can be solve. It's not optimal of course I know, but the workaround is not the worst thing either.
Hi, does this work even if you don't reach the max refresh rate 120 hz, but my FPS caps around 90fps with FSR3 frame generation?
FPS limit does nothing when the stuttering happens below the screen refresh rate what are you talking about
For anyone else missing their "farewell and auf wiedersehen" at the end of the video who was left wondering as I was at what we got instead; I believe it was "farewell and Frohe Weihnachten" or "farewell and Merry Christmas" to any other German illiterates like myself.
I love this games setting page it shows good pictures to show what each setting is and has alot of options. I really like the audio settings too because it plays a sound effect when adjusting the levels so you don’t have to guess is voice or music is how loud you want it.
Hat tip for the use "Above the Canopy" from Dark Void for your background music. A fitting throwback to a game of similar setting and design.
I agree with alex many new pc gamers dont know much they the under estimate settings and over estimate the power of their system
Frankly a lot of them aren't much smarter than console gamers.
@@lycanwarrior2137 A lot of them don't even know how a PC works and still talk.
Great work Alex, thanks a bunch for these settings and side-by-side comparisons :)
The most visually mind blowing and mind boggling game of the decade! This shit blows everything else out of the water! Every tiny little detail to every grand cascading is vista just insanely beautiful and the lighting is just insane and pulls it all together so well! Playing with Unobtanium settings on my RTX 4080 and having a blast and even with Ultra settings it looks insanely good already. The RTGI is just incredible and the shadows are next level. Every little plant, blades of grass and even the tiny little moss growing on trees have shadows, it's UNREAL (actually it's SNOWDROP!) XD. The world is SUPER immersive, traversal is WILD with its freedom, fluidity and verticality, the gameplay overall feels really great (love the unexpected depth and scope of the cooking, crafting and gathering system) and Exploration Mode is the cherry on top that makes questing super immersive, tough and satisfying when you actually find where you need to go WITHOUT a floating quest marker, just using your brain cells and decoding the map with clues! A true Navi simulator for all the Avatar fans out there who have been wanting to feel like a Navi for over a decade. Here's your chance, don't miss it! Lots of love to the MASSIVE team from a huge fan!! Hope this game does amazingly!
PS: The Hunter's Guide has really turned me into a zoologist and botanist overnight, brings a whole another level of immersion to the game!
Avatar: Frontiers of Pandora and Dead Island 2 have been rather pleasant surprises of 2023.
Honestly most of the ps5 reductions you just cant see unless you look closely for them. Meaning the devs did a phenomenal job overall.
I agree. I have both a PS5 and a 4070 PC and games on the PS5 look amazing considering the cost difference in systems. PS5 owners have nothing to feel bad about from the fanboy haters. I can't wait to see what the PS5 Pro can deliver!
@@TB-vb1stthere won’t be a pro
The devs need to be commended to making both PC and console look absolutely phenomenal. Frankly I cant believe how good the game looks on PS5 and with VRR it runs 99% of the time great. And even the image quality is great considering what the game does. Very very good job.
Most Ubisoft games are very well optimized on console even though they're trash
So performing basically as expected on both consoles and PC. A similar PC to a PS5 gets an equal or even better result. Great.
A 2070 Super isn't similar to console hardware which uses AMD.
I cannot understand why Alex doesn't use a 6650XT with 10,8 TFLOPS instead
It is similar. Different architectures, but the 2070 Super basically performs like an RX 6700 across the board, and that's basically the consoles' GPU.
also debunks the myth of magical console optimization. Both similar spec's PC and console basically perform the same
@@rob4222 It's similar in performance almost across the board and 90 percent of people use Nvidia on PC, that's why. Though I think an amd card like you mentioned would be good for them to use for educational purposes and people that use them.
@@sevrent2811 There used to be a big advantage on consoles with low level access, but the PC closed the gap a lot with Vulkan and DX12 that even though consoles still have an advantage, the advantage is really small now if developers optimise the game well on PC and consoles, which many do a poor job.
I do have to agree that an AMD card should have been thrown in the mix on the optimised settings to get an idea of how it performs, yes we know that the 2070 performs like the 6700 gpu, but some games perform much better on AMD or Nvidia gpu's of the same class, so we don't really know how it performs on AMD or Intel hardware, we can only assume it performs like the same class of Nvidia gpu's and honestly, I think Alex could have done better here with just showing the optimised settings on AMD and Intel hardware, after all, just showing Nvidia settings, encourages gamers to lean towards buying Nvidia hardware and then Nvidia will bump up the price more, but seriously, these optimised settings are more likely to be useful for gamers that don't understand what a lot of these settings means, but to be fair, Alex has always been in the Nvidia camp so it's no surprise.
3:14 this is SO TRUE HAHAHA, people will put ultra path tracing and complain that his 3080 doesn’t run it well
Frohe Weihnachten!🎄
i'm so happy that Alex was finally able to breathe in the end of this grueling year for PC ports with this and Alan Wake 2 🙏
Looks beautiful and runs great! I hope all upcoming Ubisoft PC Ports are of this quality or beyond
Gotta respect ubi for this one. When they get up from their lazy asses, they can really deliver great games. Personally love ac odyssey and fc5. Will for sure try this one soon
I think Ubisoft's PC versions are generally very good. Not perfect (Why does the Dunja Engine _still_ have traversal stutter?), but very solid. Games run well, they know how to properly implement HDR, and even the PS5 controller is fully supported, including adaptive triggers via Bluetooth.
I checked like a dragon gaiden with fsr3, it has a similar implementation but i noticed that if u cap your frame rate at something less than your max refresh rate(and hit that target) but within the vrr window u do get smooth frame times. I had jutter with v sync so i capped the frame rate 1 fps below the max refresh and it worked perfectly with fsr3, straight line on the frame times.
You'd expect DF to investigate and find the best settings for FSR3 but they didn't. Instead they wrote it off already judging by what was said in this video. There's a reason DF is called Nvidia Foundary by many.
I've tested FSR3 with Nukem's dlssg-to-fsr3 mod and Cyberpunk 2077, Witcher 3, etc are buttery smooth. I also have the frame rate limit set to 2 fps less than my monitor frequency.
@@Dark-qx8rk i mean that’s the issue that really doesnt let amd be mainstream, you shouldn’t need to do all these extra things to get a good experience (and most people don’t) so i do understand df’s perspective from a user experience standpoint.
@@Relex_92 I agree, I got a VRR monitor so that I'd never have to worry about v-sync and it's judder again, FSR3 is unacceptable to me in its current form, and that's fine, it is not like I even own an AMD GPU so it is nice that it is available at all on Nvidia cards... But I won't be using it until/unless it improves.
TBH, capping fps few frames under max refresh rate (or under what one's hardware can handle) is what I'd expect to be a standard and a basic knowledge of everyone with VRR monitor.
@@msoltysplPC Master Race 2023 is embarrassing
FSR 3 works flawlessly with my 7900 xtx and variable refresh. I think not using an AMD 7000 series is an issue when evaluating an AMD tech.
not if they say it works on all
@@mryellow6918 sure, but I will still say that it does require hardware manufacturer by AMD to prove so. AMD and Nvidea are not necessarily the most truthful corporations.
lol RTX 4090
If you are using Radeon Anti-Lag and an in-game frame limiter set to half the desired framerate, then it will be smooth. It is the same with AFMF. Playing with unstable framerates that range from 70fps - 120fps isn't going to be a good experience.
Overall I think using hardware that is manufactured by the company can confirm or negate these issues. Neither AMD or Nvidea is really being honest, so using AMD hardware is always a plus to increase the accuracy of any assumptions.
I’ve been waiting for someone to do an optimized settings video. Thank you so much digital foundry
Solid breakdown thanks Alex!
All I want is to play this game in VR. Just hope someone makes a VR mod for this.
I want to play this in 3rd person view.
with the farcry primal feel of you prob wont like it, but idk anything
@@XZ-III NPC
@@XZ-III I already put 50 hours in this game. I like it.
@@djp1234 not what im talking about about, just because i said its like another game doesnt mean i said its bad
I wish Snowdrop Engine was mainstream instead of Unreal.
This is such a great, in depth and informative video. Well done Alex. Great job.
Amazing work there from DF.
Great and in-depth analysis from Alex as always . 👍
Legend for focusing optimization for "older, mid range PC components". GPU prices have been crazy in recent years obviously since the boom in computer hardware enthusiasts and demand has been huge.
Thank you for the tip on using DLSS. The smearing issue is terrible otherwise, especially in the plains, and I’m currently using FSR because of that.
Frame generation is not usable anyway because it does blocking issue on the UI, which is way too much distracting.
Playing it on a 4080 coupled with a 7700X, 3440x1440, I do get around 90ish fps with FSR balanced. Decent, but I’m gonna go with the optimized settings and go up from there.
Also wanted to mention that for anyone that wants to play the game, it’s a decent classic fps open world, but it’s GOLD for Avatar fans.
Yeah no one cares
What avatar fans….
@@samgoff5289 the first movie made 2.9 billion and the 2nd 2.3 billion so I'm gathering there's probably a couple of fans.
Odd .... I'm getting the same to slightly better with a 5800X, 4070 TI DLSS Quality (I never go below that at 1440p because it defeats the purpose of better graphics) and the same monitor resolution ..... However I have Depth of Field turned off, Motion Blur turned off and then in the CFG file I have Chromatic Aberration and Vignetting turned off which are all features I don't like but also take away performance and cause blurrier less defined graphics. I also hacked the EXE so ultrawide is enabled in game rendered cutscenes and it seems to have zero effect on performance so I really don't understand why it's not enabled when all it took was changing 4 bytes of code. In fact squashing them down to 16:9 seems to degrade the graphics like changing the Ultra preset down to High or Medium
Damn, already the insulting kids/bots that have nothing to do in their life. Never thought I’d be so fast!
It is sad that the most exiting thing for me is the lack of shader stutter. Unreal really needs to fix that, which prove out there it is totally avoidable
That is not Unreal Engine's biggest issue.
Shader stutter can be avoided, what can not be avoided is traversal stutter.
And there is barely any UE game that doesn't have that.
Jedi Survivor and Hogwarts Legacy are examples of UE 4's traversal stutter issues.
But Immortal's of Aveum has the same issue, every time something loads in.
The Finals doesn't have it, but i guess thats because levels are just fully loaded.
In Fortnite it's not very bad today, but i still think it can be done much better, with a better parallelized load on the CPU.
This snowdrop engine does it perfectly... CPU wise a high end CPU can run this game at 200 fps throughout.
An Unreal Engine 5 game with similar gfx would certainly not reach that.
@@oropher1234nah shader stutter is still pervasive despite claims that it’s easy enough to fix. Traversal stutter may be equally, if not more ubiquitous in UE titles but has way less of an impact compared to shader compilation issues.
@@DroneCorpse Shader stutter is an issue just because devs often are careless about it, if they actually implement a precompilation step, it's not that big of a deal.
Fortnite; today does it in a weird way though, it seems that slower CPU's will be busy for too long and not reach completion before the first match actually started.
I don't understand why Fortnite doesn't precompile in the menu before loading the game.
Love playing this on my 4070ti super and ryzen 7 7800x3d. Looks and plays great.
These videos are an incredible service. Thank you.
One thing i wish devs would leave in PS5 is the motion blur setting. I am one of the few who actually prefers to play with motionblur as it helps with motion sickness. Disabling it for the performance mode is just a crime, i've had cases where my head would start to spin when i play for more than an hour. I wish i could play with it and adjust the value of it. Not a fan of devs deciding what is best for the user. Not sure who decided that motion blur is bad, because i always loved it.
I don't think is a case of deciding what is best for the user here but more a case of deciding what is the best way to get performance out of the console.
I suspect intense motion blur scenes caused some frame drops that made them decide to deactivate it on console where they try to hit that 60fps or 30fps exactly.
On PC if you play with 90 frames and you drop to 85 for a little is not much of an issue.
Frohe Weihnachten to you as well
Das „frohe Weihnachten“ am Ende hat mich gekillt.😂😂😂
So happy DF have good sponsors, they put so much work into every single video, you deserve everything
wrong
is MSI a good sponsor? specifically mentioning only Intel and Nvidia for configurations? idk.....seems a tad....messed up
@@snowpuddle9622 frl they spend all day knowing they are spewing bullshit information.
The one that gets me all the time is when they go to a random frame of a door or whatever and cont the “pixels” to see what resolution the game is running at 😂😂😂 that’s completely illogical math 😂
I didnt' realize this was Massive!! I assumed this was just another skinned Farcry game. Once I saw it was Massive I went out and bought the PS5 Gold edition and I'm loving the game so far.
Love how "Cromulent" is now part of the DF vocabulary after that DF Direct Episode
The studio's earlier game, Division, looks kind of amazing even by today's standards. Especially when you compare some of the modern "Next gen" titles like Redfall, it really goes to show how some developers have actually regressed.
To say nothing of the imaginary Division from the trailers...
@@Goodbutevilgenius When you actually play the game, there are so many locations that totally live up to the trailers with the lighting, geometry detail, destruction and particles.
I’d rather invite you to take a look at what the new “From the ground up” Forza game ended up looking like compared to the reveal trailer. That’s like Ubisoft marketing on steroids.
Nice video as always.
I’d be very thankful if the 3080 would get some love in these comparisons and for AMD users something comparable to the 2070-2080 and 3070-3080.
Testing only for two gpus and only nvidia for that is a bit narrow tbf..
I think we should keep the more limited purpose of this video in mind. The performance/return optimised settings derived in this video will hold for most systems and GPUs, although of course the need for such optimisations declines as one moves further up the performance stack. Between the videos DF has put out about this game I think we can put together a general picture of the game's performance profile at both the mid-range and high-end, but perhaps a summarizing paragraph would be a useful addition to wrap things up, to the effect that the game remains both performant and attractive on console-class hardware while also scaling up to challenge even top-tier hardware.
Personally i think they should get as close to a card on performance to the PS5 and Xbox Series X as possible.
It provides you with a baseline - you can just increase resolution or some of the settings that provide the biggest impact.
I think Alex could make note of the 3-4 settings that have the most visual impact so that 3080 or 3090 owners can bump those up.
Im sure there are other videos for lower end systems.
Loved what Massive did with The Division 2 on PC. Put in over 130hrs with that one, which uses the same Snowdrop Engine, the only decent Gfx Engine that Ubisoft has, IMO.
Rockstar must also incorporate Unobtanium Settings in their upcoming GTA VI, since it will mostly hung around for another Decade before another GTA franchise!
I think the devs just need to include a default performance/optimized settings or console equivelant presets in their PC versions at this point. This will hugely help a lot users to achive more stable frame rates with minimal quality exchange without trying to tweak every other setting in the game for hours.
I was annoyed that throughout the entire intro sequence, there was no way for me to access the settings and it was giving me anxiety because I knew the game could look better in that part and rather than paying attention to the story setup I was worried about the game's settings the entire time. But the menu was locked away from me and the only settings I had in the initial screen were simple QOL type settings.
Having fun with this game, about 40 hours in and I will probably put it maybe 15 more 😊
This is a great game , running very very well on ultrawide 1440p high settings dlss balanced - 2080 ti and an old i9 9700k . Looks amazeballz
I really wish devs just added a "console" preset. It would be the optimized settings by default.
Good visuals with optimal performance. Then based on the user's setup, they can increase or decrease settings.
man, with the great optimize from this game and the engine they build, i have big faith as Starwars fans for upcoming Ubisoft Starwars Outlaws coming later...
Another excellent DF video. Keep it up :)
Great vídeo as always Alex ! I just miss a Radeon vs Nvidia performance comparison like in the old days of DF.
So happy to not only see another third party engine being continued, but also due to its quality! Hopefully, more games will be developed using this!
Can't wait for this to go on sale, so many people are going to be surprised.
I am so thankful you mentioned the frametimes with FSR3. I thought I was going out of my mind
Great video, as always!
Great breakdown
Will there be a full FSR3 FG test with different GPUs or is this it?
Excellent detailed analysis. As usual. Thanks.
720p internal resolution is why we need “pro” consoles. Even bumping it up to 900p would be a big difference
Why though? The Protato would be running only 900p and within a year would be back to 720p.
@@fcukugimmeausername why ever upgrade anything, games get more demanding as time goes on
Man its one of those rare cases when a game everybody wanted to fail so bad - delivers, and shuts everybody up. Even the PC version runs flawlessly with no shader compilation stutters. Gameplay-wise it might be not everybody's cup of tea, but i've yet to hear valid criticisms, so far it is a really solid game that is very underrated.
Guaranteed if Ubisoft had added Unobtainium settings to the menu by default with a warning, people would have still complained online about the game being unoptimized. I very much appreciate games that offer settings that scale to future GPU’s, even if they have to hide it.
Alex der beste wie immer
why not show fps on every image?
Can you verify FSR3 FG VRR on a AMD card? Because I'm using a 7900XT and not seeing the frametime issues presented in the video.
Thank you for video, Optimised Settings it's best video for PC
Excited what this means for the upcoming Star Wars game.
Good work alex❤
do you not own an RX7000 GPU? show FSR3 on native hardware?
Very impressive work 👍
Thank you DF!
You the man alex!
What we need is a quality mode with 40-45 maybe 50fps on consoles with future development of FSR 3.1/.2 etc.!
I noticed the same fog artefacts in Alan Wake 2 with DLSS Quality, behind the Valhalla nursing home. Maybe it has more to do with DLSS than with the game itself?
“Best pc port this year” in December is truly high praise.
Pretty blown away by this games quality, the bioluminescence at night on my Alienware OLED being churned out by my 5800X3D and 7900XT is stunning. So far I think my 70$ is pleasing me. So glad FSR3 Frame Gen is working I am getting smooth game play 📈 🤩
I have a 7900xt and a 3440x1440 monitor this game looks incredible
10:10 You can use a specific refresh rate if you want fix the frametime issues with FSR3 frame gen. The FPS just have to be locked. I'm playing at 70FPS locked via NVCP with VRR and DLDSR on my 4090. The frametime issues with FSR frame gen are way more obvious when complete GPU bound. You can perfectly test this with downsampling like DLDSR and unlocked FPS.
German game site article I saw indicated that fsr3 frame gen only has this problem on nvidia gpus for some reason, at least in this game. Their frametime graph from a 7000-series Radeon was fine. In any case Avatar seems to be proof that their solution can work properly with the right implementation.
@@harryarmstrong5728 Yeah, after comparing several videos, it seems certain that NV GPUs have frametimes issues with FG enabled in this game. These zigzag lines in the frametime graph are only there on NV cards.
I have a 3080 Ti, which isn't a budget card though it's nearly 3 years old. I'm surprised I can run Avatar at standard max settings at 3440x1440, a fairly consistent 60fps, DLSS performance. Considering the fidelity of the visuals and all that RT, Massive did a great job optimizing the game. All Snowdrop really needs is an equivalent to Nanite to eliminate any noticeable LOD pop-in.
I have a 3080ti as well, if you’re willing I am curious if you have ALL settings maxed out? Reflections as well? Trying to optimize my settings. Any tips would be greatly appreciated:)
Honestly, 60 fps with dlss on perf. Is kinda bad and from a 3080ti? That's terrible optimization.
Dlss performance looks so bad. 3080 ti is fine for 1440p but not 4k. My 3090 ti can barely get 4k 60fps in new games.
@@Gimpy17 It's all relative. Avatar is using a lot of RT and the draw distance and density at max settings is impressive. Image quality is good and performance is consistent. Compare with Alan Wake 2, not an open world, on my PC its performance is all over the place on medium settings.
@@eternalbeing3339 Depends on the game. A 3080 Ti can run lots of games at native 4k/60+fps/max settings, like Stray, Elden Ring(no RT), Gears Tactics, and Lies of P. Demanding new games like Alan Wake 2 and Jedi Survivor, not so much, but with some concessions can still look and perform much better than current consoles. RT performance is the main issue for 30 series GPUs which the 40 series addresses with frame gen. That or Alex's optimized settings ;)
But as DF often claims, raw pixel count isn't as important these days for image quality with modern image clarity techniques. An internal render of 1440p or even 1080p reconstructed to 4k looks great and allows older/less demanding games to run maxed out at very high fps or newer titles like the aforementioned run with all the RT bells and whistles. I'm skipping the 40 series and will try to hold out till the 60s but might cave when the 50s come out lol. Frame gen is just such a game changer.
Framepacing is broken if you are using overlays. By AMD documentation "One thing to note is that utility overlays provided by third parties will incur a cost which the FSR 3 frame pacing algorithm may be unable to measure and consider. Overlays should always be implemented efficiently, with minimal GPU resource cost to minimize impact. We also recommend that Hardware Accelerated GPU Scheduling be turned on if supported for best frame pacing results."
21:20 I thought PS5 lacked mesh shaders?
Hence why it would also be nice if digital foundries would invest in having a high-end AMD GPU as well maybe at 6950 XT or a 7900 XTX so that we can see how vrr performs along with freesync premium for example to see if there's any variation between it and using a Nvidia graphics card with FSR and variable refresh rate
What?
Couldn’t they do that with a midrange AMD GPU?
Imagine if it properly supported explicit multi-gpu, with PCIe Gen5 bandwidth available etc. I still remember the ye old days of GN putting two Titan Vs to the test in AoS. One can wish.
frohe weihnachten ;)
Sweden should be proud of this. Malmö FTW
3:00 Agree 100%! I'm always happy when games make themselves future proof by providing graphics options that exceed what even the beefiest high end systems at time of release can provide. Sadly the "PC master race" is extremely dumb and hates games for doing so, calling it "bad optimization" (oh Lord, how I hate this term). Hiding such options behind a "secret" param is _perfect_.
I think all PC games need to have a bang for bucks setting, basically settings that more or less are maxed settings but cut back in areas that are minimal for most gamers to notice and can offer performance improvements.
Consoles are usually a good starting point but sometimes you can go even lower and still keep the core visual look of max settings.
Like Alex said, many PC gamers can get obsessed with maxing the settings out, I know this because I've got two brothers that are like that, but the performance impact to the tradeoff on visuals you get, in a lot of cases, it's not worth it.
With a lot of games, you can more or less get the core max visual settings without having to max the settings out, and that usually works much better on more modest hardware and from what I've seen with most games, max settings are a waste, the quality improvement is so small but the performance impact is usually quite big, that it's not really worth it unless you've got the hardware to power through it all.
But seriously, do a test with a few gamers at different visual settings and see if they can tell which is better, my brothers and I have tried it out with a few games years ago, going as low as medium where it can be hard to tell them apart in real gameplay, but that is different from game to game, some games can drop settings a lot more then others whiles maintaining good visuals.
I also do wonder if AMD and Intel gamers should look for alternative sites when it comes to finding the right settings, after all, Alex is supposed to be representing PC gamers, but it feels like he represents Nvidia above all else, you see that if you go back through the year of all his videos, a lot are about Nvidia tech,
Loving the tech in this game. 4090 on an OLED with HDR is stunning. For whatever reason, HDR on my Samsung G9 OLED doesn't look that good. The lack of DLSS Frame Generation was the only obvious (and embarrassing) mistake.
Hey can u guys do (the finals) tech review..its UE5 engine with destruction..
That "Frohe Weihnachten " sounded really good. I guess German relatives?
UE5 is greatly overrated
Black Myth Wukong says no.
Wukongs final chapter looks and runs like shit. Anything remotely open world that isn't rocks in UE5 will look and run much worse. @@TheT0nedude
Hi Alex! Since you mentioned DLSSTweaks in the video, I think it would have added a lot to the video if you had taken a look at the DLSS 3 Frame Generaion mods available for the game, as a comparison to FSR 3, especially in the frame time consistency department. I've noticed that FSR 3 did not feel nearly as smooth as DLSS 3's Frame Generation, but the frame time graph was quite smooth for me with both.
Nevertheless, another awesome video! Thank you for all the effort you put into it, and have a Merry Christmas and some well deserved rest!