So, I'm not sure why the video is only in 1080p, I apologize for that. I tried to record in 4K, but Nvidia messed up and it only recorded in 1080p. Other benchmarks on the channel, and in the future will be in 4K if it has 4K footage in the video!
GPUs have always been thee most expensive part of a PC, if not one of the most. Only thing that comes kind of close is a CPU. So I'm not really sure what you mean. But yep, spent a pretty penny on the entire system!
Dude, if your 4090 won't get up to 90%+ even with this Unobtanium-Setting, there's something wrong. - Maybe the programming of the game ... or optimization or on side of the hardware. I got the 3090 Ti and some games scale strange, because the impact of 2k resolution to 4k resolution is not a 50% performance drop, even if the resolution is doubled. For example in the game Red Dead Redemption 2 it is only a 10- upto 15 frames per second suffering from 2k to 4k resolution. In direction of 4k Resolution to only 2k Resolution the game seems to be very bad optimized, but in the opposite direction is the game very well optimized, because from 2k upto 4k Resolution and only loosing 10- upto 15 frames per second - ey common.^^ 120fps @ 2k, to allmost 105fps in 4k AVG - what is that?^^ I have the AMD Ryzen 9 7950X3D, but for gaming I got the CCD1 mostly deactivated if the game is not that Core-Count-Hungry. - So let's say I play mostly with an overclocked 7800X3D, because it's running at ~5.400MHz All-Core, so it has higher clocks than regular. So, I mean that the performance must scale in the right direction and you can see if it's the code of the game which causes some strange performance-behaviors, or if it belongs to the hardware. - In most cases.
I turned speculation reflection to “Very High” as that was impacting frame rate massively on Max or Ultra - I saw no change to visuals but able to play 4K 100fps with DLSS Quality. 👍
Far cry and crysis were more of tech demos for their engine development. Shame the engine itself kinda turned into a janky mess over the years, despite them selling off the IPs ubisoft and focusing entirely on it. Modern gaming wise its gonna be rare to see anything actually pushing hardware thats fully optimized. Industry has shifted focus the past decade and a half to target the widest audiance possible to maximize profits. This means development will mostly be focused on dated console hardware, rather then the bleeding edge gpu's on the market.
Don't give Crytek any props for pushing anything forward. The reason why can it run crysis is even a meme is because their game engine was so unoptimized that it doesn't matter how good your hardware is that it doesn't run very well. They didn't even do short cuts that didn't work out. They straight up made a bad engine that they had the game rendering tesslation in areas that the camera doesn't even see when it was very rudementary at the time to avoid that problem. Why do you think its only can it run crysis and not can it run crysis 2 or even 3 which had even better graphics. Its because those games were actually optimized. And its no like no one was pushing the bar with graphics. Crytek is only famous for being bad.
It used to be id, you needed a 486 for doom, upgrade or fuck off. Then quake came out and you needed a pentium, upgrade or fuck off. Then quake 3, buy a video card with good opengl drivers or fuck off. Then doom 3, better upgrade your whole system or fuck off. Rage was a similar but more complicated story due to consoles. Ah, the good old days 👴🏻
Edit: Very important to know that all of these Frame Interpretation technologies will almost always increase latency (decreased latency is very rare). It is motion smoothing, not performance boosting. Perception and expectation will be different for everyone, so is best to test it out for yourself to see if you get a better experience. 2:11 I know i'm late to the party here, and not trying to be nit-picky here, just trying to get correct information out as a lot of people seem to get these two confused, but it's VERY important to note that AMD Fluid Motion Frames and FSR 3 Frame Generation are *NOT* the same thing. FSR 3 Is AMD's Super Resolution Package that Includes Individually toggleable/tweakable Upscaling and Frame Generation. This method must be intergated within game, has minimum requirements (Though not guaranteed to not work on older GPU's) of AMD RADEON RX 5000+, NVIDIA GeForce GTX 1000+, and Intel Arc+, with reccomend requirements being a generation above that, excluding Intel Arc because it's already new enough and a 1st generation product. AMD FSR 3 has access to UI Elements and motion vectors to better prevent garbling and visual quality degradation. AMD Fluid Motion Frames (ADMF) is a Driver Level Frame Generation within the AMD RADEON Adrenalin Software for RX 6000+. It was available for aehiel via a preview driver, but is now available on the release 24.1.1 Drivers. AFMF does NOT have to be integrated within a specific game in order for it to work, just for it to be Direct x 11/12 and V-Sync off usually. And Unlike DLSS 3 FG and FSR 3 FG, This means it does NOT have access to motion vectors and other game data that can reduce latency or perserve image quality. It also tends to dynamically disable itself when gaps between frames are too large and during fast movement in order to try it's best to perserve that image quality. To counter this, it's also reccomend that those using AFMF should use HYPR-RX, which in inclues a Dynamic (BiLinear if I remember correctly) Resolution Scaling that only toggles it'self on during large amounts of motion Hope this helps some people out!.
Okay, short explanation to CPU/GPU utiliztion: If you have a set of settings that GPU bottlenecks you (99% util GPU) and that makes your CPU work for like 8% util, then the only logical thing that follows is that if you reduce graphical quality the CPU utilization increases as your FPS increases because the CPU has to work for every frame. It needs to simulate the world, and "place" (calculate positions/culling/simulate) all the rendered objects so your GPU knows what to render where. This has nothing to do with "poor" optimization but how games are simuated. This is exactly why games like CS or valorant which have like 20 dynamic entities can run in the 500s of fps while games like the Witcher, AC, Battleroyals and Avatar can only archive that many Fps because they simulate sometimes up to hundreds of dynamic entities. Also the percentage of the CPU in Rivertuner is somewhat misleading as it takes all the cores and averages utilization while maybe the render thread (which drives your fps) is at 100% utilization. Similar problems can occour with GPUs nowadays because with Raytracing involved, you have different cores handling different things (CUDA for raster, RT for raytracing etc) this means that if the RT cores can only deliver 60 fps, you might have CUDA cores sleeping because the 4090 does not have a nice ratio of those for every game released ever. Also also turning on raytracing increases CPU load as more calculations are needed to composit the frame. AND with frame I mean non generated frame, as FSR and DLSS frames are generated entirely on the GPU. Anyway, I am happy you have fun with the game, just don't get hung up on the numbers too much.
The in game frame gen is using vector data to generate frames like dlss frame gen. The fluid motion is a AMD driver level frame gen that works on all dx 11 & 12 games. Fluid motion only uses the frames to generate new frames. Fluid motion is only available on AMD cards. While FSR frame gen works on both AMD and Nvidia.
Don't be fooled by those crazy presets in games, most of the time you won't see much difference between High preset and Ultra but your FPS counter does. Unobtanium is just the unoptimized graphic preset to make people talk about the game.
The difference is definetly notable, but the game looks beautiful enough on lower settings too that you wont complain. I wouldnt call it unoptimized, it is just incredibly demanding. If you want to know how an unoptimized game runs, check out cities skylines 2. benchmarks.
Avatar has RT/RTAO built into the game no options to turn it off even at medium setting. Even XSS has RT/RTAO. I don't think they had a choice too much data to do baked in lighting for this game when every thing emits light.
You are correct, thank you for letting me know! I was going with it being determined by settings because the in game graphics menu says it's raytraced when set to ultra
this is exactly why i just got a ps5 ,demanding pc games cant even run properly on a 4090 so what's the point in spending thousands just to run these games in less performance than a console
4090 perfect experience guide: Step one: unlock unobtaniin settings. Step two: download DlSS frame generation mod with DlSS depth fix. Step 3) sett everything to max , motion blur to off , specular reflections to ultra , and environmental reflections quality to high , max doesn’t changes anything to either of this settings and has a performance hit. (You can gain about 7-8 gps putting this 2 settings 1 step down with 0 visual sacrifices, literally, not like the eye blind people that say that settings that do make difference, make none. This 2 settings are not currently working properly, it’s free performance guys , just take them 1 step down) Set DlSS to balanced , enable DlSS frame generation with preset C through the reshade comand of the mod. Enjoy an unobtanium settings 90-110 fps 4K experience that looks exactly like native.
Nice test! i noticed that by reducing the game details tab from 25 to lets say 0 or 5, you pickup a massive boost in fps, without that much of a visual difference. Its like the names says; only details. I am playing this on my 3060 and 4060 laptop which stil looks amazing with a couple of tweaks. I do have some loading issues on the desktop 3060 computer, but thats because the vram (12gb) and normal ram limitations (16 gb). Stil playable.
Thank you! I have noticed that as well in other Ubi titles! The extra details slider is typically the one that reduces the FPS the most outside of the non-raytracing options, or shadows! I do love those extra details, but sometimes they can strain your GPU so much that you lose like 40 FPS!
I don`t understand your CPU bottleneck claim. If the CPU works on like 20% of it`s Capacity and the GPU (actually not bottomed out either from what I saw, on the 1080p test). This game on 1080p is in fact not 100%ing the hardware you have at any point at all, for any sort of "bottleneck" to appear. But still... Isn`t CPU bottleneck when the CPU hits 100% utilization "first" and the GPU stands >bellow< that at all points. For example how I know my CPU of choice (and financial capabilities) have like a lil bit of GPU "room to grow" is that when I play the game that I play on V sync (I am not telling @ what refresh rate monitor for 1080p because it will get you sad :P) the CPU is like 15~20-30% and the GPU is like 45~52%... When I Uncap (remove the V Sync) the GPU hits 95% + (a.k.a. does all that it can) CPU still goes up to respectable 60%. So I am thinking if I put a grade or few above GPU I'd hit something like 90% + CPU on 100% GPU. If I "overkill" the GPU upgrade and the CPU becomes bottleneck it`d be something like 100% CPU 80% GPU (I have been there before... because I had CPU, much, much, much older than the GPU and they were in theory both 100/100 on the same game (for lower result than 20/45 on capped on my current hardware) but in reality, if the CPU was newer the GPU would have probably juiced some FPS on top out of it`s 100%. Why Would I Buy a PC that plays the game that I want with 20% CPU and 45% GPU utilization... the way I've always wanted to... Well there are many reasons. It is worth it. Also the game have which have like up to 5 - 6 times the requirements of your 85% + of the gameplay time (variegating on personal preference and choices).
So, in some cases you are correct, using 100% of your CPU and not using 100% of your GPU is a CPU bottleneck, but that only typically happens with low core count CPUs in my experience, like on my old i7 6700k with only 4 cores and 8 threads. But, on CPUs with high core count like my current 5950x with 16 cores and 32 threads I only ever see it hitting MAYBE 75% on games like Division 2; everything else is lower. But I know I am still CPU bottle necked when my GPU usage is less than about 80%, meaning your GPU is not using its full capabilities because it's waiting for the CPU to catch up. So even though I was only using about 20-25% CPU, the GPU drops to about 50-70% because it is MUCH faster than the CPU is. It's also the fault of game developers not truly supporting high core count CPUs. They only account for about 4-8 cores in most games, so CPUs with more than that just have half of the CPU idling while the other half is struggling to run the game. Hope this helps :D
11 місяців тому
Dont listen to what op says. He has 0 clue what he is talking about...
If an 8-core CPU has 1 core at 100% and 7 cores at 0%, it's a CPU bottleneck, even though the CPU is at 12.5%. This is because 8 x 12.5 = 100%. 2 cores at 100% would also be a CPU bottleneck even though the CPU is at 25% of its capacity. 4 cores at 100% would also be a bottleneck even though the CPU is at 50% capacity. The CPU in this video has 16 cores so a CPU bottleneck could easily be at less than 20% capacity. The overlay in the game isn't showing a breakdown of what each core is doing so you can't judge on whether there's a bottleneck or not. Although it did look like there was in places when the GPU usage dropped to the low 80s.
It's kind've ridiculous and insulting to the end user to include these options knowing that most setups wouldnt be able to use them. Whats worse is that even with all the settings turned up to max, the game isn't jaw droppingly beautiful. If you've never seen the game before you wouldn't walk away saying "OMG I totally see where all that processing power went." The game looks unimpressive, IMHO, at any setting.
Well, I think that's why these Unobtanium settings were removed, and can only be access through the code in the launcher, because they knew it wouldn't be - no pun intended - obtainable for most modern PCs right now lol. But also, I will say that during my time with the game, and other friends of mine agree, we've all seen views that to make us go "WOW, that is gorgeous!" I think it's something you need to see for yourself, maybe? It's like the first time you play Ghost of Tsushima, and you see those big field of grass all flowing in the wind with leaves. It's just like that feeling for me :D
I thought a game was cpu bottlenecked when the cpu was too slow to give the videocard resources to draw more frames. Usually that would happen at low settings, not max or near max settings. Or if there was just a terrible mismatch between a weak cpu and a fast video card. For example if you ran this game at 640x480 lowest settings, on a 2080 and a 4080, and get the same frame rate for both, that would be a cpu bottleneck, even if it was like 300fps. And you know this because the 4080 is undoubtedly an order of magnitude more powerful.
Yes, but even if you put most powerful CPU we have now it won't be big of difference in this scenarios, if he showed all core usage it wouldn't use all of the cores, by usage I would assume game uses maybe 6 core at max@@tnutz777
I do not think it is a CPU bottleneck but a part not in the graph is limiting it for example system RAM throughput which just getting a better CPU will not fix if it is a system RAM problem which rarely is the issue in games. The reason why i am saying system RAM is the entire game should have loaded into RAM at the same so it is either GPU, CPU or RAM. You didn't run out of it but the throughput of it. It will be interesting to see how different speeds of system RAM with the same CAS effect performance.
I honestly don't notice the difference and most people don't just stop at every single leaf to see the quality difference, which is why i think combining ray-tracing into the quality presets is dumb, they should have had it have its own tab, cause from experience RT is never worth the hit to performance.
I do agree that the graphical differences are very subtle, and that games should offer an RTX tab so players can toggle it only if they want to. Only bad part about that is it means devs will have to place all the lights in the game manually, then remove them if the player turns on RTX; essentially making even more work for them when creating levels
This game doesnt even look that good to begin with . The low fps with 4K and unobtanium settings is more the result of lack of optimization. A 4090 can do so much better than that
I just unlocked the "Unobtanium" setting on my 14900K/4090 MSI Liquid Suprim/GSkill 64GB of 6000MT/s RAM. My goal is to hit 4K Full UnOb settings 60fps. Turned on FSR3 with frame gen and removed motion blur (yuck) and ran the benchmark to an average of 60fps with low drop to the 55-56fps. Loaded my save and the game will maintain 60fps for the majority but if you go look at water with RTX reflections on UnOb, the FPS will drop to the low 50's. I switched back to full ULTRA with FSR3 w/frame generation and the game is a solid lock to 60fps. I switched back and forth between Ultra and UnOb and the difference is so subtle it is not worth the fluctuating frame-times. Running Ultra+. Cranked up the texture density from UnOb settings and the game will use 22GBs of my 24GBs of VRAM. Zoinks! I also added UnOb particle effects, volumetric clouds, shadows and a few other settings from UnOb. My current game is a near rock solid 60fps just running around in awe at Avatar's visual beauty with a slight dip to 56-58fps here and there. With HDR on, I swear shots in this game look like they're straight from the films. Pure Sublime Eye P**n!!!
I play it on ultra settings with FSR 3 frame gen on ultra quality and playing it with a controller it feels really smooth on my RTX 4070, definitely the most demanding but most beautiful game I ever played (note: I noticed that putting a frame limit in nvidia control panel 2 frames below your monitor refreshrate makes the game feel a lot smoother with the frame generation on. I put mine at 142 fps frame limit because the game started to feel stuttery when going above 144 fps)
Your GPU is seriously bottlenecking your CPU in every resolution, especially in 1080p. RTX 4090 is not meant for Ryzen 5000 series. You should upgrade to Ryzen 9 7950x 3D, Then benchmark it again, you 'll see massive increment. & Atlast, FSR is useless in your case bcz FSR works more efficiently with only AMD GPUs. I know it bcz I own one. And yes the game needs more patchs for smoother experience.
@@2hotflavored666 yet you reply negatively to a comment while not using the same knowledge/logic that you claim to have had. Adds zero value. Screams “I have nothing even close to a 4090 and I’m salty”.
i have a 4090 on 3440x1440 screen. unobtanium gives 45fps and is literally the same as high except for darker midtones in shadows. and high is 120fps. dont mistaken unobtanium as a next gen setting, its literally like high.
So, I will say that they did a great job at making the lower settings look really good in this game, the differences between the higher and lower settings are more subtle (that's awesome) - but, there's definitely are differences between unobtanium and even ultra settings, they are just *super* hard to see unless you are looking for them. Definitely not worth the major performance impact to some people, but I wouldn't say it's comparable to high settings, personally.
this is like the mode in doom eternal where it supports 1000FPS. it is there mostly for the future, if someone wants to play this thing decked out in 5 years, here you go!
that dosent look that good tho Ubisoft is so late on graphics and gameplay is pretty bad. Red Dead 2 looks better and it's a 2019 game on pc, star citizen looks better Death Stranding looks better this is so bad for a end of 2023 game
It may be using a lot of vram if it has it to use, for extra texture streaming or whatever. That does not necessarily mean it will not work well on lesser vram cards, like a 16GB 4080 for example. Cool video though ! I remember running this on my laptop 4090 and it brought it to its knees, haha!
That is true, but I think even if it were using the extra vram for texture streaming that would cause the game to either stutter like crazy or run at a much lower FPS while it waits for the VRAM and RAM to clear - that's what games like Hogwarts Legacy did. And while everything ran like crap, the textures were super blurry
@@MrJays Possibly but its not always the case. Some games use whats available for things that may not bring any tangible benefit. I remember a COD from like 10 years ago or so that had that feature to use basically whatever amount of vram you had. When I had a 8GB 3070 laptop never had any significant issues on hogwarts at 1600p with higher textures.
Someone having a 4090, and playing in 1080p... would be an incredibly rare hardware mismatch. Something is strange about your benchmarks though. You're not hitting max utilization of gpu or cpu either one. Not even close. I'm guessing the game is just horribly optimized and is really cooking just one or two cores on your 5950X. Single core performance just never seems to stop being relevant.
Nope, the game isn't horribly optimized at all, what you're noticing is the effect I mentioned in the video of being CPU limited, or CPU bottlenecked. With a high core count CPU, you don't need to see 100% CPU usage to bottleneck the GPU. So I only saw proper GPU usage, and the bottle neck was removed when I went to the higher resolutions. It's what happens when games don't fully utilize a CPU with 16 cores, but something like 4 cores.
@@MrJays *nod* , I just meant that it not utilizing more cores is less than ideal if it's going to bottleneck on the cpu like that. In general new games built on the latest engines are very badly optimized though. There's a lot of 'copy-pasta' coding, and lack of expertise in implementing "close to the metal" tech in DX12. Close to the metal is theoretically capable of greater efficiency, but it's more complex on the coder side of things, and for the most part, it's not leveraged well. You see it a lot in UE5 games especially. Though this game looks like it has it's own engine. It feels like game developers are leaning hard on FSR or DLSS to save them the work of coding more properly. 5950X is slightly slow on single core performance (paired with a 4090 anyway) 7950X or 7800X3D would probably help "some" with cpu bottleneck. but probably by only 20% or so. Heavy use of ray tracing seems like a bad idea in general too. Sure the new cards can actually run RT at playable frame rates, but it's still a huge resource hog, so using it excessively is still going to tank fps. Cyberpunk at truly max settings kills fps, and it's mostly the RT's fault. At least 4090-crushing and 4nm cpu-crushing games are mostly limited to extreme settings though. I dunno :)
@@kathrynck I do agree that there have been a lot of incredibly unoptimized games this year, and in the past few years as well, yeah. And I really wish we had more games that fully utilized a high core count CPU, that would be awesome! That's been the biggest thing for me with games is when the CPU wont go past ~35% usage, but my frames drop to 50 FPS lol. And some dev are definitely using FSR and DLSS to mask terrible optimization, you are 100% right - but I gotta say - I am a huge fan of raytracing and pathtracing lol. As a someone who loves the best graphics, I will take that hit, and just turn on frame generation :D
@@MrJays Hehe, fair enough :) You've got a 4090, so you can spoil yourself with RT/PT pretty well hehe. I got a 4090 as well, but I mainly got it for a move to 4k. So I don't run into cpu bottlenecks too much, cuz 4k really tortures the gpu ;) I did go 7900X for the cpu though, looking to max out on single-core performance. Intel might still have a slight edge on single core, but I really needed more pcie lanes than intel is willing to include. I gotta say that all the new 2023 hardware is really heavily overjuiced & overclocked at factory settings. I de-tuned both slightly in my system. Comparing them apples to apples with non-OC older hardware is a little unfair since they're running so deep into the redline out of the box. I found both my 7900X and 4090 perform _better_ with undervolt, but i took that edge away with lower thermal max's, cuz I want them to last.
played the game during the holiday and it look amazing on the Xbox Series S and looked amazing on PC at max settings Ubisoft really made one of the most beautiful games I ever played.
When you have a high core count CPU - like higher than 8 - you wont see 100% usage during a game because that are typically optimized for 4 or 8 cores. So for my 5950x I don't see more than 60-70% usage on the absolute highest end. And you can tell it's a CPU bottleneck when the GPU usage drops below about 80%. Which in this video at the low resolutions the GPU hardly hits 90% usage. And averages 50% usage on the GPU because it's just waiting for the CPU to finish.
@@MrJaysCheck single core performance if it's not 100 it's underperforming not bottleneck. I have an old cpu with latest GPU all cores at max only producing 80 fps in 1080p. so I changed to 4k monitor it still renders around 80 fps. So i Didn't change cpu. So upgrade to better resol don't spend more on cpu.
Since I'm watching this via an UA-cam video, I gotta ask: did you find this Unobtanium mode much much better? Is the performance cost at 1440p justifiable?
I must say that its a perfect game for 1440p but I would not run the unobtanium setting with anything below a 4090 or 7900xtx. I play at 1440 ultra wide and because the unobtanium settings are not optimized I have had crashes so I find myself keeping just ultra on. Hope this helps as the game is amazing either way
So, in my experience at 1080p, I can say that the game look stunning without the unobtanium settings and gives an extra, like, 30 FPS! So that is great! But I can say that if you want *slightly* higher texture quality, or better shadows with caring about your frame rate, then it is worth it. But as another comment said, they are very hard to run, and can cause crashing. Overall thoughts - Eh, not really needed, but still nice to have if someone wants to run them lol
No. Barely any difference. Disagreeing with the guy here. I have a 16gb 4090 laptop and I can run unobtanium with some tweaks, DLSS ultra quality, 4k, FG on, at about 50 fps. There's barely any difference besides maybe volumetric fog. Also he brings up Lords of the Fallen and that's just unoptimized trash, UE5 is also known to be very poorly optimized.
@@joshuam4993 totally respect your opinion! I can agree there isn't much difference unless you are specifically looking for the differences. But while I do agree UE5 games tend to have issues, Lords of The Fallen ran pretty great for me in my benchmark video! :D
If you are saying you can only be CPU limited if a CPU is being pushed to 100% that is false. If a game only uses 4 cores, the CPU will show very low total usage, like it did in the video, but those 4 cores will be pushed to 100% usage, and that will drop your GPU usage because it's waiting on your CPU most of the time. Just like in the video you can see my GPU usage dropping down to 70 or 80% even in 1440p, because the 4090 is waiting for the CPU - that's a CPU bottleneck. A CPU with faster cores would give better FPS due to each core being faster, so those 4 cores the game wants to use would able to produce more frames. Or if games utilized more CPU cores, it would give better performance as well, typically.
So we have games continuing to push the VRAM demands with Nivida cranking out midline GPUs with 8-12GB of VRAM on them. We are entering an era of gaming here where people are going to miss the days of SLI. You could get two RTX 4070s with a combined 24 GB of VRAM for about half the cost of a RTX 4090
Honestly, I'd love it if SLI came back, but I don't think it will. We can create more powerful single GPUs than two GPUs combined, and game devs aren't optimizing for it anymore
cheers mate! thanks for the review. just got a 4090 installed today and looking for titles to throw on. why you upload in 1080p? please 2k minimum i cant see shiat 😆🤙
Thanks for watching the video! And I hope you enjoy your 4090! I love mine! Also I apologize for the video being 1080p, I tried to record the footage at 4K, but Nvidia saved all the footage at 1080p, for some reason
@@MrJays it saved it in 1080 because you forgot set it to 4k 🤪 nevermind. i already heard avatar looks nice but the game itself is bad. so i dont want buy it anymore. so sad they missed the chance to make something better with such stunning graphics
High core count CPUs don't bottleneck at 100% usage while gaming, unless the game is crazy unoptimized. I have 16 cores, so if a game only uses 4 cores, that's 25%, so 20-25% usage is using all 4 if the cores the game can use, and that's why the GPU usage wasn't above 80-90% like it should be
Why are you running in Borderless Fullscreen? Usually exclusive fullscreen skips some windows management code paths and has better performance and stability overall...
I always run borderless fullscreen since I typically stream my gameplay, I need to alt tab way more than others would. But from what I've noticed in the last few years, fullscreen doesn't offer any more FPS improvements over borderless fullscreen; best case scenario I've seen is it reduces stutters a bit in some games
I apologize, I tried to record in 4K, but Nvidia only recorded in 1080p for some reason :/ In the future, benchmarks will be in 4K if they include 4K gameplay, just as they normally do on the channel!
I normally do record in 4K, and I tried to for this video too, but Nvidia had an error and only recorded in 1080p. Other benchmarks and videos that have 4K footage will be in 4K on the channel, as usual.
If you are assuming you can only be CPU limited if a CPU is being pushed to 100% that is false. If a game only uses 4 cores, the CPU will show very low total usage, like it did in the video, but those 4 cores will be pushed to 100% usage, and that will drop your GPU usage because it's waiting on your CPU most of the time. Just like in the video you can see my GPU usage dropping down to 70 or 80% even in 1440p, because the 4090 is waiting for the CPU - that's a CPU bottleneck. A CPU with faster cores would give better FPS due to each core being faster, so those 4 cores the game wants to use would be able to produce more frames. Or if games utilized more CPU cores, it would give better performance as well, typically, or at least it would show more utilization for the CPU
My 1440p monitor can only hit 75hz with freesync. I also can only focus with one eye at a time. So visual looks are not a big deal to me. Have a 5800x and a 7800 xt. Probably start with high and see how it goes.
What a massive change 😲 Hey out of curiosity , if it's ok to ask, what do you do to earn income aside from UA-cam/Twitch/streaming - sometimes I wonder what people do to afford such amazing PC builds 😅 Not to be disrespectful! I really hope to grow up and have income to afford such beauties for me and my siblings so we can game together, I feel like it would be awesome , but with how the world and economy is going, it's so hard😢
A pc with this GPU to build yourself should cost around $3k. You don't need an amazing job to afford that, especially if you save, stop buying dumb stuff, wait for sales, etc. Also not sure what job you have but if you are working an entry level retail/food service job I highly recommend getting a trade job. Look for landscaping, electrician, plumber, construction work in your area and shop around. If you find the right boss/company/mentor you can very quickly learn the trade and you'll make considerably more than working say retail or food service.
@charismatic9467 Thanks for the response! I live in the UK, and job scarcity is a very big thing. I am a student currently studying CompSci, Hardware engineering and involved in building FPGAs for Companies such as Kone industries and Hynix (memories) and software and Cyber security engineering for Businesses such as Meta and Crowdstrike. Since I am just a student, I can only earn so much from doing part time work, internships and contractual jobs for these companies (my main 'agreement' is currently Lattice & TSMC Superconductor and Crowdstrike) But even after my degree is done in couple years, I will still be in a range of £35,000 minimum while paying off student debt and support my family during this Cost of Living crisis. I understand my fortunate skills and position can earn me high paying jobs in the future but regardless of my experiences, some things still are just how it is I will need few years industrial experinece to be earning avg of £75,000 given my potential roles from my practical experiences, knowledge and hands on understanding. I don't spend money on material clothes, or partying here and there and I try to save where I can. Hoping to get into investments such as stocks n shares or SNP500's , but alas. A decent enough 4090 here costs at least £2000 given how rare it is/can be to obtain brand new, and the parts alongside it I wish to be epic, especially since not just for gaming, but it supports my in Rendering, AI computation, work and study. Hence my question was the field the UA-camr works in and how their part was accessible for them (given the history of the channel, they own multiple decent parts collectively which is awesome!) No gripe here, I just admire people nowadays and wish to learn more, even if it is silly to do this over a youtube comment. I'd rather love, ask and admire than to hate or point out anything criticising
I agree I would like to see DLSS support added, I'm kind of interested in the game. If it's an AMD sponsored game, it will never add Nvidia's DLSS. I have an i9-13900K + RTX 4090 running a 240Hz 3440x1440 monitor (~5MP) vs 4K (~8MP) I would presume 60-70 FPS running in Unobtanium and maybe +15 FPS running Ultra. Off the top of my head, I don't know how our CPUs compare to one another in this game. CPU-Z tnx299
I think the difference is you probably aren't trying to run games at 4K 144hz on the 3070, and then getting mad when it doesn't work, like some people are on the internet lol
So I don't have a 4090 or the game, but even if I had ray tracing off, and as far as 4k is concerned, I don't see any difference between 1080p and 4k, so I don't use it
CPU bottleneck doesn't need to be above 90% usage. That's the old meaning of it when all we had were low core count CPUs. Now, with high core count CPUs, when a game only uses 4 cores, that's only 25% of a 16 core cpu. So the total usage won't show 90% or higher. What shows a CPU limitation nowadays is GPU usage, the GPU slows down to wait for the CPU. Look at the GPU usage slowing down to 50-60% at 1080p, or even 70-80% at 1440p. It cannot work faster due to waiting for the CPU.
Dont know much about this but i would guess it's that most games use between 4-8 cores. So it doesn't matter that he has a 16-core processor because its only using half at most. Newer CPUs like the 7800X3D have lower core count (only 8) but higher clock speeds and extra V-cache per core to help specifically with gaming.
Not bad for a 5950x but once you go to the 7k series of CPU then you get much more FPS. Im getting ready to play this on my 5800x3d paired with a 4090. Wish me luck 😆
As I said in the video it's a secret settings mode, so you have to add the commands I mention in the video into the launch arguments in the Ubisoft launcher
I find it very frustrating to hide and not include it in the settings. I've already finished the game, and Far Cry was a fun and interesting experience, unlike this one. The game is boring, and there's actually not much to do after completing the 20 levels near the home base
As I mentioned in the video, I think the reason they removed these settings was because even the highest end PCs can barely run them. I do wish they had added them in just for poops and giggles, but, they didn't want people to think the game was heavily unoptimized due to experimental settings
I tried it. And could get it to run at 4k like 50 fps with some tweaks 4090 laptop. Honestly pointless. There's almost no difference between IT and High/ULTRA, EXCEPT for maybe the Volumetric Fog. Other then that- the settings barely do anything and just eat your FPS
I honestly agree to a certain extent, yeah. I notice a slight difference in the shadows and texture quality as well, but you are definitely correct, it EATS your FPS which for some is just not worth it for slightly better visuals
1080p? A game like this deserves 1440p minimum. Also somethings off... it's barely using the cpu and gpu is only at 60'ish percent. Need to stick that card in a 7800x3d setup...
I love 1080p with high refresh rate! But I'll be getting 1440p 240hz as soon as I get a new CPU :D And yeah, this game and others really show the bottleneck my CPU is causing on the 4090 at lower resolutions LOL
If you check out some of the shots at the end of the video, I do have a couple night time ones! The night time is gorgeous with the bioluminescence, and the light from the planet makes it so bright and beautiful!
the graphics look insane, the world itself looks great. I just couldn't get into the gameplay. I stopped after 3 hours. At least it's part of ubisoft's subscription service, so at least I didn't waste $70.
Nice PC specs you have there. How come you get so low framerates? Mine is 5800X3D, RTX4090, 32 GB RAM, 990 Pro, X570S Mainboard from Gigabyte, 1500 W PSU BeQuiet!, 1440p monitor 27# So, lower specs, than you, but my framerate is like 120 and above..? So...what is it, that you are doing wrong here?
You don't have lower specs than him. The 5800X3D is much faster than the 5950X for gaming. That's what the 3D cache is for. The 5950X wins for non-gaming tasks though.
i honestly dont understand why people like fsr & dlss. It basically gives you fake FPS and with that technology beeing the new standard in games it effectively lowers the -real- fps your computer calculates. It makes games look way worse than they could actually look in order to give you a performance boost of an fps rate of like 120 which should be standard without upscaling or frame generation. Its basically reducing ur render scale which you could also do manually by making ur resolution lower than what your monitor can do, but it does it a bit better than that. So even tho you win performance, the game doesent look sharp anymore and thats not the goal when playing games on max settings with a 3000-4000 € Computer. I think its a very bad thing to be invented because it gives developers the excuse not beeing forced to fully optimize their games anymore since 80% of the fps dont have to be calculated anymore anyways because they are fake... This leads to players accepting bad graphics for good fps on a high end computer that should be capable of giving them good fps on highest graphics possible. Try to change my mind on this but i dont think you can since i can definetely see how bad a game looks with DLSS on compared to DLSS off because its real fps VS fake fps. Dont get me wrong, i also see the potential behind all this and maybe in 20-30 years the technology will reach a point where AI can easily create perfect images that are sharp and without any diffrence to real calculated FPS, sothat we can have like 5x the possible performance then since the computer only needs to generate 10-20% of the frames... but thats the future. The way as it is now and the way it gonna be for probably the next 10 years, it sucks and only works as an improvised solution for bad performance. You already pay 3x the price of an high end graphics card for the same amount of performance. When you compare the best card of each generation you will realise that the amount of performance you get for the price you pay, has been dropped to like 1/3 of what it once was. (price-performance ratio). - 1080TI was the best card in 2017 and cost around 800€ while you could play games with 120+ fps on highest settings ... - 4090 is the best card in 2023 for 2200€ but it basically gives the same FPS for games that dont even look significantly better than games did 7-8 years ago + you have to use DLSS on top of that which makes it worse.
So, in my opinion, your argument is built off of the basis that DLSS makes games look worse, which if you think so, that's fair - I disagree though. I don't notice any visual difference, and in fact, I've used DLSS to fix bad antialiasing some games have. And in fact I spent all this money on my PC for the high frame rate, so whatever helps me achieve that while running max settings I'm happy with! That being said though, I still have my 1080 Ti, even though I don't use it, I absolutely LOVED that card! It was incredible for that time I do agree - but in my opinion - the 4090 is that card for today's generation, I love it!! :D Much respect!
Hey thanks! Glad yo enjoyed the video! And, the CPU usage being so low is only a result of my 16 core CPU - so I wont see 100% usage as being a bottleneck. Most PC games only use about 4 cores, so that's 25% of 16 :D So my 25% usage with 16 cores is 50% usage to someone with only 8 cores, and so on
So, I'm not sure why the video is only in 1080p, I apologize for that. I tried to record in 4K, but Nvidia messed up and it only recorded in 1080p. Other benchmarks on the channel, and in the future will be in 4K if it has 4K footage in the video!
so basicly your gpu is way more expensive than your entire pc...... nice :)
GPUs have always been thee most expensive part of a PC, if not one of the most. Only thing that comes kind of close is a CPU. So I'm not really sure what you mean. But yep, spent a pretty penny on the entire system!
@@MrJays Not always but tend to be these days.
Dude, if your 4090 won't get up to 90%+ even with this Unobtanium-Setting, there's something wrong.
- Maybe the programming of the game ... or optimization or on side of the hardware.
I got the 3090 Ti and some games scale strange, because the impact of 2k resolution to 4k resolution is not a 50% performance drop, even if the resolution is doubled. For example in the game Red Dead Redemption 2 it is only a 10- upto 15 frames per second suffering from 2k to 4k resolution.
In direction of 4k Resolution to only 2k Resolution the game seems to be very bad optimized, but in the opposite direction is the game very well optimized, because from 2k upto 4k Resolution and only loosing 10- upto 15 frames per second - ey common.^^
120fps @ 2k, to allmost 105fps in 4k AVG - what is that?^^
I have the AMD Ryzen 9 7950X3D, but for gaming I got the CCD1 mostly deactivated if the game is not that Core-Count-Hungry.
- So let's say I play mostly with an overclocked 7800X3D, because it's running at ~5.400MHz All-Core, so it has higher clocks than regular.
So, I mean that the performance must scale in the right direction and you can see if it's the code of the game which causes some strange performance-behaviors, or if it belongs to the hardware. - In most cases.
were you trying to start a fire? lol
Tried those settings on my sons PC. Fantastic first frame. Looking forward to seeing the second frame some time in next couple of days.
Every frame is a rollercoaster of emotions.
Lmao Gawd this is wins comment of the year
i'm crying with this one lol@@themagustava6662
A fellow toaster owner I see
This comment made my morning lol
This is the kind of game you use to warm up your room in winter after playing for a couple hours.
That's actually what I've been doing while I stream the game LOL It heats up my room so nicely :D
lol me in the morning warming my self with the alienware gaming black desert maxed out
Or the kind of game you do not even go near in summer
sadly there is no winter here, it's either hot or super hot
I Wake up turn the Pc on. Load up metro exodus..
Go get washed up & make a drink & come back to a warm room xD
I turned speculation reflection to “Very High” as that was impacting frame rate massively on Max or Ultra - I saw no change to visuals but able to play 4K 100fps with DLSS Quality. 👍
Crysis Really did push the gaming industry and technology forward in 2007, we can thank Crytek for The quality we have in games now. they set the bar.
My PC still can't run Crysis //s
yeah we need a new one to set a new era
Far cry and crysis were more of tech demos for their engine development. Shame the engine itself kinda turned into a janky mess over the years, despite them selling off the IPs ubisoft and focusing entirely on it. Modern gaming wise its gonna be rare to see anything actually pushing hardware thats fully optimized. Industry has shifted focus the past decade and a half to target the widest audiance possible to maximize profits. This means development will mostly be focused on dated console hardware, rather then the bleeding edge gpu's on the market.
Don't give Crytek any props for pushing anything forward. The reason why can it run crysis is even a meme is because their game engine was so unoptimized that it doesn't matter how good your hardware is that it doesn't run very well. They didn't even do short cuts that didn't work out. They straight up made a bad engine that they had the game rendering tesslation in areas that the camera doesn't even see when it was very rudementary at the time to avoid that problem. Why do you think its only can it run crysis and not can it run crysis 2 or even 3 which had even better graphics. Its because those games were actually optimized. And its no like no one was pushing the bar with graphics. Crytek is only famous for being bad.
It used to be id, you needed a 486 for doom, upgrade or fuck off. Then quake came out and you needed a pentium, upgrade or fuck off. Then quake 3, buy a video card with good opengl drivers or fuck off. Then doom 3, better upgrade your whole system or fuck off. Rage was a similar but more complicated story due to consoles. Ah, the good old days 👴🏻
Edit: Very important to know that all of these Frame Interpretation technologies will almost always increase latency (decreased latency is very rare). It is motion smoothing, not performance boosting. Perception and expectation will be different for everyone, so is best to test it out for yourself to see if you get a better experience.
2:11 I know i'm late to the party here, and not trying to be nit-picky here, just trying to get correct information out as a lot of people seem to get these two confused, but it's VERY important to note that AMD Fluid Motion Frames and FSR 3 Frame Generation are *NOT* the same thing.
FSR 3 Is AMD's Super Resolution Package that Includes Individually toggleable/tweakable Upscaling and Frame Generation. This method must be intergated within game, has minimum requirements (Though not guaranteed to not work on older GPU's) of AMD RADEON RX 5000+, NVIDIA GeForce GTX 1000+, and Intel Arc+, with reccomend requirements being a generation above that, excluding Intel Arc because it's already new enough and a 1st generation product.
AMD FSR 3 has access to UI Elements and motion vectors to better prevent garbling and visual quality degradation.
AMD Fluid Motion Frames (ADMF) is a Driver Level Frame Generation within the AMD RADEON Adrenalin Software for RX 6000+. It was available for aehiel via a preview driver, but is now available on the release 24.1.1 Drivers.
AFMF does NOT have to be integrated within a specific game in order for it to work, just for it to be Direct x 11/12 and V-Sync off usually.
And Unlike DLSS 3 FG and FSR 3 FG, This means it does NOT have access to motion vectors and other game data that can reduce latency or perserve image quality. It also tends to dynamically disable itself when gaps between frames are too large and during fast movement in order to try it's best to perserve that image quality.
To counter this, it's also reccomend that those using AFMF should use HYPR-RX, which in inclues a Dynamic (BiLinear if I remember correctly) Resolution Scaling that only toggles it'self on during large amounts of motion
Hope this helps some people out!.
Okay, short explanation to CPU/GPU utiliztion: If you have a set of settings that GPU bottlenecks you (99% util GPU) and that makes your CPU work for like 8% util, then the only logical thing that follows is that if you reduce graphical quality the CPU utilization increases as your FPS increases because the CPU has to work for every frame. It needs to simulate the world, and "place" (calculate positions/culling/simulate) all the rendered objects so your GPU knows what to render where. This has nothing to do with "poor" optimization but how games are simuated. This is exactly why games like CS or valorant which have like 20 dynamic entities can run in the 500s of fps while games like the Witcher, AC, Battleroyals and Avatar can only archive that many Fps because they simulate sometimes up to hundreds of dynamic entities.
Also the percentage of the CPU in Rivertuner is somewhat misleading as it takes all the cores and averages utilization while maybe the render thread (which drives your fps) is at 100% utilization. Similar problems can occour with GPUs nowadays because with Raytracing involved, you have different cores handling different things (CUDA for raster, RT for raytracing etc) this means that if the RT cores can only deliver 60 fps, you might have CUDA cores sleeping because the 4090 does not have a nice ratio of those for every game released ever.
Also also turning on raytracing increases CPU load as more calculations are needed to composit the frame.
AND with frame I mean non generated frame, as FSR and DLSS frames are generated entirely on the GPU.
Anyway, I am happy you have fun with the game, just don't get hung up on the numbers too much.
The in game frame gen is using vector data to generate frames like dlss frame gen. The fluid motion is a AMD driver level frame gen that works on all dx 11 & 12 games. Fluid motion only uses the frames to generate new frames. Fluid motion is only available on AMD cards. While FSR frame gen works on both AMD and Nvidia.
once you game in 4k there is no going back
I look forward to trying to one day :D
Key word's here "Your Opinion"
Aren't opinions great :D
Don't be fooled by those crazy presets in games, most of the time you won't see much difference between High preset and Ultra but your FPS counter does. Unobtanium is just the unoptimized graphic preset to make people talk about the game.
Yep, the devs admitted that the purpose of it is for the game to be a benchmark for future GPU's years from now.
The difference is definetly notable, but the game looks beautiful enough on lower settings too that you wont complain. I wouldnt call it unoptimized, it is just incredibly demanding. If you want to know how an unoptimized game runs, check out cities skylines 2. benchmarks.
i always see a difference bc i have 42" 4k oled
Avatar has RT/RTAO built into the game no options to turn it off even at medium setting. Even XSS has RT/RTAO. I don't think they had a choice too much data to do baked in lighting for this game when every thing emits light.
You are correct, thank you for letting me know! I was going with it being determined by settings because the in game graphics menu says it's raytraced when set to ultra
this is exactly why i just got a ps5 ,demanding pc games cant even run properly on a 4090 so what's the point in spending thousands just to run these games in less performance than a console
4090 perfect experience guide:
Step one: unlock unobtaniin settings.
Step two: download DlSS frame generation mod with DlSS depth fix.
Step 3) sett everything to max , motion blur to off , specular reflections to ultra , and environmental reflections quality to high , max doesn’t changes anything to either of this settings and has a performance hit. (You can gain about 7-8 gps putting this 2 settings 1 step down with 0 visual sacrifices, literally, not like the eye blind people that say that settings that do make difference, make none.
This 2 settings are not currently working properly, it’s free performance guys , just take them 1 step down)
Set DlSS to balanced , enable DlSS frame generation with preset C through the reshade comand of the mod.
Enjoy an unobtanium settings 90-110 fps 4K experience that looks exactly like native.
ah yes lets heavily rely on DLSS with a 2.5K ish videocard. Braindead Ngreedia buyers.
unobtanium settings super 25 FPS 👍 where Crysis came out on Ultra 12 FPS with the high-end hardware of the time 😉
Found the only person who likes this game
Nice test! i noticed that by reducing the game details tab from 25 to lets say 0 or 5, you pickup a massive boost in fps, without that much of a visual difference. Its like the names says; only details. I am playing this on my 3060 and 4060 laptop which stil looks amazing with a couple of tweaks. I do have some loading issues on the desktop 3060 computer, but thats because the vram (12gb) and normal ram limitations (16 gb). Stil playable.
Thank you! I have noticed that as well in other Ubi titles! The extra details slider is typically the one that reduces the FPS the most outside of the non-raytracing options, or shadows! I do love those extra details, but sometimes they can strain your GPU so much that you lose like 40 FPS!
@@MrJays exactly and thats to much. Especially on lower and gpus like mine
I don`t understand your CPU bottleneck claim. If the CPU works on like 20% of it`s Capacity and the GPU (actually not bottomed out either from what I saw, on the 1080p test). This game on 1080p is in fact not 100%ing the hardware you have at any point at all, for any sort of "bottleneck" to appear. But still... Isn`t CPU bottleneck when the CPU hits 100% utilization "first" and the GPU stands >bellow< that at all points. For example how I know my CPU of choice (and financial capabilities) have like a lil bit of GPU "room to grow" is that when I play the game that I play on V sync (I am not telling @ what refresh rate monitor for 1080p because it will get you sad :P) the CPU is like 15~20-30% and the GPU is like 45~52%... When I Uncap (remove the V Sync) the GPU hits 95% + (a.k.a. does all that it can) CPU still goes up to respectable 60%. So I am thinking if I put a grade or few above GPU I'd hit something like 90% + CPU on 100% GPU. If I "overkill" the GPU upgrade and the CPU becomes bottleneck it`d be something like 100% CPU 80% GPU (I have been there before... because I had CPU, much, much, much older than the GPU and they were in theory both 100/100 on the same game (for lower result than 20/45 on capped on my current hardware) but in reality, if the CPU was newer the GPU would have probably juiced some FPS on top out of it`s 100%. Why Would I Buy a PC that plays the game that I want with 20% CPU and 45% GPU utilization... the way I've always wanted to... Well there are many reasons. It is worth it. Also the game have which have like up to 5 - 6 times the requirements of your 85% + of the gameplay time (variegating on personal preference and choices).
So, in some cases you are correct, using 100% of your CPU and not using 100% of your GPU is a CPU bottleneck, but that only typically happens with low core count CPUs in my experience, like on my old i7 6700k with only 4 cores and 8 threads. But, on CPUs with high core count like my current 5950x with 16 cores and 32 threads I only ever see it hitting MAYBE 75% on games like Division 2; everything else is lower. But I know I am still CPU bottle necked when my GPU usage is less than about 80%, meaning your GPU is not using its full capabilities because it's waiting for the CPU to catch up. So even though I was only using about 20-25% CPU, the GPU drops to about 50-70% because it is MUCH faster than the CPU is. It's also the fault of game developers not truly supporting high core count CPUs. They only account for about 4-8 cores in most games, so CPUs with more than that just have half of the CPU idling while the other half is struggling to run the game. Hope this helps :D
Dont listen to what op says. He has 0 clue what he is talking about...
If an 8-core CPU has 1 core at 100% and 7 cores at 0%, it's a CPU bottleneck, even though the CPU is at 12.5%. This is because 8 x 12.5 = 100%. 2 cores at 100% would also be a CPU bottleneck even though the CPU is at 25% of its capacity. 4 cores at 100% would also be a bottleneck even though the CPU is at 50% capacity. The CPU in this video has 16 cores so a CPU bottleneck could easily be at less than 20% capacity. The overlay in the game isn't showing a breakdown of what each core is doing so you can't judge on whether there's a bottleneck or not. Although it did look like there was in places when the GPU usage dropped to the low 80s.
It's kind've ridiculous and insulting to the end user to include these options knowing that most setups wouldnt be able to use them. Whats worse is that even with all the settings turned up to max, the game isn't jaw droppingly beautiful. If you've never seen the game before you wouldn't walk away saying "OMG I totally see where all that processing power went." The game looks unimpressive, IMHO, at any setting.
Well, I think that's why these Unobtanium settings were removed, and can only be access through the code in the launcher, because they knew it wouldn't be - no pun intended - obtainable for most modern PCs right now lol. But also, I will say that during my time with the game, and other friends of mine agree, we've all seen views that to make us go "WOW, that is gorgeous!" I think it's something you need to see for yourself, maybe? It's like the first time you play Ghost of Tsushima, and you see those big field of grass all flowing in the wind with leaves. It's just like that feeling for me :D
That GPU usage can be due to engine limitation or poor multicore optimization, I don't think it can bottleneck that CPU that much.
I thought a game was cpu bottlenecked when the cpu was too slow to give the videocard resources to draw more frames. Usually that would happen at low settings, not max or near max settings. Or if there was just a terrible mismatch between a weak cpu and a fast video card.
For example if you ran this game at 640x480 lowest settings, on a 2080 and a 4080, and get the same frame rate for both, that would be a cpu bottleneck, even if it was like 300fps. And you know this because the 4080 is undoubtedly an order of magnitude more powerful.
Yes, but even if you put most powerful CPU we have now it won't be big of difference in this scenarios, if he showed all core usage it wouldn't use all of the cores, by usage I would assume game uses maybe 6 core at max@@tnutz777
I do not think it is a CPU bottleneck but a part not in the graph is limiting it for example system RAM throughput which just getting a better CPU will not fix if it is a system RAM problem which rarely is the issue in games. The reason why i am saying system RAM is the entire game should have loaded into RAM at the same so it is either GPU, CPU or RAM. You didn't run out of it but the throughput of it. It will be interesting to see how different speeds of system RAM with the same CAS effect performance.
Too many people confuse necessary VRAM with ALLOCATED VRAM. Just because it’s allocating 20 gb doesn’t mean it’s using all of that.
Very true!
I honestly don't notice the difference and most people don't just stop at every single leaf to see the quality difference, which is why i think combining ray-tracing into the quality presets is dumb, they should have had it have its own tab, cause from experience RT is never worth the hit to performance.
I do agree that the graphical differences are very subtle, and that games should offer an RTX tab so players can toggle it only if they want to. Only bad part about that is it means devs will have to place all the lights in the game manually, then remove them if the player turns on RTX; essentially making even more work for them when creating levels
video is in 1080p bro..
This game doesnt even look that good to begin with . The low fps with 4K and unobtanium settings is more the result of lack of optimization. A 4090 can do so much better than that
Motion of the foliage doesn't look verry realistic tho. hope that improves a bit in later updates.
I just unlocked the "Unobtanium" setting on my 14900K/4090 MSI Liquid Suprim/GSkill 64GB of 6000MT/s RAM. My goal is to hit 4K Full UnOb settings 60fps.
Turned on FSR3 with frame gen and removed motion blur (yuck) and ran the benchmark to an average of 60fps with low drop to the 55-56fps.
Loaded my save and the game will maintain 60fps for the majority but if you go look at water with RTX reflections on UnOb, the FPS will drop to the low 50's. I switched back to full ULTRA with FSR3 w/frame generation and the game is a solid lock to 60fps. I switched back and forth between Ultra and UnOb and the difference is so subtle it is not worth the fluctuating frame-times. Running Ultra+. Cranked up the texture density from UnOb settings and the game will use 22GBs of my 24GBs of VRAM. Zoinks! I also added UnOb particle effects, volumetric clouds, shadows and a few other settings from UnOb. My current game is a near rock solid 60fps just running around in awe at Avatar's visual beauty with a slight dip to 56-58fps here and there. With HDR on, I swear shots in this game look like they're straight from the films. Pure Sublime Eye P**n!!!
That's awesome to hear! This game is amazingly gorgeous and I totally agree, I was awe struck by its beauty constantly! :)
I sold my ps5 for an Xbox new gen console, now I regret it, Xbox exclusives are bored
Too bad, you did not upload the video in 4K.
Gotta admit, this is by far the best and most fitting setting designation ever
LOL
I play it on ultra settings with FSR 3 frame gen on ultra quality and playing it with a controller it feels really smooth on my RTX 4070, definitely the most demanding but most beautiful game I ever played (note: I noticed that putting a frame limit in nvidia control panel 2 frames below your monitor refreshrate makes the game feel a lot smoother with the frame generation on. I put mine at 142 fps frame limit because the game started to feel stuttery when going above 144 fps)
Your GPU is seriously bottlenecking your CPU in every resolution, especially in 1080p. RTX 4090 is not meant for Ryzen 5000 series. You should upgrade to Ryzen 9 7950x 3D, Then benchmark it again, you 'll see massive increment. & Atlast, FSR is useless in your case bcz FSR works more efficiently with only AMD GPUs. I know it bcz I own one. And yes the game needs more patchs for smoother experience.
Looks like it runs good to me. Been flawless on my 4090 + 13900ks.
Same here! Looks incredible, and runs amazingly! Glad you're having a good time with the game also!
"Looks like it runs good to me" "4090" Hmmm... I wonder why that is 🤔
@@2hotflavored666the thumbnail has a 4090 in it and the title states that your PC isn’t ready. Get it?
@@solidus317 The title is just clickbait. You would know that if you watched UA-cam for more than 2 months. Get it?
@@2hotflavored666 yet you reply negatively to a comment while not using the same knowledge/logic that you claim to have had. Adds zero value. Screams “I have nothing even close to a 4090 and I’m salty”.
I've got every setting maxed out on my 7900XTX and AMD's FSR3 and FMF on....100+FPS at 3440x1440.
Dang that's awesome!! I'm happy the game runs so smooth for you! :D
i have a 4090 on 3440x1440 screen. unobtanium gives 45fps and is literally the same as high except for darker midtones in shadows. and high is 120fps. dont mistaken unobtanium as a next gen setting, its literally like high.
So, I will say that they did a great job at making the lower settings look really good in this game, the differences between the higher and lower settings are more subtle (that's awesome) - but, there's definitely are differences between unobtanium and even ultra settings, they are just *super* hard to see unless you are looking for them. Definitely not worth the major performance impact to some people, but I wouldn't say it's comparable to high settings, personally.
This is no longer a game, it's a real-time movie graphics level
this is like the mode in doom eternal where it supports 1000FPS. it is there mostly for the future, if someone wants to play this thing decked out in 5 years, here you go!
why is the video only in 1080?
that dosent look that good tho Ubisoft is so late on graphics and gameplay is pretty bad. Red Dead 2 looks better and it's a 2019 game on pc, star citizen looks better Death Stranding looks better this is so bad for a end of 2023 game
Tru
it is impossible to see the diference . The video is uploaded at 1080p trying to show a video at 2160p.
My pc doesn't want any Ubisoft copy/paste games.
It may be using a lot of vram if it has it to use, for extra texture streaming or whatever. That does not necessarily mean it will not work well on lesser vram cards, like a 16GB 4080 for example. Cool video though ! I remember running this on my laptop 4090 and it brought it to its knees, haha!
That is true, but I think even if it were using the extra vram for texture streaming that would cause the game to either stutter like crazy or run at a much lower FPS while it waits for the VRAM and RAM to clear - that's what games like Hogwarts Legacy did. And while everything ran like crap, the textures were super blurry
@@MrJays Possibly but its not always the case. Some games use whats available for things that may not bring any tangible benefit. I remember a COD from like 10 years ago or so that had that feature to use basically whatever amount of vram you had. When I had a 8GB 3070 laptop never had any significant issues on hogwarts at 1600p with higher textures.
CPU bottleneck for sure. The next gen cpu's will take gaming to another level.
Someone having a 4090, and playing in 1080p... would be an incredibly rare hardware mismatch.
Something is strange about your benchmarks though. You're not hitting max utilization of gpu or cpu either one. Not even close. I'm guessing the game is just horribly optimized and is really cooking just one or two cores on your 5950X. Single core performance just never seems to stop being relevant.
Nope, the game isn't horribly optimized at all, what you're noticing is the effect I mentioned in the video of being CPU limited, or CPU bottlenecked. With a high core count CPU, you don't need to see 100% CPU usage to bottleneck the GPU. So I only saw proper GPU usage, and the bottle neck was removed when I went to the higher resolutions. It's what happens when games don't fully utilize a CPU with 16 cores, but something like 4 cores.
@@MrJays *nod* , I just meant that it not utilizing more cores is less than ideal if it's going to bottleneck on the cpu like that.
In general new games built on the latest engines are very badly optimized though. There's a lot of 'copy-pasta' coding, and lack of expertise in implementing "close to the metal" tech in DX12. Close to the metal is theoretically capable of greater efficiency, but it's more complex on the coder side of things, and for the most part, it's not leveraged well. You see it a lot in UE5 games especially. Though this game looks like it has it's own engine.
It feels like game developers are leaning hard on FSR or DLSS to save them the work of coding more properly.
5950X is slightly slow on single core performance (paired with a 4090 anyway) 7950X or 7800X3D would probably help "some" with cpu bottleneck. but probably by only 20% or so.
Heavy use of ray tracing seems like a bad idea in general too. Sure the new cards can actually run RT at playable frame rates, but it's still a huge resource hog, so using it excessively is still going to tank fps. Cyberpunk at truly max settings kills fps, and it's mostly the RT's fault.
At least 4090-crushing and 4nm cpu-crushing games are mostly limited to extreme settings though.
I dunno :)
@@kathrynck I do agree that there have been a lot of incredibly unoptimized games this year, and in the past few years as well, yeah. And I really wish we had more games that fully utilized a high core count CPU, that would be awesome!
That's been the biggest thing for me with games is when the CPU wont go past ~35% usage, but my frames drop to 50 FPS lol.
And some dev are definitely using FSR and DLSS to mask terrible optimization, you are 100% right - but I gotta say - I am a huge fan of raytracing and pathtracing lol. As a someone who loves the best graphics, I will take that hit, and just turn on frame generation :D
@@MrJays Hehe, fair enough :) You've got a 4090, so you can spoil yourself with RT/PT pretty well hehe.
I got a 4090 as well, but I mainly got it for a move to 4k. So I don't run into cpu bottlenecks too much, cuz 4k really tortures the gpu ;) I did go 7900X for the cpu though, looking to max out on single-core performance. Intel might still have a slight edge on single core, but I really needed more pcie lanes than intel is willing to include. I gotta say that all the new 2023 hardware is really heavily overjuiced & overclocked at factory settings. I de-tuned both slightly in my system. Comparing them apples to apples with non-OC older hardware is a little unfair since they're running so deep into the redline out of the box. I found both my 7900X and 4090 perform _better_ with undervolt, but i took that edge away with lower thermal max's, cuz I want them to last.
played the game during the holiday and it look amazing on the Xbox Series S and looked amazing on PC at max settings Ubisoft really made one of the most beautiful games I ever played.
I totally agree! It's absolutely stunning!
How is that a CPU bottleneck if it only uses 20%
100% usage is considered bottleneck for both CPU and GPU
When you have a high core count CPU - like higher than 8 - you wont see 100% usage during a game because that are typically optimized for 4 or 8 cores. So for my 5950x I don't see more than 60-70% usage on the absolute highest end. And you can tell it's a CPU bottleneck when the GPU usage drops below about 80%. Which in this video at the low resolutions the GPU hardly hits 90% usage. And averages 50% usage on the GPU because it's just waiting for the CPU to finish.
@@MrJaysCheck single core performance if it's not 100 it's underperforming not bottleneck. I have an old cpu with latest GPU all cores at max only producing 80 fps in 1080p. so I changed to 4k monitor it still renders around 80 fps. So i Didn't change cpu. So upgrade to better resol don't spend more on cpu.
My pc is not ready for sure, my 1060 is melting right now and this is just a video
I just got a 4090 nvidia card and upgraded my intel 914900k and this would run like butter on it, but only issue, I am NOT into Avatar movies.
Thanks for this video, really great and a brilliant explanation.
You're very welcome! Glad you enjoyed! Thank you for watching! :)
Since I'm watching this via an UA-cam video, I gotta ask: did you find this Unobtanium mode much much better? Is the performance cost at 1440p justifiable?
I must say that its a perfect game for 1440p but I would not run the unobtanium setting with anything below a 4090 or 7900xtx. I play at 1440 ultra wide and because the unobtanium settings are not optimized I have had crashes so I find myself keeping just ultra on. Hope this helps as the game is amazing either way
So, in my experience at 1080p, I can say that the game look stunning without the unobtanium settings and gives an extra, like, 30 FPS! So that is great! But I can say that if you want *slightly* higher texture quality, or better shadows with caring about your frame rate, then it is worth it. But as another comment said, they are very hard to run, and can cause crashing.
Overall thoughts - Eh, not really needed, but still nice to have if someone wants to run them lol
@@MrJays gotcha!
No. Barely any difference. Disagreeing with the guy here.
I have a 16gb 4090 laptop and I can run unobtanium with some tweaks, DLSS ultra quality, 4k, FG on, at about 50 fps. There's barely any difference besides maybe volumetric fog.
Also he brings up Lords of the Fallen and that's just unoptimized trash, UE5 is also known to be very poorly optimized.
@@joshuam4993 totally respect your opinion! I can agree there isn't much difference unless you are specifically looking for the differences. But while I do agree UE5 games tend to have issues, Lords of The Fallen ran pretty great for me in my benchmark video! :D
It's not CPU bottlenecked. This guy doesn't know what he's talking about.
And u don’t listen @3:00
If you are saying you can only be CPU limited if a CPU is being pushed to 100% that is false. If a game only uses 4 cores, the CPU will show very low total usage, like it did in the video, but those 4 cores will be pushed to 100% usage, and that will drop your GPU usage because it's waiting on your CPU most of the time. Just like in the video you can see my GPU usage dropping down to 70 or 80% even in 1440p, because the 4090 is waiting for the CPU - that's a CPU bottleneck. A CPU with faster cores would give better FPS due to each core being faster, so those 4 cores the game wants to use would able to produce more frames. Or if games utilized more CPU cores, it would give better performance as well, typically.
Will need a 5090 lol
Oh man.. games in 10 years ? WILL BE WILD !
So we have games continuing to push the VRAM demands with Nivida cranking out midline GPUs with 8-12GB of VRAM on them. We are entering an era of gaming here where people are going to miss the days of SLI. You could get two RTX 4070s with a combined 24 GB of VRAM for about half the cost of a RTX 4090
Honestly, I'd love it if SLI came back, but I don't think it will. We can create more powerful single GPUs than two GPUs combined, and game devs aren't optimizing for it anymore
Imagine, playing it in VR, you’ll be on PANDORA 😭
A dinosaur death animation glitchingly stuck in mid air was maybe not the best clip to pick for hyping up the game
What clip are you talking about?
cheers mate! thanks for the review. just got a 4090 installed today and looking for titles to throw on. why you upload in 1080p? please 2k minimum i cant see shiat 😆🤙
Thanks for watching the video! And I hope you enjoy your 4090! I love mine! Also I apologize for the video being 1080p, I tried to record the footage at 4K, but Nvidia saved all the footage at 1080p, for some reason
@@MrJays it saved it in 1080 because you forgot set it to 4k 🤪
nevermind. i already heard avatar looks nice but the game itself is bad. so i dont want buy it anymore. so sad they missed the chance to make something better with such stunning graphics
running it on my 3090 with all the settings turned on high, it looks so good, love this game, i hope they make some dlc or work on a second game,
They have a season pass on the Ubisoft store for the game that says they have two DLCs planned! So lets hope they're good!
As an intel integrated user the title hits hard.
How can you say you are CPU limited but the CPU usage is only hovering around 16 - 20% ? GPU usage was around 79 - 80 something %
High core count CPUs don't bottleneck at 100% usage while gaming, unless the game is crazy unoptimized. I have 16 cores, so if a game only uses 4 cores, that's 25%, so 20-25% usage is using all 4 if the cores the game can use, and that's why the GPU usage wasn't above 80-90% like it should be
I believe my RTX 2080 Super should be able to handle this at medium settings at 1080p
Why are you running in Borderless Fullscreen? Usually exclusive fullscreen skips some windows management code paths and has better performance and stability overall...
I always run borderless fullscreen since I typically stream my gameplay, I need to alt tab way more than others would. But from what I've noticed in the last few years, fullscreen doesn't offer any more FPS improvements over borderless fullscreen; best case scenario I've seen is it reduces stutters a bit in some games
Borderless fullscreen doesn't have any disadvantages.
1.Capturing Gameplay in 4k with maximum details.
2. Upload to UA-cam in 1080p.
3. ???
4. Profit
I apologize, I tried to record in 4K, but Nvidia only recorded in 1080p for some reason :/
In the future, benchmarks will be in 4K if they include 4K gameplay, just as they normally do on the channel!
MrJay: Were going to compare normal to unobtainium settings in a video
Also MrJay: I uploaded this in 1080p. Have fun.
I normally do record in 4K, and I tried to for this video too, but Nvidia had an error and only recorded in 1080p. Other benchmarks and videos that have 4K footage will be in 4K on the channel, as usual.
Nice game. Good video.Keep up the good work🎉
Thank you so much for the kind words! I am glad you enjoyed the video! Comments like these make my day! :)
Cpu at 22%... This dude: we are definetly cpu limited. What a fuck are you talking about...
If you are assuming you can only be CPU limited if a CPU is being pushed to 100% that is false. If a game only uses 4 cores, the CPU will show very low total usage, like it did in the video, but those 4 cores will be pushed to 100% usage, and that will drop your GPU usage because it's waiting on your CPU most of the time. Just like in the video you can see my GPU usage dropping down to 70 or 80% even in 1440p, because the 4090 is waiting for the CPU - that's a CPU bottleneck. A CPU with faster cores would give better FPS due to each core being faster, so those 4 cores the game wants to use would be able to produce more frames. Or if games utilized more CPU cores, it would give better performance as well, typically, or at least it would show more utilization for the CPU
My 1440p monitor can only hit 75hz with freesync. I also can only focus with one eye at a time. So visual looks are not a big deal to me. Have a 5800x and a 7800 xt. Probably start with high and see how it goes.
What a massive change 😲
Hey out of curiosity , if it's ok to ask, what do you do to earn income aside from UA-cam/Twitch/streaming - sometimes I wonder what people do to afford such amazing PC builds 😅
Not to be disrespectful! I really hope to grow up and have income to afford such beauties for me and my siblings so we can game together, I feel like it would be awesome , but with how the world and economy is going, it's so hard😢
A pc with this GPU to build yourself should cost around $3k. You don't need an amazing job to afford that, especially if you save, stop buying dumb stuff, wait for sales, etc. Also not sure what job you have but if you are working an entry level retail/food service job I highly recommend getting a trade job. Look for landscaping, electrician, plumber, construction work in your area and shop around. If you find the right boss/company/mentor you can very quickly learn the trade and you'll make considerably more than working say retail or food service.
@charismatic9467 Thanks for the response!
I live in the UK, and job scarcity is a very big thing.
I am a student currently studying CompSci, Hardware engineering and involved in building FPGAs for Companies such as Kone industries and Hynix (memories) and software and Cyber security engineering for Businesses such as Meta and Crowdstrike.
Since I am just a student, I can only earn so much from doing part time work, internships and contractual jobs for these companies (my main 'agreement' is currently Lattice & TSMC Superconductor and Crowdstrike)
But even after my degree is done in couple years, I will still be in a range of £35,000 minimum while paying off student debt and support my family during this Cost of Living crisis. I understand my fortunate skills and position can earn me high paying jobs in the future but regardless of my experiences, some things still are just how it is I will need few years industrial experinece to be earning avg of £75,000 given my potential roles from my practical experiences, knowledge and hands on understanding.
I don't spend money on material clothes, or partying here and there and I try to save where I can. Hoping to get into investments such as stocks n shares or SNP500's , but alas.
A decent enough 4090 here costs at least £2000 given how rare it is/can be to obtain brand new, and the parts alongside it I wish to be epic, especially since not just for gaming, but it supports my in Rendering, AI computation, work and study. Hence my question was the field the UA-camr works in and how their part was accessible for them (given the history of the channel, they own multiple decent parts collectively which is awesome!)
No gripe here, I just admire people nowadays and wish to learn more, even if it is silly to do this over a youtube comment. I'd rather love, ask and admire than to hate or point out anything criticising
I agree I would like to see DLSS support added, I'm kind of interested in the game. If it's an AMD sponsored game, it will never add Nvidia's DLSS.
I have an i9-13900K + RTX 4090 running a 240Hz 3440x1440 monitor (~5MP) vs 4K (~8MP) I would presume 60-70 FPS running in Unobtanium and maybe +15 FPS running Ultra. Off the top of my head, I don't know how our CPUs compare to one another in this game.
CPU-Z tnx299
I think u would get more fpswith that kind of set bro just saying 😅
i got a 3070 and so far there hasnt been a game i couldnt run on ultras
I think the difference is you probably aren't trying to run games at 4K 144hz on the 3070, and then getting mad when it doesn't work, like some people are on the internet lol
I think you weren’t upscaling and only using frame gen at native resolution.
I do believe I was yeah, I must've forgotten to change that. But that just makes it all the more impressive :D
I play this on my 4070ti using the Ultra preset with FSR 3 Frame gen and I get an average of 100 fps which is just fine for this style of game.
That's awesome! :D
So I don't have a 4090 or the game, but even if I had ray tracing off, and as far as 4k is concerned, I don't see any difference between 1080p and 4k, so I don't use it
Correction. my bank is not ready for AVATAR'S UNOBTANIUM MODE. lol
LOL I feel that, it's getting really expensive
I have a 5950x and a 4090, now I dont need to do this benchmarking thanks! I dialed in the settings to play well at 100-120fps on my 144hz 4k display.
How is that using 15% CPU is a CPU bottleneck while the GPU is reaching almost 90%? 🤔
dude how the f you geting CPU bottle neck on 20% usage.....
CPU bottleneck doesn't need to be above 90% usage. That's the old meaning of it when all we had were low core count CPUs. Now, with high core count CPUs, when a game only uses 4 cores, that's only 25% of a 16 core cpu. So the total usage won't show 90% or higher. What shows a CPU limitation nowadays is GPU usage, the GPU slows down to wait for the CPU. Look at the GPU usage slowing down to 50-60% at 1080p, or even 70-80% at 1440p. It cannot work faster due to waiting for the CPU.
not using CPU is not good, it was always good when balanced, 60+% usage of cpu minimum was always perfect
Can someone please explain to me why 20% of CPU used is classed as 'bottlenecking' ? There is still another 80% left ?
Dont know much about this but i would guess it's that most games use between 4-8 cores. So it doesn't matter that he has a 16-core processor because its only using half at most. Newer CPUs like the 7800X3D have lower core count (only 8) but higher clock speeds and extra V-cache per core to help specifically with gaming.
@@Ku5hFi3nd Thanks
So unobtanium just means under 60fps close to 45fps. Come on developer, pull that 4090 to 25fps.
Not bad for a 5950x but once you go to the 7k series of CPU then you get much more FPS. Im getting ready to play this on my 5800x3d paired with a 4090. Wish me luck 😆
I open the game and i dont have that one settings i mean i cant put unobtonium the maximun is ultra so what i nedd to do with that????? guys
As I said in the video it's a secret settings mode, so you have to add the commands I mention in the video into the launch arguments in the Ubisoft launcher
I find it very frustrating to hide and not include it in the settings. I've already finished the game, and Far Cry was a fun and interesting experience, unlike this one. The game is boring, and there's actually not much to do after completing the 20 levels near the home base
As I mentioned in the video, I think the reason they removed these settings was because even the highest end PCs can barely run them. I do wish they had added them in just for poops and giggles, but, they didn't want people to think the game was heavily unoptimized due to experimental settings
@@MrJays I tested but could not see any difference. The game was running fine without any issues
I tried unobtanium mode on my XTX at 1440p and only got around 100fps. But it looks SO beautiful 😍
That's awesome! Glad it ran great for you! :D
I'm loving the game too!
I tried it. And could get it to run at 4k like 50 fps with some tweaks 4090 laptop. Honestly pointless. There's almost no difference between IT and High/ULTRA, EXCEPT for maybe the Volumetric Fog. Other then that- the settings barely do anything and just eat your FPS
I honestly agree to a certain extent, yeah. I notice a slight difference in the shadows and texture quality as well, but you are definitely correct, it EATS your FPS which for some is just not worth it for slightly better visuals
1080p? A game like this deserves 1440p minimum. Also somethings off... it's barely using the cpu and gpu is only at 60'ish percent. Need to stick that card in a 7800x3d setup...
I love 1080p with high refresh rate! But I'll be getting 1440p 240hz as soon as I get a new CPU :D
And yeah, this game and others really show the bottleneck my CPU is causing on the 4090 at lower resolutions LOL
"oh we see cpu bottleneck" meanwhile cpu sleeps xD
I have a 16 core CPU, that means CPU bottlenecks look different from an 8 core CPU. You don't need 100% usage to have a bottleneck.
what about visuals at night time. does this game have night. if yes does things glow at night like they do in movie.
If you check out some of the shots at the end of the video, I do have a couple night time ones! The night time is gorgeous with the bioluminescence, and the light from the planet makes it so bright and beautiful!
the graphics look insane, the world itself looks great. I just couldn't get into the gameplay. I stopped after 3 hours. At least it's part of ubisoft's subscription service, so at least I didn't waste $70.
Setting for those that doesn't play the game, it's for photographers.
you were not lying, 20FPS on a 3700x + 3070ti and 32GB ram at 4k
Yeah, a lot of modern games will push your hardware to it's limit nowadays!
All games should have extra settings to push the hardware. It would be even better if games could push hardware down to 5fps or less.
My non existent pc is ready for everthing and nothing at the same time 😂😅😢
Nice PC specs you have there.
How come you get so low framerates?
Mine is 5800X3D, RTX4090, 32 GB RAM, 990 Pro, X570S Mainboard from Gigabyte, 1500 W PSU BeQuiet!, 1440p monitor 27#
So, lower specs, than you, but my framerate is like 120 and above..?
So...what is it, that you are doing wrong here?
CPU is slow
@@alphanecha I would think a 16 Core CPU is faster than an 8 core, wouldn't you agree?
You don't have lower specs than him. The 5800X3D is much faster than the 5950X for gaming. That's what the 3D cache is for.
The 5950X wins for non-gaming tasks though.
@@Nayah9 You don't say! ^^
i honestly dont understand why people like fsr & dlss. It basically gives you fake FPS and with that technology beeing the new standard in games it effectively lowers the -real- fps your computer calculates. It makes games look way worse than they could actually look in order to give you a performance boost of an fps rate of like 120 which should be standard without upscaling or frame generation.
Its basically reducing ur render scale which you could also do manually by making ur resolution lower than what your monitor can do, but it does it a bit better than that.
So even tho you win performance, the game doesent look sharp anymore and thats not the goal when playing games on max settings with a 3000-4000 € Computer.
I think its a very bad thing to be invented because it gives developers the excuse not beeing forced to fully optimize their games anymore since 80% of the fps dont have to be calculated anymore anyways because they are fake...
This leads to players accepting bad graphics for good fps on a high end computer that should be capable of giving them good fps on highest graphics possible. Try to change my mind on this but i dont think you can since i can definetely see how bad a game looks with DLSS on compared to DLSS off because its real fps VS fake fps.
Dont get me wrong, i also see the potential behind all this and maybe in 20-30 years the technology will reach a point where AI can easily create perfect images that are sharp and without any diffrence to real calculated FPS, sothat we can have like 5x the possible performance then since the computer only needs to generate 10-20% of the frames... but thats the future.
The way as it is now and the way it gonna be for probably the next 10 years, it sucks and only works as an improvised solution for bad performance.
You already pay 3x the price of an high end graphics card for the same amount of performance. When you compare the best card of each generation you will realise that the amount of performance you get for the price you pay, has been dropped to like 1/3 of what it once was. (price-performance ratio).
- 1080TI was the best card in 2017 and cost around 800€ while you could play games with 120+ fps on highest settings ...
- 4090 is the best card in 2023 for 2200€ but it basically gives the same FPS for games that dont even look significantly better than games did 7-8 years ago + you have to use DLSS on top of that which makes it worse.
So, in my opinion, your argument is built off of the basis that DLSS makes games look worse, which if you think so, that's fair - I disagree though. I don't notice any visual difference, and in fact, I've used DLSS to fix bad antialiasing some games have. And in fact I spent all this money on my PC for the high frame rate, so whatever helps me achieve that while running max settings I'm happy with! That being said though, I still have my 1080 Ti, even though I don't use it, I absolutely LOVED that card! It was incredible for that time I do agree - but in my opinion - the 4090 is that card for today's generation, I love it!! :D
Much respect!
I dont even bother with unobtainium or fsr, frame gen, etc. 1440p raster, ultra settings and the game is gorgeous....
This game really is stunning no matter what settings you run it on :D
Lol when a game increases your electricity bill.... 2024
Good Video. Might just be me, but I've never looked at 10-25% CPU usage as being CPU bottlenecked.
Hey thanks! Glad yo enjoyed the video! And, the CPU usage being so low is only a result of my 16 core CPU - so I wont see 100% usage as being a bottleneck. Most PC games only use about 4 cores, so that's 25% of 16 :D
So my 25% usage with 16 cores is 50% usage to someone with only 8 cores, and so on
Perfect game for the cold winter
Does it have VR mode?
Definitely a great game for the winter, but sadly no VR mode
Running AMD Athlon XP 2500+ and GeForce 2 MX. Do I need to upgrade or am I good to go?
How do we get this setting?
1:39 :)
Just find the game in ubisoft, view the game details, settings, and then find the game launch argument section at the bottom and add it in
@@MrJays cool thank you.