Hogwarts Legacy PC Performance Analysis
Вставка
- Опубліковано 6 лип 2024
- Benchmarking GPUs in Hogwarts Legacy is useful, but can also be misleading because the CPU can easily be the limiting factor in performance, especially with ray tracing enabled and in certain locations in the game. Here I explain the issues and benchmark a wide variety of GPU at a variety of graphics settings and resolutions in the demanding (on the CPU) Hogsmeade location. The GPUs tested include RTX 4090, RTX 3060 Ti, RX 6700 XT, GTX 1060 6GB, and RX 7900 XTX.
Test system specs:
CPU: Ryzen 7700X amzn.to/3ODM90l
Cooler: Corsair H150i Elite amzn.to/3VaYqeZ
Mobo: ROG Strix X670E-a amzn.to/3F9DjEx
RAM: 32GB Corsair Vengeance DDR5 6000 CL36 amzn.to/3u563Yx
SSD: Samsung 980 Pro amzn.to/3BfkKds
Case: Corsair iCUE 5000T RGB amzn.to/3OIaUsn
PSU: Thermaltake 1650W Toughpower GF3 amzn.to/3UaC8cc
Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
Keyboard: Logitech G915 TKL (tactile) amzn.to/3U7FzA9
Mouse: Logitech G305 amzn.to/3gDyfPh
What equipment do I use to make my videos?
Camera: Sony a6100 amzn.to/3wmDtR9
Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
Camera Capture Card: Elgato CamLink 4K amzn.to/3AEAPcH
PC Capture Card: amzn.to/3jwBjxF
Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
Greenscreen: Emart Collapsable amzn.to/3AGjQXx
Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
RGB Strip Backlight on desk: amzn.to/2ZceAwC
Sponsor my channel monthly by clicking the "Join" button:
/ @danielowentech
Donate directly to the channel via PayPal:
www.paypal.com/donate?hosted_...
Disclaimer: I may earn money on qualifying purchases through affiliate links above.
Chapters:
0:00 Heavily CPU limited especially with ray tracing enabled (RTX 4090, Ryzen 7700 X)
3:44 DLSS 3 Frame Generation can help when CPU limited
5:43 CPU limit is reduced with ray tracing turned off (Still RTX 4090)
8:03 3060 Ti Testing
12:27 RX 6700 XT Testing
16:20 GTX 1060 Testing (60 fps is possible if your PC can keep up)
19:09 7900 XTX Testing Ray Tracing OFF
20:44 7900 XTX Testing Ray Tracing ON - Наука та технологія
Here's Part 2 where I had time to test some other CPUs: ua-cam.com/video/hO94Ksz6bNo/v-deo.html
I noticed that big frame drops coincide with a big drop in GPU power usage. The clocks stay the same, but wattage will drop like 30-50%. When wattage returns to normal, so do your FPS. Your frame buffer % also drops at the same time.
Hey Daniel, can you try the initial RTX 4090 test again, but this time with DLSS completely disabled?
What are you doin here Steve?
Just fired it back up to check. At native 4K (TAA High, DLSS completely off) Ultra with RT Ultra I'm seeing a bit lower average since it is not fully CPU limited at native 4K. Dropping to 1080p render scale (no dlss) shows about the same ~60fps CPU limit I was seeing in this video. I also tried on my 3440x1440 ultrawide and hit the same CPU limit. Are you seeing different results? I don't have another comparable test bench to test this against.
quickest pin in the west
@@MrIsmaeltaleb85 What are you doing step Steve?
@@danielowentech I'm actually testing with the 7700X as well. I'm looking into this now. I wouldn't say I have anything different yet. I haven't tested that section of the game enough, it just looks odd. So you've got me interested now ;)
Hardware Unboxed has looked into the CPU limit I was seeing here, and has confirmed it. Edit: Originally he thought results were better on Intel, but then confirmed there is actually a menu bug that was enabling Frame Generation without it looking like it was enabled! So if you think you are getting double the FPS that I'm showing here on an RTX 4000 series card maybe try enabling DLSS and Frame Gen, and then turning them both back off, to confirm frame gen is actually off. Link to relevant tweets from Hardware Unboxed: twitter.com/HardwareUnboxed/status/1623793684436381698?cxt=HHwWhMDT1Zbw74gtAAAA
twitter.com/HardwareUnboxed/status/1623619530143723521?cxt=HHwWgsDTiYjXoIgtAAAA
CPU, GPU and Ram are all below 50% usage. Game is poorly optimized and not using hardware assets properly. Likely getting bottlenecked with the engine's use of the hardware or communication to hardware or poor High-level-api optimization. 60 fps while using 1080p upscaled on a 4090 is disgusting.
You basically say its not a gpu limitation because you don't see the usage high but then make an excuse for the cpu even though its acting similar. That's stupid.
Also you are using AI generated frame generation to get a more reasonable FPS but it still runs like crap considering your hardware to graphic quality. Like usual the PC version of a game is getting released in an incomplete form... go figure.
DLSS is going to be a handicap of gaming but it's being sold as a golden goose.
Lol well few already said that including me. No issues here and also better ram needed. AMD proves again it comes with built in stuttering.
@@antipathy17 DLSS would be a golden goose if devs didn't just say "oh look, we can avoid doing any optimization"
His latest tweet shows that frame gen was accidentally enabled on the Intel test system because of a menu bug
I’m running a 3080 with an amd Ryzen 9 5900x 32 g ram. Almost zero stuttering issues and fps averages 120-130 on ultra settings at 1440p.
Thanks so much for still testing the 1060, literally the most useful benchmark for most people including myself.
Speak for yourself. I'm on 4090.
@@MrHoojaszczyk GTX 1060, still most used card on steam despite its age. RTX 4090, you and 100 others, nobody cares.
@@MrHoojaszczyk What was the point of your comment? To display your insecurity? Good job you support companies price gouging their customers, and are the reason gpus are about to cost 2000$. Hows that boot taste?
@@TahaJelani He can't tell how it tastes, unless his rectum have a tongue.
@@MrHoojaszczyk this comment was made from momma's basement
Love that in the game it uses half the GPU but in the menu you finally get close to 100%
I'm getting over twice your performance (110fps+) on my tuned 12900K at the same settings and same area with the 4090. You're just completely CPU/RAM bottlenecked.
I can record a video and post it on my channel if you want proof.
Thanks for the nice range of performance analyses. Hope your kid feels better soon.
So basically frame generation was meant for unoptimized games.
😅😅
Actually it's for CPU heavy games, such as Flight Simulator and Racing Simulator.
I’m using frame generation on my 4070ti and still having issues
a pleasure as always, thanks for keeping us all in the loop
It's going to be VERY interesting to see how the game performs once Denuvo is cracked. Watch us have another issue with it like RE8.
A friend of mine said that the companies actually pay the crackers for the game not to be cracked for some time. It's usually 1 or 2 years depending on the game.
RE8 was a stutter issue and not a gpu usage issue. And that wasn't with denuvo it was with capcoms own proprietary DRM coinciding with all the other crap companies think we need.
@@samhhhhh i think Empress(even with all her issues) still cracks denuvo. she actually stated shed try to crack hogwarts as fast as she can if the news i saw was true.
@@samhhhhh a few days ago she said it would be cracked within the next 10 days. So within like 4 more days or so, if she wasnt lying, it will be cracked
@@samhhhhh Only 1 left. It's true, a lot of Denuvo games remain uncracked.. but the last one standing really wants to crack this one, so it will likely happen.
Would love to know how much of an impact Denuvo is making here.
I've watched quite a few vids about Denuvo and benchmarks with/without Denuvo and on most games it seemed to affect loading times and 0.1% lows the most. Guessing it works the same here, not sure about that.
@@GentlyUsedFrog It causes stutters though
@@alexorth5172 Yes, I mentioned that when talking about the 0.1% lows.
Stfu and buy the game
In my experience Denuvo doesn't really effect average fps too much, but it does create awful frame time spikes if not carefully optimized. Removing Denuvo might help a bit, but it won't fix the problem entirely.
Hi, did you experience stutter with cutscenes, new environment, new ui elements?
I'm having 100-120 fps with stabile frametime at 4k dlss balanced with high settings raytracing off. but whenever I saw something new or enter new area I'm having stutters. It's look like unreal engine shader issue to me, it does have pre-cache but I don't think its enough, becausetaking its so short time to compile them. For example in Star Ocean The Divine force has stutters if you didn't cache the shaders in the options, but if you choose to cache them it tooks 20 minutes for game to cache it. Thats why I don't think hogwarts legacy is caching all shaders and it causes me stutters.
Rtx 3070
Ryzen 5600x
32gb 3600mhz ram
samsung 980 pro ssd
Seasonic focus gold 750w psu.
Thank you. Always appreciate the time and effort you put into making these videos for us!! Keep up the good work bud!👍
Between Calisto protocol, Dead Space Remake and Now Hogwarts Legacy, it seems PC ports are suffering and its really sad to see. I understand everyone in the comment's seem to be running fine, but you're not every one and there are people with issues. Its sad that most reviews for Dead Space or Hogwarts dont mention these issues. Th easy option is just turn off RT right? Well if your game offers an option, I'm going to judge it as a whole and not pick and choose what I review. There are games that handle RT implementation well. This is not one of them as of right now and it should be a con and not looked over and told to turn it off. Thats not a solution.
Should we add denuvo into equation? Considering both titles has DRM in it.
Dead Space remake runs fine compared to the other two. I played it on my 4080 at 4k ultra RTX on and on my 2080ti at 4k medium dlss quality and the game had a consistent frame rate in each area.
simple explanation, we've come full circle. back in the 360/ps3 day's most of the console to pc ports sucked. we're basically going back to those days....good times
@@ryanespinoza7297 Again, thats great dude. But youre not everyone are you? Search around. Youll see people with Massive CPU and other frame issues with Dead Space. People with 4090s .
@@JarvisJarvis DENUVO is known to be shit for the consumer. Murders everyones enjoyment of games while offering those that pirate it the better experience 😆
I have a feeling this game was made with direct storage in mind on consoles, and the PC port is either not using it or using it poorly. It looks like it loads and releases assets from the disk constantly, and it just can't load them fast enough
I think it's just an open world game issue, Modern OWG titles are too demanding. On console this game doesn't even run native 4K
@@Kizzster on PS5 comparable hardware this game chugs on PC. It's not performing the same
its just typical Stutter Engine 4. next with those issues will be jedi: survivor (just like jedi: fallen order had it)
Crap engine.
Great video!! Did you check the cpu clock speed? That should be an important factor
One thing I find in games is crowds tend to tank performance. As the game starts managing the movement of many NPCs it hits the CPU and you can see this kind of thing in Cyberpunk by changing crowed density. You could try dropping shadows back as they seem to hit the CPU hard and even higher resolutions.
I think I might get this game just to play around with things :)
But Cyberpunk 2077 has WAY, WAY more NPCs that this area of Hogwarts Legacy. It screams bad CPU optimization.
On a 12 core Ryzen 9 5900X, you can push Cyberpunk crow density so high and you have a WAY higher than 60fps experience.
Its only here to, we have games like witcher, red dead redemption 2, cyberpunk which all work with much bigger crowds and way better performance. I thought cyberpunk it damanding game but its child's play compared to how demanding is Hogwarts legacy
@@saricubra2867 its horrible cpu optimization i dont know why my cpu usage is at 30% with 12900k getting 70fps in hogsmead with no RT on a 4090
Something is causing the game to have a persistent fog everywhere. It's not just faded out textures like DLSS would do but actual fog. You can see the game without fog for a split second when you exit an interior like the Three Broomstricks and the fog pops back in. So it's a filter doing this, because filters switch to create better ambient in different rooms.
i think i saw a mod that got rid of the fog or something like that
Is this only on pc or consoles as well?
Also must note this is set in the UK, persistent fogs are a common occurence.
@@Oshaoxin that's just not true lol. I live here and it's not foggy
@@JudeTheUA-camPoopersubscribe Jokes are always true. Even if they're not
Hey, great video! I have a question for you about my system performance. So I have a RTX 3080 (10GB) - Ryzen 5800x - 64GB DDR4 Ram 3200Mhz. I have played around with the setting a bunch and I am unable to achieve what you achieved in Hogsmeade with the exact same settings you used with your 3060Ti. You were floating around 70-90fps. I can only achieve 50-60fps with dips into the 30-40s and slight jumps to the 70s. I thought of maybe a CPU bottleneck but I saw a UA-cam video of someone benchmarking the Ryzen 5600x - 5800x - 5900x - 5950x with modern GPUs as high as the RTX 3090 and they all achieved the same performance. So I assume I am not bottlenecked with my CPU. I am lost to where my performance loss can be.
hello Daniel, what is the name of the app which you are using for the measuring (on the left side of the screen)? I want to check it out and then test it also on my pc
I did a full analysis on this game by myself, just for fun, using Radeon GPU Profilier (RGP). I was looking for a good comparison video to base my findings on, you didn't posted at the time I Tweeted my findings and wrote on Reddit, still very good since it basically validates some of my findings. So I'll share what I found here as well.
When there is no VRAM oversubscription (Video Memory spilling to System Memory), you became CPU limited like what happens with the 4090, even on mid-end GPUs, if there's no VRAM oversubscription you become CPU limited -- RGP is very good at telling if you're GPU or CPU limited, but I think it only works with AMD GPUs, I know Nvidia has an equivalent two, but I don't know if it does have this feature.
However, the stutters and performance issues are mostly issues with the game itself, the game may need more than 12GB of VRAM for 1440p in some occasions¹, if you have less than that, it will spill over to System Memory and cause massive stutter and framedrops, as well as low GPU usage and even low CPU usage. Because of the way AMD Driver reports GPU usage, you may still see it report 100% GPU usage, but that's because AMD reports based on Busy Time². If you look at GPU temperature and Power Draw, you can notice those values dropping whenever you experience massive FPS drops -- RGP also points that you're GPU limited, but that's not true, the GPU is busy waiting on Atomic Synchronization, the real bottleneck is System Memory, more on that below.
The really really complete explanation summarized is that the game uses a lot of Video Memory, that eventually get spilled to System Memory, then the game tries to access the spilled memory, which causes the Driver to issue memory synchronization calls to ensure correct synchronization of System Memory (that's a complex thing, so I'll not make any assumptions to why this is being so damn slow, I would have to investigate), those calls uses GPU time but don't really uses computational power (in simple terms, the GPU is basically sleeping, although that's not the correct explanation of atomic synchronization), it may seem that the CPU is the bottleneck, when the System Memory is the bottleneck, but the VRAM shouldn't spill over to start with. The game is badly managing the memory.
The High Video Memory usage is mostly coming from things that I think that didn't needed to be on VRAM to start with (Radeon Memory Visualizer can show what is being placed on VRAM), increasing resolution increases the used VRAM, so decreasing resolution reduces memory usage, but just at some extent since there is a baseline, which is determined by those structures that are being placed on VRAM at the start of the game, this is probably a constant and there is not way to reduce it. And since this is done at the start of the game, it will always be mapped to the VRAM, and the other things will get spilled over to the RAM.
¹: Your video proved that at 1440p it didn't reached the 12GB mark, but based on some reports, it may reach at some point, so 12GB is enough for 1440p, just don't expect to never experience any issues, you may eventually although rarely.
²: AMD measures the time the GPU is busy doing something, which also includes waiting on memory synchronization, the GPU cannot tell if it's waiting on VRAM or System Memory, so it reports both as GPU usage. This is way more complex, because memory synchronization instructions may be waiting on other work-groups that may be running on the GPU, the driver cannot really tell if it's waiting actual work or just on the data being copied over since there's no distinction (there may be a way for the drivers to do this, but AFAIK, after looking at Radeon's ISA and how it should work, it may not be practical).
Who writes so.much
No shittt, so my 3080 FE is already dated? Fuck PC gaming lmaoo.. Sooo many bullshit optimization issues I never experienced on console
@@XxLE6ITSWAGxX Dude chill, if its a game problem it will be patched, if it's a driver problem there will be an update to fix the issue. PC gaming is the best, i have a modest 1060 and it can play any new game on full hd so your 3080 it's not dated and will not be for a long time. A ps5 is close to a rtx 2070 so you are way beyond what consoles can do.
Any reasons why you would think my 5800x3d + 4090 system is literally producing in most cases double his fps, if not triple on occasion?
@@davidjavier7688 Hahah I’ll take your word man, the unreal engine games killl me thoughh, thanks for the small bit of optimism
hoping you’re right
Also 1060 ? that’s pretty badass to hear it’s still kicking strong hope it lasts you a few more years until and if you upgrade
Love the breakdowns with the different cards. Sounds like your kid has what my kids have. Hope they get better soon!
hi interesting u had the most powerful setup that exist currently , and barely play at 60 fps with rtx on .... , in outer door situation its not bother me much because the effect of the rtx is almost invisible, but indoor ( in hogwart particulary) that where the rtx show the difference sad we can't even play with it enable without dropping to really low fps ( between 10 to 25 fps for my case ).
I have a 5900x cpu and a rtx 3080 , and without Rtx in hogsmead i got the same fps u get between 80 to 90 fps with ultra setting with dlss quality on a 2k native resolution, but with activated i drop around 30 or lower.
Other question the frame generating only work with 4xxx series ? or do we need to activate something in nvidia somewhere to be able to use it with a 3080 ?
Did you change the refresh rate? I have a 1080p monitor with a 6900xt and a 5800x. I changed the refresh rate to 240hz since it have a 240hz monitor. I am still getting between 85 and 189 fps even in hogsmead. My game performance is as smooth as silk with everything on.
I have a 4090 and AMD 7900X and besides the PBO 200mhz overclock, I spent a decent amount of time overclocking my RAM and infinity fabric which has really helped a lot. Hardware Unboxed did a segment with how bad AM5 motherboards do with the secondary timings which make a huge difference for zen4. I was able to get my latency and bandwidth numbers almost 25% better for each and in a benchmark like Tomb Raider Shadows at 1080p highest I am able to get average 303 fps, which is an approximate 20% increase vs no PBO and memory overclock, meaning the memory overclock is the main contributor to that. Would be interesting to see how much it helps you here, if at all. Great video as always.
Hdr is bugged on the newest nvidia drivers. I lost 10 frames having hdr on in doom eternal. Dead space didn't seem to matter and didn't see a hit. Worth mentioning
Currently running HL using an i5-6500k on a B150 chipset with 16GB of DDR 4 3200Mhz memory and a 2060 6GB. My target is 60FPS at High at 1080p with DSLL quality mode on and RTX off.
Currently hitting around 25-35FPS for most areas of the game after all optimisations (DSLL to 2.5.1, CPU priority to High, V-sync set externally, Control Flow Guard, Latest Patch and Drivers).
I am CPU bottlenecked. GPU rarely hits 50% but CPU flat out at 100% in a lot of areas. The socket 1151 can support up to a 7th gen Intel processor, something like a i7-7700k. Will I still be CPU bottlenecked? Should I just upgrade to an i5-12400F and new chipset + 32GB of memory?
Many thanks!
Have you tried limiting the CPU threads to 12 and or disabling SMT? Could this be PC port problem and mimicing the console CPUs could help?
I've personally noticed quite a bit of difference by manually switching the DLSS version to 2.5.1. It definitely didn't fix all of it but I am curious how much difference your setups would show, if you're not already using 2.5.1 in this video ofcourse! Would love to see a comparison video :)
How dod you swap it?
Hey Daniel; I’m using a 4090 with a 5800X3D; I got about 20-30% better performance by updating the DLSS version from the one included with the game to the latest version (2.5.1 AFAIK) as well as “evening” the frame times. Could you try that next?
I can confirm using DLSS 2.5.1 you get better performance and also less graphical issues with Frame Generation enabled, the game comes with a pretty old version 2.3.11. My computer is using 5950x and 4090. Game is also allocating so much memory, system RAM used is 22gb on my 64gb system and 16gb of VRAM.
@@andreiga76 How do you change the DLSS version?
ward
It could be RAM (both size and bw) limited, did you try rebar on/off?
What is the tool you use to measure frames in the top left I cannot find the name of it for the love of god. I also want to say I have a 13900k and a gigabyte 4090 and with high refresh rate and high frames on this game stutters are way more visible and I am sure they are still happening on lower frame rates just the less frames the less likely you are to even notice a difference. I tried setting it to the ingame 75fps limit and it seemed less stuttery.
Its msi afterburner you just have to configure the settings so that you have that information in the OSD its under the monitoring tab.
It’s rivatuner statistics server, it gets bundled in with msi afterburner too
I got it for pc at first but i had really weird performance problems on rtx 3070 ryzen 5900 medium-high 1440p
Then decided to try it on ps5 and ps5 runs this game perfectly
The only problem is dynamic resolution in 60 fps is a bit much sometimes
Are you using dlss?
@@bearpuns5910s i had good frame rate but it would drop like crazy to 20 fps sometimes and the game felt like it stutters
Never seen this before and ps5 is just perfect 60 fps with no stuttering or any other problems
Played for about 8 hours already so I decided to refund pc version and keep a ps5 one
@@bearpuns5910 dlss barely does anything. i’m playing at 1080p with medium high settings and i’m getting anywhere from 40 to 100 fps
@@darkbustergt8085 isn't it because game is not optimized well? Couze dlss help a lot in many games.
@@bearpuns5910 you should never use dlss garbage. It's fake frames and makes the game laggy and blurry
Hi, do you think the devs can fix this cpu/ram bottleneck? Because if not, then the system requirements are misleading
Exacly as im thinking, no way that much ram and instability only on 1080p here and i use a 4070ti
not really its an old gen game. its supposed be about gameplay. not once did they boast about amazing graphics
@@calamdumr especially when ps5 have 16 gb of memory allocated between gpu and ram
@@japanesesamurai4945 PS5 have a guaranteed SSD performance. This means developers can depends on its SSD without having to allocate large caches on RAM. There are no guarantees for high performance storage on PC. I guess current-gen developers are relying on RAM to offset the storage performance on PC now. Hence high RAM usage for games that are designed for current gen consoles.
@@bltzcstrnx but it is the first game we seeing such a problem only on pc and not on consoles
As the RAM requested for the entire video was about 20GB and usage of this was around 14GB. It would be interesstring to see if lowering it to 16GB of total memory would show some of the problems people have mentioned. Eventhough most of windows is lowered priority wise in games having only 2GB to spare for the rest of the system could that be the reason people are having problems?
Is the in Game time of day different between the testing on the 4090 vs the 6700xt? The lighting looks way fainter with the AMD card.
Edit: it’s FSR. You can see the difference distinctly when he’s testing the 7900xtx
if they allowed exclusive fullscreen i am sure that would allow a few more frames vs windowed fullscreen. I don't understand why they didnt add this option as well
Im using a modest 10850 k, ddr 4, 3080 and im getting about the same performance at 2k but i do see stutters once in awhile.
Surprised a system like yours isnt performing much better.
I got similar results on 1440p dlss quality, all on ultra rt off
My pc has a 12700, 3080 and 32gb ddr4 3600mh, frames never go under 75 in hogsmeade, and the gpu utilization is super low btw, like 60% or 70% most of the time 🙄
I'm playing on a 4K TV. Is there a difference between setting my desktop resolution to 1080p and play the game like that. Or leave my TV on 4K and use the 1080p Rendering Resolution setting?. I playing on a Ryzen 2600x and RTX 2060 Super.
Whats the name of the program that shows all that FPS info? cheers.
Hmm. I guess I'd be interested in seeing the performance of different CPUs.
I wonder if Unreal 5 putting ray tracing on the software side will improve performance in future games, especially for the console versions. In Fortnite it looked like the AMD GPUs were seeing a huge jump in performance with UE5 RT.
Game developer probably will ditch software based RT altogether in the future. Use hardware accelerated RT or not using RT at all. It is similar years ago with DX9, DX10 and DX11. Console still using DX9 by the time DX11 arrive in 2009. Rather than making their game on each API most game developer end skipping DX10.
Another great video! Do you think the RT is actually working correctly? I have it turned on with my 4090 and honestly it doesn't feel like it is working as it should be.
The RT shadows look really good and reflections do too although they are a bit too nosiy. Analistadebits did a pretty good RT comparison video for this game. Ill probably make one too
You have to restart to see it. It’s also bad console-manageable RT with tons of noise. It’s not good looking. I turned mine off and things smoothed out a little bit.
What software do you use to see the frame rate values?
Thanks for making a no-nonsense video strictly about the performance of this game. I wonder if this engine isn't fully utilizing GPU because the developers have designed it to be more compatible with the Nintendo Switch and other GPU-limited systems. But I guess the RAM and CPU of the Switch aren't that advanced either, so there's a good chance I have no idea what I'm talking about. Wishing good health and magical gaming memories to your family.
Wondering what the increased L3 on the 5800x3d does to the game. Very educational again Daniel, thank you for testing! Hope the little one gets better soon :)
Not much at all. RAM bandwidth scales a lot more with RT especially with UE4/5. Ideal config is a raptorlake cpu with hynix ddr5 and tuned subtimings.
@@nepnep6894 could just be how powerful the 4090 is but with my 5800x3d and ftw3 ultra 3080 all ultra settings dlss quality I'm always full gpu utilization in this at 3840x1620 ultrawide config
it runs basically same as 5800x game needs cpu power not cache
Have the same config runs like shit
@@Wonkanator2 with max settings and RT enabled, you'll probably see similar dips below 60 in this area even if you lower resolution / increase dlss to get outside the gpu limit.
Daniel have u checked if its a ram bottleneck? Ive noticed 32gb ram is maxed out with pagefile disabled. Physical ram gets around 50% but virtual ram maxes out and its weird. I will never understand that virtual ram bs. So i got to enable pagefile and some games write to the ssd. I feel that is part of the issue u guys should try using 64gb ram or disabling pagefile if it doesn't crash.
Which testing software is that, that is showing CPU/GPU stats and FPS. You don't list it in the description
I have a 5950x with 64GB of DDR4 RAM and an RTX 3070....the game was hitting 10-20 fps in Hoggsmead and outside. This is a game or driver issue imo
What the fuck... I have 3080 12g and 5900x and I run ultra at 4k with quality dlss rt off. RT KILLED the game at random points, like down from 60 to 5 fps for up to 30 second at seemingly random times, but with rt off I only had that happen once in like 20 hrs...
Not finished the video yet. But interesting that “new” titles which heavily rely on frame generation like The Witcher 3 NG and now Hogwarts Legacy are CPU bottlenecked. Coincidence? I don’t think so. (But I love my theories).
Interesting, either way, but are you saying that you think devs are being lazy because they don’t have to optimize as much, or that nvidia is making them cpu bottleneck the game in order to push dlss3/fram-gen?
no, its because of denuvo
@@maxieroo629 Yeah, this is my theory.
@@xtr.7662 Could be, but I have my theory don’t take away from me please.
It's more a case of old game engines and ray tracing make for a bad combination
Could dlss performance/1080p be the issue for gpu utilitization? I know often times 1080p is used to test cpu performance by creating an intentional bottleneck.
Depends on the gpu and cpu, dlss will keep my gpu utilization under 100% but I've noticed about a 10-20% increase in cpu usage.
Running 2k naturally on my 5800x I use roughly 25% cpu and with dlss use around 45% cpu usage.
My issue seems to be solely memory based and how much gets thrown on vram and carried over into system ram.
Thanks for the review. Can you confirm if 32 GB of RAM is a necessity for 1440p for high/ultra settings?
I have a 3080ti 12gb vram, 32 gb 3600ghz memory, and a 5800x processor.
Native 2k ultra for me uses 12gb vram has high as 17gb system memory for the game alone.
I have managed to drop some setting down and keep some at ultra with raytracing. On dlss balanced, I can keep about 40-60 fps on 2k but the vram is always at 12gb and system memory can range from 4gb to 11gb.
If I only had 16 gb, I would not get the game at this time. There is a lot of wonky things going on with this game that I have never seen before.
Tell your daughter thanks from us enthusiasts for letting you do this today 😂 and we hope she feels better . Feel lucky to have a 40 series’s for these demanding titles . I feel like a shill but nvidia does give a great experience ONLY IF you aren’t hung up on frame per dollar and just overall the best “experience” . I love AMD and hope that fsr 3.0 is as good as frame gen/dlss 3.0
Same lol. I have an AMD cpu but the performance numbers are what they are and I’m loyal to performance.
Having owned a bunch of gpus over the past decades and having tested a multitude more I can say that there is no visible difference in picture quality between AMD and Nvidia cards today. Then again I just run games native and dont use FSR or DLSS. I have tested ray tracing but its only a slight difference in picture quality and I wouldnt say its actually better just a bit different. If it wouldnt tank the framerate Id say Im neither here nor there about it but considering that fact I dont see a reason to ever activate it.
There is nor reason why this game should be this demanding.
It looks like uncharted but runs like cyberpunk
@@user-ye7lp9lg1cIT looks worse than uncharted
i have a question don't know if you can answer it or not with dlss on i was like getting 144 frames inside and like 80 outside with 50 GPU usage i turn off DLSS and get like a 30 % jump in FPS at same settings 1440 p ultra with a 3700 x 32 gb of ddr 4 @ 14 clocks and a EVGA 3080 TI ultra gaming so dlss off the gpu usage went up to 80-85 % but so did my frames . i tohught dlss was suppose to get you more frames why do i get less ? with dlss is it because i am CPU bound ?
Great Video! Can you test the rx6800xt?
It'll be interesting when I try this on a 7950x system with an ARC A770. Not expecting greatness there lol. However, I do have a 7900xtx on the way, which will go into a 7950X3D system next month when that monster comes out.
I've seen a video from a smaller channel and the A770 performs really well. Would be good to see it tested as well!
Thank you so much for this. Same problems with a 3090 and 9900k. Huge stutters, awful frames, and super low gpu utilization. The intro gets you and then reality hits. It’s pretty awful.
Just got a 3090... Not good tonhear this as i was gonna get on ps5, but as usual, we are getting a bad experience on pc, but more playable on consoles
@@colinparks619 Its pretty awful. Keep RT off and the stutters will be less.
How did you manage making the GPU close to 100% utilization? I have a 3080 and I am stuck at 70% max, no matter what settings I change, both in game and out. My CPU is 5900X.
I am having a problem by starting playing the game. Yesterday I couuld play but I stablished my settings to high with a GTX 1060. Today, the game doesn' t start, it just remains black screen. Can I change the settings from outside the game or is there another option to do? My CPU is an Intel i7 8700k and I have 16gb RAM 3200 MHz
It sucks that RT is so heavy on CPU and VRAM tbh. It’s strange to me that 8 gig cards are now in hot water when it comes to their VRAM budget. Nvidia skimped too hard during the 3000 series it would seem. Imo 3070 should’ve had at least 10 gigs while 3080 should’ve had at least 12.
The ironic thing is the 30 series were lightyears ahead of the 20 series for RT. I still think RT is a work in progress even with the latest 40 series generation. Plus for Hogwarts I think they set the minimum system requirements far too low and should have been far more granular with their specification requirements for RT on or off.
Well mining happen. Because of mining nvidia end up cancelling their 3070 16GB, 3080 20GB and 3080Ti 20GB. nvidia probably afraid post mining craze where cheap cards with lots of VRAM flood the market. Imagine 3080 20GB end up being 300-400 in used market. If the VRAM still limited to 10GB and lower some people probably still doubting to get those card and will be "encourage" to take new 40 series that have more VRAM.
@@anorax001 RT on 4000 is great and really usable, playing the game on 160fps with the 4090 (but dlss 3.0 activared) the game is looking gorgeous.
I thought I'd be safe with the 5800x3D without upgrading for the next 5 years, but quickly noticed this. Seems DLSS 3 becomes a necessatiy soon for playing with RT, shame it's locked to the 40 series.
@@merlin7800 yeah, anything with dlss looks great, it really is free fps. Cant imagine what they can do in future with it.
For me, I'm on 4080 (with the same CPU, 7700x).
I was able to play the game in full 4K, Ultra but RT off... and able to get easily with 70-90FPS.
If i want to use RT On, I need to put DLSS for sure... and the FPS will be low...
But I'm totally fine to play it full 4K Ultra and with no RT. And the game is so pretty!
Same here, to me I didn't really notice to much of a difference with RT on this game so I ended up turning it off. On my 5800x3d/4080 I went with 4k Quality DLSS and I'm able to reach 116 cap pretty frequently, although in some areas I've seen it dip in the 80's.
How can you tell it's a CPU limit when it's not maxing out its usage? PC channels have always told us that cpu doesn't matter in higher resolutions but will it matter in a setting where you are rendering at 1080p with dlss on a 4k monitor?
i have a rtx 3060 12gb is there a big difference between ti and normal
Hogsmeade brought my 3080 5900x system to its knees. I hope it gets better with patches/drivers.
Just play 1080p.
@@SuperYtc1 no
It will get better with crack
@jkteddy77 anything more than 1080p and he will go over 10gb of vram. Even at 1080p high I've seen it use 9.5 Gb of vram. It's probably not a cpu bottleneck, it's probably a vram one. That's a real bummer
@Shane Garmon yeah I think you're right friend of mine has a 6800xt and he has no issues. I'm gonna try messing with settings more when I play tonight.
Still would love to know what systems they even test there games on when hardware like the 7000 series CPUs and 4000 series GPUs were only out for a few months, not enough to even got a hold of them and test on them, and if performance is this bad, really would love to see there internal results
Do Microsoft and Sony have developer consoles? If so, I would guess they mainly test it on there. The rest would most likely just be high end pc‘s with Quattros, or other high memory gpu‘s, that they also use for modeling/rendering.
How are you getting 60+ FPS with an RTX 3060 TI at Hogsmead on 1440p? I have an RTX 3080 with an i7 10700k / 32 GB Ram and I'm running on 1440 too - I'm getting 30FPS. What's happening with my system? My CPU utilization is hovering around a constant 50% and my GPU usage is around 35% but I am noticing my VRAM usage is pretty much maxed out at 10GB. Could that be the issue? If it is then that would be strange because you're getting better performance with an RTX 3060 TI...
I'm running a 5900x, 3080 and 32gb ram, max FPS is set to 60, my CPU bad GPU utilization isn't at max at all, but sometimes GPU jumps to 100% and I get stutters, I can't identify what triggers it
with my 14 core I5 13600k it runs absolutely fine without any bottleneck even with 3060 but the ram management is terrible, 20-24gb ram for 1080p ultra lol !!
That's exactly my rig, i5-13600K Stock + RTX 3060 12GB Asus Strix + I got 16GB RAM 3,600mhz (2x8GB), I haven't bought the game but if I ended up doing so it would be to stream it, I wouldn't mind streaming at 720p60 instead of the usual 936p60 seeing how much RAM it's consuming. How much RAM do you have in total?? (I've seen people on twitch with apparently 16GB of RAM streaming the game at like 1080p60)
@@ZealousPumpkinTV i have 32gb of ram, you can check full benchmark on my channel.
It wouldve been nice to have the A770 LE included in this test!
No bot uses that shi 😂
@@fm1798 ? It only makes sense to include it.
@@fm1798 BS, it runs this game perfectly. ua-cam.com/video/beT2EBXDPpY/v-deo.html
@@fm1798 I use one and would definitely like to see how it performs with this game before I spend the $$ on it
@@OUBrent1 check it yourself and then refund or wait for crack
Have you tried doing a comparison beteende running the game at fullscreen instead of borderless?
How is AMD dont release driver update for 3 months now? and will they update the driver for this? I have rx 6600 xt and really hoping for a driver update at least helps the performance a bit.
My experience: 5700G, 16gb 3200mhz cl16, RX 6600XT, 1080p 165hz display - game is consuming about 15+gb system RAM and reached maximum capacity and had a hard system crash. Second time running it, closed Chrome and other apps and the game was stable with system RAM at 15gb total usage. But wow, I’ve never seen a game this hungry for system RAM.
what about locking fps to 60 with rivaturner?
@@asifhasan4678 locking fps had no affect on system RAM usage. I did notice the RAM usage slowly ramped down as the game progressed, plateauing around 13gb.
On my 64gb system is using 22Gb, this game is using more RAM than professional applications.
Wait a year and the game will run much better
Hello daniel , can you test Hogwarts legacy in Recommended Full Hd 60+ fps settings? with 1080 ti or rx 5700 xt
Were you using Windows 10 or Windows 11 I have the exact same specs as you 7700x, 4090, 32GB DDR5 6000 and I was getting 80-90 frames in Hogsmead 4K Ultra no upscaling or raytracing on Windows 10 I just updated to Windows 11 and now I'm getting 120-130 frames still 4K Ultra with no upscaling or raytracing, maybe it's a windows scheduling issue
This is, in a way, a reflection of how many games go out into the wild these days -- not fully baked. This is to say that we can see evidence of this with pressures to realize a faster ROI starting with things like open "early access" in lieu of longer, staged betas from the past. Fully launched releases for even triple-A games now need months of post-release fixes for optimizations at best... and disaster recoveries at worst. And as teams have no time before DLCs need to also go out the door, the current state of things has become standard fare across gaming, which in turn has normalized them.
Memory leakage is definitely an issue in this game. I saw 20gb of ram used with the 7900XTX. That is absurd
There isn't a memory leak. It is using available resources for asset streaming.
It hits 20gb when u get to the menu. But then I would say it goes up a little the longer u play & then back down depending on what area your in
Seeing the ram usage just in this part alone is between 19-23gb. Isn't the minimum specs for this game 16gb?
@@icy1007 20gb is more than consoles have and consoles have 16gb shared memory not VRAM and RAM so consoles should crash with that high memory usage.
@@Extreme96PL - and if you have less total RAM then the game will use less RAM.
hi i have the same problem that my game is cpu limited, can this be fixed with future updates of the game?
Does the 3090 have frame generation or only the 40 series?
The issue is denuvo
the stutters have been the biggest issue for me, i use an rx 6600, and it’s definitely capable of a lot in that game, but it stutters particularly bad in hogwarts and hogsmade. interested to see what the devs do
Yeah certain areas are a bit funky. I use a strix 2080super and play on 1080p ultra, dlss quality setting with consistent 90-110fps in hogwarts (2 spots where it drops to 40), and 60-70fps in hogsmead.
Also cutscenes sometimes Tank fps like crazy, dropping to below 20fps
I have that gpu. Sad to see it struggle in this horribly optimised game.
Glad stutters doesn't bother me since it's not a competitive game anyway...
Wait for crack
@@user-ye7lp9lg1c nah. I just wrote a dlss update. Fixed the issues
Despite the very clear disclaimers, do you end up using DLSS 3.0?
Could this difference between AMD/Nvidia CPU usage be caused by Nvidia offloading its scheduler to the CPU?
I built my wife a 5800X/3060/32gb rig specifically to play this game. She’s barely touched PC gaming and knows nothing about FPS. I’m pretty interested to see what her gameplay experience will be like untainted by the quest for bigger numbers.
cap her at 30fps, she won’t even notice, my girlfriend doesn’t tend to either
Could've gotten 6700xt ma man 🤦
@@yellowflash511 some people will do anything but buy an amd gpu. I dont get it.
@@deagle7602 less features worse support
@@LegendaryGooseling same support, way more performance, same price 🤡
It's potentially good to know that there is potential to unlock something in order to utilize more GPU. Maybe they'll fix it in a driver update.
Potentially doubling the performance will be very exciting on the 4090 and future GPUs
Hopefully! Running a 3090 and 12900k and stuck at like 60-70% GPU utilization in hogsmead.
@@Trophykage time for CPU and ram oc
@@SlimedogNumbaSixty9 the game is not even using our cpus though. Why upgrade when the game doesn't even use a fraction of what we already have?
@@kenshinhimura9387 the less the GPU is utilized the more the CPU needs to pick up. This happens often in lower resolutions/settings. Having a better CPU for these situations will let the GPU stretch it's legs (or just play in higher resolution/settings with similar frame rate). I have no issues getting 100% GPU utilization, but the 4090 is just too strong to demand it to work harder because it's bottlenecked by the CPU in these scenarios.
do you have a 7950x that you can test it onv
What happens when you go to a higher resolution? My understanding is that lower rez puts more strain on the CPU than the GPU.
Runs so much better on Radeon in comparison, wow. And people still yap about "muh nvidia drivers"
Ray tracing kills radeon gpus in this game
With low level API's there's much more in the developers hands than there is in Nvidia's... So before pointing fingers at drivers one should really wait for a driver update that fixes it (if it is a driver issue) or the game just remains terrible at even utilising a single thread on your CPU for the rest of eternity (so not a driver issue there)...
@@anuzahyder7185 who cares about rtx?? is just a gimmick boy
@@sorryyourenotawinner2506 ok u r a 6000 series radeon user. Sorry we r not.
When CPU bound, fast and tight RAM matters the most.
FYI Nvidia users, Nvidia DLAA at Native Res is actually faster than Native TAA High and also looks cleaner on fine hair details in this game.
All Users - Config file can also have RT variables tweaked for higher res reflections and better RTAO coverage. News sites are beginning to report this.
RT reflections render at 33% Default of Native Res, you can set up to 100%, 66%, 50%,.. whatever you like.
DLAA is nice although it can introduce ghosting in some cases like Forza Horizon 5 for example but it's overall pretty good.
This game needs a big patch for the graphics optimization.
I don't think many people here tune their ram xd
Hey can you try playing the game at the minimum requirements to see how it holds up?
Do you have a standard 3060 to test with?
AMD GPUs have a hardware scheduler. Nvidia GPUs don't, they off load that function to the CPU. That's why AMD GPUs perform better in CPU limited situations.
Resize bar baby!!!!
@@bluej511 Woohoo!
I mean the game isnt even out yet either. Im trusting and hoping they put out some fixes soon but I mean im playing the game at 1440p on a mid range system just fine. I get from 80-120 fps with dlss at high settings and medium crowd density with no rt. I also think it is a ram issue with the whole whatever the hell is going on under the hood. My buddy with the same system only has 16gbs whereas I have 48gbs and he cant run the game without it crashing. So idk dude lol. We will see when it gets some patches and stuff.
We really need to go back to the standard of releasing finished, polished games. Cant just keep letting them do this expecting a patch that rarely comes.
Until then I'd stick with games from smaller companies. (3x the quality 1/3 the price)
@@sham5614 I get that but the game isn't technically out yet. Its weird because some lower end systems are running it fine. There aren't even driver tweaks out for the game yet either. Most of the time Nvidia pushes out driver optimizations and so does amd for large titles when they come out. Those probably won't drop until Friday or Saturday. And it's not like the game isn't complete or unpolished at all. I've been playing it fine and I'm 8 hours in now with very little stutters.
@@dakelpoandgames2593official release for preorders was 07.02.2022. plenty of people already finished the game. How is it not out yet?
People pay extra to get a worse version of the game 3 days early it's wild.
Did u test using latest nvidia driver.. look like driver issue also as AMD gpu always on full load when nvidia gpu is not especially high end one.
Glad I found your chanel Daniel, it helped to decide which GPU to buy :)
Definitely not the most optimized game but I think that the 3060ti getting nearly 60fps at ultra 1440p (maybe with one or two setting turned down) is more than adequate performance. Just making sure people aren't witch hunting for the sake of it.
People are already witch hunting for various of reasons for this game
No you aren't. The reason I KNOW you are lying is because you only have 8gb of VRAM. There is NO way you aren't getting performance drops into the single digits ESPECIALLY during the sorting ceremony. People who bought this game on PC KNOW you are lying.
@@stardomplays1386 I don't have the game but I watched this video and was expecting way worse numbers given all the performance hubbub. I imagine by the time it goes on sale (when I'll get it) it'll be marginally better with patches.
Glad your daughter gave you permission to post this video.
You westerners are so cringe, where I'm from, this kind of talk doesn't exist, we hold our parents and elders in such great respect. That these pathetic jokes are no where to be found.
Such a shame what you guys have devolved to.
Great content like allways Dani! hope your child get better! Greetings from ARG.
I get horrific, near game breaking FPS drops, primarily when entering certain areas like Hogsmeade, that last for around 10 seconds, yet elsewhere my FPS is perfectly reasonable. My rig is a 3080, with a AMD Ryzen 9 3900X CPU. Am I being bottlenecked by the CPU here? I am not using raytracing, and the issue persists with both DLSS and no DLSS on 1440p