This video was being held back in the hopes of a radeon game ready driver and maybe a patch, but none came. If AMD's game ready driver meaningfully changes things I might do an updated test or pull this one completely.
Not yet, AMD’s last driver a couple weeks ago, stated “AMD Is working with Hogwarts Legacy to improve Ray Tracing Performance” it will come, I am only lv 5 on my second house (HufflePuff) got another 120hr’s of game time left hopefully some more patches and Drivers come out before I finish all house story Lines *OTHERWISE AVALANCHE GAMES FINNA GET AVADA KEDAVRA!!!*
Okay... First, 7900xt has 20gb VRAM and works perfectly for 1440p or 2160p. Second, the 4070ti has less ram, but it still gives good performance as well, but looking at the prices, the 7900xt can be seen at 799usd or even 819usd (4070ti too, but has - VRAM and requires a special cable conector). AMD's is worth it.
I own a gaming cafe and I ve both 4070 Ti and a 7900xt PC. 2 b fair it looks like AMD is better. But in reality, thats not the case. When you turn on RT. Thr Nvidia actually looks and feels like it is capable of delivering it with ease despite the 12 GB overflow at tjmes. I am extremely impressed with what Nvidia is able to offer with just 12 GB VRAM. Also from where I am 7900 is more expensive than 4070 Ti. Its 5% more than Nvidis.
@@1989rs500 There's also DLSS3 to consider which is really a game changer at higher fps. It's a difficult choice tbh both have their pros and cons i think at this point in time 4070ti is better as the v-ram is sufficient at this point in time what it be enough in a year or two ? That i cannot answer and honestly noane can, FSR3 also might come into being or it might never come, or appear and be inferior to dlss3 . Honestly both of them are hard sells due to their inflated pricing.
@@Adrian-is6qn Yea.. I do agree with that.. I being a resources management engineer, all I ve to say is, nowadays more games demanding more Vram is utter lack of resource management from the programming and development side Actually with newer coding and APIs the use of Vram should be lower while giving better quality. But it seems the new gen programmers and developers are taking the easy route. GTA V, Witcher is a standard benchmark, the world is huge.. But Vram is not the limiting factor.. Its raztersation
I had the 4070 ti - regular lag with texture quality drop to PS2 area in that game for a few seconds. Swapped it for 7900 xtx, runs smooth af, all issues gone
@@fpscomputers976 just hope Nvidia stops giving GPUs insufficient VRam. I had a 1080 Ti before that card, it could still handle the game on a Samsung Odyssey G9 (5120 x 1440) on medium and a few low-ish settings just because they gave it 11GB back than
@@L1ghtbirdy Yeah, it was an OP card and has aged really well. Testing Games recently did a head-to-head between the 1080Ti and 4060Ti, and the 1080Ti holds up!
Well I played Hogwarts with ultra settings with DLSS set to performance mode. What I found was I did put RT to low and the game averaged 70 fps consistently with near 0 VRAM overflow. I own a gaming cafe so I had 7900 xt also in another pc paired with 5800x cpu. I refently tried Hogwarts in that and I ve got to say I was not blown away. Like it gave 78 fps average with same settings as n ma 4070 Ti 12700k PC but the caveat was the upscaling was really bad at times, Nvidias DLSS is kind f smoothing surfaces and textures better while FSR has this shimmery and non linear type meshes and all, it looked bad and I had to switch to Quality FSR and that did better but again at penalty of some FPS. So in my view play Hogwarts at 4k while using DLSS at Performance, ensures VRAM overflow is minimal and frankly the quality aint that bad compared with FSR
Why is the 4070ti so much blurrier? For example, at 5:20, if you compare the two, it's like night and day difference in clarity/visuals with the same settings?
Also search for "hardware unboxed 16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070, 2023 Revisit". In cases of insufficient VRam texture Quality can be downgraded to still fit into the VRam
Seems the 12gb VRAM thing got blown out of proportion. 1% lows and frametimes are no worse than the 7900xt with 20gb VRAM. I mean 12gb is still on the small side, but by no means is the card broken.
@@anthonylong5870 what the fk you mean is a Vram Limit Looking like ? If your Vram usage is at 11gb and your card has 12gb this mean's it is at its Cap already this is showing the Real Use of the Vram not the Alocated Vram and this can cause alot of problems like stuttering or very bad 1% lows that ends up in stuttering too. this both cards are 1440p cards there they are working very well but if you turn on Avatar game then the 4070 ti will struggle alot with it's 12gb vram, it is just a matter of time untill the Less Vram will show it's Limits. And this has nothing to do with "Bad Optimazed games" this has more to do with "Buy the right hardware for your needs" instead of being braindead and buying like a Apple Fanboy the Wrong product for your Needs and be happy with What ever you get.
Wow, this VRAM usage is insane. 10-11 GB used at 1080p?!? Personally, I'm glad to see games that finally use more than 8GB VRAM, but it's still surprising.
@@BrunoFerreira-fp1vb Yes, but it's still being used. Some people mistakenly think that the VRAM quantity that some call "the allocated VRAM" is not actually being used, but it is being used, but on programs other than the game. Windows and other background system processes use smaller amounts of VRAM that contribute to the total amount. You can't have your game using 100% of your card's VRAM, because then Windows won't have any left over.
@@selohcin Ok, some things to clear up, because you're triyng to lecture people but clearly have no clue yourself. VRAM allocation is software dependent. It will claim as much as it wants. Any directx 12 application handles it's vram allocation itself. It will allocate as much as it feels like and will then later on, load things from the VRAM, if need be (assets etc.). It does not "use" or need that much VRAM necessarily, it's just using it since it's there. Some applications would use 100GB VRAM if your GPU for some reason had 100GB VRAM. The percentage of VRAM being taken, depends on the Application mostly. It is up to the application not to take too much, depending on the card's capabilities. Also, you're just looking at pure VRAM available while forgetting the actual bandwidth. So, no, the allocated VRAM is not actually being fully used, it's just there, in case it needs the stuff, since the VRAM is available anyways. Stop belittling people, when you're just some random tech hobbiyst that actually has no fkin clue what he's talking about when the topic goes any deeper than buzzwords and what your favorite tech youtubers have explained to you.
@@LeegallyBliindLOL Apparently, you struggle with reading comprehension. Go back and read what I actually wrote instead of what you've convinced yourself that I wrote. I did not belittle anyone.
2:24 80fps and 239 watt draw, to 66 fps and 300 watt draw.... overall, relative to its time, rdna2 was the better product over rdna3. i had one here, but as soon a i booted up an old game, with 40% gpu utelisation, but 243 watt draw the card went back. 3 weeks later, 4070ti arrived, same game, 98 watts, fans didnt even turn on after 20 mintutes.. sorry amd, i didnt want to support nvidia politics here, but the 4070ti is just the overall better product for me, since you dont care much about consumption and scaling
at 1440p 4070ti uses less ram, vram, power and runs faster than 7900xt, but at 4K with no dlss3 no rt the 7900 TX has won because of vram but 4070 ti is faster if u enable rt and use DLss at lower resolution than 4k idk why this game demands so much memory, it’s not like this game looks impressive like cyberpunk, it’s memory demands don’t match the visual return
Not bad. I'm guessing I'll play in 4k ultra ray tracing FSR balanced to stay at 60 fps or higher?I kind of wish you would have shown that, but if all else fails 1440 is fine if ray tracing is even needed.
I'd take the 7900xt over the 4070 ti any day. It has a larger memory bus width and amount making it more suitable for 4k gaming. I expect a 4k gaming gpu if I'm spending 800$ or more, the 4070 ti is DOA at 800$+. I don't give a 💩 about DLSS 3.
I just got the 7900xt over the 4070ti. I got an XFX Merc310 black edition for under MSRP, so that's pretty good. I love the way that verison is built. Great card!
Reasons I bought 4070ti: - as you see here, way lower power usage (235 vs 300 W), if you run at 95% you are looking at around 200 W, even. You may not consider that now but perhaps you should, this might mean the 4070ti is actually getting cheaper the first 7 months or so, and everything afterwards is a straight up investment of what you are gonna pay in power usage with AMD. - many think that mining is dead, but the 4070ti and its ADA arquitecture turned out to be an absolute MONSTER for mining, even for today´s market, it´s the only card in the RTX 40 OR 30 lineup, that not only pays for the energy consumed but actually MAKES money, even in today´s mining market. If you are willing to let some mining to happen on the background, with its low power usage, it makes even more money, that´s not even considering crypto might go back up. - this one is more personal, as a mechatronics engineer and data scientist, nvidia has AI cores and features for 3d rendering and simulation that AMD simply doesn´t support (yet).
@@alessandromorelli5866 you dont need to repost your comment everywhere here, its your side of view, coz you are absolutely a fan of nvidia, tou proove nothing here 4070ti over radeon 7900. its everyone own decision, no one be smarter here
@@zq7246 Right there with you. Purchased the Black off amazon since it has higher clock speed than the "ultra" and was $40 cheaper. Replacing my RTX 2080 Super that is going to go into my Living room PC and will be giving my RTX 2060 from it to a friend to upgrade from an RX 580 Nitro+ Special Edition. The thing most people are forgetting is that GONE are the days of getting a "high end GPU for under a grand." It's just not feasible to expect that, but this card delivers on that CAPABILITY more so than ANYTHING team green has to offer under nearly 2 grand at the moment, so it's an INCREDIBLE value. Sure, the Ray Tracing performance is weaker, and there are certain AI features, encoding capabilities etc. that it's SLOWER on, or simply can't do; but most people aren't buying a gaming card for production so that is a moot point. At the end of the day when it comes to encoding what you're saving in cash and heat dumping is more than made up for in time (imho.) For it's price to performance ratio based PURELY off this gen vs last it's not a bad deal in the LEAST.
Cache dumping is a magical thing, isn't it.. Although it's using more VRAM than the GPU has today's RAM speeds have gotten so efficient that when paired with an adquate system the GPU's really don't "NEED" onboard memory for "overflow buffer" like you'd see in this type of instance. That said, however, it does show that the card's just NOT made for 4k at moderately high or max settings due to that limitation and in a VERY short time it's going to start showing that due to efficacy of game engines, code optimization, workflow optimization, etc.
I had the 4070 ti: regular lags with downgrade of the graphics to PS2 area because the VRam was full with RT active and it dumped some files. Swapped to 7900 xtx, all issues gone
@@enricod.7198 undervolting RDNA2 and RDNA3 does not reduce its power usage, instead it allows the GPU to boost higher because of the extra power allowance from the undervolt.
I was thinking of buying a 7900 XT like right now but seeing this comment i wasnt to ask, I'm sure its enough but my 1000w power supply is enough for it right?
@@vladchenkov9215 my first upgrade was a dying 6800xt from Amazon, I took it back and now have a 4070ti on the way, so far the comments have me excited, it seems to not run so hot and it's very power efficient, i run a 240hz 1080p monitor so it should be perfect for it
Its performance is a bit lacking in places but from what I've played it does still mostly feel quite smooth. Haven't run it with any card with less than 12gb VRAM yet though.
You can't see it on the reading, but 12GB is already not enough in this game with RT turned on. Space requirements. However, maybe enabling DLSS 3.0 will give this card a second life in this game
What I seen was the 4070ti used 11GB when EVERYTHING was turned to max 4K with DLSS and it ran better than the 7900xt which was using 15GB of Vram. That shows you the Vram doesn't mean much at all if the GPU is using it the right way....
@@anthonylong5870 This is both correct and incorrect at the same time. Yes, Nvidia is better at memory usage than AMD is, which is why they get away with lower VRAM for years now. No, this isn't really the reason here. As I stated to Slawomir directly, what you see is only the RESERVED VRAM, not the actually USED one. That is lower, between 8-9 GB at 1440p vs. the 10-11 GB reserved.
@@MonkeyDBenny really pisses me off that Nvidia doesnt let 30 series or even 20 series able to use DLSS 3 to protect sales of 40 series. cant wait for FSR 3.0 with FG that even 30 series cards can even use
Why the hell the top new GPUs performing so weak. 2k gaming? What! And doesn't even reach 60fps! God! Oh man I can't believe this! And the next year will be just 30% increase but the games will be heavier too so still the stupid 2k at 40 fps. God!
@Ripcord157 yea but not every one is playing at 4K, if u look at 1440p 4070 is overall faster with less power, 4070ti is faster than 7900xt but it’s vram is limiting it at 4k
would have shown it but as FSR 3 isn't out yet it would be unfair testing. I can say that this game runs into VRAM problems with the 4070Ti at 4k RT which DLSS does not resolve so frame gen can't help in that situation.
@@fpscomputers976 it´s not really unfair, if it hasn´t come out yet then it simply isn´t something you are getting right now for your money, and you aren´t even saving time because when it comes out you might have to do the video again anyways. It´s just misleading.
well well well, the RX 7900XT with FSR on is faster than the 4070 Ti with DLSS on both in quality ( remember this is the 4080 lol ) in ray tracing lol we all ready knew it was way faster in normal raster. I still say Ray tracing is a gimmick and will be for another 5+ years and these two cards are both $799 now so please don't be a NVidiot. I am rocking a GTX 1080 Ti and i refuse to upgrade with the STUPID UNJUSTIFIED prices of these cards. Saw a break down and it only costs NVidia LESS THAN $400 to make the 4090! the 4080 was around $250-$300 to produce. Talk about screwing over consumers.
most new features were gimmicks in the beginning. when T&L came with the geforce 2, it was nice to look at but dropped performance like nothing else. same thing happend with tesselation when it came around. now, its all so standard, that you dont lose a frame if you use it. it will take some time, but my guess in, in 4 years, or 2 generations, turning on rt, will have almost no performance loss and every game, will only have that option after that point.
@@anthonylong5870 AMD fan boi? WOW you are a NVidiot aren't you or did you miss the part where I said I am on a GTX 1080 Ti? You sure showed your true colors didn't you.
This guy does not even use fg with 4070 Ti lol. Guess ppl are scared to see their fav brand getting humiliated by dlss 3. One side note: Ryzen 5000 series cpus has worse performance in W11 in general, compared to W10.
why u need rtx on nobody even play with that if u have good HDR monitor u don't even need rtx on the game will look amazing ultra 1440p with HDR600 ips
Actually insane that two of the most powerful graphics cards that ever existed only achieve slightly above 60fps in “real” 4K at this game. The optimization is insanely bad.
These graphics cards are more aimed at 1440p if you want raytracing. At 4k for raytracing you need a stronger card. 7900 xtx , 4080 or 4090.....Optimization for the game is fine. Your expectations are not fine.
Reasons I bought 4070ti: - as you see here, way lower power usage (235 vs 300 W), if you run at 95% you are looking at around 200 W, even. You may not consider that now but perhaps you should, this might mean the 4070ti is actually getting cheaper the first 7 months or so, and everything afterwards is a straight up investment of what you are gonna pay in power usage with AMD. - many think that mining is dead, but the 4070ti and its ADA arquitecture turned out to be an absolute MONSTER for mining, even for today´s market, it´s the only card in the RTX 40 OR 30 lineup, that not only pays for the energy consumed but actually MAKES money, even in today´s mining market. If you are willing to let some mining to happen on the background, with its low power usage, it makes even more money, that´s not even considering crypto might go back up. - this one is more personal, as a mechatronics engineer and data scientist, nvidia has AI cores and features for 3d rendering and simulation that AMD simply doesn´t support (yet).
Software reported wattage isn't nearly as accurate as you'd think. The only TRUE way to measure system draw is to hook a kill a watt up to the system or have an in-line monitoring system for the GPU's cables but even the latter would be inaccurate since it wouldn't take into account the 75w pull from the PCIe slot.
@@Dracconus it's efficient enough that i'm willing to add a really wide margin of error and it would still be the most efficient card in the RT40 series, and far more efficient than AMD
4070ti prices is bad but so is 7900XT; but what you get for buying nvidia is working drivers. I had a 7900xt i traded for a 4070ti cus i had black screens 24/7
Fg makes fake frames not real frames if a game running at 30fps with fg it's still running at 30fps but it looks like it's running at 60fps you still have the about the same latency from 30fps.so if you had a GPU that Hit 60fps without fg and you compare it to 30fps with fg to make 60fps the GPU without fg hitting 60 it will feel better sry for bad eng
This video was being held back in the hopes of a radeon game ready driver and maybe a patch, but none came. If AMD's game ready driver meaningfully changes things I might do an updated test or pull this one completely.
Not yet, AMD’s last driver a couple weeks ago, stated “AMD Is working with Hogwarts Legacy to improve Ray Tracing Performance” it will come, I am only lv 5 on my second house (HufflePuff) got another 120hr’s of game time left hopefully some more patches and Drivers come out before I finish all house story Lines *OTHERWISE AVALANCHE GAMES FINNA GET AVADA KEDAVRA!!!*
Loving my RX 7900 XT
which one do you have ?
Same got the hellhound addition
Love your RX 7900xt, and mine to 😂
Okay... First, 7900xt has 20gb VRAM and works perfectly for 1440p or 2160p. Second, the 4070ti has less ram, but it still gives good performance as well, but looking at the prices, the 7900xt can be seen at 799usd or even 819usd (4070ti too, but has - VRAM and requires a special cable conector). AMD's is worth it.
I own a gaming cafe and I ve both 4070 Ti and a 7900xt PC.
2 b fair it looks like AMD is better. But in reality, thats not the case. When you turn on RT. Thr Nvidia actually looks and feels like it is capable of delivering it with ease despite the 12 GB overflow at tjmes.
I am extremely impressed with what Nvidia is able to offer with just 12 GB VRAM. Also from where I am 7900 is more expensive than 4070 Ti. Its 5% more than Nvidis.
@@1989rs500 There's also DLSS3 to consider which is really a game changer at higher fps. It's a difficult choice tbh both have their pros and cons i think at this point in time 4070ti is better as the v-ram is sufficient at this point in time what it be enough in a year or two ? That i cannot answer and honestly noane can, FSR3 also might come into being or it might never come, or appear and be inferior to dlss3 . Honestly both of them are hard sells due to their inflated pricing.
@@Adrian-is6qn
Yea.. I do agree with that.. I being a resources management engineer, all I ve to say is, nowadays more games demanding more Vram is utter lack of resource management from the programming and development side
Actually with newer coding and APIs the use of Vram should be lower while giving better quality.
But it seems the new gen programmers and developers are taking the easy route.
GTA V, Witcher is a standard benchmark, the world is huge.. But Vram is not the limiting factor.. Its raztersation
I had the 4070 ti - regular lag with texture quality drop to PS2 area in that game for a few seconds. Swapped it for 7900 xtx, runs smooth af, all issues gone
Yeah, I've had that a few times inside parts of the castle. Frame Gen doesn't fix it either because it's overflowing vram afaik.
@@fpscomputers976 just hope Nvidia stops giving GPUs insufficient VRam. I had a 1080 Ti before that card, it could still handle the game on a Samsung Odyssey G9 (5120 x 1440) on medium and a few low-ish settings just because they gave it 11GB back than
@@L1ghtbirdy Yeah, it was an OP card and has aged really well. Testing Games recently did a head-to-head between the 1080Ti and 4060Ti, and the 1080Ti holds up!
Hogsworth is not the most optimized game
Well I played Hogwarts with ultra settings with DLSS set to performance mode.
What I found was I did put RT to low and the game averaged 70 fps consistently with near 0 VRAM overflow.
I own a gaming cafe so I had 7900 xt also in another pc paired with 5800x cpu.
I refently tried Hogwarts in that and I ve got to say I was not blown away.
Like it gave 78 fps average with same settings as n ma 4070 Ti 12700k PC but the caveat was the upscaling was really bad at times, Nvidias DLSS is kind f smoothing surfaces and textures better while FSR has this shimmery and non linear type meshes and all, it looked bad and I had to switch to Quality FSR and that did better but again at penalty of some FPS.
So in my view play Hogwarts at 4k while using DLSS at Performance, ensures VRAM overflow is minimal and frankly the quality aint that bad compared with FSR
Please reset the Avg and 1%Low when you change settings
Why is the 4070ti so much blurrier? For example, at 5:20, if you compare the two, it's like night and day difference in clarity/visuals with the same settings?
I think it's a dynamic volumetric effect. You can see at 00:17 that it's present for the 7900 XT. In some runs it's in neither/both.
Also search for "hardware unboxed 16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070, 2023 Revisit". In cases of insufficient VRam texture Quality can be downgraded to still fit into the VRam
4070ti can't even ship with 16gb vram, for $800?
and still has the same performance as the 20gb 7900 xt, so wheres the point
@@schwazernebe You will see the point in a year or two.
Seems the 12gb VRAM thing got blown out of proportion. 1% lows and frametimes are no worse than the 7900xt with 20gb VRAM. I mean 12gb is still on the small side, but by no means is the card broken.
EXACTLY the Vram never hit it's usage cap and the AMD card was actually using 5 GB more but still had lower fps in 4K with Ray tracing.
@@anthonylong5870 it wasn't "using" 5gb more, it was allocating 5gb more. there's no chart here actually showing vram usage, only allocation.
@Varity Topic I do that and my 4070ti still gets over 60fps and zero stutters....
@@anthonylong5870 what the fk you mean is a Vram Limit Looking like ? If your Vram usage is at 11gb and your card has 12gb this mean's it is at its Cap already this is showing the Real Use of the Vram not the Alocated Vram and this can cause alot of problems like stuttering or very bad 1% lows that ends up in stuttering too.
this both cards are 1440p cards there they are working very well but if you turn on Avatar game then the 4070 ti will struggle alot with it's 12gb vram, it is just a matter of time untill the Less Vram will show it's Limits. And this has nothing to do with "Bad Optimazed games" this has more to do with "Buy the right hardware for your needs" instead of being braindead and buying like a Apple Fanboy the Wrong product for your Needs and be happy with What ever you get.
Wow, this VRAM usage is insane. 10-11 GB used at 1080p?!? Personally, I'm glad to see games that finally use more than 8GB VRAM, but it's still surprising.
Allocating.
@@BrunoFerreira-fp1vb Yes, but it's still being used. Some people mistakenly think that the VRAM quantity that some call "the allocated VRAM" is not actually being used, but it is being used, but on programs other than the game. Windows and other background system processes use smaller amounts of VRAM that contribute to the total amount. You can't have your game using 100% of your card's VRAM, because then Windows won't have any left over.
@@selohcin Ok, some things to clear up, because you're triyng to lecture people but clearly have no clue yourself. VRAM allocation is software dependent. It will claim as much as it wants. Any directx 12 application handles it's vram allocation itself. It will allocate as much as it feels like and will then later on, load things from the VRAM, if need be (assets etc.). It does not "use" or need that much VRAM necessarily, it's just using it since it's there. Some applications would use 100GB VRAM if your GPU for some reason had 100GB VRAM. The percentage of VRAM being taken, depends on the Application mostly. It is up to the application not to take too much, depending on the card's capabilities. Also, you're just looking at pure VRAM available while forgetting the actual bandwidth.
So, no, the allocated VRAM is not actually being fully used, it's just there, in case it needs the stuff, since the VRAM is available anyways. Stop belittling people, when you're just some random tech hobbiyst that actually has no fkin clue what he's talking about when the topic goes any deeper than buzzwords and what your favorite tech youtubers have explained to you.
@@LeegallyBliindLOL Apparently, you struggle with reading comprehension. Go back and read what I actually wrote instead of what you've convinced yourself that I wrote. I did not belittle anyone.
@@LeegallyBliindLOL FINALLY!!!! OMG DUDE THANK YOU!!!! Someone who realizes and understands the VRAM argument is bullshit
Ok I have a question how are the new AMD cards with VR I k ow there last models didn’t work to great with it how’s the new ones
2:24 80fps and 239 watt draw, to 66 fps and 300 watt draw.... overall, relative to its time, rdna2 was the better product over rdna3. i had one here, but as soon a i booted up an old game, with 40% gpu utelisation, but 243 watt draw the card went back. 3 weeks later, 4070ti arrived, same game, 98 watts, fans didnt even turn on after 20 mintutes.. sorry amd, i didnt want to support nvidia politics here, but the 4070ti is just the overall better product for me, since you dont care much about consumption and scaling
Rt on ultra there. Not sure if rdna2 would do 66 fps with the same settings.
@@laszlodajka5946 IT would not. But thats Not the Point for ne
at 1440p 4070ti uses less ram, vram, power and runs faster than 7900xt, but at 4K with no dlss3 no rt the 7900 TX has won because of vram
but 4070 ti is faster if u enable rt and use DLss at lower resolution than 4k
idk why this game demands so much memory, it’s not like this game looks impressive like cyberpunk, it’s memory demands don’t match the visual return
4070 ti is faster on 1440p ultra high because it has Raytracing, without it 7900xt wins in all resolutions.
Fsr 3 está vindo aí pra ajudar a amd
I have Ryzen 9 5900X CPU and a 7900 XT graphics card, a 1440p monitor. With Ray tracing turned off, I’m not getting over 100 FPS. What am I missing?
have a r7 5800x and the same card/monitor, might just be render resolution set to 200%
Not bad. I'm guessing I'll play in 4k ultra ray tracing FSR balanced to stay at 60 fps or higher?I kind of wish you would have shown that, but if all else fails 1440 is fine if ray tracing is even needed.
I'd take the 7900xt over the 4070 ti any day. It has a larger memory bus width and amount making it more suitable for 4k gaming. I expect a 4k gaming gpu if I'm spending 800$ or more, the 4070 ti is DOA at 800$+. I don't give a 💩 about DLSS 3.
I just got the 7900xt over the 4070ti. I got an XFX Merc310 black edition for under MSRP, so that's pretty good. I love the way that verison is built. Great card!
@Brian O hope you enjoy it! Mine has been great!
Reasons I bought 4070ti:
- as you see here, way lower power usage (235 vs 300 W), if you run at 95% you are looking at around 200 W, even. You may not consider that now but perhaps you should, this might mean the 4070ti is actually getting cheaper the first 7 months or so, and everything afterwards is a straight up investment of what you are gonna pay in power usage with AMD.
- many think that mining is dead, but the 4070ti and its ADA arquitecture turned out to be an absolute MONSTER for mining, even for today´s market, it´s the only card in the RTX 40 OR 30 lineup, that not only pays for the energy consumed but actually MAKES money, even in today´s mining market. If you are willing to let some mining to happen on the background, with its low power usage, it makes even more money, that´s not even considering crypto might go back up.
- this one is more personal, as a mechatronics engineer and data scientist, nvidia has AI cores and features for 3d rendering and simulation that AMD simply doesn´t support (yet).
@@alessandromorelli5866 you dont need to repost your comment everywhere here, its your side of view, coz you are absolutely a fan of nvidia, tou proove nothing here 4070ti over radeon 7900. its everyone own decision, no one be smarter here
@@zq7246 Right there with you. Purchased the Black off amazon since it has higher clock speed than the "ultra" and was $40 cheaper. Replacing my RTX 2080 Super that is going to go into my Living room PC and will be giving my RTX 2060 from it to a friend to upgrade from an RX 580 Nitro+ Special Edition.
The thing most people are forgetting is that GONE are the days of getting a "high end GPU for under a grand." It's just not feasible to expect that, but this card delivers on that CAPABILITY more so than ANYTHING team green has to offer under nearly 2 grand at the moment, so it's an INCREDIBLE value.
Sure, the Ray Tracing performance is weaker, and there are certain AI features, encoding capabilities etc. that it's SLOWER on, or simply can't do; but most people aren't buying a gaming card for production so that is a moot point. At the end of the day when it comes to encoding what you're saving in cash and heat dumping is more than made up for in time (imho.) For it's price to performance ratio based PURELY off this gen vs last it's not a bad deal in the LEAST.
You'd think there would be shuttering on the 4070ti with it maxing out its vram.
Cache dumping is a magical thing, isn't it..
Although it's using more VRAM than the GPU has today's RAM speeds have gotten so efficient that when paired with an adquate system the GPU's really don't "NEED" onboard memory for "overflow buffer" like you'd see in this type of instance.
That said, however, it does show that the card's just NOT made for 4k at moderately high or max settings due to that limitation and in a VERY short time it's going to start showing that due to efficacy of game engines, code optimization, workflow optimization, etc.
It does at 4K without DLSS watch?v=tk189Wp115M
I had the 4070 ti: regular lags with downgrade of the graphics to PS2 area because the VRam was full with RT active and it dumped some files. Swapped to 7900 xtx, all issues gone
I am AMD fanboy but the 7900xt's power usage is insanely high.
It is but I think 7900xt should be compared to 4080 not 4070ti, but it still high
@@videosuu300 this time nvidia gpus are better undervolters than amd. You can run that 4070ti at lile 2% less performance and 180W
@@enricod.7198 undervolting RDNA2 and RDNA3 does not reduce its power usage, instead it allows the GPU to boost higher because of the extra power allowance from the undervolt.
I was thinking of buying a 7900 XT like right now but seeing this comment i wasnt to ask, I'm sure its enough but my 1000w power supply is enough for it right?
@@Venenata of course it is enough. Even 800w would be enough
Isn’t this spot easy ? I have seen xtx getting 10fps outside with 4k rt
Shit. On the PC with 4070Ti specifically to play this game. Should I put get rid of it and replace it with a 7900xt?
Just not use RT (which is not that great in this game eitherway) and you are golden. Also if you want 4K, enable DLSS
@@vaghatz what is DLSS?
@@RagnarCrumpets he's not using dlss3 aka frame generator on purpose, 4070ti is faster than 7900xt with dlss3 on.
@@MonkeyDBenny i’ve just been trying it out after I tweek the settings a bit it seems to run just fine. Awesome game
@@vladchenkov9215 my first upgrade was a dying 6800xt from Amazon, I took it back and now have a 4070ti on the way, so far the comments have me excited, it seems to not run so hot and it's very power efficient, i run a 240hz 1080p monitor so it should be perfect for it
my RTX 4070ti runs stable @ 3GHZ how much oc potential got the RX7900XT?
hääää i play on Ultra and with RTX on and i have 140 FPS in Hogwarts...most of the Time...why is this so low here?
ive got a 4070ti btw
@@Pannencop You're the only person on earth geting those numbers.
@@fpscomputers976 no...seems your PC is not good or you fake :D
@@Pannencop actually investigate the channel before saying silly things.
i have i7 10700 and msi suprim x 4070 ti and i reach only 55/60 fps with rtx off in 3440x1440p
Good test.
Just... looking around for the people claiming this game would perform so much better than "Forsbroken."
Its performance is a bit lacking in places but from what I've played it does still mostly feel quite smooth. Haven't run it with any card with less than 12gb VRAM yet though.
I don't understand why people keep referring to the 4070 ti has a 1440p card when it clearly can game at 4k.
4070 ti looks choppy at 60fps 4k
6:40
It looks choppier in general, no idea why nobody sees this. Even when the 7900xt drops to 30 fps, it's still smoother than the 4070 ti.
Song?
but sadly rasterization didnt use path trace GI like what plague tales did, obliviously bias to RTX
well rip my 3070 with 8 gb vram
My 3070 with 32 gab vram can only handle this game at its best with 1080p ultra 60 fps max otherwise it just shits the bed frame rate wise
You can't see it on the reading, but 12GB is already not enough in this game with RT turned on. Space requirements. However, maybe enabling DLSS 3.0 will give this card a second life in this game
What I seen was the 4070ti used 11GB when EVERYTHING was turned to max 4K with DLSS and it ran better than the 7900xt which was using 15GB of Vram. That shows you the Vram doesn't mean much at all if the GPU is using it the right way....
@@anthonylong5870 This is both correct and incorrect at the same time. Yes, Nvidia is better at memory usage than AMD is, which is why they get away with lower VRAM for years now. No, this isn't really the reason here. As I stated to Slawomir directly, what you see is only the RESERVED VRAM, not the actually USED one. That is lower, between 8-9 GB at 1440p vs. the 10-11 GB reserved.
once you turn on DLSS3, it's not even gonna be close
indeed he didn't on purpose because you know nvdia is greedy better blame them.
@@MonkeyDBenny really pisses me off that Nvidia doesnt let 30 series or even 20 series able to use DLSS 3 to protect sales of 40 series. cant wait for FSR 3.0 with FG that even 30 series cards can even use
These are both 1080p ultra for high refresh or 1440p high. Unfortunately 4090 is the king but way to much money
Why the hell the top new GPUs performing so weak. 2k gaming? What! And doesn't even reach 60fps! God! Oh man I can't believe this!
And the next year will be just 30% increase but the games will be heavier too so still the stupid 2k at 40 fps. God!
Power draw difference is insane...
@Ripcord157 yea but not every one is playing at 4K, if u look at 1440p 4070 is overall faster with less power, 4070ti is faster than 7900xt but it’s vram is limiting it at 4k
Yea but it much faster on raster
@@crescentmoon256 12GB is sufficient for almost all 4K games. Nvidia also have better memory compression and management technology.
@@crescentmoon256 again it's only faster with raytracing in 1440p.
@@simon6658 yea in future titles and techniques but as for this cross gen stuff which is inefficient as hell the 12 gigs is getting hammered
Frame ganaration ?, Réflex ?
would have shown it but as FSR 3 isn't out yet it would be unfair testing. I can say that this game runs into VRAM problems with the 4070Ti at 4k RT which DLSS does not resolve so frame gen can't help in that situation.
@@fpscomputers976 it´s not really unfair, if it hasn´t come out yet then it simply isn´t something you are getting right now for your money, and you aren´t even saving time because when it comes out you might have to do the video again anyways. It´s just misleading.
you know about dlss3? what 's the point of not showing it
well well well, the RX 7900XT with FSR on is faster than the 4070 Ti with DLSS on both in quality ( remember this is the 4080 lol ) in ray tracing lol we all ready knew it was way faster in normal raster. I still say Ray tracing is a gimmick and will be for another 5+ years and these two cards are both $799 now so please don't be a NVidiot. I am rocking a GTX 1080 Ti and i refuse to upgrade with the STUPID UNJUSTIFIED prices of these cards. Saw a break down and it only costs NVidia LESS THAN $400 to make the 4090! the 4080 was around $250-$300 to produce. Talk about screwing over consumers.
Umm...Watch the last benchmark again , The AMD card loses badly with everything turned on in 4K....Sorry AMD fanboi
most new features were gimmicks in the beginning. when T&L came with the geforce 2, it was nice to look at but dropped performance like nothing else. same thing happend with tesselation when it came around. now, its all so standard, that you dont lose a frame if you use it. it will take some time, but my guess in, in 4 years, or 2 generations, turning on rt, will have almost no performance loss and every game, will only have that option after that point.
@@anthonylong5870 AMD fan boi? WOW you are a NVidiot aren't you or did you miss the part where I said I am on a GTX 1080 Ti? You sure showed your true colors didn't you.
This guy does not even use fg with 4070 Ti lol. Guess ppl are scared to see their fav brand getting humiliated by dlss 3.
One side note: Ryzen 5000 series cpus has worse performance in W11 in general, compared to W10.
why u need rtx on nobody even play with that if u have good HDR monitor u don't even need rtx on the game will look amazing ultra 1440p with HDR600 ips
Love my 4070 ti rt performenc ❤❤❤❤
1080P i would take 4070Ti all the way but when its going to more solution, 7900XT all the way.
If your buying 1080P I would just go for 6700XT for the price, not sure why you would pay $800 for 1080p
Frame generation disabled because ?
Actually insane that two of the most powerful graphics cards that ever existed only achieve slightly above 60fps in “real” 4K at this game.
The optimization is insanely bad.
These graphics cards are more aimed at 1440p if you want raytracing. At 4k for raytracing you need a stronger card. 7900 xtx , 4080 or 4090.....Optimization for the game is fine. Your expectations are not fine.
Turn on frame generation and you can flush the AMD card down the toilet. Without it, they are equal.
This game has so bad optimisation, fps is sooooo unstable
the VRAM LOLILOL
Reasons I bought 4070ti:
- as you see here, way lower power usage (235 vs 300 W), if you run at 95% you are looking at around 200 W, even. You may not consider that now but perhaps you should, this might mean the 4070ti is actually getting cheaper the first 7 months or so, and everything afterwards is a straight up investment of what you are gonna pay in power usage with AMD.
- many think that mining is dead, but the 4070ti and its ADA arquitecture turned out to be an absolute MONSTER for mining, even for today´s market, it´s the only card in the RTX 40 OR 30 lineup, that not only pays for the energy consumed but actually MAKES money, even in today´s mining market. If you are willing to let some mining to happen on the background, with its low power usage, it makes even more money, that´s not even considering crypto might go back up.
- this one is more personal, as a mechatronics engineer and data scientist, nvidia has AI cores and features for 3d rendering and simulation that AMD simply doesn´t support (yet).
Software reported wattage isn't nearly as accurate as you'd think. The only TRUE way to measure system draw is to hook a kill a watt up to the system or have an in-line monitoring system for the GPU's cables but even the latter would be inaccurate since it wouldn't take into account the 75w pull from the PCIe slot.
@@Dracconus it's efficient enough that i'm willing to add a really wide margin of error and it would still be the most efficient card in the RT40 series, and far more efficient than AMD
looks way better without RT
Agree.
copium
4070ti prices is bad but so is 7900XT; but what you get for buying nvidia is working drivers. I had a 7900xt i traded for a 4070ti cus i had black screens 24/7
skill issue
@@CrawdoodleWhere did you get your 7900 from for $750? and was it brand new?
@@Crawdoodle cheer, so cheap
muita gente elogiou a amd e seus drivers , sei nao em
@@Crawdoodle ROFL I paid $250 for my 4070ti after I dumped a 6800xt off on an AMD simp...Now isn't THAT 🤣
AMD가 오히려 전성비가 매우 안좋네요
AMD WILL GET BETTER DRIVERS IN TIME!!!
In my region 4070ti cheaper 100$ and it’s better ray tracing good card)
now kick on FG & DLSS and watch the 4070 to run circles around the 7900xt
Fg makes fake frames not real frames if a game running at 30fps with fg it's still running at 30fps but it looks like it's running at 60fps you still have the about the same latency from 30fps.so if you had a GPU that Hit 60fps without fg and you compare it to 30fps with fg to make 60fps the GPU without fg hitting 60 it will feel better sry for bad eng