For RTX 4090 you should choose higher Settings, try with DL-DSR, DLSS + Frame Generation while let Full Ray Tracing enabled, this is the way how this Game is optimized for. For working Frame Generation and unlocked frame rates in Cutscenes there is a Patch available, overall the Image Quality is way, way better compared to Standard GI!
Very true and runs really well on weaker GPU too. Radeon are at a bit of a disadvantage not having FSR in yet because using dynamic rea doesn't look particularly great but yea. We were able to do native 4K 60 on the 6800XT video too.
This speaks volumes about how optimized the base RT Gi is on this game. On average when RT GI is enabled the 4090 is about 60% faster than the 7900XTX and here the difference is almost the not much more than the normal raster difference between 4090 and 7900XTX wich is of around 20-25%
Yes very true. I found the XTX to be closer than I'd thought considering RT GI is maxed out. This game is quite well optimized imo. Very scalable. A bit VRAM hungry for 8GB cards but a worthy tradeoff for how good it looks and runs.
@@TerraWare absolutely! The fact that it can do native 1080P 60fps with RTGI, on an rtx 2060 when tweaking the right settings, speaks volumes. I will always applaud a game with scalability on its settings. Lately it feels like the only setting worth while changing is the upscaler preset, everything else seems to give small performance uplifts for small visual cutbacks. In this game low settings still looks “decent” and grants a decent deal of performance back! Great video, just found your channel and subscribed. I would also have loved, even for academic purposes only, to see how path tracing runner on the 7900XTX here btw!
@@lawyerlawyer1215 Thanks for the sub. Appreciate it. It would be interesting to see how PT would run were it available. Hopefully it will be soon. My guess is being a proprietary engine it's probably not ready going by what the devs said about FSR and their Vulkan engine, it requires a bit more work.
Bro is trying to call Radeon garbage. He doesn't even realize that the 7900XTX is an RTX 4080 competitor, not a 4090 competitor. I myself have a 7900XTX and love it, the performance and ram for the price you can't beat it.
One wise man said it clearly: "7900 XTX is not a halo card because it can't compete with the 4090, which is. Therefore there's no justification for AMD charging a halo price for the 7900 XTX, just like there's no justification for making excuses for either NVIDIA or AMD increasing prices simply because they want to." So nope, calling the card that costs from 900 to 1100 "halo product" is just dumb. And yeah I can easily find Sapphire Nitro+ for 999€ which is actually the best 7900XTX out here. The cheapest 4080S here costs ~1150€. Minus 8GB vram that hurts performance at 4k full RT+PT. I would rather spend 1000 for Radeon than Ngreedia.
@@kinzie3915 The halo product was in reference to my 4090. Not the 4080. Im not sure what you're on about because the 7900XTX is a 4080 Super competitor and they both perform relatively similar with XTX being a bit higher in raster but worse in RT and upscaling. Pick you poison I guess. I genuinely dont care what you buy btw. Whatever makes you happy. I'm happy with what I buy. I enjoy all of them.
@@TerraWare If I should grab one of these, it would be That Nitro+ for 999€ just because of 24gb of vram.. on the other hand if the 4080s was at ~1000.. I would try that gpu. Sadly it's too late for them anyway because new gpus are coming.
@2:16 seems to be the issue I covered in my video, right? Those small blips on the frametime graph as you turn. Doesnt seem to affect it as badly as Plague Tale, but the Radeon GPU doesnt have that.
Wow... 30% more performance for 120W+ less power consumption is quite insane. EDIT: Actually I guess it wavers between ~70-80W and over 120W depending on the scene, still insane.
I have a question on your 7900 XTX IS it OCed? Also - what is the hot stop delta vs the edge temp? I am asking because i have the Asrock Phantom Gaming one and its great overall but while edge temps are like 64 degrees, the hot spot is at 96 or so. Which is weird to me.
I've the Sapphire Nitro+ and yeah it's manually OCed, both GPU's are. My delta between edge and hotspot is around 20C which is quite normal. Yours appears a bit higher than I would've expected butI don't know a ton about the Phantom Gaming. I don't think its anything worth worrying about as long as its around 95C, as far as I'm aware that's within spec.
Hey! I have phantom gaming and i made custom fan curve. Otherwise i had same delta difference. Now with custom fan curve my difference is between 12-17c.
@@nikolaterziev93 I see. The default is like that. I decided to undervolt it a bit and power limit it - this now made the temps 60/88 Which while still a high delta - I cant hear the GPU no more so its fine.
I think it's just you lol but that's fine if that's what you think looks better. Full RT has ray traced subsurface scattering on plants simulating sunlight and how it lights up and travels through semi transparent surfaces like plant leaves its why there's a greenish glow in the jungle area with full RT enabled. Indirect lighting.
On a desktop I prefer a monitor. I use a 34 inch 3440 X 1440 ultrawide oled display. I've tried my 48" oled TV is way too big for me. I think a 40 inch max monitor or TV could work just fine though. It all depends on what works for you. My little brother for example loves using a big TV lol. I don't like to have to move my head to look at the entire picture.
Thanks you too. The PT or Full RT settings aren't available on Radeon, as of right now anyway but if they were it'd pretty much be pointless imo, which has been my conclusion in all PT games I've looked at. That said the game does used ray traced global illumination which is maxed out in this video. The XTX does pretty well.
@@johnsmith-i5j7i really? Why? Why say to me fan boy?? I use the best stuff i can get. What about you? Wich is your gpu? A radeon card? Why you pay for garbage? For a card that give glitches, driver problems, sh it upscaler, RT performance = 0 and considering the RT rendering will be the standard...
After last update on 4070 ti super when i turn on FG FPS goin down. When Patch tracing launch i can use ultra settings and 3/4 patch tracing option now is impossible because game running in 20fps, now must use high with patch tracing.
I didn't know there were rules about what you are and aren't allowed to compare? I've compared my 3080Ti to 4090 and 6800XT to 7900XTX. Besides the 4090 and 7900XTX is the best Nvidia and AMD current GPU. But to answer your question its what I want to do and a lot of people subscribe and watch these comparisons which Ive been doing for quite a while now on top of other type of content I like to make.
@TerraWare i understand, but its like taking the 4080 to compare with the 4090. So i won't sub since it's a way off comparison to what AMD and NVIDIA have claimed.
Just as side note my sapphire nitro 7900xtx clocked at 3ghz average 88fps in the forest scene, with around the same 450W of power consumption. I think it's a big thumbs up for amd considering both price and Nvidia optimized title
Really impressive perf considering the GCD on the 7900XTX is 304.35 mm², really wondering had AMD not cheaped out and made it around 400~ mm² we'd have better perf. Instead they took the savings per wafer and didn't pass it down to the customer lol.
Radeon make some stupid ass decisions sometimes. it honestly pisses me off. The hardware is good, the software is good but pricing at launch and marketing especially need a revamp. I would've marketed the heck out of their FSR FG advantage this generation and in a lot of cases smooth frametimes. Benchmark the heck out of F2P games and advertise it. Now they change the naming again if the 9070Xt is true. That's dumb and confusing.
@@TerraWare Yeah AMD does that, high price at launch and slowly drop prices. Get initial negative reviews but then become good budget offerings. Reason? IDK. Also don't get me started on their marketing LOL. I like what Intel is doing with naming, it's alphabetical defining the generation, then numerical to delineate the model. AMD is just all over the place, including their CPUs. Went from 1, 2, 3000 to jumping a thousand to 5000. Mobile parts? May as well be a random number/letter generator obfuscating products to confuse potential customers. What are they called now? Ryzen AI 300 series Pro Max ?? Must be illegal for AMD to have consistent naming/branding.
The developers srared in an interview there will be updares with more raytracing enabled. There are a few issues that cripple the best hardware too much they said.
the frame time on 7900xtx it's perfect, on nvidia it's good too but you can still see a micro frame time spikes every one secons in some area, and idk why i like more amd image quality, in detail with camera movement and colors, i don't have an amd gpu right now to test for my self so it can be the recording software but it's not the first time I've noticed it....while on performance i tought there was more gap probabily vulkan api run better on amd
Not really a good comparison, tbh. Anyone, like myself, that has a 4090 is going to enable full path-tracing in Indiana Jones. That's the whole point of having a 4090, no compromises. And of course the 7900xtx would've got absolutely crushed in that scenario. That's why it's a bad comparison. A much better comparison would've been the 4090 vs 4080 I think. Both can do rt well and you can enable path-tracing without the 4080 dying a miserable death.
Not solid numbers at all for AMD. Considering AMD's 7900 XTX is under the 4090's performance, AMD can consume upwards of 100+ Watt for much less performance than the 4090. That says a lot.
The 4090 has been $1800 for most of the generation and the 7900XTX $900. It’s even more ridiculous now, with 4090s mysteriously costing $3K with the 5090 likely just a few weeks away.
Go video but the 7900xtx and on par with the 4080 super. So maybe that test would be a better comparison? I own both plus the 4090. Just a thought. In my testing the 7900xtx does a better job and higher fps without path tracing. Thank you for taking the time to do this.
It is a 4080 Super competitor, I don't have one though. It's more of a best of one brand vs best of the other, not necessarily two competing products if that makes sense.
@TerraWare if you need a 4080 super I can send you the card. But of course I would need you to ship it back my friend USA. Hope you had a Merry Christmas 🎅 I own a bissness and build computers ect for others. As will as IT work.
I'm not happy with the devs tbh To not have upscaling or RT for AMD isn't good at all & the Games been out for a while now FSR & DLSS should be day one & same with RT Options Is it true also that the 12GB GPUs don't get any RT options ?
@@jacobmoneymillyrock3665 Yeah wouldn't surprise me if Nvidia paid them off to do it For me it's just anti consumer still to not have both It doesn't even matter if AMD are not good at RT, Next gen GPUs might be so that's why
The devs did an interview with DF and were asked about FSR. They said they didn't have enough time to include it at launch because of their engine and Vulkan it was more complicated to integrate and they must test it to work properly in the whole game thus will be coming later. Full RT works with 12GB GPU. I was running it on my 3080Ti at 1440P. Why Full RT isn't available on Radeon yet I don't think it's some conspiracy theory because it doesn't make any sense, Radeon aren't competitive in PT. It could be leading to crashes or not running properly given its a modified version of ID Tech would be my guess. I don't know. Something similar was happening with Wukong when running Full RT on Radeon the shadows and AO didn't look right and the hit to performance was immense and that was a UE5 game, very well supported by AMD and Nvidia. I wish all this stuff had made it on release obviously but seems like Microsoft wanted the game to come out when it did and weren't willing to wait any longer. I'd guess because of TGA's
@@TerraWareGiven Cyberpunk and Alan Wake have "full RT" and it runs fine on 7900 XTX (and will only get better in time with newer GPUs and Intel GPUs), it should be available nontheless
For me is the fact this game can use so much VRAM. 16GB cards like the 4080s are held back in this title becasue of lack of said VRAM. And the 5080 will be 16GB too. Nvidia are making it clear they do not care about the end user.
I hear ya, this games a bit of an outlier though since it uses a lot of vram and 4080 runs out at 4K with everything maxed out, PT and DLSS FG. It's quite the predicament because Nvidia can be stingy with vram, especially in the low end and the competition is generous with VRAM but not great in heavy RT or PT which is when the 4080 could run into vram issues in the future.
Well, PT settings aren't available on Radeon. That said though I did do a Full Ray Tracing breakdown video here ua-cam.com/video/jlKNV78uHHI/v-deo.htmlsi=PcBDmpVYvfFWh6C0
I have the 4090 in my PC but since I've bought the GPU I use it only max uv, DLSS Quality at 4k and ray- and pathtracing off all the time because I just don't see that much difference to raytracing off AND on top 120 FPS limit to 120Hz so I get 116 FPS with Ultra latency settings. Yes I'm also interested i ln a 5090 but I'd use the 5090 the same way, very limited!
If you've no interest in RT/PT 4090 will be fine for quite a while or maybe 5080 if thats an upgrade over the 4090, which it doesn't really seem to be but its just rumors as of right now.
@@Serandi1987 Now would be the time to sell 4090 they are going for a lot of money you could probably get whatever you paid for it but I'd hold on to it. DLSS is quite nice and you have plenty of VRAM for the years ahead.
@TerraWare yes it's the time for me to sell the 4090 and it is already on a selling second hand platform. If I can sell it up to 2000€ I will try to get a 5090 in summer.
*For a limited time, get the Digital Premium Edition of Indiana Jones and the Great Circle™ (a $99.99 value) with the purchase of a qualifying NVIDIA GeForce RTX™ 4090, 4080 SUPER, 4080, 4070 Ti SUPER, 4070 Ti, 4070 SUPER, or 4070 desktop or graphics card, or laptop with a GeForce RTX™ 4090 Laptop GPU, RTX 4080 Laptop GPU, RTX 4070 Laptop GPU. Technology varies by GPU.
It will get there. When I play Cyberpunk path traced, I choose to turn it on and sacrifice elsewhere. My first RT experiences on PS5 and weaker hardware left a lot to be desired, and made RT not look impressive at all. It isn't the tech that is bad though, it is the performance and application. With a path traced Cyberpunk, when you shut it off, you can clearly see how the lights are kind of in boxes and doesn't spread out well vs a path traced game play. The lighting ends up looking fake without PT.
@TerraWare you are right but saying it in just (rtx 4090) 36% faster than xtx is easier to understand and 73.5% for the xtx performance compared to the 4090 performance
NVIDIA continues to disregard crashing issues in Marvel Rivals and God of War Ragnarök with latest NVIDIA drivers. While the latest driver actually claims to fix a crashing issue in GoW, it did absolutely nothing about it. My guess is that they don't bother testing for longer than an hour, which is when this issue pops up. Always consistently somewhere within one to 2 hours the game image freezes with the audio playing and the pc responding normally.
Is that right? I can test out GOW, I am playing through it. Marvel Rivals I have it installed but havent really played. Is it the 566.45 hotfix that still crashes?
@@TerraWare Well, i am not entirely sure if everyone has it, but there is quite the number of people with the same issue. The hotfix mentions 2 games but not GOW or Marvel Rivals. The 566.36 driver does mention it fixes crashing in GOW but the freezing issue never changed for me. Freezing issue dissapears on 561.09 but that gives me terrible performance in the realm between realms. The number of reports on this issue is not very big, but considering it only happens after 1 to 2 hours of play, that may not be too strange. Most people reporting the issue had high-end systems btw.
@@oropher1234 I see. I'll look into. Sometimes its a matter of raising enough awareness. I believe my 2 videos I did about the frame pacing issues in Indiana Jones narrowing it down probably had some influence in the hotfix since those videos got quite a bit of views and we were also talking about it on X. Also sent it to my contact.
What is the purpose of this video? This comparison doesn't actually have any valuable information. Comparing completely different price point of GPU? Why?
@@TerraWare But this is misleading. These GPUs are not comparable. It would make much more sense to show their results separately or maybe compare 4080 to 7900 XTX (even though 4080 is still more expensive than 7900 XTX).
You can compare whatever you want as long as you're transparent about it. In this context it's the best of Nvidia and AMD head to head. You may not like it and that's fine but I and many subscribers of mine like these comparisons.
Pretty solid numbers from both GPU's running this game at native 4K. I hope you all had a great Christmas and holidays and continue to do so. Cheers!
For RTX 4090 you should choose higher Settings, try with DL-DSR, DLSS + Frame Generation while let Full Ray Tracing enabled, this is the way how this Game is optimized for. For working Frame Generation and unlocked frame rates in Cutscenes there is a Patch available, overall the Image Quality is way, way better compared to Standard GI!
Regardless of which card you're running on this game, you're getting a great game and a great experience.
Very true and runs really well on weaker GPU too. Radeon are at a bit of a disadvantage not having FSR in yet because using dynamic rea doesn't look particularly great but yea. We were able to do native 4K 60 on the 6800XT video too.
This speaks volumes about how optimized the base RT Gi is on this game.
On average when RT GI is enabled the 4090 is about 60% faster than the 7900XTX and here the difference is almost the not much more than the normal raster difference between 4090 and 7900XTX wich is of around 20-25%
Yes very true. I found the XTX to be closer than I'd thought considering RT GI is maxed out. This game is quite well optimized imo. Very scalable. A bit VRAM hungry for 8GB cards but a worthy tradeoff for how good it looks and runs.
@@TerraWare absolutely! The fact that it can do native 1080P 60fps with RTGI, on an rtx 2060 when tweaking the right settings, speaks volumes.
I will always applaud a game with scalability on its settings. Lately it feels like the only setting worth while changing is the upscaler preset, everything else seems to give small performance uplifts for small visual cutbacks.
In this game low settings still looks “decent” and grants a decent deal of performance back!
Great video, just found your channel and subscribed.
I would also have loved, even for academic purposes only, to see how path tracing runner on the 7900XTX here btw!
@@lawyerlawyer1215 Thanks for the sub. Appreciate it. It would be interesting to see how PT would run were it available. Hopefully it will be soon. My guess is being a proprietary engine it's probably not ready going by what the devs said about FSR and their Vulkan engine, it requires a bit more work.
that power draw difference is insane,
forced RT makes rdna3 consume a lot more
Radeon is garbage. What else is new?
Really? So it's the damn price difference@@TheBlackIdentety
@@adrianomart It's a Nitro+ model I've manually over clocked. Not your average 7900XTX
Bro is trying to call Radeon garbage. He doesn't even realize that the 7900XTX is an RTX 4080 competitor, not a 4090 competitor. I myself have a 7900XTX and love it, the performance and ram for the price you can't beat it.
Nice to see 4090 being only up to 35% faster than 7900XTX while being double the price.
It's a halo product. 4080 Super is around same raster level as 7900 XTX and similar price.
One wise man said it clearly: "7900 XTX is not a halo card because it can't compete with the 4090, which is. Therefore there's no justification for AMD charging a halo price for the 7900 XTX, just like there's no justification for making excuses for either NVIDIA or AMD increasing prices simply because they want to." So nope, calling the card that costs from 900 to 1100 "halo product" is just dumb. And yeah I can easily find Sapphire Nitro+ for 999€ which is actually the best 7900XTX out here. The cheapest 4080S here costs ~1150€. Minus 8GB vram that hurts performance at 4k full RT+PT. I would rather spend 1000 for Radeon than Ngreedia.
@@kinzie3915 The halo product was in reference to my 4090. Not the 4080. Im not sure what you're on about because the 7900XTX is a 4080 Super competitor and they both perform relatively similar with XTX being a bit higher in raster but worse in RT and upscaling. Pick you poison I guess. I genuinely dont care what you buy btw. Whatever makes you happy. I'm happy with what I buy. I enjoy all of them.
@@TerraWare If I should grab one of these, it would be That Nitro+ for 999€ just because of 24gb of vram.. on the other hand if the 4080s was at ~1000.. I would try that gpu. Sadly it's too late for them anyway because new gpus are coming.
@2:16 seems to be the issue I covered in my video, right? Those small blips on the frametime graph as you turn. Doesnt seem to affect it as badly as Plague Tale, but the Radeon GPU doesnt have that.
Probably yea. I tried running through this same area, at 3440 X 1440 this time so overall FPS is higher but no blips really.
Wow... 30% more performance for 120W+ less power consumption is quite insane. EDIT: Actually I guess it wavers between ~70-80W and over 120W depending on the scene, still insane.
Wow.. Double the price for 30% more perf 😂
Good video bud. 7900XTX doing very well here considering RTGI is maxed out. Merry Christmas to you and yours!
Yes it does quite well. Merry Christmas to you and your family as well.
idk what they did to last driver.. but my 7900XTX runs like 15% faster than before lol
What is up with the power draw on the XTX?? I thought it was like 350-something max.
Nvm just read what kind of XTX it was but still 450 watts is wild.
Why is the Frame time graph for the 7900xtx flatter than the 4090?
XTX is pretty smooth.
I have a question on your 7900 XTX
IS it OCed? Also - what is the hot stop delta vs the edge temp? I am asking because i have the Asrock Phantom Gaming one and its great overall but while edge temps are like 64 degrees, the hot spot is at 96 or so. Which is weird to me.
I've the Sapphire Nitro+ and yeah it's manually OCed, both GPU's are. My delta between edge and hotspot is around 20C which is quite normal. Yours appears a bit higher than I would've expected butI don't know a ton about the Phantom Gaming. I don't think its anything worth worrying about as long as its around 95C, as far as I'm aware that's within spec.
Hey! I have phantom gaming and i made custom fan curve. Otherwise i had same delta difference. Now with custom fan curve my difference is between 12-17c.
@@TerraWarehi! Are you running custom fan curve and fans always on with zero fan rpm disabled?
@@nikolaterziev93 I see.
The default is like that. I decided to undervolt it a bit and power limit it - this now made the temps 60/88
Which while still a high delta - I cant hear the GPU no more so its fine.
@nikolaterziev93 Yes custom with zero RPM enabled.
Maybe it's just me. But look at the image at 7:05. I think the one with RT off looks better.🤔
I think it's just you lol but that's fine if that's what you think looks better. Full RT has ray traced subsurface scattering on plants simulating sunlight and how it lights up and travels through semi transparent surfaces like plant leaves its why there's a greenish glow in the jungle area with full RT enabled. Indirect lighting.
I agree
4090 is an absolute beast.
Would you advice ppl to play on monitor or tv for pc and wich one?
On a desktop I prefer a monitor. I use a 34 inch 3440 X 1440 ultrawide oled display. I've tried my 48" oled TV is way too big for me. I think a 40 inch max monitor or TV could work just fine though. It all depends on what works for you. My little brother for example loves using a big TV lol. I don't like to have to move my head to look at the entire picture.
Merry Christmas, Man!
Runs smoother on my ps5 pro, then again consoles are dedicated games consoles.
Have there been any patches that improve the path tracing on AMD? Or is that just pointless? Merry Christmas, btw!
Thanks you too. The PT or Full RT settings aren't available on Radeon, as of right now anyway but if they were it'd pretty much be pointless imo, which has been my conclusion in all PT games I've looked at. That said the game does used ray traced global illumination which is maxed out in this video. The XTX does pretty well.
There's nothing can save trashdeon cards while PT is ON. Trashdeon are onsolete cards guys, don't buy that
@@OmnianMIU oh dear Nvidia fanboy
@@johnsmith-i5j7i really? Why? Why say to me fan boy?? I use the best stuff i can get. What about you? Wich is your gpu? A radeon card? Why you pay for garbage? For a card that give glitches, driver problems, sh it upscaler, RT performance = 0 and considering the RT rendering will be the standard...
@OmnianMIU lol driver problems? Nope. Never had issues with and drivers
also something bother me about 7900 xtx is W usage 460 instead of 4090 370~ W
After last update on 4070 ti super when i turn on FG FPS goin down. When Patch tracing launch i can use ultra settings and 3/4 patch tracing option now is impossible because game running in 20fps, now must use high with patch tracing.
If I had to guess you could be running into a VRAM limit. Try lowering Texture Pool Size to ULTRA. Restart the game and see what happens.
@@TerraWare thx for tips.
some graph can be helpful how much varm using single option like on halo and many other games.
@Analog_AI Yeah I agree.
But why u comparing 2 cards that is not competing with eacother? The 4080 is the 7900xtx competitor
I didn't know there were rules about what you are and aren't allowed to compare? I've compared my 3080Ti to 4090 and 6800XT to 7900XTX. Besides the 4090 and 7900XTX is the best Nvidia and AMD current GPU.
But to answer your question its what I want to do and a lot of people subscribe and watch these comparisons which Ive been doing for quite a while now on top of other type of content I like to make.
@TerraWare i understand, but its like taking the 4080 to compare with the 4090. So i won't sub since it's a way off comparison to what AMD and NVIDIA have claimed.
@@5thdimensionfpv390 👍
Just as side note my sapphire nitro 7900xtx clocked at 3ghz average 88fps in the forest scene, with around the same 450W of power consumption. I think it's a big thumbs up for amd considering both price and Nvidia optimized title
What does your profile look like if you dont mind me asking?
I have max clock to 2.9ghz, 1100mv, +15% power
@@TerraWare+15% power, 1050mv, 2750mhz ram. 1000w seasonic prime. I didn't touch anything else. Usual in game clock is around 2985-3010 mhz
The situation changes dramatically after the inclusion of PT from FG. Here Raden has nothing to look for
@sawomirbartosiak2487 Yeah PT is extremely demanding on both but especially Radeon.
When FSR 4 is announced (along with RDNA4) I hope AMD puts it in a lot games that have already released :)
Really impressive perf considering the GCD on the 7900XTX is 304.35 mm², really wondering had AMD not cheaped out and made it around 400~ mm² we'd have better perf. Instead they took the savings per wafer and didn't pass it down to the customer lol.
Radeon make some stupid ass decisions sometimes. it honestly pisses me off. The hardware is good, the software is good but pricing at launch and marketing especially need a revamp.
I would've marketed the heck out of their FSR FG advantage this generation and in a lot of cases smooth frametimes. Benchmark the heck out of F2P games and advertise it. Now they change the naming again if the 9070Xt is true. That's dumb and confusing.
@@TerraWare Yeah AMD does that, high price at launch and slowly drop prices. Get initial negative reviews but then become good budget offerings. Reason? IDK.
Also don't get me started on their marketing LOL. I like what Intel is doing with naming, it's alphabetical defining the generation, then numerical to delineate the model.
AMD is just all over the place, including their CPUs. Went from 1, 2, 3000 to jumping a thousand to 5000. Mobile parts? May as well be a random number/letter generator obfuscating products to confuse potential customers.
What are they called now? Ryzen AI 300 series Pro Max ?? Must be illegal for AMD to have consistent naming/branding.
The developers srared in an interview there will be updares with more raytracing enabled. There are a few issues that cripple the best hardware too much they said.
At 1080p the 7800xt should be able to handle path tracing but maybe not at a full 60fps but maybe 40fps
For $1500 more for extra 30fps isn’t worth it.
the frame time on 7900xtx it's perfect, on nvidia it's good too but you can still see a micro frame time spikes every one secons in some area, and idk why i like more amd image quality, in detail with camera movement and colors, i don't have an amd gpu right now to test for my self so it can be the recording software but it's not the first time I've noticed it....while on performance i tought there was more gap probabily vulkan api run better on amd
Not really a good comparison, tbh. Anyone, like myself, that has a 4090 is going to enable full path-tracing in Indiana Jones. That's the whole point of having a 4090, no compromises. And of course the 7900xtx would've got absolutely crushed in that scenario. That's why it's a bad comparison. A much better comparison would've been the 4090 vs 4080 I think. Both can do rt well and you can enable path-tracing without the 4080 dying a miserable death.
Not solid numbers at all for AMD. Considering AMD's 7900 XTX is under the 4090's performance, AMD can consume upwards of 100+ Watt for much less performance than the 4090. That says a lot.
The higher power consumption is fair enough but the 4090 isn't a competing product anyway to the XTX on price.
@@beyondearth6418 the 7900 xtx is really half the price of the 4090. How is that not solid.
7900XTX is a $900 card when 4090 is a $3000 card. On todays prices.
In todays market, the 4090 is a terrible purchase. I got mine at launch & I wouldn't dare pay the price it is today
The 4090 has been $1800 for most of the generation and the 7900XTX $900. It’s even more ridiculous now, with 4090s mysteriously costing $3K with the 5090 likely just a few weeks away.
Here 7900 xtx need a new drivers ti increase raw performance around 10 % .
Go video but the 7900xtx and on par with the 4080 super. So maybe that test would be a better comparison? I own both plus the 4090. Just a thought.
In my testing the 7900xtx does a better job and higher fps without path tracing.
Thank you for taking the time to do this.
It is a 4080 Super competitor, I don't have one though. It's more of a best of one brand vs best of the other, not necessarily two competing products if that makes sense.
@TerraWare if you need a 4080 super I can send you the card. But of course I would need you to ship it back my friend USA.
Hope you had a Merry Christmas 🎅
I own a bissness and build computers ect for others. As will as IT work.
Awesome. Merry Christmas. Thanks. I appreciate the offer but it's not really necessary.
I'm not happy with the devs tbh
To not have upscaling or RT for AMD isn't good at all & the Games been out for a while now
FSR & DLSS should be day one & same with RT Options
Is it true also that the 12GB GPUs don't get any RT options ?
@@RJTHEGAME it’s not laziness. They had a sponsorship with nvidia
@@jacobmoneymillyrock3665 Yeah wouldn't surprise me if Nvidia paid them off to do it
For me it's just anti consumer still to not have both
It doesn't even matter if AMD are not good at RT, Next gen GPUs might be so that's why
The devs did an interview with DF and were asked about FSR. They said they didn't have enough time to include it at launch because of their engine and Vulkan it was more complicated to integrate and they must test it to work properly in the whole game thus will be coming later.
Full RT works with 12GB GPU. I was running it on my 3080Ti at 1440P. Why Full RT isn't available on Radeon yet I don't think it's some conspiracy theory because it doesn't make any sense, Radeon aren't competitive in PT. It could be leading to crashes or not running properly given its a modified version of ID Tech would be my guess. I don't know. Something similar was happening with Wukong when running Full RT on Radeon the shadows and AO didn't look right and the hit to performance was immense and that was a UE5 game, very well supported by AMD and Nvidia.
I wish all this stuff had made it on release obviously but seems like Microsoft wanted the game to come out when it did and weren't willing to wait any longer. I'd guess because of TGA's
@@TerraWare Fair enough
Hopefully thats all true then because upscalers should be the standard feature these days
@@TerraWareGiven Cyberpunk and Alan Wake have "full RT" and it runs fine on 7900 XTX (and will only get better in time with newer GPUs and Intel GPUs), it should be available nontheless
For me is the fact this game can use so much VRAM. 16GB cards like the 4080s are held back in this title becasue of lack of said VRAM.
And the 5080 will be 16GB too. Nvidia are making it clear they do not care about the end user.
I hear ya, this games a bit of an outlier though since it uses a lot of vram and 4080 runs out at 4K with everything maxed out, PT and DLSS FG.
It's quite the predicament because Nvidia can be stingy with vram, especially in the low end and the competition is generous with VRAM but not great in heavy RT or PT which is when the 4080 could run into vram issues in the future.
@@TerraWare Even it the 5080 was 20Gb it would lengthen its life significantly.
Would have been interesting to see PT numbers. The game has some ugly visual issues without it.
Well, PT settings aren't available on Radeon. That said though I did do a Full Ray Tracing breakdown video here ua-cam.com/video/jlKNV78uHHI/v-deo.htmlsi=PcBDmpVYvfFWh6C0
You downloaded the chipset driver for amd for 3d-cache to be activated for the 9800x3d?
Of course
I have the 4090 in my PC but since I've bought the GPU I use it only max uv, DLSS Quality at 4k and ray- and pathtracing off all the time because I just don't see that much difference to raytracing off AND on top 120 FPS limit to 120Hz so I get 116 FPS with Ultra latency settings. Yes I'm also interested i ln a 5090 but I'd use the 5090 the same way, very limited!
If you've no interest in RT/PT 4090 will be fine for quite a while or maybe 5080 if thats an upgrade over the 4090, which it doesn't really seem to be but its just rumors as of right now.
@TerraWare yes maybe I'll sell the 4090 and go down to a 4080/4080 super or even a 7900 XTX
@@Serandi1987 Now would be the time to sell 4090 they are going for a lot of money you could probably get whatever you paid for it but I'd hold on to it. DLSS is quite nice and you have plenty of VRAM for the years ahead.
@TerraWare yes it's the time for me to sell the 4090 and it is already on a selling second hand platform. If I can sell it up to 2000€ I will try to get a 5090 in summer.
AMD 100 watts more is a no no for me.
that AMD card wattage is disgusting. They really need to invest more in R&D for power efficiency
its because this game uses RT which makes any rdna3 card consume more, a little research is also welcomed
Also the guy has overclocked his 7900XTX and Sapphire cards tend to allow much more than most AIBs
It's a Nitro+ model I've manually overclocked as well. Its made to be pushed hard.
*For a limited time, get the Digital Premium Edition of Indiana Jones and the Great Circle™ (a $99.99 value) with the purchase of a qualifying NVIDIA GeForce RTX™ 4090, 4080 SUPER, 4080, 4070 Ti SUPER, 4070 Ti, 4070 SUPER, or 4070 desktop or graphics card, or laptop with a GeForce RTX™ 4090 Laptop GPU, RTX 4080 Laptop GPU, RTX 4070 Laptop GPU. Technology varies by GPU.
Yeah its an Nvidia sponsored game. Not sure what it has to do with anything though.
Dont care about RT. Dont see it when playing engaging game!
Fair enough but you need RT hardware to run this game and going forward there will be more and more games like this.
It will get there. When I play Cyberpunk path traced, I choose to turn it on and sacrifice elsewhere. My first RT experiences on PS5 and weaker hardware left a lot to be desired, and made RT not look impressive at all. It isn't the tech that is bad though, it is the performance and application. With a path traced Cyberpunk, when you shut it off, you can clearly see how the lights are kind of in boxes and doesn't spread out well vs a path traced game play. The lighting ends up looking fake without PT.
its 36% more fps
not 136% because 7900 xtx is 100%
Um. Yeah. Relative performance. If XTX is 100% the 4090 is 136%. Its *not 136% faster. That would make 236%
@TerraWare you are right but saying it in just (rtx 4090) 36% faster than xtx is easier to understand and 73.5% for the xtx performance compared to the 4090 performance
NVIDIA continues to disregard crashing issues in Marvel Rivals and God of War Ragnarök with latest NVIDIA drivers.
While the latest driver actually claims to fix a crashing issue in GoW, it did absolutely nothing about it.
My guess is that they don't bother testing for longer than an hour, which is when this issue pops up.
Always consistently somewhere within one to 2 hours the game image freezes with the audio playing and the pc responding normally.
Is that right? I can test out GOW, I am playing through it. Marvel Rivals I have it installed but havent really played. Is it the 566.45 hotfix that still crashes?
@@TerraWare Well, i am not entirely sure if everyone has it, but there is quite the number of people with the same issue.
The hotfix mentions 2 games but not GOW or Marvel Rivals.
The 566.36 driver does mention it fixes crashing in GOW but the freezing issue never changed for me.
Freezing issue dissapears on 561.09 but that gives me terrible performance in the realm between realms.
The number of reports on this issue is not very big, but considering it only happens after 1 to 2 hours of play, that may not be too strange.
Most people reporting the issue had high-end systems btw.
@@oropher1234 I see. I'll look into. Sometimes its a matter of raising enough awareness. I believe my 2 videos I did about the frame pacing issues in Indiana Jones narrowing it down probably had some influence in the hotfix since those videos got quite a bit of views and we were also talking about it on X. Also sent it to my contact.
@@TerraWare Thanks alot !
@@TerraWare And ? Did you encounter the issue as well ?
7900xtx for losers 😂
No way. The XTX is a pretty solid GPU
What is the purpose of this video? This comparison doesn't actually have any valuable information. Comparing completely different price point of GPU? Why?
Well, because its what I want to do.
@@TerraWare But this is misleading. These GPUs are not comparable. It would make much more sense to show their results separately or maybe compare 4080 to 7900 XTX (even though 4080 is still more expensive than 7900 XTX).
You can compare whatever you want as long as you're transparent about it. In this context it's the best of Nvidia and AMD head to head. You may not like it and that's fine but I and many subscribers of mine like these comparisons.
@@TerraWare So why not to include Intel Arc B580 into this comparison?
@@GameTechLead Here's a better idea. Why don't you make your own channel buy the hardware with your money and compare and do whatever you want.
Nvidia overclocked vs amd stock lol nice bechmark
A Sapphire Nitro+ 7900XTX running at 2.8ghz pulling 450 watts is stock?