Kryzzp, I'm a long time viewer from my old account to now. I spent multiple years in hospital after a depressed episode, and your videos and positivity even around negative issues such as bad framerates and playing at 7p really bought happiness to me in a time nothing really ever did. Thank you so much for that
I'm glad you're better now, and I'm happy to know I could help in your recovery :) I didn't know my videos could have such an impact. Wow! Stay strong my friend! 💪😃
I own the Asrock Taichi White version of the 7900XTX. I can't put it into words how much I love that card. Much love to the channel from someone who never comments on anything!
I bought this beast last year and I haven't regretted it one bit. I'm finally playing games in 4k resolution, which has always been my wish. A real refresher after the RX 460 that I used until last year.
This is nice to see the 7900xtx is my favourite GPU ever but still sad we not seen the legendary GTX 980 it's made the 10 year old mark this month :C it deserves some appreciation in 2024
@@jherr856 competitive pricing. Amazing performance, and great features which here in Japan the 4080 and 4080 super cost 200+ more dollars against it. I would love had an 7900xtx compared to my aged 6900-XT, also no fire hazard connectors they still use the classic trustworthy 8pins and their drivers mature like a fine wine. As an 10 year Nvidia user I notice it when I switch to my 6900xt amd had potential I wasn't aware the last 3 years I am in team Radeon and is half the price of the 4090 here. So the second best GPU in raw performance for half the price of the best GPU. Is pretty appealing at least to me
I remember swapping from a 3090ti to a 7900XTX and being told that it'd be a worse experience. A year later, and I can confidently say that I get almost *double* the frames in some games, using the same settings. For someone completely uninterested in ray tracing, the 79xx lineup is just too good to be true. Even the GRE is phenomenal at its price range.
I love seeing you test the 7900 XTX! I decided on this card versus a 4080 super and I’m glad I did. I really enjoyed your 🤴🏼GTX 1080 TI🤴🏼 videos as that was my card I upgraded from. Still have it! It kept
Great video, and perfect timing too! I'm picking this card up in a couple of days to upgrade from an rtx 3060. I've got that urge to watch every benchmark video before putting the new pc together lol
Bruuh, I absolutely Love my 7900 XTX. Gaming on it is a delight 😊 Everything basically plays at 4k native or Very close to it, maxed out @60fps+. Mine's paired with a 7950X3D, 64Gb DDR5 6000 CL 30 and 4TB WD 850X. It's really powerful with plenty of VRAM to go with it and that is USED at higher resolutions and settings. Anyway, if money is no object, buy the 4090. Now, if you'd prefer basically my entire PC for about the same price as that card alone, I highly recommend the 7900 XTX 💯
@@OPXENOVGAMER You can still play it with RT at 1440p well enough. And even at 4k but you'll have to customize the settings to reach 60fps or close however if you're big Into RT Nvidia is the obvious choice.
Your benchmarks are fun and informative at the same time, also the conclusion hits home. AMD need to do better at least with upscaling to be able to compete with Nvidia, but it's also the devs, they keep implementing older versions of FSR and that is frustrating (I'm an AMD owner as well). And yes, we do need a 1440p video, haha! 😀
I have the Nitro+ vapour X 7900 XTX and it consumes around 340-380w~. I have it limited to stock power limit (AMD reference), as AIB cards like to factory OC/increase power draw. Using it at 4k.
Same here. My Nitro + pulls ~ 400w - 550w at stock settings and I gain very little in performance, so I dial back power limit and undervolt to a max of 380w.
😮 Sapphire 7900xtx nitro + max teraflop oc 42.9 vs 4090 44.6 7900xtx better ROP and rate pixels. Same memory. Rate texture +30% for 4090. Max power 550w vs 600w for 4090. It's just hidden 4090 for 1000 buck. You have a gold in your hands
Loving my Asus Tuff 7900 XTX. Went from a RTX 3080 after 15 years plus of being team green. Honestly it’s a great card and brilliant value but what really does it for me is the software and drivers. I was literally infuriated when Nvidia brought out framegen but wouldn’t let 3 series card owners have it because, “f you buy a new card” and found myself having to use AMDs framegen software as a hack. But since going with AMD there’s been several free updates with two different versions of their framegen software. For example now with cyberpunk I can use a feature in adrenaline to limit the power draw whilst having everything maxed out, no upscaling and doubling my fps with framegen to 120fps (what my monitor can do) and it perfect m. Every couple of months there’ll be something new for free that just works great but with Nvidia I always felt once you bought the card they’d be happy for you to go jump off a bridge.
@@michaelgooley229 I think it’s called iChill, it’s a fps limiter rather than a power limiter but the effect is the same. Set that to 60fps (so 120fps with the second driver for framegen 2) and a undervolt and the power draw is way less. Doesn’t really effect performance that much it mainly just stops the card from trying to max out the fps in parts of games where there’s not much happening on screen, in menus and sone cut scenes etc.
@@sargonsblackgrandfather2072 I went from a Gigabyte 3070 to a Sapphire Pulse 7900 xtx. I might try Radeon Chill. My Refresh rate is 165hz though so I might have to do some more research
@@michaelgooley229 nice I had a Gigabyte too. Really nice card actually. Only upgraded because the memory was too low and I wanted to go nuts with mods for Cyberpunk lol. Yeah iChill definitely worth playing around with.
@@sargonsblackgrandfather2072 unfortunately my 3070 had some very irritating coil whine. I mainly upgraded because my 3070 wasn't doing very Well in Hunt Showdown anymore since they updated the game Engine. Love my pulse 7900 xtx though it has zero coil whine so that's a bonus
I didn’t even think that this build would be able to run any games in native 4K on ultra settings without an upscaler of 60+ frames, I’m pleasantly surprised, beast, it would be interesting to see how much FPS there would be with the FSR 3.1 upscaler and frame generation.
I think AMD needs to focus more on developing FSR than they do the raw power of their GPUs at this point. They clearly have the horsepower to go head to head with Nvidia but with most modern titles needing some kind of upscaling they still have the advantage pretty overwhelmingly with DLSS
Native raster power over upscaling all day. Nvidias dlss is the main reason game devs are adding it to recommended requirements of games. Best advice when purchasing a gpu is to buy one that runs natively at what ur wanting for fps first then click on upscaling if needed. Don't purchase a gpu that gets u the fps u want only when upcaling is turned on. That's not even considering if the game devs even implement the latest upscaling tech. Been many games lately that get first and second gen dlss/fsr at launch then newest versions are released 3-6 months later.
@@mattblyther5426 Hard to say that in the modern market when hitting your res caps for some modern releases is nearly impossible. Hell, we've got devs releasing benchmarks including features not even intended to be used the way they're being recommended, like MHW requiring frame gen to reach 60 fps. I hate it as much as you do, but we've gotten to a point where you can't ignore upscaling and it really is an important feature in a GPU since it's almost a given you'll be using it on modern titles
I've had an ASRock Radeon RX 7900 XTX Phantom Gaming OC since August of 2023 and I know I made the right choice. There's NOTHING it cannot play, period.
I don't know why im watching this, i have my final exams tomorrow and ive just studied one topic. Even though i have a gaming PC i only have an rx 580, and have nowhere near the amount i need to upgrade even to an RTX card. Although i enjoy this because this guy's videos are amazing
well since you're going to school you are on the right track to be able to afford the best card thats available when the time comes for you to be able to upgrade. it will be well worth it especially if you didn't have to go into debt just to afford it.
Glad Kryzzp altast did extended review of this card. Have 7900 XTX nitro plus and else is exact same like in Kryzzps pc, same mobo, same cpu even ssd's lol.
I am disappointed in the ray tracing performance of the AMD flagship. To me, that's the reason to get a flagship nowadays is to go with raytracing. It's also interesting how 24gb of memory on a wide memory bus doesn't seem to help as much as you might think. Although it did seem to fix all the stuttering in fortnite.
@@laszlodajka5946 well, it's future proofing. I have an 8gb rx580, and it has had a second life recently with games that use more than 4gb vram. it could be that no game uses 24gb, but soon they will use more than 12 at least, and that is where having more than 8gb or 12gb will let this card breathe. Although I think the sun is setting on rasterization. AMD really needs to put more focus on ray tracing.
@@Layla-p2h games that basically require RT are coming out, and remakes of old games with RT are now a huge reason to get raytracing. Even my rx580 can play for example cyberpunk. But the compelling difference would be having RT with cyberpunk.
Those are really weird cyberpunk numbers. I'm running and driving around the same part of the map with a 5800X3D and reference 7900XTX at Native 4k ultra settings with psycho ssr no RT and I am getting 85-95 fps. That's so bizarre. Awesome vid!
Wormz is known for benchmarking everything other than your household appliances, but what he's not known for is his gaming ability. Dudes a gamer man. Just casually racking triple kills in Fortnite while concentrating on frame time graphs and power consumption numbers.
I have mine undervolted to 700mv at 2250 mhz. Runs at 280 watts on average. Unfortunately some unoptimized games can be very sensitive but most games i tried works fine for hours.
If you're pairing this with an older Ryzen, like the very popular 5600, the XTX is faster than the 4090. That's because the XTX handles scheduling on the card, whereas the 4090 shunts it to the CPU and the 4090 comes with a whopping schedule burden.
Of course in practice how many Ryzen 5600 owners use an RTX4090 and then run it at 1080p? Very few I think! However, AMD's ongoing decision to keep scheduling on the GPU does have benefits.
I think the ray tracing performance is similar to a 4070 Ti or 3090 so its good but not good enough for a $1000 gpu kind of good and I personally would get the 4080 super for that price but you can't deny, the 7900 xtx is a legendary gpu to have in 2024 onwards.
Awesome video and amazing 🤩GPU the Rx7900XTX by Radeon💪🥰👍. I agree it needs to be a bit cheaper than the 4080Super as well as Nvidia offers better features with DLSS and Ray Tracing and waaaay better production software support😇👍.
I have the 7900XT, it's perfect for 1440p and can even do 4k on less demanding games. The XTX is around 20% faster or so on average, but my card is still fine enough for gaming.
Agreed about the smoothness aspect of Battlefield. I have played Battlefield 2042 at 900p 60 fps and was able to maintain a K/D of 0.8-0.9. The game just felt smooth enough to rush in have some fun. Get some kills etc.
I Owned A gt 730 When I first saw Your Video Then I Bought A Used RX 470 But Sadly You Never Made a video On Rx 470 And I Bought A rtx 3070 few Months Back Love From India☺️
I just got a hellhound 7900xtx for £700, its a no brainer over nvidia as I want the vram for my quest 3 and my 4k monitors, over 1/2 the price of a 4090 which seemed to have spiked to around 2k as stocks run out as it is EOL
I just got the XFX 7900XTX for £670 ! Absolute bargain, owned a 3090 previously and this matches it for RTX but matches the 4080 super easily with raster
FINALLY, AMDs Top Modell! I've been wishing for a video like this, and it’s even the exact same card I have. I’ve personally downclocked it to 2GHz, as it doesn’t lose much performance in some games, and since I only need 1440p at 120Hz, it’s more than enough. Plus, it only draws 250-260W under load instead of 400W, so I’m really happy with the card. The only downside is in idle mode, where it draws 100W no matter the clock speed, which is a lot for idle.
Didn't AMD fix the idle consumption with an update? My 7900 XTX draws around 40 W in idle. You can also try this this video ua-cam.com/video/4B0Vt_c-7C8/v-deo.html&ab_channel=AncientGameplays
@@heavydee6795 Yes, that’s often suspected, and they’ve claimed it’s fixed in driver updates, but not in my case or many others. I have a Samsung Odyssey G93SC OLED 49” ultrawide (Or however it’s called) in PBP mode at 1440p 120Hz, and an older 1080p 60Hz monitor above it. The bottom monitor is connected via HDMI and DP, and the top one via HDMI with an HDMI to DP adapter. With all three connected, my GPU draws 100W at idle, but without the 1080p monitor, it drops to 7-30W. It’s tied to the VRAM clock, which stays at 2500MHz when the top monitor is connected and drops when not. I’ve tried everything to fix it.
@@heavydee6795 Yes, that’s often suspected, and they’ve claimed it’s fixed in driver updates, but not in my case or many others. I have a Samsung Odyssey G93SC OLED 49 ultrawide (or however it’s called) in PBP mode at 1440p 120Hz, and an older 1080p 60Hz monitor above it. The bottom monitor is connected via HDMI and DP, and the top one via HDMI with an HDMI to DP adapter. With all three connected, my GPU draws 100W at idle, but without the 1080p monitor, it drops to 7-30W. It’s tied to the VRAM clock, which stays at 2500MHz when the top monitor is connected and drops when not. I’ve tried everything to fix it.
@@heavydee6795 Yes, that’s often suspected, and they’ve claimed it’s fixed in driver updates, but not in my case or many others. I have a Samsung Odyssey G93SC OLED 49” ultrawide (Or whatsoever) in PBP mode at 1440p 120Hz, and an older 1080p 60Hz monitor above it. The bottom monitor is connected via HDMI and DP, and the top one via HDMI with an HDMI to DP adapter. With all three connected, my GPU draws 100W at idle, but without the 1080p monitor, it drops to 7-30W. It’s tied to the VRAM clock, which stays at 2500MHz when the top monitor is connected. I’ve tried everything to fix it.
@@heavydee6795 Yes, this issue is often mentioned, and they’ve claimed it’s fixed in driver updates, but not in my case or for many others. I have a Samsung Odyssey G93SC OLED 49” ultrawide in PBP mode at 1440p 120Hz, along with an older 1080p 60Hz monitor above. The bottom monitor is connected via HDMI and DP, and the top via HDMI with an HDMI to DP adapter. With all three connected, my GPU idles at 100W, but without the 1080p monitor, it drops to 7-30W. It seems to be related to the VRAM clock, which stays at 2500MHz when the top monitor is connected. I’ve tried everything.
Kryzzp, I'm a long time viewer from my old account to now. I spent multiple years in hospital after a depressed episode, and your videos and positivity even around negative issues such as bad framerates and playing at 7p really bought happiness to me in a time nothing really ever did. Thank you so much for that
I'm glad you're better now, and I'm happy to know I could help in your recovery :)
I didn't know my videos could have such an impact. Wow!
Stay strong my friend! 💪😃
@@zWORMzGaming Thank you Kryzzp, your happiness is genuinely infectious :)
Kripz content never give you down
Me too man, nothing serius but i was in the ospital 3 days and your videos keeped me Happy 😃
did you know depression is something you manifest and isnt real
glad to see new videos from you!
🍻😃
@@zWORMzGaming Pedro bro please try Lenovo v15 laptop gaming
I own the Asrock Taichi White version of the 7900XTX. I can't put it into words how much I love that card. Much love to the channel from someone who never comments on anything!
I bought this beast last year and I haven't regretted it one bit. I'm finally playing games in 4k resolution, which has always been my wish. A real refresher after the RX 460 that I used until last year.
went from intel hd 530 to rtx 4080 here.
It's a 4k 60-100fps ultra (95% of the time) card! Nothing to complain about :D
I'll never get the hate on these things. lol
Man, I hope you upgraded ur CPU as well, lol.
@@Oozaru85 its 13700k
What monitor do you have btw , is it 4k 60hz or 120hz?
Lets goo finally some 7000 series content!
I'm very happy about this 😃
@@zWORMzGaming yo browdy teach me the amd and nvidia hierchy.
Lkke all off the gpus power level or something like that.(i wanna know everything ?)
if i get this gpu i would stay on 1440p monitor cuz most games are not good optimized and u can suffer from fps if u play on 4k
@@apcetojako8219 fsr exist :D
damn, bros channel grew from gt 210 to rx7900xtx. nice 1 keep up bro!
That was his first card he used on this channel??
@@silentwitness536 it was the gtx 760
bought this GPU last year, Its a BEAST
if this is a beast then the 4090 is literal god
@@sutnack7537 no 4090 is expensive beast.
@@sutnack7537 nope.for gaming that is beast also value for money
@@sutnack7537thats why im waiting for the 5090 that card is gonna be 50-60% better😂
@@sutnack75374090 is 1600. What gpu are you using?
Despite so many subscribers, your vibe suggests you are the most humble UA-camr in this range catering to budget and enthusiasts alike. Kudos to that
This is nice to see the 7900xtx is my favourite GPU ever but still sad we not seen the legendary GTX 980 it's made the 10 year old mark this month :C it deserves some appreciation in 2024
I wasn't expecting to get this from AMD, so the 980 video got pushed to October!
As someone with a 7900XTX, i have to ask, why is it your favourite GPU.
@@jherr856 competitive pricing. Amazing performance, and great features which here in Japan the 4080 and 4080 super cost 200+ more dollars against it. I would love had an 7900xtx compared to my aged 6900-XT, also no fire hazard connectors they still use the classic trustworthy 8pins and their drivers mature like a fine wine. As an 10 year Nvidia user I notice it when I switch to my 6900xt amd had potential I wasn't aware the last 3 years I am in team Radeon and is half the price of the 4090 here. So the second best GPU in raw performance for half the price of the best GPU. Is pretty appealing at least to me
Man the amount of vids that you release every week that are somehow good quality is mind blowing honestly.
Sadly AMD won't make graphics cards on this level next gen
Yeah :(
Wait, why ?!!
Whyyyy?!??
@@Rizzmon1405 Its not worth the time and money. Next Gen they're focusing on midrange cards and might return to high end after that.
Why 😢
damn that CS testing was actually so good
That little upside down “shall we”, and the Bob thing gets me all the time. It always makes me laugh, so never stop doing it. lol
The CS gameplay was quite decent, love to see it
subbed btw
Thanks 😃
Waiting all week for this one! Thank you Kryzzp
I remember swapping from a 3090ti to a 7900XTX and being told that it'd be a worse experience. A year later, and I can confidently say that I get almost *double* the frames in some games, using the same settings. For someone completely uninterested in ray tracing, the 79xx lineup is just too good to be true. Even the GRE is phenomenal at its price range.
I just bought my 7900XTX , I currently have a 3060Ti, hope I made the right choice! Cheers
Just picked up this exact card and I love it!
I've been watching you for about 5 years now and I don't know why, but there's just something refreshing about the way you review GPUs.
4K Max settings and 60 fps in any game that would be a dream. I'm still stuck with my GTX 1060
That 1060 is getting old! Legend of a card though 😃
😂😂😂😂
Can probably run older games in 4K
If possible, same up some cash and upgrade to a 1660 Ti. Or better yet, a 2060 Super.
@@Reluctant_Hero Getting a 4060 since it's affordable enough and has low power consumption
I love seeing you test the 7900 XTX! I decided on this card versus a 4080 super and I’m glad I did. I really enjoyed your 🤴🏼GTX 1080 TI🤴🏼 videos as that was my card I upgraded from. Still have it! It kept
The previous version of this xfx model was beautiful. I've no idea what were they thinking designing this one.
Thank you for the entertaining videos, I've been a subscriber for almost 2 years now
Great video, and perfect timing too! I'm picking this card up in a couple of days to upgrade from an rtx 3060. I've got that urge to watch every benchmark video before putting the new pc together lol
Не делай ошибку! Лучше возьми 4080 super!
@@LW80 The 4080 super is stupidly overpriced and not worth it
@@LW80 I went for the 4070TiS and are some what regretting it.
Should have probably went 7900xtx.
@@dawienel1142 You should have probably went for 4080 super instead.
@@LW80 why would you use 1.5k or more for 16gb of vram are you dumb? 💀
as a intel hd graphics 5500 user every game u benchmark looks sexy for me 💀
i have the sapphire version of 7900xtx, and it's glorius
just bought one yesterday for $800 off amazon. can't wait for it to arrive today 😁
@@c0Ld nice grabbed the taichi myself. lovin it
@@mewre2062 I have kinda mixed feelings on it. The performance is great but the drivers and games crash way more often than my gtx 1080 did
@@c0Ld bet. loved my 1080ti.
@@mewre2062 How's your XTX been?
Bruuh, I absolutely Love my 7900 XTX. Gaming on it is a delight 😊 Everything basically plays at 4k native or Very close to it, maxed out @60fps+. Mine's paired with a 7950X3D, 64Gb DDR5 6000 CL 30 and 4TB WD 850X. It's really powerful with plenty of VRAM to go with it and that is USED at higher resolutions and settings. Anyway, if money is no object, buy the 4090. Now, if you'd prefer basically my entire PC for about the same price as that card alone, I highly recommend the 7900 XTX 💯
But i care about RT especially in Cyberpunk, so i should go for rtx 4080 super cuz it's about the same price in China.
@@OPXENOVGAMER cool, i odn't play cyberpunk. as long as ur happy and chillin homie, all good
@@OPXENOVGAMER You can still play it with RT at 1440p well enough. And even at 4k but you'll have to customize the settings to reach 60fps or close however if you're big Into RT Nvidia is the obvious choice.
my dude ive said it before but only you can make benchmark videos so entertaining, keep up the great work!
Glad to see my GPU featured! Except mine is the Sapphire Nitro Plus. Hopefully it will hold up until AMD makes a true successor at the high end.
probably the best one out there. since it's higher boost clock.
Oh my god! He is doing a gpu testing…that’s first time I am seeing this.
finally new review, hopefully you can do rx 7800xt, plan to purchase it later
Had one since day one. great card. 0 issues almost 2 years now
Your benchmarks are fun and informative at the same time, also the conclusion hits home. AMD need to do better at least with upscaling to be able to compete with Nvidia, but it's also the devs, they keep implementing older versions of FSR and that is frustrating (I'm an AMD owner as well). And yes, we do need a 1440p video, haha! 😀
I really liked your conclusion.😊
I will watch this video tomorrow while having breakfast. For some reason your videos are good breakfast content for me.
Haha, enjoy it tomorrow 😃
Have a great sunday!
i thought its only me 😂😂 having a lunch or breakfast watching this kind of video are freaking entertaining
Got one of these monsters last year and it's such a damn good card
I have the Nitro+ vapour X 7900 XTX and it consumes around 340-380w~. I have it limited to stock power limit (AMD reference), as AIB cards like to factory OC/increase power draw. Using it at 4k.
Same here. My Nitro + pulls ~ 400w - 550w at stock settings and I gain very little in performance, so I dial back power limit and undervolt to a max of 380w.
Are you folks undervolting? I can go down to 1045mV and it makes a huge difference in power consumption.
@@skylab4404 Nope, which is why I am saying these cards don't have to draw as much power as people make them out to be.
😮 Sapphire 7900xtx nitro + max teraflop oc 42.9 vs 4090 44.6 7900xtx better ROP and rate pixels. Same memory. Rate texture +30% for 4090. Max power 550w vs 600w for 4090. It's just hidden 4090 for 1000 buck. You have a gold in your hands
@@Accuaro Cpu limited??? You must have 30 000 time spy Cpu score
And another 1. Wow. And this 1 a long 1 also. Broooo. What. A goat. Gj.
Keep up the great work my g
i have the same GPU XFX 7900xtx its beast i play all a games at 4k 60fps !!its amazing one and happy that u test it in many game!!very nice video
Yup,sh. This mf can handle any games and will be able upto 2028
@@_P_T_H_ no it cant
@@Dempig yes it can
@@Your_Friend_Devan No it cant. 7900xtx cant play any UE5 games at 4k 60. The 4090 cant even do that.
@@Dempig i play tekken 8 at 4k with 60 ur just lying
That GPU temps are just crazy :D
Loving my Asus Tuff 7900 XTX. Went from a RTX 3080 after 15 years plus of being team green. Honestly it’s a great card and brilliant value but what really does it for me is the software and drivers. I was literally infuriated when Nvidia brought out framegen but wouldn’t let 3 series card owners have it because, “f you buy a new card” and found myself having to use AMDs framegen software as a hack. But since going with AMD there’s been several free updates with two different versions of their framegen software. For example now with cyberpunk I can use a feature in adrenaline to limit the power draw whilst having everything maxed out, no upscaling and doubling my fps with framegen to 120fps (what my monitor can do) and it perfect m. Every couple of months there’ll be something new for free that just works great but with Nvidia I always felt once you bought the card they’d be happy for you to go jump off a bridge.
Just curious. What setting do you use in adrenalin to limit power draw?
@@michaelgooley229 I think it’s called iChill, it’s a fps limiter rather than a power limiter but the effect is the same. Set that to 60fps (so 120fps with the second driver for framegen 2) and a undervolt and the power draw is way less. Doesn’t really effect performance that much it mainly just stops the card from trying to max out the fps in parts of games where there’s not much happening on screen, in menus and sone cut scenes etc.
@@sargonsblackgrandfather2072 I went from a Gigabyte 3070 to a Sapphire Pulse 7900 xtx. I might try Radeon Chill. My Refresh rate is 165hz though so I might have to do some more research
@@michaelgooley229 nice I had a Gigabyte too. Really nice card actually. Only upgraded because the memory was too low and I wanted to go nuts with mods for Cyberpunk lol. Yeah iChill definitely worth playing around with.
@@sargonsblackgrandfather2072 unfortunately my 3070 had some very irritating coil whine. I mainly upgraded because my 3070 wasn't doing very Well in Hunt Showdown anymore since they updated the game Engine. Love my pulse 7900 xtx though it has zero coil whine so that's a bonus
I didn’t even think that this build would be able to run any games in native 4K on ultra settings without an upscaler of 60+ frames, I’m pleasantly surprised, beast, it would be interesting to see how much FPS there would be with the FSR 3.1 upscaler and frame generation.
I think AMD needs to focus more on developing FSR than they do the raw power of their GPUs at this point. They clearly have the horsepower to go head to head with Nvidia but with most modern titles needing some kind of upscaling they still have the advantage pretty overwhelmingly with DLSS
Native raster power over upscaling all day. Nvidias dlss is the main reason game devs are adding it to recommended requirements of games.
Best advice when purchasing a gpu is to buy one that runs natively at what ur wanting for fps first then click on upscaling if needed. Don't purchase a gpu that gets u the fps u want only when upcaling is turned on.
That's not even considering if the game devs even implement the latest upscaling tech. Been many games lately that get first and second gen dlss/fsr at launch then newest versions are released 3-6 months later.
@@mattblyther5426 Hard to say that in the modern market when hitting your res caps for some modern releases is nearly impossible. Hell, we've got devs releasing benchmarks including features not even intended to be used the way they're being recommended, like MHW requiring frame gen to reach 60 fps. I hate it as much as you do, but we've gotten to a point where you can't ignore upscaling and it really is an important feature in a GPU since it's almost a given you'll be using it on modern titles
The upload time is perfect! Got to watch it while eating Subway. Cheers Kryzzp!
Nice! I really like subway as well, sometimes I go there after riding my bike and eat 2 footlong subs😂
Cheers mate!
Need a 1440p video on this card pls
I have one and I can max out my 165hz monitor pretty much all the time
Like 4080 super, it should crush every game at 90-120 fps (as long as it's not UE 5 OR terribly optimized)
4k gaming is insane.. I play Escape from Tarkov, and the map "Streets of Tarkov", it looks so freaking good.
Glad I picked mine up for £750, when the 4080 was £1000 and the 4090 was £2200. Does everything I want not fussed about RT.
Great high end price to performance card. A beast in 4k
I own the gigabyte gaming OC version of this card it is an absolute beast love the video mate!
his honnor in rdr2 is in -1000000
I am getting mine deliverd tomorow.I am so exited
I've had an ASRock Radeon RX 7900 XTX Phantom Gaming OC since August of 2023 and I know I made the right choice. There's NOTHING it cannot play, period.
In future, will it be able to play GTA 6 ?
@@UM_88-s2c ofc
Really underated channel and One of the best gpu reviewers... Hope you get more subs kryzzp🎉
I don't know why im watching this, i have my final exams tomorrow and ive just studied one topic. Even though i have a gaming PC i only have an rx 580, and have nowhere near the amount i need to upgrade even to an RTX card. Although i enjoy this because this guy's videos are amazing
well since you're going to school you are on the right track to be able to afford the best card thats available when the time comes for you to be able to upgrade. it will be well worth it especially if you didn't have to go into debt just to afford it.
Glad Kryzzp altast did extended review of this card. Have 7900 XTX nitro plus and else is exact same like in Kryzzps pc, same mobo, same cpu even ssd's lol.
My new GPU since last week ❤
How are liking it so far? 😃
@@zWORMzGaming Amigo esse GPU e um monstro! Adoro!Definidamente para roda 4k/144Hz 👌🏻
Great! Need more games to test such as Hellblade 2, Space Marines 2, Nobody Want to Die, Star Wars Outlaws.
Nice t-shirt Kryzzp! Love Maiden
Thanks! My favorite band 🤘
Waited like 7 months for this video! ❤
bro you are lowkey cracked in cs :D
Lots of hours playing it back in 2014 +/- 😃
@@zWORMzGaming you follow competitive CS?
The speeds of these uploads are crazy bro
7900 XTX, here to represent
💪💪
22:53 Bro Really Wanted To Kill Bob🤣🤣🤣
just bought this card. gets here Wednesday. Cant wait!!!!
I am disappointed in the ray tracing performance of the AMD flagship. To me, that's the reason to get a flagship nowadays is to go with raytracing.
It's also interesting how 24gb of memory on a wide memory bus doesn't seem to help as much as you might think. Although it did seem to fix all the stuttering in fortnite.
What is the 24gb supposed to solve?
@@laszlodajka5946 well, it's future proofing. I have an 8gb rx580, and it has had a second life recently with games that use more than 4gb vram. it could be that no game uses 24gb, but soon they will use more than 12 at least, and that is where having more than 8gb or 12gb will let this card breathe.
Although I think the sun is setting on rasterization. AMD really needs to put more focus on ray tracing.
@@Layla-p2h games that basically require RT are coming out, and remakes of old games with RT are now a huge reason to get raytracing. Even my rx580 can play for example cyberpunk. But the compelling difference would be having RT with cyberpunk.
Ray tracing 😅. It's ugly. And not realistic
Nice video, is really a monster GPU.
Great video
Thank you!
literally beast, waiting for bodycam game test...
I run this card with a 5800x3d no reason to upgrade anytime soon.
Works for me as well. Moved to a water cooler setup (720mm custom loop) and the GPU runs 3GHz and 2.8GHz mem, undervolted. Good results so far.
@@skylab4404 nice bro! I have stock cooler on mine, I keep mine at stock clocks but undervolted and it never goes above 70c Im happy with that.
Those are really weird cyberpunk numbers. I'm running and driving around the same part of the map with a 5800X3D and reference 7900XTX at Native 4k ultra settings with psycho ssr no RT and I am getting 85-95 fps. That's so bizarre. Awesome vid!
Wormz is known for benchmarking everything other than your household appliances, but what he's not known for is his gaming ability. Dudes a gamer man. Just casually racking triple kills in Fortnite while concentrating on frame time graphs and power consumption numbers.
Crazy.
your cs gameplay is so relaxing to watch
the raw performance of the latest AMD cards is really crazy for their price, sadly they are not that good at raytracing
Gosh dude if you made a channel where you just played CS or an FPS title of your choice I would eat that up. You're genuinely good at these games
That PC will give you a high electric bill 🤣
Yeah 😅
@@zWORMzGamingWhat 4k monitor do you use for daily gaming?🤔
Только не в России. У нас дешёвое электричество))
Think he said its lg c2 oled but not sure@@TheMostSavageOfAllGeese
I have mine undervolted to 700mv at 2250 mhz. Runs at 280 watts on average. Unfortunately some unoptimized games can be very sensitive but most games i tried works fine for hours.
He is alive !!!! Finally a new video !! How u doing man ?
If you're pairing this with an older Ryzen, like the very popular 5600, the XTX is faster than the 4090. That's because the XTX handles scheduling on the card, whereas the 4090 shunts it to the CPU and the 4090 comes with a whopping schedule burden.
XTX is a great card, but come on now… it’s not faster than a 4090 lol
@@Rhamses220 It is with a modest CPU. The 4090's scheduling burden is so high.
Of course in practice how many Ryzen 5600 owners use an RTX4090 and then run it at 1080p? Very few I think! However, AMD's ongoing decision to keep scheduling on the GPU does have benefits.
Finally you got a 7900 xtx
I think the ray tracing performance is similar to a 4070 Ti or 3090 so its good but not good enough for a $1000 gpu kind of good and I personally would get the 4080 super for that price but you can't deny, the 7900 xtx is a legendary gpu to have in 2024 onwards.
Brought this last year. Great investment what a card.
If you can't upgrade the gpu, downgrade the monitor 🗿
I can upgrade the gpu, but I'm still using dell 19" monitor with 1366x768 60hz resolution 😂
@@CasuallyHuman099Damm, what's the rest of your pc specs?
@@CasuallyHuman099 i use that res too, but i cant upgrade the GPU lol
@@DebelakaPG3D ryzen 5 4600g and a rtx 4060, I know my cpu is weak but for 768p 60fps is enough for me
@@Capp0 I can feel u bro, I'm saving 2yrs for just a gpu upgrade
Shadow quality on Ghost Of Tsushima, set it to the one below ultra. Minimum visual downgrade, 30% to 40% improved performance.
Awesome video and amazing 🤩GPU the Rx7900XTX by Radeon💪🥰👍. I agree it needs to be a bit cheaper than the 4080Super as well as Nvidia offers better features with DLSS and Ray Tracing and waaaay better production software support😇👍.
I have the 7900XT, it's perfect for 1440p and can even do 4k on less demanding games. The XTX is around 20% faster or so on average, but my card is still fine enough for gaming.
The best benchmark channel❤
Thank you for the video, will buy this card on black friday, was super curious how it is nowadays with the drivers
Love ur vids man they are the best gpu test videos on planet earth definetly subbing ❤
Thank you so much 😃
You should reboot your Minecraft survival series with different GPUs per episode because of the new 1.21 Tricky Trials update, just a suggestion :)
I'm still rocking a 6700xt and have no complaints lol. Maybe I'll upgrade to a 8 series next year. Great Vid too btw.
Agreed about the smoothness aspect of Battlefield.
I have played Battlefield 2042 at 900p 60 fps and was able to maintain a K/D of 0.8-0.9. The game just felt smooth enough to rush in have some fun. Get some kills etc.
entertaining benchmarks as always
WOOHOO!! DO MORE ON RX 7700 XT, RX 7800 XT, RX 7900 XT, and RX 7600 XT only if you can.
Just got done playing some Cyberpunk on my 7900xtx :)
I Owned A gt 730 When I first saw Your Video Then I Bought A Used RX 470 But Sadly You Never Made a video On Rx 470 And I Bought A rtx 3070 few Months Back Love From India☺️
Fantastic video brother just subbed
Glad you liked it! Thank you 😃
In Rust btw, cant quite remember what setting it was but that fixes the stuttering
I just got a hellhound 7900xtx for £700, its a no brainer over nvidia as I want the vram for my quest 3 and my 4k monitors, over 1/2 the price of a 4090 which seemed to have spiked to around 2k as stocks run out as it is EOL
I just got the XFX 7900XTX for £670 ! Absolute bargain, owned a 3090 previously and this matches it for RTX but matches the 4080 super easily with raster
FINALLY, AMDs Top Modell! I've been wishing for a video like this, and it’s even the exact same card I have. I’ve personally downclocked it to 2GHz, as it doesn’t lose much performance in some games, and since I only need 1440p at 120Hz, it’s more than enough. Plus, it only draws 250-260W under load instead of 400W, so I’m really happy with the card. The only downside is in idle mode, where it draws 100W no matter the clock speed, which is a lot for idle.
Didn't AMD fix the idle consumption with an update? My 7900 XTX draws around 40 W in idle. You can also try this this video ua-cam.com/video/4B0Vt_c-7C8/v-deo.html&ab_channel=AncientGameplays
@@heavydee6795 Yes, that’s often suspected, and they’ve claimed it’s fixed in driver updates, but not in my case or many others. I have a Samsung Odyssey G93SC OLED 49” ultrawide (Or however it’s called) in PBP mode at 1440p 120Hz, and an older 1080p 60Hz monitor above it. The bottom monitor is connected via HDMI and DP, and the top one via HDMI with an HDMI to DP adapter. With all three connected, my GPU draws 100W at idle, but without the 1080p monitor, it drops to 7-30W. It’s tied to the VRAM clock, which stays at 2500MHz when the top monitor is connected and drops when not. I’ve tried everything to fix it.
@@heavydee6795 Yes, that’s often suspected, and they’ve claimed it’s fixed in driver updates, but not in my case or many others. I have a Samsung Odyssey G93SC OLED 49 ultrawide (or however it’s called) in PBP mode at 1440p 120Hz, and an older 1080p 60Hz monitor above it. The bottom monitor is connected via HDMI and DP, and the top one via HDMI with an HDMI to DP adapter. With all three connected, my GPU draws 100W at idle, but without the 1080p monitor, it drops to 7-30W. It’s tied to the VRAM clock, which stays at 2500MHz when the top monitor is connected and drops when not. I’ve tried everything to fix it.
@@heavydee6795 Yes, that’s often suspected, and they’ve claimed it’s fixed in driver updates, but not in my case or many others. I have a Samsung Odyssey G93SC OLED 49” ultrawide (Or whatsoever) in PBP mode at 1440p 120Hz, and an older 1080p 60Hz monitor above it. The bottom monitor is connected via HDMI and DP, and the top one via HDMI with an HDMI to DP adapter. With all three connected, my GPU draws 100W at idle, but without the 1080p monitor, it drops to 7-30W. It’s tied to the VRAM clock, which stays at 2500MHz when the top monitor is connected. I’ve tried everything to fix it.
@@heavydee6795 Yes, this issue is often mentioned, and they’ve claimed it’s fixed in driver updates, but not in my case or for many others. I have a Samsung Odyssey G93SC OLED 49” ultrawide in PBP mode at 1440p 120Hz, along with an older 1080p 60Hz monitor above. The bottom monitor is connected via HDMI and DP, and the top via HDMI with an HDMI to DP adapter. With all three connected, my GPU idles at 100W, but without the 1080p monitor, it drops to 7-30W. It seems to be related to the VRAM clock, which stays at 2500MHz when the top monitor is connected. I’ve tried everything.
Finally a "GPU card" in xxxx year!
Its been a while.
Hey Pedro, i just heard about the fires in Portugal, hope you are safe man
You should do more benchmarks with Minecraft shaders especially with the 7900 gre xt and xtx cards