Just have to wait for monitor manufacturers to actually put a DisplayPort 2.0/2.1 on a device. Good to know AMD future proofed the 7900 XTX for future displays
It is kind of pointless for now. Like thunderbolt 5.0, pcie 5.0, m.2 gen 5, wifi 7.. DP 2.1 is good to have but not actually need to use. 4k 240hz is kind of expensive and not sure if the others manufacture put DP 2.1 on their monitor for this time.
@@phucvo4325 I have a gen5 ssd and two GPU's with enough lanes to handle all 3.... seriously the crucial T700 gen SSD is 20% faster than any other gen5 ssd and my mobo didnt officially support it.
Staying under the 4 figure price bracket is important to buyer psychology, whats a couple hundred more when your already commited to spending over a grand? The 999 price really gave the AIB's nowhere to go.
Sadly the new AMD range isnt wonderfully exciting, so they have limits to what they can do, but I can see this one being in the top 5% if you really wanted a 7900 XTX
@@johnerikson2443 depends on the game. it really comes down to personal preferance and game choice. Lastly AMD is "cheaper" while both cards are ridiculously overpriced..
I've been very happy with my Nitro card too... Getting a very stable 15% overclock with just the Adrenaline software. A ton of other outlets are saying the AIB cards aren't better than the reference (without testing it themselves of course) but that's not my experience. Maybe I got a golden sample though, who knows.
@@MLWJ1993 Ignorant of the facts much? Did you look into why it has such issue? Nvidia GPU also have idle power draw if you connect multiple 4k displays at 240hz. None of the reviewers talk about the overheating, because it's a bug you can easily bypass. If its such a big deal, why aren't more tech channels posting about it like Nvidia's melting cable?
@@dra6o0n We're not even considering the idle power draw here, which is very much a thing that happens on both manufacturers GPUs. That's something I'm not going to deny, I definitely don't appreciate you putting it as if I ever claimed that. What is however an issue is designing a cooler that by design lets memory modules overheat, something that was an issue with Nvidia reference designs for the 30XX series as well. The melting power cables are very much user error it seems, I.E. something that can be avoided by connecting your GPU properly. A not well performing cooling solution however has no real fix outside of replacing it, so long as AMD doesn't offer that, it's as much of a don't buy it advice as it is for reference 30XX GPUs from Nvidia. The problem here is that the better designed AMD GPUs (AIB models) for this generation happen to be seriously marked up to the point that it's no longer even close to MSRP. I.E. as lacklustre in value as Nvidia 40XX GPUs are right now.
@@dra6o0n There're people with stock reference cards experiencing overheating & even shutdowns. Just take a look over at r/AMD. Surely you're not going to be the ignorant one that's calling me ignorant right?
On the plus side, the fan serviceability is a great step towards a design with 3rd party easy replacable slim fans someone might consider manufacturing, and the extra fan connectors for close-by case fans PWM control based of GPU thermals/fan curves might pave a way for more of similar solutions...
@@innocentiuslacrim2290 There was a Noctua edition of some of the 3000 series NVIDIA cards. The lack of 4000 series updates suggests it didn’t do as well as we would hope, unfortunately.
This card gave me no coil whine at all. Maybe some frequencies of my ears are shot, dunno, but@@selohcin so far the 7900xtx has been a beast with no whine or stuttering, working along a Ryzen 9 7900.
@@Hietakissa this card costs 1300 which is more than even the best £1200 ASUS tuf and gaming X trio 4080. So how is it better depsite having only equal raster, 50-60% less Rt in CP, DL2, Witcher 3, worse upscaling, no frame gen, no rtx remix, less efficiency and SIGNIFICANTLY worse vr performance
may I correct the error which was made by the tech data reading? It´s not 2x 1.4 Displayports. It´s 2x 2.1 Displayports as you can read in the Sapphire specs on their site
My sapphire 7900 XTX idles at 29 to 31 degrees. Under load gaming about 60-62 / It's a bad ass card visually as well. I switched from Nvidia back to AMD graphics recently. The 7900XTX is powerful.
@@RTXRetroGonvidia and AMD are basically tied. Or they were before the 4090 came out, overall AMD is better because of the price and extra vram. That's my personal opinion doe
@@RTXRetroGo Ngreedia is better only if you care about RT, DLSS or power consumption. If you don’t care for these Radeon cards are better. I just wish AMD can make FSR equivalent to DLSS.
@@RTXRetroGoI talked to someone who bought a 2080 super about my build (7600X/7800XT) he was surprised I go with AMD. I told him that I had no driver issues, no crash, and get 2K 144 FPS RT psycho, settings ultra on Cyberpunk, he was kinda surprised. I then told him that the game doesn't even guve player anything higher than FSR2.1 meanwhile there's FSR3.1 so he knows the card absolutely destroy any Nvidia card at the same range of price
@@Player7BR if you are talking about a whining sound (ant not just the fans) it's luck based. Even within the exact same model some will have it while other won't.
Funny thing... the coil wine turned out to be from the PSU not the GPU. Only noticed it after I decided to run a test with the PSU on the outside of the case.
Funny thing... the coil wine turned out to be from the PSU not the GPU. Only noticed it after I decided to run a test with the PSU on the outside of the case.
So around 9:16 is that Bios 1 or 2? Because it looks like you're using bios 1 and it's drawing more than 100W extra for 1 to 4 fps increase. To say that it's stupid would be an understatement. Please specify for all benchmarks what bios are you using and if possible I'd like to see the power drawn compared to the reference card. I was about to buy it until I saw this video. I can do without nVidia's DLSS and RT performance but stupid power consumption is where I draw the line. I was fine with 50-60W more than a 4080 but that's insane
I keept it in bios 2 overclocked and undervolted. Power consumption is 390W at max. I get extra extra 10% fps increase when running FurMark with this setup. 109fps in stock bios 2 with 350w power draw to 122fps with 390w.
I do not like the rounded edges of the card. A better looking card, imo was sapphire's 6950XT Nitro + Pure. Of course MSI's SUPRIM is the nicest looking card of all.
I have a reference powercolor 7900xtx, and it runs and overclocks just like this card does. Hot spots usually don't hit more than 84, but the average hotspot is in the 60s... sometimes low 70 depending on game. I think that the thermals are being blown out of proportion by anti AMD outlets.
Apparently gpu orientation (ie is it horizontal or vertical) and contact pressure are causing most of the hot spot temperature issues rather than the coolers themselves being insufficient.
@@syed2694 I don't doubt peoples' cards are overheating, I just doubt how bad it is based on what news outlets are saying. It's hard to tell. I'm certainly willing to accept it's bad, but I just imagine it's a small percentage of users who will somehow wind up being helped.
I wanted this card originally but the price is way too high, if it was around MSRP it would make more sense. People slate the 4080 for it's pricing but the XTX AIB models are proving to be the same price, so I would rather take a 4080 then.
Considering how much power this AIB card will be able to draw with its 500 Watt range during load compared to RTX 4080 FE (around 300 Watt in graphs here), then RTX 4080 will be "cheaper" over time then those 3 x 8 PIN 7900XTX cards. Where I live it is very unlikely that price on electricity will go down within 5 year, so all those high end cards will not only be over priced, but also have a too high cost in usage.
This card wouldn't make sense at MSRP at all. It's more expensive to make than the reference model (just look at it, it's easy to see that) and it's proven to be much quieter, cooler and overclocks better by about 7% or so. It's Sapphire's premium line so if you want cheaper you go for their Pulse line. This is currently their best engineered card until Toxic comes out which, if it does, will be even higher quality. Sheesh you're like going to buy a Calvin Klein shirt and expecting it to cost the same as Hanes. Learn what materials were used, the labor costs (good engineers deserve to get paid well), and the margins needed to keep making good products and pay for the research that goes into making better products. Even the fans on this Nitro+ are top tier technology and better than the reference. You sound like an uneducated consumer. Go buy a cheap, loud, warm card instead.
@@beeping2blipping bruh are you comparing the max range of wattage when overclocked to a 300 watt stock number? If you got the new amd driver and you dont overclock its about the same. But the AMD card has the possibility to overclock
This is why I am leaning toward an AMD card for the first time. The 7900 XTX seems like, from a pure rasterization standpoint, the best card on the market from a price to frame standpoint.
Better cooling, better overclocking headroom. Vapor Chamber. Plus a lot quieter than the reference model. It's worth 1200 at the very least. It is hands down the best looking Radeon card I've owned. It was never going to go for 1K was it?
I don't have many resources and that's why I hesitate more in my purchases, but would you say that it is a good purchase for €950, without knowing what is coming in the future in terms of prices or new models?
Are you limiting the oc/uv potential or are the 7900 XTXs less tunable than the 7900 XTs? I have the XFX Merc 7900 XT. I am able to stably run Time Spy Extreme @ 3050 MHz and undervolted to 960 mv. I am also able to oc the memory to 2810 MHz with a +15% power limit. I first set the memory. Then I set the max oc to 3.3 GHZ @1 V. I run the program and look for the max produced frequency. I lower the max oc to 50-100 MHz over the observed frequency and begin lowering the voltage until I hit instability. I back off 10mv. I can get the memory to 2828 without performance loss in Port Royal.
Hello. First of all nice video, but a Question, @4.25 ish you say that the card offers 2 x HDMI 2.1 och 2x DP 1.4 - Is this correct? Aint the new cards with DP 2.1?!
Bought this on launch day on the assumption it would be best-in class. Very much a bitter pill in being £100 more than we'd have liked... but Nvidia's open source/Linux support is an utter disgrace so it really wasn't a contest. And, of course, it wasn't in stock so won't even arrive for at least another week. Here's hoping the drivers mature enough to make this not feel like a giant waste of money.
@@damianwright3690 It's been close to perfect. Running with mesa-git on CachyOS (Arch) and other that a few days of broken RT [caused by mesa, not the GPU] it's wildly exceeded expectations. Easily the most stable card we've ever owned and performance is a vast step up from the 5700XT. If you were holding off just to be sure, go ahead and buy
@@sudeshryan8707 Yeah its true, its not often that prevalent when gaming, but its good to point it out for a 'worst case scenario' - but it wouldnt put me off buying one.
As I am on the market for a new card to benefit my competitive gaming with even higher fps I do believe the whine likely may occur with this card. Wouldn’t you say?
this card took the place of the 4090 in my cart for the next upgrade. I don't feel paying 2k on nvidia anymore. After all my life being faithful to nvidia, its time to say goodbye.
I’d love to see one of these on chilled water. I got a 3090 Kingpin Hydro Copper, and I might grab one of these for some daily ice water running through it. As for performance, I don’t think it’s anything huge over my current GPU, due to my overclock. But I still really like this Nitro+ GPU.
Which bios is default? I dont feel like opening my PC back up to check and i installed mine a couple weeks ago. I want to make sure I'm getting the highest performance.
The price is high on all the cards, but you gotta factor in the prices of Nvidia's cards. if you want a new card you wil be paying through the ying yang regardless
Hi I want to buy an RX 7900xtx and I'm hesitating between the nitro+ which is 1300 euros, the XFX Magnetic Air which is 1000 euros and the Taichi which is 1150 euros. Are you still satisfied with your graphic card? Have you encountered any problems such as coil whine, like we heard in the video, etc? Can you give me the temperature? Thank you
The thing with Nvidia is that they dont give their partners enough headroom in terms of pricing. So AIBs are forced to cut corners everywhere to earn at least a little bit from their custom GPUs. That means Nividia cards usually come with lower quality materials and coolers etc. One reason why EVGA quit their cooperation with Nvidia. AMD leaves a lot more headroom in that aspect and AIBs can provide more quality and usually do with proper GPUs.
Is it normal for the back plate to get extremely hot? So hot that I can't even put my finger on the card around the chip. The temperatures in games are between 70-80 degrees and the wattage around 400.. Should I use more fans for better cooling in the case to keep the back plate cooler?
Great work and merry Christmas. You mentioned coil whine: have you considered doing a video on it? The subject in general, not just the whine of this card, that is. Like the frequencies of the whine. Someone who's deaf above 8000Hz may tolerant of a whine that someone who's deaf above 10,000 Hz may not if that whine is at 9,000 Hz. Here's looking to more great content in 2023!
Coil whine is generally the result of too much voltage trying to pass through a PCB or Voltage Regulators maybe, it's often a result of extremely high power use due to the system being confused from bad sensors and bad thermal conductivity...
Someone would do an ML benchmark with this card pretty please. 24 GB RAM makes it a good candidate. And with the ridiculous pricing of NVidia I would go with the red team if the tradeoff is not that bad.
Not many people test linux performance, it would take a lot of time I guess for a small audience. I use linux too, but I base results on windows, you can get decent performance with these cards on linux
Modern GPUs may still include physical VBIOS switches for several reasons. First is the issue of safety and recovery. These physical switches serve as a fail-safe mechanism, so if software vbios update failed then you're screwed. Also, physical switches ensure compatibility in scenarios where a specific VBIOS version is necessary, especially in legacy systems or non-standard configurations. Security is also a concern since software-based VBIOS changes could be exploited by malicious software, and we definitely don't need some GPU frying virus going around. Specialized applications, older systems without software-based updating capabilities, and industries where precise GPU configuration control is vital also benefit from the reliability and consistency that physical switches offer. In essence, physical VBIOS switches provide a robust and secure way to manage GPU configurations in various situations where software-based alternatives may not be sufficient. However, having this setting in the motherboards bios to communicate with the GPU to change its vbios would be a better option, but that might take extra unnecessary work.
Hahaha i was just about to comment in your last video to check this out. Beat me to it. Hey bang, if Dominic is hitting just over 500w when fully overclocked that probably explains why yours sits at 464w all the time.
Although you said that it's similar to the 4080 in terms of performance and price, the 24GB is definetely a win over the 16GB that the 4080 offers. Especially with how demanding the games are in terms of VRAM and also for game development and Ai tool usage. So taking that into consideration is much more worth taking the XTX while only sacrificing a few frames in RayTracing but getting all the other benefits that come with higher VRAM
If you dont run RT the 7900xtx is def the better pick compared to the 4080. If you run RT then even the 4080 and 4090 barely work fine in 4k. In 4k gaming the 4080's VRAM won't outlast the 7900xtx. DLSS was a bad argument and still is a bad argument, but if you care about it..fs3 is proving to be very potent. Only usecase I've seen for 4080 is 1440p RT gaming and even then cyberpunk with RT runs at 50fps for instance. Not a big selling point to me
@@gabrielfpi3046 I plan on connecting it to TV for couch gaming as well and the price difference is like 300$ so the 7900xtx is more versitile and future proof
i have a RedDevil 7900xtx Limited Edition.. an it run some good.. it pulls 430watts getting around 2900MHz gpu temp 45c junc teamps 77c... i must say the Red Devil run so good
Not saying the 7900XTX's stock power efficiency is anything earth shattering, but if you're deliberately raising the power limits and overclock any piece of hardware, you opt to throw efficiency out the window for some extra perf. Obviously this card delivers much better perf/Watt at stock than when overclocked at 500W.
@@Hugh_I except at idle ;););). I'm really just using fanboys arguments against them. Since their deluded-ness deserves it after the 6 weeks of AMD worship and Nvidia inquisition pre launch.
@@theHeartlessNooB of course they are out to screw us, they are a corporation. Which is why I said fanboys. Being a fanboy is a recipe to waste money and make foolish statements. Calling out a fanboy is what needs to happen. That's what I did. Seems like you don't disagree....
Kinda crazy how sapphire lets you pull 500+watts with overclocking. This seems like it could be a kingpin product that are designed to push the clocks to the limit.
AMD & Nvidia need to reign in this ridiculous arms race of inefficient (dare I say hot and loud) GPUs. People should in the least be nonplussed about needing an 800w or greater PSU for their flagship cards. Not to mention these gigantic cards are paired with equally grotesque heatsinks, making them vastly more expensive along with being big ugly powerhogs. Yet there are some people who I am sure are eagerly awaiting next year’s 7 slot 1200w $3000 behemoth.
It's really not as efficient on a basic GPU concept, it is one of the biggest monolithic dies found on GPUs in general. Nvidia is taking a huge loss with the high fail rate of wafers. Hence the $1600 price tag and such. I wouldn't praise Nvidia in 2024 though, they at at the end of their monolithic builds, they can't get any bigger or smaller and may start to resort to cheap tactics to hurt AMD like they did 10-20 years ago. So after the 4090, they may start to produce the same GPU but double the tensor and RT cores maybe? And try to push for DLSS 3 or 4 to fake as much of the imagery as possible because the hardware can't get better, only work differently, like a illusion to the visuals. AMD will need to keep scaling chiplets, RDNA 4 will be much better and RT being more efficient, but Nvidia will like pull some strings again to gimp RT gaming performance overall so you need a ton of RT hardware to overcome the wall they make developers put up.
I'm sure someone else has already pointed this out, but the card has DisplayPort 2.1 not 1.4.
Just have to wait for monitor manufacturers to actually put a DisplayPort 2.0/2.1 on a device. Good to know AMD future proofed the 7900 XTX for future displays
Quite relevant these days with the 4k 240hz oleds...
It is kind of pointless for now. Like thunderbolt 5.0, pcie 5.0, m.2 gen 5, wifi 7.. DP 2.1 is good to have but not actually need to use. 4k 240hz is kind of expensive and not sure if the others manufacture put DP 2.1 on their monitor for this time.
@@phucvo4325 not at all. It is important to have the proper bandwidth for the proper resolution and refresh rate...
@@phucvo4325 I have a gen5 ssd and two GPU's with enough lanes to handle all 3....
seriously the crucial T700 gen SSD is 20% faster than any other gen5 ssd and my mobo didnt officially support it.
I’ve owned a sapphire rx580 and I currently own 6600xt by sapphire they’re always so quiet and preform amazingly.
Sapphire makes amazing cards.
@@prophetgoogle7071 and really sharp looking, I pulled my RX580 a few months ago and still pop the box from time to time just look at the thing.
@@crhoads4278 :) my HD 6570 still working I had that way back in 2011 or 2012 cool..
I have sapphire rx580 nitro lol
580 nitro sapphire wasnt very cold ;
Staying under the 4 figure price bracket is important to buyer psychology, whats a couple hundred more when your already commited to spending over a grand?
The 999 price really gave the AIB's nowhere to go.
I love sapphire AMD hardware, they always make great coolers. this is a good model for sure
Sadly the new AMD range isnt wonderfully exciting, so they have limits to what they can do, but I can see this one being in the top 5% if you really wanted a 7900 XTX
Yup I love my Sapphire RX6800 🥳
4080 is a better card with proper support for ray tracing and dlss 3
@@johnerikson2443 depends on the game.
it really comes down to personal preferance and game choice. Lastly AMD is "cheaper" while both cards are ridiculously overpriced..
@@johnerikson2443 good to see you shilling multiple comments LoL
I've been very happy with my Nitro card too... Getting a very stable 15% overclock with just the Adrenaline software. A ton of other outlets are saying the AIB cards aren't better than the reference (without testing it themselves of course) but that's not my experience. Maybe I got a golden sample though, who knows.
How can AIB models be worse when the reference cards overheat? 😛
That would've been the worst engineered generation of GPUs in a long time.
@@MLWJ1993 Ignorant of the facts much?
Did you look into why it has such issue?
Nvidia GPU also have idle power draw if you connect multiple 4k displays at 240hz.
None of the reviewers talk about the overheating, because it's a bug you can easily bypass.
If its such a big deal, why aren't more tech channels posting about it like Nvidia's melting cable?
@@dra6o0n We're not even considering the idle power draw here, which is very much a thing that happens on both manufacturers GPUs. That's something I'm not going to deny, I definitely don't appreciate you putting it as if I ever claimed that.
What is however an issue is designing a cooler that by design lets memory modules overheat, something that was an issue with Nvidia reference designs for the 30XX series as well.
The melting power cables are very much user error it seems, I.E. something that can be avoided by connecting your GPU properly. A not well performing cooling solution however has no real fix outside of replacing it, so long as AMD doesn't offer that, it's as much of a don't buy it advice as it is for reference 30XX GPUs from Nvidia. The problem here is that the better designed AMD GPUs (AIB models) for this generation happen to be seriously marked up to the point that it's no longer even close to MSRP. I.E. as lacklustre in value as Nvidia 40XX GPUs are right now.
@@MLWJ1993 Perhaps AMD was expecting people to not overclock their reference cards or try to break the power limit through soft power plan hacks.
@@dra6o0n There're people with stock reference cards experiencing overheating & even shutdowns. Just take a look over at r/AMD. Surely you're not going to be the ignorant one that's calling me ignorant right?
On the plus side, the fan serviceability is a great step towards a design with 3rd party easy replacable slim fans someone might consider manufacturing, and the extra fan connectors for close-by case fans PWM control based of GPU thermals/fan curves might pave a way for more of similar solutions...
yeah its a good idea, was happy to see it.
@@innocentiuslacrim2290 There was a Noctua edition of some of the 3000 series NVIDIA cards. The lack of 4000 series updates suggests it didn’t do as well as we would hope, unfortunately.
Chiming in from the future, and we have Noctua 4000 series here.
Does this change your opinion on this topic?
Been looking forward to seeing a custom RX 7900 for a while.
Thanks for all the great efforts! I really appreciate the sound of the card.
We are living in an era of enormous graphics cards.
Very true I have a 4080 but what you lose in size you make up for in temps ans noise. Mine is whisper quiet and never breaks 60c
Mine is supposed to ship today or tomorrow. It's going to be like trying to squeeze a console into a PC case. These flagship cards are massive.
@@D90NAS77 these huge cards must have a great passive heat dissipation. I wonder if I can play some games with the fans at zero rpm.
These cards are so quiet it is really impressive.
I guess you didn't hear that coil whine, right? It's not terrible, but I wouldn't call that "quiet."
This card gave me no coil whine at all. Maybe some frequencies of my ears are shot, dunno, but@@selohcin so far the 7900xtx has been a beast with no whine or stuttering, working along a Ryzen 9 7900.
looking forward to seeing which AIB model is the best, this one seems like a pretty solid candidate
Yeah id say this is right up there. I refuse to give Nvidia my money for the 4000 series, its massively overpriced.
Yeah this one is the best, Sapphire are brilliant. always have been.
@@hairsprungdutchtechnique9626 bruh, as is AMD aswell with its 7000 series, whats your point? Just hating?
@@hairsprungdutchtechnique9626 both are overpriced rn but at least the xtx is slightly better value
@@Hietakissa this card costs 1300 which is more than even the best £1200 ASUS tuf and gaming X trio 4080. So how is it better depsite having only equal raster, 50-60% less Rt in CP, DL2, Witcher 3, worse upscaling, no frame gen, no rtx remix, less efficiency and SIGNIFICANTLY worse vr performance
awesome card and another great review Dominic, good way to end 2022 !
Love seeing these cards without the heatsinks, as that is bleeding edge tech you’re seeing right there.
I just the other day nearly had a heart attack when the thumbscrew fell into the cooler housing while I was trying to install the card.
Personally, I thought the SAPPHIRE NITRO+ RX 5700 XT was one of their best-looking designed GPUs
I miss my Nitro+ 5700 XT, one of my favourite ever cards!
may I correct the error which was made by the tech data reading? It´s not 2x 1.4 Displayports. It´s 2x 2.1 Displayports as you can read in the Sapphire specs on their site
Certainly worthy of correction. Just an empty spec on this gen :/
My sapphire 7900 XTX idles at 29 to 31 degrees. Under load gaming about 60-62 / It's a bad ass card visually as well. I switched from Nvidia back to AMD graphics recently. The 7900XTX is powerful.
Do you regret it? Everyone says the Nvidia cards perform alot better.
@@RTXRetroGo ...man.. I regret just buying a PC ...takes up too much my time.
@@RTXRetroGonvidia and AMD are basically tied. Or they were before the 4090 came out, overall AMD is better because of the price and extra vram. That's my personal opinion doe
@@RTXRetroGo Ngreedia is better only if you care about RT, DLSS or power consumption. If you don’t care for these Radeon cards are better. I just wish AMD can make FSR equivalent to DLSS.
@@RTXRetroGoI talked to someone who bought a 2080 super about my build (7600X/7800XT) he was surprised I go with AMD. I told him that I had no driver issues, no crash, and get 2K 144 FPS RT psycho, settings ultra on Cyberpunk, he was kinda surprised. I then told him that the game doesn't even guve player anything higher than FSR2.1 meanwhile there's FSR3.1 so he knows the card absolutely destroy any Nvidia card at the same range of price
I got mine a few days ago and the coil wine is crazy. I'll be sending it back.
@@Player7BR if you are talking about a whining sound (ant not just the fans) it's luck based.
Even within the exact same model some will have it while other won't.
What size if your PSU I had coil wine until I went with 1000watt Asus
Funny thing... the coil wine turned out to be from the PSU not the GPU. Only noticed it after I decided to run a test with the PSU on the outside of the case.
Funny thing... the coil wine turned out to be from the PSU not the GPU. Only noticed it after I decided to run a test with the PSU on the outside of the case.
@@siveonfarcloud4190 glad ya solved it at least lol these 7900xtxs are power hungry and spike hard
merry christmas DOminic, thanks for all your hard work this year, ive enjoyed your indepth honest reviews this year. happy 2023!
So around 9:16 is that Bios 1 or 2? Because it looks like you're using bios 1 and it's drawing more than 100W extra for 1 to 4 fps increase. To say that it's stupid would be an understatement. Please specify for all benchmarks what bios are you using and if possible I'd like to see the power drawn compared to the reference card. I was about to buy it until I saw this video. I can do without nVidia's DLSS and RT performance but stupid power consumption is where I draw the line. I was fine with 50-60W more than a 4080 but that's insane
I keept it in bios 2 overclocked and undervolted. Power consumption is 390W at max. I get extra extra 10% fps increase when running FurMark with this setup. 109fps in stock bios 2 with 350w power draw to 122fps with 390w.
@@Shulcyo what are your settings if you mind sharing please?
This is maybe the best GPU visual design I have ever seen. Sapphire - congrats!
I do not like the rounded edges of the card. A better looking card, imo was sapphire's 6950XT Nitro + Pure. Of course MSI's SUPRIM is the nicest looking card of all.
Best = Zotac 4090
I've just bagged a new one of these for £999.99p, happy with that now I realise it's £300 cheaper than launch...
Funny thing is 320mm length is short compared to the other customs
But 3.5 slots 😊
Love the fan header on the card 🔥🔥
amazinggggg review.. the power draw is a crucial part which many reviewers cant even explain.. love u
Looks like a really high quality card.
i'm still running a sapphire nitro+ 8gb rx570, still a 1080 beast
I have a reference powercolor 7900xtx, and it runs and overclocks just like this card does. Hot spots usually don't hit more than 84, but the average hotspot is in the 60s... sometimes low 70 depending on game. I think that the thermals are being blown out of proportion by anti AMD outlets.
4k is more demanding than 1080p
Apparently gpu orientation (ie is it horizontal or vertical) and contact pressure are causing most of the hot spot temperature issues rather than the coolers themselves being insufficient.
@@zuhaeraziz7966 so I've heard. Mine is horizontal.
Nice n=1 Vs dozens of reports confirming the issue
@@syed2694 I don't doubt peoples' cards are overheating, I just doubt how bad it is based on what news outlets are saying. It's hard to tell. I'm certainly willing to accept it's bad, but I just imagine it's a small percentage of users who will somehow wind up being helped.
A powerful brand and first class giant from AMD side. He should be grateful for it
my nitro topped out on 526 watts, pretty insane when you see how easy this card does that.
Interesting to see the power draw, thanks!
I wanted this card originally but the price is way too high, if it was around MSRP it would make more sense. People slate the 4080 for it's pricing but the XTX AIB models are proving to be the same price, so I would rather take a 4080 then.
Considering how much power this AIB card will be able to draw with its 500 Watt range during load compared to RTX 4080 FE (around 300 Watt in graphs here), then RTX 4080 will be "cheaper" over time then those 3 x 8 PIN 7900XTX cards.
Where I live it is very unlikely that price on electricity will go down within 5 year, so all those high end cards will not only be over priced, but also have a too high cost in usage.
This card wouldn't make sense at MSRP at all. It's more expensive to make than the reference model (just look at it, it's easy to see that) and it's proven to be much quieter, cooler and overclocks better by about 7% or so. It's Sapphire's premium line so if you want cheaper you go for their Pulse line. This is currently their best engineered card until Toxic comes out which, if it does, will be even higher quality. Sheesh you're like going to buy a Calvin Klein shirt and expecting it to cost the same as Hanes. Learn what materials were used, the labor costs (good engineers deserve to get paid well), and the margins needed to keep making good products and pay for the research that goes into making better products. Even the fans on this Nitro+ are top tier technology and better than the reference. You sound like an uneducated consumer. Go buy a cheap, loud, warm card instead.
@@vigilant_1934 ok, but how big of a premium is justified, 10, 20 or 30% and more?
Yes but the 4080 AIB cards are also higher then reference
@@beeping2blipping bruh are you comparing the max range of wattage when overclocked to a 300 watt stock number? If you got the new amd driver and you dont overclock its about the same. But the AMD card has the possibility to overclock
thanks for the noise testing
where is the sapphire pulse card. waiting for somthing other then reference card that can fit in me case.
I am happy with my 7900xtx purchase so far, I never use ray tracing and I use it in my Lenovo P620 case.
This is why I am leaning toward an AMD card for the first time. The 7900 XTX seems like, from a pure rasterization standpoint, the best card on the market from a price to frame standpoint.
@@stevecolianni1499 100%
can we get a card like this and only 2x8 pin please
Yeah seems like keeping it at 350watts + some undervolt is the way to go for thermals and noise (perf doesnt increase much anyway).
That is cool videocard. Thanks for review.
I’m really hoping I made the right choice. Got the card off Amazon last night on sale.
waiting for the 7700/XT nitro model ^^
And a £300 pound premium! Aibs are really taking the piss !
The whole GPU market is taking the piss!
Better cooling, better overclocking headroom. Vapor Chamber. Plus a lot quieter than the reference model. It's worth 1200 at the very least.
It is hands down the best looking Radeon card I've owned. It was never going to go for 1K was it?
AIBs are forced to because Nvidia are total dickheads and leave them with almost no profit margin unless they crank the prices
@@Bang4BuckPCGamer but no better performance!
@@Bang4BuckPCGamer well based off the OC results in this video, this card was never worth the price hike was it?
How does coil sound change on BIOS 2?
I don't have many resources and that's why I hesitate more in my purchases, but would you say that it is a good purchase for €950, without knowing what is coming in the future in terms of prices or new models?
How far away from the GPU was the microphone during the coil whine test?
Are you limiting the oc/uv potential or are the 7900 XTXs less tunable than the 7900 XTs? I have the XFX Merc 7900 XT. I am able to stably run Time Spy Extreme @ 3050 MHz and undervolted to 960 mv. I am also able to oc the memory to 2810 MHz with a +15% power limit. I first set the memory. Then I set the max oc to 3.3 GHZ @1 V. I run the program and look for the max produced frequency. I lower the max oc to 50-100 MHz over the observed frequency and begin lowering the voltage until I hit instability. I back off 10mv. I can get the memory to 2828 without performance loss in Port Royal.
It can be very hit and miss - silicon lottery really. thats a good result you got Robert
@@amdforlife5591 Do you have a 7900 xt?
Lowest i can go with this card is 1090mv. Tried 1050 and it crashed 5 times in a row in game. 960mv seems like a fairytale
Wait, so you didn't turn on the tightened memory timings on the card?
When overclocked and UV did you stay in Bios 1 or 2 ?
Would love a 270mm pulse edition
Is a 850 watt psu enough for this card and a 7800x3d?
yes i run that setup
I’d go with a 1000w PSU to be in the safe side.
Sapphire & Power Color are the real beasts of an AMD card 💪
Hello. First of all nice video, but a Question, @4.25 ish you say that the card offers 2 x HDMI 2.1 och 2x DP 1.4 - Is this correct? Aint the new cards with DP 2.1?!
Bought this on launch day on the assumption it would be best-in class. Very much a bitter pill in being £100 more than we'd have liked... but Nvidia's open source/Linux support is an utter disgrace so it really wasn't a contest. And, of course, it wasn't in stock so won't even arrive for at least another week.
Here's hoping the drivers mature enough to make this not feel like a giant waste of money.
It's a solid card. Just not efficient, cool, quiet. But it'll perform for sure
@@OscyJack- it didn't seem loud to me
@@xXDESTINYMBXx coil whine is quite loud on the 5 samples I got.
How're you finding it now that you've had some time with it?
@@damianwright3690 It's been close to perfect. Running with mesa-git on CachyOS (Arch) and other that a few days of broken RT [caused by mesa, not the GPU] it's wildly exceeded expectations. Easily the most stable card we've ever owned and performance is a vast step up from the 5700XT. If you were holding off just to be sure, go ahead and buy
Excellent review. Thanks for checking and documenting the coil whine. That level of whine a definite no go to me.
These coil whine only happen during benchmarks due to high power draw, but not gameplay. Same is true for nvidia cards too.
@@sudeshryan8707 Yeah its true, its not often that prevalent when gaming, but its good to point it out for a 'worst case scenario' - but it wouldnt put me off buying one.
You will likely only get that whine if you are running at over 200 fps with a high refresh monitor in some games.
As I am on the market for a new card to benefit my competitive gaming with even higher fps I do believe the whine likely may occur with this card. Wouldn’t you say?
Just grabbed it for $733 in the US on amzon.
this card took the place of the 4090 in my cart for the next upgrade. I don't feel paying 2k on nvidia anymore. After all my life being faithful to nvidia, its time to say goodbye.
Agree with Dom this is a really nice designed graphics card.
While I have a gaming oc 4090 I’m gonna pick one of these up early next year to tinker with.
Do all 4 ports enabled for 4 similtanous monitors at the same time?
I’d love to see one of these on chilled water. I got a 3090 Kingpin Hydro Copper, and I might grab one of these for some daily ice water running through it. As for performance, I don’t think it’s anything huge over my current GPU, due to my overclock. But I still really like this Nitro+ GPU.
It WILL be a huge upgrade over your current GPU. The 3090 is no match whatsoever for the 7900XTX.
You said those are Display port 1.4 but isnt it suppose to be DP 2.1?
Which bios is default? I dont feel like opening my PC back up to check and i installed mine a couple weeks ago. I want to make sure I'm getting the highest performance.
The Quiet BIOS will be off by default as it reduces performance.
I wish you also add linux benchmarks in the future :) Thanks for the review. I think i will buy this card
Thank you, its something we would love to add, but sadly we are always fighting against the clock !
This card is now $999 USD, which is huge for someone considering it vs a 4090.
17% more voltage for 8% max prob av 5% improvement at over 4800 prices, only going to sell to a few
The price is high on all the cards, but you gotta factor in the prices of Nvidia's cards. if you want a new card you wil be paying through the ying yang regardless
Hi
I want to buy an RX 7900xtx and I'm hesitating between the nitro+ which is 1300 euros, the XFX Magnetic Air which is 1000 euros and the Taichi which is 1150 euros.
Are you still satisfied with your graphic card? Have you encountered any problems such as coil whine, like we heard in the video, etc?
Can you give me the temperature?
Thank you
I have the 6800xt sapphire pulse... was considering this card. Thoughts? I have the 5900x cpu
The thing with Nvidia is that they dont give their partners enough headroom in terms of pricing. So AIBs are forced to cut corners everywhere to earn at least a little bit from their custom GPUs. That means Nividia cards usually come with lower quality materials and coolers etc. One reason why EVGA quit their cooperation with Nvidia.
AMD leaves a lot more headroom in that aspect and AIBs can provide more quality and usually do with proper GPUs.
Killer review. But man, that thing is a beastie! Watch you power meter spin when you're gaming!
Is it normal for the back plate to get extremely hot? So hot that I can't even put my finger on the card around the chip. The temperatures in games are between 70-80 degrees and the wattage around 400.. Should I use more fans for better cooling in the case to keep the back plate cooler?
70-80 is fine
Whats the PCB length? Looking for a waterblock.
These cards have the newer display port not display port 1.4.
Great work and merry Christmas.
You mentioned coil whine: have you considered doing a video on it? The subject in general, not just the whine of this card, that is. Like the frequencies of the whine. Someone who's deaf above 8000Hz may tolerant of a whine that someone who's deaf above 10,000 Hz may not if that whine is at 9,000 Hz.
Here's looking to more great content in 2023!
Coil whine is generally the result of too much voltage trying to pass through a PCB or Voltage Regulators maybe, it's often a result of extremely high power use due to the system being confused from bad sensors and bad thermal conductivity...
its usually more noticeable too at massively high frame rates. like over 200+
Someone would do an ML benchmark with this card pretty please. 24 GB RAM makes it a good candidate. And with the ridiculous pricing of NVidia I would go with the red team if the tradeoff is not that bad.
When additional overclocking
Does the adrenaline power consumption display show power over 430W?
THAT CW THOUGH!!!
How is the linux performance?
Not many people test linux performance, it would take a lot of time I guess for a small audience. I use linux too, but I base results on windows, you can get decent performance with these cards on linux
The multiply bios seems silly - why not have this as a software setting.
Modern GPUs may still include physical VBIOS switches for several reasons. First is the issue of safety and recovery. These physical switches serve as a fail-safe mechanism, so if software vbios update failed then you're screwed. Also, physical switches ensure compatibility in scenarios where a specific VBIOS version is necessary, especially in legacy systems or non-standard configurations. Security is also a concern since software-based VBIOS changes could be exploited by malicious software, and we definitely don't need some GPU frying virus going around. Specialized applications, older systems without software-based updating capabilities, and industries where precise GPU configuration control is vital also benefit from the reliability and consistency that physical switches offer. In essence, physical VBIOS switches provide a robust and secure way to manage GPU configurations in various situations where software-based alternatives may not be sufficient. However, having this setting in the motherboards bios to communicate with the GPU to change its vbios would be a better option, but that might take extra unnecessary work.
Why do cards have coil whine? Does that mean the coil whine will always be there?
what do you do about the hot spot temps hitting 110 ?
Custom cards doesn't have that issue....only reference cards
Yeah only some reference cards have this issue, i was one of the unlucky ones. I got a refund and bought the card in this video.
4:00 2x DP1.4 ….. 🤒🤦🤦♂ or 2.1?
2x DisplayPort 2.1
Performance wise, how does it compare to Red Devil 7900 XTX? I kinda like its esthetics even more.
It's an awesome card.
Hahaha i was just about to comment in your last video to check this out. Beat me to it. Hey bang, if Dominic is hitting just over 500w when fully overclocked that probably explains why yours sits at 464w all the time.
So this card is $1000 less than 4090 FE but is only 10% slower?
Although you said that it's similar to the 4080 in terms of performance and price, the 24GB is definetely a win over the 16GB that the 4080 offers. Especially with how demanding the games are in terms of VRAM and also for game development and Ai tool usage. So taking that into consideration is much more worth taking the XTX while only sacrificing a few frames in RayTracing but getting all the other benefits that come with higher VRAM
If you dont run RT the 7900xtx is def the better pick compared to the 4080. If you run RT then even the 4080 and 4090 barely work fine in 4k. In 4k gaming the 4080's VRAM won't outlast the 7900xtx. DLSS was a bad argument and still is a bad argument, but if you care about it..fs3 is proving to be very potent.
Only usecase I've seen for 4080 is 1440p RT gaming and even then cyberpunk with RT runs at 50fps for instance. Not a big selling point to me
@@psobbtutorials6792 how about RT for 1080 with a 7900xtx?
@@Good_Username 6800 XT is for you... 7900 XT / XTX is better for 4K
@@gabrielfpi3046 I plan on connecting it to TV for couch gaming as well and the price difference is like 300$ so the 7900xtx is more versitile and future proof
Without doing any overclocking, would this card out perform the rtx 4080 super? I refuse to overclock :) Thanks
It'd trade blows but generally just a smidge ahead for raster, well slower for RT
@@KitGuruTech Thank you so much! This will be my first time going with nvidia
bros eyelashes are on point
This beauty is the card that got me out of the nightmare of a Zotac Trinity 3090 that run at 77°C with fans at the speed of a raging McLaren.
i have a RedDevil 7900xtx Limited Edition.. an it run some good.. it pulls 430watts getting around 2900MHz gpu temp 45c junc teamps 77c... i must say the Red Devil run so good
Hello, can I put this video card in Aorus elite b650m AX (microATX), or do you need an ATX motherboard? thank you
4090 aibs are much much larger. 135mm height? try 150-162 on nvidia cards. that's what's also causing them not to fit in cases
Just bought one 200 cheaper than a 4080...love this card.
500w, so much for that performance x efficiency ratio ;).
It's a solid card, just hope the fanboys will pipe down now.
Not saying the 7900XTX's stock power efficiency is anything earth shattering, but if you're deliberately raising the power limits and overclock any piece of hardware, you opt to throw efficiency out the window for some extra perf. Obviously this card delivers much better perf/Watt at stock than when overclocked at 500W.
@@Hugh_I except at idle ;););).
I'm really just using fanboys arguments against them. Since their deluded-ness deserves it after the 6 weeks of AMD worship and Nvidia inquisition pre launch.
@@OscyJack- Then don't make it worse and act like those people. Both NVidia and AMD are out to screw us.
@@theHeartlessNooB of course they are out to screw us, they are a corporation. Which is why I said fanboys. Being a fanboy is a recipe to waste money and make foolish statements. Calling out a fanboy is what needs to happen. That's what I did. Seems like you don't disagree....
Love my Sapphire 6900XT Liquid ... really nice brand.
Saphire’s cooler implementation is excellent compared to the reference design.
they always make great coolers. one of the better partner cards
Kinda crazy how sapphire lets you pull 500+watts with overclocking. This seems like it could be a kingpin product that are designed to push the clocks to the limit.
Hey #KitGuruTech The display ports are (2.1) not (1.4) Got to read the box before writing the news script.
i am going to buy a nitro card and i have tt 750w 80plus psu . is this psu enough for this card ?
DisplayPort 1.4? not 2.1?
AMD & Nvidia need to reign in this ridiculous arms race of inefficient (dare I say hot and loud) GPUs. People should in the least be nonplussed about needing an 800w or greater PSU for their flagship cards. Not to mention these gigantic cards are paired with equally grotesque heatsinks, making them vastly more expensive along with being big ugly powerhogs.
Yet there are some people who I am sure are eagerly awaiting next year’s 7 slot 1200w $3000 behemoth.
The 4090 is actually shockingly efficient depending on demand. 800w have been the recommended PSU for flagships for 4+ years.
Definitely expensive
It's really not as efficient on a basic GPU concept, it is one of the biggest monolithic dies found on GPUs in general. Nvidia is taking a huge loss with the high fail rate of wafers.
Hence the $1600 price tag and such.
I wouldn't praise Nvidia in 2024 though, they at at the end of their monolithic builds, they can't get any bigger or smaller and may start to resort to cheap tactics to hurt AMD like they did 10-20 years ago.
So after the 4090, they may start to produce the same GPU but double the tensor and RT cores maybe? And try to push for DLSS 3 or 4 to fake as much of the imagery as possible because the hardware can't get better, only work differently, like a illusion to the visuals.
AMD will need to keep scaling chiplets, RDNA 4 will be much better and RT being more efficient, but Nvidia will like pull some strings again to gimp RT gaming performance overall so you need a ton of RT hardware to overcome the wall they make developers put up.
UA-cam randomly deletes my comments on referring to old articles of what Nvidia does wrong...
Nvidia abuse Tessellation
Sometimes I wonder if Nvidia is secretly on the CCP's pay, stuff keeps disappearing behind censorship.
The Sapphire 7900XTX Power Hog King!
🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥