Beautiful, thanks for sharing and covering the RTX 4090 uses for production and content creation uses. It's rare to see as many UA-camrs only review gaming benchmarks.
I think you're doing your calculations incorrectly. For example, the chart at 5:18 you did the calculation of how much better FPS the 4090 is over a single RTX 3090 of (1-(19.1/30.7 ))*100=~38%. The chart at 5:30 you did the calculation of how much better FPS the 4090 is over dual RTX 3090s of (1-(65.9/115 ))*100=~43%. However, this calculation does not give you the percentage increase in performance of the 4090 over another card. The calculation at 5:18 should be ((30.7/19.1)*100)-100=~60.1%. The calculation at 5:30 should be ((115/65.9)*100)-100=~74.5% Think of it like this if you have $100 and I have $50, you have double (2x) the amount of money as me or 100% more money than me. (($100/$50)*100)-100=100%. Using the other calculation it would tell you that you have 1.5x as much money as me which I do not think is correct. (1-(50/100 ))*100=50% However, the chart at 5:54 does use the correct calculation to find a reduction in time of the 4090 over the dual 3090 (1-(28/45 ))*100=~35%. Regardless, good video.
Thanks for the breakdown - I'll have to investigate the code that's setup for the percentage calculator in my spreadsheet. If the percentages are wrong then I apologize, the numbers and data still remains the same. The performance for productivity in video production is still quite low from the generational jump from the 3090 to the 4090 which is my main point. Appreciate you looking out and thanks for watching!
I do and it's pretty much the ONLY 8k monitor on the market. It LOOKS amazing but there really aren't any games that perform well at 8k year with the current GPU lineup. Even the 4090 only plays back 8k games at around 30fps if you're lucky.
Great overview of the difference in real world performance, just what I looked for. The wording used to describe percentage gains might be incorrect, from the stats in the video it sounds like you are using percentage reduction in time and percentage increase in speed interchangeably when they don't mean the exact same thing. 1.88 FPS to 7.28 FPS is ~300% faster 19.1 FPS to 30.7 FPS is ~60% faster 64.1 FPS to 115 FPS is ~80% faster / gains 68 sec to 38 sec is ~79% increase in speed The wording about reduction in rendering time is correct in the video 45 second to 28 second, 37% reduction in rendering time. You say on the last slide that a 4039 is 32% slower than 2x3090, that is correct, but 8:25 minutes is either ~6% slower or ~7% faster than 7:53 minutes, not 9%. The numbers in the examples makes it clear that you use percentage increase in speed and percentage gains to mean percentage reduction in time spent, but anyone not calculating the numbers can get the wrong impression. 4090 uses ~40% less time than 3090 for the same amount of frames in Davinci Resolve which means it is about ~70% faster or has a ~70% gain/improvement over 3090 in Davinci resolve. The main point of the video still stands, the 3090 is more than good enough for Davinci Resolve, the 4090 improvement in real world feeling when actually using it in Davinci Resolve is not much different than 3090, and the reduction in rendering time is only about ~40%. My language nitpick is only because this is in the niche benchmarking field where the standards is different. I appreciate that you specify that the user experience in using Davinci Resolve is basically the same for 3090 and 4090, that is what matters and it does not show up in the graphs.
Yes, I used the wrong excel calculator for the percentages. The numbers/data still stand though and the 'feel' of performance still isn't quite there compared to dual 3090s. Thanks for watching!
Awesome video! Thanks man! Real time results are way better to see then just a benchmarking. Staying with my RTX 3080Ti fir now and thanks for saving me 3000$CAD 😀
Thx for the video! Glad to see a video focused on editing w/ Premiere & Resolve. I have a 2080ti and need to upgrade because of 10 bit 4k footage.. haha
@@ramdogproductions yeah you likely need an ATX-E case to fit any of the 4090s. It's a solid 3-4 inches longer than the 3090 which was already huge. I'm using the Meshify 2 XL and it fits in perfectly.
Just got this 4090 (Zotac) at MicroCenter. Upgraded from a 3090 Gaming Trio X. I was concerned about the width (HYTE Y60 case) but it fit perfectly, with enough room for airflow. Some versions barely have room between the fans and the glass. The card itself looks great however, in a Y60 case, the curves make it look out of place since the rest of the case is straight/angular. For a regular case where the front of the card is not on display, it wouldn’t matter. The RGB accents look really nice and the warranty seems solid. The packaging DOES look SUPER premium. Two thumbs up! 🤣
Dude!!! You are a GOD!! The first video where there has Been testet real life with NEAT noise plugin!! A shame that you dont have the setup i need.. Intel i9 - 13900K
Haha right on man. I would imagine with the 13900k you'll see even better performance because that CPU will allow for faster memory bandwidth - you'll love it. Thanks for watching!
This is the one I want. I can’t find one for msrp. I never owned a Zotac before but I think it’s the best looking of all the 4090s. Normally I’d be all over the Strix card but I ain’t paying that tax anymore. Especially for the 4090 Strix that looks like a shoebox with RGB.
True! The Strix looks really ugly. Same for the TUF. I have a RTX 3080 TI TUF and it looks so good! The new TUF design is garbage in my eyes. But that Zotac card looks better than any other - except for the FE.
@@DanteBellin Ive had nothing but Asus cards since the 660ti. I had a Asus 680 TOP, ROG Matrix 780ti and Strix OC 1080ti and 2080ti. All of them looked pretty good. Especially the Mateix 780ti. I really liked the look of the Strix 30 series but that tax just got outta hand. I currently have a EVGA FTW3 Ultra 3090ti and 3080ti. Call me shallow but I really like how those cards say RTX 30** on the side. I’m curious as to what the EVGA cards would have looked like this time around.
@@OhItsThat Yes... I never really liked EVGAS's designes. I've actually never liked any designs of any brand. The FE 2080 was the first beautiful card imo. And the 30xx and 40xx FE are stunning. But those other cards cost 2000 bucks and they look like a toy for a 10 year old boy...
Thanks for the video, but your math could use some work. 5:14 100 (7.28/1.88) = 387% (287% increase) 5:40 100 (115/66) = 174% (74% increase) As for render time reduction, remember you're dealing with a denominator. Render TIME is 1/work rate, so the calculation is; 100 (1/28 / 1/43) = 153% (53% increase) 100 (1/38 / 1/55) = 144% (44% increase) 100 (1/38 / 1/68) = 179% (79% increase)
Probably a gazillion dollars here in Australia!….lol. I don’t play games, only edit in Resolve (8k Timelapses). So I’m surprised that it will mainly be beneficial to gammers. Thanks for the great review Drew! You just saved me a lot of $$$$$$. I’ll stick with my RTX3090ti for now! 👍🤣
Yeah, the playback performance for RAW 8k photos (Sony) is almost identical to the 3090. I'm a big timelapse guy myself and I just don't see this card adding any real meaningful performance to my workflow... especially for the price. If it was an $800 card I might keep it, but as it stands I'm pretty sure I'm returning mine too.
Considering jumping up to the 4090 from an RTX 2080 Super. I skipped the 30-series mainly because of scalpers/availability. The dual encoders on the 4090 look pretty nice and DaVinci Resolve performance looks pretty good overall. Probably a better buy than a Quadro card at this point, especially if I want to do some gaming on the side.
@@MagicXStallion it's not optimized for the 4090 yet. It's literally taking about 10x as long with the 4090 over the 3090. There's definitely driver issues.
awesome I wanted to know if the dual encoder helped in regular 4k and hd vs a single 3090 and it does, I mostly game but the video encoding is something I use quite often and I'm not patient >
Zotac makes good stuff. Only reason I never got a card by them again after my first card is no store I shopped at carried them. Enjoy your RTX 4090 but like you said in your video eh performance is good but not great which as more and more people put these cards through the paces the more people are realizing Nvidia pulled a fast one. DLSS 3.0 anyone? Btw subscribed cuz you seem honest about things.
Excellent review Drew!! Thank you! Any recommendations on GPU & CPU? Upgrading to a new PC soon with the prime focus on Video Editing in DaVinci Resolve Studio. Ever since upgrading the camera, a better PC is key. Especially for 4K & 8K with new codecs and 10 bit 4:2:2. Any recommendations on GPU, CPU and anything else in mind that will be key to build a new PC? Thank you!
Quite honestly there have been major driver issues with the 4090; the most recent studio version actually crippled performance in DR/Fusion. I'll make sure to do some fusion tests once they've figured out the bugs though.
I got a 3080 TI and I would need this card to be under 900 for me to consider upgrading. For Davinci resolve is it worth adding second GPU? 3080's are cheap now.
I'm seriously looking at this 4090 as I can't get a 4090FE from anywhere. Still debating whether or not the 4090 would be choked by the 5800X3D as I have no plans in moving up to the AM5 platform anytime soon. Currently have a 3090FE that's mainly used with Microsoft Flight Simulator 2020. Any thoughts?
It's a solid card (the Zotac 4090) but honestly you're best to wait for the 5000 series to drop this fall or buy a used 3090FE and nvlink them. Not sure it's worth buying the 4090 at this point unless you find a great deal on one.
@@jamesm568 I think we'll get some fun announcements... definitely 4090TI's BUT they could leap fog AMD and announce the 5000 series (likely early 2024 release).
@@District7DrewGeraci I just need a good GPU that's solid on 4K as the 3090 is close, but I do everything in 4k and lower-resolutions are irrelevant to me.
@@District7DrewGeraci Great, tnx, could you also take a look in MSI Afterburner maximum Power % that you can set? For example 4090 Palit GameRock OC comes with 4way splitter, but the Power % is limited to 111% only when compared to Nvidia Founders 133%.
Hey, thanks for great video. How is coil whine in this model? Apparently 4090's suffer from horrible coil whine especially high end models but I see positive comments about zotac and gigabyte models.
Something that is really, really bugging me. In some tests you were seeing 35-45% increases in performances, however, you then say at the end that you don't see it being an upgrade. Those are very big performance increases for a single generational upgrade. How are you coming to that conclusion? Plus, your complaing about the price. Yeah, 2 3090s, really a great value there. (a conclusion, might i add, which is different to any other reviewer you can find right now. No one else has said its a small upgrade.)
It comes down to the actual 'real-world' performance of the device. Sure, the benchmarks indicate a performance increase but if you physically can't feel that increase in performance while you're using it, then it means it's not that great of a performance increase. For example, coming from a 2080ti to the 3090 I saw vast improvements in render speeds, fluidity on the timeline (no stuttering) and a great improvement on the number of nodes you can use on full-res footage without causing playback issues. With the 4090 there is only a marginal increase in actual noticeable performance. As my machine was using dual 3090s I quite honestly didn't see a need to spend another nearly $2k on a new GPU that was only slightly faster and you're right, I invested A LOT into 3090s so why would I want to pay more for something that barely feels like an upgrade? I'm not saying the 4090 is bad, I'm saying if you're physically using it you're just going to feel like it's a 'meh' upgrade. By all means, get the 4090 if you think it will improve your workflow, for me, it just doesn't cut it.
I think the 4090’s done okay there. I read that as significant performance gains over a single 3090 and still better than dual 3090’s. I work with Blender, photogrammetry and rendering. We create a lot of animations that are 1 to 5 minutes long. We have 4x3090’s in a 5975wx, 256GB system. The appeal of moving to 4090’s is we’d only need two cards to beat our 4x3090’s. And we can then expand if we need to. The downside is we’d need water blocks which are a hassle. I think the other issue with the 4090 is that it doesn’t Thrace enough ram. It’s quite easy to use 24GB of VRAM and then the card struggles. The new A6000 will have 48GB but the cost will be much higher, over 2x. I think Nvidia have learned from last time when many production houses, CAD etc used 3090’s instead of the quadro range. Now they’re holding back on VRAM in the gaming cards to force “pros” on to the workstation range. Maybe also partially behind the crazy cooler size as well. Anyway, good review, looking forward to your future content 👍
I think you are being rather unfair considering it has only had 1 firmware update, and comparing it against two cards only 1 generation out, that its still managing to beat is actually quite impressive. Yes it's an expensive card, but that's because it can run games at 50% better fps in most cases in 4k. It's not really designed for video editing, that's just another nice trick. And surly, 30 seconds saved here and there add up considerably after some time of editing.
I don't think I'm being unfair, especially since the gains from the 2080ti to the 3090 were triple to quadroople the performance power of what we're seeing here with the 4090. Since I paid $2k for my 3090 I would expect (at a minimum) at least doubling the performance which just isn't happening (in my case for video production). There are definitely large gains for 3D modeling and gaming, but that's not what I bought the card for (only mild 3D modeling work). Hopefully they deliver new drivers soon that perhaps boost the performance but it's hard to say at this point. And to say this card isn't meant for video editing would be incorrect. The CUDA cores, VRAM, and GPU clock speed all have direct (and highly impactful) influence on how fast things are rendered, playedback and processed in realtime in 'most' video editors (excluding adobe).
@@District7DrewGeraci I stated that it isn't designed for video editing, not meant for, of course it can be used for it. But the high end Nvidia 4090 cards or titan cards in the past were/are based around overclocking and top end performance in games, that is what they are designed to do, and in games they are 77% better than the 3090, and 55-60% faster than even the 3090ti (A card that 6 months ago, cost more than the 4090). This is not a bad card at all, in fact it is the biggest performance uplift in GPU's since the GTX 10 series, in gaming I might add. Hence why I say it is not designed for video editing. you would be better going the Radeon 7k route if that's what you want to do with it. :) Also, the 20 series cards were known to be very underperforming cards, it was the first time they stretched their legs on RTX and ray tracing, so the 30 series was a massive performance jump. The 10 to the 20 series was awful in rasterization performance, what made it novel, was the fact it could undertake raytracing, but not well.
If you can find a cheap or MSRP 4090, yes it's worth the upgrade, but it doesn't really add that much in terms of performance to video/photo apps comparatively. Dual 3090s run almost identically (and you can snag one for around $650-800 now) and might be better. 4090TIs are right around the corner too.
I would buy the 4090 only because I have a FTW 1080. I do a lot of photo and video editing and will consider the 4090 but not at the prices they currently are at.
It's definitely worth upgrading to from the 1080; huge performance increase. The price is definitely still high though and with the 4090ti coming out this summer it might be best to wait.
I've never been a fan because of what I've heard from others who have purchased Zotac - it's reputation isn't the best. So far though it's performing as it should.
It definitely handles 8k footage wonderfully (but so does the 3090). And I'm sure if bitcoin were up, every miner out there would be snagging these cards!
Yeah I got the same. Was 18 cards left and 18 people in line. It was funny up until then they said we about to run out guys then last min they were ahh well good and bad new. Good news you get a card. Bad news you get a zotac. Lmao. Idk if it’s good or bad but as long as it’s not loud and does at least FE performance I’ll be cool. Never had a zotac but all reviews looked ok last gen except on the 3090 and 3090ti so I’m hoping they resolved that.
Yeah, I wasn't thrilled but it's as fast as the 4090FE but its quite a bit bigger. The cooling is good on it though, idle temp is 31 and full load is 51.
@@District7DrewGeraci well that is good cooling. at least that cooler is doing something vs zotac 3090ti from what I heard was not great. I started a new PC build about 2 weeks ago and have been exchanging and deciding what to end up with. I think I’ve probably gone too far considering taking it all back and building a 5800x3d build saving myself $2k and go with a 3080ti which would be heads and tails above my 1700x and 1080.
I just picked up a zotac 4090 to replace my 3080ti .. at 1440p ultra wide my CPU struggles to load it to 70% in warzone... I think I'm going to return it since I can't afford to completely rebuild my rig right now to fully utilize it... One heck of a card but need top of the top chip to actually use it completely
@@Xyz_Litty I'm not responding to you, I'm responding to the OP who is trying to game at 1440p and is clearly getting CPU bottlenecked. Good for you for buying a high end CPU, but OP sure as fuck didn't.
When you say it is 37% faster than a 3090, what numbers are you taking? On the grafik I see from 1.88fps to 7.28 fps for me this is 3.7 times faster, 370% faster and not 37%! So IT IS a giantik increase and not even close to only 37%.
While I did have the percentages backward, in the real world it's still a meaningless upgrade. Noticing 7 FPS over 3 FPS (in terms of 3 times the speed) is negligible when you're actually using the program.
@@District7DrewGeraci ok, thank you. I ordered one, so my 5900x + 3090 PC is already half replaced be with a 7950x + 4090 PC (the 4090 is not here by now). I hope that the weakness what I realize in my 5900x +3090 PC what is the playback of some 4K 60 H.265/H.264 formats or two parallel 8K-RAW files with activated noise removal and other things like this, will get a big improvement and that they will playback much smoother. The new Motherboard/CPU/RAM already brought a improvement, and I hope there will be again a similar improvement with the 4090. If I'll see no big improvement then maybe I'll keep the 3090, but I think I'll get what I hope.
@@iliaskapatos a newer CPU will definitely help boost performance for the 4090, I think I have around a 31% bottleneck with the Threadripper Pro, but that being said, I just tested it out on a friends 7950 as well and the performance was pretty much identical (in terms of playback and fluidity). It's not bad by any means, but coming from dual 3090s it's hardly an upgrade (for me).
@@District7DrewGeraci thank you for this information👍, it is great to hear that the 7950X is similar in fluidity like a 32 core threadripper. I got the AM5 system also because there are rumors that the next generation of AMD CPU'S the 8950x will double the amount of cores, so my thought was that at least then a AM5 system should be faster than current 32core threadrippers, just with some limitations like max. 128gb RAM. If the 7950x is already as fast in my used Videoediting program then it is great and surprising for me😃. And if everything is already fluid, then I won't have to upgrade my PC anymore. Can't wait to get my 4090.
I agree the expectations were unrealistic to a point, but when you spend $1800 on a GPU that supposedly is meant to be a 'big' improvement upon its successor and it's not, is the real problem. My expectations would be more reasonable if the price matched the performance - more like the $950-1200 range would be the appropriate price point.
@@District7DrewGeraci Thanks for the reply. Sorry for the original terse comment. I will clarify. 1. Still early, so the drivers should improve over time. 2. We are at the point where Moore's law is running out of steam. In order for these GPUs, and CPUs as well, to perform as good as they do, the sizes and power consumption are increasing to absurd amounts. 3. The price/ performance compared to past generations needs to be calculated so we can say with more clarity if it is worth it. 4. Define 'big' improvement. 5. Inflation. 6. Ultimately, this is primarily a gamer GPU, even if it is marketed as a productivity GPU, which it clearly is. Compare to the Quadra line to see how the price/performance stacks up. This single GPU is being stacked up against two 3090s, the previous best GPU, that was selling for over 2000 at periods over the pandemic, and that price was being paid even by non-miners. So if we compare in that context, someone might have paid between 3500 and 4500 for two new 3090, and I think we can agree that a single 4090, that you can buy at retail for roughly 1600 gives better performance, or at least equal, can fit in a mid-tower, uses less power, lower noise, more modern, will hold value better ( since didn't have to pay inflated price ). I haven't run the numbers to make a graph to see if the expected improvements in productivity compared to past trends is being met or not, but I am sure someone on the net has that benchmark data. Pardon me if my comments sound confrontational. They are not meant to be in any way. Wish you the best, and happy holidays!
FYI I got the same card and it popped last week turned on once I hit 350 watts it would freeze the pc I rma it and the new card would turn off the pc in games and I have a new msi gen 5 psu so it’s not the power I was able to get my money back and got a msi 4090 and guess what 3 days no problems I just found out today also is that u can not use nvidas drivers only from zotac work on this card and there way behind on there drives save your money and get a good gpu
I'll be putting out another video shortly. Drivers seemed to be the biggest issue however performance hasn't increased that much (for the price point). I'm very happy with it for gaming and 3D development though. I just wish it worked better in certain production applications.
@@bobbywiederhold well, I'm glad you liked it, and I agree content matters (which is why I try to produce solid content) but honestly it's all about YT's algorithms now. While I 100% agree with you, if you don't play the game then you don't see results. Would still love to have you as a follower though!
Negative. I have a Mac Studio and can confirm it's nowhere near as fast. Having that dedicated GPU makes a HUGE difference when it comes to real-time playback and performance on any 8k+ production. The studio is great for handling 4k footage though with minor VFX applied.
@@District7DrewGeraci Oh so Mac Studio M1 Max can't handle 8k ? Sorry, I only deal with 4k. 8K rendering must be more power hunger that 4090 is required. Anyway, how was this Zotac temp/noise/power consumption ? I'm looking to buy one to upgrade from 4070Ti Tuf, but not sure about Zotac brand.
@@thanatosor the Ultra Studio can definitely handle the basic playback of 8k without any issues but when you start compounding VFX layers, additional nodes and long-gop footage it begins to slow down tremendously. It's a great system but it has limitations over standalone PCs with dedicated GPUs.
Beautiful, thanks for sharing and covering the RTX 4090 uses for production and content creation uses. It's rare to see as many UA-camrs only review gaming benchmarks.
Thanks for watching! And yeah, I noticed that too. It's like 95% gaming and 5% productivity, ha.
I think the productivity gains in Blender make the 4090 a must have for 3d modelers.
Agreed, there is great performance in it for 3D Modeling.
I think you're doing your calculations incorrectly. For example, the chart at 5:18 you did the calculation of how much better FPS the 4090 is over a single RTX 3090 of (1-(19.1/30.7 ))*100=~38%. The chart at 5:30 you did the calculation of how much better FPS the 4090 is over dual RTX 3090s of (1-(65.9/115 ))*100=~43%. However, this calculation does not give you the percentage increase in performance of the 4090 over another card. The calculation at 5:18 should be ((30.7/19.1)*100)-100=~60.1%. The calculation at 5:30 should be ((115/65.9)*100)-100=~74.5%
Think of it like this if you have $100 and I have $50, you have double (2x) the amount of money as me or 100% more money than me. (($100/$50)*100)-100=100%. Using the other calculation it would tell you that you have 1.5x as much money as me which I do not think is correct. (1-(50/100 ))*100=50%
However, the chart at 5:54 does use the correct calculation to find a reduction in time of the 4090 over the dual 3090 (1-(28/45 ))*100=~35%.
Regardless, good video.
Thanks for the breakdown - I'll have to investigate the code that's setup for the percentage calculator in my spreadsheet. If the percentages are wrong then I apologize, the numbers and data still remains the same. The performance for productivity in video production is still quite low from the generational jump from the 3090 to the 4090 which is my main point. Appreciate you looking out and thanks for watching!
Don't you also have a 8k monitor? Could you make a video on how it performs in few year old games at 8k?
I do and it's pretty much the ONLY 8k monitor on the market. It LOOKS amazing but there really aren't any games that perform well at 8k year with the current GPU lineup. Even the 4090 only plays back 8k games at around 30fps if you're lucky.
Great overview of the difference in real world performance, just what I looked for.
The wording used to describe percentage gains might be incorrect, from the stats in the video it sounds like you are using percentage reduction in time and percentage increase in speed interchangeably when they don't mean the exact same thing.
1.88 FPS to 7.28 FPS is ~300% faster
19.1 FPS to 30.7 FPS is ~60% faster
64.1 FPS to 115 FPS is ~80% faster / gains
68 sec to 38 sec is ~79% increase in speed
The wording about reduction in rendering time is correct in the video
45 second to 28 second, 37% reduction in rendering time.
You say on the last slide that a 4039 is 32% slower than 2x3090, that is correct, but 8:25 minutes is either ~6% slower or ~7% faster than 7:53 minutes, not 9%.
The numbers in the examples makes it clear that you use percentage increase in speed and percentage gains to mean percentage reduction in time spent, but anyone not calculating the numbers can get the wrong impression.
4090 uses ~40% less time than 3090 for the same amount of frames in Davinci Resolve which means it is about ~70% faster or has a ~70% gain/improvement over 3090 in Davinci resolve.
The main point of the video still stands, the 3090 is more than good enough for Davinci Resolve, the 4090 improvement in real world feeling when actually using it in Davinci Resolve is not much different than 3090, and the reduction in rendering time is only about ~40%. My language nitpick is only because this is in the niche benchmarking field where the standards is different. I appreciate that you specify that the user experience in using Davinci Resolve is basically the same for 3090 and 4090, that is what matters and it does not show up in the graphs.
Yes, I used the wrong excel calculator for the percentages. The numbers/data still stand though and the 'feel' of performance still isn't quite there compared to dual 3090s. Thanks for watching!
Awesome video! Thanks man! Real time results are way better to see then just a benchmarking. Staying with my RTX 3080Ti fir now and thanks for saving me 3000$CAD 😀
Thx for the video! Glad to see a video focused on editing w/ Premiere & Resolve. I have a 2080ti and need to upgrade because of 10 bit 4k footage.. haha
The 4090 will be a solid upgrade from the 2080ti, especially in Resolve. Thanks for watching!
@@District7DrewGeraci Thnx!! Did you install it in a large case? Wondering if it will fit in my Mid-sized LOL
@@ramdogproductions yeah you likely need an ATX-E case to fit any of the 4090s. It's a solid 3-4 inches longer than the 3090 which was already huge. I'm using the Meshify 2 XL and it fits in perfectly.
@@District7DrewGeraci Thank you for those recommendations!
Just got this 4090 (Zotac) at MicroCenter. Upgraded from a 3090 Gaming Trio X. I was concerned about the width (HYTE Y60 case) but it fit perfectly, with enough room for airflow. Some versions barely have room between the fans and the glass. The card itself looks great however, in a Y60 case, the curves make it look out of place since the rest of the case is straight/angular. For a regular case where the front of the card is not on display, it wouldn’t matter. The RGB accents look really nice and the warranty seems solid. The packaging DOES look SUPER premium. Two thumbs up! 🤣
Yeah, it's a great card! It's really improved with the newer drivers too.
Dude!!! You are a GOD!! The first video where there has Been testet real life with NEAT noise plugin!!
A shame that you dont have the setup i need..
Intel i9 - 13900K
Haha right on man. I would imagine with the 13900k you'll see even better performance because that CPU will allow for faster memory bandwidth - you'll love it. Thanks for watching!
@@District7DrewGeraci You are welcome :)
This is the one I want. I can’t find one for msrp. I never owned a Zotac before but I think it’s the best looking of all the 4090s. Normally I’d be all over the Strix card but I ain’t paying that tax anymore. Especially for the 4090 Strix that looks like a shoebox with RGB.
True! The Strix looks really ugly. Same for the TUF. I have a RTX 3080 TI TUF and it looks so good! The new TUF design is garbage in my eyes. But that Zotac card looks better than any other - except for the FE.
@@DanteBellin Ive had nothing but Asus cards since the 660ti. I had a Asus 680 TOP, ROG Matrix 780ti and Strix OC 1080ti and 2080ti. All of them looked pretty good. Especially the Mateix 780ti. I really liked the look of the Strix 30 series but that tax just got outta hand. I currently have a EVGA FTW3 Ultra 3090ti and 3080ti. Call me shallow but I really like how those cards say RTX 30** on the side. I’m curious as to what the EVGA cards would have looked like this time around.
@@OhItsThat Yes... I never really liked EVGAS's designes. I've actually never liked any designs of any brand. The FE 2080 was the first beautiful card imo. And the 30xx and 40xx FE are stunning. But those other cards cost 2000 bucks and they look like a toy for a 10 year old boy...
YES , IT NOT BOXY LOOKING OR OUT OF PLACE
Thanks for the video, but your math could use some work.
5:14
100 (7.28/1.88) = 387% (287% increase)
5:40
100 (115/66) = 174% (74% increase)
As for render time reduction, remember you're dealing with a denominator. Render TIME is 1/work rate, so the calculation is;
100 (1/28 / 1/43) = 153% (53% increase)
100 (1/38 / 1/55) = 144% (44% increase)
100 (1/38 / 1/68) = 179% (79% increase)
Thanks, yes, I was using the wrong code in my spreadsheet for those percentages.
Probably a gazillion dollars here in Australia!….lol. I don’t play games, only edit in Resolve (8k Timelapses). So I’m surprised that it will mainly be beneficial to gammers. Thanks for the great review Drew! You just saved me a lot of $$$$$$. I’ll stick with my RTX3090ti for now! 👍🤣
Yeah, the playback performance for RAW 8k photos (Sony) is almost identical to the 3090. I'm a big timelapse guy myself and I just don't see this card adding any real meaningful performance to my workflow... especially for the price. If it was an $800 card I might keep it, but as it stands I'm pretty sure I'm returning mine too.
Also, thanks for watching!!
Considering jumping up to the 4090 from an RTX 2080 Super. I skipped the 30-series mainly because of scalpers/availability. The dual encoders on the 4090 look pretty nice and DaVinci Resolve performance looks pretty good overall. Probably a better buy than a Quadro card at this point, especially if I want to do some gaming on the side.
Can you do a bench mark for tapaz labs video enhancer
Sure, you want it compared to the 3090?
@@District7DrewGeraci that would be awsome if you could bro
@@MagicXStallion it's not optimized for the 4090 yet. It's literally taking about 10x as long with the 4090 over the 3090. There's definitely driver issues.
@@District7DrewGeraci thank you for the lab results
What cpu do you have? On 12th gen timeline decoding falls onto iGPU so 4090 really don't provide extra on decoding, but should help other items
These tests were performed on the AMD Threadripper Pro 3975wx.
Is a i7 12700k + 3000mhz ddr4 enough for this card?
CPU gonna bottleneck
@@alexl4626 i dont think the cpu is the problem since its the most recent model. But i guess the ram aint on par woth the gpu horsepower.
ssheesh! thanks for the video. I just got a zotac 4090 at msrp last night and am looking forward to it. I currently have a 1080 ti :)
If you're coming from the 1080TI you're going to LOVE the 4090!! Thanks for watching!
Does this card use the vapour chamber cooler as with the zotac 4090 extreme?
Great question - I don't know really. Temps are quite low under a full load though, 50c max. Idles around 27-30c with no load.
Upgraded from GTX 1080 it was worth waiting for!
Would be the same for me if I could grab one. I'll wait and see what AMD brings to the table then. Can't buy a 4090 here anyways. :(
awesome I wanted to know if the dual encoder helped in regular 4k and hd vs a single 3090 and it does, I mostly game but the video encoding is something I use quite often and I'm not patient >
Have you installed the nvidia studio driver for the benchmark running? On the 3090 my 3D rendering workflows runs better with the driver
Yes, these tests were all done in studio mode. And yes, for 3d modeling the 4090 may actually be a decent upgrade.
how can i change bios ?
512 GB of RAM? That's insane.
great test thank you 👍 (upgrading from 1080ti)
Zotac makes good stuff. Only reason I never got a card by them again after my first card is no store I shopped at carried them. Enjoy your RTX 4090 but like you said in your video eh performance is good but not great which as more and more people put these cards through the paces the more people are realizing Nvidia pulled a fast one. DLSS 3.0 anyone? Btw subscribed cuz you seem honest about things.
Thanks man! Appreciate the sub and the kind words. I'm likely going to return the 4090 and wait for the 4090Ti to release.
Excellent review Drew!! Thank you!
Any recommendations on GPU & CPU?
Upgrading to a new PC soon with the prime focus on Video Editing in DaVinci Resolve Studio. Ever since upgrading the camera, a better PC is key. Especially for 4K & 8K with new codecs and 10 bit 4:2:2.
Any recommendations on GPU, CPU and anything else in mind that will be key to build a new PC?
Thank you!
0:18 uhhhh, that would mean you didn't have a choice 😂😂
Haha yeah, there was no choice.
Was really hoping to see some Fusion tests, Fusion comes with templates in the Fusion page, especially the particle stuff can bog down the system.
Quite honestly there have been major driver issues with the 4090; the most recent studio version actually crippled performance in DR/Fusion. I'll make sure to do some fusion tests once they've figured out the bugs though.
I got a 3080 TI and I would need this card to be under 900 for me to consider upgrading. For Davinci resolve is it worth adding second GPU? 3080's are cheap now.
Buying (2) 3090's would give you about the same performance as 1, 4090, so yeah, double up!
Does Zotec 4090 have coil whine under load?
No, not that I've encountered. It's running quite smoothly actually!
@@District7DrewGeraci Thank you!
I'm seriously looking at this 4090 as I can't get a 4090FE from anywhere. Still debating whether or not the 4090 would be choked by the 5800X3D as I have no plans in moving up to the AM5 platform anytime soon. Currently have a 3090FE that's mainly used with Microsoft Flight Simulator 2020. Any thoughts?
It's a solid card (the Zotac 4090) but honestly you're best to wait for the 5000 series to drop this fall or buy a used 3090FE and nvlink them. Not sure it's worth buying the 4090 at this point unless you find a great deal on one.
@@District7DrewGeraci Little early for the 5000 series to be dropping this fall?
@@jamesm568 Maybe... you never know ;)
@@jamesm568 I think we'll get some fun announcements... definitely 4090TI's BUT they could leap fog AMD and announce the 5000 series (likely early 2024 release).
@@District7DrewGeraci I just need a good GPU that's solid on 4K as the 3090 is close, but I do everything in 4k and lower-resolutions are irrelevant to me.
Not sure I missed it but, how's the fan noise/coil whine on this card? :)
Haven't heard a peep in 2 weeks! No noise at all.
Do you recommend zotac model?
I'm 50/50 on it. It's crashed a number of times but it's slowly gotten better.
How does it do with Noise Reduction?
Check it out at 04:37 (Benchmarks for Resolve (Neat Video))
Does this Zotac 4090 come with 3 split or 4 split PCIE cable adapter?
4-way split from 1
@@District7DrewGeraci Great, tnx, could you also take a look in MSI Afterburner maximum Power % that you can set? For example 4090 Palit GameRock OC comes with 4way splitter, but the Power % is limited to 111% only when compared to Nvidia Founders 133%.
@@rouder9237 Great question, I'll see if I can find that information out for you.
@Peyton Cole which Zotac version did you get? (There's multiple).
what about the fans noise ?
None. I have it running at full power in my Meshify XL 2 case and hear nothing.
Hey, thanks for great video. How is coil whine in this model? Apparently 4090's suffer from horrible coil whine especially high end models but I see positive comments about zotac and gigabyte models.
Something that is really, really bugging me. In some tests you were seeing 35-45% increases in performances, however, you then say at the end that you don't see it being an upgrade. Those are very big performance increases for a single generational upgrade. How are you coming to that conclusion? Plus, your complaing about the price. Yeah, 2 3090s, really a great value there. (a conclusion, might i add, which is different to any other reviewer you can find right now. No one else has said its a small upgrade.)
It comes down to the actual 'real-world' performance of the device. Sure, the benchmarks indicate a performance increase but if you physically can't feel that increase in performance while you're using it, then it means it's not that great of a performance increase. For example, coming from a 2080ti to the 3090 I saw vast improvements in render speeds, fluidity on the timeline (no stuttering) and a great improvement on the number of nodes you can use on full-res footage without causing playback issues. With the 4090 there is only a marginal increase in actual noticeable performance. As my machine was using dual 3090s I quite honestly didn't see a need to spend another nearly $2k on a new GPU that was only slightly faster and you're right, I invested A LOT into 3090s so why would I want to pay more for something that barely feels like an upgrade? I'm not saying the 4090 is bad, I'm saying if you're physically using it you're just going to feel like it's a 'meh' upgrade. By all means, get the 4090 if you think it will improve your workflow, for me, it just doesn't cut it.
Is 850w enough for the Zotac 4090?
You could get away with it.
I think the 4090’s done okay there. I read that as significant performance gains over a single 3090 and still better than dual 3090’s.
I work with Blender, photogrammetry and rendering. We create a lot of animations that are 1 to 5 minutes long. We have 4x3090’s in a 5975wx, 256GB system. The appeal of moving to 4090’s is we’d only need two cards to beat our 4x3090’s. And we can then expand if we need to. The downside is we’d need water blocks which are a hassle.
I think the other issue with the 4090 is that it doesn’t Thrace enough ram. It’s quite easy to use 24GB of VRAM and then the card struggles. The new A6000 will have 48GB but the cost will be much higher, over 2x.
I think Nvidia have learned from last time when many production houses, CAD etc used 3090’s instead of the quadro range. Now they’re holding back on VRAM in the gaming cards to force “pros” on to the workstation range. Maybe also partially behind the crazy cooler size as well.
Anyway, good review, looking forward to your future content 👍
did you use studio drivers?
Yes
It’s crazy I got my TUF OC 4090 and absolutely no one wanted this AIB
I think you are being rather unfair considering it has only had 1 firmware update, and comparing it against two cards only 1 generation out, that its still managing to beat is actually quite impressive. Yes it's an expensive card, but that's because it can run games at 50% better fps in most cases in 4k. It's not really designed for video editing, that's just another nice trick. And surly, 30 seconds saved here and there add up considerably after some time of editing.
I don't think I'm being unfair, especially since the gains from the 2080ti to the 3090 were triple to quadroople the performance power of what we're seeing here with the 4090. Since I paid $2k for my 3090 I would expect (at a minimum) at least doubling the performance which just isn't happening (in my case for video production). There are definitely large gains for 3D modeling and gaming, but that's not what I bought the card for (only mild 3D modeling work). Hopefully they deliver new drivers soon that perhaps boost the performance but it's hard to say at this point. And to say this card isn't meant for video editing would be incorrect. The CUDA cores, VRAM, and GPU clock speed all have direct (and highly impactful) influence on how fast things are rendered, playedback and processed in realtime in 'most' video editors (excluding adobe).
@@District7DrewGeraci I stated that it isn't designed for video editing, not meant for, of course it can be used for it. But the high end Nvidia 4090 cards or titan cards in the past were/are based around overclocking and top end performance in games, that is what they are designed to do, and in games they are 77% better than the 3090, and 55-60% faster than even the 3090ti (A card that 6 months ago, cost more than the 4090). This is not a bad card at all, in fact it is the biggest performance uplift in GPU's since the GTX 10 series, in gaming I might add. Hence why I say it is not designed for video editing. you would be better going the Radeon 7k route if that's what you want to do with it. :)
Also, the 20 series cards were known to be very underperforming cards, it was the first time they stretched their legs on RTX and ray tracing, so the 30 series was a massive performance jump. The 10 to the 20 series was awful in rasterization performance, what made it novel, was the fact it could undertake raytracing, but not well.
Neat is a monster but I found its better to use less of a number to keep the sharpness better I use 480p files only cause imma upscaler
So you're saying if you have a 3090 there's no point to upgrade to a 4090?
If you can find a cheap or MSRP 4090, yes it's worth the upgrade, but it doesn't really add that much in terms of performance to video/photo apps comparatively. Dual 3090s run almost identically (and you can snag one for around $650-800 now) and might be better. 4090TIs are right around the corner too.
I would buy the 4090 only because I have a FTW 1080. I do a lot of photo and video editing and will consider the 4090 but not at the prices they currently are at.
It's definitely worth upgrading to from the 1080; huge performance increase. The price is definitely still high though and with the 4090ti coming out this summer it might be best to wait.
Never been a fan but never used one? I know we can be strange with stuff like that but odd how that happens
I've never been a fan because of what I've heard from others who have purchased Zotac - it's reputation isn't the best. So far though it's performing as it should.
Looks like it can handle 8K video footage with ease! Wow! That's amazing. Perhaps a bit overkill for playing tetris. 🔷️ But, Can it mine bitcoin?
It definitely handles 8k footage wonderfully (but so does the 3090). And I'm sure if bitcoin were up, every miner out there would be snagging these cards!
Yeah I got the same. Was 18 cards left and 18 people in line. It was funny up until then they said we about to run out guys then last min they were ahh well good and bad new. Good news you get a card. Bad news you get a zotac. Lmao. Idk if it’s good or bad but as long as it’s not loud and does at least FE performance I’ll be cool. Never had a zotac but all reviews looked ok last gen except on the 3090 and 3090ti so I’m hoping they resolved that.
Yeah, I wasn't thrilled but it's as fast as the 4090FE but its quite a bit bigger. The cooling is good on it though, idle temp is 31 and full load is 51.
@@District7DrewGeraci well that is good cooling. at least that cooler is doing something vs zotac 3090ti from what I heard was not great. I started a new PC build about 2 weeks ago and have been exchanging and deciding what to end up with. I think I’ve probably gone too far considering taking it all back and building a 5800x3d build saving myself $2k and go with a 3080ti which would be heads and tails above my 1700x and 1080.
How abt comparing to a 3090TI?
I don't have a 3090Ti so I couldn't compare it.
512GB DDR4 thought it was a typo 😂🙃
My boot drive 512gb, and you have that much in RAM alone?! 😂
Yeah, not a typo, haha. My workstation is used for VFX/3d Modeling etc which is memory hungry. Might upgrade to 1TB soon, ha.
I just picked up a zotac 4090 to replace my 3080ti .. at 1440p ultra wide my CPU struggles to load it to 70% in warzone... I think I'm going to return it since I can't afford to completely rebuild my rig right now to fully utilize it... One heck of a card but need top of the top chip to actually use it completely
What’s your CPU ???
You're literally running into CPU bottlenecks. This card is meant for 4K and up with the best CPUs.
@@romxxii Dude , I bought a 4090 .. What makes you think I won’t buy the newest Intel Processor 🤣
@@Xyz_Litty I'm not responding to you, I'm responding to the OP who is trying to game at 1440p and is clearly getting CPU bottlenecked. Good for you for buying a high end CPU, but OP sure as fuck didn't.
4090 is a 4k card lol
When you say it is 37% faster than a 3090, what numbers are you taking? On the grafik I see from 1.88fps to 7.28 fps for me this is 3.7 times faster, 370% faster and not 37%! So IT IS a giantik increase and not even close to only 37%.
While I did have the percentages backward, in the real world it's still a meaningless upgrade. Noticing 7 FPS over 3 FPS (in terms of 3 times the speed) is negligible when you're actually using the program.
@@District7DrewGeraci ok, thank you. I ordered one, so my 5900x + 3090 PC is already half replaced be with a 7950x + 4090 PC (the 4090 is not here by now). I hope that the weakness what I realize in my 5900x +3090 PC what is the playback of some 4K 60 H.265/H.264 formats or two parallel 8K-RAW files with activated noise removal and other things like this, will get a big improvement and that they will playback much smoother. The new Motherboard/CPU/RAM already brought a improvement, and I hope there will be again a similar improvement with the 4090. If I'll see no big improvement then maybe I'll keep the 3090, but I think I'll get what I hope.
@@iliaskapatos a newer CPU will definitely help boost performance for the 4090, I think I have around a 31% bottleneck with the Threadripper Pro, but that being said, I just tested it out on a friends 7950 as well and the performance was pretty much identical (in terms of playback and fluidity). It's not bad by any means, but coming from dual 3090s it's hardly an upgrade (for me).
@@District7DrewGeraci thank you for this information👍, it is great to hear that the 7950X is similar in fluidity like a 32 core threadripper. I got the AM5 system also because there are rumors that the next generation of AMD CPU'S the 8950x will double the amount of cores, so my thought was that at least then a AM5 system should be faster than current 32core threadrippers, just with some limitations like max. 128gb RAM. If the 7950x is already as fast in my used Videoediting program then it is great and surprising for me😃. And if everything is already fluid, then I won't have to upgrade my PC anymore. Can't wait to get my 4090.
Unrealistic expectations, but thanks for the video.
I agree the expectations were unrealistic to a point, but when you spend $1800 on a GPU that supposedly is meant to be a 'big' improvement upon its successor and it's not, is the real problem. My expectations would be more reasonable if the price matched the performance - more like the $950-1200 range would be the appropriate price point.
@@District7DrewGeraci Thanks for the reply. Sorry for the original terse comment. I will clarify.
1. Still early, so the drivers should improve over time.
2. We are at the point where Moore's law is running out of steam. In order for these GPUs, and CPUs as well, to perform as good as they do, the sizes and power consumption are increasing to absurd amounts.
3. The price/ performance compared to past generations needs to be calculated so we can say with more clarity if it is worth it.
4. Define 'big' improvement.
5. Inflation.
6. Ultimately, this is primarily a gamer GPU, even if it is marketed as a productivity GPU, which it clearly is. Compare to the Quadra line to see how the price/performance stacks up.
This single GPU is being stacked up against two 3090s, the previous best GPU, that was selling for over 2000 at periods over the pandemic, and that price was being paid even by non-miners. So if we compare in that context, someone might have paid between 3500 and 4500 for two new 3090, and I think we can agree that a single 4090, that you can buy at retail for roughly 1600 gives better performance, or at least equal, can fit in a mid-tower, uses less power, lower noise, more modern, will hold value better ( since didn't have to pay inflated price ).
I haven't run the numbers to make a graph to see if the expected improvements in productivity compared to past trends is being met or not, but I am sure someone on the net has that benchmark data.
Pardon me if my comments sound confrontational. They are not meant to be in any way. Wish you the best, and happy holidays!
FYI I got the same card and it popped last week turned on once I hit 350 watts it would freeze the pc I rma it and the new card would turn off the pc in games and I have a new msi gen 5 psu so it’s not the power I was able to get my money back and got a msi 4090 and guess what 3 days no problems I just found out today also is that u can not use nvidas drivers only from zotac work on this card and there way behind on there drives save your money and get a good gpu
never used zotac, never been a fan of zotac. makes sense!
It was a joke because it was the only one available ;)
35-50% faster, 40% lower heat and energy, and you're not satisfied?
c'mon, man...
I'll be putting out another video shortly. Drivers seemed to be the biggest issue however performance hasn't increased that much (for the price point). I'm very happy with it for gaming and 3D development though. I just wish it worked better in certain production applications.
Didn't watch the video solely because of the cringe "mistakes were made!" cliche small UA-camr click bait. Please for the love of God don't.
Thanks for commenting! And welcome to 2022, where clickbait is the only way to get people to watch videos.
@@District7DrewGeraci I liked it but won't watch another video because of that comment. Be better. Content matters, not just titles.
@@bobbywiederhold well, I'm glad you liked it, and I agree content matters (which is why I try to produce solid content) but honestly it's all about YT's algorithms now. While I 100% agree with you, if you don't play the game then you don't see results. Would still love to have you as a follower though!
I got the Asus One
If your not happy with the card. I will buy it from you. Just saying.
Mac M1 Max ❤
Just get a Mac Studio, much faster if you all in video rendering
Negative. I have a Mac Studio and can confirm it's nowhere near as fast. Having that dedicated GPU makes a HUGE difference when it comes to real-time playback and performance on any 8k+ production. The studio is great for handling 4k footage though with minor VFX applied.
@@District7DrewGeraci Oh so Mac Studio M1 Max can't handle 8k ? Sorry, I only deal with 4k. 8K rendering must be more power hunger that 4090 is required.
Anyway, how was this Zotac temp/noise/power consumption ? I'm looking to buy one to upgrade from 4070Ti Tuf, but not sure about Zotac brand.
@@thanatosor the Ultra Studio can definitely handle the basic playback of 8k without any issues but when you start compounding VFX layers, additional nodes and long-gop footage it begins to slow down tremendously. It's a great system but it has limitations over standalone PCs with dedicated GPUs.
Dual 3090s is so pointless
For gamming, yes. For productivity, no.
nope :)
NVIDIA sucks so bad, im done with them